Hi,
yes '1)' makes a lot of sense.
You may need to bypass the learning-rate-scheduling in the 1st epoch on new data by:
steps/nnet/train.sh --scheduler-opts "--keep-lr-iters 1" ...
For '2)' I am currently working on it, there will be a tool which sets the '<LearnRateCoef>' inside the models.
The model is the 'exp/.../final.nnet'.
Meanwhile it can be resaved as ascii by:
nnet-copy --binary=false
nn.in nn.out
and modified manually by changing values of '<LearnRateCoef>' and '<BiasLearnRateCoef>'.
All the best,
Karel.