"Shivam Jain" wrote in message <np6ks3$9be$
1...@newscl01ah.mathworks.com>...
> Thanks Greg sir. But I have one doubt. I see that both 'mse' and 'mean(error.^2)' are giving same value as desired. But why is the nntraintool that pops up when we run the program, showing performance which is different from mse command. It also mentions that it measures performance through the mean squared error. Does it take something else also into account? `
Of course. The state of the random number generator determines BOTH
1. trn/val/tst datadivision
AND
2. initial random weights.
From the help documentation:
% >> [inputs,targets] = simplefitdata;
% Undefined function or variable 'simplefitdata'.
close all, clear all, clc
rng(0) % Necessary for reproducibility
for i=1:10
[inputs,targets] = simplefit_dataset;
net = newff(inputs,targets,20);
net = train(net,inputs,targets);
outputs = net(inputs);
errors = outputs - targets;
perf(i) = perform(net,outputs,targets);
end
result = perf
% result = 0.0000 0.0000 0.0004 0.0000 0.0088
% 0.0000 0.0098 0.0005 0.0027 0.0005
check1 = mse(errors) % 4.5740e-04
check2 = mean(errors.^2) % 4.5740e-04
Hope this helps.
Greg