Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Calculation of mean square error for a neural network

516 views
Skip to first unread message

Shivam Jain

unread,
Aug 18, 2016, 5:23:09 AM8/18/16
to
Following is the matlab program which I am running. b is a 308 *7 matrix consisting of 308 instances and 6 inputs and 7th row is for target output. I have to train this neural network
i1 = b(:,1);
i2 = b(:,2);
i3 = b(:,3);
i4 = b(:,4);
i5 = b(:,5);
i6 = b(:,6);
i = [i1,i2,i3,i4,i5,i6]';
target = b(:,7)';
net = newff(i,target,10);
net = train(net,i,target);
Output = sim(net,i);
error = mean(target-Output)^2
final = sqrt(error)
perf = mse(error)

When I run it, I get nntraintool which shows performance = 0.0266 which is mean squared error as given in toolbox. Whereas 'error' which I have defined as mean square error gives something else(1.1359e-04) as error which is not equal to 0.0266. perf variable also finds mse but it is equal to 1.2902e-08. Should not all thee three values be equal because all of them calculate mean square error.Which is the correct mean square error which I can submit as an answer. Why are the other two wrong then?

Shivam Jain

unread,
Aug 18, 2016, 9:06:07 AM8/18/16
to
Sorry, the 4th last line should be read as
error = mean((target-Output).^2). There was a mistake in copy and paste. My question remains the same as given

Greg Heath

unread,
Aug 18, 2016, 12:42:09 PM8/18/16
to
"Shivam Jain" wrote in message <np4brr$p2c$1...@newscl01ah.mathworks.com>...
> Sorry, the 4th last line should be read as
> error = mean((target-Output).^2). There was a mistake in copy and paste. My question remains the same as given

vart1 = mean(var(target',1)) % Reference MSE
vart1 = var(target,1)) % Reference MSE for 1-D target

error = target-output;
MSE = mean(error.^2)
MSE = mse(error)

NMSE = MSE/vart1 % Normalize MSE as a fraction of target variance not modelled
Rsq = 1 - NMSE % Rsquare, coefficient of determination, fraction of target variance modelled

The scale independent measures NMSE and Rsq are the most descriptive of model accuracy.
Use them for regression and timeseries ( For classification count number of discrete errors)

https://www.google.com/#q=rsquared

Hope this helps.

Greg

Shivam Jain

unread,
Aug 19, 2016, 5:52:09 AM8/19/16
to
Thanks Greg sir. But I have one doubt. I see that both 'mse' and 'mean(error.^2)' are giving same value as desired. But why is the nntraintool that pops up when we run the program, showing performance which is different from mse command. It also mentions that it measures performance through the mean squared error. Does it take something else also into account? `
i1 = b(:,1);
i2 = b(:,2);
i3 = b(:,3);
i4 = b(:,4);
i5 = b(:,5);
i6 = b(:,6);
i = [i1,i2,i3,i4,i5,i6]';
target = b(:,7)';
net = newff(i,target,10);
net = train(net,i,target);
Output = sim(net,i);
error = target-Output
final = sqrt(error)
perfo = mse(error)
perfo2 = RootMeanSquareError(target,Output)
MSE = mean(error.^2)

Greg Heath

unread,
Aug 20, 2016, 11:08:08 AM8/20/16
to
"Shivam Jain" wrote in message <np6ks3$9be$1...@newscl01ah.mathworks.com>...
> Thanks Greg sir. But I have one doubt. I see that both 'mse' and 'mean(error.^2)' are giving same value as desired. But why is the nntraintool that pops up when we run the program, showing performance which is different from mse command. It also mentions that it measures performance through the mean squared error. Does it take something else also into account? `

Of course. The state of the random number generator determines BOTH

1. trn/val/tst datadivision

AND

2. initial random weights.

From the help documentation:

% >> [inputs,targets] = simplefitdata;
% Undefined function or variable 'simplefitdata'.

close all, clear all, clc
rng(0) % Necessary for reproducibility
for i=1:10
[inputs,targets] = simplefit_dataset;
net = newff(inputs,targets,20);
net = train(net,inputs,targets);
outputs = net(inputs);
errors = outputs - targets;
perf(i) = perform(net,outputs,targets);
end
result = perf
% result = 0.0000 0.0000 0.0004 0.0000 0.0088
% 0.0000 0.0098 0.0005 0.0027 0.0005

check1 = mse(errors) % 4.5740e-04
check2 = mean(errors.^2) % 4.5740e-04

Hope this helps.

Greg

Shivam Jain

unread,
Aug 21, 2016, 7:37:09 AM8/21/16
to
Thanks a lot. I understood it. Please can some one help me how to mark an answer as "accepted Answer"

Greg Heath

unread,
Aug 22, 2016, 10:29:13 PM8/22/16
to
"Shivam Jain" wrote in message <npc3ov$kpo$1...@newscl01ah.mathworks.com>...
> Thanks a lot. I understood it. Please can some one help me how to mark an answer as "accepted Answer"

That is not part of the MATLAB NEWSREADER.

However, it is part of MATLAB ANSWERS

Greg
0 new messages