Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

average change in the spread of Pareto solutions less than options

147 views
Skip to first unread message

guymcheers

unread,
Aug 17, 2013, 4:42:07 AM8/17/13
to
Who can tell me this following optimization statement is successful or fail, when I use the gamultiobj toolbox?

"Optimization terminated: average change in the spread of Pareto solutions less than options.TolFun."

If it means fail, what is the reason?

If it means successful, what does it mean?

"TolFun" means what?


Many thanks:)


guymcheers!

Alan_Weiss

unread,
Aug 19, 2013, 3:17:43 PM8/19/13
to
This is a good exit message, but you have found some deficiencies in the
documentation that I aim to address in the future. The documentation
does not describe what "spread of Pareto solutions" is. This is a notion
in Kalyanmoy Deb's book "Multi-Objective Optimization using Evolutionary
Algorithms." Basically, the idea is that the Pareto front is not
changing much toward the end of the optimization, so the solver stops.

TolFun is a tolerance described in the documentation
http://www.mathworks.com/help/optim/ug/tolerances-and-stopping-criteria.html
The TolFun tolerance in this case is measured against the change of the
spread of the solutions.

Sorry for the deficiencies in the documentation, I'll try to improve it.

Alan Weiss
MATLAB mathematical toolbox documentation

guymcheers

unread,
Aug 22, 2013, 12:51:16 PM8/22/13
to
Alan_Weiss <awe...@mathworks.com> wrote in message <kutr0n$ns3$1...@newscl01ah.mathworks.com>...
Dear Alan,

Thanks for your nice explanations!
I can understand it now:)
Sorry for late reply, cos I don't know it was answered already.
I should tick for getting email notice in future.


guymcheers

Juan

unread,
Aug 8, 2014, 12:37:05 PM8/8/14
to
"guymcheers " <guymc...@gmail.com> wrote in message <kv5fi4$pv7$1...@newscl01ah.mathworks.com>...
Hello Alan,
I am Juan and have the same problem when I work with neural networks and genetic algorithms. That is when I work with few data.
But recently I worked with more data and obtain the next:
Error using bsxfun
Non-singleton dimensions of the two input arrays must match each other.
Error in nnMATLAB.pc (line 24)
pi = bsxfun(@minus,pi,settings.xoffset);
Error in nncalc.preCalcData (line 20)
data.Pc = calcMode.pc(net,data.X,data.Xi,data.Q,data.TS,calcHints);
Error in nncalc.setup1 (line 99)
calcData =
nncalc.preCalcData(matlabMode,matlabHints,net,data,doPc,doPd,calcHints.doFlattenTime);
Error in network/sim (line 295)
[calcMode,calcNet,calcData,calcHints,~,resourceText] = nncalc.setup1(calcMode,net,data);
Error in @(pb)sim(net,pb')

Error in createAnonymousFcn>@(x)fcn(x,FcnArgs{:}) (line 11)
fcn_handle = @(x) fcn(x,FcnArgs{:});
Error in gamultiobjMakeState (line 25)
Score = FitnessFcn(state.Population(1,:));
Error in gamultiobjsolve (line 11)
state = gamultiobjMakeState(GenomeLength,FitnessFcn,output.problemtype,options);
Error in gamultiobj (line 235)
[x,fval,exitFlag,output,population,scores] = gamultiobjsolve(FitnessFcn,nvars, ...
Caused by:
Failure in initial user-supplied fitness function evaluation. GAMULTIOBJ cannot continue.

My commands were:
>> net=fitnet;
>> [net,tr]=train(net,pb',tb');
>> objFcn=@(pb) sim(net,pb');
>> [xOpt,fVal]=gamultiobj(objFcn,2)

My data were:
>> pb
pb =
496.0000 0.1000 1.0000
496.0000 0.1000 0.7500
496.0000 0.1000 1.2500
496.0000 0.1000 1.5000
496.0000 0.1500 1.0000
496.0000 0.1500 0.7500
496.0000 0.1500 1.2500
496.0000 0.1500 1.5000
496.0000 0.1700 1.0000
496.0000 0.1700 0.7500
496.0000 0.1700 1.2500
496.0000 0.1700 1.5000
496.0000 0.2000 1.0000
496.0000 0.2000 0.7500
496.0000 0.2000 1.2500
496.0000 0.2000 1.5000
396.0000 0.1000 1.0000
396.0000 0.1000 0.7500
396.0000 0.1000 1.2500
396.0000 0.1000 1.5000
396.0000 0.1500 1.0000
396.0000 0.1500 0.7500
396.0000 0.1500 1.2500
396.0000 0.1500 1.5000
396.0000 0.1700 1.0000
396.0000 0.1700 0.7500
396.0000 0.1700 1.2500
396.0000 0.1700 1.5000
396.0000 0.2000 1.0000
396.0000 0.2000 0.7500
396.0000 0.2000 1.2500
396.0000 0.2000 1.5000
595.0000 0.1000 1.0000
595.0000 0.1000 0.7500
595.0000 0.1000 1.2500
595.0000 0.1000 1.5000
595.0000 0.1500 1.0000
595.0000 0.1500 0.7500
595.0000 0.1500 1.2500
595.0000 0.1500 1.5000
595.0000 0.1700 1.0000
595.0000 0.1700 0.7500
595.0000 0.1700 1.2500
595.0000 0.1700 1.5000
595.0000 0.2000 1.0000
595.0000 0.2000 0.7500
595.0000 0.2000 1.2500
595.0000 0.2000 1.5000
674.0000 0.1000 1.0000
674.0000 0.1000 0.7500
674.0000 0.1000 1.2500
674.0000 0.1000 1.5000
674.0000 0.1500 1.0000
674.0000 0.1500 0.7500
674.0000 0.1500 1.2500
674.0000 0.1500 1.5000
674.0000 0.1700 1.0000
674.0000 0.1700 0.7500
674.0000 0.1700 1.2500
674.0000 0.1700 1.5000
674.0000 0.2000 1.0000
674.0000 0.2000 0.7500
674.0000 0.2000 1.2500
674.0000 0.2000 1.5000
>> tb
tb =
1.0e+04 *
0.0003 2.3808
0.0003 1.7856
0.0003 2.9760
0.0003 0.0000
0.0003 0.0000
0.0003 2.6784
0.0003 4.4640
0.0003 5.3568
0.0003 4.0472
0.0003 3.0354
0.0003 5.0590
0.0003 6.0708
0.0003 4.7616
0.0003 0.0000
0.0003 8.3328
0.0003 0.0000
0.0003 1.9008
0.0003 1.4256
0.0003 2.3760
0.0003 2.8512
0.0003 2.8512
0.0003 2.1384
0.0003 3.5640
0.0003 4.2768
0.0003 3.2312
0.0003 2.4234
0.0003 4.0390
0.0003 4.8468
0.0003 3.8016
0.0003 2.8512
0.0003 4.7520
0.0003 5.7024
0.0003 2.8560
0.0003 2.1420
0.0003 0.0000
0.0003 4.2840
0.0003 4.2840
0.0003 3.2130
0.0003 5.3550
0.0003 6.4260
0.0003 4.8552
0.0003 3.6414
0.0003 6.0690
0.0003 7.2828
0.0003 5.7120
0.0003 4.2840
0.0003 0.0000
0.0003 8.5680
0.0003 3.2352
0.0003 2.4264
0.0003 4.0440
0.0004 4.8528
0.0003 4.8528
0.0003 3.6396
0.0003 6.0660
0.0003 7.2792
0.0003 5.5000
0.0003 4.1250
0.0003 6.8750
0.0003 8.2500
0.0003 6.4704
0.0004 4.8528
0.0004 8.0880
0.0003 9.7056

What is your opinion about my problem?

Juan

Alan_Weiss

unread,
Aug 8, 2014, 3:18:18 PM8/8/14
to
Sorry, I don't know anything about neural networks or Simulink. I
suggest that you use the debugger to figure out why you got an error.
0 new messages