Training problem in AENET2

87 views
Skip to first unread message

seunghun jang

unread,
Jun 11, 2018, 5:06:55 AM6/11/18
to aenet


Dear AENET developers,

I have installed aenet2, and made training-set(TiO.train) using 'generate.x'.
And then, I performed the training process using 'train.x'.
But, I could not get a normal training results, as follow below

Could you point out any-problem you might have?

Thanks.

Seunghun Jang

----------------------------------------------------------------------
                            Training process
 ----------------------------------------------------------------------

 Weight optimization for 5000 epochs using the Limited Memory BFGS method.

 Sampling type               : sequential
        |------------TRAIN-----------|  |------------TEST------------|
 epoch             MAE          <RMSE>             MAE          <RMSE>
     0    4.401635E-01    5.324868E-01    4.344536E-01    5.311586E-01 <
     1    4.401635E-01    5.324868E-01    4.344536E-01    5.311586E-01 <
     2    7.854604E-01    8.975107E-01    7.665935E-01    8.809382E-01 <
     3    4.246132E-01    5.384436E-01    4.383693E-01    5.527252E-01 <
     4    4.251773E-01    5.394986E-01    4.388682E-01    5.537715E-01 <
     5    4.251802E-01    5.395040E-01    4.388707E-01    5.537768E-01 <
     6    4.251802E-01    5.395040E-01    4.388707E-01    5.537768E-01 <
     7    4.251802E-01    5.395040E-01    4.388707E-01    5.537768E-01 <
     8    4.251802E-01    5.395040E-01    4.388707E-01    5.537768E-01 <
     9    4.251802E-01    5.395040E-01    4.388707E-01    5.537768E-01 <
 The optimization has converged. Training stopped.


 Training finished.

 ----------------------------------------------------------------------

Nong Artrith

unread,
Jun 11, 2018, 11:06:41 AM6/11/18
to aenet
Dear Seunghun Jang,

This looks like a bug in version 2.0.0 that has since been fixed.  
Could you please confirm that you are using aenet version 2.0.2?

Best,
Nong

seunghun jang

unread,
Jun 16, 2018, 11:47:53 PM6/16/18
to aenet

Dear, Nong,


Thanks a lot for your answer.


Best,

Seunghun Jang

Reply all
Reply to author
Forward
0 new messages