memory problem

18 views
Skip to first unread message

Emily

unread,
Apr 12, 2017, 3:55:25 PM4/12/17
to am...@googlegroups.com
Dear all,
I have constructed a benders' decomposition in AMPL and try to solve it on
my 64bit windows computer.
My algorithm has to proceed a lot of iteration. Whith the growing number of
iteration,ampl.exe will store all the history information for each iteration
and keep increasing memory.It makes ampl run out of memory.However,there is
no need to store all the information. How can I clean the history
information after each iteration.
Is there any possible way to solve this problem?
Thank you
There are several detial information about memory shown as follow:
iteration2-3.txt
<http://ampl.996311.n3.nabble.com/file/n14767/iteration2-3.txt>



--
View this message in context: http://ampl.996311.n3.nabble.com/memory-problem-tp14767.html
Sent from the AMPL mailing list archive at Nabble.com.

Robert Fourer

unread,
Apr 13, 2017, 12:28:06 PM4/13/17
to am...@googlegroups.com
AMPL does not automatically keep any "history information" and so there is no general command for freeing information that has been stored. I do see that there is a large increase in memory allocation every time that you update the master problem (especially "shipment_price") but it is not possible to tell the reason from only the information given. Can you post some files you are using?

Bob Fourer
am...@googlegroups.com
--
You received this message because you are subscribed to the Google Groups "AMPL Modeling Language" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ampl+uns...@googlegroups.com.
To post to this group, send email to am...@googlegroups.com.
Visit this group at https://groups.google.com/group/ampl.
For more options, visit https://groups.google.com/d/optout.

Emily

unread,
Apr 16, 2017, 1:07:28 AM4/16/17
to am...@googlegroups.com
Thank you for your response.
Here is all my used files:
benders.dat <http://ampl.996311.n3.nabble.com/file/n14784/benders.dat> ;
benders.mod <http://ampl.996311.n3.nabble.com/file/n14784/benders.mod> ;
benders.run <http://ampl.996311.n3.nabble.com/file/n14784/benders.run> ;
In addition, memory record in this scale is also posted:
till_12th_iteration.txt
<http://ampl.996311.n3.nabble.com/file/n14784/till_12th_iteration.txt> ;



--
View this message in context: http://ampl.996311.n3.nabble.com/memory-problem-tp14767p14784.html

Robert Fourer

unread,
Apr 17, 2017, 11:36:06 AM4/17/17
to am...@googlegroups.com
You are saving all of the dual values

param product_price {scenario,company,period,1..nCUT};
param shipment_price {scenario,company,company,period,1..nCUT};
param satisfy_price {scenario,order,company,1..nCUT};

With 15 scenarios, 20 orders, 63 companies, and 30 periods, this is over 1.8 million values added for each cut. However for your cut constraints you only need to save the coefficients of the master problem variables, which are

for each {i in company, k in 1..nCUT} --
sum {s in scenario, t in period_company[i]} product_price[s,i,t,k] * production_capacity[s,i,t]
for each {i in company, j in company, k in 1..nCUT} --
sum {s in scenario, t in period_company[i]} shipment_price[s,i,j,t,k] * ship_capacity[i,j,t]
for each {k in 1..nCUT} --
sum {s in scenario, j in order, i in order_Company[j]} satisfy_price[s,j,i,k]*1;

Instead of product_price, shipment_price, satisfy_price, you can consider defining some params to hold just this smaller number of values.

Bob Fourer
am...@googlegroups.com


-----Original Message-----
From: am...@googlegroups.com [mailto:am...@googlegroups.com] On Behalf Of Emily

fei

unread,
Apr 27, 2017, 1:17:45 PM4/27/17
to am...@googlegroups.com
Hey,

Good morning. May I asked some question about your model. Would you explain
little more about your mode.



--
View this message in context: http://ampl.996311.n3.nabble.com/memory-problem-tp14767p14874.html
Reply all
Reply to author
Forward
0 new messages