Hyper Parameter Tuning for MIP

51 views
Skip to first unread message

Parisa Keshavarz

unread,
Mar 19, 2021, 7:13:20 PM3/19/21
to YALMIP
Hello,

I am trying to solve a mixed-integer model in Matlab using Yalmip based on the following paper (it is not the exact same model but very similar), and I have attached the model as well:

There is a parameter lambda used for the regularization of one of the variables and I am wondering how I can tune this parameter? I mean I know that I have to try different values, but I was wondering if there is an easier way to do it rather than a loop. I know Matlab has a regression learner but it only works for predefined models. 

I appreciate your help. 

SharedScreenshot.jpg

Johan Löfberg

unread,
Mar 20, 2021, 3:53:03 AM3/20/21
to YALMIP
no there is no magic (except you can speed it up using an optimizer construct (and then you have to reparameterize it slightly to use z'*z with q==z*lambda with a new variable z)). note that you can write max(b*(d-q),h*(q-d)) and skip manual modelling

Parisa Keshavarz

unread,
Mar 27, 2021, 6:53:58 PM3/27/21
to YALMIP
Thank you so much for the answer. Is it possible not to get improved results when using hyperparameters? I have been trying many values and I am getting the worst train and test performance at the same time and many values just give me the same result as an untuned model. I have different datasets to try but all of them are less than 1000 data points. I thought maybe this is an issue because of the small number of data points.

Johan Löfberg

unread,
Mar 28, 2021, 7:47:59 AM3/28/21
to YALMIP

No idea what you are talking about. As I said, there is no magic, the proof is in the pudding. If a certain setting works better, then that's is the case. If not, well then it didn't improve
Reply all
Reply to author
Forward
0 new messages