60 views

### TD

Sep 20, 2023, 10:17:55 AM9/20/23
to Artelys Knitro forum
Hello there

I just want to inquire a bit about the SQP algorithm and where it works best. First of all, I am aware that for NLPs, you really need to tune the solver settings (e.g. which linsolver to use, how the barrier is updated in the IP methods, etc.) quite a bit in order for it to work best with your given problem.

I have found for my problem that the SQP method works quite good. In my case, my number of decision variable is quite small (n<=20), but I have a huge number of constraints (more than 100*n). However, only a very few number of constraint will be active at the optimal solution (less than 5% of the constraints, I would say). Also, my function evaluations dominates the total time.

In general, is the SQP well-suited for such problems? If there is any additional information (or tips and tricks) you can provide on the algorithms (other than what is presented here https://www.artelys.com/docs/knitro/2_userGuide/algorithms.html), that would be great!

Thanks and kind regards

### Richard Waltz

Sep 21, 2023, 10:18:37 AM9/21/23
Hi,

The SQP algorithm typically works best on smaller problems (say, at most a few thousand variables and constraints), where the function/gradient evaluations are a dominant cost.  It also helps if few constraints are active.  The SQP algorithm is also able to make good use of a good initial point to converge quickly, so it may also be a good algorithm to use when a good initial point is available (for example, when solving a sequence of closely related problems).  Beyond that generalization it is hard to say much, as performance is always pretty problem dependent.

The reason it is mainly restricted to smaller problems is that it is not as scalable as the interior-point algorithm.  The cost per iteration of SQP becomes prohibitive as the models get too large as it solves a quadratic program every iteration (the combinatorics of sorting out the active constraints also can become prohibitive as the problem size becomes too large).  The reason it is recommended for problems with expensive function evaluations is that the SQP algorithm typically does more work per iteration than the interior-point method and that allows it often to converge in fewer iterations/function evaluations.  For example, the Knitro SQP algorithm has successfully been used in derivative-free, black-box optimization competitions where most of the time is spent in function evaluations.

I hope this helps.

Regards,
-Richard Waltz

Sent: Wednesday, September 20, 2023 5:37 AM

--
You received this message because you are subscribed to the Artelys "Knitro Nonlinear Optimization Solver" google group.
To post to this group, send email to kni...@googlegroups.com
To unsubscribe from this group, send email to knitro-un...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/knitro?hl=en
Thank You,
Artelys
http://www.artelys.com/en/optimization-tools/knitro
---
You received this message because you are subscribed to the Google Groups "Artelys Knitro forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to knitro+un...@googlegroups.com.

### TD

Sep 22, 2023, 9:49:44 AM9/22/23
to Artelys Knitro forum

Ok, I see. If I recall, the size of the underlying QP depends on the number of variables and active constraints, right? If I have a lot of constraints but only very few should be active at once, does it still have some implications regarding the scalability of the SQP algorithm? I understand that having more constraints means more computations, but I guess there is also implications to identify which constraints might be active? I believe this is what you were referring to here: the combinatorics of sorting out the active constraints also can become prohibitive as the problem size becomes too large, right?

Thanks for the insights!

### Richard Waltz

Sep 22, 2023, 12:58:46 PM9/22/23
to Artelys Knitro forum
Hi,

Yes, generally the expense of the underlying QP/subspace minimizations depends on the number of variables and active constraints.

but I guess there is also implications to identify which constraints might be active? I believe this is what you were referring to here: "the combinatorics of sorting out the active constraints also can become prohibitive as the problem size becomes too large", right?

Yes, correct.

If you have a problem with a small number of variables but lots of inequality constraints, you might also experiment with using the default interior-point algorithm with "bar_linsys=3"

though, again, if function evaluations are the dominant cost and the problem is not too large, SQP is often the best option.

Regards,
-Richard

Sent: Friday, September 22, 2023 5:49 AM