LPV controller design with SOS relaxation of LMIs - help in turning theory into code

118 views
Skip to first unread message

Patryk Cieslak

unread,
Aug 22, 2015, 12:27:16 PM8/22/15
to YALMIP
Hello!

I am a PhD student trying to implement a gain scheduling controller for an up/down configuration of the Acrobot following an article titled "Attitude Control of Acrobot by Gain Scheduling Control Based on Sum of Squares" by H. Ichihara nad M. Kawata.

I am still learning the concepts presented in this article and I am new to optimization so please be kind :)
It is really important to me to understand and implement this so that I can modify it for my system and be sure it is correct.

I have gathered the equations from the article and simplified them a bit by removing damping from the Acrobot model and substituting some of the parameters used by the authors for the up/down configuration. The accompanying text is mostly taken from the article. I am attaching the theory in a PDF.

I am interested in solving the SOS relaxed problem which is formulated in the last section. I am attaching the Matlab code I have now.
There are numerous things I don't know how to write in Yalmip syntax and have trouble finding the answer:
1) Eq.7 is a list of constraints on the parameters rho. How to implement constraints which involve derivatives of rho?
2) There is a determinant in the Delta variable. Should I just use standard det( ) function?
3) I have defined Xs and Zs in the code. Is it ok what I did? Maybe there is some "automatic" way to generate it?
4) How to define matrix SOS polynomials with respect to monomials with some maximum degree?
5) How to define derivative of Xs which appears in Eq.26?

I will greatly appreciate any help. It would be much easier for me to understand everything if I had a working example.

Best,
Patryk
acrobot_gain_scheduling.pdf
acrobot_LPV.m

Johan Löfberg

unread,
Aug 23, 2015, 9:32:55 AM8/23/15
to YALMIP
1. You treat the derivative as its own variable I guess
2. Yes
3. Yes
4. Well, if you define a polynomial matrix, and then use the sos module, it will be handled automatically so I don't know what you want to setup
5 It is a polynomial in rho, and you've defined the rho derivatives earlier, I guess

Patryk Cieslak

unread,
Aug 23, 2015, 12:40:57 PM8/23/15
to YALMIP
Thank you Johan for a quick response. I have gone a bit further with my code and implemented all equations.

I "automatized" the matrix polynomials declaration with monolist() and a for loop. Is it ok what I did?
  
I have defined derivatives of rho as new variables and created a matrix polynomial dXs with these variables. 
The problem is how to connect dXs and Xs? I don't think it is right to leave them unconnected....

After defining everything and writing the constraints: sos(), equalities and inequalities I tried to run the optimization but I got an error.
"Error using compilesos.....No independent variables? Perhaps you added a constraint (p(x)) when you meant (sos(p(x))). It could also be that you added a constraint directly in the independents, such as p(x)>=0 or similarily." 

I don't understand where is the problem in my code....
I have tried disabling e.g. the equality on F2 and it does run something but I don't know what.
There is still the problem with dXs and Xs.....how to write a constraint to connect them?
acrobot_LPV.m

Johan Löfberg

unread,
Aug 23, 2015, 1:04:13 PM8/23/15
to YALMIP
It looks to me as if you don't understand how SOS programs are constructed

With something like F should be sos when g>=0 (very psedu-model). The you must apply a positivstellensatz and arrive at something like sos(F-multiplier*g). You don't add the constraint g>=0 to the model, because YALMIP then thinks all variables in the constraint must be parameters, and in the end it find no variables left to perform the SOS over

Regarding connecting dXdt and X, isn't that by the definition of it. If X is 5*rho^2, then dX is 10*drhodt*rho. 

Patryk Cieslak

unread,
Aug 24, 2015, 5:49:54 AM8/24/15
to YALMIP
Ok. I think I got it now. 
For example: one of the equations from the paper has the form: F2(rho) = S20(rho) + g1(rho)S21(rho) + g2(rho)S22(rho) + epsilon*I,
so to define a SOS problem I should move everything to the left except S20:  F2(rho) - g1(rho)S21(rho) - g2(rho)S22(rho) - epsilon*I = S20(rho) and read it as "what is on the left IS SOS" which would translate into Yalmip as "sos(F2 - g1*S21 - g2*S22 - epsilon*I)".
Am I thinking right?

BTW: I was thinking about the meaning of epsilon. Is it there to force strict definiteness?

I know dXs and Xs are connected by definition but I had problem formulating it in the code until I came with an idea.
I used the jacobian() function and defined dXs like this:

dXs = sdpvar(5,5);

for i=1:5

  for j=1:5

       dXs(i,j) = jacobian(Xs(i,j),[rho1 rho2])*[drho1;drho2];

  end

end

I think it makes sense....

I have also passed all polynomial coefficients as decision variables and added gamma as the objective following an example in the SOS tutorial.
I then tried to run it. 
It looks like it started working now because it goes through the Yalmip phase to the solver phase.
The problem is that it returns "Run into numerical problems" "No sensible solution found" for SEDUMI and reaches the maximum iteration count for SDPT3...
I don't know where is the problem and how to search for it. I am not sure if what I defined is right or missing something or some parts are wrong?
acrobot_LPV.m

Johan Löfberg

unread,
Aug 25, 2015, 2:41:35 AM8/25/15
to YALMIP
A strict matrix inequality would be X>= margin*I, where you decide on margin to make it suficiently strict, while not limiting the feasible space too much, and still being significant in terms of the tolerances used by the solver

For polynomials, it might not make sense to use sos(p-margin*I). Then you are saying the polynomial is larger than 0, not positive definite. Typically you use sos(p-margin*m(x)) where m(x) is a positive definite polynomial. How to select m(x) is not easy and depends on the problem
Reply all
Reply to author
Forward
0 new messages