Answers to some questions from the QA session

62 views
Skip to first unread message

Maksym Andriushchenko

unread,
Feb 21, 2019, 6:28:11 AM2/21/19
to Machine Learning WS18/19
Hi all,


- Regarding the Slater's condition for the hard-margin SVM:
"Slater condition fulfilled if data is linearly separable ⇒ strong duality, we can solve primal problem via the dual problem."
To recall the Slater's condition:
image.png
And we have the latter case since we have affine constraints. Therefore, for the duality gap of 0, we just need to check that there exists some x s.t. g_i(x) <= 0 for every i. Therefore, it's just a question of whether the optimization problem has any feasible solutions, which is the question of whether the data is linearly separable.


-  I think the correct analysis for the KKT conditions of the soft-margin SVM (with careful handling of inequality vs equality constraints) is given as follows: 
image.png
Note that the direction matters. But if we analyze them by a case distinction on alpha_i, then we indeed have non-strict inequalities on y_i*f(x_i) (in the 1st and 3rd row). 


Best wishes,
Maksym
Reply all
Reply to author
Forward
0 new messages