Something new in algorithm theory: Smooth analysis

19 views
Skip to first unread message

Dilawar Singh

unread,
Dec 13, 2013, 8:59:53 AM12/13/13
to wncc...@googlegroups.com
Well, its more than a decade old. Pioneered by Daniel Spielman. I am leaving
rigorous results for self study, here is the main idea.

For algorithms which are discovered rarely, one often writes down their worst
case and average case running time (or some other measure). The worst case is
measured by giving an input in "adversarial" spirit: keep finding input for
which the algorithms performs worse. People in cryptography seems to be obsessed
with "worst-case" input. And they should be, the input is really chosen by a
adversary there.

For many of us, average case analysis is most useful: find input pattern at
random and see how algorithm performs. Here, the trouble is that in many case,
the worst case is too conservative while the average case is too optimistic or
"unrealistic".

One perfect example for this is "simplex method" and Karmarkar's interior-point
algorithm in linear-programming (mind the term linear-programming, it has
nothing to do with computer programming). The simplex method worst case is
really bad but it is quite competitive with Karmarkar algorithm for general
case; which is polynomial.

The idea is that if you take the worst case input and add some noise to it or
perturb it a bit then the algorithm running time might improve a lot: it can be
closer to the average case. My reading is that this is what Speilman has done: H
Narayanan thinks he has done much more. He take a worst case input
(deterministically) and add a small noise perturbation. He, then, went on to
analyse ill-conditioned matrix. Rest can be read in his paper and lecture notes,
which are quite readable.

His research described by a professional mathematician.
http://www.ams.org/notices/201109/rtx110901288p.pdf

His homepage : http://cs-www.cs.yale.edu/homes/spielman/

--
Dilawar
NCBS Bangalore
EE, IIT Bombay
Reply all
Reply to author
Forward
0 new messages