Well, Quad MINOS exists now. So what I'm suggesting is, if Quad version
is "easy" to do, then it can be used for really difficult problems, and
then just accept that doing it in software will be very slow. I leave
it to you how feasible/effective bybrid approach is where computation
starts in hardware double precision, then switches to software quad
precision when needed, or just does small parts of the calculation in
Quad.
BTW, I think I did some 128 bit extended precision,
i.e.,Quad Precision more or less, on IBM 370/168 in 1980 (yes, I'm
ancient)..
Yeah, maybe Kahan's right. In 1990, I thought that
within 10 to 15 years, quad precision would replace double precision as standard, much as double precision had recently replaced single precision as stabdard.
Instead, we now have people computing with massive data sets in single
precision on GPUs, using numerically unstable algorithms. Gee, not much
can go wrong there.