Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Bicubic Interpolation

3 views
Skip to first unread message

Gernot Hoffmann

unread,
Apr 3, 2002, 2:09:12 PM4/3/02
to
A search in Google about Bicubic Interpolation resulted more
or less in bla-bla, with two exceptions:
S-Spline and P.Bourke愀 document.
Mr. Bourke愀 concise algorithm causes much blur, therefore
the problem is still not solved.

Some results are here (400kBytes, because of low compression):

http://www.fho-emden.de/~hoffmann/bicubic03042002.pdf

Any improvement is much appreciated.

Best regards --Gernot Hoffmann

Dave Martindale

unread,
Apr 3, 2002, 3:01:22 PM4/3/02
to
hoff...@fho-emden.de (Gernot Hoffmann) writes:
>A search in Google about Bicubic Interpolation resulted more
>or less in bla-bla, with two exceptions:
>S-Spline and P.Bourke's document.
>Mr. Bourke's concise algorithm causes much blur, therefore

>the problem is still not solved.

Bicubic interpolation just means using cubic interpolation functions
in two directions. The functions themselves could be B-splines or
Catmull-Rom splines or other cubic interpolating splines.

You really ought to grab a copy of Paul Heckbert's "zoom" program and
see what it does for resampling.

For lots of background info, find "Digital Image Warping" by George Wolberg.
ISBN 0-8186-8944-7, IEEE Computer Society Press order number 1944.

If you send me your source image, I'll send you back examples of it
rotated by whatever angle you want using a dozen different resampling
filters.

Dave

Marco Schmidt

unread,
Apr 3, 2002, 3:07:30 PM4/3/02
to
Dave Martindale wrote:

[...]

>If you send me your source image, I'll send you back examples of it
>rotated by whatever angle you want using a dozen different resampling
>filters.

Another test of various methods can be found at
<http://www.fh-furtwangen.de/~dersch/interpolator/interpolator.html>.

Regards,
Marco

Gernot Hoffmann

unread,
Apr 4, 2002, 2:47:00 AM4/4/02
to
Marco Schmidt <marcos...@geocities.com> wrote in message news:<43omaukti5mo6ar16...@4ax.com>...

Thanks, Dave and Marco.

Mr.Bourke愀 formula is elegant,but the result is softened.
Mr.Dersch愀 expressions require too many calculations.
Isn愒 there something like an agreement about the best
algorithm for photo-like images, for rotations and moderate
scaling, let愀 say s=0.5..2 ?
So far, I would integrate a sharpening filter in Mr.Bourke愀
algorithm.

Best regards --Gernot Hoffmann

Gernot Hoffmann

unread,
Apr 4, 2002, 5:02:20 AM4/4/02
to
Marco Schmidt <marcos...@geocities.com> wrote in message news:<43omaukti5mo6ar16...@4ax.com>...

Some further comments:

Mr.Dersch found that Photoshop Bicubic has
a sharpening effect. That愀 not true.
The result is blurred with halos for single
pixel lines, therefore not so good.
Another question is RGB clipping. Normal poly-
noms can easily create results which need to be
clipped. B-Splines don愒 reqire clipping, IMO,
because the non-interpolating spline is inside
the convex hull of the control points.

G.Hoffmann

Dave Martindale

unread,
Apr 4, 2002, 5:43:49 PM4/4/02
to
hoff...@fho-emden.de (Gernot Hoffmann) writes:

>Mr.Bourke's formula is elegant,but the result is softened.
>Mr.Dersch's expressions require too many calculations.
>Isn't there something like an agreement about the best

>algorithm for photo-like images, for rotations and moderate

>scaling, let's say s=0.5..2 ?

In the first place, you need slightly different filtering techniques
for enlarging (scale factor 1 and up) and reduction (scale factor 1 or
less). You'll generally need different code for the two cases no matter
what filter you choose. In addition, some filters are harder to use
for one of these cases.

It also depends on whether you can afford to read the entire image into
memory or not - some interpolation techniques have "finite support" and
some do not.

Second, there is no "best" algorithm. There may be a best algorithm
given a particular upper limit on cost, but you can always spend more
CPU cycles and get better results in some cases. If you want advice on
best, you'll have to tell us what cost you're willing to pay.

Which of Prof. Dersch's expressions have "too many calculations"? He
describes multiple interpolation techniques.

Also, if you're scaling but not rotating, keep in mind that the
coefficients used for the vertical interpolation at one pixel are identical
for every pixel in that row. Similarly, the horizontal interpolation
coefficients are the same for every pixel in that column. So you can
precompute row and column weight tables just once.

Dave

Dave Martindale

unread,
Apr 4, 2002, 5:56:21 PM4/4/02
to
hoff...@fho-emden.de (Gernot Hoffmann) writes:
>Some further comments:

>Mr.Dersch found that Photoshop Bicubic has
>a sharpening effect. That愀 not true.

It is true. Try rotating an image by 5 degrees 36 times in Photoshop.
The high frequencies are boosted, which is a sharpening effect.

>The result is blurred with halos for single
>pixel lines, therefore not so good.

No interpolation algorithm can do a good job with single pixel lines.
Images with single-pixel lines have not been properly prefiltered before
sampling. If you resample such an image, you'll get ringing (overshoot)
with any interpolation technique that has good high-frequency performance.
The better the interpolation technique (in terms of high frequency
preservation), the more cycles of ringing you'll get.

In an image that has been properly filtered before sampling, there are
no abrupt black/white edges. Any transition from black to white takes
several pixels.

>Another question is RGB clipping. Normal poly-
>noms can easily create results which need to be
>clipped. B-Splines don愒 reqire clipping, IMO,
>because the non-interpolating spline is inside
>the convex hull of the control points.

Right. But it's equally true that using a Bspline alone for interpolation
gives you a curve that doesn't go through the original data points. This
causes the blurring you complain about - a loss of high frequencies.

Several people have described a technique for pre-processing the data
before using B-spline interpolation. Essentially, you calculate a set of
B-spline control points with the property that the splines do go through
the original data points. Then you do your resampling. But, again, you
get overshoots.

Another way to see what is going on: Start with a square wave, and take
its Fourier transform. Now eliminate some of the high frequencies. Convert
back into the spatial domain. Notice the edges now overshoot. This is
just the way a square wave works - to get a square edge with no overshoots
requires an infinite number of harmonics. With only a finite number, you
have to settle for either having the fastest transition available, plus some
overshoot, or you have to gradually roll off the amplitude of the high
frequencies. If you do the latter, you can reduce or eliminate the
overshoot, but the edge has a lower slope as well. There's no way of
avoiding this.

Your articles seem to suggest that you're looking for a magic algorithm
that can rotate single-pixel horizontal and vertical lines and have them
still look (a) smooth and (b) as narrow as before. There's no such thing.

Dave

Gernot Hoffmann

unread,
Apr 5, 2002, 10:36:56 AM4/5/02
to
da...@cs.ubc.ca (Dave Martindale) wrote in message news:<a8ilml$4s5$1...@newcastle.cs.ubc.ca>...

Dave,

some remarks and conclusions. All are related to examples:
http://www.fho-emden.de/~hoffmann/bicubic03042002.pdf
1 MByte, low compression (otherwise no clear results),
In Acrobat zoom=200% or 400%, pixel synchronized PDF,
In Acrobat no image smoothening.

1. The majority of applications uses 8 Byte per channel and
requires good quality for one rotation and moderate scaling.
2. All research by Erik Meijering and Philippe Thévenaz
(Lausanne,URLs in other tread) work totally in floating point.
Then a sequence of rotations makes sense in tests.
Multiple rotation tests are quite useless, if the intermediate
results are rounded to 0...255 and clipped (if the algorithm
needs clipping).
3. Photoshop 5.0 blurrs and creates halos even for one rotation.
One may call this sharpening - I would say oversharpening or
simply: not the best filter (page 2).
4. Bilinear interpolation is very reasonable. The lines are not
much blurred and don´t have halos (page 2). My algorithm works
excellently for downscaling as well (slightly modified, compared
to standard bilinear).
5. Mr.Bourke´s bicubic filter is indeed too smooth. The results
are very good, if a weak sharpening filter is applied afterwards.
This should be combined with the interpolation.
6. Photoshop offers two interpolations, bilinear and bicubic
(besides nearest neighbour). This is a good style.
"Normal" users are not interested to change filters for different
applications too often. Bilinear for screenshots etc., bicubic
for photos, that´s enough.
Therefore it´s legitimate to ask for the best interpolation, or the
two best methods, in the sense of bilinear, bicubic. The question
is not a result of my ignorance.
7. Of course all these remarks concern image processing for desktop
publishing, opposed to specific applications in medical data pro-
cessing etc. Therefore I would like to state again, that the scien-
tific work of E.Meijering and P.Thévenaz is probably very good, but
it´s not helpful for desktop publishing (as long as we don´t have
consequently 16 bits per channel).

I don´t think that we disagree much, but a few different conclusions
are probably allowed.

Best regards --Gernot Hoffmann

Dave Martindale

unread,
Apr 5, 2002, 3:09:02 PM4/5/02
to
hoff...@fho-emden.de (Gernot Hoffmann) writes:

>2. All research by Erik Meijering and Philippe Thévenaz
> (Lausanne,URLs in other tread) work totally in floating point.
> Then a sequence of rotations makes sense in tests.
> Multiple rotation tests are quite useless, if the intermediate
> results are rounded to 0...255 and clipped (if the algorithm
> needs clipping).

Sure. So keep the intermediate values as floats until you're finished
doing the series of rotations. That's easy if you're testing your own
code, harder if you're trying to test something like Photoshop.

>3. Photoshop 5.0 blurrs and creates halos even for one rotation.
> One may call this sharpening - I would say oversharpening or
> simply: not the best filter (page 2).

Photoshop's bicubic interpolation works well for what it's designed for:
resampling continuous images such as photographs. It produces overshoots
given a original with abrupt brightness transitions across single edges,
but so does *any* resampling technique with good high-frequency performance.
Just try the same test using a 4-lobed Lanczos-windowed sinc function, and
see how much ringing you get! But the problem is that your source material
is not properly bandlimited, and so algorithms designed for resampling
material that was properly sampled in the first place are not really
suitable for your test material. If you used high-quality photographs,
you might come to different conclusions. For example, grab a copy of
the "mandrill" image that's available from many places.

>4. Bilinear interpolation is very reasonable. The lines are not
> much blurred and don´t have halos (page 2). My algorithm works
> excellently for downscaling as well (slightly modified, compared
> to standard bilinear).

Bilinear is cheap, and doesn't overshoot. It also has poor high-frequency
performance, visible as blurring in just one resampling of a good image.
Also, it gives only zero-degree continuity in intensity in the result;
even the first derivative is discontinous. I've seen this produce visible
artifacts when an image is greatly enlarged. In comparison, polynomial
cubic interpolation gives first-degree continuity, while cubic b-spline
gives second-derivative continuity. These produce images that look much
more "smooth" at high magnifications - while retaining more real sharpness.

>5. Mr.Bourke´s bicubic filter is indeed too smooth. The results
> are very good, if a weak sharpening filter is applied afterwards.
> This should be combined with the interpolation.

I don't know what his filter actually does. If it's a standard B-spline,
then it doesn't interpolate the original points. There is a specific method
of applying a sharpening filter, or solving a set of equations, that
modifies the Bspline method so it *does* interpolate the original points,
and this should be used. Using some randomly-chosen sharpening algorithm
is better than nothing, but it will not exactly reconstruct the original
data points.

>6. Photoshop offers two interpolations, bilinear and bicubic
> (besides nearest neighbour). This is a good style.
> "Normal" users are not interested to change filters for different
> applications too often. Bilinear for screenshots etc., bicubic
> for photos, that´s enough.
> Therefore it´s legitimate to ask for the best interpolation, or the
> two best methods, in the sense of bilinear, bicubic. The question
> is not a result of my ignorance.

If you want to keep sharp edges in "desktop publishing", you may actually
want to use nearest-neighbour interpolation. But that has other costs -
really visible jaggies and geometric distortion. And it works particularly
poorly for photos.

To some extent, it depends on the pixel size. If you're working at 2500 DPI
(phototypesetter resolution), nearest-neighbour is just fine - individual
pixels are too small to see anyway. If you're working at 100 or 300 DPI,
where individual pixels *are* visible, then it's worthwhile paying much
more attention to filtering before sampling, and using more accurate
resampling techniques. I notice that desktop publishing is making much
more use of antialiasing in rendering fonts, for example, now that the
computation to do that isn't prohibitively expensive.

>7. Of course all these remarks concern image processing for desktop
> publishing, opposed to specific applications in medical data pro-
> cessing etc. Therefore I would like to state again, that the scien-
> tific work of E.Meijering and P.Thévenaz is probably very good, but
> it´s not helpful for desktop publishing (as long as we don´t have
> consequently 16 bits per channel).

I don't recall you saying before that you were interested in desktop
publishing. Anyway, I think the real difference is that you are
interested in sharp-edged rendering of symbols at relatively low
resolution, created without anti-aliasing, and you want to preserve
that look. I'm interested in real-world photographic and
computer-generated images that are as realistic as possible, including
proper anti-aliasing during sampling, with as few clues as possible
that the images are discrete, not continuous. The work of Meijering
and Thevenaz is done in the context of medial imaging, but applies to
all continuous-tone imaging. This means it's relevant to me, but maybe
not to you.

Dave

Gernot Hoffmann

unread,
Apr 6, 2002, 2:51:22 AM4/6/02
to
da...@cs.ubc.ca (Dave Martindale) wrote in message news:<a8l08u$clp$1...@newcastle.cs.ubc.ca>...
> ...
> To some extent, it depends on the pixel size. If you're working at 2500 DPI
> (phototypesetter resolution), nearest-neighbour is just fine - individual
> pixels are too small to see anyway. If you're working at 100 or 300 DPI,
> where individual pixels *are* visible, then it's worthwhile paying much
> more attention to filtering before sampling, and using more accurate
> resampling techniques. I notice that desktop publishing is making much
> more use of antialiasing in rendering fonts, for example, now that the
> computation to do that isn't prohibitively expensive.
> ....

Dave, are you joking ?

For imagesetter resolution 2500 dpi and linescreen 180..200 Lpi
(maximum raster frequency for offset printing) we don愒 need a
higher source image resolution than 300 dpi.

If a sublimation printer is used, than it愀 also not necessary to
have more than 300 source image pixels per inch on the paper,
because the eye resolution is not better, the single pixels are
not visible. The eye resolution is roughly 0.1mm for viewing
distance 1 ft.

Best regards --Gernot Hoffmann

Dave Martindale

unread,
Apr 6, 2002, 3:36:50 AM4/6/02
to
hoff...@fho-emden.de (Gernot Hoffmann) writes:
>da...@cs.ubc.ca (Dave Martindale) wrote in message news:<a8l08u$clp$1...@newcastle.cs.ubc.ca>...
>> ...
>> To some extent, it depends on the pixel size. If you're working at 2500 DPI
>> (phototypesetter resolution), nearest-neighbour is just fine - individual
>> pixels are too small to see anyway. If you're working at 100 or 300 DPI,
>> where individual pixels *are* visible, then it's worthwhile paying much
>> more attention to filtering before sampling, and using more accurate
>> resampling techniques. I notice that desktop publishing is making much
>> more use of antialiasing in rendering fonts, for example, now that the
>> computation to do that isn't prohibitively expensive.
>> ....

>Dave, are you joking ?

No.

>For imagesetter resolution 2500 dpi and linescreen 180..200 Lpi

>(maximum raster frequency for offset printing) we don´t need a

>higher source image resolution than 300 dpi.

What I meant was, if you're working directly with what the imagesetter
writes, which is a bilevel image at something like 2500 DPI, where all
the outline fonts have been converted to raster images, and all of the
continuous-tone images have been converted to halftone dots, then rotation
of *that* image is probably fine using nearest-neighbour.

On the other hand, if you're working with greyscale or colour continuous-
tone images, which have not yet been halftoned, then 300 DPI or less is
sufficient resolution - but you really want to use something better than
nearest-neighbour resampling when working with these images.

The process of high-quality printing uses these two different
representations at different stages of the process, but they're both raster
images and both can be rotated or scaled.

But I would normally never use a bilevel image at 300 DPI.

Dave

0 new messages