Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Fitting a hyperbola

297 views
Skip to first unread message

MC

unread,
May 24, 2004, 6:53:19 PM5/24/04
to
Hi everyone,

I'm new to curve fitting and I have several datasets that should
theoretically fit rectangular hyperbolas. By this, I mean hyperbolas
of the form:

y = a/(bx + c)

What I'd like to do is derive values of a, b, & c that maximally fit
this equation for each dataset. I'm not quite sure how to do this.

The method I'm using isn't really for approximate curve fitting; it's
simply an analytic technique for determining the equation of the
hyperbola given 2 actual data points. (I've included my code and a
small dataset below.)

I'm solving this by taking yxb + yc + a = 0, and then finding the
determinant for the matrix in the equation:

|x*y y 1| |c1|
|x1*y1 y1 1| |c2| = 0
|x2*y2 y2 1| |c3|

Solving the determinant for y gives me the rectangular hyperbola.

The problem is that this requires only 2 points to fit the hyperbola,
and different selections of points give slightly different
hyperbolas. I could try all combinations of points, e.g., all (n
choose 2) of them, and pick the hyperbola with lowest error, but I'm
thinking there must be a better way to do this.

Thank you in advance for any help you can provide me.

Here's my code: (Output is below.)

-------------------------------
% Sample data points
xpts = 1:15;
ypts = [9999 4975 3245 2433 1908 1582 1349 1179 1012 896 808 741 681
633 592];

figure;
hold on;
plot(xpts,ypts, 'Color', 'red') % Plot the actual data points

syms x y real;

xvals = xpts([1,10]); % Use these 2 pts
yvals = ypts([1,10]); % Use these 2 pts

% Our equation is y = a/(xb + c)
% So, we are solving the equation: yxb + yc + a = 0
M = [x*y xvals.*yvals; y yvals; 1 1 1];

M'

determinant = det(M')

hyperbola = solve(determinant, 'y')

hypvalues = subs(hyperbola, x, xpts);

plot(xpts, hypvalues, '--', 'Color', 'blue');

----------------------------------------

The output is:

ans =

[ x*y, y, 1]
[ 9999, 9999, 1]
[ 8960, 896, 1]


determinant =

9103*x*y-1039*y-80631936


hyperbola =

80631936/(9103*x-1039)

john kerry

unread,
May 24, 2004, 7:10:12 PM5/24/04
to
Why dont you do least squares?
JFK

Cleve Moler

unread,
May 24, 2004, 10:23:40 PM5/24/04
to
In article <eede0...@webx.raydaftYaTP>, MC <m...@nospam.org> wrote:
>Hi everyone,
>
>I'm new to curve fitting and I have several datasets that should
>theoretically fit rectangular hyperbolas. By this, I mean hyperbolas
>of the form:
>
>y = a/(bx + c)
>
>What I'd like to do is derive values of a, b, & c that maximally fit
>this equation for each dataset. I'm not quite sure how to do this.
>
>The method I'm using isn't really for approximate curve fitting; it's
>simply an analytic technique for determining the equation of the
>hyperbola given 2 actual data points. (I've included my code and a
>small dataset below.)
>
>I'm solving this by taking yxb + yc + a = 0, and then finding the
>determinant for the matrix in the equation:
>
>|x*y y 1| |c1|
>|x1*y1 y1 1| |c2| = 0
>|x2*y2 y2 1| |c3|
>
>Solving the determinant for y gives me the rectangular hyperbola.
>
>The problem is that this requires only 2 points to fit the hyperbola,
>and different selections of points give slightly different
>hyperbolas. I could try all combinations of points, e.g., all (n
>choose 2) of them, and pick the hyperbola with lowest error, but I'm
>thinking there must be a better way to do this.
>
>Thank you in advance for any help you can provide me.
>
>% Sample data points
>xpts = 1:15;
>ypts = [9999 4975 3245 2433 1908 1582 1349 1179 1012 896 808 741 681 633 592];
>
> <Code deleted>

Hi --

You realize, of course, that you can scale the three coefficients a, b, and c
by any scale factor and still have the same hyperbola.

Consider the tall, skinny matrix A with columns [xi.*yi yi 1].
If your data exactly falls on a hyperbola, then the columns of this
matrix are linearly dependent and the coefficients are the null vector.
With errors in the data, the columns are not dependent. The singular
value decomposition provides the nearest thing to a null vector.

Try this:

>> x = xpts';
>> y = ypts'/1000;
>> A = [x.*y y ones(size(y))];
>> [u,s,v] = svd(A,0);
>> c = v(:,3);
>> Y = -c(3)./(c(1)*x + c(2));
>> plot(x,y,'.',x,Y)

-- Cleve
mo...@mathworks.com

Peter Spellucci

unread,
May 25, 2004, 2:58:12 PM5/25/04
to

In article <eede0...@webx.raydaftYaTP>,

MC <m...@nospam.org> writes:
>Hi everyone,
>
>I'm new to curve fitting and I have several datasets that should
>theoretically fit rectangular hyperbolas. By this, I mean hyperbolas
>of the form:
>
>y = a/(bx + c)

this results in a nonlinear least squares problem (if this is the
adequate measure of error, for example for random noise in y and no noise in x)
and the job will be done by lsqnonlin (or lsqcurvefit in thi case)
in the optimization toolbox. for this you will need reasonable initial
values and what you describe below is one way to otain such an initial
guess. bette use the approach of Cleve, which is also a kinfd of linearization:
fix one of the parametrs first (since there is an arbitrary common factor in your
model), say
y=a/(b*x+1)
multiplying with the denominator and using least sqaures we get:
sum{ y(i)*x(i)*b + y(i) -a }^2=min_{a,b} or
[a;b]=[-ones(N,1),x.*y]\(-y);
If the fit is not very good, this will not be a good approximation
to the solution of the original nonlinear problem (in the sense of statistics,
it introduces "bias" in the estimated coefficients) but you might
use this as initial guess for the nonlinear fit.
for this one you write a function

function res=ratfit(param,x,y)
res=y-param(1)./(param(2)*x+ones(length(x),1));
and use lsqnonlin
param=lsqnonlin(@ratfit,[-ones(N,1),x.*y]\(-y),[],[],[],x,y):
it might be necessary to put bounds on param(2) in order to aoid a pole
of the rational function inside the interval spanned by the x.
lsqnonlin allos you to do so.
hth
peter

Matt Feinstein

unread,
May 25, 2004, 3:28:49 PM5/25/04
to
On Mon, 24 May 2004 18:53:19 -0400, MC <m...@nospam.org> wrote:

>Hi everyone,
>
>I'm new to curve fitting and I have several datasets that should
>theoretically fit rectangular hyperbolas. By this, I mean hyperbolas
>of the form:
>
>y = a/(bx + c)
>
>What I'd like to do is derive values of a, b, & c that maximally fit
>this equation for each dataset. I'm not quite sure how to do this.

The fact that your function has a singularity makes the question of
what you mean by 'maximally fit' non-trivial. In any case, as has been
noted, your function has only two independent parameters-- so it's a
good candidate for a simpler approach.

Personally, I would start by looking at the data and try to spot the
location of the singularity at x = -c/b . It's likely that this is the
'true' critical parameter in any case.


Matt Feinstein

--
There is no virtue in believing something that can be proved to be true.

MC

unread,
May 25, 2004, 9:20:14 PM5/25/04
to
Cleve, thanks very much for your response. I had wondered about the
numeric stability of my approach -- in the event the points were
exactly on the hyperbola -- but it hadn't occurred to me how to use
that concern to find a better solution.

(It was really quite a treat to get a response from Matlab's
inventor.)

Schimmel

unread,
May 6, 2016, 9:35:11 PM5/6/16
to
Several years ago MC described fitting a hyperbola to 15 (x,y) data that appear in his post on MATLAB. His interpolating equation is y = A/(Bx + C), a three-parameter equation. In most laboratory practice it is customary to include a background term so a three-parameter interpolating equation is y = A + B/(x + C). This method has the advantage that the interpolating equation invariant under translation of the data. I found a new interpolating equation for the data given by MC in his post. The coefficients are rounded for convenience.

y = (-1)(99.5727154693) + (10105.7817647) / (0.000420087187497 + x) (1)

MC's equation, given in his post, is closer to five of the data than Eq. (1). However, Eq. (1) is closer to ten of the fifteen data than MC's equation. On this basis, I suggest that Eq. (1) is better than MC's equation for two reasons. The method generating Eq. (1) is based on least-squares. A literature citation is "Hyperbola for curvilinear interpolation" in Applied Mathematical Sciences 7 (30) (2013) 1477-1481. This journal is online. It can be downloaded for FREE. It illustrates the individual steps and it is accompanied by examples.

Lab data can also be modelled by the hyperbola plus a background term plus a linear term. See "Curvilinear Interpolation: Hyperbola with a Linear Term" in Applied Mathematical Sciences 7 (48) (2013) 2395-2398. The method is also easy and examples are given. The author of both articles is Silver. -Schimmel
0 new messages