Thanks -
L
The concept of an asymptote already embraces curves -- see, e.g.,
http://mathworld.wolfram.com/Asymptotic.html
Your first suggestion -- f(x)-g(x) -> 0 -- is closest, but fails
to capture the curve f(x)=1/x being asymptotic to the line x=0.
f(x)/g(x) -> 1 has problems -- if f(x)=1/x and g(x)=0 (admittedly
rather a flat curve) then f(x)/g(x) is undefined. Consider also
f(x)=2/x and g(x)=1/x ; in this case f(x)/g(x) -> 2.
The limit need not be as x -> inf, e.g. e^x is asymptotic to y=0
as x -> -inf. Wiser minds in sci.math will doubtless point out
further tricky aspects :-)
--
f(x) is asymptotically equal to g(x) as x goes to a (symbolized as f~g)
if lim(x->a)(f(x)/g(x))= 1.
Notice that this means that cos(x) ~ 1-x^2/2 as x->0. Asymptotes are no
longer unique. As well they shouldn't be, since they are estimates, and
there are lots of estimates, some better than others.
Jon Miller
Clearly one cannot have a vertical non-straight asymptote.
but for non-vertical:
Given y = f(x) and y= g(x), and suppose that as x increases
without bound, the distance from (x,f(x)) to the nearest
point on the graph of g(x) goes to zero.
Wouldn't this be a case of f being asymptotic to g as x
increases without bound?
You could define "asymptote" as an equivalence relation, noticing that
a line has itself as an asymptote. So a curve has another as an
asymptote if they share an asymptote in the usual sense (a line).
Karin
I'm not sure why you say this. The graphs of y = 1/(1-x) and y =
2/(1-x) are asymptotic to each other as x -> 1.
> but for non-vertical:
>
> Given y = f(x) and y= g(x), and suppose that as x increases
> without bound, the distance from (x,f(x)) to the nearest
> point on the graph of g(x) goes to zero.
>
> Wouldn't this be a case of f being asymptotic to g as x
> increases without bound?
A counter-example: f(x) = 0 and g(x)= sin(x^2) (or even better, g(x) =
x*sin(x^2)) satisfy your condition, but it's not reasonable to call
the two graphs asymptotic. If you add the converse condition that the
distance from (x, g(x)) to the nearest point on the graph of f(x) goes
to zero, the definition would probably be more useful.
I think it would be better to formulate the defintion in terms of
parametrized curves in the plane (x(t), y(t)) rather than in terms of
graphs of functions y = f(x). The latter approach introduces
complications that are not intrinsic to the geometry. A definition
similar to the one I suggested above for graphs is: f(t) and g(t) are
asymptotic if, for large t, the "tails" of the curves are close. The
"tail" of f(t) is the set of points {f(t) | t >= T} for some large T,
and the tails are "close" if each point of one set is close to some
point of the other set (i.e., smaller than any pre-assigned epsilon >
0). A similar defintion applies if t -> -infinity.
Keep in mind that different definitions of "asymptotic" may be useful
in different situations. For example, in the study of manifolds of
non-positive curvature, one says that two (unit speed) geodesics f(t),
g(t) are asymptotic if their distance remains bounded as t ->
infinity. This can then be used to define a compactification and an
"ideal boundary" of the manifold under certain conditions.
John Mitchell
> > Clearly one cannot have a vertical non-straight asymptote.
>
> I'm not sure why you say this. The graphs of y = 1/(1-x) and y =
> 2/(1-x) are asymptotic to each other as x -> 1.
Which one of them is purely vertical? Anything non-straight
is also non-vertical at some point.
What is true is that all such examples a have a vertical
straight line as asymptote (as well as any non-vertical
non-linear asymptotic curves they might have).
Not claiming to be wiser, I propose:
Two curves, parametrized by
(first:) x=f1(t), y=f2(t)
(second:) x=g1(s), y=g2(s)
(without loss of generality, we can assume both domains of the
parameters are of type [a, infinity))
can be called asymptotically close if there is a
re-parametrization of the second curve, s=S(t), such that
the distance of
(f1(t), f2(t)) from (g1(S(t)), g2(S(t)))
goes to 0 as t goes to infinity.
(A proper pair of asymptotic curves would assume that the
points (f1(t), f2(t)) run to infinity, that is, they eventually
stay out of every disk in the plane.)
Certainly, weird examples can be invented, but I tried to cover
the familiar cases, as well as the asymptotic point of the
logarithmic spiral and similar curves (take g1, g2 constant),
and even the type represented by the asymptotic circle of the
spiral in polar coordinates
r = theta / (1 + theta).
Cheers, ZVK(Slavek).
Interesting. But there is a problem here. The curves f(x) = x^2 and g(x)
= x^2 + 1/x do not share an asymptote in the "usual sense" (a line). Yet,
they should belong to the same equivalence class as the distance between the
curves becomes arbitrarily small for large values of x.
Len
"karin" <k3e...@yahoo.dk> wrote in message
news:9418506b.03032...@posting.google.com...
I assumed that you didn't intend to say that a non-straight curve
cannot be vertical everywhere, since that is too obvious to need
saying. So I guessed that you meant that two non-straight asymptotic
curves could not be asymptotically vertical. That's false, so I was
left to wonder what you meant, hence my response.
John Mitchell
wrote
Below are some of the variations that I've encountered.
1. ASYMPTOTE
Any rational function R(x) can be written as
R(x) = p(x) + r(x),
where p(x) is a polynomial (which might be the zero polynomial)
and r(x) is such that r(x) --> 0 as |x| --> oo. Thus, we could
call p(x) the "polynomial asymptote" to R(x). Sometimes precalculus
texts introduce this notion, although the name for this notion
can vary from book to book.
Note that rational functions have a unique polynomial asymptote.
More generally, it isn't difficult to see that if any function f(x)
has a polynomial asymptote p(x) in the sense that f(x) - p(x) --> 0
as |x| --> oo, then p(x) is unique. This is because if p_1 and p_2
are distinct polynomials, then (p_1 - p_2)(x), being a nonzero
polynomial, will not approach zero as |x| --> oo. If we broaden the
notion of what to allow for an asymptote, then it is easy to see that
we might lose uniqueness.
A nontrivial example of this behavior is the sum 1 + 1/2 + ... + 1/n
and the value of gamma + ln(n) -->
http://primes.utm.edu/glossary/page.php?sort=Gamma
http://mathworld.wolfram.com/Euler-MascheroniConstant.html
http://numbers.computation.free.fr/Constants/Gamma/gamma.html
2. ASYMPTOTIC
A weaker notion (at least for functions that don't approach zero as
|x| --> oo) is "f(x) is asymptotic to g(x) as x --> oo" (or as
|x| --> oo). To be an asymptote, we require that the actual distance
between the curves approaches zero. To be asymptotic, we only require
that the relative distance between the curves approaches zero. If
the two functions are asymptotic as |x| --> 0, then as you zoom out
equally horizontally and vertically, the two graphs will appear to
get closer and closer to each other.
In practice it is often easy to obtain a simple asymptotic
equivalent of a function. The general rule of thumb is that you
can ignore "less dominate" terms in any additive situation.
By "f is less dominate than g", I mean limit(x -> oo) of f(x)/g(x)
is zero, or f = o(g) (i.e. f is little 'Oh' of g).
Thus, (3x^3 - 2x)/(5x^2 + 4x + 1) is asymptotic to (3x^3)/(5x^2),
or (3/5)x. Sqrt[9x^4 + 2x^2 + 2] is asymptotic to 3x^2.
A nontrivial example of this behavior is the number of primes less
than n and the value of n/(ln n).
http://www.utm.edu/research/primes/howmany.shtml
http://mathworld.wolfram.com/PrimeNumberTheorem.html
http://primes.utm.edu/glossary/page.php?sort=PrimeNumberThm
Another nontrivial example is Stirling's formula for n!.
http://mathworld.wolfram.com/StirlingsApproximation.html
http://www.sosmath.com/calculus/sequence/stirling/stirling.html
3. ORDER
A still weaker notion (at least for functions that don't approach zero
as x --> oo) is what I'll call f(x) is "the same order as" g(x) as
x --> oo, which means that the ratio f(x)/g(x) is bounded away from
zero and infinity as x --> oo. Thus, 7x^3 and (2x-3)^3 have the same
order, [2x + sin(x)]^4 and x^4 have the same order, and exp(ax) has
the same order as exp(bx) iff a=b. The notion of "the same order as"
is used in the limit comparison test for infinite series that one
encounters in a second semester calculus course.
http://mathworld.wolfram.com/LimitComparisonTest.html
4. EXPONENTIAL ORDER
There is a notion of f(x) is "the same exponential order as" g(x)
as x --> oo that arises in complex analysis. The idea is that any
polynomial has exponential order 0 and exp(p) has exponential order
n where p is a polynomial of degree n.
http://www.math.tamu.edu/~harold.boas/courses/618-1999a/feb12.pdf
5. C^1 ASYMPTOTES
Everything above has to do with f(x) not being "too different" from
g(x) for large x. We could also require that their derivatives are
not "too different" for large x. I suppose this could be done in
a variety of ways, like above, but I've only encountered this in
the context of the derivatives approaching each other. Below is
an old post of mine where this is discussed. Note that we could
also require that the first n derivatives, for any fixed n > 0,
behave similarly (e.g. in the sense of asymptote, in the sense of
asymptotic, in the sense of order, etc.) for large x.
********************************************************************
********************************************************************
http://mathforum.org/epigone/ap-calc/snedikha
Subject: [ap-calculus] Re: sin x/x
Author: Dave L. Renfro <dlre...@gateway.net>
Date: Wed, 20 Sep 2000 20:01:53 -0500
Curtis James <jam...@postnet.com>
[AP-CALC Mon, 18 Sep 2000 19:20:51 -0500]
<http://forum.swarthmore.edu/epigone/ap-calc/snedikha>
wrote
> The definition, according to Leithold (The Calculus Seven),
> is y= b is a horizontal asymptote of f if lim f, x->inf = b
> and for some number N, if x>N, then f(x) is not equal to b;
> there is a similar condition for x->-inf. Since this means
> that after x reaches N, f(x) cannot equal b, I think that
> disqualifies lim sinx/x, x->inf from having y=0 as a
> horizontal asymptote. (Kind of a sticky question, however
> in my opinion. Perhaps we need to designate this situation
> with some other name--maybe there already is one.)
The general consensus in this thread appears to be that the
line y=L is a (right) horizontal asymptote to the graph of
y = f(x) <==> limit as x --> infinity of f(x) is L, and this
is what I had always considered the definition to be. But missing
from these discussions, as far as I can tell, is any mention of
*WHY* one would use one definition over another. I can think of
many situations where knowing that "limit as x --> infinity of
f(x) is L" does or does not hold would be of interest, but I'm
at a loss to come up with any mathematically significant
situation in which Leithold's definition would be of interest.
I looked in some older texts out of curiosity ---->>>>
(1) From page 144 of Charles H. Sisam, ANALYTIC GEOMETRY, 1936:
"If a branch of a curve extends toward infinity in such a
way that it approaches indefinitely near to a fixed line,
this line is called an asymptote to the curve."
(2) From page 136 of Walter Burton Ford, A FIRST COURSE IN THE
DIFFERENTIAL AND INTEGRAL CALCULUS", revised edition, 1937:
"Let AB be any curve, or branch of a curve, which extends
indefinitely to distant portions of the plane, or as
commonly stated, 'extends to infinity.' As the point of
contact P of a tangent to such a curve is chosen farther
and farther away the accompanying tangent ST may approach
a definite limiting position MN. In this case MN is called
an asymptote to the curve AB." [A figure is provided.]
The definition Sisam gives is equivalent, when the 'fixed line'
is horizontal, to the limit definition I gave earlier. On the
other hand, the definition Ford gives is more restrictive. When
specialized to the graph of y = f(x), Ford requires both (i)
and (ii):
(i) limit as x --> infinity of f(x) exists
(ii) limit as x --> infinity of f'(x) exists
Ford doesn't give any examples that distinguish "(i)" from
"(i) and (ii)", but they are not hard to come by. The function
(sin x)/x satisfies both (i) and (ii), while (sin x^2)/x and
(sin x^3)/x each satisfy (i) only. [Note that the derivative of
(sin x^2)/x is bounded as x --> infinity, while the derivative
of (sin x^3)/x is unbounded as x --> infinity.] These examples
show that Ford's definition doesn't imply Leithold's definition.
The functions 1/x + (sin x^3)/(x^2) and 1/x + (sin x^4)/(x^2)
have y=0 as a horizontal asymptote according to Leithold's
definition, but neither satisfy Ford's definition. [The former
has a bounded derivative as x --> infinity, while the latter has
an unbounded derivative as x --> infinity.] Therefore, Ford's
definition neither implies, nor is implied by, Leithold's
definition.
Ford's definition is what I suppose one would call a 1'st order
of contact at infinity, with Sisam's definition being a 0'th order
of contact at infinity. [The analogy being with, for example,
y = x-a having a 0'th order of contact at x=a and y = (x-a)^2
having a 1'st order of contact at x=a. The former involves only
the behavior of y as x --> a, while the latter involves the
behavior of both y and y' as x --> a.]
I can see how Ford's definition could be useful, but I continue
to be puzzled as to why anyone would be interested in the
condition Leithold gives.
********************************************************************
********************************************************************
Dave L. Renfro
"What is an asymptote?", P. J. Giblin, Math. Gaz. 56 (Dec. 1972) 274-284
Regards,
David