Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Help with 1/2 - (1/8)*x < 1/x - 1/(e^x - 1)

105 views
Skip to first unread message

Dave L. Renfro

unread,
Mar 18, 2001, 2:57:19 AM3/18/01
to
I'm trying to verify the inequality


1/2 - (1/8)*x < 1/x - 1/(e^x - 1) < 1/2

for x > 0.

I can verify the right-hand <.

This inequality is supposed to follow from tanh(x) < x
(and I can verify this as well), but I'm not getting anywhere.

The inequality above is in the middle of a proof in

S. K. Lakshmana Rao, "On the sequence for Euler's constant",
Amer. Math. Monthly 63 (1956), 572-573.

Everything else in this paper is fine--it's only the left-hand
side of the inequality above that's giving me trouble.

Can anyone help?

Dave L. Renfro

David Petry

unread,
Mar 18, 2001, 4:11:56 AM3/18/01
to

Dave L. Renfro wrote

>I'm trying to verify the inequality
>
>
> 1/2 - (1/8)*x < 1/x - 1/(e^x - 1) < 1/2
>
>for x > 0.
>
>I can verify the right-hand <.
>
>This inequality is supposed to follow from tanh(x) < x
>(and I can verify this as well), but I'm not getting anywhere.

Here's a hint which I'm pretty sure will work.

Recall that (d/dx) tanh(x) = 1 - tanh(x)^2 > 1 - x^2

so by integrating, we have tanh(x) > x - x^3/3

and go from there.

David C. Ullrich

unread,
Mar 18, 2001, 11:35:33 AM3/18/01
to
On 18 Mar 2001 02:57:19 -0500, ren...@central.edu (Dave L. Renfro)
wrote:

>I'm trying to verify the inequality
>
>
> 1/2 - (1/8)*x < 1/x - 1/(e^x - 1) < 1/2
>
>for x > 0.
>
>I can verify the right-hand <.
>
>This inequality is supposed to follow from tanh(x) < x
>(and I can verify this as well), but I'm not getting anywhere.

If you're willing to settle for a _proof_ (as opposed to
a clever proof or a proof that explains why it's true)
it seems like you can do this: Note this is equivalent to

1/2 + 1/(e^x - 1) < 1/x + (1/8)*x .

Now everything in sight is positive - hence you can
"multiply it out" to reduce it to

x^2 + 4*x + 8 < (x^2 - 4*x +8)*e^x .

That's what I got just now, anyway. You note that
both sides are equal when x = 0 so it suffices to
show that the derivative of the LHS is less than
the derivative of the RHS when x > 0. This reduces
to

2*x + 4 < (x^2 - 2*x + 4)*e^x.

Again, both sides are equal when x = 0 so
it suffices to prove the derivative of the LHS is
less than the derivative of the RHS, which reduces
to

2 < (x^2 + 2)*e^x,

which is clear.

(The fact that I kept getting equality when x = 0
seems like evidence I may have done the
calculations correctly. If not you can still
possibly do it this way - you can prove
many inequalities this way. Calculus.)

Robin Chapman

unread,
Mar 18, 2001, 11:20:29 AM3/18/01
to
Dave L. Renfro <ren...@central.edu>

> I'm trying to verify the inequality
>
>
> 1/2 - (1/8)*x < 1/x - 1/(e^x - 1) < 1/2
>
> for x > 0.
>
> I can verify the right-hand <.
>
> This inequality is supposed to follow from tanh(x) < x
> (and I can verify this as well), but I'm not getting anywhere.


The left inequality isn't too hard if you proceed in a naive way.

Rearranging, it's equivalent to
e^x > (x^2 + 4x + 8)/(x^2 - 4x + 8).
Let
f(x) = (x^2 - 4x + 8)e^x - (x^2 + 4x + 8).
Then
f'(x) = (x^2 - 2x + 4)e^x - (2x + 4)
and
f''(x) = (x^2 + 2)e^x - 2.
Then f(0) = f'(0) = 0 and for x > 0,
f''(x) = x^2 e^x + 2(e^x -1) > 0.
This gives in turn, f'(x) > 0 and f(x) > 0 for x > 0,
and the last inequality is the one we want.

Robin Chapman, www.maths.ex.ac.uk/~rjc/rjc.html

Rainer Rosenthal

unread,
Mar 18, 2001, 12:04:43 PM3/18/01
to

Dave L. Renfro <ren...@central.edu> wrote in message
news:3nkkv9...@forum.mathforum.com...
|
| 1/2 - (1/8)*x < 1/x - 1/(e^x - 1) for x > 0.
|

Hello Dave,

this looked so nice, that I tried my hand and found
this lovely expression:

e^x > ( x^2 + 4x + 8) / ( x^2 - 4x + 8)

which is equivalent to your inequality.

I am not sure whether this will help. But aren't there
technics to do the polynom division and compare
the outcome to the power series expansion of e^x ?

Anyhow, this inequality striked me, and I used my
Excel to draw the functions "left" and "right" and
indeed we have a wonderful near approximation of
e^x by the right side near x=0.

Regards
Rainer

Peter L. Montgomery

unread,
Mar 19, 2001, 3:32:21 AM3/19/01
to
In article <992ppn$46n49$1...@ID-54909.news.dfncis.de>
"Rainer Rosenthal" <r.ros...@web.de> writes:
>
>Dave L. Renfro <ren...@central.edu> wrote in message
>news:3nkkv9...@forum.mathforum.com...
>|
>| 1/2 - (1/8)*x < 1/x - 1/(e^x - 1) for x > 0.
>|
>
>Hello Dave,
>
>this looked so nice, that I tried my hand and found
>this lovely expression:
>
> e^x > ( x^2 + 4x + 8) / ( x^2 - 4x + 8)
>
>which is equivalent to your inequality.

With a little more work, you can translate this to

tanh(x/2) > 4*x/(x^2 + 8) (x > 0)

You can check these by letting

f(x) = x - 2 arctanh( 4*x/(x^2 + 8))
= x + ln(x^2 - 4*x + 8) - ln(x^2 + 4*x + 8).

Obviously f(0) = 0. Whichever form you choose
to differentiate, f'(x) is a rational function in x
(this is why we are using logarithms). A calculation gives

f'(x) = x^2 (x^2 + 8) / ((x^2 - 4*x + 8) (x^2 + 4*x + 8)),

which is non-nonnegative for x >= 0.

Looking at the power series for tanh(x/2), we may guess

tanh(x/2) > 6*x/(x^2 + 12) (x > 0)

This is easily checked in the same way.
Going backwards, we can replace the constant
1/8 by 1/12 in the original problem. Setting x = 1 in

1/2 - x/12 > 1/x - 1/(e^x - 1)

gives 0.41667 < 0.41802, much tighter that 0.37500 < 0.41802.

>I am not sure whether this will help. But aren't there
>technics to do the polynom division and compare
>the outcome to the power series expansion of e^x ?

--
The 21st century is starting after 20 centuries complete,
but we say someone is age 21 after 21 years (plus fetus-hood) complete.
Peter-Lawren...@cwi.nl Home: San Rafael, California
Microsoft Research and CWI

Rainer Rosenthal

unread,
Mar 20, 2001, 6:21:02 PM3/20/01
to

Peter L. Montgomery <Peter-Lawren...@cwi.nl> wrote in message
news:GAFrp...@cwi.nl...

|
| 1/2 - x/12 > 1/x - 1/(e^x - 1)
|
| gives 0.41667 < 0.41802, much tighter that 0.37500 < 0.41802.
|

Thanks a lot, Peter !

I will have to think a while about your reasoning. In the meantime
I took the easy way and computed with your 12 instead of 8 the
much better approximation

e^x ~ ( x^2 + 12 + 6x) / ( x^2 + 12 - 6x)

which really took me by surprise, when I tabulated it in Excel and
sketched it - wonderful approximation !

x e^x (...)/(...)
0,1 1,105170 1,105170
0,2 1,221402 1,221402
0,3 1,349858 1,349854
0,4 1,491824 1,491803
0,5 1,648721 1,648648
0,6 1,822118 1,821917
0,7 2,013752 2,013268
0,8 2,225540 2,224489
0,9 2,459603 2,457489
1,0 2,718281 2,714285

I especially appreciate the ABRAKADABRA-math which produces
after some strong "smoke" such a rabbit like the "12 instead of 8"
Thanks again and kind reagrds

Rainer Rosenthal

Ayatollah Potassium

unread,
Mar 20, 2001, 8:16:39 PM3/20/01
to
"Dave L. Renfro" wrote:

> 1/2 - (1/8)*x < 1/x - 1/(e^x - 1) < 1/2 for x > 0.

> [...]

The secret here seems to be that z/(e^z - 1) is the series for
Bernoulli
numbers, which (after some adjustment to the first couple of terms)
is one of those series with alternating signs where truncating it at any

order gives valid upper and lower bounds (for the function at all x>0,
not just where the series converges). I think the right adjustment
of the function where this is literally true was z*coth(z) (consistent
with the remark below), but I don't remember precisely.


> This inequality is supposed to follow from tanh(x) < x
> (and I can verify this as well), but I'm not getting anywhere.

Meaning x*coth(x) > 1 ...

Dave L. Renfro

unread,
Mar 22, 2001, 3:16:25 PM3/22/01
to
Dave L. Renfro <ren...@central.edu>
[sci.math 18 Mar 01 00:08:19 -0500 (EST)]
<http://forum.swarthmore.edu/epigone/sci.math/chilhildwi>

wrote (in part):

> I'm trying to verify the inequality
>
>
> 1/2 - (1/8)*x < 1/x - 1/(e^x - 1) < 1/2
>
> for x > 0.

I'd like to thank those who have responded (7 responses at this
time). I haven't had a chance to look over the replies carefully
yet (maybe this weekend?), but if I have any additional questions
I'll certainly reply again. Here's the story behind this, if anyone
is interested. I decided to change a job interview (a tenure-track
math position) presentation the evening before I was to leave
town because I decided that what I had originally planned to talk
about might not be sufficiently focused to pull off very easily.
[Square/cube law in biology and engineering, applications of
scaling principles in mathematics (the Pythagorean theorem can
be proved in this manner, for instance), how the Cantor set and
other "fractals" scale, and using big 'Oh' and little 'Oh' to
describe growth rates (those already discussed, along with linear
approximation errors, trapezoid and Simpson rule errors, etc.).]
The audience would consist of upper level mathematics students and
I could assume a background through sophomore level multivariable
calculus. The topic needed to be something relating to undergraduate
level real analysis. Although I could assume a bit more mathematical
maturity than the typical student who had just finished multivariable
calculus (the students were, after all, upper level math majors),
I shouldn't assume that they would have already taken a real
analysis or advanced calculus course. I decided to discuss the
rate at which the harmonic series diverges. By grouping the terms
in groups of 2, 4, 8, ..., in two different ways, you can obtain
a divergence rate that is within a factor of 2 of log-base-2 of
the number of terms added. By looking at appropriate upper and
lower Riemann sums for y = 1/x, you can show that the divergence
is asymptotic to log-base-e of the number of terms added. Indeed,
you're able to that the sum of the first n terms lies between
ln(n) + 1/n and ln(n) + 1, a much stronger result. By considering
appropriate triangular regions that get ignored during the lower
Riemann sum calculation, you can improve this by showing that the
sum lies between ln(n) + 1/2 + 1/(2n) and ln(n) + 1. At this point
I would show (a bounded monotone sequence argument) that the
difference between the sum and ln(n) actually approaches a
limit--Euler's constant, and I would then show some of the many
ways it comes up in math. [See, for example,
<http://www.mathsoft.com/asolve/constant/euler/euler.html>.
A really strange way is this: The limit as n --> infinity of
[(ln n)^(-1)]*[product of (p+1)/p for all primes p less than
or equal to n] is [6*exp(Euler's constant)]/(Pi^2) (?!).] So now
we know that (sum) - ln(n) - (gamma) ['gamma' is Euler's constant]
approaches 0 as n --> infinity. There is a nice proof in a 1956
Amer. Math. Monthly paper (reference in my earlier post) that
shows (sum) - ln(n) - (gamma) lies between
(1/2)*n^(-1) - (1/8)*n^(-2) and (1/2)*n^(-1), which shows that
(1/2)*n^(-1) is the leading power-function approximation for the
rate at which (sum) - ln(n) - (gamma) approaches 0. It turns out
that other methods (e.g. the Euler-Maclaurian sum formula, which
I wasn't going to discuss) can be used to improve this to show that
(sum) - ln(n) - (gamma) lies between (1/2)*n^(-1) - (1/12)*n^(-2)
and (1/2)*n^(-1). Indeed, one can show that (sum) - ln(n) - (gamma)
has the asymptotic expansion

(1/2)*n^(-1) - (1/12)*n^(-2) + (1/120)*n^(-4) - (1/252)*n^(-6) + ...,

where the coefficients wind up being easily related to the Bernoulli
number sequence.

I left for the airport less than 10 hours after sending in my
sci.math plea, and since there were no replies by the time I had
to leave, I decided to let the inequality be something their
math students might want to work on. However, I wound up going
over the earlier stuff (which originated as some introductory
remarks that I wrote during the trip) in sufficient detail that
I decided to skip the last part (the Monthly paper proof), which
in retrospect was what I should have done anyway.

Dave L. Renfro

0 new messages