I know the solution is probably quite simple, and I'm simply overlooking it.
Any help, even so much as an "it (con/di)verges" would be appreciated.
Joe
:There is one problem that has been bothering me for a few years now
Diverges. n^((n+1)/n) = n * n^(1/n), and n^(1/n) --> 1 as n --> infinity
(any book on analysis, or take logs and use L'Hopital). Thus the ratio of
1/n and 1/n^((n+1)/n)
approaches 1, and since the first of these diverges, by the ratio test so
does the second.
--Ron Bruck
In article <ekHcTp$9#GA.251@cpmsnbbsa02>, "ashwood" <ash...@email.msn.com>
writes:
Suspicion is not a proof, though. And, the harmonic series is borderline
divergent. Each term of the new series is less than the corresponding
term of the divergent harmonic series, so the comparison test does not
lead to anything useful (despite what I recall seeing in another
response, probably yesterday). If that (n+1) term were (n-1), the
comparison test would prove divergence. As given, though ...
Lynn Killingbeck
[The result actually holds (with essentially the same
proof) under the hypothesis that the magnitude of
the ratios of the terms is bounded away from 0 and inf.]
Dave L. Renfro
Lynn Killingbeck wrote (in part):
Dave L. Renfro
Your series diverges. It follows by using the Limit Comparison Test on
your series and the Harmonic Series (1/n). Since both series always
have positive terms and the limit as n-> infinity of (nth term of your
series)/(1/n)=1, the test says that in this case, either both series
converge or both diverge. Well, since the Harmonic series diverges
(one way to show this is to use the Integral Test), yours also does.
Take Care
J.
Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.
If you're referring to Ron Bruck's post, didn't he mention the ratio
test (rather than a comparison test) ??
I understood him to be applying a result of the following flavor: Let
(a_n), (b_n) be sequences of positive terms such that a_n/b_n converges
to a positive limit. Then Sum a_n diverges if Sum b_n diverges.
Proof: Suppose a_n/b_n --> L as n --> \inf. Then, for all sufficiently
large n, (L/2)b_n < a_n < (3L/2)b_n and the first inequality implies
the result.
Clearly, that could be strengthened/refined (this version being just
"off the top of my head"), but it seems to do the trick ...
--
Ed Hook | Copula eam, se non posit
MRJ Technology Solutions, Inc. | acceptera jocularum.
NAS, NASA Ames Research Center | I can barely speak for myself, much
Internet: ho...@nas.nasa.gov | less for my employer
> We already know that the series of 1/n diverges. As n gets sufficiently large,
> 1/(n^((n+1)/n)) will get arbitrarily close to 1/n. Therefore, I expect this
> series will also diverge.
There are various things that "gets arbitrarily close to" could mean - this
is correct with some interpretations, not with others. In fact the ratio of the
two quantities tends to 1, hence the series diverges.
(Otoh: the sum of 1/n^2 converges, and as n tends to infinity this gets
arbitrarily close to 1/n, in the sense that the _difference_ tends to 0. It
does not follow that the sum of 1/n converges.)
My mistake, here. I read 'what I wanted to see', rather than what was
actually written. Worse, I did it from memory a day or two later,
without going back and re-reading the actual response. I stand
corrected.
Lynn Killingbeck
Is there a series for which it's REALLY difficult to tell if it con/diverges?
============================================================================
Of course it depends on how much is REALLY. But just using the integral
test and comparison tests, most of the well-known nasty series can be
easily squeezed in somewhere inside one of these two lists...
1/n 1/n^(1+eps)
1/(n logn) 1/(n (logn)^(1+eps) )
1/(n logn loglogn) 1/(n logn (loglogn)^(1+eps) )
1/(n logn loglogn logloglogn) ... ...
...for whose sums the first list all diverge and the second converge.
It's hard to imagine anything that resists such squeezing, but I suppose
there could be some sort of omega-ly defined series that fits in between
the two lists.
Alternatively, there might be a series defined in some totally different
way... using primes or partition numbers is out, because we know their
limiting density, as we do for most sequences of simply-defined integers.
So we want a "naturally" definable series; that is not something defined
according to whether <unsolved conjecture> is true or false. That would
be cheating most foul.
So anyway, interpreting the question in accordance with the spirit of the
provisos above:- DOES anyone know of a really nasty series, for which the
determination of con/divergence is really difficult?
Cheers.
--------------------------------------------------------------------------
Bill Taylor W.Ta...@math.canterbury.ac.nz
--------------------------------------------------------------------------
Physics confines itself to that thin sliver of reality
that can be accurately quantified. - John Baez
--------------------------------------------------------------------------
It depends on what you mean by "really difficult", but one example of a
series that was only recently proven to converge is the sum of the
reciprocals of the twin primes.
--
--Daniel Giaimo
Remove nospam. from my address to e-mail me. |
dgiaimo@(nospam.)ix.netcom.com
^^^^^^^^^<-(Remove)
|--------BEGIN GEEK CODE BLOCK--------| Ros: I don't believe in it anyway.
|Version: 3.1 |
|GM d-() s+:+++ a--- C++ UIA P+>++++ | Guil: What?
|L E--- W+ N++ o? K w>--- !O M-- V-- |
|PS? PE? Y PGP- t+(*) 5 X+ R- tv+(-) | Ros: England.
|b+@ DI++++ D--- G e(*)>++++ h->++ !r |
|!y->+++ | Guil: Just a conspiracy of
|---------END GEEK CODE BLOCK---------| cartographers, you mean?
:This little thread reminded me of something that floats around the back
:of my mind from time to time.
:
:Is there a series for which it's REALLY difficult to tell if it con/diverges?
There was a thread sometime back about a series (whose terms were a
function of x) where the question was whether the function was bounded or
not. It turned out they were unbounded, which was an old theorem of Hardy
and Littlewood, and which could be proved using the fact that if a
function is almost periodic, and its primitive is bounded, then the
primitive is also a.p. Exceedingly indirect, in other words: the
technique was to show the primitive was NOT a.p., hence couldn't be
bounded.
The series was something like \sum sin(x/n)/n, which definitely
converges--the crux being the boundedness. You might build something
around this (e.g. along the lines of your "not fair" criterion of an
unsolved conjecture, but using the SOLVED problem).
Of course, there's also the observation that the series
\sum 1/(2^n sqrt{|x-r_n|})
converges for a.e. x \in (0,1), where {r_n} is an enmumeration of the
rationals. A consequence of the monotone convergence theorem. If you
specify a particular x, say sqrt{2}/2, and a particular enumeration, say
the usual Cantor zigzag enumeration, it might be VERY difficult to decide
whether the result converges or not.
--Ron Bruck
It diverges as everyone says. I don't know if you guys are aware the
following well-known theorem which states
A series Sum{a_n} (a_n>0) is convergent if
e>0 and for n>no we have ln(1/a_n)/ln(n) >= 1+e;
It is divergent if
for n>no we have ln(1/a_n)/ln(n) <=1.
For our case
ln(1/a_n)/ln(n) = (n+1)/n === 1
where "====" means asymptoticaly equals to
Here is my problem (I do not know the solution).
Motivation: It is easy to show that the sum of sin(n)/n converges (to
something like pi/2-1) and that the sum of abs(sin(n))/n diverges.
The proof of the latter rests on finding that after adding a suitable
convergent series, the terms are greater than c/n for some c>0.
Since the average of abs(sin(x)) is 2/pi, and 1/pi is irrational, we
might ask:
Will the sum of (1/n) * (abs(sin(n) - 2/pi) converge or diverge?
Cheers, ZVK(Slavek).
> There is one problem that has been bothering me for a few years now
> regarding whether or not the sum of a series is convergent or divergent. The
> series is expressed as:
> 1/(n^((n+1)/n)) where n goes from 1 to infinite
>
> I know the solution is probably quite simple, and I'm simply overlooking it.
> Any help, even so much as an "it (con/di)verges" would be appreciated.
>
There are many ways to prove that the series
is divergent.
One possible way:
since n^(1/n)->1 as n->infinity
we have that there exists n_0 such that
for any n>n_0 n^(1/n)-1<1
(it follows from the definition of limit of sequence).
So n^(1/n)<2 for n>n_0.
Hence
1/(n^((n+1)/n)) = ( 1/n )*[ 1/n^(1/n) ] > (1/n)*(1/2) for n>n_0.
Finally 1/(n^((n+1)/n)) > 0.5*(1/n)>0 for al
n>n_0.
Because the series Sum_{n=1}^{infinity} 1/n
is divergent it follows that the series
0.5*Sum_{n=1}^{infinity} 1/n also diverges.
Now applying the comparison test we conclude
that Sum_{n=1}^{infinity} 1/n^( (n+1)/n )
diverges.
> Joe
Jan
Bill Taylor wrote:
[entire post chopped!]
> Physics confines itself to that thin sliver of reality
> that can be accurately quantified. - John Baez
Just how accurately do physicists quantify anything? The chief accountant of a
bank could know to the nearest penny how much money was in the bank. How does
that compare to the accuracy with which a physicist can measure the mass of, say,
a shoe? If John is right, is it a warning to other sciences *not* to try to hard
to emulate physics for fear of limiting their scope to much? Am I right in my
belief that far too many people wrongly think that something can only be
understood if a number can be attached to it?
Note that this is not a criticism of either BT or JB, I'm just bored....
> Is there a series for which it's REALLY difficult to tell if it con/diverges?
Consider sum_n 1/|n^3 sin n|. Whether it converges or not depends on
how well Pi can be approximated by rationals. Specifically, if there
is c > 0 such that |Pi - n/m| >= c/m^2 for all positive integers n,m,
then | n sin n| is bounded below, and the series converges. On the
other hand, if there are infinitely many n,m for which |Pi - n/m| < c/m^4,
then the terms of the series don't even go to 0, and it diverges.
AFAIK neither of these questions has been decided, although it is
known (M. Hata, Acta Arithmetica 63 (1993) 335-349), that
|Pi - n/m| > m^(-8.0161) for sufficiently large m. Thus
sum_n 1/|n^p sin n| does converge if p > 9.0161, but this was not
known before 1993.
Robert Israel isr...@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia
Vancouver, BC, Canada V6T 1Z2
|> Note that this is not a criticism of either BT or JB,
I'm sure we wouldn't take it so, but dashed nice of you to say so, sir!
|> I'm just bored....
And why not. This *is* Usenet, after all!
|> Just how accurately do physicists quantify anything?
I think it could be fairly said that astronomers, in particular, take
accuracy to extremes!
|> The chief accountant of a
|> bank could know to the nearest penny how much money was in the bank.
Is measuring the same as mere counting? Does either include the other, even?
Hmmm... measuring, counting, observing... hmmm...
|> If John is right, is it a warning to other sciences *not* to try to hard
|> to emulate physics for fear of limiting their scope to much?
Maybe. Most other subjects are said to suffer from "physics envy".
|> Am I right in my
|> belief that far too many people wrongly think that something can only be
|> understood if a number can be attached to it?
Possibly. This belief seems a little unjustified. I'm sure many humanities
topics can be well understood without having to resort to anything like
measurement. However, there may be some justification for thinking that
nothing can be terribly *objective* without some sort of measurement being
involved. No doubt millions of words have been written about all
the ramifications of this one.
Cheers. Hope I woke you up some...
----------------------------------------------------------------------------
Bill Taylor W.Ta...@math.canterbury.ac.nz
----------------------------------------------------------------------------
The chief difference between mathematics and physics is that
in mathematics we have much more direct contact with reality.
----------------------------------------------------------------------------
wrote (in part)
>This little thread reminded me of something that floats around
>the back of my mind from time to time.
>
>Is there a series for which it's REALLY difficult to tell if
>it con/diverges?
>===============================================================
>
>Of course it depends on how much is REALLY. But just using
>the integral test and comparison tests, most of the well-known
>nasty series can be easily squeezed in somewhere inside one of
>these two lists...
>
>1/n 1/n^(1+eps)
>
>1/(n logn) 1/(n (logn)^(1+eps) )
>
>1/(n logn loglogn) 1/(n logn (loglogn)^(1+eps) )
>
>1/(n logn loglogn logloglogn) ... ...
>
>...for whose sums the first list all diverge and the
>second converge.
>
>
>It's hard to imagine anything that resists such squeezing,
>but I suppose there could be some sort of omega-ly defined
>series that fits in between
>the two lists.
>
>Alternatively, there might be a series defined in some totally
>different way... using primes or partition numbers is out,
>because we know their limiting density, as we do for most
>sequences of simply-defined integers.
>
>So we want a "naturally" definable series; that is not
>something defined according to whether <unsolved conjecture>
>is true or false. That would be cheating most foul.
There appear to be two questions asked. Since all the
responses I've seen thus so have been to the 2'nd
question (which begins with "Alternatively, ..."),
I thought I'd address the first question.
I don't off-hand know of a "naturally" definable
example, but in
R. P. Agnew, "A slowly divergent series", Amer. Math.
Monthly 54 (1947), 273-274
the following example is shown to diverge more slowely
than any series in the logarithmic scale you gave:
Sum (n=1 to inf) of [P(n)]^(-1), where
P(n) = (n)*(log n)*(log log n)*(log log log n)* ...
is the product (NOT an infinite product) of all
iterated log's of n that are greater than 1.
The fact that such an example exists has been known
at least since the early 1870's, but Agnew's example has
the appeal of being easy to verify.
Another example is given in
Bruce Christianson, "Condensing a slowly convergent
series", Mathematics Magazine 68 (1995), 298-301.
Du Bois-Reymond proved in the 1870's that function
growth scales are much more densely packed together
than the real numbers are. For example, given any
countable set S of sequences and given another
sequence {b_n} that grows infinitely faster than
any of the sequences in S [in the sense that if you
take any {a_n} belonging to S, then the limit
as n --> infinity of the ratio (a_n)/(b_n) goes to 0],
then there exists a sequence {c_n} that grows infinitely
faster than any of the sequences in S and yet it is
still the case that {b_n} grows infinitely faster
than {c_n}. Note that for the set of real numbers it is
easy to find countable sets S each of whose elements
is less than some given number b, but for which there
is no number c greater than all the numbers in S
while still being less than b. [Let S = rationals less
than sqrt(2) and c = sqrt(2), or let S = {.9, .99,
.999, ...} and c = 1.] A similar result holds for
how slowly sequences can converge to some real number.
In particular, you can't take a "Deddekind cut" to obtain
either a slowest diverging series of positive terms
or a slowest converging series of positive terms.
An nice proof of this appears in
J. Marshall Ash, "Neither a worst convergent series
nor a best divergent series exists", The College
Mathematics Journal 28 (1997), 296-297.
Ash proves:
(1) If Sum(a_n) is a convergent series with positive
terms, then there exists a convergent series
Sum(b_n) such that limit as n --> infinity
of (b_n)/(a_n) goes to infinity. In other words,
there exists a convergent series whose terms become
"infinitely larger" than the terms of Sum(a_n).
(2) If Sum(c_n) is a divergent series of positive
terms, then there exists a divergent series
Sum(d_n) such that limit as n --> infinity
of (c_n)/(d_n) goes to 0. In other words, there
exists a divergent series whose terms become
"infinitely smaller" than the terms of Sum(c_n).
A very thorough historical survey of du Bois-Reymond's
work in this area is
Gordon Fisher, "The infinite and infinitesimal
quantities of du Bois-Reymond and their reception",
Archieve for the History of Exact Sciences 24 (1981),
101-163.
Du Bois-Reymond's work was continued by Hardy (his
"ORDERS OF INFINITY" book), Hausdorff, Rothberger,
and many others. This later work (especially after
Hardy) quickly gets into sophisticated set theoretic
matters. A recent survey is
Marion Scheepers, "Gaps in $\omega^{\omega}$", pages
439-561 in Haim Judah (editor), SET THEORY OF THE
REALS [Israel Mathematical Conference Proceedings 6,
Bar-Ilan University, January 1991], published by
the American Math. Society, 1993.
Note also the issue of oscillation. If f, g have growth rates
from the two lists or otherwise, such that Sum f(n) converges and Sum g(n)
diverges, some f(n)(sin h(n))^2 + g(n)(cos h(n))^2 might well have growth
rate oscillating between the two in quite a tricky way.
@
@Du Bois-Reymond proved in the 1870's that function
@growth scales are much more densely packed together
@than the real numbers are. For example, given any
@countable set S of sequences and given another
@sequence {b_n} that grows infinitely faster than
@any of the sequences in S [in the sense that if you
@take any {a_n} belonging to S, then the limit
@as n --> infinity of the ratio (a_n)/(b_n) goes to 0],
@then there exists a sequence {c_n} that grows infinitely
@faster than any of the sequences in S and yet it is
@still the case that {b_n} grows infinitely faster
@than {c_n}. Note that for the set of real numbers it is
@easy to find countable sets S each of whose elements
@is less than some given number b, but for which there
@is no number c greater than all the numbers in S
@while still being less than b. [Let S = rationals less
@than sqrt(2) and c = sqrt(2), or let S = {.9, .99,
@.999, ...} and c = 1.] A similar result holds for
@how slowly sequences can converge to some real number.
@
@In particular, you can't take a "Deddekind cut" to obtain
@either a slowest diverging series of positive terms
@or a slowest converging series of positive terms.
@
@An nice proof of this appears in
@
@J. Marshall Ash, "Neither a worst convergent series
@nor a best divergent series exists", The College
@Mathematics Journal 28 (1997), 296-297.
@
@Ash proves:
@
@(1) If Sum(a_n) is a convergent series with positive
@ terms, then there exists a convergent series
@ Sum(b_n) such that limit as n --> infinity
@ of (b_n)/(a_n) goes to infinity. In other words,
@ there exists a convergent series whose terms become
@ "infinitely larger" than the terms of Sum(a_n).
@
@(2) If Sum(c_n) is a divergent series of positive
@ terms, then there exists a divergent series
@ Sum(d_n) such that limit as n --> infinity
@ of (c_n)/(d_n) goes to 0. In other words, there
@ exists a divergent series whose terms become
@ "infinitely smaller" than the terms of Sum(c_n).
I haven't seen Ash's paper, but an easy proof is: let s_n =
c_1 + ... + c_n; if Sum c_n diverges, then Sum c_n/s_n also diverges.
Let t_n = a_n + a_n+1 + ...; if Sum a_n converges, then Sum a_n/sqrt(t_n)
also converges.
@
@A very thorough historical survey of du Bois-Reymond's
@work in this area is
@
@Gordon Fisher, "The infinite and infinitesimal
@quantities of du Bois-Reymond and their reception",
@Archieve for the History of Exact Sciences 24 (1981),
@101-163.
@
@Du Bois-Reymond's work was continued by Hardy (his
@"ORDERS OF INFINITY" book), Hausdorff, Rothberger,
@and many others. This later work (especially after
@Hardy) quickly gets into sophisticated set theoretic
@matters. A recent survey is
@
@Marion Scheepers, "Gaps in $\omega^{\omega}$", pages
@439-561 in Haim Judah (editor), SET THEORY OF THE
@REALS [Israel Mathematical Conference Proceedings 6,
@Bar-Ilan University, January 1991], published by
@the American Math. Society, 1993.
I'm sure the spirit of Bill's question involves _absolute_
convergence; e.g. think about what goes into the conditional convergence
of the likes of Sum a_n/sin(xn*pi) for various a_n and (irrational!) x...
If we stay within the logarithmico-exponential functions, asympto-
tics as n -> inf are a routine matter (as an exercise, locate on Bill's
scale a_n = n^(n^1/n) - n - (log n)^2 /(log n)^5 ). If we also allow
explicit operations with n terms (just shy of Agnew's series above), it
gets a bit tougher; for an example that _can_ be handled, try
Sum[n=1...inf] (2 - e)(2 - e^1/2)(2 - e^1/3)...(2 - e^1/n)
If we stray outside... well, some inventiveness is required even for non-
blatant (= monotonic) use of the sin function:
(sin 1)^2 + (sin sin 1)^2 + (sin sin sin 1)^2 + ...
(I won't spoil your fun; do it and you'll see what I mean).
Ilias
harder