Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Infinitesimals

22 views
Skip to first unread message

Paul N

unread,
Feb 19, 2024, 8:52:00 AMFeb 19
to
Hi all

Recently there was a discussion in comp.theory about infinitesimals. It seems I can't post to that group but can post to this one, so hopefully people will not mind too much and some kind person might even post a link there to my post here?

I wanted to point out that Ian Stewart had written an article called "Beyond the vanishing point" in which he discusses the strange situation in which doing calculus by using very small numbers and then treating these numbers as zero after you've divided through by them is not valid but nevertheless seems to work. Here is some of the article as a taster:


As far as we know, the first people to ask questions about the proper use of logic
were the ancient Greeks, although their work is flawed by modern standards. And
in about 500BC the philosopher Zeno of Elea invented four famous paradoxes to show
that infinity was a dangerous weapon, liable to blow up in its user’s hands. Even so,
the use of "infinitesimal" arguments was widespread in the sixteenth and seventeenth
centuries, and formed the basis of many presentations of (for example) the calculus.
Indeed it was often called "Infinitesimal Calculus". The logical inconsistencies involved
were pointed out forcibly by Bishop Berkeley in a 104-page pamphlet of 1734 called
The Analyst: A Discourse Addressed to an Infidel Mathematician. The trouble was,
calculus was so useful that nobody took much notice. But, as the eighteenth century
wore on, it became increasingly difficult to paper over the logical cracks. By the middle
of the century, a number of mathematicians including Augustin-Louis Cauchy, Bernard
Bolzano and Karl Weierstra8, had found ways to eliminate the use of infinities and
infinitesimals from the calculus.

The use of infinitesimals by mathematicians rapidly became "bad form", and
university students were taught rigourous analysis, involving virtuoso manipulations
of complicated expressions in the Greek letters epsilon and delta imposed by
the traditional definitions. There is even a colloquial term for the process: epsilontics.
Despite this, generations of students in Engineering departments cheerfully used the
outdated infinitesimals; and while the occasional bridge has been known to fall down,
nobody to my knowledge has ever traced such a disaster to illogical use of infinitesimals.

In other words, infinitesimals may be wrong - but they work. Indeed, in the
hands of an experienced practitioner, who can skate carefully round the thin ice, they
work very well indeed. Although the lessons of this circumstance have been learned
repeatedly in the history of science, it took mathematicians a remarkably long time to
see the obvious: that there must be a reason why they work; and if that reason can
be found, and formulated in impeccable logic, then the mathematicians could use the
"easy" infinitesimal arguments too!

It took them a long time becauseit’s very hard to get right. It relies on some deep
ideas from mathematical logic that derive from work in the 1930s. The resulting theory
is called Nonstandard Analysis, and is the creation of Abraham Robinson.It allows the
user to throw real infinities and infinitesimals around with gay abandon. Despite these
advantages, it has yet to displace orthodox epsilontics, for two main reasons:

* The necessary background in mathematical logic is difficult and, except for this
one application, relatively remote from the mathematical mainstream.
* By its very nature, any result that can be proved by nonstandard analysis can also
be proved by epsilontics: it’s just that the nonstandard proof is usually simpler.

You can get the article (published in "Eureka") by going to https://www.archim.org.uk/eureka/archive/index.html and downloading Issue 50 - April 1990. Enjoy!

Paul.

root

unread,
Feb 19, 2024, 11:41:34 AMFeb 19
to
Paul N <gw7...@aol.com> wrote:
> Hi all
>
> Recently there was a discussion in comp.theory about infinitesimals. It seems
I can't post to that group but can post to this one, so hopefully people will
not mind too much and some kind person might even post a link there to my post
here? > > I wanted to point out that Ian Stewart had written an article called
"Beyond the vanishing point" in which he discusses the strange situation in
which doing calculus by using very small numbers and then treating these numbers
as zero after you've divided through by them is not valid but nevertheless seems
to work. Here is some of the article as a taster:
>

I respect Ian Stewart because of his books, but here is being an alarmist.

Continuous functions are defined as those for which the infinitesmal analysis
works. Calculus applies to such functions.
0 new messages