Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

What a waste.

16 views
Skip to first unread message
Message has been deleted
Message has been deleted

Ben Golub

unread,
Jul 14, 2002, 8:47:52 PM7/14/02
to
World class trolling.

Ben

James Harris <jst...@msn.com> wrote in message
news:uNdF#b5KCHA.1496@cpimsnntpa03...
> Being someone with a certain perspective based on years of study, both in
> college and on my own, with a certain level of accomplishment under my belt,
> I can only say that the continual ignoring of my work by both the scientific
> and mathematical establishments is a very bad sign.
>
> I'm sure at least some of you know about the Riemann Hypothesis.
>
> Well I posted a partial differential equation which defines the pi function
> which is what the Riemann Hypothesis is about. Yeah, it was a casual thing
> for me, one small step you might say.
>
> That information has been out for over a *month* folks, and the people I'm
> sure many of you respect either still don't have a clue, or they're holding
> their breath.
>
> The sad fact is that there are more signs than my personal travails that the
> system may, in fact, be broken.
>
> Theoretical physics is a mess, especially cosmological physics, where
> basically they don't have a clue. There are theories all over the place,
> but who knows which is right? Worse, when important work like the recent
> paper about "gravastars" comes out, the physics establishment chooses to let
> it slide by? Why? Because people like Hawking are celebrities and they
> don't want to admit he might be even a little bit wrong.
>
> Come on people, it's not even that bad, and most of Hawking's work should
> still be valid. So what if black holes aren't actually singularities?
>
> Isn't the TRUTH what's important?
>
> The real problem is that we're still having battles over continuous versus
> discrete when it should be over already.
>
> Space is not continuous. Period.
>
> Let me repeat--space is not continuous.
>
> And in mathematics, number theory is a shambles. Near as I've been able to
> determine, number theory hasn't advanced much since Dedekind. Um folks,
> what I'm saying is that number theory has stagnated for over a hundred
> years.
>
> That's actually why there was all the hullabahoo over algebraic integers.
>
> You see, Dedekind came up with algebraic integers, following up on Gauss'
> "gaussian integers", and mathematicians didn't make the obvious extension.
> That is, they failed to do the full abstraction.
>
> I can see no reason for that failure. It's actually kind of bizarre.
>
> What's bad though is now they're just lying to cover themselves, if they
> even talk about the problem. The disarray of number theory is actually more
> of a concern to me than the problems in physics *because* space is
> continuous.
>
> Basically, apparently, we're falling behind at an ever increasing rate,
> while our scientists get increasingly muddled and driven more by politics
> than science.
>
> I'd hoped to find some ray of hope on these newsgroups, but instead I find a
> lot of people who are vicious to an extent that's hard to believe until you
> see it in person, as I have, who don't seem to know jackshit.
>
> I'm beginning to think there's no hope anywhere.
>
> If so, there's no one to solve the problems we're facing.
>
> These are dark times, and it looks like they're darkening. From where I
> sit, we're in some kind of Dark Ages in both math and physics.
>
> The only really bright spots in the sciences are material's science,
> biology, evolutionary psychology, and object oriented computer programming
> theory and practice.
>
> That's the good news, but it's not good enough to keep us from having a
> miserable future.
>
> Folks, it can start getting bad and just never get better within the
> lifetime of our species. But humanity won't die quickly in terms of its own
> perception. My guess is that it could take two to three hundred years for
> it all to play out, before the last human being finally gasps their last
> breath.
>
> But as far as the Universe is concerned, that's tomorrow and not even a day.
>
>
> James Harris
>
>


-----= Posted via Newsfeeds.Com, Uncensored Usenet News =-----
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
-----== Over 80,000 Newsgroups - 16 Different Servers! =-----

Peter John Lawton

unread,
Jul 14, 2002, 8:20:01 PM7/14/02
to
James Harris wrote:

> > of a concern to me than the problems in physics *because* space is
> > continuous.
>

> Damn. Damn. Damn.
>
> I had such high hopes for this damn post and I make a damn mistake.
>
> It's because space is NOT continuous.
>
> Damn. Damn. Damn.

Well I still think it's a very interesting and very clearly expressed
post, in spite of that small slip - and you did stress your opinion
earlier very clearly.
What hopes do you have for the post I wonder.
I regret I have no knowledge of the Riemann hypothesis but I would like
to know why you say that space is not continuous - perhaps you could
elaborate on that point while awaiting more informed comments on Riemann
and the pi function

Peter Lawton

LarryLard

unread,
Jul 14, 2002, 9:17:59 PM7/14/02
to
James Harris wrote:
[snip]

> Folks, it can start getting bad and just never get better within the
> lifetime of our species. But humanity won't die quickly in terms of
> its own perception. My guess is that it could take two to three
> hundred years for it all to play out, before the last human being
> finally gasps their last breath.

Predictions about what will happen in two to three hundred years are cheap.
Why? Because no one can call you on them.

I was gonna post a point-by-point snipe, but on reflection I think it's
better just to say this: I think the time has come to reclassify James from
'crank' to 'kook'. At least when he posts to sci..physics.

--
Larry Lard
Replies to group please.

Santiago Canez

unread,
Jul 14, 2002, 10:22:21 PM7/14/02
to
> Because people like Hawking are celebrities and they don't want to
> admit he might be even a little bit wrong.

You do realize that Hawking has lost bets where others have shown him
to be wrong right? You also do realize that he himself has admitted to
being wrong right?

Steve Leibel

unread,
Jul 14, 2002, 10:25:01 PM7/14/02
to
In article <uNdF#b5KCHA.1496@cpimsnntpa03>,
"James Harris" <jst...@msn.com> wrote:

> Being someone with a certain perspective based on years of study, both in
> college and on my own, with a certain level of accomplishment under my belt,
> I can only say that the continual ignoring of my work by both the scientific
> and mathematical establishments is a very bad sign.
>

Why are you complaining to us? We're all in on the conspiracy. We know
you really proved FLT, but we are going to suppress that information at
all costs. So stop telling us about it.

Steve Leibel

unread,
Jul 14, 2002, 10:28:29 PM7/14/02
to
In article <uNdF#b5KCHA.1496@cpimsnntpa03>,
"James Harris" <jst...@msn.com> wrote:

>
> I'd hoped to find some ray of hope on these newsgroups, but instead I find a
> lot of people who are vicious to an extent that's hard to believe until you
> see it in person, as I have, who don't seem to know jackshit.
>

I know Jack Shit, he used to attend the same International Math
Conspiracy meetings I did.

> I'm beginning to think there's no hope anywhere.
>


Isn't that the punchline of the story of Pandora's box? That there's
always hope?

> If so, there's no one to solve the problems we're facing.
>

But wait! There's JSH. He will lead us out of the darkness!!


>
> But as far as the Universe is concerned, that's tomorrow and not even a day.
>
>

You are a sick puppy.

Dr Arm®

unread,
Jul 14, 2002, 10:44:58 PM7/14/02
to

For instance...?

Ben Golub

unread,
Jul 14, 2002, 11:11:38 PM7/14/02
to
> > You do realize that Hawking has lost bets where others have shown him
> > to be wrong right? You also do realize that he himself has admitted to
> > being wrong right?
>
> For instance...?

Go look on Google before wasting the newsgroups' time with such trivial questions.

Two separate bets that Hawking lost:
----------

AUTHOR: Johnson, George, 1952-
TITLE: What a physicist finds obscene. (S. Hawking loses bet over existence of
naked singularities) SOURCE: New York Times (Late New York Edition) (Feb. 16 '97)
p. 4 (Sec 4) il.
STANDARD NO: 0362-4331
DATE: 1997
RECORD TYPE: art
CONTENTS: feature article
ABSTRACT: Renowned physicist Stephen Hawking had bet with colleagues that a naked
singularity, one not shielded by a black hole, was impossible. A singularity is a
tear in the fabric of time and space such that the laws of physics break down and
anything is possible. Hawking was recently forced to concede the bet when a
computer simulation by Matthew Choptuik of the University of Texas in Austin
indicated that under very rare and contrived circumstances, a black hole might
collapse in a manner that would expose its singularity.
SUBJECT:
Hawking, S. W. (Stephen W.)
Thorne, Kip S.
Preskill, John P.
Black holes (Astronomy).
Naked singularities (Astrophysics).

--------

Hawking made an insurance bet with one of John Wheeler's students, that black
holes would not exist - so that if his life's work turned out to be wasted, at
least he would have some recompense.

Astronomers began to search for black holes. A Russian physicist, Yakov Zeldovich,
had realised that if two stars are orbiting each other and one becomes a black
holes, then the remaining star will seem to orbit nothing. By the speed of the
orbiting star, scientists can work out how big the invisible object could be.
Several objects have been fond which seem to satisfy this description - so much
so, that Stephen Hawking is convinced and has settled his bet.
http://www.psyclops.com/hawking/shu/shu5.html

Sam Wormley

unread,
Jul 14, 2002, 11:39:47 PM7/14/02
to
James Harris wrote:
>
> Being someone with a certain perspective based on years of study, both in
> college and on my own, with a certain level of accomplishment under my belt,
> I can only say that the continual ignoring of my work by both the scientific
> and mathematical establishments is a very bad sign.
>

Either the mathematical doesn't see your work because it isn't disseminated
(published) or, more likely, it is just plain wrong and would be rejected for
publication because you are unwilling to correct errors.

David Kastrup

unread,
Jul 15, 2002, 4:21:07 AM7/15/02
to
"James Harris" <jst...@msn.com> writes:

> And in mathematics, number theory is a shambles. Near as I've been
> able to determine, number theory hasn't advanced much since
> Dedekind. Um folks, what I'm saying is that number theory has
> stagnated for over a hundred years.

Since you proudly admit to refusing to read even introductory number
theory texts, how would you know? You can't simultaneously blindfold
yourself and then clamor about how people can tolerate the darkness
they are living in.

--
David Kastrup, Kriemhildstr. 15, 44793 Bochum
Email: David....@t-online.de

Nico Benschop

unread,
Jul 15, 2002, 5:19:37 AM7/15/02
to
James Harris wrote:
>
> [...]

> I'm beginning to think there's no hope anywhere.
> If so, there's no one to solve the problems we're facing.
> These are dark times, and it looks like they're darkening.
> From where I sit,

that's Atlanta ?

> we're in some kind of Dark Ages in both math and physics.

The END of the World? Where have I heard that before..?

> The only really bright spots in the sciences are material's science,
> biology, evolutionary psychology, and object oriented computer
> programming theory and practice. That's the good news, but it's
> not good enough to keep us from having a miserable future.

True. But look at it from the bright side, James: probably those
beautifully object-oriented programmed computers will survive us,
and _they_ will have a bright future, so what are you moping about?

> Folks, it can start getting bad and just never get better within
> the lifetime of our species. But humanity won't die quickly in
> terms of its own perception. My guess is that it could take

> two to three hundred years for it all to play out, ..[*]


> before the last human being finally gasps their last breath.

Re[*]: Good, no worries then, for us and you.
The last Jehova Witness I talked to was definite in his
judgement that it would be much sooner; but then, who knows..?
There've been many similar reports, the last 2000 years or so,
and all were refuted by "Time-will-tell", and Time did indeed tell
(untill the next meteorite hits...;-(

> But as far as the Universe is concerned,

> that's tomorrow and not even a day. -- James Harris

-- NB

José Carlos Santos

unread,
Jul 15, 2002, 5:32:20 AM7/15/02
to
Dr Arm? <Genuin...@Yahoo.com> wrote in message news:<3D3237...@Yahoo.com>...

> > You do realize that Hawking has lost bets where others have shown him
> > to be wrong right? You also do realize that he himself has admitted to
> > being wrong right?
>
> For instance...?

Just check Hawking's "The univere in a nutshell".

Best regards

Jose Carlos Santos

Chas Brown

unread,
Jul 15, 2002, 6:26:16 AM7/15/02
to

Ben Golub wrote:
>
> James Harris <jst...@msn.com> wrote in message
> news:uNdF#b5KCHA.1496@cpimsnntpa03...

> > Being someone with a certain perspective ...

<snip>

> World class trolling.
>
> Ben
>

Absolutely. Just when you think he's tapped out, he bounces back with
another incredible performance.

> >You see, Dedekind came up with algebraic integers, following up on Gauss'
> >"gaussian integers", and mathematicians didn't make the obvious extension.

Maybe it would have helped if they'd called them "dedekindian integers".

Really, he's in a class by himself. Kudos, James, kudos.

Cheers -Chas

David C. Ullrich

unread,
Jul 15, 2002, 8:25:34 AM7/15/02
to
On Sun, 14 Jul 2002 20:32:11 -0400, "James Harris" <jst...@msn.com>
wrote:

>Being someone with a certain perspective based on years of study, both in
>college and on my own, with a certain level of accomplishment under my belt,
>I can only say that the continual ignoring of my work by both the scientific
>and mathematical establishments is a very bad sign.
>

>I'm sure at least some of you know about the Riemann Hypothesis.
>
>Well I posted a partial differential equation which defines the pi function
>which is what the Riemann Hypothesis is about.

Where and when did you post this differential equation? I missed it
somehow. _What_ differential equation was it?

>Yeah, it was a casual thing
>for me, one small step you might say.
>
>That information has been out for over a *month* folks, and the people I'm
>sure many of you respect either still don't have a clue, or they're holding
>their breath.
>
>The sad fact is that there are more signs than my personal travails that the
>system may, in fact, be broken.
>
>Theoretical physics is a mess, especially cosmological physics, where
>basically they don't have a clue. There are theories all over the place,
>but who knows which is right? Worse, when important work like the recent
>paper about "gravastars" comes out, the physics establishment chooses to let

>it slide by? Why? Because people like Hawking are celebrities and they


>don't want to admit he might be even a little bit wrong.

Your ignorance of almost everything is truly astonishing. People argue
about things Hawking says all the time. (How do I know? It's not that
I'm a physicist, I've just read a few popular books by _Hawking_...)

>Come on people, it's not even that bad, and most of Hawking's work should
>still be valid. So what if black holes aren't actually singularities?
>
>Isn't the TRUTH what's important?
>
>The real problem is that we're still having battles over continuous versus
>discrete when it should be over already.

We are?

>Space is not continuous. Period.

How do you know this?

>Let me repeat--space is not continuous.

Oh - you prove it by repetition. Very good.

>And in mathematics, number theory is a shambles. Near as I've been able to
>determine, number theory hasn't advanced much since Dedekind. Um folks,
>what I'm saying is that number theory has stagnated for over a hundred
>years.

Let me repeat--your ignorance of almost everything is astonishing.

>That's actually why there was all the hullabahoo over algebraic integers.

What hullabloo? (You mean the hullabaloo here on sci.math? _That_
hullabaloo was caused by your bitterness about the fact that people
never told you about things that in fact people had been telling you
about for years.)

>You see, Dedekind came up with algebraic integers, following up on Gauss'
>"gaussian integers", and mathematicians didn't make the obvious extension.

They did? What _is_ the obvious extension, btw?

>That is, they failed to do the full abstraction.
>
>I can see no reason for that failure. It's actually kind of bizarre.
>
>What's bad though is now they're just lying to cover themselves, if they
>even talk about the problem. The disarray of number theory is actually more

>of a concern to me than the problems in physics *because* space is
>continuous.
>

>Basically, apparently, we're falling behind at an ever increasing rate,
>while our scientists get increasingly muddled and driven more by politics
>than science.
>

>I'd hoped to find some ray of hope on these newsgroups, but instead I find a
>lot of people who are vicious to an extent that's hard to believe until you
>see it in person, as I have, who don't seem to know jackshit.
>

>I'm beginning to think there's no hope anywhere.

If you're hoping for the things I suspect you're hoping for, then
in fact there _is_ no hope anywhere.

Let me repeat--there is no hope anywhere.

>If so, there's no one to solve the problems we're facing.
>
>These are dark times, and it looks like they're darkening. From where I

>sit, we're in some kind of Dark Ages in both math and physics.

Of course it looks that way from where you sit. That's because you sit
in a place where people _proudly_ refuse to learn any of the basic
material in areas they insist on being world-shattering experts in.
You insist you're the greatest mathematician on the planet. If
that were so then things would indeed be very dark. But luckily
that's not so.

>The only really bright spots in the sciences are material's science,
>biology, evolutionary psychology, and object oriented computer programming
>theory and practice.

No, there's another bright spot: You're actually not the greatest
mathematician on the planet.

>That's the good news, but it's not good enough to keep us from having a
>miserable future.
>

>Folks, it can start getting bad and just never get better within the
>lifetime of our species. But humanity won't die quickly in terms of its own
>perception. My guess is that it could take two to three hundred years for

>it all to play out, before the last human being finally gasps their last
>breath.

If only we'd acknowledge the greatness of your contributions. Then the
race would have eternal life. Jesus (as it were.)

>But as far as the Universe is concerned, that's tomorrow and not even a day.
>
>

>James Harris
>


David C. Ullrich

David C. Ullrich

unread,
Jul 15, 2002, 8:27:36 AM7/15/02
to
On 15 Jul 2002 02:22:21 GMT, Santiago Canez <sca...@u.arizona.edu>
wrote:

You must be new here. James never lets details like facts distract
him.

David C. Ullrich

David C. Ullrich

unread,
Jul 15, 2002, 8:30:53 AM7/15/02
to
On 15 Jul 2002 10:21:07 +0200, David Kastrup
<David....@t-online.de> wrote:

>"James Harris" <jst...@msn.com> writes:
>
>> And in mathematics, number theory is a shambles. Near as I've been
>> able to determine, number theory hasn't advanced much since
>> Dedekind. Um folks, what I'm saying is that number theory has
>> stagnated for over a hundred years.
>
>Since you proudly admit to refusing to read even introductory number
>theory texts, how would you know? You can't simultaneously blindfold
>yourself and then clamor about how people can tolerate the darkness
>they are living in.

He can't? Of course he can, he just did - you must have inadvertently
omitted a hypothesis or something...

>--
>David Kastrup, Kriemhildstr. 15, 44793 Bochum
>Email: David....@t-online.de


David C. Ullrich

David Kastrup

unread,
Jul 15, 2002, 8:48:51 AM7/15/02
to
David C. Ullrich <ull...@math.okstate.edu> writes:

> On 15 Jul 2002 10:21:07 +0200, David Kastrup
> <David....@t-online.de> wrote:
>
> >"James Harris" <jst...@msn.com> writes:
> >
> >> And in mathematics, number theory is a shambles. Near as I've been
> >> able to determine, number theory hasn't advanced much since
> >> Dedekind. Um folks, what I'm saying is that number theory has
> >> stagnated for over a hundred years.
> >
> >Since you proudly admit to refusing to read even introductory number
> >theory texts, how would you know? You can't simultaneously blindfold
> >yourself and then clamor about how people can tolerate the darkness
> >they are living in.
>
> He can't? Of course he can, he just did - you must have inadvertently
> omitted a hypothesis or something...

A frequent mistake of mine. Please add "unless looking like a
complete moron is an option for you" to that last sentence.

David C. Ullrich

unread,
Jul 15, 2002, 10:11:02 AM7/15/02
to
On 15 Jul 2002 14:48:51 +0200, David Kastrup
<David....@t-online.de> wrote:

>David C. Ullrich <ull...@math.okstate.edu> writes:
>
>> On 15 Jul 2002 10:21:07 +0200, David Kastrup
>> <David....@t-online.de> wrote:
>>
>> >"James Harris" <jst...@msn.com> writes:
>> >
>> >> And in mathematics, number theory is a shambles. Near as I've been
>> >> able to determine, number theory hasn't advanced much since
>> >> Dedekind. Um folks, what I'm saying is that number theory has
>> >> stagnated for over a hundred years.
>> >
>> >Since you proudly admit to refusing to read even introductory number
>> >theory texts, how would you know? You can't simultaneously blindfold
>> >yourself and then clamor about how people can tolerate the darkness
>> >they are living in.
>>
>> He can't? Of course he can, he just did - you must have inadvertently
>> omitted a hypothesis or something...
>
>A frequent mistake of mine. Please add "unless looking like a
>complete moron is an option for you" to that last sentence.

Thanks for clarifying that. I had a few conjectures what the
missing bit was, but I didn't want to put words in your mouth.

>--
>David Kastrup, Kriemhildstr. 15, 44793 Bochum
>Email: David....@t-online.de


David C. Ullrich

JSH Info

unread,
Jul 15, 2002, 10:01:04 AM7/15/02
to

For information about James Harris see:
http://www.geocities.com/williamrexmarshall/cranks/jsh1.html

SURGEON GENERAL'S WARNING:
"Try not to respond to James.
Together we can do it."

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
All you've succeeded in doing is _ignoring_ the many
explanations of why there are numerous gaping holes in
your non-proof.
-- The above is part of a post by David Ullrich to sci.math.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

A message similar to this one is automatically
posted to every thread in sci.math started by James.
The primary purpose of these posts is to give
newbies a pointer to some background information on him.
It appears that he starts a new thread whenever he considers
himself to be losing the argument in a previous one.
This is automatic post number 259.

New threads started by James in the last 7 days: 9 (1.3 threads/day)
New threads started by James in the last 30 days: 33 (1.1 threads/day)
New threads started by James in the last 90 days: 103 (1.1 threads/day)

James: Thank you for including JSH in the subject.

Some news readers allow the user to filter out
postings with a particular phrase in the subject.

--
JSH Info, version 1.0.0.235

Randy Poe

unread,
Jul 15, 2002, 10:19:16 AM7/15/02
to
James Harris wrote:
>
> Being someone with a certain perspective based on years of study,

I thought you've said you never cracked an abstract algebra
book and have no intention of ever doing so.

> both in
> college and on my own, with a certain level of accomplishment under my belt,

I'm afraid of the answer, but... such as?

> Well I posted a partial differential equation which defines the pi function

> which is what the Riemann Hypothesis is about. Yeah, it was a casual thing


> for me, one small step you might say.

You've made many allusions as to what you think the "Riemann Hypothesis"
is about and that you've "solved" it. You've never actually stated
the Riemann Hypothesis or made the connection beyond the bald claim.

> Theoretical physics is a mess, especially cosmological physics, where
> basically they don't have a clue. There are theories all over the place,
> but who knows which is right?

Please cite specifice for this "all over the place" claim.

As to your question, are you familiar with the concept of "experiment"?

- Randy

Pythagoras Titanium

unread,
Jul 15, 2002, 12:49:00 PM7/15/02
to
In article <3D3293A9...@chello.nl>, Nico Benschop wrote:
> True. But look at it from the bright side, James: probably those
> beautifully object-oriented programmed computers will survive us,
> and _they_ will have a bright future, so what are you moping about?

I (think I) know that you are joshing, but this presents an
opportunity for me to be serious. IMO, OO is a programming
metholody which has turned out to be suprisingly useful in many
instances (GUI toolkits for instance).

When I think about it it seems clear that OO can be understood in
already understood mathematical terms. For instance a type
with a natural number a string, might be defined as member
of NxE, where E is an alphabet. Extentions a this type
could be understood as types of the form N x E* x t1 x t2 etc.
Sets of functions should be allowed as elements of types.
like funcs(N,N) (functions from N to N). You can go on
to define "types with default initialisers", for insance
((Nxfuncs(N,N)),(0,f:N->N where f(x)=x+1)) could be an example.

My point? My point is that OO can be perfectly well described by
existing math,
and ought to be viewed as a programming methodology, not a math
methology.

OK, I admitted that I'm not really qualified to talk about this
subject, so other posters are invited to clobber me or
agree with me now.

Steve Leibel

unread,
Jul 15, 2002, 1:36:15 PM7/15/02
to
In article <slrnaj5sns.1...@d226-23-159.home.cgocable.net>,
Pythagoras Titanium <mrjw...@home.com> wrote:

> My point? My point is that OO can be perfectly well described by
> existing math,
> and ought to be viewed as a programming methodology, not a math
> methology.
>

Of course OO is a programming methadology, and not even a particularly
new one, people have been playing with the concepts since the 60's. JSH
is an idiot, confuses buzzwords with knowledge.

Nico Benschop

unread,
Jul 16, 2002, 4:33:36 AM7/16/02
to
Pythagoras Titanium wrote:
>
> In article <3D3293A9...@chello.nl>, Nico Benschop wrote:
> > True. But look at it from the bright side, James: probably those
> > beautifully object-oriented programmed computers will survive us,
> > and _they_ will have a bright future, so what are you moping about?
>
> I (think I) know that you are joshing,

Just kidding, really.

I've nothing against the Object Oriented approach to anything,
say in math or in CS. In fact my take is roughly that syntax
(of language, as in CS or in math) should be balanced with a similar
'amount' of semantics - the latter being the representation of
'variables' that occur in the syntax, which is more 'string like'.

Their balanced combination, in a two-level approach, is appealing
to me as an EE (in digital network analysis & synthesis 'on silicon')
And in CS, the 'representation' (semantics) part of math would be
taken care of by 'data-structure' (the class concept in C++ ?).

Trying to do 'pure syntax' - which pure mathematicians seem to
consider the ultimate goal (correct me if I'm wrong) - is, as
far as I'm concerned, throwing the baby out with the bathwater.
What I'd call the single-focus syndrome (= standing on one leg,
and trying to walk;-)

> but this presents an opportunity for me to be serious.
> IMO, OO is a programming metholody which has turned out to be
> suprisingly useful in many instances (GUI toolkits for instance).

> [...]


>
> My point? My point is that OO can be perfectly well described by
> existing math, and ought to be viewed as a programming methodology,
> not a math methology.

As I tried to sketch above, it is - in general - a rather fundamental
and very useful approach in many contexts. A while ago (dec'98)
I even made a post in this NG on 'BOOA-constructor' :
(Boolean Object Oriented Algebra)
http://groups.google.com/groups?q=BOOA-constructor+group:sci.math.*&hl=en&lr=&ie=UTF-8&safe=off&selm=745m05%24g1e%241%40nnrp1.dejanews.com&rnum=1

> OK, I admitted that I'm not really qualified to talk about
> this subject, so other posters are invited to clobber me or
> agree with me now.

I appreciate your remarks 'in the sideline' (but then: aren't those
sidelines often most interesting, like conversations in the hallways,
or at lunch, at a conference..;-)

-- NB - http://home.iae.nl/users/benschop

Nico Benschop

unread,
Jul 16, 2002, 4:40:38 AM7/16/02
to
Nico Benschop wrote:
>
> [...]

> As I tried to sketch above, it is - in general - a rather fundamental
> and very useful approach in many contexts. A while ago (dec'98)
> I even made a post in this NG on 'BOOA-constructor' :
> (Boolean Object Oriented Algebra)
corr: Balanced O O A

Andrew Taylor

unread,
Jul 16, 2002, 11:03:46 AM7/16/02
to
David C. Ullrich <ull...@math.okstate.edu> wrote in message news:<83f5jug69t0v1v0ti...@4ax.com>...

> On Sun, 14 Jul 2002 20:32:11 -0400, "James Harris" <jst...@msn.com>
> wrote:
>
> >Being someone with a certain perspective based on years of study, both in
> >college and on my own, with a certain level of accomplishment under my belt,
> >I can only say that the continual ignoring of my work by both the scientific
> >and mathematical establishments is a very bad sign.
> >
> >I'm sure at least some of you know about the Riemann Hypothesis.
> >
> >Well I posted a partial differential equation which defines the pi function
> >which is what the Riemann Hypothesis is about.
>
> Where and when did you post this differential equation? I missed it
> somehow. _What_ differential equation was it?

It's a _partial_ differential equation... probably in the sense of
"not actually all there yet".

> [snipped the rest]

Andrew Taylor
Cambridge UK

David C. Ullrich

unread,
Jul 16, 2002, 11:28:23 AM7/16/02
to
On 16 Jul 2002 08:03:46 -0700, andrew...@analysys.com (Andrew
Taylor) wrote:

>David C. Ullrich <ull...@math.okstate.edu> wrote in message news:<83f5jug69t0v1v0ti...@4ax.com>...
>> On Sun, 14 Jul 2002 20:32:11 -0400, "James Harris" <jst...@msn.com>
>> wrote:
>>

>> >[...]


>> >
>> >Well I posted a partial differential equation which defines the pi function
>> >which is what the Riemann Hypothesis is about.
>>
>> Where and when did you post this differential equation? I missed it
>> somehow. _What_ differential equation was it?
>
>It's a _partial_ differential equation... probably in the sense of
>"not actually all there yet".

Hmm. Ingenious theory, but James wouldn't make a claim like this
if it weren't so. At least I don't _think_ he would...

>> [snipped the rest]
>
>Andrew Taylor
>Cambridge UK


David C. Ullrich

George Johnson

unread,
Jul 16, 2002, 8:01:40 PM7/16/02
to
"James Harris" <jst...@msn.com> wrote in message
news:uNdF#b5KCHA.1496@cpimsnntpa03...
| Being someone with a certain perspective based on years of study, both in
| college and on my own, with a certain level of accomplishment under my
belt,
| I can only say that the continual ignoring of my work by both the
scientific
| and mathematical establishments is a very bad sign.

I have a certain level of accomplishment below my belt (wink wink - come
over to this dark alley and I'll show it to you).

| I'm sure at least some of you know about the Riemann Hypothesis.

Incredulous gasps fill the audience ("The Riemann Hypothesis", they
whisper among themselves, "Get this man a Nobel Prize posthaste!")

| Well I posted a partial differential equation which defines the pi
function

| which is what the Riemann Hypothesis is about. Yeah, it was a casual
thing
| for me, one small step you might say.

Casual Pi Sundays, with Crazy Tie - Zeta Function Mondays, and
Comfortable Prime Number Shorts Tuesdays. It's the new mathematician Casual
Wear Attire.

| That information has been out for over a *month* folks, and the people I'm
| sure many of you respect either still don't have a clue, or they're
holding
| their breath.

I don't have a clue and I'm holding my breath and I respect it.

| The sad fact is that there are more signs than my personal travails that
the
| system may, in fact, be broken.

The mental care institutional system I ponder?

| Theoretical physics is a mess, especially cosmological physics, where
| basically they don't have a clue. There are theories all over the place,

| but who knows which is right? Worse, when important work like the recent
| paper about "gravastars" comes out, the physics establishment chooses to
let

| it slide by? Why? Because people like Hawking are celebrities and they


| don't want to admit he might be even a little bit wrong.

I once knew a theoretical physicist, but I couldn't be certain. I only
had a theory about him. I do find many openings in the field of theoretical
automotive repair, theoretical dish washing, and theoretical Get Rich Quick
sales. Perhaps you should apply.

| Come on people, it's not even that bad, and most of Hawking's work should
| still be valid. So what if black holes aren't actually singularities?

I've seen many singular black holes (oh, I bet you mean not in the porn
industry).

| Isn't the TRUTH what's important?

I cannot handle the TRUTH. All those misconceptions, hyperbole, and
dogma tend to make it pretty slippery at first touch.

| The real problem is that we're still having battles over continuous versus
| discrete when it should be over already.

Ah, back to the porn industry again (choose a company that lists their
services as "Glove Cleaning" on your credit card statement. That is very
continually discrete when viewing moist black holes.)

| Space is not continuous. Period.

It comes in bite-sized pieces with delicious bits of peanut butter
spread within.

| Let me repeat--space is not continuous.

Because then it would be segmented into periodic sections which are
then, of course, contained in ???. The Etheric Continuum makes me sleepy.

| And in mathematics, number theory is a shambles. Near as I've been able
to
| determine, number theory hasn't advanced much since Dedekind. Um folks,
| what I'm saying is that number theory has stagnated for over a hundred
| years.

This just in - FAUX NEWS REPORTS - NUMBER THEORY IN SHAMBLES - Sesame
Street's "The Count" arrested for numerous cases of fraud.

| That's actually why there was all the hullabaloo over algebraic integers.

This just in - FAUX NEWS REPORTS - RIOTS IN THE STREETS - Dramatic
Conflicts Over Algebraic Integers cause panic and confusion.

| You see, Dedekind came up with algebraic integers, following up on Gauss'
| "gaussian integers", and mathematicians didn't make the obvious extension.

| That is, they failed to do the full abstraction.

Ah, so I cannot do (2.5)*X^2 + (1.7774)*X + 3.444 with the Quadratic
Equation Solver I guess.
I sure would hate to know the roots of:
(-0.35548 plus 1.1185857006059034 i), (-0.35548 minus 1.1185857006059034 i)
http://www.1728.com/quadratc.htm

| I can see no reason for that failure. It's actually kind of bizarre.

Yeah, things that actually work consistently are damn bizarre. I tend
to call them reliable.

| What's bad though is now they're just lying to cover themselves, if they
| even talk about the problem. The disarray of number theory is actually
more
| of a concern to me than the problems in physics *because* space is
| continuous.

Thank goodness you're concerned. We desperately need a defender of
innumeracy to cloud the clear and simple established truths with
misconceptions and ignorance.
For those who want to laugh at poor James Harris you can visit:
http://innumeracy.com/ Not as amusing as the Darwin Awards yet still
informative.

| Basically, apparently, we're falling behind at an ever increasing rate,
| while our scientists get increasingly muddled and driven more by politics
| than science.

Poor muddled scientists! Watch as they cower in their formal
educations. Watch as they cringe in comfortable high-paying jobs serving
the computer, space, statistical fields, and research rooms. Sneer as they
ignore passionate fools who can barely formalize proofs, much less create
functional programs which utilize their conceptual designs.

| I'd hoped to find some ray of hope on these newsgroups, but instead I find
a
| lot of people who are vicious to an extent that's hard to believe until
you
| see it in person, as I have, who don't seem to know jackshit.

Are we projecting James? Are we describing emotions and conclusions
which you alone, James, are actually guilty of failing to acknowledge?
Perhaps we are not the cruel arbiters of poor judgement, but it is yourself
who wallows in ignorance and a gross failure to learn the necessary skills
to understand your own work.

| I'm beginning to think there's no hope anywhere.

Which would explain succinctly why you feel that your ideas are not
accepted BECAUSE EVERYBODY ELSE IS AN IDIOT. Whereas the simpler
explanation is obviously that you alone James Harris are the unteachable
fool.

| If so, there's no one to solve the problems we're facing.

Ah yes, the ever impending doom of the general population to grasp the
looming nightmare of the "continual ignoring of my work" and how the daily
"man on the street" will expire if not knowing "what if black holes aren't
actually singularities?"

| These are dark times, and it looks like they're darkening. From where I


| sit, we're in some kind of Dark Ages in both math and physics.

And only you can save us James. I fall at your feet in humble fealty in
the glare of your omnipresent genius. Now I light the firecracker I just
slid under your shoe.

| The only really bright spots in the sciences are material's science,
| biology, evolutionary psychology, and object oriented computer programming
| theory and practice.

Whatever would we have done without OOP? Oh yeah, create an entire
computer industry, a functional civilization, and create all that you
disdain.

| That's the good news, but it's not good enough to keep us from having a
| miserable future.

(Sarcastically) Oh yes James, tell us humble folk what we need to
learn.

| Folks, it can start getting bad and just never get better within the
| lifetime of our species. But humanity won't die quickly in terms of its
own
| perception. My guess is that it could take two to three hundred years for
| it all to play out, before the last human being finally gasps their last
| breath.

I figure humanity tends to die on a more individualistic basis, but that
is just those damn actuarial tables talking.

| But as far as the Universe is concerned, that's tomorrow and not even a
day.

So save your bottom dollar. God knows I don't want that smelly buck.

| James Harris


Message has been deleted

Virgil

unread,
Jul 16, 2002, 8:59:31 PM7/16/02
to
In article <#bP$ejSLCHA.1336@cpimsnntpa03>,
"James Harris" <jst...@msn.com> wrote:

> My experience indicates that the people who post on this newsgroup are about
> at the level of a 10 year old in the year 2060, and at least the ten year
> olds of that period are a lot more polite because they should be a tad bit
> smarter!

And our experience is that "James Harris" <jst...@msn.com> has the
emotional maturity of a 10 year old of 1960, though a lot less
polite because he is a tad bit dumber.

Ben Golub

unread,
Jul 16, 2002, 10:57:00 PM7/16/02
to
Just wondering, Mr. Harris claims that nobody has shown an error in his last bogus
"poof" located at http://groups.msn.com/ProofofFermatsLastTheorem/yourwebpage.msnw

Has somebody torn this up yet or is that a task that is still available for the
taking?

Ben

Nico Benschop

unread,
Jul 17, 2002, 3:27:30 AM7/17/02
to
George Johnson wrote a masterpiece that made my day,
in which I discovered (I think) only one typo [*]:
>
> "James Harris" <jst...@msn.com> wrote a lot of doomsday stuff:
> [...]

> | These are dark times, and it looks like they're darkening.
> | From where I sit, we're in some kind of Dark Ages in both
> | math and physics.
>
> And only you can save us James. ..[*]

Should'nt that be:
And only you can save James. -- NB

George Johnson

unread,
Jul 17, 2002, 9:11:20 AM7/17/02
to
"Nico Benschop" <n.ben...@chello.nl> wrote in message
news:3D351C62...@chello.nl...

Oh, I think it may be too late. Perhaps if he was introduced to the
element Lithium (for treating schizophrenia).

Otherwise the situation is:
* Poor James sees himself as the revolutionary storming the gates of the
haughty slow-to-change altars of science and mathematics.

* While we, the knowledgeable folks with a basic yet complete math
education (something James is obviously lacking), tend to see James as a
vagrant pounding furiously on the door of the Porta-Potty where he sleeps to
evict those that dare intrude into his squatter's turf.

There will be no common ground until poor James seeks a fuller and much
more complete education on basic mathematics truths. He lacks many of the
basic skills which he strangely rails so vigorously against. I won't even
get into the appalling lack of basic social skills James has never began to
grasp yet.

So to sum it up. He appears to have been home schooled by parents which
were dumb as fenceposts. I will also append that to say James had some damn
unsocial parents too as his ability to communicate is below the boastful
bullshit commonly spat out in a drum loop of current rap music.

AHA! Now there is a concept! Perhaps we ought to get James into doing
"Algebra Rap" or "Beginning Number Theory Rap". He has the attitude, it is
a crying shame that he presents his work as a boastful moron that cannot
explain how he achieved his conclusions (remember kids - The teacher
appreciates you showing them your thought process on paper so if something
is off they can help you understand where you goofed). The big downside on
this of course is that poor James would be leading a legion of spoiled rude
children into shameless math ignorance along the lines that Rush Limbaugh
has led a generation of politically motivated teens into the "Seduction of
the Ignorant" that allowed an ass like Newt Gingrich to rise to power.


David C. Ullrich

unread,
Jul 17, 2002, 10:34:54 AM7/17/02
to
On Tue, 16 Jul 2002 22:57:00 -0400, "Ben Golub" <b...@nerc.com> wrote:

>Just wondering, Mr. Harris claims that nobody has shown an error in his last bogus
>"poof" located at http://groups.msn.com/ProofofFermatsLastTheorem/yourwebpage.msnw
>
>Has somebody torn this up yet or is that a task that is still available for the
>taking?

There's no question that there are problems he hasn't addressed. I
haven't followed the details, but I've watched the threads: It seems
that a key step is that exactly two of the a's (there are three a's
in all) are divisible by f^{something}, and he gives _no_
justification for this statement, he just asserts that it's so.

People have given counterexamples showing that the things he says
he knows about the a's simply do _not_ imply what he says. He
disputes the relevance of the counterexamples. Of course it
becomes clear after a point that the only counterexample he's
going to agree is relevant is a counterexample to FLT itself,
which of course does not exist. But (not surprisingly if you've
watched him over the years) he's never acknowledged the fact
that even if the counterexamples _were_ irrelevant (the experts
find them extremely relevant) they would _still_ show that he
needs to _prove_ that exactly two of the a's have a factor of
whatever it is - he's never given any explanation for why this
is so, he just continues to assert it.

Which puts his proof of FLT in the same class as _my_ proof
of FLT, which reads as follows:

Proof: Suppose that x, y, z are positive integers and n > 2
is an integer. Then x^n + y^n does not equal z^n. QED.

_Every_ statement in that proof is _true_. Nonetheless it's
not much of a proof of FLT. Which shows that to show a
proof is incorrect it's not required to show that one
of the statements in the proof is actually false - he's
never acknowledged this fact about what is and what is
not a proof, he won't admit his proof is wrong until
someone shows that a statement in it is actually false.

(And of course when people _do_ show that some statement
is false that doesn't do it either, because he simply
calls them morons and denies the truth of the matter.)

>Ben
>
>
>
>
>-----= Posted via Newsfeeds.Com, Uncensored Usenet News =-----
>http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
>-----== Over 80,000 Newsgroups - 16 Different Servers! =-----


David C. Ullrich

Arturo Magidin

unread,
Jul 17, 2002, 10:45:31 AM7/17/02
to
In article <ojvajucen6dei6p3v...@4ax.com>,

David C. Ullrich <ull...@math.okstate.edu> wrote:

>There's no question that there are problems he hasn't addressed. I
>haven't followed the details, but I've watched the threads: It seems
>that a key step is that exactly two of the a's (there are three a's
>in all) are divisible by f^{something}, and he gives _no_
>justification for this statement, he just asserts that it's so.
>
>People have given counterexamples showing that the things he says
>he knows about the a's simply do _not_ imply what he says. He
>disputes the relevance of the counterexamples.

Actually, it went pretty much beyond that. Rupert, Jan, and I have
given different arguments to SHOW that the statement is necessarily
false whenever the original polynomial is irreducible. In that case,
given any rational power of an integer, i^j (i an integer, j a
rational), it must either divide ALL the a's, or NONE of them; it
cannot divide exactly two of them. This is true for every value of n;
in the case of n=3, the polynomial is irreducible whenever James' v is
positive, so that's that. We even gave a specific case in which we
showed none of the a's had the property (you'll recall, that was what
set up that whole "if x is a root of a primitive nonmonic irreducible
polynomial with integer coefficients then it is not an algebraic
integer" thing); it sent him spinning for a few days before he decided
it must be wrong for unspecified reasons.


======================================================================
"It's not denial. I'm just very selective about
what I accept as reality."
--- Calvin ("Calvin and Hobbes")
======================================================================

Arturo Magidin
mag...@math.berkeley.edu

Randy Poe

unread,
Jul 17, 2002, 11:15:10 AM7/17/02
to
Ben Golub wrote:
>
> Just wondering, Mr. Harris claims that nobody has shown an error in his last bogus
> "poof" located at http://groups.msn.com/ProofofFermatsLastTheorem/yourwebpage.msnw
>
> Has somebody torn this up yet or is that a task that is still available for the
> taking?

Feel free. Be aware of the "discussion" cycle.

New poster: James, you have an error on line 31.
JSH: Well, it's obvious you don't know any mathematics, and you're
one more person I can ignore.
New poster: (explicates error)
James: Well, there are no objections to my proof anymore.

- Randy

Pythagoras Titanium

unread,
Jul 17, 2002, 1:10:41 PM7/17/02
to
In article <#bP$ejSLCHA.1336@cpimsnntpa03>, James Harris wrote:
>
> "Pythagoras Titanium" <mrjw...@home.com> wrote in message
> news:slrnaj5sns.1...@d226-23-159.home.cgocable.net...

>> In article <3D3293A9...@chello.nl>, Nico Benschop wrote:
>> > True. But look at it from the bright side, James: probably those
>> > beautifully object-oriented programmed computers will survive us,
>> > and _they_ will have a bright future, so what are you moping about?
>>
>> I (think I) know that you are joshing, but this presents an
>> opportunity for me to be serious. IMO, OO is a programming
>> metholody which has turned out to be suprisingly useful in many
>> instances (GUI toolkits for instance).
>
> Object oriented programming is merely a useful abstraction, like ALL of math
> and physics.
>
> These people get more excited about the hype than the substance, which
> simply tells you about the level of sophistication of their neural nets.

>
> My experience indicates that the people who post on this newsgroup are about
> at the level of a 10 year old in the year 2060, and at least the ten year
> olds of that period are a lot more polite because they should be a tad bit
> smarter!
>
>> When I think about it it seems clear that OO can be understood in
>> already understood mathematical terms. For instance a type
>> with a natural number a string, might be defined as member
>> of NxE, where E is an alphabet. Extentions a this type
>> could be understood as types of the form N x E* x t1 x t2 etc.
>> Sets of functions should be allowed as elements of types.
>> like funcs(N,N) (functions from N to N). You can go on
>> to define "types with default initialisers", for insance
>> ((Nxfuncs(N,N)),(0,f:N->N where f(x)=x+1)) could be an example.
>
> Everything that is knowledge has a basis in mathematics. Well in
> Mathematics I should say.
>
> From what I've gathered, what goes for mathematics in the world today is a
> lot of abstruse nonsense where much simpler, shorter work can do far, far
> more.
>
> Remember, I showed you the *short* FLT proof, which you 1st generation types
> claim to have proven with a bit of work that's at least a thousand times
> longer (and probably wrong).
>
> Let me give you a statistic--99.99% of the mathematics produced over the
> last fifty years will never be referenced, looked at, or in any way
> considered again. Just lots of wasted time wasting away in moldering
> journals of no use to anyone.

>
>> My point? My point is that OO can be perfectly well described by
>> existing math,
>> and ought to be viewed as a programming methodology, not a math
>> methology.
>
> Sounds a bit contradictory, but hey, I think you should get VERY excited
> about OO. It could save your life.

>
>> OK, I admitted that I'm not really qualified to talk about this
>> subject, so other posters are invited to clobber me or
>> agree with me now.
>
> And that's what's truly sad about these newsgroups is that you feel the need
> to say that.
>
> In fact, it's the reason why the newsgroups can't be used for my full plans,
> and I'll just have to take them for what they are useful for--a dumping
> ground where I fiddle with ideas and rough drafts of things I'm working on.
>
> Now my suspicion is that you're already so beaten down by half-truths and
> postering by apes who actually barely know anything at all, that you're
> already lost. Oh well, evolution will win in the end, though I suspect it
> will be with the replacement of our species by one that's, well, more
> LOGICAL.
>
>
> James Harris
>
>

Pythagoras Titanium

unread,
Jul 17, 2002, 1:58:30 PM7/17/02
to
In article <#bP$ejSLCHA.1336@cpimsnntpa03>, James Harris wrote:
>
> These people get more excited about the hype than the substance, which
> simply tells you about the level of sophistication of their neural nets.
>
Well this characterises my suspicion of you and OO. What ideas from
OO programming would you like to transport into math methodology,
and what would be the point, even if you succeeded to some extent?
I was trying to explain in my last post that set theory could be used to make
a complete description of any theory of OO, but what does does OO
have to say about set theory? Is a set like a "class" with a bunch of default
initialisations (for functions and variables). Generally, no,
although you could define certain sets to describe exactly those things,
to any level of formalisation necessary.

There are some "intuitive" linkages between OO and some math ideas.
For instance when you think of rationals and integers you might
think of the rationals as extending the integers in the same way that
a subclass extends a class. But that's not really what happens
because the rationals are not that kind of extention. (See the sci.
math faq for exact set theoretic definitions of these sets).
An extention in OO is is a supersequence of types (with
possibly new default instances (they call this "overriding")).
Mathematicians should correct me because i'm not a mathematician,
but I would guess that an extention in math is a set such
that there is an embedding function between the two such that
corresponding operations between the original set and the embedded
subset of the new set are isomorphic. You see? different concepts,
with a superficial veneer of similarity.

>
> Everything that is knowledge has a basis in mathematics. Well in
> Mathematics I should say.

People have argued
endlessly about whatever knowledge is in the past. I take the
view that its something almost ineffable so you can't
really talk about it. You get into that "how do you know that you know?"
infinite loop, and then of course someone rudely asks "what
do you mean by 'you'? anyway", and it all ends wthout any convincing
resolution.


>
> Sounds a bit contradictory, but hey, I think you should get VERY excited
> about OO. It could save your life.

It might be exciting in some cases. I did say that OO turned out
to be "suprisingly useful". That's a personal opinion. I used
to scoff at OO until I started programming with OO myself,
and I was actually suprised that it really made some programming tasks
a lot easier. It seems to me that OO is analagous to black box
idea where you always have a common set of buttons and dials on the box
but each box actually does a slightly different thing. OO is useful
when you can successfully divide work into "similar" black boxes,
which are similar enough in function
that they can have similar control sets. If the work
cannot be so divided, then OO won't help.

>
>> OK, I admitted that I'm not really qualified to talk about this
>> subject, so other posters are invited to clobber me or
>> agree with me now.
>

> And that's what's truly sad about these newsgroups is that you feel the need
> to say that.

Well, I am aware that there is a field called "object oriented mathematics",
(someone once directed you to a web site)
and I am aware that some CS types are interested in things called proof
objects, and I haven't studied any of this personally, which is why
I felt that I shouldn't talk with maximal confidence about this subject.

Nico Benschop

unread,
Jul 18, 2002, 5:36:40 AM7/18/02
to
Pythagoras Titanium wrote:
>
> In article <#bP$ejSLCHA.1336@cpimsnntpa03>, James Harris wrote:
> >
> > These people get more excited about the hype than the substance,
> > which simply tells you about the level of sophistication of their
> > neural nets.
> >
> Well this characterises my suspicion of you and OO. What ideas from
> OO programming would you like to transport into math methodology,
> and what would be the point, even if you succeeded to some extent?
> I was trying to explain in my last post that set theory could be
> used to make a complete description of any theory of OO, but what
> does does OO have to say about set theory? Is a set like a "class"
> with a bunch of default initialisations (for functions and
> variables). Generally, no, although you could define certain sets
> to describe exactly those things, to any level of formalisation
> necessary.
> There are some "intuitive" linkages between OO and some math ideas.
> [...]

My understanding of why OOP (Object Oriented Programming) was introduced
into CS language, beyond the early concept of
'subroutine' (which allows hierarchy to control complexity)
is that it explicitly includes:
along with the (I/O) function of a subroutine S, _also_ its
'context' in terms of internal variables, thus the complete
internal state of all memory used by S. [*]

And _that_ makes a difference: blocking unexpected 'side effects',
which are due to unobserved (possibly changed) internal states
(=memory).

In other words, the 'class' concept goes beyond 'subroutine' in
the sense that subfunctions are _combined_ with their 'support'
(data-structure, internal state, context, or whatever you want
to call it).

In mathematics, IMO, this would mean to include with the 'syntax'
(of strings of variables and operators) _also_ their 'context' -
in the sense of REPRESENTATIONS of the various types of OBJECTS,
with corresponding syntax rules, like:
associative, commutative, idempotent etc for say:
functions, integer arithmetic, sets & operations)
occurring in a theorem resp. proof sequence.

My idea is that State Machines (finite # states, or infinite set
of integers as state set: ISM = Integer State Machines) and
coupled networks thereof, could model whatever constructive
mathematics can do (and 'better' than Turing Machines, in the
sense of more like executable algorithms do in CS).

PS[*]: Just think of Louis de Fun`es (French comedian) in one of
his films, about an Alien from another planet, visiting
him with the request from the Ruler there to bring him (LdF)
and his excellent soup. Louis declines at first, saying the
ingredients would not be available over there. So in fact
when LdF departs, after all (in fact after he dies;-),
he is brought over with the whole patch of ground & farm,
including everything - the complete 'context' of LdF and his
Vegetables + Chickens + Soup!
That is the essence of OOP, in a nutshell...

http://home.iae.nl/users/benschop/ism.htm

Pythagoras Titanium

unread,
Jul 18, 2002, 6:46:20 PM7/18/02
to

Rupert

unread,
Jul 19, 2002, 4:14:56 AM7/19/02
to
"Ben Golub" <b...@nerc.com> wrote in message news:<3d34d984$1...@corp.newsgroups.com>...

> Just wondering, Mr. Harris claims that nobody has shown an error in his last bogus
> "poof" located at http://groups.msn.com/ProofofFermatsLastTheorem/yourwebpage.msnw
>
> Has somebody torn this up yet or is that a task that is still available for the
> taking?
>
> Ben
>
>

Modulo fillable gaps in the argument, the first problem is "exactly
two of the a's have a factor of f^j".

This is false. I first pointed that out and proved it, oh, sometime in
May I think, maybe April. (A proof-idea of Bengt Mansson's helped a
lot. Also, as Arturo Magidin showed, a simple generalization of one of
his proof-ideas would have worked as well).

I only kept pointing it out and proving it over and over again for
about four weeks. And, boy, I REALLY had to make the proof simple. And
get insulted quite a few times. Then I actually got him to admit it
was false. That was quite exciting.

Then, even though he had admitted the proposition was false, he still
kept on claiming there was no error in the proof. This was quite
interesting. Then I started coming up with alternative interpretations
for what he might mean (even though they weren't actually what he
said). Then I managed to prove that *they* were false as well. The
conversation wasn't very civilized at this point.

Then, after a while, I realized I had better things to do like pick my
nose. (Andrzej suggested to me via private email correspondence that
there really was no point, it was a valuable suggestion).

You can read the history of it all on Google.

If you like, you can have fun trying to make sense of the paragraphs
between the first mistake and "Proof Complete".

Ben Golub

unread,
Jul 19, 2002, 10:31:00 AM7/19/02
to
> Modulo fillable gaps in the argument, the first problem is "exactly
> two of the a's have a factor of f^j".
>
> This is false. I first pointed that out and proved it, oh, sometime in
> May I think, maybe April. (A proof-idea of Bengt Mansson's helped a
> lot. Also, as Arturo Magidin showed, a simple generalization of one of
> his proof-ideas would have worked as well).

[...]

> You can read the history of it all on Google.
>
> If you like, you can have fun trying to make sense of the paragraphs
> between the first mistake and "Proof Complete".

I had a read of all this on Google (thanks everybody for telling me where to
look). Following the argument, one can see that you certainly showed a huge and
gaping hole ... I guess I was just amazed that he could so blindly claim that he
had no error. It's hard to accept such complete lack of reason.

Anyway, you have spent quite some time on him, and all of us who have responded to
his posts I guess have thrown a little bit of time into that bottomless pit of
stupidity. He is, however, a world-class troll and crank, and very amusing to
read. I don't know if I'll be able to wean myself from the JSH addiction

Joona I Palaste

unread,
Jul 31, 2002, 9:55:01 AM7/31/02
to
Nico Benschop <n.ben...@chello.nl> scribbled the following
on sci.math:

No... he was just missing the word "from" there.

--
/-- Joona Palaste (pal...@cc.helsinki.fi) ---------------------------\
| Kingpriest of "The Flying Lemon Tree" G++ FR FW+ M- #108 D+ ADA N+++|
| http://www.helsinki.fi/~palaste W++ B OP+ |
\----------------------------------------- Finland rules! ------------/
"We sorcerers don't like to eat our words, so to say."
- Sparrowhawk

Nico Benschop

unread,
Aug 1, 2002, 2:41:23 AM8/1/02
to
Joona I Palaste wrote:
>
> Nico Benschop <n.ben...@chello.nl> scribbled the following
> on sci.math:
> > George Johnson wrote a masterpiece that made my day,
> > in which I discovered (I think) only one typo [*]:
> >>
> >> "James Harris" <jst...@msn.com> wrote a lot of doomsday stuff:
> >> [...]
> >> | These are dark times, and it looks like they're darkening.
> >> | From where I sit, we're in some kind of Dark Ages in both
> >> | math and physics.
> >>
> >> And only you can save us James. ..[*]
>
> > Should'nt that be:
> > And only you can save James. -- NB
>
> No... he was just missing the word "from" there. -- Joona Palaste

Well, I was not so much concerned about 'us' (since a killfile
and the decent habit of James to put 'JSH' in the subject line,
solve that problem - that is: for the non-adicted;-)
but more about James himself, who keeps going - wasting his time -
"with an energy that's worth a better cause", as the saying goes.

0 new messages