Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

JSH: Assessing the truth

3 views
Skip to first unread message
Message has been deleted

G. Frege

unread,
Dec 7, 2002, 4:42:29 PM12/7/02
to
On 7 Dec 2002 12:29:12 -0800, jst...@msn.com (James Harris) wrote:

> Some of you may honestly be confused about whether or not I've
> presented a short proof of Fermat's Last Theorem.
>
IMHO the only person who may be honestly confused about whether or not
you've presented a short proof of Fermat's Last Theorem, is you.

>
> I'm quite certain that mathematicians have strong motivations not
> to tell you the truth.
>
Sure, I am quite certain that you have strong motivations to think
that. :-)

>
> While I admit it's quite possible that many mathematicians have
> not heard of the short FLT Proof...
>
Actually it's _quite likely_ that MOST mathematicians have not heard
of the short FLT Proof. :-)

>
> I suggest your common sense is a good start...
>
Actually (imho) "common sense" is NOT a good start in mathematics!

>
> 1. I did find a way to count prime numbers that is not in any
> established reference, which is easily verifiable.
>
Great, but HELL why don't you PUBLISH it then?!

>
> 2. Most of the short FLT proof I've presented has been verified as
> being correct, including points that *had* been controversial.
>
This may be so. So what?

>
> Now I've contacted leading mathematicians by email, and for important
> reasons I'm keeping their names and the complete contents of those
> emails private, and what I've been told by them is not that it's not
> new, but that they don't think it's important.
>
Yes. This may be so. STILL the work should be published if "original"
imho.

>
> Also, they keep talking about fast prime counting, and what I have is
> not the fastest around, though I still believe that algorithms based
> on it can be.
>
Well... I just TOLD YOU: "common sense" is NOT a good start in
mathematics! You should keep that in mind. PROVE your "believe" (or at
least present some ARGUMENTS in favor of it), or shut up!

>
> What's important here is for you to ask yourselves, how unimportant
> could a method for counting PRIME NUMBERS truly be...
>
Actually there are MUCH BETTER (=much faster) methods known already.
So what do you want?!

>
> Some of you may labor under the misapprehension that what is happening
> is strange historically, but it's not.
>
> You may have seen the life of Evariste Galois...
>
Sure. The parallels between YOU and Evariste Galois are STRIKING!

> ...mentioned or read my post linking John Nash's schizophrenia to lack
> of proper recognition...
>
...same, same.

F.

David C. Ullrich

unread,
Dec 8, 2002, 11:19:00 AM12/8/02
to
On 7 Dec 2002 12:29:12 -0800, jst...@msn.com (James Harris) wrote:

>Some of you may honestly be confused about whether or not I've

>presented a short proof of Fermat's Last Theorem,

You think some of them may be confused about that? I think
anyone reading the threads you've started in the past will
get a pretty clear picture. For the benefit of anyone who
doesn't have time to do that: No, you have not presented
a valid proof of FLT.

Now we got that straight we can move on to something else.

[...]
>
>3. Despite my prime counting function not being in any reference, the
>same people who are telling you that the short FLT Proof is wrong, are
>usually claiming that the prime counting work is unimportant, or in
>even more bizarre cases, claiming that I haven't presented anything
>new!

Yeah. A few days ago you were bragging about being in contact with
top mathematicians, like that proved something. But you neglected
to mention what they had to say. In particular you didn't include the
comment from an unnamed top mathematician that you replayed to us a
little while ago - he said that probably 20% of the grad students
working in the field come up with something similar.

>Now I've contacted leading mathematicians by email, and for important
>reasons I'm keeping their names and the complete contents of those
>emails private, and what I've been told by them is not that it's not
>new, but that they don't think it's important.
>

>Also, they keep talking about fast prime counting, and what I have is
>not the fastest around, though I still believe that algorithms based
>on it can be.

You believe that because you don't understand about algorithmic
complexity calculations. Curious, given that you're a professional
programmer.

>What's important here is for you to ask yourselves, how unimportant

>could a method for counting PRIME NUMBERS truly be such that
>mathematicians don't seem bothered with it NOT being published?

Easy. It could be unimportant if it's not all that new, doesn't
lead to anything new theoretically, and also doesn't lead to
anything like new records for pi(n) for n large.

>Here the proper assessment, I strongly suggest, is that it is VERY
>important, which is why mathematicians, to my knowledge, aren't
>talking about it.
>
>Here what you believe probably depends on whether or not you're one of
>those "conspiracy people" who suppose that it is possible aliens
>landed at Roswell and the government covered it up, and those who
>think it's complete hogwash.
>
>Personally, since I mentioned it, I don't think aliens crash landed at
>Roswell, but I'm hoping some of you will smell the possibility of a
>conspiracy of silence here.

I'm so glad you said that. Makes things so much clearer for the
people you address in your first sentence - the ones who aren't
sure who's right.

>Now I hope those three points are a start.


>
>Some of you may labor under the misapprehension that what is happening
>is strange historically, but it's not.
>

>You may have seen the life of Evariste Galois mentioned or read my
>post linking John Nash's schizophrenia to lack of proper recognition,
>based on an established psychological theory called the double bind
>theory, which you can easily enough look up on the web to see that I
>didn't make it up.
>
>Do you think that intelligent people--people intelligent enough to get
>doctorates in mathematics--really don't know the profound effect that
>simply *ignoring* a person's work can have on that person?
>
>Maybe those who repeatedly post in reply to me at least acknowledging
>the existence of my work are better than that shadowy majority of
>mathematicians who by now may know much about it,

The vast majority of mathematicians have heard about your work.
Even though it's never been published. Got it.

If you had any idea how this sort of thing sounded you wouldn't
talk this way.

>but realize that
>their most potent and vicious weapon--is silence. James Harris


David C. Ullrich

David Kastrup

unread,
Dec 8, 2002, 2:07:16 PM12/8/02
to
David C. Ullrich <ull...@math.okstate.edu> writes:

> You believe that because you don't understand about algorithmic
> complexity calculations. Curious, given that you're a professional
> programmer.

Sorry, but you are confusing programmers with computer scientists.
Computer scientists are the people that can do the algorithmic
complexity calculations and invariably just use linear lists when
programming.

Programmers are the ones that actually _use_ optimization techniques,
just the wrong ones.

In short, we are living in a world where there is a division between
the people getting taught the science of criticism, which they don't
apply to their own works, and natural artists that produce mostly
well-meant sub-par art for pay.

Good programming is an art. You can teach handiwork. You can teach
criticism. That way you can crank out enough people that cater for
basic needs in that area. And you generate an atmosphere that is
more likely to foster art. But you seemingly can't install the
creative spark itself.

Anyhow, not having a clue about algorithmic complexity is not at all
clashing with being a professional programmer. And not thinking
about applying algorithmic complexity to one's own work is not all
clashing with being a computer scientist.

Sad, but true.

--
David Kastrup, Kriemhildstr. 15, 44793 Bochum

Dik T. Winter

unread,
Dec 8, 2002, 10:30:13 PM12/8/02
to
In article <3c65f87.02120...@posting.google.com> jst...@msn.com (James Harris) writes:
...
> The prime counting function I found is
>
> dS(x,y) = (pi((x/y), y-1) -pi(y-1, sqrt(y-1))[pi(y, sqrt(y)) -
> pi(y-1, sqrt(y-1))],
>
> S(x,1) = 0.
>
> And pi(x, y) = floor(x) - S(x, y) - 1, and you get S by summing dS
> from y = 2 to sqrt(x).

So S(x, y) = sum{y = 2 to sqrt(x)} dS(x, y)?

I do not understand. What is the meaning of the second parameter 'y'
in S(x, y)?
--
dik t. winter, cwi, kruislaan 413, 1098 sj amsterdam, nederland, +31205924131
home: bovenover 215, 1025 jn amsterdam, nederland; http://www.cwi.nl/~dik/

Randy Poe

unread,
Dec 9, 2002, 12:20:44 PM12/9/02
to
G. Frege wrote:
> On 7 Dec 2002 12:29:12 -0800, jst...@msn.com (James Harris) wrote:
>>1. I did find a way to count prime numbers that is not in any
>>established reference, which is easily verifiable.
>>
>
> Great, but HELL why don't you PUBLISH it then?!

Because an editor would point out that Legendre already
did a couple of hundred years ago, so the market would
be somewhat limited.

- Randy

James Harris

unread,
Dec 9, 2002, 1:05:49 PM12/9/02
to
"Dik T. Winter" <Dik.W...@cwi.nl> wrote in message news:<H6u1q...@cwi.nl>...

> In article <3c65f87.02120...@posting.google.com> jst...@msn.com (James Harris) writes:
> ...
> > The prime counting function I found is
> >
> > dS(x,y) = (pi((x/y), y-1) -pi(y-1, sqrt(y-1))[pi(y, sqrt(y)) -
> > pi(y-1, sqrt(y-1))],
> >
> > S(x,1) = 0.
> >
> > And pi(x, y) = floor(x) - S(x, y) - 1, and you get S by summing dS
> > from y = 2 to sqrt(x).
>
> So S(x, y) = sum{y = 2 to sqrt(x)} dS(x, y)?
>
> I do not understand. What is the meaning of the second parameter 'y'
> in S(x, y)?

It's sum up to sqrt(x), if y>=sqrt(x).

It turns out though that I'm not sure if that's really required, as I
think you can just sum to y, whatever it is, as long as, for now, it's
positive, and you will simply find that dS(x,y) drops to zero if
y>sqrt(x).

It's actually fascinating, and not well worked out because
mathematicians keep acting as if it's nothing new or interesting,
which leaves me not only as the principal researcher, but maybe the
only researcher in this area, which is starting to go beyond bizarre
to scary. It's as if our modern mathematicians don't think
rationally, which is a surprise.

I mean, if we now have clear evidence that mathematicians of today
don't actually think rationally. What are they actually doing in
their day to day activities? James Harris

Message has been deleted

Tris

unread,
Dec 9, 2002, 1:27:51 PM12/9/02
to

"James Harris" <jst...@msn.com> wrote in message
news:3c65f87.02120...@posting.google.com...

I hear they have regular meetings with a Grand Dragon - some
conspiracy or something. They probably wear funny hats too.


David C. Ullrich

unread,
Dec 9, 2002, 6:23:09 PM12/9/02
to
On 9 Dec 2002 10:08:51 -0800, jst...@msn.com (James Harris) wrote:

>David C. Ullrich <ull...@math.okstate.edu> wrote in message news:<dtr6vu4tbdbolubuv...@4ax.com>...


>> On 7 Dec 2002 12:29:12 -0800, jst...@msn.com (James Harris) wrote:
>

><deleted>


>
>> >Now I've contacted leading mathematicians by email, and for important
>> >reasons I'm keeping their names and the complete contents of those
>> >emails private, and what I've been told by them is not that it's not
>> >new, but that they don't think it's important.
>> >
>> >Also, they keep talking about fast prime counting, and what I have is
>> >not the fastest around, though I still believe that algorithms based
>> >on it can be.
>>
>> You believe that because you don't understand about algorithmic
>> complexity calculations. Curious, given that you're a professional
>> programmer.
>

><deleted>
>
>Hmmm...I've seen this algorithmic complexity thing come up several
>times from different posters, so let's see if you actually know
>anything Ullrich.
>
>I admit my ignorance of exactly what you're talking about, so why
>don't you explain?

I'm talking about the fact that known algorithms for calculating
pi(x) have much better _asymptotic_ running time than yours.
I could check mathworld if I felt like getting the details right;
offhand I seem to recall that there are algorithms that run in
time O(N^(1/2 + epsilon)) or maybe it was 1/3+epsilon, while
when people analyzed yours my recollection is it came out to
O(N log(N)) or worse. And as several people pointed out, it's
very funny that you don't realize that an O(f(N)) algorithm
is never going to beat an O(g(N)) algorithm (where my _recollection_
is that f is N log(N) and g is a smaller power of N.)

This is your chance to make fun of the fact that I don't recall
exactly what the asymptotics were for your algorithm and the
standard ones. Don't do that - if you do I'll just look up
the asyptotic running time for the standard ones and ask you
to _calculate_ the asymptotic running time for yours - then
you'll look ridiculous, not being able to do that.

>Of course, I'm sure some other poster, will as usual, post ahead of
>you to try and save you from actually having to show that you know
>anything.

Doesn't look like that happened - could be there _was_ a previous
reply that hasn't appeared here yet.

> But we'll see. James Harris


David C. Ullrich

Message has been deleted

Dik T. Winter

unread,
Dec 9, 2002, 11:26:22 PM12/9/02
to
In article <3c65f87.02120...@posting.google.com> jst...@msn.com (James Harris) writes:
> "Dik T. Winter" <Dik.W...@cwi.nl> wrote in message news:<H6u1q...@cwi.nl>...
...

> > > And pi(x, y) = floor(x) - S(x, y) - 1, and you get S by summing dS
> > > from y = 2 to sqrt(x).
> >
> > So S(x, y) = sum{y = 2 to sqrt(x)} dS(x, y)?
> >
> > I do not understand. What is the meaning of the second parameter 'y'
> > in S(x, y)?
>
> It's sum up to sqrt(x), if y>=sqrt(x).

So your intention is:
S(x, y) = sum{i = 2 to min(sqrt(x),y)} dS(x, i)?

Dik T. Winter

unread,
Dec 9, 2002, 11:31:05 PM12/9/02
to
In article <3c65f87.02120...@posting.google.com> jst...@msn.com (James Harris) writes:
> David C. Ullrich <ull...@math.okstate.edu> wrote in message news:<l69avu88fhsp5029m...@4ax.com>...
...

> > >I admit my ignorance of exactly what you're talking about, so why
> > >don't you explain?
> >
> > I'm talking about the fact that known algorithms for calculating
> > pi(x) have much better _asymptotic_ running time than yours.
> > I could check mathworld if I felt like getting the details right;
> > offhand I seem to recall that there are algorithms that run in
> > time O(N^(1/2 + epsilon)) or maybe it was 1/3+epsilon, while
> > when people analyzed yours my recollection is it came out to
> > O(N log(N)) or worse. And as several people pointed out, it's
> > very funny that you don't realize that an O(f(N)) algorithm
> > is never going to beat an O(g(N)) algorithm (where my _recollection_
> > is that f is N log(N) and g is a smaller power of N.)
>
> However, the record shows that I kept improving those running times,
> which seemed to impress at least some people who actually posted about
> it! And one of the major improvements was my latest posted program
> which was 100 times faster than my original.

Improving an algorithm with O(N long(N)) complexity by a factor 100 does
not change the complexity. The only thing that changes is that the
cross-over-point where a O(N^(1/2 + epsilon)) is better increases.

> Why don't you do something better and tell the newsgroup what it is
> for *my* latest posted program?

It is *still* O(N log(N)).

John

unread,
Dec 10, 2002, 12:58:22 AM12/10/02
to
jst...@msn.com (James Harris) wrote in message news:<3c65f87.02120...@posting.google.com>...

>
> Of course, I'm sure some other poster, will as usual, post ahead of
> you to try and save you from actually having to show that you know
> anything. But we'll see. James Harris

It will come as no surprise to you that DU has almost never had anything
to say about your mathematics. Either he stiffs you without bothering
to look at the math, or he waits for somebody to do the math before
he stiffs you.

Which shows how much more Magidin grasps than Ullrich. Magidin
grasps what you put forward well enough to be able to, er,
collaborate. But he is way too much of a mathie to credit you with
those insights, which--thanks to his superior training--he is
able to validate in ways to which his buddies are accustomed.

--John

David Kastrup

unread,
Dec 10, 2002, 3:41:50 AM12/10/02
to
jst...@msn.com (James Harris) writes:

> However, the record shows that I kept improving those running times,
> which seemed to impress at least some people who actually posted about
> it! And one of the major improvements was my latest posted program

> which was 100 times faster than my original. And yes Ullrich, before
> you try some smart aleck reply, it also had better running time.

100 times is nothing when you are toy-testing an algorithm intended
for numbers like 10^20 which has worse than linear complexity.

I have rewritten algorithms operating on digitized map data, with
equal results. They were basically O(N^2), and the people cam up with
optimizations (based on some sort of triangle equation exclusion)
cutting the running time in half, so that it dropped to below one day
again for medium size maps. The code was too hard to figure out for
me, so I rewrote from scratch, with identical results but a
completely different algorithm. First time it ran, it took 2 minutes
for the same set of data. And the original stuff was written by
people with a _degree_ in computer science.

And for the really big maps, their code would have taken a month to
run. Mine about 10 minutes.

In the end, algorithmic complexity will always kill you on large
scale tasks.

Message has been deleted

David C. Ullrich

unread,
Dec 10, 2002, 10:32:52 AM12/10/02
to
On 9 Dec 2002 18:11:55 -0800, jst...@msn.com (James Harris) wrote:

>David C. Ullrich <ull...@math.okstate.edu> wrote in message news:<l69avu88fhsp5029m...@4ax.com>...

>However, the record shows that I kept improving those running times,
>which seemed to impress at least some people who actually posted about
>it! And one of the major improvements was my latest posted program
>which was 100 times faster than my original. And yes Ullrich, before
>you try some smart aleck reply, it also had better running time.

And once _again_ you show that you have no idea what the relevant
concepts even mean! Taking an O(N log(N)) algorithm and speeding
it up by a factor of 100 is not going to change the fact that
it's O(N log(N)).

>In fact, my research has shown that the running time can be further
>improved.

Great. Last I checked, you were going to be able to get pi(n)
for the largest n for which it's currently known by about the
time the Sun explodes. If you can improve that so it only takes
a hundredth of the time til the Sun explodes that will be a big
improvement.

>> This is your chance to make fun of the fact that I don't recall
>> exactly what the asymptotics were for your algorithm and the
>> standard ones. Don't do that - if you do I'll just look up
>> the asyptotic running time for the standard ones and ask you
>> to _calculate_ the asymptotic running time for yours - then
>> you'll look ridiculous, not being able to do that.
>

>Why don't you do something better and tell the newsgroup what it is
>for *my* latest posted program?

Why don't _you_ do so?

>Show what you know Ullrich.
>
>After all, given your post, people might have assumed you actually had
>something to compare against, so what is it?
>
>You see Ullrich, being a smart aleck is worse than futile, when
>someone decides to catch you on it, because people don't like smart
>alecks.

For heaven's sake, exactly what do you think you've "caught" me
on here? I said something, you asked me to explain what I meant,
suggesting I didn't know what I meant. So I explained what I
meant. You've caught me knowing what I'm talking about...

>> >Of course, I'm sure some other poster, will as usual, post ahead of
>> >you to try and save you from actually having to show that you know
>> >anything.
>>
>> Doesn't look like that happened - could be there _was_ a previous
>> reply that hasn't appeared here yet.
>

>I made my comment to prevent it. And in fact, I'd like those who
>think it might be something they can do sneakily in a post, to still
>stay away.
>
>Besides, you always have email. James Harris


David C. Ullrich

David C. Ullrich

unread,
Dec 10, 2002, 10:39:59 AM12/10/02
to
On 10 Dec 2002 06:07:47 -0800, jst...@msn.com (James Harris) wrote:

>"Dik T. Winter" <Dik.W...@cwi.nl> wrote in message news:<H6vz7...@cwi.nl>...


>> In article <3c65f87.02120...@posting.google.com> jst...@msn.com (James Harris) writes:
>> > David C. Ullrich <ull...@math.okstate.edu> wrote in message news:<l69avu88fhsp5029m...@4ax.com>...
>> ...
>> > > >I admit my ignorance of exactly what you're talking about, so why
>> > > >don't you explain?
>> > >
>> > > I'm talking about the fact that known algorithms for calculating
>> > > pi(x) have much better _asymptotic_ running time than yours.
>> > > I could check mathworld if I felt like getting the details right;
>> > > offhand I seem to recall that there are algorithms that run in
>> > > time O(N^(1/2 + epsilon)) or maybe it was 1/3+epsilon, while
>> > > when people analyzed yours my recollection is it came out to
>> > > O(N log(N)) or worse. And as several people pointed out, it's
>> > > very funny that you don't realize that an O(f(N)) algorithm
>> > > is never going to beat an O(g(N)) algorithm (where my _recollection_
>> > > is that f is N log(N) and g is a smaller power of N.)
>> >
>> > However, the record shows that I kept improving those running times,
>> > which seemed to impress at least some people who actually posted about
>> > it! And one of the major improvements was my latest posted program
>> > which was 100 times faster than my original.
>>
>> Improving an algorithm with O(N long(N)) complexity by a factor 100 does
>> not change the complexity. The only thing that changes is that the
>> cross-over-point where a O(N^(1/2 + epsilon)) is better increases.
>

>Not surprisingly, a poster couldn't resist trying to help Ullrich out
>though I requested that posters let Ullrich answer a question on his
>own!

Um, the question that you requested nobody help me with was two
posts up in the thread - this is an answer to a _different_ question,
one that you _didn't_ ask that nobody help me with.
In fact it's not an answer to a question at all, it's just pointing
out the wrongheadedness in something you said. An _obvious_ bit
of wrongheadedness, by the way.

>And to you Winter, other posters have disagreed with you,

Exactly who has disagreed with what he said here? This should be good.

>so provide
>evidence to support your claim.

Here's a _proof_ of the "claim" that improving an algorithm with
O(N log(N)) complexity by a factor 100 does not change the
complexity:

If the running time is O(N log(N)) then there is a constant c so
that the program takes time <= c (N log(N)) to execute. If we
improve that by a factor of 100 the program takes time <=
(c/100) (N log(N)) to execute. This is still O(N log(N)).

That was pretty hard.

>> > Why don't you do something better and tell the newsgroup what it is
>> > for *my* latest posted program?
>>
>> It is *still* O(N log(N)).
>

>It seems to me that if you were an Ullrich supporter, which you
>clearly have shown yourself to be, then people can *reasonably*
>suppose that you'd say anything to keep Ullrich from showing how
>ignorant he actually is.
>
>So back up your statements please.

I just backed up his statement for him.

You really have no idea how stupid you look disputing this
particular "claim". Because the "claim" is _so_ obvious to
anyone who has _any_ idea what the words mean (note the
utterly trivial nature of the proof of the claim above.)

David C. Ullrich

unread,
Dec 10, 2002, 10:44:20 AM12/10/02
to

Hint: you're making an idiot of yourself again.

Exactly which "insights" of James do you think that Magidin has
_validated_? The only thing I can possibly imagine you might
be referring to is the M&M theorem. That's not an "insight"
James had. James is confused enough that he has insisted many
times that the M&M theorem is _false_, providing bogus
counterexamples, even though he _uses_ this fact that he
insists is false in his proof.

That can't be so because it doesn't make any sense? No.
I agree it doesn't make any sense, but it doesn't follow
that it can't be an accurate description of James' behavior.

>--John


David C. Ullrich

Dik T. Winter

unread,
Dec 10, 2002, 11:00:16 AM12/10/02
to
In article <3c65f87.02121...@posting.google.com> jst...@msn.com (James Harris) writes:
> "Dik T. Winter" <Dik.W...@cwi.nl> wrote in message news:<H6vz7...@cwi.nl>...
...

> > Improving an algorithm with O(N long(N)) complexity by a factor 100 does
> > not change the complexity. The only thing that changes is that the
> > cross-over-point where a O(N^(1/2 + epsilon)) is better increases.
>
> Not surprisingly, a poster couldn't resist trying to help Ullrich out
> though I requested that posters let Ullrich answer a question on his
> own!

Tsk.

> And to you Winter, other posters have disagreed with you, so provide


> evidence to support your claim.

Which poster has disagreed with me? If follows from definitions in
complexity theory. An algorithm is O(f(N)) if there is a constant C
such that the time the algorithm takes is asymtotically C.f(N).
Note that it is *independent* of the value of C.

Bob Pease

unread,
Dec 10, 2002, 11:50:01 AM12/10/02
to

"James Harris" <jst...@msn.com> wrote in message
news:3c65f87.02120...@posting.google.com...

I have only a limited amount of item to read stuff.
This used to be fun, but it's now descended to pity.

Bye, Jimmy!!

PLONK!!!

RJ Pease


---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.419 / Virus Database: 235 - Release Date: 11/13/02


James Harris

unread,
Dec 10, 2002, 12:14:52 PM12/10/02
to
David Kastrup <David....@t-online.de> wrote in message news:<x5u1hmz...@lola.goethe.zz>...

> jst...@msn.com (James Harris) writes:
>
> > However, the record shows that I kept improving those running times,
> > which seemed to impress at least some people who actually posted about
> > it! And one of the major improvements was my latest posted program
> > which was 100 times faster than my original. And yes Ullrich, before
> > you try some smart aleck reply, it also had better running time.
>
> 100 times is nothing when you are toy-testing an algorithm intended
> for numbers like 10^20 which has worse than linear complexity.

The asympotic running time improved as well.



> I have rewritten algorithms operating on digitized map data, with
> equal results. They were basically O(N^2),

<deleted>

Your personal experiences are not relevant to what Ullrich knows.

Ullrich, smart aleck, can't give the asymptotic running time of my
programs.

Since you seem to want to help him, why don't you?

Remember for those who don't understand the heat here, or why I'm
upset, that people like Kastrup and Ullrich are trying to discount the
value of my prime counting research.

My own investigations have indicated that optimizations improved both
speed *and* asymptotic running times, so why would that be ignored?

I think it's clear here that Ullrich wants it to be ignored, so that's
why. Forget mathematical or research value, it's what Ullrich wants,
so people try to support Ullrich so that he can get it.

One thing I find interesting is that often someone like this Kastrup
starts posting about *their* own supposed accomplishments in the world
of algorithms, and that pattern is, I think, telling. James Harris

Randy Poe

unread,
Dec 10, 2002, 11:54:20 AM12/10/02
to
David C. Ullrich wrote:
> On 9 Dec 2002 10:08:51 -0800, jst...@msn.com (James Harris) wrote:
> I'm talking about the fact that known algorithms for calculating
> pi(x) have much better _asymptotic_ running time than yours.
> I could check mathworld if I felt like getting the details right;
> offhand I seem to recall that there are algorithms that run in
> time O(N^(1/2 + epsilon)) or maybe it was 1/3+epsilon, while
> when people analyzed yours my recollection is it came out to
> O(N log(N)) or worse. And as several people pointed out, it's
> very funny that you don't realize that an O(f(N)) algorithm
> is never going to beat an O(g(N)) algorithm (where my _recollection_
> is that f is N log(N) and g is a smaller power of N.)

According to the results at Mathworld,
http://mathworld.wolfram.com/PrimeCountingFunction.html
the best algorithms are O(N^(0.5+epsilon)) in runtime.
Storage is also a consideration, and the same algorithm
seems to be the most efficient one for storage. Still,
O(N^0.25) for storage is a lot of bytes when N>10^22.

> This is your chance to make fun of the fact that I don't recall
> exactly what the asymptotics were for your algorithm and the
> standard ones. Don't do that - if you do I'll just look up
> the asyptotic running time for the standard ones and ask you
> to _calculate_ the asymptotic running time for yours - then
> you'll look ridiculous, not being able to do that.
>
>
>>Of course, I'm sure some other poster, will as usual, post ahead of
>>you to try and save you from actually having to show that you know
>>anything.
>
>
> Doesn't look like that happened - could be there _was_ a previous
> reply that hasn't appeared here yet.

Well, the snide tone of James' challenge made me decide to
hold off till you responded.

- Randy

Randy Poe

unread,
Dec 10, 2002, 12:04:31 PM12/10/02
to
James Harris wrote:
> David C. Ullrich <ull...@math.okstate.edu> wrote in message news:<l69avu88fhsp5029m...@4ax.com>...

>>I'm talking about the fact that known algorithms for calculating
>>pi(x) have much better _asymptotic_ running time than yours.
>>I could check mathworld if I felt like getting the details right;
>>offhand I seem to recall that there are algorithms that run in
>>time O(N^(1/2 + epsilon)) or maybe it was 1/3+epsilon, while
>>when people analyzed yours my recollection is it came out to
>>O(N log(N)) or worse. And as several people pointed out, it's
>>very funny that you don't realize that an O(f(N)) algorithm
>>is never going to beat an O(g(N)) algorithm (where my _recollection_
>>is that f is N log(N) and g is a smaller power of N.)
>
>
> However, the record shows that I kept improving those running times,
> which seemed to impress at least some people who actually posted about
> it! And one of the major improvements was my latest posted program
> which was 100 times faster than my original. And yes Ullrich, before
> you try some smart aleck reply, it also had better running time.
>
> In fact, my research has shown that the running time can be further
> improved.

You're still missing the point on what O(f(N)) means.

If the runtime of an algorithm is a*N^2, then we say it is
O(N^2). If you have improved the running time by a factor of 100,
then you have reduced a by a factor of 100. But it's still
O(N^2). You haven't changed the asymptotic performance.

What that means is that if you have another algorithm that has
run time b*N, then sooner or later for large enough N, the
second algorithm will beat the first. And the difference will
be more and more as N increases.

It would be instructive for you to plot the runtime on log
axes of three different algorithms, as N goes from 10^1 to
10^25.

Algorithm 1a: Runtime = N^2.
Algorithm 1b (new improved superfast version of 1a):
Runtime = N^2/1000.
Algorithm 2: Runtime = 1000*N.

>>This is your chance to make fun of the fact that I don't recall
>>exactly what the asymptotics were for your algorithm and the
>>standard ones. Don't do that - if you do I'll just look up
>>the asyptotic running time for the standard ones and ask you
>>to _calculate_ the asymptotic running time for yours - then
>>you'll look ridiculous, not being able to do that.
>

> Why don't you do something better and tell the newsgroup what it is
> for *my* latest posted program?

The measurements I've seen were spotty but, seemed to
be no better than N*log(N) at best. Some indications
where O(N^2) still. In either case, far worse than
virtually every complexity result quoted on the table
at Mathworld.

> Show what you know Ullrich.

What does tabulating run times have to do with mathematical
knowledge?

> After all, given your post, people might have assumed you actually had
> something to compare against, so what is it?

He has something to compare against. He is talking about
the standard ones, and the asymptotic results are tabulated.
We've all seen them.
http://mathworld.wolfram.com/PrimeCountingFunction.html

> You see Ullrich, being a smart aleck is worse than futile, when
> someone decides to catch you on it, because people don't like smart
> alecks.

What did you "catch" him on?

He's claiming your algorithm's scaling is worse, much
worse, than the best-performing algorithms known and tabulated
at the above link. Do you think this is wrong?

- Randy

Randy Poe

unread,
Dec 10, 2002, 12:30:38 PM12/10/02
to
James Harris wrote:
> "Dik T. Winter" <Dik.W...@cwi.nl> wrote in message news:<H6vz7...@cwi.nl>...

>
>>In article <3c65f87.02120...@posting.google.com> jst...@msn.com (James Harris) writes:
>> > David C. Ullrich <ull...@math.okstate.edu> wrote in message news:<l69avu88fhsp5029m...@4ax.com>...
>> ...
>> > > >I admit my ignorance of exactly what you're talking about, so why
>> > > >don't you explain?
>> > >
>> > > I'm talking about the fact that known algorithms for calculating
>> > > pi(x) have much better _asymptotic_ running time than yours.
>> > > I could check mathworld if I felt like getting the details right;
>> > > offhand I seem to recall that there are algorithms that run in
>> > > time O(N^(1/2 + epsilon)) or maybe it was 1/3+epsilon, while
>> > > when people analyzed yours my recollection is it came out to
>> > > O(N log(N)) or worse. And as several people pointed out, it's
>> > > very funny that you don't realize that an O(f(N)) algorithm
>> > > is never going to beat an O(g(N)) algorithm (where my _recollection_
>> > > is that f is N log(N) and g is a smaller power of N.)
>> >
>> > However, the record shows that I kept improving those running times,
>> > which seemed to impress at least some people who actually posted about
>> > it! And one of the major improvements was my latest posted program
>> > which was 100 times faster than my original.
>>
>>Improving an algorithm with O(N long(N)) complexity by a factor 100 does
>>not change the complexity. The only thing that changes is that the
>>cross-over-point where a O(N^(1/2 + epsilon)) is better increases.
>
>
> Not surprisingly, a poster couldn't resist trying to help Ullrich out
> though I requested that posters let Ullrich answer a question on his
> own!

Which he did. We are now trying to explain an answer which
you have completely misunderstood.

> And to you Winter, other posters have disagreed with you, so provide
> evidence to support your claim.

Really? I haven't seen those posts. Could you cite one and
what you think it says that disagrees with the above?

What we are talking about is often called "Big-O" notation.
Mathworld covers a couple different versions of this. One
relevant description is #1 on this page:

http://mathworld.wolfram.com/LandauSymbols.html

> So back up your statements please.

Let's say the run time is f(N). The statement
"f(N) = O(N*logN) for N>10" means that f(N) < A*N*log(N)
for some constant A and all N>10. It means
we can draw curve with an N*log(N) shape, and a scale factor
A, and f(N) will lie under that curve.

With algorithms, usually there's some large set up time
that dominates for small N. For instance, you might have
f(N) = 1000*N*log(N) + 5000.

We say this has "asymptotic" N*log(N) behavior because as
N increases, the first term becomes dominant. In terms
of the O(N) notation, you could choose A large enough
that 1000*N*log(N) + 5000 < A*N*log(N). The value of A
you would choose would be something larger than 1000.

An algorithm that is O(N) is better, asymptotically,
than an algorithm that is O(N*log(N)) because no matter
what constants a and b you choose, eventually
a*N will cross over and be considerably less than
b*N*log(N).

That's why we say reducing b by a factor of 100 while
not changing the N*log(N) shape does not affect the
O(N*log(N)) behavior. It's still N*log(N).

Another example is that if the runtime is f(N) = a*N^2 +
b*N + c, we say this is O(N^2), whether a = 1000000 or
a = 0.0000001. The constant multipliers don't matter.

- Randy

Christian Bau

unread,
Dec 10, 2002, 12:52:18 PM12/10/02
to
In article <3c65f87.02121...@posting.google.com>,
jst...@msn.com (James Harris) wrote:

> David Kastrup <David....@t-online.de> wrote in message
> news:<x5u1hmz...@lola.goethe.zz>...
> > jst...@msn.com (James Harris) writes:
> >
> > > However, the record shows that I kept improving those running times,
> > > which seemed to impress at least some people who actually posted about
> > > it! And one of the major improvements was my latest posted program
> > > which was 100 times faster than my original. And yes Ullrich, before
> > > you try some smart aleck reply, it also had better running time.
> >
> > 100 times is nothing when you are toy-testing an algorithm intended
> > for numbers like 10^20 which has worse than linear complexity.
>
> The asympotic running time improved as well.

Bullshit. I proved that as long as your algorithm doesn't avoid the
utter stupidity of checking whether certain numbers are prime by
calculating pi (n+1) - pi (n-1), you can't get much better than O (N).

Brian Quincy Hutchings

unread,
Dec 10, 2002, 1:30:56 PM12/10/02
to
I keep asking for a precis of your method,
mister Harris. in particular,
do you "combine" two of the variables into some unified form?

"Dik T. Winter" <Dik.W...@cwi.nl> wrote in message news:<H6vyz...@cwi.nl>...

> So your intention is:
> S(x, y) = sum{i = 2 to min(sqrt(x),y)} dS(x, i)?

--A church-school McCrusade (Blair's ideals):
Harry-the-Mad-Potter want's US to kill Iraqis,
so does Usama's MacJihad:
"HEY, GEORGE; LET'S YOU & SADDAM FIGHT" -Dame Maggie
http://www.tarpley.net/bush25.htm ("Thyroid Storm" ch.)

http://www.rwgrayprojects.com/synergetics/plates/plates.html

Message has been deleted

David C. Ullrich

unread,
Dec 10, 2002, 6:21:46 PM12/10/02
to
On 10 Dec 2002 14:09:05 -0800, jst...@msn.com (James Harris) wrote:

>"Dik T. Winter" <Dik.W...@cwi.nl> wrote in message news:<H6wv...@cwi.nl>...


>> In article <3c65f87.02121...@posting.google.com> jst...@msn.com (James Harris) writes:
>> > "Dik T. Winter" <Dik.W...@cwi.nl> wrote in message news:<H6vz7...@cwi.nl>...
>> ...
>> > > Improving an algorithm with O(N long(N)) complexity by a factor 100 does
>> > > not change the complexity. The only thing that changes is that the
>> > > cross-over-point where a O(N^(1/2 + epsilon)) is better increases.
>> >
>> > Not surprisingly, a poster couldn't resist trying to help Ullrich out
>> > though I requested that posters let Ullrich answer a question on his
>> > own!
>>
>> Tsk.
>>
>> > And to you Winter, other posters have disagreed with you, so provide
>> > evidence to support your claim.
>>
>> Which poster has disagreed with me? If follows from definitions in
>> complexity theory. An algorithm is O(f(N)) if there is a constant C
>> such that the time the algorithm takes is asymtotically C.f(N).
>> Note that it is *independent* of the value of C.
>

>What is the basis for your claims about the asymptotic running times?
>
>others in posts have disagreed with your claims.

Exactly when did anyone disagree with the "claim" that if you
improve an O(N log(N)) algorithm by a factor of 100 it's still
an O(N log(N)) algorithm?

>What is your evidence? How did you come up with your figures?
>
>It's not a complicated question.
>
>And note, the asymptotic running times of my programs improved as well.

Message has been deleted
Message has been deleted

Maxim Stepin

unread,
Dec 11, 2002, 12:06:21 AM12/11/02
to

"James Harris" <jst...@msn.com> wrote in message
news:3c65f87.02121...@posting.google.com...

> Now then, what is the asymptotic running time of my latest posted
> program for counting primes?

Yes, what is the asymptotic running time of your latest posted
program for counting primes, James?

is it O( N*N ) ?
is it O( N*ln(N) ) ?
is it O( N ) ?
is it O( sqrt(N) ) ?

Just curious...


Christian Bau

unread,
Dec 11, 2002, 4:14:13 AM12/11/02
to
In article <3c65f87.02121...@posting.google.com>,
jst...@msn.com (James Harris) wrote:

> Christian Bau <christ...@freeserve.co.uk> wrote in message
> news:<christian.bau-A0D...@slb-newsm1.svr.pol.co.uk>...

> And anyone looking at replies to me in this thread can see different
> values being given for the asymptotic running times.

You have never given any asymptotic running times. You are probably too
stupid and uneducated to know an asymptotic running time when it bits
you in your arse.

> So why are these people attacking me when they're falling over each
> other?

Because they are disgusted at your stupidity.

> Well, just imagine that you were arguing with people about the
> correctness of a short proof of Fermat's Last Theorem you discovered,
> as they worked *very* hard to convince other people that you were
> wrong.
>
> And while you were arguing you thought of something interesting and
> three weeks later were talking about your *new* discovery which
> happened to be a fascinating prime counting function.

"Discovery"? You just repeated Legendre's formula of 200 years ago. You
never managed to demonstrate how your formula was different. Tell me,
from which book did you copy your "discovery"?

> Then there's this Bau fellow here, who seems to use a lot of swear
> words, and is remarkably critical.

Some others are remarkably patient with idiots. I can't stand idiots.
And you fall squarely into that category.

> Then you have David Kastrup who replied in this thread as well, who
> has said some, um, very interesting things (Google search with him as
> author using the words "race" and "breed") and who also rarely
> actually puts math in his posts.

> A more subdued character is Winter who parrots words from a guy named
> Magidin, who keeps claiming to have proven all sorts of things with
> "Galois Theory", and Winter seems to echo his words a lot, often
> praising him for what he calls the "Magidin-McKinnon Theorem".

Then there is this disgusting little shit named James Harris who has
never posted any maths that wasn't copied from somewhere else, who at
the same time uses the "Magidin-McKinnon Theorem" in his laughable
attempt at an FLT proof _and_ claims that it is false _and_ trivial (are
you schizophrenic by any chance?)

David Kastrup

unread,
Dec 11, 2002, 4:32:36 AM12/11/02
to
"Maxim Stepin" <maxim...@worldnet.att.net> writes:

He wouldn't be asking if he knew it. He calls Ullrich "lazy" because
he does not go and derive the complexity for him. The same attribute
would not hold for himself since he as a professional programmer is
simply lacking the potential for doing such analysis.

James: there is an absolutely fascinating and (in contrast to the
usual mathe,atician-cliquiness) very accessible book about all the
math surrounding the analysis of algorithms and some more that is
called "Concrete Mathematics" by Graham, Knuth, Patashnik. It is
enjoiyable to read, easily accessible, though of course it requires
work when you delve into the more complicated sections. But it is
one of the few books which is not dense or frightening even while it
reaches a high level of stuff. It even contains some number theory.
I'd recommend making an exception to your dislike of reading and at
least check it out at a library. It will also contain quite a bit of
asymptotics and other stuff needed for complexity analysis.

David C. Ullrich

unread,
Dec 11, 2002, 9:44:46 AM12/11/02
to
On 10 Dec 2002 19:05:09 -0800, jst...@msn.com (James Harris) wrote:

>David C. Ullrich <ull...@math.okstate.edu> wrote in message news:<9ttcvu492tkqqc17a...@4ax.com>...


>> On 10 Dec 2002 14:09:05 -0800, jst...@msn.com (James Harris) wrote:
>

><deleted>


>
>> >What is the basis for your claims about the asymptotic running times?
>> >
>> >others in posts have disagreed with your claims.
>>
>> Exactly when did anyone disagree with the "claim" that if you
>> improve an O(N log(N)) algorithm by a factor of 100 it's still
>> an O(N log(N)) algorithm?
>

>See Ullrich your problem is that you're lazy.
>
>Now because you're lazy you routinely rely on other people's post (and
>were stupid enough to admit it in your own posts).
>
>And as laziness often doesn't play when it comes to math, you get
>caught.

Um, this doesn't _quite_ answer the question. Exactly when did

anyone disagree with the "claim" that if you improve an
O(N log(N)) algorithm by a factor of 100 it's still an O(N log(N))
algorithm?

>Now then, what is the asymptotic running time of my latest posted
>program for counting primes?
>

>> >What is your evidence? How did you come up with your figures?
>> >
>> >It's not a complicated question.
>> >
>> >And note, the asymptotic running times of my programs improved as well.
>> >
>> >
>> >James Harris
>>
>>
>> David C. Ullrich
>

>Now people can see your name at the bottom past two question I'm still
>asking, so they can assume you saw those questions.
>
>Show what you know.

Why don't _you_ show us what _you_ know? I've never claimed to
be one of the world's greatest anythings.

It's kind of funny, you complaining about my laziness in relying
on others and at the same time asking me to do your homework
for you...

David C. Ullrich

unread,
Dec 11, 2002, 9:50:43 AM12/11/02
to
On 10 Dec 2002 17:06:43 -0800, jst...@msn.com (James Harris) wrote:

>And anyone looking at replies to me in this thread can see different
>values being given for the asymptotic running times.

They _can_? I don't recall anything like that.

I said that I _thought_ I recalled someone saying that it's N log(N).
That's not a value "given", that's something I said I _thought_ I
recalled. And I don't recall anyone else saying anything about it.
(Um, I think it was Bau who said "not better than O(N)". That's
perfectly consistent with what I say I think I recall.)

>So why are these people attacking me when they're falling over each
>other?
>

>Well, just imagine that you were arguing with people about the
>correctness of a short proof of Fermat's Last Theorem you discovered,
>as they worked *very* hard to convince other people that you were
>wrong.
>
>And while you were arguing you thought of something interesting and
>three weeks later were talking about your *new* discovery which
>happened to be a fascinating prime counting function.
>

>Now then, do you think suddenly those people who were arguing with you
>would start cheering?
>
>If so, then I give you the replies in this post, as you see Ullrich, a
>real live math professor, who seems to have a full time job replying
>to my posts, though he just about never has a lick of mathematics in
>his replies.


>
>Then there's this Bau fellow here, who seems to use a lot of swear
>words, and is remarkably critical.
>

>Then you have David Kastrup who replied in this thread as well, who
>has said some, um, very interesting things (Google search with him as
>author using the words "race" and "breed") and who also rarely
>actually puts math in his posts.
>
>A more subdued character is Winter who parrots words from a guy named
>Magidin, who keeps claiming to have proven all sorts of things with
>"Galois Theory", and Winter seems to echo his words a lot, often
>praising him for what he calls the "Magidin-McKinnon Theorem".
>

>Some cast of characters, eh?
>
>And that folks is what you get, if you happen to be musing about math,
>and stumble upon something cool, and are brave enough to talk about it
>on Usenet and the Internet.
>
>Think it's better if you talk to people who don't live on Usenet like
>these characters?
>
>Well I did. Didn't matter. At least they were *nicer* though.

They were nicer because you hadn't got aroung to abusing them
the way you've abused us for years.

The idea that niceness has something to do with mathematical
truth is curious. You've tried sci.math, nobody thinks the Proof
is right (and for good reason, since people have pointed out
exactly what's wrong). You try talking to people other than
on sci.math, and they don't think it's right either. I wonder
if there's an explanation for this... maybe it's because it's
not right?

No, that can't be it.

>Mathematicians on Usenet, like Ullrich and Magidin, with their
>sycophants like Bau, Winter, and Kastrup are a fascinating group.
>
>Sometimes I feel like I'm dealing with vampires and their familiars.

Randy Poe

unread,
Dec 11, 2002, 11:42:33 AM12/11/02
to
David C. Ullrich wrote:
> On 10 Dec 2002 17:06:43 -0800, jst...@msn.com (James Harris) wrote:
>>And anyone looking at replies to me in this thread can see different
>>values being given for the asymptotic running times.
>
> They _can_? I don't recall anything like that.
>
> I said that I _thought_ I recalled someone saying that it's N log(N).

That is one estimate. There's another estimate I've seen that
it may still be O(N^2).

> That's not a value "given", that's something I said I _thought_ I
> recalled. And I don't recall anyone else saying anything about it.
> (Um, I think it was Bau who said "not better than O(N)".

All estimates stated agree with that. 100%. No disagreement.

And to the best of my knowledge, the number of estimates
from James is zero. Nil. Nada. Zilch.

- Randy

Christian Bau

unread,
Dec 11, 2002, 12:48:23 PM12/11/02
to
In article <at7pt...@enews3.newsguy.com>, Randy Poe <rp...@nospam.com>
wrote:

> David C. Ullrich wrote:
> > On 10 Dec 2002 17:06:43 -0800, jst...@msn.com (James Harris) wrote:
> >>And anyone looking at replies to me in this thread can see different
> >>values being given for the asymptotic running times.
> >
> > They _can_? I don't recall anything like that.
> >
> > I said that I _thought_ I recalled someone saying that it's N log(N).
>
> That is one estimate. There's another estimate I've seen that
> it may still be O(N^2).

I am quite sure that at some point he posted a version that was
significantly worse than O (N). His later versions aren't. On the other
hand, straightforward sieve eliminating multiples of primes achieves
that.

> > That's not a value "given", that's something I said I _thought_ I
> > recalled. And I don't recall anyone else saying anything about it.
> > (Um, I think it was Bau who said "not better than O(N)".

One problem that will keep him from getting as good as O (N^alpha) for
any alpha < 1 is the fact that he still checks whether an odd number c
is prime by calculating pi (c+1) - pi (c-1). If this is done for a
substantial portion of all values c < sqrt (N) then one can show that
execution time is not O (N^alpha), alpha < 1. If this is done for too
many values then execution time will be O (N^beta) for beta > 1; I
believe this may have happened in his first attempt.

However, the rest of his algorithm is at least O (N) anyway; as usual
there might be an additional factor for very large N because operations
on very large integers cannot be done in constant time, but this seems
irrelevant because his code won't go over N = 2^63 anyway.

James Harris

unread,
Dec 11, 2002, 3:39:14 PM12/11/02
to
David C. Ullrich <ull...@math.okstate.edu> wrote in message news:<t3kevu8u60sn740qj...@4ax.com>...

<deleted>

And yet again Ullrich demonstrates that he hasn't a clue what he's
talking about, but hey, he's a math professor, so it doesn't matter,
right?

What should at least be clear now is that if he's not trying what he
probably thinks are smart ways to interject racial slurs into the
discussion Ullrich doesn't have a clue what's going on.

Now then, if some of you are starting to think there *may* be
something to my prime counting work, and while you're VERY skeptical,
there might just be *something* interesting with my FLT work, why do
you suppose Ullrich is still making his posts?

I suggest to you that he believes that my being black is relevant to
those questions, so he keeps posting believing racial stereotypes
protect him.

That is, if his firm belief is that a black person could never produce
math of the value I claim then he must also believe he's completely
safe from repercussions for his behavior.

And don't worry, if he ever has to be responsible for what he's said,
I'm sure he'll come begging to many of you, of a particular skin
color, for help.


James Harris

Message has been deleted

David C. Ullrich

unread,
Dec 11, 2002, 5:04:21 PM12/11/02
to
On 11 Dec 2002 13:53:49 -0800, jst...@msn.com (James Harris) wrote:

>"Maxim Stepin" <maxim...@worldnet.att.net> wrote in message news:<at6h28$10su19$1...@ID-57378.news.dfncis.de>...

>Hmmm...well it's not a big deal to me, but I haven't made a big deal
>out of it.
>
>You have to remember that I'm the guy who discovered a way to count
>primes that isn't in the books.

Not in the books, right. You're proud of "having contact" with "top
mathematicians", but you always seem to forget that thing that that
"top mathematician" told you about 20% of the grad students in the
field coming up with something similar.

>So after I do this I expect to get famous for it, but instead I get
>all of this abuse on the newsgroup!!!
>
>Now a lot of that abuse centers around asymptotic running times, and I
>noticed that my own research into algorithms based on my prime
>counting function, showed improvement in those.

And if you had something with _better_ running time than well-known
methods that would indeed be significant. So why don't you answer
the question people have been asking? What _is_ the asyptotic running
time of your algorithm? I mean it's your algorithm, and you're one
of the top number theorists on the planet, so surely this question
can't be too hard for you...

>Now we get Ullrich, Winter, Kastrup, and Bau falling over themselves
>trying to diminish my discovery, and basing a lot of their attacks on
>these asymptotic running times, so I ask little questions about what
>they know, and how they know.

Yeah. Hoping someone will slip and answer the question people
keep asking you, so you can stop looking like you don't know the
answer.

>I'd rather be signing autographs or something though.

David C. Ullrich

unread,
Dec 11, 2002, 5:05:25 PM12/11/02
to
On Wed, 11 Dec 2002 11:42:33 -0500, Randy Poe <rp...@nospam.com> wrote:

>David C. Ullrich wrote:
>> On 10 Dec 2002 17:06:43 -0800, jst...@msn.com (James Harris) wrote:
>>>And anyone looking at replies to me in this thread can see different
>>>values being given for the asymptotic running times.
>>
>> They _can_? I don't recall anything like that.
>>
>> I said that I _thought_ I recalled someone saying that it's N log(N).
>
>That is one estimate. There's another estimate I've seen that
>it may still be O(N^2).

Was that in this thread?

>> That's not a value "given", that's something I said I _thought_ I
>> recalled. And I don't recall anyone else saying anything about it.
>> (Um, I think it was Bau who said "not better than O(N)".
>
>All estimates stated agree with that. 100%. No disagreement.
>
>And to the best of my knowledge, the number of estimates
>from James is zero. Nil. Nada. Zilch.
>
> - Randy


David C. Ullrich

Dann Corbit

unread,
Dec 11, 2002, 5:34:51 PM12/11/02
to
"James Harris" <jst...@msn.com> wrote in message
news:3c65f87.02121...@posting.google.com...
> "Maxim Stepin" <maxim...@worldnet.att.net> wrote in message
news:<at6h28$10su19$1...@ID-57378.news.dfncis.de>...
> Hmmm...well it's not a big deal to me, but I haven't made a big deal
> out of it.
>
> You have to remember that I'm the guy who discovered a way to count
> primes that isn't in the books.

It's clearly a reformulation of known methods. I will admit it was
interesting.

> So after I do this I expect to get famous for it,

Do you know the name of the man who invented the Radix sort? (It's
Hollerith)
Do you know the name of the men who invented the technique of the double
exponential numerical integration transform? (it's Takahasi and Mori)
Do you know the name of the man who invented quicksort? (It's Hoare) Or the
one who dramatically improved it (it's Singleton)

The reason I mention these, is that they are some of the most important
algorithms in computer science. I suspect that you have never even heard of
them. If (therefore) these men invented fundamental algorithms of
incredible commercial and scientific importance and are hardly known by the
average person, and not well known even among those in the field...
Then how would you expect to become famous for reinventing a technique for
finding primes that is not even very efficient in comparison to some of the
really good methods.

> but instead I get
> all of this abuse on the newsgroup!!!

The abuse you receive has nothing to do with your accomplishments and
everything to do with your attitude.

> Now a lot of that abuse centers around asymptotic running times, and I
> noticed that my own research into algorithms based on my prime
> counting function, showed improvement in those.

You don't understand what asymptotic complexity is.

> Now we get Ullrich, Winter, Kastrup, and Bau falling over themselves
> trying to diminish my discovery, and basing a lot of their attacks on
> these asymptotic running times, so I ask little questions about what
> they know, and how they know.

They have explained the difficulty clearly, so that even someone such as
myself (who only has a Bachelor's degree) can easily understand exactly what
the problem is.

> I'd rather be signing autographs or something though.

Don't forget the women and buckets of money. They're right around the
corner in 'fairly land.'
--
C-FAQ: http://www.eskimo.com/~scs/C-faq/top.html
"The C-FAQ Book" ISBN 0-201-84519-9
C.A.P. FAQ: ftp://cap.connx.com/pub/Chess%20Analysis%20Project%20FAQ.htm

Maxim Stepin

unread,
Dec 11, 2002, 6:07:24 PM12/11/02
to

"James Harris" <jst...@msn.com> wrote in message
news:3c65f87.02121...@posting.google.com...
> "Maxim Stepin" <maxim...@worldnet.att.net> wrote in message
news:<at6h28$10su19$1...@ID-57378.news.dfncis.de>...
> Hmmm...well it's not a big deal to me, but I haven't made a big deal
> out of it.
>
> You have to remember that I'm the guy who discovered a way to count
> primes that isn't in the books.

There maybe hundreds of ways to count primes out there - in the books and "out of the books".
999th method of counting primes is not very interesting, unless it's BETTER then any known methods.
"Asymptotic running time" IS a big deal, because it's a practical criteria for "BETTER".

So... do you have the answer or not, James?

is it O( N*N ) ?
is it O( N*ln(N) ) ?
is it O( N ) ?
is it O( sqrt(N) ) ?


> So after I do this I expect to get famous for it, but instead I get


> all of this abuse on the newsgroup!!!

None of my business.
I'm here to disscuss only math, nothing else.

James Harris

unread,
Dec 11, 2002, 7:16:30 PM12/11/02
to
Christian Bau <christ...@freeserve.co.uk> wrote in message news:<christian.bau-C73...@slb-newsm1.svr.pol.co.uk>...

> In article <at7pt...@enews3.newsguy.com>, Randy Poe <rp...@nospam.com>
> wrote:
>
> > David C. Ullrich wrote:
> > > On 10 Dec 2002 17:06:43 -0800, jst...@msn.com (James Harris) wrote:
> > >>And anyone looking at replies to me in this thread can see different
> > >>values being given for the asymptotic running times.
> > >
> > > They _can_? I don't recall anything like that.
> > >
> > > I said that I _thought_ I recalled someone saying that it's N log(N).
> >
> > That is one estimate. There's another estimate I've seen that
> > it may still be O(N^2).
>
> I am quite sure that at some point he posted a version that was
> significantly worse than O (N). His later versions aren't. On the other
> hand, straightforward sieve eliminating multiples of primes achieves
> that.

Yet I pointed out that it's my fastest *posted* program, so that
reasonable people would realize that you or Ullrich could have
actually checked if you cared about the truth.

But I'm the guy who made an exciting discovery and got abuse from
people like you for my troubles.

> > > That's not a value "given", that's something I said I _thought_ I
> > > recalled. And I don't recall anyone else saying anything about it.
> > > (Um, I think it was Bau who said "not better than O(N)".
>
> One problem that will keep him from getting as good as O (N^alpha) for
> any alpha < 1 is the fact that he still checks whether an odd number c
> is prime by calculating pi (c+1) - pi (c-1). If this is done for a
> substantial portion of all values c < sqrt (N) then one can show that
> execution time is not O (N^alpha), alpha < 1. If this is done for too
> many values then execution time will be O (N^beta) for beta > 1; I
> believe this may have happened in his first attempt.

That's not relevant to my fastest *posted* program, but it's not like
you or Winter or Ullrich care about the truth, is it?

You just want to bash a discoverer and the newsgroup format lets you
get away with it.

> However, the rest of his algorithm is at least O (N) anyway; as usual
> there might be an additional factor for very large N because operations
> on very large integers cannot be done in constant time, but this seems
> irrelevant because his code won't go over N = 2^63 anyway.

And why should anyone believe you when you so clearly care nothing for
the truth?

It seems to me that you really just want to attack me, and you started
attacking me seriously when I made my discovery, as a Google search
can verify for those curious.

> > All estimates stated agree with that. 100%. No disagreement.
> >
> > And to the best of my knowledge, the number of estimates
> > from James is zero. Nil. Nada. Zilch.

I'm a discoverer. In our modern high tech world I find that Usenet
and the Internet seem very effective at getting discoverers verbally
abused.

Those who wonder should consider that there isn't a doubt that I
discovered a way to count primes that you can't find in any textbook
or math reference that isn't connected to me, but I get abuse, and
when I ask these abusive people to provide facts to back up their
assertions, you see what happens.

Now David Ullrich is an actual math professor at a state university,
so if you're an American, your tax dollars help pay his salary.

And I hate to put something like that out but I'm tired. I *should*
be famous with some kind of book deal, but I have to get past these
people first.

My motivation is clear as I'd like some recognition for my
discoveries.

Now ask yourself, what do any of these people get if I get proper
recognition for *my* discovery?

Well they can look back on their fun times bashing me, some possibly
at your expense.


James Harris

Christian Bau

unread,
Dec 11, 2002, 7:27:35 PM12/11/02
to
In article <3c65f87.02121...@posting.google.com>,
jst...@msn.com (James Harris) wrote:

> "Maxim Stepin" <maxim...@worldnet.att.net> wrote in message
> news:<at6h28$10su19$1...@ID-57378.news.dfncis.de>...

> Hmmm...well it's not a big deal to me, but I haven't made a big deal
> out of it.
>
> You have to remember that I'm the guy who discovered a way to count
> primes that isn't in the books.
>

> So after I do this I expect to get famous for it, but instead I get
> all of this abuse on the newsgroup!!!
>

> Now a lot of that abuse centers around asymptotic running times, and I
> noticed that my own research into algorithms based on my prime
> counting function, showed improvement in those.
>

> Now we get Ullrich, Winter, Kastrup, and Bau falling over themselves
> trying to diminish my discovery, and basing a lot of their attacks on
> these asymptotic running times, so I ask little questions about what
> they know, and how they know.
>

> I'd rather be signing autographs or something though.

Another post of James Harris without any mathematical contents. Are you
too lazy or too stupid to analyse your own copy of Legendre's method, or
is it both?

John

unread,
Dec 11, 2002, 8:31:08 PM12/11/02
to
jst...@msn.com (James Harris) wrote in message news:<3c65f87.02121...@posting.google.com>...
> Christian Bau <christ...@freeserve.co.uk> wrote in message news:<christian.bau-A0D...@slb-newsm1.svr.pol.co.uk>...

> > In article <3c65f87.02121...@posting.google.com>,
> > jst...@msn.com (James Harris) wrote:
> >

> If so, then I give you the replies in this post, as you see Ullrich, a
> real live math professor, who seems to have a full time job replying
> to my posts, though he just about never has a lick of mathematics in
> his replies.

Ullich snipped this portion of your post. Have you any idea why?

--John

Dann Corbit

unread,
Dec 11, 2002, 9:32:27 PM12/11/02
to

James Harris

unread,
Dec 11, 2002, 11:05:18 PM12/11/02
to
"Maxim Stepin" <maxim...@worldnet.att.net> wrote in message news:<at8gd6$1107uj$1...@ID-57378.news.dfncis.de>...

> "James Harris" <jst...@msn.com> wrote in message
> news:3c65f87.02121...@posting.google.com...
> > "Maxim Stepin" <maxim...@worldnet.att.net> wrote in message
> news:<at6h28$10su19$1...@ID-57378.news.dfncis.de>...
> > > "James Harris" <jst...@msn.com> wrote in message
> > > news:3c65f87.02121...@posting.google.com...
> > >
> > > > Now then, what is the asymptotic running time of my latest posted
> > > > program for counting primes?
> > >
> > > Yes, what is the asymptotic running time of your latest posted
> > > program for counting primes, James?
> > >
> > > is it O( N*N ) ?
> > > is it O( N*ln(N) ) ?
> > > is it O( N ) ?
> > > is it O( sqrt(N) ) ?
> > >
> > > Just curious...
> >
> > Hmmm...well it's not a big deal to me, but I haven't made a big deal
> > out of it.
> >
> > You have to remember that I'm the guy who discovered a way to count
> > primes that isn't in the books.
>
> There maybe hundreds of ways to count primes out there - in the books and "out of the books".
> 999th method of counting primes is not very interesting, unless it's BETTER then any known methods.
> "Asymptotic running time" IS a big deal, because it's a practical criteria for "BETTER".
>

Really? Then why don't you name *one* that doesn't depend on your
having a list of primes up to sqrt(N)?

Show what you know.

And I know the only one besides my own, and besides that which follows
from the Riemann Hypothesis.

That one is a brute force method that's not elegant at all, and it
doesn't lead to anything that in any way looks like a differential
equation.

It seems to me that people like you are very anxious to try and prove
a falsehood that, who knows?, might have worldwide implications
depending on how important a new discovery in the world of primes is.

But then, what do you care?

On the newsgroup you get to feel important attacking me.

You'd get nothing but satisfaction for love of the truth by telling
the truth here.

> So... do you have the answer or not, James?
>
> is it O( N*N ) ?
> is it O( N*ln(N) ) ?
> is it O( N ) ?
> is it O( sqrt(N) ) ?

You answer mine first, since as the discoverer, who finds himself
hounded by people who seem very intent on trashing me I find myself
less than interested in jumping through hoops at the drop of the hat.

I'm one guy.

There's almost half a dozen of you in this thread attacking me and my
work.

Why don't you give one of those "hundreds of ways to count primes out
there"?

> > So after I do this I expect to get famous for it, but instead I get
> > all of this abuse on the newsgroup!!!
>
> None of my business.
> I'm here to disscuss only math, nothing else.

Then why don't you actually put in some math?

So far in all the posts you've made I've seen very little of that, as
it looks like you have an agenda, but when I specifically asked you in
one of my replies about that, you replied back that you didn't.

The newsgroup format makes it easy to lie and never expect to be held
accountable, eh?


James Harris

Maxim Stepin

unread,
Dec 12, 2002, 1:29:45 AM12/12/02
to

Since I don't claim anything, I don't have to show anything.
This is your show.

> [skip]

> > So... do you have the answer or not, James?
> >
> > is it O( N*N ) ?
> > is it O( N*ln(N) ) ?
> > is it O( N ) ?
> > is it O( sqrt(N) ) ?
>
> You answer mine first, since as the discoverer, who finds himself
> hounded by people who seem very intent on trashing me I find myself
> less than interested in jumping through hoops at the drop of the hat.

Since it's your method, you should know better then me.
Are you really not interested to find the answer?

> There's almost half a dozen of you in this thread attacking me and my
> work.

> Why don't you give one of those "hundreds of ways to count primes out
> there"?

I don't have to. I don't claim anything not obvious.

> > > So after I do this I expect to get famous for it, but instead I get
> > > all of this abuse on the newsgroup!!!
> >
> > None of my business.
> > I'm here to disscuss only math, nothing else.
>
> Then why don't you actually put in some math?

If I'd claim something, I'd certanly put some math.

> So far in all the posts you've made I've seen very little of that, as
> it looks like you have an agenda, but when I specifically asked you in
> one of my replies about that, you replied back that you didn't.
>
> The newsgroup format makes it easy to lie and never expect to be held
> accountable, eh?

I repeat - I don't have a hidden agenda.
I'm just curious to find the answer on that question.
You seem like the best person to ask - being author of the method...


David Kastrup

unread,
Dec 12, 2002, 1:56:41 AM12/12/02
to
john_...@yahoo.com (John) writes:

It is common courtesy when quoting to omit stuff not pertaining to
the point one is making. For those that want the full story,
references to the full text are given in the header and often in the
article. It is called Usenet. Take a look at it some time.

Randy Poe

unread,
Dec 12, 2002, 12:21:16 PM12/12/02
to
James Harris wrote:
> "Maxim Stepin" <maxim...@worldnet.att.net> wrote in message news:<at6h28$10su19$1...@ID-57378.news.dfncis.de>...
>
> Hmmm...well it's not a big deal to me, but I haven't made a big deal
> out of it.
>
> You have to remember that I'm the guy who discovered a way to count
> primes that isn't in the books.
>

Go to Mathworld's "Prime Counting Function" page. Look at the
references. You will see names and publication dates in the same
table that shows prime counting CPU complexity and storage efficiency.
Every single one of those references is a paper about a new
way to count primes that wasn't in the books. Guess what
those people did when they came up with their new way?

> So after I do this I expect to get famous for it, but instead I get
> all of this abuse on the newsgroup!!!

(Not mutually exclusive.)

Do you consider the people mentioned above "famous"? If so,
what do you think would be the path for you to achieve
the same fame. Has any of those people ever been in
this newsgroup? How did their name get to a page on
Mathworld? If they did and got ridiculed, would it detract
from whatever you might consider their fame?

- Randy

Randy Poe

unread,
Dec 12, 2002, 12:26:24 PM12/12/02
to
Maxim Stepin wrote:
> "James Harris" <jst...@msn.com> wrote in message
> news:3c65f87.02121...@posting.google.com...
> There maybe hundreds of ways to count primes out there - in the books and "out of the books".
> 999th method of counting primes is not very interesting, unless it's BETTER then any known methods.
> "Asymptotic running time" IS a big deal, because it's a practical criteria for "BETTER".

Even without that, if James could extend the record for
largest N for which pi(N) is known, he'd have a publishable
result and have his moment in the spotlight, a la the
_Guiness Book of World Records_.

Of course, if his algorithm is O(N) or worse, it might take
him a while to get to N>10^22...

And the dancing girls, autograph-signings and appearances
on Letterman for achieving this record might be a little
longer in coming.

- Randy

James Waldby

unread,
Dec 12, 2002, 1:59:31 PM12/12/02
to
James Harris wrote:
> Christian Bau ... wrote ...

> > ... Randy Poe ... wrote:
> > > David C. Ullrich wrote:
> > > > ... James Harris wrote:
> > > >>And anyone looking at replies to me in this thread can see different
> > > >>values being given for the asymptotic running times.
> > > >
> > > > They _can_? I don't recall anything like that.
> > > >
> > > > I said that I _thought_ I recalled someone saying that it's N log(N).
> > >
> > > That is one estimate. There's another estimate I've seen that
> > > it may still be O(N^2).
> >
> > I am quite sure that at some point he posted a version that was
> > significantly worse than O (N). His later versions aren't. On the other
> > hand, straightforward sieve eliminating multiples of primes achieves
> > that.
>
> Yet I pointed out that it's my fastest *posted* program,
...

You (Harris) seem to be talking about a different issue than is Bau.
Bau is discussing the complexity of your algorithm. Such discussion
is on-topic in sci.math. Talking about who knew what when is not
really on-topic except with regard to knowledge about your "fastest
posted program", which for brevity let us refer to as YFPP. YFPP
either is or is not the same as your fastest program (YFP). If YFPP
is not the same as YFP then only you can know what YFP's computational
complexity is, and you need to tell us its complexity, rather than us
tell you. On the other hand, if YFPP == YFP, then Bau's analysis would
show that YFP's complexity is "at least O(N)" [by which, Bau probably
means "definitely Omega(N)"].

> > However, the rest of his algorithm is at least O (N) anyway; as usual
> > there might be an additional factor for very large N because operations
> > on very large integers cannot be done in constant time, but this seems
> > irrelevant because his code won't go over N = 2^63 anyway.

The meaning of O, Theta, and Omega notation is explained fairly
well at http://www.cs.mcgill.ca/~cs251/OldCourses/1997/topic3/ .

Let us use the timings from Wilma Scranton's 11 July 2002 post,
http://groups.google.com/groups?selm=20020711083304.08825.00003836%40mb-mo.aol.com
(url all on one line) and empirically exhibit C's and n0's
for Omega and O bounds. Times are in milliseconds:

> pi( 100000000)= 5761455 Time: 5168
> pi( 200000000)=11078937 Time: 10276
> pi( 400000000)=21336326 Time: 20470
> pi( 800000000)=41146179 Time: 40841
> pi(1600000000)=79451833 Time: 82201

From this data, n0=0 and C1=51, C2=52 milliseconds/million are such
that C1*N < t(N) < C2*N. Also from this data, we would predict the
following minimum times on the computer that gave the above times, if
the program and computer were able to work at the N shown:
N time (ms) time
10^9 51000 51 seconds
10^12 51000000 14 hours
10^15 51*10^9 1.6 years
10^18 51*10^12 1600 years
10^21 51*10^15 16 millenia
10^24 51*10^18 16000 millenia

Do you agree that these calculations are correct?
-jiw

Message has been deleted

David Kastrup

unread,
Dec 13, 2002, 12:30:22 PM12/13/02
to
jst...@msn.com (James Harris) writes:

> My own research has shown both improvements in total time and in
> asymptotic running times, while half a dozen or so of you have
> attacked my prime research as if it were useless.

You have not even given any numbers to support that contention, and
so far it has not even apparent that you even know what "improvement
in asymptotic running time" would mean. And even without those
numbers, it's all just your claims. And when you mentioned numbers,
those were far from impressive, anyway.

David Bernier

unread,
Dec 13, 2002, 12:43:42 PM12/13/02
to
James Harris wrote:
[...]

> Why do that when the program is *posted*, unless you hope to fool
> people?
[...]


I'd be happy to test drive your latest posted program.
I would only need a reference to the post which contains
your latest program.

David Bernier

James Waldby

unread,
Dec 13, 2002, 12:56:14 PM12/13/02
to
James Harris wrote:
> James Waldby... wrote ...

> > James Harris wrote:
> > > Christian Bau ... wrote ...
> > > > ... Randy Poe ... wrote:
> > > > > David C. Ullrich wrote:
> > > > > > ... James Harris wrote:
...

> > > Yet I pointed out that it's my fastest *posted* program,
> > ...
> >
> > You (Harris) seem to be talking about a different issue than is Bau.
> > Bau is discussing the complexity of your algorithm. Such discussion
> > is on-topic in sci.math. Talking about who knew what when is not
> > really on-topic except with regard to knowledge about your "fastest
> > posted program", which for brevity let us refer to as YFPP. YFPP
> > either is or is not the same as your fastest program (YFP). If YFPP
> > is not the same as YFP then only you can know what YFP's computational
> > complexity is, and you need to tell us its complexity, rather than us
> > tell you. On the other hand, if YFPP == YFP, then Bau's analysis would
> > show that YFP's complexity is "at least O(N)" [by which, Bau probably
> > means "definitely Omega(N)"].
>
> No, I'm defending my research from people trying to denigrate it.

>
> My own research has shown both improvements in total time and in
> asymptotic running times, while half a dozen or so of you have
> attacked my prime research as if it were useless.
>
> The trouble is that you haven't bothered with the pertinent facts.
>
> Like you've shown with your numbers in this post.

>
> > > > However, the rest of his algorithm is at least O (N) anyway; as usual
> > > > there might be an additional factor for very large N because operations
> > > > on very large integers cannot be done in constant time, but this seems
> > > > irrelevant because his code won't go over N = 2^63 anyway.
> >
> > The meaning of O, Theta, and Omega notation is explained fairly
> > well at http://www.cs.mcgill.ca/~cs251/OldCourses/1997/topic3/ .
>
> Condescension won't help you with the people who are about to see how
> badly you've treated their trust.

>
> > Let us use the timings from Wilma Scranton's 11 July 2002 post,
> > http://groups.google.com/groups?selm=20020711083304.08825.00003836%40mb-mo.aol.com
> > (url all on one line) and empirically exhibit C's and n0's
> > for Omega and O bounds. Times are in milliseconds:

...


> >
> > > pi( 100000000)= 5761455 Time: 5168
>

> I'm running the progam on my pc now and it gives the answer in 0.06
> seconds.
>
> > > pi( 200000000)=11078937 Time: 10276
>
> It gives this one in 0.08 seconds.
>
> > > pi( 400000000)=21336326 Time: 20470
>
> This one comes out at 0.1 seconds.
>
> > > pi( 800000000)=41146179 Time: 40841
>
> Here it's 0.141 seconds.
>
> > > pi(1600000000)=79451833 Time: 82201
>
> And here it's 0.22 seconds.


>
> > From this data, n0=0 and C1=51, C2=52 milliseconds/million are such
> > that C1*N < t(N) < C2*N. Also from this data, we would predict the
> > following minimum times on the computer that gave the above times, if
> > the program and computer were able to work at the N shown:
> > N time (ms) time
> > 10^9 51000 51 seconds
> > 10^12 51000000 14 hours
> > 10^15 51*10^9 1.6 years
> > 10^18 51*10^12 1600 years
> > 10^21 51*10^15 16 millenia
> > 10^24 51*10^18 16000 millenia
> >
> > Do you agree that these calculations are correct?
> > -jiw

...

Ok, to summarize, the numbers you give are:
N Time (seconds)
100000000 .06
200000000 .08
400000000 .10
800000000 .14
1600000000 .22
which is a fairly small baseline, but for this sample
.13*N < t(N) <= .6*N, with t in seconds, and N in units
of billions. Rebuilding the chart from above we get
N time
10^9 .13 seconds
10^12 2.2 minutes
10^15 36 hours
10^18 4.1 years
10^21 4.1 millenia
10^24 4.1 billenia [1 bill.=1000 mill.]
where the times shown are lower bounds based on the
assumption that your program's complexity is Omega(N)
and C is .13 seconds/billion.


Do you agree that these calculations are correct?

Also, what times do you actually measure when you
calculate with N=10^12,15,18,21,and 24?
-jiw

Christian Bau

unread,
Dec 13, 2002, 1:28:18 PM12/13/02
to
In article <3DFA1F3E...@pat7.com>,
James Waldby <j-wa...@pat7.com> wrote:

> Ok, to summarize, the numbers you give are:
> N Time (seconds)
> 100000000 .06
> 200000000 .08
> 400000000 .10
> 800000000 .14
> 1600000000 .22
> which is a fairly small baseline, but for this sample
> .13*N < t(N) <= .6*N, with t in seconds, and N in units
> of billions.

Looks very much like 0.1*N + 0.06 to me; that would be a perfect match
if the first time was 0.07. Which makes it very linear. And I can see
why an untalented schmock like James Harris would think it is better
than O (N).

Message has been deleted

Arturo Magidin

unread,
Dec 13, 2002, 7:26:25 PM12/13/02
to
In article <3c65f87.02121...@posting.google.com>,
James Harris <jst...@msn.com> wrote:

[.snip.]

>You have full professors in Ullrich and Magidin.

The term "full professor" usually is used as short hand to indicate a
position with tenure. If that's what you mean, you are misapplying it
when you use it for me.

======================================================================
"It's not denial. I'm just very selective about
what I accept as reality."
--- Calvin ("Calvin and Hobbes")
======================================================================

Arturo Magidin
mag...@math.berkeley.edu

James Waldby

unread,
Dec 13, 2002, 9:18:32 PM12/13/02
to
James Harris wrote:
> James Waldby ... wrote ...
> > James Harris wrote:
> > > James Waldby... wrote ...

...
> > Ok, to summarize, the numbers you give are:
>
> The "Ok" is a concession that you gave times spectacularly different
> from my fastest posted program, yet now you proceed as if that's
> meaningless.

No, it is no such concession. I agree your numbers are several
hundredfold smaller than Wilma's, but both sets of times show a
linear trend, which is what is relevant re order of complexity.

...
> Now you posted times apparently trying to convince others that what I
> have is useless, when I myself have pointed out that algorithms based
> on my prime counting function discovery have given me faster times and
> faster asymptotic running times as I proceeded, so what's your point
> now?
...
> Everything you've posted is clearly an attack.
>
> When caught in your first attempt, you simply backed up and came again
> as if nothing had happened and as if people should still trust you.
...

Your comments are wrong because I posted nothing in the nature
of an attack. You should apologize for making such uncalled-for
pejorative and paranoid remarks. I wish to have a calm and
reasoned discussion regarding the order of complexity of your
algorithm. Please see my remarks about that in other posts.
-jiw

James Harris

unread,
Dec 13, 2002, 9:41:51 PM12/13/02
to
jst...@msn.com (James Harris) wrote in message news:<3c65f87.02120...@posting.google.com>...

I'm adding in the prime counting function definition correction that
I've already posted, but I thought it worth it to include everything
from the original post as I think it helps explain some things if you
read it all.

I do hope you read my entire post here. JSH

> Some of you may honestly be confused about whether or not I've
> presented a short proof of Fermat's Last Theorem, whether or not
> there's a counter example or counter argument to my proof, and whether
> or not people have been caught in actual lies about the proof.
>
> Unfortunately, I'm quite certain that mathematicians have strong
> motivations not to tell you the truth. While I admit it's quite
> possible that many mathematicians have not heard of the short FLT
> Proof, I'm also very suspicious to the extent that I point out that
> they may simply believe that not saying anything is not only a potent
> weapon against anyone knowing the truth, but it might seem like the
> best defense for the future.
>
> So, if you're someone who prides themselves on knowing the actual
> truth versus political or social "truths" then you may wonder what
> hope you have here, especially if you don't know a lot of number
> theory.
>
> I suggest your common sense is a good start, along with a
> consideration of the facts NOT in dispute.
>
> You may wonder about relevance of some of these facts, but I've chosen
> them and their order quite deliberately.
>
> 1. I did find a way to count prime numbers that is not in any
> established reference, which is easily verifiable. It is also easy to
> check anyone claiming that is not true by asking them to provide the
> reference, and then simply looking.
>
> The prime counting function I found is
>
> dS(x,y) = (pi((x/y), y-1) -pi(y-1, sqrt(y-1))[pi(y, sqrt(y)) -
> pi(y-1, sqrt(y-1))],
>
> S(x,1) = 0.
>
> And pi(x, y) = floor(x) - S(x, y) - 1, and you get S by summing dS
> from y = 2 to sqrt(x).

Correction:

And pi(x, y) = floor(x) - S(x, y) - 1, and you get S by summing dS
from dS(x,2) to dS(x,y).

> though
>
> dS(x,y) = (pi((x/y), (sqrt(x/y)) - pi(y-1, sqrt(y-1))[ pi(y, sqrt(y))
> - pi(y-1, sqrt(y-1))]
>
> when sqrt(x/y) < y-1, makes for faster calculation.
>
> Here pi(x,sqrt(x)) gives the prime count for x.
>
> For instance pi(10,sqrt(10)) will give you 4, corresponding to 2, 3, 5
> and 7.
>
> Now it should be easy enough to see if that's in any reference, and
> please note there is NOT dispute about whether or not it works.
>
> 2. Most of the short FLT proof I've presented has been verified as
> being correct, including points that *had* been controversial. Here
> the important thing is to pay attention to those claiming some error
> has been found.
>
> For me that's an important point because I was a central figure in
> arguments that went on for MONTHS, and even when I was vindicated time
> and time again, these same people would keep acting as if they'd never
> been wrong!!!

And they're still at it.

> Verifying that is something that could take a bit of effort, so it's
> not as easy as the prime counting function.
>
> However, I did make it somewhat easy for you by separating out a key
> argument as "Area One" so you can just go to groups.google.com and do
> a search where I am the author on the sci.math newsgroup for the first
> mention of the phrase "Area One" and you will see me talking quite a
> while back about what I was going to do, and what it would show.
>
> Then you can see how long the arguing went on, and most importantly
> *who* I argued with.

I strongly suggest you do that, use advanced search, and do a search
by history going back to my first usage of "Area One" in posts, and
*please* notice who was arguing with me then, and what they were
arguing about.

> Of course, you may know that a "proof" can be mostly correct and still
> be wrong, but what's fascinating here is that certain people have
> clearly been caught arguing against results before they were
> independently shown to be proven, pausing, and then arguing against
> those *same* results later!
>
> If you don't believe it, be sure to read what I said when I first
> started talking about "Area One" in posts, and pay attention to what
> actually happened over a period of months.
>
> 3. Despite my prime counting function not being in any reference, the
> same people who are telling you that the short FLT Proof is wrong, are
> usually claiming that the prime counting work is unimportant, or in
> even more bizarre cases, claiming that I haven't presented anything
> new!
>
> Now I've contacted leading mathematicians by email, and for important
> reasons I'm keeping their names and the complete contents of those
> emails private, and what I've been told by them is not that it's not
> new, but that they don't think it's important.
>
> Also, they keep talking about fast prime counting, and what I have is
> not the fastest around, though I still believe that algorithms based
> on it can be.
>
> What's important here is for you to ask yourselves, how unimportant
> could a method for counting PRIME NUMBERS truly be such that
> mathematicians don't seem bothered with it NOT being published?
>
> Here the proper assessment, I strongly suggest, is that it is VERY
> important, which is why mathematicians, to my knowledge, aren't
> talking about it.
>
> Here what you believe probably depends on whether or not you're one of
> those "conspiracy people" who suppose that it is possible aliens
> landed at Roswell and the government covered it up, and those who
> think it's complete hogwash.
>
> Personally, since I mentioned it, I don't think aliens crash landed at
> Roswell, but I'm hoping some of you will smell the possibility of a
> conspiracy of silence here.
>
> Now I hope those three points are a start.
>
> Some of you may labor under the misapprehension that what is happening
> is strange historically, but it's not.
>
> You may have seen the life of Evariste Galois mentioned or read my
> post linking John Nash's schizophrenia to lack of proper recognition,
> based on an established psychological theory called the double bind
> theory, which you can easily enough look up on the web to see that I
> didn't make it up.
>
> Do you think that intelligent people--people intelligent enough to get
> doctorates in mathematics--really don't know the profound effect that
> simply *ignoring* a person's work can have on that person?
>
> Maybe those who repeatedly post in reply to me at least acknowledging
> the existence of my work are better than that shadowy majority of
> mathematicians who by now may know much about it, but realize that
> their most potent and vicious weapon--is silence. James Harris

David Bernier

unread,
Dec 13, 2002, 10:09:55 PM12/13/02
to

James Harris wrote:

[...]


> The program is posted. Test it for yourself.


I found PrimeCountH.java at
http://www.msnusers.com/AmateurMath/_whatsnew.msnw
(BTW, I had to use an MSN Passport to logon to the
page with the source code)

This is what I got:
--------------------------------------------

D:\wind\j2\bin>java PrimeCountH 1000000000000 [ 10^12 ]
Sieve Time: 641
m_max=1000001
pi(1000000000000)=37607912018 (CHECKED)

Total Time: 32277


D:\wind\j2\bin>java PrimeCountH 10000000000000 [ 10^13 ]
Sieve Time: 2083
m_max=3162278
pi(10000000000000)=346065536839 (CHECKED)

Total Time: 263108


D:\wind\j2\bin>java PrimeCountH 100000000000000 [ 10^14 ]
Sieve Time: 6890
m_max=10000001
pi(100000000000000)=3204941750802 (CHECKED)

Total Time: 2256184


D:\wind\j2\bin>java PrimeCountH 1000000000000000 [ 10^15 ]
Exception in thread "main" java.lang.OutOfMemoryError

--------------------------------------------------------------------

So it works up to 10^14 at least.

David Bernier

James Waldby

unread,
Dec 13, 2002, 10:30:30 PM12/13/02
to
James Harris wrote:
> James Waldby ... wrote ...
...

> > The meaning of O, Theta, and Omega notation is explained fairly
> > well at http://www.cs.mcgill.ca/~cs251/OldCourses/1997/topic3/ .
...
> > Ok, to summarize, the numbers you give are:
...

> > N Time (seconds)
> > 100000000 .06
> > 200000000 .08
> > 400000000 .10
> > 800000000 .14
> > 1600000000 .22
> > which is a fairly small baseline, but for this sample
> > .13*N < t(N) <= .6*N, with t in seconds, and N in units
> > of billions. Rebuilding the chart from above we get
> > N time
> > 10^9 .13 seconds
> > 10^12 2.2 minutes
>
> I checked this number, and found you way off in an over estimate.
>
> Given your attack behavior I feel less than interested in telling you
> the time I got except to say it was less than a minute.

>
> > 10^15 36 hours
> > 10^18 4.1 years
> > 10^21 4.1 millennia
> > 10^24 4.1 billennia [1 bill.=1000 mill.]

> > where the times shown are lower bounds based on the
> > assumption that your program's complexity is Omega(N)
> > and C is .13 seconds/billion.
> > Do you agree that these calculations are correct?
> >
> > Also, what times do you actually measure when you
> > calculate with N=10^12,15,18,21,and 24?
> > -jiw
...

The over-estimate implies that the value of C that I used,
.13 seconds/billion, is too large. Your comment of "less
than a minute" at 10^12, vs. the 2.2 minute lower-bound
estimate, means that C is less than .0585, which I will
conservatively approximate by .05. Hence the following table.
N Lower-bound on time
10^9 .05 seconds
10^12 51 seconds
10^15 14 hours
10^18 1.6 years
10^21 1600 years
10^24 1.6 million years

Do you agree that these calculations are correct? If not,
what should the numbers be?

My question is whether you agree that the arithmetic was done
right, rather than whether you agree that these are ok lower
bounds on times the program would take.

If your program is Omega(N) [see Omega definition at eg
http://www.cs.mcgill.ca/~cs251/OldCourses/1997/topic3/ ]
then the numbers might be ok. If your program is Omega(N^alpha)
with alpha < 1, or is Omega of some other sublinear function,
then they might be overestimates. I don't have a fast computer
to run tests on so it would be helpful if you post the times
you've observed on your computer.

Recall that of course O, Omega, and Theta bounding functions
have little or no concern with actual values of multiplicative
constants. O, Omega, and Theta notation is not concerned with
values of multiplicative constants, but merely with whether
they exist. A few program timings prove nothing about a program's
complexity, but can be suggestive of what it might be, and of
what to look for when analyzing the program.
-jiw

John

unread,
Dec 14, 2002, 3:18:59 AM12/14/02
to
Christian Bau <christ...@freeserve.co.uk> wrote in message news:<christian.bau-2C8...@slb-newsm1.svr.pol.co.uk>...

Literates in English write "schmuck".

--John

Message has been deleted

Maxim Stepin

unread,
Dec 15, 2002, 12:56:21 PM12/15/02
to
"James Harris" <jst...@msn.com> wrote in message
news:3c65f87.02121...@posting.google.com...
> David Bernier <davi...@sympatico.ca> wrote in message news:<3DFAA103...@sympatico.ca>...

> > This is what I got:
[skipped a little]
> > [ 10^12 ]
> > Total Time: 32277
> > [ 10^13 ]
> > Total Time: 263108
> > [ 10^14 ]
> > Total Time: 2256184
> > [ 10^15 ]
> > OutOfMemoryError

> Granted you can't see it here because of the out of memory error, but
> with the numbers before it looks like it'd take about 8 hours.

Maybe even 5 hours.
Since each new step takes at least 8 times more time, so... 2256 sec * 8= 5 hours.
It is a good result, no doubt.

But let's continue:

N time
10^15 5 hours
10^16 40 hours
10^17 320 hours
10^18 100 days
10^19 800 days
10^20 17 years

Just for comparison, on this page:
http://numbers.computation.free.fr/Constants/Primes/pixtable.html
I found something interesting:
-----
pi(1.5*10^22)=299,751,248,358,699,805,270
P. Demichel, X. Gourdon, 2001 Feb (70 days on PIII 1 Ghz)
-----
just 70 days for 1.5*10^22 !
I guess Gourdon's method is very good too...

David Bernier

unread,
Dec 15, 2002, 5:44:49 PM12/15/02
to

James Harris wrote:
[...]

> Now I *can* probably tweak the sieve so that it'll do the 10^15 count,
> but I'm not at all motivated as I find it all so depressing.
[...]

If you decide to write another version, I'd be glad to try it out.
It's easier for me to test the programs you write than to try to
understand your mathematics.

You say that your Prime Counting function and the Riemann Hypothesis
are related. So I ask you, can you write a program to compute
the non-trivial zeroes of the Riemann zeta function?

Best wishes,

David Bernier

Message has been deleted

Maxim Stepin

unread,
Dec 15, 2002, 10:16:24 PM12/15/02
to

"James Harris" <jst...@msn.com> wrote in message
news:3c65f87.02121...@posting.google.com...

[skipped]

> Um, did it ever occur to you that I'm an *amateur* who having made his
> own prime counting discovery managed to generate the times you said
> were a good result in just a few weeks of fiddling?

[skipped]

> At best indifference or a "good result" here and there.

I mean, good result for amateur.
No offense, but Gourdon's method at least 5000 times faster.
( I'm comparing his 70 days for pi(1.5*10^22) against yours approx. 1000 years. )

> Mostly though I have abuse like that from Ullrich, a full math
> professor at a *state* university so my taxes may help pay his salary.
>
> That's not exactly what I expected when I went looking for fame and
> fortune through discovery in math.

That's like you went to the North Pole looking for Santa.
But there is no Santa, and North Pole is still far far away...

You can continue your journey, but only for the love of math.


Message has been deleted

David Kastrup

unread,
Dec 16, 2002, 10:48:46 AM12/16/02
to
jst...@msn.com (James Harris) writes:

> Also those who've followed posts all the way back to the original
> thread might have noticed that I was pointing out that I managed to
> improve asymptotic running time.

How did you measure this improvement? Since you have not even
posted anything that would suggest you even _know_ what asymptotic
running time is, much less how to measure it, one can't assume you
know what you are talking about unless you go into anything more
specific.

David C Ullrich

unread,
Dec 16, 2002, 11:09:38 AM12/16/02
to
On 16 Dec 2002 07:37:59 -0800, jst...@msn.com (James Harris) wrote:

>"Maxim Stepin" <maxim...@worldnet.att.net> wrote in message news:<atjgfu$14b9pe$1...@ID-57378.news.dfncis.de>...


>> "James Harris" <jst...@msn.com> wrote in message
>> news:3c65f87.02121...@posting.google.com...
>>
>> [skipped]
>>
>> > Um, did it ever occur to you that I'm an *amateur* who having made his
>> > own prime counting discovery managed to generate the times you said
>> > were a good result in just a few weeks of fiddling?
>>
>> [skipped]
>>
>> > At best indifference or a "good result" here and there.
>>
>> I mean, good result for amateur.
>> No offense, but Gourdon's method at least 5000 times faster.
>> ( I'm comparing his 70 days for pi(1.5*10^22) against yours approx. 1000 years. )
>>
>

>Oh, glad you thought it necessary to point that out.
>
>I notice you deleted out where I mentioned that my prime counting
>function can be used with any prime counting algorithm.

You should _thank_ people for deleting that comment - leaving
it in makes you look stupid. Because what you found is an
_algorithm_, not a _function_.

>Also those who've followed posts all the way back to the original
>thread might have noticed that I was pointing out that I managed to
>improve asymptotic running time.

A person might also notice that in spite of repeated requests
you have not said _anything_ about what the current asymptotic
running time actually _is_.

>That was in a few weeks of trying.
>
>My point is that continuing research might reveal more routes to
>greater improvement.
>
>And my main point is that people should be suspicious of all this
>effort to justify a find in the area of primes not even being
>published, as if it didn't matter.


>
>> > Mostly though I have abuse like that from Ullrich, a full math
>> > professor at a *state* university so my taxes may help pay his salary.
>> >
>> > That's not exactly what I expected when I went looking for fame and
>> > fortune through discovery in math.
>>
>> That's like you went to the North Pole looking for Santa.
>> But there is no Santa, and North Pole is still far far away...
>>
>> You can continue your journey, but only for the love of math.
>

>Care to mention how much research time went into Gourdon's method?
>
>As I found the prime counting function from first principles, the
>total research time on it is my time, and is a few months.
>
>I'd guess that Gourdon's work rests on other research work that goes
>back to the mid 1800's.
>
>That goes back all the way to the time of Gauss.
>
>So why are *modern* mathematicians so sanguine about my find not being
>published?

That "top mathematician" explained it when he said that 20% of the
grad students working in the field find something similar.

>James Harris

Christian Bau

unread,
Dec 16, 2002, 11:22:50 AM12/16/02
to
In article <imurvuk8e3gjsf5lt...@4ax.com>,

I think it is necessary to explain what that means: 20% of the grad
students stumble upon the problem, find it interesting to invest some
time to solve it on their own, and find some solution.

The remaining 80% is _not_ students who are not clever enough; it is a
combination of students who never thought about counting prime numbers,
who thought it was boring or not worth their time, those who read an
article describing a solution and so couldn't discover it themselves
(any article I read so far included Legendre's algorithm which Harris
rediscovered and then obfuscated), and an unknown number who were not
bright enough.

David Bernier

unread,
Dec 16, 2002, 11:21:07 PM12/16/02
to

James Harris wrote:
[skipped]

> So why are *modern* mathematicians so sanguine about my find not being
> published?

[skipped]

If a group of physicists of today found a way of keeping
time 1000 times better than quartz clocks, but that was
much less accurate than atomic clocks, National Labs
like NIST, NRC (Canada) or those of the UK or France
wouldn't necessarily rush to adopt the new
technology, unless perhaps it showed great promise
of great improvement, no?

David Bernier

Message has been deleted

David Kastrup

unread,
Dec 17, 2002, 3:31:04 AM12/17/02
to
jst...@msn.com (James Harris) writes:

> David Kastrup <David....@t-online.de> wrote in message news:<x58yyq3...@lola.goethe.zz>...

> Childish.

[entirely off-topic waffle deleted]

Apparently you really don't know about asymptotic running times and
how to estimate them, or you would have answered just with your
measurements or at least an explanation of how you interpreted them
instead of another large smoke screen.

David C Ullrich

unread,
Dec 17, 2002, 8:23:51 AM12/17/02
to
On 16 Dec 2002 21:55:42 -0800, jst...@msn.com (James Harris) wrote:

>David Kastrup <David....@t-online.de> wrote in message news:<x58yyq3...@lola.goethe.zz>...

>Childish.
>
>But Ullrich will keep replying.
>
>Do you know that I ignored him for quite some time? I think it was
>over a year, and he just kept replying.
>
>When he read a post of mine that made him think I'd been actually
>reading his posts, he gleefully posted as if he'd caught me on
>something.

Tee-hee. How could you possibly know about that reply? It
was posted when you weren't reading my posts.

Tee-hee.

>So, now he claims he's some kind of victim and replies to my posts
>because I complained to his university. It doesn't make any sense,
>but such justifications rarely do.
>
>You may wonder about all these people who keep replying to me,
>especially as they're so negative.
>
>I mean how can you not wonder with all these monster threads.
>
>There was one 200+ thread that I pulled off what I thought was the
>pertinent post and started another one, which promptly soared to more
>than a hundred posts. I think that was in October.
>
>What's going on here is that I'm a true discoverer. I made a prime
>counting discovery, and before that found a short proof of Fermat's
>Last Theorem.
>
>If you want to believe that no way mathematicians would ignore such
>things, then of course, you're going to believe that, but consider
>this prime counting thing:
>
>How can it be old information when I have pi(x,y)?
>
>Look all around the world. Read Gauss, Legendres, Riemann, and you
>will not find a pi(x,y) function, just pi(x).
>
>I'm beginning to think that my work is being ignored now because it's
>so big that mathematicians can't quite absorb the information. The
>shock to their systems is just too great.
>
>That's ok. I don't necessarily need mathematicians, as there are
>others.
>
>Still, I find modern mathematician's behavior to be *very* annoying,
>and quite childish.
>
>They're acting like a bunch of spoiled children as far as I'm
>concerned. And I think *you* are part of the problem as the world let
>things get this bad, and you are part of the world.
>
>Don't you know that mathematics is important for humanity's future?
>
>
>James Harris

Message has been deleted

Randy Poe

unread,
Dec 17, 2002, 2:17:21 PM12/17/02
to
James Harris wrote:
> David Kastrup <David....@t-online.de> wrote in message news:<x54r9d6...@lola.goethe.zz>...
>
>>jst...@msn.com (James Harris) writes:

> Your suppositions about what I would or would not do are of no
> interest to me.

Undisputable fact, not supposition: You have made no claim
about the asymptotic performance of your current algorithm.

All you've said is, "it runs 100 times faster than the
previous version".

These are the facts.

Do you have a claim or data on the asymptotic performance of
your algorithm?

- Randy

Maxim Stepin

unread,
Dec 17, 2002, 3:02:53 PM12/17/02
to

"James Harris" <jst...@msn.com> wrote in message
news:3c65f87.02121...@posting.google.com...
> David Kastrup <David....@t-online.de> wrote in message
news:<x54r9d6...@lola.goethe.zz>...

> > jst...@msn.com (James Harris) writes:
> >
> > > David Kastrup <David....@t-online.de> wrote in message
news:<x58yyq3...@lola.goethe.zz>...
> > > > jst...@msn.com (James Harris) writes:
> > > >
> > > > > Also those who've followed posts all the way back to the original
> > > > > thread might have noticed that I was pointing out that I managed to
> > > > > improve asymptotic running time.
> > > >
> > > > How did you measure this improvement? Since you have not even
> > > > posted anything that would suggest you even _know_ what asymptotic
> > > > running time is, much less how to measure it, one can't assume you
> > > > know what you are talking about unless you go into anything more
> > > > specific.
> > >
> > > Childish.
> >
> > [entirely off-topic waffle deleted]
> >
> > Apparently you really don't know about asymptotic running times and
> > how to estimate them, or you would have answered just with your
> > measurements or at least an explanation of how you interpreted them
> > instead of another large smoke screen.
>
> Hmmm...I replied to your post thinking you were Ullrich, and I see you
> deleted out that content.

>
> Your suppositions about what I would or would not do are of no
> interest to me.

So asymptotic running time of your algorithm is no interest to you.
Now everyone could see - you lost interest just in time to avoid answering the question.
How convenient...

Message has been deleted

5

unread,
Dec 17, 2002, 10:02:14 PM12/17/02
to

"James Harris" <jst...@msn.com> wrote in message
news:3c65f87.02121...@posting.google.com...
> "Dik T. Winter" <Dik.W...@cwi.nl> wrote in message
news:<H6wv...@cwi.nl>...
> > In article <3c65f87.02121...@posting.google.com>
jst...@msn.com (James Harris) writes:
> > > "Dik T. Winter" <Dik.W...@cwi.nl> wrote in message
news:<H6vz7...@cwi.nl>...
> > ...
> > > > Improving an algorithm with O(N long(N)) complexity by a factor 100
does
> > > > not change the complexity. The only thing that changes is that the
> > > > cross-over-point where a O(N^(1/2 + epsilon)) is better increases

Ummm... can I ask a question?

I don't know anything about complexity theory, and what you're writing looks
like something I vaguely remember seeing in my Numerical Analysis book, but
that was 10 years ago and I haven't looked at it since...what exactly (as
exactly as you care to get) does O(N log N) etc. mean?

Matt


whu...@atlsci.com

unread,
Dec 17, 2002, 11:18:06 PM12/17/02
to

Although it seems unlikely that James really understands asymptotic
running
time, I think it is becoming clear that he is beginning to
understand that his programs are never going to lead to new
world records in enumerating pi(n). He is now retreating
into such statements as:

My programs represent *different* algorithms based on the prime
counting function itself, and because it is THE prime counting
function you can actually use it with ANY KNOWN ALGORITHM FOR
COUNTING
PRIMES!!!

and mumbling about how he uses pi(x,y) when all other mathematicians
have
used only pi(x). The advantage is that stuff like this is such
complete nonsense it is difficult to attack. We are very close to
"not even wrong" territory here.

We have been here before. When James first presented the world with
"the prime counting function" he gleefully predicted that world records
for enumerating pi(n) would be set very soon. When people pointed out
that his program was ludicrously slow when compared to modern methods
of enumerating pi(n) he first attacked by pointing out that his
algorithm only required using primes up to sqrt(n). This led to
my personal favourite, the suggestion that pi(10^30) could be
calculated by simply feeding all the primes up to pi(10^15) to
a computer. When this had to be abandoned, James next
claimed that his program needed to be "optimized". However, as
it became clear that the magic optimization was not happening,
James started to point out how run time was not really
that important. Then we got into pi(x) vs. pi(x,y) and difference
equations that magically become differential equations and
how all this was leading to a proof of the Riemann Hypothesis.
(James Harris never did explain how this was
supposed to work. Too bad I was looking forward to that
explanation.)

However, James never actually stated that his program was no longer
a contender for world records, or that the "optimizations" had
been abandoned. Thus, after people had stopped writing about this,
James felt free to once again make grandiose claims.

Karnac Predicts: James will stop claiming his prime counting function
is faster than any other. However, he will simply
abandon the threads he will not make any concessions.
In the future James will once again claim that his
prime counting function is faster than anything
else.

It will snow in Canada this winter.

-William Hughes

Mike Kent

unread,
Dec 17, 2002, 11:55:05 PM12/17/02
to
5 wrote:

> I don't know anything about complexity theory, and what you're writing looks
> like something I vaguely remember seeing in my Numerical Analysis book, but
> that was 10 years ago and I haven't looked at it since...what exactly (as
> exactly as you care to get) does O(N log N) etc. mean?

It means that the (number of elementary steps / amount of time)
required to execute the algorithm is approximated by k*N*log(N)
for some nonzero constant k, and some variable N that measures
the size of the problem, in the following sense:

lim time / k*N*log(N) = 1
N->oo

A standard exmaple is that the best general list sorting routines
have O(N log N) performance where N is the length of the list.

James Waldby

unread,
Dec 18, 2002, 2:10:32 AM12/18/02
to
Mike Kent wrote:
> 5 wrote:
> > ...what exactly ... does O(N log N) etc. mean?

>
> It means that the (number of elementary steps / amount of time)
> required to execute the algorithm is approximated by k*N*log(N)
> for some nonzero constant k, and some variable N that measures
> the size of the problem, in the following sense:
...

What Mike describes is more closely related to Omega than to O.
O(N log N) means that steps (or time) is bounded above by some
constant multiple of N log N, for N > some N_0. Bounded above
means that we can compute limits on worst-case execution times.

The meaning of O, Theta, and Omega notation is explained fairly
well at http://www.cs.mcgill.ca/~cs251/OldCourses/1997/topic3/ .

-jiw

Christian Bau

unread,
Dec 18, 2002, 3:51:03 AM12/18/02
to
In article <3dffe2dc$1...@news.teranews.com>,
"5" <matth...@cavtel.net> wrote:

This isn't James Harris posting under a false name?

A function f (N) is "O (N log N)" if f (N) is at most c times (N log N)
for some positive constant c, and for all large N. More precisely:

f (N) is O (N log N) if there is a real number c > 0 and and integer
M > 0 such that |f (N)| <= c * (N log N) for all N >= M.

And if g (N) is another function, then

f (N) is O (g (N)) if there is a real number c > 0 and and integer M
> 0 such that |f (N)| <= |c * g (N)| for all N >= M.

We are discussing here the speed of an algorithm. If an algorithm solves
a problem P(N), then one tries to analyze the execution time of the
algorithm E(N) depending on N. But you can have computers of different
speeds: If your computer is ten times faster than mine, then you will
get a smaller function E(N).

That is why these things are discussed with O(N) notation: If the
algorithm takes 0.73N microseconds on your computer and 4.9N
microseconds on mine, then its execution time is O(N). Only the constant
c has changed. If you improve the algorithm to run twice as fast, then
it is still O(N), you only made the constant c smaller.

David C Ullrich

unread,
Dec 18, 2002, 7:24:28 AM12/18/02
to
On 17 Dec 2002 16:47:52 -0800, jst...@msn.com (James Harris) wrote:

>[...]
>Hmmm...indifference from leading mathematicians, hostility from people
>who can post on the sci.math newsgroup, i.e. anyone with a computer
>and Usenet access, with a knowledge of history, and the travails
>discoverers often face, and it's beginning to look like an even bigger
>discovery.

That's indisputably valid reasoning. Supposing for the sake of
argument that every single person on the planet said out loud
that the Proof was wrong and the Function was unoriginal
and unimportant, that would prove that it _was_ without a
doubt the single most important discover in the history of
the human race.

I love these bits, where the fact that people don't believe
something becomes evidence that it's true.

>James Harris

Message has been deleted

Tris

unread,
Dec 18, 2002, 9:38:40 AM12/18/02
to

"David C Ullrich" <ull...@math.okstate.edu> wrote in message
news:89q00v0o2co9brjke...@4ax.com...

Well of course unicorns exist.


Maxim Stepin

unread,
Dec 18, 2002, 4:08:18 PM12/18/02
to

"James Harris" <jst...@msn.com> wrote in message
news:3c65f87.02121...@posting.google.com...

<deleted>

> dS(x,y) = (pi((x/y), y-1) - pi(y-1, sqrt(y-1))[ pi(y, sqrt(y)) -
> pi(y-1, sqrt(y-1))],
>
> S(x,1) = 0.
>
> And pi(x, y) = floor(x) - S(x, y) - 1, and you get S as the sum of dS
> from dS(x,2) to dS(x,y).

1) You define dS(x, y) by using pi(x,y).
2) You define pi(x,y) by using S(x, y).
3) You define S(x,y) as the sum of dS.

Do you see the problem? It's a loop.

It's a common rule for math texts to avoid such loops - define the function BEFORE you use it, not
after.

4) There are more "(" brackets then ")" brackets.

5) You wrote: "Note: pi(x,sqrt(x)) here gives the same value as the traditional pi(x)."
It is clear, that "your pi(x,y)" and "traditional pi(x)" are two different functions - because of
different number of agruments.
It's a common rule for math texts: if you have two different functions, don't give them the same
name.

6) You wrote: "it seems that mathematicians decided that the prime counting function had only one
variable, and worked from that assumption from then on".
It's not "assumption", it's what called a definition.
If you don't like traditional definition, you are free to define your own function as you like.
Just give you function a different name to aviod unnessesary confusion.
Like this: JSHpi(x,y) = pi(x)

It's all cosmetic, of course. But important, if you like clarity.

David Kastrup

unread,
Dec 18, 2002, 4:18:16 PM12/18/02
to
"Tris" <nu...@127.0.0.1> writes:

[...]


> >
> > I love these bits, where the fact that people don't believe
> > something becomes evidence that it's true.
>
> Well of course unicorns exist.

You have been had. Literally. The story of the virgin that has just
to sit there with closed eyes and the unicorn will put its head into
her lap and all that probably sounded good to you inebriated as you
were at that time, but let me tell you that the unicorn's "horn" was
probably not quite what you in your naivety assumed it to be.

Message has been deleted

Arturo Magidin

unread,
Dec 18, 2002, 6:40:37 PM12/18/02
to
In article <3c65f87.02121...@posting.google.com>,
James Harris <jst...@msn.com> wrote:

[.snip.]


>When I was a kid I wished that unicorns existed. Some of you seem to
>be wishing that math is valueless at least partly I think because you
>don't like the discoverer.

Here we go again. So far, the only person saying "this is worthless",
based exclusively on who brought it to your attention, is you. You've
done it in at least three threads so far, and recently.

You really have to stop blaming others for your own shortcomings.

>Actually, it's not just that people are denying the value of my work,
>it's their energy in fighting me.
>
>And it's that energy which I've used to help gain the visibility that
>I have.
>
>It's not me people, it's the work. If my research were as valueless
>as these people spend so much energy claim it is, well then they
>wouldn't be spending so much energy making their claims!

Aha. So I guess the result on factorization must be very good and
valued, since you spend so much energy trying to claim it is trivial.

I guess that theorem about roots of polynomials must be valid, since
you spend so much energy doing everything you can to avoid looking at
the proof.

>As people fight the value of mathematical work that is clearly
>valuable, they find themselves replying to me negatively to hold on to
>the reality they wish to accept. So by posting in reply to me simply
>claiming I'm wrong, they may gain some relief.

You seem to be looking at the mirror again, James.

>It's like denying the truth screws up with their heads, as their minds
>fight social values. You see it's supposed to be that valuable
>research brings kudos to the researcher, but they're delivering
>insults to me and downgrading valuable research work.

No, they are delivering insults to you because your work is worthless.

[.snip.]

>Wishing and posting might make some of these people feel better, like
>they're in control of their reality, but it's like with the unicorns,
>wishing that my work is valueless doesn't make it so.

Indeed it does not. It is your work itself which earns its worthless
status, not who presents it.

======================================================================
"Mufrius, non magister."
-- Petronius, Satyricon
======================================================================

Arturo Magidin
mag...@math.berkeley.edu

David Kastrup

unread,
Dec 18, 2002, 7:01:13 PM12/18/02
to
mag...@math.berkeley.edu (Arturo Magidin) writes:

[...]


> >Wishing and posting might make some of these people feel better, like
> >they're in control of their reality, but it's like with the unicorns,
> >wishing that my work is valueless doesn't make it so.
>
> Indeed it does not. It is your work itself which earns its worthless
> status, not who presents it.

Curiously enough, the _way_ in which it is presented earns the
_presenter_ some status.

Wayne Brown

unread,
Dec 18, 2002, 6:56:52 PM12/18/02
to
James Harris <jst...@msn.com> wrote:

> It's not me people, it's the work. If my research were as valueless
> as these people spend so much energy claim it is, well then they
> wouldn't be spending so much energy making their claims!

Malaria must have very great value; look at all the energy people have
spent over the years fighting against it.

--
Wayne Brown | "When your tail's in a crack, you improvise
fwb...@bellsouth.net | if you're good enough. Otherwise you give
| your pelt to the trapper."
"e^(i*pi) = -1" -- Euler | -- John Myers Myers, "Silverlock"

Andy Spragg

unread,
Dec 18, 2002, 7:30:03 PM12/18/02
to
jst...@msn.com (James Harris) pushed briefly to the front of the queue
on 18 Dec 2002 14:59:49 -0800, and nailed this to the shed door:

^ It's not me people, it's the work. If my research were as valueless
^ as these people spend so much energy claim it is, well then they
^ wouldn't be spending so much energy making their claims!

That's quite right. If you had indeed discovered an elementary proof
of FLT, it would of course be valuable. The problem here is, your
"research" is not an elementary proof, it is only a fatally flawed
attempt at one.

The concept of value presupposes the concept of correctness - if
research is correct, its value can be judged. If it is incorrect, its
value is "greyed out"; contextually meaningless. Your misfortune was
in choosing the one field of endeavour where there is no scope for
shades of grey in being correct.

People expend energy in perpetually trying to demonstrate to you that
your endeavour has been in vain, according to precisely-defined rules
of engagement that you are not prepared to concede. Stalemate.

Andy

--
sparge at globalnet point co point uk

"I imagine most people would agree with me
when I say that most people seem to imagine
that most people would agree with them"
Bob Goddard, uk.rec.sheddizen

Mike Kent

unread,
Dec 18, 2002, 11:11:30 PM12/18/02
to
James Harris wrote:

> It's not me people, it's the work. If my research were as valueless

> as these people spend so much energy claim it is, well then they

> wouldn't be spending so much energy making their claims!

If crabgrass weren't so valuable, I wouldn't spend time
getting it out of my lawn?

Brian Quincy Hutchings

unread,
Dec 18, 2002, 11:27:42 PM12/18/02
to
the reason that I stressed Ore's presentation
-- sorry, if it was in another topic -- is,
most assume that the "decimals" are part-and-parcel
of the "Arabic/Hindoo numerals,"
when those were strictly integral. so,
when Simon Stevin created them,
they revolutionized the merchants. of course,
most of the properties are well-known,
grade-school stuff, but
it's always nice to have a through comprehension
of our most basic tool; eh?

I didn't even grok the O() and o() notation, although
I had once briefly done-so, in Vinagradov's monograph,
but I think it should be obvious,
why trigonemtric series are so useful in this;
is it?

mag...@math.berkeley.edu (Arturo Magidin) wrote in message news:<atdtrh$1jvd$1...@agate.berkeley.edu>...
>
> position with tenure. If that's what you mean, you are misapplying it
> when you use it for me.

--A church-school McCrusade (Blair's ideals):
Harry-the-Mad-Potter want's US to kill Iraqis,
so does Usama's MacJihad:
"HEY, GEORGE; LET'S YOU & SADDAM FIGHT" -Dame Maggie
http://www.tarpley.net/bush25.htm ("Thyroid Storm" ch.)

http://www.rwgrayprojects.com/synergetics/plates/plates.html

It is loading more messages.
0 new messages