Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

JSH: Two proofs equals a lot of denial

1 view
Skip to first unread message

jst...@msn.com

unread,
Dec 8, 2006, 8:43:52 PM12/8/06
to
So I had to find two proofs of the same problem, where the second proof
had to be obvious in such a way that there was no room for anyone to
find even the hint of an objection.

That's a lot to ask someone, but then again, the proofs show that much
of what many of you may have cherished as brilliant mathematics is just
wrong.

The theory of ideals goes and Galois Theory as many of you have been
taught it, isn't quite right.

That's a lot to absorb.

But the wrong answer is to decide that you can't handle that truth, and
figure if I were right then mathematicians in charge would tell the
truth--even if it meant they were not as brilliant as they thought, and
even if for some of them it might mean losing positions of high
prestige.

I think this situation leaves the door open for a lot of finger
pointing at who is supposed to do the right thing.

So, should Andrew Wiles, if he were to find out about all of this, come
forward and step up to the world and say, hey, there was this
incredible mistake made in the math field over a hundred years ago,
long before anyone alive today was born, and um, it just so happens
because of this mistake that only was just noticed by some, um, amateur
math guy that people were calling a crackpot, I didn't prove
Taniyma-Shimura and much of the accomplishments I think I have in
mathematics are crap?

You people are very cruel if you expect the people currently at the top
in the field by a strong opinion of the majority to come forward and do
something like that.

It's just not a human thing for you to do to them.

In many ways, it's NOT THEIR FAULT.

The mistake entered the field after Gauss, and it kind of snow-balled
over the years.

The mistake is not your fault--hiding it would be.

Hiding it would be a very big mistake.

Especially trying to hide it knowing that I'm the person you would have
to out-think, and beat down the line, indefinitely, knowing that the
day I pushed it through against a math community in denial would be the
day the world would see you not as people caught up in a remarkable
situation difficult for anyone---but as cons and frauds.

I am asking you to protect people like Wiles, Ribet, Taylor and all
those others who had so many reasons to believe they were brilliant and
right, who will have to live with learning they were wrong, and I am
asking you not to say it is their responsibility to tell the truth
here.

Some one of you needs to step up here.

Don't leave this on "leading mathematicians" who are getting the full
kick in the gut as if you can just blame them later if this thing drags
out while students keep getting taught wrong stuff!!!

Those students deserve better. The world deserves better.

This story can still be rather remarkable and kind of neat with some
people, yes, having to live through some extreme disappointment.

But you people do not want to live with the disappointment you would
feel later for denying easy mathematical proofs that show one of the
most dramatic events in the history of mathematics, opening the door to
a surge in mathematical knowledge about the fundamental properties of
numbers, and who knows what else we could figure out?

Mathematics is the "queen of the sciences" for good reason.

There may have been a block to mathematical knowledge we can no more
comprehend than cavemen could comprehend integral calculus that was
just removed.

Over a hundred years with a subtle mistake that JUST recently got found
out, opening the door to huge increases and leaps in knowledge that
could be beyond our imaginations.

Your choice.

Keep fighting the math or go with it.

But you know the answer here, if you fight the math, go against
mathematical proof, no matter what happens, you lose.


James Harris

jshs...@yahoo.com

unread,
Dec 8, 2006, 8:52:30 PM12/8/06
to

There sure does seem to be alot of people coming up with problems with
your proofs in the other 3 threads you started in the past day or so.

There is alot of shoulder room for way more than just a hint of an
objection. You are actually looking quite foolish, but you are used to
that.

That seems to be your gift to the world.

Comic relief. :-)

jst...@msn.com

unread,
Dec 8, 2006, 9:02:59 PM12/8/06
to

jshsu...@yahoo.com wrote:
<deleted>

>
> There sure does seem to be alot of people coming up with problems with
> your proofs in the other 3 threads you started in the past day or so.
>
> There is alot of shoulder room for way more than just a hint of an
> objection. You are actually looking quite foolish, but you are used to
> that.
>
> That seems to be your gift to the world.
>
> Comic relief. :-)

So why hasn't Rupert replied again? And what about William Hughes and
his simple "why?".

I am making an appeal to the humanity of these groups.

It's not right to act like nothing can happen until leading
mathematicians come forward as they get hit the hardest.

I can imagine few things harder than being in the position of having
been celebrated for brilliance around the world for years, only to come
forward to say it was all wrong because of some subtle mistake from
over a hundred years ago.

And remember, there is a silver lining.

Already you now know you can probe into non-rationals in terms of
factors to see things that no one knew you could see before.

And then there is my prime counting research, where with the proper
attention now finally the Riemann Hypothesis may soon be resolved.

The choice here is for knowledge or for appearances where deep down you
know that not only are you going through the motions with crap math if
you are doing number theory, but you're sitting by while young people
get taught it when they could have had a chance to start fresh.

Go over the latest proof, in my thread "JSH: End of an error" and go
over it and over it and over it until you are satisfied and then move
forward.

And think about other mathematicians please. Especially the people who
will get hit the hardest and do something now to help protect them.


James Harris

jshs...@yahoo.com

unread,
Dec 8, 2006, 9:11:18 PM12/8/06
to

I have gone over your "proof". I have also read the replies to it, and
read their objections.

You are in error once again.

As far as your paper sent to the annals. I read that too. It is hardly
a paper. I predict now that it will not be published.

I hate to say it, but the laughs I get from your posts have started to
grow thin. You are just getting to repetative and monotonous. I might
need to take a break for a few months and see if you come up with
something new to shake the math world to its core. I'm still deciding.
I actually enjoy the replies to your posts way more than the pearls of
wisdom you come up with. Your starting multiple threads saying pretty
much the exact same thing within hours of each other is just sad.

Rupert

unread,
Dec 8, 2006, 9:57:04 PM12/8/06
to

jst...@msn.com wrote:
> So I had to find two proofs of the same problem, where the second proof
> had to be obvious in such a way that there was no room for anyone to
> find even the hint of an objection.
>

You've found about 50 "proofs", and they're all wrong.

Jada

unread,
Dec 8, 2006, 10:20:46 PM12/8/06
to

<snip>

> Galois Theory is wrongly used in much of number theory.

I think you are fibbing, you do not even use the same notation;

http://www.math.niu.edu/~beachy/aaol/galois.html


>And the theory of ideals does not work.

Please show what is wrong with it
http://www.andrew.cmu.edu/user/avigad/Papers/ideals71.pdf

Rupert

unread,
Dec 8, 2006, 10:34:11 PM12/8/06
to

jst...@msn.com wrote:
> jshsu...@yahoo.com wrote:
> <deleted>
> >
> > There sure does seem to be alot of people coming up with problems with
> > your proofs in the other 3 threads you started in the past day or so.
> >
> > There is alot of shoulder room for way more than just a hint of an
> > objection. You are actually looking quite foolish, but you are used to
> > that.
> >
> > That seems to be your gift to the world.
> >
> > Comic relief. :-)
>
> So why hasn't Rupert replied again?

I was asleep. I'm in Australia, you know. Give me a few hours, can't
you?

jst...@msn.com

unread,
Dec 8, 2006, 11:58:35 PM12/8/06
to

Dude. Don't think this is all about you.

With people like you disagreeing, you can give room for other people to
think there is some doubt.

There is no reasonable doubt, as the mathematical proofs are basic.

Now then, if you fight this, consider who you may hurt.

People like Wiles and Ribet are the most vulnerable at this point.

Like, what do YOU really lose?

You've argued on Usenet for some years and got some stuff wrong against
one of the greatest errors in the history of mathematics, and one of
the most subtle.

Actually, it probably is the greatest error in the history of
mathematics.

But Wiles, he learns of this and he finds out he didn't prove Fermat's
Last Theorem.

Think about it.

You won't hurt me. It's not about me. I now have a rather nice paper
I can send to mathematicians and end this, and I have another paper
that is at the Annals of Mathematics already.

And I have a high tolerance for the passage of time, if you hadn't
noticed.

Nope. You don't hurt me.

But some of these other people, like Wiles and Ribet, well, yeah, you
can hurt them, whether you completely understand exactly how or not.

What I am doing now is actually about looking for a soft landing for
some people I think have a lot of value to give to this world, who will
probably be questioning their own worth very deeply in a little bit.

You fight me here, and you don't hurt me. You may hurt them.


James Harris

jshs...@yahoo.com

unread,
Dec 9, 2006, 12:07:05 AM12/9/06
to

Since you are yet again completely in the wrong, the only one hurt is
you. All the people who post to these math NG's care about is the math.
You care about fame and your place in history. Well, your place in
history is as a minor crank who polluted a few usenet groups for a
decade or two. Nice.

Rupert

unread,
Dec 9, 2006, 12:09:08 AM12/9/06
to

jst...@msn.com wrote:
> Rupert wrote:
> > jst...@msn.com wrote:
> > > So I had to find two proofs of the same problem, where the second proof
> > > had to be obvious in such a way that there was no room for anyone to
> > > find even the hint of an objection.
> > >
> >
> > You've found about 50 "proofs", and they're all wrong.
>
> Dude. Don't think this is all about you.
>

What on earth gives you the idea that I think that?

> With people like you disagreeing, you can give room for other people to
> think there is some doubt.
>
> There is no reasonable doubt, as the mathematical proofs are basic.
>

There most certainly is reasonable doubt. I have found mistakes in your
arguments.

> Now then, if you fight this, consider who you may hurt.
>
> People like Wiles and Ribet are the most vulnerable at this point.
>
> Like, what do YOU really lose?
>

I have absolutely nothing to win or lose either way. That's why a
rational person would conclude that my motivation is pursuit of the
truth.

> You've argued on Usenet for some years and got some stuff wrong against
> one of the greatest errors in the history of mathematics, and one of
> the most subtle.
>

I haven't got anything wrong. There is no error in accepted
mathematics.

> Actually, it probably is the greatest error in the history of
> mathematics.
>
> But Wiles, he learns of this and he finds out he didn't prove Fermat's
> Last Theorem.
>

You're so absurd. You're incapable of understanding Wiles' argument for
Fermat's last theorem. How could you possibly know whether it's
correct, or whether your arguments would have any bearing on it if they
were correct? You can't even tell us what accepted theorem is supposed
to be wrong.

> Think about it.
>
> You won't hurt me. It's not about me. I now have a rather nice paper
> I can send to mathematicians and end this, and I have another paper
> that is at the Annals of Mathematics already.
>

Go ahead.

> And I have a high tolerance for the passage of time, if you hadn't
> noticed.
>
> Nope. You don't hurt me.
>

Good. I wish I could get you to see reason, though.

> But some of these other people, like Wiles and Ribet, well, yeah, you
> can hurt them, whether you completely understand exactly how or not.
>

No, I don't think so.

Karenia Brevis

unread,
Dec 9, 2006, 12:12:35 AM12/9/06
to

<jst...@msn.com> wrote in message
news:1165640315.8...@f1g2000cwa.googlegroups.com...


You are a wacko JSH, you show everybody cannot even do simple math, then you
want to hurt people.
Submitting papers to A.M., another one of your not funny jokes.


>
>
> James Harris
>


Proginoskes

unread,
Dec 9, 2006, 1:18:28 AM12/9/06
to

This part of JSH's post reminds me of a joke I once heard:

A researcher is presenting a result at a math conference. He outlines a
proof of this result.

During the question and answer period, a member of the audience says,
"The result that you mentioned is not true. I have worked out a
counterexample."

The researcher responds, "That's okay; I have another proof."

--- Christopher Heckman

David C. Ullrich

unread,
Dec 10, 2006, 7:23:23 AM12/10/06
to
On 8 Dec 2006 17:43:52 -0800, jst...@msn.com wrote:

>So I had to find two proofs of the same problem, where the second proof
>had to be obvious in such a way that there was no room for anyone to
>find even the hint of an objection.

Curious that people _found_ objections, eh?

>That's a lot to ask someone, but then again, the proofs show that much
>of what many of you may have cherished as brilliant mathematics is just
>wrong.
>
>The theory of ideals goes and Galois Theory as many of you have been
>taught it, isn't quite right.

Exactly what result is false?

>That's a lot to absorb.

Actually, until you finally answer the question of exactly what
standard result you're refuting it's nothing at all to absorb?

People have asked you many times exactly what result it is that
you're showing to be false. You've never replied. Why is that?

Hint: Eventually people will get the idea that you have no idea.

Um, nobody's _been_ leaving anything to "leading mathematicians"
here - people have been explaining exactly what's wrong with your
arguments without dropping any names.

>Those students deserve better. The world deserves better.
>
>This story can still be rather remarkable and kind of neat with some
>people, yes, having to live through some extreme disappointment.
>
>But you people do not want to live with the disappointment you would
>feel later for denying easy mathematical proofs that show one of the
>most dramatic events in the history of mathematics, opening the door to
>a surge in mathematical knowledge about the fundamental properties of
>numbers, and who knows what else we could figure out?
>
>Mathematics is the "queen of the sciences" for good reason.
>
>There may have been a block to mathematical knowledge we can no more
>comprehend than cavemen could comprehend integral calculus that was
>just removed.
>
>Over a hundred years with a subtle mistake that JUST recently got found
>out, opening the door to huge increases and leaps in knowledge that
>could be beyond our imaginations.
>
>Your choice.
>
>Keep fighting the math or go with it.
>
>But you know the answer here, if you fight the math, go against
>mathematical proof, no matter what happens, you lose.
>
>
>James Harris


************************

David C. Ullrich

jst...@msn.com

unread,
Dec 10, 2006, 2:33:00 PM12/10/06
to

David C. Ullrich wrote:
> On 8 Dec 2006 17:43:52 -0800, jst...@msn.com wrote:
>
> >So I had to find two proofs of the same problem, where the second proof
> >had to be obvious in such a way that there was no room for anyone to
> >find even the hint of an objection.
>
> Curious that people _found_ objections, eh?
>

You're betting your career on that Ullrich, so you better read what
they have carefully.

I'm kind of puzzled by people like you and Magidin who actually have
something to lose here.

You should know by now that I have a paper on my prime counting
function currently under review at the Annals of Mathematics. I also
have a NEW paper just written a few days ago with this simplified proof
showing the problem with the ring of algebraic integers.

And noting that you can move a factor of 2 with

175x^2 - 15x + 2 = (f(x) + 2)*(g(x) + 1)

is just so basic and irrefutable that now for the first time, even
casual readers can easily determine that posters are wrong when they
dispute this result.

If you sit back and wait, assuming nothing will happen, then when it
does and my results are published, then your career as a teaching
professor is gone, and not even tenure will save you.

Your only hope at this point is to claim confusion and start defending
the correct mathematical proofs now that you can see a simple and
obvious explanation.

After all, with the loss of your position and tenure, you would
probably lose most of your retirement funds as well, and have to start
over with some new career with a major black mark against you.

Why throw the dice?


James Harris

David Moran

unread,
Dec 10, 2006, 3:08:24 PM12/10/06
to

<jst...@msn.com> wrote in message
news:1165779180.1...@f1g2000cwa.googlegroups.com...

>
> David C. Ullrich wrote:
>> On 8 Dec 2006 17:43:52 -0800, jst...@msn.com wrote:
>>
>> >So I had to find two proofs of the same problem, where the second proof
>> >had to be obvious in such a way that there was no room for anyone to
>> >find even the hint of an objection.
>>
>> Curious that people _found_ objections, eh?
>>
>
> You're betting your career on that Ullrich, so you better read what
> they have carefully.

No they aren't; no one will believe a CRACKPOT.


>
> I'm kind of puzzled by people like you and Magidin who actually have
> something to lose here.

No they don't; you're just a blathering idiot.

>
> You should know by now that I have a paper on my prime counting
> function currently under review at the Annals of Mathematics. I also
> have a NEW paper just written a few days ago with this simplified proof
> showing the problem with the ring of algebraic integers.

And it'll be rejected too, I guarantee it.


>
> And noting that you can move a factor of 2 with
>
> 175x^2 - 15x + 2 = (f(x) + 2)*(g(x) + 1)
>
> is just so basic and irrefutable that now for the first time, even
> casual readers can easily determine that posters are wrong when they
> dispute this result.
>
> If you sit back and wait, assuming nothing will happen, then when it
> does and my results are published, then your career as a teaching
> professor is gone, and not even tenure will save you.

Blah blah blah whatever. Jealous that you're not smart enough to do math?


>
> Your only hope at this point is to claim confusion and start defending
> the correct mathematical proofs now that you can see a simple and
> obvious explanation.
>
> After all, with the loss of your position and tenure, you would
> probably lose most of your retirement funds as well, and have to start
> over with some new career with a major black mark against you.

BULL CRAP


>
> Why throw the dice?
>
>
> James Harris
>

Dave (a REAL mathematician)


Tim Peters

unread,
Dec 10, 2006, 5:02:59 PM12/10/06
to
[jst...@msn.com]

>>> So I had to find two proofs of the same problem, where the second proof
>>> had to be obvious in such a way that there was no room for anyone to
>>> find even the hint of an objection.

[David C. Ullrich]


>> Curious that people _found_ objections, eh?

[jst...@msn.com]


> You're betting your career on that Ullrich, so you better read what they
> have carefully.
>
> I'm kind of puzzled by people like you and Magidin who actually have
> something to lose here.

Yet nobody else is puzzled. Don't you tire of being so thick?

> You should know by now that I have a paper on my prime counting
> function currently under review at the Annals of Mathematics.

From which nothing follows. In addition, you're only the one who wouldn't
bet a great deal on that the paper will be rejected. Don't you tire of
being so thick?

> I also have a NEW paper just written a few days ago with this simplified
> proof showing the problem with the ring of algebraic integers.

Which is just another crock. Cut through all the irrelevant bullshit, and
it boils down to your naive hope that:

7*a = 5*b

implies 7 is a factor of b. But there isn't a single mathematician on the
planet you can sell that to, because (a) you have no proof (you merely keep
/repeating/ it); and, (b) it's false (and, worse, is obviously false at
first sight to everyone with relevant knowledge).

> And noting that you can move a factor of 2 with
>
> 175x^2 - 15x + 2 = (f(x) + 2)*(g(x) + 1)
>
> is just so basic and irrefutable that now for the first time, even
> casual readers can easily determine that posters are wrong when they
> dispute this result.

There's a different explanation for why the set of casual readers who see
that you're right appears to be empty, year after year: it is in fact
impossible for a casual reader to make sense of your technical writing.
It's terrible, choked with magical leaps in the places it's not plain wrong.
People like Rupert, Hughes, and marcus_b expend a great deal of effort
trying to figure out what you intended to say, were you /capable/ of
coherent technical writing. Casual readers only see a ranting lunatic.
That's why you never "win them over". Your technical points are expressed
incoherently, while your much more frequent "social commentary" appears to
be the insane raving of a megalomaniac.

> If you sit back and wait, assuming nothing will happen, then when it
> does and my results are published, then your career as a teaching
> professor is gone, and not even tenure will save you.

You really think he's gonna piss his pants now? Don't you tire of being so
thick?

> Your only hope at this point is to claim confusion and start defending
> the correct mathematical proofs now that you can see a simple and
> obvious explanation.

In truth, his hope is the same one he's relied on to protect him from a
decade of your threats: "nobody else can be that dense".

> After all, with the loss of your position and tenure, you would
> probably lose most of your retirement funds as well, and have to start
> over with some new career with a major black mark against you.
>
> Why throw the dice?

"Gambling" on that you're wrong here is like betting the sun will rise
tomorrow. Except it's more certain: there are reasons, however unlikely,
for why the sun may not rise tomorrow. OTOH, the only possiblity that your
"7*a = 5*b implies 7|b" argument is correct lies with that mathematics may
be inconsistent.


jst...@msn.com

unread,
Dec 10, 2006, 5:09:32 PM12/10/06
to

Tim Peters wrote:
> [jst...@msn.com]
> >>> So I had to find two proofs of the same problem, where the second proof
> >>> had to be obvious in such a way that there was no room for anyone to
> >>> find even the hint of an objection.
>
> [David C. Ullrich]
> >> Curious that people _found_ objections, eh?
>
> [jst...@msn.com]
> > You're betting your career on that Ullrich, so you better read what they
> > have carefully.
> >
> > I'm kind of puzzled by people like you and Magidin who actually have
> > something to lose here.
>
> Yet nobody else is puzzled. Don't you tire of being so thick?
>

Come on Ullrich. Even the dodge of posting as your "Tim Peters"
persona doesn't protect you.

And it's your choice, I just find it curious.

Why roll the dice one more time when the consequences can be the end of
your career, no more teaching period, as you wouldn't even be able to
teach high school math, and the loss of your retirement as well?

Why take that gamble?

> > You should know by now that I have a paper on my prime counting
> > function currently under review at the Annals of Mathematics.
>
> From which nothing follows. In addition, you're only the one who wouldn't
> bet a great deal on that the paper will be rejected. Don't you tire of
> being so thick?
>

I doubt the Annals editors wish to put their careers on the line the
way you do.

If I had any concerns about whether or not they will carefully review
the material in terms of mathematical importance, novelty and value to
the discipline then I wouldn't be talking about it like I do, now would
I?

> > I also have a NEW paper just written a few days ago with this simplified
> > proof showing the problem with the ring of algebraic integers.
>
> Which is just another crock. Cut through all the irrelevant bullshit, and
> it boils down to your naive hope that:
>
> 7*a = 5*b
>
> implies 7 is a factor of b. But there isn't a single mathematician on the
> planet you can sell that to, because (a) you have no proof (you merely keep
> /repeating/ it); and, (b) it's false (and, worse, is obviously false at
> first sight to everyone with relevant knowledge).
>

Fine.

I've given you your chance, so you roll the dice.

It's your decision to make.


James Harris

sg...@hotmail.co.uk

unread,
Dec 10, 2006, 5:21:36 PM12/10/06
to
James Harris wrote:

> Come on Ullrich. Even the dodge of posting as your "Tim Peters"
> persona doesn't protect you.

Haha. You're on form today, James.

-Rotwang's "Rotwang" persona

William Hughes

unread,
Dec 10, 2006, 5:49:29 PM12/10/06
to

Doesn't the fact that they did not publish your paper
on non polynomial factorization give you pause?

- William Hughes

jst...@msn.com

unread,
Dec 10, 2006, 6:07:44 PM12/10/06
to

No. That was a messy situation.

Besides you people get so used to discounting what I say that you
forget that I actually do have more information than you have, and when
I say I contact Princeton regularly about my paper, that's for real.

So far no word back from the editors is what the Annals staff tell me,
but hey, they are talking to me, get it?

If I had sent the paper on non-polynomial factorization to the Annals
first, then history might be different, as, of course, there is no way
there would have been an attempt at de-publication, and any math
journal where the editors followed the rules would publish the paper.

So what happened after the SWJPAM did it's weird thing?

Well, their publication legally stands so Cameron University in
Oklahoma has the publication whether they want it or not, so other
journals could back off from the only somewhat revised paper on legal
grounds.

Or, um, that's my guess as I'm not an expert on copyright law.

The paper on my prime counting function doesn't have any of those
problems.

Ullrich and Magidin and people like them are the most vulnerable on
this newsgroup because if they just sit quiet and hope, then I make the
argument that they shouldn't be allowed to teach because they lack the
necessary ethics for teaching at universities and shouldn't be trusted
with young minds.


James Harris

Tim Peters

unread,
Dec 10, 2006, 6:14:03 PM12/10/06
to
[jst...@msn.com]
>>>>> So I had to find two proofs of the same problem, where the second
>>>>> proof
>>>>> had to be obvious in such a way that there was no room for anyone to
>>>>> find even the hint of an objection.

[David C. Ullrich]
>>>> Curious that people _found_ objections, eh?

[jst...@msn.com]
>>> You're betting your career on that Ullrich, so you better read what they
>>> have carefully.
>>>
>>> I'm kind of puzzled by people like you and Magidin who actually have
>>> something to lose here.

[Tim Peters]


>> Yet nobody else is puzzled. Don't you tire of being so thick?

[jst...@msn.com]


> Come on Ullrich. Even the dodge of posting as your "Tim Peters"
> persona doesn't protect you.

The prosecution rests ;-) While you don't see it, everyone else does: here
as in everything else, your convictions are based on delusion.

BTW, it remains true that nobody else is puzzled. /Don't/ you tire of being
the only one without a clue?

> And it's your choice, I just find it curious.
>
> Why roll the dice one more time when the consequences can be the end of
> your career, no more teaching period, as you wouldn't even be able to
> teach high school math, and the loss of your retirement as well?
>
> Why take that gamble?

As I said last time:

"Gambling" on that you're wrong here is like betting the sun will
rise tomorrow. Except it's more certain: there are reasons, however
unlikely, for why the sun may not rise tomorrow. OTOH, the only
possiblity that your "7*a = 5*b implies 7|b" argument is correct lies
with that mathematics may be inconsistent.

Get it? This is no debate here. You're wrong again. That's the end of the
story so far as the math goes: you have nothing here, apart from a trivial
rediscovery of what everyone else has known for a century (that the
algebraic integers form neither a field nor a unique factorization domain --
it's not news, and is not indicative of "a problem" of any kind).

>>> You should know by now that I have a paper on my prime counting
>>> function currently under review at the Annals of Mathematics.

>> From which nothing follows. In addition, you're only the one who
>> wouldn't
>> bet a great deal on that the paper will be rejected. Don't you tire of
>> being so thick?

> I doubt the Annals editors wish to put their careers on the line the
> way you do.

That the Annals editors are competent is what everyone is betting on. The
difference is that, because you're both incompetent and endlessly deluded,
you have exactly the wrong expectations about what they'll do. Your paper
will be rejected. The reasons why have been explained, but you prefer to
cling to your ignorance.

> If I had any concerns about whether or not they will carefully review
> the material in terms of mathematical importance, novelty and value to
> the discipline then I wouldn't be talking about it like I do, now would
> I?

If you're seeking a psychological explanation for why you post the crazy
stuff you do, you should ask a mental health professional. Self-diagnosis
didn't seem to do you much good:

http://mathforum.org/kb/message.jspa?messageID=444857&tstart=0

>>> I also have a NEW paper just written a few days ago with this simplified
>>> proof showing the problem with the ring of algebraic integers.

>> Which is just another crock. Cut through all the irrelevant bullshit,
>> and
>> it boils down to your naive hope that:
>>
>> 7*a = 5*b
>>
>> implies 7 is a factor of b. But there isn't a single mathematician on
>> the
>> planet you can sell that to, because (a) you have no proof (you merely
>> keep
>> /repeating/ it); and, (b) it's false (and, worse, is obviously false at
>> first sight to everyone with relevant knowledge).

> Fine.
>
> I've given you your chance, so you roll the dice.
>
> It's your decision to make.

Already did. So did Professor Ullrich. Not to deny that it's very generous
of you to allow us to decide for ourselves ;-)


jst...@msn.com

unread,
Dec 10, 2006, 6:19:14 PM12/10/06
to

Fine.

But now I suggest to the rest of the newsgroup and anyone coming later
when certain events transpire that when Ullrich is forced out of
teaching permanently, remember, he was given a chance.

And he quite deliberately chose not to take it.

And please, don't let him try to play victim later, as if he didn't get
an opportunity.

He was given the chance here and now, but possibly thinks he is too
smart to get caught, no matter what, when I say, before the end of next
year, he will no longer be a math professor.


James Harris

Rupert

unread,
Dec 10, 2006, 6:24:23 PM12/10/06
to

Would you like to make a bet about that?

>
> James Harris

William Hughes

unread,
Dec 10, 2006, 6:26:42 PM12/10/06
to

Once they noticed how important the paper
was they could have contacted Cameron University and
asked for a release. Cameron University would not
refuse such a request.

They could also have said that we cannot publish the paper
in this form, we need a complete rewrite (the ideas in
the paper are not covered by copyright, only the expression
of these ideas.

Doesn't the fact that they did neither of these give you
pause?

- William Hughes

jst...@msn.com

unread,
Dec 10, 2006, 6:36:35 PM12/10/06
to

No.

The internal politics at the journal aren't a big deal to me, since I
expect them to follow the rules closely with this paper.

But, I will say to you that there had to be some other things happening
of which you are not aware and of which I have more information than
you do, so I am not concerned as I think I know a lot about what
happened.

Make no mistake, the paper should get an objective and careful
consideration which should lead to a resolution which I will find
agreeable.

That is vague but at this level there is a lot of bowing to politics
that is necessary.

You should feel privileged to have as much information as you do, as
most of the time, people on the outside have no clue about what is
going on at top level organizations.


James Harris

William Hughes

unread,
Dec 10, 2006, 6:49:59 PM12/10/06
to

jst...@msn.com wrote:
>
> Come on Ullrich. Even the dodge of posting as your "Tim Peters"
> persona doesn't protect you.
>

Wow!. From Tim Peters to David Ullrich in one fell swoop. .
Right to the top of the pantheon of JSH villains.
I gaze in wonder and envy.

-William (a lesser member of the pantheon) Hughes

Solbek

unread,
Dec 10, 2006, 7:44:31 PM12/10/06
to

<jst...@msn.com> wrote in message
news:1165792754.3...@n67g2000cwd.googlegroups.com...
>
<snip>

>> >
>> > It's your decision to make.
>>
>> Already did. So did Professor Ullrich. Not to deny that it's very
>> generous
>> of you to allow us to decide for ourselves ;-)
>
> Fine.
>
> But now I suggest to the rest of the newsgroup and anyone coming later
> when certain events transpire that when Ullrich is forced out of
> teaching permanently, remember, he was given a chance.
>
> And he quite deliberately chose not to take it.
>
> And please, don't let him try to play victim later, as if he didn't get
> an opportunity.
>
> He was given the chance here and now, but possibly thinks he is too
> smart to get caught, no matter what, when I say, before the end of next
> year, he will no longer be a math professor.
>
>
> James Harris
>

why so much hatred and jealousy ?

Just because you can't understand Math ?

Or are you just playing troll ?

troll is is.

That makes you evil.

Evil James Harris.


jshs...@yahoo.com

unread,
Dec 10, 2006, 8:02:30 PM12/10/06
to

James, if you are so confident that your paper will be published by the
annals, why don't you actually back it up with something. If your paper
is not published(which of course you know that it will, since you have
more info than the rest of us), you will make a post declaring that you
admit to being a crank, and you agree to go get some help for it. If
your paper gets published, I will make a post declaring you a great
mathematician and will never criticize any of your work again.

William Hughes

unread,
Dec 10, 2006, 8:18:30 PM12/10/06
to

So first you tell us you think that the reason it was not published
was that the copyright is held by another instituion.
Now you tell us that you are confident you know what happened
but you can't talk about it.

You know, if such a thing were possible, this would lower your
credibility.

It's too bad you have decided to be so coy. It would be interesting
to learn why you think that the Annals rejected a previous paper,
but will not reject this one.

- William Hughes

Tim Peters

unread,
Dec 10, 2006, 9:29:37 PM12/10/06
to
[jshs...@yahoo.com]

> James, if you are so confident that your paper will be published by the
> annals, why don't you actually back it up with something. If your paper
> is not published(which of course you know that it will, since you have
> more info than the rest of us), you will make a post declaring that you
> admit to being a crank, and you agree to go get some help for it. If
> your paper gets published, I will make a post declaring you a great
> mathematician and will never criticize any of your work again.

Ya, James with a sense of fair play -- LOL. He never puts anything at stake
in recent years. If he's right, his imagined enemies should suffer the
torments of the damned -- but if he's wrong, so what?

Tell you what, though. If he agrees to your proposal, and his
prime-counting paper is published by the Annals, I'll give him a cashier's
check for US $10,000.00.


jshs...@yahoo.com

unread,
Dec 10, 2006, 9:45:16 PM12/10/06
to

Yeah, but we all know that you are an imposter. :-)

You are Hughes, or Ullrich , or santa claus.

I am pretty sure I know what James will do also :-)

sg...@hotmail.co.uk

unread,
Dec 10, 2006, 9:58:15 PM12/10/06
to
James Harris wrote:


> He was given the chance here and now, but possibly thinks he is too
> smart to get caught, no matter what, when I say, before the end of next
> year, he will no longer be a math professor.

I bet he's crapping his pants now; your predictions always come true. I
guess he's one of the lucky ones though. After all, Rick Decker and Tim
Peters only have 21 days to live.

-Rotwang

Tim Peters

unread,
Dec 10, 2006, 10:03:04 PM12/10/06
to
[jst...@msn.com, to "Tim Peters"]

>> Come on Ullrich. Even the dodge of posting as your "Tim Peters"
>> persona doesn't protect you.

["William Hughes"]


> Wow!. From Tim Peters to David Ullrich in one fell swoop. .
> Right to the top of the pantheon of JSH villains.
> I gaze in wonder and envy.
>
> -William (a lesser member of the pantheon) Hughes

Nice one, Terry!

Love to Clarissa and the kids,
Greg


junoexpress

unread,
Dec 10, 2006, 10:11:59 PM12/10/06
to

>
> The internal politics at the journal aren't a big deal to me, since I
> expect them to follow the rules closely with this paper.
>
> But, I will say to you that there had to be some other things happening
> of which you are not aware and of which I have more information than
> you do, so I am not concerned as I think I know a lot about what
> happened.
>
Bullshit. You don't know anything more about it.

> Make no mistake, the paper should get an objective and careful
> consideration which should lead to a resolution which I will find
> agreeable.
>

You've never published one paper in a real journal, or you would know
(as most people here already do), that your abstract ALONE is enough to
raise red flags with ANY REAL reviewer. Be prepared to wait awhile for
the pink slip though: a good journal can take a year to get fully
reviewed and ironed out. Also, be prepared, if it even gets to the
review stage ( hey, it's your fantasy world, so I'll play along), to go
through a lot of back and forth with the editors who will probably not
be so nice as those on sci.math have been with you.

> That is vague but at this level there is a lot of bowing to politics
> that is necessary.

That vagueness is called bullshit.

> You should feel privileged to have as much information as you do, as
> most of the time, people on the outside have no clue about what is
> going on at top level organizations.

Indeed.

What a heaping, stinking, load of crap. You know nothing, and yet you
lie and then wonder why people don't believe you. Geez....
>
>
> James Harris

Matt

Jesse F. Hughes

unread,
Dec 10, 2006, 10:48:23 PM12/10/06
to
"Tim Peters" <tim...@comcast.net> writes:

> Tell you what, though. If he agrees to your proposal, and his
> prime-counting paper is published by the Annals, I'll give him a cashier's
> check for US $10,000.00.

Great. Thanks to you, the editor will likely make a deal with JSH and
agree to publish it for an 80-20 cut.

--
"Eventually the truth will come out, and you know what I'll do then?
Probably go to the beach. I'll also hang out in some bars. Yup, I'll
definitely hang out in some bars, preferably near a beach."
-- JSH on the rewards of winning a mathematical revolution

Tim Peters

unread,
Dec 10, 2006, 10:49:03 PM12/10/06
to
[junoexpress]
>>...

> You've never published one paper in a real journal, or you would know
> (as most people here already do), that your abstract ALONE is enough to
> raise red flags with ANY REAL reviewer. Be prepared to wait awhile for
> the pink slip though: a good journal can take a year to get fully
> reviewed and ironed out.

The Annals has a "fast track" for short papers, and we know pretty much for
sure that James already had one short paper rejected by them. He also said
something cryptic a week or two ago implying he had another short-paper
rejection from them. So he may well have a realistic sense of how long it
"should" take to get a rejection in this case.

> Also, be prepared, if it even gets to the review stage ( hey, it's your
> fantasy world, so I'll play along), to go through a lot of back and forth
> with the editors who will probably not be so nice as those on sci.math
> have been with you.

Strongly disagree with that, and indeed believe it's part of "the problem":
James has proved beyond any doubt that he can't recognize a polite
brush-off, and that's what editors /normally/ hand out. They don't want to
get into a futile pissing contest with a crank on "work time". No sane
person does. That's why the only honest feedback James gets is on Usenet,
and even here people tend to treat him politely until he goes postal on them
first (which he invariably does).

Indeed, if James already had a paper or two rejected by the Annals, the fact
that he hasn't yet villified the Annals here in dozens of hysterical new
threads proves that he was able to /interpret/ whatever they said to him as
"nice work! please try again!". Keep in mind that his capacity for
self-flattering delusion is far beyond yours and mine combined ;-) He's
even posted polite-brush-off emails as "proof" that Big Names can't find any
problems with his work.

> ...


junoexpress

unread,
Dec 10, 2006, 11:14:30 PM12/10/06
to

Tim Peters wrote:

> > Also, be prepared, if it even gets to the review stage ( hey, it's your
> > fantasy world, so I'll play along), to go through a lot of back and forth
> > with the editors who will probably not be so nice as those on sci.math
> > have been with you.
>
> Strongly disagree with that, and indeed believe it's part of "the problem":
> James has proved beyond any doubt that he can't recognize a polite
> brush-off, and that's what editors /normally/ hand out. They don't want to
> get into a futile pissing contest with a crank on "work time". No sane
> person does. That's why the only honest feedback James gets is on Usenet,
> and even here people tend to treat him politely until he goes postal on them
> first (which he invariably does).
>
> Indeed, if James already had a paper or two rejected by the Annals, the fact
> that he hasn't yet villified the Annals here in dozens of hysterical new
> threads proves that he was able to /interpret/ whatever they said to him as
> "nice work! please try again!". Keep in mind that his capacity for
> self-flattering delusion is far beyond yours and mine combined ;-) He's
> even posted polite-brush-off emails as "proof" that Big Names can't find any
> problems with his work.
>
> > ...

You're right: thank you for the reality check. :>)

Michael Press

unread,
Dec 10, 2006, 11:27:25 PM12/10/06
to
In article
<1165792064....@80g2000cwy.googlegroups.com>,
jst...@msn.com wrote:

> Besides you people get so used to discounting what I say that you
> forget that I actually do have more information than you have, and when
> I say I contact Princeton regularly about my paper, that's for real.
>
> So far no word back from the editors is what the Annals staff tell me,
> but hey, they are talking to me, get it?

Yes. They know what is at stake, and will be very very
careful. One false step and they will have no end of
trouble. You were wise to submit your paper for review
there.

I do not know why you keep trying to educate the nay
sayers here. It must take immense time from your great
work, and supernal patience. All you get in return is
derision and denial. It is not as if there are many of
them either; far fewer than the number of different
names posting negative messages. We know how easy it is
to post under different names and accounts. Honestly, I
do not know why you bother.

--
Michael Press

jst...@msn.com

unread,
Dec 11, 2006, 12:53:28 AM12/11/06
to

Well, idealism has its price.

There are of course other factors.

If any of you really knew my place in history, and really understood
just how big these results are, would you dare talk to me?

No. You wouldn't. But now, thinking that I am wrong and incapable of
getting my research accepted unless I convince some nobodies on Usenet,
you do.

I know what the future holds. You clearly do not.


James Harris

jshs...@yahoo.com

unread,
Dec 11, 2006, 1:01:10 AM12/11/06
to

If we are nobodies and you talk to important people, then why do you
post here?

Seems to me that you spend a good part of your life trying to convine
those on usenet that you are a great mathematician.

!0+ years and still few if any(probably zero) think of you as anything
but a crank. So to all of the nobodies who post in these math
newsgroups, you are a joke. Must make you feel important.

jshs...@yahoo.com

unread,
Dec 11, 2006, 1:03:14 AM12/11/06
to

That was supposed to be "convince those on usenet" and 10+ years.

junoexpress

unread,
Dec 11, 2006, 1:09:54 AM12/11/06
to

> > > So far no word back from the editors is what the Annals staff tell me,
> > > but hey, they are talking to me, get it?
> >
NO.
First, you don't contact the editors bugging them like this. They send
you a form saying they received your paper, and then you do like
everyone else, ...you wait. Oh no, I'm sorry, I forgot, you're James
Harris, saving the world, the youth of tomorrow, and the top
mathematical minds in the world from mass suicide all in one fell
swoop. No ,you sir, get to go to the front of the line,as you _clearly_
are not like the other 1,000 PhDs in math who submitted their papers.

>
> If any of you really knew my place in history, and really understood
> just how big these results are, would you dare talk to me?
>
> No. You wouldn't. But now, thinking that I am wrong and incapable of
> getting my research accepted unless I convince some nobodies on Usenet,
> you do.
>
> I know what the future holds. You clearly do not.
>

And by James's logic, he is absolutely correct.
Here's the proof:

Premise: A poster thinks JSH's place in history is not important.

Now here we go, ready?

If the poster knew JSH's (correct) place in history, then he would not
(be able) to talk to him
But, the poster _does_ post, therefore he does not know JSH's true
place in history.
Therefore premise is incorrect, and JSH's place in history _is_
important

I love it: the pure, undistilled, classic essence of JSH.

Matt

sg...@hotmail.co.uk

unread,
Dec 11, 2006, 1:13:17 AM12/11/06
to
James Harris wrote:

> I know what the future holds. You clearly do not.

Don't be cagey. Give us more predictions, damnit! I love your
predictions.

-Rotwang

Proginoskes

unread,
Dec 11, 2006, 1:25:36 AM12/11/06
to

A lot of messages have been posted, and I can't reply to all of them.
But what I can do (and am doing) is compile the relevant parts, in a
stream-of-consciousness sort of way. ("Extreme posting") JSH doesn't
need much context.

jst...@msn.com wrote:
>
> You should know by now that I have a paper on my prime counting
> function currently under review at the Annals of Mathematics.

For those of you in sci.mathland who don't know what "under review"
means, it means they have received the paper but haven't gotten
feedback from the "referees" whether to publish or not. Every paper
goes through this process.

> I also
> have a NEW paper just written a few days ago with this simplified proof
> showing the problem with the ring of algebraic integers.

I also have new papers.

In another post:

> So what happened after the SWJPAM did it's weird thing?
>
> Well, their publication legally stands so Cameron University in
> Oklahoma has the publication whether they want it or not, so other
> journals could back off from the only somewhat revised paper on legal
> grounds.
>

> Or, um, that's my guess as I'm not an expert on copyright law.

Or (apparently) anything else, for that matter.

Elsewhere in the thread:

> But now I suggest to the rest of the newsgroup and anyone coming later
> when certain events transpire that when Ullrich is forced out of
> teaching permanently, remember, he was given a chance.

I read an article in the newspaper today about how the Taliban killed
two women because they were *teachers*. JSH has wonderful company,
doesn't he?

> He was given the chance here and now, but possibly thinks he is too
> smart to get caught, no matter what, when I say, before the end of next
> year, he will no longer be a math professor.

Just in case he removes that post, it was posted Sun, Dec 10, 2006, at
4:19 pm. Does anyone want to take any bets on whether Ullrich will
still be teaching then? ((Later:)) I guess I'mn't the first one to ask
that.

A piece of a post from William Hughes, for a change:

> It would be interesting
> to learn why you think that the Annals rejected a previous paper,
> but will not reject this one.

Simple; because this result is *derived*.

One from "junoexpress":

> You've never published one paper in a real journal, or you would know
> (as most people here already do), that your abstract ALONE is enough to
> raise red flags with ANY REAL reviewer. Be prepared to wait awhile for
> the pink slip though: a good journal can take a year to get fully

> reviewed and ironed out. Also, be prepared, if it even gets to the


> review stage ( hey, it's your fantasy world, so I'll play along), to go
> through a lot of back and forth with the editors who will probably not
> be so nice as those on sci.math have been with you.

Makes you kind of wish you could monitor the communication, doesn't it?

(But you know, you *can*; after all Surrogate Factoring, another JSH
invention, can be used to crack any coded message. It caused the Great
Fall of Civilization in February of 2006, you know.)(sarcasm, in case
anyone is wondering)

Tim Peters (in one of his guises) posted:

> The Annals has a "fast track" for short papers, and we know pretty much for
> sure that James already had one short paper rejected by them. He also said
> something cryptic a week or two ago implying he had another short-paper
> rejection from them. So he may well have a realistic sense of how long it
> "should" take to get a rejection in this case.

Oh, so JSH wasn't BS'ing about his paper taking less time then usual,
then? That's a first, then. (I posted a few weeks back that my paper
had been in review since March 2006.)

> Strongly disagree with that, and indeed believe it's part of "the problem":
> James has proved beyond any doubt that he can't recognize a polite
> brush-off, and that's what editors /normally/ hand out. They don't want to
> get into a futile pissing contest with a crank on "work time". No sane
> person does. That's why the only honest feedback James gets is on Usenet,
> and even here people tend to treat him politely until he goes postal on them
> first (which he invariably does).

But if he starts going postal on Princeton, they might just call the
police and have JSH picked up. JSH could be blogging from jail! (A new
twist for him.)

> Indeed, if James already had a paper or two rejected by the Annals, the fact
> that he hasn't yet villified the Annals here in dozens of hysterical new
> threads proves that he was able to /interpret/ whatever they said to him as
> "nice work! please try again!". Keep in mind that his capacity for
> self-flattering delusion is far beyond yours and mine combined ;-) He's
> even posted polite-brush-off emails as "proof" that Big Names can't find any
> problems with his work.

He's not the only one, either; Archimedes Plutonium proudly posted a
rejection e-mail that he received about his "proof" of the Twin Prime
Conjecture.

Michael Press:

> I do not know why you keep trying to educate the nay
> sayers here. It must take immense time from your great
> work, and supernal patience.

Does anyone know what JSH does for a living?

JSH:

> If any of you really knew my place in history, and really understood
> just how big these results are, would you dare talk to me?

Delusions of grandeur.

--- Christopher Heckman

Tim Peters

unread,
Dec 11, 2006, 2:41:39 AM12/11/06
to
...

[Tim Peters (in one of his guises)]


>> The Annals has a "fast track" for short papers, and we know pretty much
>> for sure that James already had one short paper rejected by them. He
>> also said something cryptic a week or two ago implying he had another
>> short-paper rejection from them. So he may well have a realistic sense
>> of how long it "should" take to get a rejection in this case.

[Proginoskes]


> Oh, so JSH wasn't BS'ing about his paper taking less time then usual,
> then?

I find it least stressful to assume JSH is always BS'ing. However, in the
specific case of the Annals:

http://annals.princeton.edu/EditorsStatement.html

...

To encourage the submission of excellent short papers to the Annals,
the editors announce that Annals papers under 20 printed pages in
length will be published on an accelerated schedule. We will also
make efforts to expedite the refereeing of excellent short papers.

For example, Cao and Titi's 19-page "Global well-posedness of the
three-dimensional viscous primitive equations of large scale ocean and
atmosphere dynamics" was submitted 2-Nov-2005 and accepted 28-Feb-2006, less
than 4 months total. OTOH, Minsky's 102-page "The classification of
Kleinian surface groups, I. Models and bounds" spanned 10-Apr-2003 to
20-Apr-2006, 3 years.

I don't think those are /typical/, probably just extremes that hit my eye
early on. In the other direction, e.g., Bromberg's 17-page "Projective
structures with degenerate holonomy and the Bers density conjecture" spanned
6-Dec-2002 to 7-Jul-2006.

In the case of James's paper, you have to suspect that the first reviewer
will either die of shock or bounce it immediately. In one of those cases,
response will be swift ;-)

> That's a first, then. (I posted a few weeks back that my paper had been
> in review since March 2006.)

You don't think that's unusual, right? There's a reason arXiv is so widely
lauded.

If you work Python into your next paper, you can submit it to the next
Python conference. In exchange for what some people consider to be pocket
change, I'll push it through the review process and ensure it's accepted
within just a few months ;-)

> ...


John

unread,
Dec 11, 2006, 2:56:15 AM12/11/06
to

<jshs...@yahoo.com> wrote in message
news:1165816870.4...@l12g2000cwl.googlegroups.com...

Because he is a TROLL

>
> Seems to me that you spend a good part of your life trying to convine
> those on usenet that you are a great mathematician.
>
> !0+ years and still few if any(probably zero) think of you as anything
> but a crank. So to all of the nobodies who post in these math
> newsgroups, you are a joke. Must make you feel important.
>

He is a TROLL, that is a fact.


Tim Peters

unread,
Dec 11, 2006, 3:20:43 AM12/11/06
to
[jst...@msn.com]
> ...

> If any of you really knew my place in history, and really understood
> just how big these results are, would you dare talk to me?

That depends. If your place in history is that of a serial killer who
eventually murders hundreds of imagined enemies, and people knew that now, I
bet a lot of people would stop talking to you entirely.

Is that what you believe? If, for example, you're sane, then you know that
if your place in history were that of a great mathematical discoverer,
people would be very eager to talk to you here; and, if you're also honest,
you wouldn't pretend otherwise.

> No. You wouldn't. But now, thinking that I am wrong and incapable of
> getting my research accepted unless I convince some nobodies on Usenet,
> you do.

You misunderstand that last part: people on Usenet can tell you when a
piece of your mathematical work is badly wrong. They can do that because
they know far more math than you know. It's a fact that whenever the usual
cast of characters here has told you a piece of work was badly wrong, it was
badly wrong. That's not going to change, and you have no chance of getting
badly wrong work accepted -- think! If even nobodies can see it's badly
wrong, who are you to think you know better? Less than nobody. In more
than a decade, you've never been right about anything the serious people on
Usenet have told was wrong (although you're still denying that about a few
of your more cherished wrong claims).

> I know what the future holds. You clearly do not.

You tried telling us, though:

http://mathforum.org/kb/message.jspa?messageID=444857&tstart=0

I don't think I'd be looking forward to ending life as "an oddball recluse -
derided, feared, and loathed in equal measures", but to each his own.


Keith Ramsay

unread,
Dec 11, 2006, 4:24:45 AM12/11/06
to

Tim Peters wrote:
[...]

|Which is just another crock. Cut through all the irrelevant bullshit,
and
|it boils down to your naive hope that:
|
| 7*a = 5*b
|
|implies 7 is a factor of b. But there isn't a single mathematician on
the
|planet you can sell that to, because (a) you have no proof (you merely
keep
|/repeating/ it); and, (b) it's false (and, worse, is obviously false

at
|first sight to everyone with relevant knowledge).

I haven't been following this latest round in detail. If 7a=5b,
then b=15b-14b=3*(5b)-14b=3*(7a)-7*(2b)=7*(3a-2b). If a
and b are algebraic integers, then 3a-2b is an algebraic
integer, and b is divisible by 7. Is this a case where we
don't have that a and b are algebraic integers, or the
divisibility isn't in the algebraic integers, or what? I think
you could be more plain about the nature of the problem.

Earlier in this stuff people looked up some interesting facts
about the ring of algebraic integers. One is that it's a
Schreier domain: an integrally closed integral domain where
any two factorizations have equivalent refinements. This is
not the same as unique factorization, but makes the ring of
algebraic integers a bit more like unique factorization
domains than the ring of integers of an arbitrary number
field is, for instance. It makes certain kinds of handwaving
arguments work out in spite of being based on naive hopes.

Keith Ramsay

Tim Peters

unread,
Dec 11, 2006, 6:18:18 AM12/11/06
to
[Tim Peters]

> [...]
> |Which is just another crock. Cut through all the irrelevant bullshit,
> |and it boils down to your naive hope that:
> |
> | 7*a = 5*b
> |
> |implies 7 is a factor of b. But there isn't a single mathematician on
> |the planet you can sell that to, because (a) you have no proof (you
> |merely keep /repeating/ it); and, (b) it's false (and, worse, is
> |obviously false at first sight to everyone with relevant knowledge).


[Keith Ramsay]


> I haven't been following this latest round in detail. If 7a=5b,
> then b=15b-14b=3*(5b)-14b=3*(7a)-7*(2b)=7*(3a-2b). If a
> and b are algebraic integers, then 3a-2b is an algebraic
> integer, and b is divisible by 7. Is this a case where we
> don't have that a and b are algebraic integers, or the
> divisibility isn't in the algebraic integers, or what?

Who knows? James has refused hundreds of requests (and at least a dozen in
this thread) to specify which ring he's working in. Instead he endlessly
pushes symbols around with no mention of rings, and repeats that "algebra"
proves b has 7 as a factor. Then he acknowledges that in the much messier
expression he's actually working with (of the /form/ 7a=5b) it's the case
that b is an algebraic integer that does not have 7 as a factor. (In this
case, it so happens that `a` is an algebraic number but not an algebraic
integer -- which is why your cogent argument above doesn't apply.)

His conclusion is threefold: (1) the ring of algebraic integers is "flawed"
(sometimes he says they "have a coverage problem" instead); (2) Galois
theory "as usually taught" (but not Galois theory itself) is wrong; and, (3)
the "theory of ideals does not work" (the theory itself, as well as how it's
taught).

> I think you could be more plain about the nature of the problem.

Easily: Harris is a crank, and that's the only problem here :-( If you
want to know what he means by any of his 3 conclusions, you'll have to ask
him.

The derivation of the expressions he's working with is an unmotivated mess,
and I don't believe it would make it any plainer to reproduce that again
(James has repeated it several times each day). In the /end/, he has:

7*g(x) = 5*a_2(x) [1]

where a_2(x) is one of the roots of the quadratic (in `a`):

a^2 - (7*x-1)*a + (49*x^2 - 14*x) = 0 [2]

He stares at [1] and concludes that 7 must be a factor of a_2(x), because of
"algebra". Then he plugs x=1 into [2], to get:

a^2 - 6*a + 35 = 0 [3]

Then he notes that neither root of [3] has 7 as a factor in the ring of
algebraic integers. And that "contradicts algebra".

The roots of [3] are 3 +/- sqrt(-26). Only the "+" choice makes sense in
context, but that's been ignored. Instead the claim is that exactly one of
the roots must be divisible by 7 (and the other root is "really a unit",
again according to algebra, and again without specifying a ring).

As a ratio of algebraic integers, g(1) = 5*(3+sqrt(-26))/7, which is an
algebraic number but not an algebraic integer. Since the denominator of
g(1) /is/ 7, there's no mystery to anyone (except James) why multiplying it
by 7 doesn't magically force a_2(x) to be divisible by 7 too.

He also has an even more confused "proof" that "trivial algebra" shows that

g(x) must divide off some factors in common with 7

is false. He clings to that despite that 7 is sitting all alone in the
denominator of g(1) in his own x=1 example.

And ... well, let me know whether this was "plain enough". How can such
relentless nonsense be summarized clearly?

> Earlier in this stuff people looked up some interesting facts
> about the ring of algebraic integers. One is that it's a
> Schreier domain: an integrally closed integral domain where
> any two factorizations have equivalent refinements. This is
> not the same as unique factorization, but makes the ring of
> algebraic integers a bit more like unique factorization
> domains than the ring of integers of an arbitrary number
> field is, for instance. It makes certain kinds of handwaving
> arguments work out in spite of being based on naive hopes.

LOL -- Keith, when's the last time you really read a JSH thread? Maybe only
half LOL, though: to my eyes his ability to make plausible technical
arguments has been going downhill steadily for at least two years. Schreier
domains? I wish. He rarely makes technical arguments that make it above
"not even wrong" anymore.

For example, the real reason he never specifies a ring when talking about
factors and units (or any other term requiring a ring for context) is his
rarely /stated/ belief that there is, in fact, only one ring. Here from a
post in October:

...

[Rupert]
> Meaningless. Specify what ring you're working in.

[JSH]
You are going backwards. The point of the demonstration is that the
OLD WAYS DO NOT WORK so the old way of emphasizing what ring don't
work.

It turns out that there is really only one ring.

I call it the ring of objects.

But rather than go into that, I like pointing out that the old ideas
lead to the appearance of contradictions.

Like how you can prove factors in common with 3 and then find that you
can't get algebraic integer results that agree!!!

That's because the ring of algebraic integers has some issues--it has
problems.

Easily proven problems when you have the right tools.

...

Stuff like this is more delusional than "wrong" -- and this is as good as it
gets anymore.


Richard Tobin

unread,
Dec 11, 2006, 7:18:32 AM12/11/06
to
In article <1165788572.7...@80g2000cwy.googlegroups.com>,
<jst...@msn.com> wrote:

>Come on Ullrich. Even the dodge of posting as your "Tim Peters"
>persona doesn't protect you.

Did you just notice that "Tim Peters" is an anagram of "prime test"?

-- Richard
--
"Consideration shall be given to the need for as many as 32 characters
in some alphabets" - X3.4, 1963.

Tim Peters

unread,
Dec 11, 2006, 7:36:18 AM12/11/06
to
[jst...@msn.com]

>> Come on Ullrich. Even the dodge of posting as your "Tim Peters"
>> persona doesn't protect you.


[Richard Tobin]


> Did you just notice that "Tim Peters" is an anagram of "prime test"?

Probably not, but if he ever figures out that "David Ullrich" is an anagram
of "dull rich diva" too, he'll start paying a lot more attention ;-)


Chip Eastham

unread,
Dec 11, 2006, 8:40:53 AM12/11/06
to

Keith Ramsay wrote:
> Tim Peters wrote:
> [...]
> |Which is just another crock. Cut through all the irrelevant bullshit,
> and
> |it boils down to your naive hope that:
> |
> | 7*a = 5*b
> |
> |implies 7 is a factor of b. But there isn't a single mathematician on
> the
> |planet you can sell that to, because (a) you have no proof (you merely
> keep
> |/repeating/ it); and, (b) it's false (and, worse, is obviously false
> at
> |first sight to everyone with relevant knowledge).
>
> I haven't been following this latest round in detail. If 7a=5b,
> then b=15b-14b=3*(5b)-14b=3*(7a)-7*(2b)=7*(3a-2b). If a
> and b are algebraic integers, then 3a-2b is an algebraic
> integer, and b is divisible by 7. Is this a case where we
> don't have that a and b are algebraic integers, or the
> divisibility isn't in the algebraic integers, or what? I think
> you could be more plain about the nature of the problem.

[snip]

Hi, Keith:

To restore a bit more of the flavor of the JSH claim,
he says something like:

7g = 5a (sorry, notation conflicts with that above)

where a = 3 - sqrt(-26), root of x^2 - 6x + 35 = 0. But
JSH seems to insert g into the picture "by hand". So
my interpretation is this is a situation that one but not
both of the "letters" is an algebraic integer.

To revisit your argument, if a,b are algebraic integers
such that 7a = 5b, then 7 does divide b, and simlarly 5
divides a. So, even though 5 and 7 are not primes in the
ring of algebraic integers, as ideals (5) \cap (7) = (35)
just as in the rational integers.

regards, chip

David Moran

unread,
Dec 11, 2006, 8:41:18 AM12/11/06
to

<jst...@msn.com> wrote in message
news:1165816408.4...@f1g2000cwa.googlegroups.com...

Ahhhh classic crackpot behavior.

Dave


David C. Ullrich

unread,
Dec 11, 2006, 9:57:35 AM12/11/06
to
On 10 Dec 2006 11:33:00 -0800, jst...@msn.com wrote:

>
>David C. Ullrich wrote:


>> On 8 Dec 2006 17:43:52 -0800, jst...@msn.com wrote:
>>
>> >So I had to find two proofs of the same problem, where the second proof
>> >had to be obvious in such a way that there was no room for anyone to
>> >find even the hint of an objection.
>>

>> Curious that people _found_ objections, eh?
>>
>

>You're betting your career on that Ullrich, so you better read what
>they have carefully.

How bizarre. I could swear it was just yesterday or a few days
ago that you made a post where you asked us to pay no attention
to these ravings about Consequences.

>I'm kind of puzzled by people like you and Magidin who actually have
>something to lose here.
>

>You should know by now that I have a paper on my prime counting
>function currently under review at the Annals of Mathematics.

Yes, I know that. I've seen the "paper". They're not going to
publish it.

>I also
>have a NEW paper just written a few days ago with this simplified proof

>showing the problem with the ring of algebraic integers.
>
>And noting that you can move a factor of 2 with
>
>175x^2 - 15x + 2 = (f(x) + 2)*(g(x) + 1)
>
>is just so basic and irrefutable that now for the first time, even
>casual readers can easily determine that posters are wrong when they
>dispute this result.
>
>If you sit back and wait, assuming nothing will happen, then when it
>does and my results are published, then your career as a teaching
>professor is gone, and not even tenure will save you.

God you're stupid. Even if you _were_ right about the math, which
of course is simply not so, the idea that my saying you were wrong
would have some adverse consequences for my job is simply hilarious.

>Your only hope at this point is to claim confusion and start defending
>the correct mathematical proofs now that you can see a simple and
>obvious explanation.
>
>After all, with the loss of your position and tenure, you would
>probably lose most of your retirement funds as well, and have to start
>over with some new career with a major black mark against you.
>
>Why throw the dice?
>
>
>James Harris


************************

David C. Ullrich

David C. Ullrich

unread,
Dec 11, 2006, 10:04:21 AM12/11/06
to
On Sun, 10 Dec 2006 22:49:03 -0500, "Tim Peters" <tim...@comcast.net>
wrote:

>[junoexpress]
>>>...
>> You've never published one paper in a real journal, or you would know
>> (as most people here already do), that your abstract ALONE is enough to
>> raise red flags with ANY REAL reviewer. Be prepared to wait awhile for
>> the pink slip though: a good journal can take a year to get fully
>> reviewed and ironed out.
>
>The Annals has a "fast track" for short papers, and we know pretty much for
>sure that James already had one short paper rejected by them. He also said
>something cryptic a week or two ago implying he had another short-paper
>rejection from them. So he may well have a realistic sense of how long it
>"should" take to get a rejection in this case.

My conjecture is that they're dragging their feet because as long
as this one is "under review" they don't have to put up with any
crap from James except for queries about how the review process
is coming along. Once they reject it they have to start dealing
with the next one (as well as with whatever other bizare behavior
he exhibits when a journal rejects a paper of his. He has information
we don't, remember. Who knows what that information is? Maybe he
knows that when a journal rejects his paper he threatens to get
them all fired.)

Oops. Just realized I'm talking to myself. Never mind...

>> Also, be prepared, if it even gets to the review stage ( hey, it's your
>> fantasy world, so I'll play along), to go through a lot of back and forth
>> with the editors who will probably not be so nice as those on sci.math
>> have been with you.
>
>Strongly disagree with that, and indeed believe it's part of "the problem":
>James has proved beyond any doubt that he can't recognize a polite
>brush-off, and that's what editors /normally/ hand out. They don't want to
>get into a futile pissing contest with a crank on "work time". No sane
>person does. That's why the only honest feedback James gets is on Usenet,
>and even here people tend to treat him politely until he goes postal on them
>first (which he invariably does).
>
>Indeed, if James already had a paper or two rejected by the Annals, the fact
>that he hasn't yet villified the Annals here in dozens of hysterical new
>threads proves that he was able to /interpret/ whatever they said to him as
>"nice work! please try again!". Keep in mind that his capacity for
>self-flattering delusion is far beyond yours and mine combined ;-) He's
>even posted polite-brush-off emails as "proof" that Big Names can't find any
>problems with his work.
>
>> ...
>


************************

David C. Ullrich

Michael Press

unread,
Dec 11, 2006, 3:35:16 PM12/11/06
to
In article
<1165816408.4...@f1g2000cwa.googlegroups.com>,
jst...@msn.com wrote:

I hope that I did not over step my bounds by talking to you.

> No. You wouldn't. But now, thinking that I am wrong and incapable of
> getting my research accepted unless I convince some nobodies on Usenet,
> you do.
>
> I know what the future holds. You clearly do not.

--
Michael Press

Tim Peters

unread,
Dec 11, 2006, 4:05:34 PM12/11/06
to
[Tim Peters]

>> [...]
>> |Which is just another crock. Cut through all the irrelevant bullshit,
>> |and it boils down to your naive hope that:
>> |
>> | 7*a = 5*b
>> |
>> |implies 7 is a factor of b. But there isn't a single mathematician on
>> |the planet you can sell that to, because (a) you have no proof (you
>> |merely keep /repeating/ it); and, (b) it's false (and, worse, is
>> |obviously false at first sight to everyone with relevant knowledge).

[Keith Ramsay]


>> I haven't been following this latest round in detail. If 7a=5b,
>> then b=15b-14b=3*(5b)-14b=3*(7a)-7*(2b)=7*(3a-2b). If a
>> and b are algebraic integers, then 3a-2b is an algebraic
>> integer, and b is divisible by 7. Is this a case where we
>> don't have that a and b are algebraic integers, or the
>> divisibility isn't in the algebraic integers, or what? I think
>> you could be more plain about the nature of the problem.

[Chip Eastham]


> To restore a bit more of the flavor of the JSH claim,
> he says something like:
>
> 7g = 5a (sorry, notation conflicts with that above)
>
> where a = 3 - sqrt(-26), root of x^2 - 6x + 35 = 0.

Not that it matters ;-), but in context only the other root (3+sqrt(-26))
fits here. (3-sqrt(-26) fits his other equation, respelled in your choice
of notation as f = 5a+5; these root choices are forced by the requirement
(in James's notation) f(0)=g(0)=0)

> But JSH seems to insert g into the picture "by hand". So
> my interpretation is this is a situation that one but not
> both of the "letters" is an algebraic integer.

In his specific example (picking f(1) and g(1) and a_1(1) and a_2(1)), yes,
`a` happens to be an algebraic integer while `g` happens to be an algebraic
number that's not also an algebraic integer. But these are all "after the
fact". James refuses to identify the domain(s) of his variables in advance,
insisting that his divisibilty "results" follow from "simple algebra" alone.

> To revisit your argument, if a,b are algebraic integers
> such that 7a = 5b, then 7 does divide b, and simlarly 5
> divides a. So, even though 5 and 7 are not primes in the
> ring of algebraic integers, as ideals (5) \cap (7) = (35)
> just as in the rational integers.

Note that Keith's argument is an instance of the general truth that

xa = yb

implies x|b whenever x and y are coprime, assuming that all variables are
elements of a Bezout domain (which the algebraic integers are -- although,
incidentally, James has also claimed that Dedekind was wrong about that(!)).
It doesn't really matter that x and y are rational integers. The important
thing is that x' and y' exist s.t.

xx' + yy' = 1

It's not a coincidence ;-) that -2*7 + 3*5 = 1, and that Keith started his
argument by noting b = 3*5*b - 2*7*b. More generally, it's

b = 1*b =
(xx' + yy')b =
xx'b + yy'b =
xx'b + xay' =
x(x'b + y'a)
-> x|b

Your observation that (x) \cap (y) = (xy) also extends to any coprime x & y
(rational integer or not; "prime" or not).


Proginoskes

unread,
Dec 11, 2006, 5:51:38 PM12/11/06
to

I just hope he doesn't find out "Proginoskes" anagrams to "Pinko
Ogress", or "Pokier Songs", or "Pokes Groins" ...

--- Christopher Heckman

Richard Tobin

unread,
Dec 11, 2006, 7:50:58 PM12/11/06
to
In article <1165877497.9...@n67g2000cwd.googlegroups.com>,

Proginoskes <CCHe...@gmail.com> wrote:
>I just hope he doesn't find out "Proginoskes" anagrams to "Pinko
>Ogress", or "Pokier Songs", or "Pokes Groins" ...
>
> --- Christopher Heckman

AKA
Sharp Chicken Mother
Rotherham Pinchecks
Pinch Thermos Hacker

Dave Rosoff

unread,
Dec 11, 2006, 8:26:11 PM12/11/06
to
On Mon, 12 Dec 2006, Richard Tobin wrote:

> In article <1165877497.9...@n67g2000cwd.googlegroups.com>,
> Proginoskes <CCHe...@gmail.com> wrote:
>> I just hope he doesn't find out "Proginoskes" anagrams to "Pinko
>> Ogress", or "Pokier Songs", or "Pokes Groins" ...
>>
>> --- Christopher Heckman
>
> AKA
> Sharp Chicken Mother
> Rotherham Pinchecks
> Pinch Thermos Hacker
>
> -- Richard

Richard Tobin = Bird Cat Rhino

jst...@msn.com

unread,
Dec 11, 2006, 8:47:37 PM12/11/06
to
David C. Ullrich wrote:
> On 10 Dec 2006 11:33:00 -0800, jst...@msn.com wrote:
>
> >
> >David C. Ullrich wrote:
> >> On 8 Dec 2006 17:43:52 -0800, jst...@msn.com wrote:
> >>
> >> >So I had to find two proofs of the same problem, where the second proof
> >> >had to be obvious in such a way that there was no room for anyone to
> >> >find even the hint of an objection.
> >>
> >> Curious that people _found_ objections, eh?
> >>
> >
> >You're betting your career on that Ullrich, so you better read what
> >they have carefully.
>
> How bizarre. I could swear it was just yesterday or a few days
> ago that you made a post where you asked us to pay no attention
> to these ravings about Consequences.

I reminded people to consider context.

The second proof removes any area for confusion, so there is a new
situation.

My hope has been that some of you would see reason rather than waiting
for the inevitable, as if it won't happen, just because it didn't
before now.


> >I'm kind of puzzled by people like you and Magidin who actually have
> >something to lose here.
> >
> >You should know by now that I have a paper on my prime counting
> >function currently under review at the Annals of Mathematics.
>
> Yes, I know that. I've seen the "paper". They're not going to
> publish it.
>

So you're betting your career on that? Why?

It'd be smarter to just back off from such rigid statements, like,
notice, I haven't said they will publish my paper, though of course, I
hope they do.

Why dig your own grave here?

I still find it puzzling.

> >I also
> >have a NEW paper just written a few days ago with this simplified proof
> >showing the problem with the ring of algebraic integers.
> >
> >And noting that you can move a factor of 2 with
> >
> >175x^2 - 15x + 2 = (f(x) + 2)*(g(x) + 1)
> >
> >is just so basic and irrefutable that now for the first time, even
> >casual readers can easily determine that posters are wrong when they
> >dispute this result.
> >
> >If you sit back and wait, assuming nothing will happen, then when it
> >does and my results are published, then your career as a teaching
> >professor is gone, and not even tenure will save you.
>
> God you're stupid. Even if you _were_ right about the math, which
> of course is simply not so, the idea that my saying you were wrong
> would have some adverse consequences for my job is simply hilarious.
>

But this result is a first in mathematics, a "core" error that goes all
the way back to Dedekind, and impacts mathematical arguments and famous
mathematicians over more than a hundred years.

I first got published a proof of the core error three years ago, but
the paper was shot down by sci.math'ers bushwhacking the editors of a
journal which coincidentally was published by Cameron University, a
part of the Oklahoma State University system, with false emails.

The journal editors after trying to de-publish only managed one more
edition before calling it quits, so that's the death of a journal in
your own university system from false accusations.

Beyond that I have my prime counting research which explains the 'why'
of the prime distribution from why the prime count matches as closely
as it does with continuous functions like Li(x), to why it's not exact
to them, to how you can go from the prime counting function directly to
a continuous function moving from a sieve equation, to a partial
difference equation, to a partial differential equation--the three
major forms in my paper currently under review at the Annals of
Mathematics.

That research may lead to resolution of the Riemann Hypothesis.

Yet despite these facts you continue to do dumb things like call me
stupid on newsgroups.

> >Your only hope at this point is to claim confusion and start defending
> >the correct mathematical proofs now that you can see a simple and
> >obvious explanation.
> >
> >After all, with the loss of your position and tenure, you would
> >probably lose most of your retirement funds as well, and have to start
> >over with some new career with a major black mark against you.
> >
> >Why throw the dice?
> >
> >
> >James Harris
>
>
> ************************
>
> David C. Ullrich


And readers notice he didn't answer that final question. I still find
the behavior puzzling.

Other sci.math'ers are in a similar circumstance if not maybe as
dramatic as Ullrich's, as quiet is not a defense.

The simple reality is that university professors are supposed to be
highly intelligent and gifted individuals who can be entrusted with the
care and growth of young minds to help ensure continued progress and
learning--crucial to the future of the world.

It simply is not possible to conclude that people who can ignore the
facts at this point could possibly be considered to be at the high
level needed to be university professors, especially when they ignore
an opportunity, given by me, for them to step forward now, and say
something as simple as, I'm sorry, I made a mistake.


James Harris

Proginoskes

unread,
Dec 11, 2006, 8:52:59 PM12/11/06
to

Dave Rosoff wrote:
> On Mon, 12 Dec 2006, Richard Tobin wrote:
>
> > In article <1165877497.9...@n67g2000cwd.googlegroups.com>,
> > Proginoskes <CCHe...@gmail.com> wrote:
> >> I just hope he doesn't find out "Proginoskes" anagrams to "Pinko
> >> Ogress", or "Pokier Songs", or "Pokes Groins" ...
> >>
> >>
> >
> > AKA
> > Sharp Chicken Mother
> > Rotherham Pinchecks
> > Pinch Thermos Hacker
> >
> > -- Richard
>
> Richard Tobin = Bird Cat Rhino

"A birth cord in" --- maybe his umbilical cord got sucked into his
navel?
"Rich or Bandit"
"Roach bird nit"
"Rid no rich bat" (Do you notice an animal theme here?)

And we know what's next, right?

((Er ...

Hmmm ...

No, that one isn't really that good ...))

OH BULLOCKS!*

Well, I can do James S. Harris:

"Jam her ass, sir."

James! Shame on you!!! 8-)

--- Christopher Heckman

* An interesting remark, since I'm not British.

jshs...@yahoo.com

unread,
Dec 11, 2006, 8:56:56 PM12/11/06
to

That's right James. You should just just say that you made a mistake
and that you are sorry.

gjed...@gmail.com

unread,
Dec 11, 2006, 9:00:24 PM12/11/06
to

Nothing, I repeat nothing, is more certain than the fact that the
Annals will not publish your paper. (Unless someone has a sick sense of
humour on April 1). Trust me, David's position is safe. I'll join Tim
in saying that I too will write you a cashiers check for $10,000 if the
Annals publish it.

Rupert

unread,
Dec 11, 2006, 9:08:14 PM12/11/06
to

jst...@msn.com wrote:
> David C. Ullrich wrote:
> > On 10 Dec 2006 11:33:00 -0800, jst...@msn.com wrote:
> >
> > >
> > >David C. Ullrich wrote:
> > >> On 8 Dec 2006 17:43:52 -0800, jst...@msn.com wrote:
> > >>
> > >> >So I had to find two proofs of the same problem, where the second proof
> > >> >had to be obvious in such a way that there was no room for anyone to
> > >> >find even the hint of an objection.
> > >>
> > >> Curious that people _found_ objections, eh?
> > >>
> > >
> > >You're betting your career on that Ullrich, so you better read what
> > >they have carefully.
> >
> > How bizarre. I could swear it was just yesterday or a few days
> > ago that you made a post where you asked us to pay no attention
> > to these ravings about Consequences.
>
> I reminded people to consider context.
>
> The second proof removes any area for confusion, so there is a new
> situation.
>

In my view, you have not done a very good job of making your latest
argument clear. I doubt that it is clear even in your own mind. I have
made one possible interpretation of this argument and refuted it.

> My hope has been that some of you would see reason rather than waiting
> for the inevitable, as if it won't happen, just because it didn't
> before now.
>
>
> > >I'm kind of puzzled by people like you and Magidin who actually have
> > >something to lose here.
> > >
> > >You should know by now that I have a paper on my prime counting
> > >function currently under review at the Annals of Mathematics.
> >
> > Yes, I know that. I've seen the "paper". They're not going to
> > publish it.
> >
>
> So you're betting your career on that? Why?
>

He's not betting his career on it, you nitwit. Although it would be
perfectly safe to do so.

> It'd be smarter to just back off from such rigid statements, like,
> notice, I haven't said they will publish my paper, though of course, I
> hope they do.
>
> Why dig your own grave here?
>
> I still find it puzzling.
>

Try paying more attention to Tim's patient explanations of how you're
the only person round here who hasn't got a clue. Maybe the light bulb
will come on eventually.

> > >I also
> > >have a NEW paper just written a few days ago with this simplified proof
> > >showing the problem with the ring of algebraic integers.
> > >
> > >And noting that you can move a factor of 2 with
> > >
> > >175x^2 - 15x + 2 = (f(x) + 2)*(g(x) + 1)
> > >
> > >is just so basic and irrefutable that now for the first time, even
> > >casual readers can easily determine that posters are wrong when they
> > >dispute this result.
> > >
> > >If you sit back and wait, assuming nothing will happen, then when it
> > >does and my results are published, then your career as a teaching
> > >professor is gone, and not even tenure will save you.
> >
> > God you're stupid. Even if you _were_ right about the math, which
> > of course is simply not so, the idea that my saying you were wrong
> > would have some adverse consequences for my job is simply hilarious.
> >
>
> But this result is a first in mathematics, a "core" error that goes all
> the way back to Dedekind,

For the millionth time, what's the error? Name an accepted result
that's false.

I *constantly* ask you this, and you *never* respond. Why?

Obviously, because you haven't got a clue what the content of accepted
number theory is.

> and impacts mathematical arguments and famous
> mathematicians over more than a hundred years.
>
> I first got published a proof of the core error three years ago,

That paper proved nothing, it was complete rubbish.

> but
> the paper was shot down by sci.math'ers bushwhacking the editors of a
> journal which coincidentally was published by Cameron University, a
> part of the Oklahoma State University system, with false emails.
>
> The journal editors after trying to de-publish only managed one more
> edition before calling it quits, so that's the death of a journal in
> your own university system from false accusations.
>
> Beyond that I have my prime counting research which explains the 'why'
> of the prime distribution from why the prime count matches as closely
> as it does with continuous functions like Li(x),

Nonsense. Your "research" on this topic yields absolutely no insight
about this matter.

> to why it's not exact
> to them, to how you can go from the prime counting function directly to
> a continuous function moving from a sieve equation, to a partial
> difference equation, to a partial differential equation--the three
> major forms in my paper currently under review at the Annals of
> Mathematics.
>
> That research may lead to resolution of the Riemann Hypothesis.
>

And I may be crowned King of Uganda.

> Yet despite these facts you continue to do dumb things like call me
> stupid on newsgroups.
>

Hmmm, bit of a mystery, eh? What could the explanation be?

> > >Your only hope at this point is to claim confusion and start defending
> > >the correct mathematical proofs now that you can see a simple and
> > >obvious explanation.
> > >
> > >After all, with the loss of your position and tenure, you would
> > >probably lose most of your retirement funds as well, and have to start
> > >over with some new career with a major black mark against you.
> > >
> > >Why throw the dice?
> > >
> > >
> > >James Harris
> >
> >
> > ************************
> >
> > David C. Ullrich
>
>
> And readers notice he didn't answer that final question. I still find
> the behavior puzzling.
>

No-one else does.

> Other sci.math'ers are in a similar circumstance if not maybe as
> dramatic as Ullrich's, as quiet is not a defense.
>

You are completely and utterly mad. Get some help.

> The simple reality is that university professors are supposed to be
> highly intelligent and gifted individuals who can be entrusted with the
> care and growth of young minds to help ensure continued progress and
> learning--crucial to the future of the world.
>
> It simply is not possible to conclude that people who can ignore the
> facts at this point could possibly be considered to be at the high
> level needed to be university professors, especially when they ignore
> an opportunity, given by me, for them to step forward now, and say
> something as simple as, I'm sorry, I made a mistake.
>

Why should they in the absence of a compelling argument for believing
that they are mistaken?

>
> James Harris

Jesse F. Hughes

unread,
Dec 11, 2006, 9:08:22 PM12/11/06
to
jst...@msn.com writes:

> David C. Ullrich wrote:
>
>> Yes, I know that. I've seen the "paper". They're not going to
>> publish it.
>>
>
> So you're betting your career on that? Why?
>
> It'd be smarter to just back off from such rigid statements, like,
> notice, I haven't said they will publish my paper, though of course, I
> hope they do.
>
> Why dig your own grave here?
>
> I still find it puzzling.

But you shouldn't find it puzzling at all. He's already settled it.
He does not believe that there is a chance your paper will be
accepted, so he does not believe his action carries any risk at all.

Why are you so puzzled?

>
>> >
>> >Why throw the dice?


>> >
>
>
> And readers notice he didn't answer that final question. I still find
> the behavior puzzling.

Sure he did. He wrote "I've seen the 'paper'. They're not going to
publish it."

Hence, there is no doubt in his mind and there are no dice involved.
He is confident of the outcome.


--
Jesse F. Hughes
"If you hadn't noticed, basically every result I have destroys some
precious belief of mathematicians and they have from what I've gathered
basically gone collectively bonkers." -- James S. Harris

marcus_b

unread,
Dec 11, 2006, 11:05:40 PM12/11/06
to
jst...@msn.com wrote:
> David C. Ullrich wrote:
> > On 10 Dec 2006 11:33:00 -0800, jst...@msn.com wrote:
> >
> > >
> > >David C. Ullrich wrote:
> > >> On 8 Dec 2006 17:43:52 -0800, jst...@msn.com wrote:
> > >>
> > >> >So I had to find two proofs of the same problem, where the second proof
> > >> >had to be obvious in such a way that there was no room for anyone to
> > >> >find even the hint of an objection.
> > >>
> > >> Curious that people _found_ objections, eh?
> > >>
> > >
> > >You're betting your career on that Ullrich, so you better read what
> > >they have carefully.
> >
> > How bizarre. I could swear it was just yesterday or a few days
> > ago that you made a post where you asked us to pay no attention
> > to these ravings about Consequences.
>
> I reminded people to consider context.
>
> The second proof removes any area for confusion, so there is a new
> situation.
>

You know, I think people have been misinterpreting what you are
actually saying in this second proof.

The core of it is the equation

7*g(x) = 5*a_2(x).

You are saying that in any ring in which this equation is written,
7 must be a factor of a_2(x) because 5 and 7 are coprime.

Of course both g(x) and a_2(x) must be elements of the ring.

I think people have an overpowering urge to think you are talking
about the ring of algebraic integers, partly because a_2(x) is an
algebraic integer. But that ring is explicitly NOT what you are
talking about here. You have tried to tell people that, but they
seem not to be listening.

In fact your claim is that the ring of algebraic integers is lacking
the desirable property that 7 is a factor of a_2(x), in
that ring. That's partly what's wrong with it.

It's simply a fact. If R is a ring which contains the integers and
contains both g(x) and a_2(x), and the equation above holds, then
7 is a factor of a_2(x) in that ring. That's all you're saying.
Right?

I don't think everyone here understands this.

But I think you are also saying something more. You are saying
that this result contradicts the way Galois theory is usually taught,
and it contradicts ideal theory.

I wonder if you could be more explicit about that.

What theorems of the as-usually-taught Galois theory are
contradicted?

And, what theorems of ideal theory are contradicted?

In what kinds of rings do those theorems apply?

You are going to eventually have to answer these questions if
you are going to convince anybody that you are right. You have
to be very precise in math and say everything explicitly. It's not
like, say, politics, where you can wave your hands and imply and
have people fill in the gaps for themselves.


> My hope has been that some of you would see reason rather than waiting
> for the inevitable, as if it won't happen, just because it didn't
> before now.
>

[snip]

>
> But this result is a first in mathematics, a "core" error that goes all
> the way back to Dedekind, and impacts mathematical arguments and famous
> mathematicians over more than a hundred years.
>

[snip]

You need particularly to say here exactly what results of
Dedekind are cast into doubt, or shown to be false, by your
reasoning. No one is ever going to believe you if you keep
leaving this undefined.

Marcus.

Tim Peters

unread,
Dec 12, 2006, 12:21:08 AM12/12/06
to
...

[jst...@msn.com]


>>> I'm kind of puzzled by people like you and Magidin who actually have
>>> something to lose here.
>>>
>>> You should know by now that I have a paper on my prime counting
>>> function currently under review at the Annals of Mathematics.

[David C. Ullrich]


>> Yes, I know that. I've seen the "paper". They're not going to
>> publish it.

[jst...@msn.com]


> So you're betting your career on that? Why?

Try applying modern problem-solving techniques? Person A writes a
mathematical paper. Person B reads that paper, and states that it's certain
it will never be published by a competent mathematical journal. Now write
down all the reasons you can think of for why that /might/ be. Brainstorm!
Forget that you and David are involved. It's irrelevant at this stage. If
it helps, picture me as the one writing the paper (A), and you as the one
stating the paper will never be published (B).

Dang, I gave you the chance, but you didn't do it. Too bad!

For those who did play along, add some more facts to the mix, and cross off
the ideas you came up with that don't match: person B is a math Ph.D., is a
tenured professor of mathematics, and is widely respected on public
newsgroups for the depth and breadth of his mathematical knowledge. Person
A is not a mathematician, has no relevant training in the topic of his
paper, has repeatedly demonstrated gross incompetence in both making and
following mathematical argument, and in particular has repeatedly done so
across years wrt the specific topic of his paper.

Yet, despite all that, B acts /exactly as if/ A's paper is utterly without
merit. How can that be?

Tell you what, James. At /this/ point treat the question as if it were on
an IQ test. You'd have trouble scoring an IQ of 80 if you missed questions
this easy. Is it because:

a. B is engaged in a conspiracy to suppress A's genius.
b. In B's professional judgment, A's paper is without merit.
c. Elephants from the Rings of Saturn are controlling B's mind via
injecting mind-control drugs in circus peanuts teleported into
B's stomach while B sleeps.
d. Liar.

> It'd be smarter to just back off from such rigid statements, like,
> notice, I haven't said they will publish my paper, though of course, I
> hope they do.

No, chucklehead. While he's under no legal obligation to do so, he may well
feel a /moral/ obligation to speak out on matters in his areas of expertise.
One of the purposes of tenure is precisely this, to reduce the risk of that
speaking the truth as he sees it will be punished by loss of employment.
Your threats are empty.

David is far more knowledgeable in mathematics than I am, but even I can see
that your paper is utterly without merit. Many here can, and have said so
plainly, and explained why to you. I don't think it's worth David's time to
give you similar explanations, and it may be harder for him because he'd
have to "dumb down" his knowledge so far to have any chance of being
understood by you.

> Why dig your own grave here?

He didn't. And no matter how much you would love to dig it for him, you
don't have a shovel that works.

> I still find it puzzling.

Then there's something seriously wrong with your ability to think.
Possibilities include:

http://mathforum.org/kb/message.jspa?messageID=444857&tstart=0

http://en.wikipedia.org/wiki/Crank_%28person%29

> ...


0 new messages