Possible proof of Gabriel's Theorem?

99 views
Skip to first unread message

Jason

unread,
Mar 1, 2005, 6:22:38 PM3/1/05
to
Proof of Gabriel's Theorem:

What if we simplify gabriel's proof by keeping the ens fixed and using
s in the definition:

i.e. we let n be fixed for each of f'(x), f'(x+w/n), f'(x+2w/n), etc.
In this case, the following are true:


f(x + w/s) - f(x)
f'(x) = Lim ------------------
s->Infinity w/s


f'(x+ w/n) =


f(x + w/n + w/s) - f(x + w/n)
Lim ------------------------------
s->Infinity w/s


f'(x+ 2w/n) =


f(x + 2w/n + w/s) - f(x + 2w/n)
Lim --------------------------------
s->Infinity w/s


f'(x+ 3w/n) =


f(x + 3w/n + w/s) - f(x + 3w/n)
Lim --------------------------------
s->Infinity w/s


and so on:


However, since both n and s tend to infinity, the limit is taken
once over the entire sum. With n and s coinciding, we get
gabriel's proof.

This seems to work but is this last step legal? If not, why not,
because ultimately we are trying to find the limit of the sum as
n approaches infinity?

Jason Wells

denis feldmann

unread,
Mar 2, 2005, 1:29:56 AM3/2/05
to
Jason a écrit :

> Proof of Gabriel's Theorem:
>
> What if we simplify gabriel's proof by keeping the ens fixed and using
> s in the definition:
>
> i.e. we let n be fixed for each of f'(x), f'(x+w/n), f'(x+2w/n), etc.
> In this case, the following are true:
>
>
> f(x + w/s) - f(x)
> f'(x) = Lim ------------------
> s->Infinity w/s
>
>
> f'(x+ w/n) =
>
>
> f(x + w/n + w/s) - f(x + w/n)
> Lim ------------------------------
> s->Infinity w/s
>
>
> f'(x+ 2w/n) =
>
>
> f(x + 2w/n + w/s) - f(x + 2w/n)
> Lim --------------------------------
> s->Infinity w/s
>
>
> f'(x+ 3w/n) =
>
>
> f(x + 3w/n + w/s) - f(x + 3w/n)
> Lim --------------------------------
> s->Infinity w/s
>
>
> and so on:
>
>

As long as you are willing to copy this at length (and twice)...


> However, since both n and s tend to infinity, the limit is taken
> once over the entire sum. With n and s coinciding, we get
> gabriel's proof.

Why dont you take the trouble to write in full the argument following,
so that those of us who didn't follow the discussion have the slightest
idea of what you are talking about? And it could give you the redaction
of a correct proof (or perhaps, an hint of what is still missing?)

OK, I'll try to guess : you sum the equalities above, getting something
like

sum((f'(x+kw/n), k=0..m)= lim(s->+oo) s(f(x+mw/n+w/s)-f(x))/w

(except this is wrong, of course : there is no cancelation of anything,
and this last limit is infinite...)

But let's pray for a miraculous cancelation and go on


Now what? put m=n? =>

sum((f'(x+kw/n), k=0..n)= lim(s->+oo) s(f(x+w(1+1/s))-f(x))/w = lim
(t->0) (1/w)*

Or something deeper?

Oh, I see, next you take the limit in n->oo.Why?

Anyway, we get

lim (n->oo) (sum((f'(x+kw/n), k=0..n))= lim(s->+oo) s(f(x+w(1+1/s))-f(x))/w

At this stage, obviously, something went badly wrong...

William Elliot

unread,
Mar 2, 2005, 5:18:38 AM3/2/05
to
On Tue, 1 Mar 2005, Jason wrote:

> Proof of Gabriel's Theorem:
>
> What if we simplify gabriel's proof by keeping the ens fixed and using
> s in the definition:
>
> i.e. we let n be fixed for each of f'(x), f'(x+w/n), f'(x+2w/n), etc.
> In this case, the following are true:
>
>
> f(x + w/s) - f(x)
> f'(x) = Lim ------------------
> s->Infinity w/s
>
>

How is this so, for when 0 < w it's
lim(h->0+) (f(x+h) - f(x))/h
which is not
lim(h->0) (f(x+h) - f(x))/h

As for example f(x) = |x|
right hand f'(0) = 1 = lim(s->oo) |1/s|/(1/s)
left hand f'(0) = -1 = lim(s->oo) |-1/s|/(-1/s)
f'(0) undefined

David C. Ullrich

unread,
Mar 2, 2005, 7:34:26 AM3/2/05
to

_What_ last step?

>If not, why not,
> because ultimately we are trying to find the limit of the sum as
> n approaches infinity?
>
> Jason Wells


************************

David C. Ullrich

denis feldmann

unread,
Mar 2, 2005, 7:57:32 AM3/2/05
to
denis feldmann a écrit :

Answering my own post, I took the trouble of looking for Gabriel's
theorem. Mostly, the main result looks like lim (n->oo) w/n
(sum((f'(x+kw/n), k=0..n)))= f(x+w)-f(x), which is not hard to prove
using Riemann sums. What I believe is the main point this absurdly long
thread (or threads) try to make is to validate the (JSH-like)
incredible pretention of Gabriel, trying to prove this result by, let's
say, somewhat illegal means, *and* arguing at the same time that his
approach is the only right way to analysis, everybody else (including
Newton, Cauchy and Rieman) being dumb fools not able to see the stupid
mistakes they made when "proving" elementary Calculus results. As for
"Jason", I don't know. I would suspect he is Gabriel in disguise (or
perhaps his smarter brother, Mycroft)...

Jason

unread,
Mar 2, 2005, 10:15:30 AM3/2/05
to
> > However, since both n and s tend to infinity, the limit is taken
> > once over the entire sum. With n and s coinciding, we get
> > gabriel's proof.
> >
> > This seems to work but is this last step legal?

> _What_ last step?

David,

The last step being that since n coincides with s and the limits are
both taken over infinity, the inner limit falls away to be replaced by
the outer limit.

What do you think?

Jason

Jason

unread,
Mar 2, 2005, 10:31:45 AM3/2/05
to
Denis,

The original post included everything. It then became unmanageable as
cyber thugs kept posting garbage to the thread.

You can find the details at: http://www.geocities.com/john_gabriel

In response to your assumptions about someone trying to discredit
newton, cauchy and all the others: I don't see how you can say this,
because even gabriel acknowledges newton's work. The above result is
not something you can prove using riemann sums. Please be kind enough
to prove the result in full using riemann sums as you claimso that we
can all see.
Have never seen gabriel's theorem stated before, nor a connection
between f(x+w)-f(x) and the integral in the way gabriel
shows it. If I can prove it, then I intend to use it as a teaching aid
because it requires nothing else, i.e. no knowledge of
real analysis or deep properties of the real number system.

Jason

Jason

unread,
Mar 2, 2005, 10:35:38 AM3/2/05
to
Which is quite correct. I believe this agrees with the classic
definition. f'(0) is undefined for f(x) =abs(x)

denis feldmann

unread,
Mar 2, 2005, 5:02:14 PM3/2/05
to
Jason a écrit :

> Denis,
>
> The original post included everything. It then became unmanageable as
> cyber thugs kept posting garbage to the thread.
>
> You can find the details at: http://www.geocities.com/john_gabriel
>
> In response to your assumptions about someone trying to discredit
> newton, cauchy and all the others: I don't see how you can say this,
> because even gabriel acknowledges newton's work.

By saying poor Newton was obliged to illegal contorsions and
infinitesimals, as he could not see the truth...

The above result is
> not something you can prove using riemann sums. Please be kind enough
> to prove the result in full using riemann sums as you claimso that we
> can all see.


For the intervall [x,x+w], the fonction t->f'(t) is Riemann
integrable,the associated Riemann sum is w/n sum(f'(x+kw/n),k=0..n-1)),
with limit (n->+oo) = integral (f'(t), t=x..x+w) = f(x+w)-f(x).

Done.

As this is really trivial , I conclude you are a troll. Please be
kind enough to prove otherwise.

> Have never seen gabriel's theorem stated before, nor a connection
> between f(x+w)-f(x) and the integral in the way gabriel
> shows it. If I can prove it, then I intend to use it as a teaching aid
> because it requires nothing else, i.e. no knowledge of
> real analysis or deep properties of the real number system.

Ha! How do you prove it *without * any definition of anything, like
derivatie or Riemannsum?


>
> Jason
>

David C. Ullrich

unread,
Mar 2, 2005, 6:38:40 PM3/2/05
to

I think two things:

First, I think that you think I know what you're talking
about. There are no inner and outer limits in any of the
posts in this thread, nor any sums visible - if you think
people are going to try to make sense of your corrections
to the incoherent things we can find elsewhere you're wrong -
you should post the entire proof in a coherent form.

Also, I _know_ that in general interchanging limits is
the hard part when you're proving almost anything in
analysis - if you just assume that that works you're
usually sweeping the entire proof under the rug.

>Jason


************************

David C. Ullrich

Larry Hammick

unread,
Mar 3, 2005, 9:59:44 AM3/3/05
to
"Jason"

> Proof of Gabriel's Theorem:
Still putting lipstick on this pig? :)


Jason

unread,
Mar 3, 2005, 12:14:08 PM3/3/05
to

Yes, I thought you had looked at gabriel's stuff so I assumed you knew
what I was talking about.

I will post an entire proof which I think might be correct in the next
few days.

Jason

unread,
Mar 3, 2005, 1:43:57 PM3/3/05
to
Here is my attempt to prove Gabriel's Theorem:

Have changed some of his wording and am calling it
the AFD rather than ATG as he calls it.

Given an interval [x;x+w] subdivided into n equal parts
and a function f which is continuous on [x;x+w] and
differentiable on [x;x+w), the secant gradient or mean
value of f is equal to the average first derivative of f
[or AFD(f)] defined as follows:

w is the width of the interval
n is the number of partitions

f(x + w/n) - f(x)
AFD(f) = ------------------ --- L.H.S
w/n

1 n-1 ws
AFD(f) = Lim - SIGMA f'(x+ --- ) --- R.H.S
n->oo n s=0 n


Proof:

Start with RHS.


1
AFD(f) = Lim - [ f'(x) + f'(x+w/n) + f'(x+2w/n) + ...
n->oo n

f'(x+(n-2)w/n) + f'(x+(n-1)w/n)]


Now we simplify Gabriel's proof by keeping the ens fixed and using
t in the definition:


i.e. we let n be fixed for each of f'(x), f'(x+w/n), f'(x+2w/n), etc.
In this case, the following are true:


f(x+w/t)-f(x)
f'(x) = Lim -------------
t->oo w/t


f'(x+w/n) =


f(x+w/n+w/t)-f(x+w/n)
Lim ---------------------
t->oo w/t


f'(x+2w/n) =


f(x+2w/n+w/t)-f(x+2w/n)
Lim -----------------------
t->oo w/t


f'(x+ 3w/n) =


f(x+3w/n+w/t)-f(x+3w/n)
Lim -----------------------
t->oo w/t


and so on:


Now,
1 f(x+w/t)-f(x)
AFD(f) = Lim - [ Lim ------------- +
n->oo n t->oo w/t

f(x+w/n+w/t)-f(x+w/n)
Lim --------------------- +
t->oo w/t


f(x+2w/n+w/t)-f(x+2w/n)
Lim ----------------------- +
t->oo w/t


f(x+3w/n+w/t)-f(x+3w/n)
Lim ----------------------- +
t->oo w/t

+ ...

f(x+(n-2)w/n+w/t)-f(x+(n-2)w/n)
Lim ------------------------------- +
t->oo w/t


f(x+(n-1)w/n+w/t)-f(x+(n-1)w/n)
Lim ------------------------------- ]
t->oo w/t


Looking at the above we have two limits to infinity.
i.e. n and t. However, since both n and t tend to infinity,
the outer limit is taken once over the entire sum.
With n and t coinciding, we have:


1 f(x+w/n)-f(x)
AFD(f) = Lim - [ Lim ------------- +
n->oo n n->oo w/n

f(x+2w/n)-f(x+w/n)
Lim ------------------ +
n->oo w/n


f(x+3w/n)-f(x+2w/n)
Lim ------------------- +
n->oo w/n


f(x+4w/n)-f(x+3w/n)
Lim ------------------- +
n->oo w/n
+ ...

f(x+(n-1)w/n)-f(x+(n-2)w/n)
Lim --------------------------- +
n->oo w/n


f(x+w)-f(x+(n-1)w/n)
Lim -------------------- ]
n->oo w/n

Which leads to (**) :

1 f(x+w/n)-f(x)
AFD(f) = Lim - [ ------------- +
n->oo n w/n

f(x+2w/n)-f(x+w/n)
------------------ +
w/n


f(x+3w/n)-f(x+2w/n)
------------------- +
w/n


f(x+4w/n)-f(x+3w/n)
------------------- +
w/n
+ ...

f(x+(n-1)w/n)-f(x+(n-2)w/n)
---------------------------
w/n


f(x+w)-f(x+(n-1)w/n)
-------------------- ]
w/n


Some more simplication:

1 f(x+w)-f(x)
AFD(f) = Lim - [ ----------- ]
n->oo n w/n


Then,

1 n f(x+w)-f(x)
AFD(f) = Lim -. [ ----------- ]
n->oo n w


So,

f(x+w)-f(x)
AFD(f) = Lim [ ----------- ]
n->oo w

And finally,

f(x+w)-f(x)
AFD(f) = -----------
w

which is the desired result.

(**) Dropping the inner limit is what I am not sure about.

Jason Wells

denis feldmann

unread,
Mar 4, 2005, 1:55:17 AM3/4/05
to
Jason a écrit :

> Here is my attempt to prove Gabriel's Theorem:
>
> Have changed some of his wording and am calling it
> the AFD rather than ATG as he calls it.
>
> Given an interval [x;x+w] subdivided into n equal parts
> and a function f which is continuous on [x;x+w] and
> differentiable on [x;x+w), the secant gradient or mean
> value of f is equal to the average first derivative of f
> [or AFD(f)] defined as follows:
>
> w is the width of the interval
> n is the number of partitions
>
> f(x + w/n) - f(x)
> AFD(f) = ------------------ --- L.H.S
> w/n


Wrong LHS; as said below, you meant in fact

f(x + w) - f(x)
AFD(f) = ------------------
w

>
> 1 n-1 ws
> AFD(f) = Lim - SIGMA f'(x+ --- ) --- R.H.S
> n->oo n s=0 n
>
>
> Proof:
>
> Start with RHS.
>
>
> 1
> AFD(f) = Lim - [ f'(x) + f'(x+w/n) + f'(x+2w/n) + ...
> n->oo n
>
> f'(x+(n-2)w/n) + f'(x+(n-1)w/n)]
>
>
> Now we simplify Gabriel's proof by keeping the ens fixed and using
> t in the definition:
> i.e. we let n be fixed for each of f'(x), f'(x+w/n), f'(x+2w/n), etc.
> In this case, the following are true:
>
>
> f(x+w/t)-f(x)

> f'(x) = Lim --------------
> t->oo w/t
>
>

This xorks only *on the right* of x

This is illegal (but works in this case for some other reason, as the
final result is correct) :


compare with lim x->0 lim y->0 x-y/x+y =1 and lim y->0 lim x->0 x-y/x+y
= - 1 ; you cannot say let's x=y in the inner limit, can you ?

It's even worse : in your notation, n has disappeared when taking the
inner limit, so you certainly cannot dropped it.

>
> Jason Wells
>

David C. Ullrich

unread,
Mar 4, 2005, 7:08:28 AM3/4/05
to
On 3 Mar 2005 10:43:57 -0800, "Jason" <loga...@yahoo.com> wrote:

>Here is my attempt to prove Gabriel's Theorem:
>
>Have changed some of his wording and am calling it
>the AFD rather than ATG as he calls it.
>
>Given an interval [x;x+w] subdivided into n equal parts
>and a function f which is continuous on [x;x+w] and
>differentiable on [x;x+w), the secant gradient or mean
>value of f is equal to the average first derivative of f

Exactly what do you mean by "derivative" here? You've
given several mutually inconsistent definitions at
various times in these threads.

For now I'll just assume that you mean the same thing
as everyone else by "derivative".

>[or AFD(f)] defined as follows:
>
>w is the width of the interval
>n is the number of partitions
>
> f(x + w/n) - f(x)
> AFD(f) = ------------------ --- L.H.S
> w/n

As Denis said, surely you meant (f(x+w) - f(x))/w here.

> 1 n-1 ws
> AFD(f) = Lim - SIGMA f'(x+ --- ) --- R.H.S
> n->oo n s=0 n

As has been pointed out many times, _if_ you assume in
addition that f' is continuous (and assuming that the
correction above is what you really meant) then the
result is a trivial consequence of the fundamental
theorem of calculus. If you assume only that f is
differentiable then I actually doubt that the theorem
is true, although I don't have a counterexample handy.
Let's see about the proof:

This step is simply wrong. For example,

f(x+2w/n)-f(x+w/n)
Lim ------------------ +
n->oo w/n

is not f'(x + w/n), it is actually f'(x) (at least
if f is continuously differentiable.)

You can't just drop the inner limit that way - as I said
yesterday, interchanging limits is the hard part of most
theorems in analysis, just assuming it works with no
justification is a way to prove anything, including
things that are false.

Jason

unread,
Mar 4, 2005, 9:55:33 AM3/4/05
to
No. By derivative, I mean:

f(x + w/n) - f(x)

f'(x) = Lim ------------------

n->oo w/n

w is the width of the interval
n is the number of partitions

The above definition is *correct* and it produces the same end results
as the clasic definition.

> > 1 n-1 ws
> > AFD(f) = Lim - SIGMA f'(x+ --- ) --- R.H.S
> > n->oo n s=0 n
>
> As has been pointed out many times, _if_ you assume in
> addition that f' is continuous (and assuming that the
> correction above is what you really meant) then the
> result is a trivial consequence of the fundamental
> theorem of calculus. If you assume only that f is
> differentiable then I actually doubt that the theorem
> is true, although I don't have a counterexample handy.

Did you even bother reading my proof David? I think I stated it is
continuous everywhere and differentiable everywhere except possibly
at x+w. And once again David, please show me how this is the same as
the fundametal theorem of calculus. Thus far you have only been able
to haw-hem. Show me where you have seen this result before. Can you
do what gabriel has done with the classic definition? If yes, please
show me!

> Let's see about the proof:

> This step is simply wrong. For example,
>
> f(x+2w/n)-f(x+w/n)
> Lim ------------------ +
> n->oo w/n
>
> is not f'(x + w/n), it is actually f'(x) (at least
> if f is continuously differentiable.)

Now you are beginning to get it David, but you are not quite there I
think. I am not saying that the above result is equal to f'(x + w/n).
It is not actually f'(x) either after we equate both limits.
This is part of the result of the Average Sum Theorem. The above result
is obtained *only* once we let n and t run through to infinity.

> You can't just drop the inner limit that way - as I said
> yesterday, interchanging limits is the hard part of most
> theorems in analysis, just assuming it works with no
> justification is a way to prove anything, including
> things that are false.

This part seems rather straight forward: if we consider that both
limits
are taken as n and t approaches infinity, we can drop the inner limit.

Jason

unread,
Mar 4, 2005, 9:57:42 AM3/4/05
to
No, I meant what you see. Read my response to Ullrich please.

BTW: You are incorrect in saying that it *only works from right hand
side*.

denis feldmann

unread,
Mar 4, 2005, 11:45:32 AM3/4/05
to
Jason a écrit :

> No. By derivative, I mean:
>
> f(x + w/n) - f(x)
> f'(x) = Lim ------------------
> n->oo w/n
>
> w is the width of the interval
> n is the number of partitions
>
> The above definition is *correct* and it produces the same end results
> as the clasic definition.

Does it? What does it says for f(x)=|x| (at 0)? Or for f(x)=exp(-1/x) ?


>
>
>>> 1 n-1 ws
>>>AFD(f) = Lim - SIGMA f'(x+ --- ) --- R.H.S
>>> n->oo n s=0 n
>>
>>As has been pointed out many times, _if_ you assume in
>>addition that f' is continuous (and assuming that the
>>correction above is what you really meant) then the
>>result is a trivial consequence of the fundamental
>>theorem of calculus. If you assume only that f is
>>differentiable then I actually doubt that the theorem
>>is true, although I don't have a counterexample handy.
>
>
> Did you even bother reading my proof David? I think I stated it is
> continuous everywhere and differentiable everywhere except possibly
> at x+w.

You did. So what? Did you bother to read David's answer? (Hint : f'
continuous is stronger than f differentiable)


And once again David, please show me how this is the same as
> the fundametal theorem of calculus.

I did. I note you never answered that. For your information, here it is
again : by the fundamental theorem, f(x+w)-f(x)= integral(f'(t)dt ,
t=x..x+w) Then we use the definition of integral, by riemann sum with
fixed step w/n, and that's it.

Thus far you have only been able
> to haw-hem. Show me where you have seen this result before. Can you
> do what gabriel has done with the classic definition? If yes, please
> show me!
>
>
>>Let's see about the proof:
>>This step is simply wrong. For example,
>
>> f(x+2w/n)-f(x+w/n)
>> Lim ------------------ +
>> n->oo w/n
>>
>>is not f'(x + w/n), it is actually f'(x) (at least
>>if f is continuously differentiable.)
>
>
> Now you are beginning to get it David,


How condescending of you. The point is if you write

>> f(x+2w/n)-f(x+w/n)
>> Lim ------------------ = f'(x+2w/n),
>> n->oo w/n

you make an absurd mistake. If you wait until the whole double limit is
taken, you make an interversion of limits which is absolutely illegal


but you are not quite there I
> think. I am not saying that the above result is equal to f'(x + w/n).
> It is not actually f'(x) either after we equate both limits.
> This is part of the result of the Average Sum Theorem. The above result
> is obtained *only* once we let n and t run through to infinity.
>
>
>>You can't just drop the inner limit that way - as I said
>>yesterday, interchanging limits is the hard part of most
>>theorems in analysis, just assuming it works with no
>>justification is a way to prove anything, including
>>things that are false.
>
>
> This part seems rather straight forward: if we consider that both
> limits
> are taken as n and t approaches infinity, we can drop the inner limit.

Is you say so. But then, why ask us? I told you yesterday that actually
even what you were writing had no sense (mismatch of free and bound
variables), let alone being allowed...


>

denis feldmann

unread,
Mar 4, 2005, 11:46:43 AM3/4/05
to
Jason a écrit :

> No, I meant what you see. Read my response to Ullrich please.
>
> BTW: You are incorrect in saying that it *only works from right hand
> side*.
>

Sure. Just try it for f(x)=|x| (Hint : w/n ->0 *and stays >0*)

Jason

unread,
Mar 4, 2005, 12:33:34 PM3/4/05
to
Denis,

> Sure. Just try it for f(x)=|x| (Hint : w/n ->0 *and stays >0*)

Okay, let x = 0 and w = 1:


|0 + 1/n| - |0| 1/n
----------------- = ---- = 1
1/n 1/n

Now let w = -1;

|0 - 1/n| - |0| 1/n
----------------- = ---- = -1
- 1/n -1/n

So you have exactly the same result as you would for the classic
definition:


|0 + 1| - |0|
-------------- = 1
1

|0 - 1| - |0|
-------------- = -1
-1

See, in this particular function it does not even matter if w goes to
zero or not.

Jason Wells

denis feldmann

unread,
Mar 4, 2005, 12:53:22 PM3/4/05
to
Jason a écrit :

Oh, I see. Where on earth was w->0 included in your "definition"? So you
get f'(x)= lim (w->0) lim (n->+oo) f(x+w/n)-f(x))/(w/n) , right?

But if you include limits towards 0, what's wrong with the much simpler
(and usual) f'(x)=lim (w->0) lf(x+w)-f(x))/w ; may I ask?

Jason

unread,
Mar 4, 2005, 12:57:38 PM3/4/05
to
> Does it? What does it says for f(x)=|x| (at 0)? Or for f(x)=exp(-1/x)
?

Of course it does. Check it out!

> You did. So what? Did you bother to read David's answer? (Hint : f'
> continuous is stronger than f differentiable)

Absolute rubbish! This is not about one being stronger than the other.
It is not required that
f be differentiable at x+w. Yes, it is different to David's statement.
(Hint: Read gabriel's theorem)

> I did. I note you never answered that. For your information, here it
is
> again : by the fundamental theorem, f(x+w)-f(x)= integral(f'(t)dt ,
> t=x..x+w) Then we use the definition of integral, by riemann sum with

> fixed step w/n, and that's it.

You did nothing of the sort! This is the original result which says
nothing about the *average tangent* or
*average derivative* - did you get this? Furthermore, it is not a
riemann sum because a riemann sum is
finite. Neither is it a riemann integral! So you have not shown me
anything at all.

> How condescending of you. The point is if you write
>> f(x+2w/n)-f(x+w/n)
>> Lim ------------------ = f'(x+2w/n),
>> n->oo w/n
> you make an absurd mistake. If you wait until the whole double limit
is
> taken, you make an interversion of limits which is absolutely illegal


I'll ignore the condescending comment. I do not write that. This is
what David wrote and *misunderstood* just as he has
been misunderstanding almost everything else from the beginning.

So gabriel makes an absurd mistake here - why?

> If you wait until the whole double limit is taken, you make an
interversion of limits which is absolutely illegal

Oh really, this makes no sense to me whatsoever. What are you saying
exactly? What on earth
is an *interversion*? Is this a French word because it does not exist
in the English language?

Jason Wells

Jason

unread,
Mar 4, 2005, 1:02:37 PM3/4/05
to
> Oh, I see. Where on earth was w->0 included in your "definition"? So
you
> get f'(x)= lim (w->0) lim (n->+oo) f(x+w/n)-f(x))/(w/n) , right?
> But if you include limits towards 0, what's wrong with the much
simpler
> (and usual) f'(x)=lim (w->0) lf(x+w)-f(x))/w ; may I ask?

w->0 is not included in my definition. It's part of the classical
definition. And no, I don't get what you stated:

f'(x)= lim (w->0) lim (n->+oo) f(x+w/n)-f(x))/(w/n)

This looks like absolute nonsense. The fact of the matter is that you
are making posts without carefully thinking
about what is written. What do you think if you just slow down a bit
and try to see what I am saying instead of impulsively
posting new comments?

David C. Ullrich

unread,
Mar 5, 2005, 7:26:09 AM3/5/05
to
On 4 Mar 2005 06:55:33 -0800, "Jason" <loga...@yahoo.com> wrote:

>No. By derivative, I mean:
>
> f(x + w/n) - f(x)
> f'(x) = Lim ------------------
> n->oo w/n
>
>w is the width of the interval
>n is the number of partitions
>
>The above definition is *correct* and it produces the same end results
>as the clasic definition.

No, it defines a function of two variables x and w.

>> > 1 n-1 ws
>> > AFD(f) = Lim - SIGMA f'(x+ --- ) --- R.H.S
>> > n->oo n s=0 n
>>
>> As has been pointed out many times, _if_ you assume in
>> addition that f' is continuous (and assuming that the
>> correction above is what you really meant) then the
>> result is a trivial consequence of the fundamental
>> theorem of calculus. If you assume only that f is
>> differentiable then I actually doubt that the theorem
>> is true, although I don't have a counterexample handy.
>
>Did you even bother reading my proof David? I think I stated it is
>continuous everywhere and differentiable everywhere except possibly
>at x+w.

You stated this:

>Given an interval [x;x+w] subdivided into n equal parts
>and a function f which is continuous on [x;x+w] and
>differentiable on [x;x+w), the secant gradient or mean
>value of f is equal to the average first derivative of f

That does not state that f' is continuous.

>And once again David, please show me how this is the same as
>the fundametal theorem of calculus.

I didn't say it was the same, I said it was a trivial
consequence of ftc. It is, for exactly the reason various
people have explained to you: that sum is a Riemann sum
for the integral of f'.

>Thus far you have only been able
>to haw-hem. Show me where you have seen this result before. Can you
>do what gabriel has done with the classic definition? If yes, please
>show me!
>
>> Let's see about the proof:
>> This step is simply wrong. For example,
>>
>> f(x+2w/n)-f(x+w/n)
>> Lim ------------------ +
>> n->oo w/n
>>
>> is not f'(x + w/n), it is actually f'(x) (at least
>> if f is continuously differentiable.)
>
>Now you are beginning to get it David, but you are not quite there I
>think. I am not saying that the above result is equal to f'(x + w/n).
>It is not actually f'(x) either after we equate both limits.
>This is part of the result of the Average Sum Theorem. The above result
>is obtained *only* once we let n and t run through to infinity.
>
>> You can't just drop the inner limit that way - as I said
>> yesterday, interchanging limits is the hard part of most
>> theorems in analysis, just assuming it works with no
>> justification is a way to prove anything, including
>> things that are false.
>
>This part seems rather straight forward: if we consider that both
>limits
>are taken as n and t approaches infinity, we can drop the inner limit.

You have no idea what you're talking about.


************************

David C. Ullrich

Jason

unread,
Mar 5, 2005, 11:12:53 AM3/5/05
to
> No, it defines a function of two variables x and w.

Once again: No David! Both you and Yan got this wrong. It is a function
of *one* variable, i.e. x. w which is the width is *constant*. The only
thing that's changing is w/n, not w. If this is a function of two
variables, then so is the classic definition. How is it a function of
two variables in your mind? Is w a variable in the classic definition?
Most certainly not!

You just don't seem to get this, do you? What is bothering you about
this?
You have not grasped this since the beginning.

> You stated this:
>
> >Given an interval [x;x+w] subdivided into n equal parts
> >and a function f which is continuous on [x;x+w] and
> >differentiable on [x;x+w), the secant gradient or mean
> >value of f is equal to the average first derivative of f
> That does not state that f' is continuous.

Oh yes it does! What an uninformed thing of you to say. Let's see:

If f is differntiable everywhere, this means that f' exists everywhere
in the interval! And if f' exists everywhere in the interval, then it
is continuous everywhere in the interval. Give me one example of where
this is untrue. Gabriel states that *only* f'(x+w) need not exist.

> I didn't say it was the same, I said it was a trivial
> consequence of ftc. It is, for exactly the reason various
> people have explained to you: that sum is a Riemann sum
> for the integral of f'.

By saying it is a trivial consequence of the ftc, you are effectively
saying it is the same. It is not a Riemann sum either for it if were,
then
it would be *finite*. It's actually infinite because it is summed over
as n tends to infinity. Riemann sums are approximations, this is not an
approximation, it is *exact*. The Riemann sum becomes a *riemann
integral* as the size of each partition approaches 0.
It would be nice to use the riemann integral except that it is not
easily understood by students and you cannot use it to prove the
*missing link* which I believe is Gabriel's average tangent theorem.
Just try showing how

x+w
f(x+w) - f(x) = INT f'(x) dx
x

using the riemann sum! However, gabriel's theorem fits exactly between
the LHS and RHS in the above formula:

x+w
f(x+w) - f(x) = w * ATG(f) = INT f'(x) dx
x

No real analysis is required here. Furthermore, the riemann sum is not
really a riemann sum, but an archimedean sum. it was Archimedes who
discovered the integral and used the first methods of exhaustion. Sorry
to
downplay your German roots a little!

> You have no idea what you're talking about.

I don't think so Daivd. Maybe you can stop being so narrow minded and
hardheaded and look at this theorem without any preconceived ideas.
Perhaps if you try, you might see something you did not see before.
The average tangent theorem is a small step to the average sum theorem
which I find fascinating but do not completely understand. It is a very
interesting theorem. I believe that had anyone known gabriel's theorems
back then, real analysis may not even have been around today. Careful
David, you may end up looking the fool in a few years time!


Jason Wells.

Jason

unread,
Mar 5, 2005, 11:38:18 AM3/5/05
to
Okay David,

Have just thought of an example where f' may not be continuous.
So let's just go ahead and say that Gabriel's theorem should include
a statement about f' being continuous everywhere except at x+w where it
need
not even exist according to Gabriel's theorem.

Jason Wells

denis feldmann

unread,
Mar 5, 2005, 3:12:05 PM3/5/05
to
Jason a écrit :

>>No, it defines a function of two variables x and w.
>
>
> Once again: No David! Both you and Yan got this wrong. It is a function
> of *one* variable, i.e. x. w which is the width is *constant*. The only
> thing that's changing is w/n, not w. If this is a function of two
> variables, then so is the classic definition. How is it a function of
> two variables in your mind? Is w a variable in the classic definition?
> Most certainly not!


Where is w in the classic definition?


>
> You just don't seem to get this, do you? What is bothering you about
> this?
> You have not grasped this since the beginning.
>
>

Someone indeed don't grasp things

>>You stated this:
>>
>>
>>>Given an interval [x;x+w] subdivided into n equal parts
>>>and a function f which is continuous on [x;x+w] and
>>>differentiable on [x;x+w), the secant gradient or mean
>>>value of f is equal to the average first derivative of f
>>
>>That does not state that f' is continuous.
>
>
> Oh yes it does! What an uninformed thing of you to say.


Troll, or clueless; probably both. See below

Let's see:
>
> If f is differntiable everywhere, this means that f' exists everywhere
> in the interval! And if f' exists everywhere in the interval, then it
> is continuous everywhere in the interval.

That's the stupid clueless thing alluded above


Give me one example of where
> this is untrue.

Try f(x)= 0 for x=0, f(x)=x^2sin(1/x) for x<>0

Gabriel states that *only* f'(x+w) need not exist.
>
>
>>I didn't say it was the same, I said it was a trivial
>>consequence of ftc. It is, for exactly the reason various
>>people have explained to you: that sum is a Riemann sum
>>for the integral of f'.
>
>
> By saying it is a trivial consequence of the ftc, you are effectively
> saying it is the same. It is not a Riemann sum either for it if were,
> then
> it would be *finite*. It's actually infinite because it is summed over
> as n tends to infinity.


Yhis ends the debate. I will probably read you a bit longer for fun;
(and to see if you have any coherent answer to the classical function
above), but don't expect any answer. Good luck wiith your life, as your
maths are irredeemably lost.

Troll, hubris, and JSH-like behaviour. You have got a Hammer, too?


>
>
> Jason Wells.
>

Jason

unread,
Mar 5, 2005, 4:51:17 PM3/5/05
to
> Where is w in the classic definition?

f(x+w) - f(x)


f'(x) = Lim -------------

w->0 w

Are you sure you know how to read English? Do you understand what is
being written here or are you just posting stuff because you have
nothing better to do?

> Someone indeed don't grasp things

Yes, someone is not grasping anything from what I see.

> Troll, or clueless; probably both. See below

Okay, unless you apologize for your foul language and your tantrums, I
will no longer respond to any of your posts. See if you can understan
this. Perhaps I should translate it into French for you?

> That's the stupid clueless thing alluded above

> Try f(x)= 0 for x=0, f(x)=x^2sin(1/x) for x<>0

You obviously did not read my second post to Ullrich where I said f' is
continuous but does not have to continuous at x+w and in fact does not
even have to exist there. Go back and reread it and this time you must
*think* a little bit.

> Yhis ends the debate. I will probably read you a bit longer for fun;
> (and to see if you have any coherent answer to the classical function

> above), but don't expect any answer. Good luck wiith your life, as
your
> maths are irredeemably lost.

Read as much as you like and post as much as you like. If you do not
apologize for calling me a troll and clueless, you no longer exist as
far as I am concerned. Arrevoir my little frog!!

Jason Wells.

Jason

unread,
Mar 5, 2005, 6:20:09 PM3/5/05
to
>Given an interval [x;x+w] subdivided into n equal parts
>and a function f which is continuous on [x;x+w] and
>differentiable on [x;x+w), the secant gradient or mean
>value of f is equal to the average first derivative of f

Ullrich replied:


> That does not state that f' is continuous.

Well, if it does not imply f' is continuous, then all your real
analysis goes down the toilet!
And you call yourself a college math professor?

Giggle, giggle. You put your foot in it again!!

Jason

unread,
Mar 5, 2005, 6:56:44 PM3/5/05
to

So you think I don't know what I am talking about? Well, you probably
need to study what a riemann integral is and then compare it with
gabriel's average derivative and you will see they are vastly
different. Riemann's integral is a joke next to gabriel's average
derivative. Gabriel does not use a mesh value and gabriel's result if
applied to numeric integration/differentiation yields a far better
result than anything riemann ever dreamt of! In fact the Lebesgue
integral definition is not as strong as gabriel's ATT either.

See David, you need to do your homework carefully before you continue
to blabber all the junk you have been blabbering out
on this forum. Evidently *you* have no idea what you are talking
about!!

Jason

denis feldmann

unread,
Mar 6, 2005, 12:08:43 AM3/6/05
to
Jason a écrit :
Dod you send that one before or after the one where we showed you f' can
be not continuous (like x^2*sin(1/x^2) if x<>0, 0 if x=0)? If after,
you are a troll ; if before, don't you think you could have been, well,
a little bit less petulant in your answer? And you ask apologies from
me? (remember, I have no connection with David Ullrich)

ma...@mimosa.csv.warwick.ac.uk

unread,
Mar 6, 2005, 7:28:18 AM3/6/05
to
In article <1110039173.4...@f14g2000cwb.googlegroups.com>,

"Jason" <loga...@yahoo.com> writes:
>> No, it defines a function of two variables x and w.
>
>Once again: No David! Both you and Yan got this wrong. It is a function
>of *one* variable, i.e. x. w which is the width is *constant*. The only
>thing that's changing is w/n, not w.

But if w is a constant, then you have to tell us what its value is,
don't you? If I want to use this definition to compute a derivative of
some function f at some point x, then how do I decide which w to use?

Perhaps you will answer something like it does not matter which w I
choose, provided that f is continuous in [x,x+w]. But if that is the
case, then how do I know that I will not get different answers for the
derivative, depending on whether I choose, say w = 1 or w = 1/2 ?

Derek Holt.


David C. Ullrich

unread,
Mar 6, 2005, 8:34:51 AM3/6/05
to
On 5 Mar 2005 08:12:53 -0800, "Jason" <loga...@yahoo.com> wrote:

>> No, it defines a function of two variables x and w.
>
>Once again: No David! Both you and Yan got this wrong. It is a function
>of *one* variable, i.e. x. w which is the width is *constant*. The only
>thing that's changing is w/n, not w. If this is a function of two
>variables, then so is the classic definition. How is it a function of
>two variables in your mind? Is w a variable in the classic definition?
>Most certainly not!
>
>You just don't seem to get this, do you? What is bothering you about
>this?
>You have not grasped this since the beginning.

Guffaw.

>> You stated this:
>>
>> >Given an interval [x;x+w] subdivided into n equal parts
>> >and a function f which is continuous on [x;x+w] and
>> >differentiable on [x;x+w), the secant gradient or mean
>> >value of f is equal to the average first derivative of f
>> That does not state that f' is continuous.
>
>Oh yes it does! What an uninformed thing of you to say. Let's see:
>
>If f is differntiable everywhere, this means that f' exists everywhere
>in the interval! And if f' exists everywhere in the interval, then it
>is continuous everywhere in the interval. Give me one example of where
>this is untrue.

Your ignorance of calculus is astounding. It's a very standard
example:

Let f(x) = x^2 sin(1/x^2) for x <> 0, f(0) = 0. Then f'(x) exists
for every x, although f' is not continuous.

>Gabriel states that *only* f'(x+w) need not exist.
>
>> I didn't say it was the same, I said it was a trivial
>> consequence of ftc. It is, for exactly the reason various
>> people have explained to you: that sum is a Riemann sum
>> for the integral of f'.
>
>By saying it is a trivial consequence of the ftc, you are effectively
>saying it is the same.

No.

>It is not a Riemann sum either for it if were,
>then
>it would be *finite*. It's actually infinite because it is summed over
>as n tends to infinity.

You're gibbering here - the fact that we're taking a limit as
n -> infinity does not mean that n is infinite.

The sum

sum_j=1^n f'(x+jw/n)/(w/n)

_is_ a Riemann sum for int_x^{x+w} f'.

Just curious: Why are you continuing this series of posts?

I mean by now it must be clear that _everyone_ disagrees. So
that proves that we're all stupid and you're the only one who
can see the truth, fine. Why are you wasting your breath on us?

>Jason Wells.


************************

David C. Ullrich

David C. Ullrich

unread,
Mar 6, 2005, 8:38:59 AM3/6/05
to
On 5 Mar 2005 08:38:18 -0800, "Jason" <loga...@yahoo.com> wrote:

>Okay David,
>
> Have just thought of an example where f' may not be continuous.

That's a lie and you know it. Your weak understanding of these
things could not possibly allow you to think of an example.

You mean you just _saw_ an example in some book or somewhere
online, and although you didn't understand it you decided
for some reason it must be right.

Now I'm curious about something. In your previous post you
said this, when I said that f differentiable does not imply
that f' is continuous:

"Oh yes it does! What an uninformed thing of you to say. Let's see:

If f is differntiable everywhere, this means that f' exists everywhere
in the interval! And if f' exists everywhere in the interval, then it
is continuous everywhere in the interval. Give me one example of where
this is untrue. Gabriel states that *only* f'(x+w) need not exist."

Now you've decided that there _is_ such an example. What I'm
curious about is whether this has any affect on your estimation
of who's the "uninformed" one here.

> So let's just go ahead and say that Gabriel's theorem should include
>a statement about f' being continuous everywhere except at x+w where it
>need
>not even exist according to Gabriel's theorem.

Fine, let's say that. Then the amazing theorem _is_ a trivial
consequence of the fundamental theorem of calculus.

And I mean _trivial_: it follows from nothing but ftc plus
the _definition_ of the integral.

David C. Ullrich

unread,
Mar 6, 2005, 8:43:17 AM3/6/05
to

Facinating. A minute ago I saw a post from _you_ admitting
that in fact f' need not be continuous:

"Have just thought of an example where f' may not be continuous."

So far you've called me uninformed for my opinion on this
question. Above you say it makes my real analysis go down
the toilet and question whether I should be a professor.

Now it turns out that you were wrong about this (and
you _admit_ you were wrong). But there's been no apology
yet for the things you said.

You complain a lot about people being insulting to _you_.
This is one reason people call you a crackpot - it's
precisely typical crackpot behavior: When a crackpot
complains that people are whatever to him, you can
be certain that in fact _he_ is being whatever to them.

************************

David C. Ullrich

David C. Ullrich

unread,
Mar 6, 2005, 8:46:22 AM3/6/05
to
On 5 Mar 2005 15:56:44 -0800, "Jason" <loga...@yahoo.com> wrote:

>
>So you think I don't know what I am talking about?

_every_ competent mathematician reading these threads
_knows_ that you don't know what you're talking about.
You give proofs of this in just about every post.

Let's think about this. We've "disagreed" about many things.
So far there's _one_ question where we disagreed, where
finally both sides have agreed on which answer is right:

Q: If f' exists everywhere need f' be continuous?
A: No.

Given that on the _one_ question where the two of us
finally agreed it turned out you finally agreed that
I was right, do you think you might want to revise
any of what you say above?

W. Dale Hall

unread,
Mar 6, 2005, 12:36:43 PM3/6/05
to
Jason wrote:

Jason,

I think you forgot to put in that little thing you used to say
about not being a supporter of Gabriel's work, but merely are
curious about its validity.

Here, I'll look it up for you:

Disclaimer: I am not endorsing any of Gabriel's work.
Neither am I absolutely certain that all of it is correct
without any doubt. However, I am interested in his average
sum theorem and in particular, a special case of it which
leads to what he calls the average tangent theorem (ATT).
The ATT (or Average Derivative) if true can be used to
prove the mvt and ftoc and several others.

There. Now your objectivity is plain for all to see.

Dale.

Jason

unread,
Mar 6, 2005, 7:43:32 PM3/6/05
to
> Facinating. A minute ago I saw a post from _you_ admitting
> that in fact f' need not be continuous:
> "Have just thought of an example where f' may not be continuous."

Yes, I did say this. But, I am telling you that gabriel's theorem does
not require f' to be continuous at x+w. I see no discrepancies here.

> So far you've called me uninformed for my opinion on this
> question. Above you say it makes my real analysis go down
> the toilet and question whether I should be a professor.
> Now it turns out that you were wrong about this (and
> you _admit_ you were wrong). But there's been no apology
> yet for the things you said.

Actually I have been the only one that's forthright thus far. You were
wrong about gabriel's definition containing two variables and you are
still wrong about a lot of other things. you can be absolutely sure we
disagree on most things but this doesn't say that you are particularly
good at anything. So far, you have made several unsubstantiated claims:

- You have said Gabriel's derivative definitioin contains more
than one variable. You are wronog!
- You have ignorantly stated that gabriel's theorem is a
trivial consequence of the ftoc. You are wrong!

Did you admit to any of this? No. I don't think so. As for complaining,
I am bothered when supposedly educated men like you say things like:
*You don't know what you are talking about.* I really don't care what
most people on this forum say. I have little or no respect for them.
Shouldn't they be apologizing? Daivd, every time you see you are wrong
about something, you conveniently bring up an issue with the other
posters on this forum. Again, their opinions mean nothing to me.

Me a crackpot? Giggle, are you reading any of the other comments?
Please tell me what on earth a contributor's opinion of me (troll, or
whatever) has to do with the fucking subkect I am discussing. They can
all go to hell for all I care. Anyone who is rude, who attacks the
character of someone else without knowing jackshit can in fact get
stuffed! Trust me David, I don't lose any sleep over anyone's opinion -
not in this forum or any other. And hey, am I the only one to make
mistakes? You have all made mistakes. Please spare me the sanctimonious
'better than thou' crap. Just answer the questions as I have tried to
answer your questions. So far, you have answered nothing and have made
a few serious mistakes.
Forget about my character: None of you on this site even have a clue of
what kind of individual I am. And I would not bother with you in any
other situation were it outside this forum.

If you think my messages are so annoying, then why do you respond?
if I have offended you so much, why on earth do you bother?

Troll: I have not left any annoying messages on this forum yet I get
the likes of *real trolls* like feldman and hale and whoever else
frequents this forum being guilty of exactly what they accuse JSH and
me. Now I have not read JSH's posts and know little about what he is
trying to achieve - this is not the point. Point is you are in fact
guilty of what you accuse others to be and not the other way around.

Make a useful contribution or shut up I say. The choice is yours. I am
responding to you because I still respect you. All the other assholes
are not going to get a response from me. I asked you to take this
offline with me so that we can discuss it without all the two cents
worth of every tom, dick and harry in this forum. I know they are all
basically very stupid. Have sent you a private email but so far you
have not responded. Why don't you take this off-line with me? Too
afraid I might convince you? :-) Truth is you may end up convincing me
that gabriel is indeed wrong because I still have some questions. What
really keeps me going is the fact neither you nor anyone else has been
able to disprove or prove *anything* aside from rhetoric, ignorant
posts and irrelevant troll-like comments.

So let's see what you are really made of David - all talk or just
another troll on sci.math...

Jason

unread,
Mar 6, 2005, 7:46:41 PM3/6/05
to
The more I discuss it on this forum, the more convinced I am becoming
that it is true and every one here is a blundering fool!

So how is that for objectivity? Now dale, do you have anything
*mathematical* you care to post or do you just like posting crap for
the heck of it? Now careful! If you disrespect me by posting another
crappy message I simply won't respond. get it?
By the way, I prefer to read only stuff pertaining to the subject. I
don't care what you think of gabriel or of me or jesus christ.

MFolz

unread,
Mar 6, 2005, 8:52:37 PM3/6/05
to
> Riemann's integral is a joke next to gabriel's average
> derivative. Gabriel does not use a mesh value


I don't claim to be an expert on analysis, nor do I have the time to
decipher Gabriel's (pseudo?) math, but the Riemann integral does not
necessarily use a mesh value. Most analysis texts these days seem to
define the upper integral F as inf U(g,P) and the lower integral f as
sup L(g,P). If F=f, then the function g is integrable.

Jason

unread,
Mar 6, 2005, 11:00:10 PM3/6/05
to
Actually, they all use a mesh value. However, the riemann integral is
an *approximation*. Gabriel's average tangent/derivative theorem is not
an approximation - it is *natural integration*. You can take any
definite integral and compute it's value using the average derivative.

So you do not have the time to *decipher* gabriel's math yet you
question its validity? Strange, I did not know that math needs
deciphering for it is either true or false. If you have not looked at
something, why do you by default cast doubt on its truth? You are
simply following in the footsteps of most fools on this forum. The
majority of posts including those by math professors demonstrate they
(the real trolls) did not even bother to so much as read gabriel's
stuff. All I have seen so far is sheer arrogance, stupidity and the
utmost ignorance.

Now allow me to critisize your post:

> I don't claim to be an expert on analysis, nor do I have the time to
> decipher Gabriel's (pseudo?) math, but the Riemann integral does not
> necessarily use a mesh value. Most analysis texts these days seem to
> define the upper integral F as inf U(g,P) and the lower integral f as
> sup L(g,P). If F=f, then the function g is integrable.

Most analysis texts are a load of rubbish and are taught by insecure
individuals the likes of whom can be found on this forum. What are you
trying to say? What does *integrable* mean? It's not an English word.
Let's suppose you mean that g is a function and that by *integrable*
you mean it can be integrated. So what is P then? Do you think that
most people who read this forum know what you mean?

Jason

unread,
Mar 6, 2005, 11:22:16 PM3/6/05
to
One more thing: The father of real analysis was Karl Weierstrass who by
the way, did not complete his degree choosing rather to nurture his
beer belly and skills in fencing. He was a pathetic drunk whose works
were largely ignored a long time. Today, his slippery and vague ideas
and concepts are taught in *real analysis*.

The use of epsilon-delta arguments is highly questionable for real
analysis. Most supporters of real analysis will argue
in strong support of no hyperreal numbers existing in *real analysis*,
yet the very idea of epsilon-delta assumes real numbers are points
defined on a number line.

Weierstrass did not put calculus on a firm footing or more robust
ground, he confounded most of today's so-called mathematicians who are
nothing more than fat, beer bellied drunks who are by nature thugs and
slimebags.
How can anything sound proceed from an unhealthy mind and a sack of
beer?

Wow, am I gonna get responses to this!! Sheeesh, I just don't want to
think how many trolls are going to jump on their band wagon... Fat and
ugly boys like hammick and company are sure to splatter their beer's
worth without any doubt.

Okay trolls, do your thing...

David C. Ullrich

unread,
Mar 7, 2005, 8:25:32 AM3/7/05
to
On 6 Mar 2005 16:43:32 -0800, "Jason" <loga...@yahoo.com> wrote:

>> Facinating. A minute ago I saw a post from _you_ admitting
>> that in fact f' need not be continuous:
>> "Have just thought of an example where f' may not be continuous."
>
>Yes, I did say this. But, I am telling you that gabriel's theorem does
>not require f' to be continuous at x+w. I see no discrepancies here.
>
>> So far you've called me uninformed for my opinion on this
>> question. Above you say it makes my real analysis go down
>> the toilet and question whether I should be a professor.
>> Now it turns out that you were wrong about this (and
>> you _admit_ you were wrong). But there's been no apology
>> yet for the things you said.
>
>Actually I have been the only one that's forthright thus far. You were
>wrong about gabriel's definition containing two variables

Nope.

>and you are
>still wrong about a lot of other things. you can be absolutely sure we
>disagree on most things but this doesn't say that you are particularly
>good at anything. So far, you have made several unsubstantiated claims:
>
> - You have said Gabriel's derivative definitioin contains more
> than one variable. You are wronog!
> - You have ignorantly stated that gabriel's theorem is a
> trivial consequence of the ftoc. You are wrong!

Nope. _If_ we add the assumption that f' is continuous then
it _is_ a trivial consequence of ftoc. As many people have
pointed out, in varying degrees of detail.

>Did you admit to any of this? No. I don't think so. As for complaining,
>I am bothered when supposedly educated men like you say things like:
>*You don't know what you are talking about.*

That's a fact, that you prove with every post, whether it bothers
you or not.

>I really don't care what
>most people on this forum say. I have little or no respect for them.
>Shouldn't they be apologizing? Daivd, every time you see you are wrong
>about something,

Huh? It has not happened yet that I've seen I was wrong about
anything that's come up between the two of us.

>you conveniently bring up an issue with the other
>posters on this forum. Again, their opinions mean nothing to me.
>
>Me a crackpot? Giggle, are you reading any of the other comments?
>Please tell me what on earth a contributor's opinion of me (troll, or
>whatever) has to do with the fucking subkect I am discussing. They can
>all go to hell for all I care. Anyone who is rude, who attacks the
>character of someone else without knowing jackshit can in fact get
>stuffed!

Do you read _your_ posts? People who are rude can get stuffed.

>Trust me David, I don't lose any sleep over anyone's opinion -
>not in this forum or any other. And hey, am I the only one to make
>mistakes? You have all made mistakes.

Yes. Doesn't change the fact that more or less everything you've
said about all this is wrong.

>Please spare me the sanctimonious
>'better than thou' crap. Just answer the questions as I have tried to
>answer your questions. So far, you have answered nothing and have made
>a few serious mistakes.
>Forget about my character:

I don't recall ever saying anything about your character.

>None of you on this site even have a clue of
>what kind of individual I am. And I would not bother with you in any
>other situation were it outside this forum.
>
>If you think my messages are so annoying, then why do you respond?
>if I have offended you so much, why on earth do you bother?

If anyone's curious, this is the part that made me decide to reply
to this message instead of just dropping it as hopeless. Back to
Jason:

Huh? What makes you think that I find your messages annoying?
I haven't told _you_ to get stuffed, I've just commented on the
math.

In fact I don't find your messages annoying, I find them hilarious.
In case you didn't know, so do a lot of people - every time you
make a post here it leads to people all over the planet rolling
on the floor.

Honest.

>Troll: I have not left any annoying messages on this forum yet I get
>the likes of *real trolls* like feldman and hale and whoever else
>frequents this forum being guilty of exactly what they accuse JSH and
>me. Now I have not read JSH's posts and know little about what he is
>trying to achieve - this is not the point. Point is you are in fact
>guilty of what you accuse others to be and not the other way around.
>
>Make a useful contribution or shut up I say.

I have. The fact that you keep saying I'm wrong doesn't make me wrong.

>The choice is yours. I am
>responding to you because I still respect you. All the other assholes
>are not going to get a response from me. I asked you to take this
>offline with me so that we can discuss it without all the two cents
>worth of every tom, dick and harry in this forum. I know they are all
>basically very stupid. Have sent you a private email but so far you
>have not responded. Why don't you take this off-line with me? Too
>afraid I might convince you? :-)

No. The reason is that it was clear long ago that there's no
possibility anyone will ever convince you that you're wrong
about more or less everything, so there would be no point to
an offline "discussion".

As opposed to the online stuff, where one point is that it's
public - when people make incorrect statements in a place
like this where others come to learn about math then it's
proper for others to point out the errors. Also it's all
extremely amusing - wouldn't be anything funny about an
offline exchange because nobody else would be able to see it.

>Truth is you may end up convincing me
>that gabriel is indeed wrong because I still have some questions. What
>really keeps me going is the fact neither you nor anyone else has been
>able to disprove or prove *anything* aside from rhetoric, ignorant
>posts and irrelevant troll-like comments.

I don't believe that anyone has _said_ that his theorem is false
as stated. But nobody has given a proof of it, and people
_suspect_ it's false, for example if there does indeed exist
a non-constant differentiable f such that f'(r) = 0 for all
rational r, as I suspect, then the theorem's certainly false.

>So let's see what you are really made of David - all talk or just
>another troll on sci.math...


************************

David C. Ullrich

David C. Ullrich

unread,
Mar 7, 2005, 8:26:53 AM3/7/05
to
On 6 Mar 2005 16:46:41 -0800, "Jason" <loga...@yahoo.com> wrote:

>The more I discuss it on this forum, the more convinced I am becoming
>that it is true and every one here is a blundering fool!

When you're convinced that _everyone_ else is wrong there's usually
a more plausible explanation.

>So how is that for objectivity? Now dale, do you have anything
>*mathematical* you care to post or do you just like posting crap for
>the heck of it? Now careful! If you disrespect me by posting another
>crappy message I simply won't respond. get it?
>By the way, I prefer to read only stuff pertaining to the subject. I
>don't care what you think of gabriel or of me or jesus christ.


************************

David C. Ullrich

David C. Ullrich

unread,
Mar 7, 2005, 8:33:01 AM3/7/05
to

Yes, most people reading this thread know what "integrable" means.
The fact that you obviously don't is one of the many things that
makes it all amusing enough to persuade some people to keep reading
your posts.

Definition: Suppose that f:[a,b] -> R is bounded. We say f is
(Riemann) integrable if for every epsilon > 0 there exists
delta > 0 such that if a = t_0 < ... < t_n = b,
I_j = [t_{j-1}, t_j], M_j is the sup of f on I_j and m_j is
the inf of f on I_j then the sum of (M_j - m_j)*(length(I_j))
is less than epsilon.

You really didn't know that? If you don't know what the word
"integrable" means then the fact that you've been replying
to posts where people have been discussing the concept seems
very curious.

************************

David C. Ullrich

Jason

unread,
Mar 7, 2005, 9:39:40 AM3/7/05
to
> Definition: Suppose that f:[a,b] -> R is bounded. We say f is
> (Riemann) integrable if for every epsilon > 0 there exists
> delta > 0 such that if a = t_0 < ... < t_n = b,
> I_j = [t_{j-1}, t_j], M_j is the sup of f on I_j and m_j is
> the inf of f on I_j then the sum of (M_j - m_j)*(length(I_j))
> is less than epsilon.
> You really didn't know that? If you don't know what the word
> "integrable" means then the fact that you've been replying
> to posts where people have been discussing the concept seems
> very curious.

Actually I know exactly what it is. I am questioning the relevance of
his post. I am not sure he knows exactly what he is talking about.

See, this is an example of you skirting the issues. Forget about all
the other bozos on here! I am trying to discourage anyone who does not
know what they are talking about from posting to this thread.

As for the definition you provided, although I know it, I do not fully
agree with it. As I stated earlier, I don't like to talk about constant
functions such as f(x) = c for c some constant as being differentiable.
They are not differentiable in my opinion. Now the above definition
covers your ass for such functions but in truth, the ftoc fails
miserably for any such function. i.e. f' = 0. The indefinite integral
of 0 is some constant and its evaluation is 0 since f(x+w)-f(x) = 0 for
any such constant function. So here you have a constant function which
is differentiable but not *integrable*. Wow, makes a lot of sense for a
learner, doesn't it?
The epsilon-delta definition is a load of hogwash in my opinion. In
this case of a constant functioin it is stating that f(x) = c is *not*
integrable since no epsilon > 0 exists. I hinted at this earlier but
seeing you were all so stupid, I simply avoided it.

So why don't you try to answer the questions and stop being so arrogant!

Tim Peters

unread,
Mar 7, 2005, 10:34:18 AM3/7/05
to
[David C. Ullrich]

>> Definition: Suppose that f:[a,b] -> R is bounded. We say f is
>> (Riemann) integrable if for every epsilon > 0 there exists
>> delta > 0 such that if a = t_0 < ... < t_n = b,
>> I_j = [t_{j-1}, t_j], M_j is the sup of f on I_j and m_j is
>> the inf of f on I_j then the sum of (M_j - m_j)*(length(I_j))
>> is less than epsilon.
>> You really didn't know that? If you don't know what the word
>> "integrable" means then the fact that you've been replying
>> to posts where people have been discussing the concept seems
>> very curious.

...

[Jason]


> The epsilon-delta definition is a load of hogwash in my opinion. In
> this case of a constant functioin it is stating that f(x) = c is *not*
> integrable since no epsilon > 0 exists.

Of course every epsilon > 0 exists, so "no epsilon > 0 exists" is
nonsensical. Maybe you meant to say that, given an epsilon > 0, no delta
exists satisfying the condition? Nope, a constant function f(x)=c is
especially _easy_ this way. Then M_j - m_j = c-c = 0 regardless of the
partition, so the sum of (M_j - m_j)*length(I_j) is 0 regardless of the
partition, which is indeed less than epsilon. It doesn't matter which delta
you pick in this case; e.g., always picking delta = b-a, regardless of
epsilon, is enough to show the Riemann integrability of a constant function.

> I hinted at this earlier but seeing you were all so stupid, I simply
> avoided it.

Thank you for your attempted kindness.


David C. Ullrich

unread,
Mar 7, 2005, 10:49:07 AM3/7/05
to
On 7 Mar 2005 06:39:40 -0800, "Jason" <loga...@yahoo.com> wrote:

By the way, looking at what I wrote I see a typo. We need
to insert a few words to make the definition I gave correct:

>> Definition: Suppose that f:[a,b] -> R is bounded. We say f is
>> (Riemann) integrable if for every epsilon > 0 there exists
>> delta > 0 such that if a = t_0 < ... < t_n = b,
>> I_j = [t_{j-1}, t_j],

length(I_j) < delta for all j,

>> M_j is the sup of f on I_j and m_j is
>> the inf of f on I_j then the sum of (M_j - m_j)*(length(I_j))
>> is less than epsilon.
>> You really didn't know that? If you don't know what the word
>> "integrable" means then the fact that you've been replying
>> to posts where people have been discussing the concept seems
>> very curious.
>
>Actually I know exactly what it is. I am questioning the relevance of
>his post. I am not sure he knows exactly what he is talking about.

That's very curious - I don't see any reason to doubt that.

>See, this is an example of you skirting the issues. Forget about all
>the other bozos on here! I am trying to discourage anyone who does not
>know what they are talking about from posting to this thread.
>
>As for the definition you provided, although I know it,

Curious that you didn't notice the _error_ in the version
I posted then.

>I do not fully
>agree with it. As I stated earlier, I don't like to talk about constant
>functions such as f(x) = c for c some constant as being differentiable.
>They are not differentiable in my opinion.

Right. Again, this is the sort of "opinion" that keeps some
people reading your posts...

>Now the above definition
>covers your ass for such functions but in truth, the ftoc fails
>miserably for any such function. i.e. f' = 0. The indefinite integral
>of 0 is some constant and its evaluation is 0 since f(x+w)-f(x) = 0 for
>any such constant function. So here you have a constant function which
>is differentiable but not *integrable*. Wow, makes a lot of sense for a
>learner, doesn't it?
>The epsilon-delta definition is a load of hogwash in my opinion. In
>this case of a constant functioin it is stating that f(x) = c is *not*
>integrable since no epsilon > 0 exists. I hinted at this earlier but
>seeing you were all so stupid, I simply avoided it.
>
>So why don't you try to answer the questions and stop being so arrogant!

Because you say so many things that are utterly ridiculous, at the
same time exhibiting remarkable arrogance yourself.

Here for example, you say that a constant function does not
satisfy that definition? That's hilarious.

Here's an excruciatingly detailed proof that if f = c then
f satisfies the definition above:

Suppose that epsilon > 0. Let delta = 1. Now assume that
a = t_0 < ... < t_n = b, let I_j = [t_{j-1}, t_j],
assume that length(I_j) < delta for all j, and let
M_j and m_j be the sup and the inf of f on I_j,
respectively. Then M_j = c and m_j = c, so
sum (M_j - m_j)*(length(I_j)) = 0, and hence
sum (M_j - m_j)*(length(I_j)) < epsilon. QED.

************************

David C. Ullrich

Angus Rodgers

unread,
Mar 7, 2005, 10:41:37 AM3/7/05
to

Yikes! I wasn't expecting that. I killfiled Jason in exasperation
some time ago, and I haven't been following these threads regularly,
but this made me sit up and take notice.

You're saying that, in spite of the strange presentation on Gabriel's
website, and in spite of Jason's many hilarious misconceptions (about
constant functions not being differentiable, and so on), there is a
coherent conjecture here that hasn't been settled?

I don't suppose you could save me the effort of deciphering Gabriel's
handwriting by pointing me to an article where this conjecture has
been competently formulated, could you?

Sorry I haven't been paying attention. I wasn't expecting actually
to have to look at the maths, and I only hope my rusty analysis is
up to the job. I'd better get that hat and humble pie ready!
--
Angus Rodgers
(angus_prune@ eats spam; reply to angusrod@)
Contains mild peril

Jesse F. Hughes

unread,
Mar 7, 2005, 11:57:22 AM3/7/05
to
"Jason" <loga...@yahoo.com> writes:

>> "Jason" <loga...@yahoo.com> writes:
>>>What does *integrable* mean? It's not an English word.

[...]

> Actually I know exactly what it is. I am questioning the relevance of
> his post. I am not sure he knows exactly what he is talking about.

You got a curious way of expressing yourself.

--
17:49 3/4/05: "The proof is actually not hard, and it is perfect."
07:25 3/5/05: "Nope. I made a mistake."
11:06 3/5/05: "Maybe I screwed up[...] Otherwise, um, it's very easy to factor."
11:48 3/5/05: "The answer is just that simple." -- JSH: A day in the life.

Jason

unread,
Mar 7, 2005, 12:31:19 PM3/7/05
to
To what do I owe this pleasure? :-) Are you the famous JSH?

> You got a curious way of expressing yourself.

Well, considering all the nonsense and irrelevant junk that's posted on
this forum, it's a miracle that I am even able to *express* myself. Ha,
ha.

I have a college math professor in this forum who is bent on
reprimanding me and trolls from as far as France mouthing off at me! So
forgive me if I am sometimes unable to express myself the way I should.
Unfortunately, I know nothing about surrogate factoring so can't
comment on your stuff.

Ullrich and the others will be having a good laugh at this post. Some
possible reactions might be:

"Giggle, a meeting/mating of fools..."

OR

"You said it, I didn't have to..."

OR

"See, I told you: Wells and Hughes are one and the same. Finally..."

OR

"Wells finally flipped over..."

The biggest and nastiest troll is Oliver! I think this miserable life
form is studying psychotherapy with a major in BS(bull shit). So when
he gets his BS, it will be a double BS! I am targeting Ullrich because
I think if he tries, he might understand it - maybe better than I do -
but I am running out of hope fast. I have very little hope for all the
others. Ed Hook initially seemed to be quite knowledgeable but then
gave up me. Yan (from MIT) decided I was not worth the effort. Rodgers
quit after he got some insight into my character and realized I wasn't
such a bad guy. But he is still having a good laugh at me. :-) Oh
well,....

Now I am studying for my Phd in math. Only module I need to complete
now is the Analysis of 6-packs: I decided that the more I drink, the
more eveyone on this forum begins to make sense. However, drinking a
6-pack will result in me losing my 6-pack and not drinking it, means I
will never have a beer belly and thus shall never be awarded my Phd.
Ha, ha. So it's quite a dilemma as you realize I am sure.
I wonder what Ullrich looks like because he appeared to be annoyed at
one post in which I vilified Chapman's personal hygiene and appearance.
The last name implies germanic ancestry. Think Ullrich may be related
to the great Weierstrass somewhere down the line? Ha, ha.

So glad to have made your acquaintance Mr. Hughes. I wish you well in
your endeavours. I am proud to be associated with you and gabriel!
Here's to the revolution!

Jason Wells

Okay, I don't tell jokes ever but now will be the first time:

HAve you heard this one? What's a BS, MS and PHd stand for ?

BS - Bull Shit
MS - More Shit
PHd - Piled higher and deeper.

Okay, so it's probably an old slapstick joke. I ask your forgiveness. I
never was good at telling jokes (and no doubt someone will say I was
never good at math either or I told you so). So please trolls, spare me
the comments. I would like to keep this thread from getting any worse.
otherwise I will just have to start a new one and god knows, some of
you trolls will have a multiple fit! Ha, ha.

Jesse F. Hughes

unread,
Mar 7, 2005, 1:17:14 PM3/7/05
to
"Jason" <loga...@yahoo.com> writes:

> To what do I owe this pleasure? :-) Are you the famous JSH?

I'm merely his sycophant. I'm the unfamous JFH, not the famous JSH,
but sometimes my .sig conveys the wrong message.

--
Jesse F. Hughes
"Well, I guess that's what a teacher from Oklahoma State University
considers proper as Ullrich has said it, and he is, in fact, a teacher
at Oklahoma State University." -- James S. Harris presents a syllogism

David C. Ullrich

unread,
Mar 8, 2005, 7:20:56 AM3/8/05
to
On Mon, 07 Mar 2005 15:41:37 +0000, Angus Rodgers
<angus...@bigfoot.com> wrote:

>[...]


>You're saying that, in spite of the strange presentation on Gabriel's
>website, and in spite of Jason's many hilarious misconceptions (about
>constant functions not being differentiable, and so on), there is a
>coherent conjecture here that hasn't been settled?
>
>I don't suppose you could save me the effort of deciphering Gabriel's
>handwriting by pointing me to an article where this conjecture has
>been competently formulated, could you?

Suppose that f is differentiable on [0,1] (or differentiable
on (0,1) and continuous on [0,1]). Does it follow that

f(1) - f(0) = lim_N sum_1^N f'(j/N)/N ?

Of course this is clear from the fundamental theorem of calculus
if f' is continuous. It seems unlikely that it's true under the
stated hypotheses, but "seems unlikely" is not quite a counterexample.

(Also it's not entirely clear whether this is exactly Gabriel's
claim, because of confusion over the definition of "derivative";
we've been given several mutually inconsistent definitions,
together with the assertion that we mean the same thing as
usual by f'... The question I was talking about takes the
standard definition of the derivative.)

>Sorry I haven't been paying attention. I wasn't expecting actually
>to have to look at the maths, and I only hope my rusty analysis is
>up to the job. I'd better get that hat and humble pie ready!


************************

David C. Ullrich

Jason

unread,
Mar 8, 2005, 12:21:18 PM3/8/05
to
> In fact I don't find your messages annoying, I find them hilarious.
> In case you didn't know, so do a lot of people - every time you
> make a post here it leads to people all over the planet rolling
> on the floor. Honest.

Hate to tell you that *you* are the one being laughed at. :-) Why?
Well, I don't claim to know. That's why I post messages on this board
soliciting information from *those who claim to know*. Now you claim to
know yet not once have you been able to answer any of the questions I
asked you. Your responses are hilarious and non-commital, e.g.

Nope.
Honest. etc.

> I have. The fact that you keep saying I'm wrong doesn't make me
wrong.

You have been wrong two times I am aware of. The rest of your
commitments say nothing so how can you say too much which is wrong?

You are supposed to be a mathematician, no? Well, show me through
mathematics where and why in a detailed explanation how Gabriel's
theorem is wrong. You keep changing your tone! Well, what exactly are
you saying?

> No. The reason is that it was clear long ago that there's no
> possibility anyone will ever convince you that you're wrong
> about more or less everything, so there would be no point to
> an offline "discussion".

Nothing is clear from what you have written. You have a non-comittal
approach and you have made at least two mistakes which you refuse to
admit.

Once again David, you need to admit your mistakes and support your
criticisms. You have not been able to convince me of anything because
you
have not said much that makes sense yet. You pick out some vague topic
and then present an example which is irrelevant. As a mathematician you
would need to show in detail (giving reasons) why a proof is incorrect.
Just remember this is not my work and I am in fact answering questions
which gabriel himself should be answering. I may not be answering
correctly all the time but I am trying...

My occupation is *hobby mathematician*. I openly admit this. You are
supposed to be a professional?!

> I don't believe that anyone has _said_ that his theorem is false
> as stated. But nobody has given a proof of it, and people
> _suspect_ it's false, for example if there does indeed exist
> a non-constant differentiable f such that f'(r) = 0 for all
> rational r, as I suspect, then the theorem's certainly false.

Giggle, giggle. About 90% of the contributors on this forum have said
his theorem is false! Can you *read* ?! Have you understood any of the
posts?
Now I am rolling in laughter.

As for a proof: I attempted a proof and you have not been able to show
me anywhere that this proof is incorrect. I don't know if it is indeed
correct or not. I outlined my concerns which once again you failed to
address. Probably because this is how you teach your classes too: you
simply dismiss questions as irrelevant when you don't have answers or
you are afraid to commit yourself.

So David, if indeed the planet is laughing at me, I am at least
accomplishing some good: laughter is good for the bones!

Here's to the *good health* of all those reading my funny posts!! ha,
ha.
Please continue to read and improve your health! Your bill will be in
the mail soon. What good are you doing?

Dr. Jason Wells. Hee, hee.

Jason

unread,
Mar 8, 2005, 12:43:54 PM3/8/05
to
Ullrich wrote:

> Suppose that f is differentiable on [0,1] (or differentiable
> on (0,1) and continuous on [0,1]). Does it follow that

> f(1) - f(0) = lim_N sum_1^N f'(j/N)/N ?

> Of course this is clear from the fundamental theorem of calculus
> if f' is continuous. It seems unlikely that it's true under the
> stated hypotheses, but "seems unlikely" is not quite a
counterexample.

Now you talk about my posts being funny! Let's see how funny yours is:

First you say "suppose f is differentiable on [0,1]" and in parentheses
you write "or differentiable on (0,1) and continuous on [0,1]".
May I remind you the two are not the *equivalent*. Your statement seems
to imply they are. In mathematics you need to be precise - either/or
makes a big difference. Then you casually proceed to ask a question
which follows on a shaky statement. Okay, let's see: Does your question
pertain to the first part or second part of your statement or both?
Wow! And you call yourself a mathematician. But wait, I am not done
with you yet!

> f(1) - f(0) = lim_N sum_1^N f'(j/N)/N ?

What are you writing here? Let me see. What does *j* represent ?? Is it
a variable of summation maybe? Giggle. Are you summing from 1 - N ?
Cause if you are, it is incorrect. What does j/N mean?

Pheeeww. I think I'll stop here. Talk about clarity!! Just out of
curiousity, what percentage of your students pass? Or maybe I should
ask: How many of them have learned anything from your classes?

Jason Wells

John Gabbriel

unread,
Mar 8, 2005, 12:48:44 PM3/8/05
to
David C. Ullrich wrote:
> On Mon, 07 Mar 2005 15:41:37 +0000, Angus Rodgers
> <angus...@bigfoot.com> wrote:
>
> >[...]
>
> Suppose that f is differentiable on [0,1] (or differentiable
> on (0,1) and continuous on [0,1]). Does it follow that
>
> f(1) - f(0) = lim_N sum_1^N f'(j/N)/N ?
>
> Of course this is clear from the fundamental theorem of calculus
> if f' is continuous. It seems unlikely that it's true under the
> stated hypotheses, but "seems unlikely" is not quite a
counterexample.

This is just the first part of Gabriel's theorem.

Part ii) states the Integral (x to x+w) f'(t)dt = some limit on the
right. We have many counterexamples which show that if f is
differentiable, then f' need not be Riemann (or even Lebesgue)
integrable.

So the theorem as stated is wrong. Part i) might be true, but I am
confident there will be counterexamples to this. I read somewhere that
there are strictly increasing differentiable functions g such that
g'(x) = 0 almost everywhere, (supposedly this is given as Example
2.1.2.1 in B.R. Gelbaum and J M H Olmstead, Theorems and
counterexamples in mathematics, Springer 1990) which leads me to
believe there will be functions such that f(0) < f(1) and f'(r) = 0 for
rational r.

Arturo Magidin

unread,
Mar 8, 2005, 12:47:35 PM3/8/05
to
In article <1110303834.7...@l41g2000cwc.googlegroups.com>,

Jason <loga...@yahoo.com> wrote:
>Ullrich wrote:
>
>> Suppose that f is differentiable on [0,1] (or differentiable
>> on (0,1) and continuous on [0,1]). Does it follow that
>
>> f(1) - f(0) = lim_N sum_1^N f'(j/N)/N ?
>
>> Of course this is clear from the fundamental theorem of calculus
>> if f' is continuous. It seems unlikely that it's true under the
>> stated hypotheses, but "seems unlikely" is not quite a
>counterexample.
>
>Now you talk about my posts being funny! Let's see how funny yours is:
>
>First you say "suppose f is differentiable on [0,1]" and in parentheses
>you write "or differentiable on (0,1) and continuous on [0,1]".
>May I remind you the two are not the *equivalent*. Your statement seems
>to imply they are.

Ehr, no, it does not. If instead of "or" he had written "that is",
then the statement would indeed imply the two conditions are
equivalent. But the statement, as written, implies no such thing: it
says "suppose A (or, if you prefer, suppose B instead)".

You seem to be reading the statement as if he had written "Suppose A
(i.e., B)". But he did not.

--
======================================================================
"It's not denial. I'm just very selective about
what I accept as reality."
--- Calvin ("Calvin and Hobbes")
======================================================================

Arturo Magidin
mag...@math.berkeley.edu

Jason

unread,
Mar 8, 2005, 12:56:20 PM3/8/05