Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

The New Calculus is the only correct formulation of calculus.

370 views
Skip to first unread message

Jew Lover

unread,
Jan 10, 2019, 7:59:49 AM1/10/19
to
Mainstream mathematics academics are incorrigibly stupid creatures. They will tell you that a straight line function has a derivative and change the definition of derivative in terms of their bullshit "analytic derivative".

Let's see how the orangutans fail!

The slope of a straight line is given by:

[ f(x+h)-f(x) ] / h [A]

The derivative of a straight line is given by:

Lim (h -> 0) [ f(x+h)-f(x) ] / h [B]

Therefore, since a straight line's derivative is its slope, we have:

Lim (h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+h)-f(x) ] / h

Can you get any dumber? I have produced a New Calculus which has no ill-formed concepts and have solved the tangent line problem and removed ALL ill-formed concepts such as infinity, infinitesimals and limit theory from differential and integral calculus.

A derivative can be found geometrically using Ancient Greek mathematics: https://lnkd.in/dhN6Gzw

Download the most important mathematics book ever written here (it's free!!!):

https://lnkd.in/dEkeshd

Alan Mackenzie

unread,
Jan 10, 2019, 8:57:15 AM1/10/19
to
Jew Lover <thenewc...@gmail.com> wrote:
> Mainstream mathematics academics are incorrigibly stupid creatures.

That is false and borders on defamation.

> They will tell you that a straight line function has a derivative and
> change the definition of derivative in terms of their bullshit
> "analytic derivative".

As you note below, a straight line function does indeed have a
derivative.

> The slope of a straight line is given by:

> [ f(x+h)-f(x) ] / h [A]

In fact, that is the slope of a chord through (x, f(x)) of any real
function of one variable.

> The derivative of a straight line is given by:

> Lim (h -> 0) [ f(x+h)-f(x) ] / h [B]

Correct. How well do you understand this definition?

> Therefore, since a straight line's derivative is its slope, we have:

> Lim (h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+h)-f(x) ] / h

That is plain nonsense. On the left of the = sign is a function of one
variable, x, on the right of the = is a function of two variables, x and
h.

> Can you get any dumber?

I suspect not.

> I have produced a New Calculus which has no ill-formed concepts and
> have solved the tangent line problem and removed ALL ill-formed
> concepts such as infinity, infinitesimals and limit theory from
> differential and integral calculus.

What is "New Calculus" and what can be done with it? Given that calculus
is overwhelmingly successful (with applications in science, engineering
and statistics, amongst others), and is rigorously well defined, what's
the point of "New Calculus", whatever it might be?

> A derivative can be found geometrically using Ancient Greek
> mathematics: https://lnkd.in/dhN6Gzw

> Download the most important mathematics book ever written here (it's
> free!!!):

> https://lnkd.in/dEkeshd

Is that Euclid's Elements?

--
Alan Mackenzie (Nuremberg, Germany).

Jew Lover

unread,
Jan 10, 2019, 10:28:44 AM1/10/19
to
On Thursday, 10 January 2019 08:57:15 UTC-5, Alan Mackenzie wrote:
> Jew Lover <thenewc...@gmail.com> wrote:
> > Mainstream mathematics academics are incorrigibly stupid creatures.
>
> That is false and borders on defamation.

It is indisputably true.

>
> > They will tell you that a straight line function has a derivative and
> > change the definition of derivative in terms of their bullshit
> > "analytic derivative".
>
> As you note below, a straight line function does indeed have a
> derivative.

Either your eyesight is really bad or you're just plain stupid.

>
> > The slope of a straight line is given by:
>
> > [ f(x+h)-f(x) ] / h [A]
>
> In fact, that is the slope of a chord through (x, f(x)) of any real
> function of one variable.
>
> > The derivative of a straight line is given by:
>
> > Lim (h -> 0) [ f(x+h)-f(x) ] / h [B]
>
> Correct. How well do you understand this definition?

Better than you or anyone else will ever be able to understand it.

>
> > Therefore, since a straight line's derivative is its slope, we have:
>
> > Lim (h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+h)-f(x) ] / h
>
> That is plain nonsense. On the left of the = sign is a function of one
> variable, x, on the right of the = is a function of two variables, x and
> h.

What you write is nonsense. Both are functions in two variables - x and h.

>
> > Can you get any dumber?
>
> I suspect not.
>
> > I have produced a New Calculus which has no ill-formed concepts and
> > have solved the tangent line problem and removed ALL ill-formed
> > concepts such as infinity, infinitesimals and limit theory from
> > differential and integral calculus.
>
> What is "New Calculus" and what can be done with it?

Go and study it, you crank!

> Given that calculus
> is overwhelmingly successful (with applications in science, engineering
> and statistics, amongst others), and is rigorously well defined, what's
> the point of "New Calculus", whatever it might be?
>
> > A derivative can be found geometrically using Ancient Greek
> > mathematics: https://lnkd.in/dhN6Gzw
>
> > Download the most important mathematics book ever written here (it's
> > free!!!):
>
> > https://lnkd.in/dEkeshd
>
> Is that Euclid's Elements?

Watch the video, you moron!

j4n bur53

unread,
Jan 10, 2019, 11:07:56 AM1/10/19
to
So you are using Eulers blunder, S = Lim S?

j4n bur53

unread,
Jan 10, 2019, 11:32:45 AM1/10/19
to
Its rather John Garbageiels blunder.
What is this nonsense moron?

Lim (h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+h)-f(x) ] / h

On the LHS there is a value and on the right hand
side there is a function in h. Do you always compare
values with functions?

Such a comparison fails usually. R and R -> R
are distinct spaces. Maybe in Dans DC Proof they
are the same, but usually, in ZFC:

there is no real number r,
which is also a function f : R -> R

and there is no function f : R -> R,
which is also a real number r

How stupid are you bird brain?

Me

unread,
Jan 10, 2019, 4:38:28 PM1/10/19
to
On Thursday, January 10, 2019 at 1:59:49 PM UTC+1, Jew Lover wrote:

> The slope of a straight line is given by:
>
> [ f(x+c)-f(x) ] / c [A]

for some c e IR, c > 0.

> The derivative of a straight line is given by:
>
> lim_(h -> 0) [ f(x+h)-f(x) ] / h [B]

Right.

> Therefore, since a straight line's derivative is its slope, we have:
>
> lim_(h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+c)-f(x) ] / c

for some c e IR, c > 0.

Right.

> A derivative can be found geometrically [in this case].

Right.

Well, done, troll boy.

Jew Lover

unread,
Jan 10, 2019, 7:11:09 PM1/10/19
to
On Thursday, 10 January 2019 16:38:28 UTC-5, Me wrote:
> On Thursday, January 10, 2019 at 1:59:49 PM UTC+1, Jew Lover wrote:
>
> > The slope of a straight line is given by:
> >
> > [ f(x+c)-f(x) ] / c [A]
>
> for some c e IR, c > 0.
>
> > The derivative of a straight line is given by:
> >
> > lim_(h -> 0) [ f(x+h)-f(x) ] / h [B]
>
> Right.
>
> > Therefore, since a straight line's derivative is its slope, we have:
> >
> > lim_(h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+c)-f(x) ] / c

Changing the variable does not solve the problem, you imbecile.

Not even a good try. Fef.

Dan Christensen

unread,
Jan 10, 2019, 8:43:26 PM1/10/19
to
On Thursday, January 10, 2019 at 7:59:49 AM UTC-5, Jew Lover wrote:
> Mainstream mathematics academics are incorrigibly stupid creatures. They will tell you that a straight line function has a derivative and change the definition of derivative in terms of their bullshit "analytic derivative".
>
> Let's see how the orangutans fail!
>

Hey, Troll Boy, did you ever figure out the derivative of the y=x? No??? Hint: It is NOT undefined as you previously claimed. What else have you got?


Interested readers should see: “About the spamming troll John Gabriel in his own words (December 2018)” at https://groups.google.com/forum/#!topic/sci.math/PcpAzX5pDeY


Dan

Download my DC Proof 2.0 freeware at http://www.dcproof.com
Visit my Math Blog at http://www.dcproof.wordpress.com

Me

unread,
Jan 11, 2019, 2:13:50 AM1/11/19
to
On Friday, January 11, 2019 at 1:11:09 AM UTC+1, Jew Lover wrote:
> On Thursday, 10 January 2019 16:38:28 UTC-5, Me wrote:
> > On Thursday, January 10, 2019 at 1:59:49 PM UTC+1, Jew Lover wrote:
> > >
> > > The slope of a straight line is given by:
> > >
> > > [ f(x+c)-f(x) ] / c [A]
> > >
> > for some c e IR, c > 0.
> > >
> > > The derivative of a straight line is given by:
> > >
> > > lim_(h -> 0) [ f(x+h)-f(x) ] / h [B]
> >
> > Right.
> >
> > > Therefore, since a straight line's derivative is its slope, we have:
> > >
> > > lim_(h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+c)-f(x) ] / c
> > >
for some c e IR, c > 0.

Actually, ANY c e IR, c > 0 will do.

Well done, troll boy!

Jew Lover

unread,
Jan 11, 2019, 6:04:13 AM1/11/19
to
Except 0. What a moron! The fact is that

lim_(h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+c)-f(x) ] / c

is NOT true in general. That is too much for your syphilitic brain to digest. Got more decrees? Chuckle.

Me

unread,
Jan 11, 2019, 7:37:09 AM1/11/19
to
Am Freitag, 11. Januar 2019 12:04:13 UTC+1 schrieb Jew Lover:
> On Friday, 11 January 2019 02:13:50 UTC-5, Me wrote:
> >
> > Actually, ANY c e IR, c > 0 will do.
> >
> Except 0.

Oh, so c > 0 allows for c = 0 in your "New Calculus"? Fascinating!

Now you claim

> lim_(h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+c)-f(x) ] / c
>
> is NOT true in general.

It seems that you can't read.

Hint: For all x e IR and for any c e IR, c > 0:

lim_(h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+c)-f(x) ] / c ,

where f is a function "decribing" a "straight line". With other words:

f(x) = m*x + d for all x e IR (with m e IR and d e IR) .

Zelos Malum

unread,
Jan 11, 2019, 8:14:16 AM1/11/19
to
>Better than you or anyone else will ever be able to understand it.

Not really considering you cannot see the differens between a two variable function and one variable function and then proclaim they are the same.

>What you write is nonsense. Both are functions in two variables - x and h.

Not at all, lim(h->0) binds the h so it is not a variable and hence it is not a functino of 2 variables, but 1 cause the h is bound.

Jew Lover

unread,
Jan 11, 2019, 10:47:37 AM1/11/19
to
On Friday, 11 January 2019 07:37:09 UTC-5, Me wrote:
> Am Freitag, 11. Januar 2019 12:04:13 UTC+1 schrieb Jew Lover:
> > On Friday, 11 January 2019 02:13:50 UTC-5, Me wrote:
> > >
> > > Actually, ANY c e IR, c > 0 will do.
> > >
> > Except 0.
>
> Oh, so c > 0 allows for c = 0 in your "New Calculus"? Fascinating!

You can't refute the New Calculus, for in order to do so, you would have to reject sound geometry.

In your bogus calculus, you do set h = 0 even though you have produced a whole lot of hand waving arguments that do you no good whatsoever.

f(x,h)= [ f(x+h)-f(x) ] / h

f'(x) = f'(x) + Q(x,h) which is only possible if Q(x,h)=0 meaning h=0.

Therefore, you have NO systematic way of finding the derivative. Your first principles method is a load of crap.

<gibberish>

Me

unread,
Jan 11, 2019, 11:09:09 AM1/11/19
to
On Friday, January 11, 2019 at 4:47:37 PM UTC+1, Jew Lover wrote:
> On Friday, 11 January 2019 07:37:09 UTC-5, Me wrote:
> > Am Freitag, 11. Januar 2019 12:04:13 UTC+1 schrieb Jew Lover:
> > > On Friday, 11 January 2019 02:13:50 UTC-5, Me wrote:
> > > >
> > > > Actually, ANY c e IR, c > 0 will do.
> > > >
> > > Except 0.
> > >
> > Oh, so c > 0 allows for c = 0 in your "New Calculus"? Fascinating!
> >
> You can't <bla>

Yeah, whatever.

> In calculus, you do set h = 0

Nope! h = 0 is explicitly excluded when calculating the limit

lim_(h -> 0) [ f(x+h)-f(x) ] / h ,

idiot! (Hint: f(x+h)-f(x) ] / h is not defined for h = 0.)

What a moron!

Alan Mackenzie

unread,
Jan 11, 2019, 2:58:03 PM1/11/19
to
Jew Lover <thenewc...@gmail.com> wrote:

> You can't refute the New Calculus, for in order to do so, you would
> have to reject sound geometry.

Seeing as how you won't or can't even say what this "New Calculus" is, or
what it's for, there doesn't seem to be anything worth refuting. Don't
try going into advertising for a living; it's not your strong point
(assuming you've got one).

> In your bogus calculus, you do set h = 0 even though you have produced
> a whole lot of hand waving arguments that do you no good whatsoever.

No. I asked you yesterday how well you understood the formulation of a
derivative as a limit, and you said "better than me". It's now clear you
barely understand it at all, and haven't a clue with respect to its
rigorous foundation. You clearly belong to the mathematically less
sophisticated part of the world's population.

[ .... ]

> Therefore, you have NO systematic way of finding the derivative. Your
> first principles method is a load of crap.

If you really did have something new to offer, you wouldn't be needing
all this sweary language.

Jew Lover

unread,
Jan 11, 2019, 4:59:18 PM1/11/19
to
On Friday, 11 January 2019 14:58:03 UTC-5, Alan Mackenzie wrote:
> Jew Lover <thenewc...@gmail.com> wrote:
>
> > You can't refute the New Calculus, for in order to do so, you would
> > have to reject sound geometry.
>
> Seeing as how you won't or can't even say what this "New Calculus" is

<crap>

What a crank you are. I referred you to a publication and you said "No thanks."

Yes, you are a fucking moron. Shut up already.

Jew Lover

unread,
Jan 11, 2019, 5:04:09 PM1/11/19
to
On Friday, 11 January 2019 11:09:09 UTC-5, Me wrote:
> On Friday, January 11, 2019 at 4:47:37 PM UTC+1, Jew Lover wrote:
> > On Friday, 11 January 2019 07:37:09 UTC-5, Me wrote:
> > > Am Freitag, 11. Januar 2019 12:04:13 UTC+1 schrieb Jew Lover:
> > > > On Friday, 11 January 2019 02:13:50 UTC-5, Me wrote:
> > > > >
> > > > > Actually, ANY c e IR, c > 0 will do.
> > > > >
> > > > Except 0.
> > > >
> > > Oh, so c > 0 allows for c = 0 in your "New Calculus"? Fascinating!
> > >
> > You can't <bla>
>
> Yeah, whatever.
>
> > In calculus, you do set h = 0
>
> Nope! h = 0 is explicitly excluded when calculating the limit
>
> lim_(h -> 0) [ f(x+h)-f(x) ] / h ,
>
> idiot! (Hint: f(x+h)-f(x) ] / h is not defined for h = 0.)
>

Quite telling how you failed to address any of the following. Chuckle.

You can't refute the New Calculus, for in order to do so, you would have to reject sound geometry.

In your bogus calculus, you do set h = 0 even though you have produced a whole lot of hand waving arguments that do you no good whatsoever.

Jew Lover

unread,
Jan 11, 2019, 5:17:20 PM1/11/19
to
On Friday, 11 January 2019 11:09:09 UTC-5, Me wrote:
> On Friday, January 11, 2019 at 4:47:37 PM UTC+1, Jew Lover wrote:
> > On Friday, 11 January 2019 07:37:09 UTC-5, Me wrote:
> > > Am Freitag, 11. Januar 2019 12:04:13 UTC+1 schrieb Jew Lover:
> > > > On Friday, 11 January 2019 02:13:50 UTC-5, Me wrote:
> > > > >
> > > > > Actually, ANY c e IR, c > 0 will do.
> > > > >
> > > > Except 0.
> > > >
> > > Oh, so c > 0 allows for c = 0 in your "New Calculus"? Fascinating!
> > >
> > You can't <bla>
>
> Yeah, whatever.
>
> > In calculus, you do set h = 0
>
> Nope! h = 0 is explicitly excluded when calculating the limit
>
> lim_(h -> 0) [ f(x+h)-f(x) ] / h ,
>
> idiot! (Hint: f(x+h)-f(x) ] / h is not defined for h = 0.)

In your bogus calculus, you do set h = 0 even though you have produced a whole lot of hand waving arguments that do you no good whatsoever.

Me

unread,
Jan 11, 2019, 5:56:09 PM1/11/19
to
On Friday, January 11, 2019 at 11:04:09 PM UTC+1, Jew Lover wrote:
> On Friday, 11 January 2019 11:09:09 UTC-5, Me wrote:
> > On Friday, January 11, 2019 at 4:47:37 PM UTC+1, Jew Lover wrote:
> > > On Friday, 11 January 2019 07:37:09 UTC-5, Me wrote:
> > > > Am Freitag, 11. Januar 2019 12:04:13 UTC+1 schrieb Jew Lover:
> > > > > On Friday, 11 January 2019 02:13:50 UTC-5, Me wrote:
> > > > > >
> > > > > > Actually, ANY c e IR, c > 0 will do.
> > > > > >
> > > > > Except 0.
> > > > >
> > > > Oh, so c > 0 allows for c = 0 in your "New Calculus"? Fascinating!
> > > >
> > > You can't <bla>
> >
> > Yeah, whatever.
> >
> > > In calculus, you do set h = 0
> >
> > Nope! h = 0 is explicitly excluded when calculating the limit
> >
> > lim_(h -> 0) [ f(x+h)-f(x) ] / h ,
> >
> > idiot! (Hint: f(x+h)-f(x) ] / h is not defined for h = 0.)
> >
>
> Quite telling <bla>

If you say so.

> In calculus, you do set h = 0 <bla>

No. See above.

Me

unread,
Jan 11, 2019, 5:56:52 PM1/11/19
to
On Friday, January 11, 2019 at 11:17:20 PM UTC+1, Jew Lover wrote:

> In calculus, you do set h = 0

No.

Jew Lover

unread,
Jan 11, 2019, 7:46:55 PM1/11/19
to
On Friday, 11 January 2019 17:56:52 UTC-5, Me wrote:
> On Friday, January 11, 2019 at 11:17:20 PM UTC+1, Jew Lover wrote:
>
> > In calculus, <crap>

In your bogus calculus, you DO set h = 0 even though you have produced a whole lot of hand waving arguments that do you no good whatsoever.

Me

unread,
Jan 11, 2019, 7:53:02 PM1/11/19
to
On Saturday, January 12, 2019 at 1:46:55 AM UTC+1, Jew Lover wrote:

> In calculus, you DO set h = 0

No, idiot.

Jew Lover

unread,
Jan 12, 2019, 2:22:09 AM1/12/19
to
You can't deny it, you moron:

f(x,h)= [ f(x+h)-f(x) ] / h

f'(x) = f'(x) + Q(x,h) which is only possible if Q(x,h)=0 meaning h=0.

Therefore, you have NO systematic way of finding the derivative. Your first principles method is a load of crap.

Epsilon-delta arguments are "verifinitions", not definitions, and what is more, they are circular because you need to know the limit before you can show that your first principles guess is true.

Chuckle. Yes, you are stupid beyond belief.

Alan Mackenzie

unread,
Jan 12, 2019, 8:19:22 AM1/12/19
to
Jew Lover <thenewc...@gmail.com> wrote:
> On Friday, 11 January 2019 14:58:03 UTC-5, Alan Mackenzie wrote:
>> Jew Lover <thenewc...@gmail.com> wrote:

>> > You can't refute the New Calculus, for in order to do so, you would
>> > have to reject sound geometry.

>> Seeing as how you won't or can't even say what this "New Calculus" is

> <crap>

> What a crank you are. I referred you to a publication and you said "No
> thanks."

Yes. I'm inviting you to try to persuade me to look at it. Right at the
moment, it just seems like a waste of time. The said publication was
written by somebody who doesn't even understand the concept of a limit in
maths, and likely contains offensive material.

Like I said, it seems like it has nothing to offer, and you can't even
describe in a paragraph what "New Calculus" is. Why should anybody waste
their time with it?

> Yes, you are a fucking moron. Shut up already.

And that really motivates me to go and read more of the same.

Have a good Saturday.

Me

unread,
Jan 12, 2019, 8:47:36 AM1/12/19
to
On Saturday, January 12, 2019 at 8:22:09 AM UTC+1, Jew Lover wrote:

> f(x,h) = [ f(x+h)-f(x) ] / h

No, you can't "do" that. Since at the rhs of the equation f is a function with just ONE argument while at the lhs of the equation f is a function with TWO arguments. That just NONSENSE.

Actually, we have:

f'(x) = lim_(h->0) [ f(x+h)-f(x) ] / h

Hint: The variable "h" is BOUND in the formula "lim_(h->0) [ f(x+h)-f(x) ] / h", hence it's not a "free variable" in it. With other words, you can't "access" "h" from the outside of this formula (as if it were a FREE variable).

See: https://en.wikipedia.org/wiki/Free_variables_and_bound_variables

READ IT, IDIOT!

It's ALWAYS THE SAME:

"Cranks who contradict some mainstream opinion in some highly technical field, (e.g. mathematics [...]) frequently:

1. exhibit a marked lack of technical ability,
2. misunderstand or fail to use standard notation and terminology,
3. ignore fine distinctions which are essential to correctly
understand mainstream belief."

(Here you are on par with Mückenheim.)

Now you considert the equation:

> f'(x) = f'(x) + Q which is only possible if Q = 0

Yeah, that's right. :-)

For all r,c e IR: if r + c = r, then c = 0.

Jew Lover

unread,
Jan 12, 2019, 9:24:55 AM1/12/19
to
On Saturday, 12 January 2019 08:47:36 UTC-5, Me wrote:
> On Saturday, January 12, 2019 at 8:22:09 AM UTC+1, Jew Lover wrote:
>
> > f(x,h) = [ f(x+h)-f(x) ] / h
>
> No, you can't "do" that.

Look moron. I can and did do that and what's more, it's completely correct.

> Since at the rhs of the equation f is a function with just ONE argument while at the lhs of the equation f is a function with TWO arguments. That just NONSENSE.

Bullshit. Both sides take two arguments: x and h. It does not matter that one of them remain constant. If you follow that misguided logic, it also applies to your bullshit derivative f '(x)= lim(x->h) f(x+h)-f(x) / h which doesn't even include the h on the left hand side. You are one sorry dumb cunt and I get a lot of pleasure out of pissing and shitting all over you. It's a sorry fetish, I know. Chuckle.

>
> Actually, we have:
>
> f'(x) = lim_(h->0) [ f(x+h)-f(x) ] / h
>
> Hint: The variable "h" is BOUND in the formula "lim_(h->0) [ f(x+h)-f(x) ] / h", hence it's not a "free variable" in it. With other words, you can't "access" "h" from the outside of this formula (as if it were a FREE variable).

BWAAAAAAA HAAAAAAA HAAAAAAAA. That is hilarious. Fail. Even more hilarious is that you now have an entry on the Moronica about it. I will destroy you, you dumb bastard and every one like you when the world realises that I am the real mathematician and ALL of you are just fake, jealous, hateful cunts.!

>
> See: http

<mind boggling shit>

Jew Lover

unread,
Jan 12, 2019, 9:30:19 AM1/12/19
to
On Saturday, 12 January 2019 08:19:22 UTC-5, Alan Mackenzie wrote:
> Jew Lover <thenewc...@gmail.com> wrote:
> > On Friday, 11 January 2019 14:58:03 UTC-5, Alan Mackenzie wrote:
> >> Jew Lover <thenewc...@gmail.com> wrote:
>
> >> > You can't refute the New Calculus, for in order to do so, you would
> >> > have to reject sound geometry.
>
> >> Seeing as how you won't or can't even say what this "New Calculus" is
>
> > <crap>
>
> > What a crank you are. I referred you to a publication and you said "No
> > thanks."
>
> Yes. I'm inviting you to try to persuade me to look at it.

Considering that you are a nobody, I simply treat you with the disdain that you so deserve. Chuckle.

<too much nonsense to read>

Jew Lover

unread,
Jan 12, 2019, 9:39:55 AM1/12/19
to
On Saturday, 12 January 2019 08:19:22 UTC-5, Alan Mackenzie wrote:
> Jew Lover <thenewc...@gmail.com> wrote:
> > On Friday, 11 January 2019 14:58:03 UTC-5, Alan Mackenzie wrote:
> >> Jew Lover <thenewc...@gmail.com> wrote:
>
> >> > You can't refute the New Calculus, for in order to do so, you would
> >> > have to reject sound geometry.
>
> >> Seeing as how you won't or can't even say what this "New Calculus" is
>
> > <crap>
>
> > What a crank you are. I referred you to a publication and you said "No
> > thanks."
>
> Yes. I'm inviting you to try to persuade me to look at it.

Chuckle. I couldn't care less whether you looked at it or not. You are a nobody.

<No time to read crap>

Me

unread,
Jan 12, 2019, 10:13:36 AM1/12/19
to
On Saturday, January 12, 2019 at 3:24:55 PM UTC+1, Jew Lover wrote:
> On Saturday, 12 January 2019 08:47:36 UTC-5, Me wrote:
> > On Saturday, January 12, 2019 at 8:22:09 AM UTC+1, Jew Lover wrote:
> > >
> > > f(x,h) = [ f(x+h)-f(x) ] / h
> > >
> > No, you can't "do" that [since it isn't correct].
> >
> Look moron [...] it's completely correct.

Nope, it isn't. Hint:

> > at the rhs of the equation f is a function with ONE argument,
> > at the lhs of the equation f is a function with TWO arguments.
> > That's just NONSENSE.
> >
> Both sides take two arguments: x and h.

I didn't talk about the rhs/lhs formulas, but about "f" (occurring in the rhs formula and occurring in the lfh formula).

You claimed that you are/were a programmer. So do you think that the strings "foo(x)" and "foo(x, y)" refer to the same C function? Sure?

Ever heard anything about the "signatur of a function"?

> It does not matter that one of them remain constant.

I didn't claim otherwise. Actually, I didn't even mention the term "constant".

Now you mention that

> f'(x) = lim_(x->h) (f(x+h)-f(x)) / h doesn't even include the h on the left
> hand side.

Right. I already told you the reason for this fact:

> > The variable "h" is BOUND in the formula "lim_(h->0) [ f(x+h)-f(x) ] / h",
> > hence it's not a "free variable" in it. With other words, you can't
> > "access" "h" from the outside of this formula (as if it were a FREE
> > variable).

That's why the expession "f'(x)" doesn't contain a reference to "h".

Hint: Same situation her:

c := lim_(h->0) h + 1

Now c is a constant, actually c = 1. To write "c(h)" at the lhs would be nonsense. c does not "depend" on "h".

See: https://en.wikipedia.org/wiki/Free_variables_and_bound_variables

Jew Lover

unread,
Jan 12, 2019, 10:27:33 AM1/12/19
to
On Saturday, 12 January 2019 10:13:36 UTC-5, Me wrote:
> On Saturday, January 12, 2019 at 3:24:55 PM UTC+1, Jew Lover wrote:
> > On Saturday, 12 January 2019 08:47:36 UTC-5, Me wrote:
> > > On Saturday, January 12, 2019 at 8:22:09 AM UTC+1, Jew Lover wrote:
> > > >
> > > > f(x,h) = [ f(x+h)-f(x) ] / h
> > > >
> > > No, you can't <shit>

Afraid that conjuring a new ill-formed concept such as "free variable" does not help your cause. Chuckle. You just keep digging yourself in deeper and deeper.

Jew Lover

unread,
Jan 12, 2019, 10:29:34 AM1/12/19
to
On Saturday, 12 January 2019 10:13:36 UTC-5, Me wrote:
> On Saturday, January 12, 2019 at 3:24:55 PM UTC+1, Jew Lover wrote:
> > On Saturday, 12 January 2019 08:47:36 UTC-5, Me wrote:
> > > On Saturday, January 12, 2019 at 8:22:09 AM UTC+1, Jew Lover wrote:
> > > >
> > > > f(x,h) = [ f(x+h)-f(x) ] / h
> > > >
> > > No, you can't <shit>

I can actually provide numerous reasons for why everything you write is bullshit, but I am concerned you might create a new Wikipedia entry to give some credibility to your rot. Chuckle.

So no. I will refrain from showing you what a moron you are.

This time: Answer not a fool according to his folly, lest he ...

Jew Lover

unread,
Jan 12, 2019, 10:37:51 AM1/12/19
to
" f'(x) = Lim (h->0) [f(x+h)-f(x)] / h

x is a free variable and h is a bound variable; consequently the value of this expression depends on the value of x, but there is nothing called h on which it could depend."

1. Does not the limit depend on h? If h approached anything else BUT ZERO, would that expression still be true? Ha, ha.

2. Does not the limit depend on EVERY possible value of h (except the one that actually matters, i.e. ZERO!) because it is in fact the limit of the FINITE DIFFERENCE quotients that is f '(x)?

Chuckle.

Jew Lover

unread,
Jan 12, 2019, 10:43:29 AM1/12/19
to
I loved this one:

"A bound variable is a variable that was previously free, but has been bound to a specific value or set of values called domain of discourse or universe."
- http://en.wikipedia.org/wiki/Free_variables_and_bound_variables#CITEREFThompson1991

Gee, so if h is bound, it means that it can be anything except 0?

Time to create a new Wikipedia Moronica entry:

A bound variable is one that has certain restrictions.

See, I am not such a bad guy after all, am I?

The truth is that you fucking morons are so wrong that no matter how many edits, updates and new articles you produce on the Moronica, you will ALWAYS be wrong. Why? Because you are morons.

Me

unread,
Jan 12, 2019, 10:48:22 AM1/12/19
to
On Saturday, January 12, 2019 at 4:27:33 PM UTC+1, Jew Lover wrote:
>
> You can't deny it, you moron:
>
> f(x,h) = [ f(x+h)-f(x) ] / h

I already told you, that this in nonsense. Hence I won't explain it again.

> [bullshit]

Jew Lover

unread,
Jan 12, 2019, 10:52:00 AM1/12/19
to
Read as: Oh shit! I am really in the piss hole. I give up. There is no way I can refute John Gabriel.

Chuckle.
Message has been deleted

Me

unread,
Jan 12, 2019, 10:54:05 AM1/12/19
to
Yeah, whatever.

Jew Lover

unread,
Jan 12, 2019, 10:59:20 AM1/12/19
to
On Saturday, 12 January 2019 10:52:30 UTC-5, Me wrote:
> On Saturday, January 12, 2019 at 4:37:51 PM UTC+1, Jew Lover wrote:
>
> > "f'(x) = lim_(h->0) [f(x+h)-f(x)] / h
> >
> > x is a free variable and h is a bound variable; consequently the value of
> > this expression depends on the value of x, but there is nothing called h
> > on which it could depend."
>
> Exactly.
>
> > 1. [...] If h approached anything else BUT ZERO ...
>
> Then we would not get the limit any more (by definition).

Therefore, the limit depends on h, doesn't it, you ignoramus? Not even an attempt at answering all the other questions, eh? Chuckle.

You are inferior to me in all respects. Never, ever forget this. You are privileged that I even bother responding to your shit. You are no doubt incorrigibly stupid with an IQ no greater than 40, if that!

Me

unread,
Jan 12, 2019, 11:00:24 AM1/12/19
to
On Saturday, January 12, 2019 at 4:43:29 PM UTC+1, Jew Lover wrote:
> On Saturday, 12 January 2019 10:37:51 UTC-5, Jew Lover wrote:
> >
> > "f'(x) = lim_(h->0) [f(x+h)-f(x)] / h
> >
> > x is a free variable and h is a bound variable; consequently the value of
> > this expression depends on the value of x, but there is nothing called h
> > on which it could depend."

Right.

> "A bound variable is a variable that was previously free, but has been bound
> to a specific value or set of values called domain of discourse or universe."

Well, actually, it is "bound" by a certain "expression". In this case, "h" in the formula "[f(x+h)-f(x)] / h" is bound by the expression "lim_(h->0)".

> Gee, so if h is bound

Right. That's why it does not occure in the lhs formula, i.e. in "f'(x)".

Me

unread,
Jan 12, 2019, 11:16:56 AM1/12/19
to
On Saturday, January 12, 2019 at 4:59:20 PM UTC+1, Jew Lover wrote:

> Therefore, the limit depends on h, doesn't it?

I don't think so.

Consider:

c := lim_(h->0) h + 1

Now you may claim that c is the limit of h + 1 as h approaches 0 (or something like that).

Can's the how this (constant) value "depends on h". Sorry.

Now we could introduce a PARAMETER, say "p", with p e IR. Then we might consider

c(p) := lim_(h->p) h + 1

and claim that c(p) is the limit of h + 1 as h approaches p.

Now here I can see (and hence say) that "lim_(h->p) h + 1" or "c(p)" depends on p. No? Actually, it DOES depend on p.

Hint: c(0) differs from c(1).

gabriel...@gmail.com

unread,
Jan 12, 2019, 11:16:56 AM1/12/19
to
On Saturday, 12 January 2019 11:00:24 UTC-5, Me wrote:
> On Saturday, January 12, 2019 at 4:43:29 PM UTC+1, Jew Lover wrote:
> > On Saturday, 12 January 2019 10:37:51 UTC-5, Jew Lover wrote:
> > >
> > > "f'(x) = lim_(h->0) [f(x+h)-f(x)] / h
> > >
> > > x is a free variable and h is a bound variable; consequently the value of
> > > this expression depends on the value of x, but there is nothing called h
> > > on which it could depend."
>
> Right.
>
> > "A bound variable is a variable that was previously free, but has been bound
> > to a specific value or set of values called domain of discourse or universe."

Bwaaa haaaaa haaaaaa. No idiot. h can take on ANY value close to x, except 0, which means it very much depends on EVERY value in your bullshit "universe". Chuckle.

>
> Well, actually, it is "bound" by a certain "expression". In this case, "h" in the formula "[f(x+h)-f(x)] / h" is bound by the expression "lim_(h->0)".

Rubbish. It is restricted by a rule: h cannot be 0. And yet, without breaking that very rule EVERY time using your bogus "first principles", you have NO systematic way of determining the derivative so that you can verify with your bullshit epsilonics definition which requires that you already know the derivative! Chuckle.

>
> > Gee, so if h is bound
>
> Right. That's why it does not occure in the lhs formula, i.e. in "f'(x)".

Nope. A variable is a variable. Period. There is no such thing as "free" and "bound" variable. In the expression \sum_{k=1}^n, only n is a variable. k is NOT a variable. You are by far too stupid to realise these things and I hold out little hope you ever will.

In your bogus calculus, you DO set h = 0 even though you have produced a whole lot of hand waving arguments that do you no good whatsoever.

Me

unread,
Jan 12, 2019, 11:20:19 AM1/12/19
to
On Saturday, January 12, 2019 at 4:37:51 PM UTC+1, Jew Lover wrote:

> "f'(x) = lim_(h->0) [f(x+h)-f(x)] / h
>
> x is a free variable and h is a bound variable; consequently the value of
> this expression depends on the value of x, but there is nothing called h
> on which it could depend."

Exactly.

> 1. [...] If h approached anything else BUT ZERO ...

Then we would not get the derivation of f any more (by definition).

Me

unread,
Jan 12, 2019, 11:24:08 AM1/12/19
to
On Saturday, January 12, 2019 at 4:59:20 PM UTC+1, Jew Lover wrote:
> On Saturday, 12 January 2019 10:52:30 UTC-5, Me wrote:
> >
> > On Saturday, January 12, 2019 at 4:37:51 PM UTC+1, Jew Lover wrote:
> > >
> > > "f'(x) = lim_(h->0) [f(x+h)-f(x)] / h
> > >
> > > x is a free variable and h is a bound variable; consequently the value of
> > > this expression depends on the value of x, but there is nothing called h
> > > on which it could depend."
> > >
> > Exactly.
> > >
> > > 1. [...] If h approached anything else BUT ZERO ...
> > >
> > Then we would not get the limit any more (by definition).

Sorry, about that. Sould read:

> > Then we would not get the derivative of f any more (by definition).

It seems that you actually have learned something concerning the (essential) difference between free and bound variables. Well done, man!

Me

unread,
Jan 12, 2019, 11:44:28 AM1/12/19
to
On Saturday, January 12, 2019 at 5:16:56 PM UTC+1, gabriel...@gmail.com wrote:

> h can take on ANY value close to x, except 0

Exactly. :-)

Hint: This contradicts your former claim:

"In your bogus calculus, you do set h = 0"

I'm glad that we agree now: No, we don't set h = 0 (in this context).

Now...

> > Actually, [a bound variable] is "bound" by a certain "expression".
> > In this case, "h" in the formula "[f(x+h)-f(x)] / h" is bound by
> > the expression "lim_(h->0)".
> >
> h cannot be 0.

That's a DEFFERENT matter, but still true. Right!

> A variable is a variable.

Sure. But there are two VERY DIFFERENT "types" of variables, see:
https://en.wikipedia.org/wiki/Free_variables_and_bound_variables

> In the expression "sum_{k=1}^n", only "n" is a variable.

Nope. We do have two variables here: "k" and "n (if "n" actually is a variable).

> "k" is NOT a variable.

Sure it is. :-)

Hint: Consider the expression "k" where "k" is just a variable which occures free in it.

Now if we consider the expression

sum_{k=1}^10 k

"k" is STILL a variable. As you can see, it now occures at two "locations". Moreover NOW it is bound (in this expression) by the expression "sum_{k=1}^10" in front of the expression "k".

Hint: The result, again, is a certain NUMBER. This number does not "depend on k". But "the result" of this summation may, say, depend on "n" with n e IN:

sum_{k=1}^n k .

HERE (in this expression) "n" is a free variable.

HENCE we may as well define/introduce the term "s(n)" and write:

s(n) := sum_{k=1}^n k .

Me

unread,
Jan 12, 2019, 11:54:33 AM1/12/19
to
On Saturday, January 12, 2019 at 5:44:28 PM UTC+1, Me wrote:
> On Saturday, January 12, 2019 at 5:16:56 PM UTC+1, gabriel...@gmail.com wrote:
> >
> > In the expression "sum_{k=1}^n", only "n" is a variable.
> >
> Nope. We do have two variables here: "k" and "n (if "n" actually is a
> variable).
> >
> > "k" is NOT a variable.
> >
> Sure it is. :-)
>
> Hint: Consider the expression "k" where "k" is just a variable which occures
> free in it.
>
> Now if we consider the expression
>
> sum_{k=1}^10 k
>
> "k" is STILL a variable. As you can see, it now occures at two "locations".
> Moreover NOW it is bound (in this expression) by the expression
> "sum_{k=1}^10" in front of the expression "k".

Check the first example here:
https://en.wikipedia.org/wiki/Free_variables_and_bound_variables#Examples

Me

unread,
Jan 12, 2019, 12:10:25 PM1/12/19
to
On Saturday, January 12, 2019 at 5:44:28 PM UTC+1, Me wrote:
> On Saturday, January 12, 2019 at 5:16:56 PM UTC+1, gabriel...@gmail.com wrote:
> >
> > h can take on ANY value close to x, except 0
> >
> Exactly. :-)
> >
> > h cannot be 0.

Right. I'm glad we could clarify this misconception on your part.

Peter Percival

unread,
Jan 12, 2019, 3:17:52 PM1/12/19
to
Jew Lover wrote:
> " f'(x) = Lim (h->0) [f(x+h)-f(x)] / h
>
> x is a free variable and h is a bound variable; consequently the value of this expression depends on the value of x, but there is nothing called h on which it could depend."

One way of seeing that it doesn't depend on h is to change 'h' to
something else. The above statement is the same as this one:

f'(x) = Lim (k->0) [f(x+k)-f(x)] / k

in which 'h' has been replaced by 'k'. [One cannot change things ad
lib, e.g., 'h' cannot be changed to 'x'.]

>
> 1. Does not the limit depend on h? If h approached anything else BUT ZERO

That's right. It *does* depend on 0 (which isn't h).

> , would that expression still be true? Ha, ha.
>
> 2. Does not the limit depend on EVERY possible value of h (except the one that actually matters, i.e. ZERO!) because it is in fact the limit of the FINITE DIFFERENCE quotients that is f '(x)?
>
> Chuckle.
>


--
"He who will not reason is a bigot;
he who cannot is a fool;
he who dares not is a slave."
- Sir William Drummond

Dan Christensen

unread,
Jan 12, 2019, 3:57:51 PM1/12/19
to
Geez, Troll Boy, if you couldn't get any converts to your Wacky New Calclueless at YT, I would think your prospects at sci.math are 100 times worse. Everyone here but BKK is wise to you, and he is totally f---ing insane!


Dan












Jew Lover

unread,
Jan 12, 2019, 9:45:04 PM1/12/19
to
On Saturday, 12 January 2019 15:17:52 UTC-5, Peter Percival wrote:
> Jew Lover wrote:
> > " f'(x) = Lim (h->0) [f(x+h)-f(x)] / h
> >
> > x is a free variable and h is a bound variable; consequently the value of this expression depends on the value of x, but there is nothing called h on which it could depend."
>
> One way of seeing that it doesn't depend on h is to change 'h' to
> something else. The above statement is the same as this one:
>
> f'(x) = Lim (k->0) [f(x+k)-f(x)] / k
>
> in which 'h' has been replaced by 'k'. [One cannot change things ad
> lib, e.g., 'h' cannot be changed to 'x'.]

It is not talking about changing the name of h, but rather its value. Since the value of h changes in such a way that the finite difference quotient approaches a certain limit, it is pretty clear that the limit depends on h for it (the limit) cannot be realised if h is not changing in a certain way, that is, getting smaller.

>
> >
> > 1. Does not the limit depend on h? If h approached anything else BUT ZERO
>
> That's right. It *does* depend on 0 (which isn't h).

In fact it is h, because in your "first principles method" which is the only way you have of finding the derivative, you set h equal to 0.

f(x,h)= [ f(x+h)-f(x) ] / h

f'(x) = f'(x) + Q(x,h) which is only possible if Q(x,h)=0 meaning h=0.

Therefore, you have NO systematic way of finding the derivative. Your first principles method is a load of crap.

Epsilon-delta arguments are "verifinitions", not definitions, and what is more, they are circular because you need to know the limit before you can show that your first principles guess is true.

>
> > , would that expression still be true? Ha, ha.
> >
> > 2. Does not the limit depend on EVERY possible value of h (except the one that actually matters, i.e. ZERO!) because it is in fact the limit of the FINITE DIFFERENCE quotients that is f '(x)?
> >
> > Chuckle.
> >
>
>
> --
> "He who will not reason is a bigot;
> he who cannot is a fool;
> he who dares not is a slave."
> - Sir William Drummond

Perhaps a more serious attempt? Chuckle.

Jew Lover

unread,
Jan 12, 2019, 9:49:53 PM1/12/19
to
On Thursday, 10 January 2019 07:59:49 UTC-5, Jew Lover wrote:
> Mainstream mathematics academics are incorrigibly stupid creatures. They will tell you that a straight line function has a derivative and change the definition of derivative in terms of their bullshit "analytic derivative".
>
> Let's see how the orangutans fail!
>
> The slope of a straight line is given by:
>
> [ f(x+h)-f(x) ] / h [A]
>
> The derivative of a straight line is given by:
>
> Lim (h -> 0) [ f(x+h)-f(x) ] / h [B]
>
> Therefore, since a straight line's derivative is its slope, we have:
>
> Lim (h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+h)-f(x) ] / h
>
> Can you get any dumber? I have produced a New Calculus which has no ill-formed concepts and have solved the tangent line problem and removed ALL ill-formed concepts such as infinity, infinitesimals and limit theory from differential and integral calculus.
>
> A derivative can be found geometrically using Ancient Greek mathematics: https://lnkd.in/dhN6Gzw
>
> Download the most important mathematics book ever written here (it's free!!!):
>
> https://lnkd.in/dEkeshd

See, it's not possible to refute facts. You can make up rules/decrees, but these have no place in mathematics. Only facts and logic are allowed in mathematics. Ill-formed definitions are rejected.

1: f(x,h)= [ f(x+h)-f(x) ] / h

2: f'(x) = f'(x) + Q(x,h) which is only possible if Q(x,h)=0 meaning h=0.

666

unread,
Jan 13, 2019, 1:35:02 AM1/13/19
to
perjantai 11. tammikuuta 2019 18.09.09 UTC+2 Me kirjoitti:
> On Friday, January 11, 2019 at 4:47:37 PM UTC+1, Jew Lover wrote:

> > In calculus, you do set h = 0
>
> Nope! h = 0 is explicitly excluded when calculating the limit
>
> lim_(h -> 0) [ f(x+h)-f(x) ] / h ,
>
> idiot! (Hint: f(x+h)-f(x) ] / h is not defined for h = 0.)


lim_(h -> 0) [ f(x+h)-f(x) ] / h = 0/0



if

Lim (h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+h)-f(x) ] / h

we have

[ f(x+h)-f(x) ] / h = 0/0

and is possible only if you set h = 0

Me

unread,
Jan 13, 2019, 5:13:55 AM1/13/19
to
On Sunday, January 13, 2019 at 3:45:04 AM UTC+1, Jew Lover wrote:

> the only way you have of finding the derivative, you set h equal to 0.

Nonsense. Back at square A, crank?

Me

unread,
Jan 13, 2019, 5:21:36 AM1/13/19
to
On Sunday, January 13, 2019 at 3:49:53 AM UTC+1, Jew Lover wrote:

> The slope of a straight line is given by:
>
> [ f(x+c)-f(x) ] / c [A]

for some c e IR, c > 0.

> The derivative of a straight line is given by:
>
> lim_(h -> 0) [ f(x+h)-f(x) ] / h [B]

Right.

> Therefore, since a straight line's derivative is its slope, we have:
>
> lim_(h -> 0) [ f(x+h)-f(x) ] / h = [ f(x+c)-f(x) ] / c

for any c e IR, c > 0.

Right.

> A derivative can be found geometrically [in this case].

Right.

> Ill-formed definitions are rejected [in mathematics].

Indeed! That's the reason why we have to reject the following blunder:

> 1: f(x,h) = [ f(x+h)-f(x) ] / h

Nope. Since at the rhs of the equation, f is a function with just ONE argument while at the lhs of the equation, f is a function with TWO arguments. That just NONSENSE.

Actually, we have:

f'(x) = lim_(h->0) [ f(x+h)-f(x) ] / h

Hint: The variable "h" is BOUND in the formula "lim_(h->0) [ f(x+h)-f(x) ] / h", hence it's not a "free variable" in it. With other words, you can't "access" "h" from the outside of this formula (as if it were a FREE variable).

See: https://en.wikipedia.org/wiki/Free_variables_and_bound_variables

Now you considert the trivial equation:

> f'(x) = f'(x) + Q which is only possible if Q = 0

Yeah, that's right. :-)

For all r,c e IR: if r + c = r, then c = 0.

Me

unread,
Jan 13, 2019, 5:36:40 AM1/13/19
to
On Sunday, January 13, 2019 at 7:35:02 AM UTC+1, 666 wrote:

> [nonsense]

Running low on medication?

Me

unread,
Jan 13, 2019, 5:48:58 AM1/13/19
to
Yesterday you knew it better:

> On Saturday, January 12, 2019 at 5:16:56 PM UTC+1, gabriel...@gmail.com
> wrote:
> >
> > h can take on ANY value close to x, except 0
> >
> > ...
> >
> > h cannot be 0.

666

unread,
Jan 13, 2019, 6:10:13 AM1/13/19
to
maybe you have never heard of limits:

Jew Lover

unread,
Jan 13, 2019, 8:14:46 AM1/13/19
to
They don't do it that way. They do it this way:

1: f(x,h)= [ f(x+h)-f(x) ] / h


2: f'(x) = f'(x) + Q(x,h)

which is only possible if Q(x,h)=0 meaning h=0. Through sheer luck, Newton knew he was finding the general derivative even though his method is kludgy.

Today's academic morons know this too and that's why they added a whole lot of limit theory in an attempt to make "rigorous" their bogus calculus.

You seem to know that the mainstream definition is shit. But you still have a problem with infinity being a junk concept.

The mainstream orangutans don't claim that h becomes 0, but in order to find f'(x), they have to set h=0. They have no systematic way of finding f'(x) as I have shown is possible in the New Calculus, the first and only rigorous formulation in human history.

Jew Lover

unread,
Jan 13, 2019, 8:17:51 AM1/13/19
to
The other problem is that f(x,h) very much depends on the value of h and introducing bullshit like irrelevant "free" and "bound" variables is an unsuccessful attempt to obfuscate.

They hate me because I instantly dismiss their bullshit. It would actually have been much easier for them to admit they are wrong first and then they might stand a chance of learning.

Jew Lover

unread,
Jan 13, 2019, 8:27:03 AM1/13/19
to
On Sunday, 13 January 2019 05:21:36 UTC-5, Me wrote:

<much whining and bla, bla>

> Now you considert the trivial equation:
>
> > f'(x) = f'(x) + Q which is only possible if Q = 0
>
> Yeah, that's right. :-)

Now think moron: What is the only way that Q(x,h) can be 0? Yes! h = 0.

It doesn't help your cause that you call h bound or free at all. Hand waving is as transparent as glass.

See, it's not possible to refute facts. You can make up rules/decrees, but these have no place in mathematics. Only facts and logic are allowed in mathematics. Ill-formed definitions are rejected.

1: f(x,h)= [ f(x+h)-f(x) ] / h

2: f'(x) = f'(x) + Q(x,h) which is only possible if Q(x,h)=0 meaning h=0.

Therefore, you have NO systematic way of finding the derivative. So far, you haven't even attempted to refute this fact. Your first principles method is a load of crap.

j4n bur53

unread,
Jan 13, 2019, 8:46:09 AM1/13/19
to
Q(x,h) := f[x,x+h] - f'(x)

Where: f(x) - f(y)
/ ----------- x <> y
f[x,y] = < x - y
\
f'(x) x = y

The operator "divdiff" f[x,y] is well knowen it, has
a lot of laws that are very similar to derivative.

See for example:

Symbolic Computation of Divided Differences
W. Kahan and Richard J. Fateman - 1999
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.38.9483&rank=1

But its much older.

j4n bur53

unread,
Jan 13, 2019, 8:54:31 AM1/13/19
to
Corr:

D(x,h) := f[x,x+h]

Then:

f(x+h) = f(x) + D(x,h)*h

D(x,0) = f'(x)

j4n bur53

unread,
Jan 13, 2019, 9:01:16 AM1/13/19
to
divdiff satisfies a form of chain rule:

f(g)[x, y] = f[g(x), g(y)] g[x, y]

See section 1.4 Rules of the paper.

For x = y we get the special case:

f(g)'(x) = f'(g(x)) g'(x)

https://en.wikipedia.org/wiki/Chain_rule

666

unread,
Jan 13, 2019, 9:13:39 AM1/13/19
to
sunnuntai 13. tammikuuta 2019 15.14.46 UTC+2 Jew Lover kirjoitti:
> On Sunday, 13 January 2019 06:10:13 UTC-5, 666 wrote:
> > sunnuntai 13. tammikuuta 2019 12.36.40 UTC+2 Me kirjoitti:
> > > On Sunday, January 13, 2019 at 7:35:02 AM UTC+1, 666 wrote:
> > >
> > > > [nonsense]
> > >
> > > Running low on medication?
> >
> > maybe you have never heard of limits:
> >
> > lim_(h -> 0) [ f(x+h)-f(x) ] / h = 0/0
>
> They don't do it that way. They do it this way:
>
> 1: f(x,h)= [ f(x+h)-f(x) ] / h

no, it makes no sense.[f(x+h)-f(x)]/h is a finite difference quotient:

Delta f/ Delta h = [f(x+h)-f(x)]/h


>
>
> 2: f'(x) = f'(x) + Q(x,h)
>
> which is only possible if Q(x,h)=0 meaning h=0. Through sheer luck, Newton knew he was finding the general derivative even though his method is kludgy.

obviously f'(x) = f'(x)

so what?


>
> Today's academic morons know this too and that's why they added a whole lot of limit theory in an attempt to make "rigorous" their bogus calculus.

there's nothing wrong with the limit theory if properly understood.

>
> You seem to know that the mainstream definition is shit.

no, it is not shit if properly understood. But no-one seems to be able to take the limit:

lim_(h -> 0) [f(x+h)-f(x)]/h = 0/0


> But you still have a problem with infinity being a junk concept.

no, it is not junk. It is a basic mathematical concept.
Until you learn to handle it, you can't call yourself a mathematician.


> The mainstream orangutans don't claim that h becomes 0, but in order to find f'(x), they have to set h=0.

if

Lim (h->0) [f(x+h)-f(x)]/h = [f(x+h)-f(x)]/h

we have

[f(x+h)-f(x)]/h = 0/0

and is possible only if you set h = 0


>They have no systematic way of finding f'(x) as I have shown is possible in the New Calculus, the first and only rigorous formulation in human history.

You never calculate the slope of the tangent line. You end up calculating
the slope of a non-parallel secant line. The reason is:
your rejection of infinitesimals.

Me

unread,
Jan 13, 2019, 9:41:49 AM1/13/19
to
On Sunday, January 13, 2019 at 2:14:46 PM UTC+1, Jew Lover wrote:

We started with

f(x) := m*x + d

where m,d e IR. (Then f'(x) is defined for all x e IR.)

Now you consider the following equation:

> f'(x) = f'(x) + Q(x,h) [for all x,h e IR]
>
> [This] is only possible if Q(x,h) = 0 [for all x,h e IR].

Indeed!

> meaning h = 0.

Of course not, idiot.

Actually, it suffices that Q(x,h) = 0 for all x,h e IR.

Hence Q(x,h) may be

x*(h - h)

for example. Then the equation

f'(x) = f'(x) + Q(x,h)

holds for ALL x e IR and _all_ h e IR. Hence h = 0 does NOT follow (contrary to your claim).

j4n bur53

unread,
Jan 13, 2019, 9:42:21 AM1/13/19
to
But divdiff could be also helpful with your Geogebra
tricks bird brain John Garbageiel. Lets say we
want a secant parallel to f'(x). Then we have to

solve the following:

f[x-m,x+n] = f'(x)

Which has a solution sometimes, but not always
for lets say a given m. For example inflection points
cause problems with your secant nonsense.

But if we have solved the above equation, then
obviously, setting m,n=0,0 in divdiff, gives
the folllowing:

f[x-m,x+n]m,n=0,0 = f[x,x] = f'(x)

This is independent whether you have solved
the equation or not. Because the equation is
obviously satisfied for m,n=0,0.

Lets say you have solved the equation:

f[x-m,x+n] = f'(x)

Then your secant would trivially read:

s(x_) = f[x-m,x+n]*(x_-(x-m))+f(x-m)

s(x-m) = f(x-m)

s(x+n) = f(x+n)

s'(x_) = f[x-m,x+n] = f'(x)

But how solve the equation?
Message has been deleted

Me

unread,
Jan 13, 2019, 9:50:52 AM1/13/19
to
On Sunday, January 13, 2019 at 2:27:03 PM UTC+1, Jew Lover wrote:
> On Sunday, 13 January 2019 05:21:36 UTC-5, Me wrote:
> >
> > Now you considert the trivial equation:
> > >
> > > f'(x) = f'(x) + Q which is only possible if Q = 0
> > >
> > Yeah, that's right. :-)
> >
> Now think moron: What is the only way that Q(x,h) can be 0?

Depends. If Q(x,h) = x*(h - h) there are INFINTELY MANY ways. :-)

Hint: In this case

f'(x) = f'(x) + Q(x,h)

for ALL h e IR and ALL x e D(f').

> It doesn't help your cause that you call h bound or free at all.

In this case "h" should be free in Q(x,h) by "convention". If "h" weren't free in Q(x,h) we should't use it a parameter in the first place.

j4n bur53

unread,
Jan 13, 2019, 9:52:01 AM1/13/19
to
But how play your Geogebra demonstration tricks?
You need to solve:

f[x-m,x+n] = (f(x-m)-f(x+n))/(x-m-x-n)

= (f(x+n)-f(x-m))/(m+n) = f'(x)

Hence:

f(x+n)-f(x-m) = f'(x)*(m+n)

Hence:

f(x+n)-f'(x)*n = f(x-m)+f'(x)*m

So you can look at this function:

g(n) = f(x+n) - f'(x)*n

If you can solve it, take a root different
of -m, in case it exists:

n = g^(-1)(g(m))

j4n bur53

unread,
Jan 13, 2019, 9:53:02 AM1/13/19
to
Corr.:

n = g^(-1)(g(-m))
Message has been deleted

Me

unread,
Jan 13, 2019, 9:58:35 AM1/13/19
to


On Sunday, January 13, 2019 at 2:14:46 PM UTC+1, Jew Lover wrote:

> 1: f(x,h)= [ f(x+h)-f(x) ] / h

TELL ME, idiot, does the function f (from above) take ONE argument or TWO arguments?

You know, there's a difference, troll boy.

No function which takes two arguments can be _identical_ with a function that only takes one argument.

Hint: "A function that takes a single argument as input (such as f(x) = x^2) is called a unary function. A function of two or more variables is considered to have a domain consisting of ordered pairs or tuples of argument values. "

See: https://en.wikipedia.org/wiki/Argument_of_a_function

Me

unread,
Jan 13, 2019, 10:02:08 AM1/13/19
to
On Sunday, January 13, 2019 at 3:13:39 PM UTC+1, 666 wrote:

> no-one seems to be able to take the limit:
>
> [...] = 0/0

The reason for this is, that "0/0" is considered UNDEFINED in the context of real analysis. Hence there's no such limit.

j4n bur53

unread,
Jan 13, 2019, 10:03:56 AM1/13/19
to
Example this trivial function often seen
in John Garbageiels demonstration Geogebra:

f(x) = x^2

Now we have:

g(-m) = (x-m)^2 + 2*x*m

= x^2 - 2*x*m + m^2 + 2*x*n

= x^2 + m^2.

Which can be easily inverted:

/ sqrt(v - x^2)
g^(-1)(v) = <
\ -sqrt(v - x^2)

So that for this function we have:

f[x-m,x+m] = f'(x)

Right?

Lets check:

(x-m)^2 - (x+m)^2 - 4*x*m
----------------- = ------- = 2*x
x-m - x - m - 2*m

Yep!

j4n bur53

unread,
Jan 13, 2019, 10:10:09 AM1/13/19
to
See page 98 ff of John Gabriels compiled nonsense:
https://drive.google.com/file/d/1CIul68phzuOe6JZwsCuBuXUR8X-AkgEO/view
Message has been deleted

Me

unread,
Jan 13, 2019, 10:27:08 AM1/13/19
to
On Sunday, January 13, 2019 at 2:14:46 PM UTC+1, Jew Lover wrote:

> [Mathematicians] don't claim that h becomes 0

Yeah, since this case is EXPLICITLY excluded in this context.

> but in order to find f'(x), they have to set h=0.

Nope. Where did get that nonsensical idea from?

> They have no systematic way of finding f'(x)

If course, we have. Want to see how we get f' for f defined with

f(x) = m*x + d (x e IR)

where m,d are some (fixed) elements in IR?

Here it is:

f'(x) = lim_(h->0) {[f(x+h) - f(x)] / h} = lim_(h->0) {[(m*(x + h) + d) - (m*x + d)] / h} = lim_(h->0) {[m*x + m*h + d - m*x - d] / h} = lim_(h->0) {m*h / h} = lim_(h->0) {m} = m.

So f' is just a constant function with f'(x) = m for all x e IR. m is the slope of the straight line "described" by the equation

m*x + d ,

for x e IR (and with fixed m,d e IR).

Rememeber, you wrote:

> Therefore, since a straight line's derivative is its slope, we have:
>
> lim_(h -> 0) {[ f(x+h)-f(x) ] / h} = [ f(x+c)-f(x) ] / c

for any c e IR, c > 0.

We may check this here. Let c e IR, c > 0, then

[ f(x+c)-f(x) ] / c = [(m*(x + c) + d) - (m*x + d)] / c = [m*x + m*c + d - m*x - d] / c = m*c / c = m (since c =/= 0).

With other words, f'(x) = [ f(x+c)-f(x) ] / c for any c e IR, c > 0, and all x e IR.

qed

Me

unread,
Jan 13, 2019, 10:41:47 AM1/13/19
to
On Sunday, January 13, 2019 at 4:10:09 PM UTC+1, j4n bur53 wrote:

> See page 98 ff of John Gabriels compiled nonsense:
> https://drive.google.com/file/d/1CIul68phzuOe6JZwsCuBuXUR8X-AkgEO/view

Mind blowing stuff. Not even a single reasonable (or "rational") line.

Jew Lover

unread,
Jan 13, 2019, 12:02:07 PM1/13/19
to
On Sunday, 13 January 2019 09:41:49 UTC-5, Me wrote:
<crap>

> Now you consider the following equation:
>
> > f'(x) = f'(x) + Q(x,h) [for all x,h e IR]
> >
> > [This] is only possible if Q(x,h) = 0 [for all x,h e IR].
>
> Indeed!
>
> > meaning h = 0.
>
> Of course not...

Of course YES. Look you mega moron, the only way Q(x,h) can be 0 is if h=0. Otherwise f'(x) =/= f'(x) + Q(x,h) EVER!

Jew Lover

unread,
Jan 13, 2019, 12:02:47 PM1/13/19
to
When you can't refute shit, then all you have is shit. Chuckle.

Jew Lover

unread,
Jan 13, 2019, 12:08:50 PM1/13/19
to
On Sunday, 13 January 2019 10:27:08 UTC-5, Me wrote:

> Yeah, since this case is EXPLICIT...

<rot>

f'(x)=lim (h->oo) [f(x+h)-f(x)]/h is very embarrassing to you.

It's as if you produced a massive turd that stinks more than anything on the planet and so you are trying to hide it by spraying cologne thereon. Tsk, tsk. That strategy hasn't worked for you that well, has it? Chuckle.

f'(x)=f'(x)+Q(x,h) MEANS h = 0, for if not, then you don't have any way of finding that L (limit) so you can pretend that you are verifying it using your delusional limit theory:

0 < |x - c| < delta => | [f(c+h)-f(c)] / h - L|<epsilon

See, everything you touch is SHIT because you have shit for brains. Chuckle.

Me

unread,
Jan 13, 2019, 12:15:23 PM1/13/19
to
On Sunday, January 13, 2019 at 6:02:07 PM UTC+1, Jew Lover wrote:
> On Sunday, 13 January 2019 09:41:49 UTC-5, Me wrote:
> <crap>
>
> > Now you consider the following equation:
> >
> > > f'(x) = f'(x) + Q(x,h) [for all x,h e IR]
> > >
> > > [This] is only possible if Q(x,h) = 0 [for all x,h e IR].
> >
> > Indeed!
> > >
> > > meaning h = 0.
> > >
> > Of course not...
> >
> Of course YES.

No.

> Look, the only way Q(x,h) can be 0 is if h = 0.

No.

How about Q(x,h) = x*(h - h)?

Clearly Q(x,h) = 0 for all x,h e IR.

Hence

f'(x) = f'(x) + Q(x,h) for _all_ h e IR and x e D(f')

in this case.

Me

unread,
Jan 13, 2019, 12:17:52 PM1/13/19
to
On Sunday, January 13, 2019 at 6:08:50 PM UTC+1, Jew Lover wrote:

> then you don't have any way of finding that L (limit) so <bla>
Message has been deleted
Message has been deleted
Message has been deleted

Me

unread,
Jan 13, 2019, 2:15:52 PM1/13/19
to
On Sunday, January 13, 2019 at 6:08:50 PM UTC+1, Jew Lover wrote:

f(x) = m*x + d (for all x e IR)

We say that

lim_(h->0) [f(x+h) - f(x)] / h = L

if for every epsilon > 0 there exists a delta > 0 such that, for all h =/= 0,

> if 0 < |h| < delta, then |[f(x+h) - f(x)] / h - L| < epsilon

Let's see, if L = m will satisfy this condition. :-)

We had that for all h e IR, h =/= 0: [f(x+h) - f(x)] / h = ... = m.

Hence if L = m, the condition becomes:

> for every epsilon > 0 there exists a delta such that, for all h =/= 0,
> if 0 < |h| < delta, then |0| < epsilon

Since for every epsilon > 0, |0| < epsilon holds, the condition is satisfied.

With other words,

lim_(h->0) [f(x+h) - f(x)] / h = m .

So where's your problem, troll boy?

Note that in the conditions above h = 0 is explicitly excluded.

Jew Lover

unread,
Jan 13, 2019, 2:22:03 PM1/13/19
to
On Sunday, 13 January 2019 14:15:52 UTC-5, Me wrote:

> We baboons, say that
>
> lim_(h->0) [f(x+h) - f(x)] / h = L

Fail. f'(x)=L is what you need to determine first, you dumb cunt.

>
> if for every epsilon > 0 there exists a delta > 0 such that, for all h =/= 0,
>
> > if 0 < |h| < delta, then |[f(x+h) - f(x)] / h - L| < epsilon

No moron. This is circular. You use that faux "definition" to find f'(x) which is the L you are using. Moron! You have NO systematic way of finding L. Get it ballsack?

<mind numbing stupidity>

Jew Lover

unread,
Jan 13, 2019, 2:23:13 PM1/13/19
to
On Sunday, 13 January 2019 14:15:52 UTC-5, Me wrote:
<crapola...>

Me

unread,
Jan 13, 2019, 2:50:04 PM1/13/19
to
On Sunday, January 13, 2019 at 8:22:03 PM UTC+1, Jew Lover wrote:
> On Sunday, 13 January 2019 14:15:52 UTC-5, Me wrote:
> >
> > We say that
> >
> > lim_(h->0) [f(x+h) - f(x)] / h = L
> >
> > [etc.]
> >
> f'(x) = L is what you need to determine

Exactly. Since f'(x) = lim_(h->0) [f(x+h) - f(x)] / h BY DEFINITION, all I need to do is to check if for some suitable L:

lim_(h->0) [f(x+h) - f(x)] / h = L .

With other words,

> > if for every epsilon > 0 there exists a delta > 0 such that, for all h =/= 0,
> >
> > > if 0 < |h| < delta, then |[f(x+h) - f(x)] / h - L| < epsilon

So my CLAIM now is, that

lim_(h->0) [f(x+h) - f(x)] / h = m

for f(x) = m*x + d, and hence

f'(x) = m .

To prove that

lim_(h->0) [f(x+h) - f(x)] / h = m

(and Hence f'(x) = m) I only have to show that

> for every epsilon > 0 there exists a delta > 0 such that, for all h =/= 0,
> if 0 < |h| < delta, then |[f(x+h) - f(x)] / h - m| < epsilon .

That's all. And that's indeed the case.

> You have NO systematic way of finding L.

In the example above I actually just GUESSED that L = m might be the correct value. As you can see, I guessed correctly.

Usually we don't proceed this way, but by applying "limit laws".

Want to see how we get f' for f defined with

f(x) = m*x + d (x e IR)

where m,d are some (fixed) elements in IR?

Here it is:

f'(x) = lim_(h->0) {[f(x+h) - f(x)] / h} = lim_(h->0) {[(m*(x + h) + d) - (m*x + d)] / h} = lim_(h->0) {[m*x + m*h + d - m*x - d] / h} = lim_(h->0) {m*h / h} = lim_(h->0) {m} = m.

Hint: We *do* determine f' in a _systematic way_ here.

Me

unread,
Jan 13, 2019, 2:59:10 PM1/13/19
to
On Sunday, January 13, 2019 at 8:50:04 PM UTC+1, Me wrote:

> Usually we don't proceed this way, but by applying "limit laws".
>
> Want to see how we get f' for f defined with
>
> f(x) = m*x + d (x e IR)
>
> where m,d are some (fixed) elements in IR?
>
> Here it is:
>
> f'(x) = lim_(h->0) {[f(x+h) - f(x)] / h} = lim_(h->0) {[(m*(x + h) + d) -
> (m*x + d)] / h} = lim_(h->0) {[m*x + m*h + d - m*x - d] / h} = lim_(h->0)
> {m*h / h} = lim_(h->0) {m} = m.
>
> Hint: We *do* determine f' in a _systematic way_ here.

Here's another example:

f(x) = x^2 (x e IR)

Then f'(x) = lim_(h->0) {[f(x+h) - f(x)] / h} = lim_(h->0) {[(x + h)^2 - x^2] / h} = lim_(h->0) {[x^2 + 2xh + h^2 - x^2] / h} = lim_(h->0) {[2xh + h^2] / h} = lim_(h->0) {2x + h} = lim_(h->0) {2x} + lim_(h->0) {h} = 2x + 0 = 2x.

With other words, f'(x) = 2x for all x e IR.

Where's your problem troll boy?

Jew Lover

unread,
Jan 13, 2019, 3:20:09 PM1/13/19
to
On Sunday, 13 January 2019 14:59:10 UTC-5, Me wrote:


> f(x) = x^2 (x e IR)
>
> Then f'(x) = lim_(h->0) {[f(x+h) - f(x)] / h} = lim_(h->0) {[(x + h)^2 - x^2] / h} = lim_(h->0) {[x^2 + 2xh + h^2 - x^2] / h} = lim_(h->0) {[2xh + h^2] / h} = lim_(h->0) {2x + h}

Right here you big monkey!

> = lim_(h->0) {2x} + lim_(h->0) {h} = 2x + 0 = 2x.

What did you do to h there you fucking moron? Do you just pretend it no longer exists? Do you just turn a blind eye to it, you incorrigible retard!!

YES, YOU SET h = 0, you fucking ape!!

Thanks.

> With other ...

Shut the fuck up idiot. You don't know shit. Learn some English you dumb Kraut. It's not "with other", but "in other".

I don't know why I even bother to waste my time with you.

Once again, YOU HAVE NO SYSTEMATIC WAY OF DETERMINING f'(x). You can't have h =/= 0 before f'(x)+Q(x,h) and then h = 0 afterwards. Unless you set h=0, you NEVER have f'(x), only f'(x)+Q(x,h) which is NOT equal to f'(x).

Now fuck off kindly.

Me

unread,
Jan 13, 2019, 4:16:30 PM1/13/19
to
On Sunday, January 13, 2019 at 9:20:09 PM UTC+1, Jew Lover wrote:
> On Sunday, 13 January 2019 14:59:10 UTC-5, Me wrote:
> >
> > f(x) = x^2 (x e IR)
> >
> > Then f'(x) = lim_(h->0) {[f(x+h) - f(x)] / h} = lim_(h->0) {[(x + h)^2 -
> > x^2] / h} = lim_(h->0) {[x^2 + 2xh + h^2 - x^2] / h} = lim_(h->0) {[2xh +
> > h^2] / h} = lim_(h->0) {2x + h}

It seems that you have some question.

> Right here
> >
> > = lim_(h->0) {2x} + lim_(h->0) {h} = 2x + 0 = 2x.
> >
> What did you do to h [?]

Let's see. Actually, it's quite obvous, isn't it?

We have

lim_(h->0) {h} = 0 ,

didn't you know that?

Proof:

You know that (by definition) we have

lim_(x->0) f(x) = L

if for every epsilon > 0 there exists a delta > 0 such that, for all x e D(f),

if 0 < |x| < delta, then |f(x) - L| < epsilon .

Now, I claim that for f(x) = x, L = 0. Let's see (and check). If L = 0 the condition becomes

for every epsilon > 0 there exists a delta > 0 such that, for all x e D(f),
if 0 < |x| < delta, then |x| < epsilon

And that is indeed the case, just take delta = epsilon/2 for any epsilon > 0.

So we have PROVED that

lim_(h->0) {h} = 0 .

Got that, you moron?

Hint: In practice, lim_(x->0) x = 0 is just another "limit law".

From THIS we get

lim_(h->0) {2x} + lim_(h->0) {h} = 2x + 0 = 2x.

> YES

Great.

> Once again,

you have seen that we *do* have

> A SYSTEMATIC WAY OF DETERMINING f'(x).

Actually, you may have heard about additional systematic rules for determining f':

https://en.wikipedia.org/wiki/Differentiation_rules
Message has been deleted

Me

unread,
Jan 13, 2019, 4:28:59 PM1/13/19
to
On Sunday, January 13, 2019 at 10:16:30 PM UTC+1, Me wrote:

> Hint: In practice, lim_(x->0) x = 0 is just another "limit law".

Actually, it is

lim_(x->c) x = c .

Another one is

lim_(x->c) a = a (for any constant a).

You really should learn some math, man.

You might start with this text her: "The Limit Laws"
http://www.oxfordmathcenter.com/drupal7/node/95

Jew Lover

unread,
Jan 13, 2019, 4:52:24 PM1/13/19
to
On Sunday, 13 January 2019 16:16:30 UTC-5, Me wrote:

<crapola>

> From THIS we get
>
> lim_(h->0) {2x} + lim_(h->0) {h} = 2x + 0 = 2x.

No moron. h = 0 gives you f'(x) = 2x. The effect is the same and you are simply taking the limit after you reduce the quotient, because if you tried it before, you have no way of finding f'(x)=2x systematically. You can only find f'(x) by using your kludgy circular bullshit definition lim_(h->0) [ f(x+h)-f(x)] / h and breaking arithmetic in the process, which forbids you from dividing by 0!

So, to summarise, you have NO systematic way. What does systematic mean? It means that you perform a sequence of logical steps and arrive at the derivative without breaking any arithmetic.

<crap>

Jew Lover

unread,
Jan 13, 2019, 4:56:50 PM1/13/19
to
One more thing:

To get from lim_(h->0) {2x} + lim_(h->0) {h}

to

= 2x + 0 = 2x

You need 0<|x-c|<delta => | [ f(c+h)-f(c) ] / f - L|<epsilon

BUT L = f'(x) which is a problem because you still haven't found f'(x). You need to apply your limit process, which can't be applied because you don't know f'(x) or L, the very thing your limit process requires.

What a complete idiot you are.

j4n bur53

unread,
Jan 13, 2019, 5:00:22 PM1/13/19
to
May I make a comment? To find f'(x), we do not
always use if 0 < |h| < delta, then |[f(x+h) - f(x)] / h - m| < epsilon .

Since we are not anymore apes and have evolved, our
arsenal is much wider, Chain Rule, Hopital Rule, etc...

This rules in turn can be reduced to some lim definitions.

Me

unread,
Jan 13, 2019, 5:05:58 PM1/13/19
to
On Sunday, January 13, 2019 at 10:52:24 PM UTC+1, Jew Lover wrote:
> On Sunday, 13 January 2019 16:16:30 UTC-5, Me wrote:
> >
> > From THIS we get
> >
> > lim_(h->0) {2x} + lim_(h->0) {h} = 2x + 0 = 2x.
> >
> No

Yes. Didn't you see the proof for lim_(h->0) {h} = 0?


> So, to summarise, we have a systematic way.

Right.

> What does systematic mean? It means that you perform a sequence of logical
> steps and arrive at the derivative without breaking any arithmetic.

Exactly!

Thanks for your valuable contribution!

Jew Lover

unread,
Jan 13, 2019, 5:07:37 PM1/13/19
to
On Sunday, 13 January 2019 17:00:22 UTC-5, j4n bur53 wrote:
> May I make a comment? To find f'(x), we do not
> always use if 0 < |x-c| < delta, then |[f(c+h) - f(c)] / h - L| < epsilon .

Yes, you do!

>
> Since we are not anymore apes and have evolved, our
> arsenal is much wider, Chain Rule, Hopital Rule, etc...
>
> This rules in turn can be reduced to some lim definitions.

Correct, which means you DO always use the limit process.

Me

unread,
Jan 13, 2019, 5:08:36 PM1/13/19
to
On Sunday, January 13, 2019 at 11:00:22 PM UTC+1, j4n bur53 wrote:

> May I make a comment?

Sure.

> To find f'(x), we do not
> always use if 0 < |h| < delta, then |[f(x+h) - f(x)] / h - m| < epsilon .
>
> Since we are not anymore apes and have evolved,

Well, at least SOME of us, you know...

> our arsenal is much wider, Chain Rule, Hopital Rule, etc...
>
> This rules in turn can be reduced to some lim definitions.

Right. I just mentioned that in some of my recent posts. :-P

Very nice text: "The Limit Laws"
http://www.oxfordmathcenter.com/drupal7/node/95

It is loading more messages.
0 new messages