Presumably it is obvious that for every x in X,
|f(x)g(x)| <= 1/2 ( f(x)*f(x) + g(x)*g(x) )
How do you prove this rigorously? I've tried looking at the RHS as the
area of a rectangle, I've tried the formula for the long side of an
obtuse triangle, I've tried looking at the diagonals of a parallelogram
and nothing works.
Thanks,
-sto
(f(x) + g(x))^2 >=0
so
f(x)f(x) + 2f(x)g(x) + g(x)g(x) >=0
so
-2f(x)g(x) <= f(x)f(x) + g(x)g(x).
so
f(x)g(x) <= (1/2)(f(x)f(x) + g(x)g(x)).
Doing the same thing with (f(x)-g(x))^2 gives
-f(x)g(x) <= (1/2)(f(x)f(x) + g(x)g(x)).
Since |f(x)g(x)| is either f(x)g(x) or -f(x)g(x), in either case it is
less than or equal to (1/2)(f(x)f(x)+g(x)g(x)).
Nothing to do with them being L_2 function, all you need is for the
values to be real.
--
Arturo Magidin
Sigh. Quick correction:
> (f(x) + g(x))^2 >=0
>
> so
>
> f(x)f(x) + 2f(x)g(x) + g(x)g(x) >=0
>
> so
>
> -2f(x)g(x) <= f(x)f(x) + g(x)g(x).
>
> so
>
> f(x)g(x) <= (1/2)(f(x)f(x) + g(x)g(x)).
Should be
-f(x)g(x) <= (1/2)(f(x)f(x) + g(x)g(x))
> Doing the same thing with (f(x)-g(x))^2 gives
>
> -f(x)g(x) <= (1/2)(f(x)f(x) + g(x)g(x)).
Should be
f(x)g(x) <= (1/2)(f(x)f(x) + g(x)g(x)).
The cumulative effect is the same, though.
--
Arturo Magidin
Just to nit-pick, why not just look at (|f|-|g|)^2 >=0?
>
> Just to nit-pick, why not just look at (|f|-|g|)^2 >=0?
Because I wanted to open the possibility of screwing up the signs? (-;
Didn't really think about finessing it; if I had, I probably would
have.
--
Arturo Magidin
I just hate prrof by cases, because I all too often forget one.