Find all f such that
f : R+---> R continous such that for any x in R+
f(x) = int_[0;Ax]f(t)dt
thank you m.o.
>Find all f such that
>f : R+---> R continous such that for any x in R+
>f(x) = int_[0;Ax]f(t)dt
f(0) = 0 and f'(x) = A f(Ax).
Then f''(x) = A d/dx f(Ax) = A^2 f'(Ax) = A^3 f(A^2 x)
and f^(n)(x) = A^(n(n+1)/2) f(A^n x).
In particular, f is C^infinity and all its derivatives at 0 are 0.
So the only possible analytic solution is f(x) = 0.
I don't know if there are non-analytic solutions.
Robert Israel isr...@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia
Vancouver, BC, Canada V6T 1Z2
You could try differentiating both sides wrt x.
This gives f'(x) = A f(Ax), maybe this is easier to deal with.
> Let A >1 real
>
> Find all f such that
>
> f : R+---> R continous such that for any x in R+
> f(x) = int_[0;Ax]f(t)dt
Actually A > 1 is not needed.
Theorem: Let f : [0,oo) -> R be continuous. Let g be nonnegative,
increasing, and continuous on [0,oo), with g(0) = 0, and g(x) -> oo as x ->
oo. If
|f(x)| <= |int_[0, g(x)] f(t) dt| for all x in [0,oo),
then f is identically 0.
proof/ Let x* = sup {x >= 0 : f is 0 on [0,g(x)]}. (Note f(0) = 0, so the
sup is taken over a non-empty set.) Suppose x* < oo. Then f = 0 on [0,
g(x*)]. Choose x** > x* such that g(x**) - g(x*) < 1/2. Let M be the
maximum value of |f| on [g(x*),g(x**)]. Then M = |f(x)| for some x in
[g(x*),g(x**)]. We have
M = |f(x)| <= |int_[0, g(x)] f(t) dt|
= |int_[g(x*), g(x)] f(t) dt |
<= (g(x**) - g(x*))M <= M/2.
It follows that M = 0, a contradiction by the definition of x*. Therefore
x* = oo, finishing the proof.
>Actually A > 1 is not needed.
>Theorem: Let f : [0,oo) -> R be continuous. Let g be nonnegative,
>increasing, and continuous on [0,oo), with g(0) = 0, and g(x) -> oo as x ->
>oo. If
> |f(x)| <= |int_[0, g(x)] f(t) dt| for all x in [0,oo),
>then f is identically 0.
Counterexample: f(x) = x, g(x) = sqrt(2 x).
> Let a >1 real
>
> Find all f such that
>
> f : R+---> R continous such that for any x in R+
> f(x) = int_[0;Ax]f(t)dt
>
----------------------------------------------------
f(0)=0 , f '(x) = a.f(ax)
Lf '(x ) = laplace transform = x.Lf(x) with
Lf(x) = int_R+ f(t).exp(-xt)dt
x.Lf(x) = a.Lf(ax)
ax.Lf(ax) = aLf(a^2x)
................................
a^nxLf(a^nx) = aLf(a^(n+1)x)
multiply these equality
Lf(x ) = (1/x^(n+1))(1/a^(n(n+1)/2-n-1)Lf(a^(n+1)x)
suppose x>= 1 since a>1 we tend n--->+oo
Lf(x) = 0 for x>= 1
my question is can we deduce f is 0 on [1;+oo[ ?
Thnak for your anwsers
m.o.
>>Find all f such that
>>f : R+---> R continous such that for any x in R+
>>f(x) = int_[0;Ax]f(t)dt
>f(0) = 0 and f'(x) = A f(Ax).
>Then f''(x) = A d/dx f(Ax) = A^2 f'(Ax) = A^3 f(A^2 x)
>and f^(n)(x) = A^(n(n+1)/2) f(A^n x).
>In particular, f is C^infinity and all its derivatives at 0 are 0.
>So the only possible analytic solution is f(x) = 0.
>I don't know if there are non-analytic solutions.
No. One can bound the derivatives, and the power series
must converge. See the theory of Volterra integral equations.
--
This address is for information only. I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Department of Statistics, Purdue University
hru...@stat.purdue.edu Phone: (765)494-6054 FAX: (765)494-0558
>> Let a >1 real
>> Find all f such that
>> f : R+---> R continous such that for any x in R+
>> f(x) = int_[0;Ax]f(t)dt
>f(0)=0 , f '(x) = a.f(ax)
>Lf '(x ) = laplace transform = x.Lf(x) with
>Lf(x) = int_R+ f(t).exp(-xt)dt
Note that this integral won't necessarily converge, unless you
make an assumption such as |f(t)| <= exp(ct) where x > c.
...
>Lf(x) = 0 for x>= 1
>my question is can we deduce f is 0 on [1;+oo[ ?
We could deduce f is 0 on all of R+. But only on the assumption
of an exponential bound |f(t)| <= exp(ct) for some constant c.
Otherwise the Laplace transform of f might not be defined anywhere.
It was interesting to read the amusing replies!
The Fabius function satisfies this with A=2. This function Fb(x) is
infinitely differentiable, but nowhere analytic. It is increasing on
[0,1], Fb(0)=0, Fb(1) = 1, Fb'(x) = 2 Fb(2x). Fabius [1] initially
constructs it for x in [0,1], but it can be extended easily to all real
x, using the defining equation for x>1, and by reflection for x<0 [2].
See a picture at <http://www.math.ohio-state.edu/~edgar/selfdiff/>
references:
[1] J. Fabius, "A probabilistic example of a nowhere analytic
C^\infty-function". Z. Wahrsch. Verw. Geb. 5 (1966) 173--174.
[2] K. Stromberg, PROBABILITY FOR ANALYSTS (Chapman 7 Hall, 1994), pp.
117--120.
I have a student with some surprising results for f'(x) = a f(2x) ...
for most a, this is quite different than the case a=2. [But note this
is not the original problem of this thread.]
Here is the construction of Fabius: Let (X_k) be an iid sequence of
random variables, each uniformly distributed on [0,1]. Let X = sum(k=1
to infinity) 2^{-k} X_k. Let Fb(x) be the cumulative distribution
function of X. Check Fb(x) = integral(0 to 2x) Fb(t) dt for all x in
[0,1/2].
--
G. A. Edgar http://www.math.ohio-state.edu/~edgar/
>>>Find all f such that
>>>f : R+---> R continous such that for any x in R+
>>>f(x) = int_[0;Ax]f(t)dt
>>f(0) = 0 and f'(x) = A f(Ax).
>>Then f''(x) = A d/dx f(Ax) = A^2 f'(Ax) = A^3 f(A^2 x)
>>and f^(n)(x) = A^(n(n+1)/2) f(A^n x).
>>In particular, f is C^infinity and all its derivatives at 0 are 0.
>>So the only possible analytic solution is f(x) = 0.
>>I don't know if there are non-analytic solutions.
>No. One can bound the derivatives, and the power series
>must converge. See the theory of Volterra integral equations.
How do you get a bound on the derivatives, without an a priori
bound on the function f? E.g. in another part of this thread,
an exponential bound |f(x)| <= exp(c x) allows Laplace transform
methods to be used to show f = 0. But without any bound of that
sort, I don't see how you can proceed.
>> Let a >1 real
>> Find all f such that
>> f : R+---> R continous such that for any x in R+
>> f(x) = int_[0;Ax]f(t)dt
>f(0)=0 , f '(x) = a.f(ax)
>Lf '(x ) = laplace transform = x.Lf(x) with
>Lf(x) = int_R+ f(t).exp(-xt)dt
>x.Lf(x) = a.Lf(ax)
Oops, no:
L(f')(x) = int_0^infinity exp(-xt) a f(at) dt
= int_0^infinity exp(-xu/a) f(u) du (taking u = at)
= L(f)(x/a)
And there's no reason to think L(f) = 0 (as in fact it need not be,
if you look at the Fabius function that Gerald Edgar mentioned).
I take that back, it doesn't show anything of the sort. There are,
as Gerald Edgar mentioned, bounded nonzero solutions.
> >Actually A > 1 is not needed.
>
> >Theorem: Let f : [0,oo) -> R be continuous. Let g be nonnegative,
> >increasing, and continuous on [0,oo), with g(0) = 0, and g(x) -> oo as x ->
> >oo. If
>
> > |f(x)| <= |int_[0, g(x)] f(t) dt| for all x in [0,oo),
>
> >then f is identically 0.
>
> Counterexample: f(x) = x, g(x) = sqrt(2 x).
Thanks Robert. Interesting thread. Let me at least get something right
(although way less interesting):
Theorem: Let f : [0,oo) -> R be continuous. Let g: [0,oo) -> [0,oo) satisfy
g(x) <= x. If
|f(x)| <= |int_[0, g(x)] f(t) dt| for all x in [0,oo),
then f is identically 0.
proof/ It's not hard to show f = 0 on [0,1/2] (you get M <= M*(1/2), where
M is the maximum value of |f| on [0,1/2]). Now bootstrap this up to all of
[0,oo) using intervals of length 1/2 (or give a fancier all-in-one proof).
>> Let A >1 real
>> Find all f such that
>> f : R+---> R continous such that for any x in R+
>> f(x) = int_[0;Ax]f(t)dt
>> thank you m.o.
>The Fabius function satisfies this with A=2. This function Fb(x) is
>infinitely differentiable, but nowhere analytic. It is increasing on
>[0,1], Fb(0)=0, Fb(1) = 1, Fb'(x) = 2 Fb(2x). Fabius [1] initially
>constructs it for x in [0,1], but it can be extended easily to all real
>x, using the defining equation for x>1, and by reflection for x<0 [2].
>See a picture at <http://www.math.ohio-state.edu/~edgar/selfdiff/>
>references:
>[1] J. Fabius, "A probabilistic example of a nowhere analytic
>C^\infty-function". Z. Wahrsch. Verw. Geb. 5 (1966) 173--174.
>[2] K. Stromberg, PROBABILITY FOR ANALYSTS (Chapman 7 Hall, 1994), pp.
>117--120.
>Here is the construction of Fabius: Let (X_k) be an iid sequence of
>random variables, each uniformly distributed on [0,1]. Let X = sum(k=1
>to infinity) 2^{-k} X_k. Let Fb(x) be the cumulative distribution
>function of X. Check Fb(x) = integral(0 to 2x) Fb(t) dt for all x in
>[0,1/2].
Beautiful! And more generally, for any A > 1 we can define
X = sum_{k=1}^infinity A^(-k) X_k; its cumulative distribution function
F(x) = Prob(X <= x) satisfies
F(x) = int_{Ax-1}^{Ax} F(t) dt
(which we can get by conditioning on X_1)
and thus F'(x) = A (F(Ax) - F(Ax-1)).
Let R be the set of polynomials p(t) with coefficients in {0,1}.
For x >= 0, let f(x) = sum_{p in R} (-1)^p(1) F(x-p(A))
(note that on any bounded interval, there are only finitely many nonzero
terms in the sum).
Then f'(x) = A sum_{p in R} (-1)^p(1) (F(Ax-Ap(A)) - F(Ax-Ap(A)-1))
Noting that every p in R can be uniquely written in the form
p(t) = t q(t) or t q(t)+1 for q in R, we see that
f'(x) = A sum_{q in R} (-1)^q(1) F(Ax-q(A)) = A f(Ax)
So f is a nonzero solution of moubinool's problem.
Next question: are there more solutions, or is f unique (up to constant
multiples)?