Any help will be very much appreciated.
Let g=g(p). Differentiate both sides with respect to x, then do the
same with respect to y. It seems like you will end up
with two equations, with dg/dp and df/dx as unknowns. Solve this
system, then solve the two respective diff. eq. I haven't
tried this myself but it seems like it mauy work.
Oleg
From the equation f(-x) = -f(x),
the special case x = 0 gives f(0) = 0.
Therefore, from the main functional equation,
the special case y = 0 gives g(f(x)) = x + g(2f(x))
and the special case y = x gives g(2f(x)) = x + g(f(x)).
These last two equations imply 2x = 0, so every real number is
zero. Perhaps your functional equation is a little unhealthy? :-)
Ken Pledger.
NO!!! for y=x, g(2f(x))=x+g(-f(x)) so the problem is fine.
g(x)=2*x+g(2*x).
Oleg
sorry the last line of the previous post should read, g(f(x))=2*x+g(-
f(x))
also the first Ken's equation for y=0 is also wrong there.
> In article
> <12193236.1172760032...@nitrogen.mathforum.org>,
> Mikhail V. Sokolov <mmm_...@rambler.ru> wrote:
>
> > I am stuck with the following functional equation:
> > g(f(x)+f(y)) = x + g(f(y-x)+f(-x)) for any real x and y,
> > where functions
> > 1. g:R->R is strictly monotonic and differentiable;
> > 2. f:R->R is strictly increasing, differentiable, and satisfying f(-x) =
> > -f(x).
> >
> > Any help will be very much appreciated.
>
>
> From the equation f(-x) = -f(x),
>
> the special case x = 0 gives f(0) = 0.
>
> Therefore, from the main functional equation,
>
> the special case y = 0 gives g(f(x)) = x + g(2f(x))
>
> and the special case y = x gives g(2f(x)) = x + g(f(x))....
Alas! Mikhail e-mailed me privately to point out the error: since
f is an odd (not even) function,
the special case y = x actually gives g(2f(x)) = x + g(-f(x)).
However, here's another idea.
The special case y = 2x gives g(f(x) + f(2x)) = x + g(0).
This suggests letting f be _any_ suitable function, then defining g as a
constant plus the inverse function of f(x) + f(2x).
For example, let f(x) = x.
Then f(x) + f(2x) = 3x,
so let g(x) = x/3 + a. (The constant a could be 0.)
Then, if I haven't made another blunder, each side of the functional
equation comes out to (x + y)/3 + a, so the equation is satisfied.
Ken Pledger.
A nice solution, but are you sure that you are not missing any answers
this way?
Oleg (not Mikhail)
> In article <ken.pledger-4662...@bats.mcs.vuw.ac.nz>,
> Ken Pledger <ken.p...@mcs.vuw.ac.nz> wrote:
>
> > In article
> > <12193236.1172760032...@nitrogen.mathforum.org>,
> > Mikhail V. Sokolov <mmm_...@rambler.ru> wrote:
> >
> > > I am stuck with the following functional equation:
> > > g(f(x)+f(y)) = x + g(f(y-x)+f(-x)) for any real x and y,
> > > where functions
> > > 1. g:R->R is strictly monotonic and differentiable;
> > > 2. f:R->R is strictly increasing, differentiable, and satisfying f(-x) =
> > > -f(x)....
>
> The special case y = 2x gives g(f(x) + f(2x)) = x + g(0).
>
> This suggests letting f be _any_ suitable function, then defining g as a
> constant plus the inverse function of f(x) + f(2x).
>
> For example, let f(x) = x.
>
> Then f(x) + f(2x) = 3x,
>
> so let g(x) = x/3 + a. (The constant a could be 0.)
>
> Then, if I haven't made another blunder, each side of the functional
> equation comes out to (x + y)/3 + a, so the equation is satisfied....
Wrong again! The above condition about g^(-1) is necessary but
not sufficient for a solution. My simple example f(x) = x just turned
out luckily.
Clearly I should leave this problem alone. :-(
Ken Pledger.
Differentiation with respect to y yields identity so it's useless.
Oleg
For one (of presumably many) solutions, how about
f(x) = a*x
g(x) = x/(3*a) + b
for any b and any a > 0?
Is (dg/dx)*(df/dx)=1/3 sufficient ? (It looks like it's necessary-
differentiate the original eq with respect to x.)
Combining (2) and (3) and using (1), we obtain
g'(f(x)+f(y))*f'(x) = 1 - g'(f(x)+f(y))*f'(y)/f'(y-x)*(f'(y-x)+f(-x)).
g'(f(x)+f(y))*(f'(x)+f'(y)/f'(y-x)*(f'(y-x)+f'(-x))) = 1.
Taking in the last equation y=-x, we get
2*f'(x)+f'(x)^2/f'(2x) = 1/g'(0).
May be the last equation help to solve the problem...
I don't see what you mean... these g(x) and f(x) respectively satisfy
conditions (1) and (2) don't they? And if you directly substitute
these definitions into g(f(x)+f(y)) = x + g(f(y-x)+f(-x)) then it
comes out exactly right. What else is there to achieve?
You are certainly right. The problem is: are there any other solutions?
Mikhail, there is a problem in your derivations above- you are taking
partial derivatives with respect to two different things and then
forgetting about that.
Continuing the communal effort:
y=2*x yields
g(f(x)+f(2*x))=g(0)+x, then because f(-x)=-f(x), substitution x->-x
yields
g(-f(x)-f(2*x))=g(0)-x,
Add the two together and substitute f(x)+f(2*x)->x (removed dependence
on x on the right-hand side) to obtain
g(x)+g(-x)=2*g(0)
Some progress, I'm guessing, but who can pick up from here ?
Or is this actually IT ?
Yes, it is useful result. Moreover, without loss of generality, we may assume g(0)=0 (as g is defined up to an additive constant). So, g(-x)=-g(x).
I don't know if this is any use...
If we define h(x, y) = g(f(x) + f(y)) then the equation to be
satisfied is h(x, y) = x + h(y - x, -x). Clearly also h(x, y) = h(y,
x) and, from a result elsewhere in this thread, if we assume g(0) = 0
(which we may as well), then h(-x, -y) = -h(x, y). Simply by equating
terms it seems we are forced to have:
h(x, y) = x/3 + y/3 + a*x^3 - 3*a*x^2*y/2 - 3*a*x*y^2/2 + a*y^3 +
b*x^5 - 5*b*x^4*y/2 + b*x^3*y^2 + b*x^2*y^3 - 5*b*x*y^4/2 + b*y^5 ...
where a and b are arbitrary constants. I haven't gone any further than
fifth powers, but maybe there is some pattern if this is continued. Or
it might stop dead at some point, with no further higher powers
possible. And then maybe there's some way to figure out whether or not
a function h(x, y) of this form can be written as g(f(x) + f(y)) for
some f and g satisfying the remaining requirements. A lot of maybes!
What terms are you equating ?
I'm just supposing that h(x, y) is expressible as a two-variable power
series; i.e. a sum of terms of the form c[i,j]*x^i*y^j for non-
negative integers i and j, where the c[i,j] are constant coefficients.
Given that h(x, y) = x + h(y - x, -x), we expand h(y - x, -x) and, for
every i, j, the coefficient of x^i*y^j on the l.h.s must match the
coefficient of x^i*y^j on the r.h.s. This equating of coefficients can
be carried out independently for each distinct value of i + j (done
above for i + j = 3 and i + j = 5), yielding a set of linear
simultaneous equations. The condition h(-x, -y) = -h(x, y) means that
we only need to consider i + j odd, and h(x, y) = h(y, x) imposes
additional restrictions.
Because of the condition h(x, y) = h(y, x), there are, for each
distinct value of i + j, about twice as many equations to be satisfied
as there are free coefficients. My hope was that the above solutions
for smaller powers (smaller values of i + j) were just "coincidence",
and we'd reach a point with higher powers where no more solutions are
possible, thus limiting the possibilities for h (assuming that it can
be expressed in this way as a power series). In fact, the opposite
seems to happen: with higher powers not only do there continue to be
solutions, but there are "more" of them, in the sense that the
solutions include increasing numbers of free constants (such as the a
and b above). Failing that, I hoped for some very simple pattern to
emerge, but this seems not to be the case either.
Ho-hum.
Let h(x, y) = (x + y)/3 + t(-x, x - y, y). Then from the functional
equation for h follows:
t(-x, x - y, y) = t(x - y, y, -x) (1)
Let u = -x, v = x - y and w = y =>
t(u, v, w) = t(v, w, u) (2)
For example t(x, y, z) = x * y * z =>
h(x, y) = (x + y + 3*x*y*(y - x))/3
Theron
Since h is commutative, because h(x, y) = g(f(x) + f(y)) = g(f(y) +
f(x)) = h(y, x), it follows that: t(-x, x - y, y) = t(-y, -(x - y), x)
=>
t(u, v, w) = t(-w, -v, -u) (3)
>From (2) and (3) =>
t(u, v, w) = t(v, w, u) = t(-w, -v, -u) (4)
Theron