f(x) is a real valued function on the reals, and has a continuous derivative. f '(x)2 + f(x)3 → 0 as x → ∞. Show that f(x) and f '(x) → 0 as x → ∞.
The key to getting started is to notice that if f ' = 0 for arbitrarily large values of x then the result is certainly true. Suppose f '(xn) = 0 and xn → ∞. Then since f '2 + f3 → 0, we have f(xn) → 0. But f is monotonic on the interval [xn, xn+1} since its derivative does not change sign, hence f → 0. Hence also f ' → 0. So we may assume that for sufficiently large x, f ' does not change sign.
Now suppose f tends to a limit as x → ∞. Then f ' must also tend to a limit. If that limit is non-zero, then f increases or decreases faster than some non-constant linear function for sufficiently large x and so cannot tend to a limit. Hence f ' must tend to zero. Hence f also.
So we may assume that either (1) for sufficiently large x, f is strictly monotonic increasing and tends to infinity, or (2) for sufficiently large x, f is strictly monotonic decreasing and tends to minus infinity.
The first case is impossible, because then f3 and hence also f3 + f '2 would tend to infinity.
Showing that the second case is impossible needs a little more work. Suppose that for x ≥ X, we have f(x) < -1 and 1/4 > f(x)3 + f '(x)2 > -1/4. Then 1/2 f(x)3 < -1/2, so -1/2 f(x)3 > 1/2. Hence f '(x)2 + 1/2 f(x)3 > -1/4 - 1/2 f(x)3 > 1/4 > 0. So f '(x)2 > -1/2 f(x)3. f '(x) is negative, so f '(x) < -1/2 |f(x)|3/2 (*).
Now define g(x) to satisfy g(X) = -1, g' = -1/2 |g|3/2. Solving, we get g(x) = (1 - (x - X)/4 )-2 for x >= X. (*) shows that we must have f(x) < g(x) for x ≥ X. But g(x) → -∞ as x → 5X, so f must be discontinuous on the interval (X, 6X). Contradiction.
Comment. I found this hard. Note that the exact form of the relation is important. If, for example, it was replaced by f '(x)3 + f(x)2 → 0 as x → ∞, then the result would not be true - we could have f(x) = -x3/27, in which case f '3 + f2 is identically zero.
35th Putnam 1974
© John Scholes
18 Aug 2001