### 52nd Putnam 1991

Problem B2

R is the real line. f, g: R → R are non-constant, differentiable functions satisfying: (1) f(x + y) = f(x)f(y) - g(x)g(y) for all x, y; (2) g(x + y) = f(x)g(y) + g(x)f(y) for all x, y; and (3) f '(0) = 0. Prove that f(x)2 + g(x)2 = 1 for all x.

Solution

Moderately hard.

Let f(0) = A, g(0) = B. Setting x = y = 0 in (1) and (2) gives: A = A2 - B2, B = 2AB. The latter implies that B = 0 or A = 1/2. But the former shows that A ≠ 1/2 (since B2 cannot be negative). Hence B = 0, and hence A = 0 or 1.

Putting y = 0 in (2) gives g(x) = Ag(x). So A ≠ 0 (since g is not constant). Hence A = 1.

We now use the differentiability of f and g. Let g'(0) = C. Then f(δx) = 1 + O(δx2), g(δx) = Cδx + O(δx2). Hence, using (1), f(x + δx) = f(x) f(x + δx) - g(x) g(δx) = f(x) (1 + O(δx2)) - g(x) (C δx + O(δx2) ). So f '(x) = - C g(x). Similarly, from (2), we get g'(x) = C f(x). So the derivative of f(x)2 + g(x)2 is 2f(x)f '(x) + 2g(x)g'(x) = -2Cf(x)g(x) + 2Cf(x)g(x) = 0 for all x, and hence f(x)2 + g(x)2 = f(0)2 + g(0)2 = 1 for all x.

Comment. The relations given are modeled on those for cos and sin. The proof above shows that they are essentially the only solutions. [Because f '' = -C2f, so we can integrate and apply the values of f and f ' at 0.]