15th Putnam 1955

------
 
 
Problem A3

an is a sequence of monotonically decreasing positive terms such that ∑ an converges. S is the set of all ∑ bn, where bn is a subsequence of an. Show that S is an interval iff an-1 ≤ ∑n ai for all n.

 

Solution

Fairly hard.

The condition is certainly necessary. For suppose an-1 = k2 > k3 = ∑n ai. Let k1 = ∑1n-2 ai. Then k1 and k1 + k2 belong to S, but no number in the non-empty interval (k1 + k3, k1 + k2) belongs to S, so S cannot be an interval.

Now assume that the condition holds. Let k = ∑an. We show that S = [0, k]. We get the endpoints by taking the subsequence to be the empty set or the whole sequence. So take h ∈ (0, k). Define bn to be the earliest member of the sequence not so far chosen such that ∑1n bi ≤ h. This is clearly possible since am → 0. If at any point we get equality we are home.

So assume that the resulting {bn} is infinite. Clearly ∑ bn ≤ h. If {bn} is missing infinitely many members of {an}, then given any ε > 0, we can find am < ε missing from the subsequence. But that means that for some n, we rejected am because b1 + b2 + ... + bn + am > h. So ∑ bi ≥ b1 + b2 + ... + bn > h - ε. Hence ∑ bi = h.

It remains to consider the case where only finitely many members are missing from the subsequence. Let am be the largest such. Then for some n we have that b1 + b2 + ... + bn + am > h. We also have that b1 + b2 + ... + bn + ∑m+1 ai ≤ h. But am ≤ ∑m+1 ai. So we have a contradiction and this case cannot occur.

 


 

15th Putnam 1955

© John Scholes
jscholes@kalva.demon.co.uk
26 Nov 1999