A, B, C, D are n x n matrices with entries in some field F. The transpose of a matrix X is denoted as X' (defined as X'ij = Xji). Given that A B' and C D' are symmetric, and A D' - B C' = 1, prove that A'D - C'B = 1.
Moderately hard, unless you have seen something similar before.
Notice first three obvious facts: X is symmetric iff X = X'; X'' = X, (XY)' = Y'X'. So we are given that AB' = BA', CD' = DC', and AD' - BC' = 1. A little playing around shows that the result does not follow trivially. So we need some additional piece.
The commonest non-trivial fact about matrices is that one-sided inverses are two-sided inverses. In other words, if XY = 1, then YX = 1. But it is not immediately obvious how to use it.
Playing around may lead us to write AB' = BA' as AB' - BA' = 0. Further playing around might lead us to take transposes of AD' - BC' = 1 to get DA' - CB' = 1. Some inspiration might lead us to notice that we can write the relations as:
AD' - BC' = 1
AB' - BA' = 0
CD' - DC' = 0
-CB' + DA' = 1
If we are lucky these formulae may remind us of 2 x 2 matrix multiplication (the only obstacle to instant recognition is the minus signs). It is a well-known trick that one can group matrix elements into blocks and iterate to get the matrix product (the proof is trivial). So we could summarise the four equations as: (A, B; C, D) (D', -B'; -C', A') = 1, where (A, B; C, D) is the 4 x 4 matrix with the upper left 2 x 2 block as A, the upper right 2 x 2 block as B, the lower left 2 x 2 block as C, and the lower right 2 x 2 block as D.
Now we can use the fact noted at the start to get (D', -B'; -C', A') (A, B; C, D) and hence four more relations, including, for the bottom right blocks, the required -C'B + A'D = 1.
47th Putnam 1986
© John Scholes
30 Sep 1999