Lienar algebra eigen value question

I can’t find anything on line that addresses this question and can’t really figure out if it’s impossible (very hard).

I have an n x n correlation matrix (so it’s symmetric and the main diagonal is all 1’s). All other elements are either 0 or r (You may take r > 0 if it makes the problem easier.) The r values are scattered through the matrix (Of course the matrix must be symmetric as well.) The number of r values is twice a triangular number (so that the arrangement described below is possible).

The question is what arrangement gives the largest maximum eigenvalue. I think the answer is to have the matrix be block diagonal with the top all 1 and r, and the lower block the identity matrix.

That works out “experimentally”, but I can’t prove it. Any thoughts?

An n x n matrix will have up to n distinct eigenvalues. What are you maximizing?

For a 2X2 matrix as you have described, and assuming 0 < r, < 1, the max eigenvalue is from a matrix of all ones, with lambda = 2.

I think an n X n matrix of all 1s will have a max eigenvalue of n.

As I said, I’m looking for the largest maximum eigenvalue; that is, I wish to maximize the largest eigenvalue.

And I’m not maximizing by changing r or the number of r’s. r and the number of them are fixed. I’m maximizing by changing the arrangement of the r values. There’s no point in doing this in a 2x2. There is only one arrangement possible.

I think there’s no point in checking a 3x3 either.

In order for anything interesting to be possible, the upper half-triangle needs at least one r and at least one zero. In the 3x3, this means you have either exactly one r or exactly one zero. But if this is the case, you can always transform one arrangement into another via

M’ = P[sup]-1[/sup] M P

where P is a permutation matrix. Clearly M and M’ have the same eigenvalues.

So unless I’m saying something foolish – always a possibility! – the simplest case you could get anything meaningful for is a 4x4 problem.

I realize this goes basically nowhere toward answering your question.

I don’t have an answer either, but a suggestion: This may be hard enough that you would be better off asking at either math.stackexchange.com or mathoverflow.net. Just be aware that the latter group takes seriously that the questions asked should be research-level math.

No proof, but a vague intuition of why it’s true. (I think Old Guy already has this but much less vague than mine.)

Maximizing the correlation matrix’s largest eigenvector is the same as maximizing the expected energy of the principal component (PC) of the associated signal, which itself will be a sort of shadow of one of the largest-normed correlation rows.

Increasing the correlations among the highest-weighted elements of that PC-eigenvector (as OP’s conjectured solution does) will clearly(!) best increase the expected energy of the signal’s principle component! Q.E.D. :slight_smile:

Well, say r is a positive real number. If I understand you correctly, if you start from a matrix which is block diagonal with the upper-left block all 1 and r, and the lower block the identity matrix, its maximal eigenvalue will be 1 + dr, where d is the number of r’s in each row of the block. If you then re-arrange the entries in such a way that each row still has one 1 and at most d r’s, then each real eigenvalue of the resulting matrix will be less than or equal to 1+dr, as can be seen by a coordinate-by-coordinate argument (as in Gershgorin’s circle theorem). Is that sufficient for your purposes?

The problem with this argument (I think) is that the number of r’s above the diagonal in the first matrix is d(d-1)/2 (since there are no zeros in this block). In principle, one could then create a matrix where all of these r’s are in the first row, and Gershgorin’s theorem would then only bound the largest eigenvalue to be less than 1 + r d (d-1)/2. This isn’t a strong enough bound to preclude an eigenvalue greater than 1 + d r.

Let me check on this. All the r’s can’t be in the first row because it’s a correlation matrix so it must be symmetric. Only half the r’s could be there.

And I did ask on one of the sites suggested, but haven’t looked back yet.

Actually, that should be d(d+1)/2. But I think my argument still holds: this is a triangular number, corresponding to the number of r’s above the diagonal. Under the rearrangement I propose, you’d have d(d+1)/2 r’s in the first row, and the other d(d+1)/2 r’s in the first column.

Here’s an interesting bound based on Schur’s inequality. Consider the matrix obtained by subtracting the nxn identity matrix from your correlation matrix; in other words, it has zeroes on the diagonal instead of 1’s. Since every vector is an eigenvector of the identity matrix with eigenvalue 1, this new matrix has the same eigenvectors as your original correlation matrix, and the eigenvalues of the new matrix are obtained by subtracting 1 from each of the old eigenvalues.

Now, by Schur’s inequality, we have that

 Σ |λ|[sup]2[/sup] &lt;= Σ |a[sub]ij[/sub]|[sup]2[/sup],

where the sum on the left-hand side is over the n eigenvalues, and the sum on the right is over all n[sup]2[/sup] matrix entries. The new matrix will have d(d+1) entries (i.e., twice a triangular number) that are equal to r, and the rest will be equal to zero. Thus, applying Schur’s inequality we have

 Σ |λ|[sup]2[/sup] &lt;= d(d+1) r[sup]2[/sup].

But since all the terms on the left-hand side are strictly positive, no individual term can be greater than the right-hand side. Thus, we have that the absolute value of any eigenvalue of our new matrix is bounded above by √(d(d+1)) r.

The block-diagonal matrix you’ve described yields a largest eigenvalue of d*r when transformed in this way, so this constitutes a known lower bound on the largest eigenvalue of all such matrices. There isn’t much “room” between this upper bound and the known lower bound; fractionally, they differ by √(1+1/d), which is relatively small for large d.