An interesting problem by Mallat and Zeitouni

Last week, Ramon van Handel showed me a really cool open problem that I want to share with you today. The problem was posed by Stephane Mallat and Ofer Zeitouni in a nice short note available here.

The problem is remarkably simple to pose: Let {g} be a gaussian random vector in {\mathbb{R}^N} with a known covariance matrix {\Sigma} and {M<N}. Now, for any orthonormal basis {\mathcal{Q}} of {\mathbb{R}^N}, consider the following random variable: Given a draw of the random vector {g}, consider the {\ell_2} norm of the largest projection of {g} on a subspace generated by {M} elements of the basis {\mathcal{Q}}. The question is:

What is the basis {\mathcal{Q}} that maximizes the expected value of this random variable?

To help give some intuition, suppose that one had to choose the projection before seeing the outcome of the gaussian vector. In other words, what is the {M}-dimesional subspace that maximizes the expected value of the {\ell_2} of the projection of the random vector? The answer is easily seen to be the subspace generated by the {M} leading eigenvectors of {\Sigma}. This means that, in this case, the optimal basis is the basis that diagonalizes {\Sigma}, and this is essentially the premise of Principal component analysis.

The, very believable, conjecture posed by Mallat and Zeitouni is that this basis is still optimal in the setting of the problem. In fact, they show this to be true for {M=1} in the same note, but their argument unfortunately does not seem to generalize to {M>1}, see the comment added on page 3.

I think the problem is super interesting. Such a simple natural question and yet it seems to be quite difficult to resolve.

Looking forward to hearing your thoughts on it!

2 thoughts on “An interesting problem by Mallat and Zeitouni

    1. Hi Joel, if \Sigma is the identity then every basis has the same performance, the gaussian becomes orthogonal invariant in that case. Note that you can assume, without loss of generality, that \Sigma is a diagonal matrix.

Leave a comment