Given non-negative variance values that sum to , consider a random matrix whose entries are independent gaussians with mean zero and variance equal to . For which choice of is the expected value of the average singular value of that matrix maximized?
Happy holidays! Afonso asked me a while back to write a guest blog post, and I figured this would make a decent Christmas gift. Just to quickly introduce myself, I did some undergraduate mathematical research at Princeton University with Afonso. Today’s post is on some findings in this paper (to appear in the ITCS 2014 conference proceedings), which is joint work with Afonso, Amit Singer, and Moses Charikar from Princeton.
To align a pair of rotated images on the disc , one simply measure the similarity of different rotations of the images and choose the best one (say cross-correlation in signal space, or phase correlation in frequency space). How about multiple noisy, rotated images? For a fixed noise distribution and sufficiently many observations, we certainly expect to be able to recover the underlying template. The main difficulty is that nothing is known about the rotations or the underlying image. We describe an approach to this multiple observation alignment problem using theory developed for the Unique Games problem.
Today’s post is about a recent paper of mine with Christopher Kennedy and Amit Singer entitled “Approximating the Little Grothendieck Problem over the Orthogonal and Unitary Groups”. Earlier this week, we finished a major revision to the paper (even the title was slightly changed) and I decided to take the opportunity to blog about it (in a previous blog post I blogged about an open problem that is raised by this work).
— Generalized orthogonal Procrustes Problem —
In this post I will motivate the problem using the orthogonal Procrustes problem: Given point clouds in of points each, the generalized orthogonal Procrustes problem consists of finding orthogonal transformations that best simultaneously align the point clouds. If the points are represented as the columns of matrices , where then the orthogonal Procrustes problem consists of solving
Since , (1) has the same solution as the complementary version of the problem
where has blocks given by . Note that is positive semidefinite.
A couple of weeks ago I gave a talk at PACM Graduate Student Seminar where I showed a neat simple proof for exact recovery of Synchronization on the presence of random outliers. Although this particular argument is not available in the literature (yet – it will appear as part of a future publication, keep an eye on this blog!), the result has been shown before by Lanhui and Amit with a different technique.
Synchronization over on a complete graph
Recover, up to a global sign, , given noisy relative measurements .
The problem is essentially the same as described in my last post with the difference that it’s over and not . However, while in the Cheeger inequality for the Connection Laplacian paper and post we showed stability in the presence of adversarial noise, here we will show that if the noise is random one can achieve, with high probability, exact recovery.
I plan to write a few posts about approximation algorithms. I decided to start with my favorite one, and the first non-trivial one I read about, the SDP relaxation for Max-Cut. This post will be about this particular approximation algorithm and the proof for its guarantee.