Today’s post is about a recent paper of mine with Christopher Kennedy and Amit Singer entitled “Approximating the Little Grothendieck Problem over the Orthogonal and Unitary Groups”. Earlier this week, we finished a major revision to the paper (even the title was slightly changed) and I decided to take the opportunity to blog about it (in a previous blog post I blogged about an open problem that is raised by this work).
— Generalized orthogonal Procrustes Problem —
In this post I will motivate the problem using the orthogonal Procrustes problem: Given point clouds in
of
points each, the generalized orthogonal Procrustes problem consists of finding
orthogonal transformations that best simultaneously align the point clouds. If the points are represented as the columns of matrices
, where
then the orthogonal Procrustes problem consists of solving
Since , (1) has the same solution as the complementary version of the problem
where has
blocks given by
. Note that
is positive semidefinite.
In fact, we will focus on problems of the form (2), for any positive definite (which encodes a few other problems, see more here).
— The little Grothendieck problem () —
For , (2) reduces to the well-studied little Grothendieck problem in combinatorial optimization (see this reference)
where is a
positive semidefinite matrix matrix.
Problem (3) is known to be NP-hard (if is a Laplacian matrix of a graph then (3) reduces to the Max-Cut problem). I briefly discribed a semidefinite relaxation for (3) proposed by Goemans and Williamson in the context of the Max-Cut problem:
Note that (4) is equivalent to the semidefinite program:
and can be recovered by the Cholesky decomposition, and so (4) can be solved, up to arbitrary precision, in polynomial time.
In the same blog post I explained how a simple rounding technique (to compute a solution to (3) from a solution
to (4)) is guaranteed to produce a solution whose objective value is, in expectation, at least
of the optimum. It turns out that one can show an approximation ratio of
(instead of 0.878) for the general
case using the same relaxation. This implies, in particular, that the value of (3) can never be smaller than
times the value of (4). Interestingly, this was already known from the influential work of Grothendieck on norms of tensor products of Banach spaces.
— Little Grothendieck problem over the Orthogonal Group —
In our paper we propose, and analyze, a natural semidefinite relaxation to problem (2). It is a natural generalization of the case:
Just as in the unidimensional case, (6) is equivalent to a semidefinite program:
We propose the following approximation algorithm
Algorithm Compute
, a solution to (6). Let
be a random matrix with iid gaussian entries. Compute
, a solution to (2), as
where
.
Note that the Polar Decomposition can be efficiently computed via the Singular Value Decomposition
as
.
One of the main contributions of the paper is showing that this algorithm gives a constant factor approximation of to (2), where
is defined as
where is a gaussian random matrix with i.i.d entries
and
is the
-th singular value of
.
Theorem 1 Let
and real. Let
be the (random) output of the algorithm above. Then
where
is the constant defined in (8).
Moreover, we can also show a matching integrality gap meaning that this constant is optimal. I don’t want to go into this in much detail, but it essentially means that no other algorithm based on this natural semidefinite relaxation can have a better approximation ratio. This does not rule out the existence of a different efficient approximation algorithm that achieves a better ratio but, in these sort of problems, it tends to be the case that achieving better approximation ratios than the natural semidefinite relaxations is hard.
One of the interesting questions raised by this work is the behavior of . This was the subject of a previous blog post and I won’t discuss it much here. In a nutshell, we can give good lower bounds for this constant (that, in particular, show that our approximation ratio is larger than the best previously known, for all dimensions
) and conjecture that it is monotonically increasing with
. I find it quite counter-intuitive that problem (2) should be getting easier to approximation as the dimension
increases! Perhaps even more strange is that if we do all of the above working in
, considering the unitary group
, and define the analogous constant
it seems to be decreasing with the dimension instead! We discussed this strange phenomenon in this blog post.
1 thought on “The little Grothendieck problem over O(d)”