Friday, January 7, 2011

Approximation Algorithms and Structural Complexity

In Approximation Algorithms, we try to go as far as possible towards constructing an approximation. In Complexity Theory, people try to go as far as possible towards proving that a problem is not approximable.

There is a natural meeting point, at the limit of our respective searches -- the instances for which we have the greatest difficulty designing approximations are precisely the instances for which, on the other side of the barrier, people try to prove lower bounds. Ultimately, it is the same combinatorial structure that frustrates our attempted algorithms and that provides the ingredients of their lower bounds.

The researchers, like me, whose domain of specialty is algorithms, work on only one side of the barrier, but the ones whose particular strength, at the core, is in combinatorial structure, can work on both sides.

5 comments:

  1. one example of such meeting point is complexity of weighted independent set counting, reached from tractability side in 2006 by Weitz and from intractability side in 2010 by Sly, I wonder if there are other interesting cases where the two sides have met

    ReplyDelete
  2. Set cover? Or is that not "interesting" on the algorithm side?

    ReplyDelete
  3. Not really familiar with set cover, in what sense do the two sides meet?

    ReplyDelete
  4. Set Cover (cheapest family of sets, from input collection, covering all elements of input universe) has an O(ln n) approximation algorithm, and an Omega(log n) hardness result assuming P is not equal to NP. Another way in which the two sides meet: it has a 1*ln(n) greedy approximation algorithm, and a 1*ln(n) hardness result with a slightly stronger assumption (Feige 1998).

    Plus, there are all the problems for which SDP algorithms give the optimal approximation ratio assuming the unique game conjecture.

    ReplyDelete
  5. I think it's useful to keep both points of view in mind. Recently I have done some work with a more algorithmic flavor than usual, trying to figure out the complexity of some specific problem (let's call it problem A). Initially I was not even sure whether it was a "hard" or "easy" problem. I first tried to give an efficient algorithm, and after some effort noticed some connections with problem B, a problem which looked easier to study. So the goal became to give an efficient algorithm for problem B, and then hopefully to adapt it to problem A. But that didn't work, and after a while I realized that problem B was likely to be ocmputationally hard. So now the goal became to give a reduction from B to A, thereby establishing the hardness of A.

    ReplyDelete

Note: Only a member of this blog may post a comment.