Thursday, December 16, 2010

On which topics is there progress in Theoretical Computer Science?

Here are the best papers at STOC and FOCS in the last few years, as best as I could gather. What picture does that paint of research activity in core theoretical computer science? Crypto, approximation, and hardness of approximation are progressing swiftly, but also error-correcting codes. Structural space complexity continues strong. (Some outliers: discrete math, game theory, and the paper on faster integer multiplication.)

Are those the areas where new ideas are emerging, where things are happening, and therefore where one might consider hiring? I'm just considering the core here, not the exploding areas of interdisciplinary research.

ACM STOC 2010
"QIP=PSPACE," and "An Improved LP-based Approximation for Steiner Tree."
ACM STOC 2009
"A Constructive Proof of the Lovasz Local Lemma", and "Public-Key Cryptosystems from the Worst-Case Shortest Vector Problem".
ACM STOC 2008
"Optimal Hierarchical Decompositions for Congestion Minimization in Networks", and "Algorithms and Inapproximability Results for Every CSP?".
ACM STOC 2007
"Faster Integer Multiplication", and "Towards 3-Query Locally Decodable Codes of Subexponential Length".
ACM STOC 2006
"The PCP Theorem via Gap Amplification".
ACM STOC 2005
"Undirected ST-connectivity in log-space".
ACM STOC 2004
"Expander Flows, Geometric Embeddings, and Graph Partitionings", and "Multi-linear formulas for permanent and determinant are of super-polynomial size".

IEEE FOCS 2010
"Subexponential Algorithms for Unique Games and Related problems".
IEEE FOCS 2009
Best Paper Award: none?
IEEE FOCS 2008
"Two Query PCP with Sub-Constant Error"
IEEE FOCS 2007
"Space-Efficient Identity Based Encryption without Pairings"
IEEE FOCS 2006
"Settling the Complexity of 2-Player Nash-Equilibrium"
IEEE FOCS 2005
"The Unique Games Conjecture, Integrality Gap for Cut Problems and Embeddability of Negative Type Metrics into L1", and "Correcting Errors Beyond the Guruswami-Sudan Radius in Polynomial Time".
IEEE FOCS 2004
“Hardness of Approximating the Shortest Vector Problem in Lattices”, and “Cryptography in NC0”.

4 comments:

  1. Of course, one may conjecture that these are simply the areas which have more political power at the moment, and can flex the right muscles to get their own members the awards.

    On the question whether to focus hiring in such areas, one may answer "Why not? It's beneficial for an institution to hire people in politically influential areas."

    -mihai

    ReplyDelete
  2. Hello Claire, a mildly self-serving comment: I think arithmetic circuit complexity is another area which has made good progress recently. We have seen progress on lower bounds (you already have one such paper in your list), on polynomial identity testing, with both conditional results ("hardness versus randomness") and unconditional results (for restricted classes of circuit); we have seen progress on depth reduction ("the chasm at depth four").

    In algorithms, a field that I do not know that well, it seems that we have also had interesting results on "non-approximation" algorithms (exact exponential time algorithms, fixed parameter tractability...).
    But for some reason these topics do not seem to be so well represented at STOC and FOCS.

    Now I suppose that the readers of this blog will follow suit by listing their favourite results in (or even out of!) their area of expertise.
    At least we will learn in what areas your readers work.

    PK.

    ReplyDelete
  3. Hi Claire,

    I think the list raises some interesting questions. I definitely see
    a lack of efficient algorithms papers in the list.

    I don't think the list sais much about relative quality between
    fields. How can you, say, compare an algorithms with a crypto paper.
    Some of the awards are for problems that may be a bit contrived to
    outsiders, so it often boils down to "these famous people like the
    problem/result". This means that not only do you need a star to solve
    the problem, you also need famous stars to support it, and it is
    particularly useful if you have strong supporters on the PC. Fields
    that have already captured previous best paper awards, are more
    likely to feel that their best papers deserve the award. Finally,
    it helps with a unified field that knows to pick and rally
    behind their best paper, proclaiming victory in blogs long before the
    PC gets to the decision.

    One issue for non-approximation algorithms is that we have too
    many important but unrelated problems dividing the field into
    pockets.

    An excellent example of a non-award-winning algorithms paper is from
    STOC'03 (the first best paper award year):

    [DI] Camil Demetrescu, Giuseppe F. Italiano: A new approach to
    dynamic all pairs shortest paths. STOC 2003:159-166.

    The paper showed that we can update the all-pairs shortest path matrix in
    near-quadratic time, which is the best one can hope for. There had been plenty
    of previous work with weaker results, e.g., the previous best was
    n^2.5 in unweighted graphs whereas DI is for arbitrary weights.

    Note that shortest paths computations is something that come up many
    places in real computing.

    The paper introduced a new approach to shortest paths, considering
    much less potential shortest paths than Dijkstra. This lead to their
    strong theoretical result for the dynamic case. They also found it to
    work very well in practice for the non-dynamic static case. For the
    static case, they had no theory result, but at this years' FOCS, the
    DI-algorithm was proved to be optimal on random graphs:

    Yuval Peres, Dmitry Sotnikov, Benny Sudakov, Uri Zwick: All-Pairs
    Shortest Paths in O(n2) Time with High Probability. FOCS 2010:
    663-672.

    Now take the current champinion of the complexity folks:

    http://blog.computationalcomplexity.org/2010/11/breakthrough-circuit-lower-bound.html

    As far as I understand this, we already know that there are things we cannot
    do with mod_p gates for primes p, and now Ryan can prove limits to what
    we can do with mod_6 gates, not that we have either type of gate on a real
    computer, but Ryan has the support of Dick, Scott,
    Luca, and Reingold, so surely he will get awarded.

    One thing I like from the blog praise of Ryan is "doesn't use deep
    mathematical techniques but rather puts together a series of known
    tools in amazingly clever ways". Cleverness often leads to
    breakthroughs in algorithms, but an algorithms paper would surely
    be critized for lack of new fundamental techniques.

    Contrasting the above mod_6 result, at STOC'06, Patrascu and I were
    the first to prove a separation on the query efficiency with a
    polynomial versus a near-linear space. Near-linear space is what we
    need to store large data sets, so understanding its limits should be
    fairly relevant to computing. I don't think anyone understanding the
    result would dispute that it is as strong as can be within its field,
    but you need strong supporters in the PC understanding and
    appreciating the result. Otherwise you are simply not on the radar for
    best paper.

    So back to Claire's question. I think best paper is very much a
    question of politics. Areas like efficient algorithms may be pretty
    bad if you want best paper awards, but they can be very satisfying if
    you want to do something of relevance to computing.

    Best, Mikkel

    ReplyDelete
  4. Mikkel, Pascal and Mihai, I do not think that anything can be inferred from the absence of a field from the list. It does not mean that the field is not thriving. But for an area to be present, it needs, not only advocates etc., but also exciting advances. So I think that the presence of a field is meaningful.

    ReplyDelete

Note: Only a member of this blog may post a comment.