Monday, May 28, 2012

An interview of Maurice Herlihy

Q: You just received the Dijkstra prize for the second time. What did you receive this award for?
A: The Transactional Memory paper. The basic idea is that designing highly concurrent data structures is much too hard and people who work in that area have to focus on too many tricks. Mechanical tricks are too easily confused with essential issues, and that paper was an attempt to make it easier to deal with the mechanics, to allow researchers to focus on the core aspects of synchronization algorithms.
Q: What's the result?
A: Hardware designing.
Q: What do you mean?
A: A way of designing cache coherence protocols to allow the programmer to avoid conventional mechanisms such as locks. You can say what it is that you want to be atomic, without specifying how. It's an implementation of a very simple abstraction. The paper was very concrete, with hardware architecture.
Q: How did it have impact?
A: The paper is from 1993. For the first 10 years, nobody paid any attention to this paper. There was an average of 5 or 6 citations per year. Then in 2004 changes in technology meant that processor clock speeds could no longer improve, so people went to multicore where there is a lot of concurrency, and then they discovered that no one had any idea how to use them.
Q: You make programing multicore machines easier to do and to reason about?
A: It has other software engineering benefits, some performance benefits. The entire story is complicated but big picture is that it makes it much easier to write concurrent programs.
Q: Can you give an example of a concrete impact?
A: The Intel Haswell processor, their next processor, has transactional memory built into it. Intel laptops from now on have a machine that supports transactional memory in a form not much different from the 1993 paper.
Q: Do you have a patent?
A: I do but it's not mine. I gave it to my employer who was DEC. The patent may be expired.

Q: What did you think of it when you were doing it?
A: A very nice idea solving an important problem, but I knew no one would care about it. I was convinced that it could become a very important idea, and at the same time I was very surprised when it did.
Q: Do you have any other papers waiting to be discovered?
A: [...]
Q: How did you think about it?
A: It came up in a conversation with Eliot Moss, coauthor on the paper. I was interested in lock-free data structures and he was interested in architecture and we were talking about how difficult it was to do lock free data structures on modern architecture and Eliot asked whether one could use cache coherence protocols to address the problem.
Q: This piece of research was instantaneous?
A: We had been office mates as grad students at MIT before, so we would meet from time to time casually for conversation, so this is an example of why it's a good idea to talk to people from other areas. What made it so unusual at that time was that it was a linkage between concurrent data structures (very high level) and hardware design (very low level). It required a broad CS culture.
Q: What does that say about how to do research?
A: You need to have conversations with people who are not goal directed. Neither one of us working individually would have done this. We knew something about one another's areas but would not have concentrated enough to see this connection.

Q: Do you have any opinion of the streaming computational model?
A: It's something that has gone, in a very short time, from ad hoc to methods that are much better founded in theory.
Q: What about MapReduce?
A: It's been enormously successful, but it's very limited in what it can do, and everybody is trying to invent or discover the successor to MapReduce.
Q: Does it mean it's not worth investing a lot of time into it, say, designing algorithms for it?
A: Because MapReduce is so important, any improvement will have immediate impact, but it's something done better as a large company, in industry rather than in the academic world. There 's also a lot of engineering that goes on in this area.

Q: How do you go about doing research?
A: I am opportunistic. I have a general area of interest, namely, concurrency, but I worry about becoming too specialized, so I make an effort to balance my attention between more applied and more theoretical areas, because, often, that's where you have the opportunity to write the first "easy" paper and let the ones who write the difficult papers cite your paper.
Q: So you're a pioneer?
A: Maybe more of a vagabond.
Q: Are you ever tired of research?
A: No.
Q: Do you do research on vacation?
A: Yes, in my head.
Q: Do you always have some problems you're working on?
A: Yes, usually something that's good for in the shower, or for riding on the metro.
Q: Usually, do you work on several problems at once, or focus on a single one?
A: Usually several small problems at once, because when you get tired of one, then you can turn to the others.
Q: What are your current interests?
A: Right now, I'm writing a book on combinatorial topology and distributed computing, and I'm also interested in systems aspects of multicore. Since almost nobody is interested in both of these topics, it means that I can talk to twice as many people!
Q: Where are your collaborators?
A: Almost everywhere except at Brown. I'm writing the book with one coauthor in Mexico and the other in Germany. I am visiting several collaborators here in Paris. I work a day a week at Oracle labs in Boston.

Q: Distributed computing is not taught much in the undergrad curriculum. What do you think of that state of affairs?
A: I think that that will inevitably change. What will be taught will be multiprocess computing, because soon every computer will be multicore. Today I think that it is a little bit of a scandal that we don't teach parallel computing to our undergrads until late.
Q: Are you familiar with the CLRS textbook on Algorithms?
A: There is a chapter on parallel computing but it's an area that changes rapidly.
Q: So, what about distributed computing?
A: The problem with distributed computing is that there are many models of computation. We're trying to capture many different kinds of phenomena. The culture isn't one where people think standardization is important. Every time someone writes a paper, there is a tendency to use a slightly different model and we're not organized enough as a community to present a coherent story to the world. We're a little like European parliaments, we split into many small groups and argue with each other instead of presenting a unified front.
Q: I take it that that's a criticism?
A: It's an observation. This situation is exactly the reason why I'm writing the book. At present, when you work in this area, you have to read a dozen badly written papers. That presents a large barrier to students and outsiders, and the hope of the book is to present a single coherent story, at a level that's accessible to advanced undergraduates.
Q: When is it coming out?
A: End of this year?
Q: Have people been teaching from drafts of this book?
A: No, people teach from drafts of my other book, a much more applied textbook.
Q: What's your community, the conference where you feel most at home?
A: PODC and DISC, which are basically the same conference but on different continents.

Q: What got you interested in distributed computing?
A: As a grad student at MIT, Barbara Liskow had an opening (physical opening: a chair) in one of her offices, so I drifted into her group. Who you sit next to is perhaps the most important influence in your life. I was floating around looking for a group and the other groups on that floor weren't very friendly, but Barbara was happy to talk to a new student.
Q: Why did you go to MIT?
A: Because I was already living in Cambridge and my girlfriend didn't want to move.
Q: Why did you go to grad school?
A: I got a degree in pure math and I wasn't sure what I wanted to do and at the end of summer after I graduated I needed to start paying rent so I applied at random for jobs. I had one job offer to teach math at a boys school in Michigan, one to write Fortran programs for the DOT in Cambridge, then I went from there to working at a research group at the MIT business school, but one day they lost their grant money. They didn't fire me but they fired the guy who did the tape backups and told me I now had to do tape backups, and so I applied to grad school.

Q: What's your biggest regret in your career?
A: I think it's being slow on the intake. There are many ideas that turned out to be successful that should have been obvious much sooner. I could have gone to grad school right away instead of working as a Fortran programmer for 3 years.
Q: That was a waste of time?
A: Not really because, whenever I get frustrated, I remember that time and realize how fortunate I now am.


  1. Stunning interview Claire, thanks for posting it!

    1. A great interview (if somewhat short)!

      One comment:

      "Today I think that it is a little bit of a scandal that we don't teach parallel computing to our undergrads until late."

      How does the situation look like in the US? In Warsaw parallel computing is taught to all CS undergrads during their 2nd year.

  2. I get frustrated, I remember that time and realize how fortunate I now am.

  3. You make programing multicore machines easier to do and to reason about?


Note: Only a member of this blog may post a comment.