Tuesday, January 31, 2012

Transportation woes

I recently bought a ticket to go to France in May. I paid a little more than the lowest price, so that I could fly direct, and was very happy with my plan.

But two days ago I learned that my flight was cancelled. After calls to the various partners involved (airfare.com sold me an Iberia ticket for a flight operated by American Airlines...), I saw that the alternatives offered, all with a connection, were all significantly less convenient than what I could otherwise get (as well as a little bit more expensive). I gave up and asked for a refund. They are no longer offering the product that I bought, so let them give me my money back and we'll just forget the whole thing. That struck me as the simplest solution and the end of my frustrations.

But it's not easy to get a refund! After a few hours on the phone, this is my current understanding: if a customer buys a ticket and the flight later is cancelled (three months in advance) by the airline, then the customer will be refunded the price of their ticket, minus a $100 fee.

The morale of the story is: don't get into a fight with airlines. They're bigger than you and it's not worth your time.

Also: don't buy your ticket more than three months in advance!

Monday, January 23, 2012

"Proving something does not make it true"

This weekend I accidentally stumbled on a page for science teachers. http://digipac.ca/chemical/proof/index.htm

I was appalled by the twisted message, that reduces logic to a suspicious rhetorical device. Money quotes:

Proof does not equal truth

Here's a definition of what it means to prove something: "Proof is arriving at a logical conclusion, based on the available evidence." Notice that this has absolutely nothing to do with being right or wrong.

Proving something does not make it true. It just means that you have convinced other people that the evidence supports your conclusion.

I was particularly stung by the third quote, because I use similar terms to try to explain how to write a proof: I usually say that it should be just formal enough, but not too formal, and that "just formal enough" depends on the recipient - just enough to convince the reader that you indeed have a valid proof.

Sunday, January 22, 2012

Adaptive Online Adversaries

This year there are presidential elections in both France and the US. The leading party of the opposition is the Republican party in the US and the Socialist party in France.

This year for the first time in France the Socialist candidate was chosen by election, in a quick two-round affair: a one-day vote to wean all but two of the candidates, and a one-day runoff vote one week later.

The Republican primaries in the US are a prolonged game of suspenseful fractional elections, state by state, little by little.
Since the voters of the beginning period are not a random sample of the whole set, their votes are not necessarily representative of the final outcome. Information about voters' intentions is revealed gradually, and late voters can adapt their choice given knowledge of previous voters' choices. They have more information than in France, but that information is not uniformly distributed among voters: early voters have no information. It's like an online algorithm in which votes arrive online and the adversary's choices for the ith vote can depend on votes 1 through i-1. That makes for exciting news!

Saturday, January 21, 2012

The education contract

Before I learned how to study for class, from fourth grade up to some time in college I was never better than second best in class. After I learned how to study (see yesterday's post), one year, the year I switched to computer science, I decided to apply that studying method and give my all to learning CS. That enabled me to maximize the benefit reaped from my courses (with amazing results on my grades, but that's a detail).

Still, I did not learn as much as I could have, but the fault lied, not with me, but with the teachers. Even now, I am still a bit annoyed with those among my instructors that year who taught bad courses. Dedicated students deserve good courses. When students put a lot of effort into a course, the instructor has a responsibility to make it worthwhile by also putting a lot of effort into the course - and conversely. It's a dynamics of mutual emulation.

I am thinking about this as I re-read "La Sorbonne en guerre", my grandfather's book about being a professor during the Occupation. At the beginning of the war, universities were temporarily closed, students and professors were scattered, and no one knew what was going to happen. In the middle of the chaos, students kept trying to study for their exams, and my grandfather's students resorted to sending him assignments by postal mail. He would receive translations of ancient Greek classical texts, grade them and send them back to his students by postal mail. Word got around, and he started receiving translations from students who were not his! And so he graded even work written by complete strangers and mailed it back to them. It seems to me that there was a sense of duty and of mutual respect: one party was there to learn, and the other was there to help learn, like an implicit contract. Even in the middle of destruction, survival of the intellectual life was for them a matter of highest priority.

That view of education, resolutely non-utilitarian, is to some extent present at Brown. It can also be found in France: for example, when I taught with Jean-Jacques Levy, I was amazed by the time he put in preparing lectures and assignments, true perfectionist that he is. That makes teaching a large class intimidating: each typo in a homework problem, each poorly specified exercise, each algorithm whose presentation in class is botched means that many dedicated students will waste time on something non-constructive, and it is damaging. It puts a dent in the contract.

Friday, January 20, 2012

On studying for class

As the beginning of the semester is approaching, it may be a good time to think about how to study.

Here is how I used to go about learning when I was a student, before I knew better.

During Math class, which I enjoyed, I paid close attention and whenever I had an inkling of how to do the next step - for example, after the instructor had stated a lemma but before he he or she had proved it - I would work through the argument by myself, oblivious to the lecture; after a couple of minutes, I would raise my head and catch up with the lecture. Occasionally, I would go off the wrong track, waste some time, then have trouble catching up, but more often it was a very effective way to stay engaged. It's about active learning. Even now, I would still recommend that way of attending lectures.

To study material from lecture, I would quickly skim over my lecture notes to memorize the main definitions and theorems. Then, to work through assignments, I would do many exercises with ease, then stop when I encountered a difficulty and, after giving it a fair try, either manage or fail to solve the problem, call it good and turn in what I had - usually about 80 or 90 percent of the assignment. That meant that, as the assignment usually had problems of increasing difficulty and I stopped before the last problem, my studying reinforced my knowledge of material which I already understood fairly well, but never made me get insights that I had not already acquired in class. I do not recommend that way of studying.

One year I made a friend with whom I worked closely, and, from her, I learned how to study.

To study material from lecture, here is what she did: she would reconstruct the lecture. She had her lecture notes in her notebook next to her, but usually kept it closed. She would try to write down definitions and main theorems from memory, checking by briefly opening her notebook after writing. Moreover, she would also try to reconstruct the arguments by writing down the statements of the lemmas and outlining a proof sketch for each of them! Of course, that was time consuming: even a well-understood two-hour lecture that was fresh in our memory would take at least one hour and often close to two hours to redo. But if there was any subtlety that had gone by unnoticed during class, it really put the spotlight onto it. Then, to do the assignments, she would put in about twice as much work as I used to: indeed, she proudly claimed that she would never turn in an incomplete assignment. Since the last problems were much more difficult than the rest, she would spend the larger part of her time working on those challenging problems. At first that seemed stupid to me: why would one spend so much time reviewing lecture notes, when there was no grade associated to that activity, and we had understood the class fine anyway? Why would one spend more than half of one's study time on a tiny part of the assignment, for a tiny part of the grade? What an inefficient use of one's time! But then, as I watched her work, a realization dawned on me: the assignment was a way for her, not just to practice, but to gain better understanding of the material. She was learning in depth.

Thursday, January 19, 2012

On parenting

I had children younger than most of my US friends, and when it was their turn to raise young kids, whenever they saw me they asked questions such as: "How did you do it? I never realized it was so hard!", or "I love him/her/them, but I'm barely making it through the days", or "Tell me, how many more years do I have to go through before it gets easier?"

Yes, I do have a few friends who blossom as parents and seem to take everything in stride, usually leaving their career by the wayside along the way to perpetual parental bliss. It's a sacrifice that seems to come easily to those friends, who appear to have a natural vocation for pure parenthood.

But most of my friends are juggling career and parenting, and, while parents of young children, live in a state of constant tension, stress and tiredness. They do not have quite enough energy and time to perform as well as they expect from themselves at work, nor to be as attentive to their children as they would like to be.

As a parent, one of the most helpful things I got told came from the mother of a colleague. I only met her once, when they came to dinner at my house.
Making small talk, she asked: "So, how is it, doing research, having two young children, and all that?"
I answered succinctly: "It's not easy".
She glanced at me, surprised by my frankness. After a short pause she replied: "It will get better."
Then, seeing that her answer hadn't sunk in, she insisted: "I have had two children myself. I remember what it's like. It will get better. It really will!"
That time, I paid attention. Unnoticed by the other people present at dinner, we exchanged a look of mutual understanding; and the conversation moved on to other topics.
In the following months, whenever things were almost overwhelming, I recalled what she had told me and, instead of thinking "Things cannot go on like this," I soothed myself by repeating in petto: "It will get better."

Wednesday, January 18, 2012

Independence

Whenever I read about "independence", it raises a red flag in my mind. If a paper contains a probabilistic analysis, bugs or even errors can often be hidden behind the word "independent". I am slow in believing independence, and especially conditional independence. That's where I have seen many bugs come up in proofs by me and by others. I am now skeptical of any unproved claims of independence.

Is the pro-Colbert super-PAC “Americans for a Better Tomorrow, Tomorrow” independent of Colbert? As it exists in real life, we are assuming, barring evidence to the contrary, that it is not coordinated. But if it came up in a science paper, we would ask to see a proof.

Monday, January 16, 2012

Teaching algorithms: the concepts

The other day I saw a student studying a handout with a vocabulary list of figures of speech in English, with definitions and examples. For example, there was "Juvenalian irony" , "Metaphor", "Meter", etc. What struck me was that the list was in alphabetical order! What a way to teach! I started imagining what would happen if I organized my Algorithms class or my lecture notes using that principle. The reader would find topics in the following order:

Asymptotics
Average case
Big oh
Bipartite matching
Divide and conquer
Dynamic programming
Fast Fourier transforms
Greedy algorithms
Heaps
Huffman coding
Induction
Knapsack
Little oh
Matching
Max cut
Max flow
Median finding
Min cut
Minimum spanning tree
Randomized algorithms
Recurrences
Run time
Scheduling
Shortest paths
Sorting
Strassen's algorithm
Union find
Worst case

Dealing with email: the subject line

I stumbled on articles about how people manage email.

-"Revisiting Whittaker and Sidney's "email overload" ten years later."
-"Everything through email"
http://eprints.rclis.org/bitstream/10760/13707/1/2007_Email_PIM_WBG_PIM_book_Chapter_authors_final.pdf
-"Am I wasting my time organizing email? A study of email refinding." http://people.ucsc.edu/~swhittak/papers/chi2011_refinding_email_camera_ready.pdf

I saw one piece of advice that would make our lives easier: to put the subject line to good use.

That made me wonder what subject line I had chosen. Here is a representative sample of my recently composed work-related emails:
- my attempt at a high-level description of Elkin's algorithm
- The problem with your proof
- notes on dp
- we can chat on google gmail?
- back
- Back at Brown
- (no subject)
- dynamic networks reference (a model)
- sujet cours College de France? [English: topic course at College de France?]
- your student
- on another topic
- ton avis [English: your advice]
- resume [English: summary]
Before I looked, I was ready to give myself a pat in the back for the care with which I compose my subject lines. Now I realize that actually, my subject lines are only well-done when I am careful: for every message with a carefully chosen subject line, it turns out that I write several that are completely uninformative. I mean, come on: "on another topic": what kind of a subject is that?? Even "notes on dp" is highly context-dependent and pretty non-descriptive of my work.

This suggests a late new year's resolution: when I take the time to compose an email, I should take a little bit of extra time to compose the subject line, keeping in mind that it may get archived and might need to be retrieved in several months.

Saturday, January 14, 2012

What do professors do between semesters?

Here is how my time was spent this week, a typical light week. It was light in the sense that on each day, besides tending to that day's tasks, I had some "remaining time" that could be spent working by myself on longer-term projects. The longer-term project that was my focus this week was the preparation of a journal submission.

Monday: Attended ITCS in Boston. 4 hours of transportation, 4 hours of attending talks, 2 hours of chatting with colleagues in the hallways (on topics similar to the ones discussed on TCS blogs). Started to prepare a scientific blog post (still pending).

Tuesday: Sent some recommendation letters - the most urgent matter on my to-do list. Had a hiring-related email conversation. Met with a colleague to discuss what ideas to convey at an upcoming conference talk. Remaining time was spent proofreading a paper about to be submitted to a journal. Also attended a Skype/google chat/phone meeting to discuss another paper to be submitted to a journal.

Wednesday: Did two important items of my to-do list, involving photocopying, scanning and faxing. Cleaned up a little bit of my overflowing email box. Met a student over lunch. Remaining time was spent editing a subsection of the paper to be submitted to a journal.

Thursday: Attended a faculty meeting about hiring. Met with head TA to prepare course for this coming semester. Remaining time was spent continuing to edit the subsection of the paper in preparation, as well as the next section.

Friday: Did three items on my to-do list, involving emails and phone calls, related to hiring and to teaching. Met with a graduate student to read a research paper together. Missed a deadline for turning in a report, not by oversight but by a conscious decision (I put it on my to-do list for Tuesday): instead, remaining time was spent continuing to edit the section of the paper in preparation. At the end of the day, when it was about finished, I started thinking that maybe it should be written in a completely different way, and that I should re-write those two sections from scratch.

Friday, January 13, 2012

Rules and regulations for university students

Imposing arbitrary rules on my students in the courses I teach is one of the perplexing aspects of this job. Why set deadlines? Why give this or that penalty for missed deadlines? Why allow this and disallow that?

In general my approach is: the default is to have no rules, and rules can only be added if they serve some purpose related to the course. Broadly speaking, there are two purposes: one is to help the student learn, and the other one is to help the logistics so that the course runs smoothly. For example, requiring students to turn in homeworks at specified dates gives them an incentive to study with some regularity across the entire semester instead of cramming a few days of intensive study at the very end: it helps reinforce their self-discipline. Forbidding late homeworks after a certain point ensures that we have al the assignments turned in by the time we grade, preventing loose, isolated hand ins from floating around at random times and simplifying the graders' task.

The other day I was learning about Stanislas, a private prep school in Paris for students at the level and age of freshman and sophomore students in US colleges. I stumbled on some of their rules: no sneakers allowed. Student must wear leather shoes. No T-shirts allowed: students must wear shirts with collars. No long hair allowed for men: the students' ears must be uncovered. I was amazed. Where does that come from? How can that be justified? These rules seem completely arbitrary. Perhaps one might try to argue that students hear better if hair does not cover their ears, but what about the other rules? How does having or not having a shirt collar help study? I think that the person who created those rules had an opposite approach. Instead of starting from a default state in which there would be no rules, he or she started from some picture in their mind of an ideal student, and then created a set of rules to try to force the students to comply with their ideal image. The only limit was that rules had to be mild enough to be tolerated by the students.

What about our curriculum when we have one? Brown is very free in that respect, but other universities have mandatory requirements. Are those requirements arbitrary creations reflecting someone's ideal image of what students ought to know?

Wednesday, January 11, 2012

A small world in time

Do you know someone who knew someone who knew someone who knew someone who knew someone who knew someone who knew someone who knew George Washington personally?

We could imagine trying to run Milgram's experiment: give a message to someone who we think might lead to Washington; that person thinks of someone they used to know, to whom they could have given that message, and who might lead closer to Washington, etc. But it is impossible to implement: we cannot run an experiment into the past. The social networks of the past are gone. They have disappeared forever. They seem inaccessible.

Then, how could you estimate the number of degrees of separation between you and George Washington?

Tuesday, January 10, 2012

Gossip from a hundred years ago

Last week I visited Marie Curie's birth house in Warsaw. Talking about it with relatives, I learned a piece of gossip from a hundred years ago. My grandmother was a physics student at the Ecole Normale Superieure de Jeunes Filles (ENSJF), where a few years earlier she would have had classes with Marie Curie, but instead, was taught by her replacement Paul Langevin.

As the family story goes, Langevin was a terrible teacher. He was also sexist, and claimed that the only good thing about teaching women was that it did not require any preparation. But one day, when my grandmother arrived at the school, she found a group gathered in great excitement around one of her friends, who had bought the newspapers: "Look! They are talking about our teacher! Look at what they are saying about Langevin!" - the newspapers were saying that Langevin had an affair with (then widowed) Mme Curie. That much I knew from wikipedia, but the little additional bit that I did not know was that it was not only the journalists who spread the rumor but also Langevin's wife who made a big scandal about it - so perhaps there was some truth behind the tabloids' stories.

Had I not visited Curie's house, I would not have heard that anecdote. How many stories about the past are sleeping in our elderly relatives' memories, I wonder?

Monday, January 9, 2012

ITCS Graduating bits: a big hit

As I am typing students and postdocs looking for jobs are giving 5 minute presentations. It was absolutely great for the first hour. For each speaker I quickly checked
- their web page (if they kept their name on the screen long enough for me to catch it),
- the DBLP page for their publications,
- their Facebook page,
- checked whether they were applying for a job at Brown,
and jotted down for myself a few notes on their presentation and on whether I thought their interests might be a fit.

What a great way to get a quick first impression! Also a great way to get a quick sense of the pool of theory-inclined applicants this year. If I see other applicants in our database, then, even if they are not here today, this session will make it easier for me to evaluate them.

The room is full and I saw some people standing in the back.

If only this had been split into two sessions, it would be easier to pay attention. As it is, early speakers had a big advantage. If I was staying around, I would compare impressions with other Brown people and try to corner a few of those applicants to chat with them. Unfortunately I've about had it with traveling (I've logged close to 80 hours traveling in the past 22 days and only spent 12 hours at home since getting back from France), so I am not taking full advantage of this great opportunity to get a first look at job candidates; but it is a great, great idea.

Models

It is a paradox that reality can be better apprehended by thinking, not about the real world directly, but about simple, pure models that are not real themselves but that capture one dimension of reality better than the messy and confusing real objects.

The Turing machine is not a real computer but a model for real computers. It's a computer in "hyperbolic form". It is not real, yet it is a tool to help us think about some essential aspects of reality. We can understand reality better by starting from that model. The Genesis is not a real account of the start of humanity but a model. It is not real, but it helps us think about some essential aspects of the start of humanity such as all human beings forming a family-like community. Polynomial runtime is not a real execution time but a model. It is not real, but it helps think about the design of efficient computations. Ensoulment at conception is not a real account of the beginning of personhood but a model. The Ising model. The state of nature. Classical mechanics. The free market. Alice and Bob.

All models.

All those models enrich our understanding, but they're also limited. When we use one and are led to exciting insights, it is tempting to give that model a broader scope and view everything through the lens of that particular model, but that's a mistake. We must not lose sight that they are not a catch-all substitute for reality. The risk is that asking the wrong questions about them leads to nonsensical answers or to sterile quests. How is Turing computation affected by the presence of several tapes? How does one make archeological and paleontological finds fit the story of the Creation? Can you reduce runtime from n^{1000} to 2^{100000}n^{999}? What happens to the soul during twinning with late separation or for conjoined twins? Such questions are close to vacuous: by and large, they do not produce new understanding of the real world but only serve to show the limitations of our models. Any question that exploits the aspects of a model that diverge from the real world is of lesser interest at best: it's only about the model, not about the phenomenon that the model is aiming to describe.

What to do with FOCS rejected papers?

What to do with rejected FOCS submissions? Many people say that the conference should accept many more papers, but there is strong resistance to doing that.

How about creating a conference that would be reserved for presenting rejected FOCS papers? The only criterion for acceptance would be that the paper would have to have been rejected from FOCS. The format would depend on the number of people who wish to present their work in spite of the submission having been rejected from FOCS. This new conference could be co-located with FOCS, and could happen on the day just before, the day just after, or as poster sessions in the evenings during the same days. Then the two communities could mingle. Imagine the conversation in the hotel elevator:
"-What is your research area?
- Arbitrary analysis. What's your research area?
- Forgetful algorithms.
- That sounds interesting.
- I'd like to hear more about your arbitrary analysis. Are you going to present a paper?
- Yes, I am giving a FOCS talk tomorrow. How about you, are you going to give a talk on forgetfulness?
- Yes, I am giving a FOCS Rejected talk tonight.
- Great! I look forward to hearing it!"

Sunday, January 8, 2012

The computational society

The best way to fly across the Atlantic: with a good book that keeps you occupied from beginning to end. Having two flights, I had two books. The first one had nothing to do with computer science. The second one, "The empire of the lesser evil" by Michea, contains the following footnote (my translation).

"It is probably this obsessive fear of civil war that explains why 17th and 18th century philosophers […] almost always describe the "state of nature" as a state of inevitable […] war of all against all. This obviously consists of a transposition into philosophy of the situation of civil wars of that period, pushed by assumption - as in all thought experiments - to their imaginary limit […]. This is a hyperbolic formulation […]. Similarly, at a metaphysical level, the cartesian doubt takes a hyperbolic form to create the possibility of cogito. Note that modern solutions must always be deduced of philosophical situations that are not merely negative or even hopeless (absolute doubt, absolute violence) but also fictive (the hypothesis of the dream and the genie with Descartes, the state of nature with Hobbes, the fable of original trade of goods with economists). That is not the least paradox of a society […] claiming to be entirely "realist" and procedural - that is, founded on the purely mechanical protocols of Law and of Market - that it thus generates its own foundational myths."

Saturday, January 7, 2012

The Sorbonne at war

This month is appearing a book entitled "La Sorbonne en Guerre (1940-1944)", followed by the short "Journal de la Liberation de Versailles". It is a primary historical document from a manuscript written by my grandfather Georges Mathieu about life as a professor (of ancient Greek) at the Sorbonne during the Occupation. My father typed, edited, prefaced and indexed the text.

In there, you read about such topics as faculty meetings discussing how to deal with new laws restricting Jewish students and faculty. If a Jewish professor goes into hiding, should the department hire someone to fill the prestigious position, or should they make do with temporary replacements? If loudly voicing disagreement with the laws means that you will be fired and arrested, then what is the best way to react to unjust measures? How do you work with a colleague whose collaborationist attitude you despise? How does one deal with anonymous letters at such a time? The book is not a guide of what one ought to do, but a report of what happened.

The Liberation of Versailles is a thrilling account of the Allied forces arriving in Versailles in 1944 and being greeted with great rejoicing by the ecstatic population.

http://www.editions-harmattan.fr/index.asp?navig=catalogue&obj=livre&no=35756

Friday, January 6, 2012

Push-pull

I just saw a talk by Thomas Sauerwald analyzing the push-pull protocol for broadcasting in graph-theoretical models for social networks. If you ask your search engine for push-pull, many answers come up, related to aviation, physics, flirting, and marketing. But this is different.

To broadcast information, in the push protocol, at every round every node that has the information rudely shares it with a random neighbor, whether that neighbor wants it or not. In the pull protocol, at every round every node that does not yet have the information extracts it from a random neighbor, if that neighbor has the information. In the push-pull protocol, both behaviors occur.

Q: How many rounds before 99% of the nodes are informed?
A: log log n if the network is a Chung-Lu random graph model for social networks (parameterized such that the average distance between two nodes is log log n.) That's the result.

Q: What's the Chung-Lu model?
A: each node i has a weight w(i), and edge {i,j} is in the network independently with probability min(1,w(i)w(j)/W), where W is the sum of all weights. w(i) is such that the distribution of degrees follows a power law (with parameter beta).

Q: How about informing not just 99% but 100% of the nodes?
A: the diameter is roughly logarithmic, so, there is no way that you can inform 100% of the nodes in time log log n. In a social network, when something goes viral, 1% of the crowd will remain clueless in spite of your efforts. The solution: just forget abut them. We're the 99%!

Q: What's the rough idea of the proof?
A: Partition nodes into classes according to (the log log of) their degree, (rounded to the nearest integer.)
0. The information initially belongs to a random node.
1. After log log n rounds it reaches a node in the highest class.
2. After a few more rounds it reaches (50% of) the nodes in the highest class.
3. After another log log n rounds it goes from there to almost all network nodes.

Step 2 contains some intuition: In the Chung-Lu model, two nodes x and y of high class (i.e. high degree) are probably connected by a path of length 2 via an intermediate node z of degree 2. If x has the information, then z will pull it from x after a couple of steps, and then z will pass it on to y after a couple more steps, so nodes of degree 2 transmit information in a constant number of rounds. In other words, small degree nodes are good for passing on information quickly to everyone they know. So what matters in the large degree nodes is really their degree-2 neighbors: those will immediately pull the information and share it across to their other neighbor, thus roughly reducing the GOSSIP mode of communication to a LOCAL mode of communication.

Q: Is the push-pull protocol realistic?
A: It's not inconceivable. Pushing corresponds to, say, sending an email to a single (or a few) recipient(s) to share information: "Have you heard? Abe and Amy broke up!". Pulling corresponds to asking your friend: "How are Abe and Amy doing? I haven't seen them lately" or checking their Facebook page, and then, if you find the piece of information which you were waiting for "Status: single", adding a note about it on your own Facebook wall: "Hey everyone, look, Abe is no longer in a relationship. He's single! How exciting!". (I am having trouble imagining why someone seemingly so interested in that piece of information would want to broadcast it to the rest of the world, but if that's the only reason why the protocol is not perfectly realistic, I'll give it a pass.)

Q: Is the Chung-Lu model realistic?
A: Maybe since it's a reasonably good fit with reality according to several statistics. But there is no way to know for sure, is there?

Q: Is the above log log n result robust? Does it still hold if the push-pull protocol or the Chung-Lu network is replaced by something similar but different?
A: The proof presumably breaks down. The result might still be true if the graph has small average distance and if it has some small degree nodes adjacent to large degree nodes.

Q: Is the synchronous assumption realistic?
A: In the asynchronous model in which each node has a Poisson clock, the log log n becomes a 1. That's because x and y don't just have a path of length 2 going through node z: they have many such paths going through z1, z2, …, and one of them goes through a node whose Poisson clock rings twice in rapid succession: "Zap! Get that info from x! Zap! Tell y about it, quick, to be the first one!" and transmits information almost instantaneously. So we don't mind if synchronicity does not hold, quite the opposite!

Q: What about other graphs?
A: In general graphs, the idea of reducing GOSSIP to LOCAL has been explored by Kelner, Censor-Hillel, Haeupler and Maymounkov. The two modes differ by at most an additive polylog(n).

Thursday, January 5, 2012

An algorithm

The other day a 5-year old child was showing me her kindergarden work. I saw the instructions: "follow the pattern and color the beads", and a necklace with a sequence of pearls carefully drawn along the curving line: red, blue, purple, red blue purple, red, blue, purple, etc. I said to her: "I see that you're doing Math in school". She answered: "No, that's not Math. It's an Algorithm."

This is how I discovered the French kindergarden curriculum: drawing, singing, writing letters, alphabet, math, and algorithms.

So: "Math" means numbers and arithmetic, while "Algorithms" means logical reasoning. What a conquest!

Sunday, January 1, 2012

A new year challenge: life as in the 20th century

As you read this, I am off the internet.
I have nothing with me that connects to the internet.
No laptop.
No iPhone.

I am disconnected. I challenge you to be able to do it for 24 hours!

What happens to your email correspondents when you disconnect?
Do disasters and catastrophic events suddenly start happening?
Does hell break loose?
Do you lose your job?

I will know in a few days, when I get back online.