In 2001, Amit Chakrabarti, now a professor of computer science, co-authored a paper that introduced a new concept in theoretical computer science called “information complexity.” Twenty years later, his research has been recognized with the Test of Time award by the 2021 IEEE Symposium on the Foundations of Computer Science.

The award spotlights the long-term impact of research that opened up a new area of study, introduced new techniques, or solved a problem of lasting importance.

At the time of the paper’s publication, Chakrabarti was a graduate student at Princeton working with computational theorist Andrew Yao, currently a professor and dean of the Institute for Interdisciplinary Information Sciences at Tsinghua University in Beijing.

The 2001 paper, published in *Proceedings of the 42nd IEEE Symposium on Foundations of Computer Science* by Chakrabarti and co-authors Yaoyun Shi, Anthony Wirth, and Yao, solved what is called “the direct sum problem.” This is a question in the subfield of communication complexity that theoretical computer scientists did not know how to answer until then.

Communication complexity analyzes computational processes in which two or more “players” jointly solve a problem whose inputs are distributed among them. The players could, for example, be two people in two different places who are comparing a very large file, say a database, to make sure they match.

The aim is to find the “communication complexity” of the task—the minimum amount of communication between parties needed to solve the task. So, in the example, can the players ascertain that their files match without one of them sending the whole file to the other?

“Understanding the minimum possible communication necessary to solve a particular problem is a key step of analysis in a broad range of areas,” says Chakrabarti.

In chip design, for instance, an optimum layout of components can serve to save energy by minimizing the distance that electrical signals travel. “Communication complexity is often used to figure out the right layout,” says Chakrabarti. It also finds use in the study of data structures, quantum computing, game theory, and optimization, he adds.

“What the direct sum problem comes down to is asking whether solving many copies of a task is as many times as costly as solving one copy,” says Chakrabarti. He goes on to illustrate the question with the unlikely example of dicing potatoes.

“Imagine you are an amateur who has never cut a potato before,” Chakrabarti begins. You might figure out a method to dice one potato in two minutes, he says, but if you then had to chop 100 of them, it wouldn’t take you 200 minutes, because you become more efficient and economies of scale kick in.

The same phenomenon exists in computation, but in the case of communication complexity it was suspected, but not proven, that there *wasn’t* an economy of scale effect. “In our paper, we showed how to prove this for a certain type of problem,” says Chakrabarti.

While that result was, in effect, a limitation, to arrive at it, they developed an entirely new way of mathematical reasoning that recast the question in a new paradigm they called information complexity. “It turns out that being able to reason in terms of fractions of a bit is the key,” according to Chakrabarti, and that’s what information theory enables. Previous methods of analysis weren’t able to tackle this type of question because they would reason at the level of whole bits of communication, he explains.

“The idea was a gamechanger—it is now regularly used in the field,” says Chakrabarti. He and other researchers have, in the years since, extended the information complexity paradigm by applying it to several other algorithmic problems to obtain new results.