Test of Time 1992 Awards

Sanjeev Arora, Carsten Lund, Rajeev Motwani, Madhu Sudan, Mario Szegedy:
Proof verification and hardness of approximation problems

This seminal paper proves one of the fundamental theorems of modern computational complexity, the remarkable PCP Theorem. That is, every proof has a robust version only polynomially larger that can be probabilistically verified by reading just a constant number of bits. The probabilistic verifier queries just a constant number of bits, and if the input is in the language, then the verifier always accepts; if it is not in the language, then the verifier rejects with high probability. This was the culmination of a line of work that has been highlighted in recent Test of Time Awards, including the related work of Arora-Safra that is a co-winner of this year’s award.

Besides being central to our modern understanding of the class NP, this work has had ramifications for many other areas of TCS, notably hardness of approximation and foundations of cryptography. As observed in the paper, the PCP theorem implies that MAXSNP-hard problems are NP-hard to approximate to within a constant factor, and by work of Feige et al. it also implies that MAX CLIQUE is NP-hard to approximate to within a power of n. The PCP Theorem has had a huge influence on subsequent work on inapproximability, and increasingly tight bounds on the best achievable approximation ratios for problems have been obtained by designing PCPs tailored to particular approximation problems. PCP’s are not just a theoretical concept, but are currently being implemented in order to have efficient cryptographic proofs of knowledge as an ingredient in blockchain technology. While this paper was immediately recognized as a breakthrough, its implications are still being explored today, and continually surprise and delight.

Sanjeev Arora, Shmuel Safra:
Probabilistic Checking of Proofs: A new characterization of NP

This exceptional paper was the first to characterize NP in a form very close to the PCP theorem. Specifically, it characterizes NP as the class of languages whose proofs of membership in the language can be made robust in that they can be probabilistically verified by reading just o(log n) bits. If the input is in the language, then the probabilistic verifier always accepts; if it is not in the language, then it rejects with high probability. This was a major step in a line of work that has been highlighted in recent Test of Time Awards, including the related work of Arora-Lund-Motwani-Sudan-Szegedy that improved the number of queried bits to constant and is a co-winner of this year’s award.

Besides being central to our modern understanding of the class NP, the PCP theorem has had ramifications for many other areas of TCS, notably hardness of approximation and foundations of cryptography. By work of Arora et al., the PCP theorem implies that MAXSNP-hard problems are NP- hard to approximate to within a constant factor, and by work of Feige et al. it also implies that MAX CLIQUE is NP-hard to approximate to within a power of n. The PCP Theorem has had a huge influence on subsequent work on inapproximability, and increasingly tight bounds on the best achievable approximation ratios for problems have been obtained by designing PCPs tailored to particular approximation problems. PCP’s are not just a theoretical concept, but are currently being implemented in order to have efficient cryptographic proofs of knowledge as an ingredient in blockchain technology. While this paper was immediately recognized as a breakthrough, its implications are still being explored today, and continually surprise and delight.

Leonard J. Schulman:
Communication on Noisy Channels: A coding theorem for computation

This seminal paper formally introduced the problem of error correction in the setting of interactive communication, extending the problem of error correction for one way transmission of information that is addressed by the classical theory of error correcting codes. In addition to defining an important model and problem, the paper provides a general protocol for making interactive communication robust to noise that incurs only a constant factor overhead. The lasting impact of this paper is demonstrated by the many papers in the past decade that were inspired by, and built upon, the important insights of this paper, and by the ongoing research challenges that grew out of it.