Department of Computer Science

The Internet is a fertile source of interesting and relevant algorithm design problems. We describe two such problems in this talk.

Congestion Probing

We formulate congestion control as an optimal probing problem, in which the goal is to determine the maximal bandwidth currently available, and the result of the probe is either a succesful transmission of all packets sent, or the dropping of one or more. TCP's congestion control algorithm increases the rate additively upon success, and decreases the rate multiplicatively upon failure. We attempt to deepen our understanding of such probing algorithms through a series of simplified models.

We assume that time is divided into successive periods, and in each period $t$ there is a threshold $u_t$, representing the maximum number of packets that A can transmit to B without experiencing packet drops. In each period $t$ A transmits some number of packets $x_t$ and receives immediate feedback as to whether packet drops have occurred; {\it i.e.}, whether $x_t > u_t$. A cost function $c(x,u)$ is given, which represents the cost of transmitting $x$ packets in a period with threshold $u$. This cost reflects such factors as the bandwidth consumed by the flow, the buffer space and other resources consumed at the routers, the number of packets that get through to B, and the number of packets dropped. The goal of host A is to minimize the total cost incurred over all periods or, in the case of an infinite sequence of periods, the average cost per period. Since A's only feedback from period $t$ is whether $x_t > u_t$, $A$ does not know the $u_t$ or $c(x_t, u_t)$.

We consider both the static version of the problem, in which the available bandwidth does not change from period to period, and various dynamic versions, in which the bandwidth changes under the control of a restricted adversary. Joint work with Elias Koutsoupias, Christos Papadimitriou and Scott Shenker.

The Update Distribution Problem

Consider a situation in which many Internet servers are maintaining copies of a shared database. Each server may perform an update at any time, and each update must be distributed promptly to all the servers with high probability. However, we do not require transactional consistency, merely eventual consistency, so that while the updates are being propagated the database can be inconsistent. We are interested in minimizing the time required to disseminate each update, as well as the number of times each update is transmitted from one server to another.

We assume that the Internet enables each server $A$ to initiate contact with a random participating server $B$, but that neither $A$ nor $B$ learns the identity of the other. In this case one can show that the number of rounds for an update to reach all participating servers with high probability must be $\Omega(\log N)$ and the number of transmissions of each update must be $\Omega(N \log \log N)$. We present algorithms achieving these bounds. Joint work with Scott Shenker, Christian Schindelhauer and Berthold Voecking.

Richard M. Karp was educated at the Boston Latin School and Harvard University, where he received a Ph.D. in Applied Mathematics in 1959. From 1959 to 1968 he was a member of the Mathematical Sciences Department at the IBM Thomas J. Watson Research Center. From 1968 to 1994 he was a Professor of Computer Science, Mathematics and Operations Research at the University of California, Berkeley. From 1988 to 1995 he was also associated with the International Computer Science Institute in Berkeley. In 1995 he became a Professor of Computer Science and Engineering and an Adjunct Professor of Molecular Biotechnology at the University of Washington. The unifying theme in Karp's work has been the study of combinatorial algorithms. His most significant work is the 1972 paper, Reducibility Among Combinatorial Problems, which shows that many of the most commonly studied combinatorial problems are disguised versions of a single underlying problem, and thus are all of essentially the same computational complexity. Much of his subsequent work has concerned the development of parallel algorithms, the probabilistic analysis of combinatorial optimization problems, and the construction of randomized algorithms for combinatorial problems. His current research is concerned with strategies for sequencing the human genome, the physical mapping of large DNA molecules, the analysis of gene expression data, and other combinatorial problems arising in molecular biology.

Karp has received the U.S. National Medal of Science, the Turing Award (ACM), the Fulkerson Prize(AMS and Math. Programming Society), the von Neumann Theory Prize(ORSA-TIMS), the Lanchester Prize (ORSA), the von Neumann Lectureship (SIAM), and the Distinguished Teaching Award (Berkeley). He is a member of the National Academy of Sciences, the National Academy of Engineering and the American Philosophical Society. He is also a Fellow of the American Academy of Arts and Sciences. He holds four honorary degrees.

**Time and Location:** return to the 2000
Colloquia Series main page.

R. J. Miller Last modified: Tue Oct 10 18:58:44 EDT 2000