Theoretical Computer Science

In the theory of computation area, we use mathematical techniques to understand the nature of computation and to design and analyze algorithms for important and fundamental problems.

The members of the theory group are all interested, in one way or another, in the limitations of computation: What problems are not feasible to solve on a computer? How can the infeasibility of a problem be used to rigorously construct secure cryptographic protocols? What problems cannot be solved faster using more machines? What are the limits to how fast a particular problem can be solved or how much space is needed to solve it? How do randomness, parallelism, the operations that are allowed, and the need for fault tolerance or security affect this? Here are some of the research areas of current interest:

Algorithmic Game Theory/Mechanism Design

Game theory and mechanism design study processes in which self interested agents participate so as to maximize their desired utility (e.g. their value for items obtained minus the price paid for those items). Such self-interested behaviour may not align with more global objectives (e.g. the revenue of the seller or the total valuation of all items allocated). Game theory and mechanism design are well established fields within Economics. More recently, algorithmic considerations have shown that many of the beautiful results within this established field have to be re-thought due to computational limitations.

Algorithms

Algorithm design and analysis is an essential aspect of every area of computer science. Over the past decade, the big data phenomenon has brought into sharp focus the design of fast algorithms. Our group focuses on the the design of efficient (optimal or approximation) algorithms for classic TCS problems such as maximum flow, and for novel problems arising from machine learning, statistics, and differential privacy. Such algorithm design brings together techniques from a huge number of diverse areas such as combinatorics, high-dimensional geometry, continuous optimization, approximation theory, etc.

Complexity Theory

Motivated by questions such as the P vs NP problem, the goal of complexity theory is to find the limits of efficient computation. Which problems are inherently hard to solve, and what makes them so? How do resource constraints (such as limited time and memory) inhibit algorithms, and why? And which mathematical and algorithmic tools can we use to study these questions?

Research in this area interacts broadly with other areas in theoretical computer science and in mathematics, and integrates various focal points of interest. Among these focal points are lower bounds for circuits and for Turing machines, proof complexity, algebraic and arithmetic computation, pseudorandomness and derandomization, error-correcting codes, interactive protocols, circuit-analysis algorithms, and more.

Cryptography

Traditionally, cryptography concerned the question of how two people who share a key can securely communicate over an insecure channel. This question is related to the existence of pseudo-random generators and one-way functions, and involves deep open questions in computational complexity. Public-key encryption and signatures allow us to do even more things. Recently, we have considered fine-grained questions about ways in which information can remain secret, leading to applications such as secure and efficient computing using an insecure "cloud".

Differential Privacy

Differential privacy is an area of research which formalizes private data analysis, analogously to how cryptography formalizes secure communication. Machine learning and statistics are increasingly applied to data sets which contain sensitive information about people. A slew of attacks has shown that it is possible to link seemingly aggregate statistics of such data sets with publicly available data and reveal private information about specific individuals. The central goal of differential privacy is to design algorithms that make such attacks provably impossible while extracting useful aggregate information from the data. Our focus is on designing efficient differentially private algorithms for problems in statistics and machine learning, and to understand the inherent limitations of private data analysis.

Discrete Mathematics

Discrete Mathematics involves the study of the properties, algorithms and applications of mathematical structures built on discrete objects. There is a rich interplay between theoretical computer science and discrete mathematics: structural results about discrete objects often underlie algorithmic advances, while an algorithmic viewpoint can lead to a deeper understanding of the mathematics. Our group is primarily concerned with the mathematical and algorithmic issues pertaining to Graph Theory and Discrepancy Theory.

Discrepancy Theory

Discrepancy theory studies how well discrete objects can approximate continuous ones: how uniform can a set of n points in the plane be? How balanced can a hypergraph coloring be? Such questions have many connections to computer science, from proving data structure lower bounds, to designing efficient approximation algorithms. Our group is interested in these connections, and in algorithmic problems in discrepancy theory.

Graph Theory

A graph consists of V (a set of vertices) and E (a set of edges, either ordered or unordered, joining pairs of vertices in V). It is clear that any binary relation may be modeled as a graph; thus, applications of graphs pervade computer science. The interests of our group include computational biology, random structures and the structure of electrical circuits, as well as more traditional graph theory such as graph colouring, perfect graphs, hamiltonicity and generalizations of graphs.

Quantum Computing

Quantum computing presents the tantalizing possibility that we will be able to construct machines capable of solving certain problems exponentially faster than any classical computer. When quantum computers are built, fields ranging from cryptography to physics will be profoundly affected. Our group is interested in fundamental questions about the interplay between computation, cryptography, and quantum information. Questions include: how can quantum entanglement be used to certify and expand secure randomness? What is the complexity of quantum state transformations? Which computational problems admit a quantum speedup?

Theory of Distributed Computing

Our research in the theory of distributed computing studies both shared memory and message passing systems. We are particularly interested in issues of fault tolerance, synchronization, and the limits on the computational power of shared memory primitives.