Theoretical Computer Science
In the theory of computation area, we use mathematical techniques to understand the nature of computation and to design and analyze algorithms for important and fundamental problems.
The members of the theory group are all interested, in one way or another, in the limitations of computation: What problems are not feasible to solve on a computer? How can the infeasibility of a problem be used to rigorously construct secure cryptographic protocols? What problems cannot be solved faster using more machines? What are the limits to how fast a particular problem can be solved or how much space is needed to solve it? How do randomness, parallelism, the operations that are allowed, and the need for fault tolerance or security affect this? Here are some of the research areas of current interest:
Algorithmic Game Theory/Mechanism Design
Game theory and mechanism design study processes in which self interested agents participate so as to maximize their desired utility (e.g. their value for items obtained minus the price paid for those items). Such self-interested behaviour may not align with more global objectives (e.g. the revenue of the seller or the total valuation of all items allocated). Game theory and mechanism design are well established fields within Economics. More recently, algorithmic considerations have shown that many of the beautiful results within this established field have to be re-thought due to computational limitations.
Algorithm design and analysis is an essential aspect of every area of computer science. Over the past decade, the big data phenomenon has brought into sharp focus the design of fast algorithms. Our group focuses on the the design of efficient (optimal or approximation) algorithms for classic TCS problems such as maximum flow, and for novel problems arising from machine learning, statistics, and differential privacy. Such algorithm design brings together techniques from a huge number of diverse areas such as combinatorics, high-dimensional geometry, continuous optimization, approximation theory, etc.
There is great beauty in theoretical computer science. It is exciting to discover what makes a particular problem difficult, to find a provably optimal solution to a problem, or to discover unexpected connections between seemingly unrelated problems. Theoretical computer science provides important new ways of thinking about computation and provides lasting insights that are applicable to a wide variety of systems. There are also great challenges and opportunities since so many basic problems remain unsolved.
Cryptography and Differential Privacy
Traditionally, cryptography concerned the question of how two people who share a key can securely communicate over an insecure channel. This question is related to the existence of pseudo-random generators and one-way functions, and involves deep open questions in computational complexity. Public-key encryption and signatures allow us to do even more things. Recently, we have considered fine-grained questions about ways in which information can remain secret, leading to applications such as secure and efficient computing using an insecure "cloud".
Differential privacy is an area of research which formalizes private data analysis, analogously to how cryptography formalizes secure communication. Machine learning and statistics are increasingly applied to data sets which contain sensitive information about people. A slew of attacks has shown that it is possible to link seemingly aggregate statistics of such data sets with publicly available data and reveal private information about specific individuals. The central goal of differential privacy is to design algorithms that make such attacks provably impossible while extracting useful aggregate information from the data. Our focus is on designing efficient differentially private algorithms for problems in statistics and machine learning, and to understand the inherent limitations of private data analysis.
Discrete Mathematics involves the study of the properties, algorithms and applications of mathematical structures built on discrete objects. There is a rich interplay between theoretical computer science and discrete mathematics: structural results about discrete objects often underlie algorithmic advances, while an algorithmic viewpoint can lead to a deeper understanding of the mathematics. Our group is primarily concerned with the mathematical and algorithmic issues pertaining to Graph Theory and Discrepancy Theory.
Discrepancy theory studies how well discrete objects can approximate continuous ones: how uniform can a set of n points in the plane be? How balanced can a hypergraph coloring be? Such questions have many connections to computer science, from proving data structure lower bounds, to designing efficient approximation algorithms. Our group is interested in these connections, and in algorithmic problems in discrepancy theory.
A graph consists of V (a set of vertices) and E (a set of edges, either ordered or unordered, joining pairs of vertices in V). It is clear that any binary relation may be modeled as a graph; thus, applications of graphs pervade computer science. The interests of our group include computational biology, random structures and the structure of electrical circuits, as well as more traditional graph theory such as graph colouring, perfect graphs, hamiltonicity and generalizations of graphs.
Quantum computing presents the tantalizing possibility that we will be able to construct machines capable of solving certain problems exponentially faster than any classical computer. When quantum computers are built, fields ranging from cryptography to physics will be profoundly affected. Our group is interested in fundamental questions about the interplay between computation, cryptography, and quantum information. Questions include: how can quantum entanglement be used to certify and expand secure randomness? What is the complexity of quantum state transformations? Which computational problems admit a quantum speedup?
Theory of Distributed Computing
Our research in the theory of distributed computing studies both shared memory and message passing systems. We are particularly interested in issues of fault tolerance, synchronization, and the limits on the computational power of shared memory primitives.