I am a graduate student in the DCS Theory Group at the University of Toronto, where I am lucky to be supervised by Stephen Cook and Toni Pitassi. My current research interests lie in different aspects of computational complexity theory. I also have an interest in the intersection of computer science and cognitive science (my primary collaboraters here are Tarek Besold and Todd Wareham). My last project in this area saw the application of concepts in computational complexity to models of metaphor and concept re-representation.
Package for converting a halfspace representation of a polytope into a vertex representation of a polytope. Written in the R programming language (a port of lrslib, written by David Avis).
Consider the following problem: we are given a collection of sets of isotopic measurements (perhaps measuring the concentration of some particular isotopes of carbon and nitrogen, for example) taken from a fixed "predator" species and a collection of "prey" species for which we know the "predator" preys on. In an ideal world, the set of measurements from each predator would be a convex combination of the isotopic measurements from some samples of the 'prey' distributions, and using standard machine learning techniques (Gaussian mixture models) we could get an estimate of what proportion of each type of 'prey' the 'predator' sample is eating. Unfortunately, the world is not ideal in this way, and typically the above experiment can only be run up to certain "error terms", which are known in the computational ecology literature as discrimination factors. EDFIR (Estimating Discrimination Factors in R) is a collection of functions for reading in data sets of isotopic measurements and generating a prior distribution of discrimination factors for later input into a Gaussian mixture model. Joint work with Alex Bond.