What Does an Aberrated Photo Tell Us about the Lens and the Scene?

Huixuan Tang and Kiriakos N. Kutulakos, In ICCP2013.


Abstract
    We investigate the feasibility of recovering lens properties, scene appearance and depth from a single photo containing optical aberrations and defocus blur. Starting from the ray intersection function of a rotationally-symmetric compound lens and the theory of Seidel aberrations, we obtain three basic results. First, we derive a model for the lens PSF that (1) accounts for defocus and primary Seidel aberrations and (2) describes how light rays are bent by the lens. Second, we show that the problem of inferring depth and aberration coefficients from the blur kernel of just one pixel has three degrees of freedom in general. As such it cannot be solved unambiguously. Third, we show that these degrees of freedom can be eliminated by inferring scaled aberration coefficients and depth from the blur kernel at multiple pixels in a single photo (at least three). These theoretical results suggest that single-photo aberration estimation and depth recovery may indeed be possible, given the recent progress on blur kernel estimation and blind deconvolution.
Publication
  • Tang, H. and Kutulakos, K.N., What Does an Aberrated Photo Tell Us about the Lens and the Scene?, In Proc. 5th Int. Conf. on Computational Photography (ICCP), Boston, MA, 2013. Oral. [pdf][slide]
Data(under construction)