Course Description

Significant recent advances in science have been driven by machine learning methods. However, while machine learning methods enable new capabilities in solving challenging modeling problems and inverse problems, they often ignore physics-based models. As a result, they may exhibit brittleness or fail to generalize to new sensor technologies, measurement scenarios, or physics. Conversely, physics-based models generalize well because they are rooted in sound theoretical principles; yet, they fail to capture effects that are difficult to model analytically and are more easily learned from data.

In the areas of computer vision, computer graphics, computational imaging, and numerical simulation, physics-informed neural representations have emerged as a paradigm that combines the advantages of machine learning methods and physics-based models. These representations consist of a neural network that is used to represent physical parameters such as appearance, geometry, reflectance or other material properties. Compared to conventional representations based on pixels, voxels, point clouds, or meshes, neural representations are memory-efficient, easily optimized (because they are differentiable and continuous), and they can incorporate data-driven priors over a desired solution space. In this course we will probe the foundations of physics-informed neural representations and discuss their relevance to application scenarios across vision, graphics, imaging, and simulation.

Recommended preparation: Background in machine learning, including familiarity with common network architectures and optimization methods; graduate-level exposure to one or more of computer vision, computer graphics, sensing or numerical simulation is desirable but not required.

Teaching Assistant

Course Logistics

Discussions: Mondays 10:00am–12:00pm in Earth Sciences Centre (ES) 1047.

Instructor office hours: David: Mondays 9:00am–10:00am (BA7228); Aviad: Mondays 12:00pm-1:00pm (BA7250) starting on January 15; discussion about projects, course material, etc. Office hours will only be held by the instructor leading the discussion that week—please refer to the course schedule.

Role assignments & schedule: See Quercus for the link to the google sheet with all the info.

Contact: Course announcements and general information will be posted on the course forum on Piazza.

Coursework

Role playing participation (55%)

Marks will be distributed equally across all lectures (5% per lecture). All role-specific written components are due at midnight before the paper is discussed in class (i.e., Sunday 11:59pm). See Quercus for download links to all reading materials.

Term project (45%)

Project proposal is due February 14 (5% of total grade). Final report is due at the end of classes (40% of total grade).

The final report grade takes into account your source code submission (code organization and documentation) and the report itself (appropriate format and length, abstract, introduction, related work, description of your method, quantitative and qualitative evaluation of your method, results, discussion & conclusion, bibliography).

You can work in teams of up to 3 students for the project. Submit only one proposal and final report for each team. The expected amount of work is relative to the number of team members, so if two teams work on a similar project, we'd expect less work from a smaller team.

The project proposal is a 1-2 page document that should contain the following elements: clear motivation of your idea, a discussion of related work along at least 3 scientific references (i.e., scientific papers not blog articles or websites), an overview of what exactly your project is about and what the final goals are, milestones for your team with a timeline and intermediate goals. Once you send us your proposal, we may ask you to revise it and we will assign a project mentor to your team.

The final project report should look like a short (~6 pages) conference paper. We expect the following sections, which are standard practice for conference papers: abstract, introduction, related work, theory (i.e., your approach), analysis and evaluation, results, discussion and conclusion, references. Use the CVPR 2025 LaTeX template for your report. A detailed rubric can be found on Quercus under the final project report assignment.

Late policy

All homework is due at midnight on the due date. For role-specific homework, there will be a 30% deduction if you submit late, but before the start of that week's lecture (i.e., if you submit anytime between 12:01am and 10am on Monday). No homework will be accepted after the start of lecture.

If you need more time to submit your project proposal or final project report, you will need to discuss your timeline with the instructor and get approval at least 1 week before the posted due date.

ChatGPT policy

You may use any tools you find productive in preparing your reports. But you are responsible for any misrepresentations, inaccuracies, or plagiarism. If we find such defects in submitted work, we reserve the right to exact penalties, which may range from grade deductions to reporting academic misconduct depending on the severity of the offense.

Lecture Schedule

Week Date Topic Paper(s) Event
Week 1 Mon
Jan 6
Introduction
(Aviad/David)

Course overview, role-playing class format, course components, grading, etc.
Part I: Physics-informed neural networks
Week 2 Mon
Jan 13
PINNs
(Aviad)
Main paper
Raissi et al., Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations (Journal of Computational Physics, 2019).

To probe further:
  • Lagaris et al., Artificial Neural Networks for Solving Ordinary and Partial Differential Equations (IEEE Trans. Neural Netw., 1998).
  • Sitzmann et al., Implicit Neural Representations with Periodic Activation Functions (Proc. NeurIPS 2020).
  • Karniadakis et al., Physics-informed machine learning (Nature Reviews Physics, 2021).
  • Week 3 Mon
    Jan 20
    Neural Operators
    (Aviad)
    Main paper
    Li et al.Fourier Neural Operator for Parametric Partial Differential Equations (Proc. ICLR 2021)

    To probe further:
  • Kovachki et al., Neural Operator: Learning Maps Between Function Spaces (JMLR, 2023).
  • Lu et al., Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators (Nat. Mach. Intell., 2021)
  • Week 4 Mon
    Jan 27
    Fourier Features
    (David)
    Main paper
    Tancik et al., Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains (NeurIPS 2020).

    To probe further:
  • Jacot et al., Neural Tangent Kernel: Convergence and Generalization in Neural Networks (Proc. NeurIPS 2018).
  • Week 5 Mon
    Feb 3
    Graph Representations
    (David)
    Main paper
    Pfaff et al., Learning Mesh-Based Simulation with Graph Networks (Proc. ICLR 2021).

    To probe further:
  • Sanchez-Gonzalez et al., Learning to Simulate Complex Physics with Graph Networks (Proc. ICML 2020).
  • Kochkov et al., Machine learning–accelerated computational fluid dynamics (PNAS, 2021).
  • Part II: Equation discovery & Interpretable ML
    Week 6 Mon
    Feb 10
    PINN-SR
    (Aviad)
    Main paper
    Chen et al., Physics-informed learning of governing equations from scarce data (Nature Communications, 2021)
    Project proposals due at 11:59pm

    To probe further:
  • Brunton et al., Discovering governing equations from data by sparse identification of nonlinear dynamical systems (PNAS, 2016)
  • Schmid, Dynamic mode decomposition of numerical and experimental data (Journal of Fluid Mechanics, 2010)
  • Week 7 Mon
    Feb 17
    Winter break (No Lecture)
    Week 8 Mon
    Feb 24
    Distillation
    (Aviad)
    Main paper
    Cranmer, Discovering Symbolic Models from Deep Learning with Inductive Biases (NeurIPS 2020)

    To probe further:
  • Schmidt & Lipson, Distilling Free-Form Natural Laws from Experimental Data (Science 2009)
  • Cranmer, Interpretable Machine Learning for Science with PySR and SymbolicRegression.jl (2023)
  • Week 9 Mon
    Mar 3
    HNNs
    (Aviad)
    Main paper
    Greydanus et al., Hamiltonian Neural Networks (NeurIPS 2019)

    To probe further:
  • Chen et al., Neural Ordinary Differential Equations (NeurIPS 2018)
  • Cranmer et al., Lagrangian Neural Networks (ICLR 2020)
  • Part III: Applications in vision and sensing
    Week 10 Mon
    Mar 10
    Wavefront Shaping
    (Aviad)
    Main paper
    Feng et al, NeuWS: Neural wavefront shaping for guidestar-free imaging through static and dynamic scattering media (Sci. Adv., 2023).

    To probe further:
  • Hampson et al., Adaptive optics for high-resolution imaging (Nat. Rev. Methods Prim., 2021).
  • Week 11 Mon
    Mar 17
    Optical Neural Networks
    (David)
    Main paper
    Lin et al., All-optical machine learning using diffractive deep neural networks (Science, 2018).

    To probe further:
  • Chang et al, Hybrid optical-electronic convolutional neural networks with optimized diffractive optics for image classification (Sci. Rep. 2018).
  • Week 12 Mon
    Mar 24
    3D Reconstruction
    (David)
    Main paper
    Chen et al, 3D Reconstruction with Fast Dipole Sums (CVPR 2024).

    To probe further:
  • Mildenhall et al., NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis (Proc. ECCV 2020).
  • Wang et al., NeuS2: Fast Learning of Neural Implicit Surfaces for Multi-view Reconstruction (Proc. ICCV 2023).
  • Week 13 Mon
    Mar 31
    Generative Models
    (David)
    Main paper
    Wu et al., CAT4D: Create Anything in 4D with Multi-View Video Diffusion Models (arXiv, 2024).

    To probe further:
  • Gao et al., CAT3D: Create Anything in 3D with Multi-View Diffusion Models (Proc. NeurIPS 2024).
  • Kerbl et al., 3D Gaussian Splatting for Real-Time Radiance Field Rendering (SIGGRAPH 2023).
  • Fri
    Apr 25

    Final project reports due at 11:59pm

    Additional Information

    Related courses at UofT