Significant recent advances in science have been driven by machine learning methods. However, while machine learning methods enable new capabilities in solving challenging modeling problems and inverse problems, they often ignore physics-based models. As a result, they may exhibit brittleness or fail to generalize to new sensor technologies, measurement scenarios, or physics. Conversely, physics-based models generalize well because they are rooted in sound theoretical principles; yet, they fail to capture effects that are difficult to model analytically and are more easily learned from data.
In the areas of computer vision, computer graphics, computational imaging, and numerical simulation, physics-informed neural representations have emerged as a paradigm that combines the advantages of machine learning methods and physics-based models. These representations consist of a neural network that is used to represent physical parameters such as appearance, geometry, reflectance or other material properties. Compared to conventional representations based on pixels, voxels, point clouds, or meshes, neural representations are memory-efficient, easily optimized (because they are differentiable and continuous), and they can incorporate data-driven priors over a desired solution space. In this course we will probe the foundations of physics-informed neural representations and discuss their relevance to application scenarios across vision, graphics, imaging, and simulation.
Recommended preparation: Background in machine learning, including familiarity with common network architectures and optimization methods; graduate-level exposure to one or more of computer vision, computer graphics, sensing or numerical simulation is desirable but not required.
Discussions: Mondays 10:00am–12:00pm in Earth Sciences Centre (ES) 1047.
Instructor office hours: David: Mondays 9:00am–10:00am (BA7228); Aviad: Mondays 12:00pm-1:00pm (BA7250) starting on January 15; discussion about projects, course material, etc. Office hours will only be held by the instructor leading the discussion that week—please refer to the course schedule.
Role assignments & schedule: See Quercus for the link to the google sheet with all the info.
Contact: Course announcements and general information will be posted on the course forum on Piazza.
Project proposal is due February 14 (5% of total grade). Final report is due at the end of classes (40% of total grade).
The final report grade takes into account your source code submission (code organization and documentation) and the report itself (appropriate format and length, abstract, introduction, related work, description of your method, quantitative and qualitative evaluation of your method, results, discussion & conclusion, bibliography).
You can work in teams of up to 3 students for the project. Submit only one proposal and final report for each team. The expected amount of work is relative to the number of team members, so if two teams work on a similar project, we'd expect less work from a smaller team.
The project proposal is a 1-2 page document that should contain the following elements: clear motivation of your idea, a discussion of related work along at least 3 scientific references (i.e., scientific papers not blog articles or websites), an overview of what exactly your project is about and what the final goals are, milestones for your team with a timeline and intermediate goals. Once you send us your proposal, we may ask you to revise it and we will assign a project mentor to your team.
The final project report should look like a short (~6 pages) conference paper. We expect the following sections, which are standard practice for conference papers: abstract, introduction, related work, theory (i.e., your approach), analysis and evaluation, results, discussion and conclusion, references. Use the CVPR 2025 LaTeX template for your report. A detailed rubric can be found on Quercus under the final project report assignment.
All homework is due at midnight on the due date. For role-specific homework, there will be a 30% deduction if you submit late, but before the start of that week's lecture (i.e., if you submit anytime between 12:01am and 10am on Monday). No homework will be accepted after the start of lecture.
If you need more time to submit your project proposal or final project report, you will need to discuss your timeline with the instructor and get approval at least 1 week before the posted due date.
You may use any tools you find productive in preparing your reports. But you are responsible for any misrepresentations, inaccuracies, or plagiarism. If we find such defects in submitted work, we reserve the right to exact penalties, which may range from grade deductions to reporting academic misconduct depending on the severity of the offense.
Week | Date | Topic | Paper(s) | Event | |
---|---|---|---|---|---|
Week 1 | Mon Jan 6 |
Introduction (Aviad/David) |
Course overview, role-playing class format, course components, grading, etc. | ||
Part I: Physics-informed neural networks | |||||
Week 2 | Mon Jan 13 |
PINNs (Aviad) |
Main paper Raissi et al., Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations (Journal of Computational Physics, 2019). |
||
To probe further:
|
|||||
Week 3 | Mon Jan 20 |
Neural Operators (Aviad) |
Main paper Li et al.Fourier Neural Operator for Parametric Partial Differential Equations (Proc. ICLR 2021) |
||
To probe further:
|
|||||
Week 4 | Mon Jan 27 |
Fourier Features (David) |
Main paper Tancik et al., Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains (NeurIPS 2020). |
||
To probe further:
|
|||||
Week 5 | Mon Feb 3 |
Graph Representations (David) |
Main paper Pfaff et al., Learning Mesh-Based Simulation with Graph Networks (Proc. ICLR 2021). |
||
To probe further:
|
|||||
Part II: Equation discovery & Interpretable ML | |||||
Week 6 | Mon Feb 10 |
PINN-SR (Aviad) |
Main paper Chen et al., Physics-informed learning of governing equations from scarce data (Nature Communications, 2021) |
Project proposals due at 11:59pm | |
To probe further:
|
|||||
Week 7 | Mon Feb 17 |
Winter break (No Lecture) | |||
Week 8 | Mon Feb 24 |
Distillation (Aviad) |
Main paper Cranmer, Discovering Symbolic Models from Deep Learning with Inductive Biases (NeurIPS 2020) |
||
To probe further: | |||||
Week 9 | Mon Mar 3 |
HNNs (Aviad) |
Main paper Greydanus et al., Hamiltonian Neural Networks (NeurIPS 2019) |
||
To probe further: | |||||
Part III: Applications in vision and sensing | |||||
Week 10 | Mon Mar 10 |
Wavefront Shaping (Aviad) |
Main paper Feng et al, NeuWS: Neural wavefront shaping for guidestar-free imaging through static and dynamic scattering media (Sci. Adv., 2023). |
||
To probe further: | |||||
Week 11 | Mon Mar 17 |
Optical Neural Networks (David) |
Main paper Lin et al., All-optical machine learning using diffractive deep neural networks (Science, 2018). |
||
To probe further: | |||||
Week 12 | Mon Mar 24 |
3D Reconstruction (David) |
Main paper Chen et al, 3D Reconstruction with Fast Dipole Sums (CVPR 2024). |
||
To probe further: | |||||
Week 13 | Mon Mar 31 |
Generative Models (David) |
Main paper Wu et al., CAT4D: Create Anything in 4D with Multi-View Video Diffusion Models (arXiv, 2024). |
||
To probe further: | |||||
Fri Apr 25 |
|
Final project reports due at 11:59pm |