Advanced Topics in Computational Physics
Ten lessons spanning numerical optimization, statistical estimation, and machine learning & AI — the computational backbone of modern physics research and engineering.
Optimization
1D Optimization: Bracketing & GSS
Local vs. global minima, bracketing algorithm, golden section search, and MATLAB implementation.
Lesson 30Gradient-Based Optimization
Gradient review, gradient descent with fixed step size, steepest descent with optimal step, convergence criteria.
Statistical Estimation
Statistical Estimation I: Frequentist Methods & Maximum Likelihood
Gaussian distributions, frequentist framework, bias/variance/MSE, maximum likelihood estimator, Cramér-Rao bound.
Lesson 32Bayesian Estimation: MAP & MMSE
Prior, likelihood, posterior, Bayes' rule, MAP estimator, MMSE estimator, ML vs. MMSE comparison.
Machine Learning & Artificial Intelligence
What is ML & AI?
Definitions, types of learning, the ML pipeline, regression, regularization, and real-world physics applications.
Lesson 34Linear Classification & Loss Functions
Linear score function, decision boundaries, sigmoid activation, MSE vs. binary cross-entropy, logistic regression.
Lesson 35Neural Networks — Architecture & Forward Pass
Layers, neurons, weights, biases, activation functions (sigmoid, tanh, ReLU), forward pass equations, Universal Approximation Theorem.
Lesson 36Backpropagation & Training Neural Networks
Chain rule, computational graph, forward and backward pass, backpropagation equations, mini-batch SGD, full training loop.
Lesson 37Convolutional Neural Networks (CNNs)
Parameter sharing, convolution operation, max pooling, translation invariance, CNN architecture, and physics applications.
Lesson 38Transformers, LLMs & Diffusion Models
Attention mechanism, transformer architecture, large language models, pre-training and fine-tuning, diffusion models and generative AI.
Homework Assignments
Each assignment is an HTML page with MATLAB tasks. Submit as a single .m script (or Live Script) unless otherwise noted.
| Assignment | Topic | Related Lesson |
|---|---|---|
| HW 29 | GSS on the Buckingham Potential | Lesson 29 |
| HW 30 | Gradient Descent UDF | Lesson 30 |
| HW 31 | ML Estimation: Laser Speckle | Lesson 31 |
| HW 32 | ML vs. MMSE: Recreating MBIP Fig. 2.1 | Lesson 32 |
| HW 33 | Regression from an ML Perspective | Lesson 33 |
| HW 34 | Linear Classifier (GEO Space Objects) | Lesson 34 |
| HW 35 | Shallow Neural Network | Lessons 35–36 |
| HW 36 | Training Neural Networks | Lesson 36 |
| HW 37 | Convolutional Neural Networks | Lesson 37 |
| HW 38 | Transformers & Diffusion Models | Lesson 38 |
Projects
| Project | Topic | Related Lessons |
|---|---|---|
| Project 5 | Gradient Descent Optimization & Non-Convex Functions | Lessons 29–30 |
| Final Project | Applied Machine Learning (student-chosen topic) | Lessons 33–38 |
Learning Flow
The ten lessons build deliberately on each other across three connected topics:
- L29 — 1D optimization: bracket the minimum, then narrow with Golden Section Search.
- L30 — extend to N dimensions using the gradient. GD and SD algorithms.
- L31 — estimation from data: frequentist framework, maximum likelihood.
- L32 — incorporate prior knowledge: Bayesian estimation, MAP, MMSE.
- L33 — the big picture of ML: data → model → loss → optimization → evaluation.
- L34 — linear classification: score function, sigmoid, binary cross-entropy loss, logistic regression.
- L35 — neural networks: layers, neurons, activation functions, forward pass.
- L36 — backpropagation: chain rule, computational graph, parameter gradients, mini-batch SGD, full training loop.
- L37 — convolutional neural networks: parameter sharing, convolution, pooling, CNN architecture.
- L38 — state of the art: transformers, large language models, and diffusion models.