Physics 356 — Lessons 29–38

Advanced Topics in Computational Physics

Ten lessons spanning numerical optimization, statistical estimation, and machine learning & AI — the computational backbone of modern physics research and engineering.

Optimization

Lesson 29

1D Optimization: Bracketing & GSS

Local vs. global minima, bracketing algorithm, golden section search, and MATLAB implementation.

Lesson 30

Gradient-Based Optimization

Gradient review, gradient descent with fixed step size, steepest descent with optimal step, convergence criteria.

Statistical Estimation

Lesson 31

Statistical Estimation I: Frequentist Methods & Maximum Likelihood

Gaussian distributions, frequentist framework, bias/variance/MSE, maximum likelihood estimator, Cramér-Rao bound.

Lesson 32

Bayesian Estimation: MAP & MMSE

Prior, likelihood, posterior, Bayes' rule, MAP estimator, MMSE estimator, ML vs. MMSE comparison.

Machine Learning & Artificial Intelligence

Lesson 33

What is ML & AI?

Definitions, types of learning, the ML pipeline, regression, regularization, and real-world physics applications.

Lesson 34

Linear Classification & Loss Functions

Linear score function, decision boundaries, sigmoid activation, MSE vs. binary cross-entropy, logistic regression.

Lesson 35

Neural Networks — Architecture & Forward Pass

Layers, neurons, weights, biases, activation functions (sigmoid, tanh, ReLU), forward pass equations, Universal Approximation Theorem.

Lesson 36

Backpropagation & Training Neural Networks

Chain rule, computational graph, forward and backward pass, backpropagation equations, mini-batch SGD, full training loop.

Lesson 37

Convolutional Neural Networks (CNNs)

Parameter sharing, convolution operation, max pooling, translation invariance, CNN architecture, and physics applications.

Lesson 38

Transformers, LLMs & Diffusion Models

Attention mechanism, transformer architecture, large language models, pre-training and fine-tuning, diffusion models and generative AI.

📋 Homework Assignments

Each assignment is an HTML page with MATLAB tasks. Submit as a single .m script (or Live Script) unless otherwise noted.

AssignmentTopicRelated Lesson
HW 29GSS on the Buckingham PotentialLesson 29
HW 30Gradient Descent UDFLesson 30
HW 31ML Estimation: Laser SpeckleLesson 31
HW 32ML vs. MMSE: Recreating MBIP Fig. 2.1Lesson 32
HW 33Regression from an ML PerspectiveLesson 33
HW 34Linear Classifier (GEO Space Objects)Lesson 34
HW 35Shallow Neural NetworkLessons 35–36
HW 36Training Neural NetworksLesson 36
HW 37Convolutional Neural NetworksLesson 37
HW 38Transformers & Diffusion ModelsLesson 38

🧪 Projects

ProjectTopicRelated Lessons
Project 5Gradient Descent Optimization & Non-Convex FunctionsLessons 29–30
Final ProjectApplied Machine Learning (student-chosen topic)Lessons 33–38

🗺 Learning Flow

The ten lessons build deliberately on each other across three connected topics:

  1. L29 — 1D optimization: bracket the minimum, then narrow with Golden Section Search.
  2. L30 — extend to N dimensions using the gradient. GD and SD algorithms.
  3. L31 — estimation from data: frequentist framework, maximum likelihood.
  4. L32 — incorporate prior knowledge: Bayesian estimation, MAP, MMSE.
  5. L33 — the big picture of ML: data → model → loss → optimization → evaluation.
  6. L34 — linear classification: score function, sigmoid, binary cross-entropy loss, logistic regression.
  7. L35 — neural networks: layers, neurons, activation functions, forward pass.
  8. L36 — backpropagation: chain rule, computational graph, parameter gradients, mini-batch SGD, full training loop.
  9. L37 — convolutional neural networks: parameter sharing, convolution, pooling, CNN architecture.
  10. L38 — state of the art: transformers, large language models, and diffusion models.
Prerequisites: Basic linear algebra (matrix multiplication, dot products), calculus (derivatives, partial derivatives), MATLAB programming, and introductory probability are sufficient for L29–L32. L33–L38 build on those foundations.