Skip to main content
See the Spring 2026 Academic Calendar for semester dates. Each week below lists the readings, lecture topics, and deliverables you should complete.
1

Read AIMA Chapters 1 & 2

Systems approach to AI. Intelligent agents and their environments.
2

Review lecture: Introduction to AI

Overview of AI agents, course roadmap, and the systems approach. See lecture notes.
3

Watch videos

The Robotic AI Agent — A practical map for navigating robotic AI systems. Mathematical Prerequisites — Review the math foundations needed for the course.
4

Set up your development environment

Follow the Dev Environment guide to install Docker and configure your container.
5

Import the course repository

Import eng-ai-agents to your GitHub account and clone it locally.
6

Clone the Hands-On ML repository

Clone the companion repository for Hands-On Machine Learning with Scikit-Learn and PyTorch by Aurélien Géron: github.com/ageron/handson-mlp. This repository contains all the notebook exercises referenced throughout the course.
1

Read AIMA Chapter 19

Learning from examples: classification and regression with classical ML.
2

Review lecture: Supervised Learning

Perception subsystem, reflexive agents, the learning problem. See The Learning Problem.
3

Review lecture: Linear Regression

Regression fundamentals and empirical risk minimization. See Linear Regression.
4

Review lecture: SGD Optimization

Stochastic gradient descent for minimizing the empirical risk. See SGD.
5

Read GERON Chapter 4 — SGD sections

Read the Gradient Descent, Batch Gradient Descent, Stochastic Gradient Descent, and Mini-Batch Gradient Descent sections from Chapter 4: Training Linear Models.
6

Run the GERON Chapter 4 notebook

Work through the Training Linear Models notebook.
7

Run the SGD notebook

Execute the SGD Sinusoidal Dataset notebook in your container.
8

Watch videos

The Learning Problem — The Vapnik block diagram. Linear Regression — Extracting non-linear patterns with linear models. Gradient Descent — Optimizing complicated functions with iterative methods.
1

Review lecture: Entropy

Information theory principles and cross-entropy. See Entropy.
2

Review lecture: Marginal Maximum Likelihood

Marginal likelihood and parameter estimation. See Marginal Maximum Likelihood.
3

Review lecture: Conditional Maximum Likelihood

Conditional likelihood for supervised learning. See Conditional Maximum Likelihood.
4

Review lecture: Classification Introduction

Classification fundamentals and decision boundaries. See Classification Introduction.
5

Review lecture: Logistic Regression

Binary classification with logistic regression. See Logistic Regression.
6

Submit Assignment 1

Complete and submit Assignment 1.
7

Watch videos

Entropy — Information theory principles. Maximum Likelihood Estimation — The workhorse of statistical modeling. Binary Classification — Binary classification and Logistic Regression.
1

Read AIMA Chapter 21 and DL Chapter 6

From Perceptron to MLPs, backpropagation fundamentals.
2

Review lecture: DNN Introduction

Neural network architectures and forward pass. See DNN Introduction.
3

Review lecture: Backpropagation

Gradient computation and the chain rule. See Backpropagation.
4

Read GERON Chapter 9 — Classification MLPs

Read the Classification MLPs section from Chapter 9: Artificial Neural Networks.
5

Run the GERON Chapter 9 notebook

Work through the Artificial Neural Networks notebook.
6

Watch videos

Feature Extraction — Using a simple network to understand how features are extracted. Multiclass Classifier — A simple multiclass classifier example. Backpropagation — How to calculate gradients in a neural network. Regularization — How to regulate the complexity of a neural network.
1

Read DL Chapters 9 & 10, AIMA Chapter 25

Convolutional Neural Network architecture and applications.
2

Review lecture: CNN Introduction

Convolution operations, pooling, and spatial feature hierarchies. See CNN Introduction.
3

Review lecture: CNN Layers

Layer types and architectural patterns. See CNN Layers.
4

Review lecture: CNN Architectures and ResNets

ResNet, VGG, and other architectures. See CNN Example Architectures and Feature Extraction with ResNet.
5

Read GERON Chapter 12 — CNN sections

Read the Convolutional Layers, Pooling Layers, and CNN Architectures sections from Chapter 12: Deep Computer Vision with CNNs.
6

Run the GERON Chapter 12 notebook

Work through the Deep Computer Vision with CNNs notebook.
7

Watch videos

CNN Architectures — Looking inside a CNN layer and understanding architectural patterns. What CNNs Learn — Visualizing the features learned by CNNs.
1

Read AIMA Chapter 25

Object detection, semantic and instance segmentation.
2

Review lecture: Scene Understanding

Detection and segmentation pipelines. See Scene Understanding.
3

Review lecture: Object Detection

R-CNN family and YOLO architectures. See Object Detection.
4

Watch videos

Introduction to Object Detection — Fundamentals of object detection pipelines. Region-based Object Detectors — R-CNN, Fast R-CNN, Faster R-CNN.
1

Read AIMA Chapters 12, 13 & 14

Probability, Bayesian networks, and recursive state estimation.
2

Review lecture: Recursive State Estimation

Dynamic Bayesian Networks and Kalman filters. See Recursive State Estimation.
3

Review lecture: Kalman Filters

Linear Gaussian models and the Kalman update. See Kalman Filters.
4

Watch videos

Coming soon — Probabilistic models video lectures are in development.
1

Read AIMA Chapter 23

Natural language processing problem formulation and component mechanics.
2

Review lecture: NLP Pipelines

Tokenization, embeddings, and NLP pipeline components. See NLP Pipelines.
3

Review lecture: Word2Vec

Word embeddings and distributional semantics. See Word2Vec.
4

Watch videos

Coming soon — NLP fundamentals video lectures are in development.
1

Read AIMA Chapter 24, DL Chapter 10

RNN/LSTM architectures, attention mechanisms, Transformer framework.
2

Review lecture: RNNs and LSTMs

Recurrent architectures for sequence modeling. See RNN Introduction.
3

Review lecture: Transformers

Self-attention and the Transformer architecture. See Transformers Introduction.
4

Run the GERON Chapter 13 notebook

Work through the Processing Sequences Using RNNs and CNNs notebook.
5

Midterm preparation

Review all material from Weeks 1-8. Focus on key concepts: supervised learning, DNNs, CNNs, probabilistic models, and NLP fundamentals.
6

Watch videos

Introduction to Transformers — The transformer architecture and the simple attention mechanism. The Learnable Attention Mechanism — Implementing the scaled dot-product self attention mechanism. Multi-Head Self Attention — Using multiple attention heads to capture different aspects of input sequences.
1

Read AIMA Chapter 7

Symbolic AI, propositional logic, theorem proving, Knowledge Base construction.
2

Review lecture: Reasoning

Logical reasoning and knowledge representation. See Reasoning.
3

Watch videos

Coming soon — Knowledge representation video lectures are in development.
1

Read AIMA Chapters 16 & 17

Sequential decision making, reward signals, policy computation.
2

Review lecture: MDP Introduction

Markov Decision Processes and the Bellman equations. See MDP Introduction.
3

Review lecture: Bellman Equations

Expectation and optimality backups. See Bellman Expectation.
4

Run the GERON Chapter 19 notebook

Work through the Reinforcement Learning notebook.
5

Watch videos

Introduction to MDPs - Part 1 — Defining Markov Decision Processes. Introduction to MDPs - Part 2 — Defining Markov Decision Processes. Bellman Expectation Equations - Part 1 — Deriving the Bellman Expectation Equations. Bellman Expectation Equations - Part 2 — Deriving the Bellman Expectation Equations. Bellman Optimal Value Functions — Deriving the Bellman Optimality Equations. Policy Iteration and Value Iteration — Using the Bellman Optimality Equations for optimal control.