First Edition cover | Second edition cover |
The book is aimed at computer science and engineering undergraduates studing machine learning and artificial intelligence.
The table of contents for the second edition can be found here.
There are lots of Python/NumPy code examples in the book, and the code is available here. Datasets (either the actual data, or links to the appropriate resources) are given at the bottom of the page.
Note that the chapter headings and order below refer to the second edition. However, the titles of the chapters should enable users of the first edition to find the relevant sections. In addition, a zip file of the code for the 1st edition is available here.
All of the code is freely available to use (with appropriate attribution), but comes with no warranty of any kind.
Option 1: Zip file of all code, arranged into chapters
Option 2: Choose what you want from here:
- Chapter 2 (Preliminaries):
- Chapter 3 (Neurons, Neural Networks, and Linear Discriminants):
- The Perceptron
- The Linear Regressor
- Another Perceptron (for use with logic.py)
- Demonstration of Perceptron with logic functions
- Demonstration of Linear Regressor with logic functions
- Demonstration of Perceptron with Pima Indian dataset
- Demonstration of Linear Regressor with auto-mpg dataset
- Demonstration of Perceptron with the MNIST dataset
- Chapter 4 (The Multi-Layer Perceptron):
- The Multi-Layer Perceptron
- Demonstration of the MLP on logic functions
- Demonstration of the MLP for classification on the Iris dataset
- Demonstration of the MLP for regression on data from a sine wave
- Demonstration of the MLP for time series on the Palmerston North Ozone dataset
- Demonstration of MLP with the MNIST dataset
- Chapter 5 (Radial Basis Functions and Splines):
- Chapter 6 (Dimensionality Reduction):
- Chapter 7 (Probabilistic Learning):
- Chapter 8 (Support Vector Machines):
- Chapter 9 (Optimisation and Search):
- Steepest Descent
- Newton's method
- Levenberg-Marquarft
- Conjugate Gradients
- The version of the MLP algorithm trained using conjugate gradients
- Demonstration of the MLP algorithm trained using conjugate gradients on the Iris dataset
- Demonstration of Levenberg-Marquardt on a least-squares fitting problem
- Demonstration of four solution methods for the Travelling Salesman Problem
- Chapter 10 (Evolutionary Learning):
- Chapter 11 (Reinforcement Learning):
- Chapter 12 (Learning with Trees):
- Chapter 13 (Decision by Committee: Ensemble Learning):
- Chapter 14 (Unsupervised Learning):
- Chapter 15 (Markov Chain Monte Carlo Methods):
- Chapter 16 (Graphical Models):
- The Gibbs Sampler for the Exam Panic dataset
- The Hidden Markov Model
- A simple 1D Kalman Filter
- A complete Kalman Filter
- The Extended Kalman Filter
- The Basic Particle Filter
- A Tracking Particle Filter
- The Markov Random Field for Image Denoising
- A demonstration of finding paths in graphs
- An image for denoising
- Chapter 17 (Symmetric Weights and Deep Belief Networks):
- Chapter 18 (Gaussian Processes):
Datasets
Many of the datasets used in the book are available from the UCI Machine Learning Repository. In particular, look for the Iris data, the Pima Indian data, the car safety data, the auto-mpg data, the wine data, and the mushroom data.Two of the most popular machine learning demonstration datasets are the MNIST set of zip code digits, which is available here, and the binary alpha digits dataset, which can be downloaded here. Finally, there are a couple of smaller datasets that are not available elsewhere, at least in their current form, and so should be downloaded from this website:
- The Palmerston North Ozone dataset
- Training data for the prostate dataset
- Test data for the prostate dataset (variables are log cancer volume, log prostate weight, age, lbph, svi, lcp, Gleason score, pgg45 and the last one is response lpsa)
- The Ruapehu dataset (thanks to Mark Bebbington)
- Short version of the e. coli dataset