Begin typing your search above and press return to search. Press Esc to cancel.


This is the Machine Learning Series as presented at the London Machine Learning Study Group events.

Machine Learning Lecture 1 Lecture 1: Linear Regression and Gradient Descent

This is the first of a series of introductory Machine Learning lectures suitable for beginners. In the first module we define what Machine Learning is, we look at various resources you can use to expand your knowledge on the contents of the series, and we drill down into Linear Regression models.

We focus on parameter estimation using Gradient Descent and Normal Equations and we see these methods in action using a simple dataset.

Machine Learning Lecture 2 Lecture 2 : Gradient Descent and Normal Equations

In this second session, we revisit Gradient Descent (GD) and talk about optimal strategies for selecting the learning rate parameter. We also look at some tricks for improving the learning process (e.g. normalisation) and why they work. We then introduce Normal Equations and discuss the pros and cons when compared to GD. We do some tests against the Auto MPG data set using our own GD and Normal Equations implementations.

Machine Learning Lecture 3 Lecture 3 : Curve fitting and Model validation

In Lecture 3 we discuss various assumptions in linear regression models and how to validate them. We look at high-order polynomial curves and how we can use them to capture curvilinear relationships. We also introduce the holdout method and k-fold cross validation for estimating the generalisation of the models to unseen data.

Machine Learning Lecture 4 Lecture 4 : Decision Trees

In this session we look at our first logical model – Decision Trees. We cover topics like organisation of a tree structure, using machine learning to construct decision trees, and employing decision trees to make predictions for classification tasks. We also introduce metrics like entropy and information gain, and we talk about advantages and disadvantages of the decision tree model.