[edit]
Outcomes
Learning Outcomes Week 1
In this lecture, the following concepts were introduced.
The ability to identify likely applications of artificial intelligence and machine learning in modern computers including:
- Targeted adverts
- Speech recognition
- Ranking of news feed items
- suggestion of likely friends in social networks
- Pose identification in games consoles
- Recommender systems, like Amazon and Netflix
- Face recognition systems such (Picasa, Google, Facebook)
Modern artificial intelligence is heavily reliant on data.
Machine learning requires that data is combined with assumptions (or a model) to make a prediction.
The history of prediction with data goes back as far as Laplace and Gauss (200 years ago).
Many of the principles of prediction haven’t changed much in recent years, but the availability of data and computing power has greatly increased.
Learning Outcomes Week 2
In this lecture, the following concepts were introduced.
- An overview of the idea of classification. Including
- Understanding a basic classification algorithm like the perceptron algorithm
- Understanding what a feature matrix is.
- Understand what the data labels are.
- The concept of a learning rate
- The concept of linear separability
- An overview of the idea of regression. Including
- Basis functions can be used to make a linear regression non-linear.
- An example of a commonly used basis set (like polynomials or radial basis functions).
- A commonly used error (or objective) function such as the sum of squared errors.
- The difference between a model and an algorithm
- The concept of generalization
- The idea of a training set
- The use of the error function (also known as an objective function)
- The importance of the mathematical concepts of
- vectors
- differentiation
- minimum
- The idea behind the optimization approach of steepest descent.
- How stochastic gradient descent differs from steepest descent and why this is important.
Learning Outcomes Week 4
In this lecture, the following concepts were introduced.
- The difference between unsupervised learning and supervised learning.
- Examples of unsupervised learning approaches:
- Clustering: e.g. k-means clustering
- Dimensionality reduction: e.g. principal component analysis.
- The algorithms for dimensionality reduction and clustering involve optimisation of objective functions.
- The different characteristics of these approaches to dimensionality reduction: in clustering you represent your data as discrete groups, in dimensionality reduction by a reduced number of continuous variables.
- Understand that machine learning has two broad approaches
- The Optimization Approach to ML
- The Probabilistic Approach to ML
And that these approaches are related: often the error function has a probabilistic interpretation through being the negative log likelihood.
- The basic probability rules including:
- The properties of a probability distribution.
- The sum rule of probability.
- The product rule of probability.
- Bayes’ rule
- How these rules are applied in a simple robot navigation example.
- The difference between a machine learning model and a machine learning algorithm.
Learning Outcomes Week 5
This lecture covers the following learning outcomes
- A review of continuous probability densities.
- A review of the Gaussian density.
- The equivalence between least squares and a Gaussian noise approximation.
Learning Outcomes Week 6
This lecture covers the following learning outcomes
- Mapping the basic programming concepts into algorithms for machine
learning.
- Ability to make small modifications to existing code to change an algorithm.
- Be able to relate lines in a programming language to mathematical
formulae.
- Understanding that the mathematical derivations we create can map to implementations in code.
- Understanding how mathematics is implemented as code, for example data structures like arrays can map to mathematical structures like vectors.
- Understanding the particular needs when interacting with data: an environment that allows the display of the data. (e.g. IPython notebook).
- Reinforcing the previous lectures’ learning outcomes.