[edit]
Week 4
Unsupervised Learning and Probability Review
Lecture Notes
Unsupervised Learning and Probabilities
Learning Outcomes Week 4
In this lecture, the following concepts were introduced.
- The difference between unsupervised learning and supervised learning.
- Examples of unsupervised learning approaches:
- Clustering: e.g. k-means clustering
- Dimensionality reduction: e.g. principal component analysis.
- The algorithms for dimensionality reduction and clustering involve optimisation of objective functions.
- The different characteristics of these approaches to dimensionality reduction: in clustering you represent your data as discrete groups, in dimensionality reduction by a reduced number of continuous variables.
- Understand that machine learning has two broad approaches
- The Optimization Approach to ML
- The Probabilistic Approach to ML
And that these approaches are related: often the error function has a probabilistic interpretation through being the negative log likelihood.
- The basic probability rules including:
- The properties of a probability distribution.
- The sum rule of probability.
- The product rule of probability.
- Bayes’ rule
- How these rules are applied in a simple robot navigation example.
- The difference between a machine learning model and a machine learning algorithm.