[edit]
Week 2
Classification, Regression, Error Functions and Optimization
Lecture Notes
Classification, Regression, Error Functions and Optimization
Learning Outcomes Week 2
In this lecture, the following concepts were introduced.
- An overview of the idea of classification. Including
- Understanding a basic classification algorithm like the perceptron algorithm
- Understanding what a feature matrix is.
- Understand what the data labels are.
- The concept of a learning rate
- The concept of linear separability
- An overview of the idea of regression. Including
- Basis functions can be used to make a linear regression non-linear.
- An example of a commonly used basis set (like polynomials or radial basis functions).
- A commonly used error (or objective) function such as the sum of squared errors.
- The difference between a model and an algorithm
- The concept of generalization
- The idea of a training set
- The use of the error function (also known as an objective function)
- The importance of the mathematical concepts of
- vectors
- differentiation
- minimum
- The idea behind the optimization approach of steepest descent.
- How stochastic gradient descent differs from steepest descent and why this is important.