Theory materials

  1. Lecture 1: Machine Learning Introduction
  2. Lecture 2: Probability and Information
  3. Lecture 3: Linear Algebra & Optimization
  4. Lecture 4: Linear Regression & Bayesian Linear Regression
  5. Lecture 5: Perceptron & Logistic Regression
  6. Lecture 6 (9): Multilayer Perceptrons 
  7. Lecture 7 (6): Learning theory, Bias-Variance
  8. Lecture 8 (7): Model Selection
  9. Lecture 9 (10): Deep Learning
  10. Lecture 10 (11) Convolutional Neural Networks
  11. Lecture 11 (12)  Recurrent Neural Networks
  12. Lecture 12 (13) Autoencoders
  13. Lecture 13 (14) Feature Extraction
  14. Lecture 14 (8) k Neares Neighbour & Locally Weighted Regession
  15. Lecture 15: K-Means Clustering, EM-Clustering (Expectation Maximisation)
  16. Lecture 16: PCA, ICA
  17. Lecture 17: Decision Trees
  18. Lecture 18: Ensemble Methods
  19. Lecture 19: Kernel Methods
  20. Lecture 20: Support Vector Machines
  21. Lecture 21: Bayesian Networks
  22. Lecture 22: Stochastic Methods
  23. Lecture 23: Applications
  24. Lecture 24: Conclusion

Additional materials

  Jupyter Notebook from Hands-On Machine Learning with Scikit_Learn & Tensorflow by Aurélien    Géron with example of convolutional neural networks and transfer learning in Tensorflow (Chapter 13):

Practicals

Homeworks

  1. pen & paper, solution, grades
  2. pen & paper, solution, grades
  3. codesolution, grades
  4. pen & paper, solution, grades
  5. code (longer)

Exam

Grades


Attachments