Miaomiao Zhang

CSE 426 / 326: Fundamentals of Machine Learning

Description: An introductory course offers a broad overview of the main techniques in machine learning. Students will study the basic concepts of advanced machine learning methods as well as their theoretical background. Topics of learning theory (bias /variance tradeoffs; VC theory); supervised learning parametric / nonparametric methods, Bayesian models, support vector machines, neural networks); unsupervised learning (dimensionality reduction, kernel tricks, clustering) and reinforcement learning will be covered.

  • Class meetings: MW 2:35pm-3:50pm @ Neville Hall 003

  • Instructor: Miaomiao Zhang (miaomiao -at- cse.lehigh.edu)

  • Office hours: MW 4:00pm-5:00pm @ PL514A

  • Teaching Assistant: Yuan Xue (yux715-at-lehigh.edu), TTh 9:00am-11:00am @ PL114, or by appointment

Lecture materials including slides and notes will be posted on CourseSite

Prerequisites by Topic

  • Linear algebra: matrices, vectors, vector spaces, basic matrix transformations, eigenvalues and linear differential equations

  • Probability and statistics: Distribution of random variables, Bayesian rules, random sampling, hypotheses testing, correlation and regression

  • Programming: algorithm and implementation in Matlab, or Python, or C/C++

Textbooks

Shai Shalev-Shwartz and Shai Ben-David, “Understanding Machine Learning: From Theory to Algorithms”, 1st Edition, Cambridge University Press, 2014, ISBN 978-1107057135

Mehryar Mohri, Afshin Rostamizadeh and Ameet Talwalkar, “Foundations of Machine Learning (Adaptive Computation and Machine Learning series)”, The MIT Press, 2012, ISBN 978-0262018258

Christopher Bishop, "Pattern Recognition and Machine Learning”, ISBN 978-0-387-31073-2

Grading

  • Midterm (15%)

  • Problem sets and projects (65%)

  • Final (15%)

  • Participation (5%)

All programming will be in Matlab, a powerful, free programming tool for Lehigh students: https://software.lehigh.edu/install/. Reports must be written in LaTeX and submitted as PDF.

Homeworks are due by midnight (11:59:59 PM) on the due date. Late penalty is 15% per day. Late assignments after three days will not be accepted.

Cheating

Discussions with classmates are strongly encouraged. However, any assignment must be your own work based on your own understanding of the questions. Both copying others’ solution and let others copy your solution will be considered as cheating.

You must work on exams within limited time independently. Students caught cheating will fail this course. Please don't take that chance.

Schedule

Date Topics Problem sets
Week 1
Jan 22 Course Introduction PS0, due on Jan 24
Jan 24 Basic Concepts PS0 due
Week 2
Jan 29 Linear Regression Methods
Jan 31 The Perceptron PS1, due on Feb 12
Week 3
Feb 5 Support Vector Machines
Feb 7 Learning Theory
Week 4
Feb 12 Clustering Algorithms - Nearest Neighbors, K-means PS1 due
Feb 14 Probabilistic Modeling, MLE PS2, due on Feb 26
Week 5
Feb 19 Gaussian Mixture Model and Graphical Models
Feb 21 The EM algorithm
Week 6
Feb 26 Midterm Review by TA (Instructor travel) PS2 due
Feb 28 Midterm Exam
Week 7
Mar 5 Data Dimensionality Reduction - PCA PS3, due on March 19
Mar 7 Kernel Methods
Week 8
Mar 12 Spring break
Mar 14 Spring break
Week 9
Mar 19 Neural Networks I PS3 due
Mar 21 Neural Networks II PS4, due on Apr 2
Week 10
Mar 26 Stochastic Gradient Decent
Mar 28 Bayesian Models, MAP
Week 11
Apr 2 Markov Chain Monte Carlo Methods PS4 due
Apr 4 Random Sampling PS5, due on Apr 16
Week 12
Apr 9 Boosting
Apr 11 Model Selection
Week 13
Apr 16 Random Forest PS5 due
Apr 18 Structured Prediction I
Week 14
Apr 23 Structured Prediction II
Apr 25 Basics of Reinforcement Learning
Week 15
Apr 30 Final project presentation
May 2 Final project presentation
May 9 Final project Due

Disclaimer

The instructor reserves the right to make changes to the course schedule, syllabus, and project deadlines. Changes will be announced early in advance.