CBLL HOME
VLG Group
News/Events
Seminars
People
Research
Publications
Talks
Demos
Datasets
Software
Courses
Links
Group Meetings
Join CBLL
Y. LeCun's website
CS at Courant
Courant Institute
NYU
Lush
Lush

Machine Learning and Pattern Recognition: Schedule


[ Course Homepage | Schedule and Course Material | Mailing List ]

This page contains the schedule, slide from the lectures, lecture notes, reading lists, assigments, and web links.

I urge you to download the DjVu viewer and view the DjVu version of the documents below. They display faster, are higher quality, and have generally smaller file sizes than the PS and PDF.

Full-text search is provided for the entire collection of slides and papers. Click here to search

You can have a look at the schedule and class material for the version of this course taught during the Spring 2004 semester, but be warned that the new edition is significantly different.

09/07: Introduction and basic concepts

Subjects treated: Intro, types of learning, nearest neighbor, how biology does it, linear classifier, perceptron learning procedure, linear regression,

Slides: [DjVu | PDF | PS]

Recommended Reading:

  • Hastie/Tibshirani/Friedman: Chapter 2
  • Refresher on random variables and probabilites by Andrew Moore: (slides 1-27) [DjVu | PDF]
  • Refresher on joint probabilities, Bayes theorem by Chris Willams: [DjVu | PDF]
  • Refresher on statistics and probabilities by Sam Roweis: [DjVu | PS]
  • If you are interested in the early history of self-organizing systems and cybernetics, have a look at this book available from the Internet Archive's Million Book Project: Self-Organizing Systems, proceedings of a 1959 conference edited by Yovits and Cameron (DjVu viewer required for full text).

09/14: Energy-Based Models, Loss Functions, Linear Machines

Subjects treated: Energy-based models, minimum-energy machines, loss functions. Linear machines: perceptron, logistic regression. Linearly parameterized classifiers: Polynomial classifiers, basis function expansion, RBFs, Kernel-based expansion.

Slides: [DjVu | PDF | PS]

09/21: Gradient-Based Learning I, Multi-Module Architectures and Back-Propagation

Subjects treated: Multi-Module learning machines. Vector modules and switches. Multilayer neural nets. Backpropagation Learning. Intro to Model Selection, structural risk minimization, regularization.

Slides on Regularization: [DjVu | PDF | PS]

Slides on Multi-Module Back-Propagation: [DjVu | PDF | PS]

Required Reading:

Gradient-based Learning Applied to Document Recognition by LeCun, Bottou, Bengio, and Haffner; pages 1 to the first column of page 18: [DjVu | .ps.gz ]

09/28: Gradient-Based Learning II: Special Modules and Architectures

Subjects treated: Trainers; complex topologies; special modules; Cross-entropy and KL-divergence; RBF-nets, Mixtures of Experts; Parameter space transforms; weight sharing; convolution module; TDNN; Recurrent nets.

Slides: [DjVu | PDF | PS]

Homework Assignements 01: implementing the Perceptron Algorithm, MSE Classifier (linear regression), Logistic Regression. Details and datasets below:

  • Download this tar.gz archive. It contains the datasets and the homework description.
  • Decompress it with "tar xvfz homework-01.tgz" on Unix/Linux or with Winzip in Windows.
  • The file homework01.txt contains the questions and instructions.
  • Most the of the necessary Lush code is provided.
  • Due Date is Tuesday October 19th, before the lecture.

10/05: Convolutional Nets, Image Recognition, Convergence and Optimization

Subjects treated: Convolutional Networks; Image recognition, object detection, and other applications; Convergence of gradient-based optimization and acceleration techniques.

Slides: talk on object recognition with convolutional nets: DjVu

Slides on optimization: [DjVu | PDF | PS]

Required Reading:

If you haven't read it already: Gradient-based Learning Applied to Document Recognition by LeCun, Bottou, Bengio, and Haffner; pages 1 to the first column of page 18: [ DjVu | .ps.gz ]

Optional Reading: Fu-Jie Huang, Yann LeCun, Leon Bottou: "Learning Methods for Generic Object Recognition with Invariance to Pose and Lighting.", Proc. CVPR 2004. .ps.gz

10/12: NO LECTURE

NO LECTURE

Required Reading:

10/19: Bayesian Learning, MLE, MAP

Subjects treated: Refresher probability theory; Bayesian Estimation, Maximum Likelihood Estimation, Maximum A Posteriori Estimation, Negative Log-Likelihood Loss Functions.

Slides: Refresher on Probability Theory: [DjVu | PDF | PS]

Slides: Bayesian Learning: [DjVu | PDF | PS]

Required Reading:

Homework 01 due TODAY!

10/26: Unsupervised Learning

Subjects treated: Unsupervised Learning: Principal Component Analysis. Density Estimation: Parzen Windows, Mixtures of Gaussians, Auto-Encoders. Latent variables. Intro to the Estimation-Maximization algorithm.

Slides:

11/02: Efficient Optimization, Latent Variables, Graph Transformer Networks

Subjects treated: Modeling distributions over sequences. Learning machines that manipulate graphs. Finite-state transducers. Graph Transformer Networks.

Efficient learning: Newton's algorithm, Levenberg-Marquardt.

Required Reading: Note: the slides used in class are not provided because the two following papers cover the material.

Homework Assignements: implementing Gradient-Based Learning and back-propagation. You must implement gradient-based learning using the object-oriented, module-based approach as described in class. Various architectures, including a multilayer neural net, must be implemented and tested on two datasets.

  • Download this tar.gz archive. It contains the datasets and the homework description.
  • Decompress it with "tar xvfz homework-02.tgz" on Unix/Linux or with Winzip in Windows.
  • The file homework-02.txt contains the questions and instructions.
  • Most of the necessary Lush code is provided.
  • Due Date is Friday Nov 19.

11/09: Expectation-Maximization, Hidden Markov Models I

Subjects treated: More on optimization methods for lerning: Gauss-Mewton, Levenberg-Marquardt, BFGS, Conjugate Gradient;

Expectation-Maximization Algorithm (EM).

Introduction to Hidden Markov Models (HMM).

Required Reading:

11/16: HMM, Learning Theory, Bagging Boosting, VC-Dim

Subjects treated: HMM learning. Ensemble methods, Full Bayesian Learning, Bagging, Boosting. Learning Theory, Bounds, VC-Dimension.

Slides:

Homework Assignements: Homework 03: K-Means and Mixture of Gaussians estimation with EM.

  • The subject of this homework is to implement the K-means algorithm and the Expectation-Maximization algorithm for a Mixture of Gaussians model. The algorithms must be tested on image data for simulated image compression taks.
  • Download this tar.gz archive. It contains the datasets and the homework description.
  • Decompress it with "tar xvfz homework-03.tgz" on Unix/Linux or with Winzip in Windows.
  • The file homework-03.txt contains the questions and instructions.
  • DUE DATE: Friday Dec 3

11/23: Intro to Graphical Models

Subjects treated: Intro to graphical models, Belief Networks and Factor Graphs, Inference, Belief Propagation, Boltzmann Machines.

Homework Assignements: Final Project

  • A list of possible project topics is available here. You are welcome to pick from this list of to propose a project of your own (possibly in line with your main research interests). To make a final project proposal, send a short description be email to YLC and to the TA.
  • This project will count for 40% of the final grade.
  • Collaboration: you can do your final project in groups of two students.
  • Due Date: If you need a grade right away (e.g. if you are graduating this semester), you must turn in your final project by December 17th.
  • Extra Time: Extensions can be granted for ambitious projects by students who are not graduating this semester. Send requests for extensions to YLC.

11/30

12/07

.