(?)

CS274A: Probabilistic Learning

CLOSED : 2011 OFFERING

Assignments and Exams:

HW104/13/11Soln  
HW204/27/11Soln  
Midterm Soln  
HW305/23/11Soln  
HW406/03/11Soln  
Final Soln  
     
Final06/06/114:00-6:00  
Discussion Page

Bren Hall 1200, MWF 3-4pm


Introduction to probabilistic models, inference, and learning.

CS274A is an introductory course to probabilistic approaches to learning from data. Probabilistic models form an important part of many areas of computer science, and probabilistic learning (in this context, automatically constructing probabilistic models from data) has become an important tool in sub-fields such as artificial intelligence, data mining, speech recognition, computer vision, bioinformatics, signal processing, and many more. CS274A will provide an introduction to the concepts and principles which underly probabilistic models, and apply these principles to the development, analysis, and practical application of machine learning algorithms.

The course will focus primarily on parametric probabilistic modeling, including data likelihood, parameter estimation using likelihood and Bayesian approaches, hypothesis testing and classification problems, density estimation, clustering, and regression. Related problems, including model selection, overfitting, and bias/variance trade-offs will also be discussed.

Background.

The course is intended to be an introduction to probabilistic learning, and thus has few explicit requirements. Students are expected to be familiar with basic concepts from probability, linear algebra, multivariate calculus, etc. Homeworks will use the MATLAB programming environment, but no prior experience with MATLAB is required for the course.

Course format.

Three lectures per week (MWF). Homeworks due in class approximately every two weeks. Two exams (midterm and final). Grading: 40% homework, 25% midterm, 35% final.

Office Hours.

Office hours for the course are 3pm Tuesdays, or by appointment.

Collaboration.

Discussion of the course concepts and methods among the students is encouraged; however, all work handed in should be completely your own. In order to strike a balance, we'll use the "work product" rule: while discussing anything related to the homework, you should retain no work product created during the discussion. In other words, you can meet and discuss the problems, describe the solution, etc., but then all parties must go away from the meeting with no record (written notes, code, etc.) from the meeting and do the homework problem on your own. If you work on a whiteboard, just erase it when you're done discussing. Don't show someone else your homework, or refer to it during the discussion, since by this policy you must then throw it away.

Textbooks.

The required textbook for the course is Bishop's "Pattern Recognition and Machine Learning", but lectures are likely to follow the book only loosly. Other recommended reading include MacKay's "Information Theory, Inference, and Learning Algorithms" (available online at http://www.inference.phy.cam.ac.uk/mackay/itila/), Duda, Hart, and Stork's "Pattern Classification", and Hastie, Tibshirani, and Friedman's "Elements of Statistical Learning".

Matlab

Often we will write code for the course using the Matlab environment. Matlab is accessible through NACS computers at several campus locations (e.g., MSTB-A, MSTB-B, and the ICS lab), and if you want a copy for yourself student licenses are fairly inexpensive ($100). Personally, I do not recommend the open-source Octave program as a replacement, as the syntax is not 100% compatible and may cause problems (for me or you).

If you are not familiar with Matlab, there are a number of tutorials on the web:

You may want to start with one of the very short tutorials, then use the longer ones as a reference during the rest of the term.

(Tentative) Schedule of Topics.

All lectures are recorded, but in some the audio is not so good (due to some failure between my hands-free and the recording software); sorry about any of those.

Week 1Introduction, probability distributions; frequentist vs. Bayesian viewpointsSlides, Lecture
 Bayes' rule, exponential family distributionsSlides, Lecture
 Conditional independence; graphical models; multivariate GaussiansSlides, Lecture
For a review of probability, a few good references are: Prof. Smyth's 274A handout #1 on probability; the textbook by Olofsson, "Probability, Statistics & Stochastic Processes" (Bayes Rule, 43-56; random variables and expectation, 77-108; joint distributions, 159-?) and a UCLA stat wiki that is not verbose, but might serve as a reminder; see e.g. Fundamentals, Rules, RVs, Expectations.
Week 2More Gaussians; intro to learning, likelihood, parametersRead Prof. Smyth's handout #2; Slides, Lecture
 ML learning I: data likelihood, univariate ML; bias & varianceSlides, Lecture
 ML learning II: multivariate; exponential familySlides, Lecture
Week 3
 Bayesian learning I: priors, posterior distributions; MAP & MPE estimatesSlides, Lecture
 Bayesian learning II: conjugate priors; beta-binomialSlides, Lecture
Week 4Bayesian learning III: Gaussian models; Bayes optimal decisionsSlides, Lecture
 Classification and regression problems as parameter estimationSlides, Lecture
 Bias/variance and Bayesian priors for regressionSlides, Lecture
Week 5(continued)Slides, Lecture
 Logistic regression; Reading PRML Ch 4Slides, Lecture
 Review
Week 6MIDTERM
 Mixture models and EM; Reading Smyth handout, PRML Ch 9Slides, Lecture
 Complexity and model selection; marginal likelihood, BIC approximation; Reading PRML 3.5, 4.4Slides, Lecture
Week 7Latent space representations; PCA & Probabilistic PCASlides, Lecture
 Hidden Markov modelsSlides, Lecture
 Hidden Markov modelsSlides, Lecture
Week 8Monte Carlo methodsSlides, Lecture
  Slides, Lecture
  Slides, Lecture
Week 9A brief return to graphical models
 
 
Week 10Misc topics and review
Final Exam06/06/2011Final exam, 4-6pm
Last modified February 13, 2017, at 02:22 PM
Bren School of Information and Computer Science
University of California, Irvine