|
Classes /
CS274A: Probabilistic LearningCLOSED : 2010 OFFERING Assignments and Exams:
Lecture: Roland Hall (RH) 184, MWF 2-3pmInstructor: Prof. Alex IhlerIntroduction to probabilistic models, inference, and learning.CS274A is an introductory course to probabilistic approaches to learning from data. Probabilistic models form an important part of many areas of computer science, and probabilistic learning (in this context, automatically constructing probabilistic models from data) has become an important tool in sub-fields such as artificial intelligence, data mining, speech recognition, computer vision, bioinformatics, signal processing, and many more. CS274A will provide an introduction to the concepts and principles which underly probabilistic models, and apply these principles to the development, analysis, and practical application of machine learning algorithms. The course will focus primarily on parametric probabilistic modeling, including data likelihood, parameter estimation using likelihood and Bayesian approaches, hypothesis testing and classification problems, density estimation, clustering, and regression. Related problems, including model selection, overfitting, and bias/variance trade-offs will also be discussed. Background.The course is intended to be an introduction to probabilistic learning, and thus has few explicit requirements. Students are expected to be familiar with basic concepts from probability, linear algebra, multivariate calculus, etc. Homeworks will use the MATLAB programming environment, but no prior experience with MATLAB is required for the course. Course format.Three lectures per week (MWF). Homeworks due in class approximately every two weeks. Two exams (midterm and final). Grading: 40% homework, 25% midterm, 35% final. Office Hours.Office hours for the course are 4pm Fridays, or by appointment. Collaboration.Discussion of the course concepts and methods among the students is encouraged; however, all work handed in should be completely your own. In order to strike a balance, we'll use the "work product" rule: while discussing anything related to the homework, you should retain no work product created during the discussion. In other words, you can meet and discuss the problems, describe the solution, etc., but then all parties must go away from the meeting with no record (written notes, code, etc.) from the meeting and do the homework problem on your own. If you work on a whiteboard, just erase it when you're done discussing. Don't show someone else your homework, or refer to it during the discussion, since by this policy you must then throw it away. Textbooks.The required textbook for the course is Bishop's "Pattern Recognition and Machine Learning", but lectures are likely to follow the book only loosly. Other recommended reading include MacKay's "Information Theory, Inference, and Learning Algorithms" (available online at http://www.inference.phy.cam.ac.uk/mackay/itila/), Duda, Hart, and Stork's "Pattern Classification", and Hastie, Tibshirani, and Friedman's "Elements of Statistical Learning". MatlabOften we will write code for the course using the Matlab environment. Matlab is accessible through NACS computers at several campus locations (e.g., MSTB-A, MSTB-B, and the ICS lab), and if you want a copy for yourself student licenses are fairly inexpensive ($100). Personally, I do not recommend the open-source Octave program as a replacement, as the syntax is not 100% compatible and may cause problems (for me or you). If you are not familiar with Matlab, there are a number of tutorials on the web:
You may want to start with one of the very short tutorials, then use the longer ones as a reference during the rest of the term. (Tentative) Schedule of Topics.
|