CS274b: Learning in Graphical Models

Assignments and Exams:

HW1, DataDue 4/15   
HW2, Code & DataDue 4/29   
HW4, DataDue 6/03   
ProjectsDue 6/3

Lecture: ICS 180, TR 3:30pm-5:00pm

Instructor: Prof. Alex Ihler (ihler@ics.uci.edu), Office Bren Hall 4066

  • Office Hours: Mondays 2-3pm, Bren Hall 4066, or by appointment

Reader: Qi Lou, Office Bren Hall 4051

Graphical models have assumed a central role in representing and reasoning about complex systems across many scientific domains. Examples of graphical models include Bayesian networks and constraint networks from artificial intelligence, Markov random fields from statistics and statistical physics, and factor graphs from coding and information theory. Graphical models provide a common language to represent, make explicit, and communicate modeling assumptions, as well as providing a useful structure for organizing computation and approximations. Today, graphical models are used in many application areas: signal and image processing, computer vision, game theory, operations research, error-correcting codes, and computational biology.

The primary goal of this course is to familiarize students with the concepts underlying graphical models, and in particular with learning these models from data. A student who has successfully completed the course should be able to understand a wide variety of well known models in terms of this unifying framework and feel comfortable using it to design new models. The course will contain: (1) formal mathematical sections necessary for the development of the theory, (2) examples of probabilistic models (re)formulated in the language of graphical models and (3) examples of successful applications to real data.

The assumed pre-requisite for the course is CS274a (Probabilistic Learning); I will also assume familiarity with Python and/or Matlab.


Two excellent references are Koller & Friedman (2009), "Probabilistic Graphical Models"; and Murphy (2012), "Machine Learning: A Probabilistic Perspective". We will roughly follow (selected portions of) those texts.


We will use Piazza for dicussions and some posted materials; our course page is:

Syllabus and Schedule (subject to change)

  • Probability basics pdf
  • Bayesian networks pdf
  • Markov networks & factor graphs pdf
  • Variable elimination pdf, Junction trees pdf
  • Markov chains pdf, Python Example
  • Maximum likelihood learning (complete data)
  • Model selection in Bayes nets: Chow-Liu, TANBayes, DAGs pdf
  • Maximum entropy connections pdf
  • Loopy models: iterative scaling, IPF, pseudolikelihood pdf
  • Latent variable models; Expectation maximization pdf
  • Monte Carlo approximations; MCMC-MLE, Contrastive divergence
  • Variational approximations; loopy BP & variants, entropic learning
  • Conditional random fields and SSVMs
  • Copula models
  • Structure learning: basics; sparse learning; independence tests; etc.

For more information, see the Spring 2014 CS274b page.


For the class, I am providing some of my own Matlab and Python code for graphical models, mostly for discrete or Gaussian distributions. I may need to update the code during the class; if so I will include it with the relevant assignment. The main component is a factor class for representing and manipulating the elemental functions that make up a graphical model. In addition to the help in each function, there is some simple documentation here.

There are many other software packages available that also aim to simplify the use or study of graphical models, usually also the personal code of the lead researcher. Some good ones include:

  • BNT: Bayes Net Toolbox (Matlab)
  • PMTK3: Probabilistic Modeling Toolkit (Matlab)
  • UGM: Undirected Graphical Models Toolkit (Matlab)
  • libDAI (C++)
  • Grante (C++)
  • UnBBayes (Java),

(Note: if you have other suggestions feel free to share them with me and I may add them; but this is not intended to be a complete list of all GM software.)

Last modified May 24, 2016, at 03:22 PM
Bren School of Information and Computer Science
University of California, Irvine