This is a collection of (mostly) pen-and-paper exercises in machine learning. Each exercise comes with a detailed solution. The following topics are covered:
- linear algebra
- optimisation
- directed graphical models
- undirected graphical models
- expressive power of graphical models
- factor graphs and message passing
- inference for hidden Markov models
- model-based learning (including ICA and unnormalised models)
- sampling and Monte-Carlo integration
- variational inference
Conditions of Use
This book is licensed under a Creative Commons License (CC BY-NC-SA). You can download the ebook Pen and Paper Exercises in Machine Learning for free.
- Title
- Pen and Paper Exercises in Machine Learning
- Author(s)
- Michael U. Gutmann
- Published
- 2022-06-27
- Edition
- 1
- Format
- eBook (pdf, epub, mobi)
- Pages
- 211
- Language
- English
- License
- CC BY-NC-SA
- Book Homepage
- Free eBook, Errata, Code, Solutions, etc.
Contents Preface Linear Algebra Gram–Schmidt orthogonalisation Linear transforms Eigenvalue decomposition Trace, determinants and eigenvalues Eigenvalue decomposition for symmetric matrices Power method Optimisation Gradient of vector-valued functions Newton's method Gradient of matrix-valued functions Gradient of the log-determinant Descent directions for matrix-valued functions Directed Graphical Models Directed graph concepts Canonical connections Ordered and local Markov properties, d-separation More on ordered and local Markov properties, d-separation Chest clinic [based on][Exercise 3.3]Barber2012 More on the chest clinic [based on][Exercise 3.3]Barber2012 Hidden Markov models Alternative characterisation of independencies More on independencies Independencies in directed graphical models Independencies in directed graphical models Undirected Graphical Models Visualising and analysing Gibbs distributions via undirected graphs Factorisation and independencies for undirected graphical models Factorisation and independencies for undirected graphical models Factorisation from the Markov blankets I Factorisation from the Markov blankets II Undirected graphical model with pairwise potentials Restricted Boltzmann machine [based on][Exercise 4.4]Barber2012 Hidden Markov models and change of measure Expressive Power of Graphical Models I-equivalence Minimal I-maps I-equivalence between directed and undirected graphs Moralisation: Converting DAGs to undirected minimal I-maps Moralisation exercise Moralisation exercise Triangulation: Converting undirected graphs to directed minimal I-maps I-maps, minimal I-maps, and I-equivalency Limits of directed and undirected graphical models Factor Graphs and Message Passing Conversion to factor graphs Sum-product message passing Sum-product message passing Max-sum message passing Choice of elimination order in factor graphs Choice of elimination order in factor graphs Inference for Hidden Markov Models Predictive distributions for hidden Markov models Viterbi algorithm Forward filtering backward sampling for hidden Markov models Prediction exercise Hidden Markov models and change of measure Kalman filtering Model-Based Learning Maximum likelihood estimation for a Gaussian Posterior of the mean of a Gaussian with known variance Maximum likelihood estimation of probability tables in fully observed directed graphical models of binary variables Cancer-asbestos-smoking example: MLE Bayesian inference for the Bernoulli model Bayesian inference of probability tables in fully observed directed graphical models of binary variables Cancer-asbestos-smoking example: Bayesian inference Learning parameters of a directed graphical model Factor analysis Independent component analysis Score matching for the exponential family Maximum likelihood estimation and unnormalised models Parameter estimation for unnormalised models Sampling and Monte Carlo Integration Importance sampling to estimate tail probabilities [based on][Exercise 3.5]Robert2010 Monte Carlo integration and importance sampling Inverse transform sampling Sampling from the exponential distribution Sampling from a Laplace distribution Rejection sampling [based on][Exercise 2.8]Robert2010 Sampling from a restricted Boltzmann machine Basic Markov chain Monte Carlo inference Bayesian Poisson regression Mixing and convergence of Metropolis-Hasting MCMC Variational Inference Mean field variational inference I Mean field variational inference II Variational posterior approximation I Variational posterior approximation II Bibliography
Related Books