Dana Randall, Georgia Tech.
In this tutorial, we introduce the notion of a Markov chain and explore how it can be used to sample from a large set of configurations. Our primary focus is determining how quickly a Markov chain ``mixes,'' or converges to its stationary distribution, as this is the key factor in the running time. We provide an overview of several techniques used to establish good bounds on the mixing time. The applications are mostly chosen from statistical physics, although the methods are much more general.
Machine Learning: my favorite
results, directions, and open problems
Avrim Blum, CMU
In this tutorial I will start from scratch and give an introduction to some of the main themes and results in the theory of machine learning. I will describe (a biased subset of) some of the main accomplishments of computational learning theory to date, as well as current directions and open problems. One of my goals in this tutorial is to give my sense of why the theory of machine learning is a fundamental part of the theory of computing, intimately connected to complexity theory, cryptography, approximation algorithms, and online algorithms, with a useful perspective that can be brought to bear in these other areas. I will also talk a bit about the relation of theory and practice in machine learning, and describe some current important issues in the practice of machine learning where theoretical work may be able to have substantial impact. Slides for this tutorial are available on the web at http://www.cs.cmu.edu/~avrim/Talks/FOCS03/.