January 19-23, 2005
Optimization and inference are two important computational problems that arise in many machine learning contexts. For instance, Bayesian inference is a at the core of many applications such as computer vision, robotics, expert systems and pattern recognition. Optimisation is found in MAP estimation, optimal control, reinforcement learning and Markov Decision Processes.
In the last few years, there is a surge of interest in a variety of message passing methods, known to different communities by different names (Belief Propagation, Cluster variation methods, Bethe approximation, Survey Propagation, TAP methods and expectation propagation). In various application domains, these methods give results for both optimisation and inference tasks which often outperform previous methods in speed and accuracy. (3SAT, LDPC coding, inference in graphical models).
These message passing methods, as well as other efficient optimisation and inference methods (Monte Carlo methods, variational and mean field theories), were originally developed in the physics community and progress is still being made.
The aim of this workshop is to bring together researchers in Europe in the fields of machine learning and statistical physics to discuss work in progress in an informal setting. The workshop will focus on message passing methods with application to approximate statistical inference and optimization methods.
The meeting will be held in Lavin, Switzerland, in the house of Ruedi Stoop from the ETH Zurich as part of the MABiC series. See the map for further information. Ruedi Stoop has organized some excellent meetings in the past.