NIPS*2008 Workshop/Organizers

From Probabilistic Programming

Jump to: navigation, search

Organizer Mailing List: probabilistic-programming@mit.edu

Roy.png
Daniel Roy (Ph.D. student, MIT) is a graduate student in computer science at MIT. His theoretical interests lie at the intersection of computer science and probability theory. Using probabilistic programming languages, he is developing a computational perspective on fundamental ideas in probability theory and statistics. His recent research has focused on recursive stochastic processes that induce coherent nonparametric models for Bayesian inference. These include generalizations of the Dirichlet process via hierarchical Poisson processes and clustering models based on Kingman's coalescent.

Mansinghka.png
Vikash Mansinghka (Ph.D. student, MIT) is a graduate student at MIT. His work centers on probabilistic programming languages, virtual machines for stochastic computation, and physical machines based on natively stochastic digital circuits that recover classical abstractions in a deterministic limit. He also develops models and algorithms based on cognitive and industrial applications, emphasizing nonparametric Bayesian methods.

Winn.png
John Winn (Microsoft Research Cambridge) works on automating probabilistic inference and applying these inference methods to solve problems in vision and computational biology. He developed Variational Message Passing for automatically applying variational Bayesian methods to large models and also several software inference frameworks including VIBES and the probabilistic language Csoft used in Infer.NET (with Tom Minka). He is interested in developing probabilistic programming frameworks which are both flexible and efficient enough to meet the challenges of today's inference tasks.

McAllester.png
David McAllester (Toyota Technology Institute at Chicago) is a prominent figure in several closely related areas. He is well known in the programming languages community for static analysis algorithms and for work on the semantics of types and contracts. He is well known in the machine learning community for the PAC-Bayesian theorem relating Bayesian priors to frequentist generalization bounds. Recently he has been working on grammar-based models for computer vision and was part of a team that took second place in the 2007 PASCAL object detection challenge.

Tenenbaum.png
Joshua Tenenbaum (MIT) is a world leader in computational models of human cognition, probabilistic knowledge representation, and inductive learning. He has a strong scholarly record in machine learning based on intricately structured probabilistic models, including best paper or best student paper awards at NIPS (for learning hierarchically structured semantic models), CVPR (for separating style and content in visual recognition), and the Cognitive Science conference (for probabilistic schema-based models of human causal learning and categorization). He is also the creator of Isomap, an influential approach to learning nonlinear manifolds.

Personal tools