PIMS Distinguished Chair Lecture Series - Jeffrey Rosenthal

  • Start Date: 06/04/2009
  • End Date: 06/10/2009

Greater Vancouver Regional District


TALK #1: June 4 2009, UBC

7:30 PM in the Fairmont Lounge at St. John's College

The Curious World of Probabilities by Jeffrey S. Rosenthal
(Professor, Department of Statistics, University of Toronto, and author of the bestseller "Struck by Lightning: The Curious World of Probabilities")

Probabilities and randomness arise whenever we're not sure what will happen next. They apply to everything from lottery jackpots to airplane crashes; casino gambling to homicide rates; medical studies to election polls to surprising coincidences. This talk will explain how a Probability Perspective can shed new light on many familiar situations. It will also discuss "Monte Carlo" computer algorithms which use randomness to solve problems in many branches of science.


Talk #2: June 5 2009, SFU

1:30 - 3:00 PM, IRMACS Theatre, ASB 10900 

Adaptive MCMC: Challenges and Opportunities

To sample from a given target probability distribution, a wide variety of Markov chain Monte Carlo (MCMC) schemes and tunings are available, and it can be difficult to choose among them. One possibility is to have the computer automatically "adapt" the algorithm while it runs, to try to improve and tune on the fly. However, natural-seeming adaptive schemes can destroy the ergodicity properties necessary for MCMC algorithms to be valid. In this talk, we review adaptive MCMC, and explain how it can fail using a very simple graphical example http://probability.ca/jeff/java/adapt.html. We then present a theorem (joint with G.O. Roberts) which gives simple conditions that ensure ergodicity. We apply adaptive MCMC to several high-dimensional adaptive Metropolis and Metropolis-within-Gibbs examples. Finally, we briefly discuss a preliminary general-purpose adaptive MCMC software package http://probability.ca/amcmc.


TALK #3: June 10 2009, SFU

1:30 - 3:00 PM, AQ3005

Theoretical Rates of Convergence for MCMC Algorithms

A fundamental question about Markov chain Monte Carlo (MCMC) algorithms concerns the rate of convergence: How long should the algorithm be run before it gives satisfactory answers? While various convergence diagnostics have been proposed, none are completely satisfactory. An alternative approach involves proving theoretical, a priori bounds on the time required for convergence. We shall describe a method for computing explicit, rigorous bounds on the distance to stationarity of MCMC, using coupling constructions based on minorisation and driftconditions. The method is in principle quite general, and does not require special properties such as reversibility. We apply our method to some specific examples of MCMC, including the Gibbs sampler for variance components models and for hierarchical Poisson models.

Other Information: