Abstract
Uncertainty in the behavior of quantities of interest causes risk. Therefore statistics is used to estimate these quantities and assess their variability. Classical statistical inference does not allow to incorporate expert knowledge or to assess the influence of modeling assumptions on the resulting estimates. This is however possible when following a Bayesian approach which therefore has gained increasing attention in recent years. The advantage over a classical approach is that the uncertainty in quantities of interest can be quantified through the posterior distribution. We first introduce the Bayesian approach and illustrate its use in simple examples, including linear regression models. For more complex statistical models Markov Chain Monte Carlo methods are needed to obtain an approximate sample from the posterior distribution. Due to the increase in computing power over the last years such methods become more and more attractive for solving complex problems which are intractable using classical statistics, for instance spam e-mail filtering or the analysis of gene expression data. We illustrate why these methods work and introduce two most commonly used algorithms: the Gibbs sampler and Metropolis Hastings algorithms. Both methods are derived and applied to statistical models useful in risk analysis. In particular a Gibbs sampler is developed for a change point detection in yearly counts of events and for a regression model with time dependence, while a Metropolis Hastings algorithm is derived for modeling claim frequencies in an insurance context.
Original language | English |
---|---|
Title of host publication | Risk - A Multidisciplinary Introduction |
Publisher | Springer International Publishing |
Pages | 207-240 |
Number of pages | 34 |
ISBN (Electronic) | 9783319044866 |
ISBN (Print) | 3319044850, 9783319044859 |
DOIs | |
State | Published - 1 Jan 2014 |
Keywords
- Bayesian inference
- Bayesian risk
- Markov Chain Monte Carlo samplers
- Posterior
- Prior