Sei sulla pagina 1di 30

A Beginners Guide to Bayesian

Modelling

Peter England, PhD


EMB
GIRO 2002

Outline

An easy one parameter problem


A harder one parameter problem
Problems with multiple parameters
Modelling in WinBUGS
Stochastic Claims Reserving
Parameter uncertainty in DFA

Bayesian Modelling: General Strategy

Specify distribution for the data


Specify prior distributions for the parameters
Write down the joint distribution
Collect terms in the parameters of interest
Recognise the (conditional) posterior distribution?
Yes: Estimate the parameters, or sample directly
No: Sample using an appropriate scheme

Forecasting: Recognise the predictive distribution?


Yes: Estimate the parameters
No: Simulate an observation from the data distribution,
conditional on the simulated parameters

A One Parameter Problem


Data Sample [3,8,5,9,5,8,4,8,7,3]
Distributed as a Poisson random variable?
Use a Gamma prior for the mean of the
Poisson
Predicting a new observation?
Negative Binomial predictive distribution

Poisson Example 1 Estimation


yi ~ IPoi( )

~ ,

yi !
i 1

f | y f ( y, )

yi

yi 1 n

~ yi , n

Poisson Example 1 Prediction


~

y ) 1
~
1

f y | , y
~
(1 )( y 1) 1 1

1 yi , 1 n
~
y ~ Negative Binomial( x, p )
1
x 1 , p
1 1

1 1

~
y

One Parameter Problem:


Simple Case
We can recognise the posterior distribution
of the parameter
We can recognise the predictive distribution
No simulation required
(We can use simulation if we want to)

Variability of a forecast
Includes estimation variance and process
variance
prediction error (process variance estimation variance)

Analytic solution: estimate the two components


Bayesian solution: simulate the parameters, then
simulate the forecast conditional on the
parameters

Main Features of Bayesian


Analysis
Focus is on distributions (of parameters or
forecasts), not just point estimates
The mode of posterior or predictive
distributions is analogous to maximum
likelihood in classical statistics

One Parameter Problem:


Harder Case
Use a log link between the mean and the
parameter, that is:

Mean e

Use a normal distribution for the prior


What is the posterior distribution?
How do we simulate from it?

Poisson Example 2 Estimation


yi ~ IPoi(e )

~ N ( , )

f | y f ( y , )

2 2

e e

yi !
i 1

e i e ne e

yi

1
e
2


log density yi ne

1
2

2 2

Poisson Example 2
Step 1: Use adaptive rejection sampling
(ARS) from log density to sample the
parameter
Step 2: For prediction, sample from a

Poisson distribution with mean e , with


theta simulated at step 1

A Multi-Parameter Problem

From Scollnik (NAAJ, 2001)


3 Group workers compensation policies
Exposure measured using payroll as a proxy
Number of claims available for each of last
4 years
Problem is to describe claim frequencies in
the forecast year

Scollnik Example 1
Year
1
2
3
4
5

Payroll P(i,j)
Claims X(i,j)
Probabilities
Group 1 Group 2 Group 3 Group 1 Group 2 Group 3 Group 1 Group 2 Group 3
280
260
9
6
0.032
0.023
320
275
145
7
4
8
0.022
0.015
0.055
265
240
120
6
2
3
0.023
0.008
0.025
340
265
105
13
8
4
0.038
0.030
0.038
285
115
Average
0.029
0.019
0.039

X ij ~ Poi ( Pij i )

i ~ Gamma( , )
~ Gamma(5,5)
~ Gamma(25,1)

Scollnik Example 1
Posterior Distributions
1 | X , 2 , 3 , , ~ Gamma X 1 j 1, P1 j

2 | X ,1 , 3 , , ~ Gamma X 2 j 1, P2 j
3 | X ,1 , 2 , , ~ Gamma X 3 j 1, P3 j
| X ,1 , 2 , 3 , ~ Gamma 3 24, i 1

f ( | X ,1 , 2 , 3 , )
( )

i 1

4 5

Scollnik Example 1
Use Gibbs Sampling
Iterate through each parameter in turn
Sample from the conditional posterior
distribution, treating the other parameters as
fixed

Sampling is easy for 1 , 2 , 3 ,


Use ARS for

WinBUGS
WinBUGS is an expert system for Bayesian analysis
You specify
The distribution of the data
The prior distributions of the parameters

WinBUGS works out the conditional posterior


distributions
WinBUGS decides how to sample the parameters
WinBUGS uses Gibbs sampling for multiple
parameter problems

Stochastic Claims Reserving


Changes the focus from a best estimate of reserves
to a predictive distribution of outstanding liabilities
Most stochastic methods to date have only
considered 2nd moment properties (variance) in
addition to a best estimate
Bayesian methods can be used to investigate a full
predictive distribution, and incorporate judgement
(through the choice of priors).
For more information, see England and Verrall (BAJ,
2002)

The Bornhuetter-Ferguson Method

Useful when the data are unstable


First get an initial estimate of ultimate
Estimate chain-ladder development factors
Apply these to the initial estimate of
ultimate to get an estimate of outstanding
claims

Conceptual Framework
R e s e rv e e s t im a te
( M e a s u r e o f lo c a t io n )
V a r ia b ilit y
( P r e d ic t io n E r r o r)
P r e d ic tiv e D is t r ib u t io n

Figure 1. Predictive Aggregate Distribution of Total Reserves

10000

14000

18000

22000
26000
Total Reserves

30000

34000

Estimates of outstanding claims


To estimate ultimate claims using the chain ladder technique, you
would multiply the latest cumulative claims in each row by f, a
product of development factors .
Hence, an estimate of what the latest cumulative claims should be is
obtained by dividing the estimate of ultimate by f. Subtracting this
from the estimate of ultimate gives an estimate of outstanding
claims:

1
Estimated Ultimate 1
f

The Bornhuetter-Ferguson Method


Let the initial estimate of ultimate claims for
accident year i be M i
The estimate of outstanding claims for accident
year i is

M i 1
n i 2 n i 3 n

Mi

n i 2 n i 3 n

n i 2 n i 3 n 1

Comparison with Chain-ladder


Mi

n i 2 n i 3 n

replaces the latest cumulative


claims for accident year i, to which the usual chain-ladder
parameters are applied to obtain the estimate of outstanding
claims. For the chain-ladder technique, the estimate of outstanding
claims is

Di , n i 1 n i 2 n i 3 n 1

Multiplicative Model for Chain-Ladder


Cij ~ IPoi ( ij )
(Cij ) ij ij
E Cij xi y j

with

y
k 1

xi is the expected ultimate for origin year i


y j is the proportion paid in development year j

BF as a Bayesian Model
Put a prior distribution on the row parameters.
The Bornhuetter-Ferguson method assumes there
is prior knowledge about these parameters, and
therefore uses a Bayesian approach. The prior
information could be summarised as the
following prior distributions for the row
parameters:

xi ~ independent i , i

BF as a Bayesian Model
Using a perfect prior (very small variance)
gives results analogous to the BF method
Using a vague prior (very large variance)
gives results analogous to the standard
chain ladder model
In a Bayesian context, uncertainty
associated with a BF prior can be
incorporated

Parameter Uncertainty in DFA


Often, in DFA, forecasts are obtained using
simulation, assuming the underlying
parameters are fixed (for example, a
standard application of Wilkies model)
Including parameter uncertainty may not be
straightforward in the absence of a Bayesian
framework, which includes it naturally
Ignoring parameter uncertainty will
underestimate the true uncertainty!

Summary
Bayesian modelling using simulation
methods can be used to fit complex models
Focus is on distributions of parameters or
forecasts
Mode is analogous to maximum
likelihood
It is a natural way to include parameter
uncertainty when forecasting (e.g. in DFA)

References
Scollnik, DPM (2001) Actuarial Modeling with MCMC
and BUGS, North American Actuarial Journal, 5 (2), pages
96-124.
England, PD and Verrall, RJ (2002) Stochastic Claims
Reserving in General Insurance, British Actuarial Journal
Volume 8 Part II (to appear).
Spiegelhalter, DJ, Thomas, A and Best, NG (1999),
WinBUGS Version 1.2 User Manual, MRC Biostatistics
Unit.

Potrebbero piacerti anche