Sei sulla pagina 1di 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

Chapter 1: (Actuarial) Modelling


I have yet to see any problem, however complicated, which, when you looked at it in the right way, did not become still more complicated.

Poul Anderson, Science fiction writer, New Scientist, 25th September, 1969.

Models
A model is a simple, stylised caricature of a real world system or process. A model is not designed to capture the full complexity of reality, but rather capture the essence or key features of the system. Accordingly, as Poul Anderson observed above, building a model requires simplification of complex reality. The power of the model comes from the fact that it does not faithfully reflect reality in all its sophistication but throws into relief how the key influences determine the state of the system. So a good modeller can strike a happy balance between realism (fidelity to underlying process) and simplicity (so utility). Models can be used to understand how a system will evolve in the future. More pragmatically, models are used to predict how the process might respond to given changes thus enabling results of possible actions to be assessed. Using a model, we can explore the consequences of our actions, thus allowing us to select that action that leads to the most desirable outcome. Rather than build a model, the alternative way of studying a system is to experiment and observe how the real world system reacts to changing influences. This approach, though, is often too slow, too unethical, too risky, or too expensive. More philosophically, a model aids the organisation of empirical observations in such a way that the consequences of that organisation can be deduced. It highlights the relevant and shows, at times, the need for detail. Definition: A model is a simplification of a real system, facilitating understanding, prediction and perhaps control of the real system. Models can take many forms. Analog models, for instance, use a set of physical properties to represent the properties of the system studied. The models treated here may be termed abstract, or mathematical, or computer-based, models. Actuarial models are simply models that actuaries use to help formulate and perhaps communicate advice. Given the nature of an actuarys work, actuarial models typically have at least the following as inputs: interest rates (so monetary values can be compared across time), future mortality, morbidity or other contingency probabilities, inflation (so expenses can be forecast over time).

Page 1 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

Example 1: A life offices that sells life assurance to individuals wants to model the number and size of claims in each future year so it can set up suitable reserves. Here the nature of the problem prohibits the wait-and-see attitude: we need to set up reserves now. The model will require inputs such as the age, sex, and other influences determining the mortality experience of policyholders, the interest that can be earned on reserves and the likely future expenses of administrating the policies. Example 2: The central bank wants to control inflation but not curtail economic growth in an economy. Its chief tool to achieve its ends is that it can adjust the interest rate on short-term deposits. The central bank will need a macro-econometric model of the economy, incorporating short-term interest rate as an independent variable. Outputs would include both inflation and GDP growth. Other variables that might also be included would be current and past GDP, inflation, unemployment rates, etc. Clearly, it would be too expensive (in terms of the opportunity cost of lost potential growth) to experiment with the economy rather than build such a model. Example 3: An astronomer wants to know the location of Mars at each future period. A very accurate model would be based on Newtons Law of Gravitation and the Laws of Motion, with inputs the masses of Mars and the Sun (and perhaps some neighbouring planets) and the current location and momentum of each. Most modelling exercises cannot achieve this level of accuracy as either the relationship between the driving variables is not fully understood or the inputs cannot be measured with sufficient accuracy. The astonishing success of celestial mechanics in predicting the behaviour of the solar system has set a standard of predictability that is impossible for models of more open systems [i.e., free from external influences] to attain. Our standards for models might not be so high [in terms of accuracy] if the center of the solar system were twin stars rather than a single dominant sun. James Hickman (1997)

Example 4: An investor wants to model the maximum value that a share will attain in the next two years, so that s/he may sell it at or close to that price.

Page 2 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

Consider, as an exercise, the inputs that might be required to build such a model. The above examples show that good modelling requires a thorough understanding of the system modelled. As noted above models are built for a purpose either to understand a phenomenon or to help anticipate how the system will involve under several different scenarios. The purpose or objective of the model is paramount in assessing the adequacy or otherwise of the model. In short, a model is satisfactory if it meets the objectives of the modelling exercise satisfactorily. Note that the best model does not generally coincide with the most accurate model: there is a need to balance cost with benefits. In actuarial applications, cost (which includes timeliness) typically is the key constraint. Actuarial models will typically help the actuary to form an opinion, and recommend a course of action, on contingencies relating to uncertain future events. Inevitably, there will be uncertainty on the correct course of action and judgement is required. The actuary should not be afraid to resist any attempt to characterise an actuarial opinion as nothing more than speculation and defend the concept of uncertainty against an attempt to use it to discredit the validity of actuarial work or opinion. Models would, ideally, help the actuary assess the magnitude and financial significance of the uncertainties, capturing it in the stochastic variability of the output for any given inputs.

Categorizing and Decomposing Models


Models can be categorized as to whether they are deterministic or stochastic. A deterministic model has a unique output for a given set of inputs. The output is not a random variable or, more strictly, takes a single value (a degenerate random variable) for each input. On the other hand, the output of a stochastic model is a (non-degenerate) random variable. Perhaps some inputs are also random variables. A deterministic model can be seen as a special case of a stochastic model a degenerate stochastic model. Better, a stochastic model can be seen as a richer form of a deterministic model where not only an estimate of the output is given but also the range of uncertainty about that estimate. Stochastic models give a range of outputs with an associated probability. Example 5: It is desired to model the salary progression of an actuarial student from graduation (time 0) until retirement in 44 years time. The following model has been proposed for the salary level at time t years since graduation: Salary (t ) = 25000e.05t A graph of Salary(t) against time t is given below.

Page 3 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

250,000 225,000 200,000 175,000 150,000 125,000 100,000 75,000 50,000 25,000 0
8 10 12 14 16 18 20 0 2 4 6 42 44
42 44

22 24

26 28

30 32

34 36

This model is deterministic because Salary(t) is a function of t a constant not a (non-degenerate) random variable. On the other hand, if the following model was used:
Salary (t ) = 25000e.05t + X t where Xt N (0, (500t ) 2 )

Then the model is stochastic as the value of Salary(t) at each future time is a random variable. Salary(t) is graphed against t for several different possible outcomes overleaf.
250,000 225,000 200,000 175,000 150,000 125,000 100,000 75,000 50,000 25,000 0

34

28

30

22

24

18

10

Note the distinction between the two is that the latter has several values a distribution of values associated with each t.

12

14

16

Page 4 of 22

20

26

32

36

38

40

38 40

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

Table 1: Contrasting Deterministic & Stochastic Models


Deterministic Model Single output for each input Simpler than stochastic to create Easier and quicker to use, & explain So generally required by regulator Stochastic Model Produces a distribution of results

More difficult to formulate More difficult to interpret output, more time-consuming, and more difficult to communicate results Results can be compared over time to Comparisons over time more difficult to see trends and allow reassessment of make as no single output assumptions Uncertainty of model predictions not Allow us interpret if observed outcome is assessed (at least not explicitly significantly different than expected under quantified) the model

Note that some applications will demand a stochastic model, as in option pricing or reserving as the substance of the contract is exchanging variability for certainty at a price hence variability must be explicitly modelled. In actuarial applications options are generally appear as guarantee the guarantee that surrender value or maturity proceeds of a policy are never less than a stated amount is clearly selling a put option by the company. Similarly guaranteed terms on which to increase level of death benefit is clearly a put option granted by the company. One cannot be too careful when granting these guarantees the only cases of life offices getting into financial difficulties in recent years is by granting such guarantees and believing that they were not valuable. The decision to use deterministic model(s) or stochastic models is often made on the following considerations: 1. A deterministic model is, in general, simpler than a stochastic model And, in particular, its results are easier to communicate (try talking of gamma distributions and Levy processes to Trustees of a pension fund!) It is easier to develop, to interpret and quicker to run. It is also clearer what scenarios have been testedbut these are not implicit in the model, but made external to it as part of the wider modelling process. 2. However, only a limited number of scenarios are run in a deterministic model... The modelling exercise may have missed one that is particularly detrimental this is important when contract has embedded options (e.g., to extend life cover without underwriting) or guaranteed (e.g., surrender value not lower than premiums paid). So make scenario testing implicit in model a stochastic model. We might need to model explicitly the probability of each outcome (e.g. to price embedded option). Hence we require a stochastic model. 3. Sometimes model must allow for dynamic feedback that is the future evolution of system depends on what happens in the future this requires a stochastic model top trace the different possible paths and their likelihood. E.g., bonus declarations or policy depends on performance of assets

Page 5 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

E.g., discretionary rises to pensions in payment given level of inflation and past service surplus at the future time. 4. Stochastic models are more complex so need to be satisfied that extra output (and time needed to develop and run and interpret and communicate) is justified Do we know underlying distributions of the parameter(s) we sufficient accuracy? (Or are we just introducing spurious accuracy?) Considerable judgment required to factor in variability of parameters, relationship between parameters (correlation, copulas), and the dynamic decision feedback. 5. Often a combination of stochastic and deterministic models are used Economic models (where the output has a high dependency on inputs) are often modelled stochastically. Demographic models are often modelled deterministically (as variability is less material to output). Exercise: Indicate whether a deterministic or stochastic model is appropriate to 1. Price a guarantee on the lowest interest rate that an annuity will be sold at in the future. 2. To set a contribution rate on a defined benefit pension scheme 3. How much capital a company should maintain to that the probability of insolvency within a year is less than x%. 4. For statutory valuation of a life office 5. What reinsurance arrangements (excess of loss, stop loss, etc) gives best value for money when claims variability is set to prescribed limit. 6. The net present value of a project 7. The asset portfolio that best matches salary-related benefits.

Terminology in Modelling
Models can be decomposed into two main components. The structural part of the model establishes the relationship between the variables modelled (the inputs) so as to determine the functioning of the system (the outputs). These relationships are generally expressed in logical or mathematical terms. The complexity of model is a function of the number of variables modelled and the form of relationship posited between them. The other part of the model is the parameters, that is, the estimated value of the fixed inputs of the model. The parameters are often estimated from past data, using appropriate statistical techniques, but can also include current observation, subjective assessment, or other forms of estimation. Models can also be sub-divided into whether time is modeled as being discrete or continuous. Accordingly, we might have a discrete time stochastic model or process or a continuous time stochastic model or process. Similarly, the state space meaning the set of all possible values (or states) the process can take can be modelled with a discrete set or a continuous set. With two ways of modelling time and another two ways of modelling the state space this gives a fourfold classification of stochastic models. The decision of which of the four model types above to use in a particular situation has more to do with the purpose of the model than the underlying real system being

Page 6 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

modelled. For instance, the price a security can take is clearly discrete, yet in many modelling situations the price is assumed to be continuous the state space taken as all non-zero real numbers. This is done for modelling convenience. Certain phrases have developed over time in the actuarial literature to describe various approaches to modelling or parts of the modelling process. Below is a brief description of the more important phases in common use. Scenario modelling using different sets of economic conditions to forecast deterministically the model predictions, e.g., best estimate, prudent, pessimistic and optimistic bases. This outlines the range of possible outcomes, in a simpler and quicker manner than stochastic modelling. One can use percentile values of parameters, if their distribution is known, to approximate a stochastic model. Scenario testing this is scenario modelling above, but used to ensure compliance with a minimum standard set by the regulator. Generally each of the scenarios tested is adverse relative to current conditions. Resilience testing is an example of scenario testing for life offices, where the appointed actuary must certify that assets will cover liabilities even if long term interest rates rise or fall by 3% and equity values fall by 25% on the valuation date. The extra amount of reserves to comply with this is known as the resilience test reserve. Sensitivity Testing running the model with different assumptions (e.g. mean value of parameters, distribution of parameters, etc) to assess which assumptions the output is most sensitive to. Sensitivity testing is crucial in validating the model for use. It highlights key dependencies of model which must be communicated. It is used to obtain a deeper understanding of the process. For instance in profit testing a new contract, a better understanding of the sensitivity of the profit to key design features might prompt a change in product design. Actuarial models can be divided into: Demographic models used to model numbers of individuals in different categories, e,.g., decrement model of life table or multi-state model for sickness or disability insurance. For pension funds it would include the proportion of members married when there is a spouses pension and the marital status of members is not known and the age difference between the spouses. Economic models used to model relationship between economic, financial and investment drivers, e.g., inflation, earnings, interest rates, equity yields and returns. Particular attention must be paid to the relationship between them. Models can also be divided by the (mathematical or financial) properties of the model. Such classifications include the generic form of model - market consistent models, noarbitrage models, diffusion processes, mean reversion process, ARCH, Markov processes, Levy process,see Modelling course earlier, especially Models Stochastic Models. . A fully dynamic model is a stochastic model which incorporates decision making rules, dependent on future output. e.g., if used for model office it would change bonus declarations on with profits business depending on prevailing interest rates and forecast asset shares, etc. Dynamic modelling is used in dynamic financial analysis (DFA).

Page 7 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

Customer lifetime value (CLV) models the value of customers rather than contracts, as customer loyalty and propensity to purchase more products is an intangible asset. Such models are required to answer such questions as the discount or special terms banks might give to students to start an account with them or the goodwill to be paid for a business with customers that overlap with yours. Finally, we note that sometimes the model is run on simply one or a relatively small number of representative data points rather than on every single data point. A model point is a representative single point to use to model key characteristic of a larger group. Each within the larger group acts in all important respects, like the model point. So, a judicious selection of model points enables us to model the entire system far more quickly than putting each data point through the model and combining the outputs. We just need to scale up the model point multiply the result at the model point by the expected number in the subgroup it represents. In the pricing example, the model point is a policy. Some experimentation might be needed to establish the model points. Then choose the number so that, when scaled up, the expected new business is satisfactory modelled. Normally one does not have to use model points in valuing existing business/doing valuations because regulation requires valuation policy-by-policy or because it is simpler to treat each data point in turn as in pension fund valuations rather than group them in homogeneous categories. Model points are very useful in what if investigations, such a model office studies.

Evaluation of Suitability of a Model


And what is good, Phdrus, And what is not good... Need we ask anyone to tell us these things? Plato, The Phaedrus (and quoted in Robert M. Pirsig's Zen and the Art of Motorcycle Maintenance) Does the model help achieve its primary purpose of simplifying the real system, so facilitating the understanding, prediction and perhaps control of the real system? The key question is whether the proposed model is fit-for-purpose? This requires a critical appraisal of the model. A helpful checklist of considerations that might form part of the evaluation is set out below. Checklist for evaluation of model 1. Evaluate in context of objectives and purpose to which it is put. In particular, does the model reflect the risk profile adequately? 2. Consider the data and techniques used to calibrate the model, especially estimation errors. Are the inputs credible? Do the inputs reflect key features of the business being modelled that affect the advice given? 3. Consider the relationship between variables driving the model their correlation structure, order of cointegration, or appropriate copulas to capture their non-linear inter-reactions. They must be sensible or GIGO. 4. Consider relationship between the output variables - correlation structure, etc. The joint behaviour of the outputs should be sensible. Can they be partially independently checked? 5. Consider the continued relevance of model (if using previously developed model).

Page 8 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

6. 7. 8. 9.

Consider the credibility of outputs. Be alert to the dangers of spurious accuracy. Consider the ease of use and how results can be communicated. The model would, ideally, be capable of development and refinement in the future.

The checklist above is by no means complete. The objective of the model is clearly paramount and the checklist must be structured so that the model is evaluated in that context. Some further considerations to determine if it is fit-for-purpose are set out below that are often important in evaluating models for actuarial applications. Further evaluation checklist for actuarial models 1) Consider the short and long run properties of model i. Are the coded relationships stable over time? ii. Should we factor in relationships that are second order in the short-term but manifest over the long-term? 2) Analysing the output i. Generally by statistical sampling techniquesbut beware as observations are, in general correlated. IID assumption never, in general, valid. ii. Use failure in Turing-type (or Working) test to better model. 3) Sensitivity Testing i. Check small changes to inputs produce small changes to outputs. Check results robust to statistical distribution of inputs. ii. Explore and, perhaps, expand on key sensitivities in model. iii. Use optimistic, best estimate, and pessimistic assumptions. 4) Communication & documentation of results i. Take account of knowledge and background of audience. ii. Build confidence in model so seen as useful tool. iii. Outline limitations of models. For actuarial models in particular, we must ask a few more focused questions to ensure that the model is acceptable to inform our professional advice. Checklist for evaluation of actuarial model 1. Model must be relevant to exercise at hand, produce outputs that are credible, and be adequately documented. This is the minimum that can be expected. 2. The model should shed light on the risk profile of the process modelled (e.g., financial product, scheme or contract design) 3. All factors that could significantly affect the advice being given are incorporated in the model or modelling exercise. Any financial drivers risk discount rate, statutory reserves, etc, reasonable variation in parameters. 4. The estimated parameter values of the model should reflect the business being modelled and the economic and business environment. This means the pecularities of the product size of premium, early lapse rate, presence of options or guaranteed, mortality and morbidity experience given the population selected and method of selection, etc. 5. The parameters in the model should be self-consistent.

Page 9 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

So inflation, return on assets, risk discount rate, lapse rate, escalation of expenses, should all be mutually consistent. 6. The outputs of the model should appear reasonable Reproduce historic episodes Capable of independent verification/peer review Possible to communicate key results to client 7. Subject to all the above, the simplest model is the best As is cheaper to develop and run Easier to interpret and communicate Everything should be made as simple as possible, but not simpler." Albert Einstein

Building a Model
It is clear, then, that the idea of a fixed method [in building scientific models], or a fixed theory of rationality, rests on too nave a view of man and his social surroundingsit will become clear that there is only one principle that can be defended under all circumstances and in all stages of human development. It is the principle: anything goes. Paul Feyerabend (1993), Against Method, (3rd Edition) Verso Press, London. Quote is from pp. 18-19. Feyerabend, the philosopher of science, claims that there is no unique method to building scientific models anything goes. Accordingly, it is not possible to give a formulaic way of building a model. But we can give helpful hints. One overall tip is to have modest ambitions. It is difficult and time-consuming to come up with a half-way decent model even for a relatively straightforward system. The econophysicist Bertrand Roehner classified the complexity of models by the number of distinct objects simultaneous modelled and the nature of their interactions. He concludes that science has had notable successes in modelling phenomena of first and second level complexity but has, as yet, no model of level 3 or higher. Orders of Complexity in Modelling Level 1 - Two body problem e.g., gravity, light through prism, etc. Level 2 - N-identical body with local interaction e.g., Maxwell-Boltzmanns thermodynamics Ising model of ferromagnetism Level 3 - N-identical body with long-range interaction Level 4 - N-non-identical body with multi-interactions Modelling Markets Modelling economics systems generally General actuarial modelling The history of science gives us no example of a complex problem of Level 3 or 4 being adequately modelled. (Adapted From Roehner, B.M. (2002), Patterns of Speculation: A Study in Observational Econophysics, Cambridge University Press)

Page 10 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

As all actuarial applications are level 4 in the above ordering, the history of science underlines the need to be modest in the aim of the model and the accuracy of the forecasts from the model. In particular, actuaries must not be too dismissive of scientific models and their failure to faithfully reproduce the complexity of reality, but adapt the insights to enrich their modelling (see, for instance, Whelan (2006) for a discussion of the tension historically between pragmatic actuaries and theoretical financial economists). Typically, an actuarial model will be a meta-model, that is a model incorporating many smaller models. For instance, a life table is a model of mortality but can be used as an input into product pricing or reserving for term assurance. The level 4 complexity of problems where actuarial advice is sought requires that the actuary has many models as part of their tool-kit, with an appreciation of their limitations, from which to build more complex models. So, depending on the problem, the actuary may buy and adapt an existing commercial modelling product (e.g., Prophet), develop their own from scratch, or some combination of the two. Example 6: Indicate how you would turn a deterministic life table into a stochastic mortality model. Would you use a deterministic or stochastic model for mortality when pricing and reserving for short-term assurances on a small group of lives? Would your answer differ if the life assurance was only a financially small part of a saving product? We can give further and more pragmatic advice in helping to build a model. A key constraint is building models in a business context is that they must be completed on time and within budget. The following ten steps set out a logical and practical approach to building models in such a context. Note that typically one cycles between the steps a few times before completing the modelling exercise. The 10 Step Guide to Building a Model 1. Set well-defined objectives for the model. 2. Plan how model is to be validated o i.e., the diagnostic tests to ensure it meets the objectives 3. Define the essence of the structural model the 1st order approximation. Refinement, if necessary, can come later. 4. Collect and analyse the data to estimate the parameters in the model. 5. Involve experts on the real world system to get feedback on conceptual model. o A Turing-type test (explained in lectures) 6. Decide how to implement model o e.g. C, Excel, some statistical package. o Often random number generator needed. 7. Write and debug program. 8. Test the reasonableness of the output from the model o and otherwise analyse output. 9. Test sensitivity of output to input parameters o We do not want a chaotic system (explained in lectures) in actuarial applications. 10. Communicate and document results and the model.

Page 11 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

Finally, the model needs to be constantly updated in the light of new data and other changes. This might be regarded as the 11th point above: to monitor changes and update the model in the light of these changes. Remember the model is merely an aid to formulating and perhaps communicating actuarial advice. It is a good model if it helps achieve this aim. The model output only partially helps to formulate the advice the actuary who interprets the output, with the knowledge of its limitations and in the light of other information that was not modelled, to help her form an opinion. The step-by-step guide makes modelling seem a routine activity. It is anything butit requires insight, diligence, and patience. The mathematician turned modeller Bernard Beauzamy gives an excellent summary of the challenges facing a modeller in his short essay, Real Life Mathematics, Irish Math. Soc. Bulletin 48 (Summer 2002), 4346. Some quotes from his essay are given below to lend colour and context to our earlier guide. It is always our duty to put the problem in mathematical terms, and this part of the work represents often one half of the total work My concern is, primarily, to find people who are able and willing to discuss with our clients, trying to understand what they mean and what they want. This requires diplomacy, persistence, sense of contact, and many other human qualities. Since our problem is real life, it never fits with the existing academic tools, so we have to create our own tools. The primary concern for these new tools is the robustness. real life mathematics do not require distinguished mathematicians. On the contrary, it requires barbarians: people willing to fight, to conquer, to build, to understand, with no predetermined idea about which tool should be used.

Advantages of Modelling
Modelling can claim all the advantages of the scientific programme over any other logical, critical, and evidence-based study of a phenomenon that builds, often incrementally, to a body of knowledge. Models offer a structured framework to update our knowledge, as a model is only superseded when a better one comes along. Models help us in our study of complex systems, including stochastic systems that are otherwise not tractable mathematically (in closed form). Models allow the consequences of different policy actions can be assessed in a shorter timeframe and with less expensive than alternatives methods of assessment. However, the modelling exercise has pitfalls that must be guarded against. A check-list of such drawbacks is given below. Checklist of Pitfalls in Modelling 1. Model building requires considerable investment of time and expertise...so not free.

Page 12 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

2. Often time-consuming to use many simulations needed and the results must be analysed. 3. Sometimes difficult to interpret output from model. 4. The model only as good as the parameter inputs so need to critically assess quality and credibility of data. 5. Models are often not especially good at optimising outputs but are better at comparing results of input variations. 6. Impressive-looking models (especially complex ones) can lead to overconfidence in model. 7. Must understand the limitations of the model to apply it properly. 8. Must recognise that a model may become obsolete with a change in circumstances.

Computers & Modelling


The computer is revolutionising modelling, in particular actuarial modelling. Hickman & Heacox (1999) gives an excellent overview of how of they shaped actuarial science and actuarial practice in the couple of decades following World War II. The first generation of electronic business computers (say, the UNIVAC computer of 1951, the IBM 650 in 1955 or the earlier IBM 702/705 machines) were used as calculators - performing repetitive calculations. The first of these UNIVAC machines was bought by the Census Bureau, the second by A.C. Nielson Market Research and the third, for actuarial purposes, by the Prudential Insurance Company. By the end of the 1950s the life assurance industry was further advanced in re-engineering their businesses to harness the potential of computers than any other industry in the US (and therefore, the world). The first applications of electronic computers to actuarial work was to do similar calculations as had been done manually but now faster, with less approximation, and with less chance of error. The computers, though extraordinary limited and tedious to use compared to modern machines, made more practical the building of stochastic models, explored using simulation. An early example of such an analysis was Boermeesters paper of 1956 estimating the distribution of costs of an annuity. While simulations had been done previously (see Chapter 3), such computing power made the technique more practical. Simulation was to develop as an important technique in modelling from this time and actuaries, such as Phelim Boyle, have played an important role in its development and dissemination. The second generation of computers (say the IBM 360 series) were used not only as calculators but also as real time databases for airline reservations processing and inventory control. Once again, actuaries were quick to exploit their new capabilities, by semi-automating the back office of insurance companies and using them to help in valuation work. In fact, from the early 1950s, it was apparent that the event of such computers necessitated the re-engineering of life office organisations and actuaries played a key role in planning and managing the changes. Subsequent generations of computers have developed even greater uses. Significantly, they now aid in modelling of all kinds of things, from the design of cars and airplanes designed to designing the next generations of computer chips. J. Bradford Delong noted

Page 13 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

the general use of computers in modelling and, in particular, the spreadsheet program as a general modelling (or what-if) tool: The value of this use as a what-if machine took most computer scientists and computer manufacturers by surprisenobody before Dan Bricklin programmed Visicalc had any idea of the utility of a spreadsheet programIndeed, the computerization of Americas white-collar offices in the 1980s was largely driven by the spreadsheet programs utility first Visicalc, then Lotus 1-2-3, and finally Microsoft Excel. DeLong, J.B. (2002), Productivity Growth in the 2000s. Working Paper (Draft 1.2), University of California at Berkeley and NBER. Quote is from pp. 35-36. Again, such increased utility has been exploited in actuarial applications by, for instance, simulating the future profitability, and the associated risks, of life offices (the so-called model life office). This latter stage also saw the introduction of new assurance products such as the unit-linked policies from the early 1970s and the flexible universal life policies from the late 1970s. The administration of such products, with their embedded options and choices, was brought down by the computer to the price customers were willing to pay. Rieder (1948), quoted in Hickman & Heacox (1999), had earlier foreseen this later evolution: If the new electronic machinery, with its tremendous computing and memory capacity, had been available from the outset, we might have developed life contracts and procedures along entirely different linesit might have been possible to design one policy which would have been flexible enough to meet every policyholders insurance needs for the rest of his lifetime. The impetus to change is not slowing as computer speed rises and memory costs continue to fall exponentially. Actuarial science, the science of the possible in risk protection, has been revitalised by the possibilities - both in the possibility of modelling risks previously intractable and in developing products to transfer them efficiently. Stochastic modelling, feasible by computing speed, has allowed us to price investment guarantees and to contemplate even modelling the complete risks faces by a company, so-called enterprise risk. The computer, as cost-efficient record-keeper, has allowed transactions sizes to fall and numbers to increase enabling ever greater volumes of risk to transfer to traditional intermediary institutions or to the capital markets. The change is a revolution in financial services generally and is far from complete. Perhaps, with all the new developments, actuarial practice must take leave of some traditional ones such as the with-profits policy and the defined benefit pension promise which appear increasingly anachronistic in the brave new world.

Page 14 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

Macro-Econometric Modelling: A Case Study

[This section is largely based on Economic Models and Policy-Making. Bank of England, Quarterly Bulletin, May 1997.] A case study can give focus to some of the above comments. Many actuarial models require as an input some estimate of future general economic growth, of inflation, of interest rates. Macro-economic models, or more accurately macro-econometric models provide forecasts of these variables so it is useful to evaluate the reliability of these forecasts. Moreover, the history of developing macro-econometric models provides us with valuable lessons that are pertinent to any model building exercise. Accordingly, we take as a case study the development of macro-econometric models in the UK. Macro-economics and Macro-econometrics Macro-economics and macro-econometrics study the same thing. The subject matter for both is the modeling of aggregate measures for the whole economy, such as: inflation and monetary growth, economic growth, capital investment, unemployment, trade, interest rates both long and short, levels of stock markets, exchange rates. Macroeconomics contents itself with describing the structural form of the relationship between the variables (see earlier) while macro-econometrics takes the structural form and completes the model by estimating all unknown parameters and the variance of any error terms from historic economic data. Clearly, macro-econometric models are more useful to policymakers (and actuaries). So how good is macro-economics at describing the form of the relationships between macro-economic variables? Then, building on this, how good are the calibrated econometric models? The answer to both questions is the same: pretty much awful. The four decades macroeconometrics has been around informing economic policy decisions has been one of constant retreat into uncertainty. With so many and so big mistakes made in its brief history, the development of macro-econometrics provides us with a great many lessons. Model Building It all began in the 1960s, and the fad was most acute in the UK. The macroeconomists thought they had cracked how the economy worked. They thought they had a reliable model of the whole economy. Macro-economists were high on their enthusiasm and convinced governments to spend heavily on building complex models of the economy that could answer all questions. The logic was that, as one thing in an economy is connected to everything else, either one models everything or nothing. Below we graph how two key macro-economic variables in the UK economy evolved since WWII to the late 1960s.

Page 15 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

UK Inflation and Real Economic Growth, 1948-1968.


30.0%

25.0%

20.0%

15.0%

10.0%

5.0%

0.0% 1949 1952 1955 1958 1961 1964 1967 1970 1973 1976 1979 1982 1985 1988 1991 1994 1997 -5.0% UK Real GDP Growth UK Inflation

Growth was the primary output variable to be maximised the target variable as they called it. The input variables were those under the control of the government shortterm interest rates, taxation, public spending, etc. Inflation can be seen as an unintended output and if it rises then it breaks down the order in the system. In short, rising inflation is a key undesirable output. Real GDP growth was healthily positive over the 1960s and much of the 1960s and inflation was benign. In short, it was a great time. Being slightly cynical, maybe the certainty that economists had in their modelling was connected to the extraordinary economic growth and well-being of the 1960s: everyone likes to take credit for a good thing. In fairness to the UK economists, there was enthusiasm worldwide that all the major problems in macro-economics were solved or nearly solved. There was a dominant economic doctrine, with few challengers. As Richard Nixon declared: We are all Keynesians now. And we find legislation giving credence to that belief that economic growth could be controlled in the US, UK, and even monetarist Germany in the 1967 Act to Promote Economic Stability & Growth. By the early 1970s, four main models were developed in the UK out of generous support from public funds. These were at the: Bank of England Treasury NIESR (the National Institute of Economic and Social Research) London Business School

These were the original black-box. They were a labyrinth of small equations with the output of one forming the input of another a total of between 500 and 1,000 equations. The models were calibrated and initialized to prevailing conditions. Now, if one of the inputs say, interest rates or taxation or government spending was altered, the model would predict its effects, as the change rick-shades through all the equations to ultimately

Page 16 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

impact on everything else. The key output variables monitored at that time were economic growth, employment and inflation. The Models in Practice All four models were fundamentally the same. Each was based on the dominant economic theory of the time Keynesianism. But just because it is popular did not make it right. Extending the earlier graph of the two key macro-economic outputs up to 1980 shows the trajectory taken by the UK economy was diametrically opposite that intended.

UK Inflation and Real Economic Growth, 1948-1980.


30.0%

25.0%

20.0%

15.0%

10.0%

5.0%

0.0% 1949 1952 1955 1958 1961 1964 1967 1970 1973 1976 1979 1982 1985 1988 1991 1994 1997 -5.0% UK Real GDP Growth UK Inflation

The graph above shows inflation takes off into double digits while economic growth crosses the x-axis twice indicating two recessions in the 1970s. What went wrong? Errors in Modelling There are three generic sorts of errors associated with any modelling exercise. They are, in ascending order of importance: General uncertainty in model, represented by the error term in the model. This is because the model does not model everything relevant just the main drivers and the effect of the rest is gathered in this term. Parameter mis-estimation the form of the model is right but the parameters are not. This can lead the modeled output being systematically above or below its true value. Model misspecification the form of the model is wrong so that what is observed in practice might not be anticipated at all by the model.

The macro-econometric models made all three errors. First, captured in the general uncertainty in the model, was the oil price shock which pushed models to extremes. This could be put down to bad luck. Second, Bretton Woods, the fixed exchange rate system, failed at the start of the 1970s so that exchange rates became floating. With no data to

Page 17 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

model floating exchange rates, the models just ignored it. This might be regarded as parameter mis-estimation. Finally, all the models suffered the most serious defect of all: model misspecification. The 1974-1975 stagflation that is, inflation rising at the same time that economic growth falters simply was not thought possible under Keynesian economics. Remember macroeconomic theory should give the right structure of the model. Here though, something happened that was not thought possible in theory. There was model failure as soon as the models had begun to be used to inform policy decisions. The Finger-pointing The models could and were adjusted and some new models came along (e.g., Liverpools new classical model and the City University Business School Model). But the models were getting even more complicated and it was increasingly difficult to know how to fix them when they went wrong. The models were not learning by experience and sometimes the outputs were just plain silly. We can see this best from some quotes. Treasury forecasters [in 1980] were predicting the worst economic downturn since the Great Slump of 1929-1931. Yet they expected no fall in inflation at all. This clearly was absurd and underlined the inadequacies of the model. Nigel Lawson, The View from No. 11. The users could no longer explain why the model was producing the results it was or why one model gave one result and another model a significantly different one. Soon everyone turned cold on the big macro-econometric models. They turned on the modellers: Modelling was seen as a second-rate activity done by people who were not good enough to get proper academic jobs. Earlier expectations of what models might achieve had evidently been set too high, with unrealistic claims about their reliability and scope. Quoted from Economic Models and Policy-Making. Bank of England, Quarterly Bulletin, May 1997, p. 165 & p. 164. Lessons Learned We can make two observations: Economists adopted a very optimistic view of their creations when selling the blueprints to the government agencies for financing and, thus committed, they could not judge its output impartially There were large vested interests in the models by those who signed the cheques. Who was going to shout that the emperor had no clothes on? These big monolithic models faded out of existence in the early 1980s. When Keynesianism was challenged by the monetarism of Milton Friedman, there was no longer a widespread consensus on the structural form of the models so no uncontentious assumptions around how to build macro-econometric models. Yet policymakers, governments in the fiscal sphere and central banks in the monetary one, still require some idea of the impact of their policies. As Alan Blinder, a former Vice-Chairman of the Board of Governors of the Federal Reserve, remarked: you can

Page 18 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

get your information about the economy from admittedly fallible statistical relationships, or you can ask your uncle (Blinder, A.S. (1999)). However lessons have been learned and the ambitions of macro-econometric modellers have been reduced. Models in use today satisfy four criteria: Models and their outputs can be explained in a way consistent with basic economic analysis. The judgement part of the process is made explicit Models must be able to produce results consistent with historic economic episodes. Results must be consistent over time (e.g., parameters must not be sensitive to the period studied)

The above criteria tend to result in small-scale models. Smaller more stylised models are now the order of the day recognising the uncertainty inherent in the underlying economic structure, not just to parameter values but to structural changes too. Models are now seen as flexible friends. In fact, the most complicated model used in practice is probably the Dornbusch Overshooting Model, that is an equation linking five key variables growth, money, prices, exchange rates, and interest rates. Models should get no more elaborate than that and, in fact, the modeler should begin with just 2 or 3 parameters introducing just enough parameters to reduce the error term in the model to be acceptable for the purpose. The user of the model must then allow for the risk of model misspecification (or as economists say to deflect blame from their modelling, structural shifts in the economy). No model can be used blindly when so much uncertainty surrounds how the underlying economy functions. The Brave New World So what has policy decision-making been like under this new, less ambitious way of modelling? Well projecting forward from 1980, the time that Nigel Lawson was ridiculing the output of the old-style Treasury model, we see how the two key variables evolved in the UK.

Page 19 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

UK Inflation and Real Economic Growth, 1948-1980.


30.0%

25.0%

20.0%

15.0%

10.0%

5.0%

0.0% 1949 1952 1955 1958 1961 1964 1967 1970 1973 1976 1979 1982 1985 1988 1991 1994 1997 -5.0% UK Real GDP Growth UK Inflation

The graph shows that there was no great depression as the Treasury model predicted but, on the contrary, 1981 was pretty much the turning point with inflation falling and growth picking up. And the picture has become brighter since the early 1990s with strong growth and low inflation. How much credit can the new style of modelling take for the strength of the UK economy? First, the modellers do not confidently make the terrible, unthinking, blunders they are more modest and more circumspect. This is the major breakthrough. Indeed, the new models are less sure that they are modelling cause-and-effect and can be seen, at one level, as a pithy short-hand of the past simple summaries of past behaviour. Second, structural changes are still occurring in the economies meaning that even the appropriate form of macroeconomic model is changing. In fact, the US economy the largest in the world entered a new phase of growth in 1995/1996 when it extended its growth cycle without inflation rising, despite the unemployment rate falling below the 6% non-accelerating inflation rate of unemployment (NAIRU). Almost a decade later the debate is still raging as to what is going on with the equity market taking a very positive view and then, dramatically (as is the way of the market) a not-so-positive view. Some economist conjecture that the ubiquitous use of computers in modeling at the micro and macro level have helped to shift sustainable economic growth in the developed economies higher. Lessons for Modellers from this Case Study So what are the lessons for other modellers from the history of econometric modelling? This can be summarised simply: Have limited ambitions of models Build disposal models Small, stylised, parsimonious, models are beautiful.

Page 20 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

Concluding Thoughts
Modelling obviously requires a sound knowledge of the process to be modelled but it also requires something from the art of the modeller. The good modeller will have the insight of knowing what is possible, the toolbox of robust methods, and the experience of managing projects and teams to deliver on time and within budget a model that is fitfor-purpose. Actuaries have deep but narrow expertise in building models for actuarial applications. However, there is evidence to believe that this modelling experience can be applied in broader fields. Hickman & Heacox (1998) recounts how the modelling expertise of American actuaries was called upon during World War II. General purpose modellers were required in the war effort to model anything from the optimum depth to explode an underwater bomb, or from devising optimum aircraft search strategies for enemy submarines, to how to calibrate radar systems. Actuaries were recruited to the original Anti-Submarine Warfare Operations Research Group (ASWOG), which on demonstrable success, widened its scope to all US navy operations when it became the Operations Research Group (ORG), and then the Operations Evaluation Group (OEG). This group comprised of some 80 scientists (many very distinguished, such as the Nobel Prize winner William Shockley), of which no fewer than 18 were actuaries. This was the origin of the discipline Operations Research, defined as the application of the scientific method to providing decision-makers with a quantitative basis for decisions on operations under their control. Donald Cody, one of the actuaries with the original ASWOG, suggests that the modeller must pay unswerving attention to the objectives and must treat truly important problems with the simplest available techniques and not seek any old problems which enable use of fancy techniques (op. cit., p. 8). We conclude with a final remark on models from one of the twentieth centurys most insightful modellers: One thing I have learned in a long life: that all our science, measured against reality, is primitive and childlike and yet it is the most precious thing we have.

Albert Einstein (1879-1955)

Page 21 of 22

Lecture Notes: Actuarial Risk Management II [STAT40460]

Shane Whelan, 2010

Further Reading
Beauzamy, B. (2002), Real Life Mathematics, Irish Math. Soc. Bulletin 48 (Summer), 4346. Currently available on the web at: www.maths.tcd.ie/pub/ims/bull48/M4801.pdf Blinder, A.S. (1999), Central Banking in Theory and Practice, MIT Press. Bowie, D. et al. (1996), Models, Useful Models, and Reality. The Actuary (UK), December 1996, 27-28. DeLong, J.B. (2002), Productivity Growth in the 2000s. Working Paper (Draft 1.2), University of California at Berkeley and NBER. Feyerabend, P. (1993), Against Method, (3rd Edition) Verso Press, London. Hickman, J.C. (1997), Introduction to Actuarial Modeling, North American Actuarial Journal, 1, 3, 1-5. Hickman, J.C. & Heacox, L. (1998), Actuaries in History: The Wartime Birth of Operations Research. North American Actuarial Journal, 2, 4 (Oct.), 1-10. Hickman, J.C. & Heacox, L. (1999), Actuaries at the Dawn of the Computer Age. North American Actuarial Journal, 3, 3, 1-13. Faculty/Institute of Actuaries Core Reading for Subject 103: Stochastic Modelling - Chapter 1. Edinburgh, London & Oxford. Faculty/Institute of Actuaries Core Reading for CA1: Unit 12, Modelling. Edinburgh, London & Oxford. Roehner, B.M. (2002), Patterns of Speculation: A Study in Observational Econophysics. Cambridge University Press. Whelan, S. (2006), Not Such a Great Controversy: Actuarial Science & Financial Economics. The Actuary (US), Society of Actuaries (US). December 2006/January 2007. Whitelock-Jones, A. (2003) Chapter 8: Modelling, pp. 153-172, in Bellis, C., Shepherd, J. & Lyon, R. (2003), Understanding Actuarial Management: the actuarial control cycle. Institute of Actuaries of Australia. 462 pp. Whitley, J. (1997), Economic Models and Policy-Making. Bank of England, Quarterly Bulletin, May 1997.

Shane Whelan 2010

Page 22 of 22

Potrebbero piacerti anche