Sei sulla pagina 1di 111

# New

Work
Forecas(ng  Techniques
What  is  Forecas0ng?
þ  Process  of  predic0ng  a  future  event
þ  Underlying  basis  of    all  business  decisions
þ  Produc0on
þ  Inventory
þ  Personnel
þ  Facili0es
FORECAST:
•  A  statement  about  the  future  value  of  a  variable  of
interest  such  as  demand.
•  Forecas0ng  is  used  to  make  informed  decisions.
–  Long-­‐range
–  Short-­‐range

3-­‐5
Forecasts
•  Forecasts  aﬀect  decisions  and  ac0vi0es
throughout  an  organiza0on
– Accoun0ng,  ﬁnance
– Human  resources
– Marke0ng
– MIS
– Opera0ons
– Product  /  service  design

3-­‐6
Uses  of  Forecasts
Accounting Cost/profit estimates

## Product/service design New products and services

3-­‐7
Features  of  Forecasts
•  Assumes  causal  system
past  ==>  future
•  Forecasts  rarely  perfect  because  of    randomness
•  Forecasts  more  accurate  for  groups  vs.  individuals
•  Forecast  accuracy  decreases  as  0me  horizon
increases

3-­‐8
Elements  of  a  Good  Forecast

Timely

Reliable   Accurate

WriAen

3-­‐9
Steps  in  the  Forecas0ng  Process

“The  forecast”

## Step  6  Monitor  the  forecast

Step  5  Make  the  forecast
Step  4  Obtain,  clean  and  analyze  data
Step  3  Select  a  forecas(ng  technique
Step  2  Establish  a  (me  horizon
Step  1  Determine  purpose  of  forecast

3-­‐10
The basic forecasting methods are:

## 1.  Regression and Correlation

2. Learning Curve
3. Time series methods, which look only at the
historical pattern of one variable and generate a
forecast by extrapolating the pattern
•  Simple Moving Average
•  Weighted Moving Avergare
•  Exponential Smoothing

## 4.Expected Value (Probability)

5.Sensitivity Analysis
The  Nature  of  Forecas(ng

## •  Involves  the  future

•  Involves  uncertainty
•  Relies  on  history
•  Accuracy?  (usually  less  than  desired)
•  Revise  as  condi0ons  change
•  Plan  to  cover  devia0ons  from  forecast

A  sta0s0cal  procedure  used  to  ﬁnd
rela0onships  among  a  set  of  variables
In  regression  analysis,  there  is  a  dependent  variable,
which  is  the  one  you  are  trying  to  explain,  and  one  or
more  independent  variables  that  are  related  to  it.

You  can  express  the  rela0onship  as  a  linear  equa0on,
such  as:

y  =  a  +  bx

y  =  a  +  bx
•   y  is  the  dependent  variable
•   a  is  a  constant
•   b  is  the  slope  of  the  line  (Coeﬃcient  of  independent  variable)
•   x  is  the  independent  variable
•   For  every  increase  of  1  in  x,  y  changes  by  an  amount  equal  to  b
•   Some  rela0onships  are  perfectly  linear  and  ﬁt  this  equa0on
exactly.
Your  cell  phone  bill,  for  instance,  may  be:

Total  Charges  =  Base  Fee  +  30  ﬁls  (overage  minutes)

If  you  know  the  base  fee  and  the  number  of  overage  minutes,  you
can  predict  the  total  charges  exactly.

Correla(on  Coeﬃcient,  r  :

## The  quan0ty  r,  called  the  linear  correla5on

coeﬃcient,  measures  the  strength  and
the  direc0on  of  a  linear  rela0onship  between
two  variables.
Correla(on  Coeﬃcient,  r  :

The  value  of  r  is  such  that  -­‐1  <  r  <  +1.

The  +  and  –  signs  are  used  for  posi0ve
linear  correla0ons  and  nega0ve  linear
correla0ons,  respec0vely.
Posi7ve  correla7on:
If  x  and  y  have  a  strong  posi0ve  linear
correla0on,  r  is  close  to  +1.
Nega7ve  correla7on:
If  x  and  y  have  a  strong  nega0ve  linear
correla0on,  r  is  close  to  -­‐1.
No  correla7on:
If  there  is  no  linear  correla0on  or  a  weak  linear
correla0on,  r  is  close  to  0.  A  value  near  zero
means  that  there  is  a  random,  nonlinear
rela0onship  between  the  two  variables

Rela(onship    explained

+Ve   -­‐Ve
+1   0   -­‐1
Coeﬃcient  of  Determina(on,  r  2  or  R2  :

## The  coeﬃcient  of  determina5on,  r  2,  is  useful

because  it  gives  the  propor0on  of    the  variance
(ﬂuctua0on)  of  one  variable  that  is  predictable  from
the  other  variable.

Coeﬃcient  of  Determina(on,  r  2  or  R2  :
The  coeﬃcient  of  determina5on  represents  the
percent  of  the  data  that  is  the  closest  to  the  line  of
best  ﬁt.

For  example,  if  r  =  0.922,

then   r   2   =   0.850,   which   means   that   85%   of   the   total   varia0on   in   y   can   be
explained   by   the   linear   rela0onship   between   x   and   y   (as   described   by   the
regression  equa0on).
The  other  15%  of  the  total  varia0on  in  y  remains  unexplained.  ?????????

Learning  Curves:  introduc0on
Learning  Curves

## The more we do a job, the quicker we do it …

the man who made 6 million boxes!
Learning  Curves
The  learning  curve  was  ﬁrst  observed  in  the  aircrag  industry  in  the  USA  in  the
1920s  where  the  number  of  direct  labour  hours  needed  to  produce  a  unit  of
output  were  seen  to  decline  as  addi0onal  units  were  manufactured.

Of  course,  educa7on  and  training  are  designed  to  improve  peoples’  learning
curves!

Learning  curves  are  usually  described  by  a  percentage  (%)  value.  This  value  is
typically  1  minus  the  change  rate  of  the  measured  value  (Y)  each  0me  cumula0ve
volume  (X)  doubles.  For  example,  a  "100%"  learning  curve  would  be  a  ﬂat,
horizontal  line  for  all  X’s.  A  "90%"  learning  curve  would  drop  10%  each  0me  X
doubled.  A  “80%"  learning  curve  would  drop  20%  each  0me  X  doubled,  and  so  on.

There  is  the  story  of  the  man  in  the  USA  whose  job  it  has  been  for  several  years  to
fold  pieces  of  cardboard  into  box  shapes.  We  are  led  to  believe  that  he  has  made  6
million  boxes  to  date  and  STILL  he  is  learning  how  to  improve  his  technique!
Learning  Curves

## 80% learning curve usually assumed .

Cumula(ve  average-­‐(me
learning  model
The cumulative average-time learning model assumes that each
time the cumulative quantity of units produced doubles, there will
be constant percentage of decline in the average time per unit
required for the entire (cumulative) amount produced.

## If a plant is subject to an 80% learning curve, and the time

required to build the first unit is 10 hours, then the total time
required to build 2 units will be 10 hours x (2x.80), or 16 hours.
This works out to an average of 8 hours per unit; however, the first
unit will have taken 10 hours and the second unit only 6 hours.

After 4 units have been built, the total time required to built 4 units
will be 10x(2x.80)2, or 25.6 hours, and the third and fourth units
took a total of 9.6 hours, or only 4.8 hours each.
Incremental  unit-­‐(me
learning  model
The increment unit-time learning model assumes that each time the
cumulative quantity of units produces doubles, the time needed to
produce the last unit (incremental unit time) declines by a constant
percentage.

If the learning curve is 80% and the time required to build the first
units is 10 hours, then the time required to manufacture the second
unit will be .80* hours. Thus, the total time required to produce the
two units will be:

10 + 8 = 18 hours

## The average time per unit will be 18 / 2 = 9 hour

Learning  Curves

## 80% Learning Rate

Batches Total Time CAT
1 50.00
2 80.00
4 128.00
8 204.80
… …
128 1,342.18
Learning  Curves

## 80% Learning Rate

Batches Total Time CAT
1 50.00 50.00
2 80.00 40.00
4 128.00 32.00
8 204.80 25.60
… … …
128 1,342.18 10.49
Learning  Curves

No Learning
Batches Total Time CAT
1 50.00 50.00
2 100.00 50.00
4 200.00 50.00
8 400.00 50.00
… … …
128 6,400.00 50.00
Learning  Curves

## Implications of learning taking place as opposed

to no learning taking place …
Learning  Curves  :
Suppose,  labour  rate  of  pay  is  \$1/
hour
80% Learning Rate
Batches Total Time Labour Cost
1 50.00 50.00
2 80.00 80.00
4 128.00 128.00
8 204.80 204.80
… … …
128 1,342.18 1,342.18
Learning  Curves:

No Learning
Batches Total Time Labour Cost
1 50.00 50.00
2 100.00 100.00
4 200.00 200.00
8 400.00 400.00
… … …
128 6,400.00 6,400.00
Learning  Curves

## Imagine now that each batch is made up of 5

units …
Learning  Curves

Average Cost/unit
Units 80% Learning No Learning
5 10.00 10.00
10 16.00 20.00
20 25.60 40.00
40 40.96 80.00
… - -
640 268.44 1,280.00
Learning  Curves
Learning curve: 80%
60.000

50.000

40.000
CAT

30.000

20.000

10.000

0.000
0 100 200 300 400 500 600

Batch Number
Learning  Curves
Learning curve: 90%
60.000

50.000

40.000
CAT

30.000

20.000

10.000

0.000
0 100 200 300 400 500 600

Batch Number
Learning  Curves
Learning curve: 60%
60.000

50.000

40.000
CAT

30.000

20.000

10.000

0.000
0 100 200 300 400 500 600

Batch Number
Learning  Curves
Learning  Curves  :formulae

## It’s very difficult to calculate or predict the CAT

for anything other than the CAT at the
DOUBLING POINT: 1, 2, 4, 8, 16 … 128 …

## Now it gets mathematical!

Learning  Curves  :  formulae
Now we can calculate or predict the CAT for any
number of batches.

eg a = 50 hours
X = 128 batches
learning rate = 80%
Learning  Curves  :  formulae

## So total time = 10.48576 * 128 = 1,342.18

Learning  Curves  :  exercise

eg a = 100 hours
X = 64 batches
learning rate = 80%
Learning  Curves  :  solu0on

## So total time = 26.2141 * 64 = 1,677.71

Learning  Curves  :  exercise
Try again …

eg a = 60 hours
X = 32 batches
learning rate = 90%
Learning  Curves  :  solu0on

## So total time = 35.4298 * 32 = 1,133.4

Time series
An  ordered  sequence  of  values  of  a
variable  at  equally  spaced  5me
intervals.
Time  Series  Analysis  is  used  for  many
applica0ons  such  as:

§ Economic  Forecas0ng
§ Sales  Forecas0ng
§ Budgetary  Analysis
§ Stock  Market  Analysis
§ Inventory  Studies
§ Census  Analysis

The  four  paAerns  a  (me
series  can  take  are:

Time  Series  Forecasts
•  Trend  -­‐  long-­‐term  movement  in  data
•  Seasonality  -­‐  short-­‐term  regular  varia0ons  in
data
•  Cycle  –  wavelike  varia0ons  of  more  than  one
year’s  dura0on
•  Irregular  varia5ons  -­‐  caused  by  unusual
circumstances
•  Random  varia5ons  -­‐  caused  by  chance

3-­‐57
Forecast  Varia0ons

Irregular
varia(on

Trend

Cycles

90
89
88
Seasonal  varia(ons

3-­‐58
1) Trend-
a gradual shifting to a higher or
lower level
example: long-term sales growth.

## 2) Cyclical- data fluctuates greatly from

year to year due to cyclical factors such
as the cyclical nature or the economy.
3) Seasonal-fluctuations in a time series due to
seasonal fluctuations can take place within any
time period that is less than one year in length.
Even fluctuations within a day are considered
seasonal (within-the-day seasonal component)

## 4) Irregular-fluctuations that are caused by

short term, nonrecurring factors. The irregular
component’s impact on a time series cannot be
predicted.
Time  Series  Methods
•  Naive  –  Just  use  last  month’s  #,  or  last
month’s  #  plus  or  minus  a  percentage  or
ﬁxed  amount
•  Example:  2011  room  sales  were  \$150,000
•  Forecast  for  2012  room  sales  is  done  by
using  2002  data  plus  an  an0cipated  10%
increase  in  sales
•  \$150,000  (1.1)  =  \$  165,000
Time  Series  Pauern:  Sta0onary
•  The  result  of  many
inﬂuences  that  act
independently  so  as  to
yield  nonsystema0c
and  non-­‐repea0ng
MG
average  value.
6020

Forec
ast
•  Forecas0ng  methods:
naive,  moving
average,  exponen0al
smoothing
Time  Series  Pauern:  Trend
•  It  represents  a  general
increase  or  decrease
in  a  0me  series  over
several  consecu0ve
periods  (some  sources
MG
present  six-­‐seven  or
MT
6020
more  periods).

Forec •  Forecas0ng  methods:
ast
linear  trend
projec0on,
exponen0al  smoothing
with  trend,  etc.
Time  Series  Pauern:  Seasonal
•  Seasonal  Pauerns
represent  pauerns
that  are  periodic  and
recurrent  (usually  on  a
quarterly,  monthly,  or
MG
annual  basis).
MT
6020   •  Forecas0ng  methods:

Forec exponen0al  smoothing
ast
with  trend  and
seasonality,  0me
series  decomposi0on,
etc.
Time  Series  Pauern:  Cyclical
•  The  result  of  economic  and
demand)  and  contrac0ons
(recessions  and  depressions)
and  usually  repeat  every  two-­‐
ﬁve  years.  Cyclical  inﬂuences
are  diﬃcult  to  forecast  because
MG
MT
cyclical  demands  are  recurrent
6020   but  not  periodic  (they  happen

Forec
in  diﬀerent  intervals  of  0me
ast   with  great  variability  of
demands).
•  Forecas0ng  methods:  0me
series  decomposi0on,  mul0ple
regression
What  is  a  Time  Series?
n    Set  of  evenly  spaced  numerical  data
u    Obtained  by  observing  response  variable
at  regular  0me  periods
n  Forecast  based  only  on  past  values
u  Assumes  that  factors  inﬂuencing  past  and
present  will  con0nue  inﬂuence  in  future
n  Example
Year:  1993  1994  1995  1996  1997
Sales:  78.7  63.5  89.7  93.2  92.1

Moving  Average  Method
n    MA  is  a  series  of  arithme0c  means
n  Used  if  liule  or  no  trend,  seasonal,  and  cyclical
pauerns
n  Used  ogen  for  smoothing
u  Provides  overall  impression  of  data  over  0me

n  Equa0on

Demand in Previous n Periods
MA =

n
Moving  Average  Example
You’re  manager  of  a  museum  store  that  sells
historical  replicas.  You  want  to  forecast  sales
of  item  (123)  for  2011  using  a  3-­‐period  moving
average.
2006  4
2007    6
2008  5
2009  3
2010  7
Moving  Average  Solu0on
Time Response Moving Moving
Yi Total Average
(n=3) (n=3)
1995 4 NA NA
1996 6 NA NA
1997 5 NA NA
1998 3 4+6+5=15 15/3 = 5
1999 7
2000 NA
Moving  Average  Solu0on

## Time Response Moving Moving

Yi Total Average
(n=3) (n=3)
1995 4 NA NA
1996 6 NA NA
1997 5 NA NA
1998 3 4+6+5=15 15/3 = 5
1999 7 6+5+3=14 14/3=4 2/3
2000 NA
Moving  Average  Solu0on

## Time Response Moving Moving

Yi Total Average
(n=3) (n=3)
1995 4 NA NA
1996 6 NA NA
1997 5 NA NA
1998 3 4+6+5=15 15/3=5.0
1999 7 6+5+3=14 14/3=4.7
2000 NA 5+3+7=15 15/3=5.0
Weighted  Moving  Average
Method
n  Used  when  trend  is  present
u  Older  data  usually  less  important
n  Weights  based  on  intui0on
u Ogen  lay  between  0  &  1,  &  sum  to  1.0

n  Equa0on

## Σ(Weight for period n) (Demand in period n)

WMA =
ΣWeights
Average  Methods
n  Increasing  n  makes  forecast  less  sensi0ve
to  changes
n  Do  not  forecast  trend  well  due  to  the  delay
between  actual  outcome  and  forecast
n  Diﬃcult  to  trace  seasonal  and  cyclical
pauerns
n  Require  much  historical  data
n  Weighted  MA  may  perform  beuer
Time  Series  Methods
•  Moving  Averages  –  beuer  approach!
–  Takes  into  account  the  past  n  periods  and  removes
randomness  (unan0cipated  events)  by  averaging  or
“smoothing”

Moving  Avg.    =    Ac0vity  in  previous  n  periods
n
•  examples  of  n-­‐week  moving  averages
•  Consider  the  last  3  periods
Time  Series  Methods

## Moving  Avg.    =    Ac0vity  in  previous  n  periods

n
•  Forecast  demand  for  meals  during  week  13

## •  3  week  Moving  Avg.=  1,025  +  1,000  +  1,050

3
=  1,025  meals
(forecast  for  week  13)
Moving  Average  Method

–  Beuer  than  simple  naïve  approach
–  Using  more  weeks  “dampens”  out  any  ‘random
varia0ons’  that  took  place
–  Need  to  con0nually  store/update  historical  data
–  Gives  equal  weight  to  each  observa0on  (ie,  past
monthly  room  sales,  or  #  of  covers)
Weighted  Moving  Average

## þ  Used  when  trend  is  present

þ Older  data  usually  less  important
þ  Weights  based  on  experience  and
intui0on

## ∑  (weight  for  period  n)

Weighted                      x  (demand  in  period  n)
moving  average   =   ∑  weights
WMA
60%   20%   10%   10%
Jan    Feb    Mar
Month      Sales
Forecast     Forecast     Forecast
Sep        5,480.00              548.00

Oct        5,550.00              555.00

Nove        5,500.00        1,100.00
Dec        5,520.00        3,312.00
Jan        5,460.00      5,515.00
Feb        5,450.00      S  E      55
Mar        5,480.00      %  S  E    1
Apr        5,550.00

WMA
60%   20%   10%   10%
Jan    Feb    Mar
Month      Sales
Forecast     Forecast     Forecast
Sep        5,480.00              548.00

Oct        5,550.00              555.00

Nove        5,500.00        1,100.00
Dec        5,520.00        3,312.00
Jan        5,460.00      5,515.00     5485
Feb        5,450.00      S  E      55      35
Mar        5,480.00      %  S  E    1      .64
Apr        5,550.00

WMA
60%   20%   10%   10%
Jan    Feb    Mar
Month      Sales
Forecast     Forecast     Forecast
Sep        5,480.00              548.00

Oct        5,550.00              555.00

Nove        5,500.00        1,100.00
Dec        5,520.00        3,312.00
Jan        5,460.00      5,515.00        5464
Feb        5,450.00      S  E      55          -­‐16
Mar        5,480.00      %  S  E    1          -­‐.29
Apr        5,550.00

Exponen(al  Smoothing

data

## •  Uses  only  the  last  2  periods

Exponen(al  Smoothing
Accounts  for  forecas(ng  errors  and  requires
less  data
New  forecast  =  (Last  period’s  Actual  *  Smoothing
factor)
+(1-­‐  α)  *Last  period’s  forecast)

## Ft  = α  Xt  –  1  +  (1-­‐ α )  Ft  –  1

Where  Ft  =  new  forecast
Ft  –  1  =  previous  forecast
α  =  smoothing  (or  weigh7ng)
constant  (0  ≤  α  ≥  1)
Exponen0al  Smoothing  Method

## n  Form  of  weighted  moving  average

u  Weights  decline  exponen0ally
u  Most  recent  data  weighted  most

## n  Requires  smoothing  constant  (α)

u  Ranges  from  0  to  1

## n  Involves  liule  record  keeping  of  past  data

Exponen0al  Smoothing  Solu0on
Ft = Ft-1 + α * (At-1 - Ft-1)
Forecast, F t
Time Actual
( α = .10)
1995 180 175.00 (Given)
1996 168 175.00 +
1997 159
1998 175
1999 190
2000 NA
Exponen0al  Smoothing  Solu0on
Ft = Ft-1 + α · (At-1 - Ft-1)
Forecast, F t
Time Actual
( α = .10)
1995 180 175.00 (Given)
1996 168 175.00 + .10(
1997 159
1998 175
1999 190
2000 NA
Exponen0al  Smoothing  Solu0on
Ft = Ft-1 + α · (At-1 - Ft-1)
Forecast, Ft
Time Actual
(α = .10)
1995 180 175.00 (Given)
1996 168 175.00 + .10(180 -
1997 159
1998 175
1999 190
2000 NA
Exponen0al  Smoothing  Solu0on
Ft = Ft-1 + α · (At-1 - Ft-1)
Forecast, Ft
Time Actual
(α = .10)
1995 180 175.00 (Given)
1996 168 175.00 + .10(180 - 175.00)
1997 159
1998 175
1999 190
2000 NA
Exponen0al  Smoothing  Solu0on
Ft = Ft-1 + α · (At-1 - Ft-1)
Forecast, Ft
Time Actual

= .10)
1995 180 175.00 (Given)
1996 168 180(10%)+ (100-10%) x175.00 = 175.50
1997 159
1998 175
1999 190
2000 NA
Exponen0al  Smoothing  Solu0on
Ft = Ft-1 + α · (At-1 - Ft-1)
Forecast, F t
Time Actual
( α = .10)
1995 180 175.00 (Given)
1996 168 175.00 + .10(180 - 175.00) = 175.50
1997 159 175.50 + .10(168 - 175.50) = 174.75
1998 175
1999 190
2000 NA

## MGMT  6020    Forecast

Exponen0al  Smoothing  Solu0on
Ft = Ft-1 + α · (At-1 - Ft-1)
Forecast, F t
Time Actual
( α = .10)
1995 180 175.00 (Given)
1996 168 175.00 + .10(180 - 175.00) = 175.50
1997 159 175.50 + .10(168 - 175.50) = 174.75
1998 175 174.75 + .10(159 - 174.75)= 173.18
1999 190
2000 NA

## MGMT  6020    Forecast

Exponen0al  Smoothing  Solu0on
Ft = Ft-1 + α · (At-1 - Ft-1)
Forecast, F t
Time Actual
( α = .10)
1995 180 175.00 (Given)
1996 168 175.00 + .10(180 - 175.00) = 175.50
1997 159 175.50 + .10(168 - 175.50) = 174.75
1998 175 174.75 + .10(159 - 174.75) = 173.18
1999 190 173.18 + .10(175 - 173.18) = 173.36
2000 NA
Exponen0al  Smoothing  Solu0on
Ft = Ft-1 + α · (At-1 - Ft-1)
Forecast, F t
Time Actual
( α = .10)
1995 180 175.00 (Given)
1996 168 175.00 + .10(180 - 175.00) = 175.50
1997 159 175.50 + .10(168 - 175.50) = 174.75
1998 175 174.75 + .10(159 - 174.75) = 173.18
1999 190 173.18 + .10(175 - 173.18) = 173.36
2000 NA 173.36 + .10(190 - 173.36) = 175.02

## MGMT  6020    Forecast

Expected  Value
Expected  Value  is  means  of  associa0ng    a  dollar
amount  with  each  of  possible  out  comes  of  a
probability  distribu0on.
Expected  Value
The  expected  value  of  perfect  informa0on  is  the
diﬀerence  between  the  expected  payoﬀ  we  could
future,  and  the  expected  pay  oﬀ  we  can  receive  by
using  our  best  analy0cal  decision-­‐making  skills,  but
without  perfect  informa0on  about  the  future.
What  is  the  current  value  of  a  policy  that  has  several
possible  outcomes?
Probabili0es
•  Probability  is  the  chance  that  a  par0cular
outcome  will  be  realized
•  A  certain  outcome  has  a  probability,  ρ  =  1.0
•  An  uncertain  outcome  has  a  probability  less
than  1.0,    ρ  <  1
•  If  one  outcome  is  uncertain,  then  there  are  at
least  two  possible  outcomes
•  The  sum  of  the  probabili0es  of  all  outcomes  =
1,  Σρ    =  1,  i.e.  one  of  them  will  occur
Probabili(es  of  various  outcomes

Project A

## Outcome #1 Outcome #2 Outcome #3

NP = \$10,000 NP = \$50,000 NP = \$70,000
ρ = 0.3 ρ = 0.5 ρ = 0.2
Expected  value  of  a  policy  with  uncertain
outcomes
•  An  expected  value  of  a  policy  or  project  with
various  possible  outcomes  is  calculated  by
mul0plying  the  value  of  each  outcome  by  its
probability  and  then  summing  these  values

•  Generically:
EV  =  O#1*ρ1  +  O#2*ρ2  +  O#3*ρ3  +  …  +  O#n*ρn

Where,
O=Outcome

Expected  value  of  a  policy  with  uncertain
outcomes

Project  A

## Outcome  #1     Outcome  #2     Outcome  #3

\$10,000*0.3   +   \$50,000*0.5   +   \$70,000*0.2

## Expected  value  of  Project  A  =  \$3000+\$25,000+

\$14,000  =  \$42,000
Sensi(vity  analysis
Sensi(vity  Analysis
•  Change
–  the  values  of
•  inputs
•  parameters
•  Measure  changes  in
–  outputs
–  performance  indices

108
Sensi(vity  Analysis
Sensitivity analysis is a variety of techniques used
to determine how an amount will change if one
variable in the analysis is changed.

## If a small change in the value of one of the inputs

would cause a change in the recommended
decision alternative, then the solution to the
decision analysis problem is sensitive to that input,
and we can take extra care to make sure that the
assigned value to that input in the analysis is as
accurate as possible because of the greater risk.
Sensi(vity  Analysis
When used with decision trees and expected values,
different values are selected for the various probabilities
and payoffs and they are changed one at a time to see how
the recommended decision alternative changes.