Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Steven S Prevette
Senior Statistician
Savannah River Nuclear Solutions, LLC
2
Introduction to Measurement
3
Creating a Management System
4
Five Critical Issues – Dr. Ackoff (1, 2, 3)
5
Five Critical Issues – Dr. Ackoff (4 and 5)
6
Designing a Management System - Ackoff
7
Dr. Ackoff’s message
8
Three Information Sources
9
Survey Processing Sidebar
10
Some Approaches for Choosing PI’s
• Top-Down
• Process Approach
• Bottom-Up
• Customer Focus
• Idealized Design
• Leading Indicators
11
Top-Down Approach
12
Process Approach
THEORY
Dollars Process
Hours OUTCOMES
Materials Cycle Time Product Mission Progress
Data Service Commitments Met
In-Process Stewardship
Budget Inventory Schedule Profit
Safety
Procedure Satisfaction
Compliance
I W
REWORK
D A
L S
E T
E
Credit to Phil Monroe,
DEMCOM
13
Connecting Process, Output, and Outcome
14
Examples
Input Rate (units per time or Dollars, Hours)
Efficiency (Input vs. Budget, Input vs. Idle
15
Bottom Up Approach
16
Combination Approach
17
Customer Focus
18
Idealized Design
19
Low Frequency - High Impact Events
• There are several “outcome” based measures
such as Deaths, Bankruptcy, Property Damage,
Environmental Damage which are infrequent
occurrences but carry a high perceived risk and
emotional reaction
• Even lucky success – “Dow Jones breaks 12,000”
can be problematic
• When one of these events occurs, we are
susceptible to over reaction to the event itself and
justify reasons for it in hindsight
The Black Swan:
The Impact of the Highly Improbable
Nassim Nicholas Taleb
20
Jump Start with Leading Indicators
21
Creating a Better Future
22
Choosing Leading Indicators
23
Safety Leading Indicators at Fluor and SRNS
• Events – first-aid cases, occurrences, near misses
• Safety inspections – number and score
• Employee input – safety concerns and survey responses
• Behavior Based Safety – number of observations and at risk
behavior rates
• Management Observations – number and results
24
0
5
10
15
20
25
30
1/6/06
2/17/06
3/31/06
5/12/06
6/23/06
8/4/06
9/15/06
10/27/06
12/8/06
1/19/07
3/2/07
Week Beginning
4/13/07
5/25/07
7/6/07
First Aid Case Rate
0
1
2
3
4
5
6
7
8
1/6/06
2/17/06
3/31/06
5/12/06
6/23/06
8/4/06
9/15/06
10/27/06
12/8/06
1/19/07
3/2/07
Week Beginning
ORPS Events
(does not include RC3)
4/13/07
5/25/07
7/6/07
0
1
2
3
4
5
1/6/06
2/17/06
3/31/06
5/12/06
6/23/06
8/4/06
9/15/06
10/27/06
12/8/06
Near Misses
1/19/07
8 weeks
3/2/07
Week Beginning
4/13/07
Low Threshold Events (example at Fluor Hanford)
5/25/07
since the last event
25
7/6/07
Safety Inspections (Fluor Hanford)
500 98%
400
96%
300
94%
200
92%
100
0 90%
Jul-04
Jul-05
Jul-06
Jul-07
Jan-04
Jan-05
Jan-06
Jan-07
Jul-04
Jul-05
Jul-06
Jul-07
Jan-04
Jan-05
Jan-06
Jan-07
26
Employee Sentiment (Fluor Hanford and SRNS)
15
0.8
10
0.75 5
0
0.7
Oct-08
Dec-08
Oct-09
Dec-09
Feb-09
Jun-09
Feb-10
Jun-10
Aug-08
Apr-09
Aug-09
Apr-10
Aug-10
Jan-04
Jul-04
Jan-05
Jul-05
Jan-06
Jul-06
Jan-07
Jul-07
Month Taken
27
0
1000
2000
3000
4000
5000
6000
7000
Feb-08
Apr-08
Jun-08
Aug-08
Oct-08
Dec-08
Feb-09
Apr-09
Jun-09
Aug-09
Oct-09
Dec-09
Observed by BBS
Feb-10
Number of Personnel
Apr-10
Jun-10
Aug-10
100.0%
98.0%
98.5%
99.0%
99.5%
Feb-08
Apr-08
Behavior Based Safety (SRNS)
Jun-08
Aug-08
Oct-08
Dec-08
Feb-09
Apr-09
Jun-09
Aug-09
BBS
Oct-09
Dec-09
Feb-10
Percent Safe Behaviors
Apr-10
Jun-10
Aug-10
28
Results
• Leading indicator trends at both sites allowed management to
stay ahead of issues
• Fluor Hanford OSHA Recordable Case Rate (TRC) dropped
28% since start of use (May 03 – Apr 04 compared to Feb 05 –
Jan 06) and 2007 lowest TRC rate
• SRNS corrective actions from adverse events of summer 2009
included leading indicators
• SRNS Operations 53% reduction in TRC, FY 2010 best rate
since 1985
29
Correlations at SRNS
SRNS Total (Ops, Construction, Subs) SRNS Operations Management Observation
Leading Indicator Correlation Leading Indicator Correlation
25 SRNS S&H Concerns 0.5 0 1.3
Observations
DART 0.45 TRC
200 1.1
20 0.4
Observations
15 0.3 600
0.7
TRC
0.25 800
0.5
10 0.2 1,000
0.15 0.3
1,200
5 0.1
1,400 0.1
0.05
0 0 1,600 -0.1
Oct-08
Jun-09
Jun-10
Jun-09
Jun-10
Feb-09
Oct-09
Feb-10
Oct-08
Feb-09
Oct-09
Feb-10
Dec-08
Dec-09
Dec-08
Dec-09
Aug-08
Apr-09
Aug-09
Apr-10
Aug-10
Aug-08
Apr-09
Aug-09
Apr-10
Aug-10
SRNS Operations BBS Observations SRNS Total (Ops, Construction, Subs)
Leading Indicator Correlation Leading Indicator Correlation
1,000 3.5 First Aid, No Insect 1.2
Observations
1.2
1,200 TRC TRC
0.8
TRC
2,000 0.6
0.6
2,200 1.5
2,400
2,600
0.4
0.2
Green line is the 1.0
0.5
0.4
0.2
2,800
3,000 0.0 Leading Indicators 0.0 0.0
Oct-08
Jun-09
Jun-10
Feb-09
Oct-09
Feb-10
Dec-08
Dec-09
Oct-08
Oct-09
Dec-08
Feb-09
Jun-09
Dec-09
Feb-10
Jun-10
Aug-08
Apr-09
Aug-09
Apr-10
Aug-10
Apr-09
Apr-10
Aug-08
Aug-09
Aug-10
30
OSHA Case Rate OSHA Case Rate
0
1
2
0.5
1.5
2.5
0
1
2
0.5
1.5
Jan-04 Jan-04 2.5
Apr-04 Apr-04
Jul-04 Jul-04
Oct-04 Oct-04
Jan-05 Jan-05
Apr-05 Apr-05
Jul-05 Jul-05
OSHA Rate
OSHA Rate
Oct-05 Oct-05
Rate
Jan-06
Misses
Jan-06
Apr-06 Apr-06
Jul-06 Jul-06
Oct-06 Oct-06
Jan-07 Jan-07
Apr-07 Apr-07
Jul-07
Near Misses
Jul-07
First Aid Rate
0
5
OSHA Case Rate versus First Aid Case
0
1
2
3
10
15
20
25
0
1
2
0.5
1.5
2.5
0
1
2
0.5
1.5
2.5
Jan-04
Correlations at Fluor Hanford
Jan-04
Apr-04 Apr-04
Jul-04 Jul-04
Oct-04
Oct-04
Jan-05
Apr-05 Jan-05
Jul-05 Apr-05
OSHA Rate
Jul-05
OSHA Rate
Oct-05
Jan-06 Oct-05
Apr-06 Jan-06
Reports
Jul-06 Apr-06
Oct-06 Jul-06
Jan-07 Oct-06
Safety Perception Survey
Apr-07 Jan-07
Jul-07 Apr-07
HGET Survey
OSHA Case Rate versus Em ployee
Jul-07
OSHA Case Rate versus Ocurrence
ORPS Non-Nuc
0
5
0.73
0.75
0.77
0.79
0.81
0.83
0.85
10
15
20
25
32
Hierarchy of Indicators
Corporation
LAGGING Project
Facility
Team
LEADING
33
Performance Indicator Evolution
As a process matures, one may end up evolving the
indicators used. For example, if interested in completing
actions by commitment dates, one may end up using (as
the process matures):
34
Can we really measure anything?
35
Barrier: The Search for the “Perfect” Indicator
36
Barrier: Fear
37
Barrier: 3 Kinds of Numbers for Management
38
“Just Do It”
39
Data Gathering
• Plan ahead
• Establish Operational Definitions
• Check data quality
• Use existing databases before creating new
40
Plan Ahead
• “It’s absolutely vital for business that you settle this
method of counting, measuring, definition of faults,
mistake, defect, before you do business. It’s too late
afterwards”
-Dr. W. Edwards Deming
41
Operational Definitions
42
Operational Definitions
43
Context
44
Data Quality
•
Data should be replicable
•
Operational Definitions are a must
•
Source Data must be defined
•
There is no “true value” of any measure, but a good
operational definition can save much trouble in the
future
ANYONE at ANYTIME in the future should be able to
apply the same operational definition to the same
source data, and get the same results.
45
Conclusion
• Performance Indicators are part of the Management
System
• Good use of data will encourage Performance Indicator
Development
• All data are flawed – Set up good Operational Definitions
to minimize flaws
• “Just Do It” – Start collecting data and use it
• There is no such thing as a bad performance indicator,
only bad use of performance indicators
• Good use of performance indicators will lead to continual
improvement
46