Sei sulla pagina 1di 46

Choosing Performance Indicators

Steven S Prevette
Senior Statistician
Savannah River Nuclear Solutions, LLC

SPC Trending Primer / Two Day Training


http://www.efcog.org/wg/esh_es/Statistical_Process_Control/index.htm
Management F-Law from Dr. Ackoff

Managers who don’t know how to measure what


they want,
settle for wanting what they can measure

Dr. Russell L Ackoff


and Herbert J Anderson at
http://www.f-laws.com/pdf/A_Little_Book_of_F-LawsE.pdf

2
Introduction to Measurement

• It is more important how the measure is used than what


the measure is
• Self-fulfilling prophecies can prevent us from gathering
any data
• We are drowning in data, but little knowledge is derived
• Context and Operational Definitions are crucial
• Don’t ignore “gut feelings” and emotions

3
Creating a Management System

• Dr. Ackoff, Creating the Corporate Future


• Three Management Functions:
– ID of actual and potential problems (threats and
opportunities)
– Decision making (what to do and doing it, or having it
done)
– Maintenance and improvement of performance under
changing and unchanging conditions

4
Five Critical Issues – Dr. Ackoff (1, 2, 3)

• Managers suffer from overabundance of irrelevant


information.
• Managers don’t know what information they need.
Need to look at the decision process to determine
this.
• Even if given the information they need, decision
making will not necessarily improve.

5
Five Critical Issues – Dr. Ackoff (4 and 5)

• More communication does not necessarily lead to


better performance. Information can be used
destructively.
• Managers do need to know how the information
system works. Just because it came from a
computer doesn’t mean it is right.

6
Designing a Management System - Ackoff

• The information system should be designed as an


integral part of the management system
• Most information systems are designed
independently, leading to failure
• Information systems should serve management, not
vice versa

7
Dr. Ackoff’s message

• We do need to know the context within which the


performance indicators will be used
• Forecasting and living with the forecasted future is
important, but what about designing a better future?

8
Three Information Sources

• Worker and Customer Opinion


• Expert Review
• Process Measures

We will focus on Process Measures for this topic.


Note that opinions can be converted to
measurement data with survey analysis, and
results can be converted to measurement data
through grading criteria.

9
Survey Processing Sidebar

• Surveys usually are not analyzed well


• Most arithmetically average 1 to 5 Likert scale and
don’t asses for variation
– This assumes people think linearly
– This assumes each category is equal width
– Can over-react to random variation
• For a better idea, see
http://www.hanford.gov/rl/uploadfiles/VPP_AnaSurveyData.pdf

10
Some Approaches for Choosing PI’s

• Top-Down
• Process Approach
• Bottom-Up
• Customer Focus
• Idealized Design
• Leading Indicators

11
Top-Down Approach

• Look at your Mission and Vision


• What are your Products, Services and Customers
• What is your Business Objective
• What are desired Outcomes
• What are the Processes that accomplish the above (drawing a
flow chart may help)
• Decide on Measures (see next page)
• Go set up data sources, gather data

12
Process Approach
THEORY

Dollars Process
Hours OUTCOMES
Materials Cycle Time Product Mission Progress
Data Service Commitments Met
In-Process Stewardship
Budget Inventory Schedule Profit
Safety
Procedure Satisfaction
Compliance
I W
REWORK
D A
L S
E T
E
Credit to Phil Monroe,
DEMCOM
13
Connecting Process, Output, and Outcome

• Outcomes are only achieved as a result of a process


• Focusing ONLY on outcomes is a sure path to failure
• Ignoring outcomes is a sure path to failure
• When I provide a Product or a Service, what is my
THEORY that connects this to a favorable outcome?

Example – I provide statistical training to you. My product is you, as


you leave this room. My theory is that you will apply the knowledge
you have been provided, apply it to performance indicator work,
which will cause continual improvement to occur and have a positive
impact on accomplishing the Mission of your Organization.

14
Examples
Input Rate (units per time or Dollars, Hours)
Efficiency (Input vs. Budget, Input vs. Idle

Cycle Time (Baldrige Criteria pushes cycle time)


Backlog (inventory)
Procedure Compliance, Completion without Stoppage

Output Rate (units per time of Product or Service)


Productivity (Output divided by Input)
Defect Rate (Waste + Rework vs. Output)

Effectiveness (Outcome measures, Outcome per Input, Percent


Compliant, Output vs. Schedule)

15
Bottom Up Approach

• Go find out what data you currently have


• Why are you collecting it?
• What could it tell you?
• Choose measures from available data
• Refine by trial and error

Advantage – Cost Effective, utilizes existing resources


Disadvantage – Only focuses on “visible” data, is Reactive, Not
“designed” (re: Ackoff)

16
Combination Approach

Work down from the


1st Aim of the
Organization

3rd Meet in the middle, identify gaps


Work up from the data
2nd on hand

17
Customer Focus

• Put yourself in your customer’s place.


• What is important to the customer?
• Customer Satisfaction, Loyalty.
• Can it be measured or inferred?

Do the same for your employees.

18
Idealized Design

• Your current organization and systems are gone


• What would you put in place TODAY?
• You are free to replace it with anything you like as long
as:
– It is technologically feasible
– It must be operationally viable
Dr. Russ Ackoff

19
Low Frequency - High Impact Events
• There are several “outcome” based measures
such as Deaths, Bankruptcy, Property Damage,
Environmental Damage which are infrequent
occurrences but carry a high perceived risk and
emotional reaction
• Even lucky success – “Dow Jones breaks 12,000”
can be problematic
• When one of these events occurs, we are
susceptible to over reaction to the event itself and
justify reasons for it in hindsight
The Black Swan:
The Impact of the Highly Improbable
Nassim Nicholas Taleb

20
Jump Start with Leading Indicators

Just what are leading indicators, anyway?

Predictions of the future?


or
A means to create a better future?

21
Creating a Better Future

Distracted by calls to predict future, we delay


development of leading indicators

• At low injury rate, little information exists in outcome


indicators
• Trending response time is long at low rates
• Use leading indicators to measure lower threshold data
and activities
• Quickens trend response and improves outcomes

22
Choosing Leading Indicators

What are you doing to “Create a Better Future”?


What are your program elements, your grass-root efforts
to improve?
What are the activities you are conducting which you
expect will lead to better performance on outcomes?
What is your theory for achieving better outcomes?
What are lower threshold indicators that could be
measured or inferred?

23
Safety Leading Indicators at Fluor and SRNS
• Events – first-aid cases, occurrences, near misses
• Safety inspections – number and score
• Employee input – safety concerns and survey responses
• Behavior Based Safety – number of observations and at risk
behavior rates
• Management Observations – number and results

Senior management reviews


weekly and published in
company scorecards

24
0
5
10
15
20
25
30
1/6/06
2/17/06
3/31/06
5/12/06
6/23/06
8/4/06
9/15/06
10/27/06
12/8/06
1/19/07
3/2/07

Week Beginning
4/13/07
5/25/07
7/6/07
First Aid Case Rate

0
1
2
3
4
5
6
7
8

1/6/06
2/17/06
3/31/06
5/12/06
6/23/06
8/4/06
9/15/06
10/27/06
12/8/06
1/19/07
3/2/07
Week Beginning
ORPS Events
(does not include RC3)

4/13/07
5/25/07
7/6/07
0
1
2
3
4
5

1/6/06
2/17/06
3/31/06
5/12/06
6/23/06
8/4/06
9/15/06
10/27/06
12/8/06
Near Misses

1/19/07
8 weeks

3/2/07
Week Beginning

4/13/07
Low Threshold Events (example at Fluor Hanford)

5/25/07
since the last event

25

7/6/07
Safety Inspections (Fluor Hanford)

Safety Inspections Safety Inspection Scores


600 100%

500 98%
400
96%
300
94%
200
92%
100

0 90%

Jul-04

Jul-05

Jul-06

Jul-07
Jan-04

Jan-05

Jan-06

Jan-07
Jul-04

Jul-05

Jul-06

Jul-07
Jan-04

Jan-05

Jan-06

Jan-07

Month Conducted Month Conducted

26
Employee Sentiment (Fluor Hanford and SRNS)

HGET Safety Survey


0.9
SRNS (Ops, Construction, Subs)
25
S&H Related Employee Concerns
0.85 20

15
0.8
10

0.75 5

0
0.7

Oct-08

Dec-08

Oct-09

Dec-09
Feb-09

Jun-09

Feb-10

Jun-10
Aug-08

Apr-09

Aug-09

Apr-10

Aug-10
Jan-04
Jul-04
Jan-05
Jul-05
Jan-06
Jul-06
Jan-07
Jul-07

Month Taken

27
0
1000
2000
3000
4000
5000
6000
7000
Feb-08

Apr-08

Jun-08

Aug-08

Oct-08

Dec-08

Feb-09

Apr-09

Jun-09

Aug-09

Oct-09

Dec-09
Observed by BBS

Feb-10
Number of Personnel

Apr-10

Jun-10

Aug-10
100.0%

98.0%
98.5%
99.0%
99.5%

Feb-08

Apr-08
Behavior Based Safety (SRNS)

Jun-08

Aug-08

Oct-08

Dec-08

Feb-09

Apr-09

Jun-09

Aug-09
BBS

Oct-09

Dec-09

Feb-10
Percent Safe Behaviors

Apr-10

Jun-10

Aug-10
28
Results
• Leading indicator trends at both sites allowed management to
stay ahead of issues
• Fluor Hanford OSHA Recordable Case Rate (TRC) dropped
28% since start of use (May 03 – Apr 04 compared to Feb 05 –
Jan 06) and 2007 lowest TRC rate
• SRNS corrective actions from adverse events of summer 2009
included leading indicators
• SRNS Operations 53% reduction in TRC, FY 2010 best rate
since 1985

Allows focus on doing the right things right

29
Correlations at SRNS
SRNS Total (Ops, Construction, Subs) SRNS Operations Management Observation
Leading Indicator Correlation Leading Indicator Correlation
25 SRNS S&H Concerns 0.5 0 1.3
Observations
DART 0.45 TRC
200 1.1
20 0.4

DART per 200,000 hours


400
0.35 0.9
No. of Concerns

Observations
15 0.3 600
0.7

TRC
0.25 800
0.5
10 0.2 1,000
0.15 0.3
1,200
5 0.1
1,400 0.1
0.05
0 0 1,600 -0.1
Oct-08

Jun-09

Jun-10

Jun-09

Jun-10
Feb-09

Oct-09

Feb-10

Oct-08

Feb-09

Oct-09

Feb-10
Dec-08

Dec-09

Dec-08

Dec-09
Aug-08

Apr-09

Aug-09

Apr-10

Aug-10

Aug-08

Apr-09

Aug-09

Apr-10

Aug-10
SRNS Operations BBS Observations SRNS Total (Ops, Construction, Subs)
Leading Indicator Correlation Leading Indicator Correlation
1,000 3.5 First Aid, No Insect 1.2
Observations
1.2
1,200 TRC TRC

Note: Black line is 3.0

FA Cases per 200,000 hours


1,400 1.0
1.0
1,600 2.5
0.8
Observations

0.8

OSHA Case Rate,


1,800
2.0
TRC

TRC
2,000 0.6
0.6
2,200 1.5

2,400
2,600
0.4

0.2
Green line is the 1.0

0.5
0.4

0.2
2,800
3,000 0.0 Leading Indicators 0.0 0.0
Oct-08

Jun-09

Jun-10
Feb-09

Oct-09

Feb-10
Dec-08

Dec-09

Oct-08

Oct-09
Dec-08

Feb-09

Jun-09

Dec-09

Feb-10

Jun-10
Aug-08

Apr-09

Aug-09

Apr-10

Aug-10

Apr-09

Apr-10
Aug-08

Aug-09

Aug-10
30
OSHA Case Rate OSHA Case Rate

0
1
2

0.5
1.5
2.5
0
1
2

0.5
1.5
Jan-04 Jan-04 2.5
Apr-04 Apr-04
Jul-04 Jul-04
Oct-04 Oct-04
Jan-05 Jan-05
Apr-05 Apr-05
Jul-05 Jul-05
OSHA Rate

OSHA Rate
Oct-05 Oct-05
Rate

Jan-06

Misses
Jan-06
Apr-06 Apr-06
Jul-06 Jul-06
Oct-06 Oct-06
Jan-07 Jan-07
Apr-07 Apr-07
Jul-07
Near Misses
Jul-07
First Aid Rate

0
5
OSHA Case Rate versus First Aid Case

0
1
2
3
10
15
20
25

OSHA Case Rate versus Num ber of Near

No. of Near Misses First Aid Case Rate

OSHA Case Rate OSHA Case Rate

0
1
2

0.5
1.5
2.5
0
1
2

0.5
1.5
2.5

Jan-04
Correlations at Fluor Hanford

Jan-04
Apr-04 Apr-04
Jul-04 Jul-04
Oct-04
Oct-04
Jan-05
Apr-05 Jan-05
Jul-05 Apr-05
OSHA Rate

Jul-05
OSHA Rate

Oct-05
Jan-06 Oct-05
Apr-06 Jan-06
Reports

Jul-06 Apr-06
Oct-06 Jul-06
Jan-07 Oct-06
Safety Perception Survey

Apr-07 Jan-07
Jul-07 Apr-07
HGET Survey
OSHA Case Rate versus Em ployee

Jul-07
OSHA Case Rate versus Ocurrence

ORPS Non-Nuc

0
5

0.73
0.75
0.77
0.79
0.81
0.83
0.85
10
15
20
25

Relative Agreement ORPS (Non Rpt


Criteria 3) Rate
31
Leading and Lagging Indicators

• Lagging Indicators dominate at the higher levels,


reflecting outcomes. Tend to be standardized and
dictated from above.

• Leading Indicators dominate at the lower levels,


reflecting processes that achieve the outcomes.
Tend to be customized, and driven from the
bottom-up.

32
Hierarchy of Indicators

Corporation
LAGGING Project
Facility
Team
LEADING

Lagging indicators dominate at high levels, leading at


lower levels.

33
Performance Indicator Evolution
As a process matures, one may end up evolving the
indicators used. For example, if interested in completing
actions by commitment dates, one may end up using (as
the process matures):

• Percent of Actions completed by due date in effect at


time of completion
• Percent of Actions completed without missing any
due dates during their life
• Percent of Actions completed by the original due date
• Average days Actions completed ahead of original due
date

34
Can we really measure anything?

• Perhaps, but there are limits


There is a good book on “how to measure anything”
• Don’t become a Robert MacNamara “whiz kid”
Vietnam war “body counts”
• Pay attention to gut feelings, subjective risk
Reconcile any differences with the numbers

Can we support or refute the statement


“I love my spouse more than you love your spouse”
with a measurement? Should we?

35
Barrier: The Search for the “Perfect” Indicator

• When committees get together and try to table-top the perfect


indicator, paralysis often sets in.
• Realize all data are flawed, there is no “true value”, indicators
can always be “gamed.”
• Putting the right culture of HOW to use performance indicators
in place minimizes adverse impacts.
• Gain experience with simple indicators, then move on to more
complex indicators if needed.
• With proper analysis, flaws with existing data can be detected
and fixed. If you never look at the data, there will never be an
incentive to fix the data.

36
Barrier: Fear

• Higher ups will use it as a “hammer”


• Subjected to quotas and targets imposed from above
• Fear (“accountability”) used as a “motivator”
• Actions and Explanations as a result of random
fluctuations
• Perceived loss of control over portrayal of performance
• Must develop “perfect” indicator the first time

Use of SPC can minimize these fears

37
Barrier: 3 Kinds of Numbers for Management

• Facts of life. If we don't make this profit figure, we


will go out of business.
• Planning, prediction and budget. Can be used to
compare alternative plans.
• Arbitrary numerical targets. Generally used to judge
workers by.

Avoid the use of the 3rd kind of number

Henry Neave The Deming Dimension

38
“Just Do It”

• All data are flawed


• Make good use of your data
• Endless conference table discussions won’t
cause any data to appear
• Initial prototype successes will lead to
experience, and will further the spread of the
use of indicators

39
Data Gathering
• Plan ahead
• Establish Operational Definitions
• Check data quality
• Use existing databases before creating new

40
Plan Ahead
• “It’s absolutely vital for business that you settle this
method of counting, measuring, definition of faults,
mistake, defect, before you do business. It’s too late
afterwards”
-Dr. W. Edwards Deming

How many initiatives have we embarked upon, without a clear


set of indicators established up front, only to be left with, a
year afterwards, trying to figure out “what happened”?

41
Operational Definitions

• “Clean the Table”

42
Operational Definitions

• Most arguments about conflicting data come down to


the definition of how to count the data
• Try to be precise in your definitions, but likely
something unforeseen will arise
• Record the detail someplace:
– EM-SR—SRNS prefix occurrence reports
– Counted by categorization date
– Does not include canceled reports

43
Context

• Do not look at a chart (or any data) in a vacuum


• Reconcile any differences between the data and “gut
feeling”
• Combine experience and the data
• Lessons from the data should lead to insight in the
field, and vice versa

44
Data Quality


Data should be replicable

Operational Definitions are a must

Source Data must be defined

There is no “true value” of any measure, but a good
operational definition can save much trouble in the
future
ANYONE at ANYTIME in the future should be able to
apply the same operational definition to the same
source data, and get the same results.

45
Conclusion
• Performance Indicators are part of the Management
System
• Good use of data will encourage Performance Indicator
Development
• All data are flawed – Set up good Operational Definitions
to minimize flaws
• “Just Do It” – Start collecting data and use it
• There is no such thing as a bad performance indicator,
only bad use of performance indicators
• Good use of performance indicators will lead to continual
improvement
46

Potrebbero piacerti anche