Sei sulla pagina 1di 10

Attribute Measurement System Analysis

Vendor Invoices
Cost Data Integrity Project

Purpose
Need to make sure that our operating definitions are rock solid and no variation gets introduced based on how we are
measuring things. For example, if two different people measure the same thing and they come up with two different answers,
then we have some degree of variation introduced through bias and we want to measure only real variation. So it is
important to evaluate or gage two things:

1. Repeatability - If the same person measures the same thing, do they still get the same answer?
2. Reproducibility - If someone else measures the same thing, do they get the same answer the first person got?

Note: You may see this referred to as "Gage R & R". Here are a few more tips to keep in mind:

- If physical equipment or instruments are involved in measurement readings, make sure the equipment
is properly calibrated to ensure accuracy.
- People who do the measuring should be well versed in the process and follow a documented step by step
procedure.
- Methods, materials, and even the environment can impact the readings. You want a typical or usual environment
that is representative of how the process really works.
- Over time, you want to have a stable environment. Plotting natural variations using X & R Charts is recommende

Step 1 - Select at least two observers or appraisers to participate in the Measurement System Analysis:

Name of Observers / Appraisers 1.6 In order to measure repeatability, each person should
1.1 Sherry Minzer make at least two trial runs. Indicate the number of
1.2 Robert Bowers trail runs each person will take: 2
1.3 Jamin Musului
1.4
1.5

Step 2 - Establish an acceptable criteria regarding if the measurement passes of fails:

2.1 Percent that must be in Agreement > 95%


2.2 Confidence Level in Results > 95%
2.3 Indicate the number of measurements to be taken > 20

Step 3 - Select the units to be measured, have each observer measure the unit in random order, repeat the process
according to the number of trial runs, and tabulate the results in the table below:

Observ Sherry Minzer Robert Bowers


No. Trial 1 Trial 2 Trial 3 Trial 4 Trial 1 Trial 2 Trial 3 Trial 4
1 2 2 2 2
2 3 3 3 3
3 2 2 2 2
4 4 4 4 4
5 4 4 4 4
6 2 2 4 4
7 0 0 1 1
8 2 2 2 2
9 3 3 3 3
10 5 5 5 5
11 4 3 3 3
12 3 3 3 3
13 2 2 2 2
14 2 2 2 2
15 3 3 3 3
16 6 5 6 6
17 4 4 4 4
18 5 5 4 4
19 3 3 3 3
20 2 2 2 2
21
22
23
24
25
26
27
28
29
30

Step 4 - Determine the Degree of Agreement in the Data

Observations per Each Trial in Agreement by Observer Observations in Agreement Betwee


Observ Observ Observ Observ Observ Observ Observ Observ 1.1 vs:
No. 1.1 1.2 1.3 1.4 1.5 No. 1.2
1 1 1 1 1 1
2 1 1 1 2 1
3 1 1 1 3 1
4 1 1 1 4 1
5 1 1 1 5 1
6 1 1 1 6 0
7 1 1 1 7 0
8 1 1 1 8 1
9 1 1 1 9 1
10 1 1 0 10 1
11 0 1 1 11 0
12 1 1 1 12 1
13 1 1 1 13 1
14 1 1 1 14 1
15 1 1 1 15 1
16 0 1 1 16 0
17 1 1 1 17 1
18 1 1 1 18 0
19 1 1 1 19 1
20 1 1 1 20 1
21 21
22 22
23 23
24 24
25 25
26 26
27 27
28 28
29 29
30 30
Total Agreements 18 20 19 15
ed on how we are
th two different answers,
ariation. So it is

wer the first person got?

the equipment

ented step by step

ical or usual environment

R Charts is recommended.

, each person should


cate the number of

er, repeat the process

Jamin Musului 0 0
Trial 1 Trial 2 Trial 3 Trial 4 Trial 1 Trial 2 Trial 3 Trial 4 Trial 1 Trial 2
2 2
3 3
2 2
4 4
4 4
4 4
1 1
2 2
3 3
5 4
3 3
3 3
2 2
2 2
3 3
5 5
4 4
4 4
3 3
2 2

ons in Agreement Between Observers (May have to revise the tabulation below to fit the tabluation to the left)
Observ 1.1 vs: Observ 1.2 vs: Observ 1.3 vs:
1.3 1.1 1.3 1.1 1.2
1 Col J 1 Col K Col M
1 1
1 1
1 1
1 1
0 1
0 1
1 1
1 1
0 0
0 1
1 1
1 1
1 1
1 1
0 0
1 1
0 1
1 1
1 1
14 0 18 0 0
Trial 3 Trial 4
Attribute Measurement System Analysis
Vendor Invoices
Cost Data Integrity Project

Repeatability Results
Were each of the observers able to repeat within our target of > 95%
Number in Meets
Observer / Appraiser No of Obs Agreement Percent Target?
Sherry Minzer 20 18 90% No
Robert Bowers 20 20 100% Yes
Jamin Musului 20 19 95% Yes
0 20 0% No
0 20 0% No

Reproducibility Results
How well did each observer do in reproducing the same results as the other observers?
Number in Meets
Observer / Appraiser No of Obs Agreement Percent Target?
Observ 1.1 vs: 1.2 20 15 75% No
Observ 1.1 vs: 1.3 20 14 70% No
Observ 1.2 vs: 1.3 20 18 90% No
20 0% No
20 0% No

Potrebbero piacerti anche