Sei sulla pagina 1di 31

1

Total Quality Management


BE. Mechanical Final Year –VIII Semester
Statistical Process Control (SPC)
UNIT – III
SYLLABUS
The seven tools of quality, Statistical Fundamentals – Measures of central Tendency and
Dispersion, Population and Sample, Normal Curve, Control Charts for variables and attributes,
Process capability, Concept of six sigma, New seven Management tools

One of the best technical tools for improving product and service quality is Statistical Process
Control (SPC).There are seven basic techniques.
This technical tool not only controls the process but has the capability to improve it as well.
Pareto Diagram
Alfredo Pareto (1848—1923) conducted extensive studies of the distribution of wealth in
Europe.
v He found that there were a few people with a lot of money and many people with little
money.
v This unequal distribution of wealth became an integral part of economic theory.
v A Pareto diagram is a graph that ranks data classifications in descending order front left
to right.
v In this case, the data classifications are types of coaling machines
v Other possible data classifications are problems, complaints, causes. types of non
conformities.
v The vital few are on the left, and the useful many are on the right.
v It is sometimes necessary to combine some of the useful many into one classification
called the other.
v When this category is used, it is placed on the far right.
.
Cause-and-Effect Diagram
The cause-and-effect (C&E) diagram is a picture composed of lines and symbols designed to
represent a meaningful relationship between an effect and its causes.
• It was developed by Dr. Kaoru Ishikawa in 1943 and is some times referred to as an
Ishikawa diagram
• C&.E diagrams are used to investigate either a bad effect and to take action to correct the
causes or a good’ effect and to learn those causes that are responsible.
• Each major cause is further subdivided into numerous minor causes
2

Cause and effect Diagram

Attention to a few essential will provide a more accurate and usable result:
• Participation by every member of the team is facilitated by each member taking a turn
giving one idea at a time.i.e. a member cannot think of a minor cause, he or she passes for
that round. 2. Quantity of ideas rather than quality, is encouraged
• One person’s idea will trigger someone else’s idea, and a chain reaction occurs.
• Frequently, a trivial, or “dumb.” idea will lead to the best solution.
• Criticism of an idea is not allowed.
• There should be a free wheeling exchange of information that liberates the imagination.
• All ideas are placed on the diagram.
• Evaluation of ideas occurs ac a later time.
Visibility of the diagram is a primary factor of participation
v Create a solution oriented atmosphere and not a gripe session
v The team leader should ask questions using the why, what, where, when, who, and how
techniques.
v Let the ideas incubate for a period of time and then have another brainstorming session.
3
v Provide team members with a copy of the ideas after the first session. When no more
ideas are generated, the brain storming activity is terminated.

Check Sheets
The main purpose of check sheets is to ensure that the data is collected carefully and accurately
by operating personnel.
• Data should be collected in such a manner that it can be quickly and easily used and
analyzed.
• The form of the check sheet is individualized for each solution and is designed by the
project team..
• The figure shows a check sheet for paint nonconformities for bicycles.

v Whenever possible, check sheets are also designed to show location.


v For example the check sheet for bicycle paint nonconformities could show an
outline of a bicycle.
v Creativity plays a major role in the design of a check sheet.
v It should be user-friendly and whenever possible include information on time and
location.

Histogram
The first statistical: SPC technique is the histogram.. It describes the variation in the process by
the figure .The histogram graphically estimates the process capability and. If desired the
relationship to the specifications and the nominal.

Ungrouped Data
4
The graphical technique is a plot or picture of a frequency distribution which is a summarization
of how the data (observations) occur within each subdivision of observed values or groups of
observed values. Analytical techniques summarize data by


5

.
Statistical Fundamentals
Before a description of the next SPC t it is necessary to have a back ground in statistical
fundamentals.
v Statistics is defined as the science that deals with the collection, tabulation, analysis
interpretation, and presentation of quantitative data.
v Each division is dependent on the accuracy and completeness of the preceding one.
v Data may be collected by a technician by measuring the tensile strength of a plastic pad
or an operator using a check shed.
v It may be tabulated by simple paper-and-pencil technique or by the use of a computer the
final results are interpreted and presented to assist in the making of decisions concerning
quality.

Measures of Central Tendency


A measure of central tendency of a distribution is a numerical value that describes the central
position of the data or how the data tend to build up in the center. There are tree measures in
common use in quality:
• The average.
• The median.
• The mode.
The average is the sum of the observations divided by the number of observations. It is the most
common measure of central tendency and is represented by the equation.
6

Another measure of central tender is the median. Md which is defined as the value that divides a
series of ordered observations so that the number of items above it equal to the number below
.
Measures of Dispersion
v A second tool of statistics is composed of the measure of dispersion which describes how
the data are spread out or scattered on each side of the central value.
v Measures of dispersion and measure of central tendency are both needed to describe a
collection of data.

One of the measures of the dispersion is the range, which for a series of number is the difference
between the largest and the smallest values of observations and it is expressed by the equation

Ø The other measure of the dispersion used in quality is the standard deviation.
Ø It is a numerical value in the units of the observed values that measures the spreading
tendency of the data.
Ø A larger standard deviation shows greater variability of the data than does a small
standard deviation and it is represented by the equation
7

Population and Sample


v The population is the whole collection of data. When the averages, standard deviations
and other measures are computed from the samples, they are referred to as statistics.
v Because of the composition of samples will fluctuate the computed statistics will be
larger or smaller than their true population values or parameters.
v Parameters are considered to be fixed reference values or the best estimate of these values
available at a particular time.
v Sampling is necessary when it may be impossible to measure the entire population.
8
Normal curve
Although there are as many different populations as there arc conditions, they can be described
by a general few types.
• The normal curve is a symmetrical, unimodal bell-shaped distribution with the mean,
median, and mode having the same value.
• A curve of the normal population for the resistance in ohms of an electric device with
population mean, and population standard deviation..
• Much of the variation in nature and in industry follows the frequency distribution of the
normal curves.

• The normal curve is such a good description of the variations that occur to most quality
characteristics in industry that it is the basis for many quality control techniques.
• There is a definite relationship among the mean, the standard deviation and the normal
curve. Figure - shows three normal curves with different mean value and the only change
is the location.
• The normal distribution is fully defined by the population mean and population standard
deviation.
9

Introduction to Control Charts


Variation
The variation may be quite large and easily noticeable such as the height of human beings, or the
variation may be very small.
v When variations are very small it may appear that items are identical and however
precision instruments will show differences.
v There are three categories of variations in piece part production:
v Within piece variation is illustrated by the surface roughness of a piece, wherein one
portion of the surface is rougher than another portion or the width of one end of a keyway
varies from the other end.
v Piece to piece variation occurs among pieces produced at the same time.
v Thus, the light intensity of four consecutive light bulbs produced from a machine will be
different.
v . Time-to -time variation is illustrated by the differences in product produced at different
times of the day.
v Thus product produced at the early morning is different from that produced later in the day.
10
v Categories of variations for other types of processes such as continuous and batch are not
exactly the same: however the concept is similar.
v Variation is present in every process due to a combination of the equipment, materials,
environment and operator.
v The second source of variation is the material, the third source of variation is the
environment and the fourth is the operator.

Subgroup Size and Method


The data that are plotted on the control chart consist of groups of items called rational subgroups.
It is important to understand that data collected in a random manner do not qualify as rational.
v A rational subgroup is one in which the variation within the group is due only to chance
causes. This within-subgroup variation is used to determine the control limits.
v Variation between subgroups is used to evaluate long-term stability.
v Subgroup samples are selected from product or a service produced at one instant of time
or as close to that instant as possible, such as four consecutive parts from a machine or
four documents from a tray.
Decisions on the size of the sample or subgroup require a certain amount of empirical judgment
however, some helpful guidelines are:
• As the subgroup size increases, the control limits become closer to the central
value, which make the control chart more sensitive to small variations in the
process average.
• As the subgroup size increases, the inspection cost per subgroup increases.
• Does the increased cost of larger subgroups justify the greater sensitivity?
11
• When costly and/or destructive testing is used and the item is expensive, a small
sub group size of two or three is necessary, because it will minimize the
destruction of expensive product.
• Because of the ease of computation, a sample size of five is quite common in
industry; however, when inexpensive electronic hand calculators are used, this
reason is no longer valid.

Trial central lines and control limits

Where A D and D are factors that vary with the subgroup size and are found in Appendix A.
v For the X chart, the upper and lower control limits are symmetrical about the central line.
Theoretically, the control limits for an R chart should also be symmetrical about the
central line.
v But, for this situation to occur, with subgroup sizes of six or less, the lower control limit
would need to have a negative value.
v Because a negative range is impossible, the lower control limit is located at zero by
assigning to D the value of zero for subgroups of six or less.

State of control
12
Ø When the assignable causes have been eliminated from the process to the extent
that the points plotted on the control chart remain within the control limits; the
process is in a state of control.
Ø No higher degree of uniformity can be attained with the existing process.
Ø However, greater uniformity can be attained through a change in the basic
process resulting from quality improvement ideas.
Ø When a process is in control, there occurs a natural pattern of variation, which is
illustrated by the control chart in Figure.

When a process is in control, certain practical advantages accrue to the producer and
consumer:

Ø Individual units of the product will be more uniform, or, stated another way, there will be
less variation.
Ø The product is more uniform, fewer samples are needed to judge the quality.
Ø The cost of inspection can be reduced to a minimum.
Ø The process capability, or spread of the process, is easily attained
Ø Knowledge of the process capability, a number of reliable decisions relative to
specifications can be made,
Ø The product specifications; the amount of rework or scrap
Ø When there is insufficient tolerance; and whether to produce the product to tight
specifications and permit interchangeability of components or to produce the product to
loose specifications and use selective matching of components.
Ø The percentage of product that falls within any pair of values can be predicted with the
highest degree of assurance.
Ø It permits the customer to use the supplier’s data and, therefore, to test only a few
subgroups as a check on the supplier’s records. The X and R charts are used as statistical
evidence of process control.
Ø The operator is performing satisfactorily from a quality viewpoint. Further improvement
in the process can be achieved only by changing the input factors: materials, equipment,
environment, and operators. These changes require action by management.

OUT-OF-CONTROL PROCESS
Figure illustrates the effect of assignable causes of variation over time.
13
The unnatural, unstable nature of the variation makes it impossible to predict future variation.
The assignable causes must be found and corrected before a natural stable process can
continue.
The term out of control is usually thought of as being undesirable;
A process can also be considered out of control even when the points fall inside the 3 limits.
This situation, as shown in Figure occurs when unnatural runs of variation are present in the
process.
It is not natural for seven or more consecutive points to be above or below the central line as
shown at (a). Another unnatural run occurs at (b), where six points in a row are steadily
increasing or decreasing. At (c), the space is divided into four equal bands of 1.5 . The
process is out of control when there are two successive points at 1 .5 beyond.

There are some common questions to ask when investigating an out-of-control process:

v Are there differences in the measurement accuracy of the instruments used?


v Are there differences in the methods used by different operators?
v Is the process affected by the environment? If so, have there been any changes?
v Is the process affected by tool wear?
v Were any untrained workers involved in the process?
v Has there been any change in the source of the raw materials?
v Is the process affected by operator fatigue?
v Has there been any change in maintenance procedures?
v Is the equipment being adjusted too frequently?
v Did samples come from different shifts, operators, or machines?

Process Capability
Control limits are established as a function of the averages in other words, control limits are for
averages. Specifications, on the other hand, are the permissible variation in the size of the part
and are, therefore, for individual values
v The specification or tolerance limits are established by design engineers to meet a
particular function.
v Figure shows that the location of the specifications is optional and is not related to any of
the other features in the figure.
14
v The control limits, process spread (process capability), distribution of averages, and
distribution of individual values are interdependent.
v They are determined by the process, whereas the specifications have an optional location.
Control charts cannot determine if the process is meeting specifications.

v The true process capability cannot be determined until the X and R charts have achieved
the optimal quality improvement without a substantial investment for new equipment or
equipment modification.
v When the process is in statistical control, process capability is equal to 6 , where = R
and d is a factor fro m Appendix Table A. In the example problem, it is

v It is frequently necessary to obtain the process capability by a quick method rather than
by using the X and R charts.
v This method assumes the process is stable or in statistical control, which may or may not
be the case.
15
The procedure is as follows

Relationship of process capability to tolerance

Remember that this technique does not give the true process capability and should be used only
if circumstances require its use.
Also, more than 25 subgroups can be used to improve accuracy.
The relationship of process capability and specifications is shown in Figure.
Tolerance is the difference between the upper specification limit (USL) and the lower
specification limit (LSL). Process capability and the tolerance are combined to form a capability
index, defined as

If the capability index is greater than 1.00, the process is capable of meeting the specifications; if
the index is less than 1.00, the process is not capable of meeting the specifications.
Ø Because processes are continually shifting back and forth, a C value of 1.33 has become a
de facto standard, and some organizations are using a 2, 00 value.
16
Ø Using the capability index concept, we can measure quality, provided the process• is
centered. The larger the capability index, the better the quality.
Ø We should strive to make the capability index as large as possible.
Ø This result is accomplished by having realistic specifications and continual striving to
improve the process capability.
Ø The capability index does not measure process performance in terms of the nominal or
target value. This measure is accomplished using C which is
17

Quality professionals will use these eight items to improve the process. For example, if a C value
is less than one, then corrective action must occur.
v Initially 100% inspection is necessary to eliminate non conformities
v One solution would be to increase the tolerance of the specifications
v Another would be to work on the process to reduce the standard deviation or variability.

Different Control Charts for Variables


Although most of the quality control activity for variables is concerned with the X and R charts,
there are other charts that find application in some situation.
Control Charts for Attributes
An attribute, as defined in quality, refers to those quality characteristics that conform to
specifications or do not conform to specifications. There are two types:
Ø Where measurements are not possible, for example, visually inspected items such as
color, missing parts, scratches, and damage.
Ø Where measurements can be made but are not made because of time, cost, or need.
Ø In other words, although the diameter of a hole can be measured with an inside
micrometer, it may be more convenient to use a go- no go” gauge and determine if it con
forms or does not conform to specifications.
Where an attribute does not conform to specifications, various descriptive terms are used. A non
conformity is a departure of a quality characteristic from its intended level
18

The term nonconforming unit is used to describe a unit of product or service containing at least
one non conformity.
v Defective is analogous to defect and is appropriate for use when a unit of product or
service is evaluated in terms of usage rather than conformance to specifications.
v In this section we are listing the terms non conformity and nonconforming unit.
v This practice avoids the confusion and misunderstanding that occurs with defect and
defective in product-liability lawsuits.
v Variable control charts are an excellent means for controlling quality and subsequently
improving it; however, they do have limitations.
v One obvious limitation is that these charts cannot be used for quality characteristics that
are attributes.
v The converse is not true, because a variable can be changed to an attribute by stating that
it conforms or does not conform to specifications.
v In other words, nonconformities such as missing parts, in correct color, and so on, are not
measurable, and a variable control chart is not applicable.
v Another limitation concerns the fact that there are many variables in a manufacturing
entity.

v Even a small manufacturing plant could have as many as 1,000 variable quality
characteristics.
v Because X and charts are needed for each characteristic, 1,000 charts would be required.
19

There are two different groups of control charts for attributes. One group of charts is for
nonconforming units.
Ø A proportion, p, chart shows the proportion nonconforming in a sample or subgroup.
Ø The proportion is expressed as a fraction or a percent. Another chart in the group is for
number nonconforming, np.
Ø Another group of charts is for nonconformities.
Ø A c chart shows the count of non- conformities in an inspected unit such as an
automobile, bolt of cloth, or roll of paper.
Ø Another closely-related chart is the u chart, which is for the count of nonconformities per
unit.
Ø Much of the information on control charts for attributes is similar to that already given in
Variable Control Charts. Also see the information on State of Control.

Objectives of the Chart


The objectives of attribute charts are to
• Determine the average quality level. Knowledge of the quality average is essential as a
benchmark. This information provides the process capability in terms of attributes.
• Bring to the attention of management any changes in the average. Changes, either
increasing or decreasing, become significant once the average quality is known.
• Improve the product quality. In this regard, an attribute chart can motivate operating and
management personnel to initiate ideas for quality improvement.
• Evaluate the quality performance of operating and management personnel. Supervisors
should be evaluated by a chart for nonconforming units.
• Suggest places to use X and R charts. Even though the cost of computing and charting A’
and R charts is more than that of charts for attributes, they are much more sensitive to
variations and are more helpful in diagnosing causes. In other words, the attribute chart
suggests the source of difficulty, and A’ and R charts find the cause.
• Determine acceptance criteria of a product before shipment to the customer. Knowledge
of attributes provides management with information on whether or not to re lease an
order.
Use of the Chart
The general procedures that apply to variable control charts also apply to the p chart. The first
step in the procedure is to determine the use of the control chart. The p chart is used for data that
consist of the proportion of the number .of occurrences of an event to the total number of
occurrences. It is used in quality control to report the proportion non conforming in a product,
quality characteristic, or group of quality characteristics. As such, the proportion nonconforming
is the ratio of the number nonconforming in a sample or subgroup to the total number in the
sample or subgroup. In symbolic terms, the equation is

Where p = proportion (fraction or percent) nonconforming in the sample or subgroup


n = number in the sample or subgroup
np = number nonconforming in the sample or subgroup
The p chart is an extremely versatile control chart. It can be used to control one quality
characteristic, as is done with X and .R charts; to control a group of quality characteristics of the
same type or of the same part; or to control the entire product.
20
v The p chart can be established to measure the quality produced by a work center, by a
department, by a shift, or by an entire plant.
v It is frequently used to report the performance of an operator, group of operators, or
management as a means of evaluating their quality performance.
Subgroup Size
The second step is to determine the size of the subgroup. The subgroup size of the p chart can be
either variable or constant. A constant subgroup size is preferred; however, there may be many
situations, such as changes in mix and 100% automated inspection, where the subgroup size
changes.
• Therefore, the selection of the subgroup size requires some preliminary observations to
obtain a rough idea of the proportion nonconforming and some judgment as to the
average number of nonconforming units that will make an adequate graphical chart
• A minimum size of 50 is suggested as a starting point. Inspection can either be by audit

Trial Central Lines and Control Limits


The fourth step is the calculation of the trial central line and control limits. The average
proportion nonconforming is the central line and the control limits are established at 3o. The
equations are

v Calculations for the lower control limit resulted in a negative value, which is a theoretical
result. Therefore, the lower control limit value of 0.005 is changed to zero.
v When the lower control limit is positive, it may in some cases be changed to zero
v If the p chart is to be viewed by operating personnel, it would be difficult to explain why
a proportion nonconforming that is below the lower control limit is out of control limit is
left off the chart.
In this manner, exceptionally good performance (below the lower control limit) will be treated as
an out, of-control situation and be investigated for an assignable cause.
It is hoped that the assignable cause will indicate how the situation can be repeated.

Revised Central Line and Control Limits


The fifth step is completed by discarding any out-of-control points that have assignable
causes and recalculating the central line and control limits.
21

Scatter Diagrams
22
The simplest way to determine if a cause-and-effect relationship exists between two variables is
to plot a scatter diagram.
Ø Figure shows the relationship between automotive speed and gas mileage. The figure
shows that as speed increases, gas mileage decreases.
Ø Automotive speed is plotted on the x-axis and is the independent variable. The
independent variable is usually controllable
23

. SIX SIGMA
v The roots of six sigma as a measurement standard can be traced back to Carl Frederick
Gauss (1777-1885) who introduced the concept of the normal curve.
v Walter Stewarts showed that three sigma from the mean is the point where a process
requires correction.

v The objective of six sigma quality is to reduce process output variation so that ± six
standard deviations lie between the mean and the nearest specification limit.
v This will allow no more than 3.4 Defect Parts per Million (PPM) opportunities, also
known as Defects per Million Opportunities (DPMO), to be produced.
Sigma
is a Greek alphabet used to represent the distribution on spread (variation) about the mean of a
process.
• In manufacturing processes ‘ ’ is used to evaluate the capability of the process to
produce defect free output leading to customer’s satisfaction.
• It enables us to make comparisons with similar and dissimilar products and services.
• It is essential to understand the relationship between the sigma value and its implication
to meet the required specifications of a process.
• The process capability is estimated from the data collected on a process under statistical
control (only chance cause variations are present).
• is called standard deviation in a stable process (process under statistical control).
Predictions about process can be made from knowledge of .
• The pattern of variation of measurable characteristics (diameter, lengths, life etc.) will
conform to the statistical model known as the normal distribution.
• The total process variation is ± 3 .
• The property of normal distribution is that 99.73% of the entire output will be within 3 ,
only 0.27% output remains beyond ± 3 .

This sort of result can be expected only if the middle value of the specification is set at mean
level or at the centre, but in the actual process the mean value may not be at the middle.
24
When this happens even though process is not affected by any assignable cause variation and is
within chance cause variation level only, we call it a positive shift or movement towards higher
weight level.
Positive Shift
The process level shifts to the positive side, there will be no change in spread, but there will be
increase in defects on the positive side.

Negative Shift
The process level shifts to the negative side without any change in spread and you can see the
increase in defects on the negative side.

When we are clear that the process is capable of giving desired output level, then with the help of
control charts the process can be monitored to ensure that it is kept under control or within the
acceptable level.
v Random sampling can be used to monitor the process and also identify’ the assignable
causes, so that necessary action can be taken to bring back the process under control.
v The Japanese quality guru Dr. Hitoshi Kume changed the system of mentioning defects from
percentage to parts per million to bring about a psychological change in the people involved
in the process.
v When 0.27% may look small but 2700 ppm looks big and this makes people react and act
which can help in reducing defects.
v When we have started taking efforts the chance cause of variations slowly start disappearing
and improvements reduce the spread of ± 3 to lesser span as shown in figure.
25

Six Sigma
Six sigma strategies can be used in an organization to achieve incredible levels of efficiency.
Ø The defects can be brought down to a level of 3.4 parts per million.
Ø This level is with a shift of 1.5 . If the process can be centered properly the value can be still
smaller (two defects per billion).
Ø We check the output and if the output is prone to be on the lower or higher side (even if they
are within the tolerance).
Ø We will be adjusting the process till we get an acceptable level of performance i.e., when the
output is closer to the mean value of specification and the variations occur an positive as well
as negative side of the defects level.
Ø This essentiates a realistic margin to be provided for dynamic variations (shifts and drifts).
Therefore the long term capability is more significant. In six sigma approach the long term
The Motorola has stipulated 1.5 as typical mean shift after a lot of empirical research.
Therefore, 1.5 is the mean shift used in determining long term capability.
26

T = 12
Process mean is at distance of 6 from LSL and USL.
v 3.4 defects/million implies 1.5 shifts.
v The above process is 6 compliant. The short term capability of their processes is 6 a.
v The long term performance is 4.5. Process mean is at distance of 6 T = 12 a
v Process mean is at distance of 6 from LSL and USL.
v 3.4 defects/million implies 1.5 shifts.
v The above process is 6 a compliant. The short term capability of their processes is 6
v The long term performance is 4.5 from LSL and USL.
v 3.4 defects/million implies 1.5 shift.
v The above process is 6 compliant. The short term capability of their processes is 6 .
v The long term performance is 4.5
.
Units Used to Measure Defects.
The different units used to measure defects are

Six Sigma and Process Capability


The sigma level corresponding to the defect levels can be known from defects per million or six
sigma tables.
v When the Cp (capability index) value is higher then the variation will be lesser
v Higher the Cpr (achieved capability index) value, lesser will be the variation and the
process average will be nearer to the target.
v A process is said to of 6 when Cp 2 and Cpk 1.5.
27

THE NEW SEVEN MANAGEMENTAND PLANNING TOOLS


The “advanced” tools that are used to manage cross-functionality include the “seven new QC
tools” also known as the “seven management and planning tools” or “7 MP tools”.
These are summarized and illustrated in Hizuno (1979) and Brassard (1989)
.
• Identify a system owner and team members for each critical system.
• Describe the system under study.
• Identify all Identify subsystems that contribute to the critical system.
• Define the interdependencies of the subsystems.
• Prioritize the subsystems as to their contribution to the critical system.
• Develop a detailed “as is” description of the critical system.
• This includes identifying the interfaces between all system components as well as
expanding the level of detail for major contributing subsystems.
• Identify obvious system deficiencies.
• Identify possible cause of system deficiencies.
• Establish “base line” measures for the system and major subsystems.
• Assess the performance of the system and major subsystems.
• Develop a “should be” description of the system and subsystems.
• Recommend changes to improve system and subsystem performance (even if this may
mean creating a new system)

These are very important tools which are used in management and planning activities, they are
mainly used to organize and bring into a proper structure, the un organized unstructured complex
ideas.
The new seven management and planning tools are:
v Affinity diagrams
v Inter relationship diagram
v Tree diagram
v Matrix diagram
v Matrix data analysis
v Process decision programme chart and
v Arrow diagram.

Affinity Diagrams
In affinity diagrams large volumes of data is gathered and organized. Ideas, opinions and facts
relating to a problem are grouped.
• A sequence or pattern formation is the main aim. This is mainly useful in addressing
issues such as customer dissatisfaction etc.
• Affinity diagrams are tools for verbal data.
• Its applications are to organize into groups a large number of ideas, opinions about a
particular topic.
28

Interrelationship Diagram
The relationships between different causative factors and the main issue are established.
v This tool helps us in identify the relationship between different factors which cause a
problem or issue.
v It also helps in determining the interrelationship between these factors.
v This tool is used to identify the major causes which help in solving a problem on the
basis of logical analysis and linkage of causes associated with the problems.

Tree Diagram
Tree diagram is listed as a tool for non-numerical data. It is used to show the relationships
between an issue and its component elements.
• This is a tool used for operational planning after initial diagnosis of issues.
• The final chart shows steps from the initial problem to the sequential development till the
final conclusion.
• It looks more like a tree; therefore it is called a tree diagram.
29

Matrix Diagram
A matrix diagram consists of a set of columns and rows. The intersections of these rows and
columns are checked for determining the nature and strength of the problem.
v This helps us to arrive at the key ideas and determining the relationship and an effective
way of perusing the problem.
v The ideas are conceived on two dimensional relationship basis.
Matrix Data Analysis
It is a multi variate analysis technique also known as the “principal component analysis”. This
technique quantifies and arranges data presented in a matrix diagram to find more general
indicators that would differentiate and give clarity to large amount of complexly intertwined
information.

Process Decision Programme Chart (PDPC)


It is a method which maps out conceivable events and contingencies that can occur in any
implementation plan along with appropriate counter measures
v This tool is used to plan each possible chain of events that needs to occur when the
problem or goal is unfamiliar one.
v This is a qualitative tool.
Thus PDPC is useful whenever uncertainty exists in a proposed implementation plan. The
background situation for a PDPC application is:
• The talk on hand is either new or unique
• Implementation plan has sufficient complexity
• Consequences of failure are serious
• Efficiency of implementation is critical
• Contingencies must be planable.
30

Arrow Diagram
Arrow diagram is a tool to plan the most appropriate schedule for the completion of a complex
task and its related sub-tasks.
v It projects likely completion time and monitors all sub-tasks for adherence to necessary
schedule.
v The total work or task is broken down to sub-tasks or activities.
v The sub-tasks and the total work is linked by arrows and a diagram is constructed to depict
the activities.
31

Total Quality Management


BE. Mechanical Final Year –VIII Semester
Statistical Process Control (SPC)
UNIT – III
Assignment –III

Part-A
1. What is spc?
2. What are all the seven QC tools?
3. What is pareto principle?
4. What is cause and effect diagram?
5. Define a frequency histogram?
6. What are the different types of control charts?
7. How the data is different from information?
8. Define population and sample?
9. Define process capability?
10. What are all the management tools?

Part-B

1. Explain in detail seven QC tools.


2. Explain in detail various types of control charts.
3. Explain in detail the various aspects of population and variable techniques.
4. What is six sigma? Explain the five stages of six sigma.
5. Explain in detail the new seven management tools.
6. Explain in detail process capability and process capability index.
7. Explain the following.
i) Different levels of ppm with shift and without shift.
ii) Normal distribution.
iii) Mean mode, median and standard deviation.
iv) Specification limits and control limits.

Potrebbero piacerti anche