Sei sulla pagina 1di 50

CLASSIFICATION OF STANDARDS

Primary, Secondary, Tertiary & Working


Standards:

Primary standard: It is only one material


standard and is preserved under the most
careful conditions and is used only for
comparison with Secondary standard.
Contd..
Secondary standard: It is similar to Primary
standard as nearly as possible and is distributed to
a number of places for safe custody and is used for
occasional comparison with Tertiary standards.
Tertiary standard: It is used for reference purposes
in laboratories and workshops and is used for
comparison with working standard. Working
standard: It is used daily in laboratories and
workshops. Low grades of materials may be used.
Examples
Line & End Standards:
In the Line standard, the length is the distance
between the centres of engraved lines
whereas in End standard, it is the distance
between the end faces of the standard.
Example: for Line standard is Measuring Scale,
for End standard is Block gauge.
Definitions
Five major elements
For a measurement system, these elements
contribute to the variability of a measurement
process:
1) the standard
2) the workpiece
3) the instrument
4) the people and
5) the environment
HISTORY
Earliest standards based on the body.
4000 BC common unit was a kings elbow = 1.5 feet or 2 hand-spans or
6 hand widths or 24 finger thicknesses.
AD 1101 the yard became standard.
Distance from the nose of King Henry I to the tip of his finger
1600s various standards based on the length of a pendulum with a
given period.
1799 first definition of meter and standard block made
1870s new international standard meter = 10-7 x distance from north
pole to equator.
1960 meter officially designated at 1,650,763.73 wavelengths in vacuum
of the orange light given off by an electrically excited krypton 86
Definitions
Measuring instrument:
A device used to inspect, measure, test, or examine parts
in order to determine compliance with required
specifications.
Gage:
A device that determines whether or not a part feature is
within specified limits.
Most gages do not provide an actual measurement value.
However, measuring instruments are also sometimes
called gages.
Definitions
Accuracy
Quantitative measure of the degree of conformance to
recognized national or international standards of measurement
Repeatability
Measure of the ability of a machine to sequentially position a
tool with respect to a work piece or produce a part within small
range under similar conditions.
Resolution
The smallest change in a measured value that the instrument
can detect.
The least increment of a measuring device; the least significant
bit on a digital machine.
Resolution is also known as sensitivity.
Precision: The ability of the instrument to
reproduce its readings or observation again and
again for constant input signal.
Range: The limit of measurement values that an
instrument is capable of reading. The dimension
being measured must fit inside this range.
The physical variables that are measured between
two values. One is the higher calibration value Hc
and the other is Lower value Lc.
True size: Theoretical size of a dimension which is free from errors.
If the precision measuring instrument is highly calibrated for its error of
measurement & the constant error of measurement is known in advance, then the
accurate (true) value can be obtained as follows ;
True value = Measured value Error
Hence, calibrated & precision measuring instrument is more reliable and hence is
used in metrological laboratories.
Actual size: Size obtained through measurement with permissible error.

Error: Error in measurement is the difference between the measured value


and the true value of the measured dimension.
Error in measurement = Measured value True value
Correction: Correction is defined as a value which is
added algebraically to the uncorrected result of the
measurement to compensate to an assumed
systematic error.
Stability: The ability of a measuring instrument to
retain its calibration over a long period of time.
Stability determines an instrument's consistency over
time.
Standard: A recognized true value. Calibration must
compare measurement values to a known standard.
Span: The algebraic difference between higher
calibration values to lower calibration values.
Dead Zone: It is the largest change in the physical
variable to which the measuring instrument does not
respond.
Threshold: The minimum value of input signal that is
required to make a change or start from zero.
Backlash: The maximum distance through which one
part of the instrument is moved without disturbing
the other part.
Bias: It is a characteristic of a measure or measuring
instruments to give indications of the value of a
measured quantity for which the average value
differs from true value
Response Time: The time at which the instrument
begins its response for a change in the measured
quantity
Magnification: It means the magnitude of output
signal of measuring instrument many times increases
to make it more readable
Drift: If an instrument does not reproduce the same reading at
different times of measurement for the same input signal, it is
said to be measurement drift.
Reproducibility: It is the consistency of pattern of variation in
measurement. When individual measurements are carried out
the closeness of the agreement between the results of
measurements of the same quantity.
Uncertainty: The range about the measured value within the true
value of the measured quantity is likely to lie at the stated level of
confidence.
Traceability: It is establishing a calibration by step by step
comparison with better standards.
TRACEABILITY
The chain of calibrations, genealogy, that
establishes the value of a standard or
measurement
In the U.S. traceability for most physical and
some chemical standards goes back to NIST
From Basic Laboratory
Methods for
Biotechnology:
Textbook and
Laboratory Reference,
Seidman and Moore,
2000
TRACEABILITY

Note in this catalog example, traceable to


NIST
STABILITY
Ability of a measuring instrument's
metrological characteristics to remain constant
over time. (paraphrased from the ISO
International guide of basic and general terms
in metrology, 1993; item 5.14.)
From this, Stability is a property of an individual
measuring instument -- its variation over time.
Precision vs Accuracy
Precision

Repeatability Problem: Reproducibility Problem:


The same person (or station) Different people (or stations)
cant get the same result cant agree on the result
twice on the same subject obtained on the same subject
Within Inspector Error between Inspector Error
Repeatability Reproducibility
Ability of the same gage to Ability of the same gage to
give consistent give consistent measurement
measurement readings no readings regardless of who
matter how many times performs the
the same operator of the measurements.
gage repeats the The evaluation of a gage's
measurement process. reproducibility, therefore,
requires measurement
readings to be acquired by
different operators under the
same conditions.

Of course, in the real world, there are no existing gages or measuring devices
that give exactly the same measurement readings all the time for the same
parameter. Ex: Cloning
Error: The difference between true value and measured
value is known as measurement error.
Error = Vt Vm
Reliability: It is defined as the probability that a given
system will perform its function adequately for its specified
period of lifetime under specified operating conditions.
Hysteresis: All the energy put into the stressed component
when loaded is not recovered upon unloading. so the
output of measurement partially depends on input called
Hysteresis.
MEASURE TIME?
SELECTION OF MEASURING INSTRUMENTS
Accuracy
The degree of agreement of the measured dimension
with its true magnitude
Magnification (amplification)
Precision
Resolution
the smallest dimension that can be read on an
instruments
Rules of 10 (gage makers rule)
At least 10 times accurate than the tolerance
Sensitivity
Stability (drift); capability to maintain calibrated status
READABILITY
Readability is a word which is frequently used in the analog
measurement. The readability is depends on the both the
instruments and observer.
Readability is defined as the closeness with which the scale
of an analog instrument can be read.
It is the susceptibility of a measuring instrument to having
its indications converted to a meaningful number. It implies
the ease with which observations can be made accurately.
For getting better readability the instrument scale should
be as high as possible.
SENSITIVITY
Sensitivity of the instrument is defined as the
ratio of the magnitude of the output signal to
the magnitude of the input signal.
It denotes the smallest change in the measured
variable to which the instruments responds.
Sensitivity has no unique unit. It has wide range
of the units which dependent up on the
instrument or measuring system.
CALIBRATION
A known input is given to the measurement
system
the output deviates from the given input
corrections are made in the instrument and
then the output is measured.
This process is called Calibration.

Potrebbero piacerti anche