Sei sulla pagina 1di 14

Micrometer

From Wikipedia, the free encyclopedia


This article is about the measuring device. For the unit of length, see Micrometre.


Modern micrometer (value: 1.64 milimeters)


Outside, inside, and depth micrometers
A micrometer (/makrmtr/ US dict: mkrmtr), sometimes known as a micrometer screw
gauge, is a device incorporating a calibrated screw widely used for precise measurement of
components
[1]
in mechanical engineering and machining as well as most mechanical trades, along
with other metrological instruments such as dial, vernier, and digital calipers. Micrometers are
usually, but not always, in the form of calipers (opposing ends joined by a frame), which is
why micrometer caliper is another common name. The spindle is a very accurately machined
screw and the object to be measured is placed between the spindle and the anvil. The spindle is
moved by turning the ratchet knob or thimble until the object to be measured is lightly touched by
both the spindle and the anvil.
Micrometers are also used in telescopes or microscopes to measure the apparent diameter of
celestial bodies or microscopic objects. The micrometer used with a telescope was invented about
1638 by William Gascoigne, an English astronomer.
Colloquially the word micrometer is often shortened to mike or mic (/mak/) (US dict: mk).
Contents
[hide]
1 History of the device and its name
2 Types
o 2.1 Basic types
o 2.2 Specialized types
3 Operating principles
4 Parts
5 Reading
o 5.1 Imperial system
o 5.2 Metric system
o 5.3 Vernier
6 Torque repeatability via torque-limiting ratchets or sleeves
7 Calibration: testing and adjusting
o 7.1 Testing
o 7.2 Adjustment
8 See also
9 References
o 9.1 Bibliography
10 External links
History of the device and its name[edit]


Gascoigne's Micrometer as drawn by Robert Hooke
The word micrometer is a neoclassical coinage from Greek micros, meaning "small", and metron,
meaning "measure". The Merriam-Webster Collegiate Dictionary
[2]
says that English got it from
French and that its first known appearance in English writing was in 1670. Neither the metre nor
the micrometre nor the micrometer (device) as we know them today existed at that time. However,
the people of that time did have much need for, and interest in, the ability to measure small things
and small differences. The word was no doubt coined in reference to this endeavor, even if it did not
refer specifically to its present-day senses.
The first ever micrometric screw was invented by William Gascoigne in the 17th century, as an
enhancement of the vernier; it was used in a telescope to measure angular distances between stars
and the relative sizes of celestial objects.
Henry Maudslay built a bench micrometer in the early 19th century that was jocularly nicknamed "the
Lord Chancellor" among his staff because it was the final judge on measurement accuracy and
precision in the firm's work.
The first documented development of handheld micrometer-screw calipers was by Jean Laurent
Palmer of Paris in 1848;
[3]
the device is therefore often calledpalmer in French, and tornillo de
Palmer ("Palmer screw") in Spanish. (Those languages also use
the micrometer cognates: micromtre, micrmetro.) The micrometer caliper was introduced to the
mass market in anglophone countries by Brown & Sharpe in 1867,
[4]
allowing the penetration of the
instrument's use into the average machine shop. Brown & Sharpe were inspired by several earlier
devices, one of them being Palmer's design. In 1888 Edward Williams Morley added to the precision
of micrometric measurements and proved their accuracy in a complex series of experiments.
The culture of toolroom accuracy and precision, which started with interchangeability pioneers
including Gribeauval, Tousard, North, Hall, Whitney, and Colt, and continued through leaders such
as Maudslay, Palmer, Whitworth, Brown, Sharpe, Pratt, Whitney, Leland, and others, grew during
the Machine Age to become an important part of combiningapplied science with technology.
Beginning in the early 20th century, one could no longer truly master tool and die making, machine
tool building, or engineering without some knowledge of the science of metrology, as well as the
sciences of chemistry and physics (for metallurgy, kinematics/dynamics, and quality).
Types[edit]
Basic types[edit]


Large micrometer caliper.


Another large micrometer in use.
The topmost image shows the three most common types of micrometer; the names are based on
their application:
Outside micrometer (aka micrometer caliper), typically used to measure wires, spheres, shafts
and blocks.
Inside micrometer, used to measure the diameter of holes.
Depth micrometer, measures depths of slots and steps.
Specialized types[edit]
Each type of micrometer caliper can be fitted with specialized anvils and spindle tips for particular
measuring tasks. For example, the anvil may be shaped in the form of a segment of screw thread, in
the form of a v-block, or in the form of a large disc.
Universal micrometer sets come with interchangeable anvils, such as flat, spherical, spline,
disk, blade, point, and knife-edge. The term universal micrometer may also refer to a type of
micrometer whose frame has modular components, allowing one micrometer to function as
outside mic, depth mic, step mic, etc. (often known by the brand names Mul-T-Anvil and Uni-
Mike).
Blade micrometers have a matching set of narrow tips (blades). They allow, for example, the
measuring of a narrow o-ring groove.
Pitch-diameter micrometers (aka thread mics) have a matching set of thread-shaped tips for
measuring the pitch diameter of screw threads.
Limit mics have two anvils and two spindles, and are used like a snap gauge. The part being
checked must pass through the first gap and must stop at the second gap in order to be within
specification. The two gaps accurately reflect the top and bottom of the tolerancerange.
Bore micrometer, typically a three-anvil head on a micrometer base used to accurately
measure inside diameters.
Tube micrometers have a cylindrical anvil positioned perpendicularly to a spindle and is used
to measure the thickness of tubes.
Micrometer stops are micrometer heads that are mounted on the table of a manual milling
machine, bedways of a lathe, or other machine tool, in place of simple stops. They help the
operator to position the table or carriage precisely. Stops can also be used to actuate kickout
mechanisms or limit switches to halt an automatic feed system.
Ball micrometers have ball-shaped (spherical) anvils. They may have one flat and one ball
anvil, in which case they are used for measuring tube wall thickness, distance of a hole to an
edge, and other distances where one anvil must be placed against a rounded surface. They
differ in application from tube micrometers in that they may be used to measure against rounded
surfaces which are not tubes, but the ball anvil may also not be able to fit into smaller tubes as
easily as a tube micrometer. Ball micrometers with a pair of balls can be used when single-
tangential-point contact is desired on both sides. The most common example is in measuring the
pitch diameter of screw threads (which is also donewith conical anvils or the 3-wire method, the
latter of which uses similar geometry as the pair-of-balls approach).
Bench micrometers are tools for inspection use whose accuracy and precision are around half
a micrometre (20 millionths of an inch, "a fifth of a tenth" in machinist jargon) and
whose repeatability is around a quarter micrometre ("a tenth of a tenth"). An example is the Pratt
& Whitney Supermicrometer brand.
Digit mics are the type with mechanical digits that roll over.
Digital mics are the type that uses an encoder to detect the distance and displays the result on
a digital screen.
V mics are outside mics with a small V-block for an anvil. They are useful for measuring the
diameter of a circle from three points evenly spaced around it (versus the two points of a
standard outside micrometer). An example of when this is necessary is measuring the diameter
of 3-flute endmills and twist drills.
Operating principles[edit]


animation of a micrometer used to measure an object(black) of length = 4.14 mm
Micrometers use the principle of a screw to amplify small distances (that are too small to measure
directly) into large rotations of the screw that are big enough to read from a scale. The accuracy of a
micrometer derives from the accuracy of the thread-forms that are at its heart. In some cases it is
a differential screw. The basic operating principles of a micrometer are as follows:
1. The amount of rotation of an accurately made screw can be directly and precisely correlated
to a certain amount of axial movement (and vice versa), through the constant known as the
screw's lead (/lid/). A screw's lead is the distance it moves forward axially with one
complete turn (360). (In most threads [that is, in all single-start threads], lead and pitch refer
to essentially the same concept.)
2. With an appropriate lead and major diameter of the screw, a given amount of axial
movement will be amplified in the resulting circumferential movement.
For example, if the lead of a screw is 1 mm, but the major diameter (here, outer diameter) is 10 mm,
then the circumference of the screw is 10, or about 31.4 mm. Therefore, an axial movement of
1 mm is amplified (magnified) to a circumferential movement of 31.4 mm. This amplification allows a
small difference in the sizes of two similar measured objects to correlate to a larger difference in the
position of a micrometer's thimble. In some micrometers, even greater accuracy is obtained by using
a differential screw adjuster to move the thimble in much smaller increments than a single thread
would allow.
[5][6][7]

In classic-style analog micrometers, the position of the thimble is read directly from scale markings
on the thimble and shaft. A vernier scale is often included, which allows the position to be read to a
fraction of the smallest scale mark. In digital micrometers, an electronic readout displays the length
digitally on an LCD display on the instrument. There also exist mechanical-digit versions, like the
style of car odometers where the numbers "roll over".
Parts[edit]


The parts of a micrometer caliper, labeled. (Notice also that there is a handy decimal-fraction equivalents chart
printed right on the frame of this inch-reading micrometer.)
A micrometer is composed of:
Frame
The C-shaped body that holds the anvil and barrel in constant relation to each other. It is
thick because it needs to minimize flexion, expansion, and contraction, which would distort
the measurement.
The frame is heavy and consequently has a high thermal mass, to prevent substantial
heating up by the holding hand/fingers. It is often covered by insulating plastic plates which
further reduce heat transference.
Explanation: if you hold the frame long enough so that it heats up by 10C, then the increase
in length of any 10 cm linear piece of steel is of magnitude 1/100 mm. For micrometers this is
their typical accuracy range.
Micrometers typically have a specified temperature at which the measurement is correct
(often 20C [68F], which is generally considered "room temperature" in a room
with HVAC). Toolrooms are generally kept at 20C [68F].
Anvil
The shiny part that the spindle moves toward, and that the sample rests against.
Sleeve / barrel / stock
The stationary round part with the linear scale on it. Sometimes vernier markings.
Lock nut / lock-ring / thimble lock
The knurled part (or lever) that one can tighten to hold the spindle stationary, such as when
momentarily holding a measurement.
Screw
(not seen) The heart of the micrometer, as explained under "Operating principles". It is inside
the barrel. (No wonder that the usual name for the device in German isMessschraube,
literally "measuring screw".)
Spindle
The shiny cylindrical part that the thimble causes to move toward the anvil.
Thimble
The part that one's thumb turns. Graduated markings.
Ratchet stop
(not shown in illustration) Device on end of handle that limits applied pressure by slipping at
a calibrated torque.
Reading[edit]
Imperial system[edit]


Micrometer thimble showing 0.276 inch
The spindle of an imperial micrometer has 40 threads per inch, so
that one turn moves the spindle axially 0.025 inch (1 40 = 0.025),
equal to the distance between two graduations on the frame. The
25 graduations on the thimble allow the 0.025 inch to be further
divided, so that turning the thimble through one division moves the
spindle axially 0.001 inch (0.025 25 = 0.001). Thus, the reading is
given by the number of whole divisions that are visible on the scale
of the frame, multiplied by 25 (the number of thousandths of an
inch that each division represents), plus the number of that division
on the thimble which coincides with the axial zero line on the frame.
The result will be the diameter expressed in thousandths of an inch.
As the numbers 1, 2, 3, etc., appear below every fourth sub-division
on the frame, indicating hundreds of thousandths, the reading can
easily be taken.
Suppose the thimble were screwed out so that graduation 2, and
three additional sub-divisions, were visible (as shown in the image),
and that graduation 1 on the thimble coincided with the axial line on
the frame. The reading would then be 0.2000 + 0.075 + 0.001, or
.276 inch.

Metric system[edit]


Micrometer thimble reading 5.78mm
The spindle of an ordinary metric micrometer has 2 threads per
millimetre, and thus one complete revolution moves the spindle
through a distance of 0.5 millimeter. The longitudinal line on the
frame is graduated with 1 millimetre divisions and 0.5 millimetre
subdivisions. The thimble has 50 graduations, each being 0.01
millimetre (one-hundredth of a millimetre). Thus, the reading is
given by the number of millimetre divisions visible on the scale of
the sleeve plus the particular division on the thimble which
coincides with the axial line on the sleeve.
Suppose that the thimble were screwed out so that graduation 5,
and one additional 0.5 subdivision were visible (as shown in the
image), and that graduation 28 on the thimble coincided with the
axial line on the sleeve. The reading then would be 5.00 + 0.5 +
0.28 = 5.78 mm.
Vernier[edit]


Micrometer sleeve (with vernier) reading 5.783mm
Some micrometers are provided with a vernier scale on the sleeve
in addition to the regular graduations. These permit measurements
within 0.001 millimetre to be made on metric micrometers, or
0.0001 inches on inch-system micrometers.
The additional digit of these micrometers is obtained by finding the
line on the sleeve vernier scale which exactly coincides with one on
the thimble. The number of this coinciding vernier line represents
the additional digit.
Thus, the reading for metric micrometers of this type is the number
of whole millimetres (if any) and the number of hundredths of a
millimetre, as with an ordinary micrometer, and the number of
thousandths of a millimetre given by the coinciding vernier line on
the sleeve vernier scale.
For example, a measurement of 5.783 millimetres would be
obtained by reading 5.5 millimetres on the sleeve, and then adding
0.28 millimetre as determined by the thimble. The vernier would
then be used to read the 0.003 (as shown in the image).
Inch micrometers are read in a similar fashion.
Note: 0.01 millimetre = 0.000393 inch, and 0.002 millimetre =
0.000078 inch (78 millionths) or alternatively, 0.0001 inch =
0.00254 millimetres. Therefore, metric micrometers provide smaller
measuring increments than comparable inch unit micrometersthe
smallest graduation of an ordinary inch reading micrometer is
0.001 inch; the vernier type has graduations down to 0.0001 inch
(0.00254 mm). When using either a metric or inch micrometer,
without a vernier, smaller readings than those graduated may of
course be obtained by visual interpolation between graduations.
Torque repeatability via torque-limiting
ratchets or sleeves[edit]
A micrometer reading is not accurate if the thimble is overtorqued.
A useful feature of many micrometers is the inclusion of a torque-
limiting device on the thimbleeither a spring-loaded ratchet or a
friction sleeve. Without this device, workers may overtighten the
micrometer on the work, causing the mechanical advantage of the
screw to squeeze the material or tighten the screw threads, giving
an inaccurate measurement. However, with a thimble that will
ratchet or friction slip at a certain torque, the micrometer will not
continue to advance once sufficient resistance is encountered. This
results in greater accuracy and repeatability of measurements
most especially for low-skilled or semi-skilled workers, who may not
have developed the light, consistent touch of a skilled user.
Calibration: testing and adjusting[edit]
Testing[edit]
A standard one-inch micrometer has readout divisions of .001 inch
and a rated accuracy of +/- .0001 inch
[8]
("one tenth", in machinist
parlance). Both the measuring instrument and the object being
measured should be at room temperature for an accurate
measurement; dirt, abuse, and low operator skill are the main
sources of error.
[9]

The accuracy of micrometers is checked by using them to
measure gauge blocks,
[10]
rods, or similar standards whose lengths
are precisely and accurately known. If the gauge block is known to
be 0.7500" .00005" ("seven-fifty plus or minus fifty millionths", that
is, "seven hundred fifty thou plus or minus half a tenth"), then the
micrometer should measure it as 0.7500". If the micrometer
measures 0.7503", then it is out of calibration. Cleanliness and low
torque are especially important when calibratingeach tenth (that
is, ten-thousandth of an inch), or hundredth of a millimeter,
"counts"; each is important. A mere spec of dirt, or a mere bit too
much squeeze, obscure the truth of whether the instrument is able
to read correctly. The solution is simply conscientiousness
cleaning, patience, due care and attention, and repeated
measurements (good repeatability assures the calibrator that
his/her technique is working correctly).
Calibration would not record the error at approx 5 points along the
range. Only one can be adjusted to zero. If the micrometer is in
good condition, then they are all so near to zerothat the instrument
seems to read essentially "-on" all along its range; no noticeable
error is seen at any locale. In contrast, on a worn-out micrometer
(or one that was poorly made to begin with), one can "chase the
error up and down the range", that is, move it up or down to any of
various locales along the range, by adjusting the barrel, but one
cannot eliminate it from all locales at once.
Calibration can also include the condition of the tips (flat and
parallel), any ratchet, and linearity of the scale.
[11]
Flatness and
parallelism are typically measured with a gauge called an optical
flat, a disc of glass or plastic ground with extreme accuracy to have
flat, parallel faces, which allows light bands to be counted when the
micrometer's anvil and spindle are against it, revealing their amount
of geometric inaccuracy.
Commercial machine shops, especially those that do certain
categories of work (military or commercial aerospace, nuclear
power industry, and others), are required by variousstandards
organizations (such as ISO, ANSI, ASME, ASTM, SAE, AIA, the
U.S. military, and others) to calibrate micrometers and other
gauges on a schedule (often annually), to affix a label to each
gauge that gives it an ID number and a calibration expiration date,
to keep a record of all the gauges by ID number, and to specify in
inspection reports which gauge was used for a particular
measurement.
Not all calibration is an affair for metrology labs. A micrometer can
be calibrated on-site anytime, at least in the most basic and
important way (if not comprehensively), by measuring a high-grade
gauge block and adjusting to match. Even gauges that are
calibrated annually and within their expiration timeframe should be
checked this way every month or two, if they are used daily. They
usually will check out OK as needing no adjustment.
The accuracy of the gauge blocks themselves is traceable through
a chain of comparisons back to a master standard such as
the international prototype meter. This bar of metal, like
the international prototype kilogram, is maintained under controlled
conditions at the International Bureau of Weights and
Measures headquarters in France, which is one of the
principal measurement standards laboratories of the world. These
master standards have extreme-accuracy regional copies (kept in
the national laboratories of various countries, such as NIST), and
metrological equipment makes the chain of comparisons. Because
the definition of the meter is now based on a light wavelength, the
international prototype meter is not quite as indispensable as it
once was. But such master gauges are still important for calibrating
and certifying metrological equipment. Equipment described as
"NIST traceable" means that its comparison against master
gauges, and their comparison against others, can be traced back
through a chain of documentation to equipment in the NIST labs.
Maintaining this degree of traceability requires some expense,
which is why NIST-traceable equipment is more expensive than
non-NIST-traceable. But applications needing the highest degree of
quality control mandate the cost.
Adjustment[edit]
A micrometer that has been tested and found to be off might be
restored to accuracy by recalibration. On most micrometers, a
small pin spanner is used to turn the barrel relative to the frame, so
that its zero line is repositioned relative to the screw and thimble.
(There is usually a small hole on the barrel to accept the spanner's
pin.)
This calibration procedure will cancel a zero error: the problem that
the micrometer reads nonzero when its jaws are closed.
However, if the error originates from the parts of the micrometer
being worn out of shape and size, then restoration of accuracy by
this means is not possible; rather, repair (grinding, lapping, or
replacing of parts) is required.

Potrebbero piacerti anche