K
I've always used 33K6-4-15-1 for micrometer calibration. Table A-1 states accuracy for various resolution and range micrometers. For 0.001 resolution the accuracy is stated as ±0.001" up to 36" range. However, in looking at GGG-C-105C and ASME B89.1.13, both make a blanket statement that the maximum permissible error for outside micrometers is, for example, ±0.0001" for 0-1" mics, ±0.0002" for 1-2" through 7-9" mics, and so on. In the case of B89.1.13 it specifically states that this is independent of flatness and parallelism.
My question is this, then. When reading a 0.001" resolution micrometer, how can one possibly discriminate ±0.0001"? I see no way that this can be humanly possible, when using gage blocks to calibrate the micrometer. The only way I could see this tolerance being achievable would be use a procedure something like this, for measurement error (accuracy) verification:
Use a supermicrometer, or similar standard.
Mount the micrometer such that the spindle is parallel to the spindle of the supermic.
Rotate the micrometer spindle to exactly 0.
Bring the supermic spindle into contact with the mic spindle.
Zero the supermic.
Retract the supermic to an approximate test point (.210, .420, .605, etc.)
Rotate the micrometer spindle to read exactly the appropriate test point.
Bring the supermic spindle into contact with the mic spindle.
Read the error on the supermic.
Other than some method similar to this, using an indicating standard, the only way I could see to actually verify a 0.001 mic to the tolerances specified in GGG-C-105C and B89.1.13 would be to create several gage block stacks, in 0.0001" increments, bracketing each test point. Measure each gage block stack, using the micrometer being verified, until the stack is found that measured closest to the exact test point. The deviation of this stack from the nominal value would give an indication of the actual micrometer deviation from nominal, in ten-thousandths of an inch.
Can someone possibly enlighten me, because apparently I'm missing how I'm supposed to reliably read a 0.001 resolution micrometer to ±0.0001"?
My question is this, then. When reading a 0.001" resolution micrometer, how can one possibly discriminate ±0.0001"? I see no way that this can be humanly possible, when using gage blocks to calibrate the micrometer. The only way I could see this tolerance being achievable would be use a procedure something like this, for measurement error (accuracy) verification:
Use a supermicrometer, or similar standard.
Mount the micrometer such that the spindle is parallel to the spindle of the supermic.
Rotate the micrometer spindle to exactly 0.
Bring the supermic spindle into contact with the mic spindle.
Zero the supermic.
Retract the supermic to an approximate test point (.210, .420, .605, etc.)
Rotate the micrometer spindle to read exactly the appropriate test point.
Bring the supermic spindle into contact with the mic spindle.
Read the error on the supermic.
Other than some method similar to this, using an indicating standard, the only way I could see to actually verify a 0.001 mic to the tolerances specified in GGG-C-105C and B89.1.13 would be to create several gage block stacks, in 0.0001" increments, bracketing each test point. Measure each gage block stack, using the micrometer being verified, until the stack is found that measured closest to the exact test point. The deviation of this stack from the nominal value would give an indication of the actual micrometer deviation from nominal, in ten-thousandths of an inch.
Can someone possibly enlighten me, because apparently I'm missing how I'm supposed to reliably read a 0.001 resolution micrometer to ±0.0001"?
