Micrometer Calibration - Resolution, Accuracy and Range

K

kgriff

#1
I've always used 33K6-4-15-1 for micrometer calibration. Table A-1 states accuracy for various resolution and range micrometers. For 0.001 resolution the accuracy is stated as ±0.001" up to 36" range. However, in looking at GGG-C-105C and ASME B89.1.13, both make a blanket statement that the maximum permissible error for outside micrometers is, for example, ±0.0001" for 0-1" mics, ±0.0002" for 1-2" through 7-9" mics, and so on. In the case of B89.1.13 it specifically states that this is independent of flatness and parallelism.
My question is this, then. When reading a 0.001" resolution micrometer, how can one possibly discriminate ±0.0001"? I see no way that this can be humanly possible, when using gage blocks to calibrate the micrometer. The only way I could see this tolerance being achievable would be use a procedure something like this, for measurement error (accuracy) verification:

Use a supermicrometer, or similar standard.
Mount the micrometer such that the spindle is parallel to the spindle of the supermic.
Rotate the micrometer spindle to exactly 0.
Bring the supermic spindle into contact with the mic spindle.
Zero the supermic.
Retract the supermic to an approximate test point (.210, .420, .605, etc.)
Rotate the micrometer spindle to read exactly the appropriate test point.
Bring the supermic spindle into contact with the mic spindle.
Read the error on the supermic.

Other than some method similar to this, using an indicating standard, the only way I could see to actually verify a 0.001 mic to the tolerances specified in GGG-C-105C and B89.1.13 would be to create several gage block stacks, in 0.0001" increments, bracketing each test point. Measure each gage block stack, using the micrometer being verified, until the stack is found that measured closest to the exact test point. The deviation of this stack from the nominal value would give an indication of the actual micrometer deviation from nominal, in ten-thousandths of an inch.

Can someone possibly enlighten me, because apparently I'm missing how I'm supposed to reliably read a 0.001 resolution micrometer to ±0.0001"?
 
Elsmar Forum Sponsor

BradM

Staff member
Admin
#2
Can someone possibly enlighten me, because apparently I'm missing how I'm supposed to reliably read a 0.001 resolution micrometer to ±0.0001"?
No light here. :D:tg:

Based on my experience and teaching, you cannot have a tolerance tighter than the resolution. You simply cannot verify something that you cannot read.

I'm not familar with the standards you listed. I'm wondering if they are assuming that all the micrometers that are applicable have 4 place resolution. :)
 

bobdoering

Stop X-bar/R Madness!!
Trusted Information Resource
#3
I'm wondering if they are assuming that all the micrometers that are applicable have 4 place resolution. :)
And even if they did, there is not enough resolution in a gage that reads to .0001 to measure a spec of +/-.0001. When you add gage error, you have no resolution.
 
K

kgriff

#4
I'm wondering if they are assuming that all the micrometers that are applicable have 4 place resolution.
Unfortunately I don't know what they are assuming. That thought has crossed my mind many times, though. I haven't seen anything in either of the standards noted that says anything about the resolution of the mic. Of course, it's entirely possible that they do say something about resolution and I've just missed it. I will be looking again later today.
Likely, I'll wind up getting in contact with one of the committee members for that ASME standard and see what they say, unless someone here can provide enlightenment.
 

adamt

Involved In Discussions
#5
Good question! It is not just Fed spec asme or gidep. If you look up the accuracy of say a Mitutoyo 103-177 it is +/- .0001" with a resolution of .001". They say you can with a calibrated eye ball you can split that space into 10 divisions.
20 years of calibration experience tells me that some things will never be explained. We take an educated guess.
 
H

Hodgepodge

#6
Good question! It is not just Fed spec asme or gidep. If you look up the accuracy of say a Mitutoyo 103-177 it is +/- .0001" with a resolution of .001". They say you can with a calibrated eye ball you can split that space into 10 divisions.
20 years of calibration experience tells me that some things will never be explained. We take an educated guess.
adamt, I think you are right about the guessing. If you look through some of the other Fed-Specs you will see tolerances such as "1/5 least graduation". I do believe the intent was for you to make a reasonable determination by eye. Oddly enough, I couldn't find a Fed-Spec for calibration of eye balls :tg:.
 

Hershal

Metrologist-Auditor
Staff member
Super Moderator
#7
When the resolution is 10 times the required accuracy then only two things are possible: change the spec or change the instrument.

Based on the conversation thus far, it appears uncertainty is not the subject of discussion in this case.
 
K

kgriff

#8
Based on the conversation thus far, it appears uncertainty is not the subject of discussion in this case.
Correct. My query was not related to uncertainty, but simply stated accuracy. I'm just asking if anyone has any insight into the specifications stated in GGG-G-105C and ASME B89.1.13. Both state micrometer "maximum permissible error." In my example below, a 0-1" OD mic would have a "maximum permissible error" of ±0.0001. My question is, how is this possible with, for example, a 0.001" resolution micrometer?
I agree with you about change the spec or change the instrument. However, this is just a question about the intent of the standards, since they don't really make sense to me.
 
F

falconer65

#9
kgriff,

My interpretation would be line up the 0.001" micrometer within half of the demarcation. That is, line the lines up so that they touch.

Then read from the supermic which has a resolution of 0.0001". As pointed out earlier, old school methods depended on skill to divide the 0.0001" into fractions.

That is how I remember being taught 20+ years ago.
 
T

TOOLMAKER

#10
Hello All,
I have read the threads and would like to add comment.

The Max Permissible Error is basically the total uncertainty associated with the micrometer. The accuracy of the micrometer can be greater than the resolution since the lead screw of the micrometer is ground to the same requirements.

The method for calibration can be an interpolation of the vernier under a magnification. The 4:1 vs 10:1 ratios are utilized to reduce measurement error. This is applied to the standards used for calibration not to the part being measured. The accuracy/uncertainty of the standard should be 4 to 10 times greater than the stated accuracy of the micrometer.

The GGG and ASME standards are identifying the uncertainty (MPE) for the micrometers. This is quantified by utilizing an Uncertainty Budget that accounts for the influence factors for the calibration. i.e. CTE, Parallelism/Flatness of anvils, Leadscrew hysterisis, etc.

The simplest answer is to assure that the .001" micrometer is only used to measure parts with a minimum tolerance of ±.004".

Hope this helps,
TM
 
Thread starter Similar threads Forum Replies Date
N Confuse: Calibration using master equipment - Resolution - Micrometer General Measurement Device and Calibration Topics 7
C Micrometer and Vernier Caliper simple calibration process General Measurement Device and Calibration Topics 8
G From an ISO 17025 auditor perspective must micrometer calibration check anvil flatness? General Measurement Device and Calibration Topics 4
T A question on Calibration on In-line Laser Micrometer General Measurement Device and Calibration Topics 1
C One Inch Micrometer Calibration Method General Measurement Device and Calibration Topics 2
S Micrometer Calibration Using a CMM Calibration and Metrology Software and Hardware 12
M Calibration of a Micrometer used as a Standard General Measurement Device and Calibration Topics 5
W First Draft Micrometer Calibration Procedure ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 19
C Micrometer Parallelism and Flatness Calibration Requirements and Specs General Measurement Device and Calibration Topics 10
J Calibration of Micrometer Reading up to Three Decimals General Measurement Device and Calibration Topics 3
K Calibration Procedure for ID Micrometer Calibration & Extension Rods General Measurement Device and Calibration Topics 5
P Micrometer Calibration Methods Calibration Frequency (Interval) 11
P How to Calibrate Laser Micrometer when no Calibration Company in the Country? General Measurement Device and Calibration Topics 3
V Calibration Procedure for Thread Pitch Micrometer General Measurement Device and Calibration Topics 3
T In-House Calibration - Digital Micrometer found Out of Calibration General Measurement Device and Calibration Topics 10
C Pin Gauge Calibration using a Micrometer - Accuracy & Calibration Level Question General Measurement Device and Calibration Topics 10
J Calibration of Flute Micrometer General Measurement Device and Calibration Topics 3
apestate Thread micrometer questions, discussion regarding MU and calibration. General Measurement Device and Calibration Topics 3
apestate Micrometer Calibration Procedure - Please critique General Measurement Device and Calibration Topics 20
N Calibration of micrometer by gauge block General Measurement Device and Calibration Topics 5
D Micrometer Calibration - Seeking actual Calibration Procedure for 0-1" Micrometer General Measurement Device and Calibration Topics 19
S How often for micrometer calibration? General Measurement Device and Calibration Topics 10
A Outside micrometer anvils parallelism General Measurement Device and Calibration Topics 3
G 0.00005" tolerance on micrometer with 0.0005" resolution? General Measurement Device and Calibration Topics 4
A Depth micrometer rod adjustment? General Measurement Device and Calibration Topics 8
A Micrometer anvil chipped General Measurement Device and Calibration Topics 19
G How to compute or estimate a micrometer or caliper flatness using optical flat General Measurement Device and Calibration Topics 2
G Procedure Creation specifically for dimensional (caliper, micrometer, dti) Document Control Systems, Procedures, Forms and Templates 8
W Standard for Micrometer Anvil Flatness Oil and Gas Industry Standards and Regulations 5
G Uncertainty Budget Examples for Caliper, Micrometer and Dial Gauge Measurement Uncertainty (MU) 3
G Caliper or Micrometer's Dimensional Flatness General Measurement Device and Calibration Topics 5
G Manual Procedure with Uncertainty Budgeting for either a Caliper or Micrometer General Measurement Device and Calibration Topics 6
M Uncertainty Budget for a Micrometer Measurement Uncertainty (MU) 5
M Laser Micrometer Suggestion (if any) for diameters that are +/-.0001 inch General Measurement Device and Calibration Topics 1
M Micrometer Anvil Placement to measure the diameter of Steel Wires Measurement Uncertainty (MU) 6
W Micrometer Storage while not in use General Measurement Device and Calibration Topics 4
D Can anyone share a completed Gage R & R on a Micrometer? Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 10
W Thread Micrometer Standard Tolerances General Measurement Device and Calibration Topics 4
5 Reasonable GRR result for a 0-1" micrometer with a total tolerance of .002" Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 16
S Laboratory to Calibrate Objective Micrometer needed General Measurement Device and Calibration Topics 4
D Procedure on How to Calibrate a External Micrometer General Measurement Device and Calibration Topics 1
J Air Micrometer Basics - What are the internal parts and how do they work Manufacturing and Related Processes 7
M Trigonometric Calculation (V-Anvil Micrometer). I'm not getting it. General Measurement Device and Calibration Topics 6
4 Federal Specification GGG-C-105C - Caliper, Micrometer, Gage, Depth Micrometer General Measurement Device and Calibration Topics 7
D Determining Instrument Purchase based on Accuracy (Micrometer) General Measurement Device and Calibration Topics 3
A Generic or Specific Gauge R&R on Standard Equipment (ie.: Micrometer) Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 9
D What is the correct Working Standard for Digital Micrometer? General Measurement Device and Calibration Topics 21
D Laboratory where we can have parts checked for Thickness with a Laser Micrometer General Measurement Device and Calibration Topics 6
D Laser Micrometer for checking Thickness of Steel - Any Recommendations or Comments General Measurement Device and Calibration Topics 3
M How to Fix the Standard Bar of a Screw Thread Micrometer General Measurement Device and Calibration Topics 9

Similar threads

Top Bottom