Micrometer Calibration - Resolution, Accuracy and Range

D

dv8shane

Correct. My query was not related to uncertainty, but simply stated accuracy. I'm just asking if anyone has any insight into the specifications stated in GGG-G-105C and ASME B89.1.13. Both state micrometer "maximum permissible error." In my example below, a 0-1" OD mic would have a "maximum permissible error" of ±0.0001. My question is, how is this possible with, for example, a 0.001" resolution micrometer?
I agree with you about change the spec or change the instrument. However, this is just a question about the intent of the standards, since they don't really make sense to me.
I have read the post but not read the standards quoted. I would surmise the amount of error allowed is more than likely related to equipment requirements. Therefore I would look at the end use, as maybe a super micrometer is required to meet the measurement requirements.
 
A

aliasJohnQ

Need to know what the column "Deviation from" means, when used to calibrate instruments.
It's labeled in the center of a form. To the left is; dimensions, actual, then deviation from, then acceptable, and actions taken.
Deviation column lists: -8 mil, -7mil, -4 mil, 0 mil.
What are they talking about?
 
J

jameslaz

This calibration issue has been bugging me for a while now. I have been calibrating gages for many years and I usually follow the rule that states your calibration standard must have a calibrated tolerance of at most 25% of the gages being calibrated, or say a .001 gage must use a calibration standard that is calibrated to .00025. However what I don’t know if what tolerance to calibrate to. I always like the idea of using the smallest graduation on the gage, or calibrate a gage that reads to .001 to +/-.001. But I cannot find any specification that clearly defines what I am looking for. I read ASME B89.1.13-2001 and I do not agree with the general rule that a micrometer mist be calibrated to +/-.00010. I agree that you cannot possibly interpret .0001 on a .001 scale. If anyone has any further insight into this I would love to hear from them. I wish there was a easy to understand specification that covered all standard gage types with a broader scope of tolerance limits based on the scale of the gage.
 

Jim Wynne

Leader
Admin
This calibration issue has been bugging me for a while now. I have been calibrating gages for many years and I usually follow the rule that states your calibration standard must have a calibrated tolerance of at most 25% of the gages being calibrated, or say a .001 gage must use a calibration standard that is calibrated to .00025. However what I don’t know if what tolerance to calibrate to. I always like the idea of using the smallest graduation on the gage, or calibrate a gage that reads to .001 to +/-.001. But I cannot find any specification that clearly defines what I am looking for. I read ASME B89.1.13-2001 and I do not agree with the general rule that a micrometer mist be calibrated to +/-.00010. I agree that you cannot possibly interpret .0001 on a .001 scale. If anyone has any further insight into this I would love to hear from them. I wish there was a easy to understand specification that covered all standard gage types with a broader scope of tolerance limits based on the scale of the gage.

It's better to think of it in terms of the measurements to be made with the calibrated device, and what amount of the product tolerance you can afford to lose to caibration error or in a more general sense, measurement uncertainty. The thing that most of us need to be concerned about is our level of confidence as measurements approach a specification limit.
 
E

erbearica

Good question! It is not just Fed spec asme or gidep. If you look up the accuracy of say a Mitutoyo 103-177 it is +/- .0001" with a resolution of .001". They say you can with a calibrated eye ball you can split that space into 10 divisions.
20 years of calibration experience tells me that some things will never be explained. We take an educated guess.

I went to training at Mitutoyo in LA this past summer and that is exactly how they do it. They encouraged me to calibrate everything out to the tenths.
For a couple of the toolmakers here I have swaped the sleeves out on their mics so they could have the tenth readings and take the guess work out of it. This is a cheaper alternative than ordering new mics.
 
Top Bottom