Micrometer and Vernier Caliper simple calibration process

#1
Hi all

My client, (a small UK engineering company manufacturing turned, not ground, components) are calibrating themselves their external micrometers (0-25mm) and vernier / digital calipers (0-150mm). To do this they are using a gauge block set which is calibrated externally to international standards by an approved laboratory.

My question is regarding observed permissible error on the types of equipment. At the moment the calibration work instruction states:

"Verniers are used to measure to an accuracy of 0.015”. Vernier calipers must be accurate to one hundredth of this figure i.e. 0.000150” (There is currently no stated figures for micrometers or digital calipers in the work instruction.)

This is a small company and not a high precision environment, so I am looking for a simple calibration procedure and most importantly realistic observable error figures permitted, when calibrating the three types of equipment please.

Thank you in advance.
 

Jim Wynne

Super Moderator
#2
The general expectation is 10:1, not 100:1. Perhaps descriptions are different in the UK than in the US, but this is a vernier caliper:

5146u8xU0NL._SX522_.jpg


You're probably using dial calipers (in addition to digital calipers).
 
#3
I agree with Jim. And I don't believe that there is a vernier caliper available that has a resolution of .00015". A more appropriate target would be ± 0.0015"

As for the micrometers, they are inherently more accurate, and I generally set my system up to ± 0.0005"
 
#4
Thank you Jim and Ron. Just to clarify:
I am aware of the difference between the digital and vernier calipers, they have both on site.
Ref the resolution of 0.000150", totally agree, hence the question.

This was picked up during an audit on measuring systems on the shop floor. During the audit I determined that VERNIER 0-6" calipers and MANUAL 0-1" micrometers, were both actually being checked to 0.015". I believe this is overly generous however and 0.0015" for verniers and 0.0005 for mic's more realistic?
 
#5
The resolution and calibration of your measuring instrument needs to be driven by the tolerance of the measurement. If you are measuring something with a tolerance of ±0.005" (total of 0.010") then your resolution needs to be better than .001", and your calibration also needs to be better than .001" (1:10 rule).
 

Mike S.

An Early 'Cover'
Trusted
#6
If you are measuring something with a tolerance of ±0.005" (total of 0.010") then your resolution needs to be better than .001", and your calibration also needs to be better than .001" (1:10 rule).
Ahhhh, the old "accuracy rules". They have caused a lot of problems and misunderstandings in their day and are rarely clearly defined as to exactly what they mean or intend to say, IMO.

And its not always a "10:1 rule" -- sometimes that is not even possible at the state of the art, and it is often not practical or necessary. Sometimes it is a 4:1 "rule" set by the customer.

ARP9013 states: "It is recommended that measurement capability meets or exceeds a 4:1 ratio between the product tolerance and the certified accuracy of the measurement device to ensure identification of nonconforming product consistent with the quality parameters."

But in any event +/- .015" for calipers or micrometers is way too loose, IMO. I always saw a maximum of +/- .0015" for calipers and +/- .00015" for micrometers.
 
#7
Yes Mike, you are correct; there are times when 1:10 is not feasible, or when the customer has mandated some other ratio.

calibration of a micrometer to ± .00015" is, in my experience, not realistic. Very few micrometers have a resolution to 5 decimal places, and the infintesimal difference in the half of a tenth of a thousandth of an inch is so reliant on operator bias and handling that, (again in my opinion) to even attempt it is futile. If I need to be measuring something THAT accurately, then I want something better than a micrometer :)
 

Mike S.

An Early 'Cover'
Trusted
#9
Yes Mike, you are correct; there are times when 1:10 is not feasible, or when the customer has mandated some other ratio.

calibration of a micrometer to ± .00015" is, in my experience, not realistic. Very few micrometers have a resolution to 5 decimal places, and the infintesimal difference in the half of a tenth of a thousandth of an inch is so reliant on operator bias and handling that, (again in my opinion) to even attempt it is futile. If I need to be measuring something THAT accurately, then I want something better than a micrometer :)
I'm not a metrologist. But our Metrologist at a former company lived and breathed metrology, it was his hobby as much as profession. He had lasers and all sorts of top notch standards in his lab. He was a metrology ambassador with some professional society, etc. He knew his stuff. He considered our Mitutoyo digital mics calibrated if they read within +/- .0001" or .00015" (I can't remember which) of the standard. I think .0001". We would use them for tolerances of +/- .0004 or higher.
 

Top Bottom