Selecting reference standards for calibration


Involved In Discussions
I work for a company that makes driveshafts and I work in the Metrology lab. Over the past couple of years, we have developed calibration processes for most of our internal gauges (calipers, micrometers, dial indicators, etc). We just hired on someone from another department and I am working on training her. Our work instruction (which was written by me) has a step that spells out the requirements of our reference standards which usually are along the lines of:
-Must be calibrated by an external ISO 17025 accredited calibration company
-Must be listed as a "Master" in the gauge software
-Must be calibrated to a minimum resolution of xxxxx
-Must be within 15-85% of the full range of the gauge

Now the last one there is what I have a question about. Some time ago, I remember someone telling me (before I got into the actual process of calibrating items) that you never want to use a gauge to its full capacity. I'm not really sure what the logic was behind this, but since he seemed to know at the time, it stuck with me. Now that I have been developing processes, I usually apply a 15-85% rule when selecting reference standards to use when calibrating so as to not reach the limits of the gauge. Since we do our own internal calibrations and we dont use most of the gauges for extremely accurate measurements, we usually just use 3-4 test points to check for linearity and accuracy with ceramic gauge blocks and call it a day. Some of the issues I am running into are:

1) That step has a picture that shows some reference standards for micrometers, which include metal gauge blocks, temp/humidity gauge, and gauge rings (for inside mics). The picture was not to show what needs to be selected, but just a generic picture of types of standards needed. She's not happy that it doesnt spell out exactly what reference standards are needed. Ive tried to explain that I cannot just put the same set of reference standards to be used for all of them because we have different sizes micrometers (0-1", 1-2", 2-3", etc) which is why I used the 15-85% rule to help guide that decision along with most of the calibration setups already being created in the gauge control software.

2) I was thinking that maybe there is a better way to explain how to choose what standards to use. 1) Is there something in a standard somewhere that has either the 15-85% rule or something like it so as to not reach the full limits of the gauge, or is that something that I created but have no basis behind it?

I apologize for this being so long, but I wanted to get all of the details in there.


Involved In Discussions
ASME has standards for device calibration.
ASME B89.1.13-2013, the standard for Micrometers, states this:
Selecting reference standards for calibration

ASME B89.1.14-2018 for Calipers states this:
Selecting reference standards for calibration


We typically calibrate calibers and micrometers at 5%, 50%, and 95% of range. I don't necessarily know where we got that from or if that is nothing more than just a practice.

Now... as to your observation of using instruments at their low ends and 100% of tolerance, I too have always avoided that practice; of suggesting using something at... 1% of its operating range. To me... that is not the correct tool/ instrument for your measurement. If you're measuring... 5 psig, I wouldn't recommend using a 0-500 psig gauge.

I do know like with analytical balances, they do have a sweet spot somewhere at 50% of the full scale. Under USP 41, one is supposed to challenge the balance to as low as it will go until the readings at that test point become too varied. So for a 0-200 gram balance, you take ten readings at say... 5 grams: output variance is acceptable. You try 2 grams: output variance is unacceptable. Minimum weight on that balance is then 5 grams.
Top Bottom