Calibration uncertainty

Jayfaas

Involved In Discussions
I have been doing gauges for some time now as far as maintenance/repair and calibrations, and have slowly built up knowledge while working under ISO17025 programs, but it is only as my most recent job that I was able to dive into the calibration functions and developing uncertainty and so forth. Typically, we were calculating uncertainty using information from previous calibration certificates AND also doing 20 repeated measurements and calculating the uncertainty for that as well, combining them to get a standard uncertainty, and then expanding to K=2. I have some questions that I am hoping to get cleared up today since I have never fully received classes or anything on uncertainty, just what we have traditionally done in the past.

1) Is there a difference between calibration uncertainty and gauge uncertainty? When developing procedures for calibrations, I would think you would typically have an uncertainty that encompasses all of the gauges under that procedure, but I could be totally wrong on that. Do you just have a procedure, but all gauges have their own individual uncertainties? For example, take weight scales. You may have 10 different scales, and all of them can measure up to 500g. Some scales may have a 0.1g resolution, some may have 1g resolution, some 0.001g resolution. They may all have different individual uncertainties per gauge. I guess I am trying to find out if you have a blanket uncertainty for all gauges covered under that procedure, or the procedure is separate from uncertainty.

2) What if you try to look up manufacturer's data for uncertainty and they do not have listed uncertainty, but do have accuracy specs? Is there a way to use accuracy towards uncertainty? What if it is an uncertainty that is not a hard number, but one within a formula such as 1.9+L/400?

dwperron

Trusted Information Resource
1) Is there a difference between calibration uncertainty and gauge uncertainty? When developing procedures for calibrations, I would think you would typically have an uncertainty that encompasses all of the gauges under that procedure, but I could be totally wrong on that. Do you just have a procedure, but all gauges have their own individual uncertainties? For example, take weight scales. You may have 10 different scales, and all of them can measure up to 500g. Some scales may have a 0.1g resolution, some may have 1g resolution, some 0.001g resolution. They may all have different individual uncertainties per gauge. I guess I am trying to find out if you have a blanket uncertainty for all gauges covered under that procedure, or the procedure is separate from uncertainty. Gauge uncertainty is not a term I am familiar with. Calibration uncertainty is the uncertainty reported for the calibration of your standard. Suppose you are calibrating a micrometer and are using a 100 mm block. Say that this is a 100 mm class 0 gauge block. The accuracy of that block was measured to +0.12 µm with a calibration uncertainty reported to be 0.08 µm. You would need to add the calibration uncertainty as a term in your uncertainty budget for the micrometer. As for a "blanket uncertainty" for a class of gauges, you may have a hard time justifying that. You would need to sample a significant part of your tool inventory to come up with something like the "blanket" repeatability term to use in your measurement uncertainty calculations, and an auditor might question how you can assume that the number you get is relevant. I have always done my uncertainty calculations for a specific manufacturer / model. Then I can say that I assume that these will all have significantly the same measurement uncertainties with some confidence.

2) What if you try to look up manufacturer's data for uncertainty and they do not have listed uncertainty, but do have accuracy specs? Is there a way to use accuracy towards uncertainty? What if it is an uncertainty that is not a hard number, but one within a formula such as 1.9+L/400? You can use manufacturer specifications in your uncertainty calculations as Class B contributors. For instance, looking at the specifications for a balance you might find terms for linearity, repeatability, resolution, cross pan balance, etc. You take each of those numbers, normalize them to the proper distribution, and use them as uncertainty terms. As for uncertainties with a formula like the one you quote you can substitute in the length that you are measuring for L in the formula. Another solution is to look at the worst case uncertainty for the maximum value of a range of an instrument and use that.

Jayfaas

Involved In Discussions
I guess I have been confused because when I started delving into the world of uncertainty, we used "Methods A & B" for gathering our uncertainty. One method was a repeated measurement of something 20 times, which would generate a standard uncertainty. Then the other method would be extracting uncertainty from calibration certificates. Once we calculated the uncertainty and expanded uncertainty through a spreadsheet, we would use that for our calibration process. However, it would make sense that you cant have a calibration procedure for calibrating micrometers for example, where you could do up to 150mm micrometers, because surely they would not all be the same right? One for a 0-25mm gauge might be different than a 100-150mm gauge. It has all been confusing.
We have a calibration company that changed up some of their things where they had to start reporting when they couldnt meet 4:1 TUR. When they brought it up to us, we had no idea what that meant. They also told us that they had to split up some of the uncertainties, and I believe its because of the same reason stated above, where maybe they were covering too much of a range under one uncertainty value. Is there a difference then between the uncertainty of a measurement system vs calibration uncertainty? Or would you call it something else other than measurement system uncertainty and factor it in to the budget?

Also occasionally we have some values that we get from the calibrations certificates that dont really compute. Sometimes we get the ones like above where its 1.9+L/400 for length systems, and other times we have gotten 0.82% of IV (indicated value) for torque wrenches. How do you convert that into a calculable format to put into the budget? Thank you in advance for the help.