Hi everyone,
I am working on creating a budget for uncertainty when calibrating a digital caliper and a digital micrometer. One of the first factors that I thought to include was resolution. Recently, I have begun to think that resolution should NOT be included, but it seems like every sample budget or source I can find has included resolution. I will explain my reasoning below (for a caliper but its the same for a micrometer) and hopefully people can chime in with their thoughts!
Okay so in the situation of calibrating a caliper, the measurand is the difference between what the caliper is reading and what the length of the gage block is (AKA the error of the caliper). I have this expressed with the equation error_caliper = reading - L_nominal*(1+CTE*delta_T). I am using the GUM method of taking the partial derivatives of this equation to obtain sensitivity coefficients, and then I just need the uncertainty of each variable in the equation.
This leads me to consider the uncertainty of the reading. The only factor that I think should be included is repeatability, not resolution. If we assume for a minute that repeatability is zero, then I think that the uncertainty of the reading would be 0. It would be an exact value since the reading could only ever be that value. There is no doubt in what the value truly is, as the value is just the reading and NOT the length of what is being measured.
The calibration situation that I have described is in contrast to using a caliper to measure the length of something. In that case, I would say the measurand is the length of the part, and therefore resolution would be included since the "true length" of the part could be anywhere within a range of values for the given reading on the caliper.
The fact that all the sources I have looked at seem to include resolution (like even the Z 540.3 handbook!) is causing me confusion. I would love to hear other's thoughts on this!
Thanks
I am working on creating a budget for uncertainty when calibrating a digital caliper and a digital micrometer. One of the first factors that I thought to include was resolution. Recently, I have begun to think that resolution should NOT be included, but it seems like every sample budget or source I can find has included resolution. I will explain my reasoning below (for a caliper but its the same for a micrometer) and hopefully people can chime in with their thoughts!
Okay so in the situation of calibrating a caliper, the measurand is the difference between what the caliper is reading and what the length of the gage block is (AKA the error of the caliper). I have this expressed with the equation error_caliper = reading - L_nominal*(1+CTE*delta_T). I am using the GUM method of taking the partial derivatives of this equation to obtain sensitivity coefficients, and then I just need the uncertainty of each variable in the equation.
This leads me to consider the uncertainty of the reading. The only factor that I think should be included is repeatability, not resolution. If we assume for a minute that repeatability is zero, then I think that the uncertainty of the reading would be 0. It would be an exact value since the reading could only ever be that value. There is no doubt in what the value truly is, as the value is just the reading and NOT the length of what is being measured.
The calibration situation that I have described is in contrast to using a caliper to measure the length of something. In that case, I would say the measurand is the length of the part, and therefore resolution would be included since the "true length" of the part could be anywhere within a range of values for the given reading on the caliper.
The fact that all the sources I have looked at seem to include resolution (like even the Z 540.3 handbook!) is causing me confusion. I would love to hear other's thoughts on this!
Thanks