Acceptable Testing Range for Calibration Studies on a Measurement Device

C

camcnown

What is considered an "acceptable" range of values for performing a calibration study on a measurement device, i.e. how much outside the usable range of the device ought one go when performing the calibration. The danger of going too far outside the range of use is that there may be a lack of linearity in the device outside the range of use and the device may fail in this range and then be flagged as a failed device. Is there some documented "acceptable" range of values, i.e. +/- some % that is considered acceptable?
 

Jerry Eldred

Forum Moderator
Super Moderator
Re: Acceptable testing range for calibration studies

I think it is a little subjective. An MSA or GR&R study is different than calibration in that you want to understand statistical properties of a measurement process, whereas calibration is designed to verify an instrument meets a given spec.

My (less than expert) input would be that the study (if it is an MSA) should be based on a process need or requirement. If it is outside the acceptable range of the device (as long as it does not damage the device), that MAY show up as unstable (depending on a lot of parameters).

MSA and Calibration are different. I think it needs to be determined whether you want an MSA or Calibration and be careful to understand their differences if you must intermingle them. If they must be intermingled, understand where the incompatibilities may lie in your specific application, and account for them adequately.

If the issue is how much you may rely on the "calibratable" range of a device to make it acceptable for use in an MSA, that would be a factor of it's stability and accuracy. So how far outside its acceptable range may be used should be a factor of what is constituted as statistically acceptable in the MSA. The MSA experts here can answer some of those statistical rules better than I.
 
W

world quality

Re: Acceptable testing range for calibration studies

camcnown,

If you do work for the Auto industry then it is 10% max.

If it is for food or medical then there will be responses from other covers.

If other that will be up to your customer.
 

Hershal

Metrologist-Auditor
Trusted Information Resource
Two places to begin.....

First, the manufacturer's manual and specifications.....they give the expected (OK, maybe what sales expects) operating specifications.....

Next, the calibration procedure (e.g., NA 17-20 series, USAF T.O. 33K series, manufacturer's procedure) which gives what the parameters are for the calibration.....

Now, if you don't have the answers, and for some devices you truly still won't, then I would take the zero point (whatever the zero is for the instrument, mechanical or electronic), and run the cal 10% 50% 90%.....and run the cal forward and reverse, ending up with zero check again.

Make sure you record what you get, and also calculate your uncertainty to maintain traceability.

Hope this helps.
 
Top Bottom