Accuracy on calibration certificates - How to calculate the accuracy of UUT

C

calib_eng

hi,
i am sorry to disturb you all guyz again. but situation is going out of my hand.

i have given a dozen of ETI thermometer to an external calibration laboratory.
after doing the calibration they have issued the calibration certificates. the error in all of the thermometers is not more than 1 degree centigrade but they have printed the accuracy as +- 6 degree centigrade. this is too much , error is not more than 0.6 degree centigrade , how come accuracy of UUT is +- 6 deg C.
when i called to them they said they will either print my given accuracy to them or they will follow the Standard Accuracy.
my question is what ISO 17025 or 9001 tells about how to calculate the accuracy of UUT, and how to express this accuracy in calibration certificates.
anxiously waiting for a reply
again thank you very much and sorry to distrub all of u guyz there.
 

Jerry Eldred

Forum Moderator
Super Moderator
The specified accuracy of your thermometers is separate detail in comparison with the amount of error. The amount of error is how far your thermometer was off in comparison with the standard. The specified accuracy is a factor of either their total measurement uncertainty or what ever the manufacturer of your thermometers specified them to be.

Not knowing the particular thermometers you are using (manufacturer/model/range/resolution/etc.) it is difficult to tell whether the tolerance they gave you was correct or not.

Essentially, the tolerance is a plus or minus range limit within which you can expect the thermometer to perform. Regardless of what error was measured, your thermometer is defined is being as accurate as the tolerance.

The error measured during the calibration was what it reads at that moment in time. This is a confusing detail to many. I have dealt with many users who took two different meters after I calibrated them, used them both to measure the same quantity, saw a difference between the meters and didn't understand why. This was because of the difference between measured error and tolerance limits. If I use two meters to measure 10 VDC, one measures 10.04, the other measures the same 10 VDC as 9.96, and the tolerance on the meters is +/-0.05 VDC, they are both legitimate readings. When I provide measurement data for the user for the first meter, it has an error of +0.04 VDC versus a tolerance of +/-0.05 VDC. In this case, the error was close to the tolerance. So the measurements taken during calibration were not too confusing to the user.

However, on another meter, I might have a tolerance of +/-0.05 VDC at 10 VDC, and a lot more resolution. In this case (as a different example), I measured 10.0002 VDC with a tolerance of +/- 0.05 VDC. This meter is performing very well in comparison to it's tolerance limits.

Remembering that across a diversity of manufacturers, some are much better at specifying appropriate tolerances than others (I won't mention any mfr names here), the difference between measured error and specified tolerance varies greatly. In your case, your +/- 6% thermometer appears to be performing well in comparison to it's tolerance limits.
 
C

calib_eng

question was for accuracy

thank you very much for your reply. i think i could not explain my question but pls help me more.

tolerance if i am not mistaken is the allowable variation in the reading / measurement of an instrument and is from supplier or as per standard .

my questioin was not for tolerance.

accuracy means how close a reading is as compared to standard value applied ( as per VIM defination). the less error the more accurate and viceversa.


accuracy can vary depinding upon the condition of the equipment and an instrument may get bad in accuracy with the passaage of time.

tolerance will how ever remain same and will reject or accept the criteria for instrument say if accuracy is +- 6 Deg C and Tolerance is +-3 deg C we will reject the instrument because it crossed the tolerance limit


my question was after doing the calibration of thermometer the error was very less say 0.5 degree, so how can accuracy (not tolerance) of this instrument be +-6 deg C, this thermometer is less in error means more accurate , so the accuracy will be error + uncertanity, so why this lab is putting accuracy of +-6 deg C while it is more accurate than that.

and what standard says about expression of accuracy?

thanks
waiting for answer
 

Jerry Eldred

Forum Moderator
Super Moderator
The terms "tolerance" and "accuracy" are sometimes used interchangeably. When you receive a certificate of calibration, there should only be tolerance and measured value for the instrument being calibrated. There are not separate tolerance and accuracy specifications. The only other +/- number may be the uncertainty (which is basically the accuracy of their measurement, not your instrument).

They will make a measurement of your instrument which is a number without a plus or minus associated with it. In your case, that is the -0.6 Degree Centigrade. The tolerance (manufacturer's limits for your instrument) appears to be +/- 6 Degrees centigrade.

If you believe they have mis-stated the tolerance on your instrument, you could do some research to find out whether they are correct.

But my opinion in this case (since I have not actually seen your certificate of calibration) is that the "accuracy" of your instrument appears to be the same as the "tolerance" of your instrument.
 
Top Bottom