Calibration Discrimination - Acceptance criteria is not having the accuracy required?

A

Andrews

A temperature indicator is used to check temperature of a bath that has to be maintained at 35.2 deg celsius to 39.4 deg celsius.This temp. indicator is being calibrated and the acceptance criteria is fixed as +/- 1 deg celsius.Isn't this a non-conformity as the acceptance criteria is not having the accuracy required by the test?Shouldn't the acceptance criteria be +/-0.1 deg celsius?

Comments pls.

Andrews
 

The Taz!

Quite Involved in Discussions
Depends . . . you must have the discrimination to read to 0.1 or 0.05 to see the variation in the process. IMHO, I would like to see the instrument calibrated within the working range and at the resolution required. You are paying for the calibration thus you are the customer and make the requirements.

You might want to check around and find out what is common practice for that type of process before getting into a dispute.
 
T

tomvehoski

There are several questions to ask:

How critical is 35.2 to 39.4? What happens if it is 39.8? Or 42?

If nothing happens, why is the spec. so tight and why are you bothering to calibrate the gage. If it may create bad material, slow down the process, etc - it is probably best to have it calibrated. If the building explodes, it should definately be calibrated. You need to make the call based on the risk and cost/benefit analysis.

What type of device is being used? RTD, Thermocouple, thermometer, etc?

For a range that narrow you may need to look at calibrating both the sensor (RTD/TC) and the readout device. Its been awhile, but I recall you can get a TC certified - for big $$$. They are not very accurate devices. Even if you have resolution to .1 degree, you need to make sure the device is still accurate enough.

Hope this helps,

Tom
 
Top Bottom