R
Ryan Wilde
Mike,
I would say that it would depend on the type of calibrations that you are performing. If you are doing dimensional calibration to within a micron, it is a huge problem, but if you are calibrating a handheld multimeter, then it is no problem at all. (I can't tell by your spec'd temperature range, because dimensional labs should be 20°C [68°F] with very tight temperature constaints, while most others are supposed to be 23°C [73°F] and have greater tolerance for temperature swings.)
Also, he may be correct about "The laboratory shall ensure that the environmental conditions do not invalidate the results or adversely affect the required quality of any measurements". It would be prudent to point out to him that if he goes with a blanket generic statement in the manual then the specific temperature range and rate of change would have to be specified in each individual calibration procedure.
What it comes down to is this:
*What effect does temperature have on the uncertainty of the measurements I am performing?
*Is this effect acceptable?
A metrologist has to ask these questions and calculate the answer. If you go to your boss with hard numbers, you may change his mind, or you may just find that it truly is insignificant to the measurements you are making.
Ryan
I would say that it would depend on the type of calibrations that you are performing. If you are doing dimensional calibration to within a micron, it is a huge problem, but if you are calibrating a handheld multimeter, then it is no problem at all. (I can't tell by your spec'd temperature range, because dimensional labs should be 20°C [68°F] with very tight temperature constaints, while most others are supposed to be 23°C [73°F] and have greater tolerance for temperature swings.)
Also, he may be correct about "The laboratory shall ensure that the environmental conditions do not invalidate the results or adversely affect the required quality of any measurements". It would be prudent to point out to him that if he goes with a blanket generic statement in the manual then the specific temperature range and rate of change would have to be specified in each individual calibration procedure.
What it comes down to is this:
*What effect does temperature have on the uncertainty of the measurements I am performing?
*Is this effect acceptable?
A metrologist has to ask these questions and calculate the answer. If you go to your boss with hard numbers, you may change his mind, or you may just find that it truly is insignificant to the measurements you are making.
Ryan