Hello All,
I am currently working on a solution for a corporate requirement on Calibration Range vs Use Range. The corporate requirement decrees that both ranges must be the same or the Use Range must be within the Calibration Range. I am using the rationale that most instruments are calibrated at specific points to prove linearity. However, we have auditors that insist on calibration of the entire range from ZERO to MAXIMUM.
I am interested if there are any calibration documents that offer guidance in this regard. I would prefer a written standard to resolve this issue.
Have a great day,
Regards,
SB
I see that you are entangled in a dilemma here. You say that your auditors insist on calibration from zero to maximum. Have they ever shown you the document they are asking you to comply with? I have run into customers with auditors who claim similar requirements based on UL and FDA requirements - except that those requirements don't exist (or at least I have never found a document that requires zero to maximum calibrations).
What does exist out there are the various calibration standards being used by industry. Let's look at what they say:
ISO 17025 5.4.1
The laboratory shall use appropriate methods and procedures for all tests and/or calibrations within its scope...
ANSI Z540-1 10.2 and 10.2 a)
The laboratory shall use
appropriate methods and procedures for all calibrations / verifications...
Calibration procedures shall contain the required range and tolerance or uncertainty of each item or unit parameter being calibrated or verified...
ANSI Z540.3
5.3.1 Calibration procedures
Calibrations shall be performed using calibration procedures that:
? address the measuring and test equipment performance requirements;
? are acceptable to the customer;
? are current and
appropriate for the calibrations; and
? provide reasonable assurance that the calibration results are as described.
As you can see,
appropriate is the operative word here. The lab is supposed to determine the appropriate calibration requirements. This is where your auditors may be right - if you have no way of proving that the instruments are not going to be used over their full ranges on all functions (can you assume that they will not be used over their entire range?) then it would be appropriate to calibrate over the full range. If the equipment is dedicated to specific tests / usage then it would be appropriate to calibrate to those tests. In a world where you have multiple instruments that can be used in multiple applications and no way of controlling what instrument is used for a specific test then it may be best and safest to calibrate over the full ranges.