Calibration Documents that offer guidance on Calibration Range vs Use Range

S

SHOPBRAT

Hello All,
I am currently working on a solution for a corporate requirement on Calibration Range vs Use Range. The corporate requirement decrees that both ranges must be the same or the Use Range must be within the Calibration Range. I am using the rationale that most instruments are calibrated at specific points to prove linearity. However, we have auditors that insist on calibration of the entire range from ZERO to MAXIMUM.

I am interested if there are any calibration documents that offer guidance in this regard. I would prefer a written standard to resolve this issue.

Have a great day,
Regards,
SB
 

normzone

Trusted Information Resource
I'm not a calibration expert - one will be along shortly to quote chapter and verse.

But it seems to me that the instrument should be calibrated using points along the full range, since the potential for use is the full range.

And if you do that, then the actual use range (whatever THAT might be) WILL be within the calibration range.

So what exactly is the challenge again?
 

BradM

Leader
Admin
Hello SB!
I don't know of a Standard or such specifying such a thing. I did find a working paper from NCSL on the subject:
http://www.ncsli.org/c/f/p12/REG_2012.PRE.467.1654.pdf

In short, one calibrates an instrument to assure working order and confidence. Most of the time and instrument is purchased to satisfy a measurement need in a process. I have to measure 50-150?C ?1?C, so I buy a thermometer capable of that. My calibration requirement (among other requirements) is to assure calibration across that requirement range.

Saying all that...
If the thermometer will never be used at 20?C or at 200?C and is clearly marked as such, why am I mandated to calibrate at those points?

Now, for some devices with zero/span adjustment, the device may have to achieve a low and high value for adjustment. But if I can verify the device is accurate across the range of use, I find no argument (either compliance or within metrology) that states I am required to calibration across the mfg. specified range of the instrument.
 

BradM

Leader
Admin
shopbrat, sorry but you have confused me with your post. sorry for being obtuse... but when someone states, to me, 'use' range they normally are referring to a piece of equipment (product) being operationally calibrated to a specific range before delivery to say a customer.

when i hear, the calibration range, i normally believes it refers to the piece of calibration equipment's operational measurement range to provide the 'standard' the product's 'use' range is checked against.

if i ask nicely could you quantify a smidgen?
(and if i am in tim-buc-two so state as there are times i get derailed)

J

brad, you are correct with your statement about the temp measurement, but it must be decided by an authority (normally the product's engineering staff) and then certified to that criteria and run through the gauntlet of management reviews and documentation must be generated so the customer doesn't come back saying i bought this to measure zero kelvin and the best it does is get to 50K i want my money back!!

one factor this accomplishes, it exposes design errors being covered up by broad statements from the design engineers, et al.

i have worked with equipment which only worked at and therefore was checked only at certain calibration points. when mgmt stated in sales promos the product worked throughout the range became extremely agitated when i jumped up and down pointing out the fraud involved and the ramification(s) from judicial proceedings. they changed the promo on their product specifying only the calibration point the equipment was designed to measure.

this mentality also mitigates any assessor's insistence of checking the entire range.

J

You have made some good points/ assertions here. I'll add some perspective from my viewpoint.

It still depends on the purpose/use of a piece of equipment and what you are trying to attest by performing the calibration. If you are the manufacture and you are stating (through the calibration) that the device does comply with the stated manufacture's specifications, then Yes, a full calibration is needed.

If however all I need to do is provide assurance that the device is within the allowed tolerance at the points it is actually used, the full calibration is not required.

Say I have a digital thermometer, with a mfg. range of 0-300?C. However, our use is only 75-150?C. During calibration, it is within tolerance at 75-150, but exceeded tolerance at 0? and 300? C. Do we have to initiate an investigation? What about if that happens the next two calibration intervals. Do we fail the instrument; spend who knows how much to replace it? All the while, the device met our needs. :)

If I had to maintain standards to check all instruments across it's entire range (as opposed to a use range), it would cost millions+ and take forever to calibrate everything; failures would increase; etc.

If you have a business need to calibrate across the entire range, then it is needed. Supporting a sales/engineering/ business claim made by the mfg. of a piece of equipment would be one of those needs.
 

BradM

Leader
Admin
if a standard, you can have a piece of equipment red-lined to show it has been calibrated only to a certain parameter and any measurements outside that range are suspect. therefore, an assessor who comes in sees the calibration documentation for only a certain parameter and sees the unit being used in this fashion should be ok with the concept. if however, the assessor sees it being used outside that range, the it should raise questions about the processes.

J

I don't know what red-lined means; but I believe we agree on this point. :agree1:
 

dwperron

Trusted Information Resource
Hello All,
I am currently working on a solution for a corporate requirement on Calibration Range vs Use Range. The corporate requirement decrees that both ranges must be the same or the Use Range must be within the Calibration Range. I am using the rationale that most instruments are calibrated at specific points to prove linearity. However, we have auditors that insist on calibration of the entire range from ZERO to MAXIMUM.

I am interested if there are any calibration documents that offer guidance in this regard. I would prefer a written standard to resolve this issue.

Have a great day,
Regards,
SB
I see that you are entangled in a dilemma here. You say that your auditors insist on calibration from zero to maximum. Have they ever shown you the document they are asking you to comply with? I have run into customers with auditors who claim similar requirements based on UL and FDA requirements - except that those requirements don't exist (or at least I have never found a document that requires zero to maximum calibrations).

What does exist out there are the various calibration standards being used by industry. Let's look at what they say:

ISO 17025 5.4.1
The laboratory shall use appropriate methods and procedures for all tests and/or calibrations within its scope...

ANSI Z540-1 10.2 and 10.2 a)
The laboratory shall use appropriate methods and procedures for all calibrations / verifications...
Calibration procedures shall contain the required range and tolerance or uncertainty of each item or unit parameter being calibrated or verified...

ANSI Z540.3
5.3.1 Calibration procedures
Calibrations shall be performed using calibration procedures that:
? address the measuring and test equipment performance requirements;
? are acceptable to the customer;
? are current and appropriate for the calibrations; and
? provide reasonable assurance that the calibration results are as described.

As you can see, appropriate is the operative word here. The lab is supposed to determine the appropriate calibration requirements. This is where your auditors may be right - if you have no way of proving that the instruments are not going to be used over their full ranges on all functions (can you assume that they will not be used over their entire range?) then it would be appropriate to calibrate over the full range. If the equipment is dedicated to specific tests / usage then it would be appropriate to calibrate to those tests. In a world where you have multiple instruments that can be used in multiple applications and no way of controlling what instrument is used for a specific test then it may be best and safest to calibrate over the full ranges.
 

Charles Wathen

Involved - Posts
I'm going through the same issue right now. Normally for instruments that we can't calibrate internally, we send those out to an approved vendor or back to the OEM.

As I'm looking at my first family - a Air Velocity Meter, the instrument's range is 0 to 20 m/s. As I'm looking at the calibration certificate, I noticed the OEM only calibrated it up to 14.6 m/s. So now I'm in discussion with the OEM on their documentation.

I've been looking for a guidance document that may say 10-90% of full scale can be considered a full scale calibration. My thinking is the points below and above can be interpolated. If this is not practical, then the only other option would be to tag the instrument as a limited calibration which may cause issues if users fail to read the label.

For Equipment, we have created a URS that specifies the equipment range and qualification range. From this, we use the qualification range for our calibration range and test points.
 
Top Bottom