95% reliability for calibration frequency

J Allen

Involved In Discussions
We calibrate basic inspection tools (Micrometers, Calipers, Height Gages) in house using lab grade Gage blocks traceable to NIST as our calibration standard.

We just had a Customer audit that resulted in a finding that we were not verifying if our gages meet 95% reliability to extend/reduce calibration intervals.

Can someone help by explaining how to determine 95% reliability for the small number of calibrations that we perform.
Please use a 0 - 1" mic calibration as an example.

They say I should have a procedure or instruction.
 

normzone

Trusted Information Resource
I don't tell my auditors this, but I have never gotten beyond having some ninety-several percent of my gear in calibration.

There's always something that can't be found, something else that an employee left and took with them, etc. I just track it all and document when it can't be found - ON A DIFFERENT LIST SO I HAVE A LIST WITH 100% OF THE GEAR ON IT CALIBRATED.
 

dwperron

Trusted Information Resource
OK. What they are looking for here is that you have a program in place to evaluate the reliability of your calibration instruments. The target set by ASQR-01 is that >95% of your instruments are found to be in tolerance after each calibration event. They require that you have a method to adjust calibration intervals (shorter or longer) so that you calibrate your instruments often enough to meet the 95% reliability target.

They are looking for you to have a process where if you have an instrument that is found out of tolerance that you reduce its calibration interval. There are many ways of doing this. ASQR-01 states:"Supplier may reference ILAC-G24, OIML D 10 Guidelines for the determination of calibration intervals of measuring instruments and the NCSL RP-1: Establishment and Adjustment of Calibration Intervals as guidelines for determining their interval analysis methodology." You can also use this to justify extending calibration intervals on reliable instruments that stay in tolerance. You must create and implement an interval adjustment program.

Then you will need to analyze your calibration results and see what percentage are found to be in tolerance. If it is above 95% you are good to go. If it is lower it is time to replace unreliable instruments or further reduce intervals.

My experience in multiple labs over the years is that the 95% target is not hard to meet. You just need to document that you have a process in place to deal with the requirement.
 
S

sol49720

We calibrate basic inspection tools (Micrometers, Calipers, Height Gages) in house using lab grade Gage blocks traceable to NIST as our calibration standard.

We just had a Customer audit that resulted in a finding that we were not verifying if our gages meet 95% reliability to extend/reduce calibration intervals.

Can someone help by explaining how to determine 95% reliability for the small number of calibrations that we perform.
Please use a 0 - 1" mic calibration as an example.

They say I should have a procedure or instruction.
Lets say you have history for calibrations for the last 5 calibrations on a 1" micrometer and also on 9 more sets for a total of 10 separate micrometers with the last five calibrations performed. Now the requirements for all of them must be the same, lets say ±.0002". Of all the data used (50) calibrations, 49 were in the tolerance as received and 1 was not. So 49/50 x 100 = 98% reliability, if 48 were in tolerance you would have 96% reliability. We use grouping of identical measuring devices, but if you have enough calibration history on an individual gage, you could go that route, but then you would need to do them all that way.
 

Charles Wathen

Involved - Posts
I've been using a free Calibration Interval Analysis software:
Calibration Interval Analysis Freeware from Integrated Sciences Group

We perform an interval analysis every 2 years. Our target is 90% reliability, but you can adjust the software as needed for your requirement.

1277140_2_4.png
 
Top Bottom