Determining Calibration Intervals - Extend or change the length of

D

Dan De Yarman

#1
Calibration Intervals

Can anything be done to extend or change the length of calibration if the calibration record and procedures state it as being an annual interval? I'm thinking I can change the procedure and change the information on the calibration records when they get calibrated this time, so we will be prepared the next time we want to calibrate our IMTE. Is this correct, or can I change the intervals without calibrating some of the IMTE again? We are trying not to send out those IMTE which do not need to be calibrated. This is our first renewal, if you will, for calibration. Now we know that all the IMTE doesn't need to be calibrated annually as it was originally stated.

Thank you in advance for your help.

------------------
 
Elsmar Forum Sponsor

Jerry Eldred

Forum Moderator
Super Moderator
#2
The definition of a calibration interval (loosely stated) is basically that period of time wihtin which the IMTE (as you call them) can be expected to remain within it's specified tolerance. The interval may be lengthened on any item which you have adequate data to support that the equipment will have an acceptable level of confidence that it will stay in tolerance for the new interval. One such example is the Fluke 77 meter (I am not trying to sell Fluke meters, only an objective use of them as an example). I have gathered data in the past to extend their interval to two or even three years. I am not certain if it was the Fluke 77 or some other similar meter, or who the company was that made the interval adjustment. But I have even heard of cases where a calibration interval on an instrument was extended to no calibration required. That would have to be, of course, based on some rather extremely impressive data over many years (as I would strongly hesitate to do that - matter of fact, I don't recommend that).

I suggest getting a copy of the NCSL (National Conference of Standards Laboratories) recommended practice for the establishment and adjustment of calibration intervals. To adjust them to longer than the original manufacturer's recommended interval, you will need to document your method and data as to ohw you adjust, and how you define your confidence level.

Hope that is of help.

------------------
 
D

Dan De Yarman

#3
I forgot one very important piece of information in my first posting. That is: we used an outside calibration house to do our calibration. We are planning on doing some of the micrometers this time, but all the rest will be calibrated on the outside.

------------------
 
R

Rick Goodson

#4
Just as a follow up to Dan's post. I am sure when he mentioned the Fluke 77 metere he meant a specific meter. You can not extent the calibration cycle of a class or family of instruments based on the results of a single instrument. Each one stands alone.
 
C

CJacobsen

#5
In your SOP or procedure that covers the elements pertaining to calibration, you should have provisions of some sort to lengthen or shorten a calibration cycle.

Typically, I use the following and as long as things are properly documented, have never had a problem.

"Interval Adjustment:
When using historical data to determine or adjust a calibration interval:
1. The interval can be shortened if two successive calibrations are found to be out of tolerance. The recommended reduction is 20%.
2. The interval may be lengthened if three successive calibrations are found to be in tolerance. The recommended increase is 20%.

This should not be applied to instruments that have a particular accuracy or rating based upon length of time between calibrations.

Calibration cycles may be changed for other reasons as long as there is a written justification for changing the cycle and it is approved. Justification may be expanded capabilities or accuracies; significantly increased/decreased use; change to a more extreme environment where the instrument is used, etc.

All cycle changes will be documented and maintained in the history folder or other permanent file."

All the systems I implement for calibration recall use a permanent history folder for each instrument maintained in the system.
 

Marc

Hunkered Down for the Duration
Staff member
Admin
#6
Originally posted by Rick Goodson:

You can not extent the calibration cycle of a class or family of instruments based on the results of a single instrument. Each one stands alone.
I have seen it accepted where a certain instrument type and model was used for a certain measurement. Many stations did the same measurement with the same instrument type and model. The cal cycle for all was determined by the the one with the 'worse' calibration result.
 
C

CJacobsen

#7
Having been the Quality Manager for a calibration facility in the past and worked with many as a consultant, I cannot say where I have ever seen changing the interval of an entire class of instrument based on the performance of one.

I have seen where the initial calibration interval for a class of instrument was set to the same length when multiple units were put into service at the same time or even an initial interval for a class of instrument. But subsequent use, condition, and historical performance on calibrations would govern individually if the cycle was changed.

What type of instrument was it Marc?
 

Jerry Eldred

Forum Moderator
Super Moderator
#8
I can see where this could become somewhat contentious. I have heard some equally good philosophies on a few different methodologies for interval adjustment. Some of the philosophies are:

1. Use a convenient fixed interval (such as 12 months). This makes work forecasting much easier, but 'over-calibrates' some and 'under-calibrates' others.

2. Family/Class Interval Adjustment. This has good reviews in areas of electronic test equipment (such as the Fluke 77 example (and I am not trying to sell them, only use as a common model). Many electronic test equipment are either unaffected or little affected by level of use, and subsequently, use of a statistically significant sampling of numerous of the same model or model/family may be adequate to determine interval adjustment. This with the implicit exception of 'dogs' or 'gems' (mostly 'dogs'). It is not too difficult to define a dog in terms of reliability. This type of adjustment, however, may not work well at all with mechanical instruments, as they may sometimes tend to go out of tolerance based more on usage.

3. Individual instrument adjustment. This is also quite common, and most readily adapts to the uniqueness of each piece of test equipment. It is higher maintenance, though, and may be a more costly method (in our lab, we are currently using this method, but it is fully automated in our custom database package).

I have seen numerous variations of the above. The bottom line is that your well-defined method be adequate to eliminate calibration interval related risk to product (or service). That is really the bottom line of calibration. If I were a tire manufacturer, and I had temperature controllers to cure an intermediate adhesive layer between my tire tread and the steel belts (fictitious example only; no implication that any tire companies have this problem). The bottom line would be that I calibrated those temperature controllers accurately enough and at adequate intervals to assure that the temperature controller accuracy was continuously maintained to preclude that as a contributing factor to a tire defect.

I only have one more small hand grenade to toss into this discussoin. While visiting with some process engineers in Asia last year, one of them showed me a detailed lengthy report about how important it is in process control NOT to over-control the process. All quantitative process variables have variability (a given). If in controlling a process there occurs a single anomalous variation which throws the process from nominal out toward a control limit, the instinctive reaction would be to adjust the process back to nominal and correct for the error. These engineers showed me the mountain of statistical data that basically says when that happens (process is maintaining at about nominal for a long period, then abruptly shifts away from nominal for one reading), you should adjust nothing. The process statistically will come back to nominal. And the probability is that it did not truly drift away from nominal (I can't, by the way, relate any details of this report for company reasons). When single measurement drifting away from nominal occurs, leave it alone. It was an anomaly.

Now to explain my longwinded rambling. I think we should use some of this philosophy in calibration interval adjustments. If I have a family of Fluke 77's (say 30 of them). My data tells me that they should have (for example) a 2 year interval. If I then calibrate one of the units, and find a gross out of tolerance, in accordance with the above theory I proposed, that gross out of tolerance was an anomaly. And in my lengthy experience with Fluke 77's, that is what it would have been. I have seen corroded function/range switches (somebody left their meter in their car and it got rained on). This is an anomaly, and doesn't truly reflect the propensity of the entire population, and in this case, doesn't even reflect what that specific unit is likely to do next interval. The owner of the meter feels stupid when I give him the compassionate fatherly lecture about leaving such a nice meter out in the rain. Next year, all the units are in tolerance. My point is that I believe in only adjusting intervals based on statistically significant sample. I am doing some short term variability testing on an hp 3458A high accuracy meter at the moment. Short term (smaller data sets) have tendencies to show different patterns of variability than longer term (significant amount of data to be analyzed). We need to be careful about over-adjusting calibration intervals.

Mechanical instruments are a different matter. There is much more of a physical wear and tear issue applicable to those instruments. Micrometers heavily used will perhaps exhibit different wear and tear than those that sit in a tool box all the time, and so must be treated differently. I won't comment on that area, as I do not have much experience in mechanical tools. Individual instrument interval adjustment may well be more appropriate in those cases.

------------------
 
R

Ryan Wilde

#9
Sorry to rehash such an old topic, but this may be of some interest to you QS folks. ISO 17025 (the standard that your calibration suppliers must meet) states that we, the supplier of calibration, are no longer allowed to make any suggestion concerning a due date without it being agreed upon with the client. What this means to you is that you have to determine your own calibration interval for your own standards. Be prepared to specify calibration intervals on your purchase order, along with the myriad of other information that keeps getting lumped on you to include on your PO.

ISO 17025 is slightly more prescriptive in this area, we have no choice, we HAVE to have policies on adjusting intervals, and it has to be well documented. A steady 1 year won't hold up. We have to justify every interval.

I've noticed that ISO works on a trickle down theory, and that when one committee finds it important enough to require it, then eventually all of them require it. An earlier comment in this thread recommended getting the NCSL-International recommended practice (To be specific, it is RP-1, Establishment & Adjustment of Calibration Intervals 3-96, and it costs $30USD if you're not a member, and if you are a member, you already have it.) I recommend it highly, we saved close to 50% on our calibration cost due to the years of history we have on our reference standards. In these days of 3% cost cuts, don't spend money where it isn't needed.

As a calibration provider, we see the ramifications of improper calibration intervals on a daily basis. Our favorite ramification is the set of micrometers that comes in every year faithfully, and is used once per month by the customer. The set hasn't required an adjustment in 10 years, which to us is known as "free money".

The other ramification scares me when I go to my auto dealer to buy my wife a new grocery getter - the set of gage blocks that comes in once per year with 40 - 60% of the blocks out of tolerance. Doing a very quick interval study on the set, and other sets owned by the same company, showed that a mere reduction of their calibration interval to 9 months would have brought the problem under control. They said no way, keep it at a year. As a calibration provider, I have no say in the interval, and begrudgingly put 1 year on a set that I knew had no chance of being remotely good in one year. Don't even get me started on surface plates...

Just my thoughts, but keep in mind that it can save quality problems on one extreme, and your budget on the other.



------------------
Ryan Wilde, Technical Manager
Quality Control Sales & Services
 
R

Ryan Wilde

#10
Originally posted by Ryan Wilde:
An earlier comment in this thread recommended getting the NCSL-International recommended practice (To be specific, it is RP-1, Establishment & Adjustment of Calibration Intervals 3-96, and it costs $30USD if you're not a member, and if you are a member, you already have it.)
What kind of jerk doesn't give you a handy-dandy link so you can get the thing that he references? Well, me for one, but I can fix my oversight. The link is:
http://www.ncslinternational.org/publications/abstract.cfm?ID=29

Note: It comes with a disk that helps you do the interval test. I like any book that comes with media that simplifies my job, but I'm simple that way.

------------------
Ryan Wilde, Technical Manager
Quality Control Sales & Services
 
Thread starter Similar threads Forum Replies Date
D Factors for Determining Gage Calibration Intervals - Extended Intervals Calibration Frequency (Interval) 5
D Determining Calibration Frequency schedule for items used in production Manufacturing and Related Processes 2
I Determining Calibration Tolerance of a Measurement Device General Measurement Device and Calibration Topics 2
H Determining Calibration Tolerance for new Leak and Occlusion Equipment Measurement Uncertainty (MU) 2
howste Determining Calibration Uncertainty with Unknown Reference Standard Uncertainty Measurement Uncertainty (MU) 8
T Determining which instruments require calibration General Measurement Device and Calibration Topics 18
H Determining Measurement (Measuring) Device Calibration Frequency Calibration Frequency (Interval) 36
H Determining Calibration Frequency for Ammeters & Voltmeters used on the line Calibration Frequency (Interval) 7
M Standards for determining calibration frequencies for gages General Measurement Device and Calibration Topics 6
M Instrument Calibration Requirements - Determining what affects quality General Measurement Device and Calibration Topics 1
D Determining the the maximum number of reprocessing cycles of attachments CE Marking (Conformité Européene) / CB Scheme 2
R Determining Uncertainty from Gage R&R Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 1
A IATF 16949 4.3.1 - Determining the scope of the quality management system - supplemental IATF 16949 - Automotive Quality Systems Standard 9
D ISO 9001:2015 4.3 Determining the Scope of the QMS ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 3
J Determining SPC tolerance Statistical Analysis Tools, Techniques and SPC 21
B Determining SAT Offsets vs TUS Offsets per SAE AMS 2750E AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 0
B Determining sample size for device sterility Inspection, Prints (Drawings), Testing, Sampling and Related Topics 3
D Determining of sample size for 'Operational Qualification' AQL - Acceptable Quality Level 3
R Question on determining defective units - I am not recording fixture to part rejected Statistical Analysis Tools, Techniques and SPC 5
S Clause 8.2.2 Determining the requirements for products and services ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 5
C Determining if Maintenance Contractor is an External Service subject to ISO 9001 Clause 8.4 ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 43
S AS9100D PEAR - Examples for organization's method for determining process results? AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 5
J ISO 17025 Documented Procedure for 6.2.5 - Determining competency ISO 17025 related Discussions 4
V Determining FDA 820 (registration) vs ISO 13485 - Supplier gives us the kit ISO 13485:2016 - Medical Device Quality Management Systems 1
J ISO 9001 8.4.1 - Determining controls applied to externally provided processes ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 8
E Determining what is good and what is bad can be subjective - when is it a quality issue? ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 9
F Determining what type of scrap to include in my internal PPM calculation Quality Manager and Management Related Issues 5
M Determining number of employees within the "Scope" of the QMS ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 9
A Determining the Scope of the QMS during Stage 1? Registrars and Notified Bodies 11
W Minor Audit Nonconformance Against Determining the scope of QMS IATF 16949 - Automotive Quality Systems Standard 12
D Determining Critical Components for conformity with IEC 60601-1 IEC 60601 - Medical Electrical Equipment Safety Standards Series 21
Q ISO 9001, section 4.3 Determining the scope of our QMS ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 9
R Determining Sample Size for Medical Device Component Validation Inspection, Prints (Drawings), Testing, Sampling and Related Topics 0
A What does 8.2.2.1, Determining the requirements related to products and services,mean IATF 16949 - Automotive Quality Systems Standard 1
A Determining Retention Period for Medical Device QMS documents Document Control Systems, Procedures, Forms and Templates 5
S Surveillance Sampling Test - Determining Sample Size Inspection, Prints (Drawings), Testing, Sampling and Related Topics 5
F ISO 9001:2015 4.3 - Determining the scope of the quality management system ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 7
B Standards Needed In House - Determining what standards are applicable Various Other Specifications, Standards, and related Requirements 3
W Determining Medical Device Classification in Mexico Other Medical Device Regulations World-Wide 5
K Determining Effect of Failure without a DFMEA (Design FMEA) FMEA and Control Plans 1
W Determining the Status and Importance of the Processes and Areas to be Audited Internal Auditing 7
T Determining Customer Requirements for the US Postal Service (USPS) IATF 16949 - Automotive Quality Systems Standard 4
Crimpshrine13 Rules of achieving and maintaining IATF recognition - Determining audit days IATF 16949 - Automotive Quality Systems Standard 2
R Developing procedure for Determining Company's Context And Interested Parties ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 53
W Determining Asset Life or Depreciation Life for M&TE General Measurement Device and Calibration Topics 1
N Reason for determining no adverse effect on reworked product ISO 13485:2016 - Medical Device Quality Management Systems 7
N Procedure for determining pinhole position in condom Inspection, Prints (Drawings), Testing, Sampling and Related Topics 1
alonFAI Determining PCBA Xray Test Sampling Size Statistical Analysis Tools, Techniques and SPC 1
S Determining sample size for inspection to achieve x% confidence re defects Misc. Quality Assurance and Business Systems Related Topics 10
S Determining Sample Size - AQL & LQ AQL - Acceptable Quality Level 10

Similar threads

Top Bottom