The Elsmar Cove Business Standards Discussion Forums More Free Files Forum Discussion Thread Post Attachments Listing Elsmar Cove Discussion Forums Main Page
Welcome to what was The Original Cayman Cove Forums!
This thread is carried over and continued in the Current Elsmar Cove Forums

Search the Elsmar Cove!

Wooden Line
This is a "Frozen" Legacy Forum.
Most links on this page do NOT work.
Discussions since 2001 are HERE

Owl Line
The New Elsmar Cove Forums   The New Elsmar Cove Forums
  Measurement, Test and Calibration
  Calabration Frequencies

Post New Topic  Post A Reply
profile | register | preferences | faq | search

UBBFriend: Email This Page to Someone! next newest topic | next oldest topic
Author Topic:   Calabration Frequencies
MK
unregistered
posted 12 April 2001 08:09 AM           Edit/Delete Message   Reply w/Quote
Are there set standards for determining calibration frequencies for gages?

IP: Logged

CarolX
Forum Contributor

Posts: 108
From:Illinois, USA
Registered: Jun 2000

posted 12 April 2001 10:23 AM     Click Here to See the Profile for CarolX   Click Here to Email CarolX     Edit/Delete Message   Reply w/Quote
MK,

Nope. It is up to you to deterimine how often.

We calibrate standard measuring equipment (micrometer, caslipers, etc.) monthly. Everything else is yearly.

Regards,
CarolX

IP: Logged

JerryStem
Forum Contributor

Posts: 11
From:W Chester OH USA
Registered: Apr 2001

posted 12 April 2001 12:53 PM     Click Here to See the Profile for JerryStem   Click Here to Email JerryStem     Edit/Delete Message   Reply w/Quote
When we were going thru ISO Guide 25 accreditation by A2LA, I asked the program manager there about due dates.

Their opinion was to not even list one when certifying a customers equipment/standards. Since we weren't "guaranteeing" the calibration for 6 mo/1 yr/... they said why list one? (If I certified a foil for 6 months and the next day the customer rips it in half, obviously I can't say it's still the right thickness).

I know this is off topic slightly, but just wanted to give my 2 cents. They also went into cycle dates, saying they had no guideance for them either. You set your own but it had to be "reasonable". 10 years was a little long...

Jerry

IP: Logged

Jerry Eldred
Forum Wizard

Posts: 136
From:
Registered: Dec 1999

posted 12 April 2001 01:38 PM     Click Here to See the Profile for Jerry Eldred   Click Here to Email Jerry Eldred     Edit/Delete Message   Reply w/Quote
There are a few options for setting calibration intervals.

1. Use manufacturer's recommended intervals (drawbacks: they don't all give you a recommended interval; you don't know for certain that it is adequate; it doesn't allow for equipment aging/variety of use conditions etc).

2. Set fixed intervals. Using some info from (1), and good "engineering intuition", set conservative fixed intervals (normally should be short enough to give a high confidence that the unit will maintain in-tolerance throughout the interval). This is convenient because you can create a somewhat predictable workload, and adjust for even workflow from month to month. This also has the same drawbacks as (1), that it doesn't account for variety of conditions, usages, equipment aging and user needs.

3. Set intervals statistically based on a percent probability (typically about 95%) that the instruments will remain in-tolerance for the length of the interval. This is done by evaluating prior calibration history of the cal'd units at a given interval. Measure the percent of them that were in-tolerance at that interval. If the percent in-tolerance falls within your defined in-tolerance confidence level, the interval remains the same. If too many of the units were in-tolerance, you are calibrating too often and so need to lengthen the interval. If too few of the units were in-tolerance, the interval needs to be shortened. Drawbacks are that it is more cumbersome to maintain. Our in-house database has a module programmed into it which automatically adjusts intervals so that all instruments in the program (about 12,000 units at our site) stay in-tolerance for the defined percentage confidence level. Advantages are that you optimize the best calibration interval length. This helps keep our operating cost at a minimum and our instruments at an acceptable confidence level that they will remain in-tolerance.

Problems with the statistical method are that it is more difficult to manage in a small lab (not enough population sometimes to make statistically significant evaluations). I used to run a small lab, and set up an interval analysis system on paper. I ran a report once a year and did interval analysis and adjustment once each year.

NCSL (National Conference Of Standards Laboratories) has a recommended practice for the establishment and adjustment of calibration intervals. That is probably one of the best practices on that topic. I believe their web address is ncsl-hq.org .

I will also not recommend a specific confidence level. That needs to be done locally to meet your particular company's needs.

IP: Logged

Marc Smith
Cheech Wizard

Posts: 4119
From:West Chester, OH, USA
Registered:

posted 12 April 2001 01:41 PM     Click Here to See the Profile for Marc Smith   Click Here to Email Marc Smith     Edit/Delete Message   Reply w/Quote
quote:
Originally posted by JerryStem:

]When we were going thru ISO Guide 25 accreditation by A2LA, I asked the program manager there about due dates.

Their opinion was to not even list one when certifying a customers equipment/standards.


True, true. You only calibrate it. You cannot determine it's use cycle unless you are part of the company.

IP: Logged

Dan Larsen
Forum Contributor

Posts: 137
From:Sussex, WI
Registered: Feb 2001

posted 13 April 2001 05:35 PM     Click Here to See the Profile for Dan Larsen   Click Here to Email Dan Larsen     Edit/Delete Message   Reply w/Quote
Extending on Jerry's point (3)...

The statistical approach is good if you have the time and resources. YOU set the calibration frequency, and the driving force, I think, is the requirement that you justify all prior results if you find a gage out of calibration. If you set the calibration frequency too close, you'll never have a situation to react to, but you may be spending too much on calibration. If you set the calibration frequency out too far, you'll likely have an out of calibration situation that will be difficult and time consuming to justify (and may have a recall situation as well). Look at the history of the gage, not just whether it's "OK" or "NOT OK", but the actual found values. Set the calibration frequency based on the expected drift of the gage, so that you calibrate befor it drifts to the point you have to react.

I suggest (for small companies with limited resources) that they start out with a high frequency, then evaluate the situation after three cal cycles. If the gage doesn't drift, extend the cal cycle. Repeat. When you see a change, recalibrate and hold the cycle. Monitor for three more cycles and see if it works for you. This may not be statistically accurate, but it works from a logical standpoint. And in my mind, statistics (properly applied) is nothing but logic!

IP: Logged

All times are Eastern Standard Time (USA)

next newest topic | next oldest topic

Administrative Options: Close Topic | Archive/Move | Delete Topic
Post New Topic  Post A Reply Hop to:

Contact Us | The Elsmar Cove Home Page

Your Input Into These Forums Is Appreciated! Thanks!


Main Site Search
Y'All Come Back Now, Ya Hear?
Powered by FreeBSD!Made With A Mac!Powered by Apache!