Calibration Interval based on Usage

K

kgriff

I've got a question I'd like pose. I'm sure other people are doing this, so I'd like to hear what you do.
We have equipment ranging from fixed gages (plug gages, ring gages, etc.) to RF and microwave, and everything in between.
For a fair number of the fixed gages, we have our cal interval based on usage. These gages live in a gage crib, with the usage incrementing when they are "checked out" and "returned"
Our current, old, homegrown database controls the counting of usage. I might have, for instance, a plug gage on a 90 use interval. The problem with this system is that there is no actual "drop dead" calibration due date. Once a gage is calibrated and placed in the gage crib, it is "good forever" unless it gets used the requisite number of times. I've been trying to change this practice for years but, with our internal system, change is slow and difficult. I finally have some traction for change, so I'd like to see what kind of "drop dead" date other people use for this sort of cal interval.
Thoughts?
 

AndyN

Moved On
What you've described is the heart of an effective calibration system! If the gauges don't get used, then it's not necessary to do anything as long as you keep them in a protected condition.

I only wish more people would do this, instead of some arbitrary, time based system like "once a year..."
 

Marc

Fully vaccinated are you?
Leader
Beware - Use is not the only aspect even if a gage/measurement device is rarely used. Those are typically "Calibrate Before Use". A gage can be dropped or otherwise compromised/damaged even if it is only used once a year.

Calibration frequency should be based upon a number of factors. A few are:

Type of measurement device.
Frequency of use.
Environment of use.
Impact of an out of calibration use.
Device age and wear factors.
History of calibrations of the device

Also see:
http://www.nist.gov/calibrations/recommendedcalibrationinterval.cfm
 
J

JRKH

Something in the description bothered me....you describe "use" as being checked out and returned. This counts as one use?

Yet one check out and return could mean the thread gage (for example) could be used one time or a hundred times during the run. This would be a more important factor than number of times checked out.

Or maybe I misunderstood what you were saying.

That said, my rule of thumb was at minimum once per year, but mostly more
often depending on factors such as those listed by Marc above.

James
 

BradM

Leader
Admin
You pose a good question. There are different ways to establish an instrument recall system.

1. Calendar recall

This is where you establish a set interval for an instrument, say, 3;9. So every March and September, that instrument is calibrated.

Pros: It allows for managing an annual work load; being able to plan what instruments are serviced at a particular time; allowing more flexibility in scheduling the calibration of standards used; able to group like activities for Economies of Scale.

Cons: Short changes the interval; not all intervals are a true 6 months; does not address the usage frequency of the instrument; discourages effective risk/interval management.

2. Last service recall

The calibration interval is always a period of time from the last calibration performed. If the calibration date says 02/12/14, for a six month interval the next is 08/12/14, and so on. The recall period can (and do) change for instruments, based on their last service.

Pros:
Instrument truly has the desired interval; not short-changed.

Cons:
Does not facilitate advanced planning well, as the instruments due in any given month can (and will) change. Also, it makes it more difficult to choose the optimal time to have standards calibrated.

3. # of occurence recall

Pros:
Allows for a better risk-based approached to calibration recall (usage). This method can optimize calibration intervals for maximum benefit.

Cons:
Instruments may experience drift not associated with # of uses. If there is variable demand/variable use, it can hinder advanced planning also, as the instruments due in a month can (and do) change.


Anyway, every approach has its goods (and bads). If your current system seems to have been working for you, then I would stick with it. Manage the exceptions. If you have some instruments that need to be re-calibrated prior to the usage-frequency requirement, place a note in the system for those and service them a little sooner. :)
 

drgnrider

Quite Involved in Discussions
As Marc mentioned: wear. Using on one item vs. 100, but also the material type they are used in/on (Stainless, cast iron, clean vs. burrs, chips, etc.), and how they are used (dry, oil, turned with a wrench, etc.). For wear-type gages (thread, pins, etc.), some material (not limited to cast iron and stainless steel) can wear them down or even cause them to "grow", some gages can even "grow" on their own, (http://www.meyergage.com/abcs-of-gages/why-gages-grow/).

I, too am using calendar dates for calibration, but also trying to start a times-used as our tool crib/calibration software will monitor this. I have some gages that will end up calibrated more often (most in-house) and some less often. Unfortunately, I am doing more of this :frust: than accomplishing change. :(

BTW: we have calipers (to 60"), indicators, OD Mics (to 60"), plug/ring gages, etc. and I do most of our calibration in-house.
 
K

kgriff

A lot of information. Most of this has already been considered, either by me or in our systems. Perhaps, if I provide a better description of our current (legacy) methodology it will help.
We have multiple methods of assigning "calibration due".

* One is strictly date based, as most calibration systems are. I calibrated it today(14-Feb), on a 3 month interval, it's due 3 months from now (14-May), regardless of usage, storgage conditions, user care, etc. For many reasons, as we all know, this is not a perfect system.

* One is nominally usage based. This is the type of calibration interval I'm attempting to address at this point in time. There were a lot of good points raised about this responses. It's "low hanging fruit", so to speak. Currently, here's what happens:
--Gage sits in tool crib not accruing usage.
--Gage gets checked out today and returned today. This counts as one use.
--Gage gets checked out today, returned today, and checked out again. This counts as two uses.
--Gage gets checked out today, but returned tomorrow. This counts as two uses.
There are some obvious flaws with this. (How many times was it actually used. What material, appropriate for some gages. How was it cleaned before and/or during use. This list could go on for a while.) Unfortunately, I cannot effect any of these aspects at this point in time. I have future plans, but cannot impact this today.
What I can impact is the fact that, if not used, these gages never accrue uses and, therefore, never come due for calibration. I'm attempting to effect a system change whereby these gages will be due for calibration after a certain amount of time if they have not accrued enough use to be due. In other words, I want an either/or type cal interval. "It's due after 30 uses or 30 days, whichever comes first."
I'm looking for insight into what I can propose as the "xx days" part of this interval. Understanding, of course, that this will be different for different gages. I might, for example, have the same "xx days" for plain plugs and ring, but something different for micrometers, and another interval for calipers. I know there is no single answer, but I also know that I don't know everything, so I'm appealing to the metrology community at large to provide some insights into what other folks do.

I do have long term plans that include all of the variables mentioned below, and others not yet covered but, here where I work, change is very slow to happen. I have been pushing this specific issue for several years and only now do I have enough momentum on my side to make a change to the process. I want to have some data based starting point as I implement this change.
 

BradM

Leader
Admin
Why would you want to change? Is this system working for you? There is nothing wrong with having two approaches to estimating intervals, IMO.

When working with Forecasts, you teach the simple stuff. Then, you expand/ combine as needed.

I guess I'm lost now. :) You want a system that catches instruments that require adjustment before exceeding tolerance; while maintaining adequate intervals to maximize return. If your system accomplishes that, I wouldn't touch it. You're better off than most.
 

normzone

Trusted Information Resource
An interesting challenge, and cases can be made for differing approaches.

In a previous lifetime, I worked with a process equipment manufacturer whose product line went back the better part of a century.

I inspected new electronic components and parts made from wood ordered against sixty year old drawings.

There was a bank of vernier and digital measurement tools - some were calibrated every year, some never.

There was a concealed vault of hundreds of acme thread gages that were antiques. Some saw use once every decade or three.
 
Top Bottom