K
kgriff
I've got a question I'd like pose. I'm sure other people are doing this, so I'd like to hear what you do.
We have equipment ranging from fixed gages (plug gages, ring gages, etc.) to RF and microwave, and everything in between.
For a fair number of the fixed gages, we have our cal interval based on usage. These gages live in a gage crib, with the usage incrementing when they are "checked out" and "returned"
Our current, old, homegrown database controls the counting of usage. I might have, for instance, a plug gage on a 90 use interval. The problem with this system is that there is no actual "drop dead" calibration due date. Once a gage is calibrated and placed in the gage crib, it is "good forever" unless it gets used the requisite number of times. I've been trying to change this practice for years but, with our internal system, change is slow and difficult. I finally have some traction for change, so I'd like to see what kind of "drop dead" date other people use for this sort of cal interval.
Thoughts?
We have equipment ranging from fixed gages (plug gages, ring gages, etc.) to RF and microwave, and everything in between.
For a fair number of the fixed gages, we have our cal interval based on usage. These gages live in a gage crib, with the usage incrementing when they are "checked out" and "returned"
Our current, old, homegrown database controls the counting of usage. I might have, for instance, a plug gage on a 90 use interval. The problem with this system is that there is no actual "drop dead" calibration due date. Once a gage is calibrated and placed in the gage crib, it is "good forever" unless it gets used the requisite number of times. I've been trying to change this practice for years but, with our internal system, change is slow and difficult. I finally have some traction for change, so I'd like to see what kind of "drop dead" date other people use for this sort of cal interval.
Thoughts?