Defining frequency of measurement tools callibration

magic185

Registered
Hello!
What is the best way to define frequency of measurement tools calibration? Based on measurement capability ?
Thanks in advance.
 

Tagin

Trusted Information Resource
After manufacturer's recommendations, you may want to also consider things like:
  • Your org's risk tolerance of qty of product produced w/out-of-cal tools.
  • Customer's risk tolerance of qty of product produced w/out-of-cal tools.
  • Usage rates of tools.
  • Risk of damage to tools (dropping, etc.)
 

dwperron

Trusted Information Resource
You've gotten two great answers here.
The manufacturer ought to know how well their devices hold their tolerances.
Risk should be your driving factor in adjusting calibration intervals. How much will it cost you in investigations, recalls, rework, reputation, etc. if your tool is found out of tolerance for a month, or a year.

As for adjusting the intervals, there are many methods out there. Books have been written on the subject. A Google search on "calibration interval adjustments" got 69,000,000 hits, and there is plenty of good information there.
 
Top Bottom