Thank you all in advance for the help.
Basically, I am trying to change the culture toward the calibration standards and methods at my work - we have a lot of local procedures in place for processes and responsibilities for each engineer.
Currently there is an instruction in place that states if an out of Tolerance is found during a calibration effort (this goes for all equipment, from Oscilloscopes, to calipers, to flow modules.. etc) then the asset will have the interval cut in half; to lengthen the interval back to its original interval, then said asset must have 3 calibration cycles without any out of Tolerance finding.
So now we have assets with a wide range of intervals, nothing is consistent, and we are wasting time, money, downtime, and everything else under the sun.
We had an asset with a 3 year cycle, an out of Tolerance was found, but it left in tolerance, now it’s at a 18 month cycle, then another out of tolerance was found and now 9 months - we are currently at this asset being at. 1.5 month cycle due to so many out of Tolerance findings.
I am working on a proposal to follow Establishment and Adjustment of Calibration Intervals RP1, that way we can save money, have more time with equipment etc. But each time an asset has an out of tolerance , we provide an impact report to cover the time between the calibration where everything was in tolerance, until the out of Tolerance finding.
This all is weird to me, as long as the asset was re adjusted back into tolerance, we should not be consistently reducing intervals. I want to apply a A3 model to create a better system. All the assets that we reduce intervals on always come back “ as left in tolerance “
I feel like all the frequent calibrations put extra stress on the equipment, because now it’s subject to whatever happens when it leaves our facility.
Can anyone help me add to my point that this practice is not value added, perhaps more specific information to assist with my current issue? Thank you!
Basically, I am trying to change the culture toward the calibration standards and methods at my work - we have a lot of local procedures in place for processes and responsibilities for each engineer.
Currently there is an instruction in place that states if an out of Tolerance is found during a calibration effort (this goes for all equipment, from Oscilloscopes, to calipers, to flow modules.. etc) then the asset will have the interval cut in half; to lengthen the interval back to its original interval, then said asset must have 3 calibration cycles without any out of Tolerance finding.
So now we have assets with a wide range of intervals, nothing is consistent, and we are wasting time, money, downtime, and everything else under the sun.
We had an asset with a 3 year cycle, an out of Tolerance was found, but it left in tolerance, now it’s at a 18 month cycle, then another out of tolerance was found and now 9 months - we are currently at this asset being at. 1.5 month cycle due to so many out of Tolerance findings.
I am working on a proposal to follow Establishment and Adjustment of Calibration Intervals RP1, that way we can save money, have more time with equipment etc. But each time an asset has an out of tolerance , we provide an impact report to cover the time between the calibration where everything was in tolerance, until the out of Tolerance finding.
This all is weird to me, as long as the asset was re adjusted back into tolerance, we should not be consistently reducing intervals. I want to apply a A3 model to create a better system. All the assets that we reduce intervals on always come back “ as left in tolerance “
I feel like all the frequent calibrations put extra stress on the equipment, because now it’s subject to whatever happens when it leaves our facility.
Can anyone help me add to my point that this practice is not value added, perhaps more specific information to assist with my current issue? Thank you!