I've just been given this as a science project. This related to physical gages (threads, rings, plugs, etc.) The practice has been to allow for internal calibration failures for up to 1.5 times the total gage tolerance without needing to investigate the impact on the product - total gage tolerance is 0.005mm (this is a total of 0.0075mm or 0.0003"). The question was posed "Where did this come from? How is it justified?" At this point in time the response is "we've always done it that way" 25+ years.
At a guess it came from the "old" school since the procedures reference NCLS-150. So instead of using uncertainty we used the 10:1 to develop the gage/method and 4:1 for the calibration accuracy. Am I off base?
I definately want to get to an uncertainty budget for our lab but we aren't there yet.
At a guess it came from the "old" school since the procedures reference NCLS-150. So instead of using uncertainty we used the 10:1 to develop the gage/method and 4:1 for the calibration accuracy. Am I off base?
I definately want to get to an uncertainty budget for our lab but we aren't there yet.