Re: Calibration of Pin Gages
I'm not much of a dimensional person (more RF and precision electrical standards), but my only thought would be the uncertainty ratio. Take the various uncertainty factors for the calipers (their tolerance, 1/2 the resolution, repeatability, temperature coefficient, and anything else), and algebraically add them using Root Sum Square method, and multiply by two for K=2 confidence; and compare that with the tolerance of the pin gages. The K=2 combined and expanded uncertainty of the measurement needs to be at least 4 times greater than the tolerance of the pin gage. If it is not, it is not generally acceptable as a method.
However, if you don't have to comply with any ISO standard for these, it is a matter of assessing the risk if you don't have an adequate uncertainty ratio. If it is 1.5 to 1, or 2 to 1, your quality people (or who ever has that authority) would need to make that judgment as to if it is acceptable.
I'm certain there are procedures out there for calibrating pin gages, and I would guess a laser mic or something equivalent would normally be used. Theoretically (and I'll probably stir the pot up on this one), you can calibrate anything any way you want. However, any method you use is only as appropriate as its compliance with what ever your compliance is, what ever your quality system dictates, the end product quality or reliability can tolerate or what ever is acceptable to your customer.
So there are right and wrong ways to calibrate, but the definition of right and wrong is determined by factors above.
I was supervisor in a lab in an FDA regulated environment where every instrument had to be calibrated by the 100% full OEM procedure only, no exceptions. That is not a bad practice. But (and maybe one of my old bosses there may read this - and they know me) there are times when the full OEM procedure is inferior to other methods. They also required full before and after data on every calibration (and I mean EVERY calibration). That is not a bad thing. But there are a lot of instruments where there is no value added by the data (I'll probably stir up the pot on that one as well). Then there is the whole issue of ISO17025 Accredited Cals. Some companies get only accredited cals when they don't need them. I used to do nuclear compliant cals where we had to hand write the mfr, model, serial number cal date and due date on every standard used on a separate form (in addition to what was printed on the certificate. And in hand written before and after data, we had to write in the units (i.e.: mV, VDC, KHz, etc.). I got a call from a nuclear customer and he had to give me a deviation because I forgot to write "mV" after a value, even though it was clearly identified as such elsewhere on that data line.
I could go on literally for hours with stories like those. Bottom line is there are so many details in calibration that your company or your regulatory or compliance body decides for you, that its a moving target.
Is it okay to use calipers to calibrate pin gages? Yes and No. I won't elaborate as it would be a rerun of the above.
I've been very busy at work and I'm a little behind on my posts here. I'll try to do better.