Calibration of Pin Gages using a Calibrated Caliper that measures to 4 decimal places

DAHSR

Registered Visitor
I have 30 Pin Gage sets to "calibrate". They are all 3 decimal place (X.XXX) and either a + or - accuracy. My plan is to use a calibrated caliper, that measures to 4 decimal places, that we maintain as a standard, not used for production, with documentation, a certificate of calibration and variable data traceable to NIST. Is there a standard in the world of calibration such as ISO, ANSI, etc. that would specifically not permit me to use calipers to verify the accuracy of each pin. The pins are not use for final inspection, that is done by QC, at the end of the manufacturing process. Our product dimensional tolerance is measured to a maximum of 3 decimal places and the tightest tolerance is +/- .005".
 
A

alspread

Re: Calibration of Pin Gages

I don’t have a specific answer to your question, but I had a similar problem and found a solution that was acceptable to my auditor and customers without putting too much burden on the organization.
Basically, the fundamental problem I had was that is was nearly impossible to identify individual pin gages which would be necessary to perform any kind of meaningful calibration. Add to this the problem that the pins would invariably “float” around the shop from set to set with little ability to control.
I had multiple sets of gage pins located in many different areas on the shop floor. These pins were in addition to Go/No Go pins installed in the typical red and green handles. The pins mounted in the handles were used for acceptance or rejection and could be easily identified (labeled) which may calibration easy within the calibration system.
I wrote my internal procedure to specify that loose pins were not to be used for product acceptance and that they must be calibrated prior to use for all applications.
In order to be sure that everyone knew the procedure for pin calibration, I laminated and posted a copy of the loose pin calibration procedure in the cover of all of the pin set cases. My procedure specified that a calibrated micrometer equipped with a tenths barrel was required and that the pin was to be inspected (i.e. calibrated) 2 places 90 degrees apart at both ends and the middle. All results had to be within +/-.0004 of the stated pin size.
I couldn’t tell you that everyone who used a pin performed this process. But I could tell you that they knew what the requirement was and they understood the importance.

Good Luck
 

Jerry Eldred

Forum Moderator
Super Moderator
Re: Calibration of Pin Gages

I'm not much of a dimensional person (more RF and precision electrical standards), but my only thought would be the uncertainty ratio. Take the various uncertainty factors for the calipers (their tolerance, 1/2 the resolution, repeatability, temperature coefficient, and anything else), and algebraically add them using Root Sum Square method, and multiply by two for K=2 confidence; and compare that with the tolerance of the pin gages. The K=2 combined and expanded uncertainty of the measurement needs to be at least 4 times greater than the tolerance of the pin gage. If it is not, it is not generally acceptable as a method.

However, if you don't have to comply with any ISO standard for these, it is a matter of assessing the risk if you don't have an adequate uncertainty ratio. If it is 1.5 to 1, or 2 to 1, your quality people (or who ever has that authority) would need to make that judgment as to if it is acceptable.

I'm certain there are procedures out there for calibrating pin gages, and I would guess a laser mic or something equivalent would normally be used. Theoretically (and I'll probably stir the pot up on this one), you can calibrate anything any way you want. However, any method you use is only as appropriate as its compliance with what ever your compliance is, what ever your quality system dictates, the end product quality or reliability can tolerate or what ever is acceptable to your customer.

So there are right and wrong ways to calibrate, but the definition of right and wrong is determined by factors above.

I was supervisor in a lab in an FDA regulated environment where every instrument had to be calibrated by the 100% full OEM procedure only, no exceptions. That is not a bad practice. But (and maybe one of my old bosses there may read this - and they know me) there are times when the full OEM procedure is inferior to other methods. They also required full before and after data on every calibration (and I mean EVERY calibration). That is not a bad thing. But there are a lot of instruments where there is no value added by the data (I'll probably stir up the pot on that one as well). Then there is the whole issue of ISO17025 Accredited Cals. Some companies get only accredited cals when they don't need them. I used to do nuclear compliant cals where we had to hand write the mfr, model, serial number cal date and due date on every standard used on a separate form (in addition to what was printed on the certificate. And in hand written before and after data, we had to write in the units (i.e.: mV, VDC, KHz, etc.). I got a call from a nuclear customer and he had to give me a deviation because I forgot to write "mV" after a value, even though it was clearly identified as such elsewhere on that data line.

I could go on literally for hours with stories like those. Bottom line is there are so many details in calibration that your company or your regulatory or compliance body decides for you, that its a moving target.

Is it okay to use calipers to calibrate pin gages? Yes and No. I won't elaborate as it would be a rerun of the above.

I've been very busy at work and I'm a little behind on my posts here. I'll try to do better.
 
J

JAltmann

Re: Calibration of Pin Gages using a Calibrated Caliper that measures to 4 decimal pl

I would echo Jerry's thoughts on the increased uncertianty ratio. The caliper itself really has an absolute best case uncertianty of .0005". This would be if the uncertianty of the gage blocks and environment used to calibrate it were perfect and that same perfect environment was used to calibrate/ceritfy the gage pins.

Now we shift to reality which puts that caliper at closer to .001" uncertianty of measure. Well that alone puts you at a 10:1 ratio of the final inspection citeria of the product your using these pins to measure.

Typically gage pins are checked with a .0001" or better micrometer. The use of 30 sets of gage pins seems rather high, does the product line require a full set of gage pins? or could reduced size sets be used?

Depending on your industry/company certifications you may be able to use the calipers in your procedures to calibrate these gage pins, but your comapnies risk of have non-conforming product go out the door will be increased. You may also get some pushback from an auditor who does like this tool for calibrating the gage pins.
 
J

Jeff Frost

Re: Calibration of Pin Gages

Caliper cannot be used for calibration of pin gages. Accuracy of the caliper is typical only .001 inch for both digital or dial models unless otherwise specified by manufacturer. You will need to maintain 4:1 accuracy ratio to the pin gages (accuracy of .0002 to .00002 based on class) which will mandate use of bench mic at a minimum.
 
S

Sturmkind

Re: Calibration of Pin Gages using a Calibrated Caliper that measures to 4 decimal pl

Hi, DAHSR!

Sounds like an acceptable plan to me. Just a caution is that if the calipers are digital the 4th decimal place will either be a zero or a 5 though that may not be significant considering the tolerance range you are using the pins for. My personal preference would be a 0.0001" OD micrometer with friction thimble which I believe would be more likely to pick up any burrs and may more accurately pick up an undersize condition at the pin tips.

Also, make sure there is no wear on the jaws and light can't be seen through them when fully closed, but I suspect you already check that as standard practice.

Best of luck!
 
J

JRKH

Re: Calibration of Pin Gages using a Calibrated Caliper that measures to 4 decimal pl

I don't know of any specific requirement that would prevent you from doing as you propose, but personally I can't think of a circumstance where I would be able to justify it to myself, let alone an auditor or customer.

I echo the others here that a caliper is absolutely the wrong instrument to use. At the very least you should be using a standard micrometer that measures to 4 decimal places. My personal preference is an old fashoned analog readout - nothing electronic, but that's just me...
I like alspread's plan requiring a floor calibration check before use, particularly since you have so many sets to check. If you have circumstances that allow this, it's the way I would go.

Then you can call in each set on a rotating basis and do a followup check for wear and accuracy, replacing individual pins as needed.

Peace
James
 
Last edited by a moderator:
J

Jeff Frost

Re: Calibration of Pin Gages using a Calibrated Caliper that measures to 4 decimal pl

Hi, DAHSR!

Sounds like an acceptable plan to me. Just a caution is that if the calipers are digital the 4th decimal place will either be a zero or a 5 though that may not be significant considering the tolerance range you are using the pins for. My personal preference would be a 0.0001" OD micrometer
Best of luck!

The lowest tolerance pin gage (Class ZZ) is .0002. If you plan to use a OD Mic with accuracy of .0001 you will need to reclassify tolerance of the pins to .001 (10:1) OR .0004 (4:1). This is why you should be using a bench mic with its higher accuracy ratio.
 
D

D.MOTA

Re: Calibration of Pin Gages

Very interesting... my self i have the same problem around the shop we have pins everywhere, are you willing to share that procedure?
 

Hershal

Metrologist-Auditor
Trusted Information Resource
Re: Calibration of Pin Gages using a Calibrated Caliper that measures to 4 decimal pl

There have been many good replies here so I will add just a little.

If the caliper is used, who did the calibration? What is the uncertainty of that calibration from the calibration certificate. That is absolutely critical for calculating the uncertainty for your pins.

The calibration of each size pin also must have the uncertainty calculated, and influences have been listed a bit before my reply. Uncertainty calculations must have both Type A and Type B.

Only once the uncertainty has been calculated for each size pin can the Test Uncertainty Ratio (TUR) be calculated. To achieve a 4:1 as has been mentioned, twice the expanded uncertainty of the calibration of the pin cannot is compared to the tolerance of the pin. This is using two-sided of course.

The liklihood of even a digital caliper being able to achieve that is really small.

Best solution is to negotiate with an accredited calibration provider to cal your pins. Yes it will cost, but the application of any science, done correctly, is not cheap.
 
Top Bottom