Criteria for Defining a Measurement Device's Tolerance

MalibuJim

Starting to get Involved
Some very basic questions that are eluding me:

1. How does one define a gage's tolerance (i.e arbitrary or based on a spec)
1.1 From the Manufacturer's documents (they don't seem to like to specify this)?
1.2 From a specification (i.e. GGG-C-105C is obsolete, is there a replacement or other document that is more comprehensive?)
1.3 What are the best resources for learning more about gage tolerance
1.4 Is it acceptable to apply the 1:10 rule based on the tolerances of the parts I will be inspecting - can anyone provide a citation that invokes the 1:10 rule?

2.0 The basis for my question include 0-1" Digimatic Micrometers with .00005" resolution (blade, 1/4" barrel, and .080" step barrel); 0-1" Digimatic Height Gage; SJ-400 Surface Finish Gage; and even a Nikon vision system. The manuals for these instruments speak to "accuracy" but are silent about their uncertainty (at least not in numeric terms - inches or mm).

Thank You,

Jim
 
A

Antonio53

Any feedback?
Usually if you need a gage, you will choose the one that meets your need (specs), so the tolerance will be defined by the manufacturer of the gage chosen that you will accept. Sometimes the tolerance can be redefined by the end user, but, in this case, it (the new one) must be fully validated.
 
Last edited by a moderator:
S

samsung

Usually if you need a gage, you will choose the one that meets your need (specs), so the tolerance will be defined by the manufacturer of the gage chosen that you will accept. Sometimes the tolerance can be redefined by the end user, but, in this case, it (the new one) must be fully validated.

Though I am not good at handling the measuring devices, yet, IMO, what you call 'tolerence' here is the 'least count' that can be measured by the instrument and certainly it is different from the term 'tolerance', also known as 'acceptance' to which the rights are (should be) mostly reserved with the customer and not the otherwise.

The 'tolerance' (which I suppose to mean 'LC' here) even though defined by the manufacturer should be validated (confirmed) prior to putting the instrument in use.
 
A

Antonio53

Of course, even if you will accept the specs of the manufacturer, you should, in any case, qualify/validate the gage in order to demonstrate its Effectiveness/Efficiency in the process that you want to keep under control
 
S

samsung

Of course, even if you will accept the specs of the manufacturer, you should, in any case, qualify/validate the gage in order to demonstrate its Effectiveness/Efficiency in the process that you want to keep under control

You validate or verify the gage just to ensure that it reads correctly whatever is subjected to measurement. What demonstrates the efficiency of the process is not the measuring accuracy but the decision or action that you take based upon the measurement so as to keep the process on track/ under control. Isn't it ?
 
A

Antonio53

You should validate the gage in order to demonstrate that it is the tool you need to assure that the part of the process monitored with it will give product according to the specs you have defined. The decision/action you wqill take, based on the measurement, is part of the control of that specific part of the process, that's it. If the tolerance measured will be out of the specs this will mean that the process needs some correction in order to have, again, something that will be suitable for the next step.
 
S

samsung

You should validate the gage in order to demonstrate that it is the tool you need to assure that the part of the process monitored with it will give product according to the specs you have defined. The decision/action you wqill take, based on the measurement, is part of the control of that specific part of the process, that's it. If the tolerance measured will be out of the specs this will mean that the process needs some correction in order to have, again, something that will be suitable for the next step.

Correct. That's the point.
 

BradM

Leader
Admin
Jim, you ask a million dollar question.:agree1:

For starters, I will use the mfg. tolerance for a device. Because, in theory, that is what the device is supposed to be able to do. However, some mfg. devices never meet the mfg. specification; no matter how short the interval. So, that leads me to #2.

#2 set the tolerance so that the device has the required accuracy for the process. Say I have a pressure gauge that the mfg. says is +/.1 PSI. I use the gauge on a process where I'm measuring something +/-5 psi. I might set the gauge tolerance to .5 or even 1 psi.

So, I would start with MFG. tolerance. Then if your process spec is known, go backwards from there and determine what the device should meet.

Hope that helps.
 

MalibuJim

Starting to get Involved
Brad and Everyone else,

Thank you for the guidance on the gage meeting the minimum requirements for any given application. To me this is straightforward 10:1. I have an O.D. with a tolerance of ±.0001". So my gage must be accurate to within ±.00001" (ten millionths). Accuracy being the ability to repeat and distinguish between to different parts whose values might be .10001" and .10002". So, the "read-out" on the gage is meaningless. In my example I would probably only be able to use a super-micrometer.

Next issue. Does anyone have any suggestions for specifications or text books that define the flow of gage selection and incorporate the 10:1?

Perhaps I'm hung up on having a published document that directs this. Obviously I could write my own procedure and follows my first paragraph in this post and then put same into practice. No problem. I just like to have a body-of-knowledge reference (plus I like to read).

Take Care,

Jim ASQ-CQE; SSBB
 
Top Bottom