Proper gage to use for receiving inspection

Williamk_85

Registered
Apologies if this has been asked before, I've searched the forums for 2 hours trying to find a thread about this.

I recently had our Internal Auditor write a nonconformance for using Digital Calipers for measuring the I.D. of a slot on a bracket. He stated the feature on the print being 11.11mm +0.3/-0.1mm would result in needing a gage which can measure to 0.001mm (based on the feature size) where I was under the impression the gage had to measure to 10% of the total Tolerance (or 0.04mm) which the digital calipers are capable of.

What is the correct interpretation? 10% of the feature size or 10% of the total tolerance? Everything I've found in the MSA 4th edition states the 10-to-1 rule should divide the tolerance.
 

Mike S.

Happy to be Alive
Trusted Information Resource
The way I look at it, a gage capable of +/- .02mm would be great (10:1) but many companies allow 4:1 or +/- .05mm.
 
Last edited:

Williamk_85

Registered
Not sure if my question was explained correctly.

The calipers we are using reads out to 0.01mm increments on the digital display.

The auditor is stating we would need to use a measuring tool which goes out 3 decimal places as the feature size is called out to 2 decimal places (slot width 11.11mm).

I was under the impression the calipers would be acceptable as the tolerance is one decimal place (+0.3mm -0.1mm) so my measurement instrument can accurately measure to 10% of the tolerance (measures to 0.01mm).

The NC was written up against ISO 9001:2015 clause 7.1.5.1. but within that clause I fail to see where it is defined that the measurement tool has to be accurate to 10% of feature size which takes me to the MSA 4th Edition which states, Quote: "Adequate discrimination and sensitivity. The commonly known rule of Tens, or 10-to-1 Rule, states that instrument discrimination should divide the tolerance into ten parts or more".

I personally do not believe this NC is justified. Am I miss-interpreting the standard/requirement?
 

Ron Rompen

Trusted Information Resource
Your internal auditor is mistaken. There is no requirement in the standard to adhere to the 10:1 rule on all measurement devices. In fact, the 10:1 rule is not a 'rule' per se, just a general guideline that everyone quotes as a rule.

Although it would be nice to have all the precision measuring instruments you want, in the current economic climate, that is not always feasible. If what you have works, and works well, then there is no need to change it.

Have you performed an MSA analysis on the caliper on that feature? If so, were the results acceptable?
Do you have any history of customer complaints for the slot ID which were related to measurement error?
Have you recently changed the method of measurement (from a more precise gauge to the 'new' caliper?
 

Williamk_85

Registered
Your internal auditor is mistaken. There is no requirement in the standard to adhere to the 10:1 rule on all measurement devices. In fact, the 10:1 rule is not a 'rule' per se, just a general guideline that everyone quotes as a rule.

Although it would be nice to have all the precision measuring instruments you want, in the current economic climate, that is not always feasible. If what you have works, and works well, then there is no need to change it.

Have you performed an MSA analysis on the caliper on that feature? If so, were the results acceptable?
Do you have any history of customer complaints for the slot ID which were related to measurement error?
Have you recently changed the method of measurement (from a more precise gauge to the 'new' caliper?


MSA was done on caliper I.D. jaws and was deemed acceptable. Was not performed on specific mentioned slot feature.
No customer complaints on slot size
No change method of measurement, always been calipers.

Thank you for the clarification.
 

BradM

Leader
Admin
Hello William!

Apologies if this has been asked before, I've searched the forums for 2 hours trying to find a thread about this.

There is no apology needed!! We do appreciate you doing a search.

I recently had our Internal Auditor write a nonconformance for using Digital Calipers for measuring the I.D. of a slot on a bracket. He stated the feature on the print being 11.11mm +0.3/-0.1mm would result in needing a gage which can measure to 0.001mm (based on the feature size) where I was under the impression the gage had to measure to 10% of the total Tolerance (or 0.04mm) which the digital calipers are capable of.

What is the correct interpretation? 10% of the feature size or 10% of the total tolerance? Everything I've found in the MSA 4th edition states the 10-to-1 rule should divide the tolerance.

If you are measuring a dimension of 11.11mm +0.3/-0.1mm, you want a capable instrument. You will want a device that reads to ±.01mm resolution, and I would suggest it should have an accuracy of somewhere around ±.02mm.

With that... I see nothing that would dictate needing ±.001mm.
 

Mike S.

Happy to be Alive
Trusted Information Resource
Just as an aside, people often quote "the 10:1 rule" and this is ideal if you can do it, but this is often not practical, and sometimes not possible. SAE, the military, and Boeing all will quite often accept 4:1.

If 10:1 is an issue, see what the customer, and the process, actually requires.
 

Cari Spears

Super Moderator
Leader
Super Moderator
The auditor is wrong. It applies to the tolerance of 0.4mm (+0.3mm / -0.1mm), not the feature size.

The decimal places don't matter either; it's the size of the tolerance zone. Unless, the number of decimal places indicates the "unless otherwise specified" in the title block, which is not the case here where it specifies +0.3/-0.1.
 
Last edited:
Top Bottom