Gage R&R with a unilateral specification tolerance - What about the other limit?



How can we calculate the R&R to a gauge if the process where is used only has unilateral tolerance? i.e. min 35 Lb-in

David Drue Stauffer

In the R&R process, the values are recorded and tabulated, the result is expressed as a percentage of the total tolerance. If the Low end of the tolerance is "0" and the high end of the tolerance is (-35) the absolute value of the total tolerance is "35". When the R&R Values are computed, squared and added together, then the square root is taken and divided by the tolerance, the resulting number is the gage R&R percentage.
10% is excellent, 11-30% is acceptable based upon the application, but should be closely monitored, and 31% or greater is unacceptable.


Understand what you say, and that work if you have only specified the limit at the high side of the spec, but what happen when you have only the low side of the spec.?


Perhaps you could use the actual error as a guide. When you calculate a GR&R, you get 2 values, the actual error and a percent of tolerance. With respect to your minimum requirement of 35lb-in, if the actual error in your R&R in this case is say 3lb-in, and you can demonstrate that your process is capable to +38lb-in minimum, then the actual gage error of 3 will not cause you to accept bad product. Sign it off and make parts.


Would this apply equally to leakage?
The maximum allowable is 4 ml min

Above 4ml min the part is rejected.

Attribute wise the gauge is capable, but our customer insists on
variable gauge R&R.

This is causing a headache, as my results are well over 30%



In my opinion, if the test for Product Conformity is Pass/Fail, regardless of registering a value, then an Attribute Study would apply. Your product doesn't perform any better at 41 than at 37, so you just need to hit that minimum target. That's why the specification is written as such. That is, only if the test doesn’t destruct the product or compromises the integrity. If that is the case, a Gage R&R couldn't appropriately be conducted, as the Reference Values could change during the study. In these cases, I have simply illustrated this to the customer and shown them records of calibrations to show that the load cell performs to the specification being used. Good Luck and let us know how you make out. :bigwave:


Stop X-bar/R Madness!!
Trusted Information Resource
Test pressure is 1.38 bar +/- 60mb

I have been working on trying to get a good Gage R&R for a 5 cc/min at 3 bar, and it is tough. I get about .12 variation on just the zero. The total variation in a combination of normal and non-normal distributions. The non-normal distributions include the overall leak distribution, because the closer you get to zero, the less normal it is - conversely the further you get from zero the more normal it is. Similar to a Weibull curve. Then the individual measurement error is the same - you can get a good seat, or the best leak value, or any possibility of worse seats, fewer further away. Again, very similar to a Weibull distribution. Then, you get your zero error, which may be normal. Because these errors are significant a low leak rates, it is not easy to get a good gage R&R. At out 50 cc/min requirement, it was like a broad side of a barn with 3% GRR to tolerance.

Of course, customers do not realize this...



I will try again and use minitab to analyse the results. I agree however that as a Pass/Fail test
with calibration to +/- 0.05, then the equipment is biased towards failing good parts, NOT passing bad parts, therefore an attribute R&R is more appropriate.
Convincing our friends from OEM STA, is a harder nut to crack.
Top Bottom