Gage R&R - Tolerance vs. Total Variation - Process Variation vs. Gage Variation

J

Jrodrigu

I'm implementing Gage R&R and Measurement Process Evaluation at my facility and have a few questions.

I understand the rules of thumb regarding % R&R percentages i.e. <10% is good, 10-30% is ok but may need improvement, >30% measurement is inadequate. How does this equate to the new philosophy of Gage R&R Process control view where process variation is accounted for rather than tolerances. What does the process variation %R&R truly mean? What criteria or rules of thumb should be followed for the process %R&R? How does this correlate to traditional Gage R&R using the tolerances?

I believe the rules of thumb hold true for both. But how would you explain a % Tolerance R&R of 13.27% which is ok and a Process Variation R&R of 97.60 which is inadequate? Would improving the process variation lower the Process %R&R or is the variation inherent in the method and is not capable of improving it?

Any information is helpful including websites or literature.

:frust:
 
W

wmchale

Gauge R&R

For those people starting on the road with GR&R studies rather
than spell it out, I would recommend the following reading material : ''Concepts for R&R Studies'' written by Larry B. Barrentine. Its available from ASQC, Quality Press and the book reference is ISBN0-87389-108-2.

In general the GR&R% tells you how much variability exists within your measuring system. The process Variation % should be as close to 100% as possible, this means very little variation is due to the actual measuring device.:)
 
J

Jrodrigu

From the literature that I'm reading and from other posts, they say that the % Process variation should also follow the same criteria of less than 30%. The difference is in how you want to use the measurement. If you want the measurement to distinguish between the good and the bad, then use the % Tolerance. If you want the measurement to measure changes in the process, then you use the % Process variation.:lick:
 
Top Bottom