J
Jrodrigu
I'm implementing Gage R&R and Measurement Process Evaluation at my facility and have a few questions.
I understand the rules of thumb regarding % R&R percentages i.e. <10% is good, 10-30% is ok but may need improvement, >30% measurement is inadequate. How does this equate to the new philosophy of Gage R&R Process control view where process variation is accounted for rather than tolerances. What does the process variation %R&R truly mean? What criteria or rules of thumb should be followed for the process %R&R? How does this correlate to traditional Gage R&R using the tolerances?
I believe the rules of thumb hold true for both. But how would you explain a % Tolerance R&R of 13.27% which is ok and a Process Variation R&R of 97.60 which is inadequate? Would improving the process variation lower the Process %R&R or is the variation inherent in the method and is not capable of improving it?
Any information is helpful including websites or literature.

I understand the rules of thumb regarding % R&R percentages i.e. <10% is good, 10-30% is ok but may need improvement, >30% measurement is inadequate. How does this equate to the new philosophy of Gage R&R Process control view where process variation is accounted for rather than tolerances. What does the process variation %R&R truly mean? What criteria or rules of thumb should be followed for the process %R&R? How does this correlate to traditional Gage R&R using the tolerances?
I believe the rules of thumb hold true for both. But how would you explain a % Tolerance R&R of 13.27% which is ok and a Process Variation R&R of 97.60 which is inadequate? Would improving the process variation lower the Process %R&R or is the variation inherent in the method and is not capable of improving it?
Any information is helpful including websites or literature.
