A
I have a question regarding the "Sigma Multiplier" in Gage R&R %Tolerance. I've read many of the past threads but don't see an answer to my question. Hope someone can help.
It seems that "historically" 5.15 has been used but now many refereces say to use 6. My Master Black Belt says to use 5.15. JMP, AIAG and Minitab now have 6 as the default. I understand the difference - 5.15 relates to 99% under the normal curve and 6 relates to 99.73%. I also understand that the difference between the 2 is small and you'll get an correct indication of the variation with either value. I would like a good explanation of why the value changed.
I'm establishing a Gage R&R standard to be used throughout our entire company (globally) and I've already had questions about this. I'd like to provide a good reason for which one we'll use as standard.
Any thoughts?
It seems that "historically" 5.15 has been used but now many refereces say to use 6. My Master Black Belt says to use 5.15. JMP, AIAG and Minitab now have 6 as the default. I understand the difference - 5.15 relates to 99% under the normal curve and 6 relates to 99.73%. I also understand that the difference between the 2 is small and you'll get an correct indication of the variation with either value. I would like a good explanation of why the value changed.
I'm establishing a Gage R&R standard to be used throughout our entire company (globally) and I've already had questions about this. I'd like to provide a good reason for which one we'll use as standard.
Any thoughts?