Question on very low NDC number with tolerable GRR ratio's

We have a plastic thin film thickness gauge that has a precision and accuracy of +/-1.5 microns and our thickness specification is 42 +/-6 microns. The GRR data shows a very low NDC - number of distinct categories - while the other ratios seem adequate. My guess is the GRR is compromised because the gauge precision is not good enough?

The NDC is so low that most would say the the thin film gauge is useless for this +/-12 tolerance which would imply so is the GRR. Your review?

Actually, the person who ran this GRR entered the wrong total tolerance making the whole GRR bogus. Not me I swear! I will get him to redo.
The correct TV is 12.

Bev D

Heretical Statistician
Staff member
Super Moderator
The answer is fairly simple - you don't have enough variation in the parts you measured. this may be because your process is very tight or you just picked parts that are very close to each other (sequentially produced perhaps. this type of process often has larger batch to batch variation than piece to piece variation).

Now here is the tough love: the calculation of mathematical formulas is no substitute for thinking. A valuable Guage R&R cannot be approached as a plug and play thing. unless you are only doing this to satisfy a customer requirement, you should understand the process that creates the characteristic you are measuring, how that characteristic varies over time, how the measurement system works and how a guage R&R works. it is not sufficient to get some data and enter it into a spreadsheet. There is no value in that.

I have attached a document that begins to explain how guage R&R studies work and how to perform and understand a real R&R study. I have also attached aspreadsheet that might help you visualize and understand your specific study. The article contains several references for further study. The one thing to always remember is to PLOT YOUR DATA. Statistical summaries and equations are not informative or intutitive, but graphs are when they are properly done. Try it.

I suggest that plotting your data be your professional new year's resolution - it's one you'll keep.



Stop X-bar/R Madness!!
I have not seen the key point of the gage usage identified. NDC ONLY applies if you are using the gage for process control, and the Gage R&R should be calculated using process variation. It tells you if you have adequate resolution to see how the process behaves within its expected variation - NOT within its tolerance! If the sample is not adequate for describing the true process variation, historical process variation can be used instead. This is the preferred method in MINITAB. NDC does NOT apply to gages used for process release - as its calculation is based on process variation, not tolerance.


Stop X-bar/R Madness!!
Also, look at your PV. A correct Gage R&R has a PV that represents the variation that you expect to see in the life of the process. The PV in your data says you expect to see a 3 micron variation. I really, really doubt that. If you DO, than the expectation is that your gage can give you enough statistically accurate resolution to divide that by at least 5 (to get a "passing" ndc) would be a very special gage. If you process was capable, then you would likely have a process spread closer to 75% of the tolerance, or .75*12 microns or 9 microns. Still very tight, but more reasonable than 3 microns. That is why the recommended PV calculation uses the historical standard deviation (or its estimate based on a capable process, since that should be your goal), NOT your samples. Getting samples to accurately portray long term process variation is extremely difficult. Outside of Bev's comments. this is also a sanity check to what you are doing with the Gage R&R tool.

Top Bottom