I'd like to know if anyone could tell me if my interpretation is correct.
Usually I work with calibration processes, but Iím pursuing new fields of knowledge, so I just did my first MSA study.
But Iím still trying to undestand what the results means.
The study is about our paint fineness of grind evaluation method (itís measured with a grindometer).
I got these numbers based on the tolerance (the idea of the study is to measure if my QC lab is capable of identifying out of spec batches)
Ndc = 5
Both R and X charts are under control.
Does the %R&R = 42% mean that 42% of 1/6 of the specification is due to the variance of the measument system alone?
So if my production were somehow able to make equal batches everytime, Iíd still find results spread over 42/6 = 7% of my specification? Is that it?
A higher %VE could mean that there is a equipment problem. The usual grindometer could be replaced by another model with better resolution. That might lower the amplitude of the readings performed by the same operator of the same sample.
Any help would be appreciated and sorry if I wasn't clear enough.
I attached the (very crude and poorly translated) original data if needed.
The first thing I would recommend is that you validate this R&R template. I checked these results against a validated template and obtain somewhat different values. Given that, the first issue that you should address is the measurement resolution. A resolution of 0,5 is inadequate. You can assess this by evaluating the range chart. There needs to be a minimum of 5 distinct possible measurement between zero and the UCL_R of the range chart. This gage has only 3. A resolution of 0,1 should be adequate.