In Reply to Parent Post by jmarcus
Our product is solvent based, has a tolerance of +/- 30 mPa and it seems to change over the 20 minutes it takes to measure it, not much 2-3 units, but still changes.
I plug it into our GR&R spread sheet and I get numbers in the 40-50%.
I do not understand how the statistical date is working. I would think a 5% change could still be controlled with us knowing that it will change slightly over time.
Plus all the readings were well within Tolerance.
Is there a trick in running GR&R on a product that isn't 100% stable?
Thank you in advance.
When you state that the readings change over the 20 minutes, does this mean that one measurement takes 20 minutes, or that it takes 20 minutes to perform all of the R&R measurements?
Does the product eventually stabilize after 20 minutes, or does it continue to change?
You would definitely need to use the method for a destuctive gage study. If the product eventually stabilizes, perform the measurements after it has stabilized. If does not stabilize, you may need to get inventive and perform a regression study versus time, then apply a correction factor to your measurements based on the time.
Please review earlier posts in this forum for a discussion on collecting samples for an MSA based on whether the gage is used as an inspection gage, or as a process control gage. David made a good point about the effect that the variation of your test samples has on your results. Make sure that you are using the correct metric (P/T Ratio or %GRR) and that your samples represent the process variation if %GRR is used.
Please note that a stability study is intended to assess the stability of the measurement system. The basic assumption is that the standard itself is stable and unchanging. This seems to violate that assumption. A similar study would be worthwhile to assess the stability of the product over time, but this has a different purpose than a gage stability study.