D
I was asked to perform a gage R&R study for an Xray instrument measuring plating thickness. The total %RR was very high in spite of any intervention. I think this is due to the variation I get in the second decimal place (100ths of microns) which can't really be controlled. How do I get a valid gage RR? The spec limitations are given only in whole microns. Can I, for instance, just drop the decimal portions and do the gage study?
