Definition Accuracy and Resolution in a Measurement System: What are the differences?

bobdoering

Stop X-bar/R Madness!!
Trusted Information Resource
Wait a minute! Is the micrometer actually used in service to measure dimensions with the smaller tolerance?

Good point, if the "tolerance" specified on the NC is what they are trying to measure on the part, then the gage is useless. If the tolerance is the gage tolerance, then there really is no basis for complaint (it just looks like a standard gage spec, see this) - the NC needs to identify the relationship to the part tolerance to be of an issue.
 

msbettyhunt

Starting to get Involved
Hi! what I found the main difference between accuracy and resolution in measurement is that accuracy can be defined as the amount of uncertainty in measurement with respect to an absolute standard while on the other hand resolution refers to the range of the scale divided by the display readout. You could also say that it is the capacity divided by readability. High readability delivers a great resolution, but it doesn’t always result in the best accuracy.
 

bobdoering

Stop X-bar/R Madness!!
Trusted Information Resource
My favorite example of this is the 1/100" steel rule. Accuracy would be how well the rule is marked to the standard. Resolution is 1/100" but there ends up being a lot of error in readability trying to see or count the lines - especially without magnification (an opportunity to increase readability. The gage "system" includes the person that had to read it. I had a gage R&R for a hand gage that failed because one of the operators had taken Excedrin and was shaking enough to affect the repeatability!
 
Top Bottom