Accuracy of calibration relationship to tolerance

F

fun4rudy

who can explain the relationship for accuracy of calibration to tolerance of the indicator, given the acceptance standard:

Accuracy of calibration will be plus or minus one unit of graduation on the indicator. On indicators graduated to .0001 inch or less the acceptable tolerance is one graduation. On indicators where graduations are .0005 inch or more the tolerance is .0002 inch.

for an .0001 indicator, it says the tolerance is half the accuracy.
for an .0005 indicator, it says the tolerance is about half the accuracy.

I'm having trouble comprehending tolerance is less than the accurcy. Anyone care to say what the accuracy and tolerance would be for a .001 indicator
 

Marc

Fully vaccinated are you?
Leader
I've moved this thread to the Calibration forum for now - It will probably get a better response here.
 

Hershal

Metrologist-Auditor
Trusted Information Resource
The accuracy of an instrument is defined as a qualitative indication of the ability of a measuring instrument to give responses close to the true value of the parameter being measured (VIM, 5.8).

A tolerance represents the maximum allowable deviation, and is a property of the item being measured.

Put another way, whatever you are measuring will likely have some value, plus or minus some allowable variation. That allowable variation is the tolerance. The instrument you use to measure it will have an accuracy that means it can come repeatably close to the true value.

Now, the (adjustable cresent for Claes) wrench in the works is the measurement uncertainty (MU). Thanks to MU, you can be confident that the true value is the indicated value, plus or minus the MU. However, you will never know if you actually hit true value. Fortunately the MU - generally - is much less than tolerance, but that is where an uncertainty study comes in. The study will tell you just what your uncertainty is.

Hope this helps.

Hershal
 
Top Bottom