Hello forum
I know this subject has been discussed many times in various threads, some dedicated to the subject. If I'm dragging the issue through old mud, then may this thread languish.
To be clear, the question is about estimating a measurement at indicated readings that lie between the smallest increment of a tool's indicator.
The best example is a 0-1" micrometer that does not provide a .0001" scale. An experienced machinist using a micrometer that DOES provide the .0001" scale can estimate the ten-thousandths reading within +/- .0001" without referring to the vernier.
But by rule of thumb, do not estimate.
Are we to consider 0-1" micrometers without ten-thousandths reading verniers basically useless?
I am having a hard time deciding whether or not to estimate in certain cases when calibrating instruments.
How would you operate?: An Interapid .0005" resolution indicator carries a published accuracy of .0012". In a calibration, the error result is somewhere between .0011" and .0013", but you can't tell. Should you:
The old federal spec for micrometers is 1-2" at +/- .00015, for example. The Interapid indicator with .0005" resolution is at .0012". A Mitutoyo 0-1" micrometer with .001" resolution has an accuracy of .0001". How do these numbers come about if we are not estimating?
--Conflicted in Calibration
I know this subject has been discussed many times in various threads, some dedicated to the subject. If I'm dragging the issue through old mud, then may this thread languish.
To be clear, the question is about estimating a measurement at indicated readings that lie between the smallest increment of a tool's indicator.
The best example is a 0-1" micrometer that does not provide a .0001" scale. An experienced machinist using a micrometer that DOES provide the .0001" scale can estimate the ten-thousandths reading within +/- .0001" without referring to the vernier.
But by rule of thumb, do not estimate.
Are we to consider 0-1" micrometers without ten-thousandths reading verniers basically useless?
I am having a hard time deciding whether or not to estimate in certain cases when calibrating instruments.
How would you operate?: An Interapid .0005" resolution indicator carries a published accuracy of .0012". In a calibration, the error result is somewhere between .0011" and .0013", but you can't tell. Should you:
- Define acceptance criteria as .0015"
- Estimate
- Include the .0012" in MU calculations, and then round the uncertainty estimate up to a value that can be discriminated by the indicator
The old federal spec for micrometers is 1-2" at +/- .00015, for example. The Interapid indicator with .0005" resolution is at .0012". A Mitutoyo 0-1" micrometer with .001" resolution has an accuracy of .0001". How do these numbers come about if we are not estimating?
--Conflicted in Calibration
Last edited: