Estimating between a gage's smallest increments - What is the true story?

apestate

Quite Involved in Discussions
#1
Hello forum

I know this subject has been discussed many times in various threads, some dedicated to the subject. If I'm dragging the issue through old mud, then may this thread languish.

To be clear, the question is about estimating a measurement at indicated readings that lie between the smallest increment of a tool's indicator.

The best example is a 0-1" micrometer that does not provide a .0001" scale. An experienced machinist using a micrometer that DOES provide the .0001" scale can estimate the ten-thousandths reading within +/- .0001" without referring to the vernier.

But by rule of thumb, do not estimate.

Are we to consider 0-1" micrometers without ten-thousandths reading verniers basically useless?

I am having a hard time deciding whether or not to estimate in certain cases when calibrating instruments.

How would you operate?: An Interapid .0005" resolution indicator carries a published accuracy of .0012". In a calibration, the error result is somewhere between .0011" and .0013", but you can't tell. Should you:
  • Define acceptance criteria as .0015"
  • Estimate
  • Include the .0012" in MU calculations, and then round the uncertainty estimate up to a value that can be discriminated by the indicator
I don't know how to define the acceptance criteria. First, published accuracies are for new instruments and they can be expected to wear. Second, this company is not interested in measurement uncertainty. Third, I want to control instrument accuracy as much as possible.

The old federal spec for micrometers is 1-2" at +/- .00015, for example. The Interapid indicator with .0005" resolution is at .0012". A Mitutoyo 0-1" micrometer with .001" resolution has an accuracy of .0001". How do these numbers come about if we are not estimating?

--Conflicted in Calibration
 
Last edited:
Elsmar Forum Sponsor

Al Rosen

Staff member
Super Moderator
#2
Re: What is the true story on estimating between gage's smallest increments

atetsade said:
Hello forum

I know this subject has been discussed many times in various threads, some dedicated to the subject. If I'm dragging the issue through old mud, then may this thread languish.

To be clear, the question is about estimating a measurement at indicated readings that lie between the smallest increment of a tool's indicator.

The best example is a 0-1" micrometer that does not provide a .0001" scale. An experienced machinist using a micrometer that DOES provide the .0001" scale can estimate the ten-thousandths reading within +/- .0001" without referring to the vernier.

But by rule of thumb, do not estimate.

Are we to consider 0-1" micrometers without ten-thousandths reading verniers basically useless?

I am having a hard time deciding whether or not to estimate in certain cases when calibrating instruments.

How would you operate?: An Interapid .0005" resolution indicator carries a published accuracy of .0012". In a calibration, the error result is somewhere between .0011" and .0013", but you can't tell. Should you:
  • Define acceptance criteria as .0015"
  • Estimate
  • Include the .0012" in MU calculations, and then round the uncertainty estimate up to a value that can be discriminated by the indicator
I don't know how to define the acceptance criteria. First, published accuracies are for new instruments and they can be expected to wear. Second, this company is not interested in measurement uncertainty. Third, I want to control instrument accuracy as much as possible.

The old federal spec for micrometers is 1-2" at +/- .00015, for example. The Interapid indicator with .0005" resolution is at .0012". A Mitutoyo 0-1" micrometer with .001" resolution has an accuracy of .0001". How do these numbers come about if we are not estimating?

--Conflicted in Calibration
Accuracy and resolution are independent.
 

Wesley Richardson

Wes R
Trusted Information Resource
#3
Re: What is the true story on estimating between gage's smallest increments

atetsade said:
Hello forum

To be clear, the question is about estimating a measurement at indicated readings that lie between the smallest increment of a tool's indicator.

The best example is a 0-1" micrometer that does not provide a .0001" scale. An experienced machinist using a micrometer that DOES provide the .0001" scale can estimate the ten-thousandths reading within +/- .0001" without referring to the vernier.

But by rule of thumb, do not estimate.

Are we to consider 0-1" micrometers without ten-thousandths reading verniers basically useless?

I am having a hard time deciding whether or not to estimate in certain cases when calibrating instruments.

How would you operate?: An Interapid .0005" resolution indicator carries a published accuracy of .0012". In a calibration, the error result is somewhere between .0011" and .0013", but you can't tell. Should you:
  • Define acceptance criteria as .0015"
  • Estimate
  • Include the .0012" in MU calculations, and then round the uncertainty estimate up to a value that can be discriminated by the indicator
I don't know how to define the acceptance criteria. First, published accuracies are for new instruments and they can be expected to wear. Second, this company is not interested in measurement uncertainty. Third, I want to control instrument accuracy as much as possible.

The old federal spec for micrometers is 1-2" at +/- .00015, for example. The Interapid indicator with .0005" resolution is at .0012". A Mitutoyo 0-1" micrometer with .001" resolution has an accuracy of .0001". How do these numbers come about if we are not estimating?

--Conflicted in Calibration
Hi Atetsade,

There are two different positions about interpolating when using a measuring instrument. My position is that you do not interpolate. The resolution is the smallest increment that can be read. For a 0 to 1 inch micrometer, with increments of 0.001", then you can only use it to read to the nearest 0.001" There are still many circumstances where this type of micrometer has application.

Note on some digital micrometers, the display may show 0.00000" but the next displayed increment is 0.00005" and not 0.00001" For this the resolution is 0.00005".

For the question about the Mitutoyo 0-1" micrometer with .001" resolution and an accuracy of .0001", I suspect that Mitutoyo is stating that if the micrometer is set at a given reading, say 0.463" that the actual dimension in the gap will be 0.463" +/- 0.0001". Note this is not the same as using the micrometer to measure a part and determining the part dimension within +/0 0.001".

Today, measurement uncertainty has replaced many of the terms regarding accuracy, precision, and resolution. Measurement uncertainty is basically the total variance of the measurement system, for a given device. This includes factors such as operator, environment, set-up, reference standards, resolution, and so on.

I am sure it will be many years before measurement uncertainty is widely understood, and stated in advertising literature.

The other related question is about the meaning of a specification such as 1.000" +/- 0.005". To some people this is interpreted as 1.0000000..." +/- 0.00500000...". The acceptance is then based on what your instrument indicates. For example, a reading of 1.0050000001" is out of specification. I do not agree with this position. It is my opinion that you measure with a device to the nearest 0.001" and accept the part if it is 0.995" to 1.005", otherwise reject. If the design engineer required 1.000000" +/-0.00500", then the specification should have been written that way. That is a much more expensive measurement to make.

Wes R.
 
Last edited:

Hershal

Metrologist-Auditor
Staff member
Super Moderator
#4
Re: What is the true story on estimating between gage's smallest increments

Wes is essentially correct. To be specific, if your caliper has a resolution of 0.001", then anything less is a pure guess and must be reflected as such in your uncertainty calculations.

Hope this helps.

Hershal
 

apestate

Quite Involved in Discussions
#5
Thanks guys, especially Wesley

I think the situation Wesley describes answers the question about how manufacturers are stating accuracy to a level of precision beyond the indicator's resolution.

I really wish we could use measurement uncertainty at this plant. With the systems now in place, it would be utterly impossible. It is my understanding that measurement uncertainty handles the case where a part's measurement is very near a size limit, describing that part's acceptance as unknown.

OK, we're making tractor parts in short runs and the customers are well pleased. In the light of that, it's hard to make the case for establishing measurement uncertainty.

Still, if all the specifications of a product were contained in a software database and measurements were reported electronically, it would allow us to shorten tolerance windows based on the instrument in use, its specific accuracy, etc.

:(

But, thank you for your replies. I think the proper course in this situation is to reduce the working accuracy of tools when there is a special case and maintain the accuracy requirements based on manufacturer's specs, broadly.
 
Thread starter Similar threads Forum Replies Date
M Estimating the benefit-risk ration under MDR EU Medical Device Regulations 2
bobdoering Dr Wheeler gives his thoughts on "Estimating the Fraction Nonconforming" Capability, Accuracy and Stability - Processes, Machines, etc. 0
J Estimating Measurement Uncertainty for Autopipettes Measurement Uncertainty (MU) 3
O Estimating the dpm and ppm of the process from Cp and Cpk Statistical Analysis Tools, Techniques and SPC 4
G Performing an Internal Audit on the Cost Estimating Process AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 7
T Methods for estimating standard deviation Statistical Analysis Tools, Techniques and SPC 6
M Cost Estimating Spreadsheet for a CNC Machine & Weld Shop Manufacturing and Related Processes 8
L Estimating process - Material "fast quote" - Current material prices on the web Lean in Manufacturing and Service Industries 4
O Nonconformance Treatment Diagnose - Estimating how many nonconformances neglected Nonconformance and Corrective Action 1
E Procedure for estimating measurement uncertainty Measurement Uncertainty (MU) 11
L Risk Analysis Question - Estimating the likelihood of a hazard occuring ISO 13485:2016 - Medical Device Quality Management Systems 1
J Estimating Cost of Customer Return - How Do You Do It? Misc. Quality Assurance and Business Systems Related Topics 21
E Estimating GR&R before the test is finished Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 2
Moumen H Variations between ASTM A29 Standard for steel bars and Mill test certificates specs Manufacturing and Related Processes 1
N R&R for Differences between 2 measurements Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 4
S Distinction between a critical supplier and a Virtual manufacturer EU Medical Device Regulations 2
I Сorrespondence between hazards and risks ISO 14971 - Medical Device Risk Management 2
W What is the difference between TYPE B and TYPE BF? IEC 60601 - Medical Electrical Equipment Safety Standards Series 2
T The difference between ISO 14644-3:2005 and ISO 14644:2019 Other Medical Device Related Standards 2
Q Terminal Lugs sizes - Difference between 225/24 vs. 275/24 lugs Manufacturing and Related Processes 2
T Relationship between ISO 9001 and ISO – IEC BS EN 870079- 34 2020 ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 5
M Difference between "Production Trial Run" and "Run at Rate" IATF 16949 - Automotive Quality Systems Standard 8
Ron Rompen Surface Finish Correlation between Ra, Rz and Tp (bearing surface ratio) General Measurement Device and Calibration Topics 3
L MRA between EU and Switzerland - 1/2021 EU Medical Device Regulations 2
D Difference between Test Method Validation and Gage R&R Qualification and Validation (including 21 CFR Part 11) 18
K Joint approval between OEM and Manufacturer on Design Documents ISO 13485:2016 - Medical Device Quality Management Systems 4
C Is my software an accessory? Telecommunication between HCP and patients EU Medical Device Regulations 10
K Verify Software Architecture - supporting interfaces between items IEC 62304 - Medical Device Software Life Cycle Processes 2
E ASTM F2118 - Fatigue testing of bone cement - Changes between the 2003 and the 2014? Other Medical Device Related Standards 1
A What is the difference between Design Process, Process Design and Design Control? 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 2
R What's the major difference between Green Belt and Black Belt in term of training and project Six Sigma 3
DuncanGibbons How is the arrangement between Design and Production organisation envisaged? EASA and JAA Aviation Standards and Requirements 4
M Risk Analysis Flow - Confusion between ISO 14971 and IEC 62304 IEC 62304 - Medical Device Software Life Cycle Processes 8
T Difference between a subcontractor and a supplier ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 21
K %GRR was between 10-30% so we have to have a "backup plan" per auditor IATF 16949 - Automotive Quality Systems Standard 15
H Difference between Stainless Steel 316 ASTM F899 and ASTM A276 Other Medical Device Related Standards 3
A Exact terms for a plating failure and difference between rejection rate and failure rate Manufacturing and Related Processes 9
S ISO 13485 Lead Auditor - Debate between our Quality Team and Regulatory Auditor - Internal Auditor Training ISO 13485:2016 - Medical Device Quality Management Systems 17
M Difference between MSA and MSE? General Measurement Device and Calibration Topics 1
gramps What is the difference between discrete and continuous variables? Problem Solving, Root Cause Fault and Failure Analysis 3
R Plea for advice on transitioning between Notified Bodies (label updates) Medical Device and FDA Regulations and Standards News 1
M Measuring FIM (TIR) - Two inside diameters - Conflicting readings between inspectors Manufacturing and Related Processes 1
JoCam Difference between Approval and Registration - ISO 13485:2016 ISO 13485:2016 - Medical Device Quality Management Systems 2
S Difference between EU-MDR Annex IX and the Annex-combo X&XI EU Medical Device Regulations 4
T ISO 17025:2017 Clause 4.2.2 - The difference between "be notified" and "be informed" ISO 17025 related Discussions 4
Jimmy123 What is the difference between Error Proofing and Controls? ISO/IATF 16949 - Control Plans FMEA and Control Plans 16
DuncanGibbons Clear differences between ISO 13485 and AS 9100D requirements ISO 13485:2016 - Medical Device Quality Management Systems 10
H What is different between PED certificate and CPR certificate? Manufacturing and Related Processes 2
B Angle between two surfaces Inspection, Prints (Drawings), Testing, Sampling and Related Topics 0
D Variation between Faro Arms Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 2

Similar threads

Top Bottom