Why do micrometer manufacturers use these specific calibration points?

New to statistics

Involved In Discussions
Hello everyone,

I am reviewing a Certificate of Inspection from a micrometer , and I noticed that the manufacturer always checks the instrument at specific measuring points, for example:
  • 0,7 mm
  • 7,70 mm
  • 12,90 mm
  • 17,80 mm
  • 25,00 mm
Other manufacturers seem to follow similar fixed measurement points during calibration or inspection.

My question is:
Why are these exact lengths selected for the calibration/inspection?
Is there a technical or standards-based reason behind choosing these particular values instead of evenly spaced points (e.g., every 5 mm)?

I would appreciate any explanation related to standards, metrology practice, error distribution, or instrument design.

Thank you!
 
Elsmar Forum Sponsor
I don't know about metric micrometers as I have never actually used one. However, on an imperial micrometer, I tend to use:
.000
.0625
.250
.500
1.000

The .0625 is used so it measures the opposite side of "0" on the thimble (i.e. between 12 and 13). Again, I don't know if this answers your question but it might help.
 
Hello everyone,

I am reviewing a Certificate of Inspection from a micrometer , and I noticed that the manufacturer always checks the instrument at specific measuring points, for example:
  • 0,7 mm
  • 7,70 mm
  • 12,90 mm
  • 17,80 mm
  • 25,00 mm
Other manufacturers seem to follow similar fixed measurement points during calibration or inspection.

My question is:
Why are these exact lengths selected for the calibration/inspection?
Is there a technical or standards-based reason behind choosing these particular values instead of evenly spaced points (e.g., every 5 mm)?

I would appreciate any explanation related to standards, metrology practice, error distribution, or instrument design.

Thank you!
These points are distributed at approximately 0%, 25%, 50%, 75% and 100% of the measurement range. The slight difference between those measurements and the exact percentages might be due to the particular standards that they have available.
 
Hello everyone,

I am reviewing a Certificate of Inspection from a micrometer , and I noticed that the manufacturer always checks the instrument at specific measuring points, for example:
  • 0,7 mm
  • 7,70 mm
  • 12,90 mm
  • 17,80 mm
  • 25,00 mm
Other manufacturers seem to follow similar fixed measurement points during calibration or inspection.

My question is:
Why are these exact lengths selected for the calibration/inspection?
Is there a technical or standards-based reason behind choosing these particular values instead of evenly spaced points (e.g., every 5 mm)?

I would appreciate any explanation related to standards, metrology practice, error distribution, or instrument design.

Thank you!
Those points are chosen so that the anvil is rotated to present a unique face for every measurement.
 
The point specified often rotate the spindle to 4 places so that "drunken thread" or wear on the faces is more easily detected.
 
Back
Top Bottom