New to statistics
Involved In Discussions
Hello everyone,
I am reviewing a Certificate of Inspection from a micrometer , and I noticed that the manufacturer always checks the instrument at specific measuring points, for example:
My question is:
Why are these exact lengths selected for the calibration/inspection?
Is there a technical or standards-based reason behind choosing these particular values instead of evenly spaced points (e.g., every 5 mm)?
I would appreciate any explanation related to standards, metrology practice, error distribution, or instrument design.
Thank you!
I am reviewing a Certificate of Inspection from a micrometer , and I noticed that the manufacturer always checks the instrument at specific measuring points, for example:
- 0,7 mm
- 7,70 mm
- 12,90 mm
- 17,80 mm
- 25,00 mm
My question is:
Why are these exact lengths selected for the calibration/inspection?
Is there a technical or standards-based reason behind choosing these particular values instead of evenly spaced points (e.g., every 5 mm)?
I would appreciate any explanation related to standards, metrology practice, error distribution, or instrument design.
Thank you!