Confuse: Calibration using master equipment - Resolution - Micrometer

  • Thread starter Thread starter nick1980
  • Start date Start date
N

nick1980

I want to calibrate a micrometer internally but I have some questions about resolution.

Resolution of:
Micrometer A: 0.001mm (that was calibrated externally proving it had only 0.0002mm difference from 0.001 nominal value)
Micrometer B: 0.001mm

If I use a master equipment to calibrate others, the master one should be at least 1/4 (1/10 best) more accurate than the under-test one.

For the above example, micrometer A was calibrated by a more accurate equipment externally. Micrometer A can be used to calibrate micrometer B in my sense.

However, both A & B have the same resolution,
1. Does it obey the 1/10 measurement ratio rule, even micrometer A was calibrated by a more accurate master externally?
2. Can I use micrometer A to calibrate micrometer B with same resolution?
3. Should I use a micrometer C with resolution 0.0001mm to calibrate micrometer B?
 
Elsmar Forum Sponsor
Confuse: Calibration using master equipment

I want to know if Micrometer A is an Internal Micrometer and Micrometer B is External Micrometer.If it so,you can not calibrate B with A.

Explanation:

1.Calibration of any Micrometer can not be done at on place but it has to be done along the full travell at definate points .Hence we should know the error in A at different places to calibrate B.

2.The values reported in A at different places themselves can vary with respect to temperature.

3.If we say that we are maintaining the temperature at exactly 20 degrees,it amounts to an adjustable test rod which is nothing to do with the Least count of the micrometer A.
 
Explain my question

Sorry, I think my question may be misleading.

Assume all other factors are OK, please concentrating on the following:

Gauge block (external calibration result)
Nominal (mm) Actual (mm)
1.000------------- 1.0001
1.500------------- 1.5001
2.000------------- 1.9999

Micrometer A’s resolution is 0.001mm, acceptable tolerance is +/-0.001mm
Result from calibrating micrometer A by gauge block:
Nominal of gauge block (mm) Actual from micrometer A (mm) Deviation (mm)
1.000---------------------- 1.000------------------------- 0.000
1.500---------------------- 1.501------------------------- 0.001
2.000---------------------- 1.999------------------------- -0.001

Question:
1) I am not sure whether micrometer A is within my acceptable tolerance +/-0.001mm after calibrating with gauge block. Who can explain?

2) I want to use micrometer A to calibrate a block A of thickness 1.000mm. The resolution of both Micrometer A & block A is the same. Can I use micrometer A (which was calibrated by a more accurate gauge block) to calibrate block A?

According to MSA,
1) The instrument must have discrimination that allows at least one-tenth of the expected process variation of the characteristic to be read directly. For example, if the characteristic’s variation is 0.001mm, the equipment should be able to read a change of 0.0001mm.

2) In reading the equipment, the reading should be estimated to the nearest number that can be obtained. If possible, readings should be made to one-half of the smallest graduation. For example, if the smallest graduation is 0.0001mm, the estimate for eah reading should be rounded to the nearest 0.00005mm.

(I’m not quite understood what MSA said so I ‘m very confused in the above case.)

Please help!!!
 
The calipers wind up giving you at best a 1:1 TAR.....the gage blocks will act as a transfer standard.....

Also, if the best resolution is 0.001mm, how can the calipers be certified to 0.0002mm? The parallax alone will throw that off.....

Best answer is to have the gage blocks calibrated by your accredited calibrate provider (accredited by HKAS or CNAL) and use the gage blocks to cal the calipers.

Hope this helps.

Hershal
 
thx for Hershal but could you answer my question directly. I don't understand for your answer, seems complicated.
 
Hi,

1) Do a minimum of 3 runs & take average. Use the "actual" gb results to calculate the error. U should get 4 decimals instead of 3.

2) Yes, strictly speaking a 1:1 is not calibration... But everything depends on your application & procedures.
 
Let me try...

Hi all,

I don't know if this would help, but this is what I think...

1) I am not sure whether micrometer A is within my acceptable tolerance +/-0.001mm after calibrating with gauge block. Who can explain?

Your micrometer A is within your acceptable tolerance. In a very accurate 1 mm gauge block, your micrometer’s measurement is expected to fall between 0.999 mm and 1.001 mm, and it did. In 1.5 mm, your micrometer A is expected to fall between 1.499 mm and 1.501, and it did (same goes for 2 mm measurement).



2) I want to use micrometer A to calibrate a block A of thickness 1.000mm. The resolution of both Micrometer A & block A is the same. Can I use micrometer A (which was calibrated by a more accurate gauge block) to calibrate block A?



Now, correct me if I’m wrong, but, as far as I know, blocks don’t have resolution because they cannot read. They have tolerance. Assuming that the tolerance of your block A is +/- 0.001 mm (same as your micrometer A), you cannot calibrate your block A with your micrometer A.



Assuming that block A is 1.5 mm +/- 0.001 mm;



..and your micrometer A measured 1.501 mm;



the true value of your block A will lie between 1.5 mm and 1.502mm because



lower limit = 1.501 mm – 0.001 mm

lower limit = 1.500 mm (passed)



higher limit = 1.501 mm + 0.001 mm

higher limit = 1.502 mm (failed)



So… you won’t be able to tell if your block A passed the acceptable tolerance.



Even if you used the value of the gauge blocks used (1.501 mm),



lower limit = 1.501 mm – 0.001 mm – 0.001 mm (to correct the value of micrometer)

lower limit = 1.499 mm (passed)



higher limit = 1.501 mm + 0.001 mm – 0.001 mm (to correct the value of micrometer)

higher limit = 1.501 mm (passed)



there would still be an uncertainty of +/- 0.0005 because of the resolution of your micrometer. +/- 0.0005 is the half width. It’s like, in a measured value of 1.501 mm,

you are not sure if it is 1.5010, 1.5011, 1.5012 ..etc. (if it is 1.5015 or more, you can round it off to the 1.502)



so from your previous lower and higher limit;

low limit1 = 1.499 mm + 0.0005 = 1.4995 mm (passed)

low limit2 = 1.499 mm – 0.0005 = 1.4985 mm (failed)



high limit1 = 1.501 mm + 0.0005 = 1.5015 mm (passed)

high limit2 = 1.501 mm – 0.0005 = 1.5005 mm (failed)



so, again, you won’t be able to tell if you block A is within your tolerance of +/- 0.001 mm.


cheers!
 
nick1980 said:
thx for Hershal but could you answer my question directly. I don't understand for your answer, seems complicated.


Actually, the answer is simple. Use the gage blocks to calibrate the calipers.

The gage blocks are more precise than the calipers, so using the gage blocks to calibrate the calipers will give you the 4:1 accuracy that you want.

Hope this helps.

Hershal
 
Back
Top Bottom