Special techniques when you calibrate digital micrometers?

Jerry Eldred

Forum Moderator
Super Moderator
I wouldn't want to categorize as "special techniques". However, you need to use an appropriate, documented procedure, the correct standards (gage blocks, monochromatic light, optical flats, surface plate, cloth gloves, etc.), proper environment, properly trained personnel, etc.

The only difference (without actually having some micrometer specs in front of me) between a digital and an analogue micrometer COULD be, that some digital micrometers may have tighter operating tolerances than SOME analogue. Other than that, the procedure should be the same.

I think an analogue versus digital multimeter might be a good analogous example.

The fact that the question of differences comes up raises the issue that there needs to be a good procedure for calibrating either. If you are in a single function lab that calibrates only micrometers and perhaps calipers, I recommend reviewing whatever procedure you have for doing analogue micrometers to be sure that you have adequate uncertainty ratios between standards and items to be calibrated. Then run through the same exercise with digital micrometers.

Don't know how much help this is. Please post further if you have more questions.

------------------
 
L

Libnani

There is a French standard NF E11-095 (may be translated in English) Which describes in details the calibration of micrometers .

You can ask for it (if it has an english version sure)
 
Top Bottom