Environmental Conditions for Calibration - Temperature and Humidity

P

paulmorrow

Are there a standard set of environmental conditions, e.g., temperature and humidity, that must be met during the calibration of standard measuring instruments [calipers, micrometers, etc.? A prospective supplier of calibration services is offering to perform such calibrations on site [to save transportation time and $] but I am concerned that the proposed location does not meet the required [?]controls. :confused:
 
R

Ryan Wilde

Standard conditions for almost all dimensional calibrations is 20°C (68°F). Now is where it gets tricky. To hold the uncertainties that you are looking for, temperature range varies.

In a nutshell, gauge blocks and instruments grow with temperature increases. They grow (assuming steel gauge blocks and calipers/mic spindles) at a rate of 11.5µm/m/°C (6.4µin/in/°F) ± about 10%.

Therefore, with each °C change from 20°C, we add to uncertainty of 2.3µm/m (or 0.06µm per 25mm). Uncertainty on a micrometer at ±2.5µm (0.0001") over a 25mm (1") span would dictate that you would not want to stray from 20°C by more than about 5°C (68°F ± 9°F). If your micrometers are less accurate, then you could stray more. If they are more accurate, then you need a tighter environment still. This is all reliant on the base uncertainty of the company calibrating your tools, because if they start high, it lessens the room for error allowable for temperature.

The biggest worry is usually rate of change, because the mass of a gauge block versus the mass of a set of calipers or micrometer spindle is a big difference. If your environment can't stay within about 2°C/hour, then calibrations start really giving false measurements. If the temperature of the instrument and the temperature of the standard vary by much, errors really start adding up quickly (hence the little chunks of plastic on the micrometer are known as 'heat shields', and are meant to slow the transfer of heat from your hand to the micrometer).

Calipers are more forgiving (because of the 10:1 accuracy of a mic vs. a caliper), and you can generally get away with about 20°C ± a ballpark.

I omitted a lot of calculations here, for brevity and sanity that some of the savvy folks reading this should pick up on, such as distributions, etc., but this is a pretty good rule of thumb for a quick environmental check.

Humidity does not effect caliper or micrometer calibration, although as a supplier, I never liked taking my blocks into an environment with >60% humidity because they rust so quickly.

Hope this helps,

Ryan
 
Last edited by a moderator:
K

Ken K

Humidity does not effect caliper or micrometer calibration, although as a supplier, I never liked taking my blocks into an environment with >60% humidity because they rust so quickly.


The lab which does our gage blocks and weights does so in an environment with humidity < 45%. Temperature range is
68.0 +/-1C.
 
R

Ryan Wilde

Originally posted by Ken K



The lab which does our gage blocks and weights does so in an environment with humidity < 45%. Temperature range is
68.0 +/-1C.

That is actually a fairly broad temp range for a lab doing gage blocks. Even at 20°C ± 0.25°C, temperature accounted for over 50% of our uncertainty on gage blocks, even with thermal compensation. We also had problems with humidity (we spec'd 25-45% RH, and in the winter it always tried to go below 25%, which is not healthy for electronic devices such as gage block comparators).

I now rarely touch dimensional, and I work in a comfy 23° lab. Life is good.

Ryan
 
A

AlbertPaglinawan

Hello,

I would like to know what is the ideal environment in calibrating dimensional and electrical equipment. Ours is currently 21°C +/- 5°C and 50% +/- 10%. Now, I need to justify this figures. Where did we get these figures? What possible justifications can we give?

Thanks in advance!

cheers,
Lord Ituralde
 
G

Graeme

Lord Ituralde said:
I would like to know what is the ideal environment in calibrating dimensional and electrical equipment. Ours is currently 21°C +/- 5°C and 50% +/- 10%. Now, I need to justify this figures. Where did we get these figures? What possible justifications can we give?
Here are some figures from a couple of readily available documents --

TEMPERATURE
Dimensional, Optical and Mass
NSCL: 20 °C +/- 0.5 °C for general calibrations

ISA: 20 °C +/- 1 °C overall and +/- 0.3 °C at the point of measurement

All other disciplines
NCSL: 23 °C +/- 2 °C for general calibrations

ISA: 23 °C +/- 1.5 °C

RELATIVE HUMIDITY
Dimensional, Optical, Mass
NCSL: 40% +/- 5% RH at 20 °C

ISA: 45% RH maximum at 20 °C

All other disciplines
NCSL: 40% +/- 5% RH at 23 °C

ISA: 20 - 55% RH at 23 °C

Each of these recommended practices contain a lot of other information, of course. Both of them are targeted at "standards" laboratories; if you are in a company's production lab, this is the type of lab you would send your transfer standards to. In both cases, the values above are the most relaxed but still might be tighter than what you want to maintain. On the other hand, they also agree with what I remember (10 years ago) of the US Navy's requirements for general calibration labs --
Measurements using gage blocks: 20 °C +/- 0.5 °C and 10 - 45% RH

Other dimensional/optical/mass: 23 °C +/- 3 °C and 20 - 60% RH
Electrical/Electronic/RF/Microwave: 23 °C +/- 5 °C and 20 - 60% RH.

What you have appears to be a compromise. Do you perform all calibrations in the same aera? The temperature range means that you are giving up some accuracy capability in dimensional measurements, and the high humidity is inreasing the risk of corrosion (rust) on your standards. The only way to get better control is to divide the lab into two areas with sepatrate temperature/humidity control systems. (There may be other ways, maybe an air-conditioning engineer would know ...)


You might also want to check the NIST web site. A couple of items that may be useful are the State Weights and Measures Laboratory Handbook (NIST Handbook 143), and the Gage Block Handbook (NIST Monograph 180).
 
Last edited by a moderator:

howste

Thaumaturge
Trusted Information Resource
Reading this thread brought to mind a few questions:

1)If you calibrate all of your gages in a laboratory with a controlled environment, then use the gages in an uncontrolled environment, what kind of results do you get?

2) Does it make sense to let the standards "soak" and calibrate the equipment in the environment it will be used in?

3) Is it better to keep gages in the environment they will be used in, or in a controlled lab?
 

Mike S.

Happy to be Alive
Trusted Information Resource
howste said:
Reading this thread brought to mind a few questions:

1)If you calibrate all of your gages in a laboratory with a controlled environment, then use the gages in an uncontrolled environment, what kind of results do you get?

2) Does it make sense to let the standards "soak" and calibrate the equipment in the environment it will be used in?

3) Is it better to keep gages in the environment they will be used in, or in a controlled lab?

I think it depends onthe accuracy you need, but some folks surely need to think about it. People making very precise measurements need to maybe do a few tests and calculations and look at the data and decide. The standards labs have to be concerned with much more accuracy than I do for my mics and calipers. If I consider a 1" gage block to have a 15 ppm/C. thermal expansion coefficient and the temp is either 15 or 25 degrees C (59-77 F)the block will change size by ~ .000075" from its size at 20 C. I never calibrate at those extremes and this is more accuracy than I need anyway. In the shop area, let's say the A/C is dead one day and it gets to 32 C. (90 F). My gage block is now ~ .00018" oversize -- still very unlikely this would cause me any measurement issues, but I would not cal at that temperature. If the tool (mics or calipers) is hot any effect is cancelled-out by zeroing it at the use temp. Other folks may need more accuracy so they would need to look at their particular situation.

As for a electronic equipment, same deal, do a few tests and calculations and look at the data and decide.

JMO.
 
R

Ryan Wilde

Lord Ituralde said:
Hello,

I would like to know what is the ideal environment in calibrating dimensional and electrical equipment. Ours is currently 21°C +/- 5°C and 50% +/- 10%. Now, I need to justify this figures. Where did we get these figures? What possible justifications can we give?

Thanks in advance!

cheers,
Lord Ituralde

I seem to remember that you had a Datron/Wavetek/Fluke/whatever 1281. The accuracy spec for that particular piece of equipment is 21 - 28°C, which may pose a problem for you with your present environmental specification.

Many labs, including ours, have separate rooms for dimensional and electrical calibrations, with entirely different specifications. The crux of the problem is that 23°C is standard laboratory temperature for electrical, and temperature does have measureable effect on high-accuracy equipment. Dimensional, on the other hand, has a standard laboratory temperature of 20°C, and variations from that temperature will have a measureable effect with very little temperature change.

You also may wish to worry about your humidity specification, as 60% RH on tool steel tends to cause rust.


Ryan
 
Top Bottom