Temperature Requirements For In House Calibration - AS9100

Tyler

Involved In Discussions
We are looking to save money by performing in-house calibration/verification of certified monitoring and measuring devices. There is quite a lot of discussion about this issue on the forum, yet I still need help determining what I need to do be properly set up for in-house calibration/verification.

The tightest tolerances we have ever held have been around ±.0005", therefore the most accurate devices we require must be accurate to the the tenth (.0001"). Given this information, we need to determine what the requirements would be for the temperature range of our inspection room as well as the accuracy requirements for the thermometer in that room. Based on this post, the temperature range required for verifying devices with an accuracy to the tenth (.0001") is 20°C ± 5°C (68°F ± 9°F).

We have found a NIST traceable temperature gauge that has an accuracy of ±1°C. That seems more than sufficient to monitor a temperature range of 20°C ± 5°C; we would just need to keep the readings between 20°C ± 4°C to be certain that the temperature is within range.

Am I on the right rack here?
 

mattador78

Quite Involved in Discussions
We are looking to save money by performing in-house calibration/verification of certified monitoring and measuring devices. There is quite a lot of discussion about this issue on the forum, yet I still need help determining what I need to do be properly set up for in-house calibration/verification.

The tightest tolerances we have ever held have been around ±.0005", therefore the most accurate devices we require must be accurate to the the tenth (.0001"). Given this information, we need to determine what the requirements would be for the temperature range of our inspection room as well as the accuracy requirements for the thermometer in that room. Based on this post, the temperature range required for verifying devices with an accuracy to the tenth (.0001") is 20°C ± 5°C (68°F ± 9°F).

We have found a NIST traceable temperature gauge that has an accuracy of ±1°C. That seems more than sufficient to monitor a temperature range of 20°C ± 5°C; we would just need to keep the readings between 20°C ± 4°C to be certain that the temperature is within range.

Am I on the right rack here?
Probably bring it within 3°C± then you are giving your self a tighter parameter to control but your not on the edge if you reach your upper or lower level. With our chemistry controls here that's what we have done kept a tighter band in the middle which allows us to still operate potentially out of our parameters but within the allowed range while we rectify the issue.
 

Big Jim

Admin
The temperature gauge would be needed as you suspect. Don't perform calibration if outside of the range. Make sure you are in the range for a while and allow the testing device and the target device to both soak in that temperature range so they are stable. How long of a soak depends on the mass of the instruments involved.

Although not specifically called out in AS9100D, it is a good practice to allow whatever you are measuring to be soaked too. The part you are measuring could grow or shrink depending on if it is too warm or too cold.
 

mattador78

Quite Involved in Discussions
The temperature gauge would be needed as you suspect. Don't perform calibration if outside of the range. Make sure you are in the range for a while and allow the testing device and the target device to both soak in that temperature range so they are stable. How long of a soak depends on the mass of the instruments involved.

Although not specifically called out in AS9100D, it is a good practice to allow whatever you are measuring to be soaked too. The part you are measuring could grow or shrink depending on if it is too warm or too cold.
Woke up this morning read that the first time and thought what if there is no solution to soak it in.:uhoh:.
Following a coffee and a re read I declare ignorance on my first read through your honour.
 

Welshwizard

Involved In Discussions
Hello Tyler,

For a lot of Aerospace suppliers I see in the UK and Europe they are being compelled to calibrate their devices to UKAS standards with labs accredited to ISO 17025, the vast majority of them therefore send their devices out. The key for me is that these accredited labs have undergone quite a high level of scrutiny in particular for derivation and use of measurement uncertainty budgets and declarations and lots of other aspects too.

If you undergo your own calibration and don't derive these uncertainty statements which effectively demonstrate your measurement capability then you don't really demonstrate traceability. For example, if you allow temperature deviations of +/- 3 degrees around the reference (perfectly reasonable) then that largely controls the level of uncertainty you can demonstrate for your devices and your calibration process.

If your customer is fine with what you are proposing then great but sometimes saving money is not all it seems. In my long experience, when you factor in all the overheads and various costs it very rarely makes sense to calibrate in house gauges used for aerospace requirements unless there is a strategic intent of course.

True traceability driven by a knowledge of the associated measurement uncertainty and the other aspects that come with using the UKAS accredited labs could be a blessing if the unthinkable happens and you have to be accountable for a measurement result. It all depends on the levels of component you are making of course and your customer needs.
 

Kronos147

Trusted Information Resource
Don't forget humidity. And procedures. And references to standards used in the records.
 

Tyler

Involved In Discussions
Thanks for all the responses.

Perhaps a little more context is helpful. The calibration lab we have used in the past comes to our facilities to calibrate/verify our devices. They bring with them an insulated trailer that they heat up with a little portable heater. When they arrive, they grab all of our devices and take them into their trailer. There is practically no soak time for the devices. In addition, in the many years that we have used their services, they have never reported an out of tolerance device. All of these facts have led management to believe that we are wasting our money just to a get a sticker from an accredited lab.

We know of a machine shop in the US that performs calibration/verification in house. Well, actually it is only verification; their procedures only cover verifying that devices are within tolerance, not calibrating devices. They use NIST certified reference standards to verify their calipers and micrometers. This, they say, demonstrates traceability to NIST. I was hoping that our company could do something like this.

If you undergo your own calibration and don't derive these uncertainty statements which effectively demonstrate your measurement capability then you don't really demonstrate traceability. For example, if you allow temperature deviations of +/- 3 degrees around the reference (perfectly reasonable) then that largely controls the level of uncertainty you can demonstrate for your devices and your calibration process.

My naive assumption is that we could buy highly accurate gauge blocks, something like this gauge block set, and use them to verify that our devices are reading within their stated accuracy. For example, if checking a pair of 6" calipers with with a stated accuracy of ±.001", we would check the caliper against the gauge blocks at every inch within its range to verify that it is reading the value of the gauge block ±.001" (e.g., 1" ± .001", 2" ± .001, etc.). However, based off of the quote above, it sounds like we would have to do a lot more. Is this the case?
 

Ninja

Looking for Reality
Trusted Information Resource
Haven't seen it mentioned yet, so I figured I would throw it in...

Are any of your gages used in a manner where temperature correction is even a remote factor?

We had a laser etched glass plate for calibrating our CMM...TCE of SLS glass is about 9.1 ppm/degC. it was 6"x8".
We were talking about 90millionths of an inch per degree C...nothing we measured came anywhere close to needing that accuracy, so we ignored temp correction entirely with no issues.

just a thought...
 

Tyler

Involved In Discussions
Are any of your gages used in a manner where temperature correction is even a remote factor?

The tightest tolerances we have ever held have been ±.0003"; however, this is rare. Thus, the highest accuracy we require would be .0001". Even then, we are talking dimensions that typically don't exceed an inch. The largest micrometer we have is a 4-5" micrometer, but that thing rarely sees the light of day.

I created a few tables in a spreadsheet that computes uncertainty with respect to length and temperature. However, I am not a metrologist, so my approach could be off base. Thoughts and comments are appreciated.
 

Attachments

  • Uncertainty Tables.xlsx
    13.6 KB · Views: 210

Ninja

Looking for Reality
Trusted Information Resource
So do the math with the maximum shift that temperature change could cause...is it pertinent?

FWIW, I always used "What would happen if the room changed by 10degC?" Our rooms never shifted that far.
If yours do...use a temperature swing larger than what is possible...
 
Top Bottom