Common Sense Calibration
Subject: Re: Q: Calibration Issues for Small Firms
Date: Mon, 29 Nov 1999 10:37:41 -0600
From: ISO Standards Discussion
> It is not necessary or desirable to take every measurement to the
> greatest possible accuracy. It is only necessary to show that the range
> of uncertainty is within the tolerance of the process. For example, if
> a piece of lab equipment requires room temperatures below 120 degrees F,
> a garden thermometer is probably all you need. It would be a waste of
> money to buy and calibrate an instrument accurate to .001 degree just
> because "ISO says we have to do it." Write down the reasons you believe
> the garden thermometer is working properly, and you're done.
Thomas' common sense approach reflects the spirit of proper ISO9000 implementation, and it provides me with an opportunity to illustrate how a firm that does NOT need the NIST-certifiable paper trail trace for its measuring equipment to still adequately plan for and document the degree of uncertainty used in its measuring equipment.
Let me tell a story first to set the stage: A clerk at the local auto parts store wanted to stop me when I began rolling an inflated car tire mounted on a rim into his store. I asked him to follow me instead of kicking me out, and we went to the $1.00 tire pressure gauge place. I pulled down at least thirteen tire pressure gauges and checked the pressure using my "standard" tire. Only 6 came close (um, I knew the pressure in the tire when I rolled it into the store...). I then tested the 6 closest gauges one more time, and finally chose the single gauge that seemed to me to be the most accurate. I can't tell you how many gas station attendants have complemented me on the comparable accuracy of my $1.00 tire pressure gauge since then.
The same purchase/test requirements could be made to apply to the garden thermometers from Thomas' example. The more reputable manufacturers of these thermometers will furnish upon written request, the QA/QC test plans, the "standard" against which the tests are performed, and the acceptable amount of uncertainty (or inaccuracy) that allows release for sale. Typically, the QA/QC test procedures include one or more pieces of expensive equipment with a tracable calibration back to a NIST standard, and at specified intervals (i.e. every 5 degrees, every 3 degrees, every 1 degree, etc. AND recalibrated every three months, six months, every week... whatever). So in this example, if temperature is your game, and (say) one-degree or three-degree tolerance is acceptable according to your quality plan(s), then there's a two step method that, if used, will establish a reliable traceability to the NIST-Standard without having the expensive "requirement" of hiring a sub with a $750/day NIST calibrated thermometer accurate to .001 degree (the one Thomas was talking about) in order to calibrate your firm's thermometers.
Step One: Make the statement in your quality system documentation that you are going to rely on the manufacturer's traceability back to a NIST Standard, in part, and write to the manufacturers you've selected and tell them you plan on using their products but you need the traceability documentation they have, from NIST to their manufacturing plant in order to achieve ISO900x compliance for Control of Measuring and Test Equipment. You'll probably get a great deal more than you asked for, as soon as the manufacturer determines that your firm does not also manufacture thermometers.
Step Two, Part One: If your entire collection of processes calls for the continuous "on the floor" use of 30 (just picking a number) thermometers, then your quality plan should call for the purchase of 33 thermometers (again, just "picking a number" which, in this case, is ten percent higher than the "actual use" thermometers -- I would pick ten percent based on 30 thermometers and gut level feeling; you might have different reasons, but like Thomas said, just document them).
Step Two, Part Two: Apply a unique number to all 33 thermometers, put them in the same 'fridge and leave them there for a sufficient time so that they all record the temperature. Yank them all out at the same time and a quick look will probably reveal that at least one thermometer has a plus/minus reading of 1 degree or more difference from the whole. Document the "odd balls" and change the temperature of the 'fridge and repeat. Repeat using ambient indoor and ambient (but stable) outdoor temperatures. In all, a QC guy could have fun for a day, at best. (Lots cheaper than a NIST-monster accurate to .001 degree -- sheesh).
With enough luck on your side, you will experimentally prove that you have a group of thermometers (for this example, say, 30) that are backed up by manufacturer's traceability documents to a NIST standard, and to which you have applied your own method of eliminating any standard manufactured thermometers which fall outside the variance displayed by the norm of a quantitative lot.
When a manufacturer documents to you that thermometers are manufactured to be NIST-accurate within 1 degree (C, F, Cel, whatever) between (say) -40-degrees and +500-degrees, AND you've tested 33 of them and found that (say) 30 of them read exactly the same on each of several temperature tests within the range of intended use, then you're almost done when you document that. (And of course, when you need to replace a percentage of thermometers due to normal wear and tear, employee borrowing or whathaveyou, you'd want to do this internal testing cycle again, and document it).
Granted, some Registrar lead auditors would insist that at least one Thermometer be tested and calibrated by that $750/day accurate to .001 degree machine. If a plus or minus 3, or 5, or even 10, is all you're after, document that in your quality system somewhere "obvious" so that when the Registrar's auditor squalks, you can point it out to him and he'll be unable at that point to write an NCR against it.
David Kozenko
Subject: Re: Q: Calibration Issues for Small Firms
Date: Mon, 29 Nov 1999 10:37:41 -0600
From: ISO Standards Discussion
> It is not necessary or desirable to take every measurement to the
> greatest possible accuracy. It is only necessary to show that the range
> of uncertainty is within the tolerance of the process. For example, if
> a piece of lab equipment requires room temperatures below 120 degrees F,
> a garden thermometer is probably all you need. It would be a waste of
> money to buy and calibrate an instrument accurate to .001 degree just
> because "ISO says we have to do it." Write down the reasons you believe
> the garden thermometer is working properly, and you're done.
Thomas' common sense approach reflects the spirit of proper ISO9000 implementation, and it provides me with an opportunity to illustrate how a firm that does NOT need the NIST-certifiable paper trail trace for its measuring equipment to still adequately plan for and document the degree of uncertainty used in its measuring equipment.
Let me tell a story first to set the stage: A clerk at the local auto parts store wanted to stop me when I began rolling an inflated car tire mounted on a rim into his store. I asked him to follow me instead of kicking me out, and we went to the $1.00 tire pressure gauge place. I pulled down at least thirteen tire pressure gauges and checked the pressure using my "standard" tire. Only 6 came close (um, I knew the pressure in the tire when I rolled it into the store...). I then tested the 6 closest gauges one more time, and finally chose the single gauge that seemed to me to be the most accurate. I can't tell you how many gas station attendants have complemented me on the comparable accuracy of my $1.00 tire pressure gauge since then.
The same purchase/test requirements could be made to apply to the garden thermometers from Thomas' example. The more reputable manufacturers of these thermometers will furnish upon written request, the QA/QC test plans, the "standard" against which the tests are performed, and the acceptable amount of uncertainty (or inaccuracy) that allows release for sale. Typically, the QA/QC test procedures include one or more pieces of expensive equipment with a tracable calibration back to a NIST standard, and at specified intervals (i.e. every 5 degrees, every 3 degrees, every 1 degree, etc. AND recalibrated every three months, six months, every week... whatever). So in this example, if temperature is your game, and (say) one-degree or three-degree tolerance is acceptable according to your quality plan(s), then there's a two step method that, if used, will establish a reliable traceability to the NIST-Standard without having the expensive "requirement" of hiring a sub with a $750/day NIST calibrated thermometer accurate to .001 degree (the one Thomas was talking about) in order to calibrate your firm's thermometers.
Step One: Make the statement in your quality system documentation that you are going to rely on the manufacturer's traceability back to a NIST Standard, in part, and write to the manufacturers you've selected and tell them you plan on using their products but you need the traceability documentation they have, from NIST to their manufacturing plant in order to achieve ISO900x compliance for Control of Measuring and Test Equipment. You'll probably get a great deal more than you asked for, as soon as the manufacturer determines that your firm does not also manufacture thermometers.
Step Two, Part One: If your entire collection of processes calls for the continuous "on the floor" use of 30 (just picking a number) thermometers, then your quality plan should call for the purchase of 33 thermometers (again, just "picking a number" which, in this case, is ten percent higher than the "actual use" thermometers -- I would pick ten percent based on 30 thermometers and gut level feeling; you might have different reasons, but like Thomas said, just document them).
Step Two, Part Two: Apply a unique number to all 33 thermometers, put them in the same 'fridge and leave them there for a sufficient time so that they all record the temperature. Yank them all out at the same time and a quick look will probably reveal that at least one thermometer has a plus/minus reading of 1 degree or more difference from the whole. Document the "odd balls" and change the temperature of the 'fridge and repeat. Repeat using ambient indoor and ambient (but stable) outdoor temperatures. In all, a QC guy could have fun for a day, at best. (Lots cheaper than a NIST-monster accurate to .001 degree -- sheesh).
With enough luck on your side, you will experimentally prove that you have a group of thermometers (for this example, say, 30) that are backed up by manufacturer's traceability documents to a NIST standard, and to which you have applied your own method of eliminating any standard manufactured thermometers which fall outside the variance displayed by the norm of a quantitative lot.
When a manufacturer documents to you that thermometers are manufactured to be NIST-accurate within 1 degree (C, F, Cel, whatever) between (say) -40-degrees and +500-degrees, AND you've tested 33 of them and found that (say) 30 of them read exactly the same on each of several temperature tests within the range of intended use, then you're almost done when you document that. (And of course, when you need to replace a percentage of thermometers due to normal wear and tear, employee borrowing or whathaveyou, you'd want to do this internal testing cycle again, and document it).
Granted, some Registrar lead auditors would insist that at least one Thermometer be tested and calibrated by that $750/day accurate to .001 degree machine. If a plus or minus 3, or 5, or even 10, is all you're after, document that in your quality system somewhere "obvious" so that when the Registrar's auditor squalks, you can point it out to him and he'll be unable at that point to write an NCR against it.
David Kozenko