Subject: Re: Calibration Issues for Small Firms /Scalies/Hellmann
Date: Wed, 3 Nov 1999 17:37:06 -0600
From: Moderator
From: JJH2000
Charley,
MEASUREMENT UNCERTAINTY is a rather obscure concept because it is precisely what we strive to and assume is eliminated whenever a measurement is taken of a unknown process variable. No one wants to hear that the temperature of water boiling in a pot is sorta 100 degrees Celcius, "give-or-take" 1 degree, or maybe 2 degrees, or 3 degress....and so on. There are many variables that would determine and effect the temperature measurement such as, the elevation above sea leavel, the ambient air pressure around the pot, the degree of accuracy of the temperature measuring device, its degree of precision giving the same indication every time given the same conditions, and so on. It's this "give-or-take" that is the uncertainty of the measurement. To make an anology, a proficient archer can consistently hit a smaller area of the larger target. But even Robin Hood didn't split his arrow with every other arrow shot. But he was good at hitting the bulls eye everytime, or so the legend goes. The bulls eye, itself, is not a point in space, but rather a very small area of the whole target.
To consistenly hit a 1 inch diameter circle at 30 yards, would be the standard for the proficient archer. If Robin utilized his archery skills in hunting, for example, the results would be consistent. The deer would fall every time, and venison would be turning on the spit back at camp every evening. We can quantify and define the MEASUREMENT UNCERTAINTY of Robin's skill by the documenting the size of the bulls eye.
Now the issue becomes how does one quantify and define MEASUREMENT UNCERTAINTY, to make it "known". That is something you must decide. What is the acceptable % error accuracy of your instrumentation? It is determined by the needs of the process being measured.
What is an acceptable percent of the total range for error? I maintain a 0.25% of total range for acceptable accuracy. In other words, the instrument I am calibrating must indicate the same value as the instruments of standard (certified by NIST) within 0.25% of the range being measured. For a 0 to 100 degree Celcius ranged instrument, it must be within 0.25 degrees of the reading of the standarized instrument, or else it fails. This standard has validity in the pharmaceutical, petrochemical and chemical process control environments where I work. But it might not be appropriate for you. Why, for example, should you spend hundreds, thousands, of dollars for a 0.25% accuracy temperature instrumentation device to cook your Thanksgiving turkey when a $3 thermometer from the drug store would be perfectly adequate?
The knowing of the MEASUREMENT UNCERTAINTY is the quantified and documented size of the bulls eye that works for you.
John