Charles Wathen
Involved - Posts
Number of Cal Test Points
Hi guys,
I have an Engineer that does not understand why I use certain test points. The instrument in question has a pressure transducer with a range from zero to 1000psi. When we perform a calibration, we use 10%, 30%, 50%, 70%, and 90% of full scale as our test points.
This Engineer wants to know why we don't check it at 40 psi, since the points we chose were not full scale (10-90%), and says that we did not check the instrument within it's range. I tried to explain to him that a calibration is defined as 3 minimum points within the instruments range, to prove that the instrument has linearity.
Is there some documentation that I can use to prove to this guy that what I'm doing is acceptable?
Hi guys,
I have an Engineer that does not understand why I use certain test points. The instrument in question has a pressure transducer with a range from zero to 1000psi. When we perform a calibration, we use 10%, 30%, 50%, 70%, and 90% of full scale as our test points.
This Engineer wants to know why we don't check it at 40 psi, since the points we chose were not full scale (10-90%), and says that we did not check the instrument within it's range. I tried to explain to him that a calibration is defined as 3 minimum points within the instruments range, to prove that the instrument has linearity.
Is there some documentation that I can use to prove to this guy that what I'm doing is acceptable?