Pressure Transducer - Number of Calibration Test Points

Charles Wathen

Involved - Posts
Number of Cal Test Points

Hi guys,
I have an Engineer that does not understand why I use certain test points. The instrument in question has a pressure transducer with a range from zero to 1000psi. When we perform a calibration, we use 10%, 30%, 50%, 70%, and 90% of full scale as our test points.

This Engineer wants to know why we don't check it at 40 psi, since the points we chose were not full scale (10-90%), and says that we did not check the instrument within it's range. I tried to explain to him that a calibration is defined as 3 minimum points within the instruments range, to prove that the instrument has linearity.

Is there some documentation that I can use to prove to this guy that what I'm doing is acceptable?
 
Elsmar Forum Sponsor
I'm not sure what documentation to use. However, I've been in the electronics business since 1977, and in metrology since the early 80's. My understanding is that the principles upon A-to-D or D-to-A converters work is based upon three premises, the overall gain of the device, establishing a zero point, and a mid point check to assure linearity. Therefore, the three checkpoints would be zero offset, gain and linearity. Any additional checks are either a redundancy, or may be required for specific applications. I've had customers (fictitious numbers for purposes of illustration) who always used a 0-100 PSI gage at 15 to 20 PSIG. They did not care about full scale gain, but pushed the envelope on the low end, and therefore needed improved confidence at the low end of the range.

However, Adding in a 40% of range check when you already check at 10, 30, 50, 70, and 90 is a redundancy.

A couple of places you might check would be at NIST.GOV or ANSI. If there is a procedure or defined method for your type and range gage, that should be the final answer. If you have GIDEP access (Government Industry Data Exchange Program) there may also be resources there.

The other detail I note is that the engineer is asking for 40 PSI on a 0 to 1000 PSI gage. This is 4% of range (very low end of range. Is it rated for operation at 4% of range? and do you use it at 4% of range? Those are both important questions. The object of the calibration is to provide a defined level of confidence that the users measurement is accurate to a defined tolerance. If the gage is not rated that low, the user needs to understand that. If it is, and the user has an application at 4% of range, you may want to consider testing it there. If it is rated that low, and the users request is simply a matter of his or her general principles, besides the above, you could explain that testing zero and ten percent of range implies 4 percent of range is in tolerance in the same way that all of the other test points imply it is in tolerance at any intermediate points (example: testing at 30 and 50 percent of range implies that 31, 32, 33, ...... 47, 48, and 49 percent of range are all in tolerance. You could as easily pick any other point along the instruments range and make similar contentions.
 
Hi Jerry,
What brought all this into focus was his request to change my calibration procedure which I declined to do so. I told him that I can re-calibrate the instrument to specific points at his request, and we would label the instrument with a limited calibration label indicating the range. I also told him that the accuracy of the instrument will remain the same at ±4 psi since that is the mfg's tolerance, and I'm not about to change that.

He's having a problem with my test points of 10%, 30%, 50%, 70%, and 90% of full scale, so I was hoping to find something written somewhere that I can point in his direction. He still wants it checked at my test points, but want's additional test points below mine, and wants to change the calibration procedure. The bottom line is he's not using the right tool (or in this case - instrument) for his testing purposes. We have others that have a maximum range of 100 psi and an accuracy of ±0.5 psi, but he still questions the test points.

The calibration procedure has 5 different pressure models, and based on the model, it would still be calibrated at the same points based on the maximum scale of the instrument, i.e. 10%, 30%, 50%, 70%, and 90% of full scale. So if he was to use a 100 psi model, he want's to know why we don't check it at 40%. I told him that we can't check every single possible point, as it would take all month (hehe).
 
Charles,

I'm not a cal expert by any measure, but FWIW I have a few comments:

First, as Jerry said, where on the scale do they usually use it (~ 40 psi ??? 400 psi??? full range??? ) and is it rated for that/those point(s)?

Second, why not call the manufacturer and explain how you use the gage and ask how they recommend you should calibrate it based on that information. I would think the mfr. would be happy to advise you.
 
Hi Mike,
The mfg says to contact your internal calibration lab. :)

According to this Engineer, they run it around the 40 psi range. He has a difficult time understanding that he's not using the correct instrument for his job. Almost all our calibrations are based on the 10%, 30%, 50%, 70%, and 90% sample points.
 
Do you mind if I ask who this mfr. is, because this is an unacceptable answer to me. If they don't help customers any more than this, I wanna make sure I do not buy any of their products. If you don't want to post it publicly, you can send me a private message.
 
Well, they just say it's up to our company. It's a custom piece of equipment desgined just for our company, which is why they don't want to get into telling us what we should calibrate it to. It's not an off-the-shelf type equipment like a micrometer.
 
Throw this at your engineer.

Pressure transducers use strain gauge technology to measure pressure. If he had any experience with strain gauges, he would know that below 10% of full scale (such as 4%), their accuracy falls way off (due to mechanical resistance, which is most prevalent in the low end), to the level that the better manufacturers don't even spec them in that range, or give a wider tolerance (unless they have given themselves a pretty wide tolerance to start with).

You check it at 10% because it makes sense, that and I have NEVER seen a procedure check below that. But, if you need proof, try NAVAIR NA 17-20MX-157. It is the Navy general procedure for pressure transducers, and it is the method that I use locally (I hate writing procedures). Granted, it is a bit more in depth than your method, but it is nationally accepted. It ascends from 10-100% in 10% increments, then descends from 100% to 10% in 10% increments to prove that there is no hysteresis issues.

Ryan
 
Military procedures

GIDEP is one source, MIDAS is another. However you need to be a member. if you do a search for either of these you should be able to get your information.

*** RESTRICTED LINK REMOVED ***
 
Back
Top Bottom