How does Non-Linear UUT affect TAR (Test Accuracy Ratio) Calculation?

E

Eddie Faulkner

Hi,

How does Non-Linear UUT affect TAR calculation?

For example, I have an D/P Air Flow transmitter that performs the square root extraction from measuring D/P and outputs a flow CFM reading.

Parameters are:
INPUT: 0 to 1.3622 "WC
OUTPUT: 0 to 1760 CFM
Tolerance 17.6 CFM (Basically, 1% of Range)

What minimum test standard is needed in order to simply test the instrument with at least as good of standard as the UUT?

In this specific example, how does the square root calculation affect the TAR calculation?

Thanks,

Eddie
 

Hershal

Metrologist-Auditor
Trusted Information Resource
If using TAR instead if TUR, then you will first need to define the UUT over its range. That is, an initial curve based on manufacturer's specs. Then the calibration standards can be charted using specific data - with uncertainties - from the calibration of the standards.

Once you have that, and for simplicity that may need to be done by range breakdown, then the information can be put into Excel. That in turn will give a visual. Specific calculations for the points selected can easily be completed.

Now you are ready for actual calibration of the UUT, and then replot the points for the UUT. You can specify that the calibration provider give you the data for the points rather than a pass/fail.

You will also need to establish minimum acceptable TAR to know if you met requirements.

Hope this helps.
 

dgriffith

Quite Involved in Discussions
Hi,

How does Non-Linear UUT affect TAR calculation?

For example, I have an D/P Air Flow transmitter that performs the square root extraction from measuring D/P and outputs a flow CFM reading.

Parameters are:
INPUT: 0 to 1.3622 "WC
OUTPUT: 0 to 1760 CFM
Tolerance 17.6 CFM (Basically, 1% of Range)

What minimum test standard is needed in order to simply test the instrument with at least as good of standard as the UUT?

In this specific example, how does the square root calculation affect the TAR calculation?

Thanks,

Eddie
Been thinking about this since it was posted. You can use a rotameter transfer standard calibrated to 0.5%, since most cannot be read with the human eye to better than 1%. Why waste money on a better calibration.
If the output units are correct, then the measured input differential pressure is likely correct. Since the result is in CFM, wouldn't you calibrate in CFM? Who cares about linearity? :mg:
Okay, we all do, but I don't see it as important here. It's already in the CFM result. When you compare CFM to CFM, you get CFM error. The rotameter is non-linear internally (volume vs. piston size), but the scale you read the piston against is linear.
Your TAR is the 0.5% vs. the 17.6 of range. But that's not the best way to do it, as TAR is fast becoming an obsolete parameter of quality. The TAR would be the 1% UUT tolerance / the rotameter 0.5% uncertainty, for a 2:1 TAR.
You certainly could find a better instrument at more cost, or you could have the cal done at a vendor that uses a primary standard.
:2cents:
 
Last edited:
Top Bottom