Accuracy & Precision

nealster75

Registered
Have a CAPA re: Accuracy and Precision reqs for EQ. Our SOP says
Determine the accuracy & precision requirements of the equipment.

If accuracy and precision is not specified by the manufacturer, then default to a Test Uncertainty Ratio (TUR) of 4:1.
If a TUR previously determined thru analysis will be used in lieu of the TUR 4:1, then record the rationale in this section.

This seems onerous for much equipment, and calculating TUR does not seem feasible for technicians or others.

Is there a simple, logical, defensible process for assigning (or not) accuracy and precision that works for all EQ?
 

Miner

Forum Moderator
Leader
Admin
Leaving out the requirements for uncertainty and focusing strictly on accuracy and repeatability:
  • Accuracy should already be addressed through your calibration system, specifically the bias and linearity results from calibration.
  • Precision can be addressed through repeatability and reproducibility studies, or in special cases, a Type 1 repeatability study.
 

dwperron

Trusted Information Resource
First question: Who is supposed to be determining the accuracy and precision requirements of your equipment? That role should fall on the people who are setting test requirements for your processes, they are the ones who know the requirements for the instrumentation being used.

"If accuracy and precision is not specified by the manufacturer, then default to a Test Uncertainty Ratio (TUR) of 4:1.
If a TUR previously determined thru analysis will be used in lieu of the TUR 4:1, then record the rationale in this section. "

The TUR calculation will use the measurement uncertainty of your test equipment, and that will include accuracy and precision. In the 17025 world determining the measurement uncertainty for every calibration test point is a requirement, and if you are claiming a minimum 4:1 TUR then that calculation is also required.
 
Top Bottom