# What is the difference between Bias and Calibration?

S

#### sramasam

Bias Vs Calibration

Can Some one Explain me the difference between Bias and Calibration.

If different, pl. let me know in what way it is.

Thanks.

#### Jerry Eldred

##### Forum Moderator
Super Moderator
They are related terms, but not the same. "BIAS" is used in statistics to denote the difference between a given value (process value, or value being evaluated) and nominal (the amount of error). Calibration has a broader meaning. It refers to comparison of an unknown measurand against a known (very rough definition). Calibration is a verb and bias is a noun in this application. Calibration is used to determine and correct for bias.

I must add that as one who has worked in calibration for more than 25 years, I never once have seen the term "bias" used in calibration training or documentation until I received training in statistics in recent years. So they are quite different.

S

#### sramasam

Bias Vs Calibration

Jerry,
I accept with you that they are related terms. But often it seems to be very close to me. Would like to understand it better.

We measure the same part or a master whose value is known, ten times repeatedly and take the average of the same for comparing with the known value. And bias is the difference between the known value and the average of 10 measurements.

While calibrating a gauge or instrument for example a Micrometer 0 grade slips are being used. What actually the technician does is that he measures the slips using the micrometer and finds out the values. The difference between the slip value and the observation is metioned as the gauge/instrument error. In the calibration process we are actually comparing the measured value with the known value [In this case with the Slip]. Observed difference is then compared with the ISO/DIN standards to see whether it is acceptable or not.
Here's exactly where i am not able to differentiate BIAS calculation and the Calibration.

Pl. clarify.

Thanks.

#### Jerry Eldred

##### Forum Moderator
Super Moderator
I guess the simplest relationship between them I can think of is that BIAS is something that may be determined as a part of the calibration process.

Sounds like from your latest posting that it is the difference between the "bias calculation process" versus the "calibration process" that is in question.

There are two uses of the term calibration:
a. Verification that a measuring instrument is still measuring within its specified tolerances, or if not, documenting/adjusting as needed to bring it back within specified tolerances. This implies doing so with measurement standards of know accuracy (traceable to national or international standards). This also implies a certification process that the instrument may be expected to remain within tolerance for a given period of time and to a given percent confidence.

b. The adjustment process on a measuring instrument to optimize its measurement accuracy.

(I tend toward (a) above).

The "Bias Calculation Process" is a little different in that it is a subset of (a) above (I am not an expert in this specific term, so I am commenting based on my background in calibration), and does not necessarily apply only to measuring instruments.

The "Bias Calculation Process" is specifically determining the amount of error between a unit under test and nominal. The unit under test may be a product. If you manufacture a screw with a given thread pitch, you may sample units from the production line, and measure their actual thread pitch in comparison with a calibrated thread pitch "master" (or other source of determining nominal). This would be a bias calculation, but not a calibration. The nominal may be a thread master. This thread master may be sent out to an accredited calibration lab and certified to a given value and a given uncertainty (this is a calibration). Then the calibrated thread master is used to determine bias of the product (not a calibration).

Calibration refers more to certification that an instrument or gauge meets specified tolerances.

I hope the above helps clarify this for you.

A

#### Atul Khandekar

From NIST Engineering Statistics Handbook:
http://www.itl.nist.gov/div898/handbook/mpc/section4/mpc45.htm

Definition of bias :
The terms 'bias' and 'systematic error' have the same meaning in this handbook. Bias is defined as the difference between the measurement result and its unknown 'true value'. It can often be estimated and/or eliminated by calibration to a reference standard.

Potential problem:
Calibration relates output to 'true value' in an ideal environment. However, it may not assure that the gauge reacts properly in its working environment. Temperature, humidity, operator, wear, and other factors can introduce bias into the measurements. There is no single method for dealing with this problem, but the gauge study is intended to uncover biases in the measurement process.

Sources of bias:
Sources of bias that are discussed in this Handbook include:
Lack of gauge resolution
Lack of linearity
Drift
Hysteresis
Differences among gauges
Differences among geometries
Differences among operators
Remedial actions and strategies

G