Definition Accuracy and Resolution in a Measurement System: What are the differences?

J

jhoniegudel

Dear all member of the COve,

Does anyone can explain about the Acuration and Resolution in Measurement System.

thanks

jhonie
 

Stijloor

Leader
Super Moderator
Re: Accuration - Resolution : What is the differences?

Dear all member of the COve,

Does anyone can explain about the Acuration and Resolution in Measurement System.

thanks

jhonie

Are you referring to this?

Accuracy: The closeness of a measurement to the actual value being measured.

Resolution: The smallest detectable increment that an instrument will measure/display to.

Stijloor.
 

Wes Bucey

Prophet of Profit
Re: Accuration - Resolution : What is the differences?

Dear all member of the COve,

Does anyone can explain about the Acuration and Resolution in Measurement System.

thanks

jhonie
Is there a possibility you mean the word "accuracy" versus "accuration?" - if so, click the report button to explain and whichever moderator is on duty during that time period will help you change the thread title.

If it is "accuracy,"perhaps this might help:
From the NIST Handbook

NIST/SEMATECH e-Handbook of Statistical Methods, http://www.itl.nist.gov/div898/handbook/, date
Specific page for glossary:
http://www.itl.nist.gov/div898/handbook/glossary.htm

accuracy In metrology, the total measurement variation, including not only precision (reproducibility), but also the systematic offset between the average of measured values and the true value.

resolution

  1. In experimental design, especially for two-level designs, the length of the word of the shortest confounding relationship. Geometrically, design resolution corresponds to the 1 plus the strength.
  2. In metrology, the number of significant digits of a measurement system that can be meaningfully interpreted.
 

bobdoering

Stop X-bar/R Madness!!
Trusted Information Resource
Re: Accuracy and Resolution in a Measurement System: What is the differences?

My definitions (everybody has one...), with a few bonus definitions:

Precision: degree of “fineness” or resolution
Can be checked by Repeatability – the ability to duplicate the result

Accuracy: conformity to a standard
Checked against a standard

Resolution (or Discrimination):
-The amount of change from a reference value that an instrument can detect and faithfully indicate. This is also referred to as readability or resolution
-The measure of this ability is typically the value of the smallest graduation on the scale of the instrument. If the instrument has ‘coarse’ graduations, then a half-graduation can be used
-A general rule of thumb is the measuring instrument discrimination ought to be at least one-tenth of the range to be measured (10 to 1 rule), or one tenth of the process variation.

ndc (number of discriminate categories): an evaluation of the statistically significant discrimination

Metrology is the science of measurement; of mass, time and length (the primary quantities)
Measurement is the “language” of science
Mensuration is the branch of applied geometry that is concerned with finding the length of lines, areas of surfaces, and volumes of solids from simple data of lines and angles

Types of Measurement Error
Tool Error – inherent instrument error
Observational Error – error from the eye
Manipulative Error – error from the hand
Bias – unconscious influence causing error

For more measurement lingo:
Measurement System Analysis from MoreSteam
 
D

dmadance

Re: Accuracy and Resolution in a Measurement System: What is the differences?

Resolution is the smallest change that an instrument can detect, accuracy is a subjective term referring to how close an instrument can be assumed to measure the true value of a quantity.

It is often the case that a given instrument may have less accuracy than resolution. This is not an unusual concept, lets say one measures the ratio between two voltages, one measurement gives 1.0 +/- 0.1 V, and another gives 3.0 +/- 0.2 V. Well an instrument could divide 1.0 by 3.0 and get the result 0.33333 V. In this case one is displaying more resolution than the measurement is accurate. In other words instruments can give more significant digits in a result than are justified by its accuracy. This is not necessarily a bad thing. Sometimes an instrument can measure changes in a quantity more accurately than the quantity itself for instance.

My website, LearningMeasure.com has courses available that can explain issues like this in more detail.
 
L

lego55

Hello-

Speaking of accuracy and resolution, we just had a customer audit and they gave us a finding for our calibration data. They were just audited by an AS9100 auditor who also gave them the same finding. Unfortunately, I don't understand enough about these figures to tell if it is truly a finding.

The customer's finding...
"Section 7.6 Control of Measuring Devices-Micrometer-Digital #125 in use for measurement with tolerance of .00005 is only certified to accuracy .0003."

In short, our lab's data sheet reads...
"Instrument: Digital Micrometer, Size: .00005 x 1.00000, Accuracy Tol.: .0003."

The lab sheet has more information, but it's mostly traceability, and non-measurement info.

Do you need more info from me to help?

Thank you!!
 

bobdoering

Stop X-bar/R Madness!!
Trusted Information Resource
The customer's finding...
"Section 7.6 Control of Measuring Devices-Micrometer-Digital #125 in use for measurement with tolerance of .00005 is only certified to accuracy .0003."

In short, our lab's data sheet reads...
"Instrument: Digital Micrometer, Size: .00005 x 1.00000, Accuracy Tol.: .0003."

Good call by the auditor. Probably knew that a micrometer would not be a good gage to measure to a .00005 tolerance, and found one supporting piece of evidence for the point. Gage R&R should also show a very dubious ndc for using that gage at that tolerance.
 

Wes Bucey

Prophet of Profit
Good call by the auditor. Probably knew that a micrometer would not be a good gage to measure to a .00005 tolerance, and found one supporting piece of evidence for the point. Gage R&R should also show a very dubious ndc for using that gage at that tolerance.
Wait a minute! Is the micrometer actually used in service to measure dimensions with the smaller tolerance? If the dimensions it is used to measure fall within the calibrated tolerance, where is the violation?
 
L

lego55

Wait a minute! Is the micrometer actually used in service to measure dimensions with the smaller tolerance? If the dimensions it is used to measure fall within the calibrated tolerance, where is the violation?

Yes, it is used in service for smaller dimensions only.
 
Top Bottom