Acceptable Tolerance - how to derive?

QCJS

Do what you can
Hi,

This is the info I saw on the calibration cert of a Dial Test Indicator.
Range = 0 to 0.2mm
Graduation = 0.002mm
Allowable error=0.004mm
Measured value = 0.0010mm
Expanded uncertainty for measured value = 0.0007mm
Can I derive the equipment's Acceptable Tolerance with this info?

Pls help. Thanks,
QCJS
 

Jen Kirley

Quality and Auditing Expert
Leader
Admin
I have most often seen tolerance set at plus or minus one graduation on the gage. When calibrating, I would check at different parts of the range as error tends to be more often present at low and/or high end of the range.
 

QCJS

Do what you can
Hi Jen,

I ask this is because my MSA tutor mentioned I need to calculate the acceptance tolerance after every calibration. I'm suppose to include this in my calibration masterlist. If not, it will not be complete.

So from your last statement, can I say that I can find my acceptable tolerance by:

(Max error - Min error) / 2?

From the above example, is it Acceptable tolerance = (allowable error - expanded uncertainty)/2 = (0.004-0.0007)/2 = ±0.0017mm?
 

Jen Kirley

Quality and Auditing Expert
Leader
Admin
Hello QCJS,

I don't have the MSA Manual in front of me, but I do have NIST Handbook 44 - 2015. It says (page 6)
Tolerances are performance requirements. They fix the limit of allowable error or departure from true performance or value.
In other words, acceptable distance from zero. If I understand your tutor's math correctly, the tolerance would expand if the instrument was found to have greater error during calibration. Why would we want that?

Can you tell us your indicator's make and model? They might make recommendations in a manual.
 

dwperron

Trusted Information Resource
Hi,

This is the info I saw on the calibration cert of a Dial Test Indicator.
Range = 0 to 0.2mm
Graduation = 0.002mm
Allowable error=0.004mm
Measured value = 0.0010mm
Expanded uncertainty for measured value = 0.0007mm
Can I derive the equipment's Acceptable Tolerance with this info?

Pls help. Thanks,
QCJS

This string has confused me for several days now.
I'm not sure what you mean by "Acceptable Tolerance", and what Acceptable Tolerance is used for in your application.

For instance, in your Dial Test example it would seem logical me that the Allowable error is what the manufacturer has determined to be the Tolerance of the device. I am not sure how to make the leap to what is an "Acceptable" tolerance.
 

QCJS

Do what you can
This string has confused me for several days now.
I'm not sure what you mean by "Acceptable Tolerance", and what Acceptable Tolerance is used for in your application.

For instance, in your Dial Test example it would seem logical me that the Allowable error is what the manufacturer has determined to be the Tolerance of the device. I am not sure how to make the leap to what is an "Acceptable" tolerance.

Hi dwperron,

I may have confuse Allowable error with Acceptable tolerance too. It's because I couldn't find anything on the Internet or the Forum that defines Acceptable tolerance. And my cal cert only shows me this info.

If I say that Acceptable tolerance is Allowable error, and "Allowable error is what the manufacturer has determined to be the Tolerance of the device", why is there a need to send my Dial Indicator for yearly calibration? There must be something on the cert that determines Acceptable tolerance of a device. Maybe I need to compare a value (I don't know which) on the cert with the Allowable error, if the value is less than Allowable error, then the Allowable error becomes my Acceptable tolerance? I'm not sure too. I'm totally clueless at this.

I hope someone can help me clear this doubt because I'm the only one in charge of the equipment's calibration
 

QCJS

Do what you can
Hi Jen,

Mine's a Mitutoyo Dial Indicator. Model: 513-405-10E, 0 to 0.2mm graduation
 

dwperron

Trusted Information Resource
Hi Jen,

Mine's a Mitutoyo Dial Indicator. Model: 513-405-10E, 0 to 0.2mm graduation

OK, now I'm really confused!
Here is what Mitutoyo gives for their Indication Accuracy Specifications:

1622739309084.png

From this chart comes this interpretation:
The Indication Accuracy (Tolerance) from 0 to ±20µm (0 to ±10 divisions) is ±1 division (± 2µm)
From 20 to 200 µm the Indication Accuracy is ± 2 divisions (± 4µm)

On top of these linearity specifications there is also a specification for hysteresis (± 3µm or ± 1.5 divisions)
There is a repeatability specification of ± 1µm (± 0.5 division)

These are what you are calling the "Allowable tolerance". These will not change with calibration - these are what the manufacturer claims as the accuracy they built into the tool.
These are the tolerances you would be verifying with your calibrations. These are the accuracies you can expect when you make measurements.

You need to get your indicator calibrated at regular intervals because there is the chance, however small, that something could be damaged in the mechanism and the results you get may be out of tolerance.
 

Ninja

Looking for Reality
Trusted Information Resource
Hope I don't derail a good discussion...but I'm getting confused by terminology here...
These are what you are calling the "Allowable tolerance". These will not change with calibration - these are what the manufacturer claims as the accuracy they built into the tool.
I can't see that being a good path...
I heartily agree with what Jen said...in the book quote: "Tolerances are performance requirements. "

Allowable tolerance, assuming this means "Pass" criteria for calibrating the gage, should be based on what the parts being measured require.

If I used the above gage in question to measure something with a +/- 15mm tolerance, why would I care at all what the gage manufacturer claims?

Am I just lost in the terminology?
 

dwperron

Trusted Information Resource
Am I just lost in the terminology?

I think we are all muddling through terminology on this, maybe with some language issues to boot.

I was basing my response on this statement:
"There must be something on the cert that determines Acceptable tolerance of a device. "
Looking at the example of the calibration cert data in the original post I left totally confused as to what the actual tolerance was.
When he produced the model number of the indicator I was able to get the actual specs from Mitutoyo.
Even those were really fuzzy on how they were intended to be interpreted - not good for a spec sheet and very odd for Mitutoyo.

As for your use example, yes, you can certainly derate the specs from the manufacturer's, as long as you identify the indicator as having a Limited Calibration, or somehow assuring that it was only used on that one measurement,
 
Last edited:
Top Bottom