Dimensional Measurement of 10% accuracy and 25% uncertainty standard

W

wintay

Dear all,

I will like to verify if there are standards to comply with on the 10% accuracy and 25% uncertainty for dimensional measuring equipment.
Please advise.
Million Thanks.
 

dgriffith

Quite Involved in Discussions
Dear all,

I will like to verify if there are standards to comply with on the 10% accuracy and 25% uncertainty for dimensional measuring equipment.
Please advise.
Million Thanks.
Yes, of course. However, (opens can of worms...) we would need to know what dimensional equipment is being calibrated. Generally, a gage block will suffice as one type of dimensional standard.

As for 10% and 25%, don't confuse the two concepts.
I think you are referring to the standard having a resolution 10 times better than the unit under test. UUT has a resolution that reads to 0.001, so the standard should read to 0.0001. It's a mostly outdated concept but still used everywhere. It does not guarantee sufficient anything.

The 25% refers to the standard, or the collective standards if several are used in a calibration, having an uncertainty that is 25% or less of the UUT specification. If it does then the standards are considered to have a minimal impact/influence on the UUT measurements that can largely be ignored.
That's the superficial explanation.

Fire away, all.
 
Last edited:
W

wintay

Yes, of course. However, (opens can of worms...) we would need to know what dimensional equipment is being calibrated. Generally, a gage block will suffice as one type of dimensional standard.

As for 10% and 25%, don't confuse the two concepts.
I think you are referring to the standard having a resolution 10 times better than the unit under test. UUT has a resolution that reads to 0.001, so the standard should read to 0.0001. It's a mostly outdated concept but still used everywhere. It does not guarantee sufficient anything.

The 25% refers to the standards, or the collective standards if several are used in a calibration, having an uncertainty that is 25% or less of the UUT specification. If it does then the standards are considered to have a minimal impact/influence on the UUT measurements that can largely be ignored.
That's the superficial explanation.

Fire away, all.
Thanks dgriffith.

To understand further.
When you mentioned that the 10% accuracy(resolution) concept is already outdated, may I know what is the current concept? And which commercial or ISO standard I can refer to?

Thanks.
 
J

JAltmann

You won't find the 10% in any standard, it is a general acceptance, that many state is derived from a standard as that is what they were told, but can never produce it. Last I read it dated to somewhere in the 50's.

In today's world the closet I have seen is the AIAG MSA manual in regards to 10% GR&R acceptance criteria.

In regards to the 10% being outdated I think the poster was referring to ever tightening tolerances, making the 10% difficult to achieve in some cases. I still target 10%, but ultimately add in common sense/cost-benefit ratio to justify higher.
 

dwperron

Trusted Information Resource
Thanks for the question!
These are numbers I have always taken for granted, but you spurred me on to doing a little research on the matter.

Regarding the 10:1 ratio, the sources I found refer to this being a "rule of thumb" used in control charting. I can not find any document that cites this, even though, like many others, I have been taught that this is the "preferred" test accuracy ratio, and that if you cannot achieve 10:1 then 4:1 is acceptable.

So where does the 4:1 ration come from? I read in NASA publications that it goes back to the 1950's, when the Navy discovered that they were having difficulty maintaining a 10:1 ratio, and wondered if this was really necessary. Jerry Eagle of the Naval Ordinance Lab studied the matter and came up with a statistical analysis that a 1% consumer risk on calibration results would be considered acceptable. This is translated into about a 3:1 test accuracy ratio, and he then built in a "cushion" to recommend a more conservative 4:1 ratio. This became accepted as the standard.

As to standards requiring calibration accuracy ratios, I have found these:

ISO 10012-1:1992 in Section 4.3 Guidance states:
"The error attributable to the calibration should be as small as possible. In most areas of measurement it should be no more than one third and preferably one tenth of the permissible error of the confirmed equipment when in use". This is the only reference to a 10:1 ratio that I can find, it is "preferred" and not required. Also, the current version ISO 10012:2003 drops references to these requirements.

MIL STD 45662A 5.2 requires a 25% minimum ratio (4:1)

ANSI Z540-1 10.2 b) requires a "collective uncertainty of the measurement standard shall not exceed 25% of the acceptable tolerance" (4:1)

ANSI Z540.3 duplicates this requirement in section 5.3 b)

So it appears the the only current accuracy ratio requirement is the 25% (4:1 TUR)

Hope that this helps.
 

dgriffith

Quite Involved in Discussions
DW,
Nice research, thanks.
It can now be demonstrated that for consumer risk, using a 2% probability of false accept (claimed in tolerance when actually out), the test uncertainty ratio point is 4.6:1. This is for as-left out-the-door MTE, not as-found.
 

dsanabria

Quite Involved in Discussions
Thanks for the question!
These are numbers I have always taken for granted, but you spurred me on to doing a little research on the matter.

Regarding the 10:1 ratio, the sources I found refer to this being a "rule of thumb" used in control charting. I can not find any document that cites this, even though, like many others, I have been taught that this is the "preferred" test accuracy ratio, and that if you cannot achieve 10:1 then 4:1 is acceptable.

So where does the 4:1 ration come from? I read in NASA publications that it goes back to the 1950's, when the Navy discovered that they were having difficulty maintaining a 10:1 ratio, and wondered if this was really necessary. Jerry Eagle of the Naval Ordinance Lab studied the matter and came up with a statistical analysis that a 1% consumer risk on calibration results would be considered acceptable. This is translated into about a 3:1 test accuracy ratio, and he then built in a "cushion" to recommend a more conservative 4:1 ratio. This became accepted as the standard.

As to standards requiring calibration accuracy ratios, I have found these:

ISO 10012-1:1992 in Section 4.3 Guidance states:
"The error attributable to the calibration should be as small as possible. In most areas of measurement it should be no more than one third and preferably one tenth of the permissible error of the confirmed equipment when in use". This is the only reference to a 10:1 ratio that I can find, it is "preferred" and not required. Also, the current version ISO 10012:2003 drops references to these requirements.

MIL STD 45662A 5.2 requires a 25% minimum ratio (4:1)

ANSI Z540-1 10.2 b) requires a "collective uncertainty of the measurement standard shall not exceed 25% of the acceptable tolerance" (4:1)

ANSI Z540.3 duplicates this requirement in section 5.3 b)

So it appears the the only current accuracy ratio requirement is the 25% (4:1 TUR)

Hope that this helps.


...and yes - top customer in the industry will require you to follow this principles - the same as the temperature principles without scientific or research data to back it up.

Do it because it's being done before..... so once in a while - challenge back ;)
 
Top Bottom