Dimensional Requirement: Precision vs. Accuracy of its Measurement

P

Paul A. Pelletier

#1
I recently submitted the following question to ASME. I am curious what the experience of others has been on this topic.

My question is:

What is the proper way to specify and evaluate a dimensional requirement in terms of its designed precision versus the uncertainty and accuracy of its measurement?

Background

ASME Y 14.5M section 2.4 (Interpretation of Limits) states that: "All limits are absolute. Dimensional limits, regardless of the number of decimal places, are used as if they were continued with zeros (ex. 12.2 means 12.20....0). To determine conformance with limits, the measured value is compared directly with the specified value and any deviation outside the specified limiting value signifies nonconformance with the limits".

The as specified precision of a dimensional requirement (or any value for that matter) is generally represented by the number of decimal places to the right of the decimal point. For instance, a value of .125" is a less precise value than .1250". I cannot quote the source of this axiom. I can tell you that I was instructed this way as an engineering undergraduate, and that in the 20+ years since then, I have never encountered an engineer, machinist, inspector, draftsman, or designer that disagreed with it. If ASME does not concur with this axiom, I would like to have that confirmed as well as the rational or specification that ASME believes governs the conveyance of the level of precision in a dimensional requirement.

If this axiom is true, then section 2.4 of ASME Y 14.5M is at the very least confusing (i.e. 12.2 does not mean 12.20.....0).

Related to the correct way to specify a level of precision for a dimensional requirement is the correct way to inspect the part for conformance to that requirement. A second axiom that I have applied through out my professional experience is that one should inspect a technical requirement with an instrument that is more precise than the requirement as specified. This "incremental precision" is generally accepted to be 1 order of magnitude. The one caveat being that this would not be done if 1 order of magnitude exceeded the "state of the art" in measurement capability. This axiom is (was) documented at one time in the military specifications (MIL-STD-45662, I believe). Regardless, it is certainly logical that if one wishes to evaluate a requirement specified to .000" that it should be evaluated with an instrument capable of resolving .0000", otherwise the measured value of the third decimal place is too uncertain as to be considered reliable.

The third element of my question is what one does with the value measured in the "incremental precision position". The generally accepted practice that I have encountered is to round the "incremental precision position" up or down to the level of precision as specified by the designer (as represented by the number of decimal places in the dimensional requirement). This third piece to my question is the part that seems to be most at odds with ASME Y 14.5M section 2.4. When the standard states "All limits are absolute. .....To determine conformance with limits, the measured value is compared directly with the specified value and any deviation outside the specified limiting value signifies nonconformance with the limits", an interpretation could be that the "generally accepted practice" of rounding the "incremental precision position" in any direction is wrong. This phrase in the standard would imply that a dimensional requirement of .125 +/- .005" (specified value of .125", specified limiting value of .130") should be rejected for an as measured value of .1301", or for that matter .130001". Is that true? Is this .0001" or .000001" what ASME means when you say "outside the specified limiting value"?

This is the fundamental problem that I am requesting assistance in understanding, can one legitimately round the least significant digit (s) of measurements to the number of decimal places as specified by the designer, which for the sake of this question is the defined level of precision for the requirement, without violating the specified limiting value?

As a final illustration, consider the following: A hole diameter is specified to be .121+/- .001". The part is measured with an instrument that is capable (and calibrated) of resolving .00001", with the results being that the part measures .12210". Should the part be accepted? If the answer is no, then should it be accepted if the measured value is .12209"? Should it be accepted if the measured value is .12201"?

If your answer is that an instrument capable of resolving to .00000" is the wrong instrument, then what is the correct interpretation if the part were inspected with an instrument capable of resolving only .0000", and the measured value is .1220" when in reality the actual value is .12204". Is not the instrument rounding down? Does this not violate the intention of the standard?

I would greatly appreciate clarification on this issue. My company's intention is to unambiguously comply with internationally recognized standards, of which we consider ASME Y14.5M a critical element. Up until an internal difference of opinion resulted in a thorough research of the applicable standards (we are in the process of researching ANSI/IEEE 268 for additional guidance), we thought that we were doing so. After reviewing the standard it is not clear if we are or are not. I would submit that section 4 of ASME Y14.5M falls just short of defining the interpretation of limits in terms of the evaluation of parts for their conformance to those limits. This is the area that we are in need of clarification on.
 
Elsmar Forum Sponsor

Jerry Eldred

Forum Moderator
Super Moderator
#2
I apologize that I am in the midst of some rather 'brain-frying' activities, so I only gave this posting a quick read.

My gut response is that it is important to differentiate between precision and accuracy. It is totally correct that precision is in laymans terms a minimum resolution. I don't have all of my standards handy. But precision is like 'discrimination', the instruments ability to discriminate small incremental measurand changes.

From the text of the posting, it seemed that 'precision' was being blended into 'accuracy.' They are most definitely different and distinct from each other.

I don't have time to go into great detail. But absolutely, an 8 digit display has greater 'precision' (resolution, discrimination) than a 6 digit display. But that doesn't say anything about what difference there may be in accuracy. You could theoretically have an 8 digit resolution piece of measuring equipment with lower accuracy than a 6 digit piece of equipment.

As for the requirements for a level of precision in the standard, I am not specifically familiar with the mentioned standard. But as a 24 year metrologist, it makes good sense. This makes it simpler and less confusing when working with measurands. If a measurand is 1 inch, and has resolution of only 1 (versus say 1.000,000 inches), as a user of high 'accuracy' dimensional equipment, the added digits give me a better understanding of the actual measurand. If I am using a gage block, maintaining standard resolutions of say 0.1 micro inches, standardizes reading resolution.

As to the last part of your question, I am going to ask some of my dimensional gurus (I am not a dimensional guru - my claim to fame is in electrical areas). If I can get some worthwhile input, I will post for you.

Initially, my gut reaction would be that you define a level of resolution necessary for a given measurement. If you define 10X from the item under test (i.e.: 0.125" +/- 0.005", resolution of the measurement standard being 0.1250"), then read only those digits defined as being part of the measurement. What is happening, it seems, is that you have much better measurement standards than those units being tested or calibrated. In having that much better resolution, you have the added baggage of having digits of resolution that are not essential to the measurement process. If you document how you will make the measurement, then do it that way every time has more value than using the extra unnecessary digits of resolution.

The further thought that comes to mind is to bear in mind the required degree of precision of the unit being tested. If the unit being tested doesn't need that degree of precision, there is not, it seems, value in testing to the 'ultra-high' level of precision.

Hope I haven't muddied the waters too much on this.

------------------
 
S

Steven Truchon

#3
I agree with Jerry for the most part. Having spent the majority of my career in "Silicon Valley" metrology functions I have the following comments that are simplistic yet do reflect actual practices.

Limits are limits. A high limit of .13 is no different than .130000 . If the measurements were legitimately accurate enough, .129999 is acceptable and .130001 is out of tolerance. This is not arguable in the ideal sense of the numbers. What I have always tried to keep in mind is to keep the measurements in sync with the requirements AND expectations. Most of my travels have revealed usage of a 4X rule over the 10X rule for measurement precision. Its just a starting point and it all depends on whats being made. Irrigation pumps have much more flexibility outside a tolerance limit than does a hard-drive read-head for instance. So quantities of zeros are not equal across the board in that sense. I know I am crossing into engineering and design intent but it does become a factor.

A feature with a tolerance of X should be measured with an instrument that is accurate to 4 - 10 times the single direction feature tolerance depending on the function intent. When one reads beyond the requirement in terms of resolution and accuracy it becomes a moot point. It also becomes a matter of economics, why measure in millonths when your tolerance is thousandths? I know thats a stretch but I used it to make a point.

I remember one company I worked had a 10% out of tolerance rule. On a +/-.005 tolerance one could accept +/-.0055. I dunno. That was their rule. And it worked for them and it was under MIL-Q-9858A system.

From the mfg. world I live in, .12 and .120000000 are identical, depending...

Whaddya makin??




------------------
Steven Truchon
Precision Resource - Florida Division
www.precisionresource.com
[email protected]
 
Thread starter Similar threads Forum Replies Date
G What information to put on measurement Dimensional Results APQP and PPAP 7
V Recording CMM dimensional results on CFG-1003 APQP and PPAP 3
N How to resolve discrepancies in Level 3 PPAP supplier dimensional reports? APQP and PPAP 11
K Guard-banding and relaxed acceptance for dimensional gauges General Measurement Device and Calibration Topics 1
N Dimensional Inspection - Symbology cheat sheet Inspection, Prints (Drawings), Testing, Sampling and Related Topics 2
G Calibration Laboratory Dimensional Temperature Question General Measurement Device and Calibration Topics 2
G Procedure Creation specifically for dimensional (caliper, micrometer, dti) Document Control Systems, Procedures, Forms and Templates 8
G Caliper or Micrometer's Dimensional Flatness General Measurement Device and Calibration Topics 5
G Looking for a Metrology Course preferably Dimensional and Electronics Training - Internal, External, Online and Distance Learning 5
B Is Gage R&R required on a 100% Dimensional Layout? Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 6
W Dimensional Measurement of 10% accuracy and 25% uncertainty standard General Measurement Device and Calibration Topics 6
G Out of Dimensional Tolerance - MRB Audit Issue Nonconformance and Corrective Action 11
T How to report Basic Dimensions on a Dimensional Report (ISIR) Inspection, Prints (Drawings), Testing, Sampling and Related Topics 3
K Initial Samples with dimensional deviation in Serial Production Inspection, Prints (Drawings), Testing, Sampling and Related Topics 2
V Ford Dimensional Tag Characteristics requirements APQP and PPAP 2
M How to calculate Cg and Cgk for a 2-Dimensional Area Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 4
J Slotted Hole Dimensional Inspection Inspection, Prints (Drawings), Testing, Sampling and Related Topics 5
G MSA (Measurement System Analysis) and Dimensional Tolerances Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 1
B Changing dimensional tolerances to approve rejected product ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 2
S Dimensional Issues because of the last years FORD Recall Manufacturing and Related Processes 2
T Variable Dimensions / Part of Production / PPAP Dimensional Report Inspection, Prints (Drawings), Testing, Sampling and Related Topics 7
W Dimensional Calibration Training Class Training - Internal, External, Online and Distance Learning 2
C Building a Dimensional Metrology Laboratory to Inspect various Physical Measurements General Measurement Device and Calibration Topics 2
P Dimensional Mechanical Inspection only as a Service - ISO17025 or ISO17020 ISO 17025 related Discussions 5
J Equipment required for a Dimensional Calibration Laboratory Document Control Systems, Procedures, Forms and Templates 1
C Content of PPAP Dimensional Report APQP and PPAP 8
I Machine Capability Analysis in Multi-Dimensional Cases Capability, Accuracy and Stability - Processes, Machines, etc. 2
O Dimensional Results Requirements for PPAPs APQP and PPAP 6
R Environmental Controls Procedure for Calibration of Dimensional Gages General Measurement Device and Calibration Topics 3
H Dimensional Measurement Uncertainity - Inspection of a component Measurement Uncertainty (MU) 5
H Dimensional Inspection Procedure - Sample wanted Document Control Systems, Procedures, Forms and Templates 2
O The Layout of Dimensional Results and What to Provide the Customer APQP and PPAP 16
S Standard Equipment List for Stocking a Dimensional Metrology Laboratory General Measurement Device and Calibration Topics 8
A Criteria for taking Dimensional Measurements of a Painted Hole in a Part Inspection, Prints (Drawings), Testing, Sampling and Related Topics 5
R Reference Standard for the Calibration of Dimensional Measuring Tools General Measurement Device and Calibration Topics 2
R Self Calibration of Dimensional Measuring Tools General Measurement Device and Calibration Topics 3
M Design Review/Dimensional Analysis feasibility program or spreadsheet? Design and Development of Products and Processes 1
N Design of Dimensional Measurement Gage for Bolt General Measurement Device and Calibration Topics 13
M How to define Dimensional Tolerances for Plastic Injection Molded Parts Service Industry Specific Topics 11
M Luer Taper Tolerances - 6% or otherwise - Dimensional Accuracy Other Medical Device Related Standards 23
S Dimensional change with PA raw material product Manufacturing and Related Processes 3
D AIAG Level 3 PPAP (100% dimensional measurement or just significant characteristics)? APQP and PPAP 2
I Gage Error vs. Dimensional Tolerance - Accuracy needed for measurement equipment Measurement Uncertainty (MU) 10
kmyers Inspection Dimensional Check Sheets - Over 500 part numbers Document Control Systems, Procedures, Forms and Templates 2
J PMA (Parts Manufacture Approval) Dimensional Inspections Federal Aviation Administration (FAA) Standards and Requirements 7
L Acceptance Activities-Dimensional Inspection was missed. Parts already assembled 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 6
K Dimensional Inspection of Molded Plastic Parts in Seoul, Korea Inspection, Prints (Drawings), Testing, Sampling and Related Topics 2
J Measurement Uncertainty Budget for Comparative Dimensional Measurements Measurement Uncertainty (MU) 8
I Contract Visual, Basic Dimensional Inspection Services Service Industry Specific Topics 2
K Documentation of Dimensional Results for Different Cavities and/or Molds - PPAP APQP and PPAP 12
Similar threads


















































Top Bottom