Thoughts on Traceability and Measurement Uncertainty (MU)

Marc

Hunkered Down for the Duration
Staff member
Admin
#1
Date: Wed, 17 Oct 2001 11:13:26 -0700
From: "Dr. Howard Castrup"
To: Greg Gogates
Subject: Traceability/Uncertainty

Mike, Steve, et al.,

In Traceability/Uncertainty RE08, Steve Ellison presents what I believe to be a cogent argument for a definition of traceability that is not tied to uncertainty propagation through a traceability chain. The following definition, quoted by Mehul Joshi in this thread, contains at least the elements of what should be preserved in discussing traceability:

Traceability: "The property of a result of measurement whereby it can be related to appropriate standards, generally international or national standards, through an unbroken chain of comparisons."

Note that this definition does not include any mention of an accompanying uncertainty propagation analysis. Admittedly, requiring that parameters be calibrated by references that have themselves been calibrated by higher-level references in an unbroken chain traceable to a national or international standard or set of standards does not, in itself, control measurement uncertainty. However, such a requirement still has value. For one thing, the analysis of uncertainty propagation through a test and calibration support hierarchy, requires that the chain exist in the first place. Moreover, by leaving uncertainty propagation analysis out of the definition, we avoid several practical difficulties. These difficulties can be addressed by a separate uncertainty propagation analysis requirement.

In addition to traceability, other elements to be considered in uncertainty propagation analysis include the factors mentioned by Mike Ouellette in Traceability/Uncertainty RE03. These include location, environment, and time since calibration. The last factor is especially relevant to this discussion. As Mike and others have suggested or implied, the uncertainty of a calibrated parameter grows with time elapsed since calibration. With this in mind, it is important to note that the uncertainty statement that accompanies a calibrated parameter is nearly always just an estimate of the uncertainty of the calibration process -- not an estimate of the uncertainty in the reported parameter value at the time of use and under usage conditions.

I mention this as a lead-in to a comment that was made during an uncertainty analysis panel session at the 2001 International Dimensional Workshop. Responding to a question from the audience, one of the panelists, presumably in an effort to be consistent with the VIM, stated that, if the uncertainty during use is not the same as the stated uncertainty on the report of calibration, then the traceability of the calibrated parameter is lost.

Experience in the field of calibration interval analysis, replete with instances of observed changes in parameter value over time and/or reductions in in-tolerance probability as a function of time, argues that the uncertainty of a calibrated parameter begins to grow immediately following calibration and continues to grow during use. Given this, one might suspect that the reason ISO/IEC 17025 discourages projections of uncertainty over time on cal certs is that acknowledging the existence of uncertainty growth throws the VIM definition of traceability into controversy.

To see where this supposition derives from, I return to a comment made by Mike Ouellette: "... if the reported measurement is not accompanied by a (meaningful) uncertainty in that measurement, then THERE IS NO TRACEABILITY as defined by the VIM." Since a report of the measurement process uncertainty of a calibrating entity is hardly a meaningful uncertainty estimate for a parameter under conditions of use and time since calibration, by recognizing uncertainty growth, we either invalidate the VIM definition or conclude that we never have traceability.

So, Steve's objection to letting definitions dictate debate is needed, as we are apparently dealing with an example of the tail wagging the dog. The easiest solution, it seems to me, is to keep the concept of an unbroken chain of calibrations leading to a national or international standard separate from, but necessary to, the concept of uncertainty propagation through the chain. The former provides "error" control by fostering agreement between standards and equipment parameters, while the latter provides a means of assessing the quality of this control and also the quality of measurements made under conditions of use.

Dr. Howard Castrup
President, Integrated Sciences Group
 
Elsmar Forum Sponsor

Jerry Eldred

Forum Moderator
Super Moderator
#2
Fascinating, well put, and still fundamental. I couldn't possibly add enough to that to spit at.

I am involved in evaluation of some 'stuff' produced by a metrology person at a site across the globe. I reviewed a treatise the person wrote replete with calculus formulas that gave me a headache, but which nevertheless missed a very basic point in calibration.

Calibration exists for the sole (fundamental) purpose of getting a handle on measurement uncertainty. In some ways it is nothing more, and nothing less. But in this persons lengthy treatise (which I'm not free to share here), not a word was mentioned about measurement uncertainty of their standards. They went on at length to discuss how (according to reading taken with their "Standard", they observed short term and long term drift. And borrowing from methodology used in MSA's, they defined a continuous monitor method of adjusting calibration intervals. Nowhere was traceability or uncertainty mentioned.

Regarding the article above, I fully agree. Calibration, and uncertainty management are a big club whose members take their little standard thingies, and see how well all their measuring stuff lines up with them. Then, every once in a while, we send our standard thingies to a senior club, where they compare our standards with the nicer ones they have. Their standards are a lot nicer than ours. They in turn (and quite invisible to us), send theirs on up the line to the Shriners Club (no offense to any Shriners here - only an illustration of a highest echelon). They send 'em back, and we repeat the process. Meanwhile, these little rascals called Entropy sneak around with these bright colored T-shirts on called Drift and Uncertainty, and try to mess up all those things we knew were working really well last time we calibrated them.

And there are, it seems, an infinite number of those little rascals in T-shirts sneaking around messing up our stuff. So, we figure out with a little math how long it takes them to totally mess up our stuff. They begin to mess the stuff up as soon as we get it back. How long it takes for them to mess everything up so bad we can't stand to look at it any more depends on how badly bahaved the little rascals at our company are.

Everything in the world that measures drifts all the time. Usually in predictable ways, often in ways we can't predict. It begins an infinitessimally small moment after we finish calibrating, and stops when we re-calibrate.

The best we can do is try to keep those little rascals locked up in their cages, and keep adjusting everything back at appropriate intervals.

Stepping out of my it's-been-a-long-week mode, I think it would be much more fair to say we can never know exactly what an instrument will measure. The purpose of understanding uncertainty is simply to try to quantify how well we understand the range around which an instrument will measure over a predictable period of time. Uncertainty analysis is nothing more than SPC applied to the calibration world with a few of the names changed.

Sorry folks, long day. Hope this is useful intelligent reading of benefit to some of us out there who get confused by uncertainty.
 
U

Unregistered

#3
Originally posted by Jerry Eldred
Everything in the world that measures drifts all the time. Usually in predictable ways, often in ways we can't predict. It begins an infinitessimally small moment after we finish calibrating, and stops when we re-calibrate.
Actually, the best voltage reference standard available to the masses drifts, but it does so in such a predictable fashion that on any given day I can tell you its output with an uncertainty of about 0.3 PPM, which isn't too shabby. The trick is that you NEVER adjust it, you only chart it. It seems the best standards are not necessarily those that stay closest to nominal, rather those that you can predict with little uncertainty.

Let go, Luke, use the drift...

Ryan
 
D

DICKIE

#4
I agree whole heartedly with Dr. Castrup. I take everything he says about uncertainty to heart and believe him to be the preminent expert in the field.

Greg
 

Marc

Hunkered Down for the Duration
Staff member
Admin
#5
> Meanwhile, these little rascals :biglaugh: called Entropy sneak
> around with these bright colored T-shirts on called Drift
> and Uncertainty, and try to mess up all those things we
> knew were working really well last time we calibrated
> them.

Jerry, I really liked your description and the rest about the "...rascals..." A good way to present it. I pasted this thread from the ListServe because of the confusion about uncertainty and how it plays a part in the whole chain. I hadn't thought of it as Entropy.

It's interesting how all the factors add up in some critical situations. I remember years ago at Cincinnati Electronics the cal lab had a 'special' section with certain standards like resistors that were kept immersed in liquid at a certain temperature, etc. I didn't work in the area, but was in there a few times during some projects. I really didn't pull everything together in my mind until the last 4 or 5 years. QS-9000's Measurement Systems Analysis brought together the 'last mile' for me, so to speak.

Anyway, one last snippet:

************************
Date: Thu, 18 Oct 2001 10:26:29 EDT
From: CoachIngal
Subject: Re: Traceability/Uncertainty

VIM definition of traceability:

"property of the result of a measurement or the value of standards whereby it can be related to stated references, usually national or international standards, through an unbroken chain of comparisons all having stated uncertainties."

- Paul Ingallinera
L-A-B Assessor

*********************************************

Date: Thu, 18 Oct 2001 08:45:14 -0400
From: Philip Stein
To: Greg Gogates
Subject: Re: Traceability/Uncertainty

Howard Castrup wrote, below, that

>"it is important to note that the uncertainty statement that
>accompanies a calibrated parameter is nearly always just an estimate of the
>uncertainty of the calibration process -- not an estimate of the uncertainty
>in the reported parameter value at the time of use and under usage
conditions."

But in fact many uncertainty budgets I have assessed DO estimate uncertainty growth with time. The best of this breed use appropriate statistical analyses of historical data to do so. Fluke, for example, conducts a program whereby voltage reference standards are calibrated and the report produced includes the slope of the drift. It is therefore possible for the user of the standard to estimate the actual voltage at any time in the future during the calibration interval. This is still an estimate, of course, the actual voltage is not known, but it is being predicted with a narrower confidence interval than would be possible without this program. This drift is included as part of an overall budget that encompasses other major influence quantities. For the most picky work, the drift is seen as a 'bias' and corrected for as of the time of use of the standard. The uncertainty in this 'bias' falls to the budget. Most practitioners, however, just include the maximum predicted drift for the entire cal interval as a term in their budget. Either approach is acceptable, although many thinkers, including the GUM, frown on moving uncorrected 'bias' into the uncertainty.

In my experience, most budget writers include specific influence quantities and ignore or are unaware of others. Change of the standard between external calibrations is commonly (but not universally) included.

So what about the traceability question that is being discussed here? The definition

> "Traceability: "The property of a result of measurement whereby it can be
> related to appropriate standards, generally international or national
> standards, through an unbroken chain of comparisons."

is one we lived with, more or less, for years. All we are doing today is highlighting the fact that the word 'related' in the definition must be understood as requiring a certain level of quantitation. Any relaxation of the current definition of traceability must still require 'related' to be a quantitative statement if traceability is to mean anything at all. Alternative methods of quantifying 'related' may be acceptable to some, but since we already have an accepted world-wide standard for accomplishing such a task, namely the GUM, there should be considerable justification for doing it a different way.

Phil

Philip Stein
Fellow, ASQ and member of its board of directors
A2LA Lead Assessor
Past Chair, ASQ Measurement Quality Division
 
R

Ruebenn

#6
Re: Thoughts on Traceability and Uncertainty

Date: Wed, 17 Oct 2001 11:13:26 -0700
From: "Dr. Howard Castrup"
To: Greg Gogates
Subject: Traceability/Uncertainty

Mike, Steve, et al.,

In Traceability/Uncertainty RE08, Steve Ellison presents what I believe to be a cogent argument for a definition of traceability that is not tied to uncertainty propagation through a traceability chain. The following definition, quoted by Mehul Joshi in this thread, contains at least the elements of what should be preserved in discussing traceability:

Traceability: "The property of a result of measurement whereby it can be related to appropriate standards, generally international or national standards, through an unbroken chain of comparisons."

Note that this definition does not include any mention of an accompanying uncertainty propagation analysis. Admittedly, requiring that parameters be calibrated by references that have themselves been calibrated by higher-level references in an unbroken chain traceable to a national or international standard or set of standards does not, in itself, control measurement uncertainty. However, such a requirement still has value. For one thing, the analysis of uncertainty propagation through a test and calibration support hierarchy, requires that the chain exist in the first place. Moreover, by leaving uncertainty propagation analysis out of the definition, we avoid several practical difficulties. These difficulties can be addressed by a separate uncertainty propagation analysis requirement.

In addition to traceability, other elements to be considered in uncertainty propagation analysis include the factors mentioned by Mike Ouellette in Traceability/Uncertainty RE03. These include location, environment, and time since calibration. The last factor is especially relevant to this discussion. As Mike and others have suggested or implied, the uncertainty of a calibrated parameter grows with time elapsed since calibration. With this in mind, it is important to note that the uncertainty statement that accompanies a calibrated parameter is nearly always just an estimate of the uncertainty of the calibration process -- not an estimate of the uncertainty in the reported parameter value at the time of use and under usage conditions.

I mention this as a lead-in to a comment that was made during an uncertainty analysis panel session at the 2001 International Dimensional Workshop. Responding to a question from the audience, one of the panelists, presumably in an effort to be consistent with the VIM, stated that, if the uncertainty during use is not the same as the stated uncertainty on the report of calibration, then the traceability of the calibrated parameter is lost.

Experience in the field of calibration interval analysis, replete with instances of observed changes in parameter value over time and/or reductions in in-tolerance probability as a function of time, argues that the uncertainty of a calibrated parameter begins to grow immediately following calibration and continues to grow during use. Given this, one might suspect that the reason ISO/IEC 17025 discourages projections of uncertainty over time on cal certs is that acknowledging the existence of uncertainty growth throws the VIM definition of traceability into controversy.

To see where this supposition derives from, I return to a comment made by Mike Ouellette: "... if the reported measurement is not accompanied by a (meaningful) uncertainty in that measurement, then THERE IS NO TRACEABILITY as defined by the VIM." Since a report of the measurement process uncertainty of a calibrating entity is hardly a meaningful uncertainty estimate for a parameter under conditions of use and time since calibration, by recognizing uncertainty growth, we either invalidate the VIM definition or conclude that we never have traceability.

So, Steve's objection to letting definitions dictate debate is needed, as we are apparently dealing with an example of the tail wagging the dog. The easiest solution, it seems to me, is to keep the concept of an unbroken chain of calibrations leading to a national or international standard separate from, but necessary to, the concept of uncertainty propagation through the chain. The former provides "error" control by fostering agreement between standards and equipment parameters, while the latter provides a means of assessing the quality of this control and also the quality of measurements made under conditions of use.

Dr. Howard Castrup
Dear Sir,

Thanks for the help, truly appreciate it.
Advanced Christmas greetings.
I suppose others have personally asked you for similar advise before and i would truly be obliged if you can let me know in brief/point form in what is required of an approved signatory of a RF/Microwave lab under the 17025 scheme.
I have come to know that the technical requirements are more important than the quality aspects.
I know we have to know the standards(both operation and service aspects) by hard which is what i am trying to so so hard now.
And i also know that we have to knw something about the uncertainty measurement(BMC) and all?
Do you have a sample of such a calculation underlying an example say..using a measuring receiver HP 8902S in calibrating a signal generator say a HP 8648C?
I am still blur in getting the infos..accuracy, resolution and so on.
Appreciate the reply.

Rgds
Ruben
 
Thread starter Similar threads Forum Replies Date
S Thoughts on managing ISO 9001, 13485, IATF 16949 and 17025 ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 33
N Looking to Impress - New Job - Advice, Thoughts, Comments Welcome Career and Occupation Discussions 23
T Internal Nonconformance procedure thoughts (AS9100) AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 6
S Inventory Control - Any thoughts would be appreciated ISO 13485:2016 - Medical Device Quality Management Systems 2
S Proposed Quality Improvement - Thoughts? Medical Device and FDA Regulations and Standards News 3
E Discussion between co-worker on tolerance and uncertainty and how to apply it. Thoughts? 17025 ISO 17025 related Discussions 1
D ASQ CMQ/OE Certification - Share your thoughts ASQ - American Society for Quality 3
Marc Thoughts about the vBulletin to Xenforo Software Migration - 2 October 2018 Forum News and General Information 4
J MDR reporting and CAPA thoughts? Classifying Complaints on Risk 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 13
Q Thoughts on Communications relevant to the Quality Management System ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 3
K Thoughts on the impact of the General Data Protection Regulation? Medical Information Technology, Medical Software and Health Informatics 5
B Thoughts on performing an ISO 9001:2015 Remote Internal Audit ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 8
Marc Thoughts about Extrapolating Statistical Analysis Tools, Techniques and SPC 5
Marc Your thoughts? A weekend calibration teaser General Measurement Device and Calibration Topics 7
Marc Thoughts about Discussion Forums Coffee Break and Water Cooler Discussions 2
S Destructive Gage RR - Using Crossed - want your thoughts Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 18
Marc Thoughts for Food - What's on the menu? Coffee Break and Water Cooler Discussions 3
D Thoughts on the Implementation and Effectiveness of A3 methodology Lean in Manufacturing and Service Industries 2
Antonio Vieira Thoughts on implementation of Quality Management System in a Police Department ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 32
W Thoughts on having Safety Shower in ISO Class 7 Cleanroom ISO 13485:2016 - Medical Device Quality Management Systems 5
C Thoughts on validation of Legacy Systems for Medical Device Software EU Medical Device Regulations 2
T Need thoughts on calibration of Class A volumetric glassware General Measurement Device and Calibration Topics 3
Marc About 1940 to today - TV - Sunday Morning Thoughts Coffee Break and Water Cooler Discussions 5
G Thoughts on Audit Finding for not doing Gage R and R for Visual Inspection Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 8
Sean Kelley What are your thoughts on Process vs Product Control Plans ? FMEA and Control Plans 3
N Improvement in a Medical Plastics Compounder Job Shop - Any thoughts? Preventive Action and Continuous Improvement 14
B Your thoughts on Communicating to Customers the Actions on Customer Survey Customer Complaints 4
Steve Prevette Tom Peters thoughts on Six Sigma, ISO Misc. Quality Assurance and Business Systems Related Topics 5
N Quantifying our QMS Objectives - Your Thoughts ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 12
Jerry Eldred New ILAC Requirements Regarding Repeatability - Your thoughts on this please Measurement Uncertainty (MU) 11
Marc Thoughts and Opinions about the ASQ (American Society for Quality) ASQ, ANAB, UKAS, IAF, IRCA, Exemplar Global and Related Organizations 34
Richard Regalado ISO 27001 Statement of Applicability and Some of my Thoughts IEC 27001 - Information Security Management Systems (ISMS) 4
bobdoering Dr Wheeler gives his thoughts on "Estimating the Fraction Nonconforming" Capability, Accuracy and Stability - Processes, Machines, etc. 0
bobdoering Dr. Wheeler is back with his thoughts on MSA and Gage R&R Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 2
C ISO 27001 and Bulk Email Policy - Your Thoughts, Please IEC 27001 - Information Security Management Systems (ISMS) 3
jasonb067 Cognos Feedback - Cognos BI software from IBM - What your thoughts? After Work and Weekend Discussion Topics 4
C Organization unsure about the ISO 9001 implementation route? Share your thoughts. ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 20
Hershal Memorial Day thoughts - 2010 Coffee Break and Water Cooler Discussions 1
5 What are your thoughts on my NPD (New Product Development) process map? Process Maps, Process Mapping and Turtle Diagrams 17
A What are you thoughts on the 1.5 sigma shift Six Sigma 3
R Considering adding ISO 13485 certification - Your thoughts? ISO 13485:2016 - Medical Device Quality Management Systems 4
I Seeking thoughts on using BOTH FTA (Fault Tree Analysis) and FMEA ISO 14971 - Medical Device Risk Management 17
D KAIZEN Event "tools to use" Brain Jogger - Your thoughts please Lean in Manufacturing and Service Industries 12
P Are they "Satisfying Customer Requirements?" - Thoughts and Comments Customer Complaints 1
Paul Simpson ISO 19011 revision - Your thoughts General Auditing Discussions 53
P ISO 9001 in construction organization: Your thoughts (quality control plan, etc) ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 7
Marc Your Opinions and thoughts about the Elsmar Cove forum - Why do you visit the forum? Forum News and General Information 50
Marc Thoughts about (the murder of) Manufacturing In the US Imported Legacy Blogs 2
M Non Conformance Reports - What are your thoughts on the title? ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 14
michellemmm Thoughts on issuing a CAR for Operator Error General Auditing Discussions 15

Similar threads

Top Bottom