Definition Calibration vs. Verification - ANSI/NCSL/Z540 - Definitions of

Marc

Fully vaccinated are you?
Leader
Date: Mon, 04 Oct 1999 11:38:19 -0500
From: Steven R Stahley - Cummins.com
To: Greg Gogates
Subject: Re: Definition of "Calibration"

Marcus, The definition of Calibration has been an issue for several years. When ANSI/NCSL/Z540 was written a full day was dedicated to the discussion of the definition of calibration ver. verification. The difference comes from the traditional US definition which had generally included adjustment and the generally accepted definition in the VIM of calibration only including the measurements or comparison to standards of known value. When the arguments were over place in Z540 a caveat in the definition of "verification" that in some instances the terms verification and calibration may be used interchangeable implying that calibration is only measurement not adjustment.

Generally in the US many calibration service suppliers will offer pre-adjustment, and post-adjustment data as part of the "calibration". In many instances there is no adjustment performed either do the nature of the standard e.g. a gage block is not adjusted but simply has a new value reported or in the case for most NMIs (National Measurements Institutes) the policy is to simply measure and report a values rather then perform any type of adjustment. The reason for the NMI's position is that adjustment in many instances is seen as a form of repair. These cases help to support the VIM definition of calibration being just measurements not adjustment.

The safest and most consistent policy you may wish to follow is when quoting a calibration, break the costing into per-adjustment, post-adjustment and adjustment fees so it is clear what the customer is getting for his money. This may also prompt the customer to question his other calibration service suppliers (your competitors) as to what their quotes include.

Steven Stahley
Cummins Engine Co.
 

Marc

Fully vaccinated are you?
Leader
Date: Wed, 6 Oct 1999 15:07:06 -0400
From: "James D. Jenkins" - quametec.com
To: Greg Gogates
Subject: RE: Definition of "Calibration"

This discussion has become very interesting and many good points have been made. To summarize we have:

1. The definition of calibration is vague and debatable.

2. Some items such as gage blocks, standard resistors, etc. cannot be adjusted and other items such as some DMM's have adjustment specifications, provided by the manufacturer, which when utilized can give the user of the device a high "In-Tolerance Probability".

3. In-tolerance adjustment of items being trended can corrupt trend analysis.

4. Lack of providing in-tolerance adjustments can increase occurrences of out of tolerances.

5. Preventative maintenance in most cases can only be performed during the calibration without voiding the calibration.

All of these points are correct. I would say that "calibration" in its basic definition is comparing an unknown to a known. While calibration can also be assumed to include a requirement of "Preventative Maintenance", which may includeout of tolerance adjusting and/or in tolerance adjusting, cleaning, etc. The conclusion is that the type of service that is expected from the calibration laboratory should be spelled out by the client as this is not a one size fits all situation.

The client should identify on which items they wish to have in-tolerance adjustments performed. They should also specify the "Adjustment Policy" that they wish to have applied, as described in Flukes "Calibration: Philosophy in Practice" book.

Flukes' book talks about the two different types of equipment, those with adjustments and those without. (Reference Standards vs. Devices used for nominal value assumption with error contained by specifications) They point out that many laboratories do not attempt adjustment unless a point is more that 70% of its specified tolerance (the Adjustment Policy). Thus reducing the chance of an out-of-tolerance incident, while preserving the advantages of minimal adjustment. Fluke also discusses the Pros and Cons of the adjustment policy pointing out that the policy, when applied, should be model or class specific such as for 3.5 digit multimeters. Routine maintenance is also discussed from a logical perspective.

Unfortunately, just identifying that the client is responsible for these decisions does not mean that the client is aware or prepared for this. Generally, clients have historically expected their calibration supplier to somehow know what they need and provide it. This puts the supplier at a great disadvantage. If the supplier sets an adjustment policy on applicable classes or models of items and their competitor does not, it becomes difficult to compete when the customer is comparing prices while unaware of the differences between the two suppliers services. Plus, in most cases there is no clear guidance on setting these policies from the manufacturer or any other authority.

With today's ISO9000 and QS9000 requirements of impact analysis on out of tolerances for items where the specifications are used as error containment, an adjustment policy can be very beneficial to the company's' quality. But how is the supplier to know if this is the case or if the company is trending the device? And when setting an "Adjustment Policy", what items should this include and what should the policy be? 50%? 70%? 80% of specification? If adjustment specifications (Adjustment Policy)are necessary to ensure a high in-tolerance probability based on the design, the manufacturer is the best suited source to identify this, but only a few manufacturers address this in their calibration procedures. They usually do this with the provision of "Adjustment Specifications".

We can see that the calibration suppliers do have a dilemma on their hands and the only clear way to solve this is for the client to become educated on these topics. Calibration suppliers can play a role in this process by clearly identifying the options available to the client, thus putting the requirements clearly on the clients shoulders.

ISO Guide 25 laboratories should understand the importance of maintaining a high in-tolerance probability for items of which they have used the specifications as error containment of the measuring parameter bias in computing their measurement uncertainty. I don't think they would be happy to find that on one of their standards, their supplier left it on the verge of going out of tolerance. Then again if they are trending the device, they would also be upset if the supplier performed an adjustment. But in most cases they know enough to specify to their supplier what they expect pertaining to adjustment of the device. Unfortunately, their clients do not!

As a consultant of measurement quality, I have seen first hand the need for calibration customer education in making these important decisions. That is why we at Quametec (https://measurementuncertainty.com) are working on a book to assist clients of calibration in making these decisions and in choosing a calibration supplier that is right for their needs.

I want to thank all the contributors to this discussion for giving us some additional ideas for topics to include in our book. If any of you have any additional ideas that you would like to see included in a calibration customer handbook, please contact me. We hope that through this book the relationships and understandings between calibration suppliers and their customers can be strengthened and the result will be an improvement in measurement quality.

A special thanks to Greg Gogates for hosting this discussion group, we recommend this group to all of our clients.

Sincerely,
James D. Jenkins

"Opportunities are usually disguised as hard work, so most people don't
recognize them."
 

Marc

Fully vaccinated are you?
Leader
Date: Wed, 6 Oct 1999 15:48:10 -0700
From: Larry
Subject: RE: Definition of "Calibration"

Greetings all-

Although there have been some disagreements about whether or not adjustments are required to be considered or not in the price of a calibration, there seems to be a fairly consistent definition of calibration in the minds of most of the contributors to this list. However, you may not find this to be true if you move very far outside of the circle of testing and calibration laboratories.....

Once upon a time there was a vendor of ultrasonic flow meters that upon completion of the design phase of a line of cross correlation meters sent a dozen or so of his pre-production units out to a qualified flow laboratory for "calibration". Three of these meters were installed on a straight section of 3-in PVC pipe, five were installed on a straight section of 12-in PVC pipe and four were installed on a 12-in PVC pipe with a 90° elbow immediately upstream.

Using its gravimetric calibration system, the laboratory measured flow at 10 test points for the meters installed on the 3-in pipe, 22 test points for the meters installed on the 5-in pipe and 38 test points for the meters installed on the 12-in pipe with the 90° upstream elbow. A total of 292 flow measurements were recorded. This was done while the manufacturer's personnel recorded the data from the flow meters via computer.

Since the laboratory simply operated its equipment and did not evaluate the data from the flow meters it issued a test report indicating the uncertainty of the flow measurements was ±0.25 % representing its ability to state the mass of the collected flow in terms of volumetric flow rate.

When the manufacturer's personnel evaluated the flow meter data, they found such inconsistency and scatter, that most of it was considered unsuitable for analysis. Out of the 292 flow measurements, they could only find 5 readings that were suitable for analysis. When they fit the 5 remaining points to a linearized version of the flow coefficient equation, they had to eliminate one more data point to achieve the desired degree of linearity.

Due to the limitations of the flow laboratory equipment, all measurements were conducted at flow stream Reynolds numbers between 0.4 to 7 M but the anticipated application of the meters was approximately 25 M. In spite of the elimination of 288 calibration data points and the projection to higher flow regimes, the manufacturer claimed the uncertainty of his product is 0.25 % based on the results of the test.

To prove the meters performed as claimed, he installed some of them in nuclear power plants in Europe, Canada and the United States and showed that they basically agreed with the installed equipment consisting mostly of ASME venturi tube-based systems with uncertainties on the order of 0.5 to 1.0 %. Based on the success of this test, he concluded that calibration of subsequent production units was unnecessary and that those already installed never required re-calibration.

Over the eons the meters were operated in a 180 °F room which was contaminated by ionizing radiation. According to the vendor, the meters could be used "calibrate" the same in-plant flow systems that were used to verify the performance of the pre-production units, and he expected 0.25 % uncertainty to transfer to the in-plant systems.

And woe it came to be done.

Following "calibration" of the in-plant systems to this higher level of accuracy, it was possible to increase the reactor feed water flow rate boosting the efficiency of the plant, thereby producing more electrical power without otherwise affecting operational parameters. Increasing plant efficiency while remaining within the operational envelope resulted in increased revenue.

As the revenue increased and the money poured in it so mesermized management and quality assurance that they didn't notice that the air conditioning systems in their offices on the mezzanine level of the plant would no longer keep pace with what was otherwise an unseasonably cool October morning.

However, as the first few waves of the azure Pacific Ocean breached the security door on third level near the west side of the building, and with the first and second floors of the plant sinking sublimely below grade on driftwood strewn beach, they realized they should have heeded the warnings of the metrology engineering department, and checked on this guy's definition of calibration.

****************************************************
Larry E. Nielsen
So. Cal. Edison - Metrology
7300 Fenwick Lane
Westminster, CA 92683
(714) 895-0489; fax (714) 895-0686
****************************************************
 

Marc

Fully vaccinated are you?
Leader
Date: Fri, 8 Oct 1999 04:54:09 -0700 (PDT)
From: James
Subject: Re: Definition of "Calibration" RE5

So, by your definition one could not "calibrate" a gage block.

You all make good points, but one point that everyone will agree on is that calibration consists of a comparison test for the purposes of determining if an instrument is performing to a set of "specifications".

If the instrument is within those specs it is said to be calibrated. A better description might be "certified" to be operating within its specs.

In the competive world we live in today labs charge rates that reflect their technician time and offer lower rates for less technician time. If a unit is found to be within established tolerances and requires no adjustments then less time is spent on the "calibration" of that instrument.

It follows then that an adjustment will take more time and therefore cost more.

A lab manager might consider this to be more cost effective than a flat charge situation where you actually pay for the technician time even when no adjustments were needed.

Myself, I think charging for the actual time involved in the "certification" process is fair and equitable.

For too long companies have buried their heads in the sand while spouting off about producing a quality product. Its about time they open their eyes to the cost of real quality.
 

Marc

Fully vaccinated are you?
Leader
Date: Tue, 12 Oct 1999 09:43:09 +0100
Subject: Re: Definition of "Calibration" RE15

>So, by your definition one could not "calibrate" a
>gage block.
>
>You all make good points, but one point that everyone
>will agree on is that calibration consists of a
>comparison test for the purposes of determining if an
>instrument is performing to a set of "specifications".

Hi,

I don't agree. The word calibration as used in metrology has been standardized in the ISO International Vocabulary of Basic and General Terms in Metrology (or VIM). While we might not agree, the Guide 25 is an ISO document and it must be complient with the VIM. The definition is:

Calibration: set of perations that establish, under specified conditions, the relationship between values of quantities indicated by a measuring instrument or measuring system, or values represented by a material measure or reference material, and the corresponding values realized by standards.

When you adjust an instrument, you are adjusting an instrument. Sounds very Zen, somehow. After adjustment, the final measurement to see how the output ( e.g. multimeter) or material measure (e.g. gage block) corresponds to the definition of the quantity (volts, meters, etc.) is a calibration. The calibration report will state how the output of the instrument corresponds to the real unit.

The lab may ALSO make a decision as to whether the instrument meets some specification. This is not a calibration. The VIM doesn't have a term for this yet, but I will call it a validation, which uses the results of the calibration and some decision rule to decide if the tolerance or spec is met.

bye

Ted Doiron
(301)975-3472
National Institute of Standards and Technology
Precision Engineering Division
(301)869-0822
Metrology Bldg., Rm. B113
Gaithersburg, MD 20899-8211
U.S. Department of Commerce Technology Administration
____________________________________________________________________

Todds Two Political Principles:
1. No matter what they’re telling you, they’re not telling you the whole truth.
2. No matter what they’re talking about, they’re talking about money.
 

Marc

Fully vaccinated are you?
Leader
Date: Wed, 13 Oct 1999 13:02:59 -0400
Subject: RE: Uncertainty and Proficiency RE4

Mike, this should answer your question regarding the accounting for "In-Tolerance Probability" when using instrument specifications as containment for the Measuring Parameter Bias in computing a measurement uncertainty.

In making a Category A estimate and using it to construct confidence limits, we apply the following procedure taken from the GUM and elsewhere:

1. Take a sample of data representative of the population of interest.
2. Compute a sample standard deviation, u.
3. Assume an underlying distribution, e.g., normal.
4. Develop a coverage factor (e.g., a t-statistic) based on the degrees of freedom associated with the sample standard deviation and a desired level of confidence.
5. Multiply the sample standard deviation by the coverage factor to obtain
L = tu and use +/-L as confidence limits.

In making a Category B estimate, we reverse the process. The procedure is

1. Take a set of confidence limits, e.g., parameter tolerance limits +/-L.
2. Estimate the confidence level, e.g., the in-tolerance probability.
3. Assume an underlying distribution, e.g., normal.
4. Compute a coverage factor, t, based on the confidence level.
5. Compute the standard deviation for the quantity of interest (e.g., parameter bias) by dividing the confidence limit by the coverage factor: u = L / t.

Dr. Castrup of Integrated Sciences Group has refined the above procedure so that standard deviations can be estimated for non-normal populations and in cases where the confidence limits are asymmetric or even single-sided.

He has also developed a method for estimating the "delta u" variable in Equation G.3 of the GUM so that the degrees of freedom can be computed for Category B uncertainties. Both the refinement of the procedure and the delta u method are too complicated for elaboration here. Both methodologies have been built into uncertainty analysis software. For a description of the delta u methodology, as well as a freeware application that applies it, go to https://www.isgmax.com , click on 'Category B Degrees of Freedom Estimator,' and elect to either view the methodology or download the software.

I hope this makes sense to you as I believe it is obvious that the "In-Tolerance Probability" must be considered when using specifications as error containment. Those of us in metrology can attest to the fact that not all devices have equal probabilities of being in-tolerance. Such as an HP multi-meter vs. a $39 Chinese multi-meter with similar specifications. The applied calibration interval can also affect this probability. If I use a meter calibrated 5 years ago and you are using the same model meter calibrated last week it should be obvious that we do not have the same measurement uncertainty. In-tolerance probability can easily be supported with information from historical calibrations. In fact any calibration technician with experience in calibrating a particular device model line can provide an estimate. Give that technician the "Category B Degrees of Freedom Estimator" and you will have a statistical quantity to fit into your analysis. There is no need to "Fudge" Category B values. Using my computer and specialized software I can perform a technically correct analysis in less than half the time as those who take risky shortcuts and apply a lot a fudge.

Sincerely,
James
 

Marc

Fully vaccinated are you?
Leader
Date: Tue, 12 Oct 1999 13:05:10 -0400
Subject: RE: Definition of "Calibration" RE14

Greetings Tom,

Thank you for your interest!

As I pointed out before, unfortunately, the process of calibration is not a one size fits all. Pertaining to the reporting of calibration when using an "Adjustment Policy" to a tolerance such as 70% of specification, the example you gave is ideal. The options that I believe should be given to the client are:

1. Calibration without data with no in-tolerance adjustments.
2. Calibration without data with a defined in-tolerance "Adjustment Policy" (Such as 70%).
3. Calibration with data with no in-tolerance adjustments.
4. Calibration with data with a defined in-tolerance "Adjustment Policy" (Such as 70%).

Pertaining to the in-tolerance "Adjustment Policy" options, I would recommend that the calibration laboratory provide the data which triggered the "In-tolerance Adjustment Policy" (before and after). Personally, I would think that having an option for an applied in-tolerance "Adjustment Policy" and the reporting of the triggering data would be a great benefit worth paying extra for.

With the above options available to the client, the client would be able to select the service most applicable to the device and their needs. If the client has the option for calibration with or without data (QS9000 requires data on all calibrations), the client can determine the need for data based on the importance of the application of the device to their product quality. Also, the client can identify devices where the "Specifications" are used as a component of containment of uncertainty and where the use of an in-tolerance "Adjustment Policy" is recommended by the manufacturer of said device for giving a specified high "In-Tolerance Probability" for a defined calibration interval.

Regarding your case in point, a device calibrated using option 2 (above), as you can see, even without data, knowing that the device performed somewhere between 70 and 100% of allowed deviation relative to the specification, the client knows more than they would have without the application of the "Adjustment Policy". Plus, one can assume that if the in-tolerance adjustments had not been performed the device most likely would be found "Out-of Tolerance" on every second or third interval. Many years of personally calibrating my own DMM's has proven this to be true.

That leaves us with the question of whether the calibration interval is adequate when the applied "Adjustment Policy" mandates adjustment on every calibration of said device. Again, only the client knows the importance of this device to their product quality. But if the device is of some importance and the laboratory provides the "Adjustment Policy" triggering data (Before and After), we have enough information to evaluate the adequacy of the applied interval. To maintain a high degree of "In-tolerance Probability" with an applied 70% adjustment policy, I would want an interval that produced a measurement performance drift of less than 30% of specification. I might add, that a device that consistently requires adjustment to maintain its accuracy to the "Adjustment Policy" and yet consistently does not perform outside of its "Specified Accuracy" has an ideally applied interval for maximum savings with relative minimum cases of "Out-of-Tolerance" performances. But in cases where the device is of great importance to product quality, having an interval which produces drift of less than 30% of specification and having a calibration service which applies a 70% adjustment policy would yield a high and consistent in-tolerance probability. The reality in this situation might be that adjustments need only to be performed on every second or third interval.

With an applied "Adjustment Policy" the client is receiving more than just a watchdog service for their calibration dollars, they are receiving confidence in their measurement quality with continued confidence due to the "Preventative Maintenance" feature in the calibration process.

The conclusion is that even without data, the application of an in-tolerance "Adjustment Policy" not only provides "Preventative Maintenance", but gives the client some indication of "Out-of Tolerance" risk relative to the applied calibration interval. Isn't that what this is all about, confidence in measurement quality? Shouldn't the benefit of having equipment calibrated be one of maintained quality. Without some "Preventative Maintenance" component in the calibration service, all the client has is after the fact awareness that their product quality was potentially corrupted sometime during the previous interval by the "Out-of-Tolerance" device.

Most customers of calibration believe that the money they are spending should give them improved measurement quality. Many assume that their calibration supplier is adjusting their equipment as needed to ensure continued reliability. But how can that be if all their supplier does is stand by and monitor, reporting and reacting only when an "Out-of-Tolerance" occurs? Add a good "Adjustment Policy", where applicable, and the maximum benefits of calibration can be experienced through the provision of "Preventative Maintenance" actions.

I should also point out that when computing measurement uncertainty using specifications as error containment, a higher "In-Tolerance Probability" produces a smaller component uncertainty. I hope everyone computing measurement uncertainty is factoring the "In-Tolerance Probability" into their equations as described in the GUM Annex "G", as this is a fundamental influence to component "Confidence Level"!

Sincerely,
James D. Jenkins

-----Original Message-----
Subject: Definition of "Calibration" RE14
Date: Thu, 7 Oct 1999 10:58:29 -0500

Greetings James,

This whole issue brings a splintered requirement change to the act of adjustment during calibration. If the instrument receives a in tolerance adjustment (because it consumed 70% of the tolerance) and now is to be reported, would one report this event as

Condition Received: in-tolerance
Condition Returned : In-tolerance
Adjustment Made: Yes
Tolerance Used : 70%

So now it the client has a annual calibration cycle and for the past 4 calibration events he sees In-tolerance adjustment at a tolerance consumption of 70%, would he feel the instrument is stable, would he need to adjust his intervals or would he just plug along until he sees a out of tolerance adjustment before he takes any action?

Would he have benefited by the reporting of the intolerance adjustment value. Or does the absence of a reported value give the client enough information.

This is a precedence setting thread and will impact the industry if all users of metrology services become more detailed and educated.

Lets do some good here, Lets clean this up and make some good agreements of interpretation and standardize.

Tom Smith

Earl of Metrology
Duke of NCTL
.
 
Top Bottom