Calibration to a Known Standard - Theory 101

Marc

Fully vaccinated are you?
Leader
Some interesting thought here:

------snippo------

From: [email protected] (Doug Pfrang)
Subject: Re: Calibration definition/.../Pfrang/Gazley/Pfrang

>From: "John Gazley"<[email protected]>
>Subject: Re: Cal definition/.. /Pfrang/Gazley/Pfrang/Gazley
>
> Pfrang writes
> >..... I call my supplier and tell him I want a particular part to be
> >5.2 inches long. He sends me a box of parts, which I measure with my
> >calipers, my calipers tell me the parts are 5.2 inches long. No harm
> >no foul....... the parts meet specifications.
>
>- - - - - - - -
> The idea that it is acceptable to approve parts because a pair of
> un-calibrated gages says there 5.2 and thats what the supplier was told
> to send is ludicrous. What if both of your calipers are off? There is no
> way this would fly in any audit I have ever been involved with.
>
> Have you ever worked at a business that utilized these elaborately
> thought out scenarios, because if you have I would like to know what
> would be your reason for not just owning a set of certified gage blocks
> like any other ISO company that uses calipers.
>
> As to your prototype scenario. Yes- in prototype, transfer
> measurements are common from model to first production piece/s. Please
> don't expect me to believe that this would be acceptable once actual
> production of the product was in full swing.
>
> I too could devise hundreds of scenarios that would let me operate
> outside the standard. But my warning to others is- save yourself the
> hassle come audit time, if there is an industry standard of doing
> something, do everything you can, to do it that way. Most auditors
> come from a manuufacturing background and wether they're supposed to
> or not they bring some pre-conceptions into their audit. Save the well
> thought out scenarios for the areas in which you absolutely must
> operate contrary to industry standard practices.
>
> When implementing ISO9000 take note of the participants who ask "is
> that EXACTLY (strong emphasis on exactly) what ISO says we have to do"
> those are going to be the people you need to work with most.
>
>-John Gazley


Well, I agree there are times when it is easier to save yourself the hassle at audit time to just have everything calibrated by an outside test house, but that is rarely the most cost-effective thing to do. Moreover, just because a certain practice saves hassles at audit time does not prove that ISO9000 requires it; it merely proves that you'd rather tell your employer to spend his money on outside calibration than spend your time defending his existing practices, even if those practices are valid. Not every client, or every situation, requires calibrating every measurement device to an external (e.g., national or international) reference standard.

The examples I gave are valid. The situations I described do not require the calipers to be calibrated to an external reference standard, because they were merely being used to confirm that the dimensions of the ordered parts are the same as the dimensions of the part taken off the prototype. Not only does this not require the calipers to be calibrated to an external reference standard, but it does not even require the calipers to have a numerical output. A simple pair of woodworking calipers, which have no numbers on them at all and therefore could not even be calibrated, could have done the same job.

What if both my calipers and my supplier's calipers were off? Well, in some situations that could be a problem, but it doesn't matter in my examples, because in my examples the goal is simply to make sure the purchased parts are the same dimensions as the prototype. For the same reason stated above and in my previous posting, the calipers do not necessarily need to be calibrated to an external standard to accomplish that task. They are just being used to make a relative comparison, not an absolute measurement. And, yes, it works just as well for production "in full swing" as it does for the initial run.

As for why I would not need to own a set of certified gage blocks, the examples I described don't require them either. As long as I can show that my calipers read the same today as they did the day I used them to determine my specifications, then that's all I need. I couldn't care less if they match some external standard, because I am not using them to make absolute measurements -- I am only use them to make relative measurements (i.e., today's measurement on my parts relative to the measurement I made previously on my prototype). I might still need to calibrate them using a CONTROLLED standard -- to compensate for drift that might occur over time -- but it does not matter if that controlled standard is internal to my company or an international reference standard. I could perform the calibration using a gage block I make myself; it does not need to be a set of certified gage blocks.

As for the suggestion that my examples are "outside the Standard," my previous posting quoted the language from the Standard that supports my examples. I have not suggested that anyone do anything "outside" the Standard; I have suggested that people understand what it means for something to be "calibrated," and work out a method of calibration that is both valid, practical, and still compliant to the Standard.

As for my examples being "elaborately thought out scenarios," I can't give that objection much merit because it could be applied to ANY example someone might use to try to illustrate a point, regardless of whether the example is valid or not. (I.e., even if I accept your assertion that my examples are "elaborately thought out," that does not mean they are invalid; it just means they are hypothetical.) Obviously, I tried to select examples that I thought would best illustrate the key principles of what it means for a device to be calibrated -- to either an internal standard or an external one. I hoped that my examples would be clear enough that readers would be able to apply the same principles to answer their own calibration questions. If you think my examples are invalid, then explain why and I will clarify; but to discount them merely because they are "elaborately thought out" is insufficient to demonstrate they are wrong. Most other respondents who disagreed with my posting gave a similar response -- they stated that my examples are wrong in their "opinion," but they failed to support their opinions with valid reasoning.

I don't understand why so many people have such a strong belief that calibration always requires reference to an external standard. At the risk of being accused of using another "elaborately thought out scenario," consider the well-known example of soldiers synchronizing their watches. Five guys sitting in a foxhole might not need to coordinate their actions with anyone but each other. If so, all they need is for their watches to all read the same. Thus, they pick one watch in the group (i.e., their own internal reference standard), and they all set (i.e., calibrate) their watch to match that one. It doesn't matter if the watch they pick is reading the absolute correct time (i.e., if it matches some international reference standard), because they are only using their watches to coordinate their actions with each other. If, sitting in that foxhole, you tried to tell them the mission could not proceed until everyone calibrated their watches to an international reference standard, they would probably laugh in your face. Of course, if they needed to coordinate their actions with some other group of soldiers in a completely separate location, then they might use an external reference standard (e.g., Greenwich Mean Time) to synchronize all the watches, if that were the easiest way to get everyone in synch, but even that is not the only way to do it. (A reference standard which is internal to the army, but unknown to the rest of the world, would also work just fine.)

The situation in your factory might be similar. Five guys sitting in a production cell might not need to calibrate their test equipment to any external reference standard. If they only use a particular device for purely internal comparisons, then they don't necessarily need to calibrate it to an external reference standard. Instead, they can calibrate their measurement devices to their own internal reference standard, and make perfectly good product and be completely within the Standard. In fact, this method is commonly used for validating test fixtures.

On the other hand, if you use a particular gauge for comparison to other measurements that are taken OUTSIDE your facility by OTHER gauges, then you probably do need to calibrate that gauge to an external (e.g., national or international) reference standard. (Of course, if the gauge outside your facility (vendor gauge, customer gauge, etc.) is not calibrated to an external reference standard, then it does you little good to calibrate your gauge to an external standard, because that still won't guarantee the readings of the two gauges will match.)

As for who "people...need to work with most," everyone is of course free to choose between those who want them to blindly pay to send everything out for external calibration, and those, like me, who try to find more practical alternatives. I don't have a strong preference for either approach; I'm just trying to help all those people who are writing to this list and complaining about being told to pay for external calibration they obviously don't need. While I am sure it is easier to get through audits by automatically sending everything out for external calibration, most business people I've worked with agree that "saving the hassle come audit time," is not the main goal of their company, nor is it the main goal of their quality system. Moreover, telling a client to do something for ISO that otherwise makes no sense for their business tends to make the entire quality system look like a joke, and that is not a message I want to send. But if you and others want to advise your clients to spend money on unnecessary external calibration, merely because it makes the audit easier and doesn't require anyone to think of a better alternative, then go ahead. For some situations, some clients, or some ISO registrars, that might be the best approach. It is not, however, the best approach for all situations, all clients, or all ISO registrars; and it is not the only one which meets ISO-9000.

-- Doug Pfrang
 

Marc

Fully vaccinated are you?
Leader
From: [email protected] (Doug Pfrang)
Subject: Re: Calibration definitions/.../Andrews/Pfrang/Andrews/Pfrang

Ethan,

Thank you for your clarification. Now we're getting somewhere. The language you have cited from the Standard does not exactly fit the situation in my examples. Although the calipers are used for determining product acceptance, this acceptance is based on a relative comparison to the prototype. In essence, the prototype is the "certified equipment" mentioned in 4.11.2(b). And since the prototype has not been the subject of any national/international recognized standard, no such standard exists which would apply to the prototype as set forth in the first clause of 4.11.2(b). Therefore, the last clause of 4.11.2(b) would cover this situation; namely, the prototype, or some other controlled internal reference standard, can be used as the basis for calibrating the calipers, as long as it is documented.

I think I understand, now, the concern you raised in your last posting; namely, that "the prototype is not traceable to national/international standards (unless you calibrated your prototype, which you did not indicate)." What you meant is that since the prototype does not have a "known valid relationship to internationally or nationally recognized standards," the Standard requires that a known valid relationship to some national/international standard be created. If that were true, then the last clause of 4.11.2 would be unnecessary. But it's there, so I choose to use it.

Still, in some situations, your approach might be preferred. One might prefer to use a certified gage block to calibrate the calipers -- and to select a certified gage block for which there is a suitable national/international standard. Then one could recalibrate the calipers using that certified gage block, go back to the prototype and take new measurements of the prototype using the recalibrated calipers, and then use those measurements as the basis for establishing product pass/fail criteria. My point -- and the point that many members of this list have raised before me -- is that this approach is impractical and unnecessary in many situations. In those situations, the Standard provides the last sentence of 4.11.2(b) as a solution.

Personally, I find it better to try to work within the language of the Standard in this way, rather than to foster the bizarre distortions that inevitably result from the rigid interpretation of the Standard that you suggest. Distortions such as auditees claiming a measurement is "for reference only" (wink, wink), or claiming the measurement is "qualitative rather than quantitative" (whatever that means), or claiming there is an "unwritten rule" that tape measures and the like don't need to be calibrated. To me, these are absurd contrivances that practical people are forced to invent merely to bypass an auditor who insists that the Standard requires everything to be calibrated using an external reference standard. As long as auditors interpret the Standard in this rigid, impractical way, these distortions will occur. I believe this does not advance the cause of quality, or improve conformance to the Standard in any way. To the contrary, as evidenced by some of the postings to this list, it leads to auditees who are confused and cynical. In the approach I describe, people do not have to invent such contrivances to use a practical solution that works; they can stay within the Standard, and simply use a different clause in 4.11.2(b) to get there. But if you prefer to ignore the last sentence of 4.11.2(b), to insist that the Standard requires everything to be calibrated using an external reference standard, and to have your auditees inventing ways to exclude their test equipment from their quality system merely because you allow them no practical way to include it, then don't let me stop you. As I said before, I don't have a strong preference for either approach; I'm just trying to help all those people who are writing to this list and complaining about being told to pay for external calibration they obviously don't need.

-- Doug Pfrang


>Ethan responds:

>Doug,
>I believe that I had given a "logical" explanation of my position; however,
>for your benefit, I will do so again.
>
>I believe that you are wrong in your assertion that you do not need to
>calibrate the calipers you used in your example. In your example you
>indicated that the calipers would be utilized for product acceptance. If
>that is the case, the ISO 9001/2 Standards are quite clear. You must
>"calibrate and adjust them at prescribed intervals, or prior to use,
>against certified equipment having a known valid relationship to
>internationally or nationally recognized standards." In your example, the
>prototype had no such valid relationship.
>
>As for my assertions being based "solely on my opinion" - they are not.
>They are based on the very clear requirements of Para. 4.11 of ISO 9001/2.
>
>Ethan Andrews
>[email protected]
 

Marc

Fully vaccinated are you?
Leader
From: [email protected] (Doug Pfrang)
Subject: Re: Calibration definitions/.../Andrews/Pfrang/Stein/Pfrang

>From: Philip Stein <[email protected]>
>Subject: Re: Calibration definitions/.../Andrews/Pfrang/Stein
>>
>>I keep thinking of that poor guy who removed all the pressure gauges from
>>his machines -- gauges which had been validated by years of operation --
>>just because some idiot auditor wouldn't let him validate the gauges using
>>any method other than calibration to an international/national standard.
>>Did removing the gauges improve quality? Heck no.
>
>How can you validate a gauge by operation? This is silly. they could have
>been off by 1%, 10%, or 100%. How is one to know?
>
>Philip Stein <[email protected]) O-

Have you used the gauges on your car's dashboard for any length of time? If so, then you have validated the gauges by operation.

I did not suggest that validation by operation could be used to determine the precision of the gauge relative to any external reference standard. To the contrary, the reason for using validation by operation is to help AVOID the need to calibrate the gauge using an external reference standard. Validation by operation teaches you the relationship between the gauge's readings, and the impact those readings have on product quality (i.e., the Significance of the Gauge Readings). Once you have that information, you can calibrate the gauge using an internal reference standard (merely to keep the gauge readings from drifting), and then use the Significance of the Gauge Readings to make pass/fail decisions about your product.

-- Doug Pfrang
 
D

Dawn

We have all of our plug gages sent out for calibration at the same time for easier tracking. When we order new ones in, I throw a mic on it, and get it calibrated with the rest the next time calibration is due. We are up for pre-assesssment next month. Think it will fly?
 

Marc

Fully vaccinated are you?
Leader
I guess if you mics are 'standardized' and there is a procedure, you document it, etc. it might. Why don't you order them so they come with a cert?
 
R

Roger Eastin

There have been several snippos from Mr. Pfrang in this and the old forum. It seems that he likes to "avoid" having measurement instruments as a formal part of calibration and instead, "calibrate" the gauge through feedback from its usage. I remember one snippo that was posted about "brute force" calibration which, if I recall, used feedback from product usage/function results. Though I agree he makes some valid points, does his more unconventional metrology fly with auditors? I have been wondering this for some time. Is there a snippo that addresses this or has someone out there had experience with Mr. Pfrang's methods (with third party auditors)?
 

Marc

Fully vaccinated are you?
Leader
Brute Force Calibration

Yeah - I originally posted this because of his unusual point of view and logic. Just some thoughts. :thedeal:
 

Jerry Eldred

Forum Moderator
Super Moderator
I guess I come from another school of thought. I admittedly skimmed and scanned the postings. My impression is that the point of view expressed was that there may be some situations where calibration to an external standard may not be necessary if you are concerned only with matching relatively to a prototype (in the example). I'll give a few separate replies to cover a few different scenarios:

1. It is perfectly acceptable to calibrate to a house standard (golden unit) where no traceability exists (with some added qualifiers, such as being well characterized, etc..).

2. If you are not comparing against something of some sort of known accuracy, it simply isn't a calibration, by definition. Calibration must involve comparing something of unknown accuracy to something of known accuracy.

3. Intercomparison of a few items of unknown accuracy to match their readings is not calibration, it is (I believe) called correlation.

4. If you want to use an uncalibrated set of calipers to check a length, and you do not wish to introduce traceability to an externally known standard, you not only do not know actually how long that length actually is (to an uncertainty), you do not know how reliable those calipers are. You calibrate an instrument not only to measure and correct for bias (error), but to periodically check stability (inherently/implicitly in a calibration). The calipers (in the example) may also have a repeatability problem that without some sort of external verification, can't be quantified.

You may be able to get away without traceability to national or international standards, because the purpose of that is to standardize your measured value to internationally accepted values. If you don't have a need for your process to correspond to any absolute measured value, you don't need traceability. I can think of a good, very fundamental, example of the need for traceability. Look at your product specification. If anywhere on the specifications, you list any quantitative value, you need traceability. Something as simple as the L X W X H of a case in inches. If you don't have traceability, you can't claim that you meet your own specs. If you produce a case assembly of (for example) 10" x 12" x 20", and you sell it to customer ABC. Customer ABC produces a console and designs it to use your case assembly. They use traceable dimensions on their console, you don't. They attempt to install your case assembly in their console. It's a precision fit, which requires very tight tolerances. Your case doesn't fit properly and has to be reworked, costing your company large $$$'s.

There are probably some examples of products that have no quantitative specifications. I can't think of any. But if you have a product that needs to be measured with calipers, there must be some reason the product needs to be measured.

To discuss the soldiers synchronizing their watches from the original posting... Every measurement has tolerance limits. I would take the analytical view on the soldiers and ask, how many hours will they need their watches to stay synchronized, and what level of accuracy do they need. Let's say the time span is twelve hours, and the allowable soldier-to-soldier error is 15 seconds. Let's say four of them had good digital watches (brand new). To make it more interesting, four of them had wind up pocket watches, and four of them had self-winding watches.

Time comes for whatever strategic action to happen and two of the soldiers with pocket watches had loose springs (or something) which made their watches lose 10 seconds an hour. They just missed the strategic action. Two of the soldiers with self-winding watches unwittingly didn't shake their wrists around enough to keep their wound up, and so they went dead. One of the soldiers with a brand new digital watch had a battery go dead.

A further complication to the example could be that the entire platoon is supposed to send a radio message in to the battalion command at 0800 hours. There is a security requirement that they start their transmission over the special communications equipment within plus or minus 5 seconds, or the transmission won't go through.

I have regular discussions with users who have very good handheld multimeters, but who use them for very low accuracy measurements. As example, Fluke 87, a very good handheld multimeter, is used commonly. Some users think it is so good that they shouldn't have to calibrate it because they are just checking line voltage with it. I take issue with that philosophy. Just because your instrument is more accurate than you need does not in any way eliminate the need for calibration. All equipment changes accuracy in comparison with a known value over time. And all equipment develop reliability problems over time. It merely depends on the equipment as to how quickly the reliability problems develop.

If you are using very good equipment to make very low accuracy measurements, you may well be able to derate the equipment and calibrate at longer intervals to compensate. The way I typically illustrate is that if when you use that Fluke 87 to measure 120 VAC, if you don't care whether it reads 120 VAC, 190 VAC or 10 VAC, it may not need to be calibrated.

If you have any expectation of any kind of accuracy (and they all do), then you need it calibrated. Many instrument users confuse what we could call a very good %R$R (Precision to tolerance ratio) with lack of need for calibration. If you dig into the mind of many users, you find that a good piece of test equipment was purchased because they wanted to be able to depend on it. There is a misperception by many that very good test equipment shouldn't need to be calibrated (that need for calibration is a weakness in equipment).

If it was designed to make measurements to a known tolerance, it needs to be calibrated. The only exceptions are if you just don't care what the measurements are. As a long time metrologist, I have a good Fluke 77 at home, and a pair of digital calipers. I don't calibrate them; maybe I should. But I know good and well that there's no telling what errors I have in my measurements.

Longwinded enough for the moment.
 
D

Dawn

I'm looking for any updates to the ISO 10012-1 standard. Has anyone seen a revison?
 
Top Bottom