Safeguarding Calibration on Digital Equipment

  • Thread starter Thread starter snglcoin
  • Start date Start date
S

snglcoin

ISO 9001:2000 Section 7.6 d) states that measuring devices shall “be safeguarded from adjustments that would invalidate the measurement result.” I asked five different gage manufacturers at the Quality Exposition in Chicago this week about this requirement as it pertains to digital calipers and micrometers. Every gage salesman looked at me like I was the first person to ever ask the question and even though they understood my dilemma none of them had a satisfactory answer to my question.

My opinion is that every time an operator re-zeros their digital caliper they have just made an adjustment that could invalidate the measurement. I have seen plenty of digital equipment no longer reading zero when returned for calibration because of user adjustments. Even in the case where the caliper has an absolute scale, which reverts to the calibrated zero when it’s turned off and then on again, the absolute or origin reset button is completely assessable to the user on the face of the tool. There should never be any reason for any user to ever have to reset the absolute or origin on a digital tool. Why not put the button where it can’t be inappropriately or inadvertently pressed (like under the battery cover)? I need some feedback, do I just not get it or is there really an issue here?
 
Elsmar Forum Sponsor
The approach we took with digital calipers and micrometers in the testing lab I supervised was to have the operator's close them and zero them at the beginning of the turn, and then use a gauge block to verify them. Results were recorded in a logbook. Both calipers and micrometers were calibrated on a quarterly basis using a range of gauge blocks by me, the supervisor. If there were any problems during the turn, they were to rezero & re-verify. If that didn't work, they were to contact the lab tech ( a clerk type position) or the lab supervisor.

Had no problem with auditors comparing the lab system to the requirements of ISO 9001:1994, QS-9000, and ISO Guide 25 (the predeccessor to ISO Guide 17025).

Verifying measuring instruments on a regular basis is just good preventive practice - think :ca: - make it part of your work instructions, and you limit the amount of work you may need to do to identify and address potential non-conforming material if you recieve out-of-calibration instruments in for calibration. :)
 
My dilemma is what you have described, what everybody else I’ve talked to does, what most training video’s demonstrate, and most gage manufactures promote with their digital equipment is tantamount to having a user turn the barrel on their veneer micrometer because they think the zero is off a little. I don’t know anybody who would find that practice acceptable yet we accept it when it comes to digital equipment. I agree with having an operator verify their measuring equipment before each use but if it’s not reading correctly they should, in my mind, turn it in to be recalibrate. Most of the time, if it doesn’t read zero, it’s because there’s minute amounts of dirt on the measuring faces or natural variations in operator “feel” not because the gage needs to be reset. Yet we have an entire workforce trained to just hit the zero button, or worse yet, the origin button.
 
The key is not to disable the zero function but to train the top management and user on the concept of the science of measurement (metrology). If the calipers have been calibrated and there is no damage to the jaws when closed they will zero out without needing verification.

The zero function allows you to set your caliper, mic, or drop indicator at the nominal measurement value and check the plus or minus variance. Also if you follow down the same thought path what about dial indicators, pitch or thread mic, or height gages can be zeroed at a nominal measurement?

Before some one comments about this. In metrogaly a “gage” is device that makes a measurement and a “gauge” indicates pressure value.

Jeff
 
Jeff Frost said:
In metrogaly a “gage” is device that makes a measurement and a “gauge” indicates pressure value.

Jeff

Jeff:
Good advice about zeroing of instruments, but not so good on the English. Do you mean that "pressure value" is not the result of a measurement? "Gage" is a variant spelling of "gauge." The former is fine in informal contexts, but the latter should be used in formal writing. There is no special differentiation between the two in metrology--it would serve no purpose.
 
There are situations where "zero-ing" out the calipers can be beneficia for operators. If a .25 diameter hole is .75 from the edge of a part to the center of the hole, measure from edge of the hole to the edge of the part, zero out the calipers, move out .125 (half the hole) and zero again, now close the calipers and your reading should be .75

I have worked around allot of machinists that do this rather than pulling out the calculator or doing the math, it's not always as simple as my example so there are situations where it is a nice feature. Every person I have ever trained I have taught them to always close the calipers completely and zero out before taking any measurements.
 
Since I am also a machinist by trade I couldn’t agree more. I use the functions of a digital gage and the ability to zero it to do more complex measurements all the time and appreciate the capabilities it provides and in not having to use a calculator to do the math. That’s one of the reasons why I think digital gages with an absolute scale are so important. Whenever I turn my gage off and turn it back on it retains the absolute zero that was set during calibration. I wouldn’t want to loose the ability that the zeroing capability gives me in making measurements. But I wouldn’t re-zero the caliper before making any measurements. I may check my zero setting to see if anything has changed but I still depend on the absolute zero as my starting reference. The problem still exists however, that the absolute zero button is completely accessible and therefore not safeguarded against invalidating the calibration. If an auditor asks me how do I safeguard our digital calipers or micrometers against a user invalidating the calibration setting, I have no answer for them. Why haven’t any of the gage manufacturers been asked to address that issue. I still feel like I’m the only one who seems concerned about that as an issue. Am I missing something? Am I misinterpreting the intention of the standard? Has anybody else ever wrestled with this as an issue?
 
Do you know of a real-world instance where this has been a problem? In other words, have you ever been able to trace nonconforming conditions back to a spuriously-zeroed caliper? Do you even have strong suspicions that such a thing has happened? If so, then perhaps you need to switch to dial calipers, which are less likely to be tampered with. If not, then perhaps you should do some experimentation by calibrating a caliper, doing a GR&R, then re-zeroing it and doing another GR&R with the same parts and operators so that you have some actual data, rather than just possibly-groundless fear.
 
It doesn't matter if the calipers are digital or dial, you can fudge with the zero display. IMO it doesn't matter and the key is remembering what calibrating a pair of calipers is all about.

Our calibration procedure for calipers states:
Close the calipers, inspect the jaws for nice closure, if they have a gap or are not closing parallel then they need mechanical adjusting. If you open the calipers and measure a master block, are they within .001 (normally calipers are only accurate to .001). Repeat the measures for the inverse measuring jaws on top of the calipers. Do they get good inside and outside measures When used properly ? Yes or no determines if they are put back on the floor.

Those bold words are the key. The adjustment of setting a reference zero, digital or dial, does not change the accuracy of the calipers. It only changes the ret value based on method of measurement. The calipers are still calibrated accurately to operate within the set tolerance limits under proper use. The zero is reference to absolute zero, but digital calipers can be used properly as an incremental measuring instrument. If a caliper measures a 3.000 block as 3.000 from the closed state, it doesn't matter what the number on the screen says.

Example: If I set my calipers close at 1.000 and then measure a 3.000 block I would return value 4.000 which is a sign of a calibrated instrument (and possibly an uncalibrated person). Changin Zero does not change calibration! I think you don't have to worry about the zero button, I would argue that with an auditor all day and never back down.
 
As far as “Real World” issues I can only reference a Quality Digest article written way back in February of 1999. The article doesn’t go into depth about safeguarding equipment but it does point out the ease of which measurement errors can be introduced because of the zeroing capabilities of digital equipment. The link to the article is https://www.insidequality.wego.net/?v2_group=0&p=4499&ct=cdisplay&nt=true&cd_eid=233

It is important to remember the original issue in this thread and that is that ISO requires that measuring devices be safeguarded from adjustments that would invalidate the measurement result. I get the feeling that this is something nobody wants to talk about when it comes to digital equipment. Maybe an auditor will never question our practices but I wouldn't bet on that. I would like to be proactive and I would like gage manufacturers to address the dilemma.
 
Back
Top Bottom