Never put an Expiration (Due) Date on a Calibration Certificate

Wayne

Gage Crib Worldwide
I have some fundamental problems with the concept of a certification having an Expiration Data, or a Recalibration Date.

My problems begin with the basic fact that a certification is only valid at the moment the instrument is being checked. By the time the certification has been typed the instrument may already be out of calibration. Some may call this an absurd statement, but I contend that it is absolutely true.

Let me give some examples:

1. A Lab Tech finishes his calibration work on an instrument, leaves the tool on his bench and goes to his computer to transfer his measurement data to the certificate. While he is away from his workbench a co-worker accidentally knocks the instrument onto the floor. For reasons know only to the co-worker, he picks-up the instrument and replaces it on the workbench and does not report the incident. The fall, in this situation has put the instrument out of wack and invalidated the calibration as it was being typed.

2. A properly calibrated instrument has been packaged for shipping back to the owner of the tool. The shipment is by overnight courier. During the return trip the instrument passes through several extreme temperature situations: Heated by sun baked delivery truck; Frozen in the belly of a high flying airplane; Heated again in another truck turned oven by the sun. The result is that critical components of the instrument, which are held in tension and locked in place, are loosened by thermal expansion and contraction allowing the tool is no longer at the certified set size.

3. Upon arrival at the owner’s factory, the Receiving Clerk, in the process of unpackaging the instrument, handles it roughly by whatever untoward method you can imagine. He never reports the possible damage to the tool because it looks just fine. Regardless, the instrument has been rendered inaccurate by his actions.

4. A properly calibrated thread plug gage is placed into service with the instructions to the user to measure every 100th part. The operator, being a good employee in his eyes, figures that if one part measured in a hundred is good, every part measured is lots better. A week later the poor thread plug gage is, while unnoticeable to visual inspection worn to a nub. It is returned to the crib and stored for future use even though it has been worn beyond its limits.

Let me say that these situations may or may not cause a tool to fall out of calibration, and I would hope that many instruments can handle some abuse without going out of wack. Let me also say that the opportunity for an instrument’s settings to be silently altered are numerous and beyond total control. It is for these reasons that placing an Expiration Date on a certification, in my opinion, should be avoided like the plague.

Yes, we all know that an Expiration Date is for reference only. Yes, we all know that all instruments should be validated for function frequently. Or do we? To many not as deeply involved in the calibration of instruments and not familiar with the fragility of measurement instruments, an Expiration Date gives the false impression that all will be fine until that date is met. It might even be argued that an Expiration Date is a guarantee of performance up to that date, which is something that no calibration laboratory would be willing to assure. For these reasons I contend that no certification should ever have an Expiration Date. A Recalibration Date is nearly as bad. It implies the same thing as an Expiration Date with the subtitle differences being lost on the masses.

My preference would be to have no time based calibration cycle stated on a certification. Calibration cycles are best determined based on usage. But, customers have become trained to use time based systems, and expectant of such data being found on the certification. Calibration Laboratories, wanting to meet customer requirements, to keep the customer happy, have complied and placed such specious data on he certification.

The solution, on the certification any recalibration date should be identified as ‘suggested’. Further caveats should also be included in the fine print as to proper determination of calibration cycle or some such other disclaimer against the absolute value of any given suggested, time based, recalibration date.
 
S

silentrunning

I agree with everything you have said here. Since there is no "perfect" way to do calibration at a reasonable cost, I use it as a fence. Once a device is proven to be in calibration everthing checked prior to that calibration is considered acceptable. As you stated, a gage can go out at any point during the inspection process. It is not feasable to calibrate a gage after each measurement so we keep track of what product is checked by which tool. At the end of a production run the inspection tools are rechecked for accuracy. Not recalibrated! We send our inspection tools out every 6 months for outside calibration to verify our in house measurements. When they are proven accurate we approve the job. The extra 10 or 15 minutes this takes for each job has paid huge dividends over the last few years.

Doug
 

Tim Folkerts

Trusted Information Resource
I would tend to treat an expiration date as a maximum time, not necessarily a guarantee.

For example, the egg salad in my fridge has an expiration date a week from now. But if I leave it out of the fridge overnight, I'd throw it out - even if it is still a week to the stated expiration.

So an instrument - if shipped and stored and used as expected - should be good up until the expiration data. Eventually the calibration will fail - due to normal use, corrosion, evaporation, etc. The due data (IMHO) is an effort to estimate these sorts of expectations.

If not shipped and stored and used as expected, then it could well go out of calibration earlier. As you say, it is hard (or impossible) to know if it has been handled appropriately. That is where in-house checks help to ensure it is still operating as expected.


Tim F
 

Jerry Eldred

Forum Moderator
Super Moderator
Let me give a reply that sounds like I am talking out both sides of my mouth.

On the one hand, a Certification and due date are not in any way a guarantee of how long the instrument will stay within spec. When an instrument is calibrated/certified, it is returned to the owner, and if anything happens between the cal date and due date, the lab is not liable for it's remaining within tolerance.

On the other hand, the due date (or let me shift gears and use the term "interval" for the remainder of my reply) is based on some valid reasoning. The calibration interval is the amount of time to a defined statistical confidence level that an instrument can be expected to remain within tolerance. This means that the assignement of a due date is based on a predicted interval of time that the instrument will "probably" remain in tolerance. 95% is a common confidence, based on statistical probability.

There are other ways to assign or adjust intervals (but I won't go into all that now). So the assignment of a due date on a certificate means who ever assigned it is stating (if using the statistical method), that the unit will have a 95% probability (using my example above) of remaining in tolerance until that date.

So there is valid thought that goes into assigning the due date.

I would have to go on record that a certification (with some exceptions) must have a due date on it. This is a part of what a certificate of calibration is - a prediction of how long an instrument will probably remain in tolerance.
 

Jim Wynne

Leader
Admin
I would have to go on record that a certification (with some exceptions) must have a due date on it. This is a part of what a certificate of calibration is - a prediction of how long an instrument will probably remain in tolerance.

I agree, and thanks to Wayne for raising an interesting topic.

We need to not forget that part of the information we receive from calibration is the "as found" condition, and we should be using it wisely. If, for example, we find that 6-inch calipers in similar use consistently need adjustment when being calibrated, we need to (A) shorten the interval, and (B) investigate the situation. Are the calipers being abused? Are they worn such that maintaining set points is impossible?

The problem, as I see it, is one of education; people need to know that the due date on the label represents a maximum and not an absolute value. This seems obvious, and I think most people understand that there is risk in assuming that devices are properly calibrated when the state of calibration isn't intuitive. But you can't get people to use measurement equipment prudently by masking significant information.
 

Wes Bucey

Prophet of Profit
For my part, I had only the date of calibration/certification put on the label or tag, never a due date. To my mind, the interval between calibration/certification dates doesn't even start until the gage/instrument is put into service. Some gages might lay in a drawer in a temperature-controlled lab for 6 months before going into use. I kept track of usage in a database keyed to serial numbers of each gage or instrument. If the gage got excessive use, it got checked more frequently in-house to confirm it was still within tolerance.

I have had some gages go out of tolerance very quickly. I heard an apocryphal story about employees using a 2-inch micrometer as a nut cracker!
:notme:
 
Z

Zuggy

Let me give a reply that sounds like I am talking out both sides of my mouth.

On the one hand, a Certification and due date are not in any way a guarantee of how long the instrument will stay within spec. When an instrument is calibrated/certified, it is returned to the owner, and if anything happens between the cal date and due date, the lab is not liable for it's remaining within tolerance.

On the other hand, the due date (or let me shift gears and use the term "interval" for the remainder of my reply) is based on some valid reasoning. The calibration interval is the amount of time to a defined statistical confidence level that an instrument can be expected to remain within tolerance. This means that the assignment of a due date is based on a predicted interval of time that the instrument will "probably" remain in tolerance. 95% is a common confidence, based on statistical probability.

There are other ways to assign or adjust intervals (but I won't go into all that now). So the assignment of a due date on a certificate means who ever assigned it is stating (if using the statistical method), that the unit will have a 95% probability (using my example above) of remaining in tolerance until that date.

So there is valid thought that goes into assigning the due date.

I would have to go on record that a certification (with some exceptions) must have a due date on it. This is a part of what a certificate of calibration is - a prediction of how long an instrument will probably remain in tolerance.

I would also agree to this statement. Use to work for a Machining/Millwright company. Our standard practice was to check tools such as I/S and O/S mic's before use. Furthermore, we (for the most part) had two sets of standards for the QA system. I agree that a calibration sticker or certificate is no promise that the tool is within spec, however it does prove to the customer that your QA program is important enough to ensure best practice.


JMT Tim
 

Marc

Fully vaccinated are you?
Leader
What about a 'Due Date' on calibration stickers?

A 'Due Date' on a calibration certificate is one thing. What about a 'Due Date' on calibration stickers which most companies put on individual measurement instruments?
 
A

amanbhai

Our calibration agency calibrate instrument but does not give next due calibration date since they say that it is lab that decide what should be the due date. Why? becuase lab is user of that equipment not the calibration agency.
Yet they put next due calibration for physical equipment like weighing scale, meaureing tape etc.:thanks:
 

BradM

Leader
Admin
Wayne, thank you. It is always appreciated having posts that make you think.

Just my opinion:

1. Should you never put a recall/expiration date on a certificate?

It depends on the customer. I, for one, do not want recall dates, as I have my own system from historical data which I use for recall. A date on the certificate is just something else to explain.

For many customers I have done work for, a recall date is yet another matter. They have no system, and rely on the objectivity/advice of the calibration vendor for the date. Do I know that well? No. But the customer has confidence in the little piece of data being on the sheet. Most of the time the recall is agreed upon with the customer, and it's on there for their convenience.

2. If you put the date on there, what should it be based upon? I would like to say that it is always based on compilation of data, accounting for uncertainty, etc., but most of the time, it isn't. It is based on history; not only of the instrument, but of the vendor performing the calibration, and the variables mentioned by Wayne.

A calibration activity is similar to an audit: on this date, this is what I found. The calibration lab cannot be responsible (to an extent) of what happened before and after the calibration. The individual with the Unit Under Test has the responsibility of controlling (or measuring) the variables.

I really never have viewed recall dates as anything more than a reminder tool. Rather, a suggestion for the recall system that simply states, "unless anyone has a more brilliant idea, why don't we calibrate this next time at XX/YY". If instruments did not have a recall date on the stickers, it would be profoundly difficult to have half of them returned for calibration. Should the system be robust than that? Probably. But sometimes, I take whatever I can get.
 
Top Bottom