Brute Force Validation - Difficult to show 10-year calibration cycle is sufficient

Marc

Fully vaccinated are you?
Leader
A good read:

From: (Doug Pfrang)
Subject: Re: Calibration/Pfrang/Volker/Pfrang

What is "validation by brute force"?

Validation by brute force means that you try a tool in your process and you see if it works; if it works, then you have validated that tool by brute force. It is also known as trial and error. For example, let's say you're repairing a customer's car, and the shop manual says to use a 5 mm wrench. You've never calibrated your 5 mm wrench, so you don't know if it's really 5 mm or not. But you grab it anyway and discover that it works. You have just validated that tool by brute force, because you have tried it and found that it works in your process. You don't care if the wrench really is calibrated to 5 mm or not, because you know that it fits the bolt you're trying to remove and that's good enough.

The same goes -- potentially -- for any tool in any process: given enough time, money and effort, you could, in theory, validate every tool in every process by brute force trial and error. The problem is that some tools and some processes are too expensive, time consuming or dangerous, to validate by brute force trial and error.

>> If the manufacturer recommends
>> that you recalibrate a tool annually, but you find it perfectly adequate
--
>> in YOUR process -- to revalidate that tool every decade, then you are
>> perfectly justified in revalidating that tool every decade.
>
> Won't it be little difficult to show 10-year recalibration is
>adequate??? Say you do happen to have 2 calibration records for one
>instrument 10 years apart, and they show it's remained in calibration--
>is that one sample of a test of the 10-year-hypothesis statistically
>valid? If the manufacturer has a 1-year guideline, shouldn't you start
>from there, and then maybe see if you can lengthen it?

Your questions cannot be answered in the abstract. You have to use a specific example, because every situation is unique. With some tools, in some processes, it is perfectly acceptable to go 10 years between validations; with other tools, or other processes, it is not. The rule is to start with the manufacturer's guideline (or your own experience, if there is no guideline), and then let your (DOCUMENTED!) data and your common sense determine what is reasonable in YOUR process. Maybe ten years is OK, maybe it's not.

Also, I did NOT say that you only needed to do one validation and then you could immediately jump to a ten year duration. In most situations, you should indeed start with a shorter duration, say one year, and use that duration for a year or two until you gain enough (DOCUMENTED!) experience with YOUR particular tool in YOUR particular process to justify extending the cycle. All I said in my posting was that IF you have the (DOCUMENTED!) data to justify extending a validation cycle, then there is nothing necessarily wrong with doing so, because this is definitely NOT an ISO non-conformity.

Never forget, however, that there is a danger of extending the validation cycle too far: the longer the cycle, the more likely it will be that: (a) a tool WILL BE out of spec at its next validation (assuming drift is an increasing function of time); and (b) you will have shipped a lot MORE PRODUCT in the interval since the last validation. Therefore, a long validation cycle not only increases (a) the PROBABILITY that you will have to do a recall; but it also increases (b) the TOTAL NUMBER OF UNITS involved if a recall is required. Your exposure to financial loss (due to shipping defective products) therefore increases EXPONENTIALLY as you lengthen your validation cycle! Keep that in mind before you kick your validation cycle out to ten years.

Oh, yeah, and remember that the data you use to justify your validation cycle must be DOCUMENTED!

-- Doug Pfrang
 
Top Bottom