100% verification of special processes & risk based process validation?

Thanks @Ronen E
But forgive me, I don’t see that it says 100% verification, only verification. Sample inspection has been a commonly accepted approach to most characteristics with the possible exception of highly critical characteristics…

No don’t get me wrong, I am not a real fan of acceptance sampling - especially for critical characteristics and I do believe very strongly in robust development and validation that actually guarantees conformance to the output specifications in conjunction with strong SPC or other process controls such as error proofing; but I’m not seeing the standard backing up validation for characteristics that are only sampled. Can you help clarify?
 
A subtle point (w.r.t. purchased parts) is that 13485 has a different section on Purchasing (7.4) as opposed to Production and Service provision (7.5). In that section (and as described in the guidance) the intent is that the requirements for purchased product have been met and that the control the organization has put in place are proportionate to the risk associated with the purchased product.

"Skip lot testing" and "certificates of conformance" are just a couple of examples that might be chosen (by the receiver's QMS), per the ISO/TC 210 guidance. I see a LOT of opportunity for different approaches to satisfy this.
True. From the finished device manufacturer's perspective, purchased components may be handled differently because they are not directly responsible for the manufacturing process. But if the component manufacturer is applying ISO 13485:2016 they'd be required to apply 7.5.6.
I only write the ^ above ^ because I have known people that interpreted 7.5.6 for molded parts as "if a part is supposed to be made to the print, we have to check every (special, critical, starred, whatever) dimension on every part that comes through the door". Those people didn't even allow for sampling, even if the tools had been qualified and parts included features that would indicate it the molding process was non-conforming in some way that affected any other dimension.
That's a fine example of suboptimal design and component specification practices, as well as specification interpretation. The component manufacturer might have a different internal culture/policy (e.g. checking everything on the drawing), but from a Good Design Control Practices they should only be accountable for the characteristics that have been specified as significant. And then that accountability could translate into verification OR validation (where verification is possible).
 
Thanks @Ronen E
But forgive me, I don’t see that it says 100% verification, only verification. Sample inspection has been a commonly accepted approach to most characteristics with the possible exception of highly critical characteristics…

No don’t get me wrong, I am not a real fan of acceptance sampling - especially for critical characteristics and I do believe very strongly in robust development and validation that actually guarantees conformance to the output specifications in conjunction with strong SPC or other process controls such as error proofing; but I’m not seeing the standard backing up validation for characteristics that are only sampled. Can you help clarify?
I'm not sure I can help proof this, without putting in a significant amount of time. You have the choice to completely disregard what I wrote or trust me on this... It has been my cumulative understanding that in this context "verification" is interpreted - for regulatory/compliance purposes - as 100% verification (or very close to it, which AQL-style sampling isn't, normally). I have seen this in multiple places & contexts, though I can't, off the top of my head, come up with an "absolute", indisputable reference.

If sampling is considered acceptable, where do you draw the line? It's a slippery slope.

Please note that I'm talking mostly about "highly critical characteristics". I don't use that terminology because I think "critical" is notoriously, commonly poorly defined. In my medical device designer/engineer mind, only "highly critical characteristics" should be observed during routine manufacturing; all the rest needs to be designed robustly, as to not be a concern anymore at that stage.
 
I was going to do some digging through materials to try to get a grasp on where "100% verification" may have entered the vernacular, but before starting I felt like this was going to be something of an effort in confirming personal biases! I'll share where I was going to start... it was the old GHTF guidance (c. 1999) on process validation.

From memory: In that guidance, validation was defined as "establishing via objective evidence that the process consistently produces a result-or-product that meets pre-determined requirements". Verification that the result-or-product meets requirements (the alternative to Validation) was defined as "confirmation by examination (and provision of) objective evidence that the result-or-product meets the pre-determined requirements."

I remember that not-long-after that guidance was in circulation, that several consulting groups were circulating/presenting a very simplistic template/decision tree for use in determining "what exactly needs to be validated?" and the template/decision tree essentially shortcut the statistical guidance of the GHTF group's paper by asking something like "are the outputs 100% verified?"... because "yes" meant "no validation required", and "100%" meant "no statistician required."

I'm somewhat being snarky about the latter point, but as a rather junior member of a department that validated a LOT of manufacturing processes *I* was the one (fourth or fifth person down on the totem pole) that was eventually tapped to explain to an auditor what out statistical approach was for certain validations... and the auditor had no special insight on those approaches (honestly, the processes were pretty low stakes anyway), I came to believe that was partially an exercise to see how much panic he could induce in the organization.
 
…I will add that “interpreting” what a Standard means is more of a slippery slope into a quagmire of debate that does nothing to further the influence of Quality professionals or of the manufacture of quality products. It only promotes gamesmanship. If we are to have standards then be specific in our language and not vague.
 
…I will add that “interpreting” what a Standard means is more of a slippery slope into a quagmire of debate that does nothing to further the influence of Quality professionals or of the manufacture of quality products. It only promotes gamesmanship. If we are to have standards then be specific in our language and not vague.
As an aspiration, I agree.

Sadly, this is the reality with published standards (and regulations) - they all contain language that requires interpretation. Interpretation is not necessarily bad, as long as it's reasonable and transparent. The problem starts when things are left to the whims of individuals, no reasoning is provided, no open debate is held and no real accountability is observed.
 
Sec. 820.75 Process validation.

(a) Where the results of a process cannot be fully verified by subsequent inspection and test, the process shall be validated with a high degree of assurance and approved according to established procedures.

How would you interpret "fully"?
 
Well no single word stands alone. Context matters. Sentence structure matters, The full context is where it says “a process ‘cannot’ be fully verified.” It doesn’t say that it has to be fully verified (ie 100% inspection) nor does it say “where it is not fully verified”. I know what fully means, and I know the difference between what was stated and what the other terms mean - I still haven’t seen anything say that it must be fully verified or else you must validate. And this is a substantial difference for non critical characteristics…even for critical differences that have no other requirement from the customer.

Sorry - again I’m not a fan of sample inspection nor of wriggling out of validation. Neither am I fan of statements that are open to very different interpretations or where the auditor may interpret something beyond the standard meaning of words, creating nothing but distrust and meaningless debate.
 
I think this is key:
Sorry - again I’m not a fan of sample inspection nor of wriggling out of validation. Neither am I fan of statements that are open to very different interpretations or where the auditor may interpret something beyond the standard meaning of words, creating nothing but distrust and meaningless debate.

Outside of medical device (or other safety-focused industry's) regulations: We validate processes (at an appropriate level) so that we don't have to inspect everything (or a lot of things)....because scrap and waste. What is "appropriate"? ... That will depend on a lot of different things, and this is where critical thinking has to happen... and ought to be documented, certainly for cases where there may not be consensus (e.g. something like a flammability standard... trust me, plastics manufacturers aren't doing UL94 sustained burn tests on every lot). Within medical device regulations the requirements derive from not just scrap and waste, but patient/user safety w.r.t. the use of finished goods.

21 CFR 820.75 (a), when viewed with the general requirements of 21 CFR 820.70, is essentially saying (my words) "You need a process to do a thing, but you can't tell that the thing happened, you better generate enough OE to convince yourself that the thing happened." Here, the regulators mostly care about the patient and user safety concerns; they really aren't concerned (at least not directly) with scrap and waste. This is the main reason why sterilization is the ur-example; but we also see it in things like the strength of welds.

"You need a process to do a thing" is the specified output. "You can't tell that the thing happened" is the inability to verify. The OE is the validation effort, hopefully including some mathematical understanding of (the confidence and power of) the OE.
 
Back
Top Bottom