Verification Protocols - when to implement them (and when to skip in lieu of a report)

placebo_master

Starting to get Involved
This question concerns 21 CFR Part 820 Subpart C (f) - Design Verification:

I'm working for a client that is assembling their 510(K) submission to the FDA for a Class II IVD medical device. That effort includes writing and executing verification protocols, then summarizing the results in verification reports.

Recently, they've tasked me with writing protocols for verifying component specifications. For example, let's they need to verify that their computer system has a hard drive of a certain size and read/write speeds. Their verification plan dictates that the verification protocol shall instruct a tester to obtain the specification datasheet for the hard drive, record their findings in a datasheet, then verify that the specifications meet their size and read/write requirements. A verification repot would then follow that summarizes the tester's findings.

What I'm wondering is if 21 CFR 820 requires a protocol in this context? Isn't it possible to skip the protocol and jump right to writing a technical report that serves as proof that verification activities were performed--those activities being me, the author, collecting and synthesizing specification datasheet information that demonstrates the hard drive requirements are fulfilled? My interpretation of Subpart C (f) is that it should be possible to skip a protocol as long as the necessary information is communicated in the technical report, but I'd like to hear critiques of this interpretation (if any exist).
 
Last edited:

pziemlewicz

Involved In Discussions
I write protocols for validation testing, but not verification. The technical report you suggest only needs to meet the items outlined below:
(f) Design verification. Each manufacturer shall establish and maintain procedures for verifying the device design. Design verification shall confirm that the design output meets the design input requirements. The results of the design verification, including identification of the design, method(s), the date, and the individual(s) performing the verification, shall be documented in the DHF

You do have to follow the Design Control procedure the company established: it could be that the procedure itself is overly burdensome.
 

Tidge

Trusted Information Resource
Specifications (such as "minimum size of a hard drive") are almost certainly NOT design inputs. It is very likely that such a specification isn't even a critical element of satisfying a design input. It is necessary to establish the provenance of all components that are used in design verification (and validation) builds, but the individual inspection records (which is what I read the OP as providing) strike me as having very little prima facie value for the submission. I have typically seen production DHR (or records equivalent to DHRs) that match an established DMR in submissions, but not the equivalent of inspection records in 510(k) submissions.

Having written the above: IF the risk file specifically identifies some specific feature of a design output as the sole control for a risk that would be otherwise unacceptable (in the absence of such a control, or in the presence of a non-conformance for that specific characteristic) I can see an argument to be made that the 510(k) submission could include a directory of evidence of implementation of risk controls... but these would be instances of records of the (and methods of) inspection and not just a one-time report.
 

placebo_master

Starting to get Involved
I think both of you have touched on important issues that I've noticed with this client as well, so I'm thankful for this feedback you've both provided me.

You do have to follow the Design Control procedure the company established: it could be that the procedure itself is overly burdensome.

This is an accurate assessment of the situation. This is a startup company producing their first device, so their SOP on this topic is more conservative in how it prescribes the use of protocols.

Specifications (such as "minimum size of a hard drive") are almost certainly NOT design inputs. It is very likely that such a specification isn't even a critical element of satisfying a design input.

In this context I think the hard drive does satisfy a critical element of a design input, but I see your point about how the "minimum size" attribute shouldn't be treated as a design input itself. This is a consistent issue, but one that would be incredibly difficult to mitigate at this point in development. Good advice for them for whatever device comes next, perhaps :)
 

Tidge

Trusted Information Resource
In this context I think the hard drive does satisfy a critical element of a design input, but I see your point about how the "minimum size" attribute shouldn't be treated as a design input itself.

Counter-point: If the HDD is purchase to meet the minimum spec, but is then later partitioned (or otherwise "fills up") to have "too small" disk partitions, it would still meet the purchase requirements but it may not be able to support the application. Purchasing inspections are in my experience not sufficient (by themselves) to demonstrate that a product will meet user needs.
 
Top Bottom