Seeking reference guides/ documentation /tips on verification best practices

Given2Fly

Registered
Hi Everyone,

Having recently moved over to the design verification space, but in the same medical device company, I am interested in finding some reference documentation that I could read up on to help guide me through verification best practices. Things like test method development, planning, data analysis, reporting, change control/regression analysis, are what I would be looking for. Appreciate the help.
 

Tidge

Trusted Information Resource
If you can look past the title and specific subject matter, William Pressman's Software Engineering: A Practitioner's Approach (many editions exist, I like the 5th) has many many chapters dedicated to diverse elements of planning, design, testing etc. that are applicable to all projects, not just software. There is a chapter on testing that has fundamentals that are applicable to more than just software, although many elements of what he describes do have a relatively unique application (at least of terminology) to software.

One of the things I like best about the Pressman book is that it is so complete, in so many different dimensions. For example, his chapters on projects should be required reading for anyone who has to evaluate, plan, or execute projects (not just software projects). A grasp of the material he presents will allow the student to not just recognize a project that is destined for trouble, but can be used to recover from disasters in the making.
 

yodon

Leader
Super Moderator
One of the biggest issues I see is collection of objective evidence. I see a lot of "did the system do this" ... "yep" (with no evidence to support the conclusion). We develop protocols to drive collection of objective evidence. On the flip side of this is collecting too much data. I see this especially in software testing where a screenshot is taken of every screen transition.

Another issue I see is testing the implementation as opposed to the requirements. The implementation can change and still meet the requirement but there is now a discrepancy with the protocol.

If not obvious, you'll need to be able to show traceability between tests and requirements.

Data integrity is always necessary. Get to know ALCOA. This also addresses ensuring units are captured, precision is correct, etc.

Somewhat along those same lines is ensuring that equipment used is qualified, identified / identifiable, capable (precision), and in calibration if required. All this should be documented as part of the test data.

We incorporate a reviewer in our process that looks at the evidence to determine if they can come to the same conclusion made by the tester, that the results have data integrity, and all the i's are dotted and t's crossed.
 

Tidge

Trusted Information Resource
We incorporate a reviewer in our process that looks at the evidence to determine if they can come to the same conclusion made by the tester, that the results have data integrity, and all the i's are dotted and t's crossed.

This is an item of MAJOR importance for me. I inherited a group (people, practices, procedures) that nominally had an "independent review" of a tester's assessment but practically offered no assessment of the actual results beyond "did they initial the test step correctly?" I'd like to report that the useless/incorrect attitude has be re-aligned, but I am still encountering this problem. For young employees, after the explanation from first principles I ask them if they want recognition and promotion for having recognizable levels of expertise or are the hoping to advance because they are good at "pencil-whipping". For older employees I am more blunt: You can choose to stick with the old ways, forgo advancement and get treated to the exact same monologue from me every time, or you can generate objective evidence that you really are smarter than the rest of us by doing the proper assessments.
 

Given2Fly

Registered
Thanks Tidge and Yodon for the helpful responses - Appreciate you taking the time to cover the key aspects of good verification practices. I can certainly appreciate the need to ensure data has been gathered through appropriate test method as well as objectively reviewed by independent reviewers. We have also set up this process and it seems to work well, but naturally not without challenges such as program pressures/expectations and product and test complexity leading to steep learning curve for the independent reviewers. This coupled with the "pencil Whipping" rightfully pointed out by Tidge makes for a strained process. I will take a look at "Software Engineering: A Practitioner's Approach" - if there are any other references I'd love to hear them.
 

Hi_Its_Matt

Involved In Discussions
It may be more high level than you are looking for, but check out "Thinking Ahead to System Verification and System Validation" by Louis S. Wheatcraft of Requirements Experts. It provides a good overview of principles, and how to start thinking about verification/validation during the project planning and requirements definition stages.

I will say the terms and definitions Lou uses differ from those I see commonly used, but the guidance is still good, and you should be able to link the terms he uses to ones your organization uses.

(Ex: the companies I have worked for haven't used the terms "requirements verification" or "requirements validation" and instead just talk generally about "requirements review." And I think most med dev companies would use "Design Verification / Validation" where he uses "System Verification / Validation.")
 
Top Bottom