Is Design Verification Software required to be Calibrated

Chennaiite

Never-say-die
Trusted Information Resource
#1
We use quite a few Software for Design analysis such as FEA, Crash analysis, etc. The software periodically gets updated through internet, something like 'Windows updates'. Do we need to periodically confirm the *ability of these Software to satisfy the intended application* (reproduced from ISO/TS 16949)
According to the Software developer, it is uncalled for.
Any Guidance in the context of managing the software would help.
Thanks in advance
 
Elsmar Forum Sponsor
#2
We use quite a few Software for Design analysis such as FEA, Crash analysis, etc. The software periodically gets updated through internet, something like 'Windows updates'. Do we need to periodically confirm the *ability of these Software to satisfy the intended application* (reproduced from ISO/TS 16949)
According to the Software developer, it is uncalled for.
Any Guidance in the context of managing the software would help.
Thanks in advance
Hello! Software isn't 'calibrated', as such. However, it can be verified as performing correctly and developers actually overlook this small but important point (often they believe there's nothing wrong with their product).

You can run some numbers through the software and compare them to some result obtained by another method - or better still, ask the developer for their results! Somehow, and at some time, they must have done a test to see if the figures given in output were mathematically correct. If they didn't do it and can't provide you with the results, then you should do it (almost just because they seem not to understand it should be done!!)
 

Sidney Vianna

Post Responsibly
Leader
Admin
#3
We use quite a few Software for Design analysis such as FEA, Crash analysis, etc. The software periodically gets updated through internet, something like 'Windows updates'. Do we need to periodically confirm the *ability of these Software to satisfy the intended application* (reproduced from ISO/TS 16949)
According to the Software developer, it is uncalled for.
Any Guidance in the context of managing the software would help.
Thanks in advance
Please note that this requirement of the TS 16949 standard is in section 7.6, which primarily deals with software used to test and/or inspect products. The type of software you are using for finite element analysis, structural analysis, etc. don't fall in that category.

Imagine having to validate MS Excel for example when you use a sophisticated spreadsheet for making some calculations for a product design. Not a requirement of the standard.
 

Chennaiite

Never-say-die
Trusted Information Resource
#4
That said, can we safely consider that Design Validation process itself is a sort of verifying those softwares and complicated excel calculation? When validation fails, one of the causes, rare although, could be wrong results produced by the software. Especially for analysis software we do have a target for correlation between result achieved theoretically and the one through testing. When target is not met, the subsequent cause investigation process will reveal the bug in the software. Does this help?
 

BradM

Leader
Admin
#5
We use quite a few Software for Design analysis such as FEA, Crash analysis, etc. The software periodically gets updated through internet, something like 'Windows updates'. Do we need to periodically confirm the *ability of these Software to satisfy the intended application* (reproduced from ISO/TS 16949)
According to the Software developer, it is uncalled for.
Any Guidance in the context of managing the software would help.
Thanks in advance
I would say it depends. First, I'm assuming the software received an initial qualification against an identified functional specification. So say everything is in good shape, you have verified that given a particular software application on a certain machine, with a certain Operating System, etc., that given these inputs, you will receive these outputs. Ok, so far, so good.

Now, there are updates to the application that may affect that verification; and some that may not. For example, if there are some level of hot fixes that address some issues of the application running on a middle tier server, full Performance qualification may not be necessary. However, some abbreviated level of testing may be in order to assure the intended level of service is not interrupted.

Any level of qualification/testing should be contingent to the upgrade. If there is a new version with all kinds of bells and whistles (which you might intend on using), new this and new that, your system may require some extensive testing.

If you followed a testing plan laid out by the mfg. and the mfg. can provide you documentation that the particular "rollout" will not affect things, that is a good justification to perform very abbreviated testing.
But I would still always perform some abbreviated observational testing,
just to make sure... nothing changed. :)
 

Chennaiite

Never-say-die
Trusted Information Resource
#6
All nice insight into the subject really.
I get a feeling that relying on physical test data to verify the software's conformance to intended use could be the most appropriate mechanisam. Performing an independent test of the software could be touch tedious' given that it provides lot of parameters as output. Its like a machine inspecting 100 visual parameters which may require as many limit samples.
 
N

ncwalker

#7
A long time ago, I used to write heavy duty calculation software. We tested it. Fed it known inputs, and verified the outputs. By verified, I mean we used alternate means to calculate the results. And our customers WOULD NEVER blindly accept our updates without retesting. But it was a highly regulated industry.

More recently, for grins, I fed the MSA example Gauge R&R numbers into a piece of software for calculating Gauge R&R that we purchased from a professional company. And the results didn't match.

Having written the code myself (years ago) and witnessed supposedly professional grade (tested?) software fail, I can attest to how easy it is to have a bug in the software.

You would be foolish, foolish, foolish not to test.

But your problem is FEA. And while the method is the same, my gosh, how many different individual results must you have? I don't know what you're using it for, but in a simple model every node would have at least 3 pieces of data attached to it if not 12 or more. And verifying that would be daunting.

I would try this..... These packages are usually three things: a pre-processor (which builds the mesh, displays it graphically, and allows you to assign the initial conditions), a processor (which goes away and does the calculations), and a post processor (which takes the big results file and displays the pretty colored pictures).

Well it's really the output of the processor you're concerned with. And that is some file of results that the post processor reads. I would run some model before updates (a simple one, processing time is expensive, but not TOO simple), generate that output file and store it. Then run it AFTER updates and use a utility to compare the two files bit by bit. Unless the update is doing something to the header of the output file, it really shouldn't be doing anything to the results. The software vendor ought to be able to tell you if this were the case. You may be able to open the file in a text editor and check yourself if the headers are different.

But a bit by bit comparison of the output files before and after wouldn't take any longer than copying them somewhere (which is essentially moving them bit by bit).
 

Chennaiite

Never-say-die
Trusted Information Resource
#8
Here's a lot of information ncwalker. Thanks. I'll take it up with the right person who can explore this to benefit.
 
Thread starter Similar threads Forum Replies Date
S Checklist for RA and QA Review of DVT (design verification test) for Software Other US Medical Device Regulations 2
G Calculating Ppk for Design Verification - Variable Sampling Design and Development of Products and Processes 15
J Design Verification Testing and Statistics Reliability Analysis - Predictions, Testing and Standards 3
C Items used for Design Verification Design and Development of Products and Processes 7
J Managing design verification regression testing of design changes Design and Development of Products and Processes 1
N Example for design and development planning,input,output,review,verification,validation and transfer Misc. Quality Assurance and Business Systems Related Topics 4
C Stress / Challenge Conditions for Design Verification Testing to Reduce Sample Size 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 11
D Design Verification Sample Size vs Repeats Statistical Analysis Tools, Techniques and SPC 9
D ISO 13485 - 7.3.6 Design and development verification - Do most folks create a separate SOP? ISO 13485:2016 - Medical Device Quality Management Systems 6
S Documenting Design Verification Test Results (ISO 9001) Design and Development of Products and Processes 1
C Documentation for items used for Design Verification 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 4
P Design verification driven by new equipment. How is this different than process validation? 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 1
N Design Verification & Process Validation - Statistical sample sizes Design and Development of Products and Processes 2
D Design Verification - Is testing required? Design and Development of Products and Processes 5
R Design Verification Documentation ISO 13485:2016 - Medical Device Quality Management Systems 19
R Design verification for interim design outputs - sampling rationale ISO 13485:2016 - Medical Device Quality Management Systems 2
D Testing to failure for design verification Reliability Analysis - Predictions, Testing and Standards 11
D Supplied item design verification Supplier Quality Assurance and other Supplier Issues 5
Ronen E A Rational Basis for Design Verification Design and Development of Products and Processes 5
V Which batches should or could be considered for design validation and design verification? 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 0
R Medical Device Design verification sample prototype Other Medical Device and Orthopedic Related Topics 14
E Sample size for design verification of variable in single use device Design and Development of Products and Processes 20
S Source of QMS templates including templates for design verification/design validation Document Control Systems, Procedures, Forms and Templates 2
C Deviation in Design verification Other Medical Device Related Standards 6
V Using K-Factor(Tolerance Interval) Analysis for Design Verification Statistical Analysis Tools, Techniques and SPC 3
J Design Verification - Managing Changes Design and Development of Products and Processes 4
S CE Mark Approval Design Verification and Validation Questions CE Marking (Conformité Européene) / CB Scheme 8
V Template / format for Device History File & Design Verification of transdermal patch 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 1
H Is design verification and validation required for samples accompanying the quotes ? ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 5
D Pre-Design Verification - Using Continuous / Variables Data Statistical Analysis Tools, Techniques and SPC 2
W Qualification of Design Verification Equipment 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 1
P ISO9001 Scope Exclusion Verification in a Design Company ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 14
M Design Verification: "Difference" in Sample Size Calculations? Design and Development of Products and Processes 1
S Verification and Validation of Post Market Design Change Design and Development of Products and Processes 2
H Form that will satisfy Design Review and Design Verification Requirements (7.3) ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 2
J Design Inputs and Verification Activities CE Marking (Conformité Européene) / CB Scheme 5
R Need help on defining scope for Design Verification File for Class III IVD 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 8
smryan Satisfying the ISO 9001:2008 7.3.5 Design & Development Verification requirement ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 11
P How to separate Product Design Review, Verification, and Validation Activities ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 26
Uriel Alejandro Trouble with a Mechanical Design Technical Verification Check List AS9100, IAQG, NADCAP and Aerospace related Standards and Requirements 2
M Design and Development Outputs Approval, Verification and Validation IATF 16949 - Automotive Quality Systems Standard 4
A ?820.30(d) Design Output Release - Before or After Verification? Design and Development of Products and Processes 6
T DVP&R (Design Verification Plan and Report) for Vehicle Labels? APQP and PPAP 3
W Design History Verification Records Control and Approvals US Food and Drug Administration (FDA) 3
G Legacy Product - Do we need follow QSR 820? Design Verification and Validation 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 12
S Design and Development - Review vs. Verification vs. Validation Design and Development of Products and Processes 15
S Acceptance Criteria at Design Verification Inspection, Prints (Drawings), Testing, Sampling and Related Topics 2
G Sample Size for Design Verification and Validation 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 6
B Is Ship Test considered Design Verification or Design Validation? 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 2
D Design Input/Output and Design V&V (Verification and Validation) Interpretations 21 CFR Part 820 - US FDA Quality System Regulations (QSR) 14

Similar threads

Top Bottom