|
This thread is carried over and continued in the Current Elsmar Cove Forums
|
The New Elsmar Cove Forums
|
The New Elsmar Cove Forums
![]() Measurement, Test and Calibration
![]() 17025 section 5.4.7.2.a
|
| next newest topic | next oldest topic |
| Author | Topic: 17025 section 5.4.7.2.a |
|
Marc Smith Cheech Wizard Posts: 4119 |
From [email protected] Tue Feb 15 11:11:18 2000 Date: Tue, 15 Feb 2000 10:19:42 -0500 (EST) From: Greg Gogates [email protected] To: [email protected] Subject: 17025 Software Validation Concern Sender: [email protected] Folks, I have been reviewing 17025 section 5.4.7.2.a which states that "computer software developed by the user is documented in sufficient detail and is suitably validated as being adequate for use." There is also a note after this section that states that "COTS software in general use within their designed application range may be considered sufficiently valiated" This clause and it's note are riddled with ambiguities! 1.) What is "sufficient detail"? 2.) What does suitably in "suitably validated" mean? 3.) COTS by definition of "off the shelf" can only have one designed application range. If it was used outside of it's application range then it would be MOTS (Modified Off the Shelf). Hence, this statement is menaingless, I think? 4.) What does "Sufficiently validated" mean? 5.) The note states that if you buy COTS it's OK. Well, I've performed many software vendor audit and have found that many loosly follow (if at all) software validation practices. This leaves us all with a false sense of hope. If I was a lab reading this, I would either ensure I buy COTS software or give my brother-in-law my custom software and have him sell it back to me and claim that it is Commerical Off the Shelf! Assessors would be fooled! As an assessor, I am both disturbed at the subjective statements along with the obviuos holes. This is quite sad considering that software use in the labs is becoming more prevalant. There needs to be some consistant guidance given to both labs and assessors by all accreditation agencies with regard to this subject. I would like to spearhead this discussion as it is near and dear to my software QA background. Please give me your comments. Greg [[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[ ----------snippo------------- Date: Tue, 15 Feb 2000 12:36:51 -0500 Do the criteria described in the file "adequate_for_use.doc" on your web John John D. M. Osburn P: (770) 494 - 8673 ----------snippo------------- Moderator Note, My criteria still hold for the ftp://ftp.microserve.com/accts/i/iso25/adequate_for_use.doc software validation document. I will be updating it woth regard to the stronger/looser criteria in 17025. I am am actually looking for everyone's thoughts on this so that my revised white paper will reflect current thinking. Greg ----------snippo------------- Date: Tue, 15 Feb 2000 15:05:05 -0600 Refer to: O. Lopez, "Spreadsheet Qualification Applications," Journal of Validation Technology, Vol. 3 #3, 1997. A copy can be purchased at www.ivthome.com/products/publications/articles/ivt0112.cfm?sid=72403&token=0. The Institute of Validation Technology (home page) has many other useful references. Check it out. Bob Nicolotti, Ph.D. ----------snippo------------- Date: Tue, 15 Feb 2000 15:08:19 -0600 (CST) Greg you have quoted one the "stock" lines throughout the "quality scheme" business. The standards are developed by a concensus process and the language gets softer and softer until it is meaningless inorder to get people to accept it. What does "management shall determine the required level of training of analysts" mean? The list of these fluffy statements is endless, thus my frustration and desire to put more specific requirements on people, not labs. In your example, specific requirements on software would seem to be appropriate.
David Klein ----------snippo------------- Date: Tue, 15 Feb 2000 16:28:38 -0800 Greg, Interesting discussion topic. Software verification and control can be a pricey requirement to fulfill. In my firm's quality system we accept the off shelf software (e.g., Lotus 123 or other approved spreadsheet software, data logger software, or other commercially available software) without requiring further verification. Commercially available is the key-brothers/sisters-in-law don't count. We use the off-the-shelf software (COTS software) as the platform for producing what we call "standard software", which is used to manipulate test data. Standard software is controlled. We conduct independent handwritten verification of all standard software. The verification is a step-by-step independent check of both the logic and the algorithms. The hand-written verification is filed by our QA Administrator with a "Master Copy" of the standard software. A control copy of the standard software is put on the server for use as needed. When someone wants to use standard software he/she must download a copy from the server for each application. Occasionally, the control number is changed on the master copy and the control copy resaved/overwritten on the server with updated master (i.e., reflecting the new control number). Test reports are audited to confirm that the current editions of standard software are used. Sufficiently validated? How long is a piece of string? Currently, we validate once (i.e., one data scenario) a final QC step before approval as standard software. If the logic and algorithms are consistent with the test method and if the hand calculated proof returns the same answer as the standard software we accept it. By way of confession, I have not demonstrated that validating with one data scenario is sufficient for a final QC verification for approval as standard software. I am interested in hearing form other labs on this subject. Thanks ----------snippo------------- Date: Wed, 16 Feb 2000 15:54:16 +1000 It is true that the Standard is vague, and open to interpretation in many areas. However, our assessors circumvent this by requiring us to make an interpretation (that suits our work practices), and then require us to justify our interpretation. Naturally, the justification requires some kind of investigation, measurement, and data, and the accompanying documentation. Mark Hanlon ----------snippo------------- Moderator Note: Date: Wed, 16 Feb 2000 08:01:12 -0500 Greg, As someone who has spent a considerable amount of time writing spreadsheet applications for the chem lab I would propose a simple validation scheme. First, sample calculations should be assembled, verified and kept on file. To "test" the spreadsheet one would type in the sample set of numbers and verify that the results agree with the manual calculations. I think this would be sufficient, George Hueber IP: Logged |
All times are Eastern Standard Time (USA) | next newest topic | next oldest topic |
![]() |
Hop to: |
Your Input Into These Forums Is Appreciated! Thanks!
