Search the Elsmar Cove!
**Search ALL of Elsmar.com** with DuckDuckGo including content not in the forum - Search results with No ads.

Vision Inspection System Validation

M

makingchanges

#1
I am in process of implementing an Oasis Vision Inspection system for inspection of small threaded fasteners. We had previously decided to move forward with validation of each program by alternative inspection methods (micrometer, caliper, etc) as each program is created individually. If the program is then copied and only one dimension changed then that one dimension within the program would be validated. The programs are set by creating the appropriate feature box and then identified tolerances. An actual measurement is reported with pass of fail based off the tolerance. As we are progressing through the creation of programs the question has arisen as to whether each program requires the validation or whether top level features can be validates such as diameters, lengths etc. How are others managing this requirement? We currently do not have any customers which have tight software controls however we looking to more strongly enter that market and I do not want to back track on the programs already created.
 

Bev D

Heretical Statistician
Staff member
Super Moderator
#2
Why wouldn't you validate each feature inspected in the program?
(an ounce of validation is worth a ton of corrective action...)

I would tend towards at least validating the most difficult to inspect features...
 
M

makingchanges

#3
My only hesitation with that is that validating for that feature doesn't mean that the feature is created correctly in each subsequent program, hence the thought of individual program validation. However when you deal with thousands of part numbers this gets very reduntant and I can see the benefit of feature validation.
 

Ninja

Looking for Reality
Trusted
#4
Howdy makingchanges,

Welcome to the Cove!

I don't use the Oasis, I use a MicroVu Vertex...but the concept of using a programmed set of tasks to generate a measured output is exactly the same (it is just how the tool measures that is different).

My thought process runs this way:

Each program you write is a piece of software which generates outputs (data) that will be used for accept/reject decisions and/or fitness for use decisions.

Therefore the software must be validated, and the measurement accuracy must be established and compared to Spec.

Therefore thread A: Each program you write is it's own piece of software, therefore every program must be validated separately for every feature it measures.
(when working in thousands of programs...you might consider taking a "copy and paste" shortcut...but don't lose sight that it IS a shortcut, and leaves you with a "should be OK" rather than an "OK".

Therefore thread B: The data outputs need to be accurate, so a "known" is put into the mix when doing GRR so that we are not only measuring Reproducability and Repeatability...but also a reality check on accuracy.
Repeatably and Reproducably wrong doesn't get the job done.
 
M

makingchanges

#5
Many of the programs are based from part series so have base part number and sizes with just a length change. The though process was to create and validate the entire base program for that nominal size then copy and paste and only validate the length change for subsequent programs therefore validating the only physical change of the program. Again we are still talking 148 different lengths for one base program but once the validation is complete it is good to go. Again I am just trying to prove out my thought process as there is some challenge to it as "there must be an easier way" and I am sure there is but I do not want to jeopardize the integrity of the programs themselves and their acceptance of product...hence me asking what others are doing!
 

Ninja

Looking for Reality
Trusted
#6
Understood.

I validate all parts of each program, cut&paste or not.

In your position, I might do the cut&paste...put the equipment in use based on the reasonable belief that all is good, and then wrap up the number crunching (validation) afterwards.

So far, I have not assumed any part of the validation based on similar programs...though I have caught up the paperwork formality after the fact.
 
Top Bottom