Inspection with 3D Scanning - Reference Data

#1
Hello everyone,

Currently, I am in the process of validating a 3D scanner to inspect one off medical devices against the approved CAD STL design.

Background Info:

During the first part of validation, we ran across some strange findings, so we are performing additional studies to fully understand this deviation.
This isn't the whole issue, but long story short, we have a NIST scanable calibration standard with STLs created from different 3D CAD applications (STLs made in house, based off calibration standard).

We scanned the calibration standard and are comparing it to the STLs (set the STLs as reference), but also we have matrixed out differently sourced STLs, setting different STLs as the reference and even in some runs setting the same scan data as reference to the STL. The results are as in the following image:

1547050148306.png


Question:

With the NX STL set to as the measured data and the Freeform STL set to reference data, the % In Tolerance is 96.6512.
However, when the same files are inverse (Freeform STL as measured and NX STL as reference), the % In Tolerance is 99.9943%.
Shouldn't the results be the same value? This same issue is consistent, but the same file, when compared to itself, yields 100% In Tolerance.

This issue is going to present a problem for us with our GR&R on actual products and so I am trying to understand how this system is actually calculating these values...I don't know where to look to get more information and contacting the vendor/developer directly has only led to more confusion.

Any help, guidance, or feedback would be greatly appreciated!!!!

ValGal
 


Top Bottom