Determining an as received OOT condition during calibration does not affect measurements taken prior to calibration.

ChrisM242021

Registered
I am a CMM/Metrology Technician in the Automotive Industry. My company recently underwent certification to IATF standards. During our formal audit it was brought to our attention that our Roamer arm, which was recently calibrated by the OEM, had an as received OOT condition. as a result, we are required to verify in writing that the OOT condition did not affect our dimensional assessments conducted prior to sending the roamer arm out for calibration. As a Journeyman Calibration Technician trained by the Navy, I was taught that a Golden Standard (Type 1 level) was considered 10:1, a Calibration (Type 2 level) was considered 5:1, and a working standard (Type 3 & 4 level) was 4:1. However, I have been unable to locate that information anywhere outside of my own memory. I did find a thread contained here that alluded to the same understanding. Does such a standard exist, and if so, where can I find it for the purposes of referencing it? For the sake of discussion, the OOT of tolerance condition is in the neighborhood of 10 microns while our tightest part tolerance is no less than +/- 0.5 mm. Would it be acceptable to simply state the ratio between the OOT condition and the minimum customer tolerance meets a 50:1 ratio and therefore would not influence the stated results of measurements taken prior to the roamer arm calibration?
 

Funboi

On Holiday
I am a CMM/Metrology Technician in the Automotive Industry. My company recently underwent certification to IATF standards. During our formal audit it was brought to our attention that our Roamer arm, which was recently calibrated by the OEM, had an as received OOT condition. as a result, we are required to verify in writing that the OOT condition did not affect our dimensional assessments conducted prior to sending the roamer arm out for calibration. As a Journeyman Calibration Technician trained by the Navy, I was taught that a Golden Standard (Type 1 level) was considered 10:1, a Calibration (Type 2 level) was considered 5:1, and a working standard (Type 3 & 4 level) was 4:1. However, I have been unable to locate that information anywhere outside of my own memory. I did find a thread contained here that alluded to the same understanding. Does such a standard exist, and if so, where can I find it for the purposes of referencing it? For the sake of discussion, the OOT of tolerance condition is in the neighborhood of 10 microns while our tightest part tolerance is no less than +/- 0.5 mm. Would it be acceptable to simply state the ratio between the OOT condition and the minimum customer tolerance meets a 50:1 ratio and therefore would not influence the stated results of measurements taken prior to the roamer arm calibration?
In essence, yes. You should create a worksheet/record and show your data analysis and someone - with appropriate authority/competency - sign off. That should take care of it - and ensure the use is in your procedure etc.
 

ChrisM242021

Registered
Thank you for your reply. I settled in on using the T.U.R. as an acceptability standard. Then wrote a memo describing those measurements and the use of TUR. Finally showing the ratio to be far in excess of 10:1 and formally accepting the romer arms as valid for use during the time leading up to its calibration and subsequent correction.
 

Mike S.

Happy to be Alive
Trusted Information Resource
In many real-world cases, 4:1 is about all you can get, and is reasonable.
 

Miner

Forum Moderator
Leader
Admin
You can take an alternative approach if you retained data for all of the measurements taken between calibrations. Simply add(subtract) the discrepancy from all your values, then determine whether the adjusted values would have changed the decisions made regarding the acceptability of the product.

And remember, you are not required to prove "...that the OOT condition did not affect our dimensional assessments..." Obviously, it did affect them. What you do have to provide is a risk assessment regarding the acceptance decisions that you made based on the measurements.
 

jerry bambach

Registered
This will require a CAR (corrective Action report) to do an impact study if the determination is that you still have sufficient accuracy for a 5to1 TUR then That is what you state on the CAR. you would than close the CAR as being investigated, and show it closed on your CAR log.
This is how it is addressed in both ISO/IEC 17025 & ISO 9001
Jerry Bambach
 

Mike S.

Happy to be Alive
Trusted Information Resource
This will require a CAR (corrective Action report)....

Ummmm...you're gonna have to "show me the shall" on that one. I see nothing in the standard requiring a CAR.

You can "determine if the validity of previous measurement results has been adversely affected" without a CAR.
 

Ron Rompen

Trusted Information Resource
This will require a CAR (corrective Action report) to do an impact study if the determination is that you still have sufficient accuracy for a 5to1 TUR then That is what you state on the CAR. you would than close the CAR as being investigated, and show it closed on your CAR log.
This is how it is addressed in both ISO/IEC 17025 & ISO 9001
Jerry Bambach
Jerry, I disagree with your statement. ISO 9001: 2015 7.1.5.2 'The organization shall determine........and shall take appropriate action as necessary.
IATF 16949: 2016 7.1.5.2.1 c) 'an assessment of the risk of the intended use of the product.....condition'

Nowhere does it state that a CAR is required. Yes, there are actions that shall (should/could) be taken, but a CAR is not one of the mandatory ones.
 
Top Bottom