The big expensive set of gage pins are used in the making fixtures and tooling which is used to produce customer product. The fixtures and tooling is verified or not by inspecting the product with calibrated measuring equipment. Not all companies look the same. It works for us.
It "works for you" solely because the gages (pins) happen to be accurate. They were verified once upon a time, even if only by the manufacturer. So, they presumably are accurate. (Remember, calibration doesn't make it accurate, it merely verifies whether it is.)
Otherwise, it wouldn't work for you.
However, if it turned out your gage pins were not accurate, the failure to verify/calibrate them would lead to fixtures and tooling which are not right, which could make the product nonconforming - at this point in your explanation, you would finally discover that you have a problem. That would be a costly way to find out the mics your tool guy uses are not correct.
So, does that mean your
system is "working for you?" Or, is it simply that at this time, your gages happen to be correct, therefore they don't cause a problem?
My house has never been broken into, so I could presume I don't need locks. The only time I really need locks are those few times when a bad guy happens to be in my yard. But, since I can't predict that, I need to periodically verify my security is working.
I would propose a different angle to this discussion.
I think we need to make our calibration activities more efficient, and need to develop a more effective way. I think we calibrate more than we need to.
Basically, the purpose of calibration is to periodically verify that our gages are capable of giving us readings that are adequately accurate and repeatable for our purposes. But, different measuring systems at different companies have different needs.
Ex 1. I want my watch to tell me the correct time. It is not precisely accurate, but it is adequate for my needs. However, even I want a reasonably accurate result.
2. If I am using it to measure a sports event, there is a greater need for more precise accuracy.
3. And, if I am applying it to NASA or space telescopes, I would need an even higher level of accuracy and precision.
All three levels require the accuracy to be known, but the needed level of accuracy must be determined, and the system must be evaluated to determine that it meets that level of accuracy.
For level 1, I merely calibrate it to the TV weather channel or my cell phone. That is adequate.
For level 2 and 3, a more structured, formal calibration approach would be necessary. We can't shoot a rocket into space, and merely assume the gages were correct.
So, if we understand why we calibrate, and really understand what the requirements specify, then we can fashion a more appropriate approach to calibration.
The issue should not be just whether ISO requires it, but what is best for you. It is necessary that your gage pins are accurate, or it will cost you money and waste. If you bought them with a cert, and they haven't been damaged, they are likely still just as accurate as they were - in other words, still in calibration. So, you can factor that into your planning as to frequency and how extensive the cal needs to be. Maybe, for lab gage pins, just a visual check of condition, if your tolerances are not too precise.
But, limiting calibration to final gages only, and assuming all the rest will eventually lead to problems and waste. Not protecting yourself will lead to problems eventually.
Remember, on Sept. 11, the US government said they "weren't focused on airplane hijackings because we had not had any in 15 years." Not a very effective argument...
I would apply the concept so that the frequency and depth of verification is the variable, not excluding them altogether. If I never check my watch, eventually it will become wrong.