"Uncertainty of the measurement standards shall not exceed 25% of the
acceptable tolerance." To elaborate, this statement is somewhat misleading, as the actual MU of the calibration process should not exceed 25% of the acceptable tolerance, not just the unc of the master standard. The idea is to take into consideration the major sources of uncertainty, including environmental and operator.
So you would need to know the accuracy of your instrument...so in the case of your micrometer lets say you have an accuracy of +/- .0001" in this case you would need a calibration process which has an uncertainty less than .000025". At this point you would need to know the calibration uncertainty of your calibration process and ensure it is less than the allowable .000025" if you want to meet the 4:1 TUR.
The two items that come into play are your instruments' accuracy, and the uncertainty of your calibration process.
Please discuss further if this doesn't sound correct to you, I don't deal with TUR on any significant basis so I may not be completely on par when it comes to it's practice....
Seems a handy description of the TUR idea. Easily confused with TAR...accuracy and uncertainty are two different things though.