In Reply to Parent Post by Dongzkie
okay sir i understand now, im just so very confuse right now right after I attend my seminar on Industrial Calibration, because i ask this issue before to the speaker and he said that my pin gage must be more accurate than 0.001mm(micrometer accuracy) so it must be 0.5microns
Well - Going to a metrology type seminar you got a metrology type answer...
In other words the answer really related to what is necessary to calibrate the instrument based on the instrument's designed (or potential) capability and not to the necessary capability in how the instrument is actually being used.
If the instrument's "potential" is to three decimal places which means the testing standard needs to be accurate to 4 decimal places in order to assure that the instrument is able to perform up to it's full capability.
However, if the accuracy requirements of the work and usage is only to 1 decimal place, then a testing standard with an accuracy of 2 decimal places is sufficient to assure that the instrument performs to the needed capability.
So the answer you received at the seminar was not wrong. It simply exceeded what is needed for your application. It did not apply common sense to the equation.
I hope this helps some.