In Reply to Parent Post by howste
Let's assume that the collective uncertainty of the measurement system hasn't been calculated. If a gage block or mass standard (weight) is used for the calibration, what would be the measurement accuracy of the standard? There are no divisions or increments to be read on the standard, so would the "measurement accuracy" be the accuracy of the instrument used to calibrate the standard?
The two standards that you mentioned (gage blocks and mass standards) are calibrated to industry standard specifications (or grades); classes and grades.
Mass standards classifications can be found at Rice Lake and Troemner. Gauge blocks are listed under ASME B89.1.9-2002, for one.
I know you know some of this... but just for explanation....
The classifications of each go along with the construction of them. You don't buy a set of inexpensive Class 3 weights, and then calibrate them to an Ultra Class tolerance.
Same with gauge blocks. So you send them to the competent vendor, and have them calibrate both, providing actual data and such.
Depending on your application, the blocks and weights should be four times more accurate that what you are verifying.
Think of it like this.... you have a standard thermometer (N.I.S.T traceable of course), and a well constructed ice bath. You're calibrating some thermometers. The ice bath... is.... .01C. It does not have to indicate anything; where the standard thermometer will. Same with blocks and weights. The gauge block (if properly certified), is 1 inch. Period. So the device should produce 1.000 inch or whatever.
A lot of babbling there.
Not sure I helped or not.