Russ,
Example:
Full calibration of an attribute gage (Let's say a Class X Plug Gage at 12mm diameter - single member - Go): This would require a full temperature soak in a stable 20°C room for at least 4 hours (although 24 hours is more accepted) in order to adequately measure the plug (Tolerance +1 µm/-0 µm). The generally accepted method is six measurements - front, middle, back, spin 90°, front middle, back. Document all six measurements. Size of gage can either be reported as all 6 measurements, or the largest measurement found (as this will be the smallest diameter that will fit in any given hole). Correlate the 90° points to ensure that the gage is basically round.
In other words, fully documented, with traceability (which means accompanying uncertainty according to the VIM), etc.
Using an in-service Check Standard to verify gage condition (Let's use the same gage):
A corresponding set of rings may be used (calibrated of course) designated for checking plug condition is used. If the plug fits in the go ring, but not the no-go ring, the plug is verified to be within the dimensions of the two rings. BUT...what is the actual size of the plug? Can you do a Gage R&R without knowledge of the actual gage attributes? Is the gage oval? If you are planning on meeting ISO 17025, how will you compute uncertainty?
In other words, nothing has gone seriously wrong since the last calibration.
Unless you are using very creative calibration procedures that do not check all attributes, I don't see it as feasible to fully calibrate a gage each time it is issued. But, with a mixture of systems, it is a very good assurance of quality. Calibrate on a cycle, in-service check upon issue, and if it works out well, extend the scheduled calibration cycle based on the in-service check data combined with calibration history, therefore saving money - which is the bottom line.
Ryan