I could reasonably argue both sides of this issue. However, I strongly lean toward your side. Matter of fact, I am in an engineering lab at this moment involved (among other things) with calibrating power supplies. Here are some criteria which I use in such circumstances:
1. Test and Measuring Equipment requires calibration no matter what. The simple justification is that if they didn't care about accuracy, they would not even need the test equipment. A multimeter for example is used because they want to check a voltage. If they didn't care about the voltage value, they could use a lightbulb with two test leads attached and determine whether the voltage was present. The very fact that they used a "METER" says they want to know a value. An uncalibrated meter is also an unreliable meter. Therefore, there is a potential safety hazard involved.
2. Uncalibrated Test and Measuring Equipment could sometimes erroneously be used in production. If you cannot prove that the uncalibrated test equipment could NOT possibly be used in production, a potential quality risk is introduced.
3. Engineering/R&D produces results which are directly or indirectly related to developing product specs. Uncalibrated test and measurement equipment used in engineering could cause faulty decisions in design. Then when the product is brought to production those faulty assumptions can cost money, due to inaccurate product specs.
4. Power Supply calibrations. The user does not measure AC ripple, transient response, and full specs of the power supply. The calibration produces improved confidence in the reliability of the power supply. Additionally, to be quite frank, I am pretty cynical about the user externally monitoring the output. They can monitor under a given set of operational conditions. But they very likely don't monitor under a full variety of loads, etc. Some older technology power supplies are not designed with an output accuracy. However, some of the new GPIB controlled precision supplies (such as the HP 6620 series, 6650 series, E3630, etc.) are highly accurate, some to specs of around +/-0.05%. Regardless of what they claim, when users see a high resolution display on a power supply, they expect the output values to be correct. If the supply is not calibrated, it can cause some potentially costly errors in product. My customers use supplies in (fictitious numbers to protect proprietary details) TTL circuitry to design component characteristics. The supplies are set for +5.25 VDC (fictitious). The circuit is loaded and run under various conditions to determine product performance and to provide feedback to the factory. Current to voltage relationships are measured over the GPIB bus, plotted and sent to design teams for feedback. These details are as much a part of determining product specs are factory testing is a part of verifying the product meets specs.
I even have lower tech users who after I described technically what risk potentials are possible, agreed they didn't want to risk it, and agreed to have the power supplies calibrated.
It is understandable that an engineering manager wants to guard his/her budget. However, they must intelligently understand potential risk to product, and be willing to take it.
If you have ISO requirements in that area, that turns the entire topic up a notch (to quote Emeril Lagasse). If you have to send them out and incur a per-item charge to the users department, that also adds some other implications.
Hope this is of some help. I've fought this battle many times. Each time it has been through helping the customer understand the risk that I was able to convince them.