Re: How to Tolerance for determining Working Voltage
Selecting what to measure in production will depend on a number of factors including expected variation, margin to the limit and criticality of the limit (as you say, the limit might have some buffer built in).
For working voltage, it is not standard (normal) practice to measure in production as far as I am aware. It is not stated anywhere, but the limits do have some buffers, and you may anyhow have some margin in your design. For example, if you are using triple insulation in transformer for the 460Vrms, it is probably rated (and production tested) for 700Vrms.
It is worth to note that Vrms and Vp measurements at high frequency are dubious in the first place. The main problem is the probes: experiments in several qualified labs and manufacturer labs comparing two "calibrated" HV probes often find 10-20% difference in measurements at frequencies above 10kHz. The probe "calibration" is only for dc/50-60Hz where the probes are operating in a reliable resistive region. At higher frequency the probes use a capacitive divider which is not checked by the calibration lab and often has errors > 10%. Capacitive dividers are also unstable with temperature, voltage, frequency and the environment and they are really there just to show the shape of the waveform, not absolute values.
Oscilloscopes also have frequency errors which are not covered by calibration, although these tend to be within 3%. On top of that stray paths, even with differential measurements, can ad errors. Just the layout of the cables changes the result.
The upshot is, any measurement at HF needs to be taken with a grain of salt. Even the production variations are likely to be, at least in part, measurement related.
All of this assumes, of course that the 462Vrms is measured at HF in a switching transformer ... if not please ignore the above rambling!