If a meter, for example, is 0-100 VDC (most common on analogue meters, due to the source of error being variability of the meter movement/needle), and the spec was +/-1%F.S. , the spec would be +/-1VDC. You would apply that same calculated tolerance anywhere in that range.
If, as in the example, you are checking 10 Volts, the spec is 9 to 11 VDC. If you are checking at 50 VDC, the spec is 49 to 51 VDC, at 100 the spec is 99 to 101 VDC.
There are many ways manufacturers spec tolerances. One is as a portion of full scale, another is as a portion of reading, often you'll see the two combined. Using the same 0-100 VDC meter, if the spec was +/-1%F.S +/-1%RDG, you would add the two together. At 100 VDC, the tolerance would then be 98 to 102 VDC, at 50 VDC the tolerance would be 98.5 to 101.5 VDC, and so on.
Hope this clears things up.