The manufacturer sets the tolerance, accuracy, specification, or whatever else they may call it. They know how the instrument was built, the design tolerances, the components used. They have done their own internal testing to determine the level of accuracy that this instrument can maintain over a period of time.
There are instruments and tools out there that do not have specifications, or only "typical" specifications. For a lot of non-critical uses these can be fine. The danger here is a lot of users will assume the "accuracy" of the instrument based on its resolution. That is not a valid assumption that you can make.
There are times that the user will set their own specification based on their process requirements. However, this is only valid when you are using a looser tolerance for your process than the manufacturer's tolerance. People will use more accurate equipment than is actually required to limit the risk of failure due to out of tolerance equipment.
When people try to use equipment at tighter tolerances than the manufacturer specifies they are doing this completely at their own risk. Unless you have done studies on the equipment and have validated that it is capable of operating at a tighter tolerance then you cannot justify claiming a tighter tolerance.
For all intents and purposes, the F.S. (Full Scale/Full Span) accuracy statement (actually uncertainty) is the tolerance. Ex: A 1% FS for a 15 psi a absolute pressure gauge is 0.15 psi, which describes a fixed uncertainty for the entire range. Since it is %scale tolerance and not %reading tolerance, it is equivalent to 1% at 15 psi a but blooms to a huge percentage at 1 psi a ! If a tighter tolerance is needed at lower values, a % of reading gauge should be used.