I was wondering if it would work to choose a percentage shift that the instrument would be at where we would no longer allow an interval change.
This is your company's call. You will need to document what your procedures will be, and be aware of, and accept the risks that go with it. I recommend to have management (QA, using department, director, etc) sign-off on the deviation, or at least send them an e-mail and print it for your records.
My current company, allows micrometer end standards 13" and over to be accurate to .0001", vs. manufacturer tolerance of .000001", since the micrometers they verify are only read by .001". Very negligible risk.
One company that I am aware of decided to allow certain gages to be off by .0004" instead of the typically acceptable .0001". So instead of spending a few thousand dollars to replace them, they are accepting this risk, aware that this "shift" is more than HALF of the allowable tolerance on some of their high-dollar parts (5-6 figures). VERY little room for error and these parts have to be balanced, fit on a shaft, and spun-up to 40,000-50,000 RPM. :mg:
It all comes down to: what risk (quality, reputation, safety, etc) is acceptable to management?