The question, if you show a GRR @ 10% does that mean you should cut your tolerance by 10% to ensure product integrity?
I think what you're saying is that with 10% R&R, there's a slight probability of accepting bad product at the high and low spec limits. Because of this you might tighten the tolerance to reduce the risk of bad product. Is that correct? If so, I see two potential problems which I'll describe before I give my final answer.
First, if you tighten the tolerance, then the GRR value will actually increase because the measurement system variation is now a higher percentage of the (now smaller) tolerance. Then you'll have to tighten the tolerance again to be safe, and the GRR value will be higher still! Pretty soon you'll have zero tolerance and an infinite GRR %.
Second, if your process is capable, you shouldn't be near the limits anyway. When you do a capability study on a process, the variation of the measurement system is included. The product variance (std. dev. squared) and the measurement sytem variance add to make the total variance reported. So if your process capability study shows it's good, gage error has already been factored in.
Having said that,
if your process is not capable (i.e. you're using the gage to sort out bad product), you may consider creating an "internal spec" that is used to reduce the risk of accepting bad product (as you mentioned)
while keeping the same actual specification. Most GO/NOGO gage systems have wear factors built in which are already doing just that.