They are pretty close, but I think there is a little difference between them.
Uncertainty is the combined inaccuracy of a measurement. In a calibration context, if there are multiple instruments used to make a calibration measurement, if you used the Roo Sum Squares (RSS) method to calculate a total uncertainty, that would fall under the 'uncertainty' umbrella. The total uncertainty of a measurement may simply be the accuracy tolerance of a measurement standard. Or in the case of multiple instruments, and perhaps if environmental variation is a contributing factor, that amount of variation would also be factored in. Uncertainty refers to making a measurement.
Accuracy on the other hand refers to the plus and minus tolerance of an instrument.
TAR is a simpler quantity, referring to the ratios of the accuracies of two pieces of equipment. In calibration, uncertainty is prabably a more correct term. Where as in some other simpler applications, accuracy may be applicable. If you are using a hand held multimeter with a stated accuracy of +/- x% DC volts to adjust a power supply that has a stated accuracy of +/-y%, TAR would be appropriate. A complicated temperature calibration system with multiple instruments and opportunities for error would need to all be calculated together and a measurement uncertainty calculated.
------------------