M
Mechanica
Hey all,
this is my first post
I'm trying to set up a calibration system for a relatively very accurate torque measurement.
I'm measuring torque tools using a torque measuring system.
The %study var i'm getting is lower than 30%
The %Tolerance i'm getting is much higher than 30%
i suspect that the reason for getting that high %Tolerance is due to the low tolerance being tested and the large inter part variance.
If the tool being tested changes a bit every time i test it should i try destructive testes ?
what would give me a definitive answer in regards to the reason for the high G R&R
is it the inter part variance?
or is it the measurement system insufficient resolution / noise reading ?
this is my first post
I'm trying to set up a calibration system for a relatively very accurate torque measurement.
I'm measuring torque tools using a torque measuring system.
The %study var i'm getting is lower than 30%
The %Tolerance i'm getting is much higher than 30%
i suspect that the reason for getting that high %Tolerance is due to the low tolerance being tested and the large inter part variance.
If the tool being tested changes a bit every time i test it should i try destructive testes ?
what would give me a definitive answer in regards to the reason for the high G R&R
is it the inter part variance?
or is it the measurement system insufficient resolution / noise reading ?