Z
I need to make a linearity test to a tester designed by me, but I don't know how to select wich values I have to messure to deliver linearity. For example, my tester can messure 1000 ohm with a %error=0.02%, but if I messure 10 ohms with the same scale, this messure has %error=1%.
Is it right if I make a linearity test using patron values fron 10 ohm to 1000 ohm?
Is it right if I make a linearity test using patron values fron 10 ohm to 1000 ohm?