Linearity - Which reference values I have to measure to deliver linearity

Z

Zorro

I need to make a linearity test to a tester designed by me, but I don't know how to select wich values I have to messure to deliver linearity. For example, my tester can messure 1000 ohm with a %error=0.02%, but if I messure 10 ohms with the same scale, this messure has %error=1%.
Is it right if I make a linearity test using patron values fron 10 ohm to 1000 ohm?
 
A

Al Dyer

Zorro,

You would be correct to measure values throughout the working range of the gage. I can't say where in the scale you could do the measure.

100/200/300/400 etc... might work. What you could do is define which values you will be working with and incorporate them into the study.

Start with the 100/200/300/400 etc... and then review which intervals are called out for on a particular requirement. You might have a customer requirement for 152 +/- 5. Incorporate the 152 as part of the gage study.

There has got to be an easier way to explain it so maybe one of the pro's here can help me!

ASD...
 
A

Atul Khandekar

Zorro,

I'm not sure whether you are the manufacturer or also the end user of this instrument.

I'll use an example of a vernier. As a user, if I have to measure jobs
that are below 100 mm, I would go & buy a 0-100 vernier. If dimensions to be measured are higher (say, 300 mm & above), I would use a 0-500 vernier. It would be a very rare case for someone to measure at 150 level with an instrument having a range 0f 0-1000.

For a 0-1000 instrument,IMHO, you can use intervals that are more spaced apart at its lower end.
 
Top Bottom