It's the same concept applied to different things
psavijay said:

Hi
What is the difference between Gage Discrimination and Gage Least count
A.Vijayakumar
The two concepts are similar but usually apply to different types of instruments. In general, both terms refer to the smallest measurement that can be resolved by the gage, tool or instrument.
- The discrimination of a gage is the smallest division on its scale. This term generally applies to tools with engraved or printed scales. Examples of engraved scales are those on micrometers, steel rulers, or vernier calipers. An example of a printed scale is an analog voltage meter or pressure gage dial.
- The term least count (or least significant digit) applies to instruments with electronic digital displays, and is the smallest difference in reading that can be shown on the display. The magnitude of the difference depends on the range or scale of the display. This is also very similar to the sensitivity, which is the amount of input change required to change the reading by one count.
Both of these terms may also be referred to as the
resolution of the gage.
While these specifications are both important, neither is sufficient by itself to determine the suitability of a gage. Consider a couple of calipers selected at random from a manufacturer's on-line catalog.
- Vernier caliper; range 0 to 300 mm; resolution (discrimination) 0.02 mm; accuracy ± 0.04 mm.
- Digital display caliper; range 0 to 300 mm; resolution (least count) 0.01 mm; accuracy ± 0.03 mm.
Many people would look at the resolution of each one and conclude that the digital caliper is "better" because "its resolution is half that of the vernier caliper". Taking the time to look at the accuracy as well shows that actually there is not really a lot of difference between them. Other features may then become dominant, such as the ability of the digital instrument to transmit values to a computer for records and SPC analysis ...