Search the Elsmar Cove!
**Search ALL of Elsmar.com** with DuckDuckGo including content not in the forum - Search results with No ads.

Diameter Tolerance for Milled Cylinder - Question

G

grantmaker

#1
I aplogize if this is in the wrong forum but I recently saw a drawing for a small stainless milled cylinder part that had a specification of 1.098 inches, plus .002 inches and - .000 inches. The incoming part is being inspected with a non-digital dial type caliper with an accuracy of +/- .001 inches.

1) What does the -.000 inches mean? and 2) can it be inspected with the caliper they are using?

Thanks,
 
#2
Can it be? Yes. Somebody is doing it.

Should it be? No. If the device discriminates to .001, then your reliable range that you can safely start or finish arguments in measurements no closer than .002.

Technical arguments might be made to the contrary but the inspection practice as described is not reliable.
 
P

PaulJSmith

#3
1) It means that the acceptable range for that feature is 1.098 to 1.100. It can be .002 larger, but cannot be smaller than the nominal value.

2) see normzone's reply above
 
#4
I aplogize if this is in the wrong forum but I recently saw a drawing for a small stainless milled cylinder part that had a specification of 1.098 inches, plus .002 inches and - .000 inches. The incoming part is being inspected with a non-digital dial type caliper with an accuracy of +/- .001 inches.

1) What does the -.000 inches mean? and 2) can it be inspected with the caliper they are using?

Thanks,

1) What does the -.000 inches mean?
The specification is calling for 1.098 +.002/-.000. Another way to look at this is 1.099 +/-.001. The -.000 is saying you have .000 as a minus tolerance to your specific dimension as shown. This is a common tolerancing method I have seen when 'fit' (something has to fit--particularly press fit) is important.


2) can it be inspected with the caliper they are using??
As stated above, it can be done but calipers are not a good device to measure this with.
 
G

grantmaker

#5
Thanks all for the quick response. I don't think the spec is meant to be 1.099 + or - .001 as they are adamant they have .002 inches 'to play with' rather than .001 inch.

Would a reasonable recommendation be to use a calibrated micrometer with an accuracy of at least .0001 for a proper TUR? Also, in a rare instance (using a mic with .0001 accuracy), I'm assuming the part would be out of spec if the micrometer showed exactly 1.098.
 
#6
Thanks all for the quick response. I don't think the spec is meant to be 1.099 + or - .001 as they are adamant they have .002 inches 'to play with' rather than .001 inch.
.
In both cases, 1.099 +/-.001 or 1.098 +.002/-.000, you have a total of .002 tolerance to play with. The part has to measure between 1.098 and 1.100.
 
G

grantmaker

#7
Right! Thanks for the clarification. The two methods of showing the spec are saying the same thing.
 

Ronen E

Problem Solver
Staff member
Super Moderator
#8
Right! Thanks for the clarification. The two methods of showing the spec are saying the same thing.
That's true only in the old QC school, where all that mattered was "are we within tolerance, or not?". I.e. does the part pass or fail?

In the "new" QA school (which is actually decades old), importance is also placed on the nominal, i.e. the preferred or "ideal" value, and how close to it the part is. A symmetric tolerance means that the preferred value is in the middle, and deviations in both directions are equally "bad" (of course, the bigger the worse), to the point that the tolerance is exceeded and the part fails. In a one-sided tolerance specification, it is actually preferred that the part is as close as possible to the nominal, without going past it. Deviating in the other direction may still be acceptable (within tolerance) but is not preferred.

Cheers,
Ronen.
 
Top Bottom