Al really repeated what D Scott said - and as far as Mr. Scott's explaination, I agree.
Basically it boils down to (in the auto world early
APQP) the fact that during early design criticval characteristics start emerging. At that point you should be determining to what precision you need a dimension to be. This is where the M&TE people come in and sit down and say we have equipment to measure that or we don't and thus will have to buy.
> Don't lock yourself into a whole bunch of decimals unless
> the part/process spec requires it. We found that when we
> put in our new measurement system and "showed off" with 8
> decimals on a layout that the part we used to measure at 3
> decimals now became a requirement at 6.
I agree with the first sentence, but will say that the precision of the instrument should never dictate the dimension precision. That should be a design function. If there was a part I had and the print gave me 'y' mm +/- 0.2 mm and I used a CMM precise to 0.0001, I would only use the first 2 digits after the decimal point from the CMM readings. Mr. Scott amply described these as the 'significant digits' - also Al's 10 to 1 rule. If you don't do this you're heading for trouble. If you are measuring at 0.01 yet are capable of measuring at 0.0001 (precision), the fact that you're capable of the measurement precision beyond the print callout may get you into trouble simply because your manufacturing equipment is not capable of that precision. As a last comment, if you change a dimension precision on a part based upon your measurement precision capability, it's going to be more expensive to make that part. Tightening a tolerance - which is what I think is being implied - is typically going to make the part more expensive.
The Batavia experiment was good proof that if you don't have the M&TE people in early, and you have 'company idiot' design engineers, you can end up on the other side of the stick - the precision necessary is more than your instrument (gage, CMM, whatever) can measure to.
There is another 'rule' to consider. It's something like 25% MAX (of tolerance) for combined 'uncertainty' (the 4 to 1 rule) of the measurement system as a whole. That should be taken into consideration when setting tolerances during design phase as well. Make sure the M&TE people are there when decisions like this are reviewed during the design stage.
I think.... Me not always right :thedeal: Not my specialty.