N

#### ncwalker

Now let us roll 2 dice and sum the result. And plot the frequency of the sum which can be from 2 to 12. This will now look normal simply because the frequency of occurrence of snake-eyes and boxcars (both die = 1 or both die = 6) will be much less than 6, 7, or 8 which have much more combinations that will sum to these values.

Is this because of the Central Limit Theorem?

This would also hold true if I averaged the dice instead of summed them, more combinations would result in an average of 3 or 4 than 1 or 6.

Which then brings me to the question of doing "math" on data in any form.

1) A CMM that takes probe hits to measure a diameter. What it reads is n point coordinates, then it "does math" to best fit a circle resulting in a diameter.

2) A leak tester looking for a leak rate. That pressurizes a part, lets it stabilize, and takes several measurements over a stabilization phase to generate an average leak rate. It "does math" before the result is reported.

One could go on and one, but there are a lot of automated devices out there that have sensors connected to transducers in some manner. And further then take several inputs to compose or derive an outputted result, "doing math."

Because of CLT do the results then look more normal than the values the sensors are actually reporting?

In other words, I have a leak tester that through internal to the device math an controls gives me a leak rate in ccm. But what the sensors are outputting are volts. Would my volts be, say, Weibull but my leak rate show up as normal because the device "did math" and because of the CLT?