# Non-normal Distribution Selection where the system is constantly being corrected

#### rmf180

##### Involved In Discussions
Most things in life follow the normal distribution left to their own nature. However, GD&T features and others which drive toward zero are by design not normal (log-normal, exponential, etc). Does anyone have a list beyond these few as to the appropriate non-normal distribution?

The specific example I am working with is a water heater is controlled by PID system (non-normal or tampering) which constantly corrects for "error" from set point. Heated water from this system is cooling molds which contain internal thermocouples. The process is monitored and each cycle evaluates the temperature and dispositions product based upon alarm limits.

What distribution would be appropriate for a system where the system is constantly being corrected?

#### bobdoering

Trusted
What distribution would be appropriate for a system where the system is constantly being corrected?
If the process constantly adjusts generating a random curve, it tends to be normal by overcontrol.

If the process adjusts between limits generating a sawtooth curve, it tends to be continuous uniform distribution.

#### isolytical

##### Involved In Discussions
Ad hoc application of distributions to data is not the best use of time. Visualize the data by plotting it first. From the description, the data may have a time related function so a run sequence plot is the first plot. This gives an idea about variation and outliers. Next plot the lag-1 autocorrelation plot to check for randomness. And lastly, make a probability plot, normal and any other distribution surmised from the previous plots.
These plots will show whether the process is in control before any distribution is applied. And included is the added advantage of possibly identifying the distribution.

#### bobdoering

Trusted
Actually, you should consider expected distributions prior to analyzing your data. First you should prepare your total variance equation to determine what variances you expect, and what distributions they are likely to generate. The process will be the result of the sum of those distributions. Effort to reduce the effects of distributions from gage, measurement and other minor process variances to prevent masking of the the main variances is key.

Then do your time-ordered sequence chart. Look for signs of a function (variation as a function of time, such as a rate; e.g. tool wear rate, heating rate, etc.), such as runs in one direction or another throughout the chart. If you are not getting your expected distribution, you may need to go back to the equation and either find or reduce the effect of the other variances before you can ever claim control or determine randomness and independence. After all, you can often be in control and not random and independent if you can identify your process output as a function. And, if that is the case, you can not use Shewhart charts to monitor that kind of process. If will yield signals exactly opposite of the true process condition.

Of course, curve fitting is a far more straight forward analysis of the resulting distribution than probability plot.

Rummaging through the data as is without this thought process will not be the best use of time.

Last edited:

#### bobdoering

Trusted
I know, right? Your approach that assumes a homogeneous singular distribution is not only vintage, but the assumption is false for most industrial processes.

Last edited by a moderator:

Staff member
Most things in life follow the normal distribution left to their own nature. However, GD&T features and others which drive toward zero are by design not normal (log-normal, exponential, etc). Does anyone have a list beyond these few as to the appropriate non-normal distribution?

The specific example I am working with is a water heater is controlled by PID system (non-normal or tampering) which constantly corrects for "error" from set point. Heated water from this system is cooling molds which contain internal thermocouples. The process is monitored and each cycle evaluates the temperature and dispositions product based upon alarm limits.

What distribution would be appropriate for a system where the system is constantly being corrected?
For your system, a straight line.

I'm not sure if this is what you're looking for, but process controllers like that shouldn't have swing in them. The controller is not tuned correctly. It sounds like the Integral and/or Derivative is not set correctly. Does the controller have an auto tune?

Once it's tuned correctly, the controller should bring the heat up, level off, and control the temperature right at setpoint.

#### bobdoering

Trusted
Even with autotune, you get a sawtooth of on and off cycles. They may be small or large depending on the requirements. As in any sawtooth process, the tighter the requirement the more frequent the adjustment - even if electronic. But, it won't be a straight line unless you stand back a ways....

#### Miner

##### Forum Moderator
Staff member
Super Moderator
I agree with Bob. That has been my experience also. The degree of sawtooth depends on a lot of variables (e.g., controller, inertia of the system being controlled, etc.).

Staff member
It all depends....

I have seen perfectly tuned systems. If the rate is set properly, it should anticipate the rate of change and adjust accordingly. There are many applications ( aerospace, aluminum treating +/5 F) where there can be any overshoot.

Now... here's the thing... it also depends on the sensitivity of the probe and its location. They may have an MGO sheathed probe which is not as sensitive.
The recorder is connected to a wire thermocouple which is much more sensitive.
The controller can still be adjusted to remove a lot of the swing.

Also, i will get completely different results using 22awg versus 16awg thermocouples.

So the type of distraction can end up being a lot of things, depending on the type of control error.