Search the Elsmar Cove!
**Search ALL of Elsmar.com** with DuckDuckGo including content not in the forum - Search results with No ads.

Transformation of Data Normality Failed

#1
Hello All,

Recently i have collected around 24000 reading from a camera measuring a distance of 2mm ± 1mm. I am trying to understand the capability of the system. This is a manufacturing line.
  1. Data was cleaned
  2. Outliers were removed
Normality test failed. I tried to transform it with all the possible options in mini tab including Cox, Johnson and all the rest. I am now stuck as i cannot predict the ppms.

Any ideas what my next step should be please? I can upload the data if required.

Thank you
D.
 
#2
Try running ALL the data (before cleaning and outlier removal). One thing that I have always questioned is the rationale used in removing outliers. If they are a part of your 'normal' process, then they should be included (IMHO)
 

Miner

Forum Moderator
Staff member
Admin
#3
Please upload the data as an Excel file. Many here do not use Minitab.

There are several reasons it might fail a normality test. You may has an unstable process, mixed process streams or chunky data. It will be easier to tell with the data.
 
#6
thank you Miner.
I tried to remove the outliers but still it will not be normal.

do you think if i remove the outliers the data will be biased?
 

Miner

Forum Moderator
Staff member
Admin
#7
Even with the outliers removed, you have distinct shifts and drift in the process that will result in a non-normal distribution that cannot be transformed. Do you have more than one process stream? Are there changes in material or tooling that would cause these shifts. Additional information about the process would help us.
 
#8
So basically what you are seeing is one big web of paper and foil is being applied on the paper. There are 4 traverses being done at the same time. Camera is reading the position of the foil. So same machine is doing the foil stripe at the same time.
 

Miner

Forum Moderator
Staff member
Admin
#9
The out of control issue is primarily driven by abrupt process shifts. Do you have any automatic (or manual) adjustments that might be occurring? This is the major problem with normality. There is a secondary issue that you will have to deal with after dealing with the shifts, and that is the poor resolution of the measurement system. In the graph below, I extracted a portion of the data that was stable and in control. It still failed a normality test due to the chunkiness of the measurements. This is illustrated by the discrete vertical groupings of measurements in the graph below.
Probability Plot of 1.85.jpg
 
#10
Hello Miner and thank you.
The data is being capture auto by a camera system that does the measurements. So three decimal places would make things better?

The machine is manual adjusted, so could be the operator was doing adjustments or the machine is varying due to mechanical issues. I have to look into that.
 
Top Bottom