SBS - The best value in QMS software

Capability Analysis of Non-Normal Data

S

Seyed

#1
Hey,

I have some problems with my capability analyses of my data. The data that I have are from a Temperature uniformity survey of a heat treatment furnace from the automotive industry. I have used 20 different thermo couples positioned throughout the cross-section of the furnace (imagine two parallel boxes with 10 different thermo couples mounted on each).

The total process time is 7 hours and every three seconds the temperatures are stored from all the thermo couples. Since the data is huge I have averaged the data after each 90 seconds.

How do I know that my data are non-normal? I have performed the Andersson-Darling test and the P-values are less than 0.05 and therefore I assumed that the data are non-normal.

I have also tried to use the Cox-box transformation to transform the data, but I really don’t understand the output as the input is temperature (in the range of 890 - 920C). The output is values in the range of 650E+15 which I don’t understand.

How can I perform a correct Capability analysis and calculate my Pp, Ppk, Cp and Cpk an using my non-normal data? And how can transform the data and understand the output?

Please is there anybody who can guide my throughout this problem. I can give you more information upon request.
 
Elsmar Forum Sponsor

Miner

Forum Moderator
Staff member
Admin
#2
Re: Capability Analyses of non-normal data

Hey,

I have some problems with my capability analyses of my data. The data that I have are from a Temperature uniformity survey of a heat treatment furnace from the automotive industry. I have used 20 different thermo couples positioned throughout the cross-section of the furnace (imagine two parallel boxes with 10 different thermo couples mounted on each).
Before analyzing the aggregate data, I recommend that you analyze the individual process streams (i.e., each thermocouple) separately. Individual streams may be normal, yet the aggregate be non-normal because each stream average is different.
The total process time is 7 hours and every three seconds the temperatures are stored from all the thermo couples. Since the data is huge I have averaged the data after each 90 seconds.
Temperature data are almost guaranteed to be autocorrelated. This means that each temperature reading is dependent on the temperature reading taken 3 seconds earlier. Perform an autocorrelation analysis. Once you have identified the period of autocorrelation (i.e., the period during which the dependency exists), select an individual temperature measurement at a period longer than that. Do not average.

How do I know that my data are non-normal? I have performed the Andersson-Darling test and the P-values are less than 0.05 and therefore I assumed that the data are non-normal.
You are correct in your approach and decision. The trick is in figuring out WHY the data are non-normal. I suspect that it is multi-modal from the mixing of 20 process streams.

Once you have determined whether autocorrelation exists, and sampled at a frequency greater than the autocorrelation period, plot the data on I-MR charts by thermocouple. This will tell you whether each thermocouple zone is stable and what mean temperature and variance exists in that zone. This will be extremely important if improvements are necessary.

You may consider other tests such as a 0ne-way ANOVA followed by multiple comparison of means to determine whether the differences between thermocouples are significant. You can safely pool those thermocouples that are not statistically different. Do not pool those that are different from each other.
I have also tried to use the Cox-box transformation to transform the data, but I really don’t understand the output as the input is temperature (in the range of 890 - 920C). The output is values in the range of 650E+15 which I don’t understand.
I am not a big fan of transforming data for SPC or capability studies. This is one reason among many. The capability indices are the only numbers with any meaning when you transform the data.

How can I perform a correct Capability analysis and calculate my Pp, Ppk, Cp and Cpk an using my non-normal data? And how can transform the data and understand the output?

Please is there anybody who can guide my throughout this problem. I can give you more information upon request.
I use Minitab's non-normal capability analysis that allows you to analyze the data in the untransformed state. You do not have this issue with numbers that are meaningless.
 
Last edited:
S

Seyed

#3
Thanks for your support,

More information (comment to the reply from Mr. Miner):

As the temperature is increasing from 600C (the charge is entering the furnace) up to 910 and back to room temperature (the charge is leaving the furnace) it is difficult to analyze the entire process time at once. One have to divide the process time in several sections, however as the NADCAP advises one can select that are where the temperature should be stable. By selecting that time of period with a stable temperature, which are approx. 2 hours the temperature are in the range of 890 - 920 C.
Within that period I have averaged as mentioned before the data in order to reduce the data points and analyzed each thermocouple separately. Only three different channels are normal-distributed and 17 are non-normal.

The question is that should the data be non-normal? The process should be stable in the region of interest, however we know that when the door (entrance or exit) is opened we will have a drop in temperature and it will influence the readings.

So I think the idea of autocorrelation is important and it should be performed in order to identify the period of autocorrelation. But I think the data should be averaged, or?

Thanks in advanced for your expertise and comments.
Seyed
 

Miner

Forum Moderator
Staff member
Admin
#4
As the temperature is increasing from 600C (the charge is entering the furnace) up to 910 and back to room temperature (the charge is leaving the furnace) it is difficult to analyze the entire process time at once. One have to divide the process time in several sections, however as the NADCAP advises one can select that are where the temperature should be stable. By selecting that time of period with a stable temperature, which are approx. 2 hours the temperature are in the range of 890 - 920 C.
Within that period I have averaged as mentioned before the data in order to reduce the data points and analyzed each thermocouple separately. Only three different channels are normal-distributed and 17 are non-normal.
Are the non-normal thermocouples those closest to the doors and most likely to be influenced by the temperature loss when the doors are opened?

The question is that should the data be non-normal? The process should be stable in the region of interest, however we know that when the door (entrance or exit) is opened we will have a drop in temperature and it will influence the readings.
I would stratify the data into three (or more) stages: 1) ramp-up after doors are closed; 2) Steady state; and 3) ramp down when doors are opened. Analyze autocorrelation, normality and capability of stage 2 (steady state). I would analyze stages 1 and 3 separately, but using the same approach as follows: Analyze autocorrelation then time series analysis to understand the ramp-up, ramp-down effect. If you have more stages than this, use a similar approach for each stage.

So I think the idea of autocorrelation is important and it should be performed in order to identify the period of autocorrelation. But I think the data should be averaged, or?
Do not average the data before performing the autocorrelation study. When you average data, you lose information about your process. As I said before, once you know the period of autocorrelation select one measurement at intervals slightly greater than the period of autocorrelation. This will reduce your data set.
 

bobdoering

Stop X-bar/R Madness!!
Trusted Information Resource
#5
When you average data, you lose information about your process.
That is the bottom line!! Any chance of sharing the raw data in a spreadsheet form, by thermocouple, with conditions identified (ramp start, door open, etc.)?
 
S

Seyed

#6
Hello again,

I have tried to perform the autocorrelation for the unaveraged data in the steady state. I have als tried to read me through different guide lines for how to perform a autocorrelation and how to interpret the data.

However I have a couple of questions:
1) should i perform the autocorrelation for each channel by it self or should I put all the data in one row in the MINITAB and perform the autocorrelation for all the data at once?
2) After performing the autocorrelation or channel 1 at all the tags the autocorrelation value (a=0.05) is larger than the significance level of 95%. What does this means?

Check the PDF file attached.

Thanks for your support.
seyed
 

Attachments

Miner

Forum Moderator
Staff member
Admin
#7
Analyze each thermocouple separately. The attached graphs do show a high degree of autocorrelatation. When the blue bars dip between the two red lines, autocorrelation no longer exists. The corresponding number of lags x the sampling interval is the period of autocorrelation.

Use data points selected at an interval greater than the period of autocorrelation and analyze it as discussed previuosly.
 
S

Seyed

#8
Yes, of course,

I will add a Excel file with 4 different thermocouples. The time of period selected are within the stable zone. Thanks your your support!!!

Seyed
 
S

Seyed

#9
Hey,

Here is the excel file for those who would like to have a challange. And of course I really are thankful for all your support and expertise.

Seyed

ps. looking forward for your help.:agree:
 

Attachments

bobdoering

Stop X-bar/R Madness!!
Trusted Information Resource
#10
Hey,

Here is the excel file for those who would like to have a challange. And of course I really are thankful for all your support and expertise.

Seyed

ps. looking forward for your help.:agree:
Since you are looking for capability, what is the tolerance for the region of the data you supplied?
 
Thread starter Similar threads Forum Replies Date
A Cnpk (Non-Parametric Capability Analysis) to assess whether the Process is Stable Capability, Accuracy and Stability - Processes, Machines, etc. 7
A Capability Analysis - Dealing with non-normal data in Minitab Using Minitab Software 8
B Two excellent examples of process capability analysis from Quality Magazine Capability, Accuracy and Stability - Processes, Machines, etc. 5
R Capability analysis - What is going on this chart? Manufacturing and Related Processes 15
A Capability Analysis for Packaging Seal Strength with spec. >0.1 Kgf using Minitab Using Minitab Software 6
J Capability Analysis - Unusual Statistical Distribution of my Proccess Capability, Accuracy and Stability - Processes, Machines, etc. 5
C Minitab Binomial Process Capability Analysis Chart Explanation Using Minitab Software 1
C P value not greater than 0.05 for Capability Analysis Statistical Analysis Tools, Techniques and SPC 3
M Sample size when conducting capability analysis with OK and NOK parts Statistical Analysis Tools, Techniques and SPC 3
R Capability Analysis (using Histogram) for Subgrouped Data Statistical Analysis Tools, Techniques and SPC 16
A Inspection Capability Study (Kappa Analysis / Attribute R&R) as Certification Req't Training - Internal, External, Online and Distance Learning 7
C Process Capability - Attribute Data Analysis (Template) Document Control Systems, Procedures, Forms and Templates 0
C Process Capability - Variables Data Analysis (Template) Document Control Systems, Procedures, Forms and Templates 0
Q JMP Measurement Capability Analysis Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 1
P Data Analysis - Johnson Transformation/Weibull Distribution/Capability Analysis Capability, Accuracy and Stability - Processes, Machines, etc. 15
L Capability Analysis - Help with Minitab results Using Minitab Software 15
P Capability Analysis for Limited Resolution (thickness of sheets) Capability, Accuracy and Stability - Processes, Machines, etc. 6
M Capability Analysis and Sample Size Capability, Accuracy and Stability - Processes, Machines, etc. 8
B Capability Analysis of a Repeated Feature Quality Tools, Improvement and Analysis 9
B Capability Analysis of a Small Sample Size (as small as 2 and as large as 30) Statistical Analysis Tools, Techniques and SPC 14
S "Normal" vs. "Between/Within" Capability Analysis Capability, Accuracy and Stability - Processes, Machines, etc. 2
S How to conduct Capability Analysis for this data? Statistical Analysis Tools, Techniques and SPC 12
S Free or Opensource Software that can do Multivariate Capability Analysis Statistical Analysis Tools, Techniques and SPC 8
T Capability Analysis in the Real World Capability, Accuracy and Stability - Processes, Machines, etc. 7
D Capability Analysis for Packing Process in Food Manufacturing Capability, Accuracy and Stability - Processes, Machines, etc. 11
I Machine Capability Analysis in Multi-Dimensional Cases Capability, Accuracy and Stability - Processes, Machines, etc. 2
A Yearly MSA (Measurement System Analysis) & Capability Study - TS 16949 Requirements Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 2
T Process Capability Analysis Binomial or Normal using Minitab 15 Using Minitab Software 54
P Capability Analysis in Minitab - Subgroup size Misc. Quality Assurance and Business Systems Related Topics 2
J Capability Analysis (Cpk) of torque driver process Capability, Accuracy and Stability - Processes, Machines, etc. 5
E Process Capability Analysis for a Service Company ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 7
H Regarding: Describing Graphical Overview and Capability Analysis for main dept leader Quality Manager and Management Related Issues 1
M Macro that will perform a normality test and capability analysis - Minitab help Using Minitab Software 1
C Does a Chronically Unstable Process need a Capability Analysis? Statistical Analysis Tools, Techniques and SPC 10
F Multi Capability analysis with MINITAB Using Minitab Software 2
A Whats the rationale for a Sample size of 30 for Capability Analysis? Statistical Analysis Tools, Techniques and SPC 5
D Normal vs. When a Between / Within Capability analysis should be used? Statistical Analysis Tools, Techniques and SPC 12
E MSA vs. Measurement Capability Analysis - What are the differences? Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 33
J What causes Ppk to be greater than Cpk when a basic capability analysis is run? Capability, Accuracy and Stability - Processes, Machines, etc. 17
A Sample size for Capability Analysis in PPAP APQP and PPAP 2
L Definition Cp Cpk Pp and Ppk Definitions - I'm doing SPC and Capability Analysis Definitions, Acronyms, Abbreviations and Interpretations Listed Alphabetically 7
C How to Establish the Calibration & Measurement Capability (CMC)? ISO 17025 related Discussions 1
lanley liao How to keep the manufacturing capability under the API monogram Oil and Gas Industry Standards and Regulations 5
M Minitab Capability of the Population (no sampling) Using Minitab Software 11
A Capability Study - in the beginning of your career what should you have known about the tool Quality Tools, Improvement and Analysis 11
Q Capability study with a minimum spec Statistical Analysis Tools, Techniques and SPC 8
Q Capability - CPk comparison values Capability, Accuracy and Stability - Processes, Machines, etc. 12
H Capability Data for Paint Thickness on Painted Parts Statistical Analysis Tools, Techniques and SPC 10
N Is capability applicable for a destructive test? Capability, Accuracy and Stability - Processes, Machines, etc. 9
S Capability or Gage R&R Study for Leak Tester? Reliability Analysis - Predictions, Testing and Standards 15

Similar threads

Top Bottom