**Search ALL of Elsmar.com** with DuckDuckGo including content not in the forum - Search results with No ads.

# What would be the appropriate control chart for the attached data?

D

#### D.Salman

Dear Experts,
What would be the appropriate control chart for the attached data?
Data from one month
Specification limits between 0 to 1200 Minutes.
In most cases, sample size was variable, except the light turquoise cases, we have one sample per day.
Thanks.

#### Attachments

• 17 KB Views: 143
D

#### David DeLong

Have a sample size of one and then you could use a individuals and moving range chart. Variable samples sizes???

#### Steve Prevette

##### Deming Disciple
Staff member
Super Moderator
Assuming that these are individual results, I went with a simple x-chart. I used the initial data from 7/1 to 7/15, ignoring any results greater than 700. That gave a good prediction of the 7/22 and on data which seems to have eliminated the cause of the >700 points.

#### Attachments

• 35 KB Views: 138
P

#### Pudge 72

HMMM.....
The data troubles me.
Assuming that this is a process or operation, we go from 38 minutes in one subset on a given date to 1,125 minutes for the same operation in the same day? That's 18 hours of work vs. less than 1 hr.?
Something seems to be awry - I would scrub the data and the process before analyzing this any further and wasting time trying to see something that isn't there. 1st - set up a precise measument method as it relates to data taken, measure whatever activity is taking place in "groups" of five and get at least 5 recorded readings if possible. 2nd - we need to take the "apples to apples" approach here - with the kind of variance that we are seeing - are we recording the different subsets in the same manner? I think that we can establish there is a huge variable lurking without further investigation with the kind of variation that we are seeing - or, the data is corrupt and improperly recorded, whether that is a result of technique or another factor, we need to analyze that as well.
Then, after those issues are dealt with, let's retake the data with assemblence of order and method and see what we get......

D

#### D.Salman

Dear Experts,
Thanks for the information.
As Mr. Steve said, it is individual values.
Kindly, may I ask the following question?
•Why ignoring any results greater than 700?
•Why only from 7/1 to 7/15? What about 7/16 to 7/21 and 7/22 to 7/31 as well?
Many thanks in advance.

#### Steve Prevette

##### Deming Disciple
Staff member
Super Moderator
•Why ignoring any results greater than 700?
When I set up the initial average and control limits using the first 25 points, the UCL was around 500. So that gave me an indication that those points above 500 were outliers. I reran the average and control limits while throwing out the outliers. Visually you can plot the dots and see a wide gulf between the majority of the data, and the relative small number of points above 500.

•Why only from 7/1 to 7/15? What about 7/16 to 7/21 and 7/22 to 7/31 as well?
Dr. Shewhart's original work stated do not declare a process stable without 25 stable data points. So, I usually use the first 25 points when I set up my average and control limits. If there are more than 25 points (such as this case), I can validate the initial average and UCL by seeing if it predicts the future values. The baseline average and UCL I came up with using the 7/1 to 7/15 data (with outliers removed) quite well predicted the data after 7/15 (no statistical signals). As a double-check I did see what the average and UCL would be using the 7/15 to 7/31 data, and it was within a few percent of the 7/1 to 7/15 baseline.

For more information on this logic, see http://www.hanford.gov/rl/uploadfiles/VPP_4_SPC.ppt and http://www.hanford.gov/rl/uploadfiles/VPP_20_Life_Cycle.ppt