Heizer Om12 Im 06S
Heizer Om12 Im 06S
Heizer Om12 Im 06S
It’s important to emphasize that control limits are not the same thing as engineering specification limits.
Control limits are designed to monitor output over time to ensure that the system continues to produce
consistent output. Output produced outside of control limits sends a signal that something extraordinary
has occurred and the firm should investigate. One way to illustrate this in class is to bump into the
projector in the middle of introducing control charts. Then point out that if you did not turn around
periodically to look at the screen, you would never know that the slide is now projecting to the ceiling. So
even though the equipment was set up perfectly at the beginning, we use control charts (turn around and
look at our output) to periodically check to ensure that output still looks OK. Spec limits, on the other
hand, are fixed engineering measurements that define exactly what determines acceptable output or not.
Any output produced outside of spec limits must be rejected. It is possible for output to exceed control
limits and still be within spec limits. And, theoretically at least, it is possible for output that exceeds spec
limits be appear to be within statistical control.
With SPC charts, students can get tripped up in several places. First, they must be able to choose the
appropriate chart for the problem at hand. For variables (anything that can be measured along a number
line, such as length, temperature, volume, etc.), both and R-charts must be used. Figure S6.5 (and slides
S6-34 and S6-35) show nicely why both are necessary. Choosing the correct chart for attributes (binary
output such as percent defects) can be a bit tricky. The p-chart is used to monitor the percentage over
time, while the c-chart is used to monitor the number of occurrences over time. But what if students are
provided a set of number of errors over time, as in Example S4, but without a “Fraction Defective”
column? The answer is that they should convert those numbers to percentages and use a p-chart. What’s
the difference? A p-chart is used for output that has a specific sample size. A c-chart, on the other hand,
has no sample size. A c-chart looks at an “event,” which has no practical upper limit on the number of
occurrences. For example, students could make a c-chart to monitor the number of mistakes in the
instructor’s lectures over time. The instructor could have one mistake in a day, or five, or ten, or...there’s
really no upper limit. Hence, there is no sample size and it is impossible to convert the numbers into a
comparable fraction from day to day.
Another potential area of confusion for students is mixing up sample size and number of samples,
particularly for and R-charts, and especially if the data are presented with output from each sample
going down the columns of a table instead of across the rows. Students sometimes also do not fully
realize that -charts examine sample averages. So it is perfectly acceptable to have an individual product
with output that exceeds the control limits as long as the average for that entire sample still lies within the
limits (assuming, of course, that the individual item was still produced within spec limits). With p-charts,
122
Copyright ©2017 Pearson Education, Ltd.
123 Supplement 6
students sometimes enter the number of samples for n instead of the sample size, and they sometimes
round too much within the square root sign for σ p. They should be told to generally use four decimal
places, at least until after the square root function is applied. Finally, students sometimes are confused by
the thought that a percent defects that is below the LCL is out of control. Here, instructors can stress that
control limits are pointing out the unusual, whether unusually good or unusually bad. If, say, a firm
determined that no defects occurred on the day that the boss walked around the production floor, perhaps
the boss should walk around the production floor more often.
If instructors cover process capability, this can be a place where the distinction between control limits and
spec limits can be emphasized. It is probably worth covering the process capability ratio Cp in case
students see it at their firms; however, the process capability index Cpk is the one that should be stressed.
It is also easy to calculate but overcomes a major flaw with the Cp value.
Acceptance sampling calculations are rather complicated. They are skipped in this textbook and in most
other introductory operations texts. Interested instructors can refer to Tutorial 2 on the website for more
background.
2. It can be a fun (if not, humbling) exercise to ask the students to help set up a c-chart to monitor the
instructor’s mistakes during class from day to day. The students could identify the types of errors that
would be counted as a mistake, as well as possible causes for those errors when the number of
mistakes becomes out of control. An alternative exercise might set up a c-chart that would monitor
the number of mistakes displayed in student oral presentations, which could include mistakes of
content as well as presentation skills. Student-identified causes of poor presentations might be
humorous.
2. Perhaps the most common mistake students make when working SPC problems is to confuse sample
size and number of samples. Set up some simple sampling exercises and have the students identify
both values. This seems to work best when the instructor is careful to avoid using either phrase in his
or her description of the processes.
3. Deming, W. Edwards. 1986. Deming's Experiment. Out of the Crisis. MIT Center for Advanced
Engineering Studies, Cambridge, MA, 346-354. This activity is a substantial variation on the glass
bead. It is used to illustrate the impact of variation which exists within a system and the extent to
which that variation limits the effectiveness with which individuals can be evaluated. After seeing
that the variation in the proportion of red beads is similar to that in the proportion of defectives, the
students should recognize that system variation should be a primary focus of attention rather than
individual efforts.
Company Videos
1. Frito-Lay’s Quality-Controlled Potato Chips (10:15)
Frito-Lay is committed to quality in four key areas: (1) quality ingredients, (2) strict adherence to
recipes, (3) adherence to all process parameters, and (4) twice per shift inspections that mimic
consumer inspections (e.g., bag appearance, snack taste, etc.). Continuous improvement is the heart of
the firm’s quality assurance program. Frito-Lay focuses on two key metrics: (1) customer complaints
per million bags, and (2) hitting the center line on SPC charts for various attributes such as oil
content, moisture, seasoning, salt, thickness, and weight. The plant has nine critical checkpoints in the
production process, which are all shown in the video. A significant portion of the video includes a
thorough explanation of SPC charts, including an example of how to produce an x-bar chart, with a
known population standard deviation, for the percent salt content in potato chips. The video then
shows us how the operator at Frito-Lay produces an SPC chart observation, from scooping the
samples to measuring the salt content to updating the chart. At Frito-Lay, action is taken whenever
the SPC chart displays the following: (1) an observation outside the 3σ control limits, (2) two
consecutive observations very near either control limit, (3) five consecutive observations that trend in
the same direction, or (4) five consecutive observations that fall on the same side of the mean. “Star
Fleet” teams are available from other plants to help individual plants solve some of their more
difficult production problems. All of these practices contribute to Frito-Lay’s whopping 60% market
share.
Prior to showing the video, the instructor might ask the students to guess how many and what types of
inspections occur at a plant that makes potato chips. Afterwards, the nine types of inspection shown
in the video, in addition to the “twice per shift consumer-like inspection” and the in-store inspections,
could be compared to the student guesses. Further discussion could attempt to see if students
understand the difference between inspection and statistical process control. In particular, “control
limits” are not the same as “specification limits.” A sample that fails an SPC test would not
necessarily be rejected as being an unsuitable product. SPC is an ongoing exercise that is looking for
unusual circumstances or some sort of shift/wear in the production process so that the process can be
corrected quickly and before serious output problems emerge. In other words, failure of an SPC test
would not necessarily cause a full shutdown of the line or the rejection of an entire production lot.
right before it is served to the customer. All of these approaches to quality assurance helped Darden
win the prestigious Black Pearl Award for quality in the food industry.
Prior to showing the video, instructors might ask the students to consider what type of inspections
they expect for the food served in their favorite restaurant, and at what point in the process they
would expect those inspections to take place. Discussion following the video could cover some of
these initial impressions. Students might be surprised to know that Darden conducts so many
inspections before the products leave the respective countries of origin. Such a program clearly
involves a great deal of resources. Students could be asked to identify the pros and cons of such an
approach of attempting to ensure quality at the source. Instructors could then ask if this approach
makes sense in every industry, or is there something special about food that makes quality at the
source particularly important? Clearly not all manufacturing firms today are conducting such
extensive testing at their overseas suppliers. Can students identify examples from the news that have
described recent quality problems from overseas suppliers?
Cinematic Ticklers
1. The Simpsons, Season 4: “Duffless,” 20th Century Fox Video, 2004 (1992-1993)
Homer visits a Duff beer plant where the inspector pulls out beer bottles containing syringes and rats,
but fails to catch Hitler’s head rushing by on the conveyor belt because he was talking to a customer.
2. Teaching Tip: Using an SPC Chart to Examine American Airlines’ Pilots “Sick Out”
The WSJ (Sep. 24, 2012) reports on the high number of flight delays and cancellations at American
Airlines. The company argues that pilots intentionally pretended to be sick to disrupt operations.
Using the same data, the union argues that sick rates have not deviated from historical norms.
Students can use the data provided to plot a p-chart to see for themselves.
http://heizerrenderom.wordpress.com/2012/09/26/teaching-tip-using-an-spc-chart-to-examine-american-
airlines-pilots-sick-out/
3. Teaching Tip: Building a P-Chart Using Airline Frequent Flier Award Data
This WSJ (May 26, 2011) article provides airline frequent flier award data that can be turned into a
teaching exercise with a p-chart.
http://heizerrenderom.wordpress.com/2011/05/27/teaching-tip-building-a-p-chart-using-airline-frequent-
flier-award-data/
Presentation Slides
INTRODUCTION (S6-1 through S6-5)
S6-4
S6-14 S6-15
S6-16 S6-17
S6-21
S6-28 S6-29
S6-33
S6-37 S6-38
Slides 45-46: This example shows the formula and an application (Example S5) for c-charts. The c-chart
limits are the easiest to calculate and don’t require any sample size information. Here again,
Slide 46 shows that a negative LCL should be rounded up to 0. For this example, the
process appears to be in control.
S6-45 S6-46
S6-56
Slides 62-66: These slides provide the formula and an example (Example S7) for the process capability
index Cpk. This measure is generally preferable to C p because it does a better job of
signaling “capability” on both sides of the mean, thus better picking up the effects of a
skewed distribution.
Slide 67: Figure S6.8 helps to visualize what different values of the process capability index imply.
In this slide, the first three examples are not capable; whereas, the last two are.
S6-66 S6-67
Slides 68-69: Capability studies focus on whether or not a process is capable of producing within specs
under normal circumstances; control charts monitor output over time to make sure that
something unusual hasn’t occurred; while inspection itself determines whether completed
items themselves are acceptable or not. Firms practice acceptance sampling to avoid having
to inspect every single incoming or finished goods item. It is assumed that the defect rate of
the sample is reflective of the defective rate of the entire lot.
Slides 70-77: The text cannot devote enough space to show how to compute everything needed to draw
an operating characteristic curve, so these slides just describe the concepts. The average
outgoing quality can be computed (Slide 76) if (1) the sampling plan replaces all defective
items encountered, and (2) the true incoming percent defective for the lot is known.
Slide 78: Certain automated inspection plans have eliminated the need for acceptance sampling.
Slide 79: The final slide (Figure S6.10) can be used to summarize the three major methods
introduced in this supplement.
Internet Resources
American Society for Quality www.asq.org
American Statistical Association www.amstat.org
BPI Consulting: SPC for Excel www.spcforexcel.com
Statistical Engineering Division of NIST http://www.nist.gov/itl/sed/index.cfm
Total Quality Engineering www.tqe.com
Learning Games
o Price, B. and Zhang, X. (2007). The Power of Doing: A Learning Exercise that Brings the Central
Limit Theorem to Life. Decision Sciences Journal of Innovative Education, 5(2), 405-411.
Teaching brief that demonstrates an active learning technique for teaching the Central Limit
Theorem. Groups of students conduct experiments tossing a die in a set of 5 rolls (or 10 rolls).
Students are asked to calculate sample average.
o Fish, L. (2007). Statistical Quality Control: Developing Student’s Understanding of Variable
Control Charts using String. Decision Sciences Journal of Innovative Education, 5(1), 191-196.
Teaching brief offers a mini-demonstration for variable control charting.
o Reyes, P.M. (2006). Using a Rubber Band to Teach the Management of Quality. Decision
Sciences Journal of Innovative Education, 4(1), 123-128. Teaching brief that helps students
understand measurement system analysis and its effects on process improvement using a simple
two-phase, hands-on gage of repeatability and reproducibility (GR&R) study.
o Wright, C. and Smith, M. (2003). Serving Up the Red Beads Experience. Decision Sciences
Journal of Innovative Education, 1(1), 127-131. Teaching brief useful when teaching quality
management, statistical process control, and the management of people within a manufacturing
setting. Uses an airplane example and incorporates many of the principles from Deming’s 14
Points.
o LaPoint, Gary. Dice Game for Statistical Process Control. (See below.)
Teaching Note
This exercise works best after a normal lecture on statistical process control. It does an excellent job of
getting the students to understand the mechanics of the process. The difference in student understanding
of SPC with this exercise and without the exercise is significant.
Material needed: (1) A pair of dice for each team. (Usually available at the local Dollar Store or hobby
shop.); (2) Work template (see following pages.)
Exercise: Organize the class into teams of two. Give each team a pair of dice and a work template. One
student rolls the pair of dice while the other records the number on the template. Take 10 samples with a
sample size of 4. (This requires rolling the dice 40 times.)
Then have students calculate the X-bar, the Range, X-double bar, and the R-bar.
When students have completed the above, have them then calculate the UCL and LCL for X-bar and R
charts. (I have them calculate the Upper and Lower control limits using Table S6.1 in the text.) Students
cannot manually calculate the upper and lower control limits using the formula because they do not have
the standard deviation of the dice rolling process. Although if you do the calculation, the standard
deviation is 2.297.
Have the students develop the X-bar and R charts (you can provide the templates that follow or have the
students prepare them themselves). Once the students have created their graphs, ask them if their process
is in control.
Then give them upper and lower tolerance limits as defined by the customer that are lower than the
process control limits they calculated, say 7.5 and 4. Following that discussion, ask them to determine the
Cpk Index, and decide if the process is capable of operating within those tolerances. Several teams will not
be capable. Then collect the students’ charts and go over them on an overhead so the entire class can see
different types of processes.
Finally, ask what is inherently wrong with the exercise. The sharp students will say we can only get a
value between 2 and 12, which is correct. Therefore, this exercise is not perfect in simulating random
variation.
On the following pages are instructions to the students and the required tables and graphs for creating the
data set and control charts.
Student Instructions
Record of Observations
Sample
1 2 3 4 X-bar R
Number
1
10
Average
UCLR = UCLX-bar =
LCLR = LCLX-bar =
Cpk =
R-Chart
1 2 3 4 5 6 7 8 9 10
Samples
X-bar Chart
1 2 3 4 5 6 7 8 9 10
Samples