Untitled
Untitled
Untitled
1
TRANSFOR
INPUT MATION OUT PUT
PROCESS Goods
Material
Machines
Labor Services
Management
Capital
Feed back
Requirements
In the late 1950s and early 1960s, scholars began to deal specifically with operations
management as opposed to industrial engineering or operations research. Writers such as Edward
Bowman and Robert Fetter (Analysis for Production and Operations Management, 1957) and
Elwood S. Buffa (Modern Production Management, 1961) noted the commonality of problems
faced by all production systems and emphasized the importance of viewing production
operations as a system. They also stressed the useful application of waiting-line theory,
simulation, and linear programming, which are now standard topics in the field. In 1973, Chase
and Aquilano's stressed the need “to put the management back in operations management.” Bob
Jacobs joined the team in 1997.The following are some theories which are developed in
operation management.
The 1980s saw a revolution in the management philosophies and technologies by which
production is carried out. Just-in-time (JIT) production was the major breakthrough in
manufacturing philosophy. Pioneered by the Japanese, JIT is an integrated set of activities
designed to achieve high-volume production using minimal inventories of parts that arrive at the
workstation exactly when they are needed. The philosophy—coupled with total quality control
(TQC), which aggressively seeks to eliminate causes of production defects—is now a
cornerstone in many manufacturers' production practices, and the term “lean manufacturing” is
used to refer to the set of concepts Of course, the Japanese were not the first to develop a highly
integrated, efficient production system. In 1913, Henry Ford developed an assembly line to make
the Model-T automobile. Ford developed a system for making the Model-T that was constrained
only by the capabilities of the workforce and existing technology. Quality was a critical
prerequisite for Ford: The line could not run steadily at speed without consistently good
components. On-time delivery was also critical for Ford; the desire to keep workers and
2
machines busy with materials flowing constantly made scheduling critical. Product, processes,
material, logistics, and people were well integrated and balanced in the design and operation of
the plant.1
The late 1970s and early 1980s saw the development of the manufacturing strategy paradigm by
researchers at the Harvard Business School. This work by professors William Abernathy, Kim
Clark, Robert Hayes, and Steven Wheelwright (built on earlier efforts by Wickham Skinner)
emphasized how manufacturing executives could use their factories' capabilities as strategic
competitive weapons. Central to their thinking was the notion of factory focus and
manufacturing trade-offs. They argued that because a factory cannot do extremely well on all
performance measures, its management must devise a focused strategy, creating a focused
factory that performs a limited set of tasks extremely well. This required trade-offs among such
performance measures as low cost, high quality, and high flexibility in designing and managing
factories. Ford seems to have realized this about 60 years before the Harvard professors.
The ISO 9000 certification standards, created by the International Organization for
Standardization, now plays a major role in setting quality standards for global manufacturers.
Many European companies require that their vendors meet these standards as a condition for
obtaining contracts.
The need to become lean to remain competitive in the global economic recession in the 1990s
pushed companies to seek innovations in the processes by which they run their operations. The
flavor of business process reengineering (BPR) is conveyed in the title of Michael Hammer's
3
influential article in Harvard Business Review: “Reengineering Work: Don't Automate,
Obliterate.” The approach seeks to make revolutionary changes as opposed to evolutionary
changes (which are commonly advocated in TQM). It does this by taking a fresh look at what the
organization is trying to do in all its business processes, and then eliminating non–value-added
steps and computerizing the remaining ones to achieve the desired outcome.
Hammer actually was not the first consultant to advocate eliminating non–value-added steps and
reengineering processes. In the early 1900s, Frederick W. Taylor developed principles of
scientific management that applied scientific analysis to eliminating wasted effort from manual
labor. Around the same time, Frank and Lillian Gilbreth used the new technology of the time,
motion pictures, to analyze such diverse operations as bricklaying and medical surgery
procedures. Many of the innovations this husband-and-wife team developed, such as time and
motion study, are widely used today.
Six-sigma quality
Six Sigma is a process for developing and delivering virtually perfect products and services. The
word “sigma” is a familiar statistical term for the standard deviation, a measure of variability
around the mean of a normal distribution. In Six Sigma it is a measure of how much a given
product or process deviates from perfection, or zero defects. The main idea behind Six Sigma is
that if the number of “defects” in a process can be measured, then it can be systematically
determined how to eliminate them and get as close to zero defects as possible.
The central idea of supply chain management is to apply a total system approach to managing the
flow of information, materials, and services from raw material suppliers through factories and
warehouses to the end customer. Recent trends such as outsourcing and mass customization are
forcing companies to find flexible ways to meet customer demand. The focus is on optimizing
core activities to maximize the speed of response to changes in customer expectations.
Electronic commerce
The quick adoption of the Internet and the World Wide Web during the late 1990s was
remarkable. The term electronic commerce refers to the use of the Internet as an essential
element of business activity. The Internet is an outgrowth of a government network called
ARPANET, which was created in 1969 by the Defense Department of the U.S. government. The
use of Web pages, forms, and interactive search engines has changed the way people collect
information, shop, and communicate. It has changed the way operations managers coordinate
and execute production and distribution functions.
Service science
A direct response to the growth of services is the development of a major industry and university
program called Service Science Management and Engineering (SSME). SSME aims to apply the
latest concepts in information technology to continue to improve service productivity of
4
technology-based organizations. An interesting question raised by Jim Spohrer, leader of the
IBM team that started the effort, is where will the labor go, once productivity improves in the
service sector? “The short answer is new service sector industries and business—recall the
service sector is very diverse and becoming more so every day. Consider the growth of retail
(franchises, ecommerce, Amazon, eBay), communication (telephones, T-Mobile, Skype),
transportation (airlines, FedEx), financial (discount ebrokers, Schwab), as well as information
(television, CNN, Google) services. Not to mention all the new services in developing nations of
the world. The creative capacity of the service sector for new industries and business has
scarcely been tapped.
1.3.1 Manufacturing operations: converts input like materials, labor & capital into some
tangible outputs. The objective of each process is to change the shape or physical
characteristics of the raw-materials or inputs.
Intangibility is not tangible like the physical goods. It cannot be seen physically, but it can be
felt. Non-inventoriability as opposed to physical goods services is not inventoriable, because a
service is produced and consumed simultaneously. In this sense a service does not exist,
however the result of service last for some time.
Customer involvement besides the quality aspects, the inventoriability of services also means
that the customer may be directly involved in operations, where the production and
consumption takes place simultaneously. So the service and the service provider both are there
with the customer. Controlled flexibility is a characteristic of a firm’s operations that enables
it to react to customer needs quickly & efficiently. But where there is flexibility the possibility
5
of chaotic situation can be found in the production & delivery system. In service operations as
there are intangible objects, it is very much controlled here.
Differences: physical nature of products productivity measuring customer contact location &
size maintenance & repair. Generally, Distinction between manufacturing and service
operations is based on the following features:
TABLE 1
Manufacturing Service
Characteristic
Output Tangible Intangible
Consumer contact Low High
Nature of work Capital intensive Labor intensive
Uniformity of output High Low
Difficulty of quality assurance Low High
Measurement of performance Easy Difficult
Similarities: despite many differences, there are a lot of similarities between manufacturing &
service operations. There is an interdependency of products & services, for ex: customers want
6
both good food as well as good service at a restaurant. Again though service providers cannot
inventory their outputs, but must inventory their inputs, for ex: hospitals must maintain an
adequate supply of medications, nurses & doctors. While buying a car we not only buy a
product but also a guarantee. Hospital cares involves medication, bandages & x-ray films & so
on, so despite a lot of differences both product & service are part of each other.
At the operational level hundreds of decisions are made in order to achieve local outcomes
that contribute to the achievement of a company’s overall strategic goal. These local outcomes
are usually not measured directly in terms of profit, but instead are measured in terms of
quality, cost-effectiveness, efficiency, productivity, and so forth. Achieving good results for
local outcomes is an important objective for individual operational units and individual
operations managers. However, all these decisions are interrelated and must be coordinated for
the purpose of attaining the overall company goals. Decision making is analogous to a great
stage play, in which all the actors, the costumes, the music, the orchestra, and the script must
be choreographed and staged by the director, the stage managers, the author, and the
conductor so that everything comes together for the performance.
For many topics in operations management, there are quantitative models and techniques
available that help managers make decisions. Some techniques simply provide information
that the operations manager might use to help make a decision; other techniques recommend a
decision to the manager. Some techniques are specific to a particular aspect of operations
management; others are more generic and can be applied to a variety of decision-making
categories.
These different models and techniques are the “tools” of the operations manager. Simply
having these tools does not make someone an effective operations manager, just as owning a
saw and a hammer does not make someone a carpenter. An operations manager must know
how to use decision-making tools. How these tools are used in the decision-making process is
an important and necessary part of the study of operations management.
Many decision-making situations occur under conditions of uncertainty. For example, the
demand for a product may not be 100 units next week but may vary between 0 and 200 units,
depending on the state of the market, which is uncertain. Decision analysis is a set of
7
quantitative decision-making techniques to aid the decision maker in dealing with a decision
situation in which there is uncertainty. However, the usefulness of decision analysis for
decision making is also a beneficial topic to study because it reflects a structured, systematic
approach to decision making that many decision makers follow intuitively without ever
consciously thinking about it.
Decision analysis represents not only a collection of decision-making techniques but also an
analysis of logic underlying decision making.
When probabilities can be assigned to the occurrence of states of nature in the future, the
situation is referred to as decision making under risk. When probabilities cannot be assigned to
the occurrence of future events, the situation is called decision making under uncertainty.
To facilitate the analysis of decision situations, they are organized into payoff tables. A
payoff table is a means of organizing and illustrating the payoffs from the different
decisions, given the various states of nature, and has the general form shown in Table.
State of nature
Decision a B
1 Payoff 1a Payoff 1b
2 Payoff 2a Payoff 2b
Each decision, 1 or 2, in the table will result in an outcome, or payoff, for each state of nature
that will occur in the future. Payoffs are typically expressed in terms of profit, revenues, or
cost (although they may be expressed in terms of a variety of quantities). For example, if
decision 1 is to expand a production facility and state of nature a is good economic conditions,
payoff 1a could be $100,000 in profit.
Once the decision situation has been organized into a payoff table, several criteria are
available to reflect how the decision maker arrives at a decision, including maximax,
maximin, minimax regret, Hurwitz, and equal likelihood. These criteria reflect different
degrees of decision-maker conservatism or liberalism. On occasion they result in the same
decision; however, they often yield different results. These decision-making criteria are
demonstrated by the following example.
Example:
The Southern Textile Company is contemplating the future of one of its plants located in
South Carolina. Three alternative decisions are being considered:
8
(1) Expand the plant and produce lightweight, durable materials for possible sale to the
military, a market with little foreign competition;
(2) Maintain the status quo at the plant, continuing production of textile goods that are
subject to heavy foreign competition; or
(3) Sell the plant now. If one of the first two alternatives is chosen, the plant will still be
sold at the end of the year. The amount of profit that could be earned by selling the
plant in a year depends on foreign market conditions, including the status of a trade
embargo bill in Congress.
State of nature
Decision Good Competitive Conditions Poor Competitive conditions
Expand $800,000 $500,000
Maintain status quo $1,300,000 $-150,000
Sell now $320,000 $320,000
Solution:
1. Maximax
The decision is selected that will result in the maximum of the maximum payoffs. This is how
this criterion derives its name—the maximum of the maxima. The maximax criterion is very
optimistic. The decision maker assumes that the most favorable state of nature for each
decision alternative will occur. Thus, for this example, the company would optimistically
assume that good competitive conditions will prevail in the future, resulting in the following
maximum payoffs and decisions:
Expand: $800,000
Status quo: $1,300,000 ← Maximum
Sell: $320,000
Decision: Maintain status quo
2. Maximin
The maximin criterion is pessimistic. With the maximin criterion, the decision maker selects
the decision that will reflect the maximum of the minimum payoffs. For each decision
9
alternative, the decision maker assumes that the minimum payoff will occur; of these, the
maximum is selected as follows:
Expand: $500,000 ← Maximum
Status quo: $-150,000
Sell: $320,000
Decision: Expand
These values represent the regret for each decision that would be experienced by the decision
maker if a decision were made that resulted in less than the maximum payoff. The maximum
regret for each decision must be determined, and the decision corresponding to the minimum
of these regret values is selected as follows:
4. Hurwicz (Leonid "Leo" Hurwicz)
A compromise is made between the maximax and maximin criteria. The decision maker is
neither totally optimistic (as the maximax criterion assumes) nor totally pessimistic (as the
maximin criterion assumes). With the Hurwicz criterion, the decision payoffs are weighted
by a coefficient of optimism, a measure of the decision maker’s optimism. The coefficient of
optimism, defined as α, is between 0 and 1 (i.e., 0 < α <1). If α= 1, the decision maker is
completely optimistic; if α = 0, the decision maker is completely pessimistic. (Given this
definition, 1 - α is the coefficient of pessimism.) For each decision alternative, the maximum
payoff is multiplied by α and the minimum payoff is multiplied by 1 - α . For our investment
example, if α equals 0.3 (i.e., the company is slightly optimistic) and 1 – α= 0.7, the following
decision will result:
10
5. Equal Likelihood
The equal likelihood (or La Place) criterion weights each state of nature equally, thus
assuming that the states of nature are equally likely to occur. Since there are two states of
nature in our example, we assign a weight of 0.50 to each one. Next, we multiply these
weights by each payoff for each decision and select the alternative with the maximum of these
weighted values.
The decision to expand the plant was designated most often by four of the five decision
criteria.
The decision to sell was never indicated by any criterion. This is because the payoffs for
expansion, under either set of future economic conditions, are always better than the payoffs
for selling. Given any situation with these two alternatives, the decision to expand will always
be made over the decision to sell. The sell decision alternative could have been eliminated
from consideration under each of our criteria. The alternative of selling is said to be
dominated by the alternative of expanding. In general, dominated decision alternatives can be
removed from the payoff table and not considered when the various decision-making criteria
are applied, which reduces the complexity of the decision analysis.
Different decision criteria often result in a mix of decisions. The criteria used and the resulting
decisions depend on the decision maker. For example, the extremely optimistic decision
maker might disregard the preceding results and make the decision to maintain the status quo,
because the maximax criterion reflects his or her personal decision-making philosophy.
n
EV(x) =∑ p ( xi ) xi
i=0
Where
p (xi) = probability of outcome i
Xi = outcome i
11
Assume that it is now possible for the Southern Textile Company to estimate a probability of
0.70 that good foreign competitive conditions will exist and a probability of 0.30 that poor
conditions will exist in the future. Determine the best decision using expected value.
Solution
The expected values for each decision alternative are computed as follows.
EV (expand) =$800,000(0.70) + 500,000(0.30) = $710,000
EV (status quo) = 1,300,000(0.70) _ 150,000(0.30) =865,000 ← Maximum
EV (sell) = 320,000(0.70) +320,000(0.30) = 320,000
The decision according to this criterion is to maintain the status quo, since it has the highest
expected value.
Multifactor productivity:
Output output
Labor+materials+overhead labor+energy+capital
12
Goods and Services produced
All inputs used to produce them
Example:
Osborne Industries is compiling the monthly productivity report for its Board of Directors.
From the following data, calculate (a) labor productivity, (b) machine productivity, and (c) the
Multifactor productivity of dollars spent on labor, machine, materials, and energy. The
average Labor rate is $15 an hour, and the average machine usage rate is $10 an hour.
Units produced 100,000
Labor hours 10,000
Machine hours 5,000
Cost of materials $35,000
Cost of energy $15,000
Solution
13