Genichi Taguchi
Genichi Taguchi
Genichi Taguchi
Introduction
Taguchi is an engineer and statistician.
Western statisticians, but others have accepted many of the concepts introduced by him as valid extensions to the body of knowledge.
Background
Taguchi was born in 1924 in Takamachi Japan, a city
famous for the kimono industry so it was only natural for him to study textile engineering as he was expected to assume responsibility of the family kimono business.
But in 1942 He became interested in statistics. Taguchi's statistics skills were nurtured and upbringing.
Professional Career
Graduated from Kiryu Technical College in textile
engineering. During World War II, he was drafted into the astronomical Department of the Navigational Institute of the Imperial Japanese Navy. He joined the Ministry of Public Health and Welfare in 1948 During 1948-1950, he also worked at the Institute of Statistical Mathematics on experimental work in the production of penicillin at Morinaga Pharmaceuticals.
Professional Career
He
joined the Nippon Telegraph and Telephone Corporation (NT&T) and worked there until 1962.
Professional Career
In 1964 he became professor of engineering at Aoyama
professor from cheng kong university in Taiwan and later emigrated to the U.S.
Since 1982, Genichi Taguchi has been an advisor to
the Japanese Standards Institute and an executive director of the American Supplier Institute which is an international consulting organisation.
total loss generated by product to society. In a competitive economy, continuous quality improvement and cost reduction are necessary for staying in business. A continues quality improvement program includes continuous reduction in the variation of product performance characteristics about their target values. The customers loss due to a products performance variation is often approximately proportional to the square of the deviation of the performance characteristic from its target value.
large extent by the engineering designs of the product and its manufacturing process.
A product (or process) performance variation can be reduced by
exploiting the nonlinear effects of the product (or process) parameters on the performance characteristics.
Statistically designed experiments can be used to identify the settings of
improvement through the identifications of easily controllable factors and their settings. By setting those factors at their optimal levels, the product can be made robust to changes in operating and environmental conditions. More stable and higher-quality products can be obtained. This is achieved during Taguchi parameter-design stage by removing the bad effect of the cause rather than the cause of the bad effect.
4.
5. 6. 7. 8.
Identify the main function, side effects and failure mode. Identify the noise factor, testing condition and quality characteristics. Identify the objective function to be optimized. Identify the control factor and their levels. Select the Orthogonal Array, Matrix experiments. Conduct the Matrix equipment. Analyze the data; predict the optimum levels and performance. Perform the verification experiment and plan the failure action.
Robust Design
"Taguchi methods
Quality Loss Function gives a financial value for customers' increasing dissatisfaction as the product performance goes below the desired target performance. Equally, it gives a financial value for increasing costs as product performance goes above the desired target performance. Determining the target performance is an educated guess, often based on customer surveys and feedback. The quality loss function allows financial decisions to be made at the design stage regarding the cost of achieving the target performance.
emphasised quality through robust design, not quality through inspection. Taguchi breaks the design process into three stages: System design - involves creating a working prototype Parameter design involves experimenting to find which factors influence product performance most Tolerance design - involves setting tight tolerance limits for the critical factors and looser tolerance limits for less important factors.
quality: Quality is the loss a product causes to society after being shipped, other then any losses caused by its intrinsic functions. By loss Taguchi refers to the following two categories:
1.
Cost LSL
Scrap Cost
Target
USL
Robust Engineering
Products and services should be designed to be inherently
deviate from its target values Taguchi divide disturbances into three categories
External disturbances: variations in the environment where the product is used Internal disturbances: ware and tare inside a specific unit Disturbances in the production process: deviation from target values Concept design Parameter design Tolerance design
Robust Engineering
1. Concept Design The process of examining competing technologies for producing a product - Includes choices of technology and process design A prototype design that can be produced and meets customers needs under ideal conditions without disturbances
Robust Engineering
Parameter Design The selection of control factors (parameters) and their optimal levels
can influence.
Ex. the procedures used and the type and amount of training Often a complex (non-linear) relationship between the control factors and product/design performance
through experimentation
Robust Engineering
Tolerance Design Development of specification limits
Necessary because there will always be some variation in the production process Taguchi fiercely advocates aiming for the target value not just settle for inside the specification limits!
Occurs after the parameter design Often results in increased production costs More expensive input material might have to be used to meet specifications
Robustness Strategy
The Robustness Strategy uses five primary tools:
1. 2.
3.
4.
5.
P-Diagram is used to classify the variables associated with the product into noise, control, signal (input), and response (output) factors. Ideal Function is used to mathematically specify the ideal form of the signal-response relationship as embodied by the design concept for making the higher-level system work perfectly. Quadratic Loss Function (also known as Quality Loss Function) is used to quantify the loss incurred by the user due to deviation from target performance. Signal-to-Noise Ratio is used for predicting the field quality through laboratory experiments. Orthogonal Arrays are used for gathering dependable information about control factors (design parameters) with a small number of experiments
has several control factors which directly decide the target or desired value of the output. The optimization then involves determining the best control factor levels so that the output is at the the target value.
input that directly decides the output, the optimization involves determining the best control factor levels so that the "input signal / output" ratio is closest to the desired relationship.
the primary aim of the Taguchi experiments -
n = -10 Log10 [ mean of sum of squares of measured data ] like " defects " etc. for which the ideal value is zero.
This is usually the chosen S/N ratio for all undesirable characteristics When an ideal value is finite and its maximum or minimum value is
defined (like maximum purity is 100% or maximum Tc is 92K or minimum time for making a telephone connection is 1 sec) then the difference between measured data and ideal value is expected to be as small as possible.
(b) LARGER-THE-BETTER :
n = -10 Log10 [mean of sum squares of reciprocal of measured data] This case has been converted to SMALLER-THEBETTER by taking the reciprocals of measured data and then taking the S/N ratio as in the smaller-thebetter case.
(c) NOMINAL-THE-BEST :
square of mean n = 10 Log10 ----------------variance This case arises when a specified value is MOST desired, meaning that neither a smaller nor a larger value is desirable.
Examples are; (i) most parts in mechanical fittings have dimensions which are nominal-the-best type. (ii) Ratios of chemicals or mixtures are nominally the best type. e.g. Aqua regia 1:3 of HNO3:HCL Ratio of Sulphur, KNO3 and Carbon in gun powder (iii) Thickness should be uniform in deposition /growth /plating /etching..
Dymamic Problems
In dynamic problems, we come across many applications where the
output is supposed to follow input signal in a predetermined manner. Generally, a linear relationship between "input" "output" is desirable. volume control in audio amplifiers, document copier (with magnification or reduction) various types of moldings etc.
(usually 1).It is often treated as Larger-The-Better when the output is a desirable characteristics (as in the case of Sensors, where the slope indicates the sensitivity).
n = 10 Log10 [square of slope or beta of the I/O characteristics] On the other hand, when the output is an undesired
LINEARITY (LARGER-THE-BETTER) :
Most dynamic characteristics are required to have direct proportionality between the input and output. These applications are therefore called as "TRANSFORMATIONS".
The straight line relationship between I/O must be truly
linear i.e. with as little deviations from the straight line as possible. Square of slope or beta n = 10 Log10 ---------------------------variance deviations of measured data points from the best-fit straight line (linear regression).
Facilitators job:
Design experiment Run experiment Analyze results Confirm experiment
need improvement?
Step 2: Brainstorming
Identify critical variables in the service that affect
quality.
Open and honest discourse with all people
involved.
Decide which factors are controllable and which
are not.
session,
Step 4: Experiment
Use of ANOVA requires managers understand its
use.
Facilitator, although in charge of the experiment,
Step 5: Analysis
Factors closest to target specification identified.
Writings
During this period, Taguchi also found time to write
Experimental Design and Life Test Analysis and Design of Experiments for Engineers. In 1960, the latter book helped him earn Japan's Deming Prize for his contributions in quality engineering. Two years later, he had earned his doctorate in science, Taguchi wrote a second edition of Design of Experiments that introduced industrial research on the signal-to-noise ratio.
of Japan in 1986 for his outstanding contributions to Japanese economics and industries. He also received the International Technology Institute's Willard F. Rockwell Medal for combining engineering and statistical methods to achieve rapid improvements in cost and quality by optimizing product design and manufacturing processes. He became an honorary member of the Japanese Society of Quality Control in 1995.
receivers used in aircrafts. It should be designed to minimize the bit error rate (BER).
The main concept of demodulation is to convert the received RF signal to baseband voltage signal, sample it at the mid points of the bits and identify the bit as 0 or 1 using a threshold voltage value. Much design time and cost can be saved focusing on this main concept.
level crossings should be proportional to the number of corresponding bits. the desired level, depending on whether it is 0 or 1 bit, between the level crossings. proportional to the frequency deviation.
sequences of bits was created that has consecutive "0" bit 1, 2, , 7 times and the same for the "1" bit. The number of consecutive bits is the signal factor. The parameter C/KT defines broadband thermal noise affecting the transmission and is a key noise factor for this design. The bits 0 or 1 can also be viewed as a noise factor because the proportionality must hold for both the 0 and the 1 bit.
Conclusion
Computation of BER requires generation of millions of
bits, which is costly. By using the ideal function and the corresponding zero point proportional type S/N ratio, one can greatly cut down the simulation effort. Indeed the team needed to generate a sequence of only 100 bits to evaluate the S/N ratio The above figure shows the P-diagram for the FM demodulator project. The optimum design achieved 2 dB improvement in S/N ratio that amounted to 37% reduction in BER.