26
Taguchi's Loss Function Definition Simply put, the Taguchi loss function is a way to show how each non-perfect part produced, results in a loss for the company. Deming states that it shows "a minimal loss at the nominal value, and an ever-increasing loss with departure either way from the nominal value." - W. Edwards Deming Out of the Crisis. p.141 A technical definition is A parabolic representation that estimates the quality loss, expressed monetarily, that results when quality characteristics deviate from the target values. The cost of this deviation increases quadratically as the characteristic moves farther from the target value. - Ducan, William Total Quality Key Terms. p. 171 Graphically, the loss function is represented as shown above. Interpreting the chart This standard representation of the loss function demonstrates a few of the key attributes of loss. For example, the target value and the bottom of the parabolic function intersect, implying that as parts are produced at the nominal value, little or no loss occurs. Also, the curve flattens as it approaches and departs from the target value. (This shows that as products approach the nominal value, the loss incurred is less than when it departs from the target.) Any departure from the nominal value results in a loss! Loss can be measured per part. Measuring loss encourages a focus on achieving less variation. As we understand how even a little variation from the nominal results in a loss, the tendency would be to try and keep product and process as close to the nominal value as possible. This is what is so beneficial about the Taguchi loss. It always keeps our focus on the need to continually improve. A business that misuses what it has will continue to misuse what it can get. The point is--cure the misuse. - Ford and Crowther Application

Taguchi Loss Function

Embed Size (px)

DESCRIPTION

Production & Operation Mgmt

Citation preview

Page 1: Taguchi Loss Function

Taguchi's Loss Function

Definition

Simply put, the Taguchi loss function is a way to show how each non-perfect part produced, results in a loss for the company. Deming states that it shows

"a minimal loss at the nominal value, and an ever-increasing loss with departure either way from the nominal value."   - W. Edwards Deming Out of the Crisis. p.141

A technical definition is

A parabolic representation that estimates the quality loss, expressed monetarily, that results when quality characteristics deviate from the target values. The cost of this deviation increases quadratically as the characteristic moves farther from the target value. - Ducan, William Total Quality Key Terms. p. 171

Graphically, the loss function is represented as shown above.

Interpreting the chart

This standard representation of the loss function demonstrates a few of the key attributes of loss.  For example, the target value and the bottom of the parabolic function intersect, implying that as parts are produced at the nominal value, little or no loss occurs.  Also, the curve flattens as it approaches and departs from the target value.  (This shows that as products approach the nominal value, the loss incurred is less than when it departs from the target.)  Any departure from the nominal value results in a loss!

Loss can be measured per part.  Measuring loss encourages a focus on achieving less variation.  As we understand how even a little variation from the nominal results in a loss, the tendency would be to try and keep product and process as close to the nominal value as possible.  This is what is so beneficial about the Taguchi loss.  It always keeps our focus on the need to continually improve.

A business that misuses what it has will continue to misuse what it can get. The point is--cure the misuse. - Ford and Crowther

Application

A company that manufactures parts that require a large amount of machining grew tired of the high costs of tooling. To avoid premature replacement of these expensive tools, the manager suggested that operators set the machine to run at the high-end of the specification limits. As the tool would wear down, the products would end up measuring on the low-end of the specification limits. So, the machine would start by producing parts on the high-end and after a period of time, the machine would produce parts that fell just inside of the specs.

The variation of parts produced on this machine was much greater than it should be, since the strategy was to use the entire spec width allowed rather than produce the highest quality part possible.  Products may fall within spec, but will not produce close to the nominal.  Several of these "good parts" may not assemble well, may require recall, or may come back under warranty.  The Taguchi loss would be very high.

We should consider these vital questions:

* Is the savings of tool life worth the cost of poor products?

Page 2: Taguchi Loss Function

* Would it be better to replace the tool twice as often, reduce variation, or look at incoming part quality?

Calculations

Formulas:

Loss at a point: L(x) = k*(x-t)^2where,k = loss coefficientx = measured valuet = target value

Average Loss of a sample set: L = k*(s^2 + (pm - t)^2) where,s = standard deviation of sample pm = process mean

Total Loss = Avg. Loss * number of samples

For example:  A medical company produces a part that has a hole measuring 0.5" + 0.050".  The tooling used to make the hole is worn and needs replacing, but management doesn't feel it necessary since it still makes "good parts".  All parts pass QC, but several parts have been rejected by assembly. Failure costs per part is $45.00 Using the loss function, explain why it may be to the benefit of the company and customer to replace or sharpen the tool more frequently.  Use the data below:

Measured Value0.459 | 0.478 | 0.495 | 0.501 | 0.511 | 0.5270.462 | 0.483 | 0.495 | 0.501 | 0.516 | 0.5320.467 | 0.489 | 0.495 | 0.502 | 0.521 | 0.532 0.474 | 0.491 | 0.498 | 0.505 | 0.524 | 0.5330.476 | 0.492 | 0.500 | 0.509 | 0.527 | 0.536

Solution:

The average of the points is 0.501 and the standard deviation is about 0.022.

find k,using L(x) = k * (x-t)^2 $45.00 = k * (0.550 - 0.500)^2 k = 18000

next,

using the Average loss equation: L=k * (s^2 + (pm - t)^2)

L = 18000 * (.022^2 + (.501 - .500)^2) = 8.73

So the average loss per part in this set is $8.73.

For the loss of the total 30 parts produced,

Page 3: Taguchi Loss Function

= L * number of samples= $8.73 * 30 = $261.90

From the calculations above, one can determine that at 0.500", no loss is experienced.  At a measured value of 0.501", the loss is $0.018, and with a value of 0.536", the loss would be as much as $23.00.

Even though all measurements were within specification limits and the average hole size was 0.501", the Taguchi loss shows that the company lost about $261.90 per 30 parts being made.  If the batch size was increased to 1000 parts, then the loss would be $8730 per batch.  Due to variation being caused by the old tooling, the department is losing a significant amount of money.

From the chart, we can see that deviation from the nominal, could cost as much as $0.30 per part.  In addition we would want to investigate whether this kind of deviation would compromise the integrity of the final product after assembly to the point of product failure.

Taguchi loss function

From Wikipedia, the free encyclopedia

The Taguchi Loss Function is a graphical depiction of loss developed by the Japanese business statistician Genichi Taguchi to describe a phenomenon affecting the value of products produced by a company. Praised by Dr. W. Edwards Deming (the business guru of the 1980s American quality movement),[1] it made clear the concept that quality does not suddenly plummet when, for instance, a machinist exceeds a rigid blueprint tolerance. Instead "loss" in value progressively increases as variation increases from the intended condition. This was considered a breakthrough in describing quality, and helped fuel the continuous improvement movement that since has become known as lean manufacturing.

[edit] Overview

The Taguchi Loss Function is important for a number of reasons. Primarily, to help engineers better understand the importance of designing for variation. It was important to the Six Sigma movement by driving an improved understanding of the importance of Variation Management (a concept described in the Shingo Prize winning book, Breaking the Cost Barrier[2]). Finally, It was important to describing the effects of changing variation on a system, which is a central characteristic of Lean Dynamics, a business management discipline focused on better understanding the impact of dynamic business conditions (such as sudden changes in demand seen during the 2008-2009 economic downturn) on loss, and thus on creating value.[3]

aguchi methods

From Wikipedia, the free encyclopedia

Taguchi methods are statistical methods developed by Genichi Taguchi to improve the quality of manufactured goods, and more recently also applied to, engineering,[1] biotechnology,[2][3] marketing and advertising.[4] Professional statisticians have welcomed the goals and improvements brought about by Taguchi methods, particularly by Taguchi's development of designs for studying variation, but have criticized the inefficiency of some of Taguchi's proposals.[5]

Taguchi's work includes three principal contributions to statistics:

A specific loss function — see Taguchi loss function;

Page 4: Taguchi Loss Function

The philosophy of off-line quality control; and Innovations in the design of experiments.

Loss functions

Loss functions in statistical theory

Traditionally, statistical methods have relied on mean-unbiased estimators of treatment effects: Under the conditions of the Gauss-Markov theorem, least squares estimators have minimum variance among all mean-unbiased estimators. The emphasis on comparisons of means also draws (limiting) comfort from the law of large numbers, according to which the sample means converge to the true mean. Fisher's textbook on the design of experiments emphasized comparisons of treatment means.

Gauss proved that the sample-mean minimizes the expected squared-error loss-function (while Laplace proved that a median-unbiased estimator minimizes the absolute-error loss function). In statistical theory, the central role of the loss function was renewed by the statistical decision theory of Abraham Wald.

However, loss functions were avoided by Ronald A. Fisher.[6]

Taguchi's use of loss functions

Taguchi knew statistical theory mainly from the followers of Ronald A. Fisher, who also avoided loss functions. Reacting to Fisher's methods in the design of experiments, Taguchi interpreted Fisher's methods as being adapted for seeking to improve the mean outcome of a process. Indeed, Fisher's work had been largely motivated by programmes to compare agricultural yields under different treatments and blocks, and such experiments were done as part of a long-term programme to improve harvests.

However, Taguchi realised that in much industrial production, there is a need to produce an outcome on target, for example, to machine a hole to a specified diameter, or to manufacture a cell to produce a given voltage. He also realised, as had Walter A. Shewhart and others before him, that excessive variation lay at the root of poor manufactured quality and that reacting to individual items inside and outside specification was counterproductive.

He therefore argued that quality engineering should start with an understanding of quality costs in various situations. In much conventional industrial engineering, the quality costs are simply represented by the number of items outside specification multiplied by the cost of rework or scrap. However, Taguchi insisted that manufacturers broaden their horizons to consider cost to society. Though the short-term costs may simply be those of non-conformance, any item manufactured away from nominal would result in some loss to the customer or the wider community through early wear-out; difficulties in interfacing with other parts, themselves probably wide of nominal; or the need to build in safety margins. These losses are externalities and are usually ignored by manufacturers, which are more interested in their private costs than social costs. Such externalities prevent markets from operating efficiently, according to analyses of public economics. Taguchi argued that such losses would inevitably find their way back to the originating corporation (in an effect similar to the tragedy of the commons), and that by working to minimise them, manufacturers would enhance brand reputation, win markets and generate profits.

Such losses are, of course, very small when an item is near to negligible. Donald J. Wheeler characterised the region within specification limits as where we deny that losses exist. As we diverge from nominal, losses grow until the point where losses are too great to deny and the specification limit is drawn. All these losses are, as W. Edwards Deming would describe them, unknown and

Page 5: Taguchi Loss Function

unknowable, but Taguchi wanted to find a useful way of representing them statistically. Taguchi specified three situations:

1. Larger the better (for example, agricultural yield);2. Smaller the better (for example, carbon dioxide emissions); and3. On-target, minimum-variation (for example, a mating part in an assembly).

The first two cases are represented by simple monotonic loss functions. In the third case, Taguchi adopted a squared-error loss function for several reasons:

It is the first "symmetric" term in the Taylor series expansion of real analytic loss-functions. Total loss is measured by the variance. For uncorrelated random variables, as variance is

additive the total loss is an additive measurement of cost. The squared-error loss function is widely used in statistics, following Gauss's use of the

squared-error loss function in justifying the method of least squares.

[edit] Reception of Taguchi's ideas by statisticians

Though many of Taguchi's concerns and conclusions are welcomed by statisticians and economists, some ideas have been especially criticized. For example, Taguchi's recommendation that industrial experiments maximise some signal-to-noise ratio (representing the magnitude of the mean of a process compared to its variation) has been criticized widely.[citation needed]

[edit] Off-line quality control

[edit] Taguchi's rule for manufacturing

Taguchi realized that the best opportunity to eliminate variation is during the design of a product and its manufacturing process. Consequently, he developed a strategy for quality engineering that can be used in both contexts. The process has three stages:

1. System design2. Parameter (measure) design3. Tolerance design

[edit] System design

This is design at the conceptual level, involving creativity and innovation.

[edit] Parameter design

Once the concept is established, the nominal values of the various dimensions and design parameters need to be set, the detail design phase of conventional engineering. Taguchi's radical insight was that the exact choice of values required is under-specified by the performance requirements of the system. In many circumstances, this allows the parameters to be chosen so as to minimize the effects on performance arising from variation in manufacture, environment and cumulative damage. This is sometimes called robustification.

Tolerance design

With a successfully completed parameter design, and an understanding of the effect that the various parameters have on performance, resources can be focused on reducing and controlling variation in the critical few dimensions (see Pareto principle).

Page 6: Taguchi Loss Function

Design of experiments

Taguchi developed his experimental theories independently. Taguchi read works following R. A. Fisher only in 1954. Taguchi's framework for design of experiments is idiosyncratic and often flawed, but contains much that is of enormous value.[citation needed] He made a number of innovations.

Outer arrays

Taguchi's designs aimed to allow greater understanding of variation than did many of the traditional designs from the analysis of variance (following Fisher). Taguchi contended that conventional sampling is inadequate here as there is no way of obtaining a random sample of future conditions.[7] In Fisher's design of experiments and analysis of variance, experiments aim to reduce the influence of nuisance factors to allow comparisons of the mean treatment-effects. Variation becomes even more central in Taguchi's thinking.

Taguchi proposed extending each experiment with an "outer array" (possibly an orthogonal array); the "outer array" should simulate the random environment in which the product would function. This is an example of judgmental sampling. Many quality specialists have been using "outer arrays".

Later innovations in outer arrays resulted in "compounded noise." This involves combining a few noise factors to create two levels in the outer array: First, noise factors that drive output lower, and second, noise factors that drive output higher. "Compounded noise" simulates the extremes of noise variation but uses fewer experimental runs than would previous Taguchi designs.

Management of interactions

Interactions, as treated by Taguchi

Many of the orthogonal arrays that Taguchi has advocated are saturated arrays, allowing no scope for estimation of interactions. This is a continuing topic of controversy. However, this is only true for "control factors" or factors in the "inner array". By combining an inner array of control factors with an outer array of "noise factors", Taguchi's approach provides "full information" on control-by-noise interactions, it is claimed. Taguchi argues that such interactions have the greatest importance in achieving a design that is robust to noise factor variation. The Taguchi approach provides more complete interaction information than typical fractional factorial designs, its adherents claim.

Followers of Taguchi argue that the designs offer rapid results and that interactions can be eliminated by proper choice of quality characteristics. That notwithstanding, a "confirmation experiment" offers protection against any residual interactions. If the quality characteristic represents the energy transformation of the system, then the "likelihood" of control factor-by-control factor interactions is greatly reduced, since "energy" is "additive".

Inefficencies of Taguchi's designs

Interactions are part of the real world. In Taguchi's arrays, interactions are confounded and difficult to resolve.

Statisticians in response surface methodology (RSM) advocate the "sequential assembly" of designs: In the RSM approach, a screening design is followed by a "follow-up design" that resolves only the confounded interactions judged worth resolution. A second follow-up design may be added (time and resources allowing) to explore possible high-order univariate effects of the remaining variables, as high-order univariate effects are less likely in variables already eliminated for having no linear effect. With the economy of screening designs and the flexibility of follow-up designs, sequential designs have great statistical efficiency. The sequential designs of response surface methodology require far fewer experimental runs than would a sequence of Taguchi's designs.[8]

Page 7: Taguchi Loss Function

Analysis of experiments

Taguchi introduced many methods for analysing experimental results including novel applications of the analysis of variance and minute analysis.

Assessment

Genichi Taguchi has made valuable contributions to statistics and engineering. His emphasis on loss to society, techniques for investigating variation in experiments, and his overall strategy of system, parameter and tolerance design have been influential in improving manufactured quality worldwide.[9] Although some of the statistical aspects of the Taguchi methods are disputable, there is no dispute that they are widely applied to various processes. A quick search in related journals, as well as the World Wide Web, reveals that the method is being successfully implemented in diverse areas,such as the design of VLSI; optimization of communication & information networks, development of electronic circuits, laser engraving of photo masks, cash-flow optimization in banking,government policymaking, runway utilization improvement in airports, and even robust eco-design.[10]

Design of experiments Optimal design Orthogonal array Quality management Response surface methodology Sales process engineering Six sigma Tolerance (engineering) Probabilistic design

Loss function

In statistics and decision theory a loss function is a function that maps an event onto a real number intuitively representing some "cost" associated with the event. Typically it is used for parameter estimation, and the event in question is some function of the difference between estimated and true values for an instance of data. In the context of economics, for example, this is usually economic cost or regret. In Machine Learning, it is the penalty for an incorrect classification of an example.

Definition

Formally, we begin by considering some family of distributions for a random variable X, that is indexed by some θ.

More intuitively, we can think of X as our "data", perhaps , where i.i.d. The X is the set of things the decision rule will be making decisions on. There exists some number of possible ways Fθ to model our data X, which our decision function can use to make decisions. For a finite number of models, we can thus think of θ as the index to this family of probability models. For an infinite family of models, it is a set of parameters to the family of distributions.

On a more practical note, it is important to understand that, while it is tempting to think of loss functions as necessarily parametric (since they seem to take θ as a "parameter"), the fact that θ is non-finite-dimensional is completely incompatible with this notion; for example, if the family of probability functions is uncountably infinite, θ indexes an uncountably infinite space.

From here, given a set A of possible actions, a decision rule is a function δ :  → A.

Page 8: Taguchi Loss Function

A loss function is a real lower-bounded function L on Θ × A for some θ ∈ Θ. The value L(θ, δ(X)) is the cost of action δ(X) under parameter θ.[1]

Decision rules

A decision rule makes a choice using an optimality criterion. Some commonly used criteria are:

Minimax: Choose the decision rule with the lowest worst loss — that is, minimize the worst-case (maximum possible) loss:

Invariance: Choose the optimal decision rule which satisfies an invariance requirement. Minimize the expected value of the loss function.

Expected loss

The value of the loss function itself is a random quantity because it depends on the outcome of a random variable X. Both frequentist and Bayesian statistical theory involve making a decision based on the expected value of the loss function: however this quantity is defined differently under the two paradigms.

Frequentist risk

Main article: risk function

The expected loss in the frequentist context is obtained by taking the expected value with respect to the probability distribution, Pθ, of the observed data, X. This is also referred to as the risk function[2] of the decision rule δ and the parameter θ. Here the decision rule depends on the outcome of X. The risk function is given by

Bayesian expected loss

In a Bayesian approach, the expectation is calculated using the posterior distribution π* of the parameter θ:

.

One then should choose the action a* which minimises the expected loss. Although this will result in choosing the same action as would be chosen using the Bayes risk, the emphasis of the Bayesian approach is that one is only interested in choosing the optimal action under the actual observed data, whereas choosing the actual Bayes optimal decision rule, which is a function of all possible observations, is a much more difficult problem.

Selecting a loss function

Sound statistical practice requires selecting an estimator consistent with the actual loss experienced in the context of a particular applied problem. Thus, in the applied use of loss functions, selecting which statistical method to use to model an applied problem depends on knowing the losses that will be experienced from being wrong under the problem's particular circumstances, which results in the introduction of an element of teleology into problems of scientific decision-making.

Page 9: Taguchi Loss Function

A common example involves estimating "location." Under typical statistical assumptions, the mean or average is the statistic for estimating location that minimizes the expected loss experienced under the Taguchi or squared-error loss function, while the median is the estimator that minimizes expected loss experienced under the absolute-difference loss function. Still different estimators would be optimal under other, less common circumstances.

In economics, when an agent is risk neutral, the loss function is simply expressed in monetary terms, such as profit, income, or end-of-period wealth.

But for risk averse (or risk-loving) agents, loss is measured as the negative of a utility function, which represents satisfaction and is usually interpreted in ordinal terms rather than in cardinal (absolute) terms.

Other measures of cost are possible, for example mortality or morbidity in the field of public health or safety engineering.

For most optimization algorithms, it is desirable to have a loss function that is globally continuous and differentiable.

Two very commonly-used loss functions are the squared loss, L(a) = a2, and the absolute loss, L(a) = | a | . However the absolute loss has the disadvantage that it is not differentiable around a = 0. The squared loss has the disadvantage that it has the tendency to be dominated by outliers---when

summing over a set of a's (as in ), the final sum tends to be the result of a few particularly-large a-values, rather than an expression of the average a-value.

Loss functions in Bayesian statistics

One of the consequences of Bayesian inference is that in addition to experimental data, the loss function does not in itself wholly determine a decision. What is important is the relationship between the loss function and the prior probability. So it is possible to have two different loss functions which lead to the same decision when the prior probability distributions associated with each compensate for the details of each loss function.

Combining the three elements of the prior probability, the data, and the loss function then allows decisions to be based on maximizing the subjective expected utility, a concept introduced by Leonard J. Savage.

Regret

Main article: Regret (decision theory)

Savage also argued that using non-Bayesian methods such as minimax, the loss function should be based on the idea of regret, i.e., the loss associated with a decision should be the difference between the consequences of the best decision that could have been taken had the underlying circumstances been known and the decision that was in fact taken before they were known.

Quadratic loss function

The use of a quadratic loss function is common, for example when using least squares techniques or Taguchi methods. It is often more mathematically tractable than other loss functions because of the properties of variances, as well as being symmetric: an error above the target causes the same loss as the same magnitude of error below the target. If the target is t, then a quadratic loss function is

Page 10: Taguchi Loss Function

for some constant C; the value of the constant makes no difference to a decision, and can be ignored by setting it equal to 1.

Many common statistics, including t-tests, regression models, design of experiments, and much else, use least squares Linear models theory, which is based on the Taguchi loss function.

The quadratic loss function is also used in linear-quadratic optimal control problems.

Taguchi methods

Taguchi methods are statistical methods developed by Genichi Taguchi to improve the quality of manufactured goods, and more recently also applied to, engineering,[1] biotechnology,[2][3] marketing and advertising.[4] Professional statisticians have welcomed the goals and improvements brought about by Taguchi methods, particularly by Taguchi's development of designs for studying variation, but have criticized the inefficiency of some of Taguchi's proposals.[5]

Taguchi's work includes three principal contributions to statistics:

A specific loss function — see Taguchi loss function; The philosophy of off-line quality control; and Innovations in the design of experiments.

Loss functions

Loss functions in statistical theory

Traditionally, statistical methods have relied on mean-unbiased estimators of treatment effects: Under the conditions of the Gauss-Markov theorem, least squares estimators have minimum variance among all mean-unbiased estimators. The emphasis on comparisons of means also draws (limiting) comfort from the law of large numbers, according to which the sample means converge to the true mean. Fisher's textbook on the design of experiments emphasized comparisons of treatment means.

Gauss proved that the sample-mean minimizes the expected squared-error loss-function (while Laplace proved that a median-unbiased estimator minimizes the absolute-error loss function). In statistical theory, the central role of the loss function was renewed by the statistical decision theory of Abraham Wald.

However, loss functions were avoided by Ronald A. Fisher.[6]

Taguchi's use of loss functions

Taguchi knew statistical theory mainly from the followers of Ronald A. Fisher, who also avoided loss functions. Reacting to Fisher's methods in the design of experiments, Taguchi interpreted Fisher's methods as being adapted for seeking to improve the mean outcome of a process. Indeed, Fisher's work had been largely motivated by programmes to compare agricultural yields under different treatments and blocks, and such experiments were done as part of a long-term programme to improve harvests.

However, Taguchi realised that in much industrial production, there is a need to produce an outcome on target, for example, to machine a hole to a specified diameter, or to manufacture a cell to produce a given voltage. He also realised, as had Walter A. Shewhart and others before him, that excessive

Page 11: Taguchi Loss Function

variation lay at the root of poor manufactured quality and that reacting to individual items inside and outside specification was counterproductive.

He therefore argued that quality engineering should start with an understanding of quality costs in various situations. In much conventional industrial engineering, the quality costs are simply represented by the number of items outside specification multiplied by the cost of rework or scrap. However, Taguchi insisted that manufacturers broaden their horizons to consider cost to society. Though the short-term costs may simply be those of non-conformance, any item manufactured away from nominal would result in some loss to the customer or the wider community through early wear-out; difficulties in interfacing with other parts, themselves probably wide of nominal; or the need to build in safety margins. These losses are externalities and are usually ignored by manufacturers, which are more interested in their private costs than social costs. Such externalities prevent markets from operating efficiently, according to analyses of public economics. Taguchi argued that such losses would inevitably find their way back to the originating corporation (in an effect similar to the tragedy of the commons), and that by working to minimise them, manufacturers would enhance brand reputation, win markets and generate profits.

Such losses are, of course, very small when an item is near to negligible. Donald J. Wheeler characterised the region within specification limits as where we deny that losses exist. As we diverge from nominal, losses grow until the point where losses are too great to deny and the specification limit is drawn. All these losses are, as W. Edwards Deming would describe them, unknown and unknowable, but Taguchi wanted to find a useful way of representing them statistically. Taguchi specified three situations:

1. Larger the better (for example, agricultural yield);2. Smaller the better (for example, carbon dioxide emissions); and3. On-target, minimum-variation (for example, a mating part in an assembly).

The first two cases are represented by simple monotonic loss functions. In the third case, Taguchi adopted a squared-error loss function for several reasons:

It is the first "symmetric" term in the Taylor series expansion of real analytic loss-functions. Total loss is measured by the variance. For uncorrelated random variables, as variance is

additive the total loss is an additive measurement of cost. The squared-error loss function is widely used in statistics, following Gauss's use of the

squared-error loss function in justifying the method of least squares.

Reception of Taguchi's ideas by statisticians

Though many of Taguchi's concerns and conclusions are welcomed by statisticians and economists, some ideas have been especially criticized. For example, Taguchi's recommendation that industrial experiments maximise some signal-to-noise ratio (representing the magnitude of the mean of a process compared to its variation) has been criticized widely.[citation needed]

Off-line quality control

Taguchi's rule for manufacturing

Taguchi realized that the best opportunity to eliminate variation is during the design of a product and its manufacturing process. Consequently, he developed a strategy for quality engineering that can be used in both contexts. The process has three stages:

1. System design2. Parameter (measure) design

Page 12: Taguchi Loss Function

3. Tolerance design

System design

This is design at the conceptual level, involving creativity and innovation.

Parameter design

Once the concept is established, the nominal values of the various dimensions and design parameters need to be set, the detail design phase of conventional engineering. Taguchi's radical insight was that the exact choice of values required is under-specified by the performance requirements of the system. In many circumstances, this allows the parameters to be chosen so as to minimize the effects on performance arising from variation in manufacture, environment and cumulative damage. This is sometimes called robustification.

Tolerance design

With a successfully completed parameter design, and an understanding of the effect that the various parameters have on performance, resources can be focused on reducing and controlling variation in the critical few dimensions (see Pareto principle).

Design of experiments

Taguchi developed his experimental theories independently. Taguchi read works following R. A. Fisher only in 1954. Taguchi's framework for design of experiments is idiosyncratic and often flawed, but contains much that is of enormous value.[citation needed] He made a number of innovations.

Outer arrays

Taguchi's designs aimed to allow greater understanding of variation than did many of the traditional designs from the analysis of variance (following Fisher). Taguchi contended that conventional sampling is inadequate here as there is no way of obtaining a random sample of future conditions.[7] In Fisher's design of experiments and analysis of variance, experiments aim to reduce the influence of nuisance factors to allow comparisons of the mean treatment-effects. Variation becomes even more central in Taguchi's thinking.

Taguchi proposed extending each experiment with an "outer array" (possibly an orthogonal array); the "outer array" should simulate the random environment in which the product would function. This is an example of judgmental sampling. Many quality specialists have been using "outer arrays".

Later innovations in outer arrays resulted in "compounded noise." This involves combining a few noise factors to create two levels in the outer array: First, noise factors that drive output lower, and second, noise factors that drive output higher. "Compounded noise" simulates the extremes of noise variation but uses fewer experimental runs than would previous Taguchi designs.

Management of interactions

Interactions, as treated by Taguchi

Many of the orthogonal arrays that Taguchi has advocated are saturated arrays, allowing no scope for estimation of interactions. This is a continuing topic of controversy. However, this is only true for "control factors" or factors in the "inner array". By combining an inner array of control factors with an outer array of "noise factors", Taguchi's approach provides "full information" on control-by-noise interactions, it is claimed. Taguchi argues that such interactions have the greatest importance

Page 13: Taguchi Loss Function

in achieving a design that is robust to noise factor variation. The Taguchi approach provides more complete interaction information than typical fractional factorial designs, its adherents claim.

Followers of Taguchi argue that the designs offer rapid results and that interactions can be eliminated by proper choice of quality characteristics. That notwithstanding, a "confirmation experiment" offers protection against any residual interactions. If the quality characteristic represents the energy transformation of the system, then the "likelihood" of control factor-by-control factor interactions is greatly reduced, since "energy" is "additive".

Inefficencies of Taguchi's designs

Interactions are part of the real world. In Taguchi's arrays, interactions are confounded and difficult to resolve.

Statisticians in response surface methodology (RSM) advocate the "sequential assembly" of designs: In the RSM approach, a screening design is followed by a "follow-up design" that resolves only the confounded interactions judged worth resolution. A second follow-up design may be added (time and resources allowing) to explore possible high-order univariate effects of the remaining variables, as high-order univariate effects are less likely in variables already eliminated for having no linear effect. With the economy of screening designs and the flexibility of follow-up designs, sequential designs have great statistical efficiency. The sequential designs of response surface methodology require far fewer experimental runs than would a sequence of Taguchi's designs.[8]

Analysis of experiments

Taguchi introduced many methods for analysing experimental results including novel applications of the analysis of variance and minute analysis.

Assessment

Genichi Taguchi has made valuable contributions to statistics and engineering. His emphasis on loss to society, techniques for investigating variation in experiments, and his overall strategy of system, parameter and tolerance design have been influential in improving manufactured quality worldwide.[9] Although some of the statistical aspects of the Taguchi methods are disputable, there is no dispute that they are widely applied to various processes. A quick search in related journals, as well as the World Wide Web, reveals that the method is being successfully implemented in diverse areas,such as the design of VLSI; optimization of communication & information networks, development of electronic circuits, laser engraving of photo masks, cash-flow optimization in banking,government policymaking, runway utilization improvement in airports, and even robust eco-design.[10]

Taguchi Loss Function and Capability Analysis

Good News - Bad News

We recently sent out a QI Macros ezine about analyzing manufacturing performance data (link to rubber ezine). Many readers asked an interesting question: if my data fits between the specification limits of the histogram, but the control chart is unstable, is that good or bad?

Warranty ExampleMany years ago I read about an example from the automotive industry. One company was building transmissions for cars in both Japan and America. The American transmissions had five times the warranty issues.

To determine the problem, five transmissions were selected at random from both the Japanese factory and the American factory. Then, they took them apart and measured all of the specifications.

Page 14: Taguchi Loss Function

American TransmissionsAll of the American transmissions had parts that fell within the USL-LSL. Some measures were a little higher and some a little lower.

Japanese TransmissionsWhen the inspectors measured the Japanese transmissions, they got worried, because they got the same value on each of the parts on each of the five transmissions. They began to suspect that their gauges were incorrect.

The Japanese transmissions measured identically on all of the key specifications. There was no variation to speak of. Their graph looked more like this, with the measures centered closely around the target:

Here's my point: To truly serve your customer, your process has to be both stable and capable. It can't just be one or the other.

Stable - the control chart is in control (no unstable conditions) Capable - the histogram fits inside the specification limits (USL,LSL)

Stabilize your process When the process moves around like this example, it probably means that someone is changing the settings, without any real need to. Let the process run and then adjust the settings to move it onto the target. Then leave it alone unless it starts to drift.

Reduce Variation Once the process is stable, use process improvement reduce the variation (adjust the process to reduce the variation from the target).

Reduce the Loss Stabilizing your process and reducing the variation will, in turn, reduce the cost of the Taguchi Loss function. This will save you and your customers time and money (rework, waste, and delay). And customers are smart. They can tell the difference between two different transmissions and they can tell the difference in quality between you and your competitors.

Make sure you're in charge of who your customers return to year after year. Hitting the goal posts isn't good enough any more. You have to hit the target value most of the time. Your customers will love you for it.

Taguchi's Loss Function

Page 15: Taguchi Loss Function

Genichi Taguchi's impact upon North American product design and manufacturing processes began in November 1981.  Ford Motor Company requested that Dr. Taguchi make a presentation. Fortunately, I was invited to hear about this powerful design technique.  A different method of measuring quality is central to Taguchi's approach to design.  Loss function measures quality.  The loss function establishes a financial measure of the user dissatisfaction with a product's performance as it deviates from a target value.  Thus, both average performance and variation are critical measures of quality.  Selecting a product design or a manufacturing process that is insensitive to uncontrolled sources of variation improves quality.  Dr. Taguchi calls these uncontrolled sources of variation noise factors.  This term comes from early applications of his methods in the communications industry.  Applying Taguchi's concept entails evaluating both the variance and the average for the technical bench marking in QFD.  The loss function provides a single metric for comparison.

Static Taguchi applications search for a product design or manufacturing process that attains one fixed performance level.  A static application for an injection molding machine finds the best operating conditions for a single mold design.  Dynamic applications use mold dimensions as the signal and search for operating conditions which yield the same percentage shrinkage for any dimension in any orientation.  The dynamic approach allows an organization to produce a design that satisfies today's requirements but can be easily changed to satisfy tomorrow's demands.  You can consider this latter approach as contingency planning for some unknown future requirement.  In dynamic applications, a signal factor  moves the performance to some value and an adjustment factor modifies the design's sensitivity to this factor.  If you plot a straight line relationship, with the horizontal axis as the signal factor and the vertical axis as the response, the adjustment factor changes the slope of the line.  Being able to reduce a product's sensitivity to changes in the signal is useful.  For example, if you are designing a sports car your desired outcome might be a car that allows the driver to change the feel of the road.  The signal factor would be a control knob setting. The analysis could determine that the suspension system is the adjustment factor.  The adjustment factor adjusts the magnitude of change in road feel to a given change in the knob setting.  Several other design specifications would assure a predictable relationship in the control knob setting and the feel.  Changes in road conditions and weather would have minimal effect upon the relationship between knob adjustment and feel of the road.

Listening to the voice of the customer helps organizations create good systems designs.  Some teams use focus groups to gather input for these designs.  Fortune Magazine (April 1995) has an article by Justin Martin entitled "Ignore Your Customer."  It presents several examples of products that were strongly influenced by the voice of focus groups, but they were not purchased by consumers.  The author suggests that studying the customers under natural conditions would provide additional useful information.  Going to the Gemba is a crucial step in QFD.  This has been advocated since 1985. The Gemba is the total environment in which the customer lives and works.

Voice of the Customer Table with three components:  customer verbatim response, context of use and integration of verbatim response and context (Chapter 4, Figure 4-5).  Eventually, the team divides this expanded list of customer information into demanded qualities, failure modes, solutions, etc.  The Context of Application Table (Figure 4-3) identifies some of the environmental sources of uncontrolled variation in product performance.  Some examples of sources of variation for an easel pad are the force applied to the paper, humidity, and whether is it being used inside or outside. Taguchi's Robust Design reduces the impact of uncontrolled sources of variation upon the product's performance.

How to Measure Quality

Traditionally, quality is viewed as a step function as shown by the heavy line graph in the figure 1. A product is either good or bad.  This view assumes a product is uniformly good between the specifications (LS the lower specification and US the upper specification).  The vertical axis represents the degree of displeasure the customer has with the product's performance.  Curves A and

Page 16: Taguchi Loss Function

B represent the frequencies of performance of two designs during a certain time period.  B has a higher fraction of "bad" performance and therefore is less desirable than A.

figure 1

Sometimes traditional decision makers and those using Taguchi's loss function will make the same judgments.  If organizations consider both the position of the average and the variance, and if the averages are equal and/or the variances are equal, then the traditional decision maker and one using Taguchi's loss function will make the same decision.  However, the traditional decision-maker calculates the percent defective over time when both the average and variance are different.

figure 2

Taguchi believes that the customer becomes increasingly dissatisfied as performance departs farther away from the target.

He suggests a quadratic curve to represent a customer's dissatisfaction with a product's performance. The quadratic curve is the first term when the first derivative of a Taylor Series expansion about the target is set equal to zero.  The curve is centered on the target value, which provides the best performance in the eyes of the customer.  Identifying the best value is not an easy task.  Targets are sometimes the designer's best guess.

LCT represents lower consumer tolerance and UCT represents upper consumer tolerance.  This is a customer- driven design rather than an engineers specification.  Experts often define the consumer tolerance as the performance level where 50% of the consumers are dissatisfied.  Your organization's particular circumstance will shape how you define consumer tolerance for a product.

The equation for the target-is-best loss function uses both the average and the variance for selecting the best design.  The equation for average loss is:

Page 17: Taguchi Loss Function

Calculating the average loss permits a design team to consider the cost benefit analysis of alternate designs with different costs yielding different average losses.  As seen in figure 2, there is some financial loss incurred at the upper consumer tolerance.  This could be a warranty charge to the organization or a repair expense.

Most applications of the loss function in QFD can use a value of 1 for k since the constant would be the same for all competitors as it relates to the customer.

The graphics show a symmetric loss about the target, but this is not always the case.

If two products have the same variance but different averages, then the product with the average that is closer to the target (A) has better quality figure 3.

figure 3

If two products have the same average but different variance, then the product with the smaller variance has better quality figure 4.  Product B performs near target less often than its competitor.

figure 4

What if both average and variance are different?  Calculating the average loss assumes you agree with the concept of the loss function.  The product with smaller loss has the better quality figure 5. If curve A is far to the right, then curve B would be the better.  If curve A is centered on the target, then curve A would be better.  Somewhere in between, both have the same loss.

Page 18: Taguchi Loss Function

figure 5

Loss Function and Technical Bench Marking

Teams should gather data collected for technical bench marking in a real environment.  A real environment is one in which everything is not controlled and ideal.  Our product and the competitor's product would be evaluated at different temperatures, humidity and other conditions.  The laboratory can simulate these conditions.  By evaluating the product's performance in several environmental conditions, you would have realistic data to calculate the real world variance.

An orthogonal array can define a balanced study of different environmental conditions. The two or three important environmental conditions, each at two levels, provide a good estimate of the environmental variation.  The humidity is represented by H, the weight of items taped to the sheet on the wall by W, and the surface texture by T.  The 1 and 2 under H represent high and low humidity.  The four different combinations of environments are used to determine the average and variance of each product's performance.  

 

Instead of using all eight different combinations, the orthogonal array uses a special subset of the eight.  Due to the balanced nature of these four combinations, the effect of the missing four can be predicted.

Another option is to select the best and the worst environmental combination of the eight combinations.  This approach further reduces the number of environments evaluated to two.

The average loss for the data is:

The calculations of the variance and loss can be entered in two additional rows at the bottom of the Demanded Quality vs. Performance matrix used in QFD.  The ratio of the average loss of one competitor to another is independent of k.  The information of the average, variance and loss ratio identifies the directions for improvement as defined by the average loss equation.