Within the scientific method and its techniques for pursuing knowledge, experiments are the vehicle through which empirical facts are established. The early approach of “trial and error” is what design of experiments (DOE) aims to improve upon. By applying statistical analysis to natural phenomena, experimenters can improve the setup, execution, and conclusions drawn from trials – and errors. Experiments in modern times are critical for researchers and manufacturers alike, and occur much earlier in product life cycles than in pre-industrial eras. As good as a product concept may be, manufacturers must provide quality not only by making an efficient product, but by making it consistently throughout the product’s lifecycle.
Through experimentation, something as potentially simple as establishing causation between two factors can create a cascade of effects that ripple through a product’s design, saving costs while improving functionality. But how can these factors be considered and quantified along with the design of the product? And what if these same factors could provide knowledge about what aspects of the product are most central in determining quality and satisfaction?
These were among the questions that Genichi Taguchi considered as he worked to improve Japan’s telephone network in the 1950s. Himself an engineer, Taguchi proposed a design of experiments that coupled critical thought of a product and its crucial factors to a statistical, numerical process. This approach not only aimed to cut costs by establishing a single, optimal iteration of a product, but also sought to cut deviation from that optimal state by considering the relationship between noise (uncontrollable) and signal (controllable) factors in design and improvement.
The Taguchi method utilizes the concept of a loss function to determine quality of a product, which can offer experiment facilitators and data analysts alternative perspectives over data being collected and processed. For Taguchi, loss is measured as a product’s loss to society, which is calculated as variations in performance as well as their effects. A product that functions despite environment and user is considered robust, and for Taguchi, this is the key feature of a high-quality product. At a glance, the Taguchi method presents the case for robustification, and an associated methodology for achieving this result.
LASER-FOCUSED ON DESIGN: APPLYING THE TAGUCHI METHOD
If a company were developing a laser used to create tiny patterns on materials (a rudimentary description of the etching process used in semiconductor manufacturing), then the quality of the laser would be, in part, determined by the amount of variance from the standard found in the patterns it creates. In a case where one such laser could cost millions to develop and produce, the Taguchi method would devote greater time to the research and development stage to establish that every laser will etch a pattern that meets specified requirements.
Following the Taguchi method, loss could be measured as any negative effect resulting from design of the product. The potential for an operator to be injured while operating the laser, materials made defunct via incorrect or imprecise patterns are two clear ways that loss could occur, and would receive special attention at design and early iteration stages for a product.
Additional considerations for loss would include loftier aspects of negative results from the product. Waste produced, loss of future sales due to a drop in brand confidence, and any post-production costs to fix problems with the product can and would be included in Taguchi’s loss function.
SEEING QUALITY AS NON-LINEAR
Certain aspects of the Taguchi method are philosophical in nature, and describe the way that a company should analyze or conceptualize their products. They include three main points that are sometimes referred to as fundamental concepts. They are:
1) Quality Must be Designed into the Product:
Understanding the aspects of product design that influence quality implies an understanding of the product itself. Knowing the product, the user, and its intended use-cases may seem a simple task, but accounting for them in a pre-manufacturing stage – that is, before the product, user, or use-case exist – is part of the overall reimagination of how products should be created.
Implicit in the Taguchi method is the belief that manufacturing processes are flawed and can only introduce problems into design. Thus, adjustments and iterations take place at a preceding stage before they reach any potential manufacturing or assembly line. Consequently, this approach is also called “off-line design” or “off-line quality control.”
2) Quality is Realized by Minimizing Deviation from the Target:
Investments that reduce variation from a target optimal state in a product have favorable return on investment (ROI), especially when customer satisfaction, replacements, and post-production improvements are factored into cost. Along with bolstering brand loyalty, addressing these factors early – and continuously – makes design robust and helps eliminate loss resulting from the aforementioned pathways.
3) Quality is a Function of Deviation:
Placing a primary emphasis on the relationship between quality and cost of failure establishes a guide for optimal improvement of a product. The Taguchi method measures losses at the systemic level and can factor in any costs associated with the return of a product; warranty, re-inspection, replacement, and even costs extended to the customer are all factors that contribute to loss under the Taguchi method.
FACTORS IN THE TAGUCHI METHOD
Taguchi’s approach allows for experiments where facilitators can choose factors that are more consistent, and approaches design with consideration for uncontrollable factors. There are three central aspects of the overall structure:
The “brainstorming” and synthesis of a product or process to be used. Systems design occurs early on during conceptualization of a product. This stage focuses on achieving functionality through innovation. After these creative avenues have been exhausted, the basis for parameter design should be established, as it is the next stage in Taguchi’s process.
Parameter design in the Taguchi method achieves the goal of creating a product that is robust enough for both the environment and the user. Designing a set of rules that determine design elements, then defining each rule using parameters and components helps quantify and diagnose variation in a given product. The term “parametric design” is often used interchangeably with “robust design” given its focus.
However, Taguchi also makes use of orthogonal arrays, which fall under a greater scheme of orthogonal array testing strategies (OATS) and are meant to provide an alternative to other quality control methods which can be prohibitive as a result of setup cost, time constraints, or other factors that make them otherwise impractical. In this sense, Taguchi’s orthogonal arrays are an alternative to full factorial experimental design, which simply – and exhaustively – tests every possible combination of states and variables.
The robustness of a product is determined using a signal-to-noise ratio (signal / noise or S/N) which is determined by (mean / variation) as well as mean response, or mean output variables. Whereas other methodologies may look to minimize noise in the experiment, Taguchi’s approach makes use the signal – or desired value – and noise – or undesired value. The resulting distribution around desired values show which control factors are most robust to noise factor variation.
Tolerance design generally comes after parameter design studies. This stage specifies tightening tolerances for a product in order to improve quality, and also identifies crucial tolerances in a product or process design. A complex process or product could already have tight tolerance requirements, but selecting materials (preferably during system or parameter design stages) that have high tolerance to variability can quantify how crucial improvements are to achieving a robust design. In Taguchi’s experience, simply meeting tolerances is not as favorable as an approach seeking to meet the target while minimizing variance around it.
Although it may intuitively make sense that a product with tolerated deviation of ± 1 micrometers could be made better by tolerating only ± .5 micrometers, such a change could be cost prohibitive for a manufacturer. Without a preliminary parameter design and subsequent testing, the effect may also be minimal in increasing overall quality of the product.
Taguchi’s consideration for product deviation from target values is also contrary to a mindset in manufacturing that treats quality as a binary process, where items are either within or beyond specification. Taguchi includes tolerance ranges, with different levels of tolerance for components of varying importance to the overall design. As a result, quality under the Taguchi method is a curve, and takes on a parabolic shape when factored into the loss function (Fig. 1, above).
TAGUCHI’S APPROACH IN THE DIGITAL AGE
Many modern manufacturing life cycles reflect values inherent in the Taguchi method. Greater emphasis on R&D and baselining means that many companies go through more iterations of a product or prototype before continuing to the production and logistics stages. In such cases, this is generally a result of the cost-effectiveness that greater off-line quality control offers. For some specific industries like software and digital products, physical manufacturing may not even factor into a product’s lifecycle. However, system, parameter, and tolerance controls under Taguchi’s approach are still applicable, and quality assurance continues to play a major role in identifying and fixing problems in digital environments.
Established automobile manufacturers represent the opposite side of the spectrum, where complex manufacturing processes take considerable effort and resources. Mistakes in design (for Taguchi, the only mistakes there are) can result in global recalls of their vehicles and unforeseen repair costs. Any manufacturer facing bankruptcy as a result of a recall would likely determine that their testing and design experiments were not robust enough. The incurred costs would also factor into the Taguchi loss function for the company.
ROBUST MULTIPHYSICS DESIGNS
A thorough understanding of a product’s physics is relevant to its adoption. In the case of an electric circuit breaker, it is important to understand how fast it disconnects from the circuit, how resilient it is to mechanical impacts, and its effectiveness despite adverse weather conditions. Given the variety of conditions that equipment could be exposed to, experimental testing – and implementation of the Taguchi method – for the circuit breaker design becomes challenging. A thorough sweep of parameters via experimental investigation becomes non-plausible given the manual effort and experimental costs involved.
A high-fidelity multiphysics solver presents a solution for design insights in these cases. Understanding the physics behind the product for various operating and abuse conditions not only makes a product robust, but also catalyzes a revolution in product design. VizSpark™, as a high-fidelity thermal plasma flow solver, is already being used in the industry to provide further insights on conventional design, achieve faster design iterations, and reduce product iteration cycle times.
The figures below show published work by Ranjan et al., where VizSpark™ was used to simulate the electric disconnection in electric vehicle relays. The varying factor across the simulations are their gas compositions. The assessment was made for different levels of hydrogen in the hydrogen-nitrogen mixtures. Taguchi’s approach could be implemented in a similar way for different levels of purity of a certain gas-mixture. The use of multiphysics solvers and simulations offers system, parameter, and tolerance insights without the attached costs of physical experiments.
Thanks for reading! If you’re still curious about the topics discussed in this article, check out the following journal papers (and ask us for a free copy!):
Interested in learning more about plasma flow simulations? Click here to take a look at our previous article. Feel free to follow us on Twitter and LinkedIn for more related news, or reach out to us directly at firstname.lastname@example.org.