|No Belts Required:
The Advantages and Limitations of Statistical Quality Control
|by Major Donovan O. Fuqua
Lean Six Sigma (LSS) is a popular trend associated with making any process or organization better. This article suggests that, although LSS can be a powerful tool, it is poorly understood in the field, does not fit current military doctrine, and has severe limitations for use with complex military sustainment operations. The two main tenets of LSS, statistical quality control and waste reduction, are important to logisticians. The Army logistics community must find ways to apply those concepts to logistics operations and educate multifunctional logisticians.
The idea for this article developed from the tendency of Army logistics professionals to misunderstand LSS and use it as a catchphrase. According to many logisticians in the Army, LSS is simply about becoming more efficient. This article, however, proposes that merely repeating slogans does not create learning in an organization.
The current use of LSS within the Army logistics community generated the questions that form the basis of this article: How many logisticians who have used the acronym LSS can accurately describe what it is or, more importantly, how it relates to military operations? How does LSS relate to systems theory, complexity science, and supply chain networks? Based on those questions, this article will address the following questions relating to LSS and military logistics:
- What is LSS?
- How is LSS used in organizations to improve processes and reduce system variation and waste?
- Where does LSS succeed and where does it fail when describing complex adaptive systems found in real-world organizations?
- If LSS is flawed, how should the Army use statistical quality control and waste reduction techniques?
What the article does not do is discount the positive work done by professional logisticians seeking to make Army processes more efficient and capable, regardless of the methodologies used. Six Sigma and Lean are two separate processes. Understanding the differences between the two is critical to understanding how the concepts should be used.
The Six Sigma Model
Six Sigma is a statistical quality control method of reducing variation and limiting defects within a process. Because defects are costly to businesses both in terms of potential excess process costs and lost business, Six Sigma is attractive to businesses that want to produce outputs with consistent specifications. Organizations in a variety of industries, such as manufacturing, healthcare, and even customer service, have institutionalized Six Sigma with varied success.
The Six Sigma model uses the “define, measure, analyze, improve, control” (DMAIC) method of improving quality in a system. The DMAIC method is further described like so:
- Define: Set goals for a project.
- Measure: Find the current performance.
- Analyze: Find the causes of variation.
- Improve: Fix the problems.
- Control: Monitor or control the process.
The goal of the process is to cut costs and reduce variation and defects by “tying quality control directly to financial results.” Six Sigma training is usually concentrated on the DMAIC model and attempts to carry the model into all areas of the business.
The basic premise of Six Sigma is meeting the statistical goal of having plus or minus six standard deviations between the product target and the upper and lower specification limits. For example, if a ball bearing process had a target of 0.50000 inches +/- 0.00005 inches, using six standard deviations indicates that your production should be within specification to +/- six standard deviations (identified by the Greek letter σ) from the target (three errors per billion produced). (See the table below.)
|This table shows the percent of values within certain standard deviations. The premise of Six Sigma is meeting the statistical goal of having plus or minus six standard deviations between the product target and the upper and lower specification limits.
The Six Sigma model, however, gives the process a completely arbitrary +/- 1.5 σ away from the target for the actual mean. Because of this, the true goal is for the sum of the products to be within +/- 4.5 σ or for the process to generate only 3.4 defective parts per million. (See the figure below for a normal distribution of values.)
|This curve shows the percent of values in a sample given a normal distribution.
Although controlling variation has always been a core component of any quality control program, engineers working for Motorola in the early 1980s were the first to coin the term “Six Sigma.” Bill Smith, a Motorola reliability engineer, found that the actual defect rates at Motorola were higher than the company had figured based on the defects found in the factory (type I defects). The defects overlooked in the factory were then, unfortunately, found by consumers (type II defects). At the Motorola Research Institute, he and Mikel Harry refined their methodology and helped establish the idea of defect-free manufacturing in all other sectors of business at Motorola.
Motorola, Texas Instruments, Microsoft, American Express, and General Electric have all used Six Sigma. To varying degrees, these businesses have incorporated the Six Sigma process into their culture as an overarching method of improving quality and reducing costs. Because success almost always breeds imitation, Six Sigma has become a catch-phrase and a business in itself, often far from its original statistics-based roots. A quick search on Amazon confirms that a great number of books have been written about Six Sigma. A roughly equal number of businesses will train managers from any paying business on the “best” practices. In the past few years, the Army has become a voracious customer.
As Six Sigma has expanded from a statistical tool to more of a managerial tool, model instructors have expanded the program from improving manufacturing quality control to enabling cultural changes within any organization. Companies such as iSix Sigma and Motorola University offer classes for cultural change within institutions. Training is geared toward different levels in the institutions and is often given karate-level titles: green belts, black belts, master black belts, champions, and executive leaders. The idea is that all levels of the business must be committed to instituting Six Sigma quality control as a core component of the firm. The Army has embraced this belt system—even offering additional skill identifiers (1X, 1Y, and 1Z) for training completed.
The Lean Model
Lean is a philosophy of reducing cost, increasing speed, and eliminating waste in warehousing, ordering, and manufacturing. Unlike Six Sigma, Lean is not a method of statistical quality control. The two main tenants of Lean are just-in-time logistics and smart automation. A good metaphor of the difference between Six Sigma and Lean is the difference between a microscope and a telescope. Six Sigma attempts to focus on a single variable in order to control variation. Lean uses a wide view of the entire process in order to identify wasteful actions.1
Lean grew out of methods developed by the Toyota Production System in the 1970s. This method focused on reducing the “seven wastes”: defects, overproduction, overprocessing, conveyance, inventory, motion, and waiting. The goal of the method was to improve customer value and profits. The combination of Six Sigma and Lean was likely due to the logic that if two good things are paired, something great will result.
Other Quality Control Methods
Total Quality Management (TQM) grew out of a program, instituted by W.E. Deming, that worked to reestablish Japanese industry after World War II. The program called for continuous improvement, process feedback, and a focus on quality within an organization. Generally, TQM is considered to be a precursor of Six Sigma. The basic difference is that Six Sigma imposes an arbitrary statistical boundary to define what quality should resemble. However, in many ways, Six Sigma training is very similar to TQM training in that they both work to instill cultural changes in an organization and focus on improving product quality.
ISO 9000, which is maintained by the International Organization for Standardization, is an international quality standard used in contractual situations and is often the standard in international trade. It uses third-party registration and is a quality tool in industry. Unlike Six Sigma, ISO 9000 is not easily applied to nonmanufacturing industries.
LSS and Complex Adaptive Systems
LSS cannot be effectively applied to complex adaptive systems. Complex adaptive systems are characterized by nonlinearity (defined in this article as unproportional and nonadditive relationships between variables2), complex variable interaction, the presence of many variables, and the mixing of deterministic, stochastic, and self-organizing variables. Generally, these systems are identified by their tendencies toward emergent behavior, unpredictable ordered effects, many possible feedback loops, and interdependencies. A system is adaptive if it changes its behavior in response to external stimuli. Complex adaptive systems are commonly described as “wicked problems” because they are difficult to frame and require iterative problem-solving techniques, so it is difficult to find the right associative problems for them. This article suggests that most real-world strategic and operational systems are both complex and highly adaptive.
LSS focuses on controlling a single variable without considering the effects of interaction on the system. This is apparent when you consider that LSS originated in manufacturing. When you are constructing a silicon wafer with specification limits within one micron, LSS is a powerful tool in limiting variation. But when you are dealing with highly interdependent social systems, reducing variation in a single variable could have unintended consequences.
If no distinct process specifications exist, LSS is unable to measure success. Simply wanting to make something better or faster is an unquantifiable goal. On the other hand, placing sufficiently large specification limits on a process can guarantee compliance to six standard deviations.
LSS fails to consider inherent process turbulence seen in dynamic systems when working to control variation. Natural process turbulence can often appear as random variation if it is not thoroughly analyzed. Turbulence (even within one variable) can result from periodicity3, autocorrelation4, Chaotic system behavior5, or fractional geometries6 within time series data.
Statistical quality control, whether you are using Shewhart7 control charts, LSS, TQM, or another methodology, is a powerful tool in controlling quality in linear systems or nonlinear systems not influenced by process interactions. However, in the Army, very few operational processes can be classified through linear causation models. It is much more likely for real-world systems to be dynamical, contain self-organizing and adaptive variables, and have complex variable interaction.
For example, imagine a situation in which you want to reduce variation of delivery time (a variable) from seaports in the United States to a seaport in a deployed location. Without a doubt, apparent and possibly unknown variables in that model are deterministic, stochastic, and self-organizing. You can work to control the speed of the vessels, but you cannot control the weather processes (which are not constant factors) or the political processes on vessel selection (a set of self-organizing variables). Although you can control the speed variables, there would be second- and third-order effects based on decisions. This is a relatively simple example, but it is representative of what happens when you attempt to control a more complex system.
Logisticians must recognize the type of system they are attempting to control before determining control methodologies. When systems are simple or mechanical, statistical quality control is an acceptable tool for reducing variation and increasing product quality. In complex systems, a different methodology must be used.
One method that has shown promise in dealing with complex adaptive systems is the process of Design,8 which is a process that has its roots in General Systems Theory.9 This process relies on the cyclical actions of system framing, operations framing, reflective learning and reframing, design formulation, and developing concepts for intervention. The Design process should be both qualitative and quantitative since it forms a continuous background for planning environments. It is a command process for understanding and intervening in complex systems for positive, anticipated process shifts and emergence. Logisticians should read about the Design process in order to determine how it will influence and shape our doctrine.
LSS is a powerful tool for manufacturing and technical business applications, but I offer the following recommendations for the Army logistics community:
- Write an Army field manual on the use of statistical quality control and waste reduction with the caveat that, in complex problems, these tools are not always applicable. Separate this doctrine from LSS and make it applicable to military operations.
- Define logistics preparation of the battlefield (LPB) as a continuous process with feedback loops. Be prepared to synchronize LPB with the emerging doctrine of Design.
- Teach basic statistical methods and waste reduction classes at officer, warrant officer, and noncommissioned officer basic and advanced courses and in the corresponding civilian education programs. The Army is a world-class teaching organization; there is no reason to hire civilian business consultants and contract out instruction for “belts.”
Many organizations have successfully used LSS to increase profits, improve public perceptions, and focus their workforces on quality. By successfully helping organizations limit process variation and reduce wasteful processes, LSS has generated interest and a substantial following in business and management circles. This interest has helped found an industry focused on selling training courses and books on how to copy the success of Motorola, General Electric, and others who have benefited from LSS.
The Army should recognize the positive aspects of LSS while being careful to exclude elements that are not applicable to military supply chains and processes. The U.S. military is not a business, is not organized as a corporation, and does not view organizational success based on a quarterly earnings statement or a stock price. This distinction separates the Army from Motorola, General Electric, and other organizations that have found success through LSS. For example, the Army should not institute just-in-time logistics because we are an expeditionary force that requires some stockpiling and warehousing (thanks to our long and often tenuous lines of communication). Also, because we operate in complex environments, the Army logistics community must be willing and able to accept variation both as a source of adaptation and as a necessary requirement for supporting disparate operations with often-changing measures of effectiveness and performance.
In an expeditionary environment, efficiency and effectiveness can have an inverse proportional relationship. Risk in supply chains dictates the amount of stockpiling required to adequately support operations. As a general rule, risk increases the requirement to stockpile. This nonlinear relationship is another reason that using a “corporate model” and LSS is not compatible with military operational requirements. While businesses tend to view success in terms of stock prices or profits, the military is successful if we defend the Nation and provide sovereign options for our political leaders.
Major Donovan O. Fuqua is a student at the School of Advanced Military Studies and an Army multifunctional logistician. He is a graduate of the Air Command and Staff College, the Combined Logistics Officers Advanced Course, and the Transportation Officer Basic Course. He has a B.S. degree from Tulane University, a master’s degree in military operational art and science from the Air University, and a master’s degree in industrial engineering and operations research from New Mexico State University.
1 Martin C. Jennings, “How the Army Should Use Lean Six Sigma as a Transformation Strategy for Logisticians in the 21st Century,” Army War College, Carlisle Barracks, Pennsylvania, 13 February 2006.
2 Functions are linear if and only if:
3 Periodicity is the quality of recurrence at a regular interval and may be subject to the combined effects of multiple waves within a process. This can be detected through Fourier analysis (a technique of describing a time series in terms of the frequency domain of its periodic constituents).
4 Autocorrelation is the tendency for time series data to form patterns or correlate with itself. It is the autocovariance divided by the variance. In statistical analysis, data is normally examined with lags of 1 through N/4 (N=number of data points). A value of +1 indicates perfect patterning, -1 indicates perfectly inverse patterning, and 0 indicates no patterning. Where m=lag number, this value is:
5 Chaos (as opposed to small ‘c’ chaos) is a phenomenon where systems appear random but actually have repeatable patterns and are dynamical, deterministic, and nonlinear. Chaotic systems exhibit sensitivity to initial conditions and the potential for attractors. The author recommends Chaos Theory Tamed by Garnett P. Williams (Joseph Henry Press, Washington DC, 1997) for more information on this phenomenon.
6 A fractal is a set of points whose dimension is not a whole number. This definition refers to fractional geometries (or dimensions) that are non-integer (e.g. ). This phenomenon was first described by Beniot Mandelbrot in 1960 in his study of cotton future prices, where he described linear processes in fractal geometry that appeared random in 2 dimensional space.
7 Walter Shewhart is often referred to as the “father of statistical quality control” for his work in standardizing and controlling manufacturing at Bell Telephone Laboratories from 1925 through 1956. He developed a series of control charts that indicated when a process was moving out of tolerance based on process mean (a variable), numbers of parts nonconforming in a sample (an attribute), or as an exponentially weighted average of a sample (to reduce process memory). His control charts are normally built around a three standard deviation limit.
8 This methodology is explained in TRADOC Pamphlet 525–500, Commander’s Appreciation and Campaign Design, and SAMS Text—Art of Design Version 1.0, Booz Allen Hamilton, 24 September 2008.
9 Ludwig von Bertalanffy, “An Outline for General Systems Theory,” British Journal for the Philosophy of Science, Vol. 1, No. 2, 1950, pp. 134–165.