Implementing Quality by Design
Much has been learned about the use of Quality by Design (QbD) since it was proposed by the FDA and ICH (2005) some five years ago. While the benefits of the approach are generally acknowledged, its implementation has been slower than expected.
Resistance is to be expected when any new approach is brought forward. Common concerns are: "This is new and different - will it work?" "I don’t understand the pieces and how they fit together." "I’m afraid that it will take too long and be too costly." "We don’t have the time to do it." Previous articles have discussed the reasons for resistance (Snee, et al 2008) and the critical elements of QbD and how they fit together (Snee 2009 a,b).
Central to QbD is experimentation and how to be effective in doing experimental work. Strategy for experimentation is addressed here, including a roadmap for the sequence and linkage of the tools. Experience has shown that this approach speeds up the data collection process and helps assure that no critical variables are overlooked.
A Disciplined, Systematic Approach is Needed
The first issue in developing any experimental strategy is to ensure that the critical elements are in place at the strategic, managerial, and operational levels of work. At the strategic level, our experimental approach needs to:
• Develop process understanding.
• Focus on sources of variation - the sources of risk (Hubert, et al 2008).
• Identify the critical Xs - the major sources of variation (Pareto principle).
• Ensure that the strategy matches the experimental environment.
At the managerial level, we need methods to guide the work and the tools used. This is accomplished by developing roadmaps for how the various pieces of experimental work fit together, including the sequencing and linkage of the tools.
Operational work involves data collection methods including the experimental designs (DOE) used, process modeling approaches and tools, graphics, and software. Since all of these activities may be new to many, opportunities should be available to develop the needed skills. The critical elements of the approach are shown in Figure 1 and discussed in the following section.
Critical Elements of Experimental Strategies for Implementing QbD
Process and product Xs and Ys: The Ys - Critical Quality Attributes (CQAs) are the critical process output measurements linked to patient needs. The Xs - Critical Process Parameters (CPPs) encompass the process input (API and excipient), control, and environmental factors that have major effects on the CQAs. Raw materials factors are included as Xs.
Experiments define the data collection processes used to study the relationships between the Xs and Ys so as to identify the critical variables. These experiments typically make use of statistical design of experiment (DOE) techniques.
Strategy of Experimentation is the process of diagnosing the experimental environment and determining the best experimental strategy to satisfy the objectives of the experiment. Process and measurement robustness is the ability of the process and measurement system to perform when faced with uncontrolled variation in process, input, and environmental variables.
Analytical Methods plays two roles in the approach: as the process to collect the data and as the subject of QbD studies to assure the stability, quality of the measurements, and the robustness of the measurement methods (See Schweitzer, et al 2010). Measurement system quality is a critical element of QbD that should not be overlooked.
Process Model provides a quantitative model Y=f(X) relating the product and process outputs (Ys) to the inputs (Xs). The resulting model is typically based on both fundamental and statistical relationships and is used to create the design space.
Design Space is the combination of input variables and process parameters that provide assurance of product quality.
Process and Measurement Control is the use of control procedures, including statistical process control (SPC), to keep the process and measurement system on target and within the desired variation. Process and measurement capability tracks process performance relative to CQA specifications and provides measurement repeatability and reproducibility regarding CQAs.
Reduced Risk and Enhanced Compliance is a function of the design space, process and measurement capability, control, and robustness.
Central to the approach is a strategy for experimentation, summarized in Table 1 (Snee 2009c). This strategy identifies three experimental environments: screening, characterization, and optimization. The objectives of each of the three phases (desired information) are summarized in Table 1.
The Screening Phase explores the effects of a large number of variables with the objective of identifying a smaller number of variables to study further in characterization or optimization experiments. Additional screening experiments involving additional factors may be needed when the results of the initial screening experiments are not promising. On several occasions I’ve seen the screening experiment solve the problem.
When there is very little known about the system being studied, sometimes “range-finding" experiments are used in which candidate factors are varied one at a time to get an idea of appropriate factor levels. Yes, varying one factor at a time can be useful.
The Characterization Phase helps us better understand the system by estimating interactions as well as linear (main) effects. The process model is thus expanded to quantify how the variables interact with each other as well as to measure the effects of the variables individually.
The Optimization Phase develops a predictive model for the system that can be used to find useful operating conditions (design space) using response surface contour plots and, perhaps, mathematical optimization.
The SCO Strategy (screening, characterization, optimization), in fact, embodies several strategies, which are subsets of the overall SCO strategy, namely:
• Screening - Characterization - Optimization.
• Screening - Optimization.
• Characterization - Optimization.
• Screening - Characterization.
The end result of each of these sequences is a completed project. There is no guarantee of success in a given instance, only that the SCO strategy will “raise your batting average”(Snee 2009c). Some examples of using this strategy are discussed in the Sidebar.
Guidance on Using Strategy of Experimentation
The strategy used depends on the experimental environment, which includes the objectives of the experimental program. Criteria that can be used to characterize the experimental environment are outlined in Table 2.
These characteristics involve program objectives, the nature of the factors (Xs) and responses (Ys), resources available, quality of the information to be developed, and the theory available to guide the experiment design and analysis. A careful diagnosis of the experimental environment along these lines can have a major effect on the success of the experimental program.
Table 3 summarizes some tips and traps that can improve the effectiveness of your experimentation.
The critical issues include strategic thinking, identifying and understanding interactions, running confirmation experiments to check model validity, importance of variation and its relation to process understanding and risk, and process robustness and the measurement system. Paying attention to these tips and traps which are based on extensive use of the approach can go a long way to speeding up the development process and getting useful results.
Over the years we have learned that experimentation can be used to improve all types of processes, manufacturing and service processes alike. As with any endeavor it is important to have a disciplined, systematic approach and a strategy to guide your work. Sequential experimentation, sometimes involving several phases of the SCO strategy, has been a high-yield approach to guide experimentation. Integrated with the approach is the diagnosis of the experimental environment, which helps define the appropriate strategy. The approach and its associated tips and traps have stood the test of time and are worthy of your consideration.
Aggarwal, V. K., A. C. Staubitz and M. Owen (2006). “Optimization of the Mizoroki - Heck Reaction Using Design of Experiments (DOE)," Organic Process Research and Development, Vol. 10, 64-69.
Covey, Stephen R. (1989) The 7 Habits of Highly effective People - Powerful Lessons in Personal Change, Simon and Schuster, New York, NY.
Hulbert, M. H., L. C. Feely, E. L. Inman, A. D. Johnson, A. S. Kearney, J. Michaels, M. Mitchell and E. Zour (2008) “Risk Management in Pharmaceutical Product Development," White Paper Prepared by the PhRMA Drug Product Technology Group, Journal of Pharmaceutical Innovation, 3: 227-248.
International Committee on Harmonization (2005) "ICH Harmonized Tripartite Guideline: Pharmaceutical Development," Q8, Current Step 4 Version, and November 10, 2005.
Schweitzer, M., M. Pohl, M. Hanna-Brown, P. Nethercote, P. Borman, G. Hanson, K. Smith, J. Larew (2010) “Implications and Opportunities of Applying QbD Principles to Analytical Measurements," Position Paper: QbD Analytics, Pharmaceutical Technology, February 2010, 52-59
Snee, R. D., P. Cini, J. J. Kamm and C. Meyers (2008) “Quality by Design - Shortening the Path to Acceptance," Pharma Processing, February 2008.
Snee, R. D. (2009a) “Quality by Design - Four Years and Three Myths Later," Pharmaceutical Processing, February 2009, 14-16.
Snee, R. D. (2009b) “Building a Framework for Quality by Design," Pharmaceutical Technology Online, October 2009.
Snee, R. D. (2009c) “'Raising Your Batting Average' Remember the Importance of Strategy in Experimentation," Quality Progress, December 2009, 64-68.
Yan, L. and M. Le-he (2007) “Optimization of Fermentation Conditions for P450 BM-3 Monooxygenase Production by Hybrid Design Methodology," Journal of Zhejian University SCIENCE B, 8(1): 27-32.
About the author:
Ronald D. Snee, PhD is founder and president of Snee Associates, a firm dedicated to the successful implementation of process and organizational improvement initiatives. He provides guidance to senior executives in their pursuit of improved business performance using Quality by Design, Lean Six Sigma, and other improvement approaches that produce bottom line results. Ron received his BA from Washington and Jefferson College and MS and PhD degrees from Rutgers University. He is an academician in the International Academy for Quality and a fellow of the American Society of Quality, the American Statistical Association, and the American Association for the Advancement of Science. He has been awarded ASQ’s Shewhart and Grant Medals, and ASA’s Deming Lecture Award as well as numerous other awards and honors. He is a frequent speaker and has published four books and more than 200 papers in the fields of performance improvement, quality, management, and statistics. He can be reached at Ron@SneeAssociates.com