A ll} . aim. flaunt!) in“ , a .31; ‘........L.. 2... z . .55}... 1.: 31.... . .. 1.5215}: v 1. 5. fit! CI}! :5. .3. ‘ J... 3 . . 43%;??? lyszéfiti i. A. 2: ‘flt! 3%. nap»... ‘ l 2. {~03 A. . VI... ‘31:?! 1.2.1:“ 3-5 f iiiru. .1 . nth: ) its: 39.1 . é.- ,Kfisfls ‘nté... 5&3». m. . a...” a 31...: 2 ‘t- .1? It 1 it‘ll: I-.. $32! : .l a (1!! f. ox 5.35, x111 .— it"! x 39 :)1 , éfimfiwfié .. . :3 . 5:22.: This is to certify that the dissertation entitled Exploring the Direct Versus Indirect Linkages among Operations Practices, Operations Capabilities and Operations Performance: Does Competitive Context Moderate the Key Relationships? presented by Sarah Jinhui Wu has been accepted towards fulfillment of the requirements for the Ph. D degree in Marketing and Supply Chain Management KMUA/mr (Atlantic/)1! Major Professor’s Signature April 23, 2007 Date MSU is an affirmative-action, equal-opportunity employer LIBRARY Michigan State University PLACE IN RETURN BOX to remove this checkout from your record. To AVOID FINES return on or before date due. MAY BE RECALLED with earlier due date if requested. DATEDUE DATEDUE DAIEDUE 6/07 p:/C|RC/DateDue.indd-p.1 Exploring the Direct versus Indirect Linkages among Operations Practices, Operations Capabilities and Operations Performance: Does Competitive Context Moderate the Key Relationships? By Sarah Jinhui Wu A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTORAL OF PHILOSOPHY Department of Marketing and Supply Chain Management 2007 ABSTRACT Exploring the Direct versus Indirect Linkages among Operations Practices, Operations Capabilities and Operations Performance: Does Competitive Context Moderate the Key Relationships? By Sarah Jinhui Wu Past research in operations strategy has identified several elements when establishing the linkages between operations decisions and business performance. However, the nature of the relationships among these elements has not been fully understood. This dissertation research focuses on exploring the impact of two inter-related but different elements — operations practice initiatives (i.e., operations practices viewed at the middle level of aggregation) and operations capabilities — on operations performance. The goal is to find out the foci of practice initiatives implementation and operations capabilities development, and the focus of operations strategy under different business environments. The overall objective of this study is accomplished through the exploration of four primary research questions. (1) What are operations practices and operations capabilities? What are the critical diflerentiators? (2) Are operations practices initiatives compensatory or additive in enhancing operations performance? Are operations capabilities compensatory or additive in enhancing operations performance? (3) What are the relationships between operations practice initiatives, operations capabilities, and operations performance? (4) To what extent are the key relationships among operations practice initiatives, operations capabilities, and operations performance influenced by competitive environment? This is primarily a theory building research that follows Wacker’s (1998) procedure. In stead of separating theory building with theory validation, the study integrated both activities in one cycle. At the first stage, the grounded theory method was used to derive and extract theory from a focus group study. At the end of this stage a tentative model was proposed for validation. The second stage was to collect data from a large professional organization, test the tentative model, and further refine the model based on the feedback of the data. After analyzing the data, the study came to the following conclusions. First, with refined definitions, operations practice initiatives and operations capabilities can be clearly distinguished from one another and their validities were confirmed. Second, the nature of the intra—relationships among operations practices was quite different from that among operations capabilities. Third, the importance of operations practice initiatives and operations capabilities in improving operations performance depended on the performance goals. Lastly, the findings established the moderating roles of market competitiveness and market dynamism in the relationships among operations practices, operations capabilities, and operations performance. Overall, the results emphasized the value of cultivating operations capabilities in the process of implementing operations practices initiatives, and the significant influence of the business environment on choosing the focus of operations strategy. In addition, it showed that firms should be more focused on strategic decisions, on what practices initiatives to implement, and on what operations capabilities to develop since they exhibit a more compensatory than additive nature. The research is one of the few to explore these issues. The findings of this study not only enriched theory of operations strategy, but also motivated future research by proposing, testing and refining general hypotheses. Copyright by SanflrfinhuiVVu 2007 DEDICATED TO MY PARENTS, JUN WU AND XIUZHEN LI ACKNOWLEDGEMENTS Every step towards the completion of this dissertation was a struggle. I could not have completed the work without the help, input, and encouragement from a number of people. First and foremost, my dissertation chairman, Dr. Steven A. Melnyk deserves special thanks as he devoted tremendous amount of time and energy to make sure that I not only did interesting work, but also did it well. The research would not have been possible without his active participation and guidance. The comments and suggestions from my other dissertation committee - Dr. Cornelia Droge, Dr. Barbara B. Flynn, Dr. Morgan Swink, and Dr. Shawnee K. Vickery —— were invaluable in keeping the dissertation concise, crispy, and rigorous. I feel blessed to have the opportunity to work with all these distinguished scholars. I would also like to thank MASCO Corporation and The American Production and Inventory Control Society for their involvement in and contribution to the study. I appreciate their understanding of the value of the dissertation research and the access they provided to their management team and membership database. Those managers who contributed their time and thoughts in all the stages of the dissertation project made the research questions more relevant and empirical validation possible. I do hope that the results of this work provide them insights and benefits in the field. Finally, to all my family members, thank you for supporting me all the way to the completion of my doctoral study. Especially during the most difficult moments, you gave me hope, you cheered me up, and you pushed me forward. Without all these, I could not have been where I am today. I would like to share all my achievements with you. vi TABLE OF CONTENT LIST OF TABLES .............................................................................................................. x LIST OF FIGURES ........................................................................................................... xi Chapter 1 OVERVIEW OF THE RESEARCH .................................................................................. 1 1.1 Introduction ............................................................................................................... 1 1.2 Research Questions ................................................................................................... 2 1.3 Research Model ........................................................................................................ 6 1.4 The Motivation of the Research ................................................................................ 9 1.5 Research Methodology ........................................................................................... 12 1.5.1 Qualitative Study — Grounded Theory Method ................................................ 12 1.5.2 Quantitative Study — Survey Method ................................................................ 14 1.5.3 Data Analysis Methodology ............................................................................. 15 1.6 Contribution of the Research .................................................................................. 17 1.7 Structure of the Dissertation ................................................................................... 18 1.8 Summary ................................................................................................................. 19 Chapter 2 LITERATURE REVIEW ................................................................................................. 2O 2.] Operations Practices ................................................................................................ 20 2.1.1 Current Status of Studies on Operations Practices ......................................... 21 2.1.2 Operations Practices in This Study .................................................................. 25 2.2 Operations Capabilities ........................................................................................... 30 2.2.1 Construct Conceptualization ........................................................................... 30 2.2.2 lntra-relationships among Operations Capabilities ........................................ 34 2.2.3 Relationships with Business Performance ....................................................... 36 2.3 Competitive Context ............................................................................................... 37 2.4 Gaps and Opportunities ........................................................................................... 40 2.5 Summary ................................................................................................................. 42 Chapter 3 CONCEPTUAL FRAMEWORK ..................................................................................... 43 3.1 Wacker’s Theory Building Approach ..................................................................... 43 3.2 Definitions and Measurements of the Constructs ................................................... 45 3.2.1 Operations Practices ....................................................................................... 45 3.2.2 Operations Capabilities ................................................................................... 49 3.2.3 Operations Performance .................................................................................. 55 3.2.4 Competitive Context ......................................................................................... 56 3.3 Domain of the Theory ............................................................................................. 57 3.4 Tentative Conceptual Framework ........................................................................... 58 3.4 Chapter Summary ................................................................................................... 61 vii Chapter 4 RESEARCH DESIGN AND METHODOLOGY ............................................................ 62 4.1 Theory Development - Grounded Theory Method ................................................. 62 4.1.1 Grounded Theory Method ................................................................................ 63 4.1.2 Focus Group Study .......................................................................................... 65 4.2 Survey Development and Validation ...................................................................... 69 4.2.1 Q-sort ............................................................................................................... 70 4.2.2 Pre-test the Survey Instrument ......................................................................... 71 43 Survey Administration ............................................................................................ 72 4.3.1 Data Quality ..................................................................................................... 72 4.3.2 Data Quantity and Response Rate ................................................................... 74 4.3.3 Survey Delivery Method ................................................................................... 76 4.4 Limitations of Methodology ................................................................................... 78 4.4.1 Common Method Bias ...................................................................................... 78 4.4.2 Response Rate and Sample Size ....................................................................... 80 4.5 Data Analysis Methodology ................................................................................... 82 4.6 Chapter Summary ................................................................................................... 86 Chapter 5 DATA ANALYSIS AND DISCUSSION ........................................................................ 87 This chapter uses this flow of activities to structure the presentation of the results ..... 87 5.1 Results from the Focus Group Study ...................................................................... 87 5.2 Q-sort Results .......................................................................................................... 92 5.3 Data Collection ....................................................................................................... 94 5.3.1 Survey Responses ............................................................................................. 94 5.3.2 Demographic Information of the Sample ......................................................... 96 5.3.3 Sample Representativeness .............................................................................. 97 5.4 Data Analysis and Discussion ............................................................................... 100 5. 4.1 Analysis and Discussion for Research Question I ........................................ 100 5.4.2 Analysis Results and Discussion for Research Question 2 ............................ 110 5.4.2.1 Analysis Results ...................................................................................... 110 5.4.2.2 Discussions ............................................................................................. 112 5.4.3 Analysis Results and Discussion on Research Question 3 ............................. 114 5.4.3.1 Analysis Results ...................................................................................... 115 5.4.3.2 Discussions ............................................................................................. 1 18 5.4.4 Analysis and Discussion on Research Question 4 ......................................... 122 5.4.4.1 Analysis Results ...................................................................................... 122 5.4.4.2 Discussions ............................................................................................. 129 5.5 Refined Framework .............................................................................................. 131 5.6 Chapter Summary............. .................................................................................... 133 Chapter 6 CONCLUSION AND FUTURE RESEARCH ............................................................... 134 6.1 Conclusions ........................................................................................................... 134 6.2 Theoretical Contributions ..................................................................................... 136 viii 6.3 Contributions for Management Practice ............................................................... 139 6.4 Limitations ............................................................................................................ 140 6.5 Directions of Future Research .............................................................................. 142 6.5 Chapter Summary ................................................................................................. 146 APPENDICES ................................................................................................................ 147 Appendix A. Protocol of Focus Group Study ............................................................. 148 Appendix B. Protocol of Q-sort .................................................................................. 150 Appendix C. Questionnaire in the Survey Study ........................................................ 152 BIBLIOGRAPHY ........................................................................................................... 161 ix LIST OF TABLES Table 3.1 Definitions and Measurements of Operations Practices Initiatives .................. 47 Table 3.2 Definitions and Suggested Measurements for Operations Capabilities ............ 53 Table 5.1 Core Operations Practices and Operations Capabilities ................................... 90 Table 5.2 Q-sort Results ................................................................................................... 93 Table 5.3 Summary of Respondents’ Demographic Information ..................................... 96 Table 5.4 Respondents’ Representativeness ..................................................................... 99 Table 5.5 Measurement Model for Operations Practice Initiatives ................................ 101 Table 5.6 Measurement Model for Operations Capabilities ........................................... 104 Table 5.7 Correlation Matrix of Core Operations Practices and Core Operations Capabilities ..................................................................................................................... 109 Table 5.8 Measurement Model for Operations Performance .......................................... 111 Table 5.9 Regression Results of the Nature of the Relationship between Operations Practice Initiatives and Performance .............................................................................. 112 Table 5.10 Regression Results of the Nature of the Relationships between Operations Capabilities and Performance ......................................................................................... 112 Table 5.11 Mediating Effect in the Compensatory Model ............................................. 117 Table 5.12 Mediating Effect in the Additive Model ....................................................... 117 Table 5.13 Relationships among Operations Practices, Operations Capabilities, and Operations Performance in More Competitive Markets ................................................. 124 Table 5.14 Relationships among Operations Practices, Operations Capabilities, and Operations Performance in Less Competitive Markets .................................................. 125 Table 5.15 Relationships among Operations Practices, Operations Capabilities, and Operations Performance in More Dynamic Markets ...................................................... 127 Table 5.16 Relationships among Operations Practices, Operations Capabilities, and Operations Performance in More Stable Markets ........................................................... 128 LIST OF FIGURES Figure 1.1 The Proposed Model .......................................................................................... 7 Figure 2.1 Operations Practices Hierarchy ....................................................................... 27 Figure 2.2 The Boundary of This Study ........................................................................... 36 Figure 4.1 Theory Development — Theory Validation Cycle ........................................... 62 Figure 5.1 Two Competing Second-order Construct Measurement Models .................. 108 xi Chapter 1 OVERVIEW OF THE RESEARCH 1.1 Introduction In the field of operations strategy, two concepts continue to play a critical role: operations practices and operations capabilities. For example, when laying out their framework for operations strategy, Hayes and Wheelwright (1984) identified the critical role that operations capabilities play and recognized a collective pattern of decisions that shape operations capabilities of a firm: capacity, facilities, technology, vertical integration, workforce, quality, production planning/materials control, and organization. Similarly, operations practices have been found to significantly influence firms’ abilities to compete in the market (Flynn, Schroeder and Sakakibara, 1995a; Fullerton, McWatters and Fawson, 2003; Giffi, Roth and Seal, 1990; Schonberger, 1996; Ward and Zhou, 2006). In all of these studies, two critical observations are very evident. First, operations practices and operations capabilities have been viewed as having a strong impact on the ability of a firm to use its operations management system as a competitive weapon. Second, the relationships between operations practices and operations capabilities have not yet been addressed explicitly and modeled collectively in respect to their contribution to operations performance. This dissertation focuses on achieving three major goals. First, it defines and differentiates operations practices and operations capabilities which serve as the building blocks for theory construction. Second, it examines the intra-relationships among a set of operations practices and a set of operations capabilities individually. Third, it explores the inter-relationships among operations practices, operations capabilities, and operations performance and to what extent the key relationships are influenced by competitive environment. This is primarily a theory building research as it clarifies the distinction between operations practices and operations capabilities and evaluates their direct and indirect impact on operations performance. In scope, the study reported in this dissertation transcends the typical theory-building/theory validation dichotomy, as presented by researchers such as Hunt (1991). Rather the study follows Wacker’s (1998) approach that integrates both of the two activities. His approach consists of four stages — (1) define variables, (2) limit the domain, (3) build relationships, and (4) seek empirical support. This approach allows us to develop and refine the dual constructs of operations practices and operations capabilities through a qualitative study and then validate the relationships of the resulting constructs using quantitative data generated from survey method. 1.2 Research Questions The overall objective of this study is accomplished through the exploration of four primary research questions. 1. What are operations practices and operations capabilities? What are the critical ditferentiators between them? Given the centrality of these two concepts in operations strategy and the lack of rigorous definitions on both in the literature, a very first step in theory building is to clarify the definitions and make a clear distinction between them. If the two constructs could not be distinguished from one another, it would be impossible to evaluate their individual impact on operations performance. In this study, operations practices are defined as task-specific ways of organizing resources1 with an aim to maintain and/or improve operations performance. For example, early supplier involvement can be best described as an operations practice in that manufacturers involve suppliers at an early stage (generally at the level of concept and design) in the product development process in order to achieve new product success (Bidault, Despres and Bulter, 1998; LaBahn and Krapfel, 2000). Operations practices can be viewed from different levels of aggregation, and this study examines them from the middle level. That is, each “operations practices” (for instance, integrated product development practices) is actually an operations practice initiative supported by a cluster of concrete practices (e.g., early supplier involvement, cross- functional collaboration). Detailed discussion is provided in Chapter Two. Operations capabilities are defined as demonstrated potentials to execute a specified course of action in operations in a unique and proficient wayz. This study identifies operations capabilities from a process perspective, not from an outcome perspective. Take Wal Mart for example. Its competitiveness in cost is not considered as an operations capability (cost reflects an outcome of a process). Rather, crossdocking can be viewed as an operations capability. Wal Mart uses real-time demand data and its own fleet to rapidly consolidate shipments from disparate sources at its distribution centers and move ' Resources refer to people, technology, equipments, anything necessary for production. 2 This is the finalized definition of operations capabilities after the focus group study. them to outgoing trailers without storing them in between. Crossdocking allows Wal Mart to realize economies of scale in both inbound and outbound transportation, and eliminate much of inventory in the distribution centers. This results in Wal Mart’s competitive advantage in cost. Nevertheless, the difference between operations practices and operations capabilities is not resolved merely from the definitions; the fundamental difference is further discussed from their nature in Chapter Three. 2. Are operations practices compensatory or additive in enhancing operations performance? Are operations capabilities compensatory or additive in enhancing operations performance ? It is not unusual for a firm to implement more than one strategic initiative and develop multiple capabilities, especially world class manufacturing firms have shown evidence of developing capabilities that reinforced each other (Corbett and Van Wassehnove, 1993). Accordingly, a firm faces a resource allocation decision. That is, is it better off to spread out resources in adopting various operations practice initiatives or to focus on a few? Therefore, there is a need to understand the intra-relationships among the set of operations practice initiatives and the set of operations capabilities because it provides guidance on how to split resource investment within each set. This study incorporates a set of core operations practices (i.e., a group of operations practice initiatives) and a set of core operations capabilities. It specifically answers the question of whether a firm needs to invest in all the core practice initiatives to enhance performance. If operations practice initiatives can be compensated by each other, a firm that is good at some practices but poor at others can still run successfully. In stark contrast, poor performance in any dimension could generate an adverse consequence if it can not be compensated by other excellent dimensions. If this assumption turns out to be true, a firm can not be better off unless it makes a minimum investment in all the core practice initiatives. Similarly, the question can be applied to the set of core operations capabilities: are they compensatory or additive? 3. What are the relationships between operations practices, operations capabilities, and operations performance ? This research question focuses on the inter—relationships among the core constructs — operations practices, operations capabilities, and operations performance. Particularly, what is the role of operations capabilities in the linkage between operations practices and performance? Grounded on the literature and the resource-based View of the firm (Schroeder, Bates and J unttila, 2002), operations capabilities are hypothesized to mediate the impact of operations practices on performance. This perspective serves as a starting point for theory development and refinement. The inter-relationships among the key constructs can be studied under the concept of “fit”. Among the six frameworks of “fit” proposed by Venkatraman (1989), “fit as mediation” is appropriate because the criterion variable (i.e., operations performance) is specific and the functional form of fit is relatively precise (i.e., viewed as an indirect effect). The goal here is to find out whether the mediator variable (i.e., operations capabilities) accounts for a significant proportion of the relationship between the predictor (i.e., operations practices) and criterion. The results could help a firm weigh the importance of operations practices and operations capabilities in setting up the focus of operations strategy. 4. To what extent are the key relationships among operations practices, operations capabilities. and operations performance influenced by competitive environment? Every organization exists in an open system in which several factors can not be controlled (for instance, changes in technology, government regulation, or competitor’s action). The best a firm can do is to build and maintain the fit with its environment. Among various potential influencing factors, this study concentrates on the impact of competitive environment on the key relationships discussed in research question (3). Competitive environment has been widely proposed in the configuration framework of operations strategy (Miller, 1988; Ward, Bickford and Leong, 1996) and tested in empirical studies (Dean and Snell, 1996; Ward and Duray, 2000). This research question is designed to specifically examine the role of environment in operations strategy and increase the explanation power of the derived theory under various contexts. 1.3 Research Model Whenever doing theory-driven empirical research, one does not have to go to the field blind. Theory-building and validation is an iterative process from data to theory and then from theory to data (Handfield and Melnyk, 1998; Meredith, 1993). Consequently, the study begins with the following initial conceptual model (Figure 1.1), which serves as a starting point of the iteration. The model is grounded on the existing evidence in the literature and captures the area of interest in this study. However, it is important to emphasize that this model is tentative and revisions are expected in light of the actual data. Competitive environment / I \ / \ / l \ / I \ / ' \ Operations 1 / $ . practices Operations performance Operations capabilities Figure 1.1 The Proposed Model The model portrays the direct impact of operations practices on performance and the potential indirect impact through operations capabilities. In other words, the model hypothesizes that operations capabilities mediate the relationship between operations practices and operations performance. In addition, the relative impact of operations practices and operations capabilities could be influenced by competitive environment. The basic model in Figure 1.1 is similar to some recent studies (Rosenzweig, Roth and Dean, 2003; Swink, Narasimhan and Kim, 2005) that have focused on competitive capabilities. Operations capabilities and competitive capabilities should not be perceived as being identical. Rather, they can be clearly distinguished from one another in a manner similar to that used by Vickery (1991) to differentiate production competence from manufacturing competence. Competitive capabilities are more observable given the dimensions typically used to measure them (for instance, cost, quality, and lead time). When customers buy products, they can recognize all the competitive dimensions directly. However. operations capabilities are more embedded in the production process where customers in general could not notice. In essence, operations capabilities are process capabilities whereas competitive capabilities are outcome capabilities. Process capabilities can affect outcome capabilities. The research model is studied in a sequence of four steps. First, in order to operationalize the model, operations practices and operations capabilities need to be clearly defined and properly measured. These definitions are drawn upon the literature, but not limited to it. The measurements of operations practices are synthesized and structured based on the extant research in order to capture multiple core operations practice initiatives. Guidelines for measuring operations capabilities can be found in the work of Swink and Hegarty (1998). Detailed information on the measurements of operations practices and operations capabilities are presented in Chapter Three. Second, the compensatory versus additive nature of core operations practices and operations capabilities is assessed. That is, can weaknesses in certain dimensions of operations capabilitiesbe compensated by strengths in other dimensions to improve operations performance? Alternatively, is a certain threshold level required for all the dimensions of operations capabilities? Similarly, are operations practices compensated with each other to enhance operations performance? These questions are very critical to operations strategy, yet little research has been done so far. The answers to these questions give operations managers clear suggestions on how to set up the portfolio of operations practice initiatives and operations capabilities to improve Operations performance. Third, the study explores and evaluates the relationships among operations practices, operations capabilities, and operations performance. As can be seen in Figure 1.1, operations practices are hypothesized to have not only a direct relationship with operations performance, but an indirect one through operations capabilities. The interest is to investigate whether operations capabilities are the intermediate stage between operations practices and operations performance and what roles operations capabilities play in the proposed framework. Lastly, from a contingency perspective, it is interesting to re-evaluate the relationships while taking into consideration the potential impact of competitive environment. The results indicate the robustness of the base model and how significant the pattern difference is under various environments. 1.4 The Motivation of the Research This research is motivated from both the conceptual development of operations strategy and empirical findings in this area. Conceptually, there are basically two perspectives of operations strategy. From a content perspective, operations strategy involves a sequence of decisions that enable a business unit to develop a set of specific capabilities to implement a chosen competitive strategy over time (Hayes and Wheelwright, 1984). From a process perspective, operations strategy encompasses the identification and weighing of manufacturing’s competitive priorities, strategic manufacturing decision-making, implementation, and manufacturing performance measurement (Vickery, 1991). Thus, both perspectives implicitly assume that operations strategy comprise several interrelated decisions/stages. However, how these decisions are related has not been specified in the existing conceptual frameworks. When researchers tried to answer whether operations strategy matters, they used different “lenses” (for instance, operations practices, operations capabilities, core competence, or production competence). Though these lenses are different in the sense that they may not be at the same level or at the same stage, they are complementary dimensions of operations strategy. For example, manufacturing competitiveness was often examined as the outcome of operations processes which included a series of operations practices (Flynn, Sakakibara and Schroeder, 1995b). Production competence considered the fit between manufacturing competences and the strategic objectives of the firm (Vickery, 1991). Competence, sometimes used interchangeably with capability, can be viewed as a high order construct of capabilities (Mills, Platts and Boume, 2003). Using these different “lenses”, various empirical endeavors have been made to test the direct relationships between each of these elements and operations/business performance. The goal is to seek evidence to confirm/disconfirm the operation strategy framework proposed by Skinner (1969), that is, operations decisions have an impact on firms’ performance. However, the findings are mixed. Take research in operations practices for example. Empirical studies that examined the direct relationship between Just-In-Time (JIT) implementation and financial performance have reported mixed results 10 (Balakrishnan, Linsmeier and Venkatachalam, 1996; Callcn, Fader and Krinsky, 2000; Huson and Nanda, 1995; lnman and Mehra, 1993). There also seemed to be no distinctive patterns of Total Quality Management (TQM) factors that affect performance. For instance, some authors (Flynn, Schroeder and Sakakibara, 1994; Parzinger and Nath, 2000; Powell, 1995) found that top management commitment or leadership is positively correlated with many firm performance measures such as financial and operational results as well as customer satisfaction. However, Wilson and Collier (2000) reported that top management commitment is not related to financial results and Li (1997) reported similar results with service quality perfomiance. Therefore, even the “best” practices are contextual: it is “best” in the context of a certain business, company culture, and competitive strategy (Heibeler, Kelly and Ketteman, 1998). These mixed results imply that there may exist omitted variables, or the relationship may not be completely specified, or the mechanism may be more complex than speculated. Essentially, prior studies focused more on the direct linkage between operations practices and performance (Balakrishnan et al., 1996; Challis, Samson and Lawson, 2005; Cua, McKone and Schroeder, 2001; Dean and Snell, 1996; Flynn et al., 1995b) but neglected the critical process/path necessary for operations practices to generate a positive performance effect via operations capabilities. As operations strategy research becomes mature, it is desirable, though challenging, to examine these mechanisms and processes that explain how or why the elements of operations strategy combine together to generate high operations performance. 1] Inspired by the interaction of conceptual framework and empirical evidence, this study aims to expand upon prior research by (1) exploring the potential mechanisms of how implementing various operations practice initiatives could enhance operations performance; (2) examining the roles of operations practices and operations capabilities to improve operations perfomiance; and (3) studying the interactions between internal decision-makings and external business environment on operations performance. 1.5 Research Methodology Wacker’s (1998) theory building approach, which incorporates theory development and theory validation, is adopted in this study. At the first stage, the grouped theory method is exploited to revise the tentative model obtained from the literature review. The second stage is to collect data from a large sample, test the revised tentative model, and further refine the model based on the feedback of the data. This section addresses the methodology issues on both stages and data analysis techniques. 1.5.1 Qualitative Study — Grounded Theory Method This study takes a grounded theory method to generate theory through the interplay with data. The grounded theory method, according to Strauss and Corbin (1990, p. 24), is “a qualitative research method that uses a systematic set of procedures to develop an inductively derived grounded theory about a phenomenon”. The primary objective of this method is to expand upon an explanation of a phenomenon by identifying the key elements of that phenomenon, and then categorizing the relationships of those elements (Strauss and Corbin, 1990). Therefore, the grounded theory is developed from the data, 12 rather than the other way around. That makes the grounded theory method inductive in that it moves from specific to general. The grounded theory method is attractive in at least two ways. First, it does not require researchers to suspend or ignore all pre-existing theoretical knowledge. Instead, it encourages developing and enriching theories by drawing upon broad theoretical approaches (Glaser, 1978). Second, the method fits in the research situation where research questions are open and general rather than formed as specific hypotheses, and where the emergent theory accounts for a phenomenon that is relevant to participants (Strauss and Corbin, 1990). In this study, the grounded theory method started from a tentative framework and corresponding general research questions after literature was reviewed and gaps and concerns were identified. The method was executed through a focus group study3. The focus group consisted of eight knowledgeable, experienced, and capable middle level operations managers from a large manufacturing corporation headquartered in Michigan who met our recruiting requirements. Two researchers from Michigan State University were present during the focus group study, one leading the discussion and the other tape recording, taking notes, and clarifying questions. The discussion followed a protocol and lasted for one and a half hours. Written notes and tape were reviewed and used for analyzing the content. At the end of this phase, the key constructs were sharpened; the 3The focus group study and the survey study have been approved by Michigan State University Institutional Review Board (IRB# X06-270). The human subjects were not exposed to any physical, emotional, and psychological risk in this study. 13 tentative framework was refined; and additional insights were obtained for the quantitative study at the next phase. 1.5.2 Quantitative Study — Survey Method The second phase involves the implementation of a large scale survey designed to confimr the validity of the measurement scales of the underlying constructs and verify the presence of the relationships. The survey instrument was designed to incorporate demographic information and the key constructs identified from the literature review and the focus group study. As the measurements for operations capabilities have not been well documented in the literature, a Q-sort was done to pre-assess initial construct validity and reliability for operations capabilities (Moore and Benbasat, 1991). After the scale refinement of operations capabilities, a pre-test was conducted for the whole questionnaire. The primary purpose of the pre-test was to ensure that all the questions were clearly articulated and appropriately understood. Once changes were made based on the feedback from the pre-test group, the survey instrument was transformed and mounted onto Michigan State University server using PERSEUS — an online survey design software. Considering the key research questions, the unit of analysis is appropriate at the plant level. Consequently, the proper sampling frame is operations managers in plants. The American Society for Production and Inventory Control - The Association for Operations Management (APICS) was selected as the most appropriate organization to work with. The majority of APICS members work in the area of operations and planning, which fits well with our target population. A survey announcement with the link to the survey 14 website was sent out by the association to its members’ email accounts through its semi- monthly e-newsletter in June, 2006. In order to improve the response rate and obtain enough responses, the study used multiple tactics that complement and reinforce each other, for instance, support from a professional organization, multiple venues of incentives, follow up, multiple ways of delivery, and moderate length of the survey. In addition, the survey study was conducted in June, 2006, in advance of typical vacation time for employees in American firms. Last but not the least, using APICS as a vehicle to administrate the survey enhanced the quality of the data. In the announcement, operations managers were invited to participate in the study, which limited the potential respondents to those directly involved in producing a product or in providing a service. The survey itself also screened out those whose main responsibility was not operations with the first question. All these procedures ensured that the survey was delivered directly to the qualified respondents. 1.5.3 Data Analysis Methodology Data analysis had two steps. The first step addressed the first research question with the purpose of validating the dual constructs (i.e., convergent validity and discriminant validity) — operations practices and operations capabilities. Confirmatory Factor Analysis (CFA) has been carried out to show that all these constructs were valid, related, but distinctive. Particularly, the difference between operations practices and operations capabilities was confirmed by empirical evidence. 15 The second step addressed the rest three research questions regarding the relationships among the constructs. First, regression analysis was employed to test the competing models between the compensatory and additive nature of core operations practice initiatives and core operations capabilities respectively. The setup of regression analysis followed the human judgment model, which were originally developed for individuals to make decisions among competing alternatives when facing complex multi—attribute information (Patton and King, 1992). In this context, customers decide to buy a product from an organization because it carries certain desirable performance outcomes. In another word, customers determine business success by voting for products of different organizations. Each organization has a heterogeneous combination of operations practices and operations capabilities, which can be viewed as multi-attributes from customers’ perspective. Models of human judgment basically fall into two classes: the linear compensatory models and the non-liner noncompensatory models (Bettman, 1979; Peter and Olson, 1987). Among the three types of noncompensatory models, the study was particularly interested in the conjunctive (i.e., additive) model as it was relevant to the research question (2) — does an organization need to implement all the core practice initiatives or develop all the core operations capabilities to a certain extent to be successful in the market? The additive model concerns whether a minimum level of each attribute is met for a decision making. Therefore, the relative importance of all the attributes does not matter in the decision-making process. 16 Regression analysis was carried out to investigate the inter-relationships among operations practices, operations capabilities. and operations performance as the focus of the question was the relative importance of operations practices and operations capabilities in enhancing operations performance. The relative importance of them can be revealed from the significance and magnitude of the regression coefficients. To test the potential mediating effect of operations capabilities, the standard three-step regression approach was used (Baron and Kenny. 1986; Venkatraman, 1989). Finally, the impact of competitive environment on the key relationships among operations practices, operations capabilities, and operations performance was further tested by dividing the total sample into sub—samples and observing the pattern difference derived from the regression results. 1.6 Contribution of the Research The results of the research are interesting to both academia and practitioners. From an academic perspective, the study clarifies the definitions of two critical but poorly defined constructs in operations strategy — operations practices and operations capabilities. Second, the study develops a better understanding of the two constructs in terms of the intra-relationships among the set of operations practice initiatives and the set of operations capabilities. Third, instead of focusing merely on identifying the critical elements of operations strategy, the study examines the operations strategy from multiple dimensions and underscores potential mechanisms to improve operations performance. The study explores the total impact of operations practices, operations capabilities, and other contingent factors on operations performance. Moreover, the study provides a 17 picture of how operations practices and operations capabilities are interrelated and what should be the foci of operations strategy. It addresses the critical question in operation strategy — how operations can be used as a competitive weapon. The research enriches the theory of operations strategy, and motivates future research by proposing, testing and refining general hypotheses. The multi-method approach of qualitative and quantitative studies lends itself well to (1) the development of valid and reliable scales of the latent constructs and (2) the buildup and refinement of relationships among the latent COIISII'UCIS. The results also deliver important messages to managers facing investment decisions in building operations capabilities and launching various practice initiatives to improve operations performance. Business resources are limited, and therefore how to optimally allocate them is critical. The answers to whether firms need to implement all core practice initiatives and develop all core capabilities have a direct impact on resource deployment. The answers to whether firms need to focus on capability development indicate what managers should care most. In addition, competitive context has been found to play a role in determining the effectiveness of operations strategy. For instance, investing in operations capabilities was critical in competitive markets but investing in operations practices worked well in less competitive markets. All these findings create more venues for managers to shape their operation strategies. 1.7 Structure of the Dissertation Chapter Two offers a comprehensive review of literature on the major concepts such as operations practices, operations capabilities, and competitive environment. Chapter Three 18 contains the conceptual framework. As the research is a theory building exercise, the procedures of theory building are introduced first. Chapter Four describes research design, data collection. and data analysis methods. Chapter Five presents the results and discussion of the data analysis. Chapter Six concludes the research, addresses the contributions of the research, recognizes the limitations of the dissertation, and points out directions for further studies. 1.8 Summary The study explores the mechanisms of how operations practices generate a positive impact on operations performance. It links the critical elements identified in operation strategy research and examines the nature of the relationships between operations practices and operations capabilities. The study enriches the theory of operations strategy by incorporating the potential interactions between these elements and other influencing factors. The results of the study are also intriguing to practitioners as it offers more options to differentiate their operations strategies via the diverse combinations of operations practices and operations capabilities given their specific business environments and the desired performance goals. 19 Chapter 2 LITERATURE REVIEW The study is to contribute to the theory building of operations strategy. To do this, the study deals with two separate but interrelated constructs — operations practices and operations capabilities. The situation is similar to what Venkatraman (1989) encountered in examining “fit” in operations strategy. That is, there are many central and commonly used terms that tend to be poorly, imprecisely, and inconsistently defined. Operations practices and operations capabilities, while central to the operations strategy research, have been often poorly and inconsistently defined. In this chapter, the literature pertaining to the major constructs is reviewed and the key issues and concerns to be addressed in Chapter Three are summarized. The definitions and measurements of the key constructs, the domain of the theory, and the relationships among the constructs are elaborated in Chapter Three to keep Wacker’s (1998) approach as a whole piece in one chapter. 2.1 Operations Practices There has been a great deal of research devoted to manufacturing/operation practices4 (Cua et al., 2001; Dean and Snell, 1996; Flynn et al., 1995b; Samson and Ford, 2000; Swink et al., 2005) and best/world class practices (Davies and Kochhar, 2002; Laugen, Boer, Boer and Frick, 2005; Schonberger, 1996). Underlying these studies is the argument that operations practices are drivers to enhance performance at the operations 4 . . . . . . In thrs study. manufacturing practices and operations practices (a broader construct) are Viewed as same. 20 and corporate level. Given the potential importance of operations practices, it is important to develop a good understanding of it. The review focuses on three issues: definition, level of aggregation, relationships with performance. 2.1.] Current Status of Studies on Operations Practices First, even though the construct of operations practices has been frequently used in literature and there has been a long term awareness of its importance in operations strategy research (particularly since attention shifted to world class manufacturing practices in the early 1980s); the construct has not been consistently defined. It was often treated as something clearly well known, and some literature omitted the need to define it (Bolden, Waterson, Warr, Clegg and Wall, 1997; Christiansen, Berry, Bruun and Ward, 2003; Clegg, Axtell, Damodaran, Farbey, Hull, Lloyd-Jones, Nicholls, Sell, Tomlinson, Ainger and Sewart, 1996; Yusuff, 2004). Alternatively, operations practices were defined through examples (Guisinger and Ghorashi, 2004; Nahm, Vonderembse and Koufteros, 2004; Sakakibara, Flynn, Schroeder and Morris, 1997). For instance: “T ime-based manufacturing practices are employed to achieve fast response to customer needs; and such practices include cellular manufacturing, reengineering setups, quality improvement efforts, preventive maintenance, and pull production.” — Nahm et al., 2004 “An agile company can be defined as. . The five most prevalent agile practices can be summarized as improving relationships with suppliers; formation of strategic partnerships, adaptation of advanced technology/research. . .” — Guisinger and Ghorashi, 2004 The definitions of “best practices” were even problematic because they were prescriptive and tautological. For example, best practices have been defined as those “that will lead to 21 the superior performance of a company” (Camp, 1989), or “the best ways to performance a business process” (Heibeler et al., 1998). These kinds of definitions describe best practices in an explicit positive relationship with performance, which makes it impossible to identify them ex ante. Alternatively, best practices can only be identified ex post when it shows evidence of enhancing performance. However, there are plenty of factors that could make an influence on best practices. Given that the best practices for one firm could be different from those for another firm, it is difficult to give prescriptions to other firms in investment decisions. Second, the domain (i.e., the boundary) of operations practices was not clear in that practices have been treated in different ways and viewed from different levels of aggregation. Operations practices have been viewed at the low level of aggregation — a relatively independent activity to achieve a specific and concrete goal. For instance, statistical process control uses statistical tools to observe the performance of a production line, to predict significant deviations that may result in rejected products, and to detect whether the current process is under control. It is one of the building blocks to achieve quality control in operations processes. Operations practices have been viewed at the middle level of aggregation, which include a set of mutually consistent practices at the low level. For instance, I IT practices studied by Flynn et al. (1995b) have four dimensions such as Kanban controls, lot size reduction, setup time reduction, and J IT scheduling. Operations practices have also been viewed at the high level of aggregation. For instance, lean production and agile manufacturing practices were referred as “systems of practices” or “collection of practices” 22 (Narasimhan, Swink and Kim, 2006; Shah and Ward, 2003). These systems include many of the practices that other researchers studied at the middle level of aggregation. As a result of varying levels of aggregation as well as different contents, it is difficult to compare findings across studies. For example, total quality management was referred at the middle level of aggregation in one study and at the high level of aggregation in the other study. Some authors studied specific practices at the detailed level (Swadmidass, 1992; Swadmidass, 1994) whereas others focused on general practices (Hanson, Voss, Blackmon and Claxton, 1994). Third, much attention has been given to verify the relationships between operations practices and operations/business performance. For instance, a great deal of work has been done in examining the individual impact of JIT practices (Fullerton et al., 2003; Sakakibara et al., 1997; White, Pearson and Wilson, 1999), TQM practices (Flynn et al., 1995a; Powell, 1995), Advanced Manufacturing Technology (AMT) practices on performance (Gupta and Whitehouse, 2001; Kotha and Swamidass, 2000), but only a few investigated the integrated impact of multiple operations practices. For instance, Cua et a1. (2001) bundled the JIT, TQM and Total Preventive Maintenance (TPM) practices together and studied the overall impact on performance while Challis et a1. (2005) examined the impact of integrated manufacturing (comprising AMT, J IT, and TQM) on manufacturing performance. Given the reality that firms implement multiple practice initiatives to improve performance, it is interesting to examine the interaction among them in order to enhance operations performance. 23 Underlying much research, there has been the premise that adopting operations practices is associated with improved performance or maintaining performance. However, the research has yielded inconsistent results. For instance, unique J IT practices were found to positively impact both TQM and JIT outcomes (Flynn et al., 1995b). However, JIT practices alone were not found as a substantial factor to influence operations performance and they generated an indirect effect that worked through improvement of manufacturing infrastructure (Sakakibara et al., 1997). Large-sample empirical studies on AMT- performance have also suggested mixed results. AMT has been found to link to higher performance (Boyer, Leong, Ward and Krajewski, 1997; Kotha and Swamidass, 2000; Ward, Leong and Boyer, 1994), but not in all situations (Beaumont and Schroder, 1997; Jaikumar, 1986). The failure to obtain consistent results could be explained by several factors. Most notably: (1) Focus of the study. Some focused on the impact of operations practices on specific performance dimensions (Flynn et al., 1995b; Nahm et al., 2004) while others studied their impact on general operations performance (Challis et al., 2005; Sakakibara et al., 1997). (2) The measures used to capture operations practices and performance. As Kaynak (2003) pointed out, the inconsistency could be the result of using single construct to measure practices or performance. For instance, TQM was operationalized as a single construct in some studies (Douglas and Judge Jr., 2001) while in other studies TQM was operationalized as a multidimensional construct (Samson and Terziovski, 1999). 24 Some authors (Dean and Snell, 1996) included only the core practices while others (Cua et al., 2001; Sakakibara et al., 1997) examined infrastructure practices as well. (3) Other omitted variables. The other suspicious issue in the majority of the extant studies was that they focused only on the direct causal relationship. Some recent studies (Boyer et al., 1997; Kotha and Swamidass, 2000; White et al., 1999) used more sophisticated designs and considered the potential interaction between operations practices and variables such as organization size, strategy, and infrastructure. Yet, very few examined the indirect impact of operations practices on performance through other variables. Particularly, operations capabilities have hardly been considered as intermediate variables. (4) Nature of the relationships. This is the question of what exactly the relationship is between operations practices and performance. York and Miree (2004) specifically posed this kind of question in studying TQM and financial performance: does TQM lead to improved performance or better performing firms may be more likely to adopt TQM? Put alternatively, is this a causal relationship or a covariant relationship? All these studies indicated several venues to further understand the relationships between Operations practices and performance. 2.1.2 Operations Practices in This Study This study made three changes in the way it handled operations practices. First, it stayed away from the notion of “best practices”. The reason is that “best practices” is highly normative and implies one best way to operate the business. The definition of “operations 25 practices” in this study is similar to that in the study of Flynn et al. (1995b), which emphasizes that operations practices are specific activities with objectives to achieve. Second, as noted in the prior section, operations practices can be examined from different levels of aggregation. The different levels of aggregation can be viewed as a hierarchy of operations practices. An example is demonstrated in Figure 2.1. The first tier of the hierarchy (lean production practices) is a practice system which covers many operations practice initiatives with a goal of eliminating waste in operations processes and making more by using less. Operations practice initiatives are on the second tier, each of which contains a group of detailed practices at the third tier. The group of detailed practices is a set of mutually inclusive, supportive activities that tend to be used together to achieve a specific goal. For instance, all the practices listed under JIT share the same goal of eliminating/reducing inventory in the production system. In this study, operations practices were studied at the middle level of aggregation (i.e., second tier in Figure 2.1). That is, operations practices were seen as practice initiatives. Each of the core operations practices identified and tested was a practice cluster. This level of aggregation is appropriate and adequate for general managers to understand the implication of operations strategy. 26 L d . . + Practice can pro uctron practices system T M .1 IT TPM ......... Practice pracuces practices practices initiatives on m "o 5 91 E. a: :: ~— = 'U '3 5:2. 3- o s 5 B T, g g I Detailed "1 |-“ . 8. o. g practices a a w :r. z», 5 o H a "1 (D 8 B C Q 5. :3 Figure 2.1 Operations Practices Hierarchy Lastly, the inconsistent findings in the relationships between operations practices and performance indicate that the nature of the relationships has not yet been completely understood and opened up many research opportunities. Though it is possible to explore many ways to resolve the inconsistency, this study only focused on three aspects. (1) The study enlarged the scope of operations practices by studying a core set of operation practices (i.e., operations practice initiatives) at the same time. Empirical research on operations practices has been inundated with many articles focusing on one or a few types of practice initiatives. In reality, firms tend to implement several of them simultaneously to imitate world class manufacturing firms. Thus, 27 operations/business performance is the result of multiple practice initiatives rather than only one or a few implemented. Consequently, the linkage between operations practices and operations performance could be affected by other practice initiatives not incorporated in the models if their contribution to performance could not be decomposed. When Collins, Cordon, and Julien (1996) systematically examined the relationship between multiple core practice initiatives and performance and they found that companies are unlikely to build long-term sustained performance if they choose to concentrate on only one or two areas. Therefore, they suggested that operations practice initiatives have to be put in place in all areas to gain best performance outcomes. The core set of operations practices in this study included “hard” practices as well as “soft” practices. Operations practices were traditionally interpreted as tools and techniques. Yet, “soft” aspects (i.e., human resource management practices that are based on belief, philosophy, and organizational culture) have also caught researchers’ attention. For instance, Flynn et al. (1995b) emphasized the infrastructure practices (e.g., workforce development) shared by both JIT and TQM are crucial to JIT performance. Clegg et a1. (1996) found that 80% of information technology investments fail to achieve their performance goals not because of the technology itself but due to the lack of attention given to the crucial role played by human and organizational factors. Sakakibara et al. (1997) also mentioned that activities that provide support for the use of JIT practices are neglected. Jayaram et al. (1999) identified ten key dimensions of human resource practices that are associated with manufacturing competitive dimensions. Gagnon (1999) called for a new paradigm of 28 operations strategy that underscores supportive managerial and organizational practices that are not tied directly to operations processes per se. All these evidences reinforced each other and suggested that “soft” practices be considered in conjunction with “hard” practices. (2) Operations practices should not be viewed as being independent. However, as pointed out by Laugen et al. (2005), too little attention was paid to examine the relationships among different operations practices (i.e., the relative effects of individual practice and their interactions on performance). Facing limited amount of resources in a firm, managers have to select a portfolio of practice initiatives. The interesting question is whether they need investment in all of them or focus on a few. The question is essentially about the intra-relationship among various operations practices. Specifically, are they compensatory or additive? Operations practices are compensatory if weakness of some practices can be offset by strength of others. So far, there has not been literature explicitly addressing the question, and it has been left at best in the discussion of future studies. Yet, the answer to this question has a strategic implication to investment decisions. If the compensatory assumption is proven to be valid, a firm can concentrate on a small set of practice initiatives intensively. Otherwise, the firm has to spread out their investment to all major initiatives to improve performance. (3) The linkage between operations practices and performance was explored by introducing an intermediate variable (i.e., operations capabilities). The idea is largely grounded on the Resource-Based View of firm (RBV) which emphasizes the 29 importance of inimitable, non-substitutable, path dependent abilities that a firm has developed over time, combined with other assets, and merged with organizational culture (Schroeder et al., 2002). The study differentiated operations practices from operations capabilities from their nature and proposed the logic connections between them with respect to operations performance. The focus of this research is to explore the potential mechanisms of how operations practices effectively impact operations performance. 2.2 Operations Capabilities The review on operations capabilities focuses on its definition, the intra-relationships of operations capabilities, and their impact on performance. 2.2.] Construct Corzceptualization Skinner (1969; 1974) was the first one to observe that a company’s operations function could do more than simply produce and deliver products: operations offer certain “capabilities” that could be used as competitive weapons for an organization. Much research following Skinner’s seminal work, has specified operations capabilities as cost, quality, delivery speed, delivery dependability and flexibility (Boyer and Lewis, 2002; Cleveland, Schroeder and Anderson, 1989; Ferdows and De Meyer, 1990; Flynn and Flynn, 2004; Noble, 1995; Safizadeh, Ritzman and Mallick, 2000; Swink et al., 2005; Ward, McCreery, Ritzman and Sharma, 1998). This illustrated that operations capabilities have been defined in a fairly consistent manner, but there are further opportunities to enhance the definition. This section identifies three of them. 30 First, operations capabilities are so closely related to other constructs such as operations objectives/competitive priorities that it is difficult to differentiate among them. Competitive priorities are the foci of decision-making in manufacturing strategy planning framework, which include cost, quality, dependability, flexibility, and service (Boyer and Lewis, 2002; Van Dierdonck and Miller, 1980; Ward and Duray, 2000; Youndt, Snell, Dean and Lepak, 1996). On the other hand, the same dimensions are used to describe manufacturing capabilities (Noble, 1995; Roth and Miller, 1992; White, 1996). The study viewed competitive priorities as goals at the strategy level and operations capabilities as actual abilities at the operations level. Second, there is no clear discrimination between operations process capabilities and operations outcomes capabilities (Swink and Hegarty, 1998). Rosenzweig et al. (2003) called cost, quality, reliability, and dependability as “manufacturing-based competitive capabilities” mainly because they recognized that these dimensions are the outcomes of operations processes. They represented a manufacturer’s actual competitive strengths relative to primary competitors in its target market. While operations outcomes are visible to and appreciated by customers (e.g., cost/prices, quality, and delivery time), operations capabilities are invisible to customers (McGrath, Tsai, Venkataraman and MacMillan, 1996; Penrose, 1959). Customers do not need to figure out and appreciate how flexible/agile/lean the operations process is when making purchasing decisions, but firms achieve desired outcomes by deploying their capabilities. Simply put, operations capabilities are a means to the end. 5 Customers are aware of prices rather than costs. However, prices are closely related with costs from the economics perspective. The price of a product is usually set as the total of the cost and a markup. Therefore, customers can sense of costs from prices. 31 Third, the treatment of operations capabilities is inconsistent with that of organizational capabilities in the strategy literature. Organizational capabilities have been developed largely based on the RBV and other frameworks extended from RBV. RBV builds on the premise that what a firm can and can not do is greatly influenced by the resources/capabilities available to it. In the framework, “resources” refers to tangible and intangible assets firms used to develop and implement business strategies (Ray, Barney and Muhanna, 2004). A resource has to exhibit certain attributes (such as rare, valuable, inimitable, non-substitutable) to be the source of competitive advantage (Collis and Montgomery, .1995). Apparently, tangible physical resources that are tradable in the market can rarely satisfy the criteria. Therefore, several scholars have shifted from the general resource perspective to the more specific capability perspective (Leonard-Barron, 1992; Nelson, 1991). This shift highlighted the importance of a variety of organizational mechanisms, such as an organization's ability to coordinate specialized units, organizational culture, and communication channels as sources of competitive advantage and key determinants of organizational performance. Capabilities aim at deploying and coordinating organizational resources. The ability to control over resources is path dependent, difficult to be identified and decoded (Amit and Schoemaker, 1993; Prahalad and Hamel, 1990), which meets the requirements to be sources of competitive advantage. A Knowledge-Based View of the firm (KBV) is an extension of RBV (Grant, 1996). By extension, knowledge becomes a resource critical for product, process, material and organizational innovation, as well as a resource for the application, acquisition and calibration of other resources for a firm’s objectives. More recently, scholars (Teece, Pisano and Shuen, 1997) have extended RBV to dynamic markets and proposed dynamic 32 capabilities that rest on distinctive processes, specific asset positions and the path of adoption. Though RBV, KBV, and dynamic capability emphasize different critical assets, all of them attempt to identify the source(s) of competitive advantage from the process perspective. Moreover, it is apparent that the source(s) of competitive advantage has/have been specified more clearly, from general and vague “resources” to specific “capabilities” and “knowledge”. This becomes a motivation to identify operations capabilities from operations processes. In contrast, operations capabilities tend to be stated from operations outcomes perspective rather than from operations processes perspective. If cost, quality and time are the dimensions upon which a firm wants to outperform the competition, a question remains unanswered — what kind of capabilities need to be developed in production processes to achieve these outcomes? Some recent studies have provided greater insights of the definition of operations capabilities. Hayes and Pisano (1996) suggested that capabilities are activities that a firm can do better than its competitors. However, their definition needs to be further refined to distinguish from operations practices. As noted in the prior section, operations practices are activities, but operations capabilities are more latent than concrete activities. Swink and Hegarty (1998) referred operations capabilities to fundamental proficiencies in operations processes. Both papers recognized that capabilities exist at a different level from outcomes and capabilities are associated with operations processes. Though still quite vague, those definitions tended to converge with capabilities defined in the strategy 33 area and pointed out an important direction that should be considered when crafting the definition of operations capabilities in this study. 2.2.2 Intra-relationships among Operations Capabilities Research on operations capabilities deals with intra-relationships among themselves and relationships between capabilities and business performance. This subsection reviews the literature on the former relationships and the next subsection addresses the latter one. There has been a persistent and on-going debate on the relationships among operations capabilities — tradeoff, simultaneous, or cumulative. The trade-off perspective, originated from Skinner’s (1969) operations strategy framework, is based on the premise that in the absence of slack resources the achievement of a higher level of performance on one capability can only be obtained at the expense of performance on other capabilities. Therefore, plants need to prioritize their strategic objectives and focus on specific capabilities. In recent years the existence of trade-offs has been challenged. Global competition has intensified the pressure on plants to improve along all dimensions. World class manufacturing firms set the standard, developing capabilities that reinforce one another. The most quoted example was that high quality enables plants to become more responsive to customer needs, more reliable, and more cost efficient (Schonberger, 1996; Szwejczewski, Mapes and New, 1997). Additionally, AMT allowed plants to develop multiple capabilities simultaneously (Corbett and Van Wassehnove, 1993). Roth and 34 Miller (1992) provided evidence that business performance is positively related to a company’s performance on a set of operations capabilities. Pursuing the idea that multiple capabilities are desirable, the question becomes which capability a company should develop first. Ferdows and De Meyer (1990) proposed the “sand cone” model based on the proposition that competences are cumulative rather than mutually exclusive. They specified a particular sequence in which a company’s operations capabilities should be developed: quality — reliability — flexibility — cost efficiency. Empirical efforts to validate such sequences have been inconclusive with unsuccessful studies (Flynn and Flynn, 2004; Wood, 1991) and successful cases (Lapre and Scudder, 2004; Noble, 1995; Rosenzweig and Roth, 2004). Even though operations capabilities do not mean cost, quality, reliability, and flexibility in this study, the nature of the relationships among them is worth further examination. While the trade-offs, simultaneous, and cumulative discussion is theoretically interesting, it was not central to the major thrust of this study. Rather, this study took a different perspective to examine the intra-relationships among operations capabilities. That is, are these operations capabilities compensatory or additive? If they are compensatory, strengths in certain operations capabilities would counteract weaknesses in other dimensions. Therefore, firms are free to configure their operations capabilities. They can either continue developing those they are good at or focus on those poor operations capabilities. However, if the nature of the relationships is additive, then firms may be forced to compete on all dimensions and the overall performance is largely determined by the poorest operations capability. A more thorough investigation of such relationships 35 offers a way of better understanding the diverse strategy formulations consisting of different sets of operations capabilities. These kinds of questions were addressed in this study with refined definitions and measurements of operations capabilities. 2.2.3 Relationships with Performance Skinner (1.969) argued that operations has the potential to strengthen or weaken a company’s competitive advantage. Hayes and Wheelwright (1984) also indicated that operations capabilities can play a major role in helping a company achieve a desired competitive advantage. Theoretically, it can be argued that there is a chain effect, that is, operations capabilities improve operations performance which enhances competitive advantage (Figure 2.2). Thanks to theory of production competence, the relationship between operations performance and business performance has been validated (Cleveland et al., 1989; Vickery, 1991; Vickery, Droge and Markland, 1993). l' ———————————— _| Operations l Operations Business | capabilities ———h performance } performance l I ______ __T————l Purpose of this study Theory of production competence Figure 2.2 The Boundary of This Study Production competence captures two pieces of information — dimensions of operations performance and the strategic importance of each dimension (Vickery et al., 1993). It is the latter one that helps connect operations performance with business performance. 36 Given that the interest of this study is to explain how and why variables affect operations performance, the linkage between operations performance and business performance goes beyond the scope of this study. Yet, the findings of this study can be easily expanded to the corporate level using the theory of production competence. To sum up, this section suggests that the definition of operations capabilities can be improved in three dimensions. First, it needs to be distinguished from other related but different constructs such as competitive dimensions/priorities. Second, it needs to focus on capabilities along operations processes. Third, its definition needs to be consistent with organizational capabilities in the strategy literature. Therefore, to further advance the study on operations capabilities, this research gives a definition addressing all these problems in the next chapter. With a new definition, the relationships among operations capabilities and their impact on operations performance are investigated. 2.3 Competitive Context As has been argued in the section 2.1, mixed results of the linkage between operations practices and performance could be the result of omitted variables. Accordingly, the scope of operations practices is enlarged to include “soft” practices and the possible misspecification of relationships is suggested. However, those discussions are based on the generic situation where firm characteristics and the competitive environment are largely neglected. In fact, a business organization exists in an open system in which some factors can not be controlled (for instance, changes in technology, government regulation, or competitors’ action). All it can do is to build and maintain the fit with the environmental context. 37 The notion of “fit” has long been recognized in the operations strategy literature (Hayes and Wheelwright, 1984; Venkatraman, 1989). An effective operations strategy not only fits the environment by differentiating a company (or its products and services) from the competition. but also fits the way the company configures, equips and manages operations functions (Hayes and Wheelwright, 1984). Thus, competitive environment is a factor influencing operations strategy and is particularly considered in this study because of at least three reasons. First, business environment has been captured as one of the dimensions in the configuration approach in studying operations strategy. Miller (1988) argued that environment (environmental uncertainty measured by unpredictability, dynamism, and heterogeneity) and strategy are interdependent and firm performance results from the fit between environment, strategy, and organizational structure. Similarly, Ward et al. (1996) conceptualized four types of operations strategy based on the congruence between the environment, competitive strategy, manufacturing strategy, and structure. However, configuration is more like a framework than a complete theory (Miller, 1996) because it lacks predicative power. To move toward theory, frameworks need to become more precise, detail the mechanisms that explain the phenomena, and suggest some implications that can be tested (Schmenner and Swink, 1998). Ward and Duray (2000) went a step forward from the configuration approach by depicting the relationships among environment, competitive strategy, and operations strategy in their framework. They found that environmental dynamism affects product differentiation strategy but not cost leadership strategy, quality and flexibility manufacturing strategy but not cost and 38 delivery. The empirical findings not only confirmed the critical role of environment in the conceptual operations strategy framework, but also implied that its role could vary in different operations strategy. That is the primary reason to bring an environmental factor into the model. Second, market competition has an impact on the effectiveness of operations practices implementation. Dean and Snell (1996) specifically examined how the utilization of integrated manufacturing relates to performance as a function of industry competitiveness. They found that the impact of the relationship is magnified or diminished by competitive environment. Integrated manufacturing seemed to fit better in quality-oriented strategies and environments with limited competition. Their findings suggest competitive context could be a moderator in operations strategy, which helps specify its role in the proposed model. Third, market competition has also influenced the pattern of capabilities. For instance, Flynn and Flynn (2004) argued that industry competitiveness influences the relationships between cumulative capabilities and plant performance. That is, plants in more competitive industries are likely to gain less advantage from the law of cumulative capabilities. Eisenhardt and Martin (2000) also observed that effective patterns of dynamic capabilities vary with market dynamism. The dynamic capability framework complemented the RBV by specifically considering market dynamism because RBV has not adequately explained how and why certain firms have competitive advantage in situations of rapid and unpredictable change. All the evidence together indicates that market competitiveness could influence firms’ decisions in developing certain operations 39 capabilities. As operations capabilities is a key construct in the model, it is reasonable to take into consideration the potential impact of competitive context. To sum up, competitive environment is a relevant construct that has been considered in the operations strategy research. Though it was well developed in conceptual frameworks, little empirical research explored its role in implementation choices of operations practices and development of operations capabilities. Therefore, its influencing role was explicitly considered in this study. 2.4 Gaps and Opportunities The review of literature centers on the major constructs in the area of operations strategy, reveals several gaps and concerns, and becomes the starting point of this dissertation research. First, the constructs of operations practices and operations capabilities are important to studies of operations strategy. Yet, they have not been well-defined and subject to varying levels of disagreement. Rigorous definitions and solid measurements are needed as the first step towards theory building. The review provides a guidance of how to refine their definitions. Second, operations practices leave unanswered question of what relationships exist among them. Similarly, operations capabilities leave unanswered questions regarding the composite nature of capabilities — are they compensatory or additive? Assuming a set of operations practice initiatives and operations capabilities is identified, what kinds of operations practice initiatives need to be adopted to foster performance? Is a subset of the 40 initiatives enough? Do certain initiatives need to be present? Does a plant need to implement all the operations practice initiatives to a certain extent? In the same vein, similar questions can be asked about operations capabilities. Answers to these questions could offer valuable information of how to develop a portfolio of operations practice initiatives and operations capabilities to improve operations performance. Third, as operations practices and operations capabilities are two central elements to improve operations performance, do they interact one way or another? Current operations strategy research has focused on the individual contribution of critical elements such as operations practices or operations capabilities, while relationships among the elements were largely neglected. The empirical findings challenged the simple direct relationship and indicated that studying only one element at a time may not be sufficient to bridge the missing linkage between operations decisions and performance. Some recent studies have started to explore the relationships between operations practices and capabilities (Swink et al., 2005; Tu, Vonderembse, Ragu-Nathan and Ragu-Nathan, 2004), but still treated capabilities as an outcome variables very much like operations performance. Consequently, there is a need to improve the understanding of operations strategy by simultaneously studying multiple elements and seeking intermediate relationships among them. Finally, competitive environment has not only been considered in the operations strategy framework, but also showed as an influencing factor to operations practices and operations capabilities. Therefore, it needs to be incorporated in the study. Particularly, it is interesting to find how competitive environment could influence the role of operations 41 practices and operations capabilities in improving performance. The results could help plant managers delicately modify operations strategies in light of their business environments. 2.5 Summary Chapter Two reviews the literature regarding the critical constructs relevant to the purpose of the study. This literature review has established the constructs/elements but not many of their relationships. Establishing and refining the constructs and the relationships are the foci of this study. Overall speaking, the study positions itself in the operations strategy area as a step further in theory building by examining the relationships among operations practices, operations capabilities, competitive context, and operations performance. 42 Chapter 3 CONCEPTUAL FRAMEWORK In the prior chapter, the constructs and elements that could be viewed as the building blocks of this study were examined. In this chapter, the conceptual model that forms the foundations of this study is proposed. However, to facilitate the development, refinement, and validation of this conceptual framework, this study draws extensively on the process and guidelines in theory-building approach presented by Wacker (1998). Consequently, this chapter begins by presenting an overview of his process. 3.1 Wacker’s Theory Building Approach This study aims to extend operations theory. It expands the literature by first making refinement in constructs’ definitions and measurements and then introducing operations capabilities as potential mediators between operations practices and performance. Since there has been no prior well-developed theoretical foundation for the proposed relationships and opportunities were explored to better explain operations performance, this research is primarily a theory building work. The other reason to call it a theory building research is that the general research hypotheses were tested, rather than specific hypotheses derived from a well-developed body of relevant literature, the same argument used by Vickery et al. (1993) in developing the theory of production competence. Though the academic literature suggested many different theory building procedures for specific types of research projects (Bacharach, 1989; Eisenhardt, 1989; Swadmidass, 1986), Wacker (1998) proposed a generic procedure that ensures all guidelines for 43 “good” theory—building are met regardless of the types of research projects. His approach was adopted because unlike other procedures that separate theory development and theory validation (Hunt, 1991), his approach integrates the two. That is to say, it is not enough to propose a theory; the theory is not useful and rigorous unless it is tested. Wacker’s (1998) four-step procedure includes conceptual definitions, domain limitation, relationship building, and theory prediction/empirical support. First, clear, precise, and concise definitions of constructs are required to limit the area of investigation by defining “who” and “what”. A literature review generally provides a base for defining constructs. New definitions are proposed only if the current ones are inadequate. After precise definitions of constructs are established, the domain needs to be specified to limit “when” and “where” the theory holds. The domain of the theory directly limits its generalizability. The more specific the domain, the lower the generalizability is. The third step is to build logic relationships among constructs/variables. The relationship between any two constructs/variables must be explicitly stated, or else the theory can not be shown to be internally consistent. Wacker (1998) emphasized the importance of academic literature in suggesting the potential relationships and raising the abstraction level for theory development. The goal is to address the common questions of “why” and “how” through logical reasoning. The last stage is theory validation and prediction. To be useful, a theory has to pass the empirical test. Therefore, empirical evidence needs to be presented to verify that a proposed theory can be applied in the real world. Different methodologies (e.g., 44 experimental research, survey research, or case research) use different types of empirical evidence to verify the predictive validity of a theory. This chapter covers the first three steps, leaving the last step in Chapter Four and Chapter Five. 3.2 Definitions and Measurements of the Constructs The section contains the detailed information of how to define and operationalize the major constructs in this research. 3 .2. 1 Operations Practices Based on the discussion in Chapter Two, the study defined “operations practices” as task- specific ways of organizing resources with an aim to maintain and/or improve operations performance. This includes not only specific activities but also general practices that can be applied to the operations management context. As pointed out in Chapter Two and noted by other researchers (Bolden et al., 1997), operations practices have been addressed in different content and at varying levels of aggregation, which made comparison across studies difficult. This study focused on identifying multiple core operations practices at the middle level of aggregation (i.e., practice initiatives). The scope of the core practice initiative was expanded to contain both practices aiming to achieve specific strategic objectives and those supporting general strategic objectives. It comprised both “hard” practices and “soft” practices. The 45 improvement was in line with Davies and Kochhar’s (2002) suggestion that operations practices should be approached holistically. Table 3.1 contains the definitions and measurements of seven core operations practice initiatives generated by synthesizing the existing studies. They are quality management practices, .1 IT flow practices, customer orientation practices, supplier relationship management practices, integrated product development practices, workforce development practices, and leadership practices. It should be noted that this is not a complete list of operations practice initiatives, but represents those widely used and tested in the literature and the foci of this study. Being at the middle level of aggregation, those operations practice initiatives consist of detailed practices at the low level of aggregation. Therefore, low level practices are used to measure middle level practice initiatives. 46 Quoom .tsms>v WERE—coo 8 owcomme :oEO Soon .20.: ES 8288 couoflmcam 8:553 8382 602 .awanofiom in 8 E18 @5833 53, 88:8 820 58:32 659 .uowconconomv 25:5: 958850 9 ewe—o BEeQO SOON .90”: 98 855% EoEBE52 .muoEBmso 2: @555ch ”055: A85: in Ho Saflxwxam "mom: in 8 SEES—am £55: .200 new omnfiouoeco> £8833: Bo: can 8:253. @6385 SEE? 9 Bob: 53935835895”: Amoom :3 8 0236 ”meow 6.53 was :29 5:268: 829? :5 3.55— .eocomENEO ecu 2:82 .253 58G ”SON :3 Ho .39 accuse“: 2:: maom Ammo: 5on ecu :ocm nmeow 683 ES :39 SB :83 :aEm Amoom .955 can EEmv 3:95 e8 wEo—SEnoce: 03:69:00 :35. ..E 6 means—exam ”moo: £25055 98 ESE .Emniwxnm ”SON in do «:0: 5:25 L8 20:85::8 8:395 528833 58850 965:: 9 eocwfioe 833:3. .833 we 8:8 :m wEEEEE 332:2: can 3505558 mo Row @255 05 5:5 8238.: EOUV 80:85 cocficotc 3882.0 Em: 8388a: 30E 5: Avoom 533% .3588: van 320v ”R5— .EEomquV mafia new 86:05 .858 5:95 we om: of. I 32605 5:25 wEEszm 80:85 $08 in 8 MEBm N55: .530 can 52% ”ammo: is can wE>oE~E bwsoscccoo EoEowacaE 6 :82”: $552: _wo:m_§m\mta:o mmoooa :85:me :0 cm: 2:, I 5 BER 3:385. 5:30 outstation: 2.5.5:me 83.26.55 cutest 8.53::— 895955 28:32.0 he 3582.532 can m:e_==con fin «Sea. 47 302 in 00 880000 003000.30 0500085 00 80808800 80808802 Avoom .058ka £00m 35:08:35 830800 8 0002080 000352 800m .050 :80 50809 ..0w:0:0 .5 8808200: 0WD 800m .05”— 05 50,809 05:05: 05000 8283 0008805 SOON .050 0:0 50809 8080202: 0:0 028 0550080 80808802 282 0 82 :3 0 233380 0:28 .5320 880 Amoom J0 00 0:839 0:00? 30: $8800: 50 0080300 Emma: 000 ammo: £0 00 8300 5:08:55 0:0 0:53 8005 Amoom in 00 2826 "meow .803 0080 :0:@ 00500053 008000-380 £50: in 00 08:“: 800000 880.20 5:83-580 800m .taméoEoo 0:0 308; .3385 0080008900 800.8250 232 .538 Ba 005 .228: 82 .0805 080 88003 .0w8QV 3300 003500 8 8080302: 5:005 Goon J0 00 836 Koo: J0 00 Banflmxmm Ammo: .0280: 3:50:00 080 3:58:0000808 50 swag: Avoom 003:5 8078085 20:88 5 0000: 5:00:00 5:096 830: in 00 83”: 5:020:50 00:000m Amoom :8 00 836 £58 .00sm:>v w858000 00:00.5 Goo: 0500:5205 8mg: in 00 8}“: 00.0: 3003. 00 5080:0580 3.00: 8:0: :80 28>? 600m 058200 0:0 =2800m 088:0 80802260 5:00am .mfiow 300888090 00 80805800 05 5839 00000—080 08 500:0 :80 008258 00 0008000 02:35.4. @8300 -803000 50 2538000 0:0 5:580: 5:0: 00—050 00 003300 00:03:05. 0000000 80802050 008500 8:00:58 00 0088 m0:_>:0< 80:003 000: :85 0:025:38 8.02-50: £23800 2 0005000 3830040 and 0020080 0582003 A0026 0020080 808020>0Q 00535? RE: 80:00.00 808020>0Q 008000 0000508— 3):va .0020080 80803802 0380:0330 02:00am 8:083:50: 5:380 003535 00.55% 48 3.2.2 Operations Capabilities Insights from organizational capabilities are valuable input to derive the definition for operations capabilities. Organizational capabilities have long been treated as firm-specific assets from RBV perspective (Ray et al., 2004). Sometimes they were defined in wide latitude as “anything which could be thought of as a strength or weakness of a given firm” (Wernerfelt, 1984), while other times defined as a high-level routine (or collection of routines) that is highly patterned, repetitious, and founded in part in tacit knowledge (Winter, 2003). Briefly, capabilities are institutionalized routines embedded in the processes and demonstrated in the organization’s abilities to do something (Subramaniam and Youndt, 2005; Teece et al., 1997). The focus in this study is operations capabilities, which limits its domain to only operations function. However, as operations is a functional area of an organization, it intrinsically carries certain characteristics of the organization. In line with this logic, some general organizational capabilities are also evident to a certain extent at the functional level. Taking this into consideration and emphasizing the strengths and weaknesses of an operation process, “operations capabilities” were defined as demonstrated potentials to execute a specified course of action in operations in a unique and proficient way. This definition of operations capabilities circumvents the tautological criticism. Operations capabilities are desirable in generating positive intermediate outcomes in terms of the way that a firm carries out an action or a series of actions. However, they do not automatically link to performance. The line of reasoning is similar to those arguing 49 that dynamic capabilities are not tautological as the definition underlines the ability to integrate/reconfigure resources rather than the ability to create value in business performance (Eisenhardt and Martin, 2000). In reviewing the definitions, operations practices are clearly different from operations capabilities. The former refers to specific, task/goal oriented, and contextual bounded activities, while the latter is broad-based, context free routines/mechanisms that enable the most efficient use of a firm’s assets (Day, 1994). If operations practices can be articulated in the ways things are done, the elusive nature of capabilities makes it difficult to be specific enough to itemize and imitate. “Statistical process control” is an example of quality management practices. This practice can be learned as a method for achieving quality control in operations processes and adopted by firms that care about quality. However, operations capabilities are path dependent, difficult to be identified and decoded. Take “development of proprietary processes” for example. Every organization is endowed with various kinds of assets, and it can even exchange assets with the market and use assets in different ways. But how it extends, customizes, and combines the use of assets is evolved inside an organization over time, contingent upon such factors as business strategy and organizational culture. In addition, this capability is also formulated upon a combination of unique organizational actions, learning, and cumulated knowledge. Therefore, it is a very complicated process to develop a proprietary process. Consequently, it is difficult to find a one to one correspondence between what an organization does and what capabilities it possesses (Dierickx and Cool, 1989; Hart, 1995). That is to say, it is extremely hard to decipher a 50 capability because there is no such standard path an organization can take and develop it over night. As RBV framework provides the guidelines/criteria to identify organizational capabilities, a similar counterpart is necessary at operations level. White’s meta-analysis (1996) on operations capabilities showed that the enormous amount of research done so far had identified cost, quality, delivery speed, and delivery dependability as operations capabilities. Swink and Hegarty (1998) proposed the first operations capabilities framework which put capabilities in the context of operations processes and also categorized them into static and dynamic capabilities. Even though it has not been empirically tested. the Swink and Hegarty framework serves as a starting point for this research. Since then a lot of work has taken place (Escrig—Tena and Bou-Llusar, 2005; Subramaniam and Youndt, 2005). Insights from those works were used to augment Swink and Hegarty’s (1998) static-dynamic capabilities framework. Static capabilities are demonstrated potentials to perform an operations action in a unique and proficient way at the steady state, for instance, cooperation skills (Escrig-Tena and Bou-Llusar, 2005), development of proprietary processes (Escrig-Tena and Bou-Llusar, 2005; Schroeder et al., 2002), and responsiveness (Swink and Hegarty, 1998). Dynamic capabilities are demonstrated potentials to execute changes that affect resources or routines by develOping new capabilities to adapt to the environment. They include incremental process improvement (Swink and Hegarty. 1998), radical process innovation 5] (Subramaniam and Youndt. 2005: Swink and Hegarty, 1998), and process reconfiguration (Teece et al., 1997). Detailed definitions and suggested measurements are summarized in Table 3.2. As the goal is to identify a list of core operations capabilities (not a comprehensive list), the difference between static capabilities and dynamic capabilities is not emphasized. In light of the new definitions of operations capabilities, scale development must also take place. Measurements are suggested based on the available and relevant sources. LII IN) 50—830 05 33.8 8:888:68 88885 8 wowcmno “0808985. 85 .88 $5.88 83 $3080 088 338 88888808 808: 8 80:88; 0808988: 00“ 8955 83 $3080 088 388 8008 38888 88 888800800 8 80:88; 0808980: 80% 53.88 83 8:08 88 88:80 33.8 08“ 30880 >0 323288 8880808 no 58880:: 80:08 83 .8888 88 8 80880 8:08: 8% 8 @8888 088 885008 .880 was $8008 803020000 80 888880 80 m0 888: 88 8>8m 8.8g 8 888808 08“ 8:808 8880 86: w88m~$ $8820 80 888880 80 («0 .88: 8: 888m 8880 8 080888 08“ 085008 8889 was $8008 cwfioc 8:005 80 80880800 80 80¢ 8 8888830 88 $33 8108: 8 08m: soon 8: 888828 80 8883088 88 80:23. Q0880 cam 8&20808 .8838 808005 >8: $8008 8 88:0 0:“ 88—003 875 88.50 883—088 80 .88 880cm 8 858580000 08“ m88308 8 88 8:0 80: 8808—305 30% 888388 80 8.88308 830m 85 30036 8 880 :88 83, 8:808:00 8 08:3? 88 883388 80 880:8 0:0 80: 882 95 8088888 887. 893—088 80 83985 88 90 88:“ 888:6 80¢ 80080 83> £82 emcacoxo 08“ 00888 883388 80 .600 30— E 88 >208: 8888808 89:0 80 0:08 8 wowcano 8 8088 00 88?? 88888888 8038800 8“ m0 38800 888880880 8: 8.88%? 0:“ 8880000 80802008 858880 008 @8888 NE $02305“ 8880 8 38800 888880880 8E. 88:09; 08“ 8880880 82985 88 m0 8888 888:0 80c 8008a 83, 35803888 838m USN 83:8: 888 8 38800 0888880880 8:. Emmv mmocozmcommom Ran: 888008>8Q 888005 8888005 808 mzfim 808880000 8.8868: .888mm06 :28st 523839 80.23me 85:550va 888.800 .8“ 8.8895882 woumowwsm 0.8 8058:2— Gm 8358 53 m0w0000 00808 00 000080 00 8:38 00 000800800 02080 03 08000 00808 8 000080 00 808080 8888 00 0808000 0008:8008 03 88000 00808 00 000080 8 800080 8000 000 300 00000 00> 80800880 08 00 080:0 0E 00 00030888 0 >9 88080 8800000 w0:_0>000 8 0800008 8:88 000 0x08 008 8000808 08080 0>00 03 .8800000 850800 000 080:0 30080800000 88 8088008 080000 0>00 03 82080 8800000 08:88.00 80 00008 8: 800000008 08080 028 0>> 58008800 8800000 0>0008_ 00 08:00 000 800008 800 8000 800 808 00808 0>00 00> 0000t0> 000 883 000000 38008800 0 >9 .8800000 005000000 3:088 38008800 0>> 80800000 000000000 08800008 38008800 0 >9 0000086 8 8085800 08 00:3 808000300 8808 000 3888 8058000 00 00 08 08088.00 8 0000800880 .0888 000 8888 8080000 82080000 00 88800 088080800 0.: .800 8088 8000880 030088 005 8800000 008800000008 00080 8080—08_ 000 80000 00 88800 088080800 8? .8800000 8880 0000880 000 08000 00 38800 080880800 00% 6000 000808000000 80080 20¢: 00508000 800000 8080a 20: 808080080 800000 008080005 8880: 808.0885 8.5800 00.5000sz 80.8800Q 54 3.2.3 Operations Performance Given that the study focuses on explaining what causes the difference in operations performance, the analysis should be carried out at the plant level. There are many ways to measure operations performance, depending on the focus of the study. For instance, the impact of J IT practices on operations performance was measured in terms of inventory tumover, on-time delivery, lead time and cycle time (Sakakibara et al., 1997). Lean manufacturing’s impact on operational performance was evaluated by manufacturing cycle time, scrap and rework costs, labor productivity, unit manufacturing cost, first pass yield, and customer lead time (Shah and Ward, 2003). Even though different operations practices emphasized distinctive sets of performance measurements, the predominant approach in the literature was to use cost, quality, delivery, and flexibility as the four basic indicators of overall operations performance. The use of these indicators can be traced back to Skinner (1969) who proposed the operations performance measurements in his seminal article. These measurements have been widely used by many other researchers (Cua et al., 2001; Miller and Roth, 1994; Schroeder et al., 2002; Ward, Duray, Leong and Sum, 1995). This study focused on the same elements of cost, quality, and delivery in capturing operations performance. The only exception was that flexibility was excluded, and this decision was made based on two reasons. From the conceptual perspective, flexibility is generally defined as the ability of an operations management system to respond quickly to changes at low cost (Gerwin, 1993; Swink et al., 2005). Therefore, it can be viewed as a combined/derived measure. Even though flexibility portrays an indispensable area of 55 competition, it in fact reflects the interaction between cost and time/range. From the statistical analysis perspective, there should be discriminant validity among constructs. The constructs are expected to be not only correlated, but also clearly differentiated. Flexibility is closely related to “responsiveness” — one of the operations capabilities in this study. In order to minimize the possibility of item cross—loading and avoid redundancy in items of operations performance, this study only considered cost (e.g., unit cost of production, manufacturing overhead cost, total cost), quality (e.g., conformance quality, product reliability, product features) and delivery (e.g., delivery accuracy, delivery dependability, delivery quality, delivery availability) as the three dimensions of operations performance. 3.2.4 Competitive Context Competitive environment is also a complex and multifaceted construct that can be conceptualized in a number of different ways. This research limited itself to two critical dimensions: market competitiveness and market dynamism. The treatment of market competitiveness is based on Porter’s (1980) five forces model. Specifically, “intensity of rivalry among existing competitors” has been identified as a driver of business strategy. As noted, industry concentration is an appropriate measurement for market competitiveness in organization theory literature and therefore became a proxy measure in this study. Industries marked by high concentration are less competitive than those that are not (Dean and Snell, 1996). The more competitive market carries such features as (1) more major competitors in the market, (2) narrow price 56 difference among competitors (Brynjolfsson and Smith, 2000), and (3) small growth/decline in sales (Flynn and Flynn, 2004). The second dimension is market dynamism that underscores how rapidly the industry is changed by new products/processes. This aspect is worth examining in today’s environment and has been addressed in the dynamic capability framework. The measurements for market dynamism, largely indicated by the rate of innovation and change of customers’ preference, follow the work of Anand and Ward (2004). 3.3 Domain of the Theory The framework developed describes a mid-range theory and certain boundaries have to be placed in this study. First of all, the resulting theory is limited to organizations whose main responsibility is operations (e.g., plants). Second, the resulting theory is built upon the current perspective and knowledge, and therefore limiting itself to the current business environment. It has been observed that operations practices change with time. Therefore, what has been captured as major practice initiatives in the current framework may not hold years later. Lastly, the resulting theory can be generalized across a certain range of environment settings. Findings could be more applicable to moderately dynamic markets because dynamic capabilities have different implications in different types of markets (Eisenhardt and Martin, 2000). Dynamic capabilities can become a source of sustained competitive advantage in moderately dynamic markets. However, in extremely dynamic markets, the results become unpredictable. 57 All these observations show that the constructs and relationships identified in this research may not be universal. However, it goes beyond the scope of the study to test the generalizability of the theory in other contexts and/or another point of time. 3.4 Tentative Conceptual Framework Some researchers (Eisenhardt, 1989) argued that theory building research does not need a conceptual framework to start with and should begin as close as possible to the idea of no theory under consideration and no hypotheses to test. However, recognizing that researchers do not need to capture all the data and it is impossible to capture all the data in the field, an initial lens drawn from a theoretical foundation is critical. The chosen lens is a way but not the only way to study the research questions; yet it helps the researchers focus on the research questions and data collection. Research questions and a prior specification of constructs were suggested to shape the initial design of theory building research (Eisenhardt, 1989). Wacker (1998) further recommended stating the relationships among the constructs explicitly and arguing the connections logically through reviewing academic literature before any empirical support is found. Although early identification of the research questions, possible constructs, and relationship is helpful, it is equally important to recognize that the framework is tentative and subject to change, based on the insights from the field study. Without them, it is easy to be overwhelmed by the volume of data in the field. Figure 1.] (in Chapter One) provides the initial framework for this research. It is only a simplified graphic model. In fact, both operations practices and operations capabilities 58 contain a list of items (as discussed in the previous sections). Research questions and construct conceptualizations have been addressed before and are not repeated. The focus of this section is to describe the relationships among the constructs based on some insights from the relevant literature. The literature review in Chapter Two indicated that the direct relationship between operations practice initiatives and operations performance has been widely proposed and tested (Cua et al., 2001; Flynn et al., 1995b; Sakakibara et al., 1997; Shah and Ward, 2003). However, Schroeder et al. (2002) argued that the direct relationship is not enough to explain the phenomenon from the RBV perspective because research on operations practices did not explicitly address the effects of competitors imitating a successful innovation and failed to recognize the importance of proprietary processes that can not be obtained from factor markets. That is, operations practices adopted by imitating world class manufacturers may contribute to competitive parity but not to competitive advantage. Accordingly, these researchers promoted further studies from the resource/capability perspective. There have also been other researchers calling for the resource-based research in manufacturing plant setting (Amundson, 1998; Swadmidass, 1991), but Schroeder et al. (2002) are among the few that empirically validated the applicability of the RBV to operations capabilities. Following this line of research, operations capabilities were introduced as another contributor to operations performance. The strategy literature also argued that dynamic capabilities are the source of sustained competitive advantage (at least in the moderately dynamic market). But dynamic 59 capabilities exhibit commonalities across effective firms, which are called “best practice” (Eisenhardt and Martin, 2000). For example, product innovation is an important dynamic capability. Effective product development practices typically involve the participation of cross-functional teams that bring together different sources of knowledge and expertise. With these teams, coordination becomes more efficient among operations, marketing and design people, and eventually accelerates the process of product development. Therefore, operations capabilities are not born from vacuum; rather they are nurtured during the implementation of operations practices. These arguments and evidence indicate that operations practices and operations capabilities may be interrelated and that operations capabilities may play an important role in converting adopted practices into intrinsic and inimitable abilities to enhance operations performance. Specifically, the following questions about the proposed model are raised. How important is the indirect path through operations capabilities, compared with the direct path between operations practices and performance? The indirect path suggests the mechanism of how operations practices adoption enhances operations performance, which has not been addressed in the literature. The proposed indirect path and the well-researched direct path constitute the tentative model in Figure 1.1. Put simply, operations capabilities are hypothesized to mediate the relationship between operations practices and performance. Besides the main frame of the model presented in Figure 1.1, market environment is proposed as having an impact on the interrelationship among operations practices, operations capabilities. and operations performance (dashed lines). As argued in the prior 60 chapter, the competitive environment may influence decisions on practice implementation and desirable capabilities. From the contingency perspective, it is reasonable to argue that it could moderate the key relationships in the basic model. 3.4 Chapter Summary The chapter begins by selecting a general procedure for our theory building research. At the heart of theory building is the clear, precise and concise definitions of constructs. Therefore, the research spends much time and effort in defining and differentiating operations practices and operations capabilities, and introducing other influencing factors. The theoretical framework that portrays the relationships among the key constructs provides a foundation and driving vehicle for the methodology chapter. 61 Chapter 4 RESEARCH DESIGN AND METHODOLOGY Having the general theoretical model and approach laid out in the preceding chapters, this chapter describes the research methodology used to implement the model. Since the research follows the guidelines set out by Wacker (1998), theory and data are linked in one complete cycle (Figure 4. 1). While most studies focused on one of these two arcs (i.e., from theory to data or from data to theory), this study unifies both. Since the first step in this process is to build theory, the study begins with the link from data to theory. To implement the link, the grounded theory method is adopted. Theory development /\ Theory validation Figure 4.1 Theory Development — Theory Validation Cycle 4.1 Theory Development - Grounded Theory Method As shown in Figure 4.1, theory-building is an iterative process: going from data to theory through observation/description, empirical generalization and explanation, and then from theory to data through hypotheses testing (Handfield and Melnyk, 1998; Meredith, 1993). This study took a grounded theory method (Glaser and Strauss, 1967) to develop and refine theory through interaction with data collected in a focus group study. 4.1.1 Grounded Theory Method The grounded theory method is largely based on a general method of comparison/contrast analysis (Glaser and Strauss, 1967). Cases similar on many variables but with different outcomes are compared to see where the key causal differences lie. On the other hand, cases that have similar outcome are examined to see which conditions they have in common, thereby revealing the possible causes. The grounded theory method is appealing in theory building studies. The theory it generates allows researchers not only to develop a theoretical description of the general features of a topic but also to ground the explanation in empirical observations or data (Glaser and Strauss, 1967). Therefore, it is usually not completely refuted by one set of data or replaced by another theory, despite its inevitable modification and reformulation. This robustness reflects one great advantage of the grounded theory over those derived deductively from a grand theory. Without the help of data, those theories could turn out to fit no data set at all. The grounded theory method is particularly desirable for this study because it helps resolve the gaps and concerns pointed out in Chapter Two. First, the key constructs in this study — operations practices and operations capabilities — have not been well-defined and subject to varying levels of disagreement. Before they can be validated, their definitions and measurements have to be refined using the grounded theory method. Second, the study explores the nature of the relationships among the operations practice initiatives set and operations capabilities set, which is an issue under researched in existing literature yet important from the resource investment perspective. The grounded 63 theory method could help gain insights on the nature of the relationships and establish an initial framework for validation. Third, the tentative model proposes an alternative way to explain operations performance by considering the total effect of operations practices, operations capabilities, and competitive context. The grounded theory method could provide evidence of the interrelationships or interactions among these constructs and offer an opportunity to refine the model. The data for a grounded theory can come from at least four sources: interviews, direct observations, focus groups, and case studies — anything that may shed light on questions under study (Corbin and Strauss, 1990; Strauss and Corbin, 1994). This study used a semi-structured focus group interview design. A focus group study is to use a small group of selected people from a wide population with the purpose of soliciting their opinions about or emotional response to a particular subject (Stewart and Shamdasani, 1990). A focus group was preferred in this study than a single case study because more diversified information can be extracted from discussions with managers in different organizations. It is also less costly and less time consuming, compared with a multiple case study (Bonama, 1985). Each case study can be an extensive and expensive endeavor, making the acquisition of such qualitative expertise arduous or slow. Instead of paying many trips to different organizations, all the interviewees in a focus group were met at a specific time and their opinions were collected all at once. The focus group study was also preferred than a Delphi study because it allowed participants to elaborate their views and to interact with others’ perspectives (Abbott and Eubanks, 2005). In the process of expressing their views loud and having them discussed 64 by others, participants are exposed to other opinions, able to reflect and reassess their own interpretations. That is a consequence of the dialectic process that focus groups engender (Eubanks and Abbott, 2003). While a Delphi study emphasizes the achievement of a reliable consensus of opinions among a group of people by a series of questionnaires combined with controlled feedback from the study coordinator (McKenna, 1994), the primary objective of a focus group study is to provide greater insights into how people view a phenomenon and why they view it that way. The goal was not to reach consensus but to listen to all the possibly different explanations. 4.1.2 Focus Group Study The appropriate participants in the focus group are those in charge of operations at plant level because they matched with the level of analysis in the study. They represent the middle level operations managers in a corporation who are most likely interested in this study and have the most relevant knowledge to offer insightful opinions. Second, it is desirable to have participants working at different business environments (e.g., good performance plants versus poor performance plants, plants that underscore product innovation versus plants that emphasize process innovation). Consequently, they can bring together diversified knowledge based on their different experiences and provide alternative views for the same phenomenon. Third, participants who have common understandings on the terms used in the study are highly preferred. Given the limited amount of time of the on-site discussion, more time could be spent in collecting their view points regarding the key research questions than explaining those terminologies. 65 A corporation (The name is not disclosed to protect its identity) was found that was willing to sponsor this study and could provide the candidates that fit with the recruiting requirements. The company has a unique relationship with Michigan State University. Over the last ten years, an education program was jointly established to provide training to the middle level management team for the corporation. The education program helped improve the trust between the corporation and Michigan State University. As a result, the corporation was willing to give access to its management team, and its managers were more likely to participate in the study. Also, the managers who have gone through the intensive training program were familiar with the terms used in the study and had less confusion of them. In addition, the corporation has a well-developed operations planning and execution system committed to quality and operational excellence. The company has been aware of various kinds of operations practices and implemented practices such as lean manufacturing. value engineering, value analysis, formal product innovation, collaboration in supply chain, and six sigma. The management team of the corporation also recognizes and develops sufficient operations capabilities to compete in the market. Therefore, they have the skill set and knowledge to comment on the issues raised in the study. Apart from this, the corporation houses 30 major manufacturing divisions or companies in the home decoration and construction industry. Therefore, its managers are exposed to a wide variety of business environments. Some work under the environment with fast clock speed while other work under the environment with slow clock speed (Fine, 1999). 66 Some plants have experiences in product innovation while others in process innovation. The data collected from the management team in such a corporation are comprehensive and representative, which provides generality in that the theory includes extensive variation and is abstract enough to be applicable to a wide variety of contexts. Most focus groups consist of six to twelve people (Chan and Man, 2005; Jarvenpass and Lang, 2005); however, the number of participants depends on the objectives of the research (Stewart and Shamdasani, 1990). For example, smaller groups (four to six people) are preferable when the participants have a great deal to share about the topic or have intense or lengthy experiences with topic of discussion (Krueger, 1988). In general, Merton, Fiske, and Kendall (1990) suggested that the size of the group should be governed by two considerations. It should not be too large to preclude adequate participation by most members, nor should it be so small that it fails to provide substantially greater coverage than that of an interview with one individual. In order to secure a certain number of participants, Krueger’s (1988) suggested inviting two times the desirable size of people. In this study, 25 invitations were sent to managers in the selected corporation in order to form a focus group of six to twelve people. Eight managers responded and agreed to join the focus group at the designated time. Some of the respondents were willing to participate, but could not make it due to prior travel commitment. Two researchers from Michigan State University acted as moderators in the study, one leading the discussion and the other tape recording, taking notes, and clarifying 67 questions. Participants were encouraged to speak one at a time to avoid garbling the tape (Krueger. I988). The discussion lasted for one and a half hours. The focus group discussion was largely guided by a protocol (Appendix A). Stewart and Shamdasani (1990) proposed that most interview guides consist of fewer than a dozen questions. Krueger (1988) suggested that a focused interview include less than 10 questions and often around five or six. The protocol contained four major questions so that the researchers can develop an in-depth discussion with the participants. All the questions were arranged in a logical sequence and had a natural flow to them. “Yes” or “No” questions were avoided and open—ended questions were frequently used. It started with the definitions of operations capabilities and operations practices. The goal was to make a clear distinction between the two constructs so that the focus group could list examples of core operations capabilities and operations practices without confusion. Then they were asked to brainstorm the relationships among operations practices, operations capabilities, and operations performance. Field notes were taken and conversation was tape recorded during the discussion. The written notes and the tape were used for analyzing the content of the discussion. The aim was to seek trends and patterns that emerged from the focus group (Kreuger, 1988). Particularly, the goal was to look for overall opinions of (a) categorization of operations practices and operations capabilities, (b) intra-relationships of the operations practices set and the operations capabilities set, (c) the role of operations capabilities in improving operations performance, and ((1) comment on the tentative framework. At the end of this phase, the constructs’ definitions were revised, the tentative framework was refined, and 68 additional insights were collected. All of these became the necessary inputs for the next phase of the study. 4.2 Survey Development and Validation To accomplish theory validation (the are from theory to data), the survey method was employed to collect perceptual data from people knowledgeable in the subject and then evaluate the presence/nature of the relationships proposed in the tentative model. The data collection process is broken down into two steps: Section 4.2 covers survey development and validation, while Section 4.3 addresses survey administration. Survey research has dominated various data collection methods in empirical research in operations management (Flynn, Sukakibara, Schroeder, Bates and Flynn, 1990; Scudder and Hill, 1998). Data collected from survey research are largely perceptual. Starbuck and Mezias (1996) classified 249 articles published by the Journal of Organizational Behavior from 1988-1992 and found that 210 present perceptual data. They further argued that other journals have similar pattern — perceptual data are more frequently used than archival data. Even though there may be a divergence between archival and perceptual measures, the correlation between them has been argued to be stronger when respondents are from higher positions in an organizational hierarchy and when data come from the same level of analysis (Boyd, Dess and Rasheed, 1993). Therefore, the analysis was kept consistent at the plant level and operations managers in charge of plant operations were surveyed to increase the validity of the perceptual data. 69 A tentative survey instrument was developed based on the research model, existing measures, and the additional information gathered through the focus group study. The survey instrument can be largely divided into two parts: demographic information and questions related to the major constructs in the model. The survey instrument was validated by Q-sort and pre-testing. 4.2.1 Q-sort While the literature is inundated with measurements for operations practices, operations performance, and competitive context (discussed in Chapter Three), it has not laid a solid foundation to scale development for operations capabilities defined in this dissertation. Therefore, a Q-sort was conducted to pre-assess initial construct validity and reliability for operations capabilities. The basic concept of Q-sort method is to have experts act as judges and sort the items into several groups, with each group corresponding to a factor or dimension based on a pre-determined agreement (Boon-itt and Paul, 2006; Moore and Benbasat, 1991). In the Q-sort method, two evaluation indices are normally used to measure inter-judge agreement levels when observing or coding qualitative/categorical variables: (1) Cohen’s Kappa (Cohen, 1960) and (2) Moore and Benbasat’s hit ratio (Moore and Benbasat, 1991). Cohen’s Kappa is a robust measure than simple percent agreement calculation since it takes into account the agreement occurring by chance (Fleiss, 1981). Several studies have considered a score greater than 0.65 to be acceptable (J arvenpass, 1999; Li, Rao, Ragu-Nathan and Ragu-Nathan, 2005). 70 In this study, six senior operations management doctoral students in Michigan State University were given the definitions of all the six operations capabilities constructs and a list of measures as well. They were asked to assign each measure to a capability based on the supplied definitions. Their rankings were assessed by Cohen’s Kappa in judging the inter-rater agreement. Their feedback helped remove and/or reword some of the confusing items. As a result, respondents are more likely to achieve consistent assessment on the linkages between items and constructs and increase scale reliability and validity. The Q-sort protocol is presented in Appendix B. 4.2.2 Pre-test the Survey Instrument After validating the items for operations capability constructs using Q-sort, a pre-test was conducted for the full survey instrument in a group of 15 managers. Those managers also came from the selected corporation, but they were a different group from the focus group. They were given the survey instruments with a cover letter. The cover letter directed their attention to certain important areas: time to fill out the survey, clarity of the questions, necessity of the questions, and key missing questions. They were asked to submit the completed surveys and written comments on the questions in the cover letter, and make changes and comments on the survey. This round of pre-test had three purposes. First, it ensured no ambiguity in the questions and no different understanding between researchers and potential respondents. Second, the pre-test served as a cross check of the clarity of measurements as they were proposed by one group of managers and tested by another group. Third, this was another chance to modify the survey instruments and the flow of the questions, which helped minimize the 71 possible occurrence of common method bias (This point is further elaborated in Section 4.4.1). Once the survey instrument was revised, it was transformed into an on-line version using PERSEUS — an online survey development software. The survey was uploaded on the server of Michigan State University, which is a secure website. The on-line survey was the main means for collecting data while a downloadable version was also available because some organizations’ firewalls may block employees from accessing certain websites. The finalized survey instrument is presented in Appendix C. 4.3 Survey Administration The survey administration process was designed with three goals: quantity of the data, response rate, and quality of the data. That is, the aim was to obtain a large and quality dataset with a decent response rate. 4. 3. 1 Data Quality To ensure quality data, the most appropriate respondents must be identified first, and then an access to them needs to be obtained. Accordingly, the unit of analysis and desirable respondents are first discussed, followed by soliciting help from a professional organization who can provide access to such kinds of respondents. Implications of operations strategy can be discussed at the strategic business unit (SBU) level, plant level, or functional level. Swink and Way (1995) pointed out a problem in existing research, that is, the unit of analysis has not always been consistent with the 72 objectives of the research. For instance, operations strategy studies frequently assess strategy at a functional or plant level while performance is measured at the SBU level. To avoid this problem, the unit of analysis was kept consistent across all the variables. A manufacturing plant was set as the unit of analysis in this study because of the following reasons. First, this is consistent with operations practices literature (Flynn et al., 1995b; New and Szwejczewski, 1995; Shah and Ward, 2003). Second, the majority of operations capabilities come from operations processes (i.e., transformation processes that convert input into output), which can be more easily observed in a simple context, such as a plant. Third, managers at this level who oversee operations are knowledgeable and appropriate to answer the research questions. As a result, the desired respondents were operations managers at the plant level. In this study, “operations managers” were broadly viewed as those who directly involved in various activities that are necessary in producing a product or in providing a service (e. g., planning, scheduling, performance measurement, procurement/purchasing, or logistics/warehousing, delegating and supervising the work and activities of others involved in the operations process). Based on the prior discussion of the traits of desired respondents, a professional association was sought which could provide access to knowledgeable and competent people that fit with the target population. Consequently, APICS was selected as the most appropriate organization to work with. A detailed research project proposal was provided to APICS and asked for their collaboration. With the approval, APICS granted the access to its members who subscribed its semi-monthly e-newsletter. Considering that APICS 73 has members outside the target population (e.g., educators, consultants), a self-screening question was designed at the beginning of the survey. This can be viewed as another step to obtain quality data from the target population. 4.3.2 Data Quantity and Response Rate The quantity of a dataset is affected by the number of potential respondents and the response rate. A survey with a higher response rate often ends with a large dataset, and vise versa. Due to the connection between quantity and response rate, the discussion on these two dimensions is combined together. APICS has 30,000 members subscribe its e- newsletter, and 60% of them work in the area of operations and planning. There exist a great number of potential respondents, thus the key is to enhance the response rate. It is well-known that survey research has been plagued by low response rates (Dennis Jr., 2003; Larson, 2005; Sivo, Saunder, Chang and Jiang, 2006). Facing this challenge, different tactics have been discussed to increase response rates in operations management survey research (Frohlich, 2002). Several tactics have emerged as having a potentially positive impact on response rates. The most important ones include pre-notification/pre- qualification (Lambert and Harrington, 1990; Yu and Copper, 1983), incentives (Greer, Chuchinprakam and Seshadri, 2000), support (Larson, 2005), length of questionnaire (Deutskens, de Ruyter, Wetzels and Oosterveld, 2004), and follow-up (Dillman, 2000). It is important to recognize that these tactics are not mutually exclusive. Rather they reinforce each other and are typically used as a set, as recommended by Dillman (2000). Therefore, this study used a combination of support, multiple types of incentives, 74 multiple waves of delivery, multiple ways of delivery, and moderate length of survey to improve the response rate to a maximum extent. First, support can come from multiple sources: professional societies, governments, and organizations. Larson (2005) found that professional organizations’ support is most effective to enhance response rates. Therefore, APICS was solicited to support the research and administer the survey distribution. APICS agreed to announce the survey study in its semi-monthly e-newsletter delivered to its member subscribers through emails. Second, multiple incentives were designed to encourage APICS members’ participation and to complete the survey. It has been made clear at the beginning of the survey how important for the potential respondents to complete all the questions to their best knowledge and even suggested them talking with colleagues for answers they were not sure. All the respondents have been promised confidentiality in which their individual names and opinions were not disclosed. However, only the completed ones were provided with a summary of the project findings and had the opportunity to win a lottery (a grand prize of a $75 Visa Gift Certificate and five first prizes of Barnes and Noble Gift Cards with $25 each). In order to preserve anonymous and be able to get in touch with the winner, two datasets were created. After the original data were obtained, those unfinished responses or responses with one half of the data missing were removed and the rest was copied to a new data file. The email addresses in the new data file were taken out for lottery drawing and sending out executive summary. The rest of the data (without their contacts) was used for data analysis. 75 Third, multiple waves of survey were used to remind respondents of participation. Two weeks after the first wave of survey announcement, a reminder was put in the APICS semi-monthly e-newsletter and sent to its subscribers. Therefore, those who did not have a chance to fill it out had another opportunity to do it. Additionally, both of these waves occurred in June. in advance of vacation time for most employees in American firms. Fourth, though the survey was mounted on the server, respondents were actually given two options (i.e., fill it out online or download it). This combined use of delivery methods gave respondents more flexibility and made them feel comfortable. Lastly, the questionnaire was designed as concise and short as possible. It turned out to be 12 screen pages. From the pre-test results, the length of time to fill out the survey was 18.35 minutes on average and with standard deviation 1.83 minute. 70 percent of the respondents finished in less than 20 minutes. All these results showed that the length of the questionnaire was reasonable. 4.3.3 Survey Delivery Method Surveys can be delivered through different modes: mail, fax, telephone, personal interview, and the Internet. There is a current debate regarding the relative effectiveness and efficiency of online surveys compared with traditional mail surveys (Cook, Heath and Thompson, 2000; Deutskens et al., 2004; Ilieva, Baron and Healey, 2002; Sheehan and Hoy, 1999). The resulting evidence is mixed. While some have found that on-line surveys are more cost-effective, others have found that they can discourage participation 76 by being perceived as “impersonal” and “cold” (Deutskens et al., 2004). The survey was delivered through Internet for the following reasons. First, web-based surveys offer appealing possibilities (Cook et al., 2000). People seem to find the technology easy to use (Parker, 1992). Like a mail survey, electronic surveys can be completed at the pace the respondents choose. Unlike a mail survey that could be mislaid easily, an electronic contact remains in place until purposefully deleted (Sheehan and Hoy, 1999). In a University of Colorado survey, 55 percent of the respondents cited ease of use as one of reasons they liked most about answering a Web survey (University of Colorado at Boulder, 1996). As long as people find out how easy it can be done online, they are more willing to do it. Second, on-line surveys are less costly, with data being obtained quickly and structured (Ilieva et al., 2002). Coding errors are significantly reduced. At the same time, disadvantages of on-line surveys (for example, the unfamiliarity of the new technology) can somewhat be covered up by a paper version which can be downloaded. Third, online survey has not been showed to be plagued by missing values. For instance, a study showed that 69.4 percent of email respondents completed 95 percent of the survey whereas only 56.6 percent of mail respondents completed 95 percent of the survey (Schaefer and Dillman, 1998). Furthermore, the email participants provided answers to open-ended questions with 40 words on average, whereas the mail respondents’ answers were briefer, with 10 words on average. Data with many missing values have a direct impact on the quality of the data and data analysis. If an observation is deleted because some of the answers are missing, it reduces the number of observations used in the data 77 analysis. Mean substitution or other techniques to make up for missing values are never as good as real answers. 4.4 Limitations of Methodology Even though every effort was made to make the research design a well-thought one, every study was plagued by certain limitations. In this section, those limitations are recognized and the techniques to counteract them are reviewed. 4. 4. 1 Common Method Bias Using self-reporting measures as the primary or sole type of data collection method, though common in survey type of research, is subject to common method biases. Common method variance is the variance attributable to the measurement method rather than to the constructs of interest. Method biases are one of the main sources of systematic measurement error which threatens the validity of conclusions about relationships between constructs (Bagozzi and Yi, 1991; Nunnally, 1978; Spector, 1987). Podsakoff, Mackenzie, Lee, and Podsakoff (2003) provided a comprehensive summary of the potential sources of method biases and techniques for controlling them. Common method biases arise from having a common rater, a common measurement context, a common item context, or from the characteristics of the items themselves. They argued that method biases are likely to be a substantial problem in studies where the data from predictor and criterion variables are collected from the same source in the same measurement context using the same item context and with similar item characteristics. 78 Generally, there are two primary ways to control for method biases — one is through the research design phase and the other is through statistical control (Podsakoff et al., 2003). In the research design phase, the key is to minimize the connection between predictor and criterion variables from contextual cues, specific wording, or question format. Measures of predictors and criterion can be obtained from different sources. If this is impossible or infeasible, another potential remedy is to introduce a temporal (e.g., a time lag), psychological (e.g., cover story), or methodological (e.g., different scales) separation between the measurement of predictor and criterion variables. Another important issue is to improve the quality of the scale items to reduce the ambiguity, avoid vague concepts, and keep questions simple and concise. If little can be done in the research design phase, it is also helpful to use some statistical remedies to tackle the problem of common method biases (for instance, Harman’s single—factor test, partial out “common” factors, or multi-traits multi-methods). Following the suggestions offered by Podsakoff et al. (2003), the issue of common method biases was addressed up front in the research design phase. First of all, as previously noted, the survey instrument was pre-tested by target respondents before it was posted online. The purpose of the pre-test was to clean up the questionnaire, reduce the ambiguity of the questions, mix the positively worded items with negatively worded items properly, and condense the length of the questionnaire. Consequently, items were presented to respondents without producing artifactual covariance in the observed relationships. 79 Second, it can be argued that the design features of an on-line survey can help reduce the exposure and emergence of common method bias. The respondents do not have any chance to glance through the whole questionnaire and they could only see one or two questions on each screen. It can be viewed as a distraction from their logic flow when a new page shows up in front of them. They are also unlikely to turn back to the previous pages and change their answers. Therefore, they do not build the connection among questions easily and consciously. Third, although an effort has been made to minimize common method biases in the research design, it is extremely difficult to eliminate its unfavorable impact. Statistical analysis (i.e., Harman’s single-factor test) was done later to evaluate how significant the problem was. 4. 4.2 Response Rate and Sample Size Response rate is a major concern of survey research. However, the response rate is hard to estimate when the survey is delivered via Internet because the true sample size cannot be accurately assessed. With the lntemet and emails, it is possible for surveys to be redistributed from one person to another electronically when the initial respondent believes someone else is more appropriate to fill it out. It is also possible that respondents skip these emails. In this case, the survey announcement was sent to APICS e-newsletter subscribers’ email accounts as part of the e-newsletter. However, how many subscribers read every issue of e-newsletter is unknown. Even those who read frequently may skip this particular issue or skip the survey announcement part. Under all these situations, a proportion of the subscribers did not know anything about the survey. Yet it is almost 80 impossible to estimate the real pool of the potential respondents and calculate the accurate response rate accordingly. This becomes a problematic issue of using online survey. However, considering its great advantage (discussed in Section 4.3.3) and the fact that APICS no longer allows researchers to access its current mailing list, this delivery method is still preferred. Though APICS past mailing lists are managed by a third party (Infocus) and can be purchased with a fee, it is appealing to use the most current one to have the correct contact information. Besides the response rate, the size of the dataset is another issue. In this study, the desired number of responses is a minimum of 240 to run Confirmatory Factor Analysis (CFA). This is based on the estimation of the product of the number of constructs (16), the minimum number of items for measuring each construct (3), the minimum number of observations to generate a reliable and convergent parameter estimate (5) (Bollen, 1989). As previously noted, many tactics were introduced to obtain enough responses from knowledgeable managers given the time and budget of the research. Yet, in the case of falling below the minimum requirement, CFA could be conducted for the operations practices set, the operations capabilities set, and operations performance respectively in stead of taking all of them in one CFA model. If necessary (the dataset is too small to run even small size CFA), bootstrap technique could be used to estimate the sampling distributions of estimates by re-sampling from the original sample with replacement (Bollen and Stine, 1993). The purpose of bootstrapping is to derive robust estimates of standard errors and confidence intervals of a population parameter such as a mean or 81 regression coefficient. Therefore. bootstrap technique has been advocated as a method of internal replication to assess the replicability of results of an individual study (Thompson, 1993). The application of bootstrap is most appropriate in situations where theoretical assumptions are unlikely to be tenable, or the statistical theory is weak (Bone, Sharma and Shirnp, 1989; Fan and Thompson, 1998). Once the validity and reliability of the latent constructs are confirmed in CFA, techniques that are not data demanding could be used for the statistical tests of research questions (2) to (4), for instance, regression analysis rather than structural equation modeling. 4.5 Data Analysis Methodology Data analysis has been divided into two steps. The first step addresses the first research question with the purpose of validating the measurements of the key constructs. The second step addresses the rest three research questions regarding the relationships among the constructs. Significance level a was set at 0.05 to assess the significant relationships in all research questions except question (4). For research question (1), CFA was employed in EQS to test construct validity (i.e., convergent validity and discriminant validity) for the set of operations practice initiatives, the set of operations capabilities, and operations performance respectively. Another CFA including the measurements of all operations practice initiatives and operations capabilities was conducted with the purpose of showing that the two sets are valid and different construct groups. 82 The rest of the analyses were can'ied out through regression analysis using SPSS. As to research question (2). the human judgment model was borrowed to evaluate whether the intra-relationships among the operations practice initiatives set and among the operations capabilities set are compensatory or additive. The basic model setup followed Patton and King’s work (1992). The original dataset was recoded to make the test feasible. For example, are the operations practice initiatives compensated with each other in improving performance? Two competing models were established. In the compensatory model, the weakness of some practices can be cancelled out by other practices. What matters is the average level of all the practice initiatives, and the average usage of them becomes the independent variable in the model. The respondent with the highest average usage would be predicted to develop a higher level of the performance if the compensatory model is valid. In the additive model, the minimum level (threshold value) use of any practice initiative put a limitation on the effectiveness of other practices. Theoretically, the higher the threshold value, the better the performance. Thus, what matters is the lowest evaluation on all practice initiatives, which acts as the independent variable in the model. Each respondent was assigned a score corresponding to the lowest evaluation received on all the dimensions of operations practices. Significant positive relationship between the score and operations performance would support the additive model. Essentially, research question (3) concerns the potential mediating effect of operations capabilities. The potential mediation effect was tested following the standard three-step approach in regression analysis (Baron and Kenny, 1986; Judd and Kenny, 1981). 83 (1) Show that the initial variable is correlated with the outcome, so there is an effect that may be mediated. The first step required using operations performance (either cost, quality or delivery performance) as the criterion variable in a regression equation and operations practices as predictors. (2) Show that the initial variable is correlated with the mediator. This step essentially involves treating the mediator as if it were an outcome variable. In this case, operations capabilities were used as the criterion variable in the regression equation and operations practices as predictors. (3) Show that the mediator affects the outcome variable by using operations performance as the criterion variable in a regression equation and operations practices and operations capabilities as predictors. To establish that operations capabilities completely mediated the operations practices — operations performance relationship, the effect of operations practices on operations performance controlling for operations capabilities should not be significantly different from zero. The goal of research question (4) is to study how competitive environment influences the relationship among operations practices, operations capabilities, and operations performance. In another word, are the results obtained from research question (3) robust enough in different market environments? Two dimensions of competitive environment were examined following the literature: market competitiveness and market dynamism. The former was measured by the number of competitors, growth/decline of sales, and price difference among competitors. Low 84 market concentration, slow growth/decline of sales, and marginal price difference among competitors indicate that the market is more competitive. Market dynamism was measured by the rate of change of product introduction, processes innovation, tastes and preferences of customers. The faster the change, the more dynamic the market is. The whole sample was divided into two sub-samples based on the market competitive index and market dynamism index. Two sets of models (compensatory vs. additive) were run and results were compared between different model structures and across sub- samples. At this stage, significance level a = 0.1 was used rather than a = 0.05 due to two reasons. First, the probability of correctly rejecting null hypotheses is reduced as the sample size decreases (Labovitz, 1968). The standard error varies inversely with sample size. Consequently, a small difference is likely to be statistically significant in a large database, while with small sample size even large differences may not reach the predetermined significance level. Therefore, small a (e.g., 0.01 or 0.001) should accompany large sample size and large a (e.g., 0.10, 0.20) should be used for small database. As the sample size became smaller due to the split, the significance level was relaxed to p < 0.1 so that few “true” hypotheses were rejected. Second, Labovitz (1968) pointed out that the selection of significance level also depends on the research purpose — theory developing versus theory testing. If testing well- reasoned and well-developed hypotheses (like in most confirmatory research settings), it is logical to select a small level of significance so that researchers have a great confidence of supporting one theory over the others. On the other hand, it is inappropriate 85 to set a stringent significance level in exploratory research where researchers explore a set of interrelations for the purpose of developing hypotheses to be tested in other studies. A large significance level is suggested so as to yield more hypotheses — any of which may be subsequently validated. Having discussed in Chapter Three, the goal of this dissertation study is to develop theory and generate more specific hypotheses for future studies. Therefore, it is reasonable and acceptable to choose a relatively lenient significance level. 4.6 Chapter Summary This chapter details the methodology in order to carry out the study. The grounded theory approach is used to develop theory through a focus group study. Survey method is employed to collect data to validate the relationships among the key constructs. How to conduct the focus group study, develop the survey instrument, and administrate the survey study are described followed by the plan for data analysis. The next chapter reports the results of data analysis. 86 Chapter 5 DATA ANALYSIS AND DISCUSSION This chapter presents the qualitative and quantitative results generated in this study. It also discusses the findings and their implications, as they pertain to the key research questions. As noted previously, the research methodology employed by this study consists of the following stages: Focus Group — used to further refine the constructs of operations capabilities and operations practices and to explore relationships between these constructs and operations performance. 0 Development and refinement of the survey instrument. 0 Administration of the survey instrument. 0 Analysis of the data generated by the survey instrument. 0 Refined framework. This chapter uses this flow of activities to structure the presentation of the results. 5.1 Results from the Focus Group Study After sending out invitation to 25 middle level operations managers from the selected organization, eight responded and indicated that they were willing to participate in the focus group. No further attempt to recruit additional members was made since the member of participants was exactly within the range suggested by (Krueger, 1988). The discussions followed the protocol, as set out in the preceding chapter, and the resulting meeting lasted approximately 90 minutes. The focus group was held in December 2005 at 87 the corporate headquarters and took place during an event that allowed the participants to amend. The participants were asked to provide their inputs on the definitions of the two core constructs — operations practices and operations capabilities. They agreed that operations capabilities are unique ways to do something by extending and modifying firms’ assets. For example, the company extended their technology in insulation, which efficiently prevented transfer of heat and saved consumers’ energy bills. Insulation technology is a physical asset available to every firm. However, it can be combined with other assets and/or modified to fit with the special needs of a firm in different ways. The corporation has developed a proprietary process of insulation installation and generated a strong stream of business with homebuilders. This proprietary process was regarded as a potential core capability. Some participants pointed out that operations capabilities are not something that an organization claims to possess. Rather they must be first demonstrated. So they suggested adding one more level of specificity on the existing capability conceptualization, which would provide a clear judgment rule for firms and their management to ascertain whether they have a certain capability. Therefore, the definition of “operations capabilities” was refined as being the demonstrated potential to execute a specified course of action in operations in a unique and proficient way. After consistency on this definition was achieved, the discussion next turned to provide a list of core operations practice initiatives and operations capabilities. Before that, the difference between capabilities and practices was emphasized using examples in personal 88 life so that the participants could make the distinctions easily in operations settings. For instance, “time management” is viewed as a capability for an individual — a person good at using time (work time and leisure time) efficiently. This could be measured by how much output (e.g., jobs) the person could generate or process in a given amount of time. “Make schedules everyday and keep appointments in a palm pilot” are seen as examples of practice since these activities are used to help people better manages their time. The critical difference between practices and capabilities lies in that practices are specific, task/goal oriented, and contextual bounded activities, whereas operations capabilities are broad-based, context free routines/mechanisms that enable the most efficient use of the firm’s assets. The former can be observed and imitated easily; whereas the latter are elusive, intangible, and hard to pin down. The focus group was next asked to list core operations practice initiatives and operations capabilities. They came up with 16 operations practice initiatives and 14 operations capabilities (Table 5.1). In reviewing these two lists, it is interesting to note that there was a significant overlapping between these sets and those contained within the original framework. Consequently, it was decided to combine some of the items found in the lists generated by the focus group and to incorporate them into the original framework. 89 Table 5.1 Core Operations Practices and Operations Capabilities Core Operations Practices Core Operations Capabilities H A OO\IO\LII 9 10 ll 12 I3 14 15 16 In terms of the nature of the operations practices set and operations capabilities set, the majority of participants agreed that firms do not need to have all of them. More importantly, deficiencies in one or more practices/capabilities can be offset by each other, which supported the compensatory model. Alternatively, they suspected that some of them are core and complemented by other practices/capabilities. Both views were against the additive model where various operations practices/capabilities are equally important and need to be implemented/developed. The primary reason given by the participants was that firms have different competitive dimensions, and they choose different sets of practice initiatives and operations capabilities to support those competitive dimensions. Firms do not need to do well on all dimensions; they need to do well on those dimensions Small lot size production Information sharing with supplier Standard work practices Statistical process control ISO 9000 and ISO 14000 Feedback collection from customers Customer training to identify defects Customer championship Customers’ complaint analysis Design for manufacturing Collaborated new product development Employee training Talent development Performance measurement and evaluation (daily scorecard) Supplier certification Formal communication Value creation for core customers Sense of urgency to meet short lead time Fulfillment of customers’ orders Process improvement to make price competitive Process standardization Responsiveness Dependability and reliability Intellectual property and know-how (specialized tooling, technology, equipment) Specialization (service experts) Customization New product testing facility Product innovation Control of the supply chain Relationships and trust with partners critical to the successful achievement of their strategic objectives. 90 As showed in Figured 1.1, there is a direct connection between operations practices and operations performance with the former impacting the latter. There also exists an indirect linkage between them that goes through operations capabilities. The focus group was asked to suggest the inter—relationships among operations practices, operations capabilities. and operations performance. The focus group first pointed out the direct connection between operations practices and performance as it was both intuitive and straight forward. It is natural to attribute what a firm achieves to what it puts into action. Yet, the impact of operations capabilities on performance could not be seen very easily and clearly. Though it is reasonable to believe that a firm’s performance relies on its ability to execute activities in efficient ways, the linkage is not evident due to the subtle and elusive nature of operations capabilities. However, they showed a great interest in finding out the exact role of operations capabilities. In terms of the relationships between operations practices and operations capabilities, the focus group recognized that they reinforce each other over time. Capabilities can not come from nothing in an organization; and they are knowledge gained or drawn from everyday practices. With capabilities developed over time, an organization can quickly or easily implement new practices or uses existing practices more efficiently. However, due to the limitation of empirical analysis techniques, it was hard to incorporate the reciprocal relationships into the conceptual model for testing. Therefore, only the linkage from operations practices to operations capabilities was tested. 91 5.2 Q-sort Results The focus group study helped refine the constructs and provide evidence of the tentative framework. In order to validate the proposed framework, data were collected through a large-scale survey. Before the survey was distributed, the theoretical linkages between the measurement items and the constructs of operations capabilities were ensured through a Q-sort analysis. As noted from Chapter Fours, six senior operations management doctoral students in Michigan State University were asked to assign a list of 25 measures to six capabilities constructs based on the supplied definitions. The resulting assignment is summarized in Table 5.2. To assess the degree of inter-rater reliability, the study used Cohen’s Kappa. Cohen’s Kappa between any two researchers was 0.65 or higher, which was deemed to indicate an acceptable level of agreement. As can be seen from Table 5.2, most of the evaluations between raters were generally very consistent with the original design. The only exception was item #15. Two third of the researchers assigned it to the “wrong” construct. Consequently, item #15 was removed from the measurement model. Cohen’s Kappa was computed again, with all of values improving to above 0.70. These results demonstrated face validity of operations capability constructs. 92 Table 5.2 Q-sort Results Correct rate Item # COS PPD RSP IPI RPI PRC IDEAL ( %) l 4 4 4 4 4 4 4 100 2 2 2 2 2 2 2 2 100 3 3 3 3 3 3 3 3 100 4 6 6 6 3 3 6 6 67 5 5 5 5 5 5 5 5 100 6 4 4 4 4 4 4 4 100 7 2 2 2 2 2 6 2 84 8 3 3 3 3 3 3 3 100 9 6 3 6 6 3 6 6 67 10 2 6 2 2 2 6 2 67 l I 6 3/4/6 4 6 6 6 6 67 12 1 1 1 1 I 1 1 100 I3 3 3 3 3 3 3 3 100 I4 5 5 5 5 5 5 5 100 15 I 2 l 2 1 l 2 33 16 3 3 4 3 3 3 3 84 I7 I I 2 I I 1 l 84 I8 6 6 2 6 6 2 6 67 l9 1 1 I 1 I 1 1 100 20 5 5 5 5 5 5 5 100 21 4 4 4 4 4 4 4 100 22 2 2 2 2 6 6 2 67 23 2 2 6 6 2 2 2 67 24 l l 1 1 1 1 l 100 25 4 2 4 4 2 4 4 67 Note: COS — Cooperation skills; PPD — proprietary processes development; RSP — responsiveness; IPI — incremental process improvement; RPI — radical process innovation; PRC - process reconfiguration. IDEAL represents the construct number each item is supposed to measure in the original design. Correct rate is the ratio of the number of correct answers and the total number of participants. Besides item #15, there still existed some level of inconsistency in less than half of the items. The correction rates were not 100% for items such as #4, #7, #9, #10, #11, #16, #17, #18, #22, #23, and #25. Consequently, the researchers were asked to explain the rationale for their assignment on the inconsistent items. The feedback from this discussion was then used to rephrase the wording of those items. The result was a set of items that were linked to the underlying constructs and that were worded to minimize the chance for misinterpretation and confusion. 5.3 Data Collection An online survey was developed using PERSEUS software and pre-tested by the managers in the selected corporation before it was uploaded on the university server. The survey targeted APICS members who are primarily in charge of operations at the plant level. To gain access to this target group, the survey announcement was sent out through APICS semi-monthly e-newsletter. There, the potential respondents were invited to participate in the study and provided the link to the survey webpage (www.msu.edu/~wuiinhui). 5.3.1 Survey Responses The survey announcement was first sent out on June 6, 2006, followed by another round two weeks later. After the first round, 103 responses were received; and 50 more were received after the second round by June 30, 2006. According to the APICS membership 2003 directory, there are 2600 members whose primary area of responsibility was operations. The 153 responses should represent 5.88% response rate. However, this is only an estimate. As previously noted, the exact response rate was hard to estimate. Among the 153 responses, 19 responses were considered invalid because they were almost completely empty. The remaining 134 responses were used in the analysis. In general, missing values were not a problem since respondents seemed to be very committed to finish all the questions once they got started. For those cases where missing 94 values were present, the missing values were replaced using the mean values of those variables. If non-response bias can be viewed as a continuum, ranging from fast responders to slow responders (with non—responders defining the end of the continuum), the comparison between the first round and the second round respondents can infer the seriousness of the non-response bias (Armstrong and Overton, 1977). Pair-wise t-tests were carried out between the two groups on all the questions related to operations practices, operations capabilities, and operations performance. Overall, 95 percent of the questions showed no significant difference between the two groups. This means that non-response bias is not a substantial problem in this study. In a typical hypothesis-testing research, a researcher wants to control Type I error (or) but also must be very concerned about Type 11 error ([3). Therefore, the research design must ensure that power (1- [3) is reasonably high to detect reasonable departures from the null hypothesis so that a researcher will not accept a hypothesis when it is false. Power is positively related to sample size. The larger the sample, the more likely a false hypothesis can be detected (i.e., the greater the power is). However, if there is too much power, trivial effects become highly significant. 80 percent power was suggested by Cohen (1988) to estimate the required sample size: 196 observations for a small effect size (d = 0.2, null hypothesis is wrong by a small amount), 33 for a medium effect size (d = 0.5), and 14 for a large effect size (d = 0.8) in one sample t-test. Considering regression was employed in most of the analyses and t-tests were used to evaluate the significance of the coefficients, Cohen’s suggestion was followed to evaluate the power of this study. Given 95 a usable sample size 134, the power was high (above 80%) for us to detect a medium effect size but relatively high to uncover a small effect size. 5. 3.2 Demographic Information of the Sample Respondents’ demographic information was demonstrated from the job title, the number of years in the position, the number of years in the field, and confidence to assess plant level issues. Detailed information is presented in the Table 5.3(a), (b), and (c) respectively. Table 5.3 Summary of Respondents’ Demographic Information (3) Respondents’ Job Title Frequency Chief operations manager 4 VP operations 4 Plant manager 7 Director of operations 10 Production supervisor 2 Operations manager 31 Supply chain manager 13 Planning and inventory system manager 19 Others 44 (b) Years in this Position Frequency Years in the Field of Operations Frequency Less than one year 24 Less than one year 2 1-5 years 74 1-5 years 15 6-10 years 21 6-10 years 32 Over 10 years 15 Over 10 years 85 (C) Confidence to Assess Plan Level Issue Frequency Not at all 0 Somewhat confident 31 Confident 67 Extremely confident 36 96 Respondents were asked to identify their job title from a provided list. If they could not find in the list, they checked “others” and specified their exact job titles. Among the 44 job titles they provided, 36 of them are entitled as “continuous improvement leader”, “demand planning supervisor”, “director of quality assurance”, “director/manager of logistics/supply chain”, “lean coordinator/leader”, “Master planning and operations systems manager”, and “material control/coordination manager”. Therefore, the majorities of the respondents work in the area of operations, and were appropriate informants to answer the survey. A typical respondent has been in the current position for more than five years and he/she has worked in the general area of operations for more than eight years. Thus, they have cumulated enough experience and knowledge to give feedback on the questions in the survey. The unit of analysis in this study is at the plant level, and most of the questions in the survey are related to operations at this level. 77 percent of the respondents claimed that they have good confidence of assessing plant issue. Overall speaking, the respondents represented the target population and were capable to address the issues in the questionnaire. 5. 3.3 Sample Representativeness Considering the estimated response rate was relatively low (i.e., 5.88%), sample representativeness was further analyzed from the organizational size and industry aspects. The survey asked a number of questions to infer the industry each respondent’s business 97 competes, for instance, North American Industry Classification System (NAICS) code, Standard Industry Classification (SIC) code, major products, and name of the firm to which the plant belongs. Among all the options, the majority of the respondents filled out the information about their major products. Based on the product description, their industry information was generalized using three-digit NAICS code. Detailed breakdown is provided in Table 5.4(a). Except a few in the sectors of utilities, construction, transportation, wholesales, and professional services, the majority of the business are in the manufacturing sectors. Among them, their businesses widely spread among food/beverage, machinery, computer/electronic products, chemical/pharmaceutical products, electronic equipment, and transportation equipment. Apart from this, market competitiveness and market dynamism were used as proxies for industry representativeness. The sample has a good span from more competitive markets to less competitive markets, from dynamic markets to stable markets (Table 5.4b). Therefore, the sample has good industry representativeness. The survey had two questions to estimate the size of the organization: annual sales and the number of employees. Since there was a lot of missing information on annual sales, only the number of employees was reported in Table 5.4(c). Apparently, the sample covers fairy small businesses as well as large businesses. 98 Table 5.4 Sample Representativeness (a) NAICS Code NAICS Title Frequency 22 Utilities 1 23 Constructions 2 31-33 Manufacturing 106 3 1 1 Food 4 312 Beverage and tobacco product 2 313 Textile mills I 314 Textile product mills 2 315 Apparel 1 316 Leather and allied product 1 321 Wood product - 322 Paper 4 323 Printing and related support activities 3 324 Petroleum and coal product - 325 Chemical manufacturing 12 326 Plastics and rubber products 4 327 Nonmetallic mineral product 2 331 Primary metal 4 332 Fabricated metal 11 333 Machinery 12 334 Computer and electronic product 6 335 Electrical equipment, appliance and components 10 336 Transportation equipment 13 337 Furniture and related product 6 339 Miscellaneous (medical devices) 8 42 Wholesales 2 48 Transportation 1 54 Professional services 1 (b) Market Dynamism Frequency Market Competitiveness Frequency Dynamic market 59 More competitive market 57 Stable market 75 Less competitive market 77 (C) Number of Employee Frequency 50 2 X 17 50 < X S 200 37 200 < X S. 500 33 500 < X S 1000 12 X > 1000 18 99 5.4 Data Analysis and Discussion The statistical analyses are presented in the order of the research questions, followed immediately by the discussion. 5. 4.1 Analysis and Discussion for Research Question I Research question (I) aims to provide empirical evidence for the validity and reliability of the operations capabilities set and the operations practices set. Confirmatory Factor Analysis (CFA) was the technique used to achieve this research objective. Since 134 usable responses fell below the minimum requirement for running CFA for all the constructs in one model, CFA was run individually for the operations practices set and the operations capabilities set. The results of these analyses are presented in Table 5.5 and Table 5.6 respectively. In the operations practices set, CFA started with seven practice initiatives, as presented in the initial framework. However, the factor loadings of “supplier relationships management” were problematic because the items were found to have stronger relationships with other constructs, which indicated this construct is not distinct from other practice initiatives. As the goal of this study is not to identify a comprehensive list of operations practices but to obtain a set of core practice initiatives to investigate their relationships with operations capabilities and performance, this construct was deleted in the further analysis to achieve a higher fit with the data. l00 mod :5 888 swab: wag—em :53on end mud m=§m Bo: wEEwE 8a. cSBom El: 93 85328ch 98 x83 88H Amwd u «an?» whomncouuv moouofim EoEowacaE 88353 5.3 $6 380E swig Eggs—8-8:: 3.: mwd wctoofiwco “cobsocou Eng owd 525233358 H8 cwuom Awwd n 39¢. @5235er 80:85 “cacao—gov 8%on wBSwBE Saw cud >203: mafia—9:8 .muofioumso 8 ecoamom and and cotoflmgm $8830 8332 Quw Sid £08836 5:5 68:8 owe—o 58:82 Ammo u mam—«V whomncoeuv 80:83 20:23:25 58830 NM; Inc 30c ES 8533 wcmmwoooa oNEan 9 Bot?— 5583:08qume 3.9 5.0 853:5“: cocoa—68 Edam 5w 8.0 205830 SEEN mammwsesm 5 05m :83 =«Em 65.0 n 932 9:03:88 Susana 30c H: mmd mud 3:26 E wcfisgficon 03:89:00 wm.w 2.6 3:95 H8 couwoamfloo Hosmmsm 0.06 5.0 mam—m Ea mug—om tax—«=0 €50 n «an? {63:8an 803an 8080358 5:25 .253 58% 5 3.6.26er Estates?» gwiezex 2: No 3: 22 m8. :3. 33$ 3%: E 85 383 EEoEooU 02?; eofiEaeesm mew: 93 302280 3.53:5 8305.5 223.596 .8._ .352 Ems—2382 m.m ~35. 101 mod n_m:o%om .6032: 2: mw.m E0 E £5:an 623:: :6w 9 30:86 0:: 06:39: :80 mm: @6680 8:968: 50 $688046 So No.3 3.0 3 $66: 2: 633. 6:6: 9 30:83 9:: 06:30:: :80 96: 36?? miss—0 SO £66856 So 3 2.3 3.0 :66: 2: 2:3 6:6: 2 80:25 0:: 36:39: :60 was $880 :36: 6:090 :30 $836088 mm.» 3.0 :5 :5: 5.. 66:56:03 :2: e83 653:: E 03: :8: we: 2680506 .80 35.0 .I. «:02 9:23:88 22:00—65: mommoooa b33605 9:038: 63862633 mes 3.0 8 650 :26 5:5 SEEEEE: >516: wEESEmE 6 02:03 8: woos—0:6 50 3.0— $0 .2265: $06: x6388: 2826.: 650880 $8.80 SO $3 3.0 .w:o:o::0 $06: :266008 66563 889$ 8:2:505 50 35.0 n «:02 Enougeuv mzflm 63660000 .331 Sea E 5:656:03. 05:23:. 2: 5.3: 6333: .5 3ng 39m «BEE 8 32:8 2: 83 8.3: 62200000 2:93 wofiwsgfim 2:8— 3: 865.5800 8:59:30 233.600 .5. .352 3659.332 33 63:8 104 00.0 “<82: .000 n 8 .80 u ::0 .80 u E22: .8. v m .038 u 5..me 0:: .2002 0m: ._ 00.0 880:0 60:08 0: ::0:8: 0: 8:20 ::0 00:06:88 :0_0>0: 0 >9 8.: N00 880:0 60:08 0: ::0:8: 0: 88:08: 3000—8880808 0888000: 0 3 mm: _ 3W0 .8w:0:0 60:08 0: ::0:8: 0: 80:00:: 8:00 ::0 30: :0::0:0 0 >9 A50 n 0::2 0.808800 :0:0::w::000: 8000:: 86—080 88000:: mm: 00.0 08:08:: E 880:8 8:080 ::0 0:08 :08 80:0>0:E :0:00:0 0>0: 0 3 00.: 00.0 .88008: wE=0>0:: ::0 :0w:0:0 3880888: :05 8:0:0>0:E :0:00:0 0>0: 0 >9 NVN: 00.0 068000 88000:: wE=0>0:: ::0 0:08 :08 0:0:0>0:E :0:00:0 0>0: 03 $00 M 8:10. 8:000:88 ::080>8:8_ 8008: 82:0: 0:808:80 :d: 00.0 88008: 0888 0: 88:0: ::0 880008 80: 88: :0802 0>0: 0 >9 00.0: N00 .00:0::0> ::0 0803 0080: 38008800 03 mm. 0 00.0 .88008: :0:0::0:: 00:80:88 b88888 0 >9 20.0 H 8:? 9:000:88 ::080>8:8_ 8000:: 8:080:05 8065000 020?: 8880805 080: ::0 06:80:00 105 All factor loadings were significant at p < 0.01 with values above 0.59, which denoted good convergent validity of operations capabilities. Factor loadings of the individual items on their respective operations capability were of greater magnitude than with other capabilities in the measurement. model. The correlation between any two operations capabilities was between 0.45 and 0.77. Discriminant validity was further assessed by using two-factor CFA models involving each possible pair of operations capabilities, with the correlation between them first set free and then constrained to one (Bagozzi et al., 1991). In all cases. the x2 value of the unconstrained model was significantly lower than that of the constrained one. All of these established discriminant validity among all operations capabilities in this study. Cronbach’s alphas were computed to measure constructs reliability. All of them were greater than 0.73; and the reliabilities of operations capabilities were ensured. As noted in Chapter Four, survey research is exposed to common method bias. Common method bias could either inflate or deflate observed relationships between items and constructs. To assess the presence of common method bias, Harman’s single-factor test (Podsakoff et al., 2003) was applied in a CFA (containing all the constructs of operations practices and operations capabilities) where all the measurements loaded on one latent variable. This model generated poor results as indicated by x2 = 1914.32 with degree of freedom 740, p < .001, BNNFI = 0.57, CFI = 0.59, [F] = 0.59, RMSEA= 0.11. This model was compared with another first order measurement model where all the items were linked to one of the core operations practices and operations capabilities they were supposed to measure. The overall measurement model fit was significantly 106 improved with the following evidences: x2(674) = 999.14, p < .001, BNNFI = 0.88, CFI = 0.90, IF] = 0.90, RMSEA= 0.06. The comparison of x2 in these two models rejected the hypothesis of the presence of a single factor accounting for the majority of variance among the measures. Therefore, common method bias did not seem problematic in this study. Given the overall validity of the operations practices set and the operations capabilities set respectively, CFA involving both sets was done to ensure the distinction between them. Two competing second-order measurement models were proposed in Figure 5.1 and tested. It is important to note that the number of items and constructs did not match exactly with those in CFA due to graphic limitation. The left one hypothesizes that all the operations practice initiatives and operations capabilities reflect a single higher level latent concept (i.e., they could not be distinguished from one another). In contrast, the right one hypothesizes that all the operations practice initiatives reflect a higher order concept while all the operations capabilities reflect another higher order concept. The left model in Figure 5.1 generated results of x2028) = 1130.14, p < .001, BNNFI = 0.85, CFI = 0.86, IF] = 0.86, RMSEA= 0.06 while the right model produced x2027) = 1075.10, p < .001, BNNFI = 0.87, CFI = 0.88, IF] = 0.88, RMSEA= 0.06. Comparing the two second-order CFAs based on the fit indices and the change of x2 per change of the degree of freedom (sz/Adf), it is apparent that the left hand side model did not fit as well as the right hand side model. That is to say, the practice initiatives set and operations capabilities set do not seem to come from one higher level concept, but more likely from two concepts. 107 083.3000 0090600000. 80 00 who 98 ME mmO .HmZOU 502—50900 20038000 030 8 5%: N00 can 60 $005080 $058000 03 30.6.0002 Nam 000 SE $.80: 0:088:30:— 80 AS 00 fl n C on ”802 «.0002 0:089:53: 00.52.00 SEAT—500$ magma—0U 03,—. fim azur— 2x 2x 80 :x ~00 , :x 2x 2x 00 20 ex ex ex ex ex ex ex ex E $0 on Q SE Nx , Q E an 108 The distinction between the operations practice initiatives and operations capabilities can also be demonstrated by examining the correlations among them. As presented in Table 5.7, all of the correlations were significant at p< 0.05. Given that the study uses perceptual data, it is not surprising to see moderately strong correlations among constructs. However, none of the correlations were high enough (above 0.90) to threaten the validity of the constructs. Consequently, there was no need to combine any two into one COIISII'LICI. Table 5.7 Correlation Matrix of Core Operations Practices and Core Operations Capabilities COS PPD RSP IPI RPI PRC QMP JFP COP IPD WDP LDP COS l PPD 0.55 l RSP 0.45 0.45 l IPI 0.59 0.45 0.35 l RPI 0.46 0.5 I 0.40 0.67 l PRC 0.67 0.66 0.47 0.70 0.58 l QMP 0.26 0.53 0.4l 0.49 0.45 0.47 l JFP 0.48 0.50 0.35 0.65 0.46 0.63 0.65 l COP 0.45 0.49 0.27 0.42 0.29 0.59 0.53 0.6l l IPD 0.49 0.41 0.26 0.43 0.35 0.55 0.49 0.59 0.62 l WDP 0.4! 0.36 0.24 0.58 0.45 0.52 0.65 0.77 0.58 0.60 l LDP 0.59 0.42 0.25 0.50 0.37 0.57 0.58 0.66 0.65 0.54 0.76 l Note: COS — cooperation skills; PPD — proprietary processes development; RSP — responsiveness; IPI — incremental process improvement; RP] — radical process innovation; PRC — process reconfiguration. QMP — quality management practices; J FP - J IT flow practices; COP — customer orientation practices; IPD — integrated product development practices; WDP - workforce development practices; LDP - leadership practices. After establishing the construct validity of the operations practice initiatives and operations capabilities, all the measures of each construct were averaged and single measure for each construct was used in the regression analysis. 109 5.4.2 Analysis Results and Discussion for Research Question 2 Research question (2) addresses the nature of the relationships among the core operations capabilities set and the core operations practices set. Two competing models were tested — compensatory model and additive model. The former one argues that operations capabilities (or operations practice initiatives) can be compensated by each other and the latter one suggests that a minimum level is needed to produce good performance. 5. 4.2.] Analvsis Results Following the setup of human judgment models (Patton and King, 1992), the original dataset was recoded to test the two types of models. Specifically, as the average level of all the operations capabilities (or practice initiatives) was hypothesized as a predictor to operations performance (denoted as “PERF”) in the compensatory model, new variables were generated to reflect the average value of operations capabilities (denoted as AVGCC) and the average usage of practice initiatives (denoted as AVGPR) of each respondent. In contrast, the lowest value of all operations capabilities (or practice initiatives) was assumed to determine operations performance in the additive model. New variables were generated to represent the minimum value of each respondent’s answers to all types of operations capabilities (denoted as LOWCC) and practice initiatives (denoted as LOWPR) respectively. Operations performance contained three constructs: cost, quality, and delivery. Each construct was measured using multiple items. Confirmatory factor analysis was done to establish construct validity and results are summarized in Table 5.8. These three dimensions of operations performance served as the dependent variables. 110 Table 5.8 Measurement Model for Operations Performance Constructs and Items Standardized t-value Coefficient Relative to your competition, how would you rate the performance of your plant operation on the following dimensions of performance? Cost (Cronbach’s Alpha = 0.88) Manufacturing unit cost 0.81 10.75 Manufacturing overhead cost 0.82 11.05 Total cost (acquisition, setup, maintenance, service, etc.) 0.90 12.65 Quality (Cronbach’s Alpha = 0.89) Product conformance 0.89 12.66 Product reliability 0.91 13.19 Product features 0.79 10.65 Delivery (Cronbach’s Alpha 2 0.84) Delivery accuracy 0.83 10.99 Delivery dependability 0.77 9.89 Delivery quality 0.84 11.2 Delivery availability 0.60 7.19 Model Fit: x262) = 57.82, p < .001, BNNFI = 0.95, CFI = 0.97, [F] = 0.97, RMSEA= 0.08 To examine question (2), a series of regressions were run in SPSS between average operations practice initiatives (AVGPR) and the three dimensions of operations performance. Then another set of regressions were run between the minimum level of the practice initiatives (LOWPR) and three dimensions of operations performance. The coefficients of determination (R2) and the regression coefficients ([3) are summarized in Table 5.9. 111 Table 5.9 Regression Results of the Nature of the Relationship between Operations Practice Initiatives and Performance Model Type Dependent Variable Independent Variable Cost Quality Delivery ADDITIVE LOWPR R2 0.17 0.14 0.02 B 0.41* 0.37* 0.15 COMPENSATORY AVGPR R2 0.17 0.13 0.06 [3 0.41* 0.36* 0.25* Note: * denotes p < 0.05. Similarly, regressions were run between the average operations capabilities (AVGCC) and three dimensions of operations performance. Then another set of regressions were run between the minimum level of the operations capabilities (LOWCC) and three dimensions of operations performance. The results are reported in Table 5.10. Table 5.10 Regression Results of the Nature of the Relationships between Operations Capabilities and Performance Model Type Dependent Variables Independent Variable Cost Quality Delivery ADDITIVE LOWCC R2 0.12 0.03 0.04 [3 0.35* 0.18* 0.20* COMPENSATORY AVGCC R2 0.21 0.08 0.04 (5 0.46* 0.28* 0.20* 5. 4. 2.2 Discussions In terms of the intra-relationships among operations practice initiatives, Table 5.9 showed that both compensatory and additive models appear to have approximately the same explanatory power on cost and quality dimension of performance. Actually, the additive model is more resource demanding than the compensatory model because it requires 112 resource investment in all kinds of operations practice initiatives. Given the fact that both models fit with the data equally well, it is the additive model that determines decisions regarding practices implementation. That is, firms have to use all the practice initiatives to a certain level to improve the competitiveness in cost and quality. Yet, the compensatory model outperformed the additive model in explaining firm’s delivery performance. That is to say, to improve delivery performance, firms do not need to adopt all the practice initiatives. Rather they can be effective on certain areas to compete well in the market. Going back to the data, it can be seen that firms tend to implement all the practice initiatives at the same level, no matter how intensively they use them. Thus, the one with high average usage of all the practice initiatives has a high minimum usage as well; while the one with low average usage of all the practice initiatives tends to have a low minimum usage. That could be the reason to give both models approximately the same explanatory power. As to the nature of the relationships among operations capabilities, the results in Table 5.10 indicated that the compensatory model outperforms the additive model in explaining a firm’s cost and quality performance and the two models tie in explaining delivery performance. That can be interpreted as that firms can improve the first two dimensions of performance by developing various types of operations capabilities. What matters is not a “threshold” that needs to be met by all the operations capabilities. Rather firms can develop different capability portfolios, which could render them the same competitive advantage. If a firm is weak at one dimension, it could still competes well in the market 113 as long as it can excel at other capabilities. This is an evidence of showing the presence 99 of “equifinality . Equifinality, as introduced by Hambrick (1984), recognizes that there are multiple paths available for firms and their management teams to reach a desired 812116. Combining the results above, it is interesting to find that the nature of relationships of operations practice initiatives and that of operations capabilities are quite different in terms of their impact on a firm’s operations performance. Implementing all kinds of practice initiatives is a necessary condition to improve cost and quality performance, whereas it is not such a condition for operations capabilities. Put alternatively, there could be a variety of ways to configure a firm’s operations capability portfolio, but fewer choices for the practice initiatives. The results, on one hand, confirmed the conclusion gained from the Collins et al. (1996) study, where the researchers found that operations practices have to be put in place in all areas to build long-term sustained performance. Yet, on the other hand, the results also indicated that firms have great flexibility in building operations capabilities to be equally successful. 5.4.3 Analysis Results and Discussion on Research Question 3 After examining the nature of the relationships among operations practice initiatives set and operations capabilities set individually, both sets were brought into one model with the purpose of identifying the relative roles of each set in enhancing operations performance. 114 5.4.3. .1 Analysis Results Figure 1.1 suggests that the operations capabilities set could mediate the relationship between the practice initiatives set and operations performance. The three-step regression approach was then used to detect the potential mediation described in the Chapter Four. As noted previously, operations practices and operations capabilities can be either compensatory or additive. Therefore, the potential mediating effect was tested under both scenarios. Results of the compensatory scenario and additive scenario are summarized in Table 5.11 and Table 5.12 respectively. Adjusted R2 was reported in regressions with more than one independent variable. Regression diagnostics were performed and there was no substantial evidence of multicollinearity, heteroscedasticity, and omitted variables in the regression model. Therefore the results obtained using ordinary least square method are unbiased and reliable. Two things were evident after testing the proposed model under the two scenarios. First, the compensatory model worked better to explain the difference in firms’ cost and delivery performance (in terms of how much variance of dependent variables can be explained by the independent variables). This implied that firms have a greater freedom to implement various practice initiatives and develop different operations capabilities to be equally successful in both performance dimensions. However, additive and compensatory models worked similarly well in explaining the quality performance, which indicated that a minimum level of every core practice initiative is crucial to provide quality products or services. 115 Second, capabilities played a full mediation role in explaining the cost performance. That is to say, the usage of practice initiatives has to be transformed into a set of unique capabilities to reduce cost. However, the intensive use of all practice initiatives still played the major role in explaining quality performance while the role of capabilities was not substantial in the relationship. Unfortunately, neither practices nor capabilities could explain delivery performance well (as indicated by R2 = 0.07). Even though the linkage between operations practices and performance was significant, the explanatory power of the full model was weak, as only seven percent of variance in operations performance can be traced down to practice initiatives implementation. 116 00:0 00000 00.0 H mm 00000.60. *20 u 00 M00.0 n .0 00301—*~0+~E>>04*.0+00 n >mm>3m0 ~00 H 0m 3 .0 H .0 mgqf .0+00 n >mm>§m0 NN.0 H NM .300 u .0 MmE04*.0+o0 M 00304 00000 00000 E .0 H mm 0050.60. 000 u 3 ”05.0 n .0 00304*~0+~E>>04*3+00 n >HE>04*.0+00 H 0.000 :0 H mm 10.0 n .0 MAS/0.7.520 H 0.000 NWO N Na *000 u .0 ~5§04t0+00 M 00304 0000009.: 000000000000 00 0000600000 0000000000 00500030 0000000 006000030 .252 20.03 2.. e. 8000 0:00.002 000 6.8.0... 00000 00000 >00 H mm 000000000 00.0 u S 0..de n .0 000><*00+~50><*.0+00 u >~E>Sm0 00.0 H mm *mmd n .0 mm0><*.0+00 n >mm>imn 00.0 H mm .300 u .0 Mm0>< 00000 000.00 E .0 H mm 00000.60. 00.0 n E ”en—N0 n .0 000><*~0+~50><*.0+00 u >EA<*.0+00 n 000>< 000000000 ::0 mmd H mm 005000000 o..~m.0 H 3 n0N0 u .0 000><..~0+M00><....0+00 n .500 E .0 H mm 10.0 n .0 mm0><*.0+00 H .500 00.0 H 0% 0.00.0 u .0 ~50>< 000000000 0000000000000 00 0000800000 0000600000 00_mm00w0m 0000000 0230000 .0002 000000000000 05 5 000.000 000.0002 3.0 030M. 117 5. 4. 3. 2 Discnssimzs The reported results appeared to indicate that there are three situations that need to be explained separately. First, why do operations practice initiatives have an indirect impact on cost performance through operations capabilities? Second, why do operations practice initiatives have a direct impact on quality performance? Third, why neither practice initiatives nor operations capabilities influence delivery performance substantially? The answers to the first question lay in the measurement of cost, the nature of operations practices and operations capabilities. Cost in this study was actually a comprehensive measurement of performance, because it included manufacturing unit cost as well as total cost. The total cost covers both the direct purchasing cost and the hidden cost associated with using a product or service, for instance, cost associated to setup, maintenance, service, and operating. Therefore, the total cost actually embraces other dimensions of performance implicitly into cost. For instance, it is hard to imagine a low total cost if a product has poor quality. Though the purchasing cost could be low for the product, the maintenance cost could be extremely high and eventually end up with a high total cost. In return, competitiveness in “total cost” could also mean high quality. From a trade-off perspective, improving quality performance or delivery performance could come at the price of losing the competitiveness in cost. Though there are firms that could manage to solve the tradeoff thanks to the advert of new technology and innovative management philosophy, the pressure of cost increases still exists. How firms overcome the tradeoff impasse could not be traced down to the practices they use. There are at least two reasons. First, practices are specific and task—oriented activities with a very detailed 118 goal. However, the performance is measured at a high level in terms of cost, quality, and delivery. It is difficult to build up the one-to-one correspondence between specific activity and broad performance. Moreover. practices can be identified, communicated, and disseminated easily. This nature of practices indicates they can be followed quickly and the potential rent coming from adopting the practices can be exploited rapidly and will not sustain. In contrast, capabilities are long lasting due to the nature of elusiveness and uniqueness. Firms develop their capabilities inside their organizations through various activities and practices in everyday operations over a long period of time. Once they are established, they tend to have a long time influence on performance, as they are hard to be identified, articulated, and decoded. Therefore, operations capabilities generate a strong mediation effect between operations practices and cost performance. The second question concerns why operations practices have a direct impact on quality performance. It can be argued as follows. The significance of quality in business led many organizations to conclude that effect quality management can enhance their competitive abilities (Anderson, Rungtusanatham and Schroeder, 1994). Since Japan’s leading industrialists instituted Edwards Deming’s quality control methods, Japanese quality, productivity, and competitive position have been improved and strengthened enormously (Buffa, 1984; Garvin, 1984; Juran, 1981; Riggs and Felix, 1983). Deming’s idea has quickly spread in the USA. afterwards. In 1988, the US. government established the Federal Quality Prototype and the Malcolm Baldrige National Quality Award was established in 1995. Ferdow and De Meyer’s (1990) “sandcone” model even 119 suggested a path to improve performance where quality lays a solid foundation for improving other dimensions of performance and improving performance continuously. As quality holds such a critical position in competition, it has been studied intensively for more than half a century. Improving quality has been emphasized for long in practices. Juran (1986) described quality management with three elements: quality planning, quality control, and quality improvement. However, he found that often a very low priority is given to planning and improvement while top priority and resources are given to control. With the tremendous growth of quality literature in both academic and practitioner- oriented journals, the term quality management has been diluted to mean different things and the scope of activities underlying it lacked consensus (Watson and Korukonda, 1995). Yet, it is widely believed that the underlying practices in quality management are fundamental and essential for effective management (Nair, 2006). Over time, the tacit knowledge of managing quality has become increasingly more explicit. Guidelines and procedures have been documented to facilitate the articulation of the tacit knowledge and the communication of quality improvement. Trainings are given from the top management team to the frontier operators about the commitment, philosophy, tools, and teamwork. Tools, such as process flowcharting, scatter diagrams, Pareto analysis, cause and effect analysis, control charts, become standardized. All of these help convert operations capabilities into operations practices, which reinforces the direct relationship between practice initiatives and quality performance. In addition, quality means quality assurance for many companies because quality organizations stress quality control and assurance (Spencer, 1994). Quality performance 120 links directly to quality assurance approach — a systematic approach to the pursuit of quality (Collins, 1994). The purpose of quality assurance is the conformance of products, services and processes with given requirements and standards (Crosby, 1979; Moreno- Lonzo and Peris, 1998). This conformance is achieved through systematic measurement and control to detect special causes of variation and achieve process standardization (Dale, Boaden and Lascelles, 1990). Quality assurance includes, and is an extension of, quality control (Garvin, 1988; Moreno—Lonzo and Peris, 1998). If quality is the degree to which an item or process meets or exceeds the user's requirements, then quality assurance is those actions that provide the confidence that quality was in fact achieved. Essentially, the goal of quality assurance is to meet the standard and be on target through usage of tools, methods, and training, in another word, practices. Alternatively, quality performance can be achieved through using quality management practices. Finally, the findings strongly indicated that neither operations practices nor operations capabilities are good indictors to explain delivery performance. Two explanations for this situation exist: (1) there may be more than one homogeneous group, with the result that the relationships are disguised when the whole sample is used for the test; and (2) delivery is rarely an independent performance variable. Few firms would only consider delivery performance without the care of quality and/or cost. It is likely that delivery is bundled with other performance dimensions. Undoubtedly, more research is required to answer this question. 121 5.4.4 Analysis and Discussion on Research Question 4 Research question (4) suggests that the pattern of the relationships found out in question (3) may be contextual. That is, the competitive context could moderate the key relationships. To explore this possibility, two dimensions of competitive context were investigated — market competitiveness and market dynamism. 5.4.4.1 Analysis Results Market competitiveness was measured by the number of competitors in the market, growth/decline of sales, and price difference among competitors. Market dynamism was measured by the rate of change of introduction of new product, new processes, tastes and preferences of customers. For each construct, an exploratory factor analysis was done to verify the validity of the measures. Then the measures of each construct were averaged to form an index. The whole sample was split into two subgroups by the mean values of market competitiveness index and market dynamism index respectively. Firms were viewed as competing in more competitive markets when the competitiveness index (MKTCOM) was no more than 3.8. This group consisted of 57 firms. The remaining 77 were viewed as being in less competitive markets. Similarly, firms were viewed as operating in more dynamic markets when the dynamism index (MKTDY) is no less than 3.7 (59 firms were assigned to this subgroup). The remaining 75 were viewed as being in more stable markets. 122 Table 5.13 and Table 5.14 contain the results of the interrelationship among operations practices, operations capabilities and performance for firms in more competitive markets and in less competitive markets. There are three observations based on the results. First, the compensatory model dominates the additive model in explaining all dimensions of performance in both situations. That is largely consistent with the results found in the previous subsection, which also provides empirical support for the concept of “equifinality”. That is, firms have greater freedom to manage their operations practices portfolio and operations capability portfolio regardless whether they are in more or less competitive markets. Second, operations capabilities play a significant mediating role between operations practices and all three dimensions of performance in more competitive markets; while operations practices play a more critical role in less competitive markets. Third, the pattern of the relationships among operations practices, operations capabilities, and operations performance under more competitive markets and less competitive markets exhibits significant difference. This inconsistency strongly infers that market competitiveness moderates the key relationships among them. 123 .0 v 0 .130 v 0 r. 8.62 Aanomdv >MW>~4WQ I 00304 A0060 >P44MM>~4MQ I MAE/04 AmNdv 3.0 >H~4Mfl>~4m0 I yak/04 A**0N.Ov mod >E4~mm>44m0 I 000>< 0.3."?de >t4< $000.00 Fm00 I 000>< :Odv wod >-m~>44m0 l Mm0>< ANOdrV No.0 >t4< An: .00 _N.O r—tm00 | mm0>< faxwmdv mod >Mm~>~4m0 I MEO>< QNNdV med >F~4< $800000 00.0 Hm00 I Mm0>< “330.00 NWO 000>< | NEO>< $0 0% 30 $4 $0 AK 40003 3800003000 300—002 023000500 0002 0_ 0000—0000003 000000000 000 $000—$000 000000000 $8000.00 000000000 000.00 005000020“ 04.0 030,—. 124 A: .00 >Mm>44m0 I 00304 @000 >P~4mm>~4m0 I M5304 A**mv.00 : .0 >b4<00 I Mat/>04 A**Nv.00 _N.0 Pm00 I yak/04 30.00 _0.0 >mm>~4m0 I NEE/04 feliwdv 04.0 >E4>04 I M§04 Q» N: _ «S E «S E 3%: 2.2%? 600.0 >mm>~4m0 I 000>< $0.00 >b4< vamdv Fm00 I 000>< $2.100 00.0 >mm>44m0 I ~50>< fcwvvdv 9 .0 >t4<30 I Mm0>< 0.33000 wmd Hm00 I ~50>< Ammdv no.0 >Mm>44m0 I MAO>< $6.00 2.0 >t4< Avwvawdv vmd Fm00 l ~50>< 030.00 0V0 000>< I ~50>< $0 u< L $0 Mk S; E the: iSunzmeeD 38.32 02:89:80 mmoq E oogfiuafon— mgzabaO v.5 $355350 £55336 £85995 2853240 95:.“ mfismaeuatm 3% 2%? 125 Table 5.15 and Table 5.16 showed the interrelationships among operations practices, operations capabilities and performance for firms in dynamic markets and in stable markets. There are two observations based on the results. First, the compensatory model works better in dynamic markets whereas the additive model works well in stable markets. Consequently, market dynamism does moderate the key relationships among operations practices, operations capabilities, and operations performance. Second, operations capabilities play an important role in mediating the relationship between operations practices and cost performance, but not quality and delivery performance in dynamic markets. In stable markets, firms that implement every operations practice initiative at a certain level compete better in cost and quality. But a minimum level of all kinds of operations capabilities is of great help to improve delivery performance. AD .00 >mm>44m0 I 00304 00.00 >t4<30 I 00304 0.30000 Hm00 I 00304 010.00 _0.0 >Mm>44m0 I Mark/04 0.33000 00.0 >t4<30 I M§04 TEENS w_ .0 Hm00 I Mm304 A200 _0.0 >Mm4>~4m0 I fink/04 A**VM.00 N_.0 >F~4~4m>44m0 I 000>< C. _ .00 >E4< chwmdv Hm00 I 000>< 7.33000 w0.0 >Mm>_4m0 I MED>< fanvmudv 0N0 >H~4< A**0N.00 w0.0 >mm>44m0 I mm0>< chmvdv w_ .0 >E4< f3." Ziov 0_ .0 rrm00 l Mm0>< Taxmmtov wN.0 000>< I NEO>< 4%; pk _ Sc NM «3 E $on ESQEEEQU 3332 3.52;: 5 8:58:94 €23,590 95 .moEznanaU 233.520 £85995 23232.6 95:3 gimp—SEEM mfim Bean. 127 30.0-0 >mm>44m0 I 000>< 400.00 >H04< QuN00 Hm00 I 000>< 3N00 No.0 >Mm>04mfl I Mm0>< AmNdv m0.0 >E4< a— .00 E .0 Pm00 I Mm0>< f _N.00 v0.0 >mm>04m0 I MEO>< A**mm.00 W00 >E4< 0.31.000 E .0 Fm00 I Mm0>< 01.5500 00.0 000>< I Mm0>< £0 -~ _ 30 NM £0 N2 0.0.330 ESESKEQU fuvm.00 >mm>04mQ I 00>>04 2 _ 0-0 >b4<30 I 00304 20.0 Hm00 I 00304 30.00 00.0 >Mm>44mfl I yak/04 03.3“.00 m _ .0 >¥E4mm>04m0 I “3504 A**wm.00 m. .0 >b4515: $555555 5 45:3? 55 55.8355 .50 L L L L L L L 5:25:50 9.05“ 5.83558 88:55.4 55—585 555.4 50 L L L L L L ..L 5:25.50 3055 2255508 m3§=o£ :53? 5:55.85 .50 b 0 m v m N n 59%: - - _a u .. 59.95:. Emir—em .552 Emir—am .23:— uacz E 35:53: waive—8 2: .53 552% .5 news .5» .52.: 3 25:5 2: 8a.. 3.53 53:33.5 2:55.590 0 £8525 =5 3 6:35.. 3 at 3.5:— 535ch «5:3 :2: 8:: 82 55:53: 953:8 2: :c mififinoummca .5?» $5523. m_ E c3552: 5.5 53 5:3 .553— ..5355 awn—9.3.. .5 :Ewt: 2. 5 5.55 :2: 3o: ammo—m .5?» .5553. 5.5 .5.» :2: :53 £555. a :o .85anon .8 5255595 :8» .5 522. z .5325 525E £355.85 :53 «:25 coo—ma .555 3555.55: 1.5 :53.an .5325.“an 325.590 .= tax— 154 LLLL L L 55..»5 bugbm LLLL L LLLL L LLLL L .5 5:5 Z LLLL L LLLL L LLLL. L L L 55555:. bwgbm 5555550155 00 0:005. 00 5:25 55 505555500 5052.6 53 55:55 55.55 05 95055. 9 55050550 55555 5 5555050 555.5005. 53 5555.5 50—55 00 0505550 0» 5500055 553 E5 35: 50.5 53 5555505355 53550500 95 50—55 55 5 5.5555 55 .5 5535 55 53 555.50 5.555005 5:555 5 525595 555505 50 5.55 55 525555 05555 55: 53 55555005 @5555 50 @5555 3.3555253 55 52555: 55555 55; 53 55.030 55555005 w==5>55 50 5055 55 525555 05555 55: 53 55035500 55555005 5555 05 555—5... 95 5555003 0.55 50¢ 05:55. 55; 53 .50:5_5> E5 5553 55302 b53558 53 55555005 50303005 33055 3550:5500 53 55555005 50005005 5555955 353055500 53 $2050 05 3:55 5555555055 55555 5 555:0 5555955: 5.0 55.5 53 5055—0 25 3555 5555555055 55— 5 525:? 0555955 55 55:5 53 $2050 E5 355 3:9: 5:555 55 55550500 5 m=0_5_5> 0555955 5.5 55.55 53 .30c no“ 5 .5 550. 55 55.55 3:55 55 30550 3 3:55:55 5555505 .5 355.525 5052 53 155 mate—Ema Lozaasm L L L L L L L L L L L L L L L L Humane—26w 8235 L L L L L L L L x223 mafia—9:8 .EoEBmso 8 2.83M L L L L L L L L cocoflmsmm $8830 2:322 L L L L L L L L £2:onan :23 88:8 820 E8532 L L L L L L L L $56239: .mLoEQmso 3 owe—o 8830 L L L L L L L L 95:33 FEB wage? setup—.85 30c was L L L L L L L L 85:53 @5883 3:550 2 593— onflbcofifisvm L L L L L L L L cores—v05 889? ::m L L L L L L L L 833568 5:968 mfiom L L L L L L L L 3% :83 =mEm L L L L L L L L bzasc E w:§SE.._o:on 03:89:00 L L L L L L L L bzmsc 8L cosmoEtoo Lozamsm L L L L L L L L 933 28 33:3 bzmso L L L L L L L L 75:8 mmoooa Eoumcflm a h e m w m N ~ 32 £32 3.5;: :85 3% moo aczom .962 638:9? 3: .Su menu: :32: £252.88 2:3:— ah—Zm: 2.22233 2:35 :mUC: :2: 8c: ammo—m Sum—n :8» E 3953.:— mnctauonc ”532—8 2: no em: 95 com :9» 3. 25:3 :33 ch .893qu 333.390 .3 156 L L LLLLLL LLLLLL L L L L LLLLLLL LLLLLL L LLLLLLL L L LLLLLL L LLLLLLLL L LLLLLL L LLLLLLLL L L LLLLLLL LLLLLLL L L LLLLLLL <2 3:2 2.2.5 5:0 3% 80 LLLLLLLL L LLLLLL L E020m LLLLLLL L L LLLLLL L ..0>0 Z $300030 0%0083 00 EOE:E~000 0:0E0wmcmZ 200w Ecosmficwwho wE>0Eom E 898380 38200:. 0:0E0w0002 :0wcaso L0 €209.20: 0m: E0E0>_0>E 0:0 3:: 89220000 0:0E0w0032 2:08 cwzefi wag—om E0305 2:03 30: wEEfi. .5» 0.030% cozmntawho 0:0 0:03 8031 080.0103 005000-380 0:0E0>0EE_ L00 x03000m 00305 00 830E 0000.58.00 0mD €800.08 .x0v 3530:. 8008.850 106000500: 0:0 3000::— EBmxm 00:00:30 .250”— mm0005 swig 003003-085 wct005w=0 80:30:00 3600 002005 30: E 00080202: 8:095 938308309: 00% @300 20300.50 032:8 :0 0030 00300.8 3:005 00:260. 093 00:00.5 157 10. Relative to your competition, how would you rate the performance of your plant operation on the following dimensions of performance? Poor Excellent 1 2 3 4 5 6 7 Manufacturing unit cost F F f‘ f‘ r r“ F Manufacturing overhead cost (‘ r‘ r‘ r“ r“ r c Total cost of ownership t“ r" r' r r p p Product conformance t" r r‘ r r ('~ (~ Product durability r‘ r" r r F F r‘ Product overall quality F t“ r‘ r r (‘ p Product reliability 1" r r' (‘ t‘ (- (~ Product features F F r‘ r r‘ F r Delivery accuracy (correct items were .L’ r) ’) "1 “i “'3 ") delivered) Delivery dependability (delivered on the (L (L (L (L (L (L (L agreed upon date) Delivery quality (condition of product (L (L (L (L (L (L (L after shlpment) Delivery availability (probability that (L (L (L (L (L (L (L Items are 1n stock at order time) Delivery speed (short elapsed time) P f“ t" F P t" f" Ability to adjust product volumes f" f“ 1" f” F 4'" F" Ability to produce a range of products r 1" P f" t” f“ F Lead time to introduce new products F f” r" f“ f“ F F Number of new products introduced each (L (L (L (L (L (L (L year 158 11. How would you indicate the rate of change for the following dimensions? Slow Rapid 1 2 3 4 5 6 7 The industry rate of introduction of new products (L (L or services The Industry rate of introduction of new (L (L (L (L (L (L (L operating processes The tastes and preferences of the customers in (L (L (L (L (L (L (L your industry The rate at which products or services in your . t" t“ t“ F F f“ (‘ 1ndustry become outdated 12. To what extent do you agree or disagree with the applicability of the following phrases to describe the culture of your plant? Strongly _ _ Neutral _ _ Strongly disagree agree 1 2 3 4 5 6 7 Willingness to take risks F f” F F F F“ (‘ Has a value-added mentality F f“ (" f“ F (" (" Customer orientated F f“ F f“ (‘ f“ (‘ Employees feel empowered (" F F I“ F r‘ r Process orientated 1"“ P t“ t“ (‘ r F 13. How would you describe the industry your plant competes in? None/ Numerous/ low/ high/ short long 1 2 3 4 5 6 7 The number of major competitors F f“ F F f“ F i“ Price difference between you and your (L (L (L (L (L (L (. major competitors Growth/decline in sales Product life cycle length 159 14. Please indicate the industry in which your business competes. If you know the NAICS code, please specify your three-digit code (please refer to the website for the NAICS code information (http://www.census.gov/epcd/naicsOZ/naicodOZ.htm)I . If you know the SIC code, please specify your three-digit code i Major product Name of the firm to which your plant belongs l _ (15. What is the approximate annual sales dollars in 2005 (in 1,000$) in the plant? 16. What is the approximate number of employees in the plant? Part III. Follow-up 17. During the course of any study such as this one, interesting or unexpected findings often emerge. Determining what has happened requires some additional work. In some cases, this means asking some of the respondents for additional feedback in the form of a short (i.e., one page) survey. This information greatly enhances the quality of the findings and improves the nature of the insights gained from the study. Would you be interested in participating in this small group feedback initiative? '— Yes. In exchange for your participation, you will receive a more detailed executive summary. F' No. 18. Thank you very much for participating in the survey. Please enter the following information so that you can qualify for the drawing and/or receive an electronic version of the executive summary. Your email address is optional, but please note that we cannot send anything to you if we do not know who you are. Email address _. 160 BIBLIOGRAPHY 161 Abbott, C. and Eubanks, P. (2005), "How academics and practitioners evaluate technical texts: A focus group study," Journal of Business and Technical Communication, Vol. 19, No.2, pp.l7l-218. Amit, R. and Schoemaker, P. J. H. (1993), "Strategic assets and organizational rent," Strategic Management Journal, Vol. 14, No.1, pp.33-46. Amundson, S. D. (1998), "Relationships between theory-driven empirical research in operations management and other disciplines," Journal of Operations Management, Vol. 16, No.4. PP.341-359. Anand, G. and Ward, P. T. (2004), "Fit, flexibility and performance in manufacturing: Coping with dynamic enviomments," Production and Operations Management, Vol. 13, No.4, pp.369-385. Anderson, J. C., Rungtusanatham, M. and Schroeder, R. G. (1994), "A theory of quality management underlying the Deming management," Academy of Management Review, Vol. 19, No.3, pp.472-509. Armstrong, S. and Overton, T. (1977), "Estimating Nonresponse Bias in Mail Surveys," Journal of Marketing Research, Vol. 14, No.3 (special issue), pp.396-402. Bacharach, S. B. (1989), "Organizational theories: Some criteria for evaluation," Academy Management Review, Vol. 14, No.4, pp.496-515. Bagozzi, R. P. and Yi, Y. (199]), "Multitrait-multimethod matrices in consumer research," Journal of Consumer Research, Vol. 17, No.4, pp.426-439. Bagozzi, R. P., Yi, Y. and Philips, L. M. (1991), "Assessing construct validity in organizational research," Administrative Science Quarterly, Vol. 36, No.3, pp.421— 458. Balakrishnan, R., Linsmeier, T. J. and Venkatachalam, M. (1996), "Financial benefits from JIT adoption: effects of cusomter concentration and cost structure," The Accounting Review, Vol. 71, No.2, pp.183-205. Barney, B. J. (1991), "Firm resources and sustained competitive advantage," Journal of Management, Vol. 17, No.1, pp.99-120. Barney, B. J. (1995), "Look inside for competitive advantage," Academy of Management Executive, Vol. 15, No.4, pp.49-61. Baron, R. and Kenny, D. (1986), "The moderator-mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations," Journal of Personality & Social Psychology, Vol. 51, No.6, pp.1 173-1 182. I62 Beaumont, N. B. and Schroder, R. M. (1997), "Technology, manufacturing performance and business performance amongst Australian manufacturers," Technovation, Vol. 17, No.6, pp.297-307. Bettman, J. R. (1979). An Information Processing Theory of Consumer Choice. Reading, MA: Addison-Wesley. Bidault, F., Despres, C. and Bulter, C. (1998), "The Drivers of Cooperation between Buyers and Suppliers for Product Innovation," Research Policy, Vol. 26, No.7, pp.719—732. Bolden, R., Waterson, P., Warr, P., Clegg, C. and Wall, T. (1997), "A new taxonomy of modern manufacturing practices," International Journal of Operations and Production Management, Vol. 17, No.11, pp.1112-1130. Bollen, K. A. (1989). Structural Equations with Latent Variables. New York, NY: John Wiley & Sons. Bollen, K. A. and Stine, R. A. (1993). "Bootstrapping goodness-of-fit measures in structural equation models." In Testing Structural Equation Models.Eds K.A. Bollen and JS Long: Sage Publication. pp] 11-135. Bonama, T. V. (1985), "Case research in marketing: opportunities, problems, and a process," Journal of Marketing Research, Vol. 22, No.2, pp.l99-208. Bone, P. F., Sharma, S. and Shimp, T. A. (1989), "A bootstrap procedure for evaluating goodness-of-fit indices of structural equation and confirmatory factor models," Journal of Marketing Research, Vol. 26, No.1, pp.lOS-l 1 l. Boon-itt, S. and Paul, H. (2006), "A study of supply chain integration in Thai automotive industry: a theoretical framework and measurement," Management Research News, Vol. 29, No.4, pp.l94-205. Boyd, B. K., Dess, G. G. and Rasheed, A. M. A. (1993), "Divergence between archival and perceptual measures of the environment: causes and consequences," Academy of Management Review, Vol. 18, No.2, pp.204-226. Boyer, K. K., Leong, K. 6., Ward, P. T. and Krajewski, L. J. (1997), "Unlocking the potential of advanced manufacturing technologies," Journal of Operations Management, Vol. 15, No.4, pp.331-347. Boyer, K. K. and Lewis, M. W. (2002), "Competitive priorities: Investigating the need for trade-offs in operations strategy," Production and Operations Management, Vol. 11, No.1, pp.9-20. I63 Brynjolfsson, E. and Smith, M. D. (2000), "Frictionless commerce? A comparison of Internet and conventional retailers," Management Science, Vol. 46, No.4, pp.563- 585. Buffa, E. S. (1984), "Making American manufacturing competitive," California Management Review, Vol. 16, No.1, pp.29—46. Callen, J. L.. Fader, C. and Krinsky, l. (2000), "Just-in-time: a cross-sectional plant analysis," International Journal of Production Economics, Vol. 63, No.3, pp.277- 301. Camp, R. C. (1989). Benchmarking -- The Search for Industry Best Practices that lead to Superior Performance. Milwaukee, WI: ASQC Quality Press. Challis, D., Samson, D. and Lawson, B. (2005), "Impact of technological, organizational and human resource investments on employee and manufacturing performance: Australian and New Zealand evidence," International Journal of Production Research, Vol. 43, No.1, pp.81-107. Chan, S. K. K. and Man, D. W. K. (2005), "Barriers to returning to work for people with spinal cord injuries," Work, Vol. 25, No.4, pp.325-332. Chase, R. B., Jacobs, F. R. and Aquilano, N. J. (2005). Operations Management for Competitive Advantage New York, NY: McGraw-Hill/lrwin. Christensen, C. M. (1992), "Exploring the limits of the technology S-curve. Part 1: component technologies," Production and Operations Management, Vol. 1, No.4, pp.334-357. Christiansen, T., Berry, W. L., Bruun, P. and Ward, P. (2003), "A mapping of competitive priorities, manufacturing practices, and operational performance in groups of Danish manufacturing companies," International Journal of Operations & Production Management, Vol. 23, No.10, pp.ll63-1183. Clegg, C., Axtell, C., Damodaran, L., Farbey, 8., Hull, R., Lloyd-Jones, R., Nicholls, J., Sell, R., Tomlinson, C., Ainger, A. and Sewart, T. (1996). The performance of information technology and the role of human and organizational factors. Scheffield, report to the Economic and Social Research Council, Institute of Work Psychology. Cleveland, G., Schroeder, R. G. and Anderson, J. C. (1989), "A theory of production competence," Decision Sciences, Vol. 20, No., pp.655-668. Cohen, J. (1960), "A coefficient of agreement for nominal scales," Educational and psychological Measurement, Vol. 20, No.1, pp.37-46. 164 Cohen, J. (1988). Stt'tistical Power Analysis for Behavior Sciences. Hillsdale, NJ: Lawrence Erlbaum. Collins, P. (1994), "Approaches to quality," The TQM Magazine, Vol. 6, No.3, pp.39-43. Collins, R., Cordon, C. and Julien, D. (1996), "Lessons from the 'Made in Switzerland' study: What makes a world-class manufacturer?," European Management Journal, Vol. 14, No.6, pp.576-589. Collis, D. J. and Montgomery, C. A. (1995), "Competing on resources: Strategy for the 19903," Harvard Business Review, Vol. 73, No.4, pp.118-128. Cook, C., Heath, F. and Thompson, R. (2000), "A meta-analysis of response rates in Web- or Intemet-based surveys," Educational and psychological Measurement, Vol. 60, No.6, pp.821-836. Corbett, C. J. and Van Wassehnove, L. N. (1993), "Trade-offs? What trade-offs? Competence and competitiveness," California Management Review, Vol. 35, No.4, pp.107-122. Corbin, J. and Strauss, A. (1990), "Grounded theory research: procedures, cannons, and evaluative criteria," Qualitative Sociology, Vol. 13, No.1, pp.3-21. Cronbach, L. (1951), "Coefficient Alpha and the lntemal Structure of the Tests," Psychometrika, Vol. 16, No., pp.297-334. Crosby, P. T. (1979). Quality Is Free: The Art of Making Quality Free. New York, NY: McGraw-Hill. Cua, K. 0., McKone, K. E. and Schroeder, R. G. (2001), "Relationship between implementation of TQM, J IT, and TPM and manufacturing performance," Joumal of Operations Management, Vol. 19, No., pp.675-694. Author (1990), "Total quality management: an overview." In Managing Quality. Eds.Dale, B. G., Boaden, R. J. and Lascelles, D., Englewood Cliff, NY: Prentice- Hall. Davies, A. J. and Kochhar, A. K. (2002), "Manufacturing best practice and performance studies: A critique," International Journal of Operations and Production Management, Vol. 22, No.3, pp.289-305. Davy, J. A., White, R. E., Merritt, N. J. and Gritzmacher, K. (1992), "A derivation of the underlying constructs of just-in-time management systems," Academy Management Review, Vol. 35, No.3, pp.653-670. I65 Day. G. S. (1994), "The capabilities of market driven organizations," Journal of Marketing, Vol. 58, No.1, pp.37-52. Dean. J. W. and Snell, S. A. (1996), "The strategic use of integrated manufacturing: An empirical examination," Strategic Management Journal, Vol. 17, No.6, pp.459-480. Dennis Jr., W. J. (2003), "Raising response rates in mail surveys of small business owners: results of an experiment," Journal of Small Business Management, Vol. 41, No.3, pp.278-295. Deutskens, E., de Ruyter, K., Wetzels, W. and Oosterveld, P. (2004), "Response rate and response quality of Intemet-based surveys: an experimental study," Marketing Letters, Vol. 15, No.1, pp.21-36. Dierickx, l. and Cool, K. (1989), "Asset stock accumulation and sustainability of competitive advantage," Management Science, Vol. 35, No.12, pp.1504-1511. Dillman, D. A. (2000). Mail and Internet Surveys: The Tailored Design Methods. New York, NY: John Wiley & Sons. Douglas, T. J. and Judge Jr., W. Q. (2001), "Total quality management implementation and competitive advantage: the role of structural control and exploration," Academy of Management, Vol. 44, No.1, pp.158-l69. Droge, C., Jayaram, J. and Vickery, S. K. (1999), "The ability to minimize the timing of new product development and introduction: An examination of antecedent factors in the North American automobile supplier industry," Journal of Product Innovation Management, Vol. 17, No.1, pp.24-40. Eisenhardt, K. M. (1989), "Building theories from case study research," Academy Management Review, Vol. 14, No.4, pp.532-550. Eisenhardt, K. M. and Martin, J. A. (2000), "Dynamic capabilities: What are they?," Strategic Management Journal, Vol. 21, No.10-ll (special issue: The Evolution of Firm Capabilities), pp.1105-1121. Escrig-Tena, A. B. and Bou-Llusar, J. C. (2005), "A model for evaluating organizational competencies: An application in the context of a quality management initiative," Decision Sciences, Vol. 36. No.2, pp.221-257. Eubanks, P. and Abbott, C. (2003), "Using focus groups to supplement the assessment of technical communication texts, pedagogy, and programs," Technical Communication Quarterly, Vol. 12, No.1, pp.25-46. I66 Fan, X. and Thompson, B. (1998). Bootstrap estimation of sample statistic bias in structural equation modeling. Annual Meeting of the American Eduational Research Association. San Diego, CA. Ferdows, K. and De Meyer, A. (1990), "Lasting improvements in manufacturing performance: In search of a new theory," Journal of Operations Management, Vol. 9, No.2, pp.l68—184. Fine. C. H. (1999). Clockspeed: Winning Industry Control in the Age of Temporary Advantage. New York, NY: Perseus Books Group. Fleiss, J. (1981). Statistical Methods for Rates and Proportions. New York, NY: John Wiley & Sons. Flynn, B., Sukakibara, S., Schroeder, R., Bates, K. and Flynn, J. (1990), "Empirical research methods in operations management," Journal of Operations Management, Vol. 9, No.2, pp.250-284. Flynn, B. B. and Flynn, E. J. (2004), "An exploratory study of the nature of cumulative capabilities," Journal of Operations Management, Vol. 22, No.5, pp.439-457. Flynn, B. B., Sakakibara, S. and Schroeder, R. G. (1995b), "Relationship between JIT and TQM: practices and performance," Academy of Management Journal, Vol. 38, No.5, pp.1325-1360. Flynn, B. B., Schroeder, R. G. and Sakakibara, S. (1994), "A framework for quality management research and an associated measurement instrument," Journal of Operations Management, Vol. 11, No.4, pp.339-366. Flynn, B. B., Schroeder, R. G. and Sakakibara, S. (1995a), "The impact of quality management practices on performance and competitive advantage," Decision Sciences, Vol. 26, No.5, pp.659-691. Frohlich, M. T. (2002), "Techniques for improving response rates in OM survey research," Journal of Operations Management, Vol. 20, No.1, pp.53-62. Fullerton, R. R., McWatters. C. S. and Fawson, C. (2003), "An examination of the relationships between JIT and financial performance," Journal of Operations Management, Vol. 21, No.4, pp.383-404. Gagnon, S. (1999), "Resource-based competition and the new operations strategy," International Journal of Operations and Production Management, Vol. 19, No.2, pp.125-138. Garvin, D. A. (1984), "Japanese quality management," Columbia Journal of World Business, Vol., No., pp.3-12. I67 Garvin, D. A. (1988). Managing Quality: The Strategic and Competitive Edge. New York, NY: Free Press. Gerwin, D. (1993), "Manufacturing flexibility: A strategic perspective," Management Science, Vol. 39, No.4, pp.395-410. Giffi, C., Roth, A. and Seal, G. M. (1990). Competing in World Class Manufacturing: America 's 21st Century Challenge. Homewood, IL: Business One Irwin. Gigerenzer, G. and Goldstein, D. G. (1996), "Reasoning the fast and frugal way: models of bounded rationality," Psychological Review, Vol. 103, No.4, pp.650-669. Glaser, B. G. (1978). Theoretical Sensitivity: Advances in the Methodology of Grounded Theory. San Francisco, CA: Sociology Press. Glaser, B. G. and Strauss, A. L. (1967). The Discovery of Grounded Theory: Strategies for Qualitative Research. Chicago: Aldine Publishing Company. Grant, R. (1996), "Toward a knowledge—based theory of the firm," Strategic Management Journal, Vol. 17, No.8pecial issue: Knowledge and the Firm, pp.109-122. Greer, T. V., Chuchinprakam, N. and Seshadri, S. (2000), "Likelihood of participating in mail survey research: business respondents' perspective," Industrial Marketing Management, Vol. 29, No.2, pp.97-109. Guisinger, A. and Ghorashi, B. (2004), "Agile manufacturing practices in the specialty chemical industry," International Journal of Operations & Production Management, Vol. 24, No.6, pp.625-635. Gupta, A. and Whitehouse, F. R. (2001), "Firms using advanced manufacturing technology management: An empirical analysis based on size," Integrated Manufacturing Systems, Vol. 12, No.5, pp.346-350. Hambrick, D. C. (1984), "Taxonomic approaches to studying strategy: Some conceptual and methodological issues," Journal of Management, Vol. 10, No.1, pp.27-41. Handfield, R. B. and Melnyk, S. A. (1998), "The scientific theory-building process: A primer using the case of TQM," Journal of Operations Management, Vol. 16, No.4, pp.321-339. Hanson, P., Voss, C., Blackmon, K. and Claxton, T. (1994). Made in Europe: A Four Nations Best Practice Study. IBM Consulting Group and London Business School. Hart, S. L. (1995), "A natural-resource based view of the firm," Academy Management Review, Vol. 20, No.4, pp.986-1014. 168 Hartley, J. L., Zirger, B. J. and Kamath, R., R. (1997), "Managing the buyer-supplier interface for on-time performance in product development," Journal of Operations Management, Vol. 15, No.1, pp.57-70. Hartley, R. H. (1992). Concurrent engineering: Shortening lead time, raising quality and lowering cost. Cambridge, MA: Productivity Press. Hayes, R. H. and Wheelwright, S. C. (1984). Restoring Our Competitive Edge: ' Competing through Manufacturing. New York: Wiley. Heibeler, R., Kelly, T. B. and Ketteman, C. (1998). Best Practices -- Building Your Business with C ustomer-focused Solutions. New York, NY: Simon & Schuster. Hunt, S. (1991). Modern Marketing Theory: Critical Issues in the Philosophy of Marketing Science. Cincinnati, OH: South-Westem Publishing Co. Huson, M. and Nanda, D. (1995), "The impact of just-in—time manufacturing on firm performance in US," Journal of Operations Management, Vol. 12, No.3, pp.297- 310. Ilieva, J ., Baron, S. and Healey, N. M. (2002), "Online surveys in marketing research: pros and cons," International Journal of Market Research, Vol. 44, No.3, pp.361- 382. Imai, M. (1986). The Key to Japan's Competitive Success. New York, NY: McGraw- Hill/Irwin. Inman, R. A. and Mehra, S. (1993), "Financial justification of JIT implementation," International Journal of Operations & Production Management, Vol. 13, No.4, pp.32-39. Jaikumar, R. (1986), "Postindustrial manufacturing," Harvard Business Review, Vol. 64, No.1, pp.69-76. Jarvenpass, S. (1999), "The effect of task demand and graphical format on information process strategies," Management Science, Vol. 21, No.5, pp.285-303. J arvenpass, S. and Lang, K. R. (2005), "Managing the paradoxes of mobile technology," Information Systems Management, Vol. 22, No.4, pp.7-23. Judd, C. M. and Kenny, D. A. (1981), "Process analysis: Estimating mediation in treatment evaluations," Evaluation Review, Vol. 5, No.5, pp.602—619. Juran, J. M. (1981), "Product quality - a prescription for the west: Part 1: training and improvement programs," Management Review, Vol. 70, No.6, pp.8-14. 169 Kaynak, H. (2003), "The relationship between total quality management practices and their effects on firm performance," Journal of Operations Management, Vol. 21, No., pp.405-435. Khan, A. M. (1987). "Assessing venture capital investments with noncompensatory behavioral decision models," Journal of Business Venturing, Vol. 2, No.3, pp.l93- 205. Kitazawa, S. and Sarkis, J. (2000), "The relationship between 18014001 and continuous source reduction programs," International Journal of Operations and Production Management, Vol. 20, No.2, pp.225. Kotha, S. and Swamidass, P. M. (2000), "Strategy, advanced manufacturing technology and performance: Empirical evidence from U. S. manufacturing firms," Journal of Operations Management, Vol. 18, No.3, pp.257-277. Koufteros, X. A., Vonderembse, M. A. and Doll, w. J. (1998), "Developing measures of time-based manufacturing," Journal of Operations Management, Vol. 16, No.1, pp.21-41. Krause, D. R., Scannell, T. V. and Calantone, R. J. (2000), "A structure analysis of the effectiveness of buying firms' strategies to improve supplier performance," Decision Sciences, Vol. 31, No.1, pp.33-55. Krueger, R. A. (1988). Focus Groups: A Practical Guide for Applied Research. London: Sage Publications. LaBahn, D. W. and Krapfel, R. (2000), "Early Supplier Involvement in Customer New Product Development: A Contingency Model of Component Supplier Intentions," Journal of Business Research, Vol. 47, No.3, pp.173—l90. Labovitz, S. (1968), "Criteria for selecting a significance level: a note on the sacredness of 0.05," The American Sociologist, Vol. 68, No.3, pp.220-222. Lambert, D. M. and Harrington, T. C. (1990), "Measuring nonresponse bias in customer service mail survey," Journal of Business Logistics, Vol. 11, No.2, pp.5-25. Lapre, M. A. and Scudder, G. D. (2004), "Performance improvement paths in the US. airline industry: linking trade-offs to asset frontiers," Production and Operations Management, Vol. 13, No.2, pp.123-134. Larson, P. (2005), "A note on mail surveys and response rates in logistics research," Journal of Business Logistics, Vol. 26, No.2, pp.211-221. 170 Laugen, B. T., Boer, N. A., Boer, H. and Frick, J. (2005), "Best manufacturing practices: What do the best-performing companies do?," International Journal of Operations and Production Management, Vol. 25, No.2, pp.l31-150. Leonard-Barron, D. (1992), "Core capabilities and core rigidities: A paradox in managing new product development," Strategic Management Journal, Vol. 13, No.2, pp] 1 1- 125. Li. L. (1997), "Relationships between determinants of hospital quality management and service quality performance - a path analytical model," Omega, Vol. 25, No.5, pp.535-545. Li, S., Rao, S. S., Ragu—Nathan, T. S. and Ragu-Nathan, B. (2005), "Development and validation of a measurement instrument for studying supply chain management practices," Journal of Operations Management, Vol. 23, No.6, pp.618-641. McGrath, R. G., Tsai, M., Venkataraman, S. and MacMillan, I. C. (1996), "Innovation, competitive advantage and rent: a model and test," Management Science, Vol. 42, No.3, pp.389—403. McKenna, H. P. (1994), "The Delphi technique: a worthwhile research approach for nursing? ," Journal of Advanced Nursing, Vol. 19, No.6, pp.1221-1225. McLachlin, R. (1997), "Management initiatives and just-in-time manufacturing," Journal of Operations Management, Vol. 15, No.4, pp.271-292. Meredith, J. (1993), "Theory building through conceptual methods," Intemational Journal of Operations and Production Management, Vol. 13, No.5, pp.3-1 l. Merton, R. K., Fiske, M. and Kendall, P. L. (1990). The Focused Interview: A Manual of Problems and Procedures. London: Collier MacMillan. Miles, R. E., Snow, C. C., Meyer, A. D. and Coleman Jr., H. J. (1978), "Organizational strategy, structure, and process," Academy of Management Review, Vol. 3, No.3, pp.546-562. Miller, D. (1988), "Relating Porter's business strategies to environment and structure," Academy Management Journal, Vol. 31, No.2, pp.280-308. Miller, D. (1996), "Configuration revisited," Strategic Management Journal, Vol. 17, No.7, pp.505-512. Miller, J. G. and Roth, A. (1994), "A taxonomy of manufacturing strategies," Management Science, Vol. 40, No.3, pp.285-304. I71 Mills, J ., Platts, K. and Boume, M. (2003), "Competence and resources architectures," International Journal of Operations and Production Management, Vol. 23, No.9, pp.977-994. Moore, G. C. and Benbasat, I. (1991), "Development of an instrument to measure the perceptions of adopting an information technology innovation," Information System Research, Vol. 2, No.2, pp. 192-222. Moreno-Lonzo, M. D. and Peris, F. J. (1998), "Strategic approach, organizational design and quality management," International Journal of Quality Science, Vol. 3, No.4, pp.328-347. Nahm, A. Y., Vonderembse, M. A. and Koufteros, X. A. (2004), "The impact of organizational culture on time-based manufacturing performance," Decision Sciences, Vol. 35, No.4, pp.579-607. Nair, A. (2006), "Meta-analysis of the relationship between quality management practices and firm performance - implications for quality management theory development," Journal of Operations Management, Vol. 24, No.6, pp.948-975. Narasimhan, R., Swink, M. and Kim, S. W. (2006), "Disentangling leanness and agility: An empirical investigation," Journal of Operations Management, Vol. 24, No.5, pp.440-457. Nelson, R. (1991), "Why do firms differ and how does it matter?," Strategic Management Journal, Vol. 12, No.1, pp.6l-74. New, C. C. and Szwejczewski, M. (1995), "Performance measurement and the focused factory: Empirical evidence," International Journal of Operations and Production Management, Vol. 15, No.4, pp.63-79. Noble, M. A. (1995), "Manufacturing strategy: Testing the cumulative model in a multiple country context," Decision Sciences, Vol. 26, No.5, pp.693-720. Nunnally, J. C. (1978). Psychometric theory. New York: McGraw-Hill. Parker, L. (1992), "Collecting data the email way," Training and Development, Vol. 46, No.7, pp.52-54. Parzinger, M. J. and Nath, R. (2000), "A study of the relationships between total quality management implementation factors and software quality," Total Quality Management, Vol. 11, No.3, pp.353-371. Patton, W. E., III and King, R. H. (1992), "The Use of Human Judgment Models in Sales Force Selection Decisions," The Journal of Personal Selling & Sales Management, Vol. 12, No.2, pp.1-14. Penrose, E. (1959). The theory of the growth o/‘tltefirm. New York, NY: Wiley. Peter, J. P. and Olson, J. C. (1987). Consumer Behavior: Marketing Strategy Perspectives. Homewood, IL: Irwin. Podsakoff, P. M., MacKenzie, S. B., Lee, J .-Y. and Podsakoff, N. P. (2003), "Common method biases in behavior research: A critical review of the literature and recommended remedies," Journal of Applied Psychology, Vol. 88, No.5, pp.879- 903. Powell, T. C. (1995), "Total quality management as competitive advantage: A review and empirical study," Strategic Management Journal, Vol. 16, No.1, pp.15-27. Prabhu, V., Yarrow, D. and Gordon-Hart, G. (2000), "Best practice and performance within Northeast manufacturing," Total Quality Management, Vol. 11, No.1, pp.113-122. Prahalad, C. K. and Hamel, G. (1990), "The core competence of the corporation," Harvard Business Review, Vol. 90, No.3, pp.79-91. Ray. G., Barney, J. B. and Muhanna, W. A. (2004), "Capabilities, business processes, and competitive advantage: Choosing the dependent variable in empirical tests of the resource-based view," Strategic Management Journal, Vol. 25, No.1, pp.23-37. Riggs, J. L. and Felix, G. H. (1983). Productivity by Objectives. Englewood Cliffs, NJ: Prentice-Hall. Rosenzweig, E. D. and Roth, A. V. (2004), "Towards a theory of competitive progression: evidence from high-tech manufacturing," Production and Operations Management, Vol. 13, No.4, pp.354-368. Rosenzweig, E. D., Roth, A. V. and Dean, J. W., Jr. (2003), "The influence of an integration strategy on competitive capabilities and business performance: An exploratory study of consumer products manufacturers," Journal of Operations Management, Vol. 21, No.4, pp.437-456. Roth, A. V. and Miller, J. G. (1992), "Success factors in manufacturing," Business Horizons, Vol. 35, No.4, pp.73-81. Safizadeh, M. H., Ritzman, L. and Mallick, D. (2000), "Alternative paradigms in manufacturing strategy," Production and Operations Management, Vol. 9, No.2, pp.lll-127. Sahal, D. (1981). Patterns of Technological Innovation. London: Addison-Wesley. I73 Sakakibara, S., Flynn, B. B. and Schroeder, R. G. (1993), "A framework and measurement instrument for just-in-time manufacturing," Production and Operations Management, Vol. 2, No.3, pp.l77-194. Sakakibara, S., Flynn, B. B., Schroeder, R. G. and Morris, W. T. (1997), "The impact of Just-in-Time manufacturing and its infrastructure on manufacturing performance," Management Science, Vol. 43, No.9, pp.1246-1257. Samson, D. and Ford, S. (2000), "Manufacturing practices and performance: Comparisons between Australia and New Zealand," International Journal of Production Economics, Vol. 65, No.3, pp.243-255. Samson, D. and Terziovski, M. (1999), "The relationship between total quality management practices and operational performance," Journal of Operations Management, Vol. 17, No.4, pp.393-409. Schaefer, D. R. and Dillman, D. A. (1998), "Development of a standard e-mail methodology: results of an experiment," Public Opinion Quarterly, Vol. 62, No.3, pp.378-397. Schmenner, R. W. and Swink, M. L. (1998), "On theory in operations management," Journal of Operations Management, Vol. 17, No.1, pp.97-113. Schonberger, R. J. (1996). World Class Manufacturing: The Next Decade. New York: Free Press. Schroeder, R. G., Bates, K. A. and Junttila, M. A. (2002), "A resource-based view of manufacturing strategy and the relationship to manufacturing performance," Strategic Management Journal, Vol. 23, No.2, pp.105-117. Scudder, G. D. and Hill, C. A. (1998), "A review and classification of empirical research in operations management," Journal of Operations Management, Vol. 16, No.1, pp.91-101. Shah, R. and Ward, P. T. (2003), "Lean manufacturing: Context, practice bundles, and performance," Journal of Operations Management, Vol. 21, No.2, pp.129-149. Sheehan, K. B. and Hey, M. G. (1999), "Using e-mail to survey intemet users in the United States: methodology and assessment," Journal of Computer-Mediated Communication, Vol. 4, No.3, pp. Shewhart, W. A. (1939). Statistical Method from the Viewpoint of Quality Control. New York, NY: Dover. 174 Sivo, S. A., Saunder, C., Chang, Q. and Jiang, J. J. (2006), "How low should you go? Low response rates and the validity of Inferences in IS questionnaire research," Journal of the Association for Information Systems, Vol. 7, No.6, pp.351-4l4. Skinner, W. (1969). "Manufacturing -- Missing link in corporate strategy," Harvard Business Review, Vol. 47, No.3, pp.l36-144. Skinner, W. ( 1974). "The focused factory," Harvard Business Review, Vol. 52, No.3, pp.112-121. Sloss, E. S. (1995), "Child care choices in a lexicographic framework," Journal of Economic Issues, Vol. 29, No.2, pp.629-637. Snell. S. A. and Dean, J. W. (1992), "Integrated manufacturing and human resource management: A human capital perspective," Academy Management Review, Vol. 35, No.3, pp.467-504. Spear, S. and Bowen, K. H. (1999), "Decoding the DNA of the Toyota Production System," Harvard Business Review, Vol. 77, No.5, pp.96-106. Spector, P. E. (1987), "Method variance as an artifact in self-reported affect and perceptions at work: Myth or significant problem," Journal of Applied Psychology, Vol. 72, No.3, pp.438-443. Spencer, B. A. (1994), "Models of organization and total quality management: a comparison and critical evaluation," Academy of Management Review, Vol. 19, No.3, pp.446-471. Starbuck, W. H. and Mezias, J. M. (1996), "Opening Pandora's box: studying the accuracy of managers' perceptions," Journal of Organizational Behavior, Vol. 17, No.2, pp.99-1 17. Stewart, D. W. and Shamdasani, P. N. (1990). "Focus groups: theory and practices." In Applied Social Research Methods Series. Eds. Newbury Park: CA: Sage Publications. pp. Strauss, A. and Corbin, J. (1990). Basics of Qualitative Research. Newbury Park: Sage. Strauss, A. and Corbin, J. (1994). "Grounded theory methodology: an overview." In Handbook of Qualitative Research. Eds N.K. Denzin and Y.S. Lincoln: Thousand Oaks: Sage. pp.273-285. Subramaniam, M. and Youndt, M. A. (2005), "The influence of intellectual capital on the types of innovative capabilities," Academy of Management Journal, Vol. 48, No.3, pp.450-463. 175 Swadmidass, P. M. (1986), "Manufacturing strategy: Its assessment and practice," Journal of Operations Management, Vol. 6, No.4, pp.471-484. Swadmidass, P. M. (1991), "Empirical sciences: New frontier in operations management research," Journal of Operations Management, Vol. 16, No.4, pp.793-8l4. Swadmidass, P. M. (1992). Technology on the factory floor: A survey of advanced manufacturing technologies: today's use and plans for tomorrow. Thomas Walter Center for Manufacturing Technology, Auburn University, Auburn. Swadmidass, P. M. (1994). Technology on the factory floor 11: Benchmarking manufacturing technology use in the United States. The Manufacturing Institute, Washington DC. Swink, M. and Hegarty, W. H. (1998), "Core manufacturing capabilities and their links to product differentiation," International Journal of Operations and Production Management, Vol. 18, No.4, pp.374-396. Swink, M., Narasimhan, R. and Kim, S. W. (2005), "Manufacturing Practices and Strategy Integration: Effects on Cost Efficiency, Flexibility, and Market-Based Performance," Decision Sciences, Vol. 36, No.3, pp.427-457. Swink, M. and Way, M. H. (1995), "Manufacturing strategy: Propositions, current research, renewed directions," International Journal of Operations and Production Management, Vol. 15, No.7, pp.4-26. Szwejczewski, M., Mapes, J. and New, C. (1997), "Delivery and trade-offs," International Journal of Production Economics, Vol. 53, No.3, pp.323-330. Teece, D. J., Pisano, G. and Shuen, A. (1997), "Dynamic capabilities and strategic management," Strategic Management Journal, Vol. 18, No.7, pp.509-533. Thompson, B. (1993), "The use of statistical significance tests in research: bootstrap and other alternatives," The Journal of Experimental Education, Vol. 61, No., pp.681- 686. Tu, Q., Vonderembse, M. A., Ragu-Nathan, T. S. and Ragu-Nathan, B. (2004), "Measuring Modularity-Based Manufacturing Practices and Their Impact on Mass Customization Capability: A Customer-Driven Perspective," Decision Sciences, Vol. 35, No.2, pp.l47-168. University of Colorado at Boulder (1996). Senior and alumni survey data collection: sampling methodology and response rate [online]. available: http://www.colorado.edu/SARS/srsurvey/samplehtm. 176 Van de Ven, A. H. and Darzin, R. (1985). "The concept of fit in contingency theory." In Research in Organizational Behavior. Eds B.M. Staw and LL. Cummings. Greenwich, CT: JAl Press. pp.333-375. Van Dierdonck, R. and Miller, J. G. (1980), "Designing production planning and control systems," Journal of Operations Management, Vol. 1, No.1, pp.37-46 Venkatraman, N. (1989), "The concept of fit in strategy research: toward verbal and statistical correspondence,"Academy ofManagement, Vol. 14, No.3, pp.423-444. Vickery. S. (1991), "The theory of production competence revisited," Decision Sciences, Vol. 22, No., pp.635—643. Vickery, S., Droge, C. and Markland, R. (1993), "Production competence and business strategy: Do they affect business performance?," Decision Sciences, Vol. 24, No.2, pp.435-455. Wacker, J. G. (1998), "A definition of theory: research guidelines for different theory- building research methods in operations management," Journal of Operations Management, Vol. 16, No.4, pp.361-385. Ward, P. and Zhou, H. (2006), "Impact of information technology integration and lean/Just-In-Time practices on lead-time performance," Decision Sciences, Vol. 37, No.29 pp.l77'203. Ward, P. T., Bickford, D. J. and Leong, G. K. (1996), "Configurations of manufacturing strategy, business strategy, environment, and structure," Journal of Management, Vol. 22, No.4, PP.597-626. Ward, P. T. and Duray, R. (2000), "Manufacturing strategy in context: Environment, competitive strategy and manufacturing strategy," Journal of Operations Management, Vol. 18, No.2, pp.123-138. Ward, P. T., Duray, R., Leong, G. K. and Sum, C. C. (1995), "Business environment, operations strategy and performance: An empirical study of Singapore manufacturers," Journal of Operations Management, Vol. 13, No.1, pp.99-115. Ward, P. T., Leong, G. K. and Boyer, K. K. (1994), "Manufacturing proactiveness and performance," Decision Sciences, Vol. 25, No.3, pp.337-358. Ward, P. T., McCreery, J. T., Ritzman, L. and Sharma, D. (1998), "Competitive priorities in operations management," Decision Sciences, Vol. 29, No.4, pp. 1035-1046. 177 if Watson, J. and Korukonda, A. (1995), "The TQM jungle: a dialectical analysis," International Journal of Quality and Reliability Management, Vol. 12, No.1, pp.100-109. Watts, C. A. and Hahn, C. K. (1993), "Supplier development programs: An empirical analysis," International Journal of Purchasing and Materials Management, Vol. 29, No.2, pp.10-17. Wernerfelt, B. (1984), "A resource-based view of the firm," Strategic Management Journal, Vol. 5, No.2, pp.l7l-180. White, G. P. (1996), "A meta-analysis model of manufacturing capabilities," Journal of Operations Management, Vol. 14, No.4, pp.315—331. White, R. E., Pearson, J. N. and Wilson, J. R. (1999), "JIT manufacturing: A survey of implementations in small and large U. S. manufacturers," Management Science, Vol.45, No.1, pp.15. Williams, H. P. (1994), "An alternative explanation of disjunctive formulations," European Journal of Operations Research, Vol. 72, No.1, pp.200-2003. Wilson, D. D. and Collier, D. A. (2000), "An empirical investigation of the Malcolm Baldrige National Quality Award causal model," Decision Sciences, Vol. 31, No.2, pp.361-390. Winter, S. G. (2003), "Understanding dynamic capabilities," Sloan Management Review, Vol. 24, No.10, pp.991-995. Wood, C. H. (1991). Operations strategy: Decision patterns and measurement. Ph. D dissertation. The Ohio State University, Columbus, OH. Youndt, M. A., Snell, S. A., Dean, J. W. and Lepak, D. P. (1996), "Human resource management, manufacturing strategy and firm performance," Academy of Management Journal, Vol. 39, No.4, pp.836-866. Yu, J. and Copper, H. (1983), "A quantitative review of research design effects on response rates to questionnaires," Journal of Marketing Research, Vol. 20, No.1, pp.36-44. Yusuff, R. M. (2004), "Manufacturing best practices of the electric and electronic firms in Malaysia," Benchmarking: An International Journal, Vol. 11, No.4, pp.361-369. 178 0 uirritantaffinity"