COMPARISON OF SELECTIVE TREATMENT BASED ON WHITE BLOOD CELL COUNT TO METAPHYLAXIS TO TREAT BOVINE RESPIRATORY DISEASE AND REDUCE ANTIBIOTIC USAGE IN FEEDLOTS By Elizabeth Frey A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Animal Science - Master of Science 2019 ABSTRACT COMPARISON OF SELECTIVE TREATMENT BASED ON WHITE BLOOD CELL COUNT TO METAPHYLAXIS TO TREAT BOVINE RESPIRATORY DISEASE AND REDUCE ANTIBIOTIC USAGE IN FEEDLOTS By Elizabeth Frey Metaphylaxis is commonly used to treat bovine respiratory disease (BRD) in the United States (US) feedlot industry. It is costly to the spread of antibiotic resistance and is expensive for producers to implement (Dean, et. al, 2011). Neutrophils and lymphocytes, two types of white blood cells (WBC), are key components early in the immune response to bacterial infection, and their numbers fluctuate in response to stress and bacterial pathogens (Parkin and Cohen, 2001). We hypothesize that selective treatment based on abnormal white blood cell (WBC) counts would result in significantly reduced morbidity rates compared to cattle that did not receive antibiotics. Selective treatment based on WBC count would also lessen overall antibiotic usage compared to metaphylaxis, without significant increases in morbidity or mortality rates. Two studies were conducted where cattle were randomly assigned to treatments. The control treatment (CON) was not treated with antibiotics, the second treatment was treated metaphylactically (MET), and the third treatment was selectively treated based on WBC count (SEL). In the first study, no significant treatment differences were observed. The study was considered inconclusive due to low BRD incidence. In the second study, cattle with a rectal temperature >39.4°C at processing were given antibiotics (FEV). No treatment differences were observed, supporting the theory that selective antibiotic treatment may reduce antibiotic usage. Seasonal differences were significant, with highest morbidity observed in the fall. ACKNOWLEDGEMENTS To my academic advisor and research primary investigator, Dr. Steven Rust, thank you for the opportunity to be involved in this research project, have direct contact with industry partners, and grow academically over the past two years. This has been an immense learning experience. To my committee members, Dr. Paul Bartlett, Dr. Bo Norby, and Dr. Andrew Huff, I appreciate the time and guidance you invested into me throughout this program. It has been a pleasure to work with such a committee that represents such diverse components of animal health. To Tristan Foster and the management team at the MSU Beef Cattle Teaching & Research Center (BCTRC), thank you for accommodating our research, answering my numerous questions, and helping process cattle no matter the time or weather. Special thanks to Dr. Dan Buskirk for connecting me with this research opportunity, for coordinating operations at BCTRC, and your mentorship throughout this program. To the members of the Animal Science Graduate Student Association, thank you for being my adopted lab group, social outlet, and go-to people for any and all department related questions. This research would not have been possible without this assistance of Madison Sokacz, Tori Mires, and Lainey Selkirk. Thank you for the hours of work, endless laughs, and dedication to this research process. I can’t wait to see the big things you all do in your own careers. Finally, to my family and closest friends, I can’t thank you enough for the constant support and confidence throughout this graduate school experience. I never would have made it here without you. iv TABLE OF CONTENTS LIST OF TABLES ……………………………………………………………..…………….… vii LIST OF FIGURES …………………………………………………………………….……... viii KEY TO ABBREVIATIONS …………………………………………..……………………..... ix INTRODUCTION ………………………………………………………………………...…….. 1 CHAPTER 1: Literature Review ………………………………………………………...……… 3 Management Decisions ……………………………………...…………………..….…… 4 Cattle Characteristics ……………………………………...…………...……….……… 13 Symptoms ……….…………………………………...……...…………………….…… 15 Significance of Bovine Respiratory Disease ....…………………………...….….…..… 17 Bacterial Components of Bovine Respiratory Disease ………………………...….…… 22 Immunology Overview …………………………………............…………...….……… 27 Emerging Technologies ………………………………………...…………...….……… 30 Summary …………………...……………………………..……….………...….…....… 37 CHAPTER 2: Neutrophil and lymphocyte enumeration for early detection of bovine respiratory disease ……………………………………………………………….….……………… 38 Abstract …………………...……………………………..………….……...….……..… 39 Key Words …………………...……………………………..………….……...……….. 39 Introduction ……..…………...……………………………..…………….….......….….. 40 Materials and Methods …………………………………………………………………. 41 Results and Discussion ………………………………………………………………… 45 APPENDIX......…………………………………………………………………………………. 48 CHAPTER 3: Comparison of selective treatment based on white blood cell counts to metaphylaxis to treat bovine respiratory disease and reduce antibiotic usage in feedlots 57 Abstract ...…...…...…...…...…...…...…...…...…...…...…....….....…...…...…................ 58 Key Words …………………...……………………………..…………...………….….. 59 Introduction ……..…………...……………………………………..…………....….….. 59 Materials and Methods …...…...…...…...…...…...…...…...…...…...…...…...…...…...... 65 Results …………………...……………………..………..…………...….……….…….. 69 Discussion …………,………………………...………………………...….…………… 73 Summary ……..……….………………………...………………………...………….… 77 v APPENDIX …...…………..……...………………………………….………………………..... 78 CHAPTER 4: Interpretive Summary...….....…...…...…...…...…...…...…...…...…...…...…...... 87 LITERATURE CITED ………………………………………………………………………… 94 vi LIST OF TABLES Table 1.1: Feed analysis and dry matter (DM) basis of feed used d 1-56 in groups 2 and 3 ....... 50 Table 1.2: Correlation of biological measures with health …………………………………….. 54 Table 1.3: Treatment and gender effect on health, growth, and efficiency ………….....……… 55 Table 2.1: Comparison of rectal temperature at processing and subsequent morbidity ….……. 79 Table 2.2: Comparison of neutrophil count and subsequent morbidity ……......…………..…... 80 Table 2.3: Comparison of lymphocyte count and subsequent morbidity …………………........ 81 Table 2.4: Correlation of biological measures and health based on antibiotic treatment ……… 83 Table 2.5: Effect of processing treatment on health management ……………………..……..... 84 Table 2.6: Effects of season on white blood cell counts and health management ……………... 85 Table 2.7: Effects of white blood cell counts and metaphylactic treatment on morbidity and mortality …………………………………………………………………………....…... 86 vii LIST OF FIGURES Figure 1.1: Decision-making tree for treatment assignment used in initial processing …….….. 49 Figure 1.2: Initial rectal temperature and correlation with morbidity ………………….…….... 51 Figure 1.3: Initial neutrophil count and correlation with morbidity ………………….….......… 52 Figure 1.4: Initial lymphocyte count and correlation with morbidity ……………….…………. 53 Figure 1.5: Percent of cattle given antibiotics at processing per treatment ………….………… 56 Figure 2.1: Comparison of rectal temperature at processing and subsequent morbidity …….… 79 Figure 2.2: Comparison of neutrophil count and subsequent morbidity ……......……………... 80 Figure 2.3: Comparison of lymphocyte count and subsequent morbidity …………………....... 81 Figure 2.4: Purchase weight ……………………………………………………………………. 82 viii KEY TO ABBREVIATIONS AAD Advanced Animal Diagnostics Abn-A Abnormal WBC, treated with antibiotics at processing Abn-NA Abnormal WBC count, not treated with antibiotics at processing ADG Average Daily Gains ANOVA Analysis of Variance ARG ASL Antibiotic Resistant Gene Air-Surface Liquid BAM Bronchoalveolar macrophages BCTRC Beef Cattle Teaching & Research Center (affiliated w/ Michigan State University) BHV-1 Bovine Herpesvirus-1 BRD Bovine Respiratory Disease BRSV Bovine Syncytial Virus BVDV Bovine Viral Diarrhea Virus CON DM DMI FEV GR LPS MET MLV NET Control treatment Dry Matter Dry Matter Intake Treatment assigned due to fever at processing Glucocorticoid Receptor Lipooligosaccharides Metaphylaxis treatment Modified Live Virus Neutrophil Extracellular Traps NSAID Nonsteroidal Steroidal Anti-inflammatory Drug Norm-A Normal WBC count, treated with antibiotics at processing Norm-NA Normal WBC count, not treated with antibiotics at processing ix PAMP Pathogen Associated Molecular Patterns PI-3 PRR PVP Parainfluenza Virus 3 Pattern Recognition Receptors Process Verified Programs RFID Radio-Frequency Identification ROI SEL Return on Investment Selective treatment based on white blood cell count WBC White Blood Cell x INTRODUCTION On an annual basis, Bovine Respiratory Disease (BRD) costs the industry an estimated $1 billion through treatment costs, decreased performance and feed efficiency, increased days on feed, decreased carcass value, and mortality (Fulton, 2002; Hodgins, 2004). The lack of improvement in industry BRD rates in the United States (US) over the last 40 years, and the annual financial loss associated with BRD indicates the need for a more precise tool for predicting BRD susceptibility (Wilson et. al, 2017). Increased public interest in livestock healthcare practices and the spread of antimicrobial resistance also contributed to interest in improving the efficacy of BRD treatment on feedlots. As public pressure on agricultural decision-making systems continues to rise, it will become increasingly important for livestock producers to have management practices in place which are based on quantifiable research findings which emphasize judicious usage of antibiotics. This study aims to address these issues by testing the strength of correlation between white blood cell (WBC) count and morbidity due to BRD, as well as the real-world application potential of selective antibiotic treatment based on WBC count at processing. Individual WBC counts were obtained chute-side during processing of cattle via a new technology developed by Advanced Animal Diagnostics (AAD) called the QScout. The QScout scans a drop of blood drawn from each animal to determine each individual’s differential WBC count which then can be classified as “normal” or “abnormal” based on expected profiles determined in previous studies (Roland et. al, 2014). Cattle with “normal” counts are considered adequately healthy and less susceptible to BRD pathogens, while cattle with “abnormal” counts considered highest risk and are thought to experience greater benefit from antibiotic treatment. 1 By focusing on management decisions and individual host susceptibility, our objective is to determine if WBC differential counts are an effective predictor of BRD susceptibility and predictor of therapeutic success. We hypothesize utilizing WBC count as a primary screening factor for antibiotic administration at the time of cattle processing will result in a more effective and judicious use of antibiotics as reflected in BRD related morbidity and mortality rates. Potential implications of these findings include decreased BRD-related financial burden on producers, proactive action against the development of antimicrobial resistance through targeted antibiotic usage, and the introduction of a quantifiable decision making system for earlier BRD diagnosis across the feedlot industry. 2 CHAPTER 1: Literature Review Bovine respiratory disease (BRD) is a multifactorial bacterial disease complex which affects the respiratory tract of cattle and impairs lung function, resulting in decreased animal productivity and welfare. Severe or chronic cases of BRD can also result in death. The BRD complex encompasses multiple types of respiratory impairment including shipping fever, enzootic calf pneumonia, acute interstitial pneumonias, and metastatic pneumonia (Callan and Garry, 2002). Historically, BRD has affected 16.1% of all beef cattle and caused 75% of all feedlot morbidity in the United States (US), making it the most widespread health issue in the nation (Wilson et al., 2017). Due to the severe and widespread nature of the disease, farmers, scientists, and veterinarians are dedicated to finding ways to reduce the prevalence of BRD. Despite extensive research efforts focused on BRD over the last 40 years, little definitive progress was made in terms of effective prevention, management, or treatment of BRD (Currin, 2009; Wolfger et al., 2015). This is particularly true for high concentration beef operations in the US (Wilson et al., 2017). The disease is particularly hard to control because the effects of management decisions, environmental conditions, host susceptibility, and bacterial agent pathogenesis vary greatly on a case-by-case basis (Taylor et al., 2010). Antibiotic usage associated with BRD is also cause for concern. In 2016, 43% of the domestic sales of medically important antibiotics prescribed for use in food animal production were used on cattle operations (FDA, 2017). Improvements to the management and prevention of BRD could substantially affect the finances of the US beef industry and have global implications for human and animal healthcare (Avra et al., 2017). Management considerations, US beef industry structure, pathogenesis, antibiotic usage, and financial implications of BRD are all factors to bear in mind when conducting BRD-related research and will be discussed in the following literature review. 3 Management Decisions The most important component of managing prevalence of BRD on a feedlot is to minimize the stress cattle experience. Stress impairs the immune system’s ability to effectively fight invasive bacteria by increasing cortisol levels in the body, resulting in reduced efficacy of the adaptive immune response (Edwards, 2010; Wilson et al., 2017). If stress is prolonged early in life, then cattle are more likely to develop serious or chronic conditions because the immune system is unable to develop fully or respond to pathogens effectively (Neibergs et al., 2014). Management practices can be modified to reduce stress levels in cattle and reduce future susceptibility to chronic conditions (Mosier, 2015). Key management practices to consider include preconditioning, weaning and castration, vaccination and deworming, handling and shipment, commingling, and access to appropriate nutrients (Currin, 2009). These factors typically act in combination with each other and have a synergistic effect (Callan and Garry, 2002). Although processing costs represent only 2-6% of expected total production costs, early management and processing decisions can affect animal performance, health, and overall profitability, making these choices directly related to the financial status of an operation (Griffin, 1997). The following section provides an overview of these stressful events feedlot cattle typically experience early in life, as well as key factors to consider when transitioning cattle to a feedlot. Preconditioning Preconditioning is an industry term used to describe the phase of life between weaning and shipment to another facility such as a feedlot, backgrounding facility, or pasture-based 4 finishing system. The overarching goal of the preconditioning period is to minimize stress and enhance immune system strength. Minimizing stress levels early in life allows time for the immune system to develop, enhances feedlot performance, and reduces morbidity throughout the growing phase (Arthington et al., 2008). Although this can be achieved through various combinations of management decisions, preconditioning programs most often focus on preventative health protocols and nutritional programs, over at least a 45 day period post-weaning (Laborie, 2018). Preventative health management through preconditioning programs is a multi-faceted effort and is considered value-added for producers. The additional capital invested in calves prior to sale and shipment results in better performance in feedlots compared to cattle which are not preconditioned (Arthington et al., 2008; Wilson et al., 2017). The hands-on portion of preconditioning can often be executed in one day, can be verified through USDA process verified programs (PVPs), enhances the cost-effectiveness of processing, and typically results in higher sale prices for feeder cattle. Two common foci of preconditioning protocols are to vaccinate effectively against common diseases and increase weight. Ideally, the method used or weight gain during preconditioning should reflect the operation they are destined for. For example, if a group is going to be raised in a feedlot it is best to allow them to adapt to feed bunks and water troughs. However if they are going to be backgrounded or finished on pasture, it makes sense to feed grass hay or pasture during preconditioning (Shelley and Matney, 2016). Optimal average daily gain over a 45 days preconditioning program should be approximately 0.68 kg/d. This rate allows for continuous growth without calves becoming overly filled-out, an important 5 consideration because numerous studies have shown that preconditioned calves with too high of a body condition score actually receive discounted sale prices (Thrift and Thrift, 2011). In addition to strategic weight gain, preconditioning programs should include a scheduled sequence of vaccinations to help the immune system develop. One recommended vaccination protocol is a two-part injection series of clostridium and viral vaccines prior to shipping to allow proper immune system development (Wilson et al, 2017). Cattle vaccinated with an modified live virus (MLV) at approximately 7 weeks and again 3 weeks before shipment were associated with lower morbidity rates than those treated with killed viral vaccines much closer to the date of shipment (Fulton et al., 2002). A second recommended vaccination protocol involves injections to combat viral agents such as Bovine Herpesvirus 1 (BHV-1), Parainfluenza Virus 3 (PI-3), Bovine Viral Diarrhea Virus (BVDV), and Bovine Respiratory Syncytial Virus (BRSV) as well as bacterial pathogens such as Mannheimia haemolytica, Pasteurella multocida and Histophilus somni. These pathogens are associated with increased susceptibility to BRD (Fulton et al., 2000). Time is a key element to immune system development and proper vaccination schedules. In most cases it takes at least 1 to 3 weeks after vaccination for immunity to develop, and some vaccines may also require booster doses given at a later date (Urban-Chmiel and Grooms, 2012). Consultation with a veterinarian is recommended to ensure proper vaccination schedules are observed to help cattle develop immunity to these viruses and bacteria as effectively as possible (Laborie, 2018). Much like vaccination protocols, a proper de-worming schedule can have a substantial impact on the health of calves early in the growing phase (Gould, 2011). High parasite load is also associated with lower quality carcass traits (Clark et al., 2015). For young calves, de- worming treatment should begin at three to four months of age and be repeated at weaning to 6 effectively minimize the impact parasites have on their growth and immune status (Drovers, 2013). The main classes of de-wormers include avermectins, milbemycins, and benzimidazoles, and are often available as pour-on or injectable forms. Overall, immune status and growth rates often go hand in hand throughout the preconditioning period. Incidence of BRD was associated with decreased growth during preconditioning (P <.001) for heifers with an average body weight of 241.3 +/- 16.6 kg observed over a 63 day preconditioning period in Oklahoma in 2007 (Holland et al., 2010). Not only does a well-rounded preconditioning program help prevent morbidity incidence, it often provides opportunity for producers to generate additional income. A summary of multiple studies found that buyers were willing to pay a premium of $1.43 to $6.15/45.4 kg for calves that were preconditioned compared to non-preconditioned calves (Thrift and Thrift, 2011). This value may rise as breeders develop a positive reputation amongst buyers (Shelley and Matney, 2016). Thus, investments into preventative animal health including vaccination and de-worming tend to have a favorable return on investment in terms of animal immune status and future performance. Weaning & Castration Traditionally, calves are weaned at 150-210 days of age, however as genetic improvements progress across the industry, calves are growing at faster rates meaning producers are often able to wean closer to 120-160 days of age. The weaning process is inherently stressful, particularly for young, light-weight calves and this stress is reflected by elevated cortisol levels in the bloodstream following weaning. Elevated cortisol levels impair the ability of the immune system to respond effectively to antigens and increases the likelihood of respiratory related illness (Avra et al., 2017). 7 Weaning is also associated with a decline in overall immune function, with symptoms including a fever lasting for 2 days post-weaning and a decrease in phagocytic function for 7 days after weaning (Lynch et al., 2010). Calves weaned prior to shipment exhibit lower rates of BRD morbidity compared to calves weaned and loaded directly on the truck. Calves weaned prior to shipment have more time to recover from the stress before adjusting to unfamiliar surroundings (Arthington et al., 2008; Wilson et al., 2017). Allowing calves to remain on the site of origin for 45 days after weaning was associated with a significantly lower rate of BRD related morbidity (P <.001) throughout the first 42 days after arrival into a feedlot (Step et al., 2008). Pre–conditioned calves have been shown to have improved average daily gains (ADG; P <.01), improved rates of consumption of concentrate (P <.03), and show improved efficiency of gain (P <.01) compared to those shipped on the day of weaning (Arthington et al., 2008). Accordingly, there is incentive for buyers to purchase calves which are weaned 30 to 70 days prior to shipment, even if the initial purchase price is higher. In addition to weaning, castration is an inherently stressful experience for calves which can affect future feedlot health and performance. Increased incidence of BRD in steers compared to heifers (P <.05) was the result of stress induced by castration of bulls (Snowder et al., 2006; Taylor et al., 2010). Castration of bulls is a necessary as it helps ensure farm safety and improves food quality traits. Ideally, castration occurs between birth and 4 months of age to allow recovery prior to weaning. There is financial incentive for producers to castrate during this time frame, because bulls castrated in advance of weaning have lower morbidity rates and improved overall performance compared to bulls castrated and weaned at the same time (Hilton, 2009). These benefits to performance are well known across the industry and buyers are typically willing to pay a premium for bulls castrated prior to weaning and shipment. 8 Handling and Shipment Good animal husbandry practices are essential to effectively minimize the stress cattle experience during handling and shipment. Cattle should always be handled in a calm and quiet environment, avoiding excessive loud noises, overcrowding, and the use of cattle prods. Sorting systems with good footing and solid-sided, curved walls also reduce stress and minimize risk of injury (Grandin, 1994; Edwards, 2010). These factors should be considered during on-farm handling, while preparing for shipment, and during the shipment process. Shipment is the most universally accepted risk factor for BRD but is often a necessary element of beef production. In the US, cow-calf operations are typically established marginal lands that are unsuitable for crop production. Finishing operations are frequently in proximity to the “corn belt”. Cattle frequently have to travel long distances from cow-calf units or backgrounding to feedlots (Taylor et al., 2010). This means a large majority of cattle must be shipped long distances at least once early on in life, causing significant stress (Cernicchiaro et al., 2012). The initial phases of shipment including animal handling, truck loading, and the first 150 miles of transport in particular have been shown to temporarily increase stress-related hormone levels in animals and impair immune response efficacy (Odore et al., 2004; Taylor et al., 2010). Neutrophilia has also been observed in response to shipment, although cortisol levels and increased neutrophil counts are not strongly correlated (Buckham Sporer et al., 2007). Length of shipment has also been directly correlated to shrink, the industry term for a decline in animal bodyweight due to elevated stress levels, loss of gut fill, and loss of bodily fluids (Taylor et al., 2010; Cernicchiaro et al., 2012). Across the industry, a shrink rate of 3-4% 9 is considered normal and acceptable (Wilson et al., 2017). It is generally recommended that cattle be allowed to rest for 12 to 48 hours in a pen separate from the existing herd after arrival to allow time to regain gut fill and nutritional status before being processed and comingled. The extent of shrink and shipment stress can also be affected by weather, particularly excessive heat or changing environmental conditions. Adverse weather conditions can directly affect cattle health and decrease the immune system’s ability to fight infectious bacterial agents. Seasonality effects during shipping have been shown to influence BRD prevalence, with many studies indicating that cattle shipped in the late summer and fall are at elevated risk of excessive shrink and development of BRD. While this trend is true across much of the cattle industry, it is difficult to determine the exact relationship between weather, human effects, and market conditions which contribute to higher morbidity rates (Taylor et al., 2010). Facilities Conditions Factors such as facility maintenance and airflow can also affect the prevalence of BRD in feedlots. Ambient airborne dust particles, particle size, and clearance ability of cattle have been attributed to BRD prevalence in feedlots. Elevated airborne particles in housing facilities can impair an animal’s ability to effectively clear debris and pathogens from the respiratory tract, increasing opportunity for bacteria to colonize lower in the respiratory tract and cause cellular damage (MacVean et al., 1986). Although most bacteria in the air are not pathogenic, dead bacteria or bacteria which fail to thrive in the respiratory pathway pose a burden to the respiratory tracts clearance ability and inadvertently assist BRD inducing bacteria (Wathes et al., 1983; Nordlund, 2006). 10 Proper year-round ventilation is key to minimizing the number of BRD cases due to housing conditions. Bacterial density, ambient temperature management, humidity, and carbon dioxide levels are all dependent on proper ventilation and can affect overall animal health (Nordlund, 2006). Methods to manage ventilation is largely influenced by geographical location, as well as operation size. Consideration of housing cattle in barns versus open lots, weather conditions, facilities design, and directional positioning of pens based on natural airflow are key to ensuring proper airflow throughout the operation. Stocking density and overall cleanliness also impacts herd health. Overcrowding cattle induces stress and reduces immune system responsiveness, simultaneously increasing total airborne bacterial count (Gay and Barnouin, 2009). Survivability of viral respiratory pathogens in the external environment is typically limited to minutes or hours, while bacteria tend to survive longer particularly in wet conditions with abundant organic material (Callan and Garry, 2002). Thus, providing cattle ample space and bedding allows them to stay cool and dry, reducing moisture and organic material levels and effectively reducing environmental pathogen survivability. Commingling Commingling is another major source of stress and contributes to BRD development because it disrupts herd social hierarchy and often exposes them to foreign pathogens. Young cattle have had less time to develop a robust arsenal of adaptive immune responses and are at increased risk of being affected by foreign pathogens. When compared to single source cattle, cohorts that were commingled had significantly greater morbidity rates and incurred greater health related expenses during the growing period (Step et al., 2008). 11 A simple step feedlots can take to reduce commingling stress is to purchase directly from a producer, rather than going through a sale barn or online auction. When purchasing directly, buyers are more likely to have access to health records and can minimize commingling and exposure to foreign pathogens. However, it is still important to vaccinate herds that have never been exposed to unfamiliar cattle, because naïve cattle are at elevated risk for severe outbreak if inadvertently exposed to an unfamiliar pathogen. If direct purchase is not an option, or new cattle are being added to an existing herd, steps can still be taken to minimize commingling as much as possible. Cattle should be quarantined upon arrival to the feedlot or farm, and the herd should be maintained under a uniform and documented vaccination protocol. Bunk Acclimation and Nutrition Acclimation to bunk feed systems and community water sources prior to feedlot arrival also helps reduce herd stress. Cattle are more likely to return to optimal nutritional levels after experiencing stress from shipment if they are familiar with a feed bunk system. This is important in preventing BRD because timely return to normal nutritional status after shipping-induced fasting is associated with resumption of normal immune system function (Galyean et al., 1999). Another important element of preventing BRD is maintaining normal nutritional status and immune system function via proper ration balancing. The ration offered to cattle upon arrival to the feedlot can be vital to productivity and health. If given a ration mix that is too “hot”, or energy dense, ruminal pH may drop to acidic levels which kills some species of ruminal bacteria and disrupts normal digestive processes. This can lead to acidosis, a condition which damages the lining of the digestive tract and prevents cattle from absorbing adequate nutrients and water. 12 Potentially deadly on its own, acidosis is a serious condition and can have detrimental effects on respiratory function, initial feedlot growth rates, and overall efficiency (Owens et al., 1998). There are a wide range of management decisions to be made when working to prevent BRD on feedlots in the USA. No single combination of decisions will completely prevent respiratory challenge, and sometimes even when all the “right” moves are made outbreaks still occur. The key to the management decision-making process for BRD minimization is to consider each factor critically, keep detailed operational records, and focus on improving management strategies rather than relying on medical interventions. Cattle Characteristics Susceptibility to BRD is influenced on an individual basis by genetics, age, arrival weight and nutrition status, pathogen exposure history, and existing immune challenges (Avra et al., 2017). While it can affect animals at any age, BRD is typically observed in young, stressed calves in domestic agricultural settings. Incidence is greatest in cattle that weigh 318 kg or less and within 3-10 days after arrival to a new location (Wolfger et al., 2015). Clinical cases identified later in life are usually associated with subclinical or chronic infections (Taylor et al., 2010). The disease does not typically spread between species, however outbreaks of much smaller scale can be caused by similar strains of bacteria and have been observed in goats, sheep, and even the Tibetan Antelope (Yu et al., 2013). Prior exposure to pathogens and overall herd health history also affects BRD susceptibility. Prior exposure to respiratory related pathogens can be beneficial if it occurs in small doses because it activates the adaptive immune response, helping the body develop the antibodies necessary to combat more aggressive exposures. 13 Heritability There is also a variety of genetic components to BRD susceptibility. Estimates of the heritability of BRD susceptibility traditionally were considered low at h2 =.06 +/- .07 (Muggli- Cockett et al., 1992). Similarly, the heritability of resistance to BRD ranged from h2 =.04 to .08 +/- .01 when 18,112 calves representing 9 breeds were analyzed (Snowder et al., 2006). It is not clear whether these heritability estimates are influenced by the phenotypic indicators traditionally used to identify BRD in a feedlot setting (Bishop and Woolliams, 2014). However, more recent studies indicate that specific loci were associated with BRD susceptibility with local population heritability of h2 =.21. When heritability data from multiple populations were combined, total population heritability fell to h2 =.13 suggesting that requirements for BRD resistance vary between specific populations (Neibergs et al., 2014). Some studies suggest that due to passive immunity transfer, dams with high BRD resistance are more likely to raise young with poor BRD resistance. It is hypothesized that dams with strong immune systems impart similarly robust passive immunity to their calves. While this is beneficial when offspring are young, it may delay calves’ development of adaptive immunity, resulting in susceptibility to sickness when exposed to foreign pathogens at later in life (Snowder et al., 2006). Little difference in BRD susceptibility has been observed between breeds, aside from Herefords who are generally considered more slightly susceptible compared to other common North American breeds (P <.05; Snowder et al., 2006). Alternatively, animal temperament is a heritable trait which has been shown to affect ADG, meat quality, and BRD related morbidity rates (Voisinet et al., 1997). Significant correlations exist between high stress animals, elevated cortisol, hematological variables, decreased animal ADG, and increased morbidity rates. This 14 indicates an incentive for producers to select for animals with traits such as relaxed temperament and slow flight times (Fell et al., 1999). Cattle that were aggressive and stress easily were worth $62.19 less per head than docile cattle due to decreased ADG and carcass quality. Temperament was also directly correlated to animal health, farm safety, and profitability per head. These factors should be considered seriously when breeding future generations of beef cattle (Haskell et al., 2014). Genetic differences in reactivity to vaccination as determined by neutrophil and lymphocyte counts pre versus post vaccination were also observed (r = 0.73 ± 0.08, and 0.67 ± 0.06 respectively; P <.001). Genetic correlation between total WBC, neutrophil, and lymphocyte counts were also significant (P <.001). Changes in lymphocyte counts after vaccination were correlated with health records related to BRD (P <.01). Lymphocyte response was associated with a heritability of h2 >.41 (Leach et al., 2013). Additional research is necessary to determine the exact genetic mechanisms that affect response to vaccination. However, these initial findings suggest strength of immune response has strong heritability and may be a useful genetic selection tool to reduce animal health issues. Symptoms Currently, identification of animals with clinical BRD is dependent on visual assessments made by farm workers who examine pens once or twice daily. During these visual assessments, workers check for animals with external symptoms of sickness. The most common visual symptoms of BRD include fever, lack of appetite, lethargy and self-imposed isolation from herd, nasal discharge, repetitive coughing or wheezing, labored breathing, and drooping eyes, head and ears (Snowder et al. 2006, Gay and Barnouin, 2009; Neibergs et al., 2014). If these symptoms are observed, animals are pulled from the pen and brought to handling facility for assessment. 15 Further examinations include measurement of the animal’s rectal temperature or listening more closely to an animal’s breathing. If rectal body temperature exceeds the normal range of 38.3- 40°C, the animal has a treatable fever and antibiotic intervention should be considered. Fever is generally considered an accurate indicator of BRD up to two days prior to peak sickness (Toaff- Rosenstein et al., 2016). When compared to healthy animals, cattle with a fever are 12 times more likely to have lung lesions upon ultrasound (Abutarbush et al., 2012). This is a good predictive tool for lung condition, however results from ultrasound have not been shown to be predictive of weight gain or other health conditions. If clinical BRD is detected, producers administer a dose of antibiotics per label instructions. Depending on the operation, this animal may be placed in a “sick pen” for closer observation, or placed back in its original pen to minimize stress associated with social reorganization. It is important to note that not all beef producers experience issues with BRD on their farms or feedlots. Absence of BRD can be due to a variety of management and environmental circumstances, including that certain strains of bacteria may happen to be absent in various herds. Bacterial agents which cause the development of BRD are commensal, meaning they can exist in the host’s respiratory tract without causing any harm to the animal. Even under commensal circumstances, bacteria can be transmitted via respiration in shared airspace, nose to nose contact, and nasal mucous deposited in feed bunks and waterers (MacVean et al., 1986). The relationship between commensal bacteria and the host shifts to parasitic when colonization of the lower respiratory tract begins and impairs respiratory efficiency of the host. To combat the costly and often unpredictable development of BRD, producers who handle highly stressed animals and struggle with BRD commonly utilize a treatment strategy known as metaphylaxis. 16 Metaphylaxis is defined as the mass medication of a group of animals to eliminate or minimize an expected outbreak of disease (Edwards, 2010; Urban-Chmiel and Grooms, 2012). Generally, in assessing BRD outbreak potential, producers first determine the level of risk each group of cattle represents. Groups designated as “high risk” based on health and management history are treated with additional vaccines or antibiotics which specifically target BRD pathogens. Mass medication at the pen level is often recommended if 10% of a single pen is treated for 2 to 3 days in a row, or more than 25% are treated in one day (Edwards, 2010). Current estimates in the beef industry indicate that only one fifth of animals treated using metaphylaxis are at high risk for developing BRD. This suggests that successful targeted screening protocols could reduce mass antibiotic treatment by 80% (Maday, 2018). Significance of Bovine Respiratory Disease This bacterial disease complex has varied importance at the local, national, and global level. The prevalence of BRD in the feedlot industry affects the financial status of the agriculture industry damages public perception of agriculture, and has serious implications for global health. Industry Concerns The two most important concerns in the beef production industry are animal well-being and profitability. First and foremost, BRD is an issue of animal well-being. This disease spreads easily in the intensive management system common in the US and contributes to animal suffering. Producers are charged with the care and well-being of their herd, and should strive to do everything ethically feasible to prevent this type of outbreak on their operation. One major challenge in addressing BRD associated well-being issues and financial losses is the prevalence of subclinical cases. Subclinical cases of BRD are difficult to identify prior to 17 harvest, but contribute greatly to the spread of harmful bacterial pathogens and impair herd productivity. Nearly 65% of all lungs analyzed post-mortem have shown signs of BRD challenge as indicated by lung consolidation, fibrin tissue in lung, hyperinflation, and lung lesions. Only 50% of those animals were identified as sick based on clinical symptoms while alive (Kiser et al., 2017). This indicates that a substantial undetected portion of the national herd is underperforming, spreading bacterial pathogens, and experiencing chronic subclinical BRD challenge. This reduces industry productivity and increases production costs without farmers realizing it. This observation confirmed findings from 1997 when it was discovered that cattle with lung lesions were associated with a 0.03-0.06 kg/day lower ADG compared to those with healthy lungs (Griffin, 1997; Bryant et al., 1999). Subclinical cases of BRD are particularly difficult to diagnose early on in the infection process because cattle are prey animals and instinctively mask symptoms for as long as possible as a self-preservation measure (Griffin, 2014). The challenge of detecting and treating BRD rapidly and effectively is exasperated by the fact that bacterial pathogens adapt over time, constantly changing the efficacy of treatment. This impairs producers’ and scientists’ abilities to compare historical trends with current day BRD outbreaks. Our current inability to rapidly identify specific strains of bacteria also deters treatment efficacy. Unfortunately, these challenges have been amplified by the “cocked syringe” mentality that previously dominated the industry. The old saying “if you start treatment early enough they will respond to any antibiotic” is not accurate and may have contributed to the spread of resistant bacteria (Griffin, 1997). Although the industry has made great strides in better understanding 18 antibiotic stewardship, this antiquated mentality still exists in some pockets of the industry and must be addressed. In terms of financial ramifications for the beef industry, BRD is an extremely costly disease complex. In the North American market, BRD costs an estimated $1 billion annually (Griffin, 1997; Fulton et al., 2002; Snyder et al., 2017). Unfortunately, despite a rise in understanding of the pathogenesis of BRD and gradual progress in preventative and treatment options, little improvement in BRD rates has been observed over the last 40 years (Currin, 2009; Wolfger et al., 2015). This is a clear indicator that current management methods and associated technology require further research and improvement. Finding solutions will be increasingly vital, as demand for beef and global beef production are projected to grow from 94.4 million to 96 million head in the US alone by 2023 to accommodate demand (Westcott and Trostle, 2014; USDA, 2018b). Costs associated with BRD have accumulated in the form of medical expenses for preventative and clinical treatment, labor required to monitor animals and provide treatment, decreased animal performance, and lower quality animal products. Cattle treated multiple times were less likely to finish with their cohort, incur higher treatment costs during the growing phase, and were associated with lower net value overall (Fulton et al., 2002; Avra et al., 2017). The average cost per treatment is currently estimated at $23.60, and calves treated two or more times after arrival to the feedlot exhibit lower ADG, had lower marbling scores, and recorded a net loss at time of sale (USDA, 2013). Exact values vary depending on market prices for the year, but in 2009 it was estimated that declines in performance and carcass merit resulted in a loss of total carcass value of $23.23 if an animal was treated once with antibiotics to treat BRD, $30.15 if treated twice, and $54.01 if treated three times (Schneider et al., 2009). Total losses per 19 head were estimated at $40.46 if treated once, $58.35 if treated twice, and $291.93 per head if treated 3 or more times (Fulton et al., 2002). Global health concerns Any opportunity to reduce antibiotic usage in modern medicine should be considered seriously because of the natural survival strategies bacteria utilize. When faced with a bactericide, most bacteria will be killed but a small number of resistant bacteria often survive and multiply. As the resistant bacteria grow in numbers, the efficacy of the bactericide declines (Snyder et al., 2017). This resistance can be achieved when bacteria develop protective functions such as the ability to neutralize antibiotics, rapidly pump the antibiotic away from bacterial cells, or change the conformation of binding sites on its cell walls (Cunha, 2017). This is a fact of natural life, but the widespread usage of antibiotics in modern medicine has sped up the rise of antibiotic resistant bacteria worldwide. To slow this process, unnecessary or inappropriate usage of antibiotics should be avoided at all times in both animal and human medicine (Hoelzer et al., 2017). This is a major driving force for current research surrounding early detection and targeted treatment of BRD. Scientists and pharmaceutical companies involved in the development and distribution of antibiotics used to treat BRD have a particularly high interest in antibiotic resistance. The time and resources devoted to development of new antibiotics to combat BRD is extensive, difficult, and extremely costly. Thus, increased resistance to antibiotics is a concern from both a public health and financial perspective. The longer antibiotics are efficacious in the field, the more time pharmaceutical companies can devote to development of other products with long lasting efficacy for human and animal healthcare. 20 In the last few decades, only two new types of antibiotics have been developed and made available for general use. Even if recent government incentive programs to develop new antibiotic classes are successful, it may take a decade or more for those products to become available on the market and they will likely be only available in human medicine. The limitation on new classes of antibiotics available for use in animal medicine is especially important for the beef and dairy producers. In 2016, 43% of the domestic sales of medically important and 55% of non-medically important antibiotics prescribed for use in food animal production were used on cattle operations (FDA, 2017). Public perception Improved BRD management and a reduction in the financial weight the disease complex holds over the industry bodes well for consumers as well as producers. Any time a feedlot can reduce production costs without damaging food safety and quality or risking animal health is a mutually beneficial scenario. Reduced production costs can translate to lower prices in grocery stores, increased capital for producers to improve production practices, greater product demand, and overall stability in the marketplace. Over the last decade or so, there has also been a significant increase in social pressure on producers to decrease their usage of antibiotics in the food animal system due to the fear of antibiotic residues. It is important to remember that products raised in an agricultural setting and processed in state and government inspected facilities are routinely tested for antibiotic residues, and processes to improve and ensure food safety and inspection are updated annually by the federal government (USDA, 2018) Thus, the public perception that meat or milk products are contaminated with residues from antibiotics is inaccurate. 21 Alternatively, shared spaces such as common workplaces or housing structures do provide opportunity for the transmission of bacteria, including potentially resistant populations. Fecal contamination is a common mode of transmission of bacteria and commonalities between microbial populations of farm workers and livestock have been discovered. To combat this form of resistant transmission, discovery of mechanisms of antibiotic resistant gene (ARG) transfer are a growing field of research (Sun et al., 2017). To aid in this effort, producers must be diligent about keeping records on antibiotic usage, strive to reduce cross contamination between farms, and maintain high sanitation standards. While there is a legitimate cause for concern surrounding antibiotic resistance on a global scale, there are also health, production and welfare concerns associated with decreasing usage. Outright removal of antibiotic treatment in animal agriculture without alternative management and health strategies in place is irresponsible and could contribute to the development of serious disease outbreaks. Consideration of the complex interaction between industry concerns, global health issues, and societal pressures will be key to the future of antibiotic usage in animal production. Ultimately, producers and scientists must strive to establish a balance between judicious and efficacious usage of antibiotics in food animals, while protecting the well-being of animals and ensuring food safety. Bacterial Components of Bovine Respiratory Disease Bacteria which cause the development of BRD often act opportunistically after viruses create conditions which drain the body’s immune capabilities. Common viral agents identified as precursors to BRD include Bovine Herpesvirus 1 (BHV-1), Parainfluenza Virus 3 (PI-3), Bovine Viral Diarrhea Virus (BVDV), and Bovine Respiratory Syncytial Virus (BRSV; Fulton et al., 2000; Toaff-Rosenstein et al., 2016). These viruses impair the efficacy of the immune system by 22 eliciting a heightened immune response, causing cellular death, and taxing the body’s ability to fight additional foreign pathogens. This creates an opportunity for commensal bacteria to rapidly replicate deeper in the respiratory tract, leading to the development of BRD. Mannheimia haemolytica, Pasteurella multocida, Histophilus somni, and Mycoplasma bovis are bacterial agents most commonly associated with BRD (Ellis, 2001; Callan and Garry, 2002). A study conducted in Oklahoma found that 92% of herds involved in an animal health feedlot study tested positive for one or more of M. haemolytica, P. multocida, and H. somni upon arrival to the feed yard, confirming the suspected endemic presence of those bacteria across the US industry. However, presence of these bacteria was not predictive of illness, indicating that positive nasal swab results at processing do not necessarily indicate that antibiotic treatment is necessary (Fulton et al., 2002). Antibiotics typically used to treat BRD include florfenicol, tulathromycin, tildipirosin, gamithromycin, danofloxacin, ceftiofur, enrofloxacin, and tilmicosin (Snyder et al., 2017). Mannheimia haemolytica Mannheimia haemolytica is a Gram-negative, anaerobic, non-spore-forming and non- motile bacteria (Rice et al., 2008). It is one of the most common bacterial agents associated with BRD in the USA, causing acute hemorrhagic fibrinonecrotic pneumonia and is associated with a grayish brown or reddish black coloration in areas of lung consolidation (Campbell, 2018). Neuraminidase and neutral protease produced by the bacteria are thought to enhance colonization and adherence to the epithelium of the respiratory tract (Whiteley et al., 1992). Mannheimia haemolytica also produces a leukotoxin which binds primarily to CD18 adhesion molecules, resulting in the upregulation of pathways which destroy the targeted host respiratory cells. The leukotoxin causes activation of neutrophils, induces apoptosis of leukocytes, inhibits 23 phagocytosis and targeting of bacterial cells, and upregulates cytokines such as interleukin-1 and interleukin-8 which have proinflammatory functions in body tissue and contribute to tissue damage (Hodgins et al., 2002). Interleukin-8 acts as a chemoattractant and contributes to the superfluous recruitment of neutrophils to the site of infection. When associated with acute stress response, these proinflammatory effects are amplified and can further contribute to cell death within the lungs, impairing respiratory function (Malazdrewich et al., 2004). In North America, there are two major genotypes of M. haemolytica which can be further subdivided into distinct subtypes. In 2016 only genotype 2 was associated with a higher incidence of BRD related lung damage, and had higher prevalence of resistance to antibiotic treatment. Specific resistant loci for resistant traits have been identified, suggesting that further genotyping may aid in the development of targeted treatment strategies tailored to the specific genotypes and subtypes of harmful bacteria (Clawson et al., 2016). In Georgia, nasal swab samples were obtained from a herd with an average weight of 229 kg, and M. haemolytica was isolated in 16% of all cattle upon arrival. The herd was swabbed again 10-14 days after arrival, and 72.8% of animals cultured positive for M. haemolytica. This increase in prevalence across the herd indicated a high transmission potential for BRD pertinent bacteria throughout a herd (P <.001). Additionally, 98.6% of isolates from the second swab sampling were partially or completely resistant to macrolides and fluoroquinolone enrofloxacin and 69.4% were partially or completely resistant to florfenicol (Snyder et al., 2017). These findings confirm that the transmission of antibiotic resistant bacteria can be rapid and rampant. 24 Pasteurella multocida Pasteurella multocida is a non-motile, Gram-negative strain of bacteria associated with subacute and chronic bronchopneumonia (Dabo et al., 2007). Compared to other BRD related bacteria, larger populations of this bacteria are typically required to induce respiratory damage (Ellis, 2001). However, if substantial numbers of P. mulotcida are present, infection results in rapid development of acute fibrinous bronchopneumonia or chronic suppurative bronchopneumonia potentially leading to toxemia (Hodgins et al., 2002; Currin, 2009). There are three subspecies, five capsular serogroups and 16 serotype of P. mulotcida, with type A:3 most commonly isolated in cases of BRD. It is thought that the difference between commensal and pathogenic P. multocida is due to the ability to induce inflammatory responses through toll-like receptor (TLR) signaling (Dabo et al., 2007). Although P. multocida is often considered a less severe infectious agent compared to M. haemolytica, low antibody titers against the bacteria have been associated with decreased herd net value (P <.001). For individual animals, low antibody titers were associated with decreased net value (P <.05) and gross margin (P <.01) and high titers associated with increased average daily gain (P <.01), indicating that strong immunity against this bacteria could have financial benefits for the industry (Fulton et al., 2000). Histophilus somni Histophilus somni is Gram-negative, is typically associated with bronchopneumonia, and is often isolated in cases of subacute infection. This bacteria is more common in northern regions of North America and has been linked to peracute septicemia, fibrinous pleuritic, and arthritis 25 development (Campbell, 2018). Virulence factors of H. somni include lipooligosaccharide, iron- binding proteins, and a Fc receptor-like protein which aids its survival (Ellis, 2001). Lesions including necrotizing bronchiolitis and alveolitis are associated with H. somni. Lipooligosaccharides (LPS) found in the outer membrane of the bacteria are thought to be a key virulence factor for H. somni. The LPS causes endothelial changes in the host which trigger thrombosis in alveolar vessels, leading to further lung damage (Sylte et al., 2001). The bacteria also triggers apoptosis of neutrophils which aids in evasion of immune response, so the host’s response to the bacteria is rendered less effective (Hodgins et al., 2002). Mycoplasma bovis Mycoplasma bovis is a bacterium which was connected to BRD in the United States after the three mentioned above. It is associated with chronic bronchopneumonia characterized by caseation, coagulative necrosis, and chronic lameness (Campbell, 2018). Symptoms associated with M. bovis infection include lung lesions, impaired lung function, droopy ears, head tilt, and joint swelling. It is suspected that rather than the cell-mediated response associated with other BRD bacterium, a specific immunoglobulin is responsible for the lung lesions observed in cattle infected with M. bovis. When isolated, M. bovis IgG1 and IgG2 promote the killing of macrophages while IgG2 attacks the host’s neutrophils (Ellis, 2001). An additional unique characteristic of M. bovis is that up to 50% lung damage can occur in cattle before these symptoms are ever expressed externally. This makes diagnosis and treatment prior to irreversible lung damage extremely difficult. Adding to the challenge of treating M. bovis effectively, this strain of bacteria does not have the cell wall composition observed in most strains of bacteria. This makes many antibiotics 26 commonly used to treat BRD such as penicillin, ceftiofur crystalline free acid, or ceftiofur hydrochloride ineffective against infection. Certain antibiotics such as Nuflor, florfenicol, or tulathromycin have been associated with higher treatment success. These particular antibiotics tend to be on the higher end of the price range for treatment options meaning some producers shy away from using them (Rosenbusch et al., 2005). Because of the unique nature of M. bovis and the likelihood for relapse after treatment, early detection and prolonged treatment with veterinarian consultation are recommended if Mycoplasma infection is suspected within a herd (Currin, 2009). Immunology Overview Ruminants like beef cattle may be naturally predisposed to respiratory issues compared to other species due to the structure of their respiratory tract. Compared to their body size, cattle have small lungs with a long tracheobronchial tree. While this structure does not necessarily impair respiratory function compared to other animals, the large amount of surface area within the respiratory tract may lead to increased transit time for inhaled substances and contribute to particle and bacterial deposition (Ackermann et al., 2010). Additionally, bovine lungs have many lobules which means there is a high degree of lung segmentation, low elasticity, and more connective tissue than other species. No collateral ventilation in the lungs also means occlusion of one bronchus causes collapse of the distal lung segment (Muller and Berg, 2011). The respiratory tract also naturally provides a unique opportunity for pathogens to gain access to the body because alveoli require free access to air. A unique defense system is necessary to protect the lungs from this direct interaction between the external and internal environments. The multiple layers of protection within the respiratory tract include initial barriers to entry for pathogens, known as mucosal and epithelial barriers. These protective 27 barriers utilize antimicrobial mucous, cilia, and lining epithelial cells. Mucous typically contains factors such as lysozyme and immunoglobulin A, and performs a trapping and antiseptic function. Antimicrobial peptides such as defensins and cathelidins are also involved in initial protective functions and can aid in triggering more specific immune responses (Ganz, 2003; Kościuczuk et al., 2012). Cells such as neutrophils and bronchoalveolar macrophages (BAM) are also crucial to the innate nonspecific response to infectious agents (Ellis, 2001). The Gram-negative bacteria involved in BRD often produce lipopolysaccharides (LPS), which act as pathogen associated molecular patterns (PAMP). These PAMPs are then recognized by epithelia, alveolar macrophages, and intravascular macrophages, as well as more specialized cells in cases of acute infection (Ackermann et al., 2010). Recognition of invasive pathogens triggers response from the second line of defense in the innate immune response involving phagocytes such neutrophils, macrophages, and dendritic cells. These cells identify, engulf and destroy invasive pathogens or damaged host cells and aid in beginning the process of activation in the adaptive immune response (Rosales and Uribe-Querol, 2017). Neutrophils, a type of white blood cell, are key to this innate immune response and flow freely throughout the blood stream to aid in rapid response to antigens (Parkin and Cohen, 2001). They have a key role in basic immune functions including apoptosis, adhesion, and inflammation and are involved in a rapid response to glucocorticoids such as cortisol. The phagocytic function of these cells can be beneficial in rapid response to infection, however they can also cause harm to body tissue when rapid degranulation and release of proteolytic enzymes or reactive oxygen species occurs. Following stressful events, neutrophils have been shown to be significantly correlated with BPI, an antibacterial granule protein associated with response to Gram-negative bacterial infections (P =.02; Buckham Sporer et al., 2007). 28 The adaptive immune response is activated following the initiation of the innate response to stress or infection. Adaptive immune response differs from innate immunity in that it requires both specificity and memory. This type of response utilizes white blood cells known as B and T lymphocytes which are further categorized into classes based on function. B cells are involved in humoral immunity which is mediated by antibodies, while T cells are involved in cell mediated immunity. B and T lymphocytes are generated in a process much like cloning, and are designed to respond to a single type of antigen, as determined by cell surface markers. When the B cell receptor (BCR) or T cell receptor (TCR) complex is activated by the binding of an appropriate antigen, a signaling cascade is initiated by a transmembrane signaling complex. This in turn triggers proliferation of the appropriate lymphocyte which is released into the blood stream to help protect the body (Elsevier, 2019). This response is very precise, however it can take days or weeks to develop a full adaptive immune response (Parkin and Cohen, 2001). The stress caused by transportation has been associated with impaired lymphocyte upregulation, and increased blood cortisol. The increased cortisol was associated with a down- regulation of lymphocyte glucocorticoid receptor (GR) and beta-adrenergic receptor for the first 24 hours after shipping (Odore et al., 2004). Additionally, BRD inducing pathogens are able to impede the adaptive immune response via altered alveolar macrophage function, induced apoptosis of immune cells, suppression of lymphocyte proliferation, or modification to the release of inflammatory mediators including cytokines (Panciera and Confer, 2010). If an animal’s immune response is too slow or ineffective, cellular damage occurs and respiratory function may be impaired. For example, bronchopneumonia is classified as a condition which occurs when inflammation of the bronchi occurs, impairing airways and gaseous exchange or causing patchy consolidation and pus formation in alveoli. Alveolitis is associated 29 with granulocyte destruction and reticuloendothelial proliferation, while necrosis results in caseation, the coagulation of dead cells until they form soft white protein-dense structures with a cheese-like appearance throughout the lungs (Thomas, 1978). Respiratory bronchiolitis occurs when macrophage death occurs rapidly, and they begin to fill the lumen of bronchioles and peribronchiolar alveoli. It has been confirmed that immune system vigor upon arrival to a feedlot plays a role in individual susceptibility of developing these conditions while at a feedlot. High levels of antibodies to viral and bacterial antigens upon arrival to feedlots were associated with lower rates of morbidity, and low antibody levels are associated with high morbidity rates (Fulton et al., 2002). Unfortunately, these antibody levels at arrival were determined via serologic tests which are conducted in a laboratory setting, making them largely inaccessible to producers. The correlation between immune system vigor at time of arrival and future health has encouraged further lines of research focused on technology which allows producers to detect BRD challenge earlier and make rapid treatment decisions. Emerging Technologies To effectively reduce BRD prevalence across the beef industry, emerging technologies and management strategies must be adaptable and reliable in a wide array of scenarios. Real- world application must always be considered when creating new health management tools for the beef industry. If producers can’t implement new tools or strategies in functional ways on their operation, these methods will have little to no impact on the beef industry even if they were effective in pilot research settings. Perhaps the most succinct way to summarize the requirements for new management strategies or tools for farmers, is that they must be rapid, reliable, and represent the prospect for return on investment (ROI; Maday, 2018). 30 While there is debate over how exactly to approach the issue, many scientists and producers agree that earlier detection will be key to improving BRD prevalence in the beef industry. Currently, clinical diagnosis of BRD tends to have poor sensitivity but high specificity for BRD on American feedlots (Timsit et. al, 2016). Extensive work is currently being conducted to develop technology which better detects immune system challenge or the earliest stages of BRD and allows for targeted antibiotic treatment. Whisper The Whisper is an electronic auscultation device designed to help diagnose BRD based on an electronic scoring of lung function. The device is shaped similarly to a traditional stethoscope and must be placed on the chest cavity of the animal for eight seconds. The Whisper uses sound frequency to determine lung condition and provide a lung health score of 1-5. Designed by Merck Animal Health, this tool is currently available to veterinarians and is undergoing continued modification to optimize predictive strength (www.merck-animal-health- usa.com/whisper, accessed January 3, 2019). This method provides a quantifiable and trackable scoring system for lung condition on an individual and herd basis. Early studies using the Whisper lead to reduced treatment costs and enhanced producers’ ability to administer treatment with precision. It is important to note that while system can assist in diagnosing early signs of BRD and improve targeted treatment early in the disease development, it does not necessarily predict which animals are at highest risk to get the disease (Maday, 2018). 31 Biosensors and Wearable Technology Biosensors and wearable technology are a rapidly expanding avenue of research across human and animal medicine. According to Harrop et al. this sector of the health industry is expected to expand from $0.91 billion in 2017 to $2.6 billion by 2027 (Harrop et al., 2016). This type of technology holds promise for animal health monitoring because they make farm monitoring easier by providing real-time on farm monitoring, with results often accessible via producers smart phones or computers (Neethirajan, 2017). Poor sensitivity and specificity in detection of BRD (61.8 and 62.8 respectively) have been a driving motivation to improve monitoring and disease detection across the US beef industry (White and Renter, 2009). Common focuses for biosensors and wearable technologies in the beef industry include physical behavior trackers, feeding and watering monitors, and body temperature indictors. Movement monitoring is a key focus for these technologies because physical behavior and lethargy have been associated with BRD for years, most likely due to the metabolic requirements of the immune system and associated effects of the immune response to bacterial infection (Hart, 1988). These trackers are typically attached to the ear, leg or on a collar, with various brands and sensitivities available across the industry. Similar tracking systems can be used to track feeding and watering behaviors, most often achieved by assigning each individual animal an electronic identifier number registered to a computer database. Each feed bunk or water trough can then be outfitted with a scale, and an attached computer system is able to measure the total change in feed weight while each animal is at the bunk (Bach et al., 2004). Other technologies measure the timing, frequency, and duration of feed bunk visits and upload the data to a remote monitoring system (Richeson et al., 2018). 32 This information can be valuable, as feeding behavior is correlated to morbidity both before and after diagnosis of BRD (Buhman et al., 2000). Fever is closely associated with clinical cases of BRD particularly in young cattle after shipment, making remote monitoring of body temperature is a large focus for biosensor development (Schwartzkopf-Genswein and Grandin, 2014; Toaff-Rosenstein et al., 2016). In beef cattle these technologies typically focus on body core temperature via ruminal bolus, peripheral temperature through microchips in the skin, or infrared thermal imaging which create a thermogram of the body’s temperature (Neethirajan, 2017). The applicability of these technologies varies based on the size and layout of each operation; however, temperature has been shown to be predictive of BRD 12 to 136 h prior to external symptoms (Timsit et al., 2011). Serum Haptoglobin Levels In the early stages of response to infectious agents, an acute phase response is triggered within the body. These reactions are targeted at the site of infection and include the rapid release of acute phase proteins (APP) in blood serum. Using polyacrylamide gel electrophoresis, scientists can identify specific APP which were identified as the alpha and beta subunits of haptoglobin. These haptoglobin molecules are found in the serum of cattle battling BRD but are absent in healthy cattle (Godson et al., 1996). The presence or absence of these serum haptoglobin molecules has been associated with the presence of respiratory challenge, and changes in levels can serve as an indicator to response to antibiotic treatment. However, haptoglobin levels and case severity are not necessarily correlated (Wittum et al., 1996). Additionally, this testing method has been found to be an effective indicator of general inflammatory status but must be measured during precise windows of time to be useful as a 33 disease prediction tool (Maday, 2018). This creates barriers for early BRD detection and treatment across the industry. Non-esterified fatty acids Non-esterified fatty acids or NEFAs have been identified as potential contributors to system-wide inflammation including chronic respiratory conditions. Certain fatty acids including saturated and n-6 polyunsaturated fatty acids have been associated with heightened innate immune response. Thus, it is thought that elevated NEFA levels in the blood may be an appropriate indicator of increased risk for respiratory challenge (Wood et al., 2009). Although NEFAs may serve as an accurate indicator of negative energy balance or lipolysis, more research is necessary to determine how strongly NEFA levels are associated with BRD susceptibility (Maday, 2018). Biosurveillance Systems As modern technology advances, information, individuals, and pathogens are able to travel faster than ever before. These advancements represent a major opportunity for growth and collaboration, but also increase the threat for disease outbreak at an international level. In response to this, one of the largest focuses for public health officials in recent years has been biosurveillance. The four basic focuses of this effort are to detect cases of disease within specific populations, analyze and confirm reported cases, provide timely and appropriate regional responses, and synthesize epidemiologic information for long term management and health care programs (Kman and Bachmann, 2012). These principles are applicable for human and animal health in addition to serving bioterrorism purposes. 34 One example of a biosurveillance system currently under development is the Global Rapid Identification of Threats System (GRITS). This system is designed to sift through online publications ranging from news reports to scientific publications to blogs and identify key epidemiologic related terms associated with increased risk in infectious outbreak. The goal of the system is to track infectious disease related information via non-traditional sources to identify emerging threats at an international scale (Huff et al., 2016). While this type of biosurveillance system may not be immediately applicable to BRD prevention or treatment, it speaks to the widespread effort underway to improve early disease detection. With continued refinement surveillance systems like GRITS could be adapted to interpret data collected from monitors or databases at on-farm and regional levels. For example, if producers utilize electronic monitoring systems or upload BRD-related morbidity data to a database regularly, a system like GRITS could track risk factors such as weather conditions, seasonality effects, animal health history, and regional trends to identify periods of high risk for BRD outbreak at the farm, community, and national level. Bolstered by limitless data points and instantaneous statistical analysis, this type of monitoring system could alert producers of risk and recommend specific treatment strategies based on historical trends. Alternatively, biosurveillance systems are extremely complex and often costly to operate and maintain. The key to determining the true value of a biosurveillance system will be to conduct cost-benefit analysis which includes consideration of fixed versus variable costs compared to the quantified value of detecting emergent diseases prior largescale disease outbreak (Yang, 2017). At this stage most biosurveillance systems are not accessible for small to medium scale livestock producers. However, with continued investment biosurveillance systems may 35 advance enough to enable producers to use similar small-scale surveillance systems at the local or even farm level. QScout White blood cell (WBC) counts are objective measures which provide insight into the body’s ability to effectively mount innate and adaptive immune responses after vaccination or infection (Leach et al., 2013). Due to their vital role in the mammalian immune response, WBCs have become a focal point for researchers working to improve early detection of BRD and enable targeted antibiotic treatment. Increased understanding of the role WBCs in the immune response has led to the development of technology which provides a chute-side differential WBC count. The technology specifically tested in this study is called the QScout and was developed by Advanced Animal Diagnostics. Using the QScout, producers can perform chute-side blood tests and receive a differential white blood cell count in approximately 1 minute. The machine is calibrated to distinguish cattle with “normal” WBC counts from those with “abnormal” counts, enabling producers to administer treatment to the individuals who are determined most likely to experience immune system challenge. Early studies conducted by AAD indicate that treating cattle based on results from the QScout can help reduce antibiotic usage by about half without a significant reduction in treatment rate, mortality rate or average daily gain for the first 42 days in the feedlot (www.qscoutlab.com/beef, accessed January 12, 2019). The same technology was originally designed to detect early onset mastitis in dairy cows and has proven to be effective for this purpose. A recent partnership with Zoetis will make the technology more widely accessible to dairy producers internationally. This tool continues to undergo refinement and adaptation for application in the beef industry. 36 Summary Bovine respiratory disease is the most costly, widespread, and chronic health condition the US beef industry faces (Wilson et al., 2017). Young, stressed, and light-weight calves are at highest risk to develop BRD because their immune systems are still under development (Wolfger et al., 2015). Exposure to stressful experiences early on in life such as weaning, castration, and transportation has been shown to increase morbidity (Currin, 2009). Producers can reduce BRD susceptibility through preconditioning, vaccination, proper animal handling, and improvements to facilities. However, diagnosis of BRD is remains a challenge because cattle are prey animals and naturally mask their symptoms to avoid being singled out from the herd (Griffin, 2014). As a result, it has become common industry to treat cattle which are perceived to be high risk metaphylactically with antibiotics. This practice is costly for producers and may contribute to the global spread of antibiotic resistance. The QScout was developed to address these concerns, and was designed to capitalize on the insight abnormal WBC counts in response to immune system challenge may provide. This technology and theory were tested in the accompanying studies. 37 CHAPTER 2: Neutrophil and lymphocyte enumeration for early detection of bovine respiratory disease*+ Elizabeth Frey1, Andrew Huff2, Bo Norby2, Paul Bartlett2, Steven Rust1 1Department of Animal Science, Michigan State University, East Lansing, Michigan 48823 2Large Animal Clinical Sciences, Michigan State University, East Lansing, Michigan, 48823 *In consortium with Advanced Animal Diagnostics + Special thanks to the Michigan State Univeristy Beef Cattle Teaching and Research Center for providing facilities for this research 38 Abstract Bovine Respiratory Disease (BRD) is the most widespread and costly health challenge the US beef industry faces, with financial losses exceeding $1 billion annually. BRD will continue to directly relate to human and animal health concerns as antibiotic resistance spreads and treatment efficacy declines. The objective of this study was to address financial and public health concerns by testing a newly developed selective antibiotic treatment strategy for BRD. Abnormal neutrophil or lymphocyte counts at the time of processing may indicate the animal is at higher risk for impaired immune function and onset of bacterial infection, providing an opportunity for effective targeted antibiotic treatment (Parkin & Cohen, 2001; Odore et al., 2004; Buckham-Sporer et al., 2007). We hypothesize that cohorts of cattle selectively treated with antibiotics based on abnormal neutrophil or lymphocyte count will experience significantly lower morbidity rates compared to cattle that do not receive an antibiotic at processing. Additionally, selective treatment with antibiotics will result in less overall antibiotic usage than metaphylaxis treatment with similar morbidity and mortality rates. Two hundred twenty-eight Holstein and native type cattle with an average initial weight of 200.2 kg were enrolled in the study. Average daily gain, morbidity, and mortality were measured for group 1 through d 28, and groups 2 and 3 through d 56. Selective treatment resulted in a 53.7% reduction in antibiotic usage. Morbidity and mortality rates were similar between control, metaphylaxis, and selective treatment. Significant gender differences were observed for dry matter intake (DMI) d 0 to 27, but were not significant for any other variable. Due to limited natural challenge resulting in low rates of BRD-related morbidity, few decisive conclusions can be drawn as to the efficacy of this selective treatment method. Key Words antibiotics, cattle, metaphylaxis, white blood cells 39 Introduction Bovine Respiratory Disease (BRD) is a disease complex that affects the respiratory tract of cattle. Despite technological advances in animal health over the years, 16.1% of all US beef cattle experience health challenges due to BRD (USDA, 2013). This prevalence rises to 21% for cattle that arrive at feedlots weighing less than 318 kg (Wolfger et al., 2015). A singular solution to BRD has proven particularly difficult to identify due of the multifactorial nature of the disease. Due to the dynamic causative factors of BRD, producers often metaphylactically use antibiotics to preemptively treat cattle perceive to be at high risk of sickness prior to an outbreak. Metaphylaxis is defined as the timely mass medication of a high risk group of animals to eliminate or minimize an expected outbreak of disease, and is used by 59.3 percent of all US feedlots (Edwards, 2010; Westcott and Trostle, 2014). Unfortunately, this treatment strategy is extremely costly in terms of financial investment and increasingly falls under scrutiny due to its potential contribution to the spread of antibiotic resistance throughout animal agriculture. This study was designed primarily with the goal of developing a quantitative decision- making strategy to serve as an alternative to metaphylaxis. It was also designed to determine the application potential of new technology developed by Advanced Animal Diagnostics (AAD). The machine known as the QScout provides a chute-side analysis of each individual animals’ differential white blood cell (AAD; www.qscoutlab.com). This tool may be able to help producers identify and treat cattle which are experiencing the early stages of immune system challenge. 40 We hypothesize that groups of cattle treated selectively with antibiotics during processing based on abnormal neutrophil or lymphocyte counts would experience significantly lower morbidity rates than cattle that did not receive antibiotics. Additionally, selective treatment based on white blood cell counts (WBC) would lessen overall antibiotic usage compared to metaphylaxis without significant increases in morbidity or mortality rates. This early detection strategy has the potential to reduce producers’ total treatment costs, slow the spread of antibiotic resistance, and provide producers a quantifiable strategy for health management and antibiotic usage. Materials and Methods From February to December 2017, 228 high risk mixed breed native-type steers and heifers were obtained in three groups from sale barns and auctions in the greater Midwest region. All cattle were held overnight at the auction market before being shipped to the Michigan State University Beef Cattle Teaching & Research Center (BCTRC). Group 1 was enrolled in the study in February, group 2 was enrolled in April, and group 3 was enrolled in December. These months were chosen because of their expected correlation with seasonal weather and temperature changes in Michigan. Average initial animal weight was 200.2 kg, measured over two consecutive days after arrival and averaged. Group 1 was enrolled in the study for 28 d, while groups 2 and 3 were enrolled for 56 d. Initial processing took place 24-48 h after arrival at BCTRC. During processing, cattle were dehorned and castrated as needed, then assigned radio-frequency identification (RFID) and bangle animal identification tags. Additionally, cattle were vaccinated for Infectious Bovine Rhinotracheitis and blackleg with BoviShield Gold 5 and the clostridial Ultrabac7/Somubac (Zoetis, www.zoetis.com). The fenbendazole oral drench dewormer Safeguard was also 41 administered (Merck Animal Health, www.merck-animal-health-usa.com). Blood samples (<10ml) were obtained with 18mm x .6 cm needles from the tail-head vein, held in whole blood collection tubes, applied to a slide, and scanned by the QScout developed by AAD. The scanning process took approximately 40 s and provided chute-side differential WBC counts for each sample. Cattle were randomly assigned to one of three treatments in the order they appeared in the handling chute and were treated accordingly. Heifers and steers were penned by gender based on treatment to allow for customized nutrition programs. Animals with a rectal temperature of greater than 40°C were treated with antibiotic regardless of treatment to ensure animal well- being was not sacrificed for the sake of research. Two animals had a fever upon arrival, were treated with Draxxin (Zoetis, www.zoetis.com), a tulathromycin, per label instructions, and placed in the pen with their assigned treatments. Treatments are categorized as: CON – control group, no antibiotic treatment; MET – metaphylactic antibiotic treatment (all received tulathromycin); SEL – selective antibiotic treatment based on differential WBC count. Tulathromycin was administered to cattle in treatment SEL if their neutrophil count was <1.5 or >4.5 x103 cells/μL and/or lymphocyte count was >10 x103 cells/μL as indicated by chute side QScout test. A decision-making tree of these treatment assignments is shown in Figure 1.1. Treatment parameters for treatment SEL were assessed based on the “normal” healthy WBC profile of cattle at this age (Roland et al, 2014). Some blood samples obtained from animals in treatments CON and MET were tested within 8 hours of initial processing rather than chute side to save time during processing. Proprietary studies conducted by AAD have shown that similar values are observed between 42 scan results if the test is conducted within 8 hours, however testing as soon as possible after blood collection is recommended. Following initial processing, cattle were housed by treatment in pens of 5-6 and managed by individuals blind to the treatment assignment of each pen. Each pen was separated from the next by a one-pen gap to prevent nose to nose contact, minimizing risk of direct cross- contamination between treatments. Records of which animals were treated with antibiotics upon arrival were kept by the barn manager and only accessed to ensure proper antibiotic treatment guidelines and withdrawal periods were observed in the event cattle required treatment after initial processing. If an animal exhibited clinical symptoms accompanied by a rectal temperature of 40°C or higher after initial processing, antibiotic treatment was deemed necessary and treatment records were pulled. If that animal did not receive antibiotics at the time of processing, they were treated with tulathromycin, followed by Nuflor, a florfenicol (Merck Animal Health, www.merck- animal-health-usa.com) 48 h later if fever persisted. If necessary based on internal temperature, Excenel, a ceftiofur hydrochloride (Zoetis, www.zoetis.com) was administered after an additional 48 h. If cattle did receive antibiotic treatment during processing, the treatment regimen started with florfenicol, followed by ceftiofur hydrochloride, then tulathromycin at 48 h intervals as necessary based on fever. Cattle did not receive antibiotic treatment more than three times after processing. Feed was provided via a bunk system and the ration profile was formulated based on the growing phase and weight of each pen. Hay was included in the diet for up to three weeks after arrival to BCTRC. High moisture corn, corn silage, dried distillers grains, a protein-mineral supplement mix called BFS 50, and weigh-back comprised of refused feed from the MSU Dairy 43 Cattle Teaching and Research Center were fed throughout the growing phase. Free access to water was provided via automatic water troughs available to each pen. Dry matter intake (DMI) was calculated for each pen after adjustment for feed refusals. Similarly, cattle were weighed at 28 d intervals to determine average daily gain (ADG) and gain efficiency. Measurements for dry matter (DM), DMI, ADG, and gain efficiency were collected for groups 2 and 3 only. Average DM for each feed type and total mixed ration (TMR) for groups 2 and 3 are shown in Table 1.1. Statistical analysis was conducted with the SAS program version 9.4 utilizing the Proc Glimmix function. Response variables morbidity and mortality were calculated as binomial events, while ADG, DMI, and gain per feed were calculated as continuous responses. Categorical variables included treatment, and gender. Experimental unit for these responses were at the pen level, with sufficient accommodation for variations in pen size, adjusting for random group effect. Link function logit, diagonal variance matrix, and residual degrees of freedom were used. Treatment CON was used as the reference group for treatment differences. Interaction of initial biological measures including temperature, neutrophil count, and lymphocyte count with morbidity were analyzed separately. The experimental unit for these measures was the individual animal. SAS version 9.4 function Proc Glimmix was also used for this portion of analysis. Group, treatment and gender were used as categorical variables, while neutrophil count, lymphocyte count, and rectal temperature at processing were considered continuous variables. This analysis was based on a normal distribution with group and treatment nested within pen acting as random effects. Treatment and gender differences for binomial variables morbidity and mortality were determined using odds ratio estimates generated by Proc Glimmix. Treatment and gender differences for continuous responses such as gain and efficiency were determined by ANOVA 44 and Tukey test using Proc Glm. Treatment and gender differences were significant at α <0.05. Animal procedures were approved and followed internal guidelines recommended by the Animal Care and Use Committee of Michigan State University (IACUC study number 02/17-017-00). Results and Discussion Protein-mineral supplement, corn silage, dried distiller grains, hay and high-moisture corn were fed to all groups. Unlike group 3, the ration fed to group 2 included weigh-back comprised of refused feed from the MSU Dairy Cattle Teaching and Research Center. Feed intake was only measured for groups 2 and 3. Ration composition is provided in Table 1.1. Average daily gain, DMI, and gain efficiency were similar among treatments (Table 1.3). Female DMI intake was significantly lower than male DMI d 1 to 27 (P=.04). No other gender differences were observed. Variation in DMI between genders throughout the growing phase is not considered abnormal due to differences in body size, net energy requirements, and overall growth rates (Zinn et al., 2008). Gender comparison through d 56 for ADG, DMI or gain per DMI was not available because females were only present in group 1. Average initial rectal temperature was 38.7°C, with a minimum of 36.7°C and maximum of 40.2°C (Figure 1.2). Average neutrophil count at processing was 3.8 x103 cells/μL, with a minimum of 0.7 x103 cells/μL and maximum of 9.9 x103 cells/μL (Figure 1.3). Average lymphocyte count at processing was 6.9 x103 cells/μL, with a minimum of 3.2 x103 cells/μL and maximum 11.3 x103 cells/μL (Figure 1.4). No significant correlations between morbidity or mortality and initial temperature, neutrophil count, or lymphocyte count were observed (Table 1.2). However, the correlation between neutrophil count and initial weight was significant, and indicated that an increase in 45 initial weight was correlated with a decrease in neutrophil count at processing. The correlation between lymphocyte count and initial weight was also significant, and indicated that an increase in initial weight was correlated with a larger lymphocyte count at processing. Morbidity rates, mortality rates, ADG, DMI, and gain efficiency were similar among treatments (Table 1.3). These results do not support the hypothesis that selective treatment would result in significantly reduced morbidity rates compared to cattle that receive no antibiotic treatment. However, only 46.3% of animals in treatment SEL required antimicrobial treatment based on treatment thresholds (Figure 1.5). Thus, usage of the QScout selective treatment method effectively reduced antibiotic treatment usage at processing by 53.7% compared to metaphylactic treatment (MET), with no significant effects to morbidity. While supportive of the initial hypothesis, this result may not be reflective of real-world conditions due to abnormally low observed morbidity and mortality rates The morbidity and mortality rates in this study were notably lower than the 30% anticipated when designing the study, in addition to being much lower than the US beef industry BRD related morbidity rate of 16.1% (USDA, 2013). High morbidity rates were expected for these cattle because they were sourced from auction barns where they were commingled and housed overnight, their health and preconditioning history was unknown, and groups arrived on the farm during months that are traditionally associated with high morbidity rates across the industry. The limited natural challenge observed in this study did not mirror expected real-world conditions and made it impossible to make decisive conclusions about the industry application potential for the tested technology and antibiotic management strategy. The lack of BRD challenge for these high risk animals may be attributed to unknown preconditioning of calves 46 which bolstered the immune system, or the management practices, farm layout, and group monitoring unique to BCTRC. The current study design did not allow for decisive conclusions regarding these possibilities. If successful after more rigorous testing, the selective treatment strategy used in this study has the potential to enable early detection of immune system challenge and allow targeted antibiotic usage in feedlots. Further study is needed to test the efficacy of differential WBC as a tool in BRD management. Future studies with greater BRD challenge and larger groups are necessary. 47 APPENDIX 48 Figure 1.1: Decision-making tree for treatment assignment used in initial processing CONTROL No antibiotics Measure rectal temperature: Treat if >40°C Randomly assign treatment METAPHYLAXIS All treated with antibiotics SELECTIVE TREATMENT Treat only if neutrophil count <1.5 or >4.5 x103 cells/μL lymphocyte count >10 x103 cells/μL 49 Table 1.1: Feed analysis and dry matter (DM) basis of feed used d 1-56 in groups 2 and 3* Feed Analysis Group 2 Group 3 Crude protein (%) Crude fiber (%) Ether extract fat (%) Total digestible nutrients DM (%) (%) Ration Ration composition composition (%) (%) 53.1 1.4 1.3 51.3 93.3 3.3 6.7 21.9 3.0 68.9 34.9 13.7 33.9 10.0 10.6 94.8 89.6 15.2 8.8 5.4 36.3 2.17 51.7 79.6 11.3 2.0 3.1 91.2 72.0 43.0 14.7 10.7 4.6 85.4 53.4 13.5 -- -- -- -- -- 100 3.8 18.0 8.3 13.4 56.5 0 100 Feed Type Protein- mineral supplement (BFS-50) Corn silage Dried distillers grains Hay High moisture corn Weigh- back** Total mixed ration *Feed analysis was not conducted for Group 1, but rations were comprised of the same feeds used for Groups 2 and 3 **Weigh-back is comprised of refused feed from the MSU Dairy Cattle Teaching and Research Center 50 Figure 1.2: Initial rectal temperature and correlation with morbidity d r e h l a t o t f o t n e c r e P 70 60 50 40 30 20 10 0 10 % , y r o g e t a c e r u t a r e p m e t l h c a e n i h t i w y t i d i b r o M 8 6 4 2 0 36.5 37 37.5 38 38.5 39 39.5 40 Initial rectal temperature (°C) N Pr r Mean SE Min Max 228 0.51 0.05 38.7°C .04°C 36.7°C 40.2°C 51 Figure 1.3: Initial neutrophil count and correlation with morbidity d r e h l a t o t f o t n e c r e P 14 12 10 8 6 4 2 0 30 % , y r o g e t a c l i h p o r t u e n h c a e n i h t i w y t i d i b r o M 20 10 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7 7.5 8 8.5 9 9.5 Initial neutrophil count x103 cells/μL* *Selective treatment thresholds for neutrophil count were >1.5 x103 μ/L and <4.5 x103 μ/L N Pr r 228 0.14 -0.10 Mean 3.79 x103 cells/μL 0.12 x103 cells/μL 0.70 x103 cells/μL 9.90 x103 cells/μL SE Min Max 52 Figure 1.4: Initial lymphocyte count and correlation with morbidity d r e h l a t o t f o t n e c r e P 16 14 12 10 8 6 4 2 0 3 3.5 4 4.5 5 30 % , y r o g e t a c l i h p o r t u e n h c a e n i h t i w y t i d i b r o M 20 10 0 6 5.5 8.5 Initial lymphocyte count x103 μ/L 6.5 7.5 7 8 9 9.5 10 10.5 11 *The thresholds for selective treatment based on lymphocyte count was >10 x103 cells/μL N Pr r 228 0.50 0.05 Mean 6.86 x103 cells/μL .11 x103 cells/μL 3.20 x103 cells/μL 11.3 x103 cells/μL SE Min Max 53 Table 1.2: Correlation of biological measures with health Temperature (°C) Neutrophil Count Lymphocyte Count x103 cells/μL x103 cells/μL Morbidity (%) Mortality (%) Initial Weight (kg) Temperature (°C) Neutrophil Count x103 cells/μL Lymphocyte Count x103 cells/μL 0.04a (0.51) 0.01 (0.86) 0.02 (0.73) 1.0 -0.05 (0.45) -0.06 (0.36) -0.10 (0.14) 0.02 (0.81) -0.30 (<0.001) -0.05 (0.45) 1.0 0.08 (0.22) 0.05 (0.50) 0.03 (0.64) 0.19 (0.004) -0.06 (0.36) 0.08 (0.22) 1.0 a Within a row and column, first number is a correlation coefficient (r value), followed by probability value in parenthesis b Group 2 and 3 only 54 Table 1.3: Treatment and gender effect on health, growth, and efficiency N N pens Overall morbidity Overall mortality Day 1 - 27 Day 28 – 56e Day 0-56e Day 1 – 27e Day 28 – 56e Day 0-56e Day 1 – 27 Day 28 – 56e Day 0 – 56e 74 13 6.8c 1.4c 1.1c 2.4c 1.9 4.6c 8.4c 7.1c 20c 290c 260c CONa MET Treatment Gender SEL 80 14 Pr > t -- -- Mb 198 34 F 30 6 Pr > t -- -- Morbidity and Mortality (%) 74 13 1.4c 1.4c 1.2c 2.5c 2.0c 6.3c 1.3c 0.46 4.0c 13.3c 0.06 0.83 1.5c 6.7c 0.87 ADG (kg) 1.2c 2.3c 1.9c 0.38 0.26 0.59 DMI (kg per head per day) 0.06 0.41 0.28 5.1c 9.3c 7.6c Gain per DMI (g/kg) 20c 280c 260c 0.42 0.60 0.85 5.0c 8.9c 7.2c 30c 260c 270c 1.1c 2.4 2.0 5.0c 8.8 7.3 32c 270 270 1.2c -- -- 4.8d -- -- 26c -- -- 0.84 -- -- 0.04 -- -- 0.10 -- -- a Con=control; MET= metaphylaxis; SEL= treated if neutrophil count was <1.5 x103 cells/μL or >4.5 x103 cells/μL, or if lymphocyte count was >10 x103 cells/μL b M= steer; F=heifer cd Means within a row with unlike superscripts differ e Groups 2 and 3 only 55 Figure 1.5: Percent of cattle given antibiotics at processing per treatment t n e c r e P 100 90 80 70 60 50 40 30 20 10 0 100 46.3 0 Control Metaphylaxis Selective Treatment * Selective treatment = treated if neutrophil count was <1.5 x103 cells/μL or >4.5 x103 cells/μL, or if lymphocyte count was >10 x103 cells/μL 56 CHAPTER 3: Comparison of selective treatment based on white blood cell counts to metaphylaxis to treat bovine respiratory disease and reduce antibiotic usage in feedlots*+ Elizabeth Frey1, Andrew Huff2, Bo Norby2, Paul Bartlett2, Steven Rust1 1Department of Animal Science, Michigan State University, East Lansing, Michigan 48823 2Large Animal Clinical Sciences, Michigan State University, East Lansing, Michigan, 48823 *In consortium with Advanced Animal Diagnostics + Special thanks to Lewis Farms for providing facilities for this research 57 Abstract Costs associated with bovine respiratory disease (BRD) reach $1 billion in the U.S. annually (Wolfger et al., 2015). Neutrophils and lymphocytes, two types of white blood cells (WBC), have been identified as key components early in the immune response to bacterial infection and aid in the response to bacteria which cause BRD (Parkin and Cohen, 2001). We hypothesize that selective antibiotic treatment based on abnormal white blood cell (WBC) count would result in significantly reduced morbidity rates compared to cattle that did not receive antibiotics. Selective treatment based on WBC count would also lessen overall antibiotic usage compared to metaphylaxis, without significant increases in morbidity or mortality rates. Four hundred ninety-seven Holstein steers purchased from Michigan auction barns were enrolled in the study from April to November, 2018. Castrated and polled or previously dehorned males were eligible for enrollment. Blood samples (<10 ml) were obtained from all steers and analyzed to provide differential WBC counts. Cattle with a rectal temperature greater than 39.4°C were assigned to treatment FEV and treated with Zuprevo, a tildipirosin. Cattle with normal rectal temperatures were randomly assigned to treatments based on chute order. Cattle in the control treatment were not treated with antibiotics (CON), all cattle were treated with antibiotics in the metaphylactic treatment (MET), and cattle were selectively treated with antibiotics based on WBC in the third treatment (SEL). No treatment differences in morbidity or mortality were observed despite a 61.1% reduction in antibiotic usage in treatment SEL. Secondary analyses were conducted to determine seasonal effects and to isolate the effects of abnormal WBC and receipt of antibiotics. Cattle were more likely to require antibiotic treatment for clinical BRD in the fall, compared to spring and summer (P <0.0001). Neutrophil and lymphocyte counts did not have significant correlation with morbidity and mortality rates. 58 Key Words metaphylaxis, respiratory disease, white blood cell Introduction Development of bovine respiratory disease (BRD) in feedlot cattle can be caused by a variety of infectious agents and factors which induce stress in cattle. Although it is difficult to identify a single pathway in which disease is transmitted, scientists and producers generally understand how this bacterial infection develops within the respiratory tract and causes cellular damage. In the US, treatment and productivity costs associated with bovine respiratory disease reach $1 billion annually, and little improvement in BRD related morbidity rates has been seen in the last 40 years (Currin, 2009; Wolfger et al., 2015). As a result, BRD is a major focal point for health and management research in the US beef industry. Primary BRD relevant bacterial agents identified in North America include Mannheimia haemolytica, Pasteurella multocida, Histophilus somni, and Mycoplasma bovis (Panciera and Confer, 2010; Taylor et al., 2010). These bacteria live commensally in the upper respiratory tract of beef cattle before they act opportunistically to cause infection. Transmission can occur via shared airspace, feed bunks, or water troughs. The relationship between host and bacteria typically becomes infectious after cattle are subjected to stress or a viral infection, which lowers their ability to prevent further bacterial colonization and cellular damage. Common symptoms observed in clinical cases of BRD include labored breathing, coughing, fever, nasal discharge and lack of appetite (Schneider et al., 2009; Neibergs et al., 2014). Cattle are anatomically prone to respiratory issues due to their long tracheobronchial tree and relatively small lungs compared to body size. This structure provides ample opportunity for 59 particle deposition throughout the airways, which restricts the animal’s ability to effectively clear debris. Bacteria are allowed increased opportunity for deposition and colonization compared to most other species (Ackermann et al., 2010). Additionally, bovine lungs have many lobules and low elasticity. If one bronchus is damaged it can cause collapse of the distal lung segment, which would further impair respiratory function (Muller and Berg, 2011). To combat infectious bacteria, the bovine respiratory tract is lined with physical and chemical protective barriers such as hairs, specialized epithelial cells, mucous, and air surface liquid (ASL) (Ellis, 2001). The initial line of defense includes hairs which provide a physical barrier to trap large airborne particles, and squamous cells which create a stratified surface and protect against microbial adhesion. Mucous known as air-surface liquid (ASL) is also produced by goblet cells and submucosal glands and lines the upper respiratory tract (Ackermann et al., 2010). Antimicrobial molecules present within the ASL play a direct role in pro and anti- inflammatory responses. Infection is combatted via lysozymes which disrupt bacterial membranes to as they come in contact with the ASL, and antimicrobial peptides such as defensins and cathelicidins which are able to induce rapid lysis of bacteria. As these initial physical and chemical barriers are attacked, chemical signals are sent from the respiratory tract to help trigger the upregulation of more advanced innate and adaptive immune responses which are the foci of this study (Bartlett et al., 2008). In response to these bacterial pathogens, the acute inflammatory response is initiated followed by an upregulation of the innate immune system, and antigen presentation within hours of initial infection. The adaptive humoral and cell-mediated immune responses then develop over the days and weeks following antigen presentation. If infection persists, internal body 60 temperature rises which causes fever, an indication which producers often use to identify and treat sick cattle. Neutrophils Neutrophils are granulocytes which play a key role in the innate immune response. These cells have a typical life span of 8-24 hours in the bloodstream after recruitment to the site of infection. Pattern recognition receptors (PRRs), located on the surface of neutrophils, are used to identify pathogen associated molecular patterns (PAMPs) found on the surface of bacteria. Lipopolysaccharides (LPS) are the key PAMP found on Gram-negative bacteria, which is relevant to BRD because all major BRD inducing bacteria are Gram-negative. Neutrophils are also recruited and utilized in the ASL where they release antimicrobial alpha defensins and cathelicidins. Additionally, neutrophils phagocytose bacteria, release granulocytes, and create neutrophil extracellular traps (NETs) to trap invasive organisms (Ackermann et al., 2010). The initial neutrophil response can be relatively weak in cattle because they maintain a small reserve of granulocytes in the bone marrow. As a result, neutropenia, abnormally low blood neutrophil counts, is common for the first 1-2 days after an acute inflammatory response is mounted to combat respiratory pathogens. Approximately 3-5 days are required for neutrophil counts to rebound after a substantial inflammatory response (Roland et al., 2014). It may take even longer for neutrophil counts to rebound to normal levels if cattle did not receive adequate passive immunity from their dams (Galyean et al., 1999). Acute inflammatory response is associated with both neutropenia and the early stages of BRD development. Neutropenia is a key biological measure of interest in this study to determine if a link exists between neutropenia and BRD. 61 A phenomenon known as the “neutrophil paradox” sometimes occurs after neutrophil counts are restored. At times, high levels of neutrophil activity can make a disease state worse by contributing to extensive chronic inflammation and damage to the host’s epithelial lining. This damage typically occurs due to unregulated pro-inflammatory neutrophil activity such as degranulation and release of proteolytic enzymes or reactive oxygen species. This activity can affect the hosts ability to rapidly identify bacterial PAMPs (Buckham-Sporer et al., 2007). The efficacy of the neutrophil response is often further impaired when environmental stressors such as handling or transport cause an elevation in blood cortisol levels, which reduces the expression of defensin genes in response to LPS (Buckham-Sporer et al., 2007; Ackermann et al., 2010). Abnormally high neutrophil counts are another biological indicator of interest in this study because chronic inflammation and stress are associated with both neutrophilia and increased susceptibility to BRD (Roland et al., 2014). Based on an average estimate from multiple publications, the normal neutrophil range in healthy cattle is 1.3-5 x10³ cells/μL (Kramer, 2000; Kraft and Durr, 2005; Jones and Allison, 2007; George et al., 2010; Wood and Quiroz-Rocha, 2010; Roland et al., 2014). To be conservative in data collection, neutrophil counts considered normal for this study are within the range of 1.5-4.5 x10³ cells/μL. Lymphocytes Lymphocytes act as part of the adaptive immune response and are visually distinct from neutrophils at the microscopic level. There are two primary types of lymphocytes, T and B cells, which act in response to specific antigens. Approximately 95% of the body’s T lymphocytes are sequestered within the lymphatic system and circulate throughout the body periodically via the bloodstream and lymph. When a foreign antigen is detected via the innate immune response, it is 62 brought directly to the lymph by an antigen presenting cell and presented to T helper (Th) and T cytotoxic (Tc) lymphocytes. These cells recognize the attached peptides as foreign and invasive material, which triggers activation of the adaptive immune response to target that specific foreign antigen. Upregulation of this antigen specific response escalates for days to weeks afterwards (Parkin and Cohen, 2001). In response to antigen presentation, B lymphocytes produce antibodies which neutralize toxins and prevent pathogens from binding to mucosal surfaces such as the ASL. The IgA antibody is most closely associated with mucosal immunity. Prior to activation, B cells are most commonly found in lymphoid tissue and circulate in the blood stream after infectious pathogens have been detected (Parkin and Cohen, 2001). Lymphopenia, abnormally low blood lymphocyte levels, has been observed in cattle in response to acute stress, bacterial and viral infection, and corticosteroids (Jones and Allison, 2007). Exposure to stressful experiences such as weaning or shipment have been associated with an increase in blood cortisol levels, and a related downregulation of lymphocyte glucocorticoid receptor presentation. This has been observed in cattle for up to 24 hours after shipment and may provide a window for increased pathogen activity (Odore et al., 2004). Alternatively, lymphocytosis, abnormally high blood lymphocyte counts, has been observed in cases of purulent disease such as nephritis, hepatitis, or bronchopneumonia, which are all commonly associated with BRD (Roland et al., 2014). Because of their direct association with BRD risk factors and symptoms, lymphopenia and lymphocytosis are the final focal points for selective treatment in this study. The normal lymphocyte range for cattle is 2.1-6.7 x10³ cells/μL, based on averaged results from multiple publications (Kramer, 2000; Kraft and Durr, 2005; Jones and Allison, 2007; George et al., 2010; Wood and Quiroz-Rocha, 2010; Roland et al., 2014). However, in this 63 study lymphocyte counts were considered normal up to 10 (x 10³ Cells/μL) in order focus on the most severe cases of lymphocytosis and place selection emphasis more heavily on abnormal neutrophil counts. Motivating Factors Currently, many feedlot operations in the United States treat BRD preemptively following a management practice known as metaphylaxis. Metaphylaxis is defined as the mass medication of a group of high risk or newly-received animals to eliminate or minimize an acute onset or outbreak of disease (Urban-Chmiel and Grooms, 2012; Snyder et al., 2017). This widespread use of antibiotics has fallen under increased scrutiny from consumers and health specialists due to concerns regarding the spread of antibiotic resistance. Additionally, the practice is extremely expensive for producers, and it has been estimated that only 1 in 5 cattle treated are actually at high risk to develop BRD (Maday, 2018). Social and financial pressure serve as motivating factors for the industry and researchers to identify more targeted and responsible ways to treat BRD. An additional common practice producers use to treat BRD is to use rectal temperature as a measure to identify cattle which may benefit antibiotic treatment. This allows for more targeted antibiotic usage and reduces treatment costs compared to metaphylaxis, but enables bacterial activity in the respiratory tract to progress to the later stages of infection when substantial cellular damage to the respiratory tract occurs. Researchers have theorized that it may be possible to address the concerns about antibiotic usage in the livestock industry by selectively treating cattle based on WBC count rather than fever. White blood cell counts change in response to infection days or even weeks 64 prior to the development of fever, which may provide a window for selective antibiotic treatment. If effective, this strategy would allow producers to provide more targeted antibiotic treatment earlier in the development of disease, which may prevent both clinical and subclinical respiratory damage and reduce total antibiotic usage. One purpose of this study is to gain knowledge of the “normal” distribution of WBC in light-weight Holstein steers and to determine the relationship between initial WBC counts and BRD-related morbidity. We hypothesize that selectively treating based on abnormal white blood cell (WBC) count would result in significantly reduced morbidity rates compared to cattle that did not receive antibiotics. Additionally, selective treatment based on WBC counts would lessen overall antibiotic usage compared to metaphylaxis without causing significant increases in morbidity or mortality rates. Materials and Methods Holstein steers were obtained from auction markets across the state of Michigan from April to November 2018 and raised in a straw bedded-pack feedlot in North Street, Michigan known as Lewis Farms. A total of 497 steers were enrolled in the study with a purchase weight range of 128 to 259 kg. On-site scales were not available to verify purchase weight. Cattle were eligible for enrollment if they were castrated and polled or previously dehorned Holstein steers. Date of arrival and date of processing was recorded, and the length of time between these dates (processing delay) was calculated for each pen. Cattle were categorized into pens based on the date of their arrival to the farm, as well as their purchase location. This allowed for distinction between subgroups which were processed on the same date and commingled based on body weight. Pen sizes ranged from 1 to 37 head, and a total of 46 pens were enrolled throughout 65 the study. The average processing delay was 5.8 d, with a range of 0 to 19 d. Some pens were processed the same day as arrival due to manpower availability, and a few pens were fed and observed for health issues for an extended period before processing due to extenuating farm circumstances. Each steer was assigned a bangle tag ID number, implanted with Encore, an estradiol implant (Elanco, www.elanco.us), and received generic brand topical ivermectin de-wormer following standard farm procedure. Injections of Presponse SQ, a Mannheimia haemolytica toxoid (Boehringer Ingelheim, www.boehringer-ingelheim.com), were administered per label instructions. Cattle were also injected with Inforce 3 a BRSV/PI-3 vaccine, Bovi-Shield Gold an IBR/BVD vaccine, and the Histophilus somni bacterin Somubac (Zoetis, www.zoetis.com) per label instructions. Rectal temperature was measured and blood samples (<10ml) were obtained with 18 mm x 0.6 cm needles from the tail-head vein. Blood samples were held in whole blood collection tubes, applied to a slide, and scanned by the QScout, a machine developed by Advanced Animal Diagnostics (AAD; www.qscoutlab.com). The scanning process took approximately 40 s and provided chute-side differential WBC counts for each steer. Following standard practice for the farm, if a steer had a rectal temperature greater than 39.4°C during processing they were treated per label instructions with a dose of the antibiotic Zuprevo, a tildipirosin, as well as Banamine a flunixin meglumine, nonsteroidal anti- inflammatory drug (NSAID; Merck, www.merck.com), and were assigned to treatment group FEV. Cattle with a rectal temperature less than 39.4°C were randomly assigned one of three treatment groups based on order of appearance in the processing chute. Cattle randomly assigned to the control treatment CON received no antibiotics. Those assigned to treatment MET were 66 treated following the metaphylaxis treatment theory, meaning all received tildipirosin. Cattle assigned to the selective treatment group SEL were selectively treated with tildipirosin based on each steer’s differential WBC counts, measured by the QScout. Selective antibiotic treatment was administered if cattle exhibited a neutrophil count of <1.5 or >4.5 x10³ cells/μL or a lymphocyte count of >10 x10³ cells/μL. Cattle processed on the same day were sorted based on body weight, and housed together for 21 d after processing regardless of assigned treatment. After 21 d cattle were commingled based on estimated finishing date and housed together until d 60, at which point they were transferred to a nearby finishing barn. Morbidity and mortality data was recorded through d 60. Records of which steers received tildipirosin at processing were kept by the barn manager to ensure antibiotic treatment and withdrawal guidelines were followed. Pens were observed daily by farm workers and cattle were pulled from the pen for closer examination if they exhibited common external symptoms of BRD. Once pulled, if the steer had a rectal temperature greater than 39.4°C they were given a dose of the tulathromycin Draxxin (Zoetis, www.zoetis.com) based on visual estimation of animal weight. Injections of the enroflaxin Baytril and the NSAID flunixin meglumine (Bayer, www.baytril.com) were also administered per label instructions. If that steer was pulled a second time for clinical symptoms and still had a rectal temperature greater than 39.4°C, they received an injection of the florfenicol and flunixin meglumine Resflor Gold (Merck, www.merck.com) per label instructions. If that steer required an additional retreatment due to fever, they received the tulathromycin treatment regimen a second time. Season of arrival was also assigned based on average monthly temperature, to account for seasonal changes in weather. Cattle processed within the months of April, May, and June were 67 considered spring arrivals. Summer months included July, August, and September, while cattle classified as fall arrived within the months of October and November. Previous studies and common industry perceptions indicate that season of arrival in feedlots may affect BRD susceptibility. Statistical analysis Statistical analysis was conducted to examine differences of treatment assigned at processing and seasonal effects on morbidity and mortality. Individual variable correlation was determined with SAS functions Proc Corr and are shown in Table 2.1. In addition to treatments assigned at processing, cattle were reassigned into treatments to isolate the effects of WBC counts and antibiotic treatment on BRD related morbidity. Cattle with a fever upon arrival were still identified as FEV. Those with no fever at arrival and normal WBC counts, who did not receive tildipirosin were identified as Normal-NoA. Cattle with no fever and normal WBC which were treated metaphylactically with tildipirosin were identified as Normal- A. Cattle with no fever but abnormal WBC at arrival that did not receive tildipirosin at processing were identified as Abnormal-NoA. Those with no fever treated selectively with tildipirosin based on abnormal WBC as described previously were identified as Abnormal-A (Table 2.4). The analysis of original treatments, seasonal effects, and the reassigned treatments based on WBC count and antibiotic treatment at processing was conducted with the Proc Glimmix function of the SAS program version 9.4. Morbidity and mortality were calculated as binary events with link function logit. Categorical variables included treatment group and season of arrival. Treatments CON and Norm-NA and the season spring were reference groups for their 68 respective analyses. Pen was treated as a random effect to account for cattle which arrived on farm on the same date from the same purchase location, but were commingled, processed, and randomly assigned to a treatment on the same day as other pens. Treatment and seasonal comparisons for continuous variables were calculated using SAS version 9.4 Proc Glimmix, with treatment and seasonal differences detected via the LSMeans differences test as indicated by alphabetic superscripts in Tables 2.2-2.4. Standard error means (SEM) presented were also generated via the LSMeans test, with the largest treatment SEM shown in Tables 2.2-2.4. Morbidity is defined as requiring one or more antibiotic treatments due to elevated rectal temperature post-processing. Treatment and seasonal differences were significant at α <0.05, with experimental unit defined at the animal level. Animal procedures were approved and followed internal guidelines recommended by the Animal Care and Use Committee of Michigan State University (IACUC study number 02/17-017-00). Results The mean initial rectal temperature for all cattle at processing was 39.2°C with a range of 37.9°C to 41.7°C (Figure 2.1). The rectal temperature threshold for antibiotic treatment was 39.4°C. Trend lines for the correlation between morbidity and temperature indicate that cattle with elevated temperatures at processing were more likely to receive antibiotics post-processing. Cattle with a low rectal temperature that did not receive antibiotics at processing were more likely to receive subsequent antibiotic therapy than those that did receive antibiotics at processing. The correlation between initial rectal temperature and morbidity was not significant, for all cattle (P =0.11) and regardless of antibiotic treatment at processing (P =0.71 – no antibiotics at processing; P =0.15 – treated with antibiotics at processing; Table 2.1). 69 The mean neutrophil count for all cattle at processing was 3.9 x103 cells/μL, with a range of 0.5 to 20 x103 cells/μL (Figure 2.2). Trend lines suggest that cattle that were treated with antibiotics at processing were more likely to require antiboitic treatment post-processing if they had particularly low or high neutrophil counts. Cattle not treated with antibiotics at procesing were less likely to require antibiotic treatment at processing if they had particularly low or high neutrophil counts. The distribution of neutrophil counts was skewed to the right. The correlation between neutrophil count and morbidity was not signficant for all cattle (P =0.54) and regardless of antibitic treatment at processing (P =0.82 – no antibiotics at processing; P =0.53 – treated with antibiotics at processing; Table 2.2). The mean lymphocyte count for all cattle at processing awas 8.3 x103 cells/μL, with a range of 1.7 to 21.3 x103 cells/μL (Figure 2.3). Lymphocyte count at processing had a normal distribution. Overall, cattle with high lymphocyte counts which were treated with antibiotics at processing were less likely to require antibiotic treatment post-processing compared to those that did receive antibiotics at processing. The correlation between between lymphocyte count and morbidity was not signficant for all cattle (P =0.35) and regardless of antibitic treatment at processing (P =0.71 – no antibiotics at processing; P =0.16 – treated with antibiotics at processing; Table 2.3). Purchase weight of cattle ranged from 128 kg to 259 kg, with an average purchase weight of 177 kg (Figure 2.4). Few significant correlations between initial processing data and future health were observed when comparing cattle that received antibiotic treatment to those with no antibiotic treatment at processing. However, neutrophil count and lymphocyte counts for cattle not treated with antibiotics at processing were positively correlated with morbidity d 8-14 post-processing 70 (P =.02; Table 2.4). Lymphocyte count in cattle treated with antibiotics had a negative correlation with processing delay (P =.03). The overall lack of correlation between neutrophil count or lymphocyte level with morbidity challenges the theory that these particular WBC count indicate an immediate need for antibiotic treatment. Purchase weight for all cattle had a positive correlation with processing delay (Table 2.4). There is no clear explanation for this relationship, however it may indicate farmer or seasonal bias in processing delay related habits. The correlation between morbidity and mortality (P =0.03; Talbe 2.4) suggesting that Lewis Farms successfully identifies sick cattle post-processing and attempts to assist their health through therapuetic antibiotic treatment. When considering original treatments assigned at processing, selective treatment based on WBC (SEL) resulted in a 61.1% reduction of antibiotic usage compared to metaphylaxis (MET). No signficiant differences in overall moribidty were observed, as indicated by the percent of cattle per treatment requiring antibiotic treatment one time post-processing due to BRD (P =0.09; Table 2.5). Selective antibiotic treatment was administered to cattle if they exhibited a neutrophil level of <1.5 or >4.5 x10³ cells/μL or a lymphocyte level of >10 x10³ cells/μL. However, treatment differences were detected for the percent of cattle per treatment requiring antibiotics two and three times post-processing. Cattle in treatments SEL and FEV required more antibiotic treatments twice post-processing compared to CON (P =0.01; Table 2.5). Additionally, cattle in treatments SEL and FEV required more antibiotic treatments three times post-processing compared to CON and MET (P =0.04). Based on observations, the lowest mortality was observed in treatment CON (2.7%) and the highest mortality was observed in 71 treatment FEV (5.1%). However treatment differences for mortality could not be calculated because the model would not converge due to low overall incidence. Rectal temperature at processing was higher in treatment FEV than all other treatments (P <0.0001). Neutrophil count in treatment FEV was also higher in treatment FEV compared to all other treatments (mean count=4.4 x10³ cells/μL; P =0.03). No other treatment differences were detected (Table 2.5). Processing delay was significantly longer in the fall compared to spring (6.6 days; P =0.001; Table 2.6). Morbidity, as defined as percent of cattle per season requiring one antibiotic treatment post-processing was greater in the fall compared to spring or summer (46.6%; P <0.0001). Similarly, the percent of cattle requiring antibiotic treatment twice post-processing was significantly greater in the fall compared to spring or summer (24.8%; P =0.003). Treatment differences for cattle requiring three antibiotic treatments post-processing and for mortality could not be calculated due to lack of model convergence. However observations indicate that morbidity requiring three antibiotic treatments in the fall (8.7%) as well as mortality in the fall (6.8%) were higher compared to spring or summer (Table 2.6). No other seasonal differences were detected. To allow closer analysis of the effects of antibiotic treatment based on WBC, cattle in treatment CON were reassigned to treatments Normal-NoA and Abnormal-NoA based on WBC level. Cattle originally assigned to treatment MET were reassigned to treatments Normal-A and Abnormal-A depending on WBC. Cattle in treatment SEL were reassigned to treatments Normal-NoA and Abnormal-A as appropriate per treatment thresholds. 72 Mean temperature at processing was higher in treatment FEV than other reassigned treatments, which was expected due to selection criteria (40.0 ◦C; P <0.0001). Initial neutrophil counts were higher in treatments Abnormal-NoA (4.9 x10³ cells/μL), Abnormal-A (5.0 x10³ cells/μL), and FEV (4.4 x10³ cells/μL) compared to Normal-NoA (2.7 x10³ cells/μL) and Normal-A (2.7 x10³ cells/μL), which was also expected based on treatment selection criteria (P <0.0001). Similarly, lymphocyte level was higher in treatments Abnormal-NoA (9.8 x10³ cells/μL), Abnormal-A (9.9 x10³ cells/μL), and FEV (7.9 x10³ cells/μL) compared to Normal- NoA (7.3 x10³ cells/μL) and Normal-A (7.2 x10³ cells/μL), which was also expected based on treatment selection criteria (P <0.0001; Table 2.7). Cattle in treatment FEV (20.9%) were more likely to require antibiotic treatment twice post-processing compared to treatments Abnormal-NA (7.0%) and Normal-A (7.1%; P =0.03; Table 2.7). Treatment differences for mortality could not be calculated, however mortality in treatment FEV (5.1%) appeared to be higher compared to other treatments. No other treatment differences were observed. This provides support for the theory that selective treatment would result in morbidity rates comparable to those seen after metaphylactic treatment. However, higher morbidity rates were not observed in treatment Abnormal-NoA. This challenges the theory that abnormal WBC counts directly cause higher morbidity rates if cattle are not treated with antibiotics. Discussion Overall, the results of this study suggest that there is an opportunity to reduce antibiotic usage on feedlots compared to metaphylaxis, but seasonal effects on morbidity outweigh treatment effects. 73 Selective treatment resulted in a 61.1% reduction of antibiotic usage compared to metaphylaxis, with no significant effect on morbidity or mortality rates for the first 60 d post- processing. These results provide preliminary support for the hypothesis that selective treatment based on WBC counts would result in morbidity rates comparable to that seen after metaphylaxis while reducing the total amount of antibiotics used at processing. However, neutrophil and lymphocyte counts were not correlated with overall morbidity rates, even for cattle that were not treated with antibiotics at processing. The lack of correlation between WBC count and morbidity challenges the theory that WBC counts directly reflect immune system capability, but suggest that antibiotic treatment may reduce morbidity for specific time frames post-processing. Specifically, the correlation between neutrophil and lymphocyte counts in cattle not treated with antibiotics and morbidity d 8-14 suggests that high neutrophil and lymphocyte counts at processing indicates increased susceptibility to BRD after d 7 in the feedlot. Automatically pulling these cattle for observation or antibiotic treatment at d 7 could reduce morbidity d 8-14. Research measuring weekly changes in WBC counts during the initial transition phase in feedlots and their association with morbidity is recommended prior to widespread adoption of this strategy, as it would allow researchers to better understand the immune response post-processing, and optimize the timing of selective antibiotic treatment. While no treatment differences in morbidity were present, seasonal effects on morbidity were notable in this study. Cattle processed and enrolled in the study within the months of October and November and observed through January were significantly more likely to experience BRD related morbidity over their first 60 d in the feedlot. These results support the industry perception that BRD incidence and severity increases in the fall. 74 Debate continues whether these trends are due to seasonal changes in the weather, increased marketing of cattle which contributes to greater pathogen spread, or other operational demands which draw attention away from the cattle such as crop harvest and farm maintenance, allowing BRD to spread undetected throughout the herd (Dabo et al., 2007; Taylor et al., 2010). Behavioral studies of seasonal differences in farm activity may help shed light on this issue. For feedlot operations where morbidity rates are consistently higher in the fall compared to other seasons, it may be beneficial to utilize the QScout on a seasonal basis as a diagnostic tool, and treat cattle based on abnormal WBC in addition to cattle with elevated fever. Farm workers are likely to have other farm responsibilities which draw attention away from monitoring herd health. In these cases, selective treatment via the QScout may increase antibiotic usage at processing compared to treatment based on fever alone, but effectively reduce total herd pathogen load and prevent widespread BRD outbreak. Further research into this concept is recommended. The nonsignificant correlation between WBC and morbidity for the first, third, and fourth through eighth week post-processing suggest that the timing of managerial intervention is key to the success of selective feedlot health management. Timing is particularly important for treatment based on WBC counts because these counts are dynamic numbers and can change rapidly based on physiological changes such as variations in blood pressure, changes to cortisol levels after transportation, or the upregulation of acute stress responses due to processing (Roland et al., 2014). Currently producers and scientists know very little about timing of these changes, the magnitude of their effects on animal health, or how they would affect selective treatment accuracy. Additional research on how processing delay affects WBC counts and how 75 WBC counts change after processing would help scientists and producers better understand if an optimal window exists for selective antibiotic treatment. In addition to the issue of timing, thresholds used for selective treatment in this study were preliminary and based on limited data for Holstein cattle of this weight. Additional studies and data collection focused on the normal WBC counts and changes to WBC counts throughout the growing period for light-weight Holstein steers may help to establish more effective thresholds for selective treatment. The housing structure used in this study enabled all cattle to benefit from herd immunity. Future studies where cattle are penned based on treatment group are recommended to isolate the effects of each treatment group. Isolating cattle with fever at processing and comparing morbidity rates across treatments may also help determine if sick pens are an effective way to further reduce feedlot morbidity and antibiotic usage in addition to selective treatment. Due to the study design and treatment group designation, conclusions cannot be drawn to determine if selective treatment based on WBC counts is definitively more effective than treatment based on fever. The cattle in FEV group are at a different stage of immune response than those in the Abnormal WBC and SEL treatments, so it cannot be assumed antibiotic efficacy would be equal for these groups. However, antibiotic treatment at processing did not reduce future morbidity compared to those that were not treated with antibiotics. It may be financially beneficial to conduct further study with similar treatments, but include a group of cattle with fever at processing that do not receive antibiotic treatment. This may shed light on the efficacy of antibiotic treatment due to fever compared to antibiotic treatment due to abnormal WBC counts. 76 The average length of time from processing to first clinical treatment was between 13.5- 25.7 days. Cattle are most likely to exhibit clinical BRD symptoms due to pathogens which already exist in their respiratory tract within the first 3-10 days after arrival in a feedlot (Taylor et al., 2010). These results suggest that cattle may have developed clinical BRD due to some form of pathogen exposure or increased stress at the Lewis farm, rather than preexisting commensal pathogens. This could be the result of exposure to foreign pathogens through shared water and feed troughs or airspace, issues with airflow throughout the facility, or delayed identification of sick cattle which increased total herd pathogen load. It is also possible that management or handling practices on the farm contribute to stress and increase BRD susceptibility. Morbidity rates after d 13.5 at the Lewis farm could potentially be reduced via changes to animal or facility management, rather that changing processing procedures alone. Summary Overall, the results of this study indicate that seasonal effects are more impactful for feedlot morbidity rates than antibiotic treatment at processing or rectal temperature at arrival. These results align with common industry perceptions about seasonality of BRD. Selective treatment successfully reduced antibiotic usage by 61.1%, and with no changes to morbidity rates compared to cattle treated metaphylactically. However, lack of correlation between neutrophil count or lymphocyte count and morbidity suggest a need for additional research before widespread adoption of this selective treatment strategy. Future research focused on establishing a timeline of normal versus abnormal WBC counts for Holstein steers, comparison of morbidity rates within isolated treatment pens, and economic ramifications of adopting each treatment strategy are recommended. 77 APPENDIX 78 Figure 2.1: Comparison of rectal temperature at processing and subsequent morbidity d r e h f o t n e c r e P 30 25 20 15 10 5 0 Morbidity - No antibiotics at processing R² = 0.41 Morbidity -Treated with antibiotics at processing R² = 0.30 100 90 80 70 60 50 40 30 20 10 0 % , y r o g e t a c e r u t a r e p m e t l h c a e n i h t i w y t i d i b r o M 37.75 38 38.25 38.5 38.75 39 39.25 39.5 39.75 40 40.25 40.5 40.75 41 41.25 41.5 Temperature (°C)a a Threshold for antibiotic treatment based on fever was 39.4°C Table 2.1: Comparison of rectal temperature at processing and subsequent morbidity All cattle Treatments includeda Normal-NA, Normal-A, Abnormal-NA, Abnormal-A, FEV No antibiotics at processing Normal-NA, Abnormal-NA Received Normal-A, N Mean (°C) SEM (°C) Min (°C) Max (°C) P* 497 39.2 0.03 37.9 41.7 0.11 183 39.0 0.02 37.9 39.4 0.71 antibiotics at Abnormal-A, 314 39.5 0.03 38.3 41.7 0.15 processing FEV a Normal-NA= Normal white blood cell (WBC) count and rectal temperature, steers were not treated with antibiotics at processing; Normal-A= Normal WBC count and rectal temperature, steers were treated with antibiotics at processing; Abnormal-NA= Abnormal WBC count and normal rectal temperature, steers were not treated with antibiotics at processing; Abnormal-A= Abnormal WBC count and normal rectal temperature, steers were treated with antibiotics at processing; FEV= Abnormally elevated rectal temperature at processing, steers were treated with antibiotics at processing * This P value indicates the significance of correlation between morbidity and rectal temperature 79 Figure 2.2: Comparison of neutrophil count and subsequent morbidity d r e h f o t n e c r e P 16 14 12 10 8 6 4 2 0 Morbidity - Treated with antibiotics at processing R² = 0.11 Morbidity - No antibiotics at processing R² = 0.30 100 90 80 70 60 50 40 30 20 10 0 % , y r o g e t a c l i h p o r t u e n h c a e n i h t i w y t i d i b r o M . 5 1 0 . 5 2 1 . 5 3 2 . 5 4 3 . 5 5 4 . 5 6 5 . 5 7 6 . 6 8 7 . 5 9 8 . 5 9 0 1 . 5 0 1 1 1 . 5 1 1 2 1 . 5 2 1 3 1 . 5 3 1 4 1 . 5 4 1 5 1 . 5 5 1 * 6 1 > Neutrophil count, x10³ cells/μLa * 5 cattle had neutrophil counts >16 a Threshold for selective antibiotic treatment was <1.5 x10³ cells/μL or >4.5 x10³ cells/μL Table 2.2: Comparison of neutrophil count and subsequent morbidity Treatments includeda N Mean, x103 SEM, x103 cells/μL cells/μL Min, x103 cells/μL Max, x103 cells/μL P* All cattle Normal-NA, Normal-A, Abnormal-NA, Abnormal-A, FEV No antibiotics at processing Normal-NA, Abnormal-NA Received Normal-A, 497 3.9 0.13 0.5 20 0.54 183 3.4 0.15 0.9 13.8 0.82 antibiotics at Abnormal-A, 314 4.2 0.18 0.5 20 0.53 processing FEV a Normal-NA= Normal white blood cell (WBC) count and rectal temperature, steers were not treated with antibiotics at processing; Normal-A= Normal WBC count and rectal temperature, steers were treated with antibiotics at processing; Abnormal-NA= Abnormal WBC count and normal rectal temperature, steers were not treated with antibiotics at processing; Abnormal-A= Abnormal WBC count and normal rectal temperature, steers were treated with antibiotics at processing; FEV= Abnormally elevated rectal temperature at processing, steers were treated with antibiotics at processing * This P value indicates the significance of correlation between morbidity and neutrophil count 80 Figure 2.3: Comparison of lymphocyte count and subsequent morbidity d r e h f o t n e c r e P 12 10 8 6 4 2 0 Morbidity - No antibiotics at processing R² = 0.08 100 Morbidity - Treated with antibiotics at processing R² = 0.21 90 80 70 60 50 40 30 20 10 0 % , y r o g e t a c e t y c o h p m y l h c a e n h t i i w y t i d b r o M i * 4 5 3 < . 5 5 4 . 5 6 5 . 5 7 6 . 5 8 7 . 5 9 8 . 0 1 . 5 0 1 1 1 2 1 . 5 1 1 . 5 2 1 3 1 . 5 3 1 4 1 . 5 4 1 5 1 . 5 5 1 6 1 * * 5 6 1 > . Lymphocyte count, x10³ cells/μLa a Threshold for selective antibiotic treatment was >10 x10³ cells/μL * 3 steers had lymphocyte counts <3.5 x10³ cells/μL ** 3 steers had lymphocyte counts >16.5 x10³ cells/μL Table 2.3: Comparison of lymphocyte count and subsequent morbidity Treatments includeda N Normal-NA, Normal-A, Mean, x103 SEM, x103 Min, x103 Max, x103 cells/μL cells/μL cells/μL cells/μL P* All cattle Abnormal-NA, 497 8.3 0.11 1.7 21.3 0.35 No antibiotics at processing Received antibiotics at Abnormal-A, FEV Normal-NA, Abnormal-NA 183 8.1 0.16 4.5 16.8 0.71 Normal-A, Abnormal-A, FEV 314 8.4 0.15 1.7 21.3 0.16 processing a Normal-NA= Normal white blood cell (WBC) count and rectal temperature, steers were not treated with antibiotics at processing; Normal-A= Normal WBC count and rectal temperature, steers were treated with antibiotics at processing; Abnormal-NA= Abnormal WBC count and normal rectal temperature, steers were not treated with antibiotics at processing; Abnormal-A= Abnormal WBC count and normal rectal temperature, steers were treated with antibiotics at processing; FEV= Abnormally elevated rectal temperature at processing, steers were treated with antibiotics at processing * This P value indicates the significance of correlation between morbidity and lymphocyte count 81 Figure 2.4: Purchase weight d r e h l a t o T f o t n e c r e P 16 14 12 10 8 6 4 2 0 125 135 145 155 165 175 185 195 205 215 225 235 245 255 Purchase weight (kg) N Mean SEM Min Max 497 177 kg 1.3 kg 128 kg 259 kg 82 Table 2.4: Correlation of biological measures and health based on antibiotic treatment Processing delay Morbidity Mortality Treated d 1-7 Treated d 8-14 Treated d 15-21 Treated d 22-60 Temperature Neutrophil Lymphocyte Processing counts counts delay Purchase weight Morbidity NAa 0.01b (0.85)* -0.07 (0.70) -0.07 (0.35) -0.05 (0.47) 0.003 (0.97) 0.06 (0.44) -0.03 (.067) A 0.10 (0.08) 0.08 (0.15) 0.10 (0.08) 0.06 (0.34) 0.06 (0.32) 0.03 (0.62) 0.04 (0.44) NA -0.10 (0.19) -0.02 (0.82) 0.07 (0.34) -0.05 (0.51) 0.17 (0.02) -0.08 (0.26) -0.04 (0.57) A -0.02 (0.74) 0.04 (0.53) 0.08 (.17) 0.11 (0.06) 0.05 (0.39) -0.06 (0.28) -0.02 (0.73) NA -0.09 (.25) 0.03 (.71) -0.02 (0.83) -0.04 (0.56) 0.11 (0.02) -0.08 (0.26) 0.03 (0.73) A -0.12 (0.03) -0.08 (0.16) -0.08 (0.14) -0.07 (0.19) 0.02 (0.69) -0.11 (0.06) -0.07 (0.25) NA 1.0 -0.08 (0.28) -0.004 (0.96) -0.01 (0.86) -0.11 (0.13) 0.04 (0.57) -0.05 (0.46) A 1.0 -0.10 (0.09) 0.04 (0.52) -0.09 (0.10) -0.05 (0.39) 0.06 (0.28) -0.01 (0.86) NA 0.21 A 0.22 (0.005) (<0.001) -0.08 (0.28) 0.10 (0.20) 0.05 (0.52) 0.02 (0.77) -0.06 (0.39) -0.09 (0.21) -0.08 (0.17) 0.006 (0.92) -0.02 (0.66) -0.01 (0.84) -0.04 (0.54) -0.04 (0.46) All -0.09 (0.05) 1.0 0.10 (0.03) 0.37 (<0.001) 0.39 (<0.001) 0.36 (<0.001) 0.63 (<0.001) a NA= no antibiotic treatment at processing; A= antibiotic treatment given at processing b Within a row and column, first number is a correlation coefficient (r value), followed by significance value in parenthesis * This P value indicates the correlation between the pair of variables for the respective column and row 83 Table 2.5: Effect of processing treatment on health management N Antibiotics at processing (%) Temp. (°C) Neut. counts, (x10³ cells/μL) Lymph. counts, (x10³ cells/μL) Initial weight (kg) Processing delay (d) Treated 1x post- processing, % (n)b Treated 2x post- processing, % (n) Treated 3x post- processing, % (n) Mortality, % (n) Day 1-7, % (n ) Day 8-14, % (n) Day 15-21, % (n) Day 22–60, % (n) Time to first treatment (d) CONa 113 0 39.0c 3.8c 8.5 178.6 5.7 23.0 (26) 6.2c (7) 0.9c (1) 2.7 (3) 7.9cd (9) 2.6 (3) 1.8 (2) 10.6 (12) 20.7 MET 113 100 39.0c 3.7c 8.2 180.4 5.6 SEL 113 38.9 39.0c 3.5c 8.5 FEV 158 100 40.0d 4.4d 7.9 177.5 171.7 5.6 6.2 Morbidity and Mortality 23.9 (27) 10.6cd (12) 2.6c (3) 3.5 (4) 27.4 (31) 15.0d (17) 7.1d (8) 4.4 (5) 36.1 (57) 20.8d (33) 8.9d (14) 5.1 (8) Time of Antibiotic Treatment Post-processing 12.0d (19) 6.9 (11) 2.5 (4) 14.6 (23) 7.9cd (9) 4.4 (5) 2.7 (3) 8.8 (10) 7.1c (8) 2.7 (3) 3.5 (4) 14.1 (16) SEM -- -- 0.18 0.53 0.23 5.9 0.38 -- -- -- -- -- -- -- -- 16.9 21.1 19.4 6.0 P* -- -- <0.0001 0.03 0.27 0.15 0.94 0.09 0.01 0.04 NA 0.04 0.40 NA 0.62 0.39 a Con=control; MET= metaphylaxis; SEL= treated based on white blood cell counts; FEV=cattle treated with a fever at processing. b Number of steers. cd Means within a row with unlike superscripts differ, as determined by odds ratio analysis. NA = model would not converge due to lack of observations * This P value indicates the significance of treatment differences for each outcome 84 Table 2.6: Effects of season on white blood cell counts and health management Spring Summer 146 64.4 39.4 3.8 7.9 176.9 5.8bc Fall 190 67.4 39.3 4.2 8.4 174.8 6.6c Morbidity and Mortality 18.5b (27) 8.2b (12) 1.4 (2) 0 (0) 46.6c (75) 24.8c (40) 8.7 (14) 6.8 (10) Time of Antibiotic Treatment Post-processing 8.7bc (14) 5.6 (9) 4.3 (7) 5.0 (8) 13.5 2.1b (3) 5.5 (8) 0.7 (1) 10.3 (15) 25.7 5.0c (8) 6.8 (11) 9.9 (16) 26.7 (43) 25.6 161 56.5 39.1 3.7 8.5 178.3 4.9b 23.0b (37) 7.5b (12) 3.7 (6) 2.5 (4) N Antibiotics at processing (%) Temp. (°C) Neut. counts, ( x10³ cells/μL) Lymph. counts, (x10³ cells/μL) Initial weight (kg) Processing delay(d) Treated 1x post- processing,%(n)a Treated 2x post- processing,% (n) Treated 3x post- processing,% (n) Mortality,% (n) Day -7,% (n) Day 8-14, % (n) Day 15–21,%, (n) Day 22–60,% (n) Time to first treatment (d) SEM -- -- 0.10 0.81 0.51 9.5 0.35 -- -- -- -- -- -- -- -- 9.3 P* -- -- 0.08 0.05 0.17 0.05 0.001 <0.0001 0.003 0.07 NA 0.05 0.06 NA NA 0.09 a Number of steers. bc Means within a row with unlike superscripts differ, as determined by odds ratio analysis. NA = Model would not converge due to lack of observations * This P value indicates the significance of seasonal differences for each outcome 85 Table 2.7: Effects of white blood cell counts and metaphylactic treatment on morbidity and mortality N Antibiotics at processing (%) Temperature at processing (°C) Neutrophil counts, (x10³ cells/μL) Lymphocyte counts, (x10³ cells/μL) Purchase weight (kg) Processing delay (d) Treated 1x post- processing, % (n)c Treated 2x post- processing, % (n) Treated 3x post- processing, % (n) Mortality, % (n) D 1-7, % (n) D 8-14, % (n) D 15-21, % (n) D 22-60, % (n) Time to first treatment (d) Normal WBC Abnormal WBC Fever at Count Count Processing NoAa 126 0 Ab 64 100 NoA 57 0 A 92 100 A 158 100 39.0d 39.0d 39.0d 39.0d 40.0e 2.7d 7.3d 177.5 5.9 23.8 (30) 7.1d (9) 1.6 (2) 1.6 (2) 2.7d 7.2d 180.5 5.1 4.9e 9.8e 180.9 5.7 5.0e 9.9e 178.1 5.6 Morbidity and Mortality 26.6 (17) 9.4de (6) 3.1 (2) 0 (0) 28.1 (16) 7.0d (4) 1.8 (1) 3.5 (2) 22.8 (21) 13.0de (12) 4.3 (4) 3.2 (3) 4.4e 7.9d 171.7 6.2 34.8 (55) 20.9e (33) 8.2 (13) 5.1 (8) Time of Antibiotic Treatment Post-processing 7.1 (9) 1.6 (2) 2.4 (3) 12.7 (16) 21.3 6.3 (4) 3.1 (2) 6.3 (4) 26.6 (17) 19.2 1.1 (1) 8.7 (8) 3.3 (3) 9.8 (9) 22.0 3.5 (2) 5.3 (5) 10.5 (3) 28.1 (6) 23.8 5.7 (9) 7.0 (11) 7.0 (11) 17.7 (28) 23.7 SEM -- -- 0.07 0.57 0.40 6.5 0.54 -- -- -- -- -- -- -- -- 9.3 a Not treated with antibiotics at processing b Treated with antibiotics at processing c Number of steers. de Treatments within a row with unlike superscripts differ NA = Model would not converge due to lack of observations * This P value indicates the significance of treatment differences for each outcome P* -- -- <0.0001 <0.0001 <0.0001 0.21 0.79 0.22 0.03 0.11 NA 0.25 0.19 NA NA 0.90 86 CHAPTER 4: Interpretive Summary The two research studies discussed in this thesis focused on similar hypotheses, but were each unique in their scope and overall success. General observations about each study and the technology tested will be discussed in this chapter. MSU Farm Study The study conducted at MSU’s Beef Cattle Teaching and Research Center (BCTRC) served as a sufficient trial run of the study concepts and helped familiarize research staff with the QScout, but overall did not serve as a sufficient test of the original hypothesis. Morbidity rates associated with BRD for all herds of cattle on trial at Michigan State University were much lower than anticipated. High BRD morbidity rates were anticipated in this study because the steers were commingled at sale barns, held overnight without feed, and shipped at minimum 52 miles from sale barn to feedlot. All of these factors are considered well-established stressors linked to high BRD rates. However, the unknown health history of the steers meant that it was impossible to predict their future morbidity rates with certainty. Genetic predisposition, previous exposure to pathogens, timing and methods used for weaning and castration, and antibiotic and vaccination history of the animals were unknown. Each of these factors can positively or negatively affect future feedlot morbidity on an individual basis. The lack of information about the cattle in this study prior to their arrival to the feedlot was intentional, because health history is typically unknown for cattle obtained in a sale barn environment. The low observed morbidity rates across all three herds suggests that farms which 87 utilize metaphylaxis due to the perception that cattle are at “high-risk” for developing BRD may result in an excessive and unnecessary use of antibiotics. This in turn increases production costs and contributes to the spread of antibiotic resistance. Conditions unique to BCTRC such as increased opportunity for observation due to farm layout and staff size, low pen density, prevention of nose to nose contact with unfamiliar cattle, and small total herd size may also have contributed to low morbidity rates. These are factors which could not be assessed through this study, but should be considered seriously before conducting further health studies at BCTRC which are intended to be reflective of large scale feedlot conditions. This research may have been more impactful if feed conversion were recorded through slaughter or if lung lesion analysis has been conducted post-mortem. These measurements may have shed light on the financial ramifications for each antibiotic treatment strategy. Lewis Farm Study The Lewis farm provided an opportunity to test the QScout theory and real-world applicability in a Midwest feedlot. Lewis farms had a historical morbidity rate of approximately 40% over the past 10 years. They intentionally purchase light-weight calves from auction barn with no medical history, with the intent to offset the financial risk purchasing high risk cattle by negotiating low purchase prices and utilizing good health management at the farm. Due to this buying strategy, Lewis farms frequently has cattle arrive in small groups that have been shipped or commingled extensively. All of these factors are thought to contribute the development of BRD and led to a strong test of the hypothesis. 88 However, a few logistical changes may have helped achieve a stronger test of the hypothesis. Processing delay for all seasons of the study were longer than typical for the industry. Cattle were purchased sporadically from across the state, which meant farm and research staff were not always available to process within 48 hours of arrival. This may mimic some of the challenges associated with running a diversified farm, but it complicated the analysis process. Future research is also recommended to determine how processing delays affect cortisol levels and WBC counts, as this may have changed treatment efficacy for this study. Pen structure also affected the way treatments were housed at the farm. It would have been preferable to house cattle by treatment or in isolated pens based on date of processing to reduce commingling and the effects of herd immunity. This would also allowed more clear-cut treatment comparison. As it was conducted, the study more accurately reflected real-world conditions, but it was difficult to draw decisive conclusions about treatment effects on morbidity. It also would have been beneficial to have exact records of weight gain, feed efficiency, and antibiotic usage per head, to enable financial analysis of each treatment. Review of QScout Functionality Overall, the QScout in its current state should be considered a prototype rather than a finished design. The most glaring issue with the QScout is the processing time required to obtain WBC counts. Although very rapid from the perspective of obtaining reliable scientific data, 40 s was the standard run time for a single blood sample. This does not include the time required to obtain the blood sample, mix the blood with anticoagulants in the collection tube, prepare a pipet, enter the animal’s ID number, apply blood to a slide, or rerun slides that did not process on the first attempt. In total, it took approximately 2 minutes to take all the necessary steps to obtain a differential WBC count. This process also prevented farm workers from doing their jobs as 89 efficiently because they could not administer vaccinations or apply ear tags while blood was being sampled, and they had to wait for the machine to process the sample before administering antibiotics. This process would slow down large scale producers significantly, costing them time and money. Other barriers to expedient use of the QScout include the technical skill required when handling blood, and ease of access to supplies. Blood must be carefully applied to the slide without any spillage because the QScout imaging system is easily damaged. Pipet tips and blood collection tubes are also likely not used frequently in feedlots, meaning producers would need access to a research or medical supplier. One solution to these issues is to have a veterinarian on hand for all processing, have them transport and use the QScout, and provide all additional supplies necessary. Logistical barriers such as scheduling processing dates and financial feasibility for veterinarians may prevent implementation. Additionally, the QScout machine was extremely sensitive to cold and took up to 25 minutes to boot up on cold fall days even after being transported in the cab of a temperature controlled vehicle. Even after successfully starting up and reaching a functional temperature, the machine sometimes required breaks during processing to regain optimal internal temperature. These issues occurred in the fall in a barn protected from the elements, which casts considerable doubt on the machine’s ability to perform satisfactorily in the winter months particularly in barns which are not insulated or temperature controlled. Although the QScout program itself was not difficult to navigate, there were multiple occasions where it was necessary to contact technology support due to errors in program updates. Successfully reaching a technology support person was a challenge, and slowed down processing. In the event a support person could be reached, it was not possible for support staff 90 to access the machine to troubleshoot without internet access, meaning that all barns where the QScout is used would need to be outfitted with Wi-Fi or Ethernet cables. Internet access is also necessary in order to upload WBC data to an external source for consultation with AAD. This may be a major barrier for producers and may limit use of the QScout on the day of processing due to technology difficulties. Beyond the technical challenges associated with using the QScout, it requires a flat, hard, clean space near the processing chute with access to electricity, good lighting, protection from rain or snow, and ventilation to prevent overheating in warm weather. If veterinarians were to have a QScout available to transport to each of their clients on processing day, they would need to set up this type of space at each client’s barn to be able to use the machine. Economic analysis should also be done to determine how feasible it would be for producers or veterinarians to purchase and use the QScout. Consideration of the duration of use, reduction of antibiotic usage, or degree of improvement in morbidity rates required to offset the initial purchase cost is necessary on a case by case basis. Additional Studies Additional studies which include feedlots in unique geographical regions of the United States, test the efficacy of selective treatment based on WBC count, and include more frequent blood testing are recommended prior to widespread implementation of the QScout treatment strategy. This would help determine ease of use of the QScout on larger scale facilities and shed light on the financial benefits or fallacies of the selective treatment strategy. Further investigation into how WBC counts change in response to stress, aging, transport, and commingling would also be beneficial. The selective treatment thresholds used in this study were arbitrary and based 91 on limited WBC count data. Measuring WBC before and after shipment, during processing, and periodically throughout the growing phase is recommended. Concluding Thoughts Overall, these studies indicated that there is opportunity to reduce antibiotic usage in feedlots in the US, but the QScout, timing of WBC testing, and the exact treatment thresholds used in these studies need further refinement before widespread implementation. The weather related challenges associated with using the QScout, as discussed previously, may be a major barrier to marketability within the industry. Nearly all producers I spoke with were interested in the idea of the QScout and thought the WBC theory had potential, but were deterred by the length of processing time and supplies required. The results from the Lewis farm research suggest there is merit to the theory that selective treatment at arrival can reduce antibiotic usage compared to metaphylaxis. All treatments effectively reduced overall BRD-related morbidity compared to the 40% historical rate at Lewis farm, most likely due to herd immunity. However, moving forward the most valuable research test for the Lewis farm may be to determine if, after refinement of treatment thresholds, selective treatment based on WBC is more effective at reducing morbidity than treatment based on fever alone. The correlation between neutrophil and lymphocyte counts with morbidity d 8-14 also suggests that timing is an important element of the selective antibiotics treatment approach. To reduce morbidity d 8-14, it may be beneficial to test and selectively treat cattle for abnormal WBC count at d 7 post-processing rather than at processing. Alternatively, producers could screen all cattle with the QScout at processing, and selectively pull those that had high WBC count at processing for a recheck at d 7. In the meantime, I would recommend 92 exploring other health management practices before purchasing the QScout for full-time use. Management practices such as improving facilities cleanliness or introducing a sick pen for cattle with fever at processing may effectively reduce morbidity with minimal capital investment. 93 LITERATURE CITED 94 REFERENCES Abutarbush, S. M., C. M. Pollock, B. K. Wildman, T. Perrett, O. C. Schunicht, R. K. Fenton, S. J. Hannon, A. R. Vogstad, G. K. Jim, and C. W. Booker. 2012. Evaluation of the diagnostic and prognostic utility of ultrasonography at first diagnosis of presumptive bovine respiratory disease. Can. J. Vet. Res. 76:23–32. Available from: http://www.ncbi.nlm.nih.gov/pubmed/22754091 Ackermann, M. R., R. Derscheid, and J. A. Roth. 2010. Innate immunology of bovine respiratory disease. Vet. Clin. North Am. Food Anim. Pract. 26:215–28. doi:10.1016/j.cvfa.2010.03.001. Available from: http://www.ncbi.nlm.nih.gov/pubmed/20619180 Arthington, J. D., X. Qiu, R. F. Cooke, J. M. B. Vendramini, D. B. Araujo, C. C. Chase, and S. W. Coleman. 2008. Effects of preshipping management on measures of stress and performance of beef steers during feedlot receiving. J. Anim. Sci. 86:2016–2023. doi:10.2527/jas.2008-0968. Avra, T. D., K. M. Abell, D. D. Shane, M. E. Theurer, R. L. Larson, and B. J. White. 2017. A retrospective analysis of risk factors associated with bovine respiratory disease treatment failure in feedlot cattle. J. Anim. Sci. 95:1521–1527. doi:10.2527/jas2016.1254. Bach, A., C. Iglesias, and I. Busto. 2004. Technical Note: A Computerized System for Monitoring Feeding Behavior and Individual Feed Intake of Dairy Cattle. J. Dairy Sci. 87:4207– 4209. doi:10.3168/jds.S0022-0302(04)73565-1. Bartlett, J., A. Fischer, and P. J. McCray. 2008. Innate immune functions of the airway epithelium. Contrib Microbiol. 15:147–163. Bishop, S. C., and J. A. Woolliams. 2014. Genomics and disease resistance studies in livestock. Livest. Sci. 166:190–198. doi:10.1016/J.LIVSCI.2014.04.034. Available from: https://www.sciencedirect.com/science/article/pii/S1871141314002352 Bryant, L. K., L. J. Perino, D. Griffin, A. R. Doster, and T. E. Wittum. 1999. A method for recording pulmonary lesions of beef calves at slaughter, and the association of lesions with average daily gain. Bov. Pract. 33:163–173. Available from: https://eurekamag.com/research/003/021/003021827.php Buckham-Sporer, K. R., J. L. Burton, B. Earley, and M. A. Crowe. 2007. Transportation stress in young bulls alters expression of neutrophil genes important for the regulation of apoptosis, tissue remodeling, margination, and anti-bacterial function. Vet. Immunol. Immunopathol. 118:19–29. doi:10.1016/j.vetimm.2007.04.002. Buhman, M. J., L. J. Perino, M. L. Galyean, T. E. Wittum, T. H. Montgomery, and R. S. Swingle. 2000. Association between changes in eating and drinking behaviors and respiratory tract disease in newly arrived calves at a feedlot. Am. J. Vet. Res. 61:1163–1168. doi:10.2460/ajvr.2000.61.1163. Available from: http://avmajournals.avma.org/doi/abs/10.2460/ajvr.2000.61.1163 95 Callan, R. J., and F. B. Garry. 2002. Biosecurity and bovine respiratory disease. Vet. Clin. North Am. Food Anim. Pract. 18:57–77. doi:10.1016/S0749-0720(02)00004-X. Available from: https://www.sciencedirect.com/science/article/pii/S074907200200004X Campbell, J. 2018. Bacterial Pneumonia in Cattle - Respiratory System. Merck Man. Vet. Man. Available from: https://www.merckvetmanual.com/respiratory-system/respiratory-diseases-of- cattle/bacterial-pneumonia-in-cattle Cernicchiaro, N., B. J. White, D. G. Renter, A. H. Babcock, L. Kelly, and R. Slattery. 2012. Effects of body weight loss during transit from sale barns to commercial feedlots on health and performance in feeder cattle cohorts arriving to feedlots from 2000 to 20081. J. Anim. Sci. 90:1940–1947. doi:10.2527/jas.2011-4600. Available from: http://www.ncbi.nlm.nih.gov/pubmed/22247120 Clark, C. A., W. D. Busby, and P. J. Gunn. 2015. Effects of internal parasite infection at feedlot arrival on performance and carcass characteristics of beef steers. Prof. Anim. Sci. 31:412–416. doi:10.15232/pas.2014-01381. Available from: https://linkinghub.elsevier.com/retrieve/pii/S1080744615300619 Clawson, M. L., R. W. Murray, M. T. Sweeney, M. D. Apley, K. D. DeDonder, S. F. Capik, R. L. Larson, B. V. Lubbers, B. J. White, T. S. Kalbfleisch, G. Schuller, A. M. Dickey, G. P. Harhay, M. P. Heaton, C. G. Chitko-McKown, D. M. Brichta-Harhay, J. L. Bono, and T. P. L. Smith. 2016. Genomic signatures of Mannheimia haemolytica that associate with the lungs of cattle with respiratory disease, an integrative conjugative element, and antibiotic resistance genes. BMC Genomics. 17:1–14. doi:10.1186/s12864-016-3316-8. Available from: http://dx.doi.org/10.1186/s12864-016-3316-8 Cunha, J. 2017. Antibiotic Resistance: Questions & Answers. RXList. Available from: https://www.rxlist.com/antibiotic_resistance/drugs-condition.htm Currin, J. F. 2009. Strategic Use of Antibiotics in Stocker Cattle. Virginia Coop. Ext. 400–307. Dabo, S. M., J. D. Taylor, and A. W. Confer. 2007. Pasteurella multocida and bovine respiratory disease. Anim. Heal. Res. Rev. 8:129–150. doi:10.1017/S1466252307001399. Drovers. 2013. Deworming cattle is a springtime chore. Drovers. Available from: https://www.drovers.com/article/deworming-cattle-springtime-chore Edwards, T. A. 2010. Control methods for bovine respiratory disease for feedlot cattle. Vet. Clin. North Am. - Food Anim. Pract. 26:273–284. doi:10.1016/j.cvfa.2010.03.005. Available from: http://dx.doi.org/10.1016/j.cvfa.2010.03.005 Ellis, J. A. 2001. The Immunology of the Bovine Respiratory Disease Complex. Vet. Clin. North Am. Food Anim. Pract. 17:535–550. doi:10.1016/S0749-0720(15)30005-0. Available from: https://www.sciencedirect.com/science/article/pii/S0749072015300050?via%3Dihub Elsevier, B. 2019. Lymphocyte ScienceDirect Topics. ScienceDirect. Available from: https://www.sciencedirect.com/topics/neuroscience/lymphocyte 96 FDA. 2017. 2016 Summary Report on Antimicrobials Sold or Distributed for Use in Food- Producing Animals. Washington DC. Available from: https://www.fda.gov/downloads/ForIndustry/UserFees/AnimalDrugUserFeeActADUFA/UCM58 8085.pdf Fell, L. R., I. G. Colditz, K. H. Walker, and D. L. Watson. 1999. Associations between temperament, performance and immune function in cattle entering a commercial feedlot. Aust. J. Exp. Agric. 39:795. doi:10.1071/EA99027. Available from: http://www.publish.csiro.au/?paper=EA99027 Fulton, R. W., B. J. Cook, D. L. Step, A. W. Confer, J. T. Saliki, M. E. Payton, L. J. Burge, R. D. Welsh, and K. S. Blood. 2002. Evaluation of health status of calves and the impact on feedlot performance: Assessment of a retained ownership program for postweaning calves. Can. J. Vet. Res. 66:173–180. Fulton, R. W., C. W. Purdy, A. W. Confer, J. T. Saliki, R. W. Loan, R. E. Briggs, and L. J. Burge. 2000. Bovine viral diarrhea viral infections in feeder calves with respiratory disease: interactions with Pasteurella spp., parainfluenza-3 virus, and bovine respiratory syncytial virus. Can. J. Vet. Res. 64:151–9. Available from: http://www.ncbi.nlm.nih.gov/pubmed/10935880 Galyean, M. L., L. J. Perino, and G. C. Duff. 1999. Interaction of Cattle Health/Immunity and Nutrition. Available from: https://www.webpages.uidaho.edu/ruminant_nutrition/Optional Reading Materials/Interaction of cattle health-immunity and nutrition.pdf Ganz, T. 2003. Defensins: antimicrobial peptides of innate immunity. Nat. Rev. Immunol. 3:710–720. doi:10.1038/nri1180. Available from: http://www.ncbi.nlm.nih.gov/pubmed/12949495 Gay, E., and J. Barnouin. 2009. A nation-wide epidemiological study of acute bovine respiratory disease in France. Prev. Vet. Med. 89:265–271. doi:10.1016/J.PREVETMED.2009.02.013. Available from: https://www.sciencedirect.com/science/article/pii/S0167587709000427?via%3Dihub George, J., J. Snipes, and V. Lane. 2010. Comparison of bovine Clin, hematology reference intervals from 1957 to 2006. Vet Clin Pathol. 39:138–148. Godson, D. L., M. Campos, S. K. Attah-Poku, M. J. Redmond, D. M. Cordeiro, M. S. Sethi, R. J. Harland, and L. A. Babiuk. 1996. Serum haptoglobin as an indicator of the acute phase response in bovine respiratory disease. Vet. Immunol. Immunopathol. 51:277–292. doi:10.1016/0165- 2427(95)05520-7. Available from: https://www.sciencedirect.com/science/article/pii/0165242795055207 Gould, K. 2011. Beef Cattle Deworming Strategies. Michigan State Univ. Ext. Available from: https://www.canr.msu.edu/news/beef_cattle_deworming_strategies Grandin, T. 1994. Solving livestock handling problems. Vet. Med. 89:989–998. Available from: http://agris.fao.org/agris-search/search.do?recordID=US9517358 97 Griffin, D. 1997. Economic Impact Associated with Respiratory Disease in Beef Cattle. Vet. Clin. North Am. Food Anim. Pract. 13:367–377. doi:10.1016/S0749-0720(15)30302-9. Available from: https://linkinghub.elsevier.com/retrieve/pii/S0749072015303029 Griffin, D. 2014. The monster we don’t see: subclinical BRD in beef cattle. Anim. Heal. Res. Rev. 15:138–141. doi:10.1017/S1466252314000255. Available from: https://www.cambridge.org/core/product/identifier/S1466252314000255/type/journal_article Harrop, P., R. Das, and N. Tsao. 2016. Wearable Technology for Animals 2017-2027: Technologies, Markets, Forecasts. Available from: https://www.idtechex.com/research/reports/wearable-technology-for-animals-2017-2027- technologies-markets-forecasts-000488.asp Hart, B. L. 1988. Biological basis of the behavior of sick animals. Neurosci. Biobehav. Rev. 12:123–137. doi:10.1016/S0149-7634(88)80004-6. Available from: https://www.sciencedirect.com/science/article/pii/S0149763488800046?via%3Dihub Haskell, M. J., G. Simm, and S. P. Turner. 2014. Genetic selection for temperament traits in dairy and beef cattle. Front. Genet. 5:368. doi:10.3389/fgene.2014.00368. Available from: http://journal.frontiersin.org/article/10.3389/fgene.2014.00368/abstract Hilton, W. M. 2009. When to castrate beef calves, PLUS: 9 bull castration facts to consider. BEEF. Available from: https://www.beefmagazine.com/health/0401-castrate-calves-timing Hodgins, D. C., J. A. Conlon, and P. E. Shewen. 2002. Respiratory Viruses and Bacteria in Cattle. Available from: https://www.ncbi.nlm.nih.gov/books/NBK2480/?report=printable Hoelzer, K., N. Wong, J. Thomas, K. Talkington, E. Jungman, and A. Coukell. 2017. Antimicrobial drug use in food-producing animals and associated human health risks: what, and how strong, is the evidence? BMC Vet. Res. 13:211. doi:10.1186/s12917-017-1131-3. Available from: http://www.ncbi.nlm.nih.gov/pubmed/28676125 Holland, B. P., L. O. Burciaga-Robles, D. L. VanOverbeke, J. N. Shook, D. L. Step, C. J. Richards, and C. R. Krehbiel. 2010. Effect of bovine respiratory disease during preconditioning on subsequent feedlot performance, carcass characteristics, and beef attributes. J. Anim. Sci. 88:2486–2499. doi:10.2527/jas.2009-2428. Available from: https://academic.oup.com/jas/article/88/7/2486-2499/4745686 Huff, A. G., N. Breit, T. Allen, K. Whiting, and C. Kiley. 2016. Evaluation and Verification of the Global Rapid Identification of Threats System for Infectious Diseases in Textual Data Sources. Interdiscip. Perspect. Infect. Dis. 2016:1–5. doi:10.1155/2016/5080746. Available from: http://www.hindawi.com/journals/ipid/2016/5080746/ Jones, M. L., and R. W. Allison. 2007. Evaluation of the Ruminant Complete Blood Cell Count. Vet. Clin. North Am. Food Anim. Pract. 23:377–402. doi:10.1016/J.CVFA.2007.07.002. Available from: https://www.sciencedirect.com/science/article/pii/S0749072007000461?via%3Dihub 98 Kiser, J. N., T. E. Lawrence, M. Neupane, C. M. Seabury, J. F. Taylor, J. E. Womack, and H. L. Neibergs. 2017. Rapid Communication: Subclinical bovine respiratory disease – loci and pathogens associated with lung lesions in feedlot cattle. J. Anim. Sci. 95:2726. doi:10.2527/jas2017.1548. Available from: https://www.animalsciencepublications.org/publications/jas/abstracts/95/6/2726 Kman, N. E., and D. J. Bachmann. 2012. Biosurveillance: a review and update. Adv. Prev. Med. 2012:301408. doi:10.1155/2012/301408. Available from: http://www.ncbi.nlm.nih.gov/pubmed/22242207 Kościuczuk, E. M., P. Lisowski, J. Jarczak, N. Strzałkowska, A. Jóźwik, J. Horbańczuk, J. Krzyżewski, L. Zwierzchowski, and E. Bagnicka. 2012. Cathelicidins: family of antimicrobial peptides. A review. Mol. Biol. Rep. 39:10957–70. doi:10.1007/s11033-012-1997-x. Available from: http://www.ncbi.nlm.nih.gov/pubmed/23065264 Kraft, W., and U. Durr. 2005. Klinische Labordiagnostik in der Tiermedizin [Clinical laboratory diagnostics in veterinary medicine]. 6th ed. Stuttgart, Germany. Kramer, J. 2000. Normal hematology of cattle; sheep; and goats. In: Schalm’s veterinary hematology. 5th ed. p. 1075–1084. Laborie, E. 2018. Weighing the Costs and Benefits of Preconditioning | Drovers. Drovers. Available from: https://www.drovers.com/article/weighing-costs-and-benefits-preconditioning Leach, R. J., C. G. Chitko-McKown, G. L. Bennett, S. A. Jones, S. D. Kachman, J. W. Keele, K. A. Leymaster, R. M. Thallman, and L. A. Kuehn. 2013. The change in differing leukocyte populations during vaccination to bovine respiratory disease and their correlations with lung scores, health records, and average daily gain. J. Anim. Sci. 91:3564–3573. doi:10.2527/jas.2012-5911. Available from: https://academic.oup.com/jas/article/91/8/3564/4731328 Lynch, E., B. Earley, M. McGee, and S. Doyle. 2010. Effect of abrupt weaning at housing on leukocyte distribution, functional activity of neutrophils, and acute phase protein response of beef calves. BMC Vet. Res. 6:39. doi:10.1186/1746-6148-6-39. Available from: http://bmcvetres.biomedcentral.com/articles/10.1186/1746-6148-6-39 MacVean, D. W., D. K. Franzen, T. J. Keefe, and B. W. Bennett. 1986. Airborne particle concentration and meteorologic conditions associated with pneumonia incidence in feedlot cattle. Am. J. Vet. Res. 47:2676–82. Available from: http://www.ncbi.nlm.nih.gov/pubmed/3800131 Maday, J. 2018. Target your Feedlot Treatments. Drovers. Available from: https://www.drovers.com/article/target-your-feedlot-treatments Malazdrewich, C., P. Thumbikat, M. . Abrahamsen, and S. . Maheswaran. 2004. Pharmacological inhibition of Mannheimia haemolytica lipopolysaccharide and leukotoxin- induced cytokine expression in bovine alveolar macrophages. Microb. Pathog. 36:159–169. doi:10.1016/J.MICPATH.2003.11.002. Available from: https://www.sciencedirect.com/science/article/pii/S088240100300216X 99 Mosier, D. 2015. Review of BRD pathogenesis: The old and the new. Anim. Heal. Res. Rev. 15:166–168. doi:10.1017/S1466252314000176. Muggli-Cockett, N. E., L. V. Cundiff, and K. E. Gregory. 1992. Genetic analysis of bovine respiratory disease in beef calves during the first year of life. J. Anim. Sci. 70:2013–2019. doi:10.2527/1992.7072013x. Available from: https://academic.oup.com/jas/article/70/7/2013- 2019/4632005 Muller, K., and R. Berg. 2011. Thoracic Cavity. In: K. Budras, P. Greenough, R. Habel, and C. Mulling, editors. Bovine Anatomy . 2nd ed. Schlutersche Verlagsgesellschaft mbH & Co. KG. p. 141–143. Neethirajan, S. 2017. Recent advances in wearable sensors for animal health management. Sens. Bio-Sensing Res. 12:15–29. doi:10.1016/J.SBSR.2016.11.004. Available from: https://www.sciencedirect.com/science/article/pii/S2214180416301350#bb0145 Neibergs, H. L., C. M. Seabury, A. J. Wojtowicz, Z. Wang, E. Scraggs, J. N. Kiser, M. Neupane, J. E. Womack, A. Eenennaam, G. Hagevoort, T. W. Lehenbauer, S. Aly, J. Davis, and J. F. Taylor. 2014. Susceptibility loci revealed for bovine respiratory disease complex in pre-weaned holstein calves. BMC Genomics. 15:1164. doi:10.1186/1471-2164-15-1164. Available from: http://bmcgenomics.biomedcentral.com/articles/10.1186/1471-2164-15-1164 Nordlund, K. 2006. Housing Factors To Optimize Respiratory Health of Calves. Vet. Med. doi:10.13031/2013.22789. Odore, R., A. D’Angelo, P. Badino, C. Bellino, S. Pagliasso, and G. Re. 2004. Road transportation affects blood hormone levels and lymphocyte glucocorticoid and β-adrenergic receptor concentrations in calves. Vet. J. 168:297–303. doi:10.1016/j.tvjl.2003.09.008. Owens, F. N., D. S. Secrist, W. J. Hill, and D. R. Gill. 1998. Acidosis in Cattle: A Review. J. Anim. Sci. 76:275–286. doi:10.2527/1998.761275x. Panciera, R. J., and A. W. Confer. 2010. Pathogenesis and pathology of bovine pneumonia. Vet. Clin. North Am. Food Anim. Pract. 26:191–214. doi:10.1016/j.cvfa.2010.04.001. Available from: http://www.ncbi.nlm.nih.gov/pubmed/20619179 Parkin, J., and B. Cohen. 2001. An overview of the immune system. Lancet. 357:1777–1789. doi:10.1016/S0140-6736(00)04904-7. Available from: https://www.sciencedirect.com/science/article/pii/S0140673600049047#cesec170 Rice, J. A., Carrasco-Medina, L., Hodgins, D. C., Shewen, P. E. 2008. Mannheimia haemolytica and bovine respiratory disease. Anim. Heal. Res. Rev. 8:117–128. Available from: ftp://173.183.201.52/Inetpub/wwwroot/DairyWeb/Resources/Research/AHRR8/AHRR8_117.pd f Richeson, J. T., T. E. Lawrence, and B. J. White. 2018. Using advanced technologies to quantify beef cattle behavior. Transl. Anim. Sci. 2:223–229. doi:10.1093/tas/txy004. Available from: https://academic.oup.com/tas/article/2/2/223/4904093 100 Roland, L., M. Drillich, and M. Iwersen. 2014. Hematology as a diagnostic tool in bovine medicine. J. Vet. Diagnostics. 26:592–598. doi:10.1177/1040638714546490. Available from: https://journals.sagepub.com/doi/pdf/10.1177/1040638714546490 Rosales, C., and E. Uribe-Querol. 2017. Phagocytosis: A Fundamental Process in Immunity. Biomed Res. Int. 2017:9042851. doi:10.1155/2017/9042851. Available from: http://www.ncbi.nlm.nih.gov/pubmed/28691037 Rosenbusch, R. F., J. M. Kinyon, M. Apley, N. D. Funk, S. Smith, and L. J. Hoffman. 2005. In Vitro Antimicrobial Inhibition Profiles of Mycoplasma Bovis Isolates Recovered from Various Regions of the United States from 2002 to 2003. J. Vet. Diagnostic Investig. 17:436–441. doi:10.1177/104063870501700505. Available from: http://www.ncbi.nlm.nih.gov/pubmed/16312234 Schneider, M. J., R. G. Tait, W. D. Busby, and J. M. Reecy. 2009. An evaluation of bovine respiratory disease complex in feedlot cattle: Impact on performance and carcass traits using treatment records and lung lesion scores. J. Anim. Sci. 87:1821–1827. doi:10.2527/jas.2008- 1283. Schwartzkopf-Genswein, K., and T. Grandin. 2014. Cattle Transport by Road. In: Livestock Handling and Transport. 4th ed. CAB International. p. 143–173. Available from: http://www.cabi.org/cabebooks/ebook/20143217260 Shelley, C., and C. Matney. 2016. Bovine Respiratory Disease: Preconditioning Calves. Fort Collins, CO. Available from: https://extension.colostate.edu/docs/pubs/livestk/08023.pdf Snowder, G. D., L. D. Van Vleck, L. V. Cundiff, and G. L. Bennett. 2006. Bovine respiratory disease in feedlot cattle: Environmental, genetic, and economic factors. J. Anim. Sci. 84:1999– 2008. doi:10.2527/jas.2006-046. Available from: https://academic.oup.com/jas/article/84/8/1999/4777251 Snyder, E., B. Credille, R. Berghaus, and S. Giguère. 2017. Prevalence of multi drug antimicrobial resistance in mannheimia haemolytica isolated from high-risk stocker cattle at arrival and two weeks after processing. J. Anim. Sci. 95:1124–1131. doi:10.2527/jas2016.1110. Step, D. L., C. R. Krehbiel, H. A. DePra, J. J. Cranston, R. W. Fulton, J. G. Kirkpatrick, D. R. Gill, M. E. Payton, M. A. Montelongo, and A. W. Confer. 2008. Effects of commingling beef calves from different sources and weaning protocols during a forty-two-day receiving period on performance and bovine respiratory disease1,2. J. Anim. Sci. 86:3146–3158. doi:10.2527/jas.2008-0883. Available from: https://academic.oup.com/jas/article/86/11/3146/4789170 Sun, J., T. Huang, C. Chen, T.-T. Cao, K. Cheng, X.-P. Liao, and Y.-H. Liu. 2017. Comparison of Fecal Microbial Composition and Antibiotic Resistance Genes from Swine, Farm Workers and the Surrounding Villagers. Sci. Rep. 7:4965. doi:10.1038/s41598-017-04672-y. Available from: http://www.ncbi.nlm.nih.gov/pubmed/28694474 101 Sylte, M. J., L. B. Corbeil, T. J. Inzana, and C. J. Czuprynski. 2001. Haemophilus somnus Induces Apoptosis in Bovine Endothelial Cells In Vitro. Infect. Immun. 69:1650–1660. doi:10.1128/IAI.69.3.1650-1660.2001. Available from: http://www.ncbi.nlm.nih.gov/pubmed/11179340 Taylor, J. D., R. W. Fulton, T. W. Lehenbauer, D. L. Step, and A. W. Confer. 2010. The epidemiology of bovine respiratory disease: What is the evidence for predisposing factors? Can. Vet. J. = La Rev. Vet. Can. 51:1095–102. Available from: http://www.ncbi.nlm.nih.gov/pubmed/21197200 Thomas, P. 1978. Fibrosing alveolitis. Can. Med. Assoc. J. 119:1211–6. Available from: http://www.ncbi.nlm.nih.gov/pubmed/369671 Thrift, F. A., and T. A. Thrift. 2011. REVIEW: Update on preconditioning beef calves prior to sale by cow-calf producers. Prof. Anim. Sci. 27:73–82. doi:10.15232/S1080-7446(15)30452-6. Available from: https://www.sciencedirect.com/science/article/pii/S1080744615304526 Timsit, E., S. Assié, R. Quiniou, H. Seegers, and N. Bareille. 2011. Early detection of bovine respiratory disease in young bulls using reticulo-rumen temperature boluses. Vet. J. 190:136– 142. doi:10.1016/J.TVJL.2010.09.012. Available from: https://www.sciencedirect.com/science/article/pii/S1090023310003035?via%3Dihub Timsit, E., N. Dendukuri, I. Schiller, and S. Buczinski. 2016. Diagnostic accuracy of clinical illness for bovine respiratory disease (BRD) diagnosis in beef cattle placed in feedlots: A systematic literature review and hierarchical Bayesian latent-class meta-analysis. Prev. Vet. Med. 135:67–73. doi:10.1016/j.prevetmed.2016.11.006. Toaff-Rosenstein, R. L., L. J. Gershwin, and C. B. Tucker. 2016. Fever, feeding, and grooming behavior around peak clinical signs in bovine respiratory disease 1. J. Anim. Sci. 94:3918–3932. doi:10.2527/jas2016-0346. Available from: https://dott-scivet.campusnet.unito.it/att/Art_B3.pdf Urban-Chmiel, R., and D. L. Grooms. 2012. Prevention and Control of Bovine Respiratory Disease. Available from: https://livestockscience.in/wp- content/uploads/2012/Bovine_Respiratory_Disease.pdf USDA. 2013. Types and Costs of Respiratory Disease Treatments in U.S. Feedlots. Available from: http://nahms.aphis.usda.gov USDA. 2018. United States National Residue Program for Meat, Poultry, and Egg Products 2019 Residue Sampling Plans. Washington DC. Available from: https://www.fsis.usda.gov/wps/wcm/connect/394f0bd4-2c5d-47bc-ba4f-f65992972e43/2019- blue-book.pdf?MOD=AJPERES USDA, N. 2018. January 1 U.S. All Cattle and Calves Inventory 1867-2018. USDA . Available from: https://www.nass.usda.gov/Charts_and_Maps/Cattle/inv.php Voisinet, B. D., T. Grandin, S. F. O’Connor, J. D. Tatum, and M. J. Deesing. 1997. Bos indicus- cross feedlot cattle with excitable temperaments have tougher meat and a higher incidence of 102 borderline dark cutters. Meat Sci. 46:367–377. doi:10.1016/S0309-1740(97)00031-4. Available from: https://www.sciencedirect.com/science/article/abs/pii/S0309174097000314?via%3Dihub Wathes, C., C. Jones, and A. Webster. 1983. Ventilation, air hygiene and animal health. Vet. Rec. . 554–559. Available from: https://www.aivc.org/resource/ventilation-air-hygiene-and- animal-health Westcott, P., and R. Trostle. 2014. USDA Agricultural Projections to 2023. Washington DC. Available from: www.ers.usda.gov/topics/farm-economy/agricultural-baseline-projections/usdas- long- Whisper - Veterinary Stethoscope. Merck Co. Available from: https://www.merck-animal- health-usa.com/whisper White, B. J., and D. G. Renter. 2009. Bayesian Estimation of the Performance of Using Clinical Observations and Harvest Lung Lesions for Diagnosing Bovine Respiratory Disease in Post- weaned Beef Calves. J. Vet. Diagnostic Investig. 21:446–453. doi:10.1177/104063870902100405. Available from: http://journals.sagepub.com/doi/10.1177/104063870902100405 Whiteley L. O., Maheswaran S. K., Weiss D. J., Ames T. R., K. M. S. 1992. Pasteurella haemolytica A1 and bovine respiratory disease. J. Vet Intern. Med. 6:11–22. Wilson, B. K., C. J. Richards, D. L. Step, and C. R. Krehbiel. 2017. Beef Species Symposium: Best management practices for newly weaned calves for improved health and well-being1. J. Anim. Sci. 95:2170–2182. doi:10.2527/jas.2016.1006. Available from: https://academic.oup.com/jas/article/95/5/2170/4703679 Wittum, T. E., C. R. Young, L. H. Stanker, D. D. Griffin, L. J. Perino, and E. T. Littledike. 1996. Haptoglobin response to clinical respiratory tract disease in feedlot cattle. Am. J. Vet. Res. 57:646–9. Available from: http://www.ncbi.nlm.nih.gov/pubmed/8723875 Wolfger, B., E. Timsit, B. J. White, and K. Orsel. 2015. A systematic review of bovine respiratory disease diagnosis focused on diagnostic confirmation, early detection, and prediction of unfavorable outcomes in feedlot cattle. Vet. Clin. North Am. Food Anim. Pract. 31:351–365. doi:10.1016/j.cvfa.2015.05.005. Available from: http://www.ncbi.nlm.nih.gov/pubmed/26210764 Wood, D., and G. Quiroz-Rocha. 2010. Normal hematology of cattle. In: D. Weiss and K. Wardrop, editors. Schalm’s veterinary hematology. 6th ed. Wiley, Ames, IA. p. 829–835. Wood, L., H. Scott, M. Garg, and P. Gibson. 2009. Innate immune mechanisms linking non- esterified fatty acids and respiratory disease. Prog. Lipid Res. 48:27–43. doi:10.1016/j.plipres.2008.10.001. Available from: http://www.ncbi.nlm.nih.gov/pubmed/19017534 Yang, W. 2017. Early warning for infectious disease outbreak : theory and practice. Academic Press. 103 Yu, Z., T. Wang, H. Sun, Z. Xia, K. Zhang, D. Chu, Y. Xu, Y. Xin, W. Xu, K. Cheng, X. Zheng, G. Huang, Y. Zhao, S. Yang, Y. Gao, and X. Xia. 2013. Contagious caprine pleuropneumonia in endangered Tibetan antelope, China, 2012. Emerg. Infect. Dis. 19:2051–3. doi:10.3201/eid1912.130067. Available from: http://www.ncbi.nlm.nih.gov/pubmed/24274020 Zinn, R. A., A. Barreras, F. N. Owens, and A. Plascencia. 2008. Performance by feedlot steers and heifers: Daily gain, mature body weight, dry matter intake, and dietary energetics. J. Anim. Sci. 86:2680–2689. doi:10.2527/jas.2007-0561. Available from: http://www.ncbi.nlm.nih.gov/pubmed/18539825 104