The Lancet Commission on diabetes: using data to transform diabetes care and patient lives
Chan JCN , Lim LL , Wareham NJ , Shaw JE , Orchard TJ , Zhang P , Lau ESH , Eliasson B , Kong APS , Ezzati M , Aguilar-Salinas CA , McGill M , Levitt NS , Ning G , So WY , Adams J , Bracco P , Forouhi NG , Gregory GA , Guo J , Hua X , Klatman EL , Magliano DJ , Ng BP , Ogilvie D , Panter J , Pavkov M , Shao H , Unwin N , White M , Wou C , Ma RCW , Schmidt MI , Ramachandran A , Seino Y , Bennett PH , Oldenburg B , Gagliardino JJ , Luk AOY , Clarke PM , Ogle GD , Davies MJ , Holman RR , Gregg EW . Lancet 2020 396 (10267) 2019-2082 2020 will go down in history as the year when the global community was awakened to the fragility of human health and the interdependence of the ecosystem, economy, and humanity. Amid the COVID-19 pandemic, the vulnerability of people with diabetes during a public health emergency became evident by their at least 2 times increased risk of severe disease or death, especially in individuals with poorly controlled diabetes, comorbidities, or both. The disease burden caused by COVID-19, exacerbated by chronic conditions like diabetes, has inflicted a heavy toll on health-care systems and the global economy. | | In this Lancet Commission on diabetes, which embodies 4 years of extensive work on data curation, synthesis, and modelling, we urge policy makers, payers, and planners to collectively change the ecosystem, build capacity, and improve the clinical practice environment. Such actions will enable practitioners to systematically collect data during routine practice and to use these data effectively to diagnose early, stratify risks, define needs, improve care, evaluate solutions, and drive changes at patient, system, and policy levels to prevent and control diabetes and other non-communicable diseases. Emerging evidence regarding the possible damaging effects of severe acute respiratory syndrome coronavirus 2 on pancreatic islets implies the potential worsening of the COVID-19 pandemic and the diabetes epidemic, adding to the urgency of these collective actions. | | Prevention, early detection, prompt diagnosis, and continuing care with regular monitoring and ongoing evaluation are key elements in reducing the growing burden of diabetes. Given the silent and progressive nature of diabetes, it is epidemiological analyses that have provided a framework for identifying populations and subgroups at risk of diabetes and its complications. Although the total prevalence of diabetes reflects disease burden, incidence rates might reflect the effects of interventions among determinant factors that include, but are not limited to, political, socioeconomical, and technological changes within a population, area, or both. | | In 2019, 463 million people had diabetes worldwide, with 80% from low-income and middle-income countries. Over 70% of global deaths are due to non-communicable diseases, including diabetes, cardiovascular disease, cancer, and respiratory disease. On average, diabetes reduces life expectancy in people aged 40–60 years by 4–10 years and independently increases the risk of death from cardiovascular disease, renal disease, and cancer by 1·3–3·0 times. Diabetes is among the leading causes of non-traumatic lower extremity amputation and blindness, especially in people of working age. The co-occurrence of these morbidities severely impairs quality of life, reduces productivity, and causes major suffering. |
Imputed state-level prevalence of achieving goals to prevent complications of diabetes in adults with self-reported diabetes - United States, 2017-2018
Chen Y , Rolka D , Xie H , Saydah S . MMWR Morb Mortal Wkly Rep 2020 69 (45) 1665-1670 Diabetes increases the risk for developing cardiovascular, neurologic, kidney, eye, and other complications. Diabetes and related complications also pose a huge economic cost to society: in 2017, the estimated total economic cost of diagnosed diabetes was $327 billion in the United States (1). Diabetes complications can be prevented or delayed through the management of blood glucose (measured by hemoglobin A1C), blood pressure (BP), and non-high-density lipoprotein cholesterol (non-HDL-C) levels, and by avoiding smoking; these are collectively known as the ABCS goals (hemoglobin A1C, Blood pressure, Cholesterol, Smoking) (2-5). Assessments of achieving ABCS goals among adults with diabetes are available at the national level (4,6); however, studies that assess state-level prevalence of meeting ABCS goals have been lacking. This report provides imputed state-level proportions of adults with self-reported diabetes meeting ABCS goals in each of the 50 U.S. states and the District of Columbia (DC). State-level estimates were created by raking and multiple imputation methods (7,8) using data from the 2009-2018 National Health and Nutrition Examination Survey (NHANES), 2017-2018 American Community Survey (ACS), and 2017-2018 Behavioral Risk Factor Surveillance System (BRFSS). Among U.S. adults with diabetes, an estimated 26.4% met combined ABCS goals, and 75.4%, 70.4%, 55.8%, and 86.0% met A1C <8%, BP <140/90 mmHg, non-HDL-C <130 mg/dL and nonsmoking goals, respectively. Public health departments could use these data in their planning efforts to achieve ABCS goal levels and reduce diabetes-related complications at the state level. |
Angiotensin-converting enzyme inhibitor or angiotensin receptor blocker use among hypertensive US adults with albuminuria
Chu CD , Powe NR , McCulloch CE , Banerjee T , Crews DC , Saran R , Bragg-Gresham J , Morgenstern H , Pavkov ME , Saydah SH , Tuot DS . Hypertension 2020 77 (1) 94-102 Since 2003, US hypertension guidelines have recommended ACE (angiotensin-converting enzyme) inhibitors or ARBs (angiotensin receptor blockers) as first-line antihypertensive therapy in the presence of albuminuria (urine albumin/creatinine ratio ≥300 mg/g). To examine national trends in guideline-concordant ACE inhibitor/ARB utilization, we studied adults participating in the National Health and Nutrition Examination Surveys 2001 to 2018 with hypertension (defined by self-report of high blood pressure, systolic blood pressure ≥140 mm Hg or diastolic ≥90 mm Hg, or use of antihypertensive medications). Among 20 538 included adults, the prevalence of albuminuria ≥300 mg/g was 2.8% in 2001 to 2006, 2.8% in 2007 to 2012, and 3.2% in 2013 to 2018. Among those with albuminuria ≥300 mg/g, no consistent trends were observed for the proportion receiving ACE inhibitor/ARB treatment from 2001 to 2018 among persons with diabetes, without diabetes, or overall. In 2013 to 2018, ACE inhibitor/ARB usage in the setting of albuminuria ≥300 mg/g was 55.3% (95% CI, 46.8%-63.6%) among adults with diabetes and 33.4% (95% CI, 23.1%-45.5%) among those without diabetes. Based on US population counts, these estimates represent 1.6 million adults with albuminuria ≥300 mg/g currently not receiving ACE inhibitor/ARB therapy, nearly half of whom do not have diabetes. ACE inhibitor/ARB underutilization represents a significant gap in preventive care delivery for adults with hypertension and albuminuria that has not substantially changed over time. |
Estimated number of eligible Part B beneficiaries for the medicare diabetes prevention program at the county level and by urban-rural classification
Ng BP , Cheng YJ , Rutledge S , Cannon MJ , Zhang P , Smith BD . PLoS One 2020 15 (11) e0241757 INTRODUCTION: Diabetes imposes large health and financial burdens on Medicare beneficiaries. Type 2 diabetes can be prevented or delayed through lifestyle modification programs. In 2018, Medicare began to offer the Medicare Diabetes Prevention Program (MDPP), a lifestyle intervention, to eligible beneficiaries nationwide. The number of MDPP-eligible beneficiaries is not known, but this information is essential in efforts to expand the program and increase enrollment. This study aimed to estimate the number and spatial variation of MDPP-eligible Part B beneficiaries at the county level and by urban-rural classification. METHODS: Data from 2011-2016 National Health and Nutrition Examination Surveys and a survey-weighted logistic regression model were used to estimate proportions of prediabetes in the United States by sex, age, and race/ethnicity based on the MDPP eligibility criteria. The results from the predictive model were applied to 2015 Medicare Part B beneficiaries to estimate the number of MDPP-eligible beneficiaries. The National Center for Health Statistics' Urban-Rural Classification Scheme for Counties from 2013 were used to define urban and rural categories. RESULTS: An estimated 5.2 million (95% CI = 3.5-7.0 million) Part B beneficiaries were eligible for the MDPP. By state, estimates ranged from 13,000 (95% CI = 8,500-18,000) in Alaska to 469,000 (95% CI = 296,000-641,000) in California. There were 2,149 counties with ≤1,000 eligible beneficiaries and 11 with >25,000. Consistent with demographic patterns, urban counties had more eligible beneficiaries than rural counties. CONCLUSIONS: These estimates could be used to plan locations for new MDPPs and reach eligible Part B beneficiaries for enrollment. |
Risk of cervical precancer and cancer among uninsured and underserved women from 2009 to 2017
Saraiya M , Cheung LC , Soman A , Mix J , Kenney K , Chen X , Perkins RB , Schiffman M , Wentzensen N , Miller J . Am J Obstet Gynecol 2020 224 (4) 366 e1-366 e32 BACKGROUND: New guidelines for managing cervical precancer among women in the United States use risk directly to guide clinical actions for individuals who are being screened. These risk-based management guidelines have previously only been based on risks from a large integrated healthcare system. We present here data representative of women of low income without continuous insurance coverage to inform the 2019 guidelines and ensure applicability. OBJECTIVE: We examined the risks of high-grade precancer after human papillomavirus and cytology tests in underserved women and assessed the applicability of the 2019 guidelines to this population. STUDY DESIGN: We examined cervical cancer screening and follow-up data among 363,546 women enrolled in the Centers for Disease Control and Prevention's National Breast and Cervical Cancer Early Detection Program from 2009 to 2017. We estimated the immediate (prevalent) risks of cervical intraepithelial lesion grade 3 or cancer by using prevalence-incidence mixture models. Risks were estimated for each combination of human papillomavirus and cytology result and were stratified by screening history. We compared these risks with published estimates used in new risk-based management guidelines. RESULTS: Women who were up-to-date with their screening, defined as being screened with cytology within the past 5 years, had immediate risks of cervical intraepithelial neoplasia grade 3 or higher similar to that of women at Kaiser Permanente Northern California, whose data were used to develop the management guidelines. However, women in the Centers for Disease Control and Prevention's National Breast and Cervical Cancer Early Detection Program had greater immediate risks if they were never screened or not up-to-date with their screening. CONCLUSION: New cervical risk-based management guidelines are applicable for underinsured and uninsured women with a low income in the United States who are up-to-date with their screening. The increased risk observed here among women who received human papillomavirus-positive, high-grade cytology results, who were never screened, or who were not up-to-date with their cervical cancer screening, led to a recommendation in the management guidelines for immediate treatment among these women. |
Cost-effectiveness of the new 2018 American College of Physicians Glycemic Control Guidance Statements Among US Adults With Type 2 Diabetes
Shao H , Laxy M , Gregg EW , Albright A , Zhang P . Value Health 2020 24 (2) 227-235 Objectives: This study aims to estimate the national impact and cost-effectiveness of the 2018 American College of Physicians (ACP) guidance statements compared to the status quo. Methods: Survey data from the 2011-2016 National Health and Nutrition Examination were used to generate a national representative sample of individuals with diagnosed type 2 diabetes in the United States. Individuals with A1c <6.5% on antidiabetic medications are recommended to deintensify their A1c level to 7.0% to 8.0% (group 1); individuals with A1c 6.5% to 8.0% and a life expectancy of <10 years are recommended to deintensify their A1c level >8.0% (group 2); and individuals with A1c >8.0% and a life expectancy of >10 years are recommended to intensify their A1c level to 7.0% to 8.0% (group 3). We used a Markov-based simulation model to evaluate the lifetime cost-effectiveness of following the ACP recommended A1c level. Results: 14.41 million (58.1%) persons with diagnosed type 2 diabetes would be affected by the new guidance statements. Treatment deintensification would lead to a saving of $363 600 per quality-adjusted life-year (QALY) lost for group 1 and a saving of $118 300 per QALY lost for group 2. Intensifying treatment for group 3 would lead to an additional cost of $44 600 per QALY gain. Nationally, the implementation of the guidance would add 3.2 million life-years and 1.1 million QALYs and reduce healthcare costs by $47.7 billion compared to the status quo. Conclusions: Implementing the new ACP guidance statements would affect a large number of persons with type 2 diabetes nationally. The new guidance is cost-effective. |
Trajectories in estimated glomerular filtration rate in youth-onset type 1 and type 2 diabetes: The SEARCH for Diabetes in Youth Study
Westreich KD , Isom S , Divers J , D'Agostino R , Lawrence JM , Kanakatti Shankar R , Dolan LM , Imperatore G , Dabelea D , Mayer-Davis EJ , Mottl AK . J Diabetes Complications 2020 35 (2) 107768 AIMS: We sought to characterize the direction and associated factors of eGFR change following diagnosis of youth-onset type 1 and type 2 diabetes. METHODS: We assessed the direction of eGFR change at two visits (mean 6.6 years apart) in SEARCH, a longitudinal cohort study of youth-onset type 1 and type 2 diabetes. We used the CKiD(Cr-CysC) equation to estimate GFR and categorized 'rising' and 'declining' eGFR as an annual change of ≥3 ml/min/1.73 m(2) in either direction. Multivariable logistic regression evaluated factors associated with directional change in eGFR. RESULTS: Estimated GFR declined in 23.8% and rose in 2.8% of participants with type 1 diabetes (N = 1225; baseline age 11.4 years), and declined in 18.1% and rose in 15.6% of participants with type 2 diabetes (N = 160; baseline age 15.0 years). Factors associated with rising and declining eGFR (versus stable) in both type 1 and type 2 diabetes included sex, age at diagnosis, baseline eGFR and difference in fasting glucose between study visits. Additional factors in type 1 diabetes included time from baseline visit, HbA1c and body mass index. CONCLUSIONS: Over the first decade of diabetes, eGFR decline is more common in type 1 diabetes whereas eGFR rise is more common in type 2 diabetes. |
Within-trial cost-effectiveness of a structured lifestyle intervention in adults with overweight/obesity and type 2 diabetes: Results from the Action for Health in Diabetes (Look AHEAD) Study
Zhang P , Atkinson KM , Bray G , Chen H , Clark JM , Coday M , Dutton GR , Egan C , Espeland MA , Evans M , Foreyt JP , Greenway FL , Gregg EW , Hazuda HP , Hill JO , Horton ES , Hubbard VS , Huckfeldt PJ , Jackson SD , Jakicic JM , Jeffery RW , Johnson KC , Kahn SE , Killean T , Knowler WC , Korytkowski M , Lewis CE , Maruthur NM , Michaels S , Montez MG , Nathan DM , Patricio J , Peters A , Pi-Sunyer X , Pownall H , Redmon B , Rushing JT , Steinburg H , Wadden TA , Wing RR , Wyatt H , Yanovski SZ . Diabetes Care 2020 44 (1) 67-74 OBJECTIVE: To assess the cost-effectiveness (CE) of an intensive lifestyle intervention (ILI) compared with standard diabetes support and education (DSE) in adults with overweight/obesity and type 2 diabetes, as implemented in the Action for Health in Diabetes study. RESEARCH DESIGN AND METHODS: Data were from 4,827 participants during their first 9 years of the study participation from 2001 to 2012. Information on Health Utilities Index Mark 2 (HUI-2) and HUI-3, Short-Form 6D (SF-6D), and Feeling Thermometer (FT), cost of delivering the interventions, and health expenditures was collected during the study. CE was measured by incremental CE ratios (ICERs) in costs per quality-adjusted life year (QALY). Future costs and QALYs were discounted at 3% annually. Costs were in 2012 U.S. dollars. RESULTS: Over the 9 years studied, the mean cumulative intervention costs and mean cumulative health care expenditures were $11,275 and $64,453 per person for ILI and $887 and $68,174 for DSE. Thus, ILI cost $6,666 more per person than DSE. Additional QALYs gained by ILI were not statistically significant measured by the HUIs and were 0.07 and 0.15, respectively, measured by SF-6D and FT. The ICERs ranged from no health benefit with a higher cost based on HUIs to $96,458/QALY and $43,169/QALY, respectively, based on SF-6D and FT. CONCLUSIONS: Whether ILI was cost-effective over the 9-year period is unclear because different health utility measures led to different conclusions. |
HIV drug resistance profile in South Africa: Findings and implications from the 2017 national HIV household survey.
Moyo S , Hunt G , Zuma K , Zungu M , Marinda E , Mabaso M , Kana V , Kalimashe M , Ledwaba J , Naidoo I , Takatshana S , Matjokotja T , Dietrich C , Raizes E , Diallo K , Kindra G , Mugore L , Rehle T . PLoS One 2020 15 (11) e0241071 BACKGROUND: HIV drug resistance (HIVDR) testing was included in the 2017 South African national HIV household survey. We describe the prevalence of HIVDR by drug class, age, sex and antiretroviral drugs (ARV) status. METHODS: Dried blood were spots tested for HIV, with Viral load (VL), exposure to ARVs and HIVDR testing among those HIV positive. HIVDR testing was conducted on samples with VL ≥1000 copies/ml using Next Generation Sequencing. Weighted percentages of HIVDR are reported. RESULTS: 697/1,105 (63%) of HIV positive samples were sequenced. HIVDR was detected in samples from 200 respondents (27.4% (95% confidence interval (CI) 22.8-32.6)). Among these 130 (18.9% (95% CI 14.8-23.8)), had resistance to non-nucleoside reverse transcriptase inhibitors (NNRTIs) only, 63 (7.8% (95% CI 5.6-10.9)) resistance to NNRTIs and nucleoside reverse transcriptase inhibitors, and 3 (0.5% (95% CI 0.1-2.1)) resistance to protease inhibitors. Sixty-five (55.7% (95% CI 42.6-67.9) of ARV-positive samples had HIVDR compared to 112 (22.8% (95% CI 17.7-28.7)), in ARV-negative samples. HIVDR was found in 75.6% (95% CI 59.2-87.3), n = 27, samples from respondents who reported ARV use but tested ARV-negative, and in 15.3% (95% CI 6.3-32.8), n = 7, respondents who reported no ARV use and tested ARV-negative. There were no significant age and sex differences in HIVDR. CONCLUSION: 27% of virally unsuppressed respondents had HIVDR, increasing to 75% among those who had discontinued ARV. Our findings support strengthening first-line ARV regimens by including drugs with a higher resistance barrier and treatment adherence strategies, and close monitoring of HIVDR. |
Invasive pneumococcal strain distributions and isolate clusters associated with persons experiencing homelessness during 2018.
Metcalf BJ , Chochua S , Walker H , Tran T , Li Z , Varghese J , Snippes Vagnone PM , Lynfield R , McGee L , Li Y , Pilishvili T , Beall B . Clin Infect Dis 2020 72 (12) e948-e956 OBJECTIVES: We aimed to characterize invasive pneumococcal disease (IPD) isolates collected from multistate surveillance in the USA during 2018 and examine within-serotype propensities of isolates to form related clusters. METHODS: We predicted strain features using whole genome sequence obtained from 2885 IPD isolates obtained through the Center for Disease Control and Prevention's Active Bacterial Core surveillance (ABCs) that has a surveillance population of approximately 34.5 million individuals distributed among 10 states. Phylogenetic analysis was provided for serotypes accounting for >27 isolates. RESULTS: Thirteen-valent conjugate vaccine (PCV13) serotypes together with 6C accounted for 23/105 (21.9%) of isolates from children aged <5 years and 820/2780 (29.5%) isolates from those aged >5 years. The most common serotypes from adult IPD isolates were serotypes 3 (413/2780, 14.9%), 22F (291/2780, 10.5%) and 9N (191/2780, 6.9%). Among children IPD isolates, serotypes 15BC (18/105, 17.1%), 3 (11/105, 10.5%) and 33F (10/105, 9.5%) were most common. Serotypes 4, 12F, 20, and 7F had the highest proportions of isolates that formed related clusters together with highest proportions of isolates from persons experiencing homelessness (PEH). Among 84 isolates from long-term care facilities, two instances of highly related isolate pairs from co-residents were identified. CONCLUSIONS: Non-PCV13 serotypes accounted for more than 70% of IPD in ABCs, however PCV13 serotype 3 is the most common IPD serotype overall. Serotypes most common among PEH were more often associated with temporally related clusters identified both among PEH and among persons not reportedly experiencing homelessness. |
Risk Assessment and Management of COVID-19 Among Travelers Arriving at Designated U.S. Airports, January 17-September 13, 2020.
Dollard P , Griffin I , Berro A , Cohen NJ , Singler K , Haber Y , de la Motte Hurst C , Stolp A , Atti S , Hausman L , Shockey CE , Roohi S , Brown CM , Rotz LD , Cetron MS , Alvarado-Ramy F . MMWR Morb Mortal Wkly Rep 2020 69 (45) 1681-1685 In January 2020, with support from the U.S. Department of Homeland Security (DHS), CDC instituted an enhanced entry risk assessment and management (screening) program for air passengers arriving from certain countries with widespread, sustained transmission of SARS-CoV-2, the virus that causes coronavirus disease 2019 (COVID-19). The objectives of the screening program were to reduce the importation of COVID-19 cases into the United States and slow subsequent spread within states. Screening aimed to identify travelers with COVID-19-like illness or who had a known exposure to a person with COVID-19 and separate them from others. Screening also aimed to inform all screened travelers about self-monitoring and other recommendations to prevent disease spread and obtain their contact information to share with public health authorities in destination states. CDC delegated postarrival management of crew members to airline occupational health programs by issuing joint guidance with the Federal Aviation Administration.* During January 17-September 13, 2020, a total of 766,044 travelers were screened, 298 (0.04%) of whom met criteria for public health assessment; 35 (0.005%) were tested for SARS-CoV-2, and nine (0.001%) had a positive test result. CDC shared contact information with states for approximately 68% of screened travelers because of data collection challenges and some states' opting out of receiving data. The low case detection rate of this resource-intensive program highlighted the need for fundamental change in the U.S. border health strategy. Because SARS-CoV-2 infection and transmission can occur in the absence of symptoms and because the symptoms of COVID-19 are nonspecific, symptom-based screening programs are ineffective for case detection. Since the screening program ended on September 14, 2020, efforts to reduce COVID-19 importation have focused on enhancing communications with travelers to promote recommended preventive measures, reinforcing mechanisms to refer overtly ill travelers to CDC, and enhancing public health response capacity at ports of entry. More efficient collection of contact information for international air passengers before arrival and real-time transfer of data to U.S. health departments would facilitate timely postarrival public health management, including contact tracing, when indicated. Incorporating health attestations, predeparture and postarrival testing, and a period of limited movement after higher-risk travel, might reduce risk for transmission during travel and translocation of SARS-CoV-2 between geographic areas and help guide more individualized postarrival recommendations. |
A Nationwide Outbreak of Invasive Pneumococcal Disease in Israel Caused by Streptococcus Pneumoniae Serotype 2.
Dagan R , Ben-Shimol S , Benisty R , Regev-Yochay G , Lo SW , Bentley SD , Hawkins PA , McGee L , Ron M , Givon-Lavi N , Valinsky L , Rokney A . Clin Infect Dis 2020 73 (11) e3768-e3777 BACKGROUND: Invasive pneumococcal disease (IPD) caused by Streptococcus pneumoniae serotype 2 (Sp2) is infrequent. Large scale outbreaks have not been reported following pneumococcal conjugate vaccine (PCV) implementation. We describe a Sp2 IPD outbreak in Israel, in the 13-valent PCV (PCV13) era, with focus on Sp2 population structure and evolutionary dynamics. METHODS: The data derived from a population-based, nationwide active surveillance of IPD since 2009. 7-valent PCV (PCV7)/PCV13 vaccines were introduced in July 2009 and November 2010, respectively. Sp2 isolates were tested for antimicrobial susceptibility, Multilocus Sequence Typing (MLST) and Whole Genome Sequencing (WGS) analysis. RESULTS: Overall, 170 Sp2 IPD cases were identified during 2009-2019; Sp2 increased in 2015 and caused 6% of IPD during 2015-2019, a 7-fold increase compared with 2009-2014.The outbreak was caused by a previously unreported molecular type (ST-13578), initially observed in Israel in 2014. This clone caused 88% of Sp2 during 2015-2019. ST-13578 is a single-locus variant of ST-1504, previously reported globally, including in Israel. WGS analysis confirmed clonality among the ST-13578 population. Single-nucleotide polymorphisms-dense regions support a hypothesis that the ST-13578 outbreak clone evolved from ST-1504 by recombination.All tested strains were penicillin-susceptible (MIC <0.06 μg/mL). The ST-13578 clone was identified almost exclusively (99%) in the Jewish population and was mainly distributed in 3/7 Israeli districts. The outbreak is still ongoing, although declining since 2017.Conclusions: To the best of our knowledge, this is the first widespread Sp2 outbreak since PCV13 introduction worldwide, caused by the emerging ST-13578 clone. |
Features of Streptococcus agalactiae strains recovered from pregnant women and newborns attending different hospitals in Ethiopia.
Ali MM , Woldeamanuel Y , Asrat D , Fenta DA , Beall B , Schrag S , McGee L . BMC Infect Dis 2020 20 (1) 848 BACKGROUND: Streptococcus agalactiae (Group B Streptococcus, GBS) serotypes, sequence types, and antimicrobial resistance profile vary across different geographic locations affecting disease patterns in newborns. These differences are important considerations for vaccine development efforts and data from large countries in Africa is limited. The aim of this study was to determine serotypes and genotypes of GBS isolates from pregnant women and their newborns in Ethiopia. METHODS: A hospital based cross-sectional study was conducted at three hospitals in Ethiopia from June 2014 to September 2015. Out of 225 GBS isolates, 121 GBS were recovered, confirmed and characterized at CDC's Streptococcus Laboratory using conventional microbiology methods and whole genome sequencing. RESULTS: Of the 121 isolates, 87 were from rectovaginal samples of pregnant women, 32 from different body parts of their newborns and 2 from blood of newborns with suspected sepsis. There were 25 mother-infant pairs and 24 pairs had concordant strains. The most prevalent serotypes among mothers and/or their babies were II, Ia and V (41.5, 20.6, 19.5 and 40.6%, 25 and 15.6%, respectively). Multilocus sequence typing (MLST) on 83 isolates showed ST10 (24; 28.9%) and ST2 (12; 14.5%) as most predominant sequence types. All GBS strains were susceptible to penicillin, cefotaxime and vancomycin, which correlated to the presence of wildtype PBP2x types and the lack of known vancomycin-resistance genes. Tetracycline resistance was high (73; 88%, associated primarily with tetM, but also tetO and tetL). Five isolates (6%) were resistant to erythromycin and clindamycin and 3 isolates were fluoroquinolone-resistant, containing associated mutations in gyrA and parC genes. All isolates were positive for one of four homologous Alpha/Rib family determinants and 1-2 of the three main pilus types. CONCLUSIONS: Predominant serotypes were II, Ia, and V. A limited number of clonal types were identified with two STs accounting for about half of the isolates. All strains collected in this study were susceptible to beta-lactam antibiotics and vancomycin. Typical of most GBS, these isolates were positive for single alpha-like family protein, serine-rich repeat gene, as well as 1-2 pilus determinants. |
Remote Household Observation for Non-influenza Respiratory Viral Illness.
Emanuels A , Heimonen J , O'Hanlon J , Kim AE , Wilcox N , McCulloch DJ , Brandstetter E , Wolf CR , Logue JK , Han PD , Pfau B , Newman KL , Hughes JP , Jackson ML , Uyeki TM , Boeckh M , Starita LM , Nickerson DA , Bedford T , Englund JA , Chu HY . Clin Infect Dis 2020 73 (11) e4411-e4418 BACKGROUND: Non-influenza respiratory viruses are responsible for a substantial burden of disease in the United States. Household transmission is thought to contribute significantly to subsequent transmission through the broader community. In the context of the COVID-19 pandemic, contactless surveillance methods are of particular importance. METHODS: From November 2019 to April 2020, 303 households in the Seattle area were remotely monitored in a prospective longitudinal study for symptoms of respiratory viral illness. Enrolled participants reported weekly symptoms and submitted respiratory samples by mail in the event of an acute respiratory illness (ARI). Specimens were tested for fourteen viruses, including SARS-CoV-2, using RT-PCR. Participants completed all study procedures at home without physical contact with research staff. RESULTS: In total, 1171 unique participants in 303 households were monitored for ARI. Of participating households, 128 (42%) included a child aged <5 years and 202 (67%) included a child aged 5-12 years. Of the 678 swabs collected during the surveillance period, 237 (35%) tested positive for one or more non-influenza respiratory viruses. Rhinovirus, common human coronaviruses, and respiratory syncytial virus were the most common. Four cases of SARS-CoV-2 were detected in three households. CONCLUSIONS: This study highlights the circulation of respiratory viruses within households during the winter months during the emergence of the SARS-CoV-2 pandemic. Contactless methods of recruitment, enrollment and sample collection were utilized throughout this study, and demonstrate the feasibility of home-based, remote monitoring for respiratory infections. |
Factors associated with typical enteropathogenic Escherichia coli infection among children <5 years old with moderate-to-severe diarrhoea in rural western Kenya, 2008-2012.
Fagerli K , Omore R , Kim S , Ochieng JB , Ayers TL , Juma J , Farag TH , Nasrin D , Panchalingam S , Robins-Browne RM , Nataro JP , Kotloff KL , Levine MM , Oundo J , Parsons MB , Laserson KF , Mintz ED , Breiman RF , O'Reilly CE . Epidemiol Infect 2020 148 1-37 Typical enteropathogenic Escherichia coli (tEPEC) infection is a major cause of diarrhoea and contributor to mortality in children <5 years old in developing countries. Data were analysed from the Global Enteric Multicenter Study examining children <5 years old seeking care for moderate-to-severe diarrhoea (MSD) in Kenya. Stool specimens were tested for enteric pathogens, including by multiplex polymerase chain reaction for gene targets of tEPEC. Demographic, clinical and anthropometric data were collected at enrolment and ~60-days later; multivariable logistic regressions were constructed. Of 1778 MSD cases enrolled from 2008 to 2012, 135 (7.6%) children tested positive for tEPEC. In a case-to-case comparison among MSD cases, tEPEC was independently associated with presentation at enrolment with a loss of skin turgor (adjusted odds ratio (aOR) 2.08, 95% confidence interval (CI) 1.37-3.17), and convulsions (aOR 2.83, 95% CI 1.12-7.14). At follow-up, infants with tEPEC compared to those without were associated with being underweight (OR 2.2, 95% CI 1.3-3.6) and wasted (OR 2.5, 95% CI 1.3-4.6). Among MSD cases, tEPEC was associated with mortality (aOR 2.85, 95% CI 1.47-5.55). This study suggests that tEPEC contributes to morbidity and mortality in children. Interventions aimed at defining and reducing the burden of tEPEC and its sequelae should be urgently investigated, prioritised and implemented. |
Characteristics of Hospitalized COVID-19 Patients Discharged and Experiencing Same-Hospital Readmission - United States, March-August 2020.
Lavery AM , Preston LE , Ko JY , Chevinsky JR , DeSisto CL , Pennington AF , Kompaniyets L , Datta SD , Click ES , Golden T , Goodman AB , Mac Kenzie WR , Boehmer TK , Gundlapalli AV . MMWR Morb Mortal Wkly Rep 2020 69 (45) 1695-1699 Coronavirus disease 2019 (COVID-19) is a complex clinical illness with potential complications that might require ongoing clinical care (1-3). Few studies have investigated discharge patterns and hospital readmissions among large groups of patients after an initial COVID-19 hospitalization (4-7). Using electronic health record and administrative data from the Premier Healthcare Database,* CDC assessed patterns of hospital discharge, readmission, and demographic and clinical characteristics associated with hospital readmission after a patient's initial COVID-19 hospitalization (index hospitalization). Among 126,137 unique patients with an index COVID-19 admission during March-July 2020, 15% died during the index hospitalization. Among the 106,543 (85%) surviving patients, 9% (9,504) were readmitted to the same hospital within 2 months of discharge through August 2020. More than a single readmission occurred among 1.6% of patients discharged after the index hospitalization. Readmissions occurred more often among patients discharged to a skilled nursing facility (SNF) (15%) or those needing home health care (12%) than among patients discharged to home or self-care (7%). The odds of hospital readmission increased with age among persons aged ≥65 years, presence of certain chronic conditions, hospitalization within the 3 months preceding the index hospitalization, and if discharge from the index hospitalization was to a SNF or to home with health care assistance. These results support recent analyses that found chronic conditions to be significantly associated with hospital readmission (6,7) and could be explained by the complications of underlying conditions in the presence of COVID-19 (8), COVID-19 sequelae (3), or indirect effects of the COVID-19 pandemic (9). Understanding the frequency of, and risk factors for, readmission can inform clinical practice, discharge disposition decisions, and public health priorities such as health care planning to ensure availability of resources needed for acute and follow-up care of COVID-19 patients. With the recent increases in cases nationwide, hospital planning can account for these increasing numbers along with the potential for at least 9% of patients to be readmitted, requiring additional beds and resources. |
Lessons Learned from a COVID-19 Biohazard Spill During Swabbing at a Quarantine Facility.
Mayer O , Pfundt T , Fortenberry GZ , Harcourt BH , Bower WA . Disaster Med Public Health Prep 2020 16 (3) 1-9 The need for increased testing for SARS-CoV-2, the virus that causes COVID-19, has resulted in an increase of testing facilities outside of traditional clinical settings and sample handling by individuals without appropriate biohazard and biocontainment training. During the repatriation and quarantine of passengers from the Grand Princess cruise ship at a U.S. military base, biocontainment of a potentially infectious sample from a passenger was compromised. This paper describes the steps taken to contain the spill, decontaminate the area, and discusses the needs for adequate training in a biohazard response. |
Lack of antibodies to SARS-CoV-2 in a large cohort of previously infected persons.
Petersen LR , Sami S , Vuong N , Pathela P , Weiss D , Morgenthau BM , Henseler RA , Daskalakis DC , Atas J , Patel A , Lukacs S , Mackey L , Grohskopf LA , Thornburg N , Akinbami LJ . Clin Infect Dis 2020 73 (9) e3066-e3073 BACKGROUND: Reports suggest that some persons previously infected with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) lack detectable IgG antibodies. We aimed to determine the proportion IgG seronegative and predictors for seronegativity among persons previously infected with SARS-CoV-2. METHODS: We analyzed serologic data collected from health care workers and first responders in New York City and the Detroit metropolitan area with history of a positive SARS-CoV-2 reverse transcriptase polymerase chain reaction (RT-PCR) test result and who were tested for IgG antibodies to SARS-CoV-2 spike protein at least 2 weeks after symptom onset. RESULTS: Of 2,547 persons with previous confirmed SARS-CoV-2 infection, 160 (6.3%) were seronegative. Of 2,112 previously symptomatic persons, the proportion seronegative slightly increased from 14 to 90 days post symptom onset (p=0.06). The proportion seronegative ranged from 0% among 79 persons previously hospitalized to 11.0% among 308 persons with asymptomatic infections. In a multivariable model, persons taking immunosuppressive medications were more likely to be seronegative (31.9%, 95% confidence interval [CI] 10.7%-64.7%), while participants of non-Hispanic Black race/ethnicity (versus non-Hispanic White) (2.7%, 95% CI 1.5%-4.8%), with severe obesity (versus under/normal weight) (3.9%, 95% CI 1.7%-8.6%), or with more symptoms were less likely to be seronegative. CONCLUSIONS: In our population with previous RT-PCR confirmed infection, approximately one in 16 persons lacked IgG antibodies. Absence of antibodies varied independently by illness severity, race/ethnicity, obesity, and immunosuppressive drug therapy. The proportion seronegative remained relatively stable among persons tested up to 90 days post symptom onset. |
Preparing for the 2020-2021 Influenza Season.
Uyeki TM , Santoli J , Jernigan DB . JAMA 2020 324 (22) 2318-2319 As health care systems across the US are experiencing or preparing for surges in individuals with coronavirus disease 2019 (COVID-19) this fall and winter, the potential for cocirculation of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) and influenza viruses poses added challenges for clinicians and public health. Recent reports suggest that influenza activity can be influenced substantially by nonpharmaceutical measures implemented to control the spread of SARS-CoV-2 (eg, use of face masks, social distancing, restrictions on public gatherings, travel restrictions) and other factors influenced by the COVID-19 pandemic (eg, reduced domestic and international travel). In early spring of 2020, sharp declines in influenza activity coincided with implementation of SARS-CoV-2 control measures in the US.1 |
Characterization of COVID-19 in Assisted Living Facilities - 39 States, October 2020.
Yi SH , See I , Kent AG , Vlachos N , Whitworth JC , Xu K , Gouin KA , Zhang S , Slifka KJ , Sauer AG , Kutty PK , Perz JF , Stone ND , Stuckey MJ . MMWR Morb Mortal Wkly Rep 2020 69 (46) 1730-1735 The coronavirus disease 2019 (COVID-19) pandemic has highlighted the vulnerability of residents and staff members in long-term care facilities (LTCFs) (1). Although skilled nursing facilities (SNFs) certified by the Centers for Medicare & Medicaid Services (CMS) have federal COVID-19 reporting requirements, national surveillance data are less readily available for other types of LTCFs, such as assisted living facilities (ALFs) and those providing similar residential care. However, many state and territorial health departments publicly report COVID-19 surveillance data across various types of LTCFs. These data were systematically retrieved from health department websites to characterize COVID-19 cases and deaths in ALF residents and staff members. Limited ALF COVID-19 data were available for 39 states, although reporting varied. By October 15, 2020, among 28,623 ALFs, 6,440 (22%) had at least one COVID-19 case among residents or staff members. Among the states with available data, the proportion of COVID-19 cases that were fatal was 21.2% for ALF residents, 0.3% for ALF staff members, and 2.5% overall for the general population of these states. To prevent the introduction and spread of SARS-CoV-2, the virus that causes COVID-19, in their facilities, ALFs should 1) identify a point of contact at the local health department; 2) educate residents, families, and staff members about COVID-19; 3) have a plan for visitor and staff member restrictions; 4) encourage social (physical) distancing and the use of masks, as appropriate; 5) implement recommended infection prevention and control practices and provide access to supplies; 6) rapidly identify and properly respond to suspected or confirmed COVID-19 cases in residents and staff members; and 7) conduct surveillance of COVID-19 cases and deaths, facility staffing, and supply information (2). |
A Proposed Framework and Timeline of the Spectrum of Disease Due to SARS-CoV-2 Infection: Illness Beyond Acute Infection and Public Health Implications.
Datta SD , Talwar A , Lee JT . JAMA 2020 324 (22) 2251-2252 Although much of the response to the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic has focused on acute coronavirus disease 2019 (COVID-19) illness, accumulating evidence demonstrates morbidity beyond acute SARS-CoV-2 infection.1-4 At least 2 other periods of illness appear to be temporally associated with SARS-CoV-2 infection: a rare postacute hyperinflammatory illness and late inflammatory and virological sequelae. These 3 illness periods not only define the temporal course of SARS-CoV-2 infection at the population level but also capture distinct phases of host-viral interaction. |
Household Transmission of SARS-CoV-2 in the United States.
Lewis NM , Chu VT , Ye D , Conners EE , Gharpure R , Laws RL , Reses HE , Freeman BD , Fajans M , Rabold EM , Dawson P , Buono S , Yin S , Owusu D , Wadhwa A , Pomeroy M , Yousaf A , Pevzner E , Njuguna H , Battey KA , Tran CH , Fields VL , Salvatore P , O'Hegarty M , Vuong J , Chancey R , Gregory C , Banks M , Rispens JR , Dietrich E , Marcenac P , Matanock AM , Duca L , Binder A , Fox G , Lester S , Mills L , Gerber SI , Watson J , Schumacher A , Pawloski L , Thornburg NJ , Hall AJ , Kiphibane T , Willardson S , Christensen K , Page L , Bhattacharyya S , Dasu T , Christiansen A , Pray IW , Westergaard RP , Dunn AC , Tate JE , Nabity SA , Kirking HL . Clin Infect Dis 2020 73 (7) 1805-1813 BACKGROUND: Although many viral respiratory illnesses are transmitted within households, the evidence base for SARS-CoV-2 is nascent. We sought to characterize SARS-CoV-2 transmission within US households and estimate the household secondary infection rate (SIR) to inform strategies to reduce transmission. METHODS: We recruited laboratory-confirmed COVID-19 patients and their household contacts in Utah and Wisconsin during March 22-April 25, 2020. We interviewed patients and all household contacts to obtain demographics and medical histories. At the initial household visit, 14 days later, and when a household contact became newly symptomatic, we collected respiratory swabs from patients and household contacts for testing by SARS-CoV-2 rRT-PCR and sera for SARS-CoV-2 antibodies testing by enzyme-linked immunosorbent assay (ELISA). We estimated SIR and odds ratios (OR) to assess risk factors for secondary infection, defined by a positive rRT-PCR or ELISA test. RESULTS: Thirty-two (55%) of 58 households had evidence of secondary infection among household contacts. The SIR was 29% (n = 55/188; 95% confidence interval [CI]: 23-36%) overall, 42% among children (<18 years) of the COVID-19 patient and 33% among spouses/partners. Household contacts to COVID-19 patients with immunocompromised conditions had increased odds of infection (OR: 15.9, 95% CI: 2.4-106.9). Household contacts who themselves had diabetes mellitus had increased odds of infection (OR: 7.1, 95% CI: 1.2-42.5). CONCLUSIONS: We found substantial evidence of secondary infections among household contacts. People with COVID-19, particularly those with immunocompromising conditions or those with household contacts with diabetes, should take care to promptly self-isolate to prevent household transmission. |
Coronavirus Disease 2019 (COVID-19) in a Patient with Disseminated Histoplasmosis and HIV-A Case Report from Argentina and Literature Review.
Messina FA , Marin E , Caceres DH , Romero M , Depardo R , Priarone MM , Rey L , Vázquez M , Verweij PE , Chiller TM , Santiso G . J Fungi (Basel) 2020 6 (4) The disease caused by the new SARS-CoV-2, known as Coronavirus disease 2019 (COVID-19), was first identified in China in December 2019 and rapidly spread around the world. Coinfections with fungal pathogens in patients with COVID-19 add challenges to patient care. We conducted a literature review on fungal coinfections in patients with COVID-19. We describe a report of a patient with disseminated histoplasmosis who was likely infected with SARS-CoV-2 and experienced COVID-19 during hospital care in Buenos Aires, Argentina. This patient presented with advanced HIV disease, a well-known factor for disseminated histoplasmosis; on the other hand, we suspected that COVID-19 was acquired during hospitalization but there is not enough evidence to support this hypothesis. Clinical correlation and the use of specific Histoplasma and COVID-19 rapid diagnostics assays were key to the timely diagnosis of both infections, permitting appropriate treatment and patient care. |
Mental Health-Related Emergency Department Visits Among Children Aged <18 Years During the COVID-19 Pandemic - United States, January 1-October 17, 2020.
Leeb RT , Bitsko RH , Radhakrishnan L , Martinez P , Njai R , Holland KM . MMWR Morb Mortal Wkly Rep 2020 69 (45) 1675-1680 Published reports suggest that the coronavirus disease 2019 (COVID-19) pandemic has had a negative effect on children's mental health (1,2). Emergency departments (EDs) are often the first point of care for children experiencing mental health emergencies, particularly when other services are inaccessible or unavailable (3). During March 29-April 25, 2020, when widespread shelter-in-place orders were in effect, ED visits for persons of all ages declined 42% compared with the same period in 2019; during this time, ED visits for injury and non-COVID-19-related diagnoses decreased, while ED visits for psychosocial factors increased (4). To assess changes in mental health-related ED visits among U.S. children aged <18 years, data from CDC's National Syndromic Surveillance Program (NSSP) from January 1 through October 17, 2020, were compared with those collected during the same period in 2019. During weeks 1-11 (January 1-March 15, 2020), the average reported number of children's mental health-related ED visits overall was higher in 2020 than in 2019, whereas the proportion of children's mental health-related visits was similar. Beginning in week 12 (March 16) the number of mental health-related ED visits among children decreased 43% concurrent with the widespread implementation of COVID-19 mitigation measures; simultaneously, the proportion of mental health-related ED visits increased sharply beginning in mid-March 2020 (week 12) and continued into October (week 42) with increases of 24% among children aged 5-11 years and 31% among adolescents aged 12-17 years, compared with the same period in 2019. The increased proportion of children's mental health-related ED visits during March-October 2020 might be artefactually inflated as a consequence of the substantial decrease in overall ED visits during the same period and variation in the number of EDs reporting to NSSP. However, these findings provide initial insight into children's mental health in the context of the COVID-19 pandemic and highlight the importance of continued monitoring of children's mental health throughout the pandemic, ensuring access to care during public health crises, and improving healthy coping strategies and resiliency among children and families. |
Multiple COVID-19 Outbreaks Linked to a Wedding Reception in Rural Maine - August 7-September 14, 2020.
Mahale P , Rothfuss C , Bly S , Kelley M , Bennett S , Huston SL , Robinson S . MMWR Morb Mortal Wkly Rep 2020 69 (45) 1686-1690 Large indoor gatherings pose a high risk for transmission of SARS-CoV-2, the virus that causes coronavirus disease 2019 (COVID-19), and have the potential to be super-spreading events (1,2). Such events are associated with explosive growth, followed by sustained transmission (3). During August 7-September 14, 2020, the Maine Center for Disease Control and Prevention (MeCDC) investigated a COVID-19 outbreak linked to a wedding reception attended by 55 persons in a rural Maine town. In addition to the community outbreak, secondary and tertiary transmission led to outbreaks at a long-term care facility 100 miles away and at a correctional facility approximately 200 miles away. Overall, 177 COVID-19 cases were epidemiologically linked to the event, including seven hospitalizations and seven deaths (four in hospitalized persons). Investigation revealed noncompliance with CDC's recommended mitigation measures. To reduce transmission, persons should avoid large gatherings, practice physical distancing, wear masks, stay home when ill, and self-quarantine after exposure to a person with confirmed SARS-CoV-2 infection. Persons can work with local health officials to increase COVID-19 awareness and determine the best policies for organizing social events to prevent outbreaks in their communities. |
COVID-19 Outbreak in an Amish Community - Ohio, May 2020.
Ali H , Kondapally K , Pordell P , Taylor B , Martinez GM , Salehi E , Ramseyer S , Varnes S , Hayes N , de Fijter S , Lloyd S . MMWR Morb Mortal Wkly Rep 2020 69 (45) 1671-1674 In the United States, outbreaks of SARS-CoV-2, the virus that causes coronavirus disease 2019 (COVID-19), were initially reported in densely populated urban areas (1); however, outbreaks have since been reported in rural communities (2,3). Rural residents might be at higher risk for severe COVID-19-associated illness because, on average, they are older, have higher prevalences of underlying medical conditions, and have more limited access to health care services.* In May, after a cluster of seven COVID-19 cases was identified in a rural Ohio Amish community, access to testing was increased. Among 30 additional residents tested by real-time reverse transcription-polymerase chain reaction (RT-PCR; TaqPath COVID-19 Combo Kit),(†) 23 (77%) received positive test results for SARS-CoV-2. Rapid and sustained transmission of SARS-CoV-2 was associated with multiple social gatherings. Informant interviews revealed that community members were concerned about having to follow critical mitigation strategies, including social distancing(§) and mask wearing.(¶) To help reduce the ongoing transmission risk in a community, state and county health department staff members and community leaders need to work together to develop, deliver, and promote culturally responsive health education messages to prevent SARS-CoV-2 transmission and ensure that access to testing services is timely and convenient. Understanding the dynamics of close-knit communities is crucial to reducing SARS-CoV-2 transmission. |
Declines in SARS-CoV-2 Transmission, Hospitalizations, and Mortality After Implementation of Mitigation Measures- Delaware, March-June 2020.
Kanu FA , Smith EE , Offutt-Powell T , Hong R , Dinh TH , Pevzner E . MMWR Morb Mortal Wkly Rep 2020 69 (45) 1691-1694 Mitigation measures, including stay-at-home orders and public mask wearing, together with routine public health interventions such as case investigation with contact tracing and immediate self-quarantine after exposure, are recommended to prevent and control the transmission of SARS-CoV-2, the virus that causes coronavirus disease 2019 (COVID-19) (1-3). On March 11, the first COVID-19 case in Delaware was reported to the Delaware Division of Public Health (DPH). The state responded to ongoing community transmission with investigation of all identified cases (commencing March 11), issuance of statewide stay-at-home orders (March 24-June 1), a statewide public mask mandate (from April 28), and contact tracing (starting May 12). The relationship among implementation of mitigation strategies, case investigations, and contact tracing and COVID-19 incidence and associated hospitalization and mortality was examined during March-June 2020. Incidence declined by 82%, hospitalization by 88%, and mortality by 100% from late April to June 2020, as the mask mandate and contact tracing were added to case investigations and the stay-at-home order. Among 9,762 laboratory-confirmed COVID-19 cases reported during March 11-June 25, 2020, two thirds (6,527; 67%) of patients were interviewed, and 5,823 (60%) reported completing isolation. Among 2,834 contacts reported, 882 (31%) were interviewed and among these contacts, 721 (82%) reported completing quarantine. Implementation of mitigation measures, including mandated mask use coupled with public health interventions, was followed by reductions in COVID-19 incidence and associated hospitalizations and mortality. The combination of state-mandated community mitigation efforts and routine public health interventions can reduce the occurrence of new COVID-19 cases, hospitalizations, and deaths. |
Kaposi Sarcoma Rates Among Persons Living With Human Immunodeficiency Virus in the United States: 2008-2016.
Luo Q , Satcher Johnson A , Hall HI , Cahoon EK , Shiels M . Clin Infect Dis 2020 73 (7) e2226-e2233 BACKGROUND: Recent studies have suggested that Kaposi sarcoma (KS) rates might be increasing in some racial/ethnic groups, age groups, and US regions. We estimated recent US trends in KS incidence among people living with human immunodeficiency virus (HIV; PLWH). METHODS: Incident KS patients aged 20-59 years were obtained from 36 cancer registries and assumed to be living with HIV. The number of PLWH was obtained from national HIV surveillance data from 2008 to 2016. Age-standardized KS rates and annual percent changes (APCs) in rates were estimated by age, sex, race/ethnicity, state, and region. RESULTS: Between 2008 and 2016, the age-adjusted KS rate among PLWH was 116/100 000. Rates were higher among males, in younger age groups, and among white PLWH. Washington, Maine, and California had the highest KS rates among PLWH. KS rates among PLWH decreased significantly (average APC = -3.2% per year, P < .001) from 136/100 000 to 97/100 000 between 2008 and 2016. There were no statistically significant increases in KS rates in any age, sex, or racial/ethnic group or in any geographic region or state. However, there were nondecreasing trends in some states and in younger age groups, primarily among black PLWH. CONCLUSIONS: KS incidence rates among PLWH have decreased nationally between 2008 and 2016. Though there were no statistically significant increases in KS rates in any demographic or geographic group, nondecreasing/stagnant KS trends in some states and among younger and black PLWH highlight the need for early diagnosis and treatment of HIV infection. |
Persistence of Positive RT-PCR Results for Over 70 Days in Two Travelers with COVID-19.
Kandetu TB , Dziuban EJ , Sikuvi K , Beard RS , Nghihepa R , van Rooyen G , Shiningavamwe A , Katjitae I . Disaster Med Public Health Prep 2020 16 (3) 1-7 The relation of continuing to test positive for SARS-CoV-2 by reverse transcription real-time polymerase chain reaction (RT-PCR) to infectivity remains unclear, with numerous consequences. This report describes two patients with persistent viral detection by RT-PCR for 77 and 72 days, longer than other reported cases who were otherwise healthy. |
National-level effectiveness of ART to prevent early mother to child transmission of HIV in Namibia
Agabu A , Baughman AL , Fischer-Walker C , de Klerk M , Mutenda N , Rusberg F , Diergaardt D , Pentikainen N , Sawadogo S , Agolory S , Dinh TH . PLoS One 2020 15 (11) e0233341 BACKGROUND: Namibia introduced the prevention of mother to child HIV transmission (MTCT) program in 2002 and lifelong antiretroviral therapy (ART) for pregnant women (option B-plus) in 2013. We sought to quantify MTCT measured at 4-12 weeks post-delivery. METHODS: During Aug 2014-Feb 2015, we recruited a nationally representative sample of 1040 pairs of mother and infant aged 4-12 weeks at routine immunizations in 60 public health clinics using two stage sampling approach. Of these, 864 HIV exposed infants had DNA-PCR HIV test results available. We defined an HIV exposed infant if born to an HIV-positive mother with documented status or diagnosed at enrollment using rapid HIV tests. Dried Blood Spots samples from HIV exposed infants were tested for HIV. Interview data and laboratory results were collected on smartphones and uploaded to a central database. We measured MTCT prevalence at 4-12 weeks post-delivery and evaluated associations between infant HIV infection and maternal and infant characteristics including maternal treatment and infant prophylaxis. All statistical analyses accounted for the survey design. RESULTS: Based on the 864 HIV exposed infants with test results available, nationally weighted early MTCT measured at 4-12 weeks post-delivery was 1.74% (95% confidence interval (CI): 1.00%-3.01%). Overall, 62% of mothers started ART pre-conception, 33.6% during pregnancy, 1.2% post-delivery and 3.2% never received ART. Mothers who started ART before pregnancy and during pregnancy had low MTCT prevalence, 0.78% (95% CI: 0.31%-1.96%) and 0.98% (95% CI: 0.33%-2.91%), respectively. MTCT rose to 4.13% (95% CI: 0.54%-25.68%) when the mother started ART after delivery and to 11.62% (95% CI: 4.07%-28.96%) when she never received ART. The lowest MTCT of 0.76% (95% CI: 0.36% - 1.61%) was achieved when mother received ART and ARV prophylaxis within 72hrs for infant and highest 22.32% (95%CI: 2.78% -74.25%) when neither mother nor infant received ARVs. After adjusting for mother's age, maternal ART (Prevalence Ratio (PR) = 0.10, 95% CI: 0.03-0.29) and infant ARV prophylaxis (PR = 0.32, 95% CI: 0.10-0.998) remained strong predictors of HIV transmission. CONCLUSION: As of 2015, Namibia achieved MTCT of 1.74%, measured at 4-12 weeks post-delivery. Women already on ART pre-conception had the lowest prevalence of MTCT emphasizing the importance of early HIV diagnosis and treatment initiation before pregnancy. Studies are needed to measure MTCT and maternal HIV seroconversion during breastfeeding. |
Risk scores for predicting early antiretroviral therapy mortality in sub-Saharan Africa to inform who needs intensification of care: a derivation and external validation cohort study
Auld AF , Fielding K , Agizew T , Maida A , Mathoma A , Boyd R , Date A , Pals SL , Bicego G , Liu Y , Shiraishi RW , Ehrenkranz P , Serumola C , Mathebula U , Alexander H , Charalambous S , Emerson C , Rankgoane-Pono G , Pono P , Finlay A , Shepherd JC , Holmes C , Ellerbrock TV , Grant AD . BMC Med 2020 18 (1) 311 BACKGROUND: Clinical scores to determine early (6-month) antiretroviral therapy (ART) mortality risk have not been developed for sub-Saharan Africa (SSA), home to 70% of people living with HIV. In the absence of validated scores, WHO eligibility criteria (EC) for ART care intensification are CD4 < 200/μL or WHO stage III/IV. METHODS: We used Botswana XPRES trial data for adult ART enrollees to develop CD4-independent and CD4-dependent multivariable prognostic models for 6-month mortality. Scores were derived by rescaling coefficients. Scores were developed using the first 50% of XPRES ART enrollees, and their accuracy validated internally and externally using South African TB Fast Track (TBFT) trial data. Predictive accuracy was compared between scores and WHO EC. RESULTS: Among 5553 XPRES enrollees, 2838 were included in the derivation dataset; 68% were female and 83 (3%) died by 6 months. Among 1077 TBFT ART enrollees, 55% were female and 6% died by 6 months. Factors predictive of 6-month mortality in the derivation dataset at p < 0.01 and selected for the CD4-independent score included male gender (2 points), ≥ 1 WHO tuberculosis symptom (2 points), WHO stage III/IV (2 points), severe anemia (hemoglobin < 8 g/dL) (3 points), and temperature > 37.5 °C (2 points). The same variables plus CD4 < 200/μL (1 point) were included in the CD4-dependent score. Among XPRES enrollees, a CD4-independent score of ≥ 4 would provide 86% sensitivity and 66% specificity, whereas WHO EC would provide 83% sensitivity and 58% specificity. If WHO stage alone was used, sensitivity was 48% and specificity 89%. Among TBFT enrollees, the CD4-independent score of ≥ 4 would provide 95% sensitivity and 27% specificity, whereas WHO EC would provide 100% sensitivity but 0% specificity. Accuracy was similar between CD4-independent and CD4-dependent scores. Categorizing CD4-independent scores into low (< 4), moderate (4-6), and high risk (≥ 7) gave 6-month mortality of 1%, 4%, and 17% for XPRES and 1%, 5%, and 30% for TBFT enrollees. CONCLUSIONS: Sensitivity of the CD4-independent score was nearly twice that of WHO stage in predicting 6-month mortality and could be used in settings lacking CD4 testing to inform ART care intensification. The CD4-dependent score improved specificity versus WHO EC. Both scores should be considered for scale-up in SSA. |
Sporotrichosis cases in commercial insurance data, United States, 2012-2018
Benedict K , Jackson BR . Emerg Infect Dis 2020 26 (11) 2783-2785 The geographic distribution of sporotrichosis in the United States is largely unknown. In a large commercial health insurance database, sporotrichosis was rare but most frequently occurred in southern and south-central states. Knowledge about where sporotrichosis is most likely to occur is essential for increasing clinician awareness of this rare fungal disease. |
Vital Signs: Deaths among persons with diagnosed HIV infection, United States, 2010-2018
Bosh KA , Johnson AS , Hernandez AL , Prejean J , Taylor J , Wingard R , Valleroy LA , Hall HI . MMWR Morb Mortal Wkly Rep 2020 69 (46) 1717-1724 BACKGROUND: Life expectancy for persons with human immunodeficiency virus (HIV) infection who receive recommended treatment can approach that of the general population, yet HIV remains among the 10 leading causes of death among certain populations. Using surveillance data, CDC assessed progress toward reducing deaths among persons with diagnosed HIV (PWDH). METHODS: CDC analyzed National HIV Surveillance System data for persons aged ≥13 years to determine age-adjusted death rates per 1,000 PWDH during 2010-2018. Using the International Classification of Diseases, Tenth Revision, deaths with a nonmissing underlying cause were classified as HIV-related or non-HIV-related. Temporal changes in total deaths during 2010-2018 and deaths by cause during 2010-2017 (2018 excluded because of delays in reporting), by demographic characteristics, transmission category, and U.S. Census region of residence at time of death were calculated. RESULTS: During 2010-2018, rates of death decreased by 36.6% overall (from 19.4 to 12.3 per 1,000 PWDH). During 2010-2017, HIV-related death rates decreased 48.4% (from 9.1 to 4.7), whereas non-HIV-related death rates decreased 8.6% (from 9.3 to 8.5). Rates of HIV-related deaths during 2017 were highest by race/ethnicity among persons of multiple races (7.0) and Black/African American persons (5.6), followed by White persons (3.9) and Hispanic/Latino persons (3.9). The HIV-related death rate was highest in the South (6.0) and lowest in the Northeast (3.2). CONCLUSION: Early diagnosis, prompt treatment, and maintaining access to high-quality care and treatment have been successful in reducing HIV-related deaths and remain necessary for continuing reductions in HIV-related deaths. |
Implementing TB preventive treatment within differentiated HIV service delivery models in global programs
Boyd AT , Moore B , Shah M , Tran C , Kirking H , Cavanaugh JS , Al-Samarrai T , Pathmanathan I . Public Health Action 2020 10 (3) 104-110 Global HIV program stakeholders, including the US President's Emergency Plan for AIDS Relief (PEPFAR), are undertaking efforts to ensure that eligible people living with HIV (PLHIV) receiving antiretroviral treatment (ART) receive a course of TB preventive treatment (TPT). In PEPFAR programming, this effort may require providing TPT not only to newly diagnosed PLHIV as part of HIV care initiation, but also to treatment-experienced PLHIV stable on ART who may not have been previously offered TPT. TPT scale-up is occurring at the same time as a trend to provide more person-centered HIV care through differentiated service delivery (DSD). In DSD, PLHIV stable on ART may receive less frequent clinical follow-up or receive care outside the traditional clinic-based model. The misalignment between traditional delivery of TPT and care delivery in innovative DSD may require adaptations to TPT delivery practices for PLHIV. Adaptations include components of planning and operationalization of TPT in DSD, such as determination of TPT eligibility and TPT initiation, and clinical management of PLHIV while on TPT. A key adaptation is alignment of timing and location for TPT and ART prescribing, monitoring, and dispensing. Conceptual examples of TPT delivery in DSD may help program managers operationalize TPT in HIV care. |
Characteristics of and meningococcal disease prevention strategies for commercially insured persons receiving eculizumab in the United States
Bozio CH , Isenhour C , McNamara LA . PLoS One 2020 15 (11) e0241989 INTRODUCTION: Eculizumab is a licensed treatment for several rare, complement-mediated diseases. Eculizumab use is associated with an approximately 2,000-fold increased meningococcal disease risk. In the United States, meningococcal vaccines are recommended for eculizumab recipients but there are no recommendations on use of long-term antibiotic prophylaxis. We describe characteristics of and meningococcal vaccine and antibiotic receipt in U.S. eculizumab recipients to inform meningococcal disease prevention strategies. METHODS: Persons in the IBM® MarketScan® Research Databases with ≥1 claim for eculizumab injection during 2007-2017 were included. Indication for eculizumab use, meningococcal vaccine receipt, and antibiotic receipt were assessed using International Classification of Diseases-9/10 diagnosis codes, vaccine administration procedure codes, and antibiotic codes from pharmacy claims, respectively. RESULTS: Overall 696 persons met the inclusion criteria. Paroxysmal nocturnal hemoglobinuria (PNH) and atypical hemolytic uremic syndrome (aHUS) were the most common indications for eculizumab use (41% and 37%, respectively); 20% had an undetermined indication. From June 2015 through December 2017, 28% (41/148) of continuously-enrolled patients received ≥1 serogroup B vaccine dose. For serogroup ACWY conjugate vaccine, 45% (91/201) of patients received ≥1 dose within five years of their most recent eculizumab dose, as recommended. Of eculizumab recipients with outpatient prescription data, 7% (41/579) received antibiotics for ≥50% of the period of increased risk for meningococcal disease. CONCLUSION: Many eculizumab recipients had an undetermined indication for eculizumab use; few were up-to-date for recommended meningococcal vaccines or were prescribed antibiotics long-term. These findings can inform further investigation of how to best protect this population from meningococcal disease. |
Aminoglycosides and capreomycin in the treatment of multidrug-resistant tuberculosis: Individual patient data meta-analysis of 12 030 patients from 25 countries, 2009-2016
Cegielski JP , Chan PC , Lan Z , Udwadia ZF , Viiklepp P , Yim JJ , Menzies D . Clin Infect Dis 2020 73 (11) e3929-e3936 BACKGROUND: As new drugs are developed for multidrug-resistant tuberculosis (MDR-TB), the role of currently used drugs must be reevaluated. METHODS: We combined individual-level data on patients with pulmonary MDR-TB published during 2009-2016 from 25 countries. We compared patients receiving each of the injectable drugs and those receiving no injectable drugs. Analyses were based on patients whose isolates were susceptible to the drug they received. Using random-effects logistic regression with propensity score matching, we estimated the effect of each agent in terms of standardized treatment outcomes. RESULTS: More patients received kanamycin (n = 4330) and capreomycin (n = 2401) than amikacin (n = 2275) or streptomycin (n = 1554), opposite to their apparent effectiveness. Compared with kanamycin, amikacin was associated with 6 more cures per 100 patients (95% confidence interval [CI], 4-8), while streptomycin was associated with 7 (95% CI, 5-8) more cures and 5 (95% CI, 4-7) fewer deaths per 100 patients. Compared with capreomycin, amikacin was associated with 9 (95% CI, 6-11) more cures and 5 (95% CI, 2-8) fewer deaths per 100 patients, while streptomycin was associated with 10 (95% CI, 8-13) more cures and 10 (95% CI, 7-12) fewer deaths per 100 patients treated. In contrast to amikacin and streptomycin, patients treated with kanamycin or capreomycin did not fare better than patients treated with no injectable drugs. CONCLUSIONS: When aminoglycosides are used to treat MDR-TB and drug susceptibility test results support their use, streptomycin and amikacin, not kanamycin or capreomycin, are the drugs of choice. |
Pre-exposure prophylaxis use and detected sexually transmitted infections among men who have sex with men in the United States - National HIV Behavioral Surveillance, 5 US Cities, 2017
Chapin-Bardales J , Johnson Jones ML , Kirkcaldy RD , Bernstein KT , Paz-Bailey G , Phillips C , Papp JR , Raymond HF , Opoku J , Braunstein SL , Spencer EC , Khuwaja S , Wejnert C . J Acquir Immune Defic Syndr 2020 85 (4) 430-435 BACKGROUND: Men who have sex with men (MSM) using HIV pre-exposure prophylaxis (PrEP) may be at high risk for bacterial sexually transmitted infections (STIs). We examined the prevalence of extragenital gonorrhea and chlamydia by PrEP status among a multisite sample of US MSM. METHODS: MSM aged ≥18 years were recruited through venue-based sampling to participate in the 2017 National HIV Behavioral Surveillance. In 5 cities (San Francisco, Washington DC, New York City, Miami, and Houston), participants completed a questionnaire, HIV testing, and pharyngeal and rectal STI specimen self-collection. We measured prevalence of pharyngeal and rectal gonorrhea and chlamydia among self-reported non-HIV-positive MSM who reported using or not using PrEP in the previous 12 months. RESULTS: Overall, 29.6% (481/1627) of non-HIV-positive MSM reported PrEP use in the past year. MSM who reported PrEP use were more likely to have any STI (ie, extragenital gonorrhea and/or chlamydia) than MSM not on PrEP [14.6% vs. 12.0%, adjusted prevalence ratio (aPR) = 1.5, 95% confidence interval (CI) : 1.1 to 2.0], reflecting differences in rectal chlamydia prevalence (8.7% vs. 6.0%, aPR = 1.6, 95% CI: 1.1 to 2.4). PrEP use was not associated with pharyngeal chlamydia, pharyngeal gonorrhea, or rectal gonorrhea. CONCLUSIONS: The prevalence of extragenital STI was high for both MSM on PrEP and those not on PrEP in the past year. MSM on PrEP were more likely to have rectal chlamydia but not pharyngeal STIs or rectal gonorrhea. Our findings support regular STI testing at exposed anatomic sites as recommended for sexually active MSM, including those on PrEP. |
Disability among adults with diagnosed HIV in the United States, 2017
Chowdhury PP , Beer L , Shu F , Fagan J , Luke Shouse R . AIDS Care 2020 33 (12) 1-5 In the United States, one in four adults is living with a disability. Age-related changes, disease-related pathology and treatments can place a person with HIV at risk for a disability. We analyzed nationally representative data to describe disability status among adults ≥18 years with diagnosed HIV in the United States and Puerto Rico by demographic characteristics, health behaviors, quality of care, clinical outcomes and mental health status. We reported weighted percentages and prevalence ratios with predicted marginal means to evaluate significant differences between groups (P < .05). Overall, 44.5% reported any disability; the most frequently reported disabilities were related to mobility (24.8%) and cognition (23.9%). Persons who lived in households at or below the poverty level or who experienced homelessness in the last 12 months reported a higher prevalence of any disability than persons who were not poor or not homeless (60.2% vs. 33.4% and 61.8% vs. 42.8%, respectively). Prevalence of depression and anxiety was higher among persons with any disability compared with those with no disability (32.8% and 26.6% versus 10.1% and 7.0%, respectively). Enhancing support from clinicians and ancillary providers may help optimize long-term health outcomes among HIV-positive persons with disabilities. |
Prevalence of latent tuberculosis infection among non-U.S.-born persons by country of birth - United States, 2012-2017
Collins JM , Stout JE , Ayers T , Hill AN , Katz DJ , Ho CS , Blumberg HM , Winglee K . Clin Infect Dis 2020 73 (9) e3468-e3475 BACKGROUND: Most tuberculosis (TB) disease in the U.S. is attributed to reactivation of remotely acquired latent TB infection (LTBI) in non-U.S.-born persons who were likely infected with Mycobacterium tuberculosis in their countries of birth. Information on LTBI prevalence by country of birth could help guide local providers and health departments to scale up the LTBI screening and preventive treatment needed to advance progress towards TB elimination. METHODS: 13 805 non-U.S.-born persons at high risk of TB infection or progression to TB disease were screened for LTBI at 16 clinical sites located across the United States with a tuberculin skin test, QuantiFERON ® Gold In-Tube test, and T-SPOT ®.TB test. Bayesian latent class analysis was applied to test results to estimate LTBI prevalence and associated credible intervals (CRI) for each country or world region of birth. RESULTS: Among the study population, the estimated LTBI prevalence was 31% (95% CRI 26% - 35%). Country-of-birth-level LTBI prevalence estimates were highest for persons born in Haiti, Peru, Somalia, Ethiopia, Vietnam, and Bhutan, ranging from 42%-55%. LTBI prevalence estimates were lowest for persons born in Colombia, Malaysia, and Thailand, ranging from 8%-13%. CONCLUSIONS: LTBI prevalence in persons born outside the United States varies widely by country. These estimates can help target community outreach efforts to the highest risk groups. |
Diverging trends in US male-female condom use by STI risk factors: a nationally representative study
Copen CE , Dittus PJ , Leichliter JS , Kumar S , Aral SO . Sex Transm Infect 2020 98 (1) 50-52 OBJECTIVE: Condom use behaviours are proximal to recent STI increases in the USA, yet it remains unclear whether the use of condoms has changed over time among unmarried, non-cohabiting young men who have sex with women (MSW) and how this variability is influenced by STI risk factors. METHODS: To examine condom use over time among MSW aged 15-29, we used three cross-sectional surveys from the 2002, 2006-2010 and 2011-2017 National Survey of Family Growth. We estimated weighted percentages, adjusted prevalence ratios (APRs) and 95% confidence intervals (CI) to assess changes in condom use, stratified by whether MSW reported any STI risk factors in the past 12 months (ie, perceived partner non-monogamy, male-to-male sex, sex in exchange for money or drugs, sex partner who injects illicit drugs, or an HIV-positive sex partner). RESULTS: We observed a divergence in trends in condom use at last sex between men aged 15 -29 with STI risk factors in the past 12 months and those without such history. We saw significant declines in condom use from 2002 to 2011-2017 among men with STI risk factors (APR=0.80, 95% CI 0.68 to 0.95), specifically among those aged 15-19 (APR=0.73, 95% CI 0.57 to 0.94) or non-Hispanic white (APR=0.71, 95% CI 0.54 to 0.93). In contrast, trends in condom use among men with no STI factors remained stable or increased. Across all time periods, the most prevalent STI risk factor reported was perception of a non-monogamous female partner (23.0%-26.9%). Post-hoc analyses examined whether condom use trends changed once this variable was removed from analyses, but no different patterns were observed. CONCLUSIONS: While STIs have been increasing, men aged 15-29 with STI risk factors reported a decline in condom use. Rising STI rates may be sensitive to behavioural shifts in condom use among young MSW with STI risk factors. |
Assessing partner services provided by state and local health departments, 2018
Cuffe KM , Gift TL , Kelley K , Leichliter JS . Sex Transm Dis 2020 48 (6) 429-435 BACKGROUND: Surveillance reports have shown that reported sexually transmitted diseases (STD) are increasing. The provision of partner services is an effective tool for preventing and reducing the spread of STDs. We examined partner services provided by health departments and assessed for associations with jurisdiction size, STD morbidity, and region. METHODS: We used stratified random sampling to select 668 LHDs and selected all (N=50) SHDs. Rao-Scott chi-square analyses were performed to examine partner services by health department (HD) type (SHD vs LHD), region, jurisdiction size (LHD only), and STD morbidity (LHD only). RESULTS: Approximately 49.0% of LHDs and 88.0% of SHDs responded to the survey. Most LHDs (81.6%) and SHDs (79.5%) provided partner services for some STDs (P=0.63). Compared to SHDs, a higher proportion of LHDs provided expedited partner therapy for chlamydia (66.8% vs 34.2%, P<.01) and gonorrhea (39.3% vs 22.9%, P=0.09). Partner service staff performed other activities such as conducting enhanced surveillance activities (23.0% of LHDs, 34.3% of SHDs, P=0.20) and participating in outbreak response and emergency preparedness (84.8% of LHDs, 80.0% of SHDs, P=0.51). Associations were found when partner services were stratified by HD type, jurisdiction size, STD morbidity, and region. All LHDs in high morbidity areas provided partner services and 45.4% performed serologic testing of syphilis contacts in the field. CONCLUSIONS: A majority of STD programs in LHDs and SHDs provide a variety of partner services and partner service-related activities. It is imperative to continue monitoring the provision of partner services to understand how critical public health needs are being met. |
Modeling the impact of voluntary medical male circumcision (VMMC) on cervical cancer in Uganda
Davis SM , Habel MA , Pretorius C , Yu T , Toledo C , Farley T , Kabuye G , Samuelson J . J Acquir Immune Defic Syndr 2020 86 (3) 323-328 BACKGROUND: In addition to providing millions of men with lifelong lower risk for HIV infection, voluntary medical male circumcision (VMMC) also provides female partners with health benefits including decreased risk for human papillomavirus (HPV) and resultant cervical cancer (CC). SETTING: We modeled potential impacts of VMMC on cervical cancer incidence and mortality in Uganda as an additional benefit beyond HIV prevention. METHODS: HPV and CC outcomes were modeled using the CC model from the Spectrum policy tool suite, calibrated for Uganda, to estimate HPV infection incidence and progression to CC, using a 50-year (2018-2067) time horizon. 2016 Demographic Health Survey data provided baseline VMMC coverage. The baseline (no VMMC scale-up beyond current coverage, minimal HPV vaccination coverage) was compared to multiple scenarios to assess the varying impact of VMMC according to different implementations of HPV vaccination and HPV screening programs. RESULTS: Without further intervention, annual CC incidence was projected to rise from 16.9 to 31.2 per 100,000 women in 2067. VMMC scale-up alone decreased 2067 annual CC incidence to 25.3, averting 13,000 deaths between 2018-2067. With rapidly-achieved 90% HPV9 vaccination coverage for adolescent girls and young women, 2067 incidence dropped below 10 per 100,000 with or without a VMMC program. With 45% vaccine coverage, the addition of VMMC scaleup decreased incidence by 2.9 per 100,000 and averted 8,000 additional deaths. Similarly, with HPV screen-and-treat without vaccination, the addition of VMMC scaleup decreased incidence by 5.1 per 100,000 and averted 10,000 additional deaths. CONCLUSIONS: Planned VMMC scale-up to 90% coverage from current levels could prevent a substantial number of CC cases and deaths in the absence of rapid scale-up of HPV vaccination to 90% coverage. |
Incidence of influenza during pregnancy and association with pregnancy and perinatal outcomes in three middle-income countries: a multisite prospective longitudinal cohort study
Dawood FS , Kittikraisak W , Patel A , Rentz Hunt D , Suntarattiwong P , Wesley MG , Thompson MG , Soto G , Mundhada S , Arriola CS , Azziz-Baumgartner E , Brummer T , Cabrera S , Chang HH , Deshmukh M , Ellison D , Florian R , Gonzales O , Kurhe K , Kaoiean S , Rawangban B , Lindstrom S , Llajaruna E , Mott JA , Saha S , Prakash A , Mohanty S , Sinthuwattanawibool C , Tinoco Y . Lancet Infect Dis 2020 21 (1) 97-106 BACKGROUND: Influenza vaccination during pregnancy prevents influenza among women and their infants but remains underused among pregnant women. We aimed to quantify the risk of antenatal influenza and examine its association with perinatal outcomes. METHODS: We did a prospective cohort study in pregnant women in India, Peru, and Thailand. Before the 2017 and 2018 influenza seasons, we enrolled pregnant women aged 18 years or older with expected delivery dates 8 weeks or more after the season started. We contacted women twice weekly until the end of pregnancy to identify illnesses with symptoms of myalgia, cough, runny nose or nasal congestion, sore throat, or difficulty breathing and collected mid-turbinate nasal swabs from symptomatic women for influenza real-time RT-PCR testing. We assessed the association of antenatal influenza with preterm birth, late pregnancy loss (≥13 weeks gestation), small for gestational age (SGA), and birthweight of term singleton infants using Cox proportional hazards models or generalised linear models to adjust for potential confounders. FINDINGS: Between March 13, 2017, and Aug 3, 2018, we enrolled 11 277 women with a median age of 26 years (IQR 23-31) and gestational age of 19 weeks (14-24). 1474 (13%) received influenza vaccines. 310 participants (3%) had influenza (270 [87%] influenza A and 40 [13%] influenza B). Influenza incidences weighted by the population of women of childbearing age in each study country were 88·7 per 10 000 pregnant woman-months (95% CI 68·6 to 114·8) during the 2017 season and 69·6 per 10 000 pregnant woman-months (53·8 to 90·2) during the 2018 season. Antenatal influenza was not associated with preterm birth (adjusted hazard ratio [aHR] 1·4, 95% CI 0·9 to 2·0; p=0·096) or having an SGA infant (adjusted relative risk 1·0, 95% CI 0·8 to 1·3, p=0·97), but was associated with late pregnancy loss (aHR 10·7, 95% CI 4·3 to 27·0; p<0·0001) and reduction in mean birthweight of term, singleton infants (-55·3 g, 95% CI -109·3 to -1·4; p=0·0445). INTERPRETATION: Women had a 0·7-0·9% risk of influenza per month of pregnancy during the influenza season, and antenatal influenza was associated with increased risk for some adverse pregnancy outcomes. These findings support the added value of antenatal influenza vaccination to improve perinatal outcomes. FUNDING: US Centers for Disease Control and Prevention. TRANSLATIONS: For the Thai, Hindi, Marathi and Spanish translations of the abstract see Supplementary Materials section. |
Low human immunodeficiency virus (HIV) testing rates and no HIV preexposure prophylaxis prescribed among female patients diagnosed with a sexually transmitted infection, 2017-2018
Henny KD , Huang YA , Hoover KW . Obstet Gynecol 2020 136 (6) 1083-1085 INTRODUCTION | Our primary objective was to estimate human immunodeficiency virus (HIV) testing rates among female patients with a gonorrhea or syphilis diagnosis. Our secondary objective was to estimate the rate of preexposure prophylaxis prescriptions among these patients. | | Go to: | METHODS | We analyzed data from both the IBM MarketScan commercial and Medicaid insurance databases. Nonpregnant female patients aged 15–64 years without a prior HIV diagnosis who had a diagnosis of gonorrhea or syphilis (Appendix 1, available online at http://links.lww.com/AOG/C94) in 2017 and who were continuously enrolled in their health insurance plans for at least 6 months before and 11 months after their first sexually transmitted infection (STI) diagnosis date were included in the analysis (Appendix 2, available online at http://links.lww.com/AOG/C94). To estimate associations between HIV testing (Appendix 3, available online at http://links.lww.com/AOG/C94) and female patients’ characteristics and STI diagnoses, we performed multivariate logistic regression analyses for those with commercial insurance and those with Medicaid insurance separately. In the commercial model, we included age, U.S. geographic region, and urban compared with rural location as covariates; in the Medicaid model, we included age and race–ethnicity. | | Go to: | RESULTS | Among female patients with commercial insurance, 3,709 were diagnosed with gonorrhea and 1,696 with syphilis (Table 1). Among female patients with Medicaid insurance, 6,172 were diagnosed with gonorrhea and 1,497 with syphilis (Table 1). HIV testing rates among female patients with Medicaid insurance who were diagnosed with gonorrhea (716/6,172, 11.6%) or syphilis (146/1,497, 9.8%) were higher than among those with commercial insurance (282/3,709, 7.6% and 102/1,696, 6.0%) (P<.001) (Table 1). |
Benchmarks for HIV testing: What is needed to achieve universal testing coverage at U.S. ambulatory healthcare facilities
Hoover KW , Khalil GM , Cadwell BL , Rose CE , Peters PJ . J Acquir Immune Defic Syndr 2020 86 (2) e48-e53 BACKGROUND: Black and Hispanic men have the highest rates of HIV diagnoses. To decrease the number of U.S. men who are unaware of their HIV status, they should be tested at least once. Our objective was to estimate the increases needed in HIV testing rates at ambulatory healthcare visits to achieve universal coverage. METHODS: We analyzed nationally representative medical record abstraction data to estimate the number of visits per person to physician offices, emergency departments, and outpatient clinics among men aged 18-39 years during 2009-2016, and the percentage of visits with an HIV test. We calculated the increase in the percentage of visits with an HIV test needed to achieve universal testing coverage of men by age 39 years. RESULTS: Men had a mean of 75.3 million ambulatory visits per year and 1.67 visits per person. An HIV test was performed at 0.9% of the ambulatory visits made by white men, 2.5% by black men, and 2.4% by Hispanic men. A 3-fold increase in the percentage of visits with an HIV test would result in coverage of 46.2% of white, 100% of black, and 100% of Hispanic men; an 11-fold increase would be needed to result in coverage of 100% of white men. CONCLUSIONS: HIV testing rates of men at ambulatory healthcare visits were too low to provide HIV testing coverage of all men by aged 39 years. A 3-fold increase in the percentage of visits with an HIV test would result in universal testing coverage of black and Hispanic men by age 39 years. |
Progress toward poliomyelitis eradication - Pakistan, January 2019-September 2020
Hsu CH , Rehman MS , Bullard K , Jorba J , Kader M , Young H , Safdar M , Jafari HS , Ehrhardt D . MMWR Morb Mortal Wkly Rep 2020 69 (46) 1748-1752 Pakistan and Afghanistan are the only countries where wild poliovirus type 1 (WPV1) is endemic (1,2). In 2019, Pakistan reported 147 WPV1 cases, approximately 12 times the number reported in 2018. As of September 15, 72 cases had been reported in 2020. Since 2019, WPV1 transmission has also spread from Pakistan's core poliovirus reservoirs (Karachi, Peshawar, and Quetta block) to southern districts of Khyber Pakhtunkhwa (KP), Punjab, and Sindh provinces. Further, an outbreak of circulating vaccine-derived poliovirus type 2 (cVDPV2), first detected in July 2019, has caused 22 paralytic cases in 2019 and 59 as of September 15, 2020, throughout the country. The coronavirus disease 2019 (COVID-19) pandemic has substantially reduced delivery of polio vaccines through essential immunization (formerly routine immunization) and prevented implementation of polio supplementary immunization activities (SIAs)* during March-July 2020. This report describes Pakistan's progress in polio eradication during January 2019-September 2020 and updates previous reports (1,3,4). The Pakistan polio program has reinitiated SIAs and will need large, intensive, high-quality campaigns with strategic use of available oral poliovirus vaccines (OPVs)(†) to control the surge and widespread transmission of WPV1 and cVDPV2. |
Influenza surveillance capacity improvements in Africa during 2011-2017
Igboh LS , McMorrow M , Tempia S , Emukule GO , Talla Nzussouo N , McCarron M , Williams T , Weatherspoon V , Moen A , Fawzi D , Njouom R , Nakoune E , Dauoda C , Kavunga-Membo H , Okeyo M , Heraud JM , Mambule IK , Sow SO , Tivane A , Lagare A , Adebayo A , Dia N , Mmbaga V , Maman I , Lutwama J , Simusika P , Walaza S , Mangtani P , Nguipdop-Djomo P , Cohen C , Azziz-Baumgartner E . Influenza Other Respir Viruses 2020 15 (4) 495-505 BACKGROUND: Influenza surveillance helps time prevention and control interventions especially where complex seasonal patterns exist. We assessed influenza surveillance sustainability in Africa where influenza activity varies and external funds for surveillance have decreased. METHODS: We surveyed African Network for Influenza Surveillance and Epidemiology (ANISE) countries about 2011-2017 surveillance system characteristics. Data were summarized with descriptive statistics and analyzed with univariate and multivariable analyses to quantify sustained or expanded influenza surveillance capacity in Africa. RESULTS: Eighteen (75%) of 24 ANISE members participated in the survey; their cumulative population of 710 751 471 represent 56% of Africa's total population. All 18 countries scored a mean 95% on WHO laboratory quality assurance panels. The number of samples collected from severe acute respiratory infection case-patients remained consistent between 2011 and 2017 (13 823 vs 13 674 respectively) but decreased by 12% for influenza-like illness case-patients (16 210 vs 14 477). Nine (50%) gained capacity to lineage-type influenza B. The number of countries reporting each week to WHO FluNet increased from 15 (83%) in 2011 to 17 (94%) in 2017. CONCLUSIONS: Despite declines in external surveillance funding, ANISE countries gained additional laboratory testing capacity and continued influenza testing and reporting to WHO. These gains represent important achievements toward sustainable surveillance and epidemic/pandemic preparedness. |
Pre-exposure prophylaxis (PrEP) awareness and prescribing behaviors among primary care providers: DocStyles Survey, 2016-2020, United States
Jones JT , deCastro BR , August EM , Smith DK . AIDS Behav 2020 25 (4) 1267-1275 Few studies have assessed providers' intent of prescribing PrEP in the future. We analyzed cross-sectional web-based surveys to estimate trends from 2016 to 2020 in PrEP awareness and prescribing behaviors in the United States among primary care providers. Multivariable logistic regression was used to estimate prevalence of PrEP awareness, prescribing behaviors, and likelihood of prescribing PrEP in the next 12 months. The adjusted prevalence for PrEP awareness was significantly higher in 2019 (93.7%, 95% CI 91.9%, 95.2%) compared to 2018 (88.1%, 95% CI 85.5%, 90.3%). The adjusted prevalence for prescribing PrEP was significantly higher in 2019 (16.4%, 95% CI 13.6%, 19.6%) and 2020 (15.6%, 95% CI 13.0%, 18.7%) compared to 2018 (12.2%, 95% CI 10.0%, 14.7%). Practicing in the West and regularly screening for HIV were associated with higher PrEP awareness and provision. Studies should examine factors associated with PrEP provision for groups with increased risk for HIV. |
Improving injection safety practices of Cambodian healthcare workers through training
Kanagasabai U , Singh A , Shiraishi RW , Ly V , Hy C , Sanith S , Srun S , Sansam S , SopHeap ST , Liu Y , Jones G , Ijeoma UC , Bock N , Benech I , Selenic D , Drammah B , Gadde R , Mili FD . PLoS One 2020 15 (10) e0241176 BACKGROUND: This study evaluated the impact of a safe injection safety training on healthcare worker (HCW) practice and knowledge following an HIV outbreak in Roka commune, Cambodia. METHODS: Surveys were conducted at baseline (September 2016) and seven months after a training intervention (March 2018) using the World Health Organization standardized injection practices assessment tool. HCWs were sampled at 15 purposively government health facilities in two provinces. HCWs were observed during injection practices and interviewed by trained experts from Becton-Dickinson and the Ministry of Health Cambodia. The Rao-Scott chi square test was used test for differences between baseline and follow-up. RESULTS: We completed 115 observations of practice at baseline and 206 at post-training follow-up. The proportion of patients whose identification was confirmed by HCWs prior to procedure being performed increased from 40.4% to 98% (p <0.0001). The proportion of HCWs who practiced correct hand hygiene increased from 22.0% to 80.6% (p = 0.056) [therapeutic observations] and 17.2% to 63.4% (p = 0.0012) [diagnostic observations]. Immediate disposal of sharps by HCWs decreased from 96.5% to 92.5% (p = 0.0030). CONCLUSIONS: We found significant improvements in the practice of patient identity confirmation and hand hygiene but not in the immediate disposal of sharps in the post-training intervention. However, findings are not representative of all HCWs in the country. Further pre-service and in-service training and monitoring are necessary to ensure sustained behavior change. |
Quality improvement approach for increasing linkage to HIV care and treatment among newly-diagnosed HIV-infected persons in Kenyan urban informal settlements during 20112015
Kegoli S , Ondondo R , Njoroge A , Motoku J , Muriithi C , Ngugi E , Katana A , Waruru A , Weyanga H , Mutisya I . East Afr Med J 2019 96 (2) 2396-2408 Background: Pre-enrollment loss to follow-up and delayed linkage to HIV care and treatment (C&T) of newly-diagnosed HIV-infected individuals are associated with increased morbidity and mortality. Objective: To describe quality improvement approach utilized by Eastern Deanery AIDS Relief Program (EDARP) to increase linkage to HIV C&T of newly-diagnosed HIV-infected individuals. Design: Cross-sectional descriptive assessement of a three-phased continuous quality improvement (CQI) project among 20,972 newly diagnosed HIV patients at 14 EDARP health facilities in Nairobi, Kenya. Phase 1 physically escorting patients to the HIV C&T clinic; Phase 2 use of linkage registers and timely tracking and tracing individuals who missed appointments; Phase 3 use of patient HIV literacy materials. Routine patient data collected during the CQI interventions implemented between October 2011 and September 2015 were analyzed. Results: Implementation of the three CQI phases significantly increased linkage to HIV C&T from 60% at baseline in 2011 to 98% in 2015 (p<0.0001). Factors associated with decreased linkage to HIV C&T through this CQI intervention were: age (adolescents aged 1019 years), [odds ratio (OR) 0.60, 95% confidence interval (CI): 0.51-7.0]; female sex [OR 0.64, (95% CI: 0.59-0.70)] and unemployement [OR 0.84, (95% CI: 0.77-0.92)]. First time tester [OR 1.9, (95% CI: 1.8-2.1)] and divorcees [OR 2.0, (95% CI: 1.7-2.3)], (p<0.001) had increased likelihood of linkage to HIV C&T. Conclusion: Successful linkage to HIV C&T services for newly-diagnosed HIV-infected individuals is achievable through adoption of feasible and low-cost multi-pronged CQI interventions. |
Challenges to achieving measles elimination, Georgia, 2013-2018
Khetsuriani N , Sanadze K , Chlikadze R , Chitadze N , Dolakidze T , Komakhidze T , Jabidze L , Huseynov S , Ben Mamou M , Muller C , Zakhashvili K , Hübschen JM . Emerg Infect Dis 2020 26 (11) 2565-2577 Controlling measles outbreaks in the country of Georgia and throughout Europe is crucial for achieving the measles elimination goal for the World Health Organization's European Region. However, large-scale measles outbreaks occurred in Georgia during 2013-2015 and 2017-2018. The epidemiology of these outbreaks indicates widespread circulation and genetic diversity of measles viruses and reveals persistent gaps in population immunity across a wide age range that have not been sufficiently addressed thus far. Historic problems and recent challenges with the immunization program contributed to outbreaks. Addressing population susceptibility across all age groups is needed urgently. However, conducting large-scale mass immunization campaigns under the current health system is not feasible, so more selective response strategies are being implemented. Lessons from the measles outbreaks in Georgia could be useful for other countries that have immunization programs facing challenges related to health-system transitions and the presence of age cohorts with historically low immunization coverage. |
Enterovirus D68-associated acute flaccid myelitis, United States, 2020
Kidd S , Lopez AS , Konopka-Anstadt JL , Nix WA , Routh JA , Oberste MS . Emerg Infect Dis 2020 26 (10) Acute flaccid myelitis (AFM) is a serious neurologic condition that causes limb weakness or paralysis in previously healthy children. Since clusters of cases were first reported in 2014, nationwide surveillance has demonstrated sharp increases in AFM cases in the United States every 2 years, most occurring during late summer and early fall. Given this current biennial pattern, another peak AFM season is expected during fall 2020 in the United States. Scientific understanding of the etiology and the factors driving the biennial increases in AFM has advanced rapidly in the past few years, although areas of uncertainty remain. The Centers for Disease Control and Prevention and AFM partners are focused on answering key questions about AFM epidemiology and mechanisms of disease. This article summarizes the current understanding of AFM etiology and outlines priorities for surveillance and research as we prepare for a likely surge in cases in 2020. |
Tuberculosis among newly arrived immigrants and refugees in the United States
Liu Y , Phares CR , Posey DL , Maloney SA , Cain KP , Weinberg MS , Schmit KM , Marano N , Cetron MS . Ann Am Thorac Soc 2020 17 (11) 1401-1412 Rationale: U.S. health departments routinely conduct post-arrival evaluation of immigrants and refugees at risk for tuberculosis (TB), but this important intervention has not been thoroughly studied.Objectives: To assess outcomes of the post-arrival evaluation intervention.Methods: We categorized at-risk immigrants and refugees as having had recent completion of treatment for pulmonary TB disease overseas (including in Mexico and Canada); as having suspected TB disease (chest radiograph/clinical symptoms suggestive of TB) but negative culture results overseas; or as having latent TB infection (LTBI) diagnosed overseas. Among 2.1 million U.S.-bound immigrants and refugees screened for TB overseas during 2013-2016, 90,737 were identified as at risk for TB. We analyzed a national data set of these at-risk immigrants and refugees and calculated rates of TB disease for those who completed post-arrival evaluation.Results: Among 4,225 persons with recent completion of treatment for pulmonary TB disease overseas, 3,005 (71.1%) completed post-arrival evaluation within 1 year of arrival; of these, TB disease was diagnosed in 22 (732 cases/100,000 persons), including 4 sputum culture-positive cases (133 cases/100,000 persons), 13 sputum culture-negative cases (433 cases/100,000 persons), and 5 cases with no reported sputum-culture results (166 cases/100,000 persons). Among 55,938 with suspected TB disease but negative culture results overseas, 37,089 (66.3%) completed post-arrival evaluation; of these, TB disease was diagnosed in 597 (1,610 cases/100,000 persons), including 262 sputum culture-positive cases (706 cases/100,000 persons), 281 sputum culture-negative cases (758 cases/100,000 persons), and 54 cases with no reported sputum-culture results (146 cases/100,000 persons). Among 30,574 with LTBI diagnosed overseas, 18,466 (60.4%) completed post-arrival evaluation; of these, TB disease was diagnosed in 48 (260 cases/100,000 persons), including 11 sputum culture-positive cases (60 cases/100,000 persons), 22 sputum culture-negative cases (119 cases/100,000 persons), and 15 cases with no reported sputum-culture results (81 cases/100,000 persons). Of 21,714 persons for whom treatment for LTBI was recommended at post-arrival evaluation, 14,977 (69.0%) initiated treatment and 8,695 (40.0%) completed treatment.Conclusions: Post-arrival evaluation of at-risk immigrants and refugees can be highly effective. To optimize the yield and impact of this intervention, strategies are needed to improve completion rates of post-arrival evaluation and treatment for LTBI. |
A prolonged cholera outbreak caused by drinking contaminated stream water, Kyangwali refugee settlement, Hoima District, Western Uganda: 2018
Monje F , Ario AR , Musewa A , Bainomugisha K , Mirembe BB , Aliddeki DM , Eurien D , Nsereko G , Nanziri C , Kisaakye E , Ntono V , Kwesiga B , Kadobera D , Bulage L , Bwire G , Tusiime P , Harris J , Zhu BP . Infect Dis Poverty 2020 9 (1) 154 BACKGROUND: On 23 February 2018, the Uganda Ministry of Health (MOH) declared a cholera outbreak affecting more than 60 persons in Kyangwali Refugee Settlement, Hoima District, bordering the Democratic Republic of Congo (DRC). We investigated to determine the outbreak scope and risk factors for transmission, and recommend evidence-based control measures. METHODS: We defined a suspected case as sudden onset of watery diarrhoea in any person aged ≥ 2 years in Hoima District, 1 February-9 May 2018. A confirmed case was a suspected case with Vibrio cholerae cultured from a stool sample. We found cases by active community search and record reviews at Cholera Treatment Centres. We calculated case-fatality rates (CFR) and attack rates (AR) by sub-county and nationality. In a case-control study, we compared exposure factors among case- and control-households. We estimated the association between the exposures and outcome using Mantel-Haenszel method. We conducted an environmental assessment in the refugee settlement, including testing samples of stream water, tank water, and spring water for presence of fecal coliforms. We tested suspected cholera cases using cholera rapid diagnostic test (RDT) kits followed by culture for confirmation. RESULTS: We identified 2122 case-patients and 44 deaths (CFR = 2.1%). Case-patients originating from Demographic Republic of Congo were the most affected (AR = 15/1000). The overall attack rate in Hoima District was 3.2/1000, with Kyangwali sub-county being the most affected (AR = 13/1000). The outbreak lasted 4 months, which was a multiple point-source. Environmental assessment showed that a stream separating two villages in Kyangwali Refugee Settlement was a site of open defecation for refugees. Among three water sources tested, only stream water was feacally-contaminated, yielding > 100 CFU/100 ml. Of 130 stool samples tested, 124 (95%) yielded V. cholerae by culture. Stream water was most strongly associated with illness (odds ratio [OR] = 14.2, 95% CI: 1.5-133), although tank water also appeared to be independently associated with illness (OR = 11.6, 95% CI: 1.4-94). Persons who drank tank and stream water had a 17-fold higher odds of illness compared with persons who drank from other sources (OR = 17.3, 95% CI: 2.2-137). CONCLUSIONS: Our investigation demonstrated that this was a prolonged cholera outbreak that affected four sub-counties and two divisions in Hoima District, and was associated with drinking of contaminated stream water. In addition, tank water also appears to be unsafe. We recommended boiling drinking water, increasing latrine coverage, and provision of safe water by the District and entire High Commission for refugees. |
Large outbreak of Guillain-Barr syndrome, Peru, 2019
Munayco CV , Gavilan RG , Ramirez G , Loayza M , Miraval ML , Whitehouse E , Gharpure R , Soares J , Soplopuco HV , Sejvar J . Emerg Infect Dis 2020 26 (11) 2778-2780 Outbreaks of Guillain-Barré syndrome (GBS) are uncommon. In May 2019, national surveillance in Peru detected an increase in GBS cases in excess of the expected incidence of 1.2 cases/100,000 population. Several clinical and epidemiologic findings call into question the suggested association between this GBS outbreak and Campylobacter. |
The effect of TB treatment on health-related quality of life for people with advanced HIV
Opollo V , Sun X , Lando R , Miyahara S , Torres TS , Hosseinipour MC , Bisson GP , Kumwenda J , Gupta A , Nyirenda M , Katende K , Suryavanshi N , Beulah F , Shah NS . Int J Tuberc Lung Dis 2020 24 (9) 910-915 BACKGROUND: Study A5274 was an open-label trial of people with HIV (PLHIV) with CD4 cell count <50 cells/µL who were randomized to empirical TB treatment vs. isoniazid preventive therapy (IPT) in addition to antiretroviral therapy (ART). We evaluated health-related quality of life (HRQoL) by study arm, changes over time, and association with sociodemographic and clinical factors.METHODS: Participants aged >13 years were enrolled from outpatient clinics in 10 countries. HRQoL was assessed at Weeks 0, 8, 24 and 96 with questions about daily activity, hospital or emergency room visits, and general health status. We used logistic regression to examine HRQoL by arm and association with sociodemographic and clinical factors.RESULTS: Among 850 participants (424 empiric arm, 426 IPT arm), HRQoL improved over time with no difference between arms. At baseline and Week 24, participants with WHO Stage 3 or 4 events, or those who had Grade 3 or 4 signs/symptoms, were significantly more likely to report poor HRQoL using the composite of four HRQoL measures.CONCLUSION: HRQoL improved substantially in both arms during the study period. These findings show that ART, TB screening, and IPT can not only reduce mortality, but also improve HRQoL in PLHIV with advanced disease. |
Altered antibody responses in persons infected with HIV-1 while using PrEP
Parker I , Khalil G , Martin A , Martin MT , Vanichseni S , Leelawiwat W , McNicholl JM , Hickey A , Garcia-Lerma JG , Choopanya K , Curtis K . AIDS Res Hum Retroviruses 2020 37 (3) 189-195 BACKGROUND: Pre-exposure prophylaxis (PrEP) is an effective HIV prevention tool, although effectiveness is dependent upon adherence. It is important to characterize the impact of PrEP on HIV antibody responses in people who experience breakthrough infections in order to understand the potential impact on timely diagnosis and treatment. METHODS: Longitudinal HIV-1-specific antibody responses were evaluated in 42 people who inject drugs (PWID) from the Bangkok Tenofovir Study (placebo=28; PrEP=14) who acquired HIV while receiving PrEP. HIV-1 antibody levels and avidity to three envelope proteins (gp41, gp160, and gp120) were measured in the plasma using a customized Bio-Plex (Bio-Rad Laboratories) assay. A survival analysis was performed for each biomarker to compare the distribution of times at which study subjects exceeded the recent/long-term assay threshold, comparing PrEP and placebo treatment groups. We fit mixed-effects models to identify longitudinal differences in antibody levels and avidity between groups. RESULTS: Overall, longitudinal antibody levels and avidity were notably lower in the PrEP breakthrough group compared to the placebo group. Survival analyses demonstrated a difference in time to antibody reactivity between treatment groups for all Bio-Plex biomarkers. Longitudinal gp120 antibody levels within the PrEP breakthrough group were decreased compared to the placebo group. When accounting for PrEP adherence, both gp120 and gp160 antibody levels were lower in the PrEP breakthrough group compared to the placebo group. CONCLUSION: We demonstrate hindered envelope antibody maturation in PWID who became infected while receiving PrEP in the Bangkok Tenofovir Study, which has significant implications for HIV diagnosis. Delayed maturation of the antibody response to HIV may increase the time to detection for antibody-based tests. |
Progress toward regional measles elimination - worldwide, 2000-2019
Patel MK , Goodson JL , Alexander JP Jr , Kretsinger K , Sodha SV , Steulet C , Gacic-Dobo M , Rota PA , McFarland J , Menning L , Mulders MN , Crowcroft NS . MMWR Morb Mortal Wkly Rep 2020 69 (45) 1700-1705 In 2010, the World Health Assembly (WHA) set the following three milestones for measles control to be achieved by 2015: 1) increase routine coverage with the first dose of measles-containing vaccine (MCV1) among children aged 1 year to ≥90% at the national level and to ≥80% in every district, 2) reduce global annual measles incidence to <5 cases per 1 million population, and 3) reduce global measles mortality by 95% from the 2000 estimate* (1). In 2012, WHA endorsed the Global Vaccine Action Plan,(†) with the objective of eliminating measles(§) in five of the six World Health Organization (WHO) regions by 2020. This report describes progress toward WHA milestones and regional measles elimination during 2000-2019 and updates a previous report (2). During 2000-2010, estimated MCV1 coverage increased globally from 72% to 84% but has since plateaued at 84%-85%. All countries conducted measles surveillance; however, approximately half did not achieve the sensitivity indicator target of two or more discarded measles and rubella cases per 100,000 population. Annual reported measles incidence decreased 88%, from 145 to 18 cases per 1 million population during 2000-2016; the lowest incidence occurred in 2016, but by 2019 incidence had risen to 120 cases per 1 million population. During 2000-2019, the annual number of estimated measles deaths decreased 62%, from 539,000 to 207,500; an estimated 25.5 million measles deaths were averted. To drive progress toward the regional measles elimination targets, additional strategies are needed to help countries reach all children with 2 doses of measles-containing vaccine, identify and close immunity gaps, and improve surveillance. |
Association of schistosomiasis and HIV infections: a systematic review and meta-analysis
Patel P , Rose CE , Kjetland EF , Downs JA , Mbabazi PS , Sabin K , Chege W , Watts DH , Secor WE . Int J Infect Dis 2020 102 544-553 BACKGROUND: Female genital schistosomiasis (FGS) affects up to 56 million women in sub-Saharan Africa and may increase risk of HIV infection. METHODS: To assess the association of schistosomiasis with HIV infection, we examined peer-reviewed literature published until December 31, 2018 and generated a pooled estimate for the odds ratio using Bayesian random effects models. RESULTS: Of the 364 abstracts identified, 26 were included in the summary. Eight reported odds ratios of the association between schistosomiasis and HIV; one reported a transmission hazard ratio (HR) of 1·8 (95% confidence interval [CI]: 1·2-2·6) among women and 1·4 (95% CI: 1·0-1·9) among men; 11 described the prevalence of schistosomiasis among HIV-positive persons (range, 1·5%-36·6%); and six reported the prevalence of HIV among persons with schistosomiasis (range, 5·8%-57·3%). Six studies were selected for quantitative analysis. The pooled estimate for the odds ratio of HIV among persons with schistosomiasis was 2·3 (95% CI: 1·2-4·3). CONCLUSIONS: We found a significant association of schistosomiasis with HIV. However, we could not generate a specific summary estimate for FGS. We provide a research agenda to determine the effect of FGS on HIV infection. WHO's policy on mass drug administration for schistosomiasis may prevent HIV. |
Leprosy post-exposure prophylaxis with single-dose rifampicin (LPEP): an international feasibility programme
Richardus JH , Tiwari A , Barth-Jaeggi T , Arif MA , Banstola NL , Baskota R , Blaney D , Blok DJ , Bonenberger M , Budiawan T , Cavaliero A , Gani Z , Greter H , Ignotti E , Kamara DV , Kasang C , Manglani PR , Mieras L , Njako BF , Pakasi T , Pandey BD , Saunderson P , Singh R , Smith WCS , Stäheli R , Suriyarachchi ND , Tin Maung A , Shwe T , van Berkel J , van Brakel WH , Vander Plaetse B , Virmond M , Wijesinghe MSD , Aerts A , Steinmann P . Lancet Glob Health 2020 9 (1) e81-e90 BACKGROUND: Innovative approaches are required for leprosy control to reduce cases and curb transmission of Mycobacterium leprae. Early case detection, contact screening, and chemoprophylaxis are the most promising tools. We aimed to generate evidence on the feasibility of integrating contact tracing and administration of single-dose rifampicin (SDR) into routine leprosy control activities. METHODS: The leprosy post-exposure prophylaxis (LPEP) programme was an international, multicentre feasibility study implemented within the leprosy control programmes of Brazil, India, Indonesia, Myanmar, Nepal, Sri Lanka, and Tanzania. LPEP explored the feasibility of combining three key interventions: systematically tracing contacts of individuals newly diagnosed with leprosy; screening the traced contacts for leprosy; and administering SDR to eligible contacts. Outcomes were assessed in terms of number of contacts traced, screened, and SDR administration rates. FINDINGS: Between Jan 1, 2015, and Aug 1, 2019, LPEP enrolled 9170 index patients and listed 179 769 contacts, of whom 174 782 (97·2%) were successfully traced and screened. Of those screened, 22 854 (13·1%) were excluded from SDR mainly because of health reasons and age. Among those excluded, 810 were confirmed as new patients (46 per 10 000 contacts screened). Among the eligible screened contacts, 1182 (0·7%) refused prophylactic treatment with SDR. Overall, SDR was administered to 151 928 (86·9%) screened contacts. No serious adverse events were reported. INTERPRETATION: Post-exposure prophylaxis with SDR is safe; can be integrated into different leprosy control programmes with minimal additional efforts once contact tracing has been established; and is generally well accepted by index patients, their contacts, and health-care workers. The programme has also invigorated local leprosy control through the availability of a prophylactic intervention; therefore, we recommend rolling out SDR in all settings where contact tracing and screening have been established. FUNDING: Novartis Foundation. |
Same-day prescribing of daily oral pre-exposure prophylaxis for HIV prevention
Rowan SE , Patel RR , Schneider JA , Smith DK . Lancet HIV 2020 8 (2) e114-e120 Pre-exposure prophylaxis (PrEP) is highly effective in reducing HIV transmission but remains underutilised globally. Same-day PrEP prescribing and medication provision is an emerging implementation approach. The experiences of the three same-day PrEP programmes support the feasibility of the approach. Key elements of safe and effective same-day PrEP programmes include the ability to order laboratory tests at the time of the clinical visit and the ability to contact patients when laboratory results are available. Same-day PrEP has the potential to alleviate the attrition seen in usual care between initial evaluation and receipt of a PrEP prescription. A widespread application of same-day prescribing will be needed to assess its effect on PrEP usage. |
Optimal allocation of societal HIV prevention resources to reduce HIV incidence in the United States
Sansom SL , Hicks KA , Carrico J , Jacobson EU , Shrestha RK , Green TA , Purcell DW . Am J Public Health 2020 111 (1) e1-e8 Objectives. To optimize combined public and private spending on HIV prevention to achieve maximum reductions in incidence.Methods. We used a national HIV model to estimate new infections from 2018 to 2027 in the United States. We estimated current spending on HIV screening, interventions that move persons with diagnosed HIV along the HIV care continuum, pre-exposure prophylaxis, and syringe services programs. We compared the current funding allocation with 2 optimal scenarios: (1) a limited-reach scenario with expanded efforts to serve eligible persons and (2) an ideal, unlimited-reach scenario in which all eligible persons could be served.Results. A continuation of the current allocation projects 331 000 new HIV cases over the next 10 years. The limited-reach scenario reduces that number by 69%, and the unlimited reach scenario by 94%. The most efficient funding allocations resulted in prompt diagnosis and sustained viral suppression through improved screening of high-risk persons and treatment adherence support for those infected.Conclusions. Optimal allocations of public and private funds for HIV prevention can achieve substantial reductions in new infections. Achieving reductions of more than 90% under current funding will require that virtually all infected receive sustained treatment. (Am J Public Health. Published online ahead of print November 19, 2020: e1-e8. https://doi.org/10.2105/AJPH.2020.305965). |
Nowcasting (short-term forecasting) of influenza epidemics in local settings, Sweden, 2008-2019
Spreco A , Eriksson O , Dahlström Ö , Cowling BJ , Biggerstaff M , Ljunggren G , Jöud A , Istefan E , Timpka T . Emerg Infect Dis 2020 26 (11) 2669-2677 The timing of influenza case incidence during epidemics can differ between regions within nations and states. We conducted a prospective 10-year evaluation (January 2008-February 2019) of a local influenza nowcasting (short-term forecasting) method in 3 urban counties in Sweden with independent public health administrations by using routine health information system data. Detection-of-epidemic-start (detection), peak timing, and peak intensity were nowcasted. Detection displayed satisfactory performance in 2 of the 3 counties for all nonpandemic influenza seasons and in 6 of 9 seasons for the third county. Peak-timing prediction showed satisfactory performance from the influenza season 2011-12 onward. Peak-intensity prediction also was satisfactory for influenza seasons in 2 of the counties but poor in 1 county. Local influenza nowcasting was satisfactory for seasonal influenza in 2 of 3 counties. The less satisfactory performance in 1 of the study counties might be attributable to population mixing with a neighboring metropolitan area. |
Social media recruitment for a web survey of sexual and gender minority youth: An evaluation of methods used and resulting sample diversity
Stern MJ , Fordyce E , Hansen C , Heim Viox M , Michaels S , Schlissel A , Avripas S , Harper C , Johns M , Dunville R . LGBT Health 2020 7 (8) 448-456 Purpose: The purpose of this study was to assess the feasibility and efficacy of using advertisements (ads) on Facebook, Instagram, and Snapchat to recruit a national sample of adolescent sexual minority males ages 13-18 and transgender youth ages 13-24 for a web survey. Methods: The Survey of Today's Adolescent Relationships and Transitions (START) used targeted ads as survey recruitment tools. We assessed the efficacy of these varied forms of recruitment ads in reaching our target population. To understand how our sample differed from a national probability sample targeting the general adolescent population, we compared START respondents with sexual minority men identified from the 2017 Youth Risk Behavior Survey (YRBS). Results: The use of targeted language produced higher rates of completes per click compared with ads without targeted language. Video ads (compared with static images) were more effective at recruiting younger respondents. START and YRBS samples differed along lines of sexual identity, race and ethnicity, and age. The START sample had a greater percentage of Hispanic/Latino and Other/Multiracial respondents relative to the YRBS sample, thus providing additional data on these underserved sexual minority youth. Conclusion: The factors associated with design decisions for a hard-to-reach, non-probability sample impact the likelihood that respondents engage in and complete a survey. The ads proved to be effective and efficient at recruiting the targeted population. |
Burkholderia pseudomallei in soil, US Virgin Islands, 2019
Stone NE , Hall CM , Browne AS , Sahl JW , Hutton SM , Santana-Propper E , Celona KR , Guendel I , Harrison CJ , Gee JE , Elrod MG , Busch JD , Hoffmaster AR , Ellis EM , Wagner DM . Emerg Infect Dis 2020 26 (11) 2773-2775 The distribution of Burkholderia pseudomallei in the Caribbean is poorly understood. We isolated B. pseudomallei from US Virgin Islands soil. The soil isolate was genetically similar to other isolates from the Caribbean, suggesting that B. pseudomallei might have been introduced to the islands multiple times through severe weather events. |
High-risk groups for influenza complications
Uyeki TM . JAMA 2020 324 (22) 2334 Many groups of people are at high risk of complications from influenza. | | Persons considered to be at increased risk of complications from influenza include young children, pregnant women and postpartum women up to 2 weeks after delivery, older adults, people with certain chronic medical problems, people who live in nursing homes, and certain racial and ethnic minority groups. From 2009 to 2019, non-Hispanic Black people had the highest influenza hospitalization rates, followed by non-Hispanic American Indian or Alaska Native people, then Hispanic or Latino people, and then non-Hispanic White people. |
Performance of symptom-based case definitions to identify influenza virus infection among pregnant women in middle-income countries: findings from the Pregnancy and Influenza Multinational Epidemiologic (PRIME) Study
Wesley MG , Tinoco Y , Patel A , Suntarratiwong P , Hunt D , Sinthuwattanawibool C , Soto G , Kittikraisak W , Das PK , Arriola CS , Hombroek D , Mott J , Kurhe K , Bhargav S , Prakash A , Florian R , Gonzales O , Cabrera S , Llajaruna E , Brummer T , Malek P , Saha S , Garg S , Azziz-Baumgartner E , Thompson MG , Dawood FS . Clin Infect Dis 2020 73 (11) e4321-e4328 BACKGROUND: The World Health Organization (WHO) recommends case definitions for influenza surveillance that are also used in public health research, though their performance has not been assessed in many risk groups, including pregnant women in whom influenza may manifest differently. We evaluated the performance of symptom-based definitions to detect influenza in a cohort of pregnant women in India, Peru, and Thailand. METHODS: In 2017 and 2018, we contacted 11,277 pregnant women twice weekly during the influenza season to identify illnesses with new or worsened cough, runny nose, sore throat, difficulty breathing or myalgia, and collected data on other symptoms and nasal swabs for influenza rRT-PCR testing. We calculated sensitivity, specificity, positive predictive value and negative predictive value of each symptom-predictor, WHO respiratory illness case definitions and a de novo definition derived from results of multivariable modelling. RESULTS: Of 5,444 eligible illness episodes among 3,965 participants, 310 (6%) were positive for influenza. In a multivariable model, measured fever ≥38° Celsius (adjusted odds ratio = 4.6, 95% confidence interval [CI] = 3.1, 6.8), myalgia (3.0, 95% CI: 2.2, 4.0), cough (2.7, 95% CI: 1.9, 3.9), and chills (1.6, 95% CI: 1.1, 2.4) were independently associated with influenza illness. A definition based on these four (measured fever, cough, chills or myalgia), was 95% sensitive and 27% specific. The WHO influenza-like illness (ILI) definition was 16% sensitive and 98% specific. CONCLUSIONS: The current WHO ILI case definition was highly specific but had low sensitivity. The intended use of case definitions should be considered when evaluating the tradeoff between sensitivity and specificity. |
Genetic Diversity of Nipah virus in Bangladesh.
Rahman MZ , Islam MM , Hossain ME , Rahman MM , Islam A , Siddika A , Hossain MSS , Sultana S , Islam A , Rahman M , Rahman M , Klena JD , Flora MS , Daszak P , Epstein JH , Luby SP , Gurley ES . Int J Infect Dis 2020 102 144-151 BACKGROUND: Nipah virus (NiV) infection, often fatal in humans, is primarily transmitted in Bangladesh through consumption of date palm sap contaminated by Pteropus bats. Person to person transmission is also common and increases the concern of large outbreaks. This study aimed to characterize the molecular epidemiology, phylogenetic relationship, and evolution of the nucleocapsid gene (N gene) of NiV. METHODS: We conducted molecular detection, genetic characterization and Bayesian time-scale evolution analyses of NiV using pooled Pteropid bat roost urine samples from an outbreak area in 2012 and archived RNA samples from NiV case-patients identified during 2012-2018 in Bangladesh. RESULTS: NiV-RNA was detected in 19% (38/456) of bat roost urine samples and among them; nine N gene sequences were recovered. We also retrieved sequences from 53% (21 out of 39) of archived RNA samples from patients. Phylogenetic analysis revealed that all Bangladeshi strains belonged to NiV-BD genotype and had an evolutionary rate of 4.64 × 10(-4) substitutions/site/year. The analyses suggested that the strains of NiV-BD genotype diverged around1995 and formed two sub-lineages. CONCLUSION: This analysis provides further evidence that the NiV strains of the Malaysian and Bangladesh genotypes departed recently and continues to evolve. More extensive surveillance of NiV in bats and human will be helpful to explore strain diversity and virulence potential to infect humans via direct or person-to-person virus transmission. |
Asymptomatic Infection of Marburg Virus Reservoir Bats Is Explained by a Strategy of Immunoprotective Disease Tolerance.
Guito JC , Prescott JB , Arnold CE , Amman BR , Schuh AJ , Spengler JR , Sealy TK , Harmon JR , Coleman-McCray JD , Kulcsar KA , Nagle ER , Kumar R , Palacios GF , Sanchez-Lockhart M , Towner JS . Curr Biol 2020 31 (2) 257-270 e5 Marburg virus (MARV) is among the most virulent pathogens of primates, including humans. Contributors to severe MARV disease include immune response suppression and inflammatory gene dysregulation ("cytokine storm"), leading to systemic damage and often death. Conversely, MARV causes little to no clinical disease in its reservoir host, the Egyptian rousette bat (ERB). Previous genomic and in vitro data suggest that a tolerant ERB immune response may underlie MARV avirulence, but no significant examination of this response in vivo yet exists. Here, using colony-bred ERBs inoculated with a bat isolate of MARV, we use species-specific antibodies and an immune gene probe array (NanoString) to temporally characterize the transcriptional host response at sites of MARV replication relevant to primate pathogenesis and immunity, including CD14(+) monocytes/macrophages, critical immune response mediators, primary MARV targets, and skin at the inoculation site, where highest viral loads and initial engagement of antiviral defenses are expected. Our analysis shows that ERBs upregulate canonical antiviral genes typical of mammalian systems, such as ISG15, IFIT1, and OAS3, yet demonstrate a remarkable lack of significant induction of proinflammatory genes classically implicated in primate filoviral pathogenesis, including CCL8, FAS, and IL6. Together, these findings offer the first in vivo functional evidence for disease tolerance as an immunological mechanism by which the bat reservoir asymptomatically hosts MARV. More broadly, these data highlight factors determining disparate outcomes between reservoir and spillover hosts and defensive strategies likely utilized by bat hosts of other emerging pathogens, knowledge that may guide development of effective antiviral therapies. |
The changing triad of plague in Uganda: invasive black rats (Rattus rattus), indigenous small mammals, and their fleas
Enscore RE , Babi N , Amatre G , Atiku L , Eisen RJ , Pepin KM , Vera-Tudela R , Sexton C , Gage KL . J Vector Ecol 2020 45 (2) 333-355 Rattus rattus was first reported from the West Nile Region of Uganda in 1961, an event that preceded the appearance of the first documented human plague outbreak in 1970. We investigated how invasive R. rattus and native small mammal populations, as well as their fleas, have changed in recent decades. Over an 18-month period, a total of 2,959 small mammals were captured, sampled, and examined for fleas, resulting in the identification of 20 small mammal taxa that were hosts to 5,109 fleas (nine species). Over three-fourths (75.8%) of captured mammals belonged to four taxa: R. rattus, which predominated inside huts, and Arvicanthis niloticus, Mastomys sp., and Crocidura sp., which were more common outside huts. These mammals were hosts for 85.8% of fleas collected, including the efficient plague vectors Xenopsylla cheopis and X. brasiliensis, as well as likely enzootic vectors, Dinopsyllus lypusus and Ctenophthalmus bacopus. Flea loads on small mammals were higher in certain environments in villages with a recent history of plague compared to those that lacked such a history. The significance of these results is discussed in relation to historical data, the initial spread of plague in the WNR and the continuing threat posed by the disease. |
Short Report: Eastern equine encephalitis virus seroprevalence in Maine cervids, 2012-2017
Kenney JL , Henderson E , Mutebi JP , Saxton-Shaw K , Bosco-Lauth A , Elias SP , Robinson S , Smith RP , Lubelczyk C . Am J Trop Med Hyg 2020 103 (6) 2438-2441 Eastern equine encephalitis virus (EEEV) first emerged in Maine in the early 2000s and resulted in an epizootic outbreak in 2009. Since 2009, serum samples from cervids throughout Maine have been collected and assessed for the presence of neutralizing antibodies to EEEV to assess EEEV activity throughout the state. We tested 1,119 Odocoileus virginianus (white-tailed deer) and 982 Alces americanus (moose) serum samples collected at tagging stations during the hunting seasons from 2012 to 2017 throughout the state of Maine. Odocoileus virginianus from all 16 counties were EEEV seropositive, whereas A. americanus were seropositive in the northwestern counties of Aroostook, Somerset, Piscataquis, and Franklin counties. Seroprevalence in O. virginianus ranged from 6.6% to 21.2% and in A. americanus from 6.6% to 10.1%. Data from this report in conjunction with findings previously reported from 2009 to 2011 indicate that EEEV is endemic throughout Maine. |
Exploring the benefits and value of public health department internships for environmental health students
Gerding JA , Hall SK , Gumina CO . J Environ Health 2020 83 (4) 20-25 Internships are an essential component of preparing prospective college graduates for entering the practice-based field of environmental health (EH). EH professionals continually encounter events or hazards of high complexity and impact, and many experienced EH professionals are expected to retire within the next several years. Efforts are needed to ensure a supply of highly qualified and prepared graduates is available to sustain and strengthen the EH workforce. The National Environmental Public Health Internship Program (NEPHIP) addresses this need by supporting health department internships for EH students of academic programs accredited by the National Environmental Health Science and Protection Accreditation Council. We conducted an assessment to examine former NEPHIP intern and mentor experiences and perspectives on 1) how well the internships prepared interns for careers in EH and 2) to what extent the internships provided value to the host health department. Overall, the internships appeared to provide EH students with a well-rounded professional and practice-based experience, while health departments benefited from hosting interns with a foundational knowledge and college education in EH. Promoting the value of public health department EH internships could encourage more students and graduates to seek internship or employment opportunities with health departments, ultimately strengthening the EH workforce. |
Associations of breast milk consumption with urinary phthalate and phenol exposure biomarkers in infants
Henderson NB , Sears CG , Calafat A , Chen A , Lanphear B , Romano M , Yolton K , Braun JM . Environ Sci Technol Lett 2020 7 (10) 733-739 Phthalates and phenols are readily detected in human breast milk, but the contribution of breast milk feeding to an infant's exposure to these chemicals is unknown. Among 248 mother-infant pairs in the HOME Study, we assessed breast milk consumption via maternal report and quantified concentrations of eight phthalate metabolites and three phenols (bisphenol A, triclosan, benzophenone-3) in infants' urine at age 12 months. We estimated covariate-adjusted percent differences in phthalate metabolite and phenol concentrations by breast milk consumption. Seventy infants (28%) were fed some breast milk up to age 12 months. Urinary phenol concentrations were similar in infants who were or were not fed breast milk at 12 months. In contrast, urinary concentrations of monocarboxyisooctyl phthalate (MCOP) and mono-3-carboxypropyl phthalate (MCPP) were 42% (95% CI: 3, 97) and 24% (95% CI:-8, 62) higher among infants fed breast milk at 12 months, respectively. Moreover, MCOP and MCPP concentrations were positively associated with the quantity of breast milk consumed in the prior month and 24 h. In this cohort, we found evidence suggesting that breast milk consumption may be a source of exposure to some phthalates. Future studies should examine whether plastic feeding bottles or pumped milk is a potential exposure source among infants. |
Per- and polyfluoroalkyl substance mixtures and gestational weight gain among mothers in the Health Outcomes and Measures of the Environment study
Romano ME , Gallagher LG , Eliot MN , Calafat AM , Chen A , Yolton K , Lanphear B , Braun JM . Int J Hyg Environ Health 2020 231 113660 BACKGROUND: Per- and polyfluoroalkyl substances (PFAS) are environmentally persistent chemicals commonly used in the production of household and consumer goods. While exposure to PFAS has been associated with greater adiposity in children and adults, less is known about associations with gestational weight gain (GWG). METHODS: We quantified using mass spectrometry perfluorooctanoate (PFOA), perfluorooctanesulfonate (PFOS), perfluorohexanesulfanoate (PFHxS) and perfluorononanoate (PFNA) in maternal serum from 18 ± 5 weeks' gestation (mean ± standard deviation (std)) in a prospective pregnancy and birth cohort (2003-2006, Cincinnati, Ohio) (n = 277). After abstracting weight data from medical records, we calculated GWG from 16 ± 2 weeks' gestation (mean ± std) to the measured weight at the last visit or at delivery, rate of weight gain in the 2nd and 3rd trimesters (GWR), and total weight gain z-scores standardized for gestational age at delivery and pre-pregnancy BMI. We investigated covariate-adjusted associations between individual PFAS using multivariable linear regression; we assessed potential effect measure modification (EMM) by overweight/obese status (pre-pregnancy BMI<25 kg/m(2) v. ≥25 kg/m(2)). Using weighted quantile sum regression, we assessed the combined influence of these four PFAS on GWG and GWR. RESULTS: Each doubling in serum concentrations of PFOA, PFOS, and PFNA was associated with a small increase in GWG (range 0.5-0.8 lbs) and GWR (range 0.03-0.05 lbs/week) among all women. The association of PFNA with GWG was stronger among women with BMI≥25 kg/m(2) (β = 2.6 lbs; 95% CI:-0.8, 6.0) than those with BMI<25 kg/m(2) (β = -1.0 lbs; 95% CI:-3.8, 1.8; p-EMM = 0.10). We observed associations close to the null between PFAS and z-scores and between the PFAS exposure index (a combined summary measure) and the outcomes. CONCLUSION: Although there were consistent small increases in gestational weight gain with increasing PFOA, PFOS, and PFNA serum concentrations in this cohort, the associations were imprecise. Additional investigation of the association of PFAS with GWG in other cohorts would be informative and could consider pre-pregnancy BMI as a potential modifier. |
Analysis of fecal sludges reveals common enteric pathogens in urban Maputo, Mozambique
Capone D , Berendes D , Cumming O , Knee J , Nalá R , Risk BB , Stauber C , Zhu K , Brown J . Environ Sci Technol Lett 2020 7 (12) 889-895 Sewage surveillance is increasingly used in public health applications; metabolites, biomarkers, and pathogens are detectable in wastewater and can provide useful information about community health. Work on this topic has been limited to wastewaters in mainly high-income settings, however. In low-income countries, where the burden of enteric infection is high, nonsewered sanitation predominates. In order to assess the utility of fecal sludge surveillance as a tool to identify the most prevalent enteric pathogens circulating among at-risk children, we collected 95 matched child stool and fecal sludge samples from household clusters sharing latrines in urban Maputo, Mozambique. We analyzed samples for 20 common enteric pathogens via multiplex real-time quantitative PCR. Among the 95 stools matched to fecal sludges, we detected the six most prevalent bacterial pathogens (Enteroaggregative E. coli, Shigella/Enteroinvasive E. coli, Enterotoxigenic E. coli, Enteropathogenic E. coli, shiga-toxin producing E. coli, Salmonella), and all three protozoan pathogens (Giardia duodenalis, Cryptosporidium parvum, Entamoeba histolytica) in the same rank order in both matrices. We did not observe the same trend for viral pathogens or soil-transmitted helminths, however. Our results suggest that sampling fecal sludges from onsite sanitation offers potential for localized pathogen surveillance in low-income settings where enteric pathogen prevalence is high. |
Modelling airport catchment areas to anticipate the spread of infectious diseases across land and air travel
Huber C , Watts A , Grills A , Yong JHE , Morrison S , Bowden S , Tuite A , Nelson B , Cetron M , Khan K . Spat Spatiotemporal Epidemiol 2021 36 100380 Air travel is an increasingly important conduit for the worldwide spread of infectious diseases. However, methods to identify which airports an individual may use to initiate travel, or where an individual may travel to upon arrival at an airport is not well studied. This knowledge gap can be addressed by estimating airport catchment areas: the geographic extent from which the airport derives most of its patronage. While airport catchment areas can provide a simple decision-support tool to help delineate the spatial extent of infectious disease spread at a local scale, observed data for airport catchment areas are rarely made publicly available. Therefore, we evaluated a probabilistic choice behavior model, the Huff model, as a potential methodology to estimate airport catchment areas in the United States in data-limited scenarios. We explored the impact of varying input parameters to the Huff model on estimated airport catchment areas: distance decay exponent, distance cut-off, and measures of airport attractiveness. We compared Huff model catchment area patterns for Miami International Airport (MIA) and Harrisburg International Airport (MDT). We specifically compared our model output to observed data sampled for MDT to align model parameters with an established, observed catchment area. Airport catchment areas derived using the Huff model were highly sensitive to changes in model parameters. We observed that a distance decay exponent of 2 and a distance cut-off of 500 km represented the most realistic spatial extent and heterogeneity of the MIA catchment area. When these parameters were applied to MDT, the Huff model produced similar spatial patterns to the observed MDT catchment area. Finally, our evaluation of airport attractiveness showed that travel volume to the specific international destinations of interest for infectious disease importation risks (i.e., Brazil) had little impact on the predicted choice of airport when compared to all international travel. Our work is a proof of concept for use of the Huff model to estimate airport catchment areas as a generalizable decision-support tool in data-limited scenarios. While our work represents an initial examination of the Huff model as a method to approximate airport catchment areas, an essential next step is to conduct a quantitative calibration and validation of the model based on multiple airports, possibly leveraging local human mobility data such as call detail records or online social network data collected from mobile devices. Ultimately, we demonstrate how the Huff model could be potentially helpful to improve the precision of early warning systems that anticipate infectious disease spread, or to incorporate when local public health decision makers need to identify where to mobilize screening infrastructure or containment strategies at a local level. |
Identification and evaluation of epidemic prediction and forecasting reporting guidelines: A systematic review and a call for action
Pollett S , Johansson M , Biggerstaff M , Morton LC , Bazaco SL , Brett Major DM , Stewart-Ibarra AM , Pavlin JA , Mate S , Sippy R , Hartman LJ , Reich NG , Maljkovic Berry I , Chretien JP , Althouse BM , Myer D , Viboud C , Rivers C . Epidemics 2020 33 100400 INTRODUCTION: High quality epidemic forecasting and prediction are critical to support response to local, regional and global infectious disease threats. Other fields of biomedical research use consensus reporting guidelines to ensure standardization and quality of research practice among researchers, and to provide a framework for end-users to interpret the validity of study results. The purpose of this study was to determine whether guidelines exist specifically for epidemic forecast and prediction publications. METHODS: We undertook a formal systematic review to identify and evaluate any published infectious disease epidemic forecasting and prediction reporting guidelines. This review leveraged a team of 18 investigators from US Government and academic sectors. RESULTS: A literature database search through May 26, 2019, identified 1467 publications (MEDLINE n = 584, EMBASE n = 883), and a grey-literature review identified a further 407 publications, yielding a total 1777 unique publications. A paired-reviewer system screened in 25 potentially eligible publications, of which two were ultimately deemed eligible. A qualitative review of these two published reporting guidelines indicated that neither were specific for epidemic forecasting and prediction, although they described reporting items which may be relevant to epidemic forecasting and prediction studies. CONCLUSIONS: This systematic review confirms that no specific guidelines have been published to standardize the reporting of epidemic forecasting and prediction studies. These findings underscore the need to develop such reporting guidelines in order to improve the transparency, quality and implementation of epidemic forecasting and prediction research in operational public health. |
Lessons learned from a decade of investigations of shiga toxin-producing Escherichia coli outbreaks linked to leafy greens, United States and Canada
Marshall KE , Hexemer A , Seelman SL , Fatica MK , Blessington T , Hajmeer M , Kisselburgh H , Atkinson R , Hill K , Sharma D , Needham M , Peralta V , Higa J , Blickenstaff K , Williams IT , Jhung MA , Wise M , Gieraltowski L . Emerg Infect Dis 2020 26 (10) 2319-2328 Shiga toxin-producing Escherichia coli (STEC) cause substantial and costly illnesses. Leafy greens are the second most common source of foodborne STEC O157 outbreaks. We examined STEC outbreaks linked to leafy greens during 2009-2018 in the United States and Canada. We identified 40 outbreaks, 1,212 illnesses, 77 cases of hemolytic uremic syndrome, and 8 deaths. More outbreaks were linked to romaine lettuce (54%) than to any other type of leafy green. More outbreaks occurred in the fall (45%) and spring (28%) than in other seasons. Barriers in epidemiologic and traceback investigations complicated identification of the ultimate outbreak source. Research on the seasonality of leafy green outbreaks and vulnerability to STEC contamination and bacterial survival dynamics by leafy green type are warranted. Improvements in traceability of leafy greens are also needed. Federal and state health partners, researchers, the leafy green industry, and retailers can work together on interventions to reduce STEC contamination. |
Investigations of possible multistate outbreaks of Salmonella, Shiga toxin-producing Escherichia coli, and Listeria monocytogenes infections - United States, 2016
Marshall KE , Nguyen TA , Ablan M , Nichols MC , Robyn MP , Sundararaman P , Whitlock L , Wise ME , Jhung MA . MMWR Surveill Summ 2020 69 (6) 1-14 PROBLEM/CONDITION: Salmonella, Shiga toxin-producing Escherichia coli (STEC), and Listeria monocytogenes are the leading causes of multistate foodborne disease outbreaks in the United States. Responding to multistate outbreaks quickly and effectively and applying lessons learned about outbreak sources, modes of transmission, and risk factors for infection can prevent additional outbreak-associated illnesses and save lives. This report summarizes the investigations of multistate outbreaks and possible outbreaks of Salmonella, STEC, and L. monocytogenes infections coordinated by CDC during the 2016 reporting period. PERIOD COVERED: 2016. An investigation was considered to have occurred in 2016 if it began during 2016 and ended on or before March 31, 2017, or if it began before January 1, 2016, and ended during March 31, 2016-March 31, 2017. DESCRIPTION OF SYSTEM: CDC maintains a database of investigations of possible multistate foodborne and animal-contact outbreaks caused by Salmonella, STEC, and L. monocytogenes. Data were collected by local, state, and federal investigators during the detection, investigation and response, and control phases of the outbreak investigations. Additional data sources used for this report included PulseNet, the national molecular subtyping network based on isolates uploaded by local, state, and federal laboratories, and the Foodborne Disease Outbreak Surveillance System (FDOSS), which collects information from state, local, and territorial health departments and federal agencies about single-state and multistate foodborne disease outbreaks in the United States. Multistate outbreaks reported to FDOSS were linked using a unique outbreak identifier to obtain food category information when a confirmed or suspected food source was identified. Food categories were determined and assigned in FDOSS according to a classification scheme developed by CDC, the Food and Drug Administration (FDA), and the U.S. Department of Agriculture Food Safety and Inspection Service (FSIS) in the Interagency Food Safety Analytics Collaboration. A possible multistate outbreak was determined by expert judgment to be an outbreak if supporting data (e.g., temporal, geographic, demographic, dietary, travel, or food history) suggested a common source. A solved outbreak was an outbreak for which a specific kind of food or animal was implicated (i.e., confirmed or suspected) as the source. Outbreak-level variables included number of illnesses, hospitalizations, cases of hemolytic uremic syndrome (HUS), and deaths; the number of states with illnesses; date of isolation for the earliest and last cases; demographic data describing patients associated with a possible outbreak (e.g., age, sex, and state of residence); the types of data collected (i.e., epidemiologic, traceback, or laboratory); the outbreak source, mode of transmission, and exposure location; the name or brand of the source; whether the source was suspected or confirmed; whether a food was imported into the United States; the types of regulatory agencies involved; whether regulatory action was taken (and what type of action); whether an outbreak was publicly announced by CDC via website posting; beginning and end date of the investigation; and general comments about the investigation. The number of illnesses, hospitalizations, cases of HUS, and deaths were characterized by transmission mode, pathogen, outcome (i.e., unsolved, solved with suspected source, or solved with confirmed source), source, and food or animal category. RESULTS: During the 2016 reporting period, 230 possible multistate outbreaks were detected and 174 were investigated. A median of 24 possible outbreaks was under investigation per week, and investigations were open for a median of 37 days. Of these 174 possible outbreaks investigated, 56 were excluded from this analysis because they occurred in a single state, were linked to international travel, or were pseudo-outbreaks (e.g., a group of similar isolates resulting from laboratory media contamination rather than infection in patients). Of the remaining 118 possible multistate outbreaks, 50 were determined to be outbreaks and 39 were solved (18 with a confirmed food source, 10 with a suspected food source, 10 with a confirmed animal source, and one with a suspected animal source). Sprouts were the most commonly implicated food category in solved multistate foodborne outbreaks (five). Chicken was the source of the most foodborne outbreak-related illnesses (134). Three outbreaks involved novel food-pathogen pairs: flour and STEC, frozen vegetables and L. monocytogenes, and bagged salad and L. monocytogenes. Eleven outbreaks were attributed to contact with animals (10 attributed to contact with backyard poultry and one to small turtles). Thirteen of 18 multistate foodborne disease outbreaks with confirmed sources resulted in product action, including 10 outbreaks with recalls, two with market withdrawals, and one with an FSIS public health alert. Twenty outbreaks, including 11 foodborne and nine animal-contact outbreaks, were announced to the public by CDC via its website, Facebook, and Twitter. These announcements resulted in approximately 910,000 webpage views, 55,000 likes, 66,000 shares, and 5,800 retweets. INTERPRETATION: During the 2016 reporting period, investigations of possible multistate outbreaks occurred frequently, were resource intensive, and required a median of 37 days of investigation. Fewer than half (42%) of the 118 possible outbreaks investigated were determined to have sufficient data to meet the definition of a multistate outbreak. Moreover, of the 50 outbreaks with sufficient data, approximately three fourths were solved. PUBLIC HEALTH ACTION: Close collaboration among CDC, FDA, FSIS and state and local health and agriculture partners is central to successful outbreak investigations. Identification of novel outbreak sources and trends in sources provides insights into gaps in food safety and safe handling of animals, which helps focus prevention strategies. Summarizing investigations of possible multistate outbreaks can provide insights into the investigative process, improve future investigations, and help prevent illnesses. Although identifying and investigating possible multistate outbreaks require substantial resources and investment in public health infrastructure, they are important in determining outbreak sources and implementing prevention and control measures. |
Adolescent mental health, COVID-19, and the value of school-community partnerships.
Hertz MF , Barrios LC . Inj Prev 2020 27 (1) 85-86 Newly released 2019 Youth Risk Behavior Surveillance System data and the Center for Disease Control and Prevention's (CDC)'2019 Youth Risk Behavior Survey Data Summary and Trends Report show that US adolescents continue to suffer from poor mental health and suicidality at alarming rates. These data alone would be cause for concern, but the COVID-19 pandemic has the potential to further erode adolescent mental health, particularly for those whose mental health was poor prior to the pandemic. Given the status of adolescent mental health prior to COVID-19 and the impact of COVID-19, health professionals and schools must partner together now to mitigate potentially deleterious health, mental health and education impacts for children and adolescents. |
A cluster randomized trial of the impact of education through listening (a novel behavior change technique) on household water treatment with chlorine in Vihiga District, Kenya, 2010-2011
Stauber CE , Person B , Otieno R , Oremo J , Schilling K , Hayat MJ , Ayers T , Quick R . Am J Trop Med Hyg 2020 104 (1) 382-390 Despite multiple studies demonstrating the effectiveness of household water treatment with chlorine in disinfecting water and preventing diarrhea, social marketing of this intervention in low- and middle-income countries has resulted in only modest uptake. In a cluster randomized trial in Vihiga district, western Kenya, we compared uptake of household water treatment with chlorine among six villages served by community vendors trained in standard social marketing plus education through listening (ETL), an innovative behavior change method, and six villages served by community vendors trained in standard social marketing only. Water treatment uptake, water quality, and childhood diarrhea were measured over 6 months and compared between the two groups of villages. During the 6-month period, we found no association between ETL exposure and reported and confirmed household water treatment with chlorine. In both groups (ETL and comparison), reported use of water treatment was low and did not change during our 6-month follow-up. However, persons confirmed to have chlorinated water had improved bacteriologic water quality. Study findings suggest that ETL implementation was suboptimal, which, along with unexpected changes in the supply and price of chlorine, may have prevented an accurate assessment of the potential impact of ETL on water treatment behavior. Taken together, these observations exemplify the complexities of habits, practices, attitudes, and external factors that can create challenging conditions for implementing behavioral interventions. As a consequence, in this trial, ETL had no measurable impact on water treatment behavior. |
Cost-effectiveness of HPV vaccination for adults through age 45years in the United States: Estimates from a simplified transmission model
Chesson HW , Meites E , Ekwueme DU , Saraiya M , Markowitz LE . Vaccine 2020 38 (50) 8032-8039 INTRODUCTION: The objective of this study was to assess incremental costs and benefits of a human papillomavirus (HPV) vaccination program expanded to include "mid-adults" (adults aged 27 through 45 years) in the United States. METHODS: We adapted a previously published, dynamic mathematical model of HPV transmission and HPV-associated disease to estimate the incremental costs and benefits of a 9-valent HPV vaccine (9vHPV) program for people aged 12 through 45 years compared to a 9vHPV program for females aged 12 through 26 years and males aged 12 through 21 years. RESULTS: A 9vHPV program for females aged 12 through 26 years and males aged 12 through 21 years was estimated to cost < $10,000 quality-adjusted life year (QALY) gained, compared to no vaccination. Expanding the 9vHPV program to include mid-adults was estimated to cost $587,600 per additional QALY gained when including adults through age 30 years, and $653,300 per additional QALY gained when including adults through age 45 years. Results were most sensitive to assumptions about HPV incidence among mid-adults, current and historical vaccination coverage, vaccine price, and the impact of HPV diseases on quality of life. CONCLUSIONS: Mid-adult vaccination is much less cost-effective than the comparison strategy of routine vaccination for all adolescents at ages 11 to 12 years and catch-up vaccination for women through age 26 years and men through age 21 years. |
The economic burden of opioid use disorder and fatal opioid overdose in the United States, 2017
Florence C , Luo F , Rice K . Drug Alcohol Depend 2020 218 108350 BACKGROUND: The United States (U.S.) is experiencing an ongoing opioid crisis. Economic burden estimates that describe the impact of the crisis are needed when considering federal and state resources devoted to addressing overdoses. In this study, we estimate the societal costs for opioid use disorder and fatal overdose from all opioids in 2017. METHODS: We estimated costs of fatal overdose from all opioids and opioid use disorder based on the incidence of overdose deaths and the prevalence of past-year opioid use disorder for 2017. Incidence of fatal opioid overdose was obtained from the National Vital Statistics System; prevalence of past-year opioid use disorder was estimated from the National Survey of Drug Use and Health. Costs were estimated for health care, criminal justice and lost productivity. Costs for the reduced quality of life for opioid use disorder and life lost due to fatal opioid overdose were valued using U.S. Department of Health and Human Services guidelines for valuing reductions in morbidity and mortality. RESULTS: Costs for opioid use disorder and fatal opioid overdose in 2017 were estimated to be $1.02 trillion. The majority of the economic burden is due to reduced quality of life from opioid use disorder and the value of life lost due to fatal opioid overdose. CONCLUSIONS: These estimates can assist decision makers in understanding the magnitude of opioid use disorder and fatal overdose. Knowing the magnitude and distribution of the economic burden can inform public policy, clinical practice, research, and prevention and response activities. |
Fecal Microbiota Transplantations: Where Are We, Where Are We Going, and What Is the Role of the Clinical Laboratory?
Ransom EM , Burnham CD , Jones L , Kraft CS , McDonald LC , Reinink AR , Young VB . Clin Chem 2020 66 (4) 512-517 Fecal microbiota transplantation (FMT) is a medical procedure by which intestinal microorganisms are transferred to a patient as a therapeutic. FMTs can use microbiota from donors or from an autologous supply; these are referred to as allo- and auto-FMTs, respectively. FMTs are most commonly used for medically refractory or recurrent infections of Clostridioides (formerly Clostridium) difficile. C. difficile infections (CDIs) usually develop after broad-spectrum antibiotic usage that disrupts the normal intestinal microbiota, creating a niche permissive for C. difficile to flourish and cause a toxin-mediated illness. CDIs are routinely treated with oral antimicrobial therapy; however, relapse, and even multiple relapses, can occur after antimicrobials are stopped. An adjunct treatment option is FMT. While FMTs have been used to treat CDIs for over a decade, questions remain regarding their effectiveness, safety, regulatory oversight, and best practices. We have asked 5 experts with different roles in the field (including infectious diseases, laboratory medicine, industry, and public health) to share their thoughts on this important topic. |
Mitochondrial Genome Sequences of the Emerging Fungal Pathogen Candida auris .
Misas E , Chow NA , Gómez OM , Muñoz JF , McEwen JG , Litvintseva AP , Clay OK . Front Microbiol 2020 11 560332 Candida auris is an emerging fungal pathogen capable of causing invasive infections in humans. Since its first appearance around 1996, it has been isolated in countries spanning five continents. C. auris is a yeast that has the potential to cause outbreaks in hospitals, can survive in adverse conditions, including dry surfaces and high temperatures, and has been frequently misidentified by traditional methods. Furthermore, strains have been identified that are resistant to two and even all three of the main classes of antifungals currently in use. Several nuclear genome assemblies of C. auris have been published representing different clades and continents, yet until recently, the mitochondrial genomes (mtDNA chromosomes) of this species and the closely related species of C. haemulonii, C. duobushaemulonii, and C. pseudohaemulonii had not been analyzed in depth. We used reads from PacBio and Illumina sequencing to obtain a de novo reference assembly of the mitochondrial genome of the C. auris clade I isolate B8441 from Pakistan. This assembly has a total size of 28.2 kb and contains 13 core protein-coding genes, 25 tRNAs and the 12S and 16S ribosomal subunits. We then performed a comparative analysis by aligning Illumina reads of 129 other isolates from South Asia, Japan, South Africa, and South America with the B8441 reference. The clades of the phylogenetic tree we obtained from the aligned mtDNA sequences were consistent with those derived from the nuclear genome. The mitochondrial genome revealed a generally low genetic variation within clades, although the South Asian clade displayed two sub-branches including strains from both Pakistan and India. In particular, the 86 isolates from Colombia and Venezuela had mtDNA sequences that were all identical at the base level, i.e., a single conserved haplotype or mitochondrial background that exhibited characteristic differences from the Pakistan reference isolate B8441, such as a unique 25-nt insert that may affect function. |
Updated Characterization of Outbreak Response Strategies for 2019-2029: Impacts of Using a Novel Type 2 Oral Poliovirus Vaccine Strain.
Kalkowska DA , Pallansch MA , Wilkinson A , Bandyopadhyay AS , Konopka-Anstadt JL , Burns CC , Oberste MS , Wassilak SGF , Badizadegan K , Thompson KM . Risk Anal 2020 41 (2) 329-348 Delays in achieving the global eradication of wild poliovirus transmission continue to postpone subsequent cessation of all oral poliovirus vaccine (OPV) use. Countries must stop OPV use to end all cases of poliomyelitis, including vaccine-associated paralytic polio (VAPP) and cases caused by vaccine-derived polioviruses (VDPVs). The Global Polio Eradication Initiative (GPEI) coordinated global cessation of all type 2 OPV (OPV2) use in routine immunization in 2016 but did not successfully end the transmission of type 2 VDPVs (VDPV2s), and consequently continues to use type 2 OPV (OPV2) for outbreak response activities. Using an updated global poliovirus transmission and OPV evolution model, we characterize outbreak response options for 2019-2029 related to responding to VDPV2 outbreaks with a genetically stabilized novel OPV (nOPV2) strain or with the currently licensed monovalent OPV2 (mOPV2). Given uncertainties about the properties of nOPV2, we model different assumptions that appear consistent with the evidence on nOPV2 to date. Using nOPV2 to respond to detected cases may reduce the expected VDPV and VAPP cases and the risk of needing to restart OPV2 use in routine immunization compared to mOPV2 use for outbreak response. The actual properties, availability, and use of nOPV2 will determine its effects on type 2 poliovirus transmission in populations. Even with optimal nOPV2 performance, countries and the GPEI would still likely need to restart OPV2 use in routine immunization in OPV-using countries if operational improvements in outbreak response to stop the transmission of cVDPV2s are not implemented effectively. |
Planning for a gonococcal vaccine: A narrative review of vaccine development and public health implications
Abara WE , Jerse AE , Hariri S , Kirkcaldy RD . Sex Transm Dis 2020 48 (7) 453-457 Declining gonococcal susceptibility to ceftriaxone and azithromycin has raised the possibility of untreatable gonorrhea in the future and re-ignited interest in gonococcal vaccine development. Despite decades of research, previous gonococcal vaccine candidates have been ineffective. A growing body of data suggest that meningococcal group B outer membrane vaccines (MenB OMV) may be cross-protective against Neisseria gonorrhoeae. Clinical trials of a licensed vaccine against N. meningitidis serogroup B containing an OMV component are underway to determine its efficacy against N. gonorrhoeae. Other experimental gonococcal vaccine candidates are in the preclinical phases. Population impact of future gonococcal vaccines with different levels of efficacy and duration of protection in various populations are being evaluated using modeling studies. Despite recent progress, gaps in gonococcal vaccine research remain. Research is needed to evaluate vaccine efficacy in preventing gonococcal infections acquired via various anatomic routes and among patients co-infected with other sexually transmitted infections. Studies that model the impact of a future vaccine on high-burden populations such as men who have sex with men and estimate both vaccine cost-effectiveness and the incremental cost-effectiveness ratio of vaccination to antimicrobial resistance and treatment costs are warranted. This narrative review examines the current state of gonococcal vaccine research, the possible impact of a gonococcal vaccine on gonorrhea rates based on modeling studies, gaps in the gonococcal vaccine literature, and public health implications of a future gonococcal vaccine on reducing the gonorrhea burden in the United States. |
Safety and immunogenicity of inactivated poliovirus vaccine schedules for the post-eradication era: a randomised open-label, multicentre, phase 3, non-inferiority trial
Bandyopadhyay AS , Gast C , Rivera L , Saez-Llorens X , Oberste MS , Weldon WC , Modlin J , Clemens R , Costa Clemens SA , Jimeno J , Ruttimann R . Lancet Infect Dis 2020 21 (4) 559-568 Background: Following the global eradication of wild poliovirus, countries using live attenuated oral poliovirus vaccines will transition to exclusive use of inactivated poliovirus vaccine (IPV) or fractional doses of IPV (f-IPV; a f-IPV dose is one-fifth of a normal IPV dose), but IPV supply and cost constraints will necessitate dose-sparing strategies. We compared immunisation schedules of f-IPV and IPV to inform the choice of optimal post-eradication schedule. Method(s): This randomised open-label, multicentre, phase 3, non-inferiority trial was done at two centres in Panama and one in the Dominican Republic. Eligible participants were healthy 6-week-old infants with no signs of febrile illness or known allergy to vaccine components. Infants were randomly assigned (1:1:1:1, 1:1:1:2, 2:1:1:1), using computer-generated blocks of four or five until the groups were full, to one of four groups and received: two doses of intradermal f-IPV (administered at 14 and 36 weeks; two f-IPV group); or three doses of intradermal f-IPV (administered at 10, 14, and 36 weeks; three f-IPV group); or two doses of intramuscular IPV (administered at 14 and 36 weeks; two IPV group); or three doses of intramuscular IPV (administered at 10, 14, and 36 weeks; three IPV group). The primary outcome was seroconversion rates based on neutralising antibodies for poliovirus type 1 and type 2 at baseline and at 40 weeks (4 weeks after the second or third vaccinations) in the per-protocol population to allow non-inferiority and eventually superiority comparisons between vaccines and regimens. Three co-primary outcomes concerning poliovirus types 1 and 2 were to determine if seroconversion rates at 40 weeks of age after a two-dose regimen (administered at weeks 14 and 36) of intradermally administered f-IPV were non-inferior to a corresponding two-dose regimen of intramuscular IPV; if seroconversion rates at 40 weeks of age after a two-dose IPV regimen (weeks 14 and 36) were non-inferior to those after a three-dose IPV regimen (weeks 10, 14, and 36); and if seroconversion rates after a two-dose f-IPV regimen (weeks 14 and 36) were non-inferior to those after a three-dose f-IPV regimen (weeks 10, 14, and 36). The non-inferiority boundary was set at -10% for the lower bound of the two-sided 95% CI for the seroconversion rate difference. Safety was assessed as serious adverse events and important medical events. This study is registered on ClinicalTrials.gov, NCT03239496. Finding(s): From Oct 23, 2017, to Nov 13, 2018, we enrolled 773 infants (372 [48%] girls) in Panama and the Dominican Republic (two f-IPV group n=217, three f-IPV group n=178, two IPV group n=178, and three IPV group n=200). 686 infants received all scheduled vaccine doses and were included in the per-protocol analysis. We observed non-inferiority for poliovirus type 1 seroconversion rate at 40 weeks for the two f-IPV dose schedule (95.9% [95% CI 92.0-98.2]) versus the two IPV dose schedule (98.7% [95.4-99.8]), and for the three f-IPV dose schedule (98.8% [95.6-99.8]) versus the three IPV dose schedule (100% [97.9-100]). Similarly, poliovirus type 2 seroconversion rate at 40 weeks for the two f-IPV dose schedule (97.9% [94.8-99.4]) versus the two IPV dose schedule (99.4% [96.4-100]), and for the three f-IPV dose schedule (100% [97.7-100]) versus the three IPV dose schedule (100% [97.9-100]) were non-inferior. Seroconversion rate for the two f-IPV regimen was statistically superior 4 weeks after the last vaccine dose in the 14 and 36 week schedule (95.9% [92.0-98.2]) compared with the 10 and 14 week schedule (83.2% [76.5-88.6]; p=0.0062) for poliovirus type 1. Statistical superiority of the 14 and 36 week schedule was also found for poliovirus type 2 (14 and 36 week schedule 97.9% [94.8-99.4] vs 10 and 14 week schedule 83.9% [77.2-89.2]; p=0.0062), and poliovirus type 3 (14 and 36 week schedule 84.5% [78.7-89.3] vs 10 and 14 week schedule 73.3% [65.8-79.9]; p=0.0062). For IPV, a two dose regimen administered at 14 and 36 weeks (99.4% [96.4-100]) was superior a 10 and 14 week schedule (88.9% [83.4-93.1]; p<0.0001) for poliovirus type 2, but not for type 1 (14 and 36 week schedule 98.7% [95.4-99.8] vs 10 and 14 week schedule 95.6% [91.4-98.1]), or type 3 (14 and 36 week schedule 97.4% [93.5-99.3] vs 10 and 14 week schedule 93.9% [89.3-96.9]). There were no related serious adverse events or important medical events reported in any group showing safety was unaffected by administration route or schedule. Interpretation(s): Our observations suggest that adequate immunity against poliovirus type 1 and type 2 is provided by two doses of either IPV or f-IPV at 14 and 36 weeks of age, and broad immunity is provided with three doses of f-IPV, enabling substantial savings in cost and supply. These novel clinical data will inform global polio immunisation policy for the post-eradication era. Funding(s): Bill & Melinda Gates Foundation. |
Durability of humoral immune responses to rubella following MMR vaccination
Crooke SN , Riggenbach MM , Ovsyannikova IG , Warner ND , Chen MH , Hao L , Icenogle JP , Poland GA , Kennedy RB . Vaccine 2020 38 (51) 8185-8193 BACKGROUND: While administration of the measles-mumps-rubella (MMR-II®) vaccine has been effective at preventing rubella infection in the United States, the durability of humoral immunity to the rubella component of MMR vaccine has not been widely studied among older adolescents and adults. METHODS: In this longitudinal study, we sought to assess the durability of rubella virus (RV)-specific humoral immunity in a healthy population (n = 98) of adolescents and young adults at two timepoints: ~7 and ~17 years after two doses of MMR-II® vaccination. Levels of circulating antibodies specific to RV were measured by ELISA and an immune-colorimetric neutralization assay. RV-specific memory B cell responses were also measured by ELISpot. RESULTS: Rubella-specific IgG antibody titers, neutralizing antibody titers, and memory B cell responses declined with increasing time since vaccination; however, these decreases were relatively moderate. Memory B cell responses exhibited a greater decline in men compared to women. CONCLUSIONS: Collectively, rubella-specific humoral immunity declines following vaccination, although subjects' antibody titers remain well above the currently recognized threshold for protective immunity. Clinical correlates of protection based on neutralizing antibody titer and memory B cell ELISpot response should be defined. |
Immunogenicity of reduced-dose monovalent type 2 oral poliovirus vaccine in Mocuba, Mozambique
de Deus N , Capitine IPU , Bauhofer AFL , Marques S , Cassocera M , Chissaque A , Bero DM , Langa JP , Padama FM , Jeyaseelan V , Oberste MS , Estivariz CF , Verma H , Jani I , Mach O , Sutter RW . J Infect Dis 2020 226 (2) 292-298 BACKGROUND: Monovalent type 2 oral poliovirus vaccine (mOPV2) stockpile is low. One potential strategy to stretch the existing mOPV2 supply is to administer a reduced dose: one-drop instead of two-drops. METHODS: We conducted a randomized, controlled, open-label, non-inferiority trial (10% margin) to compared immunogenicity following administration of one versus two-drops of mOPV2. We enrolled 9-22-months old infants from Mocuba district of Mozambique. Poliovirus neutralizing antibodies were measured in sera collected before and one month after mOPV2 administration. Immune response was defined as seroconversion from seronegative (<1:8) at baseline to seropositive (>1:8) after vaccination or boosting titers by >4-fold for those with titers between 1:8 and 1:362 at baseline. The trial was registered at anzctr.org.au (number ACTRN12619000184178p). RESULTS: We enrolled 378 children and 262 (69%) completed per-protocol requirements. Immune response of mOPV2 was 53.6% (95% confidence interval [CI]: 44.9%-62.1%) and 60.6% (95% CI: 52.2%-68.4%) in 1-drop and 2-drops recipients, respectively. The non-inferiority margin of the 10% was not reached (difference=7.0%; 95%CI= -5.0-19.0). CONCLUSION: A small loss of immunogenicity of reduced mOPV2 was observed. Although the non-inferiority target was not achieved, the Strategic Advisory Group of Experts on Immunization, recommended the 1-drop strategy as a dose-sparing measure if mOPV2 supplies deteriorate further. |
Planning and implementing a targeted polio vaccination campaign for Somali mobile populations in Northeastern Kenya based on migration and settlement patterns
Harvey B , Dalal W , Amin F , McIntyre E , Ward S , Merrill RD , Mohamed A , Hsu CH . Ethn Health 2020 27 (4) 1-16 Supporting the global eradication of wildpoliovrisu (WPV), this project aimed to provide polio and measles vaccines to a population frequenty missed by immunization services and campaigns, ethnic Somali children living among mobile populations within Kenya's Northeastern Region. Additionally, nutritional support, albendazole (for treatment of intestinal parasites) and vitamin A were provided to improve children's health and in accordance with regional vaccination campaign practices. To better understand movement patterns and healthcare-seeking behaviors within this population, we trained community-based data collectors in qualitative and geospatial data collection methods. Data collectors conducted focus group and participatory mapping discussions with ethnic Somalis living in the region. Qualitative and geospatial data indicated movement patterns that followed partially definable routes and temporary settlement patterns with an influx of ethnic Somali migrants into Kenya at the start of the long rainy season (April-June). Community members also reported concerns about receiving healthcare services in regional health facilities. Using these data, an 8-week vaccination campaign was planned and implemented: 2196 children aged 0-59 months received polio vaccine (9% had not previously received polio vaccine), 2524 children aged 9-59 months received measles vaccine (27% had not previously received measles vaccine), 113 were referred for the treatment of severe acute malnourishment, 150 were referred to a supplementary feeding program due to moderate acute malnourishment, 1636 children aged 12-59 months were provided albendazole and 2008 children aged 6-59 months were provided with vitamin A. This project serves as an example for how community-based data collectors and local knowledge can help adapt public health programming to the local context and could aid disease eradication in at-risk populations. |
Low influenza vaccine effectiveness against A(H3N2) associated hospitalizations in the 2016-2017 and 2017-2018 seasons of the Hospitalized Adult Influenza Vaccine Effectiveness Network (HAIVEN)
Martin ET , Cheng C , Petrie JG , Alyanak E , Gaglani M , Middleton DB , Ghamande S , Silveira FP , Murthy K , Zimmerman RK , Monto AS , Trabue C , Talbot HK , Ferdinands JM . J Infect Dis 2020 223 (12) 2062-2071 INTRODUCTION: The 2016-2017 and 2017-2018 influenza seasons were notable for high number of hospitalizations for influenza A(H3N2) despite vaccine and circulating strain match. METHODS: We evaluated vaccine effectiveness (VE) against hospitalization in the test-negative HAIVEN study. Nasal-throat swabs were tested by RT-PCR for influenza and VE was determined based on odds of vaccination by generalized estimating equations. Vaccine-specific antibody was measured in a subset of enrollees. RESULTS: A total of 6,129 adults were enrolled from ten hospitals. Adjusted VE against A(H3N2) was 22.8% (95% C.I. 8.3%, 35.0%), pooled across both years and 49.4% (95% C.I. 34.3%, 61.1%) against B/Yamagata. In 2017-2018, the A(H3N2) VE point estimate for the cell-based vaccine was 43.0% (95% C.I. -36.3%, 76.1%; 56 vaccine recipients) compared to 24.0% (95% C.I. 3.9%, 39.9%) for egg based vaccines. Among 643 with serology data, hemagglutinin antibodies against the egg-based A(H3N2) vaccine strain were increased in influenza-negative individuals. CONCLUSIONS: Low VE for the A/Hong Kong/4801/2014 vaccine virus in both A(H3N2) seasons emphasizes concerns for continued changes in H3N2 antigenic epitopes, including changes that may impact glycosylation and ultimately reduce VE. |
Estimates of inactivated influenza vaccine effectiveness among children in Senegal: results from two consecutive cluster-randomized controlled trials in 2010 and 2011
Niang MN , Sugimoto JD , Diallo A , Diarra B , Ortiz JR , Lewis KDC , Lafond KE , Halloran ME , Widdowson MA , Neuzil KM , Victor JC . Clin Infect Dis 2020 72 (12) e959-e969 BACKGROUND: We report results of Years 2 and 3 of consecutive cluster-randomized controlled trials of trivalent inactivated influenza vaccine (IIV3) in Senegal. METHODS: We cluster-randomized (1:1) 20 villages to annual vaccination with IIV3 or inactivated poliovirus vaccine (IPV) of age-eligible residents (6 months - 10 years). The primary outcome was total vaccine effectiveness against laboratory-confirmed influenza illness (LCI) among age-eligible children (modified-intention-to-treat population [mITT]). Secondary outcomes were indirect (herd protection) and population (overall community) vaccine effectiveness. RESULTS: We vaccinated 74% of 12,408 age-eligible children in Year 2 (June 2010-April 11) and 74% of 11,988 age-eligible children in Year 3 (April 2011-December 2011) with study vaccines. Annual cumulative incidence of LCI was 4.7 (Year 2) and 4.2 (Year 3) per 100 mITT child vaccinees of IPV villages. In Year 2, IIV3 matched circulating influenza strains. The total effectiveness was 52.8% (95% CI: 32.3%-67.0%), and the population effectiveness was 36.0% (95% CI: 10.2%-54.4%) against LCI caused by any influenza strain. The indirect effectiveness against LCI by A/H3N2 was 56.4% (95% CI: 39.0%-68.9%). In Year 3, 74% of influenza detections were vaccine-mismatched to circulating B/Yamagata and 24% were vaccine-matched to circulating A/H3N2. The Year 3 total effectiveness against LCI was -14.5% (95% CI: -81.2%-27.6%). Vaccine effectiveness varied by type/subtype of influenza in both years. CONCLUSION: IIV3 was variably effective against influenza illness in Senegalese children, with total and indirect vaccine effectiveness present during the year when all circulating strains matched the IIV3 formulation. |
Evaluating the association of stillbirths after maternal vaccination in the Vaccine Safety Datalink
Panagiotakopoulos L , McCarthy NL , Tepper NK , Kharbanda EO , Lipkind HS , Vazquez-Benitez G , McClure DL , Greenberg V , Getahun D , Glanz JM , Naleway AL , Klein NP , Nelson JC , Weintraub ES . Obstet Gynecol 2020 136 (6) 1086-1094 OBJECTIVE: To evaluate whether the Centers for Disease Control and Prevention's Advisory Committee on Immunization Practices recommended influenza and tetanus toxoid, reduced diphtheria toxoid, and acellular pertussis (Tdap) vaccinations in pregnancy are associated with increased risk of stillbirth. METHODS: We performed a case-control study in the Vaccine Safety Datalink that was matched 1:4 on site, month, and year of last menstrual period, comparing the odds of vaccination in pregnancies that ended in stillbirth (defined as fetal loss at or after 20 weeks of gestation) compared with those that ended in live birth from January 1, 2012, to September 30, 2015. We included patients with singleton pregnancies that ended in stillbirth or live birth who had at least one prenatal care visit, pregnancy dating information, and continuous health plan enrollment for the duration of pregnancy. Medical records for all stillbirths were reviewed. We were statistically powered to detect an odds ratio (OR) of 1.37 when evaluating the association between influenza or Tdap vaccination and stillbirth. We also examined stillbirth rates in pregnant patients aged 14-49 years in the Vaccine Safety Datalink between 2007 and 2015. RESULTS: In our matched analysis of 795 confirmed stillbirths in the case group and 3,180 live births in the control group, there was no significant association between influenza vaccination during pregnancy and stillbirth (343/795 [43.1%] stillbirths in the case group vs 1,407/3,180 [44.3%] live births in the control group, OR 0.94, adjusted OR 0.95, 95% CI 0.79-1.14, P=.54) and no significant association between Tdap vaccination during pregnancy and stillbirth (184/795 [23.1%] stillbirths in the case group vs 746/3,180 [23.5%] live births in the control group, OR 0.97, aOR 0.96, 95% CI 0.76-1.28, P=.91). From 2007 to 2015, the stillbirth rate in the Vaccine Safety Datalink was 5.2 per 1,000 live births and stillbirths. CONCLUSION: No association was found between vaccination during pregnancy and the odds of stillbirth. These findings support the safety of ACIP recommendations for vaccination during pregnancy. |
Maternal anti-rotavirus IgG antibodies persist in the post-rotavirus vaccine era
Payne DC , McNeal M , Staat MA , Piasecki AM , Cline A , DeFranco E , Goveia MG , Parashar UD , Burke RM , Morrow AL . J Infect Dis 2020 224 (1) 133-136 To assess whether titers of anti-rotavirus IgG persist during the post-rotavirus vaccine era, the PREVAIL Cohort analyzed sera collected from Cincinnati-area mothers and young infants in 2017-18. Rotavirus-specific antibodies continue to be transferred from US mothers to offspring in the post-rotavirus vaccine era, despite dramatic decreases in childhood rotavirus gastroenteritis. |
Guillain-Barr syndrome after high-dose influenza vaccine administration in the United States, 2018-2019 season
Perez-Vilar S , Hu M , Weintraub E , Arya D , Lufkin B , Myers T , Woo EJ , Lo AC , Chu S , Swarr M , Liao J , Wernecke M , MaCurdy T , Kelman J , Anderson S , Duffy J , Forshee RA . J Infect Dis 2020 223 (3) 416-425 BACKGROUND: The Vaccine Safety Datalink (VSD) identified a statistical signal for an increased risk of Guillain-Barré syndrome (GBS) in days 1-42 after 2018-2019 high-dose influenza vaccine (IIV3-HD) administration. We evaluated the signal using Medicare. METHODS: We conducted early- and end-of-season claims-based self-controlled risk interval analyses among Medicare beneficiaries ages ≥65 years, using days 8-21 and 1-42 postvaccination as risk windows and days 43-84 as control window. The VSD conducted chart-confirmed analyses. RESULTS: Among 7 453 690 IIV3-HD vaccinations, we did not detect a statistically significant increased GBS risk for either the 8- to 21-day (odds ratio [OR], 1.85; 95% confidence interval [CI], 0.99-3.44) or 1- to 42-day (OR, 1.31; 95% CI, 0.78-2.18) risk windows. The findings from the end-of-season analyses were fully consistent with the early-season analyses for both the 8- to 21-day (OR, 1.64; 95% CI, 0.92-2.91) and 1- to 42-day (OR, 1.12; 95% CI, 0.70-1.79) risk windows. The VSD's chart-confirmed analysis, involving 646 996 IIV3-HD vaccinations, with 1 case each in the risk and control windows, yielded a relative risk of 1.00 (95% CI, 0.06-15.99). CONCLUSIONS: The Medicare analyses did not exclude an association between IIV3-HD and GBS, but it determined that, if such a risk existed, it was similar in magnitude to prior seasons. Chart-confirmed VSD results did not confirm an increased risk of GBS. |
Intussusception after rotavirus vaccine introduction in India
Reddy SN , Nair NP , Tate JE , Thiyagarajan V , Giri S , Praharaj I , Mohan VR , Babji S , Gupte MD , Arora R , Bidari S , Senthamizh S , Mekala S , Goru KB , Reddy B , Pamu P , Gorthi RP , Badur M , Mohan V , Sathpathy S , Mohanty H , Dash M , Mohakud NK , Ray RK , Mohanty P , Gathwala G , Chawla S , Gupta M , Gupta R , Goyal S , Sharma P , Mathew MA , Jacob TJK , Sundaram B , Purushothaman GKC , Dorairaj P , Jagannatham M , Murugiah K , Boopathy H , Maniam R , Gurusamy R , Kumaravel S , Shenoy A , Jain H , Goswami JK , Wakhlu A , Gupta V , Vinayagamurthy G , Parashar UD , Kang G . N Engl J Med 2020 383 (20) 1932-1940 BACKGROUND: A three-dose, oral rotavirus vaccine (Rotavac) was introduced in the universal immunization program in India in 2016. A prelicensure trial involving 6799 infants was not large enough to detect a small increased risk of intussusception. Postmarketing surveillance data would be useful in assessing whether the risk of intussusception would be similar to the risk seen with different rotavirus vaccines used in other countries. METHODS: We conducted a multicenter, hospital-based, active surveillance study at 27 hospitals in India. Infants meeting the Brighton level 1 criteria of radiologic or surgical confirmation of intussusception were enrolled, and rotavirus vaccination was ascertained by means of vaccination records. The relative incidence (incidence during the risk window vs. all other times) of intussusception among infants 28 to 365 days of age within risk windows of 1 to 7 days, 8 to 21 days, and 1 to 21 days after vaccination was evaluated by means of a self-controlled case-series analysis. For a subgroup of patients, a matched case-control analysis was performed, with matching for age, sex, and location. RESULTS: From April 2016 through June 2019, a total of 970 infants with intussusception were enrolled, and 589 infants who were 28 to 365 days of age were included in the self-controlled case-series analysis. The relative incidence of intussusception after the first dose was 0.83 (95% confidence interval [CI], 0.00 to 3.00) in the 1-to-7-day risk window and 0.35 (95% CI, 0.00 to 1.09) in the 8-to-21-day risk window. Similar results were observed after the second dose (relative incidence, 0.86 [95% CI, 0.20 to 2.15] and 1.23 [95% CI, 0.60 to 2.10] in the respective risk windows) and after the third dose (relative incidence, 1.65 [95% CI, 0.82 to 2.64] and 1.08 [95% CI, 0.69 to 1.73], respectively). No increase in intussusception risk was found in the case-control analysis. CONCLUSIONS: The rotavirus vaccine produced in India that we evaluated was not associated with intussusception in Indian infants. (Funded by the Bill and Melinda Gates Foundation and others.). |
Assessing the cost-utility of preferentially administering Heplisav-B vaccine to certain populations
Rosenthal EM , Hall EW , Rosenberg ES , Harris A , Nelson NP , Schillie S . Vaccine 2020 38 (51) 8206-8215 Vaccination is the primary strategy to prevent hepatitis B virus (HBV) infection in the United States. Prior to 2017, most standard hepatitis B vaccine schedules required 3 doses over 6 months. Heplisav-B, approved in 2017, is administered in 2 doses over a 1 month time period but has a higher per-dose cost ($115.75 per dose compared to $57.25 per Engerix-B dose, costs as of June 1, 2019). We aimed to assess the cost-utility of providing the two-dose Heplisav-B vaccine compared to a three-dose Engerix-B vaccine among adult populations currently recommended for vaccination against hepatitis B. We used a decision-tree model with microsimulation and a Markov disease progression process to assess the cost-utility separately for the following populations: adults with diabetes, obesity, chronic kidney disease, HIV; non-responders to previous hepatitis B vaccination; older adults; and persons who inject drugs (PWID). We modeled epidemiologic outcomes (incident HBV infections, sequelae and related deaths), costs (2019 USD) and benefits (quality-adjusted life years, QALYs) and compared them across strategies. Sensitivity analyses assessed the cost-utility at varying estimates of Heplisav-B efficacy. In the base case scenario for each population, vaccination with Heplisav-B resulted in fewer HBV infections (37.5-59.8% averted), sequelae, and HBV-related deaths (36.3-71.4% averted). Heplisav-B resulted in decreased costs and increased benefits compared to Engerix-B for all populations except non-responders. Incremental costs from the baseline strategy ranged from $4746.78 saved (PWID) to $14.15 added cost (non-responders). Incremental benefits per person ranged from 0.00005 QALYs (older adults) to 0.7 QALYs (PWID). For persons with HIV and PWID, Heplisav-B resulted in lower costs and increased benefits in all scenarios in which Heplisav-B series efficacy was at least 80%. Vaccination using Heplisav-B is a cost-saving strategy compared to Engerix-B for adults with diabetes, chronic kidney disease, obesity, and HIV; older adults; and PWID. |
Parental vaccine hesitancy and childhood influenza vaccination
Santibanez TA , Nguyen KH , Greby SM , Fisher A , Scanlon P , Bhatt A , Srivastav A , Singleton JA . Pediatrics 2020 146 (6) OBJECTIVES: To quantify the prevalence of parental vaccine hesitancy (VH) in the United States and examine the association of VH with sociodemographics and childhood influenza vaccination coverage. METHODS: A 6-question VH module was included in the 2018 and 2019 National Immunization Survey-Flu, a telephone survey of households with children age 6 months to 17 years. RESULTS: The percentage of children having a parent reporting they were "hesitant about childhood shots" was 25.8% in 2018 and 19.5% in 2019. The prevalence of concern about the number of vaccines a child gets at one time impacting the decision to get their child vaccinated was 22.8% in 2018 and 19.1% in 2019; the prevalence of concern about serious, long-term side effects impacting the parent's decision to get their child vaccinated was 27.3% in 2018 and 21.7% in 2019. Only small differences in VH by sociodemographic variables were found, except for an 11.9 percentage point higher prevalence of "hesitant about childhood shots" and 9.9 percentage point higher prevalence of concerns about serious, long-term side effects among parents of Black compared with white children. In both seasons studied, children of parents reporting they were "hesitant about childhood shots" had 26 percentage points lower influenza vaccination coverage compared with children of parents not reporting hesitancy. CONCLUSIONS: One in 5 children in the United States have a parent who is vaccine hesitant, and hesitancy is negatively associated with childhood influenza vaccination. Monitoring VH could help inform immunization programs as they develop and target methods to increase vaccine confidence and vaccination coverage. |
Estimated influenza illnesses and hospitalizations averted by influenza vaccination among children aged 6-59months in Suzhou, China, 2011/12 to 2015/16 influenza seasons
Zhang W , Gao J , Chen L , Tian J , Biggerstaff M , Zhou S , Situ S , Wang Y , Zhang J , Millman AJ , Greene CM , Zhang T , Zhao G . Vaccine 2020 38 (51) 8200-8205 BACKGROUND: There are few estimates of vaccination-averted influenza-associated illnesses in China. METHODS: We used a mathematical model and Monte Carlo algorithm to estimate numbers and 95% confidence intervals (CI) of influenza-associated outcomes (hospitalization, illness, and medically-attended (MA) illness) averted by vaccination among children aged 6-59 months in Suzhou from October 2011-September 2016. Influenza illnesses included non-hospitalized MA influenza illnesses and non-MA influenza illnesses. The numbers of influenza-associated outcomes averted by vaccination were the difference between the expected burden if there were no vaccination given and the observed burden with vaccination. The model incorporated the disease burden estimated based on surveillance data from Suzhou University Affiliated Children's Hospital (SCH) and data from health utilization surveys conducted in the catchment area of SCH, age-specific estimates of influenza vaccination coverage in Suzhou from the Expanded Program on Immunization database, and influenza vaccine effectiveness estimates from previous publications. Averted influenza estimations were presented as absolute numbers and in terms of the prevented fraction (PF). A hypothetical scenario with 50% coverage (but identical vaccine effectiveness) over the study period was also modeled. RESULTS: In ~250,000 children, influenza vaccination prevented an estimated 731 (CI: 549-960) influenza hospitalizations (PF: 6.2% of expected, CI: 5.8-6.6%) and 10,024 (7593-12,937) influenza illnesses (PF: 6.5%, 6.4-6.7%), of which 8342 (6338-10,768) were MA (PF: 6.6%, 6.4-6.7%) from 2011 to 2016. The PFs declined each year along with decreasing influenza vaccination coverage. If 50% of the study population had been vaccinated over time, the estimated numbers of averted cases during the study period would have been 4059 (3120-5762) influenza hospitalizations (PF: 27.2%, 26.4-27.9%) and 56,215 (42,925-78,849) influenza illnesses (PF: 28.5%, 28.3-28.7%), of which 46,596 (35,662-65,234) would be MA (PF: 28.5%, 28.3-28.7%). CONCLUSION: Influenza vaccination is estimated to have averted influenza-associated illness outcomes even with low coverage in children aged 6-59 months in Suzhou. Increasing influenza vaccination coverage in this population could further reduce illnesses and hospitalizations. |
Advancing environmental health practice through environmental health informatics activities
Coleman EW , Jagne AF . J Environ Health 2020 83 (4) 30-31 Environmental health programs collect data (e.g., inspection results) that might not be routinely analyzed for trends or used to inform timely public health decision-making. State, tribal, local, and territorial health departments and environmental health programs, however, can lack resources, time, or the experience to collect, analyze, and visualize EH data. Leveraging the use of informatics by standardizing data collection, sharing, and utilization can support innovative approaches to improving EH practice. | | The Centers for Disease Control and Prevention (CDC) Water, Food, and Environmental Health Services Branch supports the work of EH informatics through collaborative activities that | | promote timely public data and information sharing to detect and address existing or potential exposures to EH hazards, | support best practices for the innovative use of existing data and electronic information to design interventions to protect public health, and | identify environmental and health outcome indicators to assess the need for and impact of EH services. | This month's column explores designing an open data standard to improve health and safety in aquatic facilities, as well as leveraging informatics to improve environmental health practice and innovation. |
Population heath informatics can advance interoperability: National Program of Cancer Registries Electronic Pathology Reporting Project
Pollack LA , Jones SF , Blumenthal W , Alimi TO , Jones DE , Rogers JD , Benard VB , Richardson LC . JCO Clin Cancer Inform 2020 4 985-992 PURPOSE: Given the reach, breadth, and volume of data collected from multiple clinical settings and systems, US central cancer registries (CCRs) are uniquely positioned to test and advance cancer health information exchange. This article describes a current Centers for Disease Control and Prevention (CDC) National Program of Cancer Registries (NPCR) cancer informatics data exchange initiative. METHODS: CDC is using an established cloud-based platform developed by the Association of Public Health Laboratories (APHL) for national notifiable disease reporting to enable direct transmission of standardized electronic pathology (ePath) data from laboratories to CCRs in multiple states. RESULTS: The APHL Informatics Messaging Services (AIMS) Platform provides an infrastructure to enable a large national laboratory to submit data to a single platform. State health departments receive data from the AIMS Platform through a secure portal, eliminating separate data exchange routes with each CCR. CONCLUSION: Key factors enabling ePath data exchange from laboratories to CCRs are having established cancer registry data standards and using a single platform/portal to reduce data streams. NPCR plans to expand this approach in alignment with ongoing cancer informatics efforts in clinical settings. The 50 CCRs supported by NPCR provide a variety of scenarios to develop and disseminate cancer data informatics initiatives and have tremendous potential to increase the implementation of cancer data exchange. |
Taking Action to Prevent Violence Against Adolescents in the Time of COVID-19.
Chiang L , Howard A , Butchart A . J Adolesc Health 2020 68 (1) 11-12 Adolescents are often considered the forgotten demographic in public health and social policy [1]. They may be particularly vulnerable to certain types of violence owing to simultaneous risks of violence from caregivers and intimate partners and their unique physiology, particularly the rapid brain development that is a hallmark of adolescence [1]. Chronic exposure to the toxic stress of violence during youth can have severe consequences across the lifespan such as poor mental health, sexual and reproductive health problems, and chronic disease [2]. While the global burden of violence in youth is high, impacting 1 billion children and adolescents each year, [3] countries are beginning to prioritize prevention, and many government officials acknowledge the need for scale-up. Of 155 countries assessed, 56% reported some national support for implementing evidence-based violence prevention and response for children and adolescents, but just 25% considered their support adequate to reach all in need [4]. |
Texting while driving: A discrete choice experiment
Foreman AM , Friedel JE , Hayashi Y , Wirth O . Accid Anal Prev 2020 149 105823 Texting while driving is one of the most dangerous types of distracted driving and contributes to a large number of transportation incidents and fatalities each year. Drivers text while driving despite being aware of the risks. Although some factors related to the decision to text while driving have been elucidated, more remains to be investigated in order to better predict and prevent texting while driving. To study decision making involved in reading a text message while driving, we conducted a discrete choice experiment with 345 adult participants recruited from Amazon's Mechanical Turk. Participants were presented with multiple choice sets, each involving two different scenarios, and asked to choose the scenario in which they would be more likely to text while driving. The attributes of the scenarios were the relationship to the text-message sender, the road conditions, and the importance of the message. The attributes varied systematically across the choice sets. Participants were more likely to read a text message while driving if the sender of the message was a significant other, the message was perceived to be very important, and the participant was driving on rural roads. Discrete choice experiments offer a promising approach to studying decision making in drivers and other populations because they allow for an analysis of multiple factors simultaneously and the trade-offs among different choices. |
Introduction to a special section on the effects of the Dating Matters model on secondary outcomes: Results from a comparative effectiveness cluster randomized controlled trial
Niolon PH . Prev Sci 2020 22 (2) 145-149 Teen dating violence (TDV) affects millions of young people in the USA each year (Basile et al. 2020) and is associated with a myriad of negative consequences across the lifespan, including placing individuals at greater risk for experiencing intimate partner violence (IPV) in their more permanent relationships in adulthood (Exner-Cortens et al. Pediatrics 131(1):71-78 Exner-Cortens et al. 2013; Exner-Cortens et al. Journal of Adolescent Health 60(2):176-183 Exner-Cortens et al. 2017). The CDC developed the Dating Matters®: Strategies to Promote Healthy Teen Relationships comprehensive prevention model to prevent TDV and its consequences among young people, and it was found to be effective at reducing TDV perpetration and victimization compared with another evidence-based program (Niolon et al. American Journal of Preventive Medicine 57(1):13-23 Niolon et al. 2019). Dating Matters addresses multiple risk and protective factors for TDV through its multiple components, many of which are shared risk and protective factors for other forms of violence and risk behaviors among adolescents. This article introduces this special section, which includes three papers examining these secondary outcomes of the Dating Matters comparative effectiveness, multi-site, longitudinal cluster randomized controlled trial and concludes with an invited commentary by Debman and Temple (in press). This introduction briefly discusses the Dating Matters comprehensive prevention model, the comparative effectiveness trial used to evaluate effectiveness, the outcomes examined by the three papers included in this special section and the commentary from external reviewers. This special section makes an important contribution to the field of violence prevention, highlighting a preventive intervention for TDV that addresses a constellation of risk and protective factors and demonstrating its effects on multiple adolescent risk and violence outcomes. |
Suffering whether you tell or don't tell: Perceived re-victimization as a barrier to disclosing child sexual abuse in Zimbabwe
Obong'o CO , Patel SN , Cain M , Kasese C , Mupambireyi Z , Bangani Z , Pichon LC , Miller KS . J Child Sex Abus 2020 29 (8) 1-21 Disclosing child sexual abuse (CSA) is a necessary first step to access the legal, health, and psycho-social services that survivors and their families need. However, disclosure rates are low: of young women who experienced CSA in Zimbabwe, only 9% disclosed the first incident. The purpose of this qualitative study was to explore and describe perceived barriers to disclosing CSA in Zimbabwe. We conducted focus group discussions with children aged 10-14 years (n = 40) and their parents/caregivers aged 20-62 years (n = 40), participating in an intervention trial in Chitungwiza, Zimbabwe. We found that potential retaliation against survivors and their families is a major barrier to disclosing CSA. These retaliatory acts, which we refer to as "re-victimization," arise from stigma or the victim feeling blamed or doubted and manifest through physical violence, emotional violence, and deprivation of family life and education. Our findings suggest that addressing social and cultural norms related to sex and strengthening legal protection for CSA survivors and their families could encourage CSA disclosure and could help end this violence. Our findings also highlight a need to increase children's awareness of their rights and to create safe systems for disclosure of sexual abuse. |
Atypical Mutation in Neisseria gonorrhoeae 23S rRNA Associated with High-Level Azithromycin Resistance.
Pham CD , Nash E , Liu H , Schmerer MW , Sharpe S , Woods G , Roland B , Schlanger K , St Cyr SB , Carlson J , Sellers K , Olsen A , Sanon R , Hardin H , Soge OO , Raphael BH , Kersh EN . Antimicrob Agents Chemother 2020 65 (2) A2059G mutation in the 23S rRNA gene is the only reported mechanism conferring high-level azithromycin resistance (HL-AZMR) in Neisseria gonorrhoea Through U.S. gonococcal antimicrobial resistance surveillance projects, we identified four HL-AZMR gonococcal isolates lacking this mutational genotype. Genetic analysis revealed an A2058G mutation of 23S rRNA alleles in all four isolates. In vitro selected gonococcal strains with homozygous A2058G recapitulated the HL-AZMR phenotype. Taken together, we postulate that A2058G mutation confers HL-AZMR in N. gonorrhoeae. |
Identifying septic pollution exposure routes during a waterborne norovirus outbreak - A new application for human-associated microbial source tracking qPCR.
Mattioli MC , Benedict KM , Murphy J , Kahler A , Kline KE , Longenberger A , Mitchell PK , Watkins S , Berger P , Shanks OC , Barrett CE , Barclay L , Hall AJ , Hill V , Weltman A . J Microbiol Methods 2020 180 106091 In June 2017, the Pennsylvania Department of Health (PADOH) was notified of multiple norovirus outbreaks associated with 179 ill individuals who attended separate events held at an outdoor venue and campground over a month period. Epidemiologic investigations were unable to identify a single exposure route and therefore unable to determine whether there was a persistent contamination source to target for exposure mitigation. Norovirus was detected in a fresh recreational water designated swimming area and a drinking water well. A hydrogeological site evaluation suggested a nearby septic leach field as a potential contamination source via ground water infiltration. Geological characterization revealed a steep dip of the bedrock beneath the septic leach field toward the well, providing a viral transport pathway in a geologic medium not previously documented as high risk for viral ground water contamination. The human-associated microbial source tracking (MST) genetic marker, HF183, was used as a microbial tracer to demonstrate the hydrogeological connection between the malfunctioning septic system, drinking water well, and recreational water area. Based on environmental investigation findings, venue management and local public health officials implemented a series of outbreak prevention strategies including discontinuing the use of the contaminated well, issuing a permit for a new drinking water well, increasing portable toilet and handwashing station availability, and promoting proper hand hygiene. Despite the outbreaks at the venue and evidence of ground water contamination impacting nearby recreational water and the drinking water well, no new norovirus cases were reported during a large event one week after implementing prevention practices. This investigation highlights a new application for human-associated MST methods to trace hydrological connections between multiple fecal pollutant exposure routes in an outbreak scenario. In turn, pollutant source information can be used to develop effective intervention practices to mitigate exposure and prevent future outbreaks associated with human fecal contaminated waters. |
Identification of Candida auris and related species by multiplex PCR based on unique GPI protein-encoding genes.
Alvarado M , Bartolomé Álvarez J , Lockhart SR , Valentín E , Ruiz-Gaitán AC , Eraso E , Groot PWJ . Mycoses 2020 64 (2) 194-202 BACKGROUND: The pathogen Candida auris is rapidly gaining clinical importance because of its resistance to antifungal treatments and its persistence in hospital environments. Early and accurate diagnosis of C. auris infections is crucial, however, the fungus has often been misidentified by commercial systems. OBJECTIVES: To develop conventional and real-time PCR methods for accurate and rapid identification of C. auris and its discrimination from closely related species by exploiting the uniqueness of certain glycosylphosphatidylinositol-modified protein-encoding genes. METHODS: Species-specific primers for two unique putative GPI protein-encoding genes per species were designed for C. auris, C. haemulonii, C. pseudohaemulonii, C. duobushaemulonii, C. lusitaniae, and C. albicans. Primers were blind tested for their specificity and efficiency in conventional and real-time multiplex PCR set-up. RESULTS: All primers combinations showed excellent species specificity. In multiplex mode, correct identification was aided by different sized amplicons for each species. Efficiency of the C. auris primers was validated using a panel of 155 C. auris isolates, including all known genetically diverse clades. In real-time multiplex PCR, different melting points of the amplicons allowed the distinction of C. auris from four related species. C. auris limit of detection was 5 CFU/reaction with a threshold value of 32. The method was also able to detect C. auris in spiked blood and serum. CONCLUSIONS: PCR identification based on unique GPI protein-encoding genes allows for accurate and rapid species identification of C. auris and related species without need for expensive equipment when applied in conventional PCR set-up. |
Complete and Circularized Genome Assemblies of the Kroppenstedtia eburnea Genus Type Strain and the Kroppenstedtia pulmonis Species Type Strain with MiSeq and MinION Sequence Data.
Gulvik CA , Batra D , Rowe LA , Sheth M , Humrighouse BW , Howard DT , Lee J , McQuiston JR , Lasker BA . Microbiol Resour Announc 2020 9 (44) Kroppenstedtia eburnea DSM 45196(T) and Kroppenstedtia pulmonis W9323(T) are aerobic, Gram-positive, filamentous, chemoorganotrophic thermoactinomycetes. Here, we report on the complete and circular genome assemblies generated using Illumina MiSeq and Oxford Nanopore Technologies MinION reads. Putative gene clusters predicted to be involved in the production of secondary metabolites were also identified. |
Portable Rabies Virus Sequencing in Canine Rabies Endemic Countries Using the Oxford Nanopore MinION.
Gigante CM , Yale G , Condori RE , Costa NC , Long NV , Minh PQ , Chuong VD , Tho ND , Thanh NT , Thin NX , Hanh NTH , Wambura G , Ade F , Mito O , Chuchu V , Muturi M , Mwatondo A , Hampson K , Thumbi SM , Thomae BG , de Paz VH , Meneses S , Munyua P , Moran D , Cadena L , Gibson A , Wallace RM , Pieracci EG , Li Y . Viruses 2020 12 (11) As countries with endemic canine rabies progress towards elimination by 2030, it will become necessary to employ techniques to help plan, monitor, and confirm canine rabies elimination. Sequencing can provide critical information to inform control and vaccination strategies by identifying genetically distinct virus variants that may have different host reservoir species or geographic distributions. However, many rabies testing laboratories lack the resources or expertise for sequencing, especially in remote or rural areas where human rabies deaths are highest. We developed a low-cost, high throughput rabies virus sequencing method using the Oxford Nanopore MinION portable sequencer. A total of 259 sequences were generated from diverse rabies virus isolates in public health laboratories lacking rabies virus sequencing capacity in Guatemala, India, Kenya, and Vietnam. Phylogenetic analysis provided valuable insight into rabies virus diversity and distribution in these countries and identified a new rabies virus lineage in Kenya, the first published canine rabies virus sequence from Guatemala, evidence of rabies spread across an international border in Vietnam, and importation of a rabid dog into a state working to become rabies-free in India. Taken together, our evaluation highlights the MinION's potential for low-cost, high volume sequencing of pathogens in locations with limited resources. |
Differential responses of murine alveolar macrophages to elongate mineral particles of asbestiform and non-asbestiform varieties: Cytotoxicity, cytokine secretion and transcriptional changes.
Khaliullin TO , Kisin ER , Guppi S , Yanamala N , Zhernovkov V , Shvedova AA . Toxicol Appl Pharmacol 2020 409 115302 Human exposures to asbestiform elongate mineral particles (EMP) may lead to diffuse fibrosis, lung cancer, malignant mesothelioma and autoimmune diseases. Cleavage fragments (CF) are chemically identical to asbestiform varieties (or habits) of the parent mineral, but no consensus exists on whether to treat them as asbestos from toxicological and regulatory standpoints. Alveolar macrophages (AM) are the first responders to inhaled particulates, participating in clearance and activating other resident and recruited immunocompetent cells, impacting the long-term outcomes. In this study we address how EMP of asbestiform versus non-asbestiform habit affect AM responses. Max Planck Institute (MPI) cells, a non-transformed mouse line that has an AM phenotype and genotype, were treated with mass-, surface area- (s.a.), and particle number- (p.n.) equivalent concentrations of respirable asbestiform and non-asbestiform riebeckite/tremolite EMP for 24 h. Cytotoxicity, cytokines secretion and transcriptional changes were evaluated. At the equal mass, asbestiform EMP were more cytotoxic, however EMP of both habits induced similar LDH leakage and decrease in viability at s.a. and p.n. equivalent doses. DNA damage assessment and cell cycle analysis revealed differences in the modes of cell death between asbestos and respective CF. There was an increase in chemokines, but not pro-inflammatory cytokines after all EMP treatments. Principal component analysis of the cytokine secretion showed close clustering for the s.a. and p.n. equivalent treatments. There were mineral- and habit-specific patterns of gene expression dysregulation at s.a. equivalent doses. Our study reveals the critical nature of EMP morphometric parameters for exposure assessment and dosing approaches used in toxicity studies. |
Characterization of Reference Materials for Spinal Muscular Atrophy Genetic Testing: A GeT-RM Collaborative Project.
Prior TW , Bayrak-Toydemir P , Lynnes TC , Mao R , Metcalf JD , Muralidharan K , Iwata-Otsubo A , Pham HT , Pratt VM , Qureshi S , Requesens D , Shen J , Vetrini F , Kalman L . J Mol Diagn 2020 23 (1) 103-110 Spinal muscular atrophy (SMA) is an autosomal recessive disorder predominately caused by bi-allelic loss of the SMN1 gene. Increased copies of SMN2, a low functioning nearly identical paralog, is associated with a less severe phenotype. SMA was recently recommended for inclusion in newborn screening. Clinical laboratories must accurately measure SMN1 and SMN2 copy number to identify SMA patients, carriers, and to identify individuals likely to benefit from therapeutic interventions. Having publicly available and appropriately characterized reference materials with various combinations of SMN1 and SMN2 copy number variants is critical to assure accurate SMA clinical testing. To address this need, the Centers for Disease Control and Prevention based Genetic Testing Reference Material Coordination Program (GeT-RM), in collaboration with members of the genetic testing community and the Coriell Institute for Medical Research, have characterized 15 SMA reference materials derived from publicly available cell lines. DNA samples were distributed to four volunteer testing laboratories for genotyping using 3 different methods. The characterized samples had 0-4 copies of SMN1 and 0-5 copies SMN2. The samples also contained clinically important allele combinations (eg. 0 copies SMN1, 3 copies SMN2), and several had markers indicative of a SMA carrier. These and other reference materials characterized by the GeT-RM will support the quality of clinical laboratory testing and are available from the Coriell Institute. |
Do I Have HIV or Not? Lack of RNA Detection and the Case for Sensitive DNA Testing.
Springer SA , Masciotra S , Johnson JA , Campbell S . Open Forum Infect Dis 2020 7 (11) ofaa478 We present a case of a 20-year-old male who had ambiguous HIV test results after entering new provider care and whose status was later complicated by undetectable viral RNA off antiretroviral therapy (ART). Verifying HIV infection status may occasionally require sensitive DNA testing that might need to be considered in diagnostic guidelines to resolve diagnosis and ensure appropriate ART management. |
The Role of the Gut Microbiome in Resisting Norovirus Infection as Revealed by a Human Challenge Study.
Patin NV , Peña-Gonzalez A , Hatt JK , Moe C , Kirby A , Konstantinidis KT . mBio 2020 11 (6) Norovirus infections take a heavy toll on worldwide public health. While progress has been made toward understanding host responses to infection, the role of the gut microbiome in determining infection outcome is unknown. Moreover, data are lacking on the nature and duration of the microbiome response to norovirus infection, which has important implications for diagnostics and host recovery. Here, we characterized the gut microbiomes of subjects enrolled in a norovirus challenge study. We analyzed microbiome features of asymptomatic and symptomatic individuals at the genome (population) and gene levels and assessed their response over time in symptomatic individuals. We show that the preinfection microbiomes of subjects with asymptomatic infections were enriched in Bacteroidetes and depleted in Clostridia relative to the microbiomes of symptomatic subjects. These compositional differences were accompanied by differences in genes involved in the metabolism of glycans and sphingolipids that may aid in host resilience to infection. We further show that microbiomes shifted in composition following infection and that recovery times were variable among human hosts. In particular, Firmicutes increased immediately following the challenge, while Bacteroidetes and Proteobacteria decreased over the same time. Genes enriched in the microbiomes of symptomatic subjects, including the adenylyltransferase glgC, were linked to glycan metabolism and cell-cell signaling, suggesting as-yet unknown roles for these processes in determining infection outcome. These results provide important context for understanding the gut microbiome role in host susceptibility to symptomatic norovirus infection and long-term health outcomes.IMPORTANCE The role of the human gut microbiome in determining whether an individual infected with norovirus will be symptomatic is poorly understood. This study provides important data on microbes that distinguish asymptomatic from symptomatic microbiomes and links these features to infection responses in a human challenge study. The results have implications for understanding resistance to and treatment of norovirus infections. |
Functional Characterization of Circulating Mumps Viruses with Stop Codon Mutations in the Small Hydrophobic Protein.
Stinnett RC , Beck AS , Lopareva EN , McNall RJ , Latner DR , Hickman CJ , Rota PA , Bankamp B . mSphere 2020 5 (6) Between 2015 and 2017, routine molecular surveillance in the United States detected multiple mumps viruses (MuVs) with mutations in the small hydrophobic (SH) gene compared to a reference virus of the same genotype. These mutations include an unusual pattern of uracil-to-cytosine hypermutations and other mutations resulting in the generation of premature stop codons or disruption of the canonical stop codon. The mumps virus SH protein may serve as a virulence factor, based on evidence that it inhibits apoptosis and innate immune signaling in vitro and that recombinant viruses that do not express the SH protein are attenuated in an animal model. In this study, mumps viruses bearing variant SH sequences were isolated from contemporary outbreak samples to evaluate the impact of the observed mutations on SH protein function. All isolates with variant SH sequences replicated in interferon-competent cells with no evidence of attenuation. Furthermore, all SH-variant viruses retained the ability to abrogate induction of NF-κB-mediated innate immune signaling in infected cells. Ectopic expression of variant mumps SH genes is consistent with findings from infection experiments, indicating that the observed abrogation of signaling was not mediated by other viral factors that may modulate innate immune signaling. Molecular surveillance is an important public health tool for monitoring the diversity of circulating mumps viruses and can provide insights into determinants of disease. These findings, in turn, will inform studies employing reverse genetics to elucidate the specific mechanisms of MuV pathogenesis and potential impacts of observed sequence variants on infectivity, fitness, and virulence.IMPORTANCE Mumps virus (MuV) outbreaks occur in the United States despite high coverage with measles, mumps, rubella (MMR) vaccine. Routine genotyping of laboratory-confirmed mumps cases has been practiced in the United States since 2006 to enhance mumps surveillance. This study reports the detection of unusual mutations in the small hydrophobic (SH) protein of contemporary laboratory-confirmed mumps cases and is the first to describe the impact of such mutations on SH protein function. These mutations are predicted to profoundly alter the amino acid sequence of the SH protein, which has been shown to antagonize host innate immune responses; however, they were neither associated with defects in virus replication nor attenuated protein function in vitro, consistent with detection in clinical specimens. A better understanding of the forces governing mumps virus sequence diversity and of the functional consequences of mutations in viral proteins is important for maintaining robust capacity for mumps detection and disease control. |
Trials of the automated particle counter for laboratory rearing of mosquito larvae
Benedict MQ , Bascuñán P , Hunt CM , Aviles EI , Rotenberry RD , Dotson EM . PLoS One 2020 15 (11) e0241492 As a means of obtaining reproducible and accurate numbers of larvae for laboratory rearing, we tested a large-particle flow-cytometer type device called the 'Automated Particle Counter' (APC). The APC is a gravity-fed, self-contained unit that detects changes in light intensity caused by larvae passing the detector in a water stream and controls dispensing by stopping the flow when the desired number has been reached. We determined the accuracy (number dispensed compared to the target value) and precision (distribution of number dispensed) of dispensing at a variety of counting sensitivity thresholds and larva throughput rates (larvae per second) using < 1-day old Anopheles gambiae and Aedes aegypti larvae. All measures were made using an APC algorithm called the 'Smoothed Z-Score' which allows the user to define how many standard deviations (Z scores) from the baseline light intensity a particle's absorbance must exceed to register a count. We dispensed a target number of 100 An. gambiae larvae using Z scores from 2.5-8 and observed no difference among them in the numbers dispensed for scores from 2.5-6, however, scores of 7 and 8 under-counted (over-dispensed) larvae. Using a Z score ≤ 6, we determined the effect of throughput rate on the accuracy of the device to dispense An. gambiae larvae. For rates ≤ 98 larvae per second, the accuracy of dispensing a target of 100 larvae was - 2.29% ± 0.72 (95% CI of the mean) with a mode of 99 (49 of 348 samples). When using a Z score of 3.5 and rates ≤ 100 larvae per second, the accuracy of dispensing a target of 100 Ae. aegypti was - 2.43% ± 1.26 (95% CI of the mean) with a mode of 100 (6 of 42 samples). No effect on survival was observed on the number of An. gambiae first stage larvae that reached adulthood as a function of dispensing. |
Invited review: human air-liquid-interface organotypic airway tissue models derived from primary tracheobronchial epithelial cells-overview and perspectives
Cao X , Coyle JP , Xiong R , Wang Y , Heflich RH , Ren B , Gwinn WM , Hayden P , Rojanasakul L . In Vitro Cell Dev Biol Anim 2020 57 (2) 1-29 The lung is an organ that is directly exposed to the external environment. Given the large surface area and extensive ventilation of the lung, it is prone to exposure to airborne substances, such as pathogens, allergens, chemicals, and particulate matter. Highly elaborate and effective mechanisms have evolved to protect and maintain homeostasis in the lung. Despite these sophisticated defense mechanisms, the respiratory system remains highly susceptible to environmental challenges. Because of the impact of respiratory exposure on human health and disease, there has been considerable interest in developing reliable and predictive in vitro model systems for respiratory toxicology and basic research. Human air-liquid-interface (ALI) organotypic airway tissue models derived from primary tracheobronchial epithelial cells have in vivo-like structure and functions when they are fully differentiated. The presence of the air-facing surface allows conducting in vitro exposures that mimic human respiratory exposures. Exposures can be conducted using particulates, aerosols, gases, vapors generated from volatile and semi-volatile substances, and respiratory pathogens. Toxicity data have been generated using nanomaterials, cigarette smoke, e-cigarette vapors, environmental airborne chemicals, drugs given by inhalation, and respiratory viruses and bacteria. Although toxicity evaluations using human airway ALI models require further standardization and validation, this approach shows promise in supplementing or replacing in vivo animal models for conducting research on respiratory toxicants and pathogens. |
Development and assessment of a pooled serum as candidate standard to measure influenza a virus group 1 hemagglutinin stalk-reactive antibodies
Carreño JM , McDonald JU , Hurst T , Rigsby P , Atkinson E , Charles L , Nachbagauer R , Behzadi MA , Strohmeier S , Coughlan L , Aydillo T , Brandenburg B , García-Sastre A , Kaszas K , Levine MZ , Manenti A , McDermott AB , Montomoli E , Muchene L , Narpala SR , Perera Rapm , Salisch NC , Valkenburg SA , Zhou F , Engelhardt OG , Krammer F . Vaccines (Basel) 2020 8 (4) The stalk domain of the hemagglutinin has been identified as a target for induction of protective antibody responses due to its high degree of conservation among numerous influenza subtypes and strains. However, current assays to measure stalk-based immunity are not standardized. Hence, harmonization of assay readouts would help to compare experiments conducted in different laboratories and increase confidence in results. Here, serum samples from healthy individuals (n = 110) were screened using a chimeric cH6/1 hemagglutinin enzyme-linked immunosorbent assay (ELISA) that measures stalk-reactive antibodies. We identified samples with moderate to high IgG anti-stalk antibody levels. Likewise, screening of the samples using the mini-hemagglutinin (HA) headless construct #4900 and analysis of the correlation between the two assays confirmed the presence and specificity of anti-stalk antibodies. Additionally, samples were characterized by a cH6/1N5 virus-based neutralization assay, an antibody-dependent cell-mediated cytotoxicity (ADCC) assay, and competition ELISAs, using the stalk-reactive monoclonal antibodies KB2 (mouse) and CR9114 (human). A "pooled serum" (PS) consisting of a mixture of selected serum samples was generated. The PS exhibited high levels of stalk-reactive antibodies, had a cH6/1N5-based neutralization titer of 320, and contained high levels of stalk-specific antibodies with ADCC activity. The PS, along with blinded samples of varying anti-stalk antibody titers, was distributed to multiple collaborators worldwide in a pilot collaborative study. The samples were subjected to different assays available in the different laboratories, to measure either binding or functional properties of the stalk-reactive antibodies contained in the serum. Results from binding and neutralization assays were analyzed to determine whether use of the PS as a standard could lead to better agreement between laboratories. The work presented here points the way towards the development of a serum standard for antibodies to the HA stalk domain of phylogenetic group 1. |
Meeting an urgent public health workforce need: Development of the CDC laboratory Leadership Service Fellowship program
Glynn MK , Liu X , Ned-Sykes R , Dauphin LA , Simone PM . Health Secur 2020 18 (5) 418-423 Laboratory scientists of the US Centers for Disease Control and Prevention (CDC) and other public health laboratories play a fundamental and increasingly complex role in implementing public health programs while ensuring laboratory safety and quality. In 2014, a series of laboratory safety incidents highlighted the need for improvement in federal and other government laboratories. One component of the CDC's response to these incidents was a new career-entry fellowship program, the Laboratory Leadership Service (LLS). Offering laboratory safety and quality training and leadership development for laboratory scientists, LLS is intended to create a pipeline of future laboratory leaders who prioritize quality and safety as a core part of their laboratory science and practice throughout their careers. LLS incorporates evidence-based practices such as the service-learning model, a competency-based curriculum, ongoing stakeholder engagement, and program evaluation to maximize the program's success. This article describes how the CDC created LLS as a workforce development measure to respond to an urgent public health need-to improve laboratory safety and quality-and presents key factors for success in quickly establishing the program. |
A role for neuroimmune signaling in a rat model of Gulf War Illness-related pain
Lacagnina MJ , Li J , Lorca S , Rice KC , Sullivan K , O'Callaghan JP , Grace PM . Brain Behav Immun 2020 91 418-428 More than a quarter of veterans of the 1990-1991 Persian Gulf War suffer from Gulf War Illness (GWI), a chronic, multi-symptom illness that commonly includes musculoskeletal pain. Exposure to a range of toxic chemicals, including sarin nerve agent, are a suspected root cause of GWI. Moreover, such chemical exposures induce a neuroinflammatory response in rodents, which has been linked to several GWI symptoms in rodents and veterans with GWI. To date, a neuroinflammatory basis for pain associated with GWI has not been investigated. Here, we evaluated development of nociceptive hypersensitivity in a model of GWI. Male Sprague Dawley rats were treated with corticosterone in the drinking water for 7 days, to mimic high physiological stress, followed by a single injection of the sarin nerve agent surrogate, diisopropyl fluorophosphate. These exposures alone were insufficient to induce allodynia. However, an additional sub-threshold challenge (a single intramuscular injection of pH 4 saline) induced long-lasting, bilateral allodynia. Such allodynia was associated with elevation of markers for activated microglia/macrophages (CD11b) and astrocytes/satellite glia (GFAP) in the lumbar dorsal spinal cord and dorsal root ganglia (DRG). Additionally, Toll-like receptor 4 (TLR4) mRNA was elevated in the lumbar dorsal spinal cord, while IL-1β and IL-6 were elevated in the lumbar dorsal spinal cord, DRG, and gastrocnemius muscle. Demonstrating a casual role for such neuroinflammatory signaling, allodynia was reversed by treatment with either minocycline, the TLR4 inhibitor (+)-naltrexone, or IL-10 plasmid DNA. Together, these results point to a role for neuroinflammation in male rats in the model of musculoskeletal pain related to GWI. Therapies that alleviate persistent immune dysregulation may be a strategy to treat pain and other symptoms of GWI. |
Validation of the chromogenic Bethesda assay for factor VIII inhibitors in hemophilia a patients receiving emicizumab
Miller CH , Boylan B , Payne AB , Driggers J , Bean CJ . Int J Lab Hematol 2020 43 (2) e84-e86 Development of antibodies interfering with the function of factor VIII (FVIII) replacement products is one of the most significant complications in the treatment of hemophilia A (HA). Laboratory testing for such antibodies, called inhibitors, is an important part of hemophilia care and is conducted both to identify the cause of treatment failure and as routine screening to detect early antibody appearance. Treatment of HA patients who develop inhibitors is often carried out by giving repeated doses of FVIII to induce immune tolerance and allow use of FVIII or by use of by-passing agents that act by facilitating coagulation without the need for FVIII. The newest by-passing product, emicizumab (Hemlibra®), is a bispecific antibody that mimics the function of FVIII by bringing factor IXa and factor X (FX) together to produce the Xa complex.1,2 Emicizumab, which is long acting and given subcutaneously, is now widely available for use in patients both with and without inhibitors to avoid frequent use of intravenous FVIII replacement. |
Harmonizing newborn screening laboratory proficiency test results using the CDC NSQAP Reference Materials
Pickens CA , Sternberg M , Seeterlin M , De Jesús VR , Morrissey M , Manning A , Bhakta S , Held PK , Mei J , Cuthbert C , Petritis K . Int J Neonatal Screen 2020 6 (3) 75 Newborn screening (NBS) laboratories cannot accurately compare mass spectrometry-derived results and cutoff values due to differences in testing methodologies. The objective of this study was to assess harmonization of laboratory proficiency test (PT) results using quality control (QC) data. Newborn Screening Quality Assurance Program (NSQAP) QC and PT data reported from 302 laboratories in 2019 were used to compare results among laboratories. QC materials were provided as dried blood spot cards which included a base pool and the base pool enriched with specific concentrations of metabolites in a linear range. QC data reported by laboratories were regressed on QC data reported by the Centers for Disease Control and Prevention (CDC), and laboratory's regression parameters were used to harmonize their PT result. In general, harmonization tended to reduce overall variation in PT data across laboratories. The metabolites glutarylcarnitine (C5DC), tyrosine, and phenylalanine were displayed to highlight inter- and intra-method variability in NBS results. Several limitations were identified using retrospective data for harmonization, and future studies will address these limitations to further assess feasibility of using NSQAP QC data to harmonize PT data. Harmonizing NBS data using common QC materials appears promising to aid result comparison between laboratories. |
Development of a multiplex bead assay for the detection of canine IgG(4) antibody responses to guinea worm
Priest JW , Ngandolo BNR , Lechenne M , Cleveland CA , Yabsley MJ , Weiss AJ , Roy SL , Cama V . Am J Trop Med Hyg 2020 104 (1) 303-312 Increased levels of guinea worm (GW) disease transmission among dogs in villages along the Chari River in Chad threaten the gains made by the GW Eradication Program. Infected dogs with preemergent worm blisters are difficult to proactively identify. If these dogs are not contained, blisters can burst upon submersion in water, leading to the contamination of the water supply with L1 larvae. Guinea worm antigens previously identified using sera from human dracunculiasis patients were coupled to polystyrene beads for multiplex bead assay analysis of 41 non-endemic (presumed negative) dog sera and 39 sera from GW-positive dogs from Chad. Because commercially available anti-dog IgG secondary antibodies did not perform well in the multiplex assay, dog IgGs were partially purified, and a new anti-dog IgG monoclonal antibody was developed. Using the new 4E3D9 monoclonal secondary antibody, the thioredoxin-like protein 1-glutathione-S-transferase (GST), heat shock protein (HSP1)-GST, and HSP2-GST antigen multiplex assays had sensitivities of 69-74% and specificities of 73-83%. The domain of unknown function protein 148 (DUF148)-GST antigen multiplex assay had a sensitivity of 89.7% and a specificity of 85.4%. When testing samples collected within 1 year of GW emergence (n = 20), the DUF148-GST assay had a sensitivity of 90.0% and a specificity of 97.6% with a receiver-operating characteristic area under the curve of 0.94. Using sera from two experimentally infected dogs, antibodies to GW antigens were detected within 6 months of exposure. Our results suggest that, when used to analyze paired, longitudinal samples collected 1-2 months apart, the DUF148/GST multiplex assay could identify infected dogs 4-8 months before GW emergence. |
Quantification of seven terpenes in human serum by headspace solid-phase microextraction-gas chromatography-tandem mass spectrometry
Silva LK , Espenship MF , Newman CA , Blount BC , De Jesús VR . Environ Sci Technol 2020 54 (21) 13861-13867 Terpenes are a class of volatile organic hydrocarbons commonly produced by vegetation and released into the atmosphere. These compounds are responsible for the scents of pine forests, citrus fruits, and some flowers. Human terpene exposure can come from inhalation, diet, smoking, and more recently, using e-cigarettes. Terpenes are present in tobacco smoke and are used as flavor chemicals in e-liquids. The health effects of terpenes are not widely known, though several studies have suggested that they may prove useful in future medical applications. We have developed a novel, high-throughput method of quantifying seven terpenes (α-pinene, β-pinene, β-myrcene, 3-carene, limonene, β-caryophyllene, and α-humulene) in human serum to aid human-exposure investigations. This method employs headspace sampling using solid-phase microextraction (SPME) coupled to gas chromatography-tandem mass spectrometry to detect and quantify five monoterpenes and two sesquiterpenes in the low parts-per-trillion to low parts-per-billion range. The intraday and interday variability (percent error) of the method are ≤2 and ≤11%, respectively. In addition, this method showed excellent recovery in human serum (between 80 and 120% for all analytes). The assay precision ranges between 4.0 and 11%. Limits of detection ranged between 0.032 and 0.162 μg/L. Using serum cotinine values to classify tobacco use showed that smokers have higher serum concentrations of six terpenes compared to nonusers. Terpene concentrations were 14-78% higher in smokers than nonusers. Our method can provide essential biomonitoring data to establish baseline exposure levels for terpenes in humans. |
Determination of 226Ra in urine using triple quadrupole inductively coupled plasma mass spectrometry
Xiao G , Liu Y , Jones RL . Radiat Prot Dosimetry 2020 Measuring 226Ra in urine at low levels is critical for both biomonitoring and radiological emergency response. Here we report a new analytical method to quantify 226Ra, as developed and validated by a simple dilute-and-shoot procedure, followed by Inductively Coupled Plasma-triple quadrupole-mass spectrometry detection using 'No Gas MS-MS' mode. The method provides rapid and accurate results for 226Ra with a limit of detection (LOD) down to 0.007 ng/l (0.26 Bq/l). This LOD is well below the recommended action levels for 226Ra detection in children and pregnant women (C/P) set by the Clinical Decision Guide (NCRP Report #161). Results for 226Ra obtained by this method are within ±7.0% of the target values of standard reference materials spiked in the urine. |
Preventive care and medical homes among US children with heart conditions
Broughton A , Riehle-Colarusso T , Nehl E , Riser AP , Farr SL . Cardiol Young 2020 31 (1) 1-7 Within a medical home, primary care providers can identify needs, provide services, and coordinate care for children with heart conditions. Using parent-reported data from the 2016-2017 National Survey of Children's Health, we examined receipt of preventive care in the last 12 months and having a medical home (care that is accessible, continuous, comprehensive, family-centred, coordinated, compassionate, and culturally effective) among US children aged 0-17 years with and without heart conditions. Using the marginal predictions approach to multivariable logistic regression, we examined associations between presence of a heart condition and receipt of preventive care and having a medical home. Among children with heart conditions, we evaluated associations between sociodemographic and health characteristics and receipt of preventive care and having a medical home. Of the 66,971 children included, 2.2% had heart conditions. Receipt of preventive care was reported for more children with heart conditions (91.0%) than without (82.7%) (adjusted prevalence ratio = 1.09, 95% confidence interval: 1.05-1.13). Less than half of children with heart conditions (48.2%) and without (49.5%) had a medical home (adjusted prevalence ratio = 1.02, 95% confidence interval: 0.91-1.14). For children with heart conditions, preventive care was slightly more common among younger children and less common among those with family incomes 200-399% of the federal poverty level. Having a medical home was less common among younger children, non-Hispanic "other" race, and those with ≥2 other health conditions. Most children with heart conditions received preventive care, but less than half had a medical home, with disparities by age, socioeconomic status, race, and concurrent health conditions. These findings highlight opportunities to improve care for children with heart conditions. |
Periconceptional surveillance for prevention of anaemia and birth defects in Southern India: protocol for a biomarker survey in women of reproductive age
Finkelstein JL , Fothergill A , Johnson CB , Guetterman HM , Bose B , Jabbar S , Zhang M , Pfeiffer CM , Qi YP , Rose CE , Krisher JT , Ruth CJ , Mehta R , Williams JL , Bonam W , Crider KS . BMJ Open 2020 10 (10) e038305 INTRODUCTION: Women of reproductive age (WRA) are a high-risk population for anaemia and micronutrient deficiencies. Evidence supports the role of periconceptional nutrition in the development of adverse pregnancy complications. However, in India, there are limited population-based data to guide evidence-based recommendations and priority setting. The objective of this study is to conduct a population-based biomarker survey of anaemia and vitamin B(12) and folate status in WRA as part of a periconceptional surveillance programme in Southern India. METHODS: WRA (15-40 years) who are not pregnant or lactating and reside within 50 km(2) of our community research site in Southern India will be screened and invited to participate in the biomarker survey at our research facility at Arogyavaram Medical Centre. After informed consent/assent, structured interviews will be conducted by trained nurse enumerators to collect sociodemographic, dietary, anthropometry, health and reproductive history data. Venous blood samples will be collected at enrolment; whole blood will be analysed for haemoglobin. Plasma, serum and red blood cells (RBCs) will be processed and stored <-80°C until batch analysis. Vitamin B(12) concentrations will be measured via chemiluminescence, and RBC and serum folate concentrations will be evaluated using the World Health Organisation (WHO)-recommended microbiological assay at our laboratory in Bangalore. A WHO surveillance system will also be established to determine the baseline prevalence of birth defects in this setting. ETHICS AND DISSEMINATION: This study has obtained clearance from the Health Ministry Screening Committee of the Indian Council of Medical Research. The study protocol was reviewed and approved by the Institutional Review Board at Cornell University and the Institutional Ethics Committees at Arogyavaram Medical Centre and St. John's Research Institute. Findings from this biomarker survey will establish the burden of anaemia and micronutrient deficiencies in WRA and directly inform a randomised trial for anaemia and birth defects prevention in Southern India. The results of this study will be disseminated at international research conferences and as published articles in peer-reviewed journals. TRIAL REGISTRATION NUMBERS: Clinical trials registration number NCT04048330, NCT03853304 and Clinical Trials Registry of India (CTRI) registration number REF/2019/03/024479. |
Association of genetic mutations and loss of ambulation in childhood-onset dystrophinopathy
Haber G , Conway KM , Paramsothy P , Roy A , Rogers H , Ling X , Kozauer N , Street N , Romitti PA , Fox DJ , Phan HC , Matthews D , Ciafaloni E , Oleszek J , James KA , Galindo M , Whitehead N , Johnson N , Butterfield RJ , Pandya S , Venkatesh S , Bhattaram VA . Muscle Nerve 2020 63 (2) 181-191 INTRODUCTION: Quantifying associations between genetic mutations and loss of ambulation (LoA) among males diagnosed with childhood-onset dystrophinopathy is important for understanding variation in disease progression and may be useful in clinical trial design. METHODS: Genetic and clinical data from the Muscular Dystrophy Surveillance, Tracking, and Research Network for 358 males born and diagnosed from 1982-2011 were analyzed. LoA was defined as the age at which independent ambulation ceased. Genetic mutations were defined by overall type (deletion/duplication/point mutation) and among deletions, those amenable to exon-skipping therapy (exons 8, 20, 44-46, 51-53) and another group. Cox proportional hazards regression modeling was used to estimate hazards ratios (HR) and 95% confidence intervals (CI). RESULTS: Mutation type did not predict time to LoA. Controlling for corticosteroids, Exons 8 (HR=0.22; 95% CI=0.08,0.63) and 44 (HR=0.30; 95% CI=0.12,0.78) were associated with delayed LOA compared to other exon deletions. DISCUSSION: Delayed LoA in males with mutations amenable to exon-skipping therapy is consistent with previous studies. These findings suggest that clinical trials including exon 8 and 44 skippable males should consider mutation information prior to randomization. |
Attention-deficit/hyperactivity disorder among US children and adolescents with congenital adrenal hyperplasia
Harasymiw LA , Grosse SD , Sarafoglou K . J Endocr Soc 2020 4 (12) bvaa152 BACKGROUND: Little is known regarding risk for co-occurring mental health conditions among pediatric patients with congenital adrenal hyperplasia (CAH). The objective of the current study was to investigate the prevalence of medically managed attention-deficit/hyperactivity disorder (ADHD) in 2 large administrative samples of insured children and adolescents with and without CAH in the United States. METHODS: We assessed the prevalence of CAH and of medically managed ADHD using algorithms defined from diagnosis codes and filled prescriptions data using the IBM MarketScan Commercial and Multi-State Medicaid claims databases. We evaluated subjects who were continuously enrolled for ≥ 12 months with a first claim during October 2015 through December 2017 when they were 5 to 18 years old. RESULTS: The administrative prevalence of CAH in the Commercial (N = 3 685 127) and Medicaid (N = 3 434 472) samples was 10.1 per 100 000 (n = 372) and 7.2 per 100 000 (n = 247), respectively. The prevalence of medically managed ADHD in the non-CAH population was 8.4% in the Commercial sample and 15.1% in the Medicaid sample. Among children with CAH, there was no increased prevalence of ADHD in the Commercial (9.2%, prevalence ratio [PR] = 1.1; 95% confidence interval [CI], 0.82-1.54; P = 0.48) or Medicaid (13.8%; PR = 0.91; 95% CI, 0.67-1.24; P = 0.55) samples compared with the general population. CONCLUSIONS: Using 2 large samples of insured children and adolescents in the United States, we found similar prevalence of medically managed ADHD among those with CAH and the general population. Future research to assess the validity of our claims algorithm for identifying pediatric CAH cases is warranted. |
Early intervention, parent talk, and pragmatic language in children with hearing loss
Yoshinaga-Itano C , Sedey AL , Mason CA , Wiggin M , Chung W . Pediatrics 2020 146 S270-s277 BACKGROUND AND OBJECTIVES: Pragmatic language skills form the foundation for conversational competence, whereas deficits in this area are associated with behavioral problems and low literacy skills. Children who are deaf or hard of hearing demonstrate significant delays in this critical area of language. Our purpose with this research was to identify variables associated with pragmatic language ability in children who are deaf or hard of hearing. METHODS: This was a longitudinal study of 124 children with bilateral hearing loss between 4 and 7 years of age living in Colorado. As part of a comprehensive speech and language assessment, pragmatic language skills were evaluated annually by using the Pragmatics Checklist. RESULTS: The children's pragmatic skills increased significantly with age. Higher levels of pragmatic language ability at 7 years of age were predicted by (1) meeting Early Hearing Detection and Intervention 1-3-6 guidelines (hearing screening by 1 month, identification of hearing loss by 3 months, and receiving intervention by 6 months of age), (2) greater quantity of parent talk, (3) higher nonverbal intelligence, (4) lesser degrees of hearing loss, and (5) higher maternal education. CONCLUSIONS: With the findings of this study, we underscore the importance of pediatricians and other health care professionals counseling parents about the value of adherence to the Early Hearing Detection and Intervention 1-3-6 guidelines with regard to intervention outcomes. The strong association between amount of child-directed parent talk in the first 4 years of life and pragmatic language outcomes at 7 years of age emphasizes the need for professionals to encourage parents to talk to their children as much as possible. |
Comparison of anthropometric data quality in children aged 6-23 and 24-59 months: lessons from population-representative surveys from humanitarian settings
Bilukha O , Couture A , McCain K , Leidman E . BMC Nutr 2020 6 (1) 60 Background: Ensuring the quality of anthropometry data is paramount for getting accurate estimates of malnutrition prevalence among children aged 6–59 months in humanitarian and refugee settings. Previous reports based on data from Demographic and Health Surveys suggested systematic differences in anthropometric data quality between the younger and older groups of preschool children. Methods: We analyzed 712 anthropometric population-representative field surveys from humanitarian and refugee settings conducted during 2011–2018. We examined and compared the quality of five anthropometric indicators in children aged 6–23 months and children aged 24–59 months: weight for height, weight for age, height for age, body mass index for age and mid-upper arm circumference (MUAC) for age. Using the z-score distribution of each indicator, we calculated the following parameters: standard deviation (SD), percentage of outliers, and measures of distribution normality. We also examined and compared the quality of height, weight, MUAC and age measurements using missing data and rounding criteria. Results: Both SD and percentage of flags were significantly smaller on average in older than in younger age group for all five anthropometric indicators. Differences in SD between age groups did not change meaningfully depending on overall survey quality or on the quality of age ascertainment. Over 50% of surveys overall did not deviate significantly from normality. The percentage of non-normal surveys was higher in older than in the younger age groups. Digit preference score for weight, height and MUAC was slightly higher in younger age group, and for age slightly higher in the older age group. Children with reported exact date of birth (DOB) had much lower digit preference for age than those without exact DOB. SD, percentage flags and digit preference scores were positively correlated between the two age groups at the survey level, such as those surveys showing higher anthropometry data quality in younger age group also tended to show higher quality in older age group. Conclusions: There should be an emphasis on increased rigor of training survey measurers in taking anthropometric measurements in the youngest children. Standardization test, a mandatory component of the pre-survey measurer training and evaluation, of 10 children should include at least 4–5 children below 2 years of age. |
Most national, mandatory flour fortification standards do not align with international recommendations for iron, zinc, and vitamin B12 levels
Bobrek KS , Broersen B , Aburto NJ , Garg A , Serdula M , Beltrán Velázquez F , Wong EC , Pachón H . Food Policy 2020 99 As national flour fortification standards are one of the policy documents developed to guide food fortification, the objective was to compare national, mandatory wheat and maize flour fortification standards to World Health Organization (WHO) fortification guidelines. For each nutrient in 72 countries’ standards, the type of compound was noted as ‘yes’ if it was included in international guidelines or ‘no’ if it was not. Nutrient levels in standards were classified as lower than, equal to, or higher than those suggested by WHO. If another food (i.e. rice, oil, milk) was mass fortified with a nutrient categorized as “lower than,” the classification was changed to “less than recommendation and included in other mass fortified food”. At least 61% of standards included one or more recommended compounds for all nutrients in standards for wheat flour alone (iron, folic acid, vitamin A, zinc, vitamin B12,), wheat and maize flour together (iron, folic acid, vitamin A, zinc, vitamin B12) and maize flour alone (thiamin, riboflavin, niacin, pyridoxine); no country included pantothenic acid in its maize flour standard. For folic acid, vitamin A, thiamin, riboflavin, niacin and pyridoxine, at least 50% of standards (1) met or exceeded WHO suggested levels, or (2) were lower than suggested levels and another food was mass fortified with the specific nutrient in the country. For iron, zinc and vitamin B12, less than 50% of standards met (1) or (2). In conclusion, iron, zinc and vitamin B12 may require the most attention in national fortification standards. |
The mPINC survey: Impacting US maternity care practices
Nelson JM , Grossniklaus DA , Galuska DA , Perrine CG . Matern Child Nutr 2020 17 (1) e13092 The Centers for Disease Control and Prevention administered the original Maternity Practices in Infant Nutrition and Care (mPINC) survey, a census of all US birth facilities, from 2007 to 2015 to monitor infant feeding-related maternity care practices and policies. The purpose of this paper is to describe the many uses of mPINC data. Hospitals, organizations and governments (federal, state and local) have used the mPINC survey as a tool for improving care among the populations they serve. Nationally, the mPINC survey has been used to document marked improvements in infant feeding-related maternity care. Researchers have used the mPINC data to examine a variety of questions related to maternity care practices and policies. The newly revised mPINC survey (2018) has been designed to capture changes that have occurred over the past decade in infant feeding-related US maternity care. Hospitals, organizations, governments and researchers will be able to continue using this important tool in their efforts to ensure US maternity care practices and policies are fully supportive of breastfeeding. |
Telomeres in toxicology: Occupational health.
Shoeb M , Meier HCS , Antonini JM . Pharmacol Ther 2020 220 107742 The ends of chromosomes shorten at each round of cell division, and this process is thought to be affected by occupational exposures. Occupational hazards may alter telomere length homeostasis resulting in DNA damage, chromosome aberration, mutations, epigenetic alterations and inflammation. Therefore, for the protection of genetic material, nature has provided a unique nucleoprotein structure known as a telomere. Telomeres provide protection by averting an inappropriate activation of the DNA damage response (DDR) at chromosomal ends and preventing recognition of single and double strand DNA (ssDNA and dsDNA) breaks or chromosomal end-to-end fusion. Telomeres and their interacting six shelterin complex proteins in coordination act as inhibitors of DNA damage machinery by blocking DDR activation at chromosomes, thereby preventing the occurrence of genome instability, perturbed cell cycle, cellular senescence and apoptosis. However, inappropriate DNA repair may result in the inadequate distribution of genetic material during cell division, resulting in the eventual development of tumorigenesis and other pathologies. This article reviews the current literature on the association of changes in telomere length and its interacting proteins with different occupational exposures and the potential application of telomere length or changes in the regulatory proteins as potential biomarkers for exposure and health response, including recent findings and future perspectives. |
Toward an automation of functional analysis interpretation: A proof of concept
Cox A , Friedel JE . Behav Modif 2020 46 (1) 147-177 The advent of functional analysis (FA) methodology paved the way for improved function-based behavioral interventions and ultimately client outcomes. Behavior analysts primarily rely on visual inspection to interpret FA results. However, the literature suggests interpretations may vary across raters resulting in poor interobserver agreement (IOA). To increase interpretation objectivity and address IOA issues, Hagopian et al. created visual-inspection criteria. They reported improved IOA, alongside criteria limitations. Following this, Roane et al. modified these criteria. The current project describes the first steps toward developing a decision support system to assist in FA interpretation. Specifically, we created a computer script, written in R, designed to evaluate FA data and produce an outcome (assign function) based on the Roane et al. criteria. Average agreement between experienced human raters and the computer script outcomes was 81%. We discuss criteria limitations (e.g., vague rules), study implications, and the significance of further research on this topic. |
Occupational respiratory infections
de Perio MA , Kobayashi M , Wortham JM . Clin Chest Med 2020 41 (4) 739-751 Occupational respiratory infections can be caused by bacterial, viral, and fungal pathogens. Transmission in occupational settings can occur from other humans, animals, or the environment, and occur in various occupations and industries. In this article, we describe 4 occupationally acquired respiratory infections at the focus of NIOSH investigations over the last decade: tuberculosis (TB), influenza, psittacosis, and coccidioidomycosis. We highlight the epidemiology, clinical manifestations, occupational risk factors, and prevention measures. |
Occupational allergies to cannabis
Decuyper II , Green BJ , Sussman GL , Ebo DG , Silvers WS , Pacheco K , King BS , Cohn JR , Zeiger RS , Zeiger JS , Naimi DR , Beezhold DH , Nayak AP . J Allergy Clin Immunol Pract 2020 8 (10) 3331-3338 Within the last decade there has been a significant expansion in access to cannabis for medicinal and adult nonmedical use in the United States and abroad. This has resulted in a rapidly growing and diverse workforce that is involved with the growth, cultivation, handling, and dispensing of the cannabis plant and its products. The objective of this review was to educate physicians on the complexities associated with the health effects of cannabis exposure, the nature of these exposures, and the future practical challenges of managing these in the context of allergic disease. We will detail the biological hazards related to typical modern cannabis industry operations that may potentially drive allergic sensitization in workers. We will highlight the limitations that have hindered the development of objective diagnostic measures that are essential in separating “true” cannabis allergies from nonspecific reactions/irritations that “mimic” allergy-like symptoms. Finally, we will discuss recent advances in the basic and translational scientific research that will aid the development of diagnostic tools and therapeutic standards to serve optimal management of cannabis allergies across the occupational spectrum. |
Safe patient handling and mobility (SPHM) for increasingly bariatric patient populations: Factors related to caregivers' self-reported pain and injury
Galinsky T , Deter L , Krieg E , Feng HA , Battaglia C , Bell R , Haddock KS , Hilton T , Lynch C , Matz M , Moscatel S , Riley FD , Sampsel D , Shaw S . Appl Ergon 2020 91 103300 This study was conducted at 5 Veterans Administration Medical Centers (VAMCs). A cross sectional survey was administered to 134 workers who routinely lift and mobilize patients within their workplaces' safe patient handling and mobility (SPHM) programs, which are mandated in all VAMCs. The survey was used to examine a comprehensive list of SPHM and non-SPHM variables, and their associations with self-reported musculoskeletal injury and pain. Previously unstudied variables distinguished between "bariatric" (≥300 lb or 136 kg) and "non-bariatric" (<300 lb or 136 kg) patient handling. Significant findings from stepwise and logistic regression provide targets for workplace improvements, predicting: lower injury odds with more frequently having sufficient time to use equipment, higher back pain odds with more frequent bariatric handling, lower back pain odds with greater ease in following SPHM policies, and lower odds of upper extremity pain with more bariatric equipment, and with higher safety climate ratings. |
An investigation of resurgence of reinforced behavioral variability in humans
Galizio A , Friedel JE , Odum AL . J Exp Anal Behav 2020 114 (3) 381-393 The present study examined resurgence of reinforced variability in college students, who completed a 3-phase computer-based variability task. In the first phase, baseline, points were delivered for drawing rectangles that sufficiently differed from previous rectangles in terms of a target dimension (size or location, counterbalanced) but were sufficiently similar in terms of the alternative dimension. In the second phase, alternative, points were only delivered for rectangles that were sufficiently different in terms of the alternative dimension, but repetitive in terms of the target dimension. In the third phase, extinction, no points were delivered. In baseline, participants made rectangles that were highly varied in terms of the target dimension and less varied in terms of the alternative dimension, and vice versa in the alternative phase. During extinction, levels of variability increased for the target dimension, providing evidence for resurgence of reinforced variability of a specific dimension of behavior. However, levels of variability also remained high for the alternative dimension, indicating that extinction-induced response variability may also have impacted the results. Although future research is needed to explore other explanations, the results of this study replicate prior research with pigeons and provide some support for the notion of variability as an operant. |
Radiographic screening reveals high burden of silicosis among workers at an engineered stone countertop fabrication facility in California
Heinzerling A , Cummings KJ , Flattery J , Weinberg JL , Materna B , Harrison R . Am J Respir Crit Care Med 2020 203 (6) 764-766 Silicosis is a progressive and incurable, but preventable, occupational respiratory disease caused by respirable crystalline silica exposure. Over the past decade, outbreaks of silicosis have been reported in several countries among workers who cut and finish engineered stone slabs for countertops (1–3). Many affected workers have been young, with rapidly progressive disease (4, 5). Engineered stone is a composite material made of crushed quartz bound together with polyester resins and pigments, with significantly higher silica content than natural stone materials; engineered stone typically contains >90% silica, compared with <45% in granite and <5% in marble (3, 6). Workers can be exposed to markedly elevated levels of respirable crystalline silica when cutting and finishing engineered stone materials (7, 8). |
Working in smoke: Wildfire impacts on the health of firefighters and outdoor workers and mitigation strategies
Navarro K . Clin Chest Med 2020 41 (4) 763-769 Wildland firefighters work on wildfire incidents all over the United States and perform arduous work under extreme work conditions, including exposure to smoke. Wildland fire smoke is a mixture of hazardous air pollutants. For assessing wildland firefighter exposure to smoke, most studies measured carbon monoixde (CO) and particulate matter and reported changes in lung health by measured lung function, airway responsiveness, and respiratory symptoms across individual work shifts and single fire seasons. All fire personnel should understand the hazards of smoke and develop ways to mitigate exposure to smoke. |
Occupational bronchiolitis: An update
Nett RJ , Harvey RR , Cummings KJ . Clin Chest Med 2020 41 (4) 661-686 Occupational bronchiolitis is characterized by inflammation of the small airways, and represents a heterogeneous set of lung conditions that can occur following a range of inhalation exposures related to work. The most common clinical presentation includes insidious onset of exertional dyspnea and cough. Multiple reports in recent years have drawn attention to previously unrecognized risk factors for occupational bronchiolitis following exposures in several settings. Both current and past occupational exposures, including prior military deployment-related exposures, should be considered in patients undergoing evaluation for unexplained dyspnea. Diagnostic testing for potential bronchiolitis should include a thorough assessment of the small airways. |
An approach to characterize the impact absorption performance of construction helmets in top impact
Pan CS , Wimer BM , Welcome DE , Wu JZ . J Test Eval 2020 49 (3) The helmets used by construction site workers are mainly designed for head protection when objects are dropped from heights. Construction helmets are also casually called hard hats in industries. Common construction helmets are mostly categorized as type 1 according to different standards. All type 1 helmets have to pass type 1 standard impact tests, which are top impact tests--the helmet is fixed and is impacted by a free falling impactor on the top crown of the helmet shell. The purpose of this study was to develop an approach that can determine the performance characterization of a helmet. A total of 31 drop impact tests using a representative type 1 helmet model were performed at drop heights from 0.30 to 2.23 m, which were estimated to result in impact speeds from 2.4 to 6.6 m/s. Based on our results, we identified a critical drop height that was used to evaluate the performance of helmets. The peak impact forces and peak accelerations varied nonproportionally with the drop height. When the drop height is less than the critical height, the peak force and peak acceleration increase gradually and slowly with increasing drop height. When the drop height is greater than the critical height, the peak force and peak acceleration increase steeply with even a slight increase in drop height. Based on the critical drop height, we proposed an approach to determine the safety margin of a helmet. The proposed approach would make it possible to determine the performance characteristics of a helmet and to estimate the safety margin afforded by the helmet, if the helmet first passes the existing standardized tests. The proposed test approach would provide supplementary information for consumers to make knowledgeable decisions when selecting construction helmets. |
Incident chronic obstructive pulmonary disease associated with occupation, industry, and workplace exposures in the Health and Retirement Study
Silver SR , Alarcon WA , Li J . Am J Ind Med 2020 64 (1) 26-38 BACKGROUND: Chronic health effects from accumulated occupational exposures manifest as the workforce ages. The Health and Retirement Study (HRS), a panel survey of U.S. adults nearing/in retirement, allows assessment of associations among industry and occupation (I/O), workplace exposures, and incident chronic obstructive pulmonary disease (COPD). METHODS: The study population comprised respondents from the 1992 HRS cohort employed in 1972 or later and not diagnosed with COPD as of initial interview. We examined associations with incident COPD through 2016 and: (1) broad and selected detailed I/O, (2) workplace exposures, and (3) exposures within I/O. Given the cohort's baseline age (50-62), we calculated subhazard ratios (SHRs) for COPD accounting for competing risk of death. RESULTS: SHRs for COPD were significantly elevated for several industries: mining; blast furnaces, steelworks, rolling and finishing mills; groceries and related products; and automotive repair shops. Occupations with significantly elevated SHRs were maids and housemen; farmworkers; vehicle/mobile equipment mechanics and repair workers; material moving equipment operators; and nonconstruction laborers. Significantly elevated COPD SHRs were observed for specific I/O-exposure pairs: blast furnace/steelworks/rolling/finishing mills and asbestos; automotive repair shops and aerosol paints; farmworkers and pesticide exposures; and both material moving equipment operators and nonconstruction laborers exposed to dust and ash. CONCLUSIONS: Certain jobs and occupational exposures are associated with increased risk for developing COPD in late preretirement and during retirement. Given the disability and economic costs of COPD, these findings support focusing exposure prevention and medical monitoring resources on groups of workers at increased risk. |
Linking datasets to characterize injury and illness in Alaska's fishing industry
Syron LN , Case SL , Lee JR , Lucas DL . J Agromedicine 2020 26 (1) 31-44 Objectives: Limited research has characterized nonfatal injury/illness in Alaska's hazardous fishing industry. This study aimed to determine (a) the utility of linking datasets to conduct surveillance, and (b) injury/illness patterns during 2012-2016. Methods: Data were obtained from the Alaska Trauma Registry (ATR), Fishermen's Fund (FF), and US Coast Guard (USCG). Datasets were coded to identify patterns in injury/illness characteristics and circumstances. Probabilistic linkage methods were utilized to identify unique incidents that appeared in more than one dataset. Results: After linking datasets, 3,014 unique injury/illness cases were identified. By dataset, 2,365 cases appeared only in FF, 486 only in USCG, 110 only in ATR, 25 in ATR and FF, 15 in ATR and USCG, 10 in USCG and FF, and 3 in all datasets. FF mainly captured claims submitted by small, independently-owned vessels in Southcentral and Southeastern Alaska. In contrast, USCG mainly captured reports from large, company-owned vessels in Western Alaska. By nature, cases were most frequently sprains, strain, and tears (27%), cuts (15%), and fractures (11%). Across fleets, injuries/illnesses most frequently resulted from contact with objects and equipment (41%), overexertion and bodily reaction (27%), and slips, trips, and falls (20%). Work processes associated with traumatic injuries were most frequently hauling gear (18%) and walking, climbing, and descending (18%). Half of all injuries were of moderate severity (53%). Conclusion: Linking datasets, which capture different segments of Alaska's fishing industry, provides the most comprehensive understanding of nonfatal injury/illness to date. These results, stratified by fleet and severity, will inform prevention strategies. |
COVID-19 and worker fatigue lessons learned and mitigation strategies
Wong I . Synergist 2020 31 (11) 20-25 The declaration of coronavirus disease 2019 (COVID-19) as a public health emergency in the United States immediately changed the way we work and live, and intensified feelings of stress and uncertainty about the future. New routines and behaviors such as adhering to stay-at-home orders, wearing masks, and social distancing, along with frequent messaging about handwashing, cleaning, and not touching our faces, are constant reminders of our new normal and have been shown to increase anxiety. Exposure to an abundance of news coverage, some of which is conflicting or has changed over time, has fueled feelings of uncertainty and fear. The closure of many businesses has led to job loss and financial instability for millions for an undetermined period. Essential service occupations- some of which might not have been previously recognized for exposure to infectious diseases- are now perceived as more dangerous due to the increased infection risk associated with working among the general public. Worries about the health and wellbeing of ourselves and loved ones have further affected our emotional health and added to fatigue. New terms such as caution fatigue and quarantine fatigue have emerged to describe the weariness we feel about our new restrictive circumstances as a result of the COVID-19 pandemic. ... The only certain thing during these uncertain times is the need to work together to navigate through uncharted territories. Recognizing that worker fatigue can be attributed to a variety of sources stemmfog from individual concerns, changes in work routines and schedules, and varying degrees of stress due to adversities related to COVID-19 can aid in the development of targeted, efficient mitigation strategies. A holistic approach with shared responsibility and open communication among employers and employees is needed to ensure worker safety, health. and well-being and a successful return to regular operations. As the economy reopens, addressing changes in work hours and routines, organizational practices, and the physical and psychosocial work environment due to COVID-19 will help mitigate worker fatigue and can support healthy behaviors and practices as workers adjust to the new normal. |
Biological effects of inhaled hydraulic fracturing sand dust. IX. Summary and significance
Anderson SE , Barger M , Batchelor TP , Bowers LN , Coyle J , Cumpston A , Cumpston JL , Cumpston JB , Dey RD , Dozier AK , Fedan JS , Friend S , Hubbs AF , Jackson M , Jefferson A , Joseph P , Kan H , Kashon ML , Knepp AK , Kodali V , Krajnak K , Leonard SS , Lin G , Long C , Lukomska E , Marrocco A , Marshall N , Mc Kinney W , Morris AM , Olgun NS , Park JH , Reynolds JS , Roberts JR , Russ KA , Sager TM , Shane H , Snawder JE , Sriram K , Thompson JA , Umbright CM , Waugh S , Zheng W . Toxicol Appl Pharmacol 2020 409 115330 An investigation into the potential toxicological effects of fracking sand dust (FSD), collected from unconventional gas drilling sites, has been undertaken, along with characterization of their chemical and biophysical properties. Using intratracheal instillation of nine FSDs in rats and a whole body 4-d inhalation model for one of the FSDs, i.e., FSD 8, and related in vivo and in vitro experiments, the effects of nine FSDs on the respiratory, cardiovascular and immune systems, brain and blood were reported in the preceding eight tandem papers. Here, a summary is given of the key observations made in the organ systems reported in the individual studies. The major finding that inhaled FSD 8 elicits responses in extra-pulmonary organ systems is unexpected, as is the observation that the pulmonary effects of inhaled FSD 8 are attenuated relative to forms of crystalline silica more frequently used in animal studies, i.e., MIN-U-SIL®. An attempt is made to understand the basis for the extra-pulmonary toxicity and comparatively attenuated pulmonary toxicity of FSD 8. |
Biological effects of inhaled hydraulic fracturing sand dust. I. Scope of the investigation
Fedan JS . Toxicol Appl Pharmacol 2020 409 115329 Hydraulic fracturing ("fracking") is a process in which subterranean natural gas-laden rock is fractured under pressure to enhance retrieval of gas. Sand (a "proppant") is present in the fracking fluid pumped down the well bore to stabilize the fissures and facilitate gas flow. The manipulation of sand at the well site creates respirable dust (fracking sand dust, FSD) to which workers are exposed. Because workplace exposures to FSD have exceeded exposure limits set by OSHA, a physico-chemical characterization of FSD along with comprehensive investigation of the potential early adverse effects of FSDs on organ function and biomarkers has been conducted using a rat model and related in vivo and in vitro experiments involving the respiratory, cardiovascular, immune systems, kidney and brain. An undercurrent theme of the overall study was, to what degree do the health effects of inhaled FSD resemble those previously observed after crystalline silica dust inhalation? In short-term studies, FSD was found to be less bioactive than MIN-U-SIL® in the lungs. A second theme was, are the biological effects of FSD restricted to the lungs? Bioactivity of FSD was observed in all examined organ systems. Our findings indicate that, in many respects, the physical and chemical properties, and the short-term biological effects, of the FSDs share many similarities but have little in common with crystalline silica dust. |
A laboratory investigation of underside shield sprays to improve dust control of longwall water spray systems
Klima SS , Reed WR , Driscoll JS , Mazzella AL . Min Metall Explor 2020 38 (1) 593-602 Researchers at the National Institute for Occupational Safety and Health (NIOSH) performed laboratory testing to improve longwall dust control by examining the use of underside shield sprays in conjunction with the longwall directional spray system. In a field survey of longwall operations, NIOSH researchers observed dust clouds created by the fracturing and spalling of coal immediately upwind of the headgate drum that migrated into the walkway, exposing mining personnel to respirable coal dust. The goal of this research was to create an effective traveling water curtain to prevent this dust from reaching the personnel walkway by redirecting it toward the longwall face. The location, orientation, and pressure of the water sprays were the primary testing parameters examined for minimizing dust exposure in the walkway. Laboratory testing indicates that the use of underside shield sprays on the longwall face may be beneficial toward reducing respirable dust exposure for mining personnel. |
Evaluation of carbon monoxide and smoke sensors at a low ventilation velocity
Rowland JHIII , Yuan L , Thomas RA . Min Metall Explor 2020 38 (1) 603-608 This paper presents the results of large-scale fire experiments on evaluating the performances of carbon monoxide (CO) and smoke sensors at low ventilation velocities. Experiments using three different combustibles—conveyor belt, coal, and diesel fuel—were conducted in the Experimental Mine at the National Institute for Occupational Safety and Health (NIOSH) Bruceton Research Facility. A total of eight sensor stations were located downstream of the fire with each station containing CO, smoke, carbon dioxide, oxygen, humidity, barometric pressure, temperature sensors, and two airflow sensors. The airflow velocity ranged from 0.22 to 0.26 m/s (44 to 51 fpm) in the tests. The response times were recorded for the CO and smoke sensors at each sensor station when smoke and gaseous products of combustion of each burning combustible reached the station. The response times of the CO sensors were used to determine the appropriate sensor spacing in the belt entry with a low air velocity. The performance of the smoke sensor was found to be affected by the high humidity in the experiments. The results on proper selection of sensors and the determination of sensor spacing at a low ventilation velocity can be helpful for ensuring sufficient early fire warning for underground workers, thereby improving the health and safety of miners. |
Biological effects of inhaled hydraulic fracturing sand dust VII. Neuroinflammation and altered synaptic protein expression
Sriram K , Lin GX , Jefferson AM , McKinney W , Jackson MC , Cumpston A , Cumpston JL , Cumpston JB , Leonard HD , Kashon M , Fedan JS . Toxicol Appl Pharmacol 2020 409 115300 Hydraulic fracturing (fracking) is a process used to recover oil and gas from shale rock formation during unconventional drilling. Pressurized liquids containing water and sand (proppant) are used to fracture the oil- and natural gas-laden rock. The transportation and handling of proppant at well sites generate dust aerosols; thus, there is concern of worker exposure to such fracking sand dusts (FSD) by inhalation. FSD are generally composed of respirable crystalline silica and other minerals native to the geological source of the proppant material. Field investigations by NIOSH suggest that the levels of respirable crystalline silica at well sites can exceed the permissible exposure limits. Thus, from an occupational safety perspective, it is important to evaluate the potential toxicological effects of FSD, including any neurological risks. Here, we report that acute inhalation exposure of rats to one FSD, i.e., FSD 8, elicited neuroinflammation, altered the expression of blood brain barrier-related markers, and caused glial changes in the olfactory bulb, hippocampus and cerebellum. An intriguing observation was the persistent reduction of synaptophysin 1 and synaptotagmin 1 proteins in the cerebellum, indicative of synaptic disruption and/or injury. While our initial hazard identification studies suggest a likely neural risk, more research is necessary to determine if such molecular aberrations will progressively culminate in neuropathology/neurodegeneration leading to behavioral and/or functional deficits. |
Impact of Plasmodium falciparum gene deletions on malaria rapid diagnostic test performance.
Gatton ML , Chaudhry A , Glenn J , Wilson S , Ah Y , Kong A , Ord RL , Rees-Channer RR , Chiodini P , Incardona S , Cheng Q , Aidoo M , Cunningham J . Malar J 2020 19 (1) 392 BACKGROUND: Malaria rapid diagnostic tests (RDTs) have greatly improved access to diagnosis in endemic countries. Most RDTs detect Plasmodium falciparum histidine-rich protein 2 (HRP2), but their sensitivity is seriously threatened by the emergence of pfhrp2-deleted parasites. RDTs detecting P. falciparum or pan-lactate dehydrogenase (Pf- or pan-LDH) provide alternatives. The objective of this study was to systematically assess the performance of malaria RDTs against well-characterized pfhrp2-deleted P. falciparum parasites. METHODS: Thirty-two RDTs were tested against 100 wild-type clinical isolates (200 parasites/µL), and 40 samples from 10 culture-adapted and clinical isolates of pfhrp2-deleted parasites. Wild-type and pfhrp2-deleted parasites had comparable Pf-LDH concentrations. Pf-LDH-detecting RDTs were also tested against 18 clinical isolates at higher density (2,000 parasites/µL) lacking both pfhrp2 and pfhrp3. RESULTS: RDT positivity against pfhrp2-deleted parasites was highest (> 94%) for the two pan-LDH-only RDTs. The positivity rate for the nine Pf-LDH-detecting RDTs varied widely, with similar median positivity between double-deleted (pfhrp2/3 negative; 63.9%) and single-deleted (pfhrp2-negative/pfhrp3-positive; 59.1%) parasites, both lower than against wild-type P. falciparum (93.8%). Median positivity for HRP2-detecting RDTs against 22 single-deleted parasites was 69.9 and 35.2% for HRP2-only and HRP2-combination RDTs, respectively, compared to 96.0 and 92.5% for wild-type parasites. Eight of nine Pf-LDH RDTs detected all clinical, double-deleted samples at 2,000 parasites/µL. CONCLUSIONS: The pan-LDH-only RDTs evaluated performed well. Performance of Pf-LDH-detecting RDTs against wild-type P. falciparum does not necessarily predict performance against pfhrp2-deleted parasites. Furthermore, many, but not all HRP2-based RDTs, detect pfhrp2-negative/pfhrp3-positive samples, with implications for the HRP2-based RDT screening approach for detection and surveillance of HRP2-negative parasites. |
Community-based surveys for Plasmodium falciparum pfhrp2 and pfhrp3 gene deletions in selected regions of mainland Tanzania.
Bakari C , Jones S , Subramaniam G , Mandara CI , Chiduo MG , Rumisha S , Chacky F , Molteni F , Mandike R , Mkude S , Njau R , Herman C , Nace DP , Mohamed A , Udhayakumar V , Kibet CK , Nyanjom SG , Rogier E , Ishengoma DS . Malar J 2020 19 (1) 391 BACKGROUND: Histidine-rich protein 2 (HRP2)-based malaria rapid diagnostic tests (RDTs) are effective and widely used for the detection of wild-type Plasmodium falciparum infections. Although recent studies have reported false negative HRP2 RDT results due to pfhrp2 and pfhrp3 gene deletions in different countries, there is a paucity of data on the deletions of these genes in Tanzania. METHODS: A community-based cross-sectional survey was conducted between July and November 2017 in four regions: Geita, Kigoma, Mtwara and Ruvuma. All participants had microscopy and RDT performed in the field and provided a blood sample for laboratory multiplex antigen detection (for Plasmodium lactate dehydrogenase, aldolase, and P. falciparum HRP2). Samples showing RDT false negativity or aberrant relationship of HRP2 to pan-Plasmodium antigens were genotyped to detect the presence/absence of pfhrp2/3 genes. RESULTS: Of all samples screened by the multiplex antigen assay (n = 7543), 2417 (32.0%) were positive for any Plasmodium antigens while 5126 (68.0%) were negative for all antigens. The vast majority of the antigen positive samples contained HRP2 (2411, 99.8%), but 6 (0.2%) had only pLDH and/or aldolase without HRP2. Overall, 13 samples had an atypical relationship between a pan-Plasmodium antigen and HRP2, but were positive by PCR. An additional 16 samples with negative HRP2 RDT results but P. falciparum positive by microscopy were also chosen for pfhrp2/3 genotyping. The summation of false negative RDT results and laboratory antigen results provided 35 total samples with confirmed P. falciparum DNA for pfhrp2/3 genotyping. Of the 35 samples, 4 (11.4%) failed to consistently amplify positive control genes; pfmsp1 and pfmsp2 and were excluded from the analysis. The pfhrp2 and pfhrp3 genes were successfully amplified in the remaining 31 (88.6%) samples, confirming an absence of deletions in these genes. CONCLUSIONS: This study provides evidence that P. falciparum parasites in the study area have no deletions of both pfhrp2 and pfhrp3 genes. Although single gene deletions could have been missed by the multiplex antigen assay, the findings support the continued use of HRP2-based RDTs in Tanzania for routine malaria diagnosis. There is a need for the surveillance to monitor the status of pfhrp2 and/or pfhrp3 deletions in the future. |
Continued low efficacy of artemether-lumefantrine in Angola, 2019.
Dimbu PR , Horth R , Cândido ALM , Ferreira CM , Caquece F , Garcia LEA , André K , Pembele G , Jandondo D , Bondo BJ , Nieto Andrade B , Labuda S , Ponce de León G , Kelley J , Patel D , Svigel SS , Talundzic E , Lucchi N , Morais JFM , Fortes F , Martins JF , Pluciński MM . Antimicrob Agents Chemother 2020 65 (2) BACKGROUND: Biennial therapeutic efficacy monitoring is a crucial activity for ensuring efficacy of currently used artemisinin-based combination therapy in Angola. METHODS: Children with acute uncomplicated P. falciparum infection in sentinel sites in Benguela, Zaire, and Lunda Sul Provinces were treated with artemether-lumefantrine (AL) or artesunate amodiaquine (ASAQ) and followed for 28 days to assess clinical and parasitological response. Molecular correction was performed using seven microsatellite markers. Samples from treatment failures were genotyped for the pfk13, pfcrt, and pfmdr1 genes. RESULTS: Day 3 clearance rates were ≥95% in all arms. Uncorrected Day-28 Kaplan-Meier efficacy estimates ranged from 84.2 to 90.1% for the AL arms, and 84.7 to 100% for the ASAQ arms. Corrected Day-28 estimates were 87.6% (95% Confidence interval [CI]: 81-95%) for the AL arm in Lunda Sul, 92.2% (95%CI: 87-98%) for AL in Zaire, 95.6% (95%CI: 91-100%) for ASAQ in Zaire, 98.4% (95%CI: 96-100%) for AL in Benguela, and 100% for ASAQ in Benguela and Lunda Sul. All 103 analyzed samples had wildtype pfk13 sequences. The 76T pfcrt allele was found in most (92%, 11/12) ASAQ late failure samples but only 16% (4/25) of AL failure samples. The N86 pfmdr1 allele was found in 97% (34/35) of treatment failures. CONCLUSION: AL efficacy in Lunda Sul was below the 90% World Health Organization threshold, the third time in four rounds that this threshold was crossed for an AL arm in Angola. In contrast, observed ASAQ efficacy has not been below 95% to date in Angola, including this latest round. |
Reduced long-lasting insecticidal net efficacy and pyrethroid insecticide resistance are associated with over-expression of CYP6P4, CYP6P3 and CYP6Z1 in populations of Anopheles coluzzii from South-East Côte d'Ivoire.
Meiwald A , Clark E , Kristan M , Edi C , Jeffries CL , Pelloquin B , Irish SR , Walker T , Messenger LA . J Infect Dis 2020 225 (8) 1424-1434 BACKGROUND: Resistance to major public health insecticides in Côte d'Ivoire has intensified and now threatens the long-term effectiveness of malaria vector control interventions. METHODS: This study evaluated the bioefficacy of conventional and next-generation long-lasting insecticidal nets (LLINs), determined resistance profiles, and characterized molecular and metabolic mechanisms in wild Anopheles coluzzii from South-East Côte d'Ivoire in 2019. RESULTS: Phenotypic resistance was intense: more than 25% of mosquitoes survived exposure to ten times the doses of pyrethroids required to kill susceptible populations. Similarly, 24-hour mortality to deltamethrin-only LLINs was very low and not significantly different to an untreated net. Sub-lethal pyrethroid exposure did not induce significant delayed vector mortality 72 hours later. In contrast, LLINs containing the synergist piperonyl butoxide (PBO), or new insecticides, clothianidin and chlorfenapyr, were highly toxic to An. coluzzii. Pyrethroid-susceptible An. coluzzii were significantly more likely to be infected with malaria, compared to those that survived insecticidal exposure. Pyrethroid resistance was associated with significant over-expression of CYP6P4, CPY6Z1 and CYP6P3. CONCLUSIONS: Study findings raise concerns regarding the operational failure of standard LLINs and support the urgent deployment of vector control interventions incorporating PBO, chlorfenapyr or clothianidin in areas of high resistance intensity in Côte d'Ivoire. |
Association of malnutrition with subsequent malaria parasitemia among children younger than three years in Kenya: A secondary data analysis of the Asembo Bay Cohort Study
Donovan CV , McElroy P , Adair L , Pence BW , Oloo AJ , Lal A , Bloland P , Nahlen B , Juliano JJ , Meshnick S . Am J Trop Med Hyg 2020 104 (1) 243-254 Malaria and malnutrition remain primary causes of morbidity and mortality among children younger than 5 years in Africa. Studies investigating the association between malnutrition and subsequent malaria outcomes are inconsistent. We studied the effects of malnutrition on incidence and prevalence of malaria parasitemia in data from a cohort studied in the 1990s. Data came from the Asembo Bay cohort study, which collected malaria and health information on children from 1992 to 1996 in western Kenya. Infants were enrolled at birth and followed up until loss to follow-up, death, end of study, or 5 years old. Anthropometric measures and blood specimens were obtained monthly. Nutritional exposures included categorized Z-scores for height-for-age, weight-for-age, and weight-for-height. Febrile parasitemia and afebrile parasitemia were assessed with thick and thin blood films. Multiply imputed and weighted multinomial generalized estimating equation models estimated odds ratios (OR) for the association between exposures and outcomes. The sample included 1,182 children aged 0-30 months who contributed 18,028 follow-up visits. There was no significant association between malnutrition and either incident febrile parasitemia or prevalent febrile parasitemia. Prevalence ORs for afebrile parasitemia increased from 1.07 (95% CI: 0.89, 1.29) to 1.35 (1.03, 1.76) as stunting severity increased from mild to severe, and from 1.16 (1.02, 1.33) to 1.35 (1.09, 1.66) as underweight increased from mild to moderate. Stunting and underweight did not show a significant association with subsequent febrile parasitemia infections, but they did show a modest association with subsequent afebrile parasitemia. Consideration should be given to testing malnourished children for malaria, even if they present without fever. |
Cost-effectiveness of intermittent preventive treatment with dihydroartemisinin-piperaquine for malaria during pregnancy: an analysis using efficacy results from Uganda and Kenya, and pooled data
Fernandes S , Were V , Gutman J , Dorsey G , Kakuru A , Desai M , Kariuki S , Kamya MR , Ter Kuile FO , Hanson K . Lancet Glob Health 2020 8 (12) e1512-e1523 BACKGROUND: Prevention of malaria infection during pregnancy in HIV-negative women currently relies on the use of long-lasting insecticidal nets together with intermittent preventive treatment in pregnancy with sulfadoxine-pyrimethamine (IPTp-SP). Increasing sulfadoxine-pyrimethamine resistance in Africa threatens current prevention of malaria during pregnancy. Thus, a replacement for IPTp-SP is urgently needed, especially for locations with high sulfadoxine-pyrimethamine resistance. Dihydroartemisinin-piperaquine is a promising candidate. We aimed to estimate the cost-effectiveness of intermittent preventive treatment in pregnancy with dihydroartemisinin-piperaquine (IPTp-DP) versus IPTp-SP to prevent clinical malaria infection (and its sequelae) during pregnancy. METHODS: We did a cost-effectiveness analysis using meta-analysis and individual trial results from three clinical trials done in Kenya and Uganda. We calculated disability-adjusted life-years (DALYs) arising from stillbirths, neonatal death, low birthweight, mild and moderate maternal anaemia, and clinical malaria infection, associated with malaria during pregnancy. Cost estimates were obtained from data collected in observational studies, health-facility costings, and from international drug procurement databases. The cost-effectiveness analyses were done from a health-care provider perspective using a decision tree model with a lifetime horizon. Deterministic and probabilistic sensitivity analyses using appropriate parameter ranges and distributions were also done. Results are presented as the incremental cost per DALY averted and the likelihood that an intervention is cost-effective for different cost-effectiveness thresholds. FINDINGS: Compared with three doses of sulfadoxine-pyrimethamine, three doses of dihydroartemisinin-piperaquine, delivered to a hypothetical cohort of 1000 pregnant women, averted 892 DALYs (95% credibility interval 274 to 1517) at an incremental cost of US$7051 (2653 to 13 038) generating an incremental cost-effectiveness ratio (ICER) of $8 (2 to 29) per DALY averted. Compared with monthly doses of sulfadoxine-pyrimethamine, monthly doses of dihydroartemisinin-piperaquine averted 534 DALYS (-141 to 1233) at a cost of $13 427 (4994 to 22 895), resulting in an ICER of $25 (-151 to 224) per DALY averted. Both results were highly robust to most or all variations in the deterministic sensitivity analysis. INTERPRETATION: Our findings suggest that among HIV-negative pregnant women with high uptake of long-lasting insecticidal nets, IPTp-DP is cost-effective in areas with high malaria transmission and high sulfadoxine-pyrimethamine resistance. These data provide a comprehensive overview of the current evidence on the cost-effectiveness of IPTp-DP. Nevertheless, before a policy change is advocated, we recommend further research into the effectiveness and costs of different regimens of IPTp-DP in settings with different underlying sulfadoxine-pyrimethamine resistance. FUNDING: Malaria in Pregnancy Consortium, which is funded through a grant from the Bill & Melinda Gates Foundation to the Liverpool School of Hygiene and Tropical Medicine. |
The risk of Plasmodium vivax parasitaemia after P. falciparum malaria: An individual patient data meta-analysis from the WorldWide Antimalarial Resistance Network
Hossain MS , Commons RJ , Douglas NM , Thriemer K , Alemayehu BH , Amaratunga C , Anvikar AR , Ashley EA , Asih PBS , Carrara VI , Lon C , D'Alessandro U , Davis TME , Dondorp AM , Edstein MD , Fairhurst RM , Ferreira MU , Hwang J , Janssens B , Karunajeewa H , Kiechel JR , Ladeia-Andrade S , Laman M , Mayxay M , McGready R , Moore BR , Mueller I , Newton PN , Thuy-Nhien NT , Noedl H , Nosten F , Phyo AP , Poespoprodjo JR , Saunders DL , Smithuis F , Spring MD , Stepniewska K , Suon S , Suputtamongkol Y , Syafruddin D , Tran HT , Valecha N , Van Herp M , Van Vugt M , White NJ , Guerin PJ , Simpson JA , Price RN . PLoS Med 2020 17 (11) e1003393 BACKGROUND: There is a high risk of Plasmodium vivax parasitaemia following treatment of falciparum malaria. Our study aimed to quantify this risk and the associated determinants using an individual patient data meta-analysis in order to identify populations in which a policy of universal radical cure, combining artemisinin-based combination therapy (ACT) with a hypnozoitocidal antimalarial drug, would be beneficial. METHODS AND FINDINGS: A systematic review of Medline, Embase, Web of Science, and the Cochrane Database of Systematic Reviews identified efficacy studies of uncomplicated falciparum malaria treated with ACT that were undertaken in regions coendemic for P. vivax between 1 January 1960 and 5 January 2018. Data from eligible studies were pooled using standardised methodology. The risk of P. vivax parasitaemia at days 42 and 63 and associated risk factors were investigated by multivariable Cox regression analyses. Study quality was assessed using a tool developed by the Joanna Briggs Institute. The study was registered in the International Prospective Register of Systematic Reviews (PROSPERO: CRD42018097400). In total, 42 studies enrolling 15,341 patients were included in the analysis, including 30 randomised controlled trials and 12 cohort studies. Overall, 14,146 (92.2%) patients had P. falciparum monoinfection and 1,195 (7.8%) mixed infection with P. falciparum and P. vivax. The median age was 17.0 years (interquartile range [IQR] = 9.0-29.0 years; range = 0-80 years), with 1,584 (10.3%) patients younger than 5 years. 2,711 (17.7%) patients were treated with artemether-lumefantrine (AL, 13 studies), 651 (4.2%) with artesunate-amodiaquine (AA, 6 studies), 7,340 (47.8%) with artesunate-mefloquine (AM, 25 studies), and 4,639 (30.2%) with dihydroartemisinin-piperaquine (DP, 16 studies). 14,537 patients (94.8%) were enrolled from the Asia-Pacific region, 684 (4.5%) from the Americas, and 120 (0.8%) from Africa. At day 42, the cumulative risk of vivax parasitaemia following treatment of P. falciparum was 31.1% (95% CI 28.9-33.4) after AL, 14.1% (95% CI 10.8-18.3) after AA, 7.4% (95% CI 6.7-8.1) after AM, and 4.5% (95% CI 3.9-5.3) after DP. By day 63, the risks had risen to 39.9% (95% CI 36.6-43.3), 42.4% (95% CI 34.7-51.2), 22.8% (95% CI 21.2-24.4), and 12.8% (95% CI 11.4-14.5), respectively. In multivariable analyses, the highest rate of P. vivax parasitaemia over 42 days of follow-up was in patients residing in areas of short relapse periodicity (adjusted hazard ratio [AHR] = 6.2, 95% CI 2.0-19.5; p = 0.002); patients treated with AL (AHR = 6.2, 95% CI 4.6-8.5; p < 0.001), AA (AHR = 2.3, 95% CI 1.4-3.7; p = 0.001), or AM (AHR = 1.4, 95% CI 1.0-1.9; p = 0.028) compared with DP; and patients who did not clear their initial parasitaemia within 2 days (AHR = 1.8, 95% CI 1.4-2.3; p < 0.001). The analysis was limited by heterogeneity between study populations and lack of data from very low transmission settings. Study quality was high. CONCLUSIONS: In this meta-analysis, we found a high risk of P. vivax parasitaemia after treatment of P. falciparum malaria that varied significantly between studies. These P. vivax infections are likely attributable to relapses that could be prevented with radical cure including a hypnozoitocidal agent; however, the benefits of such a novel strategy will vary considerably between geographical areas. |
Population-based prevalence of Chlamydia trachomatis infection and antibodies in four districts with varying levels of trachoma endemicity in Amhara, Ethiopia
Nash SD , Astale T , Nute AW , Bethea D , Chernet A , Sata E , Zerihun M , Gessese D , Ayenew G , Ayele Z , Melak B , Haile M , Zeru T , Tadesse Z , Arnold BF , Callahan EK , Martin DL . Am J Trop Med Hyg 2020 104 (1) 207-215 The Trachoma Control Program in Amhara region, Ethiopia, scaled up the surgery, antibiotics, facial cleanliness, and environmental improvement (SAFE) strategy in all districts starting in 2007. Despite these efforts, many districts still require additional years of SAFE. In 2017, four districts were selected for the assessment of antibody responses against Chlamydia trachomatis antigens and C. trachomatis infection to better understand transmission. Districts with differing endemicity were chosen, whereby one had a previous trachomatous inflammation-follicular (TF) prevalence of > 30% (Andabet), one had a prevalence between 10% and 29.9% (Dera), one had a prevalence between 5% and 10% (Woreta town), and one had a previous TF prevalence of < 5% (Alefa) and had not received antibiotic intervention for 2 years. Survey teams assessed trachoma clinical signs and took conjunctival swabs and dried blood spots (DBS) to measure infection and antibody responses. Trachomatous inflammation-follicular prevalence among children aged 1-9 years was 37.0% (95% CI: 31.1-43.3) for Andabet, 14.7% (95% CI: 10.0-20.5) for Dera, and < 5% for Woreta town and Alefa. Chlamydia trachomatis infection was only detected in Andabet (11.3%). Within these districts, 2,195 children provided DBS. The prevalence of antibody responses to the antigen Pgp3 was 36.9% (95% CI: 29.0-45.6%) for Andabet, 11.3% (95% CI: 5.9-20.6%) for Dera, and < 5% for Woreta town and Alefa. Seroconversion rate for Pgp3 in Andabet was 0.094 (95% CI: 0.069-0.128) events per year. In Andabet district, where SAFE implementation has occurred for 11 years, the antibody data support the finding of persistently high levels of trachoma transmission. |
Developing the Active Communities Tool to implement the Community Guide's Built Environment Recommendation for Increasing Physical Activity
Evenson KR , Porter AK , Day KL , McPhillips-Tangum C , Harris KE , Kochtitzky CS , Bors P . Prev Chronic Dis 2020 17 E142 Physical activity is higher in communities that include supportive features for walking and bicycling. In 2016, the Community Preventive Services Task Force released a systematic review of built environment approaches to increase physical activity. The results of the review recommended approaches that combine interventions to improve pedestrian and bicycle transportation systems with land use and environmental design strategies. Because the recommendation was multifaceted, the Centers for Disease Control and Prevention determined that communities could benefit from an assessment tool to address the breadth of the Task Force recommendations. The purpose of this article is to describe the systematic approach used to develop the Active Communities Tool. First, we created and refined a logic model and community theory of change for tool development. Second, we reviewed existing community-based tools and abstracted key elements (item domains, advantages, disadvantages, updates, costs, permissions to use, and psychometrics) from 42 tools. The review indicated that no tool encompassed the breadth of the Community Guide recommendations for communities. Third, we developed a new tool and pilot tested its use with 9 diverse teams with public health and planning expertise. Final revisions followed from pilot team and expert input. The Active Communities Tool comprises 6 modules addressing all 8 interventions recommended by the Task Force. The tool is designed to help cross-sector teams create an action plan for improving community built environments that promote physical activity and may help to monitor progress toward achieving community conditions known to promote physical activity. |
Knowledge of the Adult and Youth 2008 Physical Activity Guidelines for Americans
Hyde ET , Omura JD , Watson KB , Fulton JE , Carlson SA . J Phys Act Health 2019 16 (8) 618-622 BACKGROUND: To estimate the proportion of adults' and parents' knowledge of the adult aerobic and youth physical activity guidelines, respectively, in the United States. METHODS: Data were analyzed from a national sample of adults in the 2017 ConsumerStyles survey. Prevalence of knowledge of the adult aerobic guideline (ie, 150 min/wk of moderate-intensity activity) was estimated among all respondents (n = 3910) and of the youth guideline (ie, 60 min/d of physical activity on 7 d/wk) among parents (n = 1288). Odds ratios were estimated using logistic regression models adjusting for demographic characteristics. RESULTS: Overall, 2.5% (95% confidence interval, 2.0-3.1) of adults and 23.0% (95% confidence interval, 20.5-25.7) of parents were knowledgeable of the adult aerobic and youth guidelines, respectively. After adjustment, odds of knowledge of the adult guideline differed significantly by sex and physical activity level, whereas knowledge of the youth guideline differed by parental education level. CONCLUSIONS: Despite the release of the 2008 Physical Activity Guidelines for Americans nearly a decade ago, most US adults and parents lack knowledge of the adult aerobic and youth physical activity guidelines. Effective communication strategies may help raise awareness of current and future editions of national guidelines for physical activity. |
Reading between the lines: A qualitative case study of national public health institute functions and attributes in the Joint External Evaluation
Clemente J , Rhee S , Miller B , Bronner E , Whitney E , Bratton S , Carnevale C . J Public Health Afr 2020 11 (1) 1329 National Public Health Institutes (NPHIs) are national-level institutions that can lead and coordinate a country's public health system. The Africa Centres for Disease Control and Prevention (Africa CDC) considers NPHI development critical to strengthening public health systems in Africa. This paper describes how Joint External Evaluation (JEE) reports demonstrate the role NPHIs can play in supporting the goals of IHR compliance and global health security. This study is a secondary document-based qualitative analysis of JEE reports from 11 countries in the WHO AFRO region (Botswana, Ethiopia, Liberia, Mozambique, Namibia, Nigeria, Rwanda, Sierra Leone, South Africa, Uganda, and Zambia). Researchers found three distinct thematic areas: i) core public health functions, ii) governance, and iii) coordination, collaboration, and communication. These themes and their interlinkages, both in pairs and all three, were of importance in displaying the roles that NPHIs could play in the strengthening of health systems. The data suggests that NPHIs, though not always explicitly mentioned in the data, may have a vital role in strengthening health systems across Africa and their governments' goals of achieving IHR compliance. |
Zambia field epidemiology training program: strengthening health security through workforce development
Kumar R , Kateule E , Sinyange N , Malambo W , Kayeye S , Chizema E , Chongwe G , Minor P , Kapina M , Baggett HC , Yard E , Mukonka V . Pan Afr Med J 2020 36 323 The Zambia Field Epidemiology Training Program (ZFETP) was established by the Ministry of Health (MoH) during 2014, in order to increase the number of trained field epidemiologists who can investigate outbreaks, strengthen disease surveillance, and support data-driven decision making. We describe the ZFETP´s approach to public health workforce development and health security strengthening, key milestones five years after program launch, and recommendations to ensure program sustainability. Program description: ZFETP was established as a tripartite arrangement between the Zambia MoH, the University of Zambia School of Public Health, and the U.S. Centers for Disease Control and Prevention. The program runs two tiers: Advanced and Frontline. To date, ZFETP has enrolled three FETP-Advanced cohorts (training 24 residents) and four Frontline cohorts (training 71 trainees). In 2016, ZFETP moved organizationally to the newly established Zambia National Public Health Institute (ZNPHI). This re-positioning raised the program´s profile by providing residents with increased opportunities to lead high-profile outbreak investigations and analyze national surveillance data-achievements that were recognized on a national stage. These successes attracted investment from the Government of Republic of Zambia (GRZ) and donors, thus accelerating field epidemiology workforce capacity development in Zambia. In its first five years, ZFETP achieved early success due in part to commitment from GRZ, and organizational positioning within the newly formed ZNPHI, which have catalyzed ZFETP´s institutionalization. During the next five years, ZFETP seeks to sustain this momentum by expanding training of both tiers, in order to accelerate the professional development of field epidemiologists at all levels of the public health system. |
Patient and pharmacist perspectives on pharmacist-prescribed contraception: a systematic review
Eckhaus LM , Ti AJ , Curtis KM , Stewart-Lynch AL , Whiteman MK . Contraception 2020 103 (2) 66-74 OBJECTIVE: Increasingly, states authorize pharmacists to prescribe hormonal contraception to patients without a prescription from another healthcare provider. The purpose of this review is to investigate pharmacist and patient perspectives on pharmacist-prescribed contraception in the United States. Study Design We searched Medline, Embase, PsycInfo, CINAHL, Scopus, and the Cochrane Library from inception through July 10, 2019. We included qualitative and mixed-methods studies, quantitative surveys, observational studies, and randomized trials in the United States. Risk of bias was assessed using tools for quantitative and qualitative studies. RESULTS: Fifteen studies met inclusion criteria, including studies on pharmacists and student pharmacists (n=9), patients (n=5), and both (n=1). Study samples ranged from local to national. Studies had moderate to high risk of bias, primarily due to low response rates and lack of validated instruments. Most pharmacists (57-96%) across four studies were interested in participating in pharmacist-prescribed contraception services. Among patients, 63-97% across three studies supported pharmacist-prescribed contraception, and 38-68% across four studies intended to participate in these services. At least half of pharmacists across four studies felt comfortable prescribing contraception, though pharmacists identified additional training needs. Pharmacists and patients identified several reasons for interest in pharmacist-prescribed contraception services, including increasing patient access, reducing unintended pregnancies, and offering professional development for pharmacists. They also identified barriers, including payment, time and resource constraints, liability, and patient health concerns. CONCLUSIONS: Most pharmacists and patients across 15 studies were interested in expanded access to contraception through pharmacist-prescribed contraception. Findings on facilitators and barriers may inform implementation efforts. Implications Pharmacist-prescribed contraception is a strategy to expand patient access to contraception. Reducing barriers to implementation could improve participation among pharmacists and patients. |
Prevalence of home births and associated risk profile and maternal characteristics, 2016-2018
Goyal S , Kortsmit K , Cox S , DʼAngelo DV , Romero L , Henderson ZT , Barfield WD . Obstet Gynecol 2020 136 (6) 1195-1203 OBJECTIVE: To estimate the prevalence of pregnancies that meet the low-risk criteria for planned home births and describe geographic and maternal characteristics of home births compared with hospital births. METHODS: Data from the 2016-2018 Pregnancy Risk Assessment Monitoring System (PRAMS), a survey among women with recent live births, and linked birth certificate variables were used to calculate the prevalence of home births that were considered low-risk. We defined low-risk pregnancy as a term (between 37 and 42 weeks of gestation), singleton gestation with a birth weight within the 10th-90th percentile mean for gestational age (as a proxy for estimated fetal size appropriate for gestational age), without prepregnancy or gestational diabetes or hypertension, and no vaginal birth after cesarean (VBAC). We also calculated the prevalence of home and hospital births by site and maternal characteristics. Weighted prevalence estimates are presented with 95% CIs to identify differences. RESULTS: The prevalence of home births was 1.1% (unweighted n=1,034), ranging from 0.1% (Alabama) to 2.6% (Montana); 64.9% of the pregnancies were low-risk. Among the 35.1% high-risk home births, 39.5% of neonates were large for gestational age, 20.5% of neonates were small for gestational age, 17.1% of the women had diabetes, 16.9% of the women had hypertension, 10.6% of the deliveries were VBACs, and 10.1% of the deliveries were preterm. A significantly higher percentage of women with home births than hospital births were non-Hispanic White (83.9% vs 56.5%), aged 35 years or older (24.0% vs 18.1%), with less than a high school-level of education (24.6% vs 12.2%), and reported no health insurance (27.0% vs 1.9%). A significantly lower percentage of women with home births than hospital births initiated prenatal visits in the first trimester (66.9% vs 87.1%), attended a postpartum visit (80.1% vs 90.0%), and most often laid their infants on their backs for sleep (59.3% vs 79.5%). CONCLUSIONS: Understanding the risk profile, geographic distribution, and characteristics of women with home births can guide efforts around safe birthing practices. |
Sexual and gender minority youth and sexual health education: A systematic mapping review of the literature
Pampati S , Johns MM , Szucs LE , Bishop MD , Mallory AB , Barrios LC , Russell ST . J Adolesc Health 2020 68 (6) 1040-1052 PURPOSE: To synthesize the diverse body of literature on sexual and gender minority youth (SGMY) and sexual health education. METHODS: We conducted a systematic search of the literature on SGMY and sexual health education, including SGMY perspectives on sexual health education, the acceptability or effectiveness of programs designed for SGMY, and SGMY-specific results of sexual health education programs delivered to general youth populations. RESULTS: A total of 32 articles were included. Sixteen qualitative studies with SGMY highlight key perspectives underscoring how youth gained inadequate knowledge from sexual health education experiences and received content that excluded their identities and behaviors. Thirteen studies examined the acceptability or effectiveness of sexual health interventions designed for SGMY from which key characteristics of inclusive sexual health education relating to development, content, and delivery emerged. One study found a sexual health education program delivered to a general population of youth was also acceptable for a subsample of sexual minority girls. CONCLUSIONS: Future research on SGMY experiences should incorporate populations understudied, including younger adolescents, sexual minority girls, and transgender persons. Further, the effectiveness of inclusive sexual health education in general population settings requires further study. |
The role of public-private partnerships to increase access to contraception in an emergency response setting: The Zika Contraception Access Network Program
Romero L , Mendoza ZV , Croft L , Bhakta R , Sidibe T , Bracero N , Malave C , Suarez A , Sanchez L , Cordero D , Lathrop E , Monroe J . J Womens Health (Larchmt) 2020 29 (11) 1372-1380 The Zika Contraception Access Network (Z-CAN) program was a short-term emergency response intervention that used contraception to prevent unintended pregnancies to reduce Zika-related adverse birth outcomes during the 2016-2017 Zika virus outbreak in Puerto Rico. The Centers for Disease Control and Prevention (CDC) reported that a collaborative and coordinated response was needed from governments and private-sector partners to improve access to contraception during the Zika outbreak in Puerto Rico. In response, the National Foundation for the CDC, with technical assistance from CDC, established the Z-CAN program, a network of 153-trained physicians, that provided client-centered contraceptive counseling and same-day access to the full range of the Food and Drug Administration-approved reversible contraceptive methods at no cost for women who chose to prevent pregnancy. From May 2016 to September 2017, 29,221 women received Z-CAN services. Through Z-CAN, public-private partnerships provided a broad range of opportunities for partners to come together to leverage technical expertise, experience, and resources to remove barriers to access contraception that neither the public nor the private sector could address alone. Public-private partnerships focused on three areas: (1) the coordination of efforts among federal and territorial agencies to align strategies, leverage resources, and address sustainability; (2) the mobilization of private partnerships to secure resources from private corporations, domestic philanthropic organizations, and nonprofit organizations for contraceptive methods, physician reimbursement, training and proctoring resources, infrastructure costs, and a health communications campaign; and (3) the engagement of key stakeholders to understand context and need, and to identify strategies to reach the target population. Public-private partnerships provided expertise, support, and awareness, and could be used to help guide programs to other settings for which access to contraception could improve health outcomes. |
Microbial communities and gene contributions in smokeless tobacco products.
Rivera AJ , Tyx RE , Keong LM , Stanfill SB , Watson CH . Appl Microbiol Biotechnol 2020 104 (24) 10613-10629 Smokeless tobacco products (STP) contain bacteria, mold, and fungi due to exposure from surrounding environments and tobacco processing. This has been a cause for concern since the presence of microorganisms has been linked to the formation of highly carcinogenic tobacco-specific nitrosamines. These communities have also been reported to produce toxins and other pro-inflammatory molecules that can cause mouth lesions and elicit inflammatory responses in STP users. Moreover, microbial species in these products could transfer to the mouth and gastrointestinal tract, potentially altering the established respective microbiotas of the consumer. Here, we present the first metagenomic analysis of select smokeless tobacco products, specifically US domestic moist and dry snuff. Bacterial, eukaryotic, and viral species were found in all tobacco products where 68% of the total species was comprised of Bacteria with 3 dominant phyla but also included 32% Eukarya and 1% share abundance for Archaea and Viruses. Furthermore, 693,318 genes were found to be present and included nitrate and nitrite reduction and transport enzymes, antibiotic resistance genes associated with resistance to vancomycin, β-lactamases, their derivatives, and other antibiotics, as well as genes encoding multi-drug transporters and efflux pumps. Additional analyses showed the presence of endo- and exotoxin genes in addition to other molecules associated with inflammatory responses. Our results present a novel aspect of the smokeless tobacco microbiome and provide a better understanding of these products' microbiology. KEY POINTS: The findings presented will help understand microbial contributions to overall STP chemistries. Gene function categorization reveals harmful constituents outside canonical forms. Pathway genes for TSNA precursor activity may occur at early stages of production. Bacteria in STPs carry antibiotic resistance genes and gene transfer mechanisms. |
Estimating costs of hospitalizations associated with opioid use disorder or opioid misuse at a large, urban safety-net hospital-Denver, Colorado, 2017
Arifkhanova A , McCormick Kraus E , Al-Tayyib A , Taub J , Encinias A , McEwen D , Davidson A , Shlay JC . Drug Alcohol Depend 2020 218 108306 INTRODUCTION: The national and state economic burden of the opioid crisis is substantial. This study estimated the number of hospitalizations associated with opioid use disorder (OUD) or opioid misuse (OM) and the cost of those hospitalizations at Denver Health (DH) Medical Center, a large, urban safety-net hospital. METHODS: For 2017, direct inpatient medical costs for hospitalizations associated with OUD or OM at DH Medical Center were estimated and categorized by group and insurance type. Data were from the DH electronic health records database that included charge data. Hospitalizations associated with OUD or OM were identified using diagnostic codes and an expanded set of inclusion criteria including diagnostic codes, opioid withdrawal assessments, opioid-related admission notes, and medication prescriptions to treat OUD. Costs were estimated using cost-to-charge ratios specific to DH. RESULTS: During 2017, 220 hospitalizations, $9,834,979 in total charges, $3,690,724 in estimated total costs, and $2,115,990 in total reimbursements were identified using diagnostic codes. Using the most expansive set of inclusion criteria, 739 hospitalizations, $35,033,157 in total charges, $13,346,099 in estimated total costs, and $7,020,877 in total reimbursements were identified. Of the 739 hospitalizations, Medicaid covered 546 hospitalizations (74 %), the largest proportion of total reimbursement (65 %), with estimated total costs of $10,135,048 (77 %). CONCLUSIONS: Our study identified considerable costs for hospitalizations associated with OUD or OM for DH. Estimating costs for hospitalizations associated with OUD or OM through use of expanded inclusion methodology can guide future program planning to allocate resources efficiently for hospitals such as DH Medical Center. |
Tobacco product use among adults - United States, 2019
Cornelius ME , Wang TW , Jamal A , Loretan CG , Neff LJ . MMWR Morb Mortal Wkly Rep 2020 69 (46) 1736-1742 Cigarette smoking remains the leading cause of preventable disease and death in the United States (1). The prevalence of current cigarette smoking among U.S. adults has declined over the past several decades, with a prevalence of 13.7% in 2018 (2). However, a variety of combustible, noncombustible, and electronic tobacco products are available in the United States (1,3). To assess recent national estimates of tobacco product use among U.S. adults aged ≥18 years, CDC analyzed data from the 2019 National Health Interview Survey (NHIS). In 2019, an estimated 50.6 million U.S. adults (20.8%) reported currently using any tobacco product, including cigarettes (14.0%), e-cigarettes (4.5%), cigars (3.6%), smokeless tobacco (2.4%), and pipes* (1.0%).(†) Most current tobacco product users (80.5%) reported using combustible products (cigarettes, cigars, or pipes), and 18.6% reported using two or more tobacco products.(§) The prevalence of any current tobacco product use was higher among males; adults aged ≤65 years; non-Hispanic American Indian/Alaska Native (AI/AN) adults; those whose highest level of educational attainment was a General Educational Development (GED) certificate; those with an annual household income <$35,000; lesbian, gay, or bisexual (LGB) adults; uninsured adults and those with Medicaid; those with a disability; or those with mild, moderate, or severe generalized anxiety disorder. E-cigarette use was highest among adults aged 18-24 years (9.3%), with over half (56.0%) of these young adults reporting that they had never smoked cigarettes. Implementing comprehensive, evidence-based, population level interventions (e.g., tobacco price increases, comprehensive smoke-free policies, high-impact antitobacco media campaigns, and barrier-free cessation coverage), in coordination with regulation of the manufacturing, marketing, and sale of all tobacco products, can reduce tobacco-related disease and death in the United States (1,4). As part of a comprehensive approach, targeted interventions are also warranted to reach subpopulations with the highest prevalence of use, which might vary by tobacco product type. |
Children prenatally exposed to alcohol and other drugs: what the literature tells us about child welfare information sources, policies, and practices to identify and care for children
Richards T , Bertrand J , Newburg-Rinn S , McCann H , Morehouse E , Ingoldsby E . J Public Child Welf 2020 1 (24) Many parents who interact with the child welfare system present with substance use issues, which means their children are at risk for prenatal exposure to alcohol and other drugs. Because child welfare agencies play an important role in identifying and providing services to mitigate negative impacts of prenatal exposures, we conducted a search for literature addressing child welfare information sources, policies, and practices related to this population. The search yielded 16 research/evaluation and 16 policy/practice papers, with most addressing exposures to both alcohol and other drugs. The literature most commonly reports that children identified as exposed are referred to child protection agencies during the newborn period. This practice may lead to underidentification, especially of children with prenatal exposure to alcohol. Research suggests that this population is at risk for poorer child welfare outcomes and that there are specific service needs for these children. This review indicates that there is an overall lack of research literature regarding identification of prenatally exposed children involved in the child welfare system that could best inform child welfare policies and practices. Studies investigating how the child welfare system identifies and cares for children with prenatal exposures are needed. |
Entamoeba sp. infection in a bearded dragon (Pogona vitticeps)
Diana S , Karim AMI , Shantanu R , Lisa PM , Brandy K , David E . Vet Glas 2019 74 (1) 77-84 A 3-year-old, male intact, pet inland bearded dragon (Pogona vitticeps) presented with a history of diarrhea, progressive inappetence and weight loss. A palpable cranial celomic mass was identified on physical examination and confirmed to be hepatic in origin by celomic ultrasonography. Hematologic and biochemical abnormalities were mild and consistent with inflammation, regenerative anemia, and hepatocellular injury. Fine needle aspiration of the liver masses was suggestive of amoebiasis and the patient was humanely euthanized. PCR and Sanger DNA sequencing of liver aspirates were supportive of Entamoeba infection, although definitive speciation was not possible. Pathogenic amoebiasis due to infection by E. invadens has been reported in a wide range of reptiles and is an important cause of morbidity and mortality in these species. |
Epidemiology and clinical features of Rocky Mountain spotted fever from enhanced surveillance, Sonora, Mexico: 2015-2018
Alvarez D , Ochoa E , Nichols Heitman K , Binder AM , Alvarez G , Armstrong PA . Am J Trop Med Hyg 2020 104 (1) 190-197 Rocky Mountain spotted fever (RMSF), caused by Rickettsia rickettsii, is a severe and potentially fatal tick-borne disease. In 2015, Mexico issued a declaration of epidemiologic emergency in response to ongoing outbreaks of RMSF in northern Mexico. Sonora state is one of the most heavily impacted states in Mexico, with historic case fatality rates (CFRs) of 18%. We summarized data from enhanced surveillance to understand demographic, clinical, and treatment factors associated with the high mortality. We conducted a retrospective review of confirmed and probable RMSF cases reported to the General Directorate of Health Promotion and Disease Prevention in Sonora. A case of RMSF is defined as fever (> 38.5°C), plus two symptoms, and epidemiologic criteria. A confirmed case requires laboratory evidence. During 2015-2018, a total of 510 cases of RMSF were reported; 252 (49%) were in persons aged ≤ 18 years. Case fatality rate was 44% (n = 222). Older age and being confirmed by PCR were associated with fatal outcome (P-value < 0.01). The mean time from onset of symptoms to treatment with doxycycline was 7.9 days (SD ± 5.5). Hot spot analysis revealed neither areas of inordinately high nor low incidence, rather clusters of disease in population centers. The CFR for RMSF in Sonora remains high, and a large proportion of cases are seen in persons aged ≤ 18 years. Whereas previously children experienced a disproportionately high CFR, interventions have reversed this trend. Disease clusters in urban nuclei, but location remains a predictor of fatal outcome. |
2018 Zika Health Brigade: Delivering critical health screening in the U.S. Virgin Islands
Godfred-Cato S , Fehrenbach SN , Reynolds MR , Galang RR , Schoelles D , Brown-Shuler L , Hillman B , DeWilde L , Prosper A , Hudson A , Moore CA , Ellis EM . Trop Med Infect Dis 2020 5 (4) In 2017, Hurricanes Irma and Maria caused significant damage to the United States Virgin Islands (USVI), heightening the challenges many residents faced in accessing adequate healthcare and receiving recommended Zika virus screening services. To address this challenge, the USVI Department of Health (DOH) requested technical assistance from the Centers for Disease Control and Prevention (CDC), the Health Resources and Services Administration (HRSA), and the American Academy of Pediatrics (AAP) to organize a health brigade to bring needed medical care to an underserved population. It also established the development of important partnerships between federal and private partners as well as between clinical providers and public health entities such as the Epidemiology & Disease Reporting, Maternal Child Health (MCH), and Infant and Toddlers Programs within the DOH, and local clinicians. This health brigade model could be replicated to ensure recommended evaluations are delivered to populations that may have unmet medical needs due to the complexity of the conditions and/or rural location. |
Human exposures to by-products from animals suspected to have died of anthrax in Bangladesh: An exploratory study
Islam MS , Hasan SM , Salzer JS , Kadzik M , Haque F , Haider N , Hossain MB , Islam MA , Rahman M , Kennedy E , Gurley ES . Transbound Emerg Dis 2020 68 (4) 2514-2520 Anthrax is a zoonotic disease caused by the bacterium Bacillus anthracis that is considered endemic in Bangladesh, where cases among animals and people have been reported almost annually since 2009. Contaminated by-products from animals are suspected to play a role in transmission to people, but minimal information is known on the supply-chain of these potentially contaminated products. Between April 2013 and May 2016, we conducted a qualitative study in 17 villages located in 5 districts in Bangladesh, which had experienced suspected anthrax outbreaks. The study explored how by-products from suspected animal cases were collected, discarded, processed, distributed, and used by people. We conducted open-ended interviews, group discussions, and unstructured observations of people's exposure to animal by-products. The practice of slaughtering acutely ill domestic ruminants before they died was common. Respondents reported that moribund animals were typically butchered, and the waste products were discarded in nearby rivers, ditches, bamboo bushes, or on privately owned land. Regardless of health status before death, very few carcasses were buried, and none were incinerated or burned. The hides were reportedly used to make wallets, belts, shoes, balls, and clothing. Discarded bones were often ground into granular and powder forms to produce bone meal and fertilizer. Therefore, given anthrax is endemic in the study region, livestock with acute onset of fatal disease or found dead with no known cause of death may be an anthrax case and subsequently pose a health risk to those involved in the collection and processing of the carcass, as well as the end-user of these products. Improved bio-security practices and safe carcass disposal measures could reduce the risk of human exposure, but resource and other constraints make implementation a challenge. Therefore, targeting at-risk animal populations for vaccination may be the most effective strategy to reduce anthrax outbreaks, protect the supply chain, and reduce the risk of exposure to B. anthracis. |
Preventing vector-borne transmission of Zika virus infection during pregnancy, Puerto Rico, USA, 2016-2017(1)
Kortsmit K , Salvesen von Essen B , Warner L , D'Angelo DV , Smith RA , Shapiro-Mendoza CK , Shulman HB , Virella WH , Taraporewalla A , Harrison L , Ellington S , Barfield WD , Jamieson DJ , Cox S , Pazol K , Garcia Díaz P , Herrera BR , Bernal MV . Emerg Infect Dis 2020 26 (11) 2717-2720 We examined pregnant women's use of personal protective measures to prevent mosquito bites during the 2016-2017 Zika outbreak in Puerto Rico. Healthcare provider counseling on recommended measures was associated with increased use of insect repellent among pregnant women but not with wearing protective clothing. |
Association between serological responses to two zoonotic ruminant pathogens and esophageal squamous cell carcinoma
Miller HK , Stoddard RA , Dawsey SM , Nasrollahzadeh D , Abnet CC , Etemadi A , Kamangar F , Murphy G , Sotoudeh M , Kersh GJ , Malekzadeh R , Camargo MC . Vector Borne Zoonotic Dis 2020 21 (2) 125-127 Questionnaire data have linked contact with ruminants to the risk of esophageal squamous cell carcinoma (ESCC) in high-risk Asian populations. To better understand this observed association, we investigated exposure to two major zoonotic ruminant pathogens relative to ESCC risk. Using enzyme-linked immunosorbent assay, immunofluorescence assay, and Brucella microagglutination test assays, we measured immunoglobulin G anti-Coxiella burnetii and anti-Brucella spp. antibodies in patients with ESCC (n = 177) and population-based controls (n = 177) matched by age, gender, and residence area from the Golestan case-control study in Iran. We found a similarly high seroprevalence of C. burnetii in ESCC cases and controls (75% and 80%, respectively), and a similarly low seroprevalence of Brucella spp. (0% and 0.6%, respectively). While documenting a high exposure to one of two zoonotic ruminant infections, this exposure failed to explain the observed association of ruminant contact and ESCC risk in this high-risk population. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Epidemiology and Surveillance
- Food Safety
- Health Behavior and Risk
- Health Communication and Education
- Health Economics
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Parasitic Diseases
- Physical Activity
- Public Health Leadership and Management
- Reproductive Health
- Substance Use and Abuse
- Veterinary Medicine
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure