Improving Screening Uptake among Breast Cancer Survivors and Their First-Degree Relatives at Elevated Risk to Breast Cancer: Results and Implications of a Randomized Study in the State of Georgia.
Lipscomb J , Escoffery C , Gillespie TW , Henley SJ , Smith RA , Chociemski T , Almon L , Jiang R , Sheng X , Goodman M , Ward KC . Int J Environ Res Public Health 2020 17 (3) Women diagnosed with breast cancer at a relatively early age (</=45 years) or with bilateral disease at any age are at elevated risk for additional breast cancer, as are their female first-degree relatives (FDRs). We report on a randomized trial to increase adherence to mammography screening guidelines among survivors and FDRs. From the Georgia Cancer Registry, breast cancer survivors diagnosed during 2000-2009 at six Georgia cancer centers underwent phone interviews about their breast cancer screening behaviors and their FDRs. Nonadherent survivors and FDRs meeting all inclusion criteria were randomized to high-intensity (evidence-based brochure, phone counseling, mailed reminders, and communications with primary care providers) or low-intensity interventions (brochure only). Three and 12-month follow-up questionnaires were completed. Data analyses used standard statistical approaches. Among 1055 survivors and 287 FDRs who were located, contacted, and agreed to participate, 59.5% and 62.7%, respectively, reported breast cancer screening in the past 12 months and were thus ineligible. For survivors enrolled at baseline (N = 95), the proportion reporting adherence to guideline screening by 12 months post-enrollment was similar in the high and low-intensity arms (66.7% vs. 79.2%, p = 0.31). Among FDRs enrolled at baseline (N = 83), screening was significantly higher in the high-intensity arm at 12 months (60.9% vs. 32.4%, p = 0.03). Overall, about 72% of study-eligible survivors (all of whom were screening nonadherent at baseline) reported screening within 12 months of study enrollment. For enrolled FDRs receiving the high-intensity intervention, over 60% reported guideline screening by 12 months. A major conclusion is that using high-quality central cancer registries to identify high-risk breast cancer survivors and then working closely with these survivors to identify their FDRs represents a feasible and effective strategy to promote guideline cancer screening. |
Trends in incidence of type 1 and type 2 diabetes among youths - selected counties and Indian reservations, United States, 2002-2015
Divers J , Mayer-Davis EJ , Lawrence JM , Isom S , Dabelea D , Dolan L , Imperatore G , Marcovina S , Pettitt DJ , Pihoker C , Hamman RF , Saydah S , Wagenknecht LE . MMWR Morb Mortal Wkly Rep 2020 69 (6) 161-165 Diabetes is one of the most common chronic diseases among persons aged <20 years (1). Onset of diabetes in childhood and adolescence is associated with numerous complications, including diabetic kidney disease, retinopathy, and peripheral neuropathy, and has a substantial impact on public health resources (2,3). From 2002 to 2012, type 1 and type 2 diabetes incidence increased 1.4% and 7.1%, respectively, among U.S. youths (4). To assess recent trends in incidence of diabetes in youths (defined for this report as persons aged <20 years), researchers analyzed 2002-2015 data from the SEARCH for Diabetes in Youth Study (SEARCH), a U.S. population-based registry study with clinical sites located in five states. The incidence of both type 1 and type 2 diabetes in U.S. youths continued to rise at constant rates throughout this period. Among all youths, the incidence of type 1 diabetes increased from 19.5 per 100,000 in 2002-2003 to 22.3 in 2014-2015 (annual percent change [APC] = 1.9%). Among persons aged 10-19 years, type 2 diabetes incidence increased from 9.0 per 100,000 in 2002-2003 to 13.8 in 2014-2015 (APC = 4.8%). For both type 1 and type 2 diabetes, the rates of increase were generally higher among racial/ethnic minority populations than those among whites. These findings highlight the need for continued surveillance for diabetes among youths to monitor overall and group-specific trends, identify factors driving these trends, and inform health care planning. |
A replicable approach to promoting best practices: Translating cardiovascular disease prevention research
Hawkins NA , Bhuiya AR , Shantharam S , Chapel JM , Taylor LN , Thigpen S , Decker A , Moeti R , Bernard S , Jones CD , Schooley M . J Public Health Manag Pract 2020 27 (2) 109-116 OBJECTIVE: Significant delays in translating health care-related research into public health programs and medical practice mean that people may not get the best care when they need it. Regarding cardiovascular disease, translation delays can mean lives may be unnecessarily lost each year. To facilitate the translation of knowledge to action, we created a Best Practices Guide for Cardiovascular Disease Prevention Programs. DESIGN: Using the Rapid Synthesis Translation Process and the Best Practices Framework as guiding frameworks, we collected and rated research evidence for hypertension control and cholesterol management strategies. After identifying best practices, we gathered information about programs that were implementing the practices and about resources useful for implementation. Research evidence and supplementary information were consolidated in an informational resource and published online. Web metrics were collected and analyzed to measure use and reach of the guide. RESULTS: The Best Practices Guide was released in January 2018 and included background information and resources on 8 best practice strategies. It was published as an online resource, publicly accessible from the Centers for Disease Control and Prevention Web site in 2 different formats. Web metrics show that in the first year after publication, there were 25 589 Web page views and 2467 downloads. A query of partner use of the guide indicated that it was often shared in partners' own resources, newsletters, and online material. CONCLUSION: In following a systematic approach to creating the Best Practices Guide and documenting the steps taken in its development, we offer a replicable approach for translating research on health care practices into a resource to facilitate implementation. The success of this approach is attributed to 3 key factors: using a prescribed and documented approach to evidence translation, working closely with stakeholders throughout the process, and prioritizing the content design and accessibility of the final product. |
Self-reported short sleep duration among US adults by disability status and functional disability type: Results from the 2016 Behavioral Risk Factor Surveillance System
Okoro CA , Courtney-Long E , Cyrus AC , Zhao G , Wheaton AG . Disabil Health J 2020 13 (3) 100887 BACKGROUND: Short sleep duration is associated with an increased risk of chronic disease and all-cause death. A better understanding of sleep disparities between people with and without disabilities can help inform interventions designed to improve sleep duration among people with disabilities. OBJECTIVE: To examine population-based prevalence estimates of short sleep duration by disability status and disability type among noninstitutionalized adults aged >/=18 years. METHODS: Data from the 2016 Behavioral Risk Factor Surveillance System were used to assess prevalence of short sleep duration among adults without and with disabilities (serious difficulty with cognition, hearing, mobility, or vision; any difficulty with self-care or independent living). Short sleep duration was defined as <7 h per 24-h period. We used log-binomial regression to estimate prevalence ratios (PRs) and 95% confidence intervals (CIs) while adjusting for socioeconomic and health-related characteristics. RESULTS: Adults with any disability had a higher prevalence of short sleep duration than those without disability (43.8% vs. 31.6%; p < .001). After controlling for selected covariates, short sleep was most prevalent among adults with multiple disabilities (PR 1.40, 95% CI: 1.36-1.43), followed by adults with a single disability type (range: PR 1.13, 95% CI: 1.03-1.24 [for independent living disability] to PR 1.25, 95% CI: 1.21-1.30 [for mobility disability]) compared to adults without disability. CONCLUSIONS: People with disabilities had a higher likelihood of reporting short sleep duration than those without disabilities. Assessment of sleep duration may be an important component in the provision of medical care to people with disabilities. |
A landscape of genomic alterations at the root of a near-untreatable tuberculosis epidemic.
Klopper M , Heupink TH , Hill-Cawthorne G , Streicher EM , Dippenaar A , de Vos M , Abdallah AM , Limberis J , Merker M , Burns S , Niemann S , Dheda K , Posey J , Pain A , Warren RM . BMC Med 2020 18 (1) 24 BACKGROUND: Atypical Beijing genotype Mycobacterium tuberculosis strains are widespread in South Africa and have acquired resistance to up to 13 drugs on multiple occasions. It is puzzling that these strains have retained fitness and transmissibility despite the potential fitness cost associated with drug resistance mutations. METHODS: We conducted Illumina sequencing of 211 Beijing genotype M. tuberculosis isolates to facilitate the detection of genomic features that may promote acquisition of drug resistance and restore fitness in highly resistant atypical Beijing forms. Phylogenetic and comparative genomic analysis was done to determine changes that are unique to the resistant strains that also transmit well. Minimum inhibitory concentration (MIC) determination for streptomycin and bedaquiline was done for a limited number of isolates to demonstrate a difference in MIC between isolates with and without certain variants. RESULTS: Phylogenetic analysis confirmed that two clades of atypical Beijing strains have independently developed resistance to virtually all the potent drugs included in standard (pre-bedaquiline) drug-resistant TB treatment regimens. We show that undetected drug resistance in a progenitor strain was likely instrumental in this resistance acquisition. In this cohort, ethionamide (ethA A381P) resistance would be missed in first-line drug-susceptible isolates, and streptomycin (gidB L79S) resistance may be missed due to an MIC close to the critical concentration. Subsequent inadequate treatment historically led to amplification of resistance and facilitated spread of the strains. Bedaquiline resistance was found in a small number of isolates, despite lack of exposure to the drug. The highly resistant clades also carry inhA promoter mutations, which arose after ethA and katG mutations. In these isolates, inhA promoter mutations do not alter drug resistance, suggesting a possible alternative role. CONCLUSION: The presence of the ethA mutation in otherwise susceptible isolates from ethionamide-naive patients demonstrates that known exposure is not an adequate indicator of drug susceptibility. Similarly, it is demonstrated that bedaquiline resistance can occur without exposure to the drug. Inappropriate treatment regimens, due to missed resistance, leads to amplification of resistance, and transmission. We put these results into the context of current WHO treatment regimens, underscoring the risks of treatment without knowledge of the full drug resistance profile. |
Phylodynamic Analysis Complements Partner Services by Identifying Acute and Unreported HIV Transmission.
Campbell EM , Patala A , Shankar A , Li JF , Johnson JA , Westheimer E , Gay CL , Cohen SE , Switzer WM , Peters PJ . Viruses 2020 12 (2) Tailoring public health responses to growing HIV transmission clusters depends on accurately mapping the risk network through which it spreads and identifying acute infections that represent the leading edge of cluster growth. HIV transmission links, especially those involving persons with acute HIV infection (AHI), can be difficult to uncover, or confirm during partner services investigations. We integrated molecular, epidemiologic, serologic and behavioral data to infer and evaluate transmission linkages between participants of a prospective study of AHI conducted in North Carolina, New York City and San Francisco from 2011-2013. Among the 547 participants with newly diagnosed HIV with polymerase sequences, 465 sex partners were reported, of whom only 35 (7.5%) had HIV sequences. Among these 35 contacts, 23 (65.7%) links were genetically supported and 12 (34.3%) were not. Only five links were reported between participants with AHI but none were genetically supported. In contrast, phylodynamic inference identified 102 unreported transmission links, including 12 between persons with AHI. Importantly, all putative transmission links between persons with AHI were found among large clusters with more than five members. Taken together, the presence of putative links between acute participants who did not name each other as contacts that are found only among large clusters underscores the potential for unobserved or undiagnosed intermediaries. Phylodynamics identified many more links than partner services alone and, if routinely and rapidly integrated, can illuminate transmission patterns not readily captured by partner services investigations. |
Combining genomics and epidemiology to track mumps virus transmission in the United States.
Wohl S , Metsky HC , Schaffner SF , Piantadosi A , Burns M , Lewnard JA , Chak B , Krasilnikova LA , Siddle KJ , Matranga CB , Bankamp B , Hennigan S , Sabina B , Byrne EH , McNall RJ , Shah RR , Qu J , Park DJ , Gharib S , Fitzgerald S , Barreira P , Fleming S , Lett S , Rota PA , Madoff LC , Yozwiak NL , MacInnis BL , Smole S , Grad YH , Sabeti PC . PLoS Biol 2020 18 (2) e3000611 Unusually large outbreaks of mumps across the United States in 2016 and 2017 raised questions about the extent of mumps circulation and the relationship between these and prior outbreaks. We paired epidemiological data from public health investigations with analysis of mumps virus whole genome sequences from 201 infected individuals, focusing on Massachusetts university communities. Our analysis suggests continuous, undetected circulation of mumps locally and nationally, including multiple independent introductions into Massachusetts and into individual communities. Despite the presence of these multiple mumps virus lineages, the genomic data show that one lineage has dominated in the US since at least 2006. Widespread transmission was surprising given high vaccination rates, but we found no genetic evidence that variants arising during this outbreak contributed to vaccine escape. Viral genomic data allowed us to reconstruct mumps transmission links not evident from epidemiological data or standard single-gene surveillance efforts and also revealed connections between apparently unrelated mumps outbreaks. |
Phylogenetic diversity of Mycobacterium tuberculosis in two geographically distinct locations in Botswana - The Kopanyo Study.
Click ES , Finlay A , Oeltmann JE , Basotli J , Modongo C , Boyd R , Wen XJ , Shepard J , Moonan PK , Zetola N . Infect Genet Evol 2020 81 104232 Mycobacterium tuberculosis complex (MTBC) is divided into several major phylogenetic lineages, with differential distribution globally. Using population-based data collected over a three year period, we performed 24-locus Mycobacterial Interspersed Repeat Unit - Variable Number Tandem Repeat (MIRU-VNTR) genotyping on all culture isolates from two districts of the country that differ in tuberculosis (TB) incidence (Gaborone, the capital, and Ghanzi in the Western Kalahari). The study objective was to characterize the molecular epidemiology of TB in these districts. Overall phylogenetic diversity mirrored that reported from neighboring Republic of South Africa, but differences in the two districts were marked. All four major lineages of M. tuberculosis were found in Gaborone, but only three of the four major lineages were found in Ghanzi. Strain diversity was lower in Ghanzi, with a large proportion (38%) of all isolates having an identical MIRU-VNTR result, compared to 6% of all isolates in Gaborone with the same MIRU-VNTR result. This study demonstrates localized differences in strain diversity by two districts in Botswana, and contributes to a growing characterization of MTBC diversity globally. |
Moderate-to-High Levels of Pretreatment HIV Drug Resistance in KwaZulu-Natal Province, South Africa.
Chimukangara B , Kharsany ABM , Lessells RJ , Naidoo K , Rhee SY , Manasa J , Graf T , Lewis L , Cawood C , Khanyile D , Diallo K , Ayalew KA , Shafer RW , Hunt G , Pillay D , Abdool SK , de Oliveira T . AIDS Res Hum Retroviruses 2019 35 (2) 129-138 There is evidence of increasing levels of pretreatment HIV drug resistance (PDR) in Southern Africa. We used data from two large population-based HIV surveillance studies to estimate prevalence of PDR in KwaZulu-Natal, the province with the highest HIV prevalence in South Africa. Sanger sequencing was performed on samples obtained from a longitudinal HIV surveillance program (study A, 2013-2014) and the HIV Incidence Provincial Surveillance System (study B, 2014-2015). Sequences were included for adult HIV positive participants (age >/=15 years for study A, age 15-49 years for study B) with no documented prior exposure to antiretroviral therapy (ART). Overall and drug class-specific PDR was estimated using the World Health Organization 2009 surveillance drug resistance mutation (SDRM) list, and phylogenetic analysis was performed to establish evidence of drug resistance transmission linkage. A total of 1,845 sequences were analyzed (611 study A; 1,234 study B). An overall PDR prevalence of 9.2% [95% confidence interval (CI) 7.0-11.7] was observed for study A and 11.0% (95% CI 8.9-13.2) for study B. In study B, the prevalence of non-nucleoside reverse-transcriptase inhibitor (NNRTI) PDR exceeded 10% for sequences collected in 2014 (10.2%, 95% CI 7.5-12.9). The most prevalent SDRMs were K103NS (7.5%), M184VI (2.4%), and V106AM (1.4%). There was no evidence of large transmission chains of drug-resistant virus. High level NNRTI PDR (>10%) suggests a need to modify the standard first-line ART regimen and to focus attention on improving the quality of HIV prevention, treatment, and care. |
Re-drawing the maps for endemic mycoses
Ashraf N , Kubat RC , Poplin V , Adenis AA , Denning DW , Wright L , McCotter O , Schwartz IS , Jackson BR , Chiller T , Bahr NC . Mycopathologia 2020 185 (5) 843-865 Endemic mycoses such as histoplasmosis, coccidioidomycosis, blastomycosis, paracoccidioidomycosis, and talaromycosis are well-known causes of focal and systemic disease within specific geographic areas of known endemicity. However, over the past few decades, there have been increasingly frequent reports of infections due to endemic fungi in areas previously thought to be "non-endemic." There are numerous potential reasons for this shift such as increased use of immune suppressive medications, improved diagnostic tests, increased disease recognition, and global factors such as migration, increased travel, and climate change. Regardless of the causes, it has become evident that our previous understanding of endemic regions for these fungal diseases needs to evolve. The epidemiology of the newly described Emergomyces is incomplete; our understanding of it continues to evolve. This review will focus on the evidence underlying the established areas of endemicity for these mycoses as well as new data and reports from medical literature that support the re-thinking these geographic boundaries. Updating the endemic fungi maps would inform clinical practice and global surveillance of these diseases. |
Effect of tuberculosis screening and retention interventions on early antiretroviral therapy mortality in Botswana: a stepped-wedge cluster randomized trial
Auld AF , Agizew T , Mathoma A , Boyd R , Date A , Pals SL , Serumola C , Mathebula U , Alexander H , Ellerbrock TV , Rankgoane-Pono G , Pono P , Shepherd JC , Fielding K , Grant AD , Finlay A . BMC Med 2020 18 (1) 19 BACKGROUND: Undiagnosed tuberculosis (TB) remains the most common cause of HIV-related mortality. Xpert MTB/RIF (Xpert) is being rolled out globally to improve TB diagnostic capacity. However, previous Xpert impact trials have reported that health system weaknesses blunted impact of this improved diagnostic tool. During phased Xpert rollout in Botswana, we evaluated the impact of a package of interventions comprising (1) additional support for intensified TB case finding (ICF), (2) active tracing for patients missing clinic appointments to support retention, and (3) Xpert replacing sputum-smear microscopy, on early (6-month) antiretroviral therapy (ART) mortality. METHODS: At 22 clinics, ART enrollees > 12 years old were eligible for inclusion in three phases: a retrospective standard of care (SOC), prospective enhanced care (EC), and prospective EC plus Xpert (EC+X) phase. EC and EC+X phases were implemented as a stepped-wedge trial. Participants in the EC phase received SOC plus components 1 (strengthened ICF) and 2 (active tracing) of the intervention package, and participants in the EC+X phase received SOC plus all three intervention package components. Primary and secondary objectives were to compare all-cause 6-month ART mortality between SOC and EC+X and between EC and EC+X phases, respectively. We used adjusted analyses, appropriate for study design, to control for baseline differences in individual-level factors and intra-facility correlation. RESULTS: We enrolled 14,963 eligible patients: 8980 in SOC, 1768 in EC, and 4215 in EC+X phases. Median age of ART enrollees was 35 and 64% were female. Median CD4 cell count was lower in SOC than subsequent phases (184/muL in SOC, 246/muL in EC, and 241/muL in EC+X). By 6 months of ART, 461 (5.3%) of SOC, 54 (3.2%) of EC, and 121 (3.0%) of EC+X enrollees had died. Compared with SOC, 6-month mortality was lower in the EC+X phase (adjusted hazard ratio, 0.77; 95% confidence interval, 0.61-0.97, p = 0.029). Compared with EC enrollees, 6-month mortality was similar among EC+X enrollees. CONCLUSIONS: Interventions to strengthen ICF and retention were associated with lower early ART mortality. This new evidence highlights the need to strengthen ICF and retention in many similar settings. Similar to other trials, no additional mortality benefit of replacing sputum-smear microscopy with Xpert was observed. TRIAL REGISTRATION: Retrospectively registered: ClinicalTrials.gov (NCT02538952). |
Temporal and geographic variability in time from HIV diagnosis to viral suppression in Alabama, 2012-2014
Batey DS , Dong X , Rogers RP , Merriweather A , Elopre L , Rana AI , Hall HI , Mugavero MJ . JMIR Public Health Surveill 2020 6 (2) e17217 BACKGROUND: Evaluation of the time from HIV diagnosis to viral suppression (VS) captures the collective effectiveness of HIV prevention and treatment activities in a given locale and provides a more global estimate of how effectively the larger HIV care system is working in a given geographic area or jurisdiction. OBJECTIVE: To evaluate temporal and geographic variability in VS among persons with newly diagnosed HIV infection in Alabama in 2012-2014. METHODS: With data from the National HIV Surveillance System, we evaluated median time from HIV diagnosis to VS (<200 c/mL) overall and stratified by Alabama public health area (PHA) among persons with HIV diagnosed during 2012-2014 using the Kaplan-Meier approach. RESULTS: Among 1,979 newly diagnosed persons, 1,181 (59.7%) achieved VS within 12 months of diagnosis; 52.6% in 2012, 59.5% in 2013, and 66.9% in 2014. Median time from HIV diagnosis to VS was 8 months; 10 months in 2012, 8 months in 2013, and 6 months in 2014. Across 11 PHAs in Alabama, 12-month VS ranged from 45.8% to 83.9%, and median time from diagnosis to VS ranged from five to 13 months. CONCLUSIONS: Temporal improvement in persons achieving VS following HIV diagnosis statewide in Alabama is encouraging. However, considerable geographic variability warrants further evaluation to inform public health action. Time from HIV diagnosis to VS represents a meaningful indicator that can be incorporated into public health surveillance and programming. CLINICALTRIAL: |
HIV and hepatitis C virus infection testing among commercially insured persons who inject drugs, United States, 2010-2017
Bull-Otterson L , Huang YA , Zhu W , King H , Edlin BR , Hoover KW . J Infect Dis 2020 222 (6) 940-947 BACKGROUND: We assessed prevalence of testing for HIV and hepatitis C virus (HCV) infection among persons who inject drugs (PWID). METHODS: Using a nationwide health insurance database for claims paid during 2010-2017, we identified PWID by using codes from the International Classification of Diseases, Current Procedural Terminology, and National Drug Codes directory. We then estimated the percentage of PWIDs tested for HIV or HCV within 1 year of an index encounter, and used multivariate logistic regression models to assess demographic and clinical factors associated with testing. RESULTS: Of 844 242 PWIDs, 71 938 (8.5%) were tested for HIV and 65 188 (7.7%) for HCV infections. Missed opportunities were independently associated with being male (ORs: HIV, 0.50 [95% CI, 0.49-0.50]; P < .001; HCV, 0.66 [95% CI, 0.65-0.72]; P < .001), rural residence (ORs: HIV, 0.67 [95% CI, 0.65-0.69]; P < .001; HCV, 0.75 [95% CI, 0.73-0.77]), and receiving services for skin infections or endocarditis (aORs: HIV, 0.91 [95% CI, 0.87-0.95]; P <.001; HCV, 0.90 [95% CI, 0.86-0.95]; P <.001). CONCLUSION: Approximately 90% of presumed PWIDs missed opportunities for HIV or HCV testing, especially male rural residents with claims for skin infections or endocarditis, commonly associated with injection drug use. |
Over the limit: tuberculosis and excessive alcohol use
Chaulk CP , Moonan PK . Int J Tuberc Lung Dis 2020 24 (1) 3-4 THE RELATIONSHIP BETWEEN alcohol consumption and tuberculosis (TB) is well documented.1 In 2017, 17% of all newly reported TB cases and 15% of all deaths during anti-tuberculosis treatment in the world were attributed to excessive alcohol use, second only to smoking (23%) as the top modifiable risk factor for TB.2 In this issue of the IJTLD, Ragan et al. conducted a comprehensive meta-analysis and found an association between excessive alcohol use and unsuccessful TB treatment outcomes.3 Among 111 studies analyzed, excessive alcohol use resulted in more loss to follow-up, more treatment failure, and more death than non-excessive alcohol use in both drug-susceptible and multidrug-resistant TB patients. |
Multidrug-resistant tuberculosis in the United States, 2011-2016: patient characteristics and risk factors
Chen MP , Miramontes R , Kammerer JS . Int J Tuberc Lung Dis 2020 24 (1) 92-99 OBJECTIVE: To determine risk factors for multidrug-resistant tuberculosis (MDR-TB) and describe MDR-TB according to three characteristics: previous TB disease, recent transmission of MDR-TB, and reactivation of latent MDR-TB infection.SETTING and DESIGN: We used 2011-2016 surveillance data from the US National Tuberculosis Surveillance System and National Tuberculosis Genotyping Service and used logistic regression models to estimate risk factors associated with MDR-TB.RESULTS: A total of 615/45 209 (1.4%) cases were confirmed as MDR-TB; 111/615 (18%) reported previous TB disease; 41/615 (6.7%) were attributed to recent MDR-TB transmission; and 449/615 (73%) to reactivation. Only 12/41 (29%) patients with TB attributed to recent transmission were known to be contacts of someone with MDR-TB. For non-US-born patients, the adjusted odds ratios of having MDR-TB were 32.6 (95%CI 14.6-72.6) among those who were known to be contacts of someone with MDR-TB and 6.5 (95%CI 5.1-8.3) among those who had had previous TB disease.CONCLUSION: The majority of MDR-TB cases in the United States were associated with previous TB disease or reactivation of latent MDR-TB infection; only a small proportion of MDR-TB cases were associated with recent transmission. |
Evolving epidemiology of reported giardiasis cases in the United States, 1995-2016
Coffey CM , Collier SA , Gleason ME , Yoder JS , Kirk MD , Richardson AM , Fullerton KE , Benedict KM . Clin Infect Dis 2020 72 (5) 764-770 BACKGROUND: Giardiasis is the most common intestinal parasitic disease of humans identified in the United States and an important waterborne disease. In the United States, giardiasis has been variably reportable since 1992 and was made a nationally notifiable disease in 2002. Our objective was to describe the epidemiology of US giardiasis cases from 1995-2016 using National Notifiable Disease Surveillance System data. METHODS: Negative binomial regression models were used to compare incidence rates by age groups (0-4, 5-9, 10-19, 20-29, 30-39, 40-49, 50-64 and >/=65 years) during three time periods (1995-2001, 2002-2010 and 2011-2016). RESULTS: From 1995-2016, the average number of reported cases were 19 781 per year (range 14 623-27 778 cases). The annual incidence of reported giardiasis in the US decreased across all age groups. This decrease differs by age group and sex and may reflect either changes in surveillance methods (for example changes to case definitions or reporting practices) or changes in exposure. Incidence rates in males and older age groups did not decrease to the same extent as rates in females and children. CONCLUSIONS: Trends suggest that differences in exposures by sex and age group are important to the epidemiology of giardiasis. Further investigation into the risk factors of populations with higher rates of giardiasis will support prevention and control efforts. |
A ten-year retrospective evaluation of acute flaccid myelitis at 5 pediatric centers in the United States, 2005-2014
Cortese MM , Kambhampati AK , Schuster JE , Alhinai Z , Nelson GR , Guzman Perez-Carrillo GJ , Vossough A , Smit MA , McKinstry RC , Zinkus T , Moore KR , Rogg JM , Candee MS , Sejvar JJ , Hopkins SE . PLoS One 2020 15 (2) e0228671 BACKGROUND: Acute flaccid myelitis (AFM) is a severe illness similar to paralytic poliomyelitis. It is unclear how frequently AFM occurred in U.S. children after poliovirus elimination. In 2014, an AFM cluster was identified in Colorado, prompting passive US surveillance that yielded 120 AFM cases of unconfirmed etiology. Subsequently, increased reports were received in 2016 and 2018. To help inform investigations on causality of the recent AFM outbreaks, our objective was to determine how frequently AFM had occurred before 2014, and if 2014 cases had different characteristics. METHODS: We conducted a retrospective study covering 2005-2014 at 5 pediatric centers in 3 U.S. regions. Possible AFM cases aged </=18 years were identified by searching discharge ICD-9 codes and spinal cord MRI reports (>37,000). Neuroradiologists assessed MR images, and medical charts were reviewed; possible cases were classified as AFM, not AFM, or indeterminate. RESULTS: At 5 sites combined, 26 AFM cases were identified from 2005-2013 (average annual number, 3 [2.4 cases/100,000 pediatric hospitalizations]) and 18 from 2014 (12.6 cases/100,000 hospitalizations; Poisson exact p<0.0001). A cluster of 13 cases was identified in September-October 2014 (temporal scan p = 0.0001). No other temporal or seasonal trend was observed. Compared with cases from January 2005-July 2014 (n = 29), cases from August-December 2014 (n = 15) were younger (p = 0.002), more frequently had a preceding respiratory/febrile illness (p = 0.03), had only upper extremities involved (p = 0.008), and had upper extremity monoplegia (p = 0.03). The cases had higher WBC counts in cerebrospinal fluid (p = 0.013). CONCLUSION: Our data support emergence of AFM in 2014 in the United States, and those cases demonstrated distinctive features compared with preceding sporadic cases. |
Identification of United States counties at elevated risk for congenital syphilis using predictive modeling and a risk scoring system
Cuffe KM , Kang JDY , Dorji T , Bowen VB , Leichliter JS , Torrone E , Bernstein KT . Sex Transm Dis 2020 47 (5) 290-295 BACKGROUND: Although preventable through timely screening and treatment, congenital syphilis (CS) rates are increasing in the United States (US), occurring in 5% of counties in 2015. Although individual-level factors are important predictors of CS, given the geographic focus of CS, it is also imperative to understand what county-level factors are associated with CS. METHODS: This is a secondary analysis of reported county CS cases to the National Notifiable Disease Surveillance System (NNDSS) during 2014-15 and 2016-17. We developed a predictive model to identify county-level factors associated with CS and use these to predict counties at elevated risk for future CS. RESULTS: Our final model identified 973 (31.0% of all US counties) counties at elevated risk for CS (sensitivity: 88.1%; specificity: 74.0%). County factors that were predictive of CS included metropolitan area, income inequality, P&S syphilis rates among women and MSM, and population proportions of those who are non-Hispanic Black, Hispanic, living in urban areas, and uninsured. The predictive model using 2014-2015 CS outcome data was predictive of 2016-2017 CS cases (area under the curve value = 89.2%) CONCLUSIONS: Given the dire consequences of CS, increasing prevention efforts remains important. The ability to predict counties at most elevated risk for CS based on county factors may help target CS resources where they are needed most. |
Rates of suicidal ideation among HIV-infected patients in care in the HIV Outpatient Study 2000-2017, USA
Durham MD , Armon C , Mahnken JD , Novak RM , Palella F , Tedaldi E , Buchacz K . Prev Med 2020 134 106011 BACKGROUND: Suicidal ideation (SI) refers to an individual thinking about, considering or planning suicide. Identifying and characterizing persons with HIV (PWH) at greater risk for SI may lead to better suicide prevention strategies and quality of life improvement. METHODS: Using clinical data gathered from medical chart abstraction for HIV Outpatient Study (HOPS) participants from 2000 to 2017, we assessed SI frequency among PWH in care and explored factors associated with the presence of SI diagnoses using linear mixed models analyses of case-matched participants. RESULTS: Among 6706 participants, 224 (3.3%) had a charted diagnosis of SI. Among those with SI, median age (interquartile range [IQR]) was 43.4years [IQR: 38.7-50.3], median (IQR) CD4 count was 439 cells/mm(3) (IQR: 237-686), 71.4% were male, 54% were men who have sex with men (MSM), 25.4% heterosexual, and 13.4% persons who inject drugs. In multivariable analysis, persons at increased risk for SI were more likely to be: <50years old (adjusted rate ratio [aRR] 1.86, 95% confidence interval [95%CI] 1.36-2.53), non-Hispanic/Latino black (aRR 1.75; 95%CI 1.29-2.38), have CD4+ cell count <350 cells/mm(3) (aRR 1.32; 95%CI 1.05-1.65), have a viral load >/=50 copies/mL (aRR 1.49; 95%CI 1.12-1.98), have stopped antiretroviral therapy (aRR 1.46; 95%CI 1.10-1.95), have a history of: alcohol dependence (aRR 2.75; 95%CI 1.67-4.52), and drug overdose (aRR 4.09; 95%CI 2.16-7.71). CONCLUSION: Routine mental health assessment and monitoring are needed in HIV clinical practice to better understand factors associated with SI and to inform the development of preventive interventions. |
Assessment of routine screening of pediatric contacts of adults with tuberculosis disease in Tanzania
Emerson C , Ng'eno B , Ngowi B , Pals S , Kohi W , Godwin M , Date A , Modi S . Public Health Action 2019 9 (4) 148-152 Setting: Ten selected healthcare facilities in Tanzania, March-April 2016. Objective(s): To assess the implementation of screening among pediatric contacts of adults with tuberculosis (TB) disease. Design(s): Using a mixed-methods approach, we conducted a questionnaire study among sputum smear-positive adult TB patients and abstracted data from their patient cards to assess the implementation of a child contact management (CCM) intervention. We also conducted in-depth interviews with healthcare workers (HCWs) to solicit their views on clinical practices and challenges in CCM. Result(s): A total of 141 adult smear-positive TB patients reported 396 children living in households; detailed information on 346 (87.4%) was available. Only 37 (10.7%) children were clinically assessed for TB, 5 (13.5%) were diagnosed with TB, and 22 started on isoniazid preventive therapy (IPT) (59.0%). Of the 320 children whose caregivers responded to whether their children had undergone human immunodeficiency virus (HIV) testing, 55 (17.2%) had been tested and one (1.8%) was HIV-positive. Forty-one HCWs described passive CCM without use of contact or IPT registers. Conclusion(s): We identified gaps in the implementation of TB screening, IPT provision, and HIV testing in pediatric contacts of adults with sputum smear-positive TB. Systematic efforts, including increasing HCW training and educating the community, may improve implementation. |
Evaluation of algorithms used for PrEP surveillance using a reference population from New York City, July 2016-June 2018
Furukawa NW , Smith DK , Gonzalez CJ , Huang YA , Hanna DB , Felsen UR , Zhu W , Arnsten JH , Patel VV . Public Health Rep 2020 135 (2) 33354920904085 OBJECTIVE: Daily tenofovir disoproxil fumarate/emtricitabine (TDF/FTC) use as HIV preexposure prophylaxis (PrEP) is monitored by identifying TDF/FTC prescriptions from pharmacy databases and applying diagnosis codes and antiretroviral data to algorithms that exclude TDF/FTC prescribed for HIV postexposure prophylaxis (PEP), HIV treatment, and hepatitis B virus (HBV) treatment. We evaluated the accuracy of 3 algorithms used by the Centers for Disease Control and Prevention (CDC), Gilead Sciences, and the New York State Department of Health (NYSDOH) using a reference population in Bronx, New York. METHODS: We extracted diagnosis codes and data on all antiretroviral prescriptions other than TDF/FTC from an electronic health record database for persons aged >/=16 prescribed TDF/FTC during July 2016-June 2018 at Montefiore Medical Center. We reviewed medical records to classify the true indication of first TDF/FTC use as PrEP, PEP, HIV treatment, or HBV treatment. We applied each algorithm to the reference population and compared the results with the medical record review. RESULTS: Of 2862 patients included in the analysis, 694 used PrEP, 748 used PEP, 1407 received HIV treatment, and 13 received HBV treatment. The algorithms had high specificity (range: 98.4%-99.0%), but the sensitivity of the CDC algorithm using a PEP definition of TDF/FTC prescriptions </=30 days was lower (80.3%) than the sensitivity of the algorithms developed by Gilead Sciences (94.7%) or NYSDOH (96.1%). Defining PEP as TDF/FTC prescriptions </=28 days improved CDC algorithm performance (sensitivity, 95.8%; specificity, 98.8%). CONCLUSIONS: Adopting the definition of PEP as </=28 days of TDF/FTC in the CDC algorithm should improve the accuracy of national PrEP surveillance. |
The changing landscape of pediatric viral enteropathogens in the post-rotavirus vaccine era
Halasa N , Piya B , Stewart LS , Rahman H , Payne DC , Woron A , Thomas L , Constantine-Renna L , Garman K , McHenry R , Chappell J , Spieker AJ , Fonnesbeck C , Batarseh E , Hamdan L , Wikswo ME , Parashar U , Bowen MD , Vinje J , Hall AJ , Dunn JR . Clin Infect Dis 2020 72 (4) 576-585 BACKGROUND: Acute gastroenteritis(AGE) is a common reason for children to seek medical care. However, the viral etiology of AGE illness is not well described in the post-rotavirus vaccine era, particularly in the outpatient(OP) setting. METHODS: Between 2012 and 2015, children 15 days through 17 years old presenting to Vanderbilt Children's Hospital, Nashville, TN with AGE were enrolled prospectively from the inpatient, emergency department, and OP settings and stool specimens were collected. Healthy controls(HCs) were enrolled and frequency-matched for period, age group, race, and ethnicity. Stool specimens were tested by reverse-transcription real-time quantitative polymerase chain reaction for norovirus, sapovirus, and astrovirus RNA and by Rotaclone enzyme immunoassay for rotavirus antigen, followed by PCR verification of antigen detection. RESULTS: A total of 3705 AGE cases and 1563 HC were enrolled, among whom 2885 cases(78%) and 1110 HCs(71%) provided stool specimens that were tested. All four viruses were more frequently detected in AGE cases vs. HC: norovirus, 22% vs. 8%; rotavirus, 10% vs. 1%; sapovirus, 10% vs. 5%; and astrovirus, 5% vs. 2%(p<0.001 for each virus, respectively). AGE rates in the OP setting due to norovirus were highest compared to the other three viruses. Children under five years old had higher OP AGE rates compared to older children for all viruses. CONCLUSIONS: Norovirus remains the most common virus detected in all settings, occurring nearly twice as frequently as the next most common pathogens, sapovirus and rotavirus. Combined, these four viruses were associated with almost half of all AGE visits and therefore are an important reason for children to seek medical care. |
Clinical features of human metapneumovirus-associated community-acquired pneumonia hospitalizations
Howard LM , Edwards KM , Zhu Y , Grijalva CG , Self WH , Jain S , Ampofo K , Pavia AT , Arnold SR , McCullers JA , Anderson EJ , Wunderink RG , Williams DJ . Clin Infect Dis 2020 72 (1) 108-117 BACKGROUND: Human metapneumovirus (HMPV) is a leading cause of respiratory tract infections. Few studies have compared the clinical characteristics and severity of HMPV-associated pneumonia with other pathogens. METHODS: Active population-based surveillance was previously conducted for radiographically-confirmed community-acquired pneumonia hospitalizations among children and adults in eight United States hospitals. Clinical data and specimens for pathogen detection were systematically collected. We described clinical features of all HMPV-associated pneumonia, and after excluding co-detections with other pathogen types, we compared features of HMPV-associated pneumonia with other viral, atypical, and bacterial pneumonia and modeled severity (mild, moderate, severe) and length of stay using multivariable proportional odds regression. RESULTS: HMPV was detected in 298/2358 (12.6%) children and 88/2320 (3.8%) adults hospitalized with pneumonia and was commonly co-detected with other pathogens (125/298 [42%] children and 21/88 [24%] adults). Fever and cough were the most common presenting symptoms of HMPV-associated pneumonia and were also common symptoms of other pathogens. After excluding co-detections, in children (n=1778), compared to HMPV (reference), bacterial pneumonia exhibited increased severity (OR 3.66 [95% CI 1.43-9.40]), RSV (0.76 [0.59-0.99]) and atypical (0.39 [0.19-0.81]) infections exhibited decreased severity, and other viral pneumonia exhibited similar severity (0.88 [0.55-1.39]). In adults (n=2145), bacterial (3.74 [1.87-7.47]) and RSV pneumonia (1.82 [1.32-2.50]) were more severe than HMPV (reference), but all other pathogens had similar severity. CONCLUSIONS: Clinical features did not reliably distinguish HMPV-associated pneumonia from other pathogens. HMPV-associated pneumonia was less severe than bacterial and adult RSV pneumonia but otherwise as or more severe than other common pathogens. |
Norovirus seroprevalence among adults in the United States: Analysis of NHANES serum specimens from 1999-2000 and 2003-2004
Kirby AE , Kienast Y , Zhu W , Barton J , Anderson E , Sizemore M , Vinje J , Moe CL . Viruses 2020 12 (2) Norovirus is the most common cause of epidemic and endemic acute gastroenteritis. However, national estimates of the infection burden are challenging. This study used a nationally representative serum bank to estimate the seroprevalence to five norovirus genotypes including three GII variants: GI.1 Norwalk, GI.4, GII.3, GII.4 US95/96, GII.4 Farmington Hills, GII.4 New Orleans, and GIV.1 in the USA population (aged 16 to 49 years). Changes in seroprevalence to the three norovirus GII.4 variants between 1999 and 2000, as well as 2003 and 2004, were measured to examine the role of population immunity in the emergence of pandemic GII.4 noroviruses. The overall population-adjusted seroprevalence to any norovirus was 90.0% (1999 to 2000) and 95.9% (2003 to 2004). Seroprevalence was highest to GI.1 Norwalk, GII.3, and the three GII.4 noroviruses. Seroprevalence to GII.4 Farmington Hills increased significantly between the 1999 and 2000, as well as the 2003 and 2004, study cycles, consistent with the emergence of this pandemic strain. Seroprevalence to GII.4 New Orleans also increased over time, but to a lesser degree. Antibodies against the GIV.1 norovirus were consistently detected (population-adjusted seroprevalence 19.1% to 25.9%), with rates increasing with age. This study confirms the high burden of norovirus infection in US adults, with most adults having multiple norovirus infections over their lifetime. |
Availability of safety-net sexually transmitted disease clinical services in the U.S., 2018
Leichliter JS , O'Donnell K , Kelley K , Cuffe KM , Weiss G , Gift TL . Am J Prev Med 2020 58 (4) 555-561 INTRODUCTION: Safety-net sexually transmitted disease services can prevent transmission of sexually transmitted disease. This study assesses the availability of safety-net sexually transmitted disease clinical services across the U.S. METHODS: A 2018 survey of U.S. local health departments examined the availability of safety-net providers and the availability of specific sexually transmitted disease clinical services, including point-of-care testing and treatment. In 2019, Rao-Scott chi-square tests were used to compare service availability by clinic type (sexually transmitted disease clinic versus other clinics). RESULTS: A total of 326 local health departments completed the survey (49% response rate). Of respondents, 64.4% reported that a clinic in their jurisdiction provided safety-net sexually transmitted disease services. Having a safety-net clinic that provided sexually transmitted disease services was more common in medium and large jurisdictions. Sexually transmitted disease clinics were the primary provider in 40.5% of jurisdictions. A wide range of specific sexually transmitted disease services was offered at the primary safety-net clinic for sexually transmitted diseases. Most clinics offered human papillomavirus vaccination and appropriate point-of-care treatment for gonorrhea and syphilis. Fewer than one-quarter of clinics offered point-of-care rapid plasma reagin or darkfield microscopy syphilis testing. Compared with other clinics, services more commonly offered at sexually transmitted disease clinics included same-day services, hepatitis B vaccination, rapid plasma reagin testing (syphilis), any point-of-care testing for gonorrhea, point-of-care trichomonas testing, and extragenital chlamydia or gonorrhea testing. CONCLUSIONS: One-third of local health departments reported no safety-net sexually transmitted disease services or were not aware of the services, and availability of specific services varied. Without an expansion of resources, local health departments might explore collaborations with healthcare systems and innovations in testing to expand sexually transmitted disease services. |
Food insecurity and risk indicators for sexually transmitted infection among sexually active persons aged 15-44, National Survey of Family Growth, 2011-2017
Loosier PS , Haderxhanaj L , Beltran O , Hogben M . Public Health Rep 2020 135 (2) 33354920904063 OBJECTIVES: Food insecurity is linked to poor sexual health outcomes, especially among persons engaged in sexual behaviors that are associated with the risk of acquiring sexually transmitted infections (STIs). We examined this link using nationally representative data. METHODS: We used data on adolescents and adults aged 15-44 who reported sexual activity in the past year from 6 years (September 2011-September 2017) of cross-sectional, weighted public-use data from the National Survey of Family Growth. We compared data on persons who did and did not report food insecurity, accounting for demographic characteristics, markers of poverty, and past-year STI risk indicators (ie, engaged in 1 of 4 high-risk activities or diagnosed with chlamydia or gonorrhea). RESULTS: Respondents who reported at least 1 past-year STI risk indicator were significantly more likely to report food insecurity (females: adjusted risk ratio [ARR] = 1.63; 95% confidence interval [CI], 1.35-1.97; P < .001; males: ARR = 1.46; 95% CI, 1.16-1.85) than respondents who did not report food insecurity. This finding was independent of the association between food insecurity and markers of poverty (</=100% federal poverty level [females: ARR = 1.46; 95% CI, 1.23-1.72; P < .001; males: ARR = 1.81; 95% CI, 1.49-2.20; P < .001]; if the respondent or someone in the household had received Special Supplemental Nutrition Program for Women, Infants, and Children or Supplemental Nutrition Assistance Program benefits in the past year [females: ARR = 3.37; 95% CI, 2.81-4.02; P < .001; males: ARR = 3.27; 95% CI, 2.76-3.87; P < .001]). Sex with opposite- and same-sex partners in the past year was significantly associated with food insecurity (females: ARR = 1.44; 95% CI, 1.11-1.85; P = .01; males: ARR = 1.99; 95% CI, 1.15-3.42; P = .02). CONCLUSIONS: Food insecurity should be considered a social determinant of health independent of poverty, and its effect on persons at highest risk for STIs, including HIV, should be considered when planning interventions designed to decrease engagement in higher-risk sexual behaviors. |
Social and structural factors associated with sustained viral suppression among heterosexual black men with diagnosed HIV in the United States, 2015-2017
McCree DH , Beer L , Fugerson AG , Tie Y , Bradley ELP . AIDS Behav 2020 24 (8) 2451-2460 This paper describes sociodemographic, sexual risk behavior, and clinical care factors associated with sustained viral suppression (SVS) among heterosexual Black men with diagnosed HIV in the US. Sample was 968 men, 2015-2017 cycles of Medical Monitoring Project. We used prevalence ratios and a multivariable logistic regression model to identify independent predictors of SVS. About 9% of sexually active men had sex that carries a risk of HIV transmission. Nearly 2/3 lived at or below the poverty level, 13% were under or uninsured, 1/4 experienced food insecurity and 15% reported recent homelessness. About 26% were not engaged in HIV care, 8% not currently taking antiretroviral therapy (ART) and 59% had SVS. Among men taking ART, care engagement and adherence were the only significant independent predictors of SVS. Efforts to increase VS should focus on increasing ART use, care engagement, and ART adherence, and include strategies that address the social and structural factors that influence them. |
Characteristics of patients with acute flaccid myelitis, United States, 2015-2018
McLaren N , Lopez A , Kidd S , Zhang JX , Nix WA , Link-Gelles R , Lee A , Routh JA . Emerg Infect Dis 2020 26 (2) 212-219 Observed peaks of acute flaccid myelitis (AFM) cases have occurred biennially since 2014 in the United States. We aimed to determine if AFM etiology differed between peak and nonpeak years, considering that clinical features of AFM differ by virus etiology. We compared clinical and laboratory characteristics of AFM cases that occurred during peak (2016 and 2018, n = 366) and nonpeak (2015 and 2017, n = 50) years. AFM patients in peak years were younger (5.2 years) than those in nonpeak years (8.3 years). A higher percentage of patients in peak years than nonpeak years had pleocytosis (86% vs. 60%), upper extremity involvement (33% vs. 16%), and an illness preceding limb weakness (90% vs. 62%) and were positive for enterovirus or rhinovirus RNA (38% vs. 16%). Enterovirus D68 infection was associated with AFM only in peak years. Our findings suggest AFM etiology differs between peak and nonpeak years. |
Lack of tularemia among health care providers with close contact with infected patients-a case series
Nelson CA , Brown J , Riley L , Dennis A , Oyer R , Brown C . Open Forum Infect Dis 2020 7 (1) ofz499 Francisella tularensis has a low infectious dose and can infect laboratory staff handling clinical specimens. The risk to health care providers exposed during patient care is poorly defined. We describe 9 examples of health care providers who did not develop tularemia after significant exposures to infected patients. |
Self-reported prevalence of HIV testing among those reporting having been diagnosed with selected sexually transmitted infections or hepatitis C, United States, 2005-2016
Patel SN , Delaney KP , Pitasi MA , Oraka E , Tao G , Van Handel M , Kilmer G , DiNenno EA . Sex Transm Dis 2020 47 S53-S60 BACKGROUND: Persons with sexually transmitted infections (STIs) or hepatitis C virus (HCV) infection often have indicators of HIV risk. We used weighted data from six cycles of the National Health and Nutrition Examination Survey (NHANES) to assess the proportion of persons who reported ever being diagnosed with a selected STI or HCV infection and who reported that they were ever tested for HIV. METHODS: Persons aged 20-59 years with prior knowledge of HCV infection before receiving NHANES HCV RNA positive results (2005-2012) or reporting ever being told by a doctor that they had HCV infection (2013-2016), or ever had genital herpes, or had chlamydia or gonorrhea in the past 12 months, were categorized as having had a selected STI or HCV infection. Weighted proportions and 95% confidence intervals were estimated for reporting ever being tested for HIV for those who did and did not report a selected STI or HCV infection. RESULTS: A total of 19,102 respondents had non-missing data for STI and HCV diagnoses and HIV testing history; 44.4% reported ever having been tested for HIV and 5.2% reported being diagnosed with a selected STI or HCV infection. The proportion reporting an HIV test was higher for the group that reported a STI or HCV infection than the group that did not. CONCLUSION: Self-reported HIV testing remains low in the United States, even among those who reported a previous selected STI or HCV infection. Ensuring HIV tests are conducted routinely for those with overlapping risk factors can help facilitate diagnosis of HIV infections. |
Updated assessment of risks and benefits of dolutegravir versus efavirenz in new antiretroviral treatment initiators in sub-Saharan Africa: modelling to inform treatment guidelines
Phillips AN , Bansi-Matharu L , Venter F , Havlir D , Pozniak A , Kuritzkes DR , Wensing A , Lundgren JD , Pillay D , Mellors J , Cambiano V , Jahn A , Apollo T , Mugurungi O , Ripin D , Da Silva J , Raizes E , Ford N , Siberry GK , Gupta RK , Barnabas R , Revill P , Cohn J , Calmy A , Bertagnolio S . Lancet HIV 2020 7 (3) e193-e200 BACKGROUND: The integrase inhibitor dolutegravir is being considered in several countries in sub-Saharan Africa instead of efavirenz for people initiating antiretroviral therapy (ART) because of superior tolerability and a lower risk of resistance emergence. WHO requested updated modelling results for its 2019 Antiretroviral Guidelines update, which was restricted to the choice of dolutegravir or efavirenz in new ART initiators. In response to this request, we modelled the risks and benefits of alternative policies for initial first-line ART regimens. METHODS: We updated an existing individual-based model of HIV transmission and progression in adults to consider information on the risk of neural tube defects in women taking dolutegravir at time of conception, as well as the effects of dolutegravir on weight gain. The model accounted for drug resistance in determining viral suppression, with consequences for clinical outcomes and mother-to-child transmission. We sampled distributions of parameters to create various epidemic setting scenarios, which reflected the diversity of epidemic and programmatic situations in sub-Saharan Africa. For each setting scenario, we considered the situation in 2018 and compared ART initiation policies of an efavirenz-based regimen in women intending pregnancy, and a dolutegravir-based regimen in others, and a dolutegravir-based regimen, including in women intending pregnancy. We considered predicted outcomes over a 20-year period from 2019 to 2039, used a 3% discount rate, and a cost-effectiveness threshold of US$500 per disability-adjusted life-year (DALY) averted. FINDINGS: Considering updated information on risks and benefits, a policy of ART initiation with a dolutegravir-based regimen rather than an efavirenz-based regimen, including in women intending pregnancy, is predicted to bring population health benefits (10 990 DALYs averted per year) and to be cost-saving (by $2.9 million per year), leading to a reduction in the overall population burden of disease of 16 735 net DALYs per year for a country with an adult population size of 10 million. The policy involving ART initiation with a dolutegravir-based regimen in women intending pregnancy was cost-effective in 87% of our setting scenarios and this finding was robust in various sensitivity analyses, including around the potential negative effects of weight gain. INTERPRETATION: In the context of a range of modelled setting scenarios in sub-Saharan Africa, we found that a policy of ART initiation with a dolutegravir-based regimen, including in women intending pregnancy, was predicted to bring population health benefits and be cost-effective, supporting WHO's strong recommendation for dolutegravir as a preferred drug for ART initiators. FUNDING: Bill & Melinda Gates Foundation. |
HIV testing and linkage to care among transgender women who have sex with men: 23 U.S. cities
Pitasi MA , Clark HA , Chavez PR , DiNenno EA , Delaney KP . AIDS Behav 2020 24 (8) 2442-2450 Transgender women face unique barriers to HIV testing and linkage to care. This article describes the results of a national testing initiative conducted by 36 community-based and other organizations using a variety of recruitment and linkage-to-care strategies. A total of 2191 HIV tests were conducted with an estimated 1877 unique transgender women, and 4.6% of the transgender women had confirmed positive results. Two thirds (66.3%) were linked to care within approximately three months of follow-up, and the median time to linkage was 7 days. Transgender women tested at clinical sites were linked to care faster than those tested at non-clinical sites (median: 0 vs. 12 days; P = .003). Despite the use of a variety of linkage-to-care strategies, the proportion of transgender women successfully linked to care was below national goals. Tailored programs and interventions are needed to increase HIV testing and improve timely linkage to care in this population. |
Duration of exposure among close contacts of patients with infectious tuberculosis and risk of latent tuberculosis infection
Reichler MR , Khan A , Yuan Y , Chen B , McAuley J , Mangura B , Sterling TR . Clin Infect Dis 2020 71 (7) 1627-1634 BACKGROUND: Predictors of latent tuberculosis infection (LTBI) among close contacts of persons with infectious tuberculosis (TB) are incompletely understood, particularly the number of exposure hours. METHODS: We prospectively enrolled adult patients with culture-confirmed pulmonary TB and their close contacts at 9 health departments in the United States and Canada. Patients with TB were interviewed and close contacts were interviewed and screened for TB and LTBI during contact investigations. RESULTS: LTBI was diagnosed in 1390 (46%) of 3040 contacts, including 624 (31%) of 2027 US/Canadian-born and 766 (76%) of 1013 non-US/Canadian-born contacts. In multivariable analysis, age >/=5 years, male sex, non-US/Canadian birth, smear-positive index patient, and shared bedroom with an index patient (P < .001 for each), as well as exposure to >1 index patient (P < .05), were associated with LTBI diagnosis. LTBI prevalence increased with increasing exposure duration, with an incremental prevalence increase of 8.2% per 250 exposure hours (P < .0001). For contacts with <250 exposure hours, no difference in prevalence was observed per 50 exposure hours (P = .63). CONCLUSIONS: Hours of exposure to a patient with infectious TB is an important LTBI predictor, with a possible risk threshold of 250 hours. More exposures, closer exposure proximity, and more extensive index patient disease were additional LTBI predictors. |
Prevalence of viral load suppression, predictors of virological failure and patterns of HIV drug resistance after 12 and 48 months on first-line antiretroviral therapy: a national cross-sectional survey in Uganda
Ssemwanga D , Asio J , Watera C , Nannyonjo M , Nassolo F , Lunkuse S , Salazar-Gonzalez JF , Salazar MG , Sanyu G , Lutalo T , Kabuga U , Ssewanyana I , Namatovu F , Namayanja G , Namale A , Raizes E , Kaggwa M , Namuwenge N , Kirungi W , Katongole-Mbidde E , Kaleebu P . J Antimicrob Chemother 2020 75 (5) 1280-1289 OBJECTIVES: We implemented the WHO cross-sectional survey protocol to determine rates of HIV viral load (VL) suppression (VLS), and weighted prevalence, predictors and patterns of acquired drug resistance (ADR) in individuals with virological failure (VF) defined as VL >/=1000 copies/mL. METHODS: We enrolled 547 and 1064 adult participants on first-line ART for 12 (+/-3) months (ADR12) and >/=48 months (ADR48), respectively. Dried blood spots and plasma specimens were collected for VL testing and genotyping among the VFs. RESULTS: VLS was 95.0% (95% CI 93.4%-96.5%) in the ADR12 group and 87.9% (95% CI 85.0%-90.9%) in the ADR48 group. The weighted prevalence of ADR was 96.1% (95% CI 72.9%-99.6%) in the ADR12 and 90.4% (95% CI 73.6-96.8%) in the ADR48 group, out of the 30 and 95 successful genotypes in the respective groups. Initiation on a zidovudine-based regimen compared with a tenofovir-based regimen was significantly associated with VF in the ADR48 group; adjusted OR (AOR) 1.96 (95% CI 1.13-3.39). Independent predictors of ADR in the ADR48 group were initiation on a zidovudine-based regimen compared with tenofovir-based regimens, AOR 3.16 (95% CI 1.34-7.46) and ART duration of >/=82 months compared with <82 months, AOR 1.92 (95% CI 1.03-3.59). CONCLUSIONS: While good VLS was observed, the high prevalence of ADR among the VFs before they underwent the recommended three intensive adherence counselling (IAC) sessions followed by repeat VL testing implies that IAC prior to treatment switching may be of limited benefit in improving VLS. |
Prospective evaluation of HIV testing technologies in a clinical setting: Protocol for Project DETECT
Stekler JD , Violette LR , Clark HA , McDougal SJ , Niemann LA , Katz DA , Chavez PR , Wesolowski LG , Ethridge SF , McMahan VM , Cornelius-Hudson A , Delaney KP . JMIR Res Protoc 2020 9 (1) e16332 BACKGROUND: HIV testing guidelines provided by the Centers for Disease Control and Prevention (CDC) are continually changing to reflect advancements in new testing technology. Evaluation of existing and new point-of-care (POC) HIV tests is crucial to inform testing guidelines and provide information to clinicians and other HIV test providers. Characterizing the performance of POC HIV tests using unprocessed specimens can provide estimates for the window period of detection, or the time from HIV acquisition to test positivity, which allows clinicians and other HIV providers to select the appropriate POC HIV tests for persons who may be recently infected with HIV. OBJECTIVE: This paper describes the protocols and procedures used to evaluate the performance of the newest POC tests and determine their sensitivity during early HIV infection. METHODS: Project DETECT is a CDC-funded study that is evaluating POC HIV test performance. Part 1 is a cross-sectional, retrospective study comparing behavioral characteristics and HIV prevalence of the overall population of the Public Health-Seattle & King County (PHSKC) Sexually Transmitted Disease (STD) Clinic to Project DETECT participants enrolled in part 2. Part 2 is a cross-sectional, prospective study evaluating POC HIV tests in real time using unprocessed whole blood and oral fluid specimens. A POC nucleic acid test (NAT) was added to the panel of HIV tests in June 2018. Part 3 is a longitudinal, prospective study evaluating seroconversion sensitivity of POC HIV tests through serial follow-up testing. For comparison, HIV-1 RNA and HIV-1/HIV-2 antigen/antibody tests are also performed for participants enrolled in part 2 or 3. A behavioral survey that collects information about demographics, history of HIV testing, STD history, symptoms of acute HIV infection, substance use, sexual behaviors in the aggregate and with recent partners, and use of pre-exposure prophylaxis and antiretroviral therapy is completed at each part 2 or 3 visit. RESULTS: Between September 2015 and March 2019, there were 14,990 Project DETECT-eligible visits (part 1) to the PHSKC STD Clinic resulting in 1819 part 2 Project DETECT study visits. The longitudinal study within Project DETECT (part 3) enrolled 27 participants with discordant POC test results from their part 2 visit, and 10 (37%) were followed until they had fully seroconverted with concordant positive POC test results. Behavioral survey data and HIV test results, sensitivity, and specificity will be presented elsewhere. CONCLUSIONS: Studies such as Project DETECT are critical for evaluating POC HIV test devices as well as describing characteristics of persons at risk for HIV acquisition in the United States. HIV tests in development, including POC NATs, will provide new opportunities for HIV testing programs. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID): RR1-10.2196/16332. |
Guidelines for the treatment of latent tuberculosis infection: Recommendations from the National Tuberculosis Controllers Association and CDC, 2020
Sterling TR , Njie G , Zenner D , Cohn DL , Reves R , Ahmed A , Menzies D , Horsburgh CRJr , Crane CM , Burgos M , LoBue P , Winston CA , Belknap R . MMWR Recomm Rep 2020 69 (1) 1-11 Comprehensive guidelines for treatment of latent tuberculosis infection (LTBI) among persons living in the United States were last published in 2000 (American Thoracic Society. CDC targeted tuberculin testing and treatment of latent tuberculosis infection. Am J Respir Crit Care Med 2000;161:S221-47). Since then, several new regimens have been evaluated in clinical trials. To update previous guidelines, the National Tuberculosis Controllers Association (NTCA) and CDC convened a committee to conduct a systematic literature review and make new recommendations for the most effective and least toxic regimens for treatment of LTBI among persons who live in the United States.The systematic literature review included clinical trials of regimens to treat LTBI. Quality of evidence (high, moderate, low, or very low) from clinical trial comparisons was appraised using the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) criteria. In addition, a network meta-analysis evaluated regimens that had not been compared directly in clinical trials. The effectiveness outcome was tuberculosis disease; the toxicity outcome was hepatotoxicity. Strong GRADE recommendations required at least moderate evidence of effectiveness and that the desirable consequences outweighed the undesirable consequences in the majority of patients. Conditional GRADE recommendations were made when determination of whether desirable consequences outweighed undesirable consequences was uncertain (e.g., with low-quality evidence).These updated 2020 LTBI treatment guidelines include the NTCA- and CDC-recommended treatment regimens that comprise three preferred rifamycin-based regimens and two alternative monotherapy regimens with daily isoniazid. All recommended treatment regimens are intended for persons infected with Mycobacterium tuberculosis that is presumed to be susceptible to isoniazid or rifampin. These updated guidelines do not apply when evidence is available that the infecting M. tuberculosis strain is resistant to both isoniazid and rifampin; recommendations for treating contacts exposed to multidrug-resistant tuberculosis were published in 2019 (Nahid P, Mase SR Migliori GB, et al. Treatment of drug-resistant tuberculosis. An official ATS/CDC/ERS/IDSA clinical practice guideline. Am J Respir Crit Care Med 2019;200:e93-e142). The three rifamycin-based preferred regimens are 3 months of once-weekly isoniazid plus rifapentine, 4 months of daily rifampin, or 3 months of daily isoniazid plus rifampin. Prescribing providers or pharmacists who are unfamiliar with rifampin and rifapentine might confuse the two drugs. They are not interchangeable, and caution should be taken to ensure that patients receive the correct medication for the intended regimen. Preference for these rifamycin-based regimens was made on the basis of effectiveness, safety, and high treatment completion rates. The two alternative treatment regimens are daily isoniazid for 6 or 9 months; isoniazid monotherapy is efficacious but has higher toxicity risk and lower treatment completion rates than shorter rifamycin-based regimens.In summary, short-course (3- to 4-month) rifamycin-based treatment regimens are preferred over longer-course (6-9 month) isoniazid monotherapy for treatment of LTBI. These updated guidelines can be used by clinicians, public health officials, policymakers, health care organizations, and other state and local stakeholders who might need to adapt them to fit individual clinical circumstances. |
Testing the effectiveness and cost-effectiveness of a combination HIV prevention intervention among young cisgender men who have sex with men and transgender women who sell or exchange sex in Thailand: Protocol for the Combination Prevention Effectiveness Study
Wirtz AL , Weir BW , Mon SHH , Sirivongrangson P , Chemnasiri T , Dunne EF , Varangrat A , Hickey AC , Decker MR , Baral S , Okanurak K , Sullivan P , Valencia R , Thigpen MC , Holtz TH , Mock PA , Cadwell B , Adeyeye A , Rooney JF , Beyrer C . JMIR Res Protoc 2020 9 (1) e15354 BACKGROUND: Pre-exposure prophylaxis (PrEP) is highly effective in the prevention of HIV acquisition, particularly for men who have sex with men (MSM). Questions remain on the benefits of PrEP and implementation strategies for those at occupational risk of HIV acquisition in sex work, as well as on methods to support adherence among young people who initiate PrEP. OBJECTIVE: The Combination Prevention Effectiveness study for young cisgender MSM and transgender women (TGW) aims to assess the effectiveness and cost-effectiveness of a combination intervention among HIV-uninfected young MSM and TGW engaged in sex work in Thailand. METHODS: This open-label, nonrandomized assessment compares the relative effectiveness of a combination prevention intervention with and without daily oral emtricitabine and tenofovir disoproxil fumarate (Truvada) PrEP with SMS-based adherence support. HIV-uninfected young MSM and TGW aged 18 to 26 years in Bangkok and Pattaya who self-report selling/exchanging sex at least once in the previous 12 months are recruited by convenience sampling and peer referral and are eligible regardless of their intent to initiate PrEP. At baseline, participants complete a standard assessment for PrEP eligibility and may initiate PrEP then or at any time during study participation. All participants complete a survey and HIV testing at baseline and every 3 months. Participants who initiate PrEP complete monthly pill pickups and may opt-in to SMS reminders. All participants are sent brief weekly SMS surveys to assess behavior with additional adherence questions for those who initiated PrEP. Adherence is defined as use of 4 or more pills within the last 7 days. The analytic plan uses a person-time approach to assess HIV incidence, comparing participant time on oral PrEP to participant time off oral PrEP for 12 to 24 months of follow-up, using a propensity score to control for confounders. Enrollment is based on the goal of observing 620 person-years (PY) on PrEP and 620 PY off PrEP. RESULTS: As of February 2019, 445 participants (417 MSM and 28 TGW) have contributed approximately 168 PY with 95% (73/77) retention at 12 months. 74.2% (330/445) of enrolled participants initiated PrEP at baseline, contributing to 134 PY of PrEP adherence, 1 PY nonadherence, and 33 PY PrEP nonuse/noninitiation. Some social harms, predominantly related to unintentional participant disclosure of PrEP use and peer stigmatization of PrEP and HIV, have been identified. CONCLUSIONS: The majority of cisgender MSM and TGW who exchange sex and participate in this study are interested in PrEP, report taking sufficient PrEP, and stay on PrEP, though additional efforts are needed to address community misinformation and stigma. This novel multilevel, open-label study design and person-time approach will allow evaluation of the effectiveness and cost-effectiveness of combination prevention intervention in the contexts of both organized sex work and exchanged sex. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID): RR1-10.2196/15354. |
"Submergence" of Western equine encephalitis virus: Evidence of positive selection argues against genetic drift and fitness reductions.
Bergren NA , Haller S , Rossi SL , Seymour RL , Huang J , Miller AL , Bowen RA , Hartman DA , Brault AC , Weaver SC . PLoS Pathog 2020 16 (2) e1008102 Understanding the circumstances under which arboviruses emerge is critical for the development of targeted control and prevention strategies. This is highlighted by the emergence of chikungunya and Zika viruses in the New World. However, to comprehensively understand the ways in which viruses emerge and persist, factors influencing reductions in virus activity must also be understood. Western equine encephalitis virus (WEEV), which declined during the late 20th century in apparent enzootic circulation as well as equine and human disease incidence, provides a unique case study on how reductions in virus activity can be understood by studying evolutionary trends and mechanisms. Previously, we showed using phylogenetics that during this period of decline, six amino acid residues appeared to be positively selected. To assess more directly the effect of these mutations, we utilized reverse genetics and competition fitness assays in the enzootic host and vector (house sparrows and Culex tarsalis mosquitoes). We observed that the mutations contemporary with reductions in WEEV circulation and disease that were non-conserved with respect to amino acid properties had a positive effect on enzootic fitness. We also assessed the effects of these mutations on virulence in the Syrian-Golden hamster model in relation to a general trend of increased virulence in older isolates. However, no change effect on virulence was observed based on these mutations. Thus, while WEEV apparently underwent positive selection for infection of enzootic hosts, residues associated with mammalian virulence were likely eliminated from the population by genetic drift or negative selection. These findings suggest that ecologic factors rather than fitness for natural transmission likely caused decreased levels of enzootic WEEV circulation during the late 20th century. |
LYMESIM 2.0: An updated simulation of blacklegged tick (Acari: Ixodidae) population dynamics and enzootic transmission of Borrelia burgdorferi (Spirochaetales: Spirochaetaceae)
Gaff H , Eisen RJ , Eisen L , Nadolny R , Bjork J , Monaghan AJ . J Med Entomol 2020 57 (3) 715-727 Lyme disease is the most commonly reported vector-borne disease in the United States, and the number of cases reported each year continues to rise. The complex nature of the relationships between the pathogen (Borrelia burgdorferi sensu stricto), the tick vector (Ixodes scapularis Say), multiple vertebrate hosts, and numerous environmental factors creates challenges for understanding and predicting tick population and pathogen transmission dynamics. LYMESIM is a mechanistic model developed in the late 1990s to simulate the life-history of I. scapularis and transmission dynamics of B. burgdorferi s.s. Here we present LYMESIM 2.0, a modernized version of LYMESIM, that includes several modifications to enhance the biological realism of the model and to generate outcomes that are more readily measured under field conditions. The model is tested for three geographically distinct locations in New York, Minnesota, and Virginia. Model-simulated timing and densities of questing nymphs, infected nymphs, and abundances of nymphs feeding on hosts are consistent with field observations and reports for these locations. Sensitivity analysis highlighted the importance of temperature in host finding for the density of nymphs, the importance of transmission from small mammals to ticks on the density of infected nymphs, and temperature-related tick survival for both density of nymphs and infected nymphs. A key challenge for accurate modeling of these metrics is the need for regionally representative inputs for host populations and their fluctuations. LYMESIM 2.0 is a useful public health tool that downstream can be used to evaluate tick control interventions and can be adapted for other ticks and pathogens. |
Entomological investigation detects dengue virus type 1 in Aedes (Stegomyia) albopictus (Skuse) during the 2015-16 outbreak in Hawaii
Hasty JM , Felix GE , Amador M , Barrera R , Santiago GS , Nakasone L , Park SY , Okoji S , Honda E , Asuncion B , Save M , Munoz-Jordan JL , Martinez-Conde S , Medina FA , Waterman SH , Petersen LR , Johnston DI , Hemme RR . Am J Trop Med Hyg 2020 102 (4) 869-875 A dengue outbreak occurred on Hawaii Island between September 2015 and March 2016. Entomological investigations were undertaken between December 2015 and February 2016 to determine which Aedes mosquito species were responsible for the outbreak. A total of 3,259 mosquitoes were collected using a combination of CDC autocidal gravid ovitraps, Biogents BG-Sentinel traps, and hand-nets; immature mosquitoes were collected during environmental surveys. The composition of species was Aedes albopictus (58%), Aedes aegypti (25%), Wyeomyia mitchelli (7%), Aedes vexans (5%), Culex quinquefasciatus (4%), and Aedes japonicus (1%). Adult mosquitoes were analyzed by real-time reverse transcription polymerase chain reaction (PCR) for the presence of dengue virus (DENV) RNA. Of the 185 pools of female mosquitoes tested, 15 containing Ae. albopictus were positive for the presence of DENV type 1 RNA. No virus was detected in pools of the remaining species. Phylogenetic analysis showed the virus strain belonged to genotype I and was closely related to strains that were circulating in the Pacific between 2008 and 2014. This is the first report of detection of DENV in Ae. albopictus from Hawaii. |
Transmission of Coxiella burnetii by ingestion in mice
Miller HK , Priestley RA , Kersh GJ . Epidemiol Infect 2020 148 e21 Coxiella burnetii, the causative agent of Q fever, is widely present in dairy products around the world. It has been isolated from unpasteurised milk and cheese and can survive for extended periods of time under typical storage conditions for these products. Although consumption of contaminated dairy products has been suggested as a potential route for transmission, it remains controversial. Given the high prevalence of C. burnetii in dairy products, we sought to examine the feasibility of transmitting the major sequence types (ST16, ST8 and ST20) of C. burnetii circulating in the United States. We delivered three strains of C. burnetii, comprising each sequence type, directly into the stomachs of immunocompetent BALB/c mice via oral gavage (OG) and assessed them for clinical symptoms, serological response and bacterial dissemination. We found that mice receiving C. burnetii by OG had notable splenomegaly only after infection with ST16. A robust immune response and persistence in the stomach and mesenteric lymph nodes were observed in mice receiving ST16 and ST20 by OG, and dissemination of C. burnetii to peripheral tissues was observed in all OG infected mice. These findings support the oral route as a mode of transmission for C. burnetii. |
The serological prevalence of rabies virus-neutralizing antibodies in the bat population on the Caribbean island of Trinidad
Seetahal JFR , Greenberg L , Satheshkumar PS , Sanchez-Vazquez MJ , Legall G , Singh S , Ramkissoon V , Schountz T , Munster V , Oura CAL , Carrington CVF . Viruses 2020 12 (2) Rabies virus (RABV) is the only lyssavirus known to be present within the Caribbean. The island of Trinidad, is richly diverse in chiropteran fauna and endemic for bat-transmitted rabies with low RABV isolation rates observed in this population. We aimed to determine the seroprevalence of rabies virus neutralizing antibodies (RVNA) in light of spatio-temporal and bat demographic factors to infer the extent of natural exposure to RABV in the Trinidadian bat population. RVNA titers were determined by the RABV micro-neutralization test on 383 bat samples representing 21 species, comprising 30.9% of local bat diversity, from 31 locations across the island over 5 years. RVNA was positively detected in 33 samples (8.6%) representing 6 bat species (mainly frugivorous) with titers ranging from 0.1 to 19 IU/mL (mean 1.66 IU/mL). The analyses based on a multivariable binomial generalised linear mixed-effects model showed that bat age and year of capture were significant predictors of seropositivity. Thus, juvenile bats were more likely to be seropositive when compared to adults (estimate 1.13; p = 0.04) which may suggest early exposure to the RABV with possible implications for viral amplification in this population. Temporal variation in rabies seropositivity, 2012-2014 versus 2015-2017 (estimate 1.07; p = 0.03) may have been related to the prevailing rabies epizootic situation. Regarding other factors investigated, RVNA was found in bats from both rural and non-rural areas, as well as in both hematophagous and non-hematophagous bat species. The most common seropositive species, Artibeus jamaicensis planirostris is ubiquitous throughout the island which may potentially facilitate human exposure. The findings of this study should be factored into public health assessments on the potential for rabies transmission by non-hematophagous bats in Trinidad. |
Circumstances involved in unsupervised solid dose medication exposures among young children
Agarwal M , Lovegrove MC , Geller RJ , Pomerleau AC , Sapiano MRP , Weidle NJ , Morgan BW , Budnitz DS . J Pediatr 2020 219 188-195 e6 OBJECTIVE: To identify types of containers from which young children accessed solid dose medications (SDMs) during unsupervised medication exposures and the intended recipients of the medications to advance prevention. STUDY DESIGN: From February to September 2017, 5 US poison centers enrolled individuals calling about unsupervised solid dose medication exposures by children </=5 years. Study participants answered contextually directed questions about exposure circumstances. RESULTS: Sixty-two percent of eligible callers participated. Among 4496 participants, 71.6% of SDM exposures involved children aged </=2 years; 33.8% involved only prescription medications, 32.8% involved only over-the-counter (OTC) products that require child-resistant packaging, and 29.9% involved >/=1 OTC product that does not require child-resistant packaging. More than one-half of exposures (51.5%) involving prescription medications involved children accessing medications that had previously been removed from original packaging, compared with 20.8% of exposures involving OTC products (aOR, 3.39; 95% CI, 2.87-4.00). Attention deficit hyperactivity disorder medications (49.3%) and opioids (42.6%) were often not in any container when accessed; anticonvulsants (41.1%), hypoglycemic agents (33.8%), and cardiovascular/antithrombotic agents (30.8%) were often transferred to alternate containers. Grandparents' medications were involved in 30.7% of prescription medication exposures, but only 7.8% of OTC product exposures (aOR, 3.99; 95% CI, 3.26-4.87). CONCLUSIONS: Efforts to reduce pediatric SDM exposures should also address exposures in which adults, rather than children, remove medications from child-resistant packaging. Packaging/storage innovations designed to encourage adults to keep products within child-resistant packaging and specific educational messages could be targeted based on common exposure circumstances, medication classes, and medication intended recipients. |
Changes in emergency department visits for zolpidem-attributed adverse drug reactions after FDA Drug Safety Communications
Geller AI , Zhou EH , Budnitz DS , Lovegrove MC , Dal Pan GJ . Pharmacoepidemiol Drug Saf 2020 29 (3) 352-356 Purpose: To identify possible changes in the U.S. emergency department (ED) visits from zolpidem-attributed adverse drug reactions (ADRs) after 2013 Food and Drug Administration (FDA) Drug Safety Communications (DSCs), which notified the public about FDA's new dosing recommendations for zolpidem. Method(s): We estimated the occurrence of ED visits from zolpidem-attributed ADRs using nationally representative, public health surveillance of medication harms (National Electronic Injury Surveillance System-Cooperative Adverse Drug Event Surveillance project, 2010-2017). We estimated the number of zolpidem prescriptions using IQVIA National Prescription Audit, 2010-2017. We calculated rates of ED visits for zolpidem-attributed ADRs per 10 000 dispensed zolpidem prescriptions and identified time trends and potential inflection points using the joinpoint regression. For comparison, we repeated these analyses for sedating antidepressants commonly used to treat disordered sleep (trazodone, doxepin, and mirtazapine). Result(s): The best-fit regression model for rates of ED visits for zolpidem-attributed ADRs by 6-month intervals identified a single inflection point in the second half of 2014 (P =.024) with a 6.7% biannual decrease from 2010 to 2014 ([-13.1%, 0.3%], P =.059) and a 13.9% biannual increase from the second half of 2014 through 2017 ([-1.1%, 31.3%], P =.068). No change or inflection points were identified for rates of ED visits for sedating antidepressant-attributed ADRs. Conclusion(s): While there was a nominal decline in the rate of ED visits for ADRs in the time period before and for 18 months after FDA's 2013 zolpidem DSCs, the decrease was not sustained, and thus questions remain concerning the long-term impact of the zolpidem DSCs on ADRs. Copyright Published 2020. This article is a U.S. Government work and is in the public domain in the USA. |
Serum elimination half-lives adjusted for ongoing exposure of tri-to hexabrominated diphenyl ethers: Determined in persons moving from North America to Australia.
Sjodin A , Mueller JF , Jones R , Schutze A , Wong LY , Caudill SP , Harden FA , Webster TF , Toms LM . Chemosphere 2020 248 125905 The objective of the study was to determine the human serum elimination half-life of polybrominated diphenyl ethers (PBDEs) adjusted for ongoing exposure in subjects moving from a higher exposure region (North America) to a lower exposure region (Australia). The study population was comprised of exchange students and long-term visitors from North America moving to Brisbane, Australia (N = 27) and local residents (N = 23) who were followed by repeated serum sampling every other month. The local residents were sampled to adjust for ongoing exposure in Australia. Only one visitor remained in Australia for a period of time similar to the elimination half-life and had a sufficiently high initial concentration of PBDEs to derive a half-life. This visitor arrived in Australia in March of 2011 and remained in the country for 1.5 years. Since the magnitude of PBDE exposure is lower in Australia than in North America we observed an apparent 1st order elimination curve over time from which we have estimated the serum elimination half-lives for BDE28, BDE47, BDE99, BDE100, and BDE153 to be 0.942, 1.19, 1.03, 2.16, and 4.12 years, respectively. Uncertainty in the estimates were estimated using a Monte Carlo simulation. The human serum elimination half-life adjusted for ongoing exposure can allow us to assess the effectiveness and reduction in exposure in the general population following phase out of commercial penta- and octaBDE in 2004 in the United States. |
Factors associated with exposure to trihalomethanes, NHANES 2001-2012
Ashley David L , Smith Mitchell M , Silva Lalith K , Yoo Young M , De Jesús Víctor R , Blount Benjamin C . Environ Sci Technol 2020 54 (2) 1066-1074 Disinfection is critical for maintaining a safe water supply, but the use of chlorine or chloramine leads to exposure to disinfection byproducts (DBPs), including trihalomethanes (THMs), which have been associated with adverse reproductive outcomes and bladder cancer. The U.S. Environmental Protection Agency revised the DBP regulations starting in 1998 to further limit levels of THMs in household water. We analyzed data from the National Health and Nutrition Examination Survey (NHANES) collected between 2001 and 2012 (with 2 years per cycle) using models with and without water-related predictors to examine the utility of including these measures. Median blood chloroform levels (25th-75th percentiles) were 16.2 (9.13-31.2) ng/L in 2001-2002 and 5.97 (2.92-12.3) ng/L in 2011-2012. Median blood bromodichloromethane (BDCM) levels (25th-75th percentiles) were 2.22 (1.06-4.61) ng/L in 2001-2002 and 1.18 (<limit of detection-2.92) ng/L in 2011-2012. THM water concentrations and measures of the recency since time spent in water use activities were associated with blood THM levels. Being in a pool/hot tub/sauna within 24 h or taking a shower/bath within 6 h of blood collection was associated with elevated blood levels of chloroform and BDCM. When possible, it is important to include recency and external dose when assessing associations to internal dose levels for nonpersistent compounds. |
The association between wildfire smoke exposure and asthma-specific medical care utilization in Oregon during the 2013 wildfire season
Gan RW , Liu J , Ford B , O'Dell K , Vaidyanathan A , Wilson A , Volckens J , Pfister G , Fischer EV , Pierce JR , Magzamen S . J Expo Sci Environ Epidemiol 2020 30 (4) 618-628 Wildfire smoke (WFS) increases the risk of respiratory hospitalizations. We evaluated the association between WFS and asthma healthcare utilization (AHCU) during the 2013 wildfire season in Oregon. WFS particulate matter </= 2.5 mum in diameter (PM2.5) was estimated using a blended model of in situ monitoring, chemical transport models, and satellite-based data. Asthma claims and place of service were identified from Oregon All Payer All Claims data from 1 May 2013 to 30 September 2013. The association with WFS PM2.5 was evaluated using time-stratified case-crossover designs. The maximum WFS PM2.5 concentration during the study period was 172 microg/m(3). A 10 microg/m(3) increase in WFS increased risk in asthma diagnosis at emergency departments (odds ratio [OR]: 1.089, 95% confidence interval [CI]: 1.043-1.136), office visit (OR: 1.050, 95% CI: 1.038-1.063), and outpatient visits (OR: 1.065, 95% CI: 1.029-1.103); an association was observed with asthma rescue inhaler medication fills (OR: 1.077, 95% CI: 1.065-1.088). WFS increased the risk for asthma morbidity during the 2013 wildfire season in Oregon. Communities impacted by WFS could see increases in AHCU for tertiary, secondary, and primary care. |
Air Quality Index and air quality awareness among adults in the United States
Mirabelli MC , Ebelt S , Damon SA . Environ Res 2020 183 109185 BACKGROUND: Information about local air quality is reported across the United States using air quality alerts such as the Environmental Protection Agency's Air Quality Index. However, the role of such alerts in raising awareness of air quality is unknown. We conducted this study to evaluate associations between days with Air Quality Index >/=101, corresponding to a categorization of air quality as unhealthy for sensitive groups, unhealthy, very unhealthy, or hazardous, and air quality awareness among adults in the United States. METHODS: Data from 12,396 respondents to the 2016-2018 ConsumerStyles surveys were linked by geographic location and survey year to daily Air Quality Index data. We evaluated associations between the number of days in the past year with Air Quality Index >/=101 and responses to survey questions about awareness of air quality alerts, perception of air quality, and changes in behavior to reduce air pollution exposure using logistic regression. RESULTS: Awareness of air quality alerts (prevalence ratio [PR] = 1.23; 95% confidence interval [CI] = 1.15, 1.31), thinking/being informed air quality was bad (PR = 2.02; 95% CI = 1.81, 2.24), and changing behavior (PR = 2.27; 95% CI = 1.94, 2.67) were higher among respondents living in counties with >/=15 days with Air Quality Index >/=101 than those in counties with zero days in the past year with Air Quality Index >/=101. Each aspect of air quality awareness was higher among adults with than without asthma, but no differences were observed by heart disease status. Across quintiles of the number of days with Air Quality Index >/=101, air quality awareness increased among those with and without selected respiratory and cardiovascular diseases. CONCLUSIONS: Among U.S. adults, air quality awareness increases with increasing days with alerts of unhealthy air. These findings improve our understanding of the extent to which air quality alerts prompt people to take actions to protect their health amidst poor air quality. |
A comparison of individual-level vs. hypothetically pooled mercury biomonitoring data from the Maternal Organics Monitoring Study (MOMS), Alaska, 1999-2012
Mosites E , Rodriguez E , Caudill SP , Hennessy TW , Berner J . Int J Circumpolar Health 2020 79 (1) 1726256 Biomonitoring for heavy metals is important to assess health risks, especially in Arctic communities where rural residents rely on locally harvested foods. However, laboratory testing for blood contaminants is expensive and might not be sustainable for long-term monitoring. We assessed whether pooled specimen biomonitoring could be a part of a plan for blood contaminant surveillance among pregnant women in rural Alaska using existing blood mercury level data from three cross sectional studies of pregnant women. We applied a hypothetical pooled specimen template stratified into 8 demographic groups based on age, coastal or inland residence, and pre-pregnancy weight. The hypothetical geometric mean blood mercury levels were similar to the individual-level geometric means. However, the 95% confidence intervals were much wider for the hypothetical geometric means compared to the true geometric means. Although the variability that resulted from pooling specimens using a small sample made it difficult to compare demographic groups to each other, pooled specimen results could be an accurate reflection of the population burden of mercury contamination in the Arctic in the context of large numbers of biomonitoring samples. |
Correlates of plasma concentrations of brominated flame retardants in a cohort of U.S. Black women residing in the Detroit, Michigan metropolitan area
Orta OR , Wesselink AK , Bethea TN , Claus Henn B , McClean MD , Sjodin A , Baird DD , Wise LA . Sci Total Environ 2020 714 136777 BACKGROUND: Polybrominated diphenyl ethers (PBDEs) and polybrominated biphenyls (PBBs) are brominated flame retardant chemicals detectable in the environment and U.S. population, and are associated with adverse health outcomes over the life course. Correlates of these organic pollutants are understudied among U.S. Black women. METHODS: Using baseline data from a prospective cohort study of U.S. Black women aged 23-35 years from the Detroit area of Michigan (2010-2012), we examined correlates of PBDEs and PBB-153. Non-fasting blood samples were collected from 742 participants at enrollment, a subset of women selected for a case-cohort study of environmental chemicals. Data on socio-demographics, behaviors, diet, medical history, and early-life exposures were collected via self-administered questionnaires, telephone interviews, and in-person clinic visits. We fit linear regression models to calculate percent differences and 95% confidence intervals in lipid adjusted plasma concentrations of 11 individual PBDE congeners and PBB-153 for each baseline predictor. RESULTS: In models adjusted for all other correlates, a 5-year increase in age was inversely associated with most PBDE congeners (% differences ranged from 6 to 15% lower), and was positively associated with PBB-153 (52% higher). A 5-kg/m(2) increase in BMI was inversely associated with PBDE-153 and PBB-153 (16% lower for both), and 6% higher for PBDE-28. Compared with having never been breastfed in infancy, >/=3 months of breastfeeding in infancy was associated with 69% higher PBB-153 concentrations. Lower education, current smoking, and heavy alcohol use were associated with higher plasma concentrations of most flame retardants. Diet was not an important predictor. CONCLUSION: Important correlates for elevated body burdens of PBB-153 were increasing age and a history of having been breastfed in infancy. Education, smoking, and heavy alcohol use were important predictors of elevated body burdens of most flame retardants. This study fills an important gap in the environmental health literature by focusing on an understudied population. |
Outbreak of Listeriosis in South Africa Associated with Processed Meat.
Thomas J , Govender N , McCarthy KM , Erasmus LK , Doyle TJ , Allam M , Ismail A , Ramalwa N , Sekwadi P , Ntshoe G , Shonhiwa A , Essel V , Tau N , Smouse S , Ngomane HM , Disenyeng B , Page NA , Govender NP , Duse AG , Stewart R , Thomas T , Mahoney D , Tourdjman M , Disson O , Thouvenot P , Maury MM , Leclercq A , Lecuit M , Smith AM , Blumberg LH . N Engl J Med 2020 382 (7) 632-643 BACKGROUND: An outbreak of listeriosis was identified in South Africa in 2017. The source was unknown. METHODS: We conducted epidemiologic, trace-back, and environmental investigations and used whole-genome sequencing to type Listeria monocytogenes isolates. A case was defined as laboratory-confirmed L. monocytogenes infection during the period from June 11, 2017, to April 7, 2018. RESULTS: A total of 937 cases were identified, of which 465 (50%) were associated with pregnancy; 406 of the pregnancy-associated cases (87%) occurred in neonates. Of the 937 cases, 229 (24%) occurred in patients 15 to 49 years of age (excluding those who were pregnant). Among the patients in whom human immunodeficiency virus (HIV) status was known, 38% of those with pregnancy-associated cases (77 of 204) and 46% of the remaining patients (97 of 211) were infected with HIV. Among 728 patients with a known outcome, 193 (27%) died. Clinical isolates from 609 patients were sequenced, and 567 (93%) were identified as sequence type 6 (ST6). In a case-control analysis, patients with ST6 infections were more likely to have eaten polony (a ready-to-eat processed meat) than those with non-ST6 infections (odds ratio, 8.55; 95% confidence interval, 1.66 to 43.35). Polony and environmental samples also yielded ST6 isolates, which, together with the isolates from the patients, belonged to the same core-genome multilocus sequence typing cluster with no more than 4 allelic differences; these findings showed that polony produced at a single facility was the outbreak source. A recall of ready-to-eat processed meat products from this facility was associated with a rapid decline in the incidence of L. monocytogenes ST6 infections. CONCLUSIONS: This investigation showed that in a middle-income country with a high prevalence of HIV infection, L. monocytogenes caused disproportionate illness among pregnant girls and women and HIV-infected persons. Whole-genome sequencing facilitated the detection of the outbreak and guided the trace-back investigations that led to the identification of the source. |
Deaths, hospitalizations, and emergency department visits from food-related anaphylaxis, New York city, 2000-2014: Implications for fatality prevention
Poirot E , He F , Gould LH , Hadler JL . J Public Health Manag Pract 2020 26 (6) 548-556 CONTEXT: Food-induced anaphylaxis is potentially fatal but preventable by allergen avoidance and manageable through immediate treatment. Considerable effort has been invested in preventing fatalities from nut exposure among school-aged children, but few population-based studies exist to guide additional prevention efforts. OBJECTIVES: To describe the epidemiology and trends of food-related anaphylaxis requiring emergency treatment during a 15-year span in New York City when public health initiatives to prevent deaths were implemented and to understand the situational circumstances of food-related deaths. DESIGN/SETTING/PARTICIPANTS: Retrospective death record review and analysis of inpatient hospital discharges and emergency department (ED) visits in New York City residents, 2000-2014. MAIN OUTCOME: Vital statistics data, medical examiner reports, ED, and hospital discharge data were used to examine risk for death and incidence trends in medically attended food-related anaphylaxis. Potentially preventable deaths were those among persons with a known allergy to the implicated food or occurring in public settings. RESULTS: There were 24 deaths, (1.6 deaths/year; range: 0-5), 3049 hospitalizations, and 4014 ED visits, including 7 deaths from crustacean, 4 from peanut, and 2 each from tree nut or seeds and fish exposures. Risk for death among those hospitalized or treated in the ED was highest for persons older than 65 years and for those treated for crustacean reactions (relative risk 6.5 compared with those treated for peanuts, 95% confidence interval = 1.9-22.1). Eleven of 16 deaths with medical examiner data were potentially preventable. Hospitalizations (2000-2014) and ED visit rates (2005-2014) were highest for children and those with peanut exposure and increased across periods. CONCLUSIONS: Deaths from food-related anaphylaxis were rare; however, rates of hospitalization and ED visits increased. Prevention efforts related to peanut allergies among children should continue, and additional attention is needed to prevent and treat anaphylaxis among adults, particularly those with known crustacean allergies where case fatality is highest. |
State Medicaid coverage for tobacco cessation treatments and barriers to accessing treatments - United States, 2008-2018
DiGiulio A , Jump Z , Babb S , Schecter A , Williams KS , Yembra D , Armour BS . MMWR Morb Mortal Wkly Rep 2020 69 (6) 155-160 The prevalence of current cigarette smoking is approximately twice as high among adults enrolled in Medicaid (23.9%) as among privately insured adults (10.5%), placing Medicaid enrollees at increased risk for smoking-related disease and death (1). Medicaid spends approximately $39 billion annually on treating smoking-related diseases (2). Individual, group, and telephone counseling and seven Food and Drug Administration (FDA)-approved medications* are effective in helping tobacco users quit (3). Comprehensive, barrier-free, widely promoted coverage of these treatments increases use of cessation treatments and quit rates and is cost-effective (3). To monitor changes in state Medicaid cessation coverage for traditional Medicaid enrollees(dagger) over the past decade, the American Lung Association collected data on coverage of nine cessation treatments by state Medicaid programs during December 31, 2008-December 31, 2018: individual counseling, group counseling, and the seven FDA-approved cessation medications( section sign); states that cover all nine of these treatments are considered to have comprehensive coverage. The American Lung Association also collected data on seven barriers to accessing covered treatments.( paragraph sign) As of December 31, 2018, 15 states covered all nine cessation treatments for all enrollees, up from six states as of December 31, 2008. Of these 15 states, Kentucky and Missouri were the only ones to have removed all seven barriers to accessing these cessation treatments. State Medicaid programs that cover all evidence-based cessation treatments, remove barriers to accessing these treatments, and promote covered treatments to Medicaid enrollees and health care providers could reduce smoking, smoking-related disease, and smoking-attributable federal and state health care expenditures (3-7). |
Introducing the PLOS special collection of economic cases for NCD prevention and control: A global perspective
Nugent RA , Husain MJ , Kostova D , Chaloupka F . PLoS One 2020 15 (2) e0228564 Noncommunicable diseases (NCDs), such as heart disease, cancer, diabetes, and chronic respiratory disease, are responsible for seven out of every 10 deaths worldwide. While NCDs are associated with aging in high-income countries, this representation is often misleading. Over one-third of the 41 million annual deaths from NCDs occur prematurely, defined as under 70 years of age. Most of those deaths occur in low- and middle-income countries (LMICs) where surveillance, treatment, and care of NCDs are often inadequate. In addition to high health and social costs, the economic costs imposed by such high numbers of excess early deaths impede economic development and contribute to global and national inequity. In higher-income countries, NCDs and their risks continue to push health care costs higher. The burden of NCDs is strongly intertwined with economic conditions for good and for harm. Understanding the multiple ways they are connected-through risk factor exposures, access to quality health care, and financial protection among others-will determine which countries are able to improve the healthy longevity of their populations and slow growth in health expenditure particularly in the face of aging populations. The aim of this Special Collection is to provide new evidence to spur those actions. |
Trends over time and jurisdiction variability in supplemental security income and state supplementary payment programs for children with disabilities
Robinson LR , McCord RF , Cloud LK , Kaminski JW , Cook A , Amoroso J , Watts MH , Kotzky K , Barry CM , Johnson R , Kelleher KJ . J Public Health Manag Pract 2020 26 S45-s53 CONTEXT: Nearly 1.2 million children with disabilities received federally administered Supplemental Security Income (SSI) payments in 2017. Based on a robust review of research and evaluation evidence and microsimulations, The National Academies of Sciences, Engineering, and Medicine committee identified modifications to SSI (ie, increasing the federal SSI benefit maximum by one-third or two-thirds) as 1 of 10 strategies that could reduce the US child poverty rate, improving child health and well-being on a population level. OBJECTIVE: Describing the availability and amount of SSI and State Supplementary Payment (SSP) program benefits to support families of children with disabilities may be a first step toward evaluating The National Academies of Sciences, Engineering, and Medicine-proposed modification to SSI as a potential poverty alleviation and health improvement tool for children with disabilities and their families. DESIGN: We used public health law research methods to characterize the laws (statutes and state agency regulations) governing the federal SSI program and SSP programs in the 50 states and District of Columbia from January 1, 1996, through November 1, 2018. RESULTS: The number of jurisdictions offering supplementary payments (SSP) was relatively stable between 1996 and 2018. In 2018, 23 US jurisdictions legally mandated that SSP programs were available for children. Among the states with SSP payment amounts in their codified laws, SSP monthly benefit amounts ranged from $8 to $64.35 in 1996 and $3.13 to $60.43 in 2018. CONCLUSION: Our initial exploration of SSI-related policies as a tool for improving the economic stability of children with disabilities and their families suggests that current SSPs, in combination with SSI, would not rise to the level of SSI increases proposed by The National Academies of Sciences, Engineering, and Medicine. Understanding more about how SSI and SSP reach children and work in combination with other federal and state income security programs may help identify policies and strategies that better support children with disabilities in low-income households. |
Notes from the Field: Carbapenem-resistant Klebsiella pneumoniae with mcr-1 Gene Identified in a Hospitalized Patient - Wyoming, January 2019.
Rhodes H , Loveland C , Van Houten C , Hull N , Harrist A . MMWR Morb Mortal Wkly Rep 2020 69 (6) 171-172 In mid-December 2018, an adult with a history of recurrent urinary tract infections was admitted to a Wyoming hospital with acute confusion. Because of a history of methicillin-resistant Staphylococcus aureus, the patient was placed on contact precautions in a private room. Admission urine culture and antimicrobial susceptibility testing identified carbapenem-resistant Klebsiella pneumoniae with extended-spectrum beta-lactamase production. Susceptibility to colistin, an antibiotic of last resort, was not tested. Carbapenem-resistant Enterobacteriaceae (CRE) infections are reportable to the Wyoming Department of Health (WDH), and the isolate was sent to the Wyoming Public Health Laboratory (WPHL), where ertapenem resistance was confirmed. Further testing identified resistance to 16 antibiotics* and susceptibility to amikacin, imipenem, meropenem, and tigecycline. Using the Carba NP assay (1), carbapenemase production was not found. WPHL sent the isolate to the Texas Antibiotic Resistance (AR) Laboratory Network regional laboratory for further characterization. Because of known sensitivity issues with the Carba NP assay (2), repeat testing used the modified carbapenem inactivation method. Texas AR Laboratory Network confirmed WPHL results. Colistin susceptibility testing by broth microdilution found that the minimum inhibitory concentration was >4 μg/mL, which was above the Clinical and Laboratory Standards Institute’s epidemiologic cutoff value for wild type Enterobacteriaceae (≤2 μg/mL) (3). The plasmid-mediated mcr-1 colistin resistance gene was detected using a CDC-developed multiplex real-time polymerase chain reaction assay (4). In early January 2019, Texas AR Laboratory Network alerted WDH that the isolate carried the mcr-1 gene. |
Donor-derived transmission through lung transplantation of carbapenem-resistant Acinetobacter baumannii producing the OXA-23 carbapenemase during an ongoing healthcare facility outbreak
Bardossy AC , Snavely EA , Nazarian E , Annambhotla P , Basavaraju SV , Pepe D , Maloney M , Musser KA , Haas W , Barros N , Pierce VM , Walters M , Epstein L . Transpl Infect Dis 2020 22 (2) e13256 We describe a rare instance of donor-derived OXA-23-producing carbapenem-resistant Acinetobacter baumannii transmission during lung transplantation and the subsequent public health response. This investigation highlights how transplantation can introduce rare multidrug-resistant organisms into different healthcare facilities and regions. |
Unique clindamycin-resistant Clostridioides difficile strain related to fluoroquinolone-resistant epidemic BI/RT027 strain
Skinner AM , Petrella L , Siddiqui F , Sambol SP , Gulvik CA , Gerding DN , Donskey CJ , Johnson SC . Emerg Infect Dis 2020 26 (2) 247-254 During a surveillance study of patients in a long-term care facility and the affiliated acute care hospital in the United States, we identified a Clostridioides difficile strain related to the epidemic PCR ribotype (RT) 027 strain associated with hospital outbreaks of severe disease. Fifteen patients were infected with this strain, characterized as restriction endonuclease analysis group DQ and RT591. Like RT027, DQ/RT591 contained genes for toxin B and binary toxin CDT and a tcdC gene of identical sequence. Whole-genome sequencing and multilocus sequence typing showed that DQ/RT591 is a member of the same multilocus sequence typing clade 2 as RT027 but in a separate cluster. DQ/RT591 produced a similar cytopathic effect as RT027 but showed delayed toxin production in vitro. DQ/RT591 was susceptible to moxifloxacin but highly resistant to clindamycin. Continued surveillance is warranted for this clindamycin-resistant strain that is related to the fluoroquinolone-resistant epidemic RT027 strain. |
Trivalent inactivated influenza vaccine (IIV3) during pregnancy and six-month infant development
Avalos LA , Ferber J , Zerbo O , Naleway AL , Bulkley J , Thompson M , Cragan J , Williams J , Odouli R , Kauffman TL , Ball S , Shifflett P , Li DK . Vaccine 2020 38 (10) 2326-2332 OBJECTIVE: Despite recommendations by professional organizations that all pregnant women receive inactivated influenza vaccine, safety concerns remain a barrier. Our objective was to assess the effect of trivalent influenza vaccines (IIV3) during pregnancy on parent report 6-month infant development. METHODS: We conducted a multi-site prospective birth cohort study during the 2010-2011 influenza season and followed pregnant women and their newborns through 6 months of age. Information on IIV3 during pregnancy was ascertained from the EHR and self-report. The Ages and Stages Questionnaire-3 (ASQ-3) was completed by the mother to assess 6-month infant neurodevelopment in five domains (communication, gross motor, fine motor, problem-solving, and personal adaptive skills). Scores for each domain above the cut-off point indicating typical development were categorized as "on schedule" while scores in the zones indicating the need for either monitoring or further assessment were categorized as "not on schedule". Multivariable logistic regression was conducted. RESULTS: Of the 1225 infant-mother pairs, 65% received IIV3 during pregnancy. In bivariate analysis, infants of women who received IIV3 during pregnancy were moderately-less likely to need monitoring or further assessment in the personal social domain compared with infants of unvaccinated women (10.0% vs. 14.1%, p = 0.033; crude OR (cOR): 0.68(95%CI:0.48,0.97)). However, after controlling for potential confounders, the findings were no longer statistically significant (aOR:0.72,95%CI: 0.49,1.06,p = 0.46). No significant unadjusted or adjusted associations emerged in any other ASQ-3 domain. CONCLUSION: There was no significant association between IIV3 exposure during pregnancy and 6-month infant development. Studies of IIV3 during pregnancy to assess longer-term developmental outcomes are indicated. |
Influence of demographically-realistic mortality schedules on vaccination strategies in age-structured models
Feng Z , Feng Y , Glasser JW . Theor Popul Biol 2020 132 24-32 Because demographic realism complicates analysis, mathematical modelers either ignore demography or make simplifying assumptions (e.g., births and deaths equal). But human populations differ demographically, perhaps most notably in their mortality schedules. We developed an age-stratified population model with births, deaths, aging and mixing between age groups. The model includes types I and II mortality as special cases. We used the gradient approach Feng et al. (2015, 2017) to explore the impact of mortality patterns on optimal strategies for mitigating vaccine-preventable diseases such as measles and rubella, which the international community has targeted for eradication. Identification of optimal vaccine allocations to reduce the effective reproduction number Rv under various scenarios are presented. Numerical simulations of the model with various types of mortality are carried out to ascertain the long-term effects of vaccination on disease incidence. We conclude that optimal vaccination strategies and long-term effects of vaccination may depend on demographic assumptions. |
Advisory Committee on Immunization Practices recommended immunization schedule for adults aged 19 years or older - United States, 2020
Freedman MS , Hunter P , Ault K , Kroger A . MMWR Morb Mortal Wkly Rep 2020 69 (5) 133-135 At its October 2019 meeting, the Advisory Committee on Immunization Practices (ACIP) voted to recommend approval of the 2020 Recommended U.S. Adult Immunization Schedule for Persons Aged 19 Years and Older. The 2020 adult immunization schedule, available at https://www.cdc.gov/vaccines/schedules/index.html) summarizes ACIP recommendations in two tables and accompanying notes. This 2020 adult immunization schedule has been approved by the CDC Director, the American College of Physicians, the American Academy of Family Physicians, the American College of Obstetricians and Gynecologists, and the American College of Nurse-Midwives. Health care providers are advised to use the tables and the notes together. |
Effectiveness of trivalent and quadrivalent inactivated vaccines against influenza B in the United States, 2011-2012 to 2016-2017
Gaglani M , Vasudevan A , Raiyani C , Murthy K , Chen W , Reis M , Belongia EA , McLean HQ , Jackson ML , Jackson LA , Zimmerman RK , Nowalk MP , Monto AS , Martin ET , Chung JR , Spencer S , Fry AM , Flannery B . Clin Infect Dis 2020 72 (7) 1147-1157 BACKGROUND: Since 2013, quadrivalent influenza vaccines containing two B viruses gradually replaced trivalent vaccines in the United States. We compared vaccine effectiveness of quadrivalent to trivalent inactivated vaccines (IIV4 to IIV3) against illness due to influenza B during the transition when IIV4 use increased rapidly. METHODS: The US Influenza Vaccine Effectiveness (Flu VE) Network analyzed 25,019 of 42,600 outpatients aged >/=6 months enrolled within 7 days of illness-onset during six seasons from 2011-2012. Upper respiratory specimens were tested for influenza virus type and B-lineage. Using logistic regression, we estimated IIV4 or IIV3 effectiveness by comparing the odds of influenza B infection overall, and by B lineage among vaccinated versus unvaccinated participants. Over four seasons from 2013-2014, we compared relative odds of influenza B infection among IIV4 versus IIV3 recipients. RESULTS: Trivalent vaccines included the predominantly circulating B lineage in four of six seasons. During four influenza seasons when both IIV4 and IIV3 were widely used, overall effectiveness against any influenza B was 53% (95% confidence interval [CI], 45 to 59) for IIV4 versus 45% (95% CI, 34 to 54) for IIV3. IIV4 was more effective than IIV3 against the B lineage not included in IIV3, but comparative effectiveness against illness related to any influenza B favored neither vaccine valency. CONCLUSIONS: Uptake of quadrivalent inactivated influenza vaccines was not associated with increased protection against any influenza B illness, despite higher effectiveness of quadrivalent vaccines against the added B virus lineage. Public health impact and cost-benefit analyses are needed globally. |
Order of live and inactivated vaccines and risk of non-vaccine-targeted infections in US children 11-23 months of age
Newcomer SR , Daley MF , Narwaney KJ , Xu S , DeStefano F , Groom HC , Jackson ML , Lewin BJ , McLean HQ , Nordin JD , Zerbo O , Glanz JM . Pediatr Infect Dis J 2020 39 (3) 247-253 BACKGROUND: Some findings from observational studies have suggested that recent receipt of live vaccines may be associated with decreased non-vaccine-targeted infection risk and mortality. Our objective was to estimate risk of non-vaccine-targeted infections based on most recent vaccine type (live vaccines only, inactivated vaccines only or both concurrently) received in US children 11-23 months of age. METHODS: We conducted a retrospective cohort study within the Vaccine Safety Datalink. We examined electronic health record and immunization data from children born in 2003-2013 who received 3 diphtheria-tetanus-acellular pertussis vaccines before their first birthday. We modeled vaccine type as a time-varying exposure and estimated risk of non-vaccine-targeted infections identified in emergency department and inpatient settings, adjusting for multiple confounders. RESULTS: Among 428,608 children, 48.9% were female, 4.9% had >/=1 immunization visit with live vaccines only and 10.3% had a non-vaccine-targeted infection. In males, lower risk of non-vaccine-targeted infections was observed following last receipt of live vaccines only or live and inactivated vaccines concurrently as compared with last receipt of inactivated vaccines only [live vaccines-only adjusted hazard ratio (aHR) = 0.83, 95% confidence interval (CI): 0.72-0.94; live and inactivated vaccines concurrently aHR: 0.91, 95% CI: 0.88-0.94]. Among females, last receipt of live and inactivated vaccines concurrently was significantly associated with non-vaccine-targeted infection risk (aHR = 0.94, 95% CI: 0.91-0.97 vs. last receipt of inactivated vaccines only). CONCLUSIONS: We observed modest associations between live vaccine receipt and non-vaccine-targeted infections. In this observational study, multiple factors, including healthcare-seeking behavior, may have influenced results. |
Acceptability of seasonal influenza vaccines among health care workers in Vietnam in 2017
Nguyen TTM , Lafond KE , Nguyen TX , Tran PD , Nguyen HM , Ha VTC , Do TT , Ha NT , Seward JF , McFarland JW . Vaccine 2020 38 (8) 2045-2050 INTRODUCTION: A demonstration project in Vietnam provided 11,000 doses of human seasonal influenza vaccine free of charge to healthcare workers (HCWs) in 4 provinces of Vietnam. Through this project, we conducted an acceptability survey to identify the main reasons that individuals chose to be vaccinated or not to inform and improve future immunization activities. METHODS: We conducted a descriptive cross-sectional survey from May to August 2017 among HCWs at 13 selected health facilities. We employed logistic regression to determine the association between demographic and professional factors, and the decision to receive seasonal influenza vaccine. We performed post-hoc pairwise comparisons among reasons for and against vaccination using Chi square and Fisher's exact tests (for cell sizes <5). RESULTS: A total of 1,450 HCWs participated in the survey, with a higher proportion of females than males (74% versus 26%). The median age of the participating HCWs was 35 years (median range 25.8-44.2). Among those surveyed, 700 (48%) HCWs were vaccinated against seasonal influenza during the first half of 2017. Younger HCWs under 30 and 30-39 years old were less likely to get vaccinated against seasonal influenza than HCWs >/=50 years old (OR = 0.5; 95%CI 0.4-0.8 and OR = 0.6; 95%CI 0.4-0.8 respectively). Nurses and other employees were more likely to get seasonal influenza vaccination than physicians (OR = 1.5; 95%CI 1.0-2.4 and OR = 2.0; 95%CI 1.2-3.2 respectively). The most common reason for accepting vaccination was fear of getting influenza (66%) and the most common reason for not getting vaccinated was concern about vaccine side effects (23%). CONCLUSION: Acceptability of seasonal influenza vaccines in this setting varied among HCWs by age group and job category. Interventions to increase acceptance of vaccine among HCWs in this setting where influenza vaccine is being introduced free for the first time should include targeted risk communication on vaccine safety and efficacy. |
Licensure of a diphtheria and tetanus toxoids and acellular pertussis, inactivated poliovirus, haemophilus influenzae type b conjugate, and hepatitis B vaccine, and guidance for use in infants
Oliver SE , Moore KL . MMWR Morb Mortal Wkly Rep 2020 69 (5) 136-139 On December 21, 2018 the Food and Drug Administration (FDA) licensed a hexavalent combined diphtheria and tetanus toxoids and acellular pertussis (DTaP) adsorbed, inactivated poliovirus (IPV), Haemophilus influenzae type b (Hib) conjugate (meningococcal protein conjugate) and hepatitis B (HepB) (recombinant) vaccine, DTaP-IPV-Hib-HepB (Vaxelis; MCM Vaccine Company),* for use as a 3-dose series in infants at ages 2, 4, and 6 months (1). On June 26, 2019, after reviewing data on safety and immunogenicity, the Advisory Committee on Immunization Practices (ACIP)(dagger) voted to include DTaP-IPV-Hib-HepB in the federal Vaccines for Children (VFC) program.( section sign) This report summarizes the indications for DTaP-IPV-Hib-HepB and provides guidance for its use. |
Advisory Committee on Immunization Practices recommended immunization schedule for children and adolescents aged 18 years or younger - United States, 2020
Robinson CL , Bernstein H , Poehling K , Romero JR , Szilagyi P . MMWR Morb Mortal Wkly Rep 2020 69 (5) 130-132 At its October 2019 meeting, the Advisory Committee on Immunization Practices (ACIP)* approved the 2020 Recommended Child and Adolescent Immunization Schedule for Ages 18 Years or Younger. The 2020 child and adolescent immunization schedule summarizes ACIP recommendations, including several changes from the 2019 immunization schedule(dagger) on the cover page, three tables, and notes found on the CDC immunization schedule website (https://www.cdc.gov/vaccines/schedules/index.html). Health care providers are advised to use the tables and the notes together. This immunization schedule is recommended by ACIP (https://www.cdc.gov/vaccines/acip/index.html) and approved by the CDC Director, the American Academy of Pediatrics, the American Academy of Family Physicians, the American College of Obstetricians and Gynecologists, and, for the first time, the American College of Nurse-Midwives. |
Fever after influenza, diphtheria-tetanus-acellular pertussis, and pneumococcal vaccinations
Walter EB , Klein NP , Wodi AP , Rountree W , Todd CA , Wiesner A , Duffy J , Marquez PL , Broder KR . Pediatrics 2020 145 (3) BACKGROUND: Administering inactivated influenza vaccine (IIV), 13-valent pneumococcal conjugate vaccine (PCV13), and diphtheria-tetanus-acellular pertussis (DTaP) vaccine together has been associated with increased risk for febrile seizure after vaccination. We assessed the effect of administering IIV at a separate visit from PCV13 and DTaP on postvaccination fever. METHODS: In 2017-2018, children aged 12 to 16 months were randomly assigned to receive study vaccines simultaneously or sequentially. They had 2 study visits 2 weeks apart; nonstudy vaccines were permitted at visit 1. The simultaneous group received PCV13, DTaP, and quadrivalent IIV (IIV4) at visit 1 and no vaccines at visit 2. The sequential group received PCV13 and DTaP at visit 1 and IIV4 at visit 2. Participants were monitored for fever (>/=38 degrees C) and antipyretic use during the 8 days after visits. RESULTS: There were 110 children randomly assigned to the simultaneous group and 111 children to the sequential group; 90% received >/=1 nonstudy vaccine at visit 1. Similar proportions of children experienced fever on days 1 to 2 after visits 1 and 2 combined (simultaneous [8.1%] versus sequential [9.3%]; adjusted relative risk = 0.87 [95% confidence interval 0.36-2.10]). During days 1 to 2 after visit 1, more children in the simultaneous group received antipyretics (37.4% vs 22.4%; P = .020). CONCLUSIONS: In our study, delaying IIV4 administration by 2 weeks in children receiving DTaP and PCV13 did not reduce fever occurrence after vaccination. Reevaluating this strategy to prevent fever using an IIV4 with a different composition in a future influenza season may be considered. |
To share is human! Advancing evidence into practice through a national repository of interoperable clinical decision support
Lomotan EA , Meadows G , Michaels M , Michel JJ , Miller K . Appl Clin Inform 2020 11 (1) 112-121 BACKGROUND: Healthcare systems devote substantial resources to the development of clinical decision support (CDS) largely independently. The process of translating evidence-based practice into useful and effective CDS may be more efficient and less duplicative if healthcare systems shared knowledge about the translation, including workflow considerations, key assumptions made during the translation process, and technical details. OBJECTIVE: Describe how a national repository of CDS can serve as a public resource for healthcare systems, academic researchers, and informaticists seeking to share and reuse CDS knowledge resources or "artifacts." METHODS: In 2016, the Agency for Healthcare Research and Quality (AHRQ) launched CDS Connect as a public, web-based platform for authoring and sharing CDS knowledge artifacts. Researchers evaluated early use and impact of the platform by collecting user experiences of AHRQ-sponsored and community-led dissemination efforts and through quantitative/qualitative analysis of site metrics. Efforts are ongoing to quantify efficiencies gained by healthcare systems that leverage shared, interoperable CDS artifacts rather than developing similar CDS de novo and in isolation. RESULTS: Federal agencies, academic institutions, and others have contributed over 50 entries to CDS Connect for sharing and dissemination. Analysis indicates shareable CDS resources reduce team sizes and the number of tasks and time required to design, develop, and deploy CDS. However, the platform needs further optimization to address sociotechnical challenges. Benefits of sharing include inspiring others to undertake similar CDS projects, identifying external collaborators, and improving CDS artifacts as a result of feedback. Organizations are adapting content available through the platform for continued research, innovation, and local implementations. CONCLUSION: CDS Connect has provided a functional platform where CDS developers are actively sharing their work. CDS sharing may lead to improved implementation efficiency through numerous pathways, and further research is ongoing to quantify efficiencies gained. |
Sexual violence victimization of youth and health risk behaviors
Basile KC , Clayton HB , Rostad WL , Leemis RW . Am J Prev Med 2020 58 (4) 570-579 INTRODUCTION: This study assesses associations between past-12-month sexual violence victimization and recent health risk behaviors using a nationally representative sample of male and female high school students. It is hypothesized that sexual violence victimization will be associated with most of the negative health behaviors for both sexes. METHODS: Data from the 2017 National Youth Risk Behavior Survey, a school-based cross-sectional survey of students in Grades 9-12, were used to assess associations between sexual violence victimization and 29 health risk behaviors in sex-stratified logistic regression models. Effect modification was also examined through sex X sexual violence victimization interactions within unstratified models. All models controlled for race/ethnicity, grade, and sexual identity. Data were analyzed in 2018. RESULTS: Students who experienced sexual violence victimization were significantly more likely to report many health risk behaviors and experiences, such as substance use, injury, negative sexual health behaviors, feelings of sadness or hopelessness, suicidality, poor academic performance, and cognitive difficulties, and these associations were often stronger among male students (significant adjusted prevalence ratios ranged from 1.63 to 14.40 for male and 1.24 to 6.67 for female students). CONCLUSIONS: Past-year sexual violence victimization was significantly related to various health risk behaviors, suggesting that efforts to prevent sexual violence may also be associated with decreases in poor health. Integrating violence, substance use, sexual, and other health risk prevention efforts is warranted. |
Examination of sports and recreation-related concussion among youth ages 12-17: results from the 2018 YouthStyles survey
Sarmiento K , Daugherty J , DePadilla L , Breiding MJ . Brain Inj 2020 34 (3) 1-6 Background: This paper sought to examine the frequency of self-reported sports- and recreation-related (SRR) concussion, as well as care-seeking behaviors and potential activity restrictions after concussions, in a sample of youth.Methods: A sample of 845 youth ages 12-17 years responded to the web-based YouthStyles survey in 2018. The survey measured the frequency of self-reported lifetime SRR concussion, the setting of their most recent SRR concussion, whether a doctor or nurse evaluated them, and the types of activity restrictions they experienced.Results: Forty-three percent of youth surveyed sustained their most recent concussion while playing on a sports team, 21.1% while playing on a community-based team, and 36.0% while engaged in a sport or recreational activity. Nearly half (45.3%) reported having to miss playing sports or participating in physical activity for at least one day; about two in ten (19.7%) reported having to miss time on their phone or computer for at least one day.Conclusion: Despite wide-spread efforts to promote protocols for SRR concussion among youth, a third of participants in this study did not seek medical care and more than half did not miss at least one day of sports or physical activity participation following a concussion. |
Global update on the susceptibilities of human influenza viruses to neuraminidase inhibitors and the cap-dependent endonuclease inhibitor baloxavir, 2017-2018.
Takashita E , Daniels RS , Fujisaki S , Gregory , Gubareva LV , Huang W , Hurt AC , Lackenby A , Nguyen HT , Pereyaslov D , Roe M , Samaan M , Subbarao K , Tse H , Wang D , Yen H-L , Zhang W , Meijer A . Antiviral Res 2020 175 104718-104718 The global analysis of neuraminidase inhibitor (NAI) susceptibility of influenza viruses has been conducted since the 2012-13 period. In 2018 a novel cap-dependent endonuclease inhibitor, baloxavir, that targets polymerase acidic subunit (PA) was approved for the treatment of influenza virus infection in Japan and the United States. For this annual report, the susceptibilities of influenza viruses to NAIs and baloxavir were analyzed. A total of 15409 viruses, collected by World Health Organization (WHO) recognized National Influenza Centers and other laboratories between May 2017 and May 2018, were assessed for phenotypic NAI susceptibility by five WHO Collaborating Centers (CCs). The 50% inhibitory concentration (IC(50)) was determined for oseltamivir, zanamivir, peramivir and laninamivir. Reduced inhibition (RI) or highly reduced inhibition (HRI) by one or more NAIs was exhibited by 0.8% of viruses tested (n = 122). The frequency of viruses with RI or HRI has remained low since this global analysis began (2012-13: 0.6%; 2013-14: 1.9%; 2014-15: 0.5%; 2015-16: 0.8%; 2016-17: 0.2%). PA gene sequence data, available from public databases (n = 13523), were screened for amino acid substitutions associated with reduced susceptibility to baloxavir (PA E23G/K/R, PA A36V, PA A37T, PA I38F/M/T/L, PA E119D, PA E199G): 11 (0.08%) viruses possessed such substitutions. Five of them were included in phenotypic baloxavir susceptibility analysis by two WHO CCs and IC(50) values were determined. The PA variant viruses showed 6-17-fold reduced susceptibility to baloxavir. Overall, in the 2017-18 period the frequency of circulating influenza viruses with reduced susceptibility to NAIs or baloxavir was low, but continued monitoring is important. |
Isoniazid- and Rifampin-Resistance Mutations Associated with Resistance to Second-line Drugs and with Sputum Culture Conversion.
Click ES , Kurbatova E , Alexander H , Dalton TL , Chen MP , Posey JE , Ershova JJ , Cegielski P . J Infect Dis 2020 221 (12) 2072-2082 BACKGROUND: Mutations in the genes inhA, katG and rpoB confer resistance to anti-tuberculosis (TB) drugs isoniazid and rifampin. We questioned whether specific mutations in these genes were associated with different clinical and microbiological characteristics. METHODS: In a multi-country prospective cohort study of MDR-TB, we identified inhA, katG and rpoB mutations in sputum isolates using the Hain MTBDRplus line probe assay. For specific mutations, we performed bivariate analysis to determine relative risk of baseline or acquired resistance to other TB drugs. We compared time-to-sputum-culture-conversion (TSCC) using Kaplan-Meier curves and stratified Cox regression. RESULTS: In total, 447 participants enrolled January 2005-December 2008 from seven countries were included. Relative to rpoB S531L, isolates with rpoB D516V had less cross-resistance to rifabutin, increased baseline resistance to other drugs, and increased acquired fluoroquinolone resistance.Relative to mutation of katG only, mutation of inhA promoter and katG was associated with increased acquired fluoroquinolone resistance and slower TSCC (125.5 vs. 89.0 days). CONCLUSIONS: Specific mutations in inhA and katG are associated with differences in resistance to other drugs and TSCC. Molecular testing may make it possible to tailor treatment and assess additional drug resistance risk according to specific mutation profile. |
Expanding US Laboratory Capacity for Neisseria gonorrhoeae Antimicrobial Susceptibility Testing and Whole Genome Sequencing through CDC's Antibiotic Resistance Laboratory Network.
Kersh EN , Pham CD , Papp JR , Myers R , Steece R , Kubin G , Gautom R , Nash EE , Sharpe S , Gernert KM , Schmerer M , Raphael BH , Henning T , Gaynor AM , Soge O , Schlanger K , Kirkcaldy RD , St Cyr SB , Torrone EA , Bernstein K , Weinstock H . J Clin Microbiol 2020 58 (4) US gonorrhea rates are rising, and antibiotic-resistant Neisseria gonorrhoeae (AR-Ng) is an urgent public health threat. Since implementation of nucleic acid amplification tests for Ng identification, capacity for culturing Ng in the US has declined, along with the ability to perform culture-based antimicrobial susceptibility testing (AST). Yet, AST is critical for detecting and monitoring AR-Ng. In 2016, CDC established the Antibiotic Resistance Laboratory Network (AR Lab Network) to shore up national capacity for detecting several resistance threats including Ng. AR-Ng testing, a sub-activity of CDC's AR Lab Network, is performed in a tiered network of approximately 35 local laboratories, four regional laboratories (state public health laboratories in MD, TN, TX, WA), and CDC's national reference laboratory. Local laboratories receive specimens from approximately 60 clinics associated with the Gonococcal Isolate Surveillance Project (GISP), enhanced GISP (eGISP), and Strengthening the U.S. Response to Resistant Gonorrhea (SURRG). They isolate and ship up to 20,000 isolates to regional laboratories for culture-based agar dilution AST with seven antibiotics and for whole genome sequencing of up to 5,000 isolates. The CDC further examines concerning isolates and monitors genetic AR markers. During 2017 and 2018, the network tested 8,214 and 8,628 Ng isolates, and CDC received 531 and 646 concerning isolates, and 605 and 3,159 sequences, respectively. In summary, the AR Lab Network supported laboratory capacity for Ng-AST and associated genetic marker detection, expanding pre-existing notification and analysis systems for resistance detection. Continued, robust AST and genomic capacity can help inform national public health monitoring and intervention. |
Presence of cagPAI genes and characterization of vacA s, i and m regions in Helicobacter pylori isolated from Alaskans and their association with clinical pathologies.
Miernyk KM , Bruden D , Rudolph KM , Hurlburt DA , Sacco F , McMahon BJ , Bruce MG . J Med Microbiol 2020 69 (2) 218-227 Introduction. Gastric cancer is a health disparity in the Alaska Native people. The incidence of Helicobacter pylori infection, a risk factor for non-cardia gastric adenocarcinoma, is also high. Gastric cancer is partially associated with the virulence of the infecting strain.Aim. To genotype the vacA s, m and i and cag pathogenicity island (cagPAI) genes in H. pylori from Alaskans and investigate associations with gastropathy.Methodology. We enrolled patients with gastritis, peptic ulcer disease (PUD) and intestinal metaplasia (IM) in 1998-2005 and patients with gastric cancer in 2011-2013. Gastric biopsies were collected and cultured and PCR was performed to detect the presence of the right and left ends of the cagPAI, the cagA, cagE, cagT and virD4 genes and to genotype the vacA s, m and i regions.Results. We recruited 263 people; 22 (8 %) had no/mild gastritis, 121 (46 %) had moderate gastritis, 40 (15%) had severe gastritis, 38 (14 %) had PUD, 30 (11 %) had IM and 12 (5 %) had gastric cancer. H. pylori isolates from 150 (57%) people had an intact cagPAI; those were associated with a more severe gastropathy (P</=0.02 for all comparisons). H. pylori isolates from 77 % of people had either the vacA s1/i1/m1 (40 %; 94/234) or s2/i2/m2 (37 %; 86/234) genotype. vacA s1/i1/m1 was associated with a more severe gastropathy (P</=0.03 for all comparisons).Conclusions. In this population with high rates of gastric cancer, we found that just over half of the H. pylori contained an intact cagPAI and 40 % had the vacA s1/i1/m1 genotype. Infection with these strains was associated with a more severe gastropathy. |
Microglial activation and responses to vasculature that result from an acute LPS exposure
Bowyer JF , Sarkar S , Burks SM , Hess JN , Tolani S , O'Callaghan JP , Hanig JP . Neurotoxicology 2020 77 181-192 Bacterial cell wall endotoxins, i.e. lipopolysaccharides (LPS), are some of the original compounds shown to evoke the classic signs of systemic inflammation/innate immune response and neuroinflammation. The term neuroinflammation often is used to infer the elaboration of proinflammatory mediators by microglia elicited by neuronal targeted activity. However, it also is possible that the microglia are responding to vasculature through several signaling mechanisms. Microglial activation relative to the vasculature in the hippocampus and parietal cortex was determined after an acute exposure of a single subcutaneous injection of 2 mg/kg LPS. Antibodies to allograft inflammatory factor (Aif1, a.k.a. Iba1) were used to track and quantify morphological changes in microglia. Immunostaining of platelet/endothelial cell adhesion molecule 1 (Pecam1, a.k.a. Cd31) was used to visualize vasculature in the forebrain and glial acidic fibrillary protein (GFAP) to visualize astrocytes. Neuroinflammation and other aspects of neurotoxicity were evaluated histologically at 3 h, 6 h, 12 h, 24 h, 3 d and 14 d following LPS exposure. LPS did not cause neurodegeneration as determined by Fluoro Jade C labeling. Also, there were no signs of mouse IgG leakage from brain vasculature due to LPS. Some changes in microglia size occurred at 6 h, but by 12 h microglial activation had begun with the combined soma and proximal processes size increasing significantly (1.5-fold). At 24 h, almost all the microglia soma and proximal processes in the hippocampus, parietal cortex, and thalamus were closely associated with the vasculature and had increased almost 2.0-fold in size. In many areas where microglia were juxtaposed to vasculature, astrocytic endfeet appeared to be displaced. The microglial activation had subsided slightly by 3 d with microglial size 1.6-fold that of control. We hypothesize that acute LPS activation can result in vascular mediated microglial responses through several mechanisms: 1) binding to Cd14 and Tlr4 receptors on microglia processes residing on vasculature; 2) damaging vasculature and causing the release of cytokines; and 3) possibly astrocytic endfeet damage resulting in cytokine release. These acute responses may serve as an adaptive mechanism to exposure to circulating LPS where the microglia surround the vasculature. This could further prevent the pathogen(s) circulating in blood from entering the brain. However, diverting microglial interactions away from synaptic remodeling and other types of microglial interactions with neurons may have adverse effects on neuronal function. |
Actigraphy-based assessment of sleep parameters
Fekedulegn D , Andrew ME , Shi M , Violanti JM , Knox S , Innes KE . Ann Work Expo Health 2020 64 (4) 350-367 Actigraphy, a method for inferring sleep/wake patterns based on movement data gathered using actigraphs, is increasingly used in population-based epidemiologic studies because of its ability to monitor activity in natural settings. Using special software, actigraphic data are analyzed to estimate a range of sleep parameters. To date, despite extensive application of actigraphs in sleep research, published literature specifically detailing the methodology for derivation of sleep parameters is lacking; such information is critical for the appropriate analysis and interpretation of actigraphy data. Reporting of sleep parameters has also been inconsistent across studies, likely reflecting the lack of consensus regarding the definition of sleep onset and offset. In addition, actigraphy data are generally underutilized, with only a fraction of the sleep parameters generated through actigraphy routinely used in current sleep research. The objectives of this paper are to review existing algorithms used to estimate sleep/wake cycles from movement data, demonstrate the rules/methods used for estimating sleep parameters, provide clear technical definitions of the parameters, and suggest potential new measures that reflect intraindividual variability. Utilizing original data collected using Motionlogger Sleep Watch (Ambulatory Monitoring Inc., Ardsley, NY), we detail the methodology and derivation of 29 nocturnal sleep parameters, including those both widely and rarely utilized in research. By improving understanding of the actigraphy process, the information provided in this paper may help: ensure appropriate use and interpretation of sleep parameters in future studies; enable the recalibration of sleep parameters to address specific goals; inform the development of new measures; and increase the breadth of sleep parameters used. |
Griffithsin inhibits Nipah virus entry and fusion and can protect Syrian golden hamsters from lethal Nipah virus challenge
Lo MK , Spengler JR , Krumpe LRH , Welch SR , Chattopadhyay A , Harmon JR , Coleman-McCray JD , Scholte FEM , Hotard AL , Fuqua JL , Rose JK , Nichol ST , Palmer KE , O'Keefe BR , Spiropoulou CF . J Infect Dis 2020 221 S480-S492 Nipah virus (NiV) is a highly pathogenic zoonotic paramyxovirus that causes fatal encephalitis and respiratory disease in humans. There is currently no approved therapeutic for human use against NiV infection. Griffithsin (GRFT) is high-mannose oligosaccharide binding lectin that has shown in vivo broad-spectrum activity against viruses including severe acute respiratory syndrome coronavirus, human immunodeficiency virus 1, hepatitis C virus, and Japanese encephalitis virus. In this study, we evaluated the in vitro antiviral activities of GRFT and its synthetic trimeric tandemer (3mG) against NiV and other viruses from across 4 virus families. The 3mG had comparatively greater potency than GRFT against NiV due to its enhanced ability to block NiV glycoprotein-induced syncytia formation. Our initial in vivo prophylactic evaluation of an oxidation-resistant GRFT (Q-GRFT) showed significant protection against lethal NiV challenge in Syrian golden hamsters. Our results warrant further development of Q-GRFT and 3mG as potential NiV therapeutics. |
Mouse pulmonary dose- and time course-responses induced by exposure to nitrogen-doped multi-walled carbon nanotubes
Porter DW , Orandle M , Zheng P , Wu N , Hamilton RF Jr , Holian A , Chen BT , Andrew M , Wolfarth MG , Battelli L , Tsuruoka S , Terrones M , Castranova V . Inhal Toxicol 2020 32 (1) 1-15 Objective: In this study, we compared in vitro and in vivo bioactivity of nitrogen-doped multi-walled carbon nanotubes (NDMWCNT) to MWCNT to test the hypothesis that nitrogen doping would alter bioactivity.Materials and Methods: High-resolution transmission electron microscopy (TEM) confirmed the multilayer structure of MWCNT with an average layer distance of 0.36 nm, which was not altered by nitrogen doping: the nanomaterials had similar widths and lengths. In vitro studies with THP-1 cells and alveolar macrophages from C57BL/6 mice demonstrated that NDMWCNT were less cytotoxic and stimulated less IL-1beta release compared to MWCNT. For in vivo studies, male C57BL/6J mice received a single dose of dispersion medium (DM), 2.5, 10 or 40 microg/mouse of NDMWCNT, or 40 microg/mouse of MWCNT by oropharyngeal aspiration. Animals were euthanized between 1 and 7 days post-exposure for whole lung lavage (WLL) studies.Results and Discussion: NDMWCNT caused time- and dose-dependent pulmonary inflammation. However, it was less than that caused by MWCNT. Activation of the NLRP3 inflammasome was assessed in particle-exposed mice by determining cytokine production in WLL fluid at 1 day post-exposure. Compared to DM-exposed mice, IL-1beta and IL-18 were significantly increased in MWCNT- and NDMWCNT-exposed mice, but the increase caused by NDMWCNT was less than MWCNT. At 56 days post-exposure, histopathology determined lung fibrosis in MWCNT-exposed mice was greater than NDMWCNT-exposed mice.Conclusions: These data indicate nitrogen doping of MWCNT decreases their bioactivity, as reflected with lower in vitro and in vivo toxicity inflammation and lung disease. The lower activation of the NLRP3 inflammasome may be responsible. Abbreviations: NDMWCNT: nitrogen-doped multi-walled carbon nanotubes; MWCNT: multi-walled carbon nanotubes; TEM: transmission electron microscopy; HRTEM: high resolution transmission electron microscopy; IL-1ss: interleukin-1ss; DM: dispersion medium; WLL: whole lung lavage; IL-18: interleukin-18; GSD: geometric standard deviation; XPS: X-ray photoelectron spectroscopy; SEM: standard error of the mean; PMA: phorbol 12-myristate 13-acetate; LPS: lipopolysacharride; LDH: lactate dehydrogenase; AM: alveolar macrophage; PMN: polymorphonuclear leukocyte. |
Validation of aztreonam-avibactam susceptibility testing using digitally dispensed custom panels
Ransom E , Bhatnagar A , Patel JB , Machado M , Boyd S , Reese N , Lutgring JD , Lonsway D , Anderson K , Brown AC , Elkins CA , Rasheed JK , Karlsson M . J Clin Microbiol 2020 58 (4) Aztreonam-avibactam is a combination antimicrobial agent with activity against carbapenemase-producing Enterobacteriaceae (CPE) with metallo-beta-lactamases (MbetaLs). Although aztreonam-avibactam is not yet approved by the U.S. Food and Drug Administration (FDA), clinicians can administer this combination by using two FDA-approved drugs: aztreonam and ceftazidime-avibactam. This combination of drugs is recommended by multiple experts for treatment of serious infections caused by MbetaL-producing CPE. At present, in vitro antimicrobial susceptibility testing (AST) of aztreonam-avibactam is not commercially available; thus, most clinicians receive no laboratory-based guidance that can support consideration of aztreonam-avibactam for serious CPE infections. Here, we report our internal validation for aztreonam-avibactam AST by reference broth microdilution (BMD) according to Clinical and Laboratory Standards Institute (CLSI) guidelines. The validation was performed using custom, frozen reference BMD panels prepared in-house at the Centers for Disease Control and Prevention (CDC). In addition, we took this opportunity to evaluate a new panel-making method using a digital dispenser, the Hewlett Packard (HP) D300e. Our studies demonstrate that the performance characteristics of digitally dispensed panels were equivalent to conventionally prepared frozen reference BMD panels for a number of drugs, including aztreonam-avibactam. We found the HP D300e liquid handler to be easy-to-use and to provide the capacity to prepare complex drug panels. Our findings will assist other clinical and public health laboratories implement susceptibility testing for aztreonam-avibactam. |
Using the collaborative requirements development methodology to build laboratory capacity for timely diagnosis during the Zika epidemic in Puerto Rico
Rembert JH , Zometa CS , O'Carroll PW , Licier AL , McPhillips-Tangum C , Hale PM . J Public Health Manag Pract 2020 27 (3) E143-E150 INTRODUCTION: In 2016, Puerto Rico became the focal point of the Zika epidemic, with more than 36 000 laboratory-confirmed cases before August. The Puerto Rico Department of Health (PRDH) responded by providing tests to symptomatic and asymptomatic pregnant women. The increased demand for Zika testing placed unprecedented strain on the laboratory capacity and information management processes used within the PRDH. The PRDH recognized the need to have an updated informatics system that securely manages, stores, and transmits digital data. The Centers for Disease Control and Prevention funded the Public Health Informatics Institute to collaborate with the PRDH to assess and improve the informatics capability to respond to the ongoing Zika virus transmission in Puerto Rico. APPROACH: The team employed a 4-component approach to assess the informatics system and improve the information management processes for laboratory testing and reporting of arboviral diseases (Zika, chikungunya, and dengue). The method consisted of a (1) needs assessment, (2) business process analysis and requirements definition, (3) vendor analysis, and (4) solution implementation. RESULTS: The needs assessment determined that the PRDH's procedures for arbovirus testing and reporting were highly complex and paper-based and thus did not maximize the use of existing technology. The solution was to build a Web portal. The business process analysis yielded information to create a map of the flow of specimens, an arbovirus context diagram, and more than 200 requirements. The requirements identified in this process guided the design and creation of the Web portal. DISCUSSION: This report describes the process to build a Web portal to enhance laboratory testing and electronic reporting of Zika cases during the 2016 epidemic in Puerto Rico. We demonstrate the utility of applying the Collaborative Requirements Development Methodology, a proven informatics method, to the development of a Web portal for managing arboviruses in a health department. |
Antiviral ranpirnase TMR-001 inhibits rabies virus release and cell-to-cell infection in vitro
Smith TG , Jackson FR , Morgan CN , Carson WC , Martin BE , Gallardo-Romero N , Ellison JA , Greenberg L , Hodge T , Squiquera L , Sulley J , Olson VA , Hutson CL . Viruses 2020 12 (2) Currently, no rabies virus-specific antiviral drugs are available. Ranpirnase has strong antitumor and antiviral properties associated with its ribonuclease activity. TMR-001, a proprietary bulk drug substance solution of ranpirnase, was evaluated against rabies virus in three cell types: mouse neuroblastoma, BSR (baby hamster kidney cells), and bat primary fibroblast cells. When TMR-001 was added to cell monolayers 24 h preinfection, rabies virus release was inhibited for all cell types at three time points postinfection. TMR-001 treatment simultaneous with infection and 24 h postinfection effectively inhibited rabies virus release in the supernatant and cell-to-cell spread with 50% inhibitory concentrations of 0.2-2 nM and 20-600 nM, respectively. TMR-001 was administered at 0.1 mg/kg via intraperitoneal, intramuscular, or intravenous routes to Syrian hamsters beginning 24 h before a lethal rabies virus challenge and continuing once per day for up to 10 days. TMR-001 at this dose, formulation, and route of delivery did not prevent rabies virus transit from the periphery to the central nervous system in this model (n = 32). Further aspects of local controlled delivery of other active formulations or dose concentrations of TMR-001 or ribonuclease analogues should be investigated for this class of drugs as a rabies antiviral therapeutic. |
Harmonization of commercial assays for PINP; the way forward
Vasikaran SD , Bhattoa HP , Eastell R , Heijboer AC , Jørgensen NR , Makris K , Ulmer C , Kanis JA , Cooper C , Silverman S , Cavalier E . Osteoporos Int 2020 31 (3) 10.1007/s00198-020-05310-6 International Federation of Clinical Chemistry and Laboratory Medicine and The International Osteoporosis Foundation Joint Committee on Bone Metabolism believes that the harmonization of PINP assays is an achievable and practical goal. INTRODUCTION: In order to examine the agreement between current commercial assays, a multi-center study was performed for PINP in serum and plasma. METHODS: The automated methods for PINP (Roche Cobas and IDS iSYS) gave similar results. A significant proportional bias was observed between the two automated assays and the Orion radioimmunoassay (RIA) for PINP. RESULTS: Results from other published studies comparing PINP values among these three assays broadly support our findings. Taken together, these results confirm that harmonized PINP measurements exist between the two automated assays (Roche Cobas and IDS iSYS) when the eGFR is > 30 mL/min/1.73m(2), but a significant bias exists between the Orion RIA and the two automated assays. CONCLUSION: Therefore, in subjects with normal renal function, PINP results reported by the Roche Cobas and IDS iSYS assays are similar and may be used interchangeably, and similar reference intervals and treatment targets could be applied for the two automated assays. Harmonization between the automated assays and the RIA is potentially possible with the use of common calibrators and the development of a reference method for PINP. This should also help ensure that any new commercial assay developed in the future will attain similar results. IOF and IFCC are committed to working together towards this goal with the cooperation of the reagent manufacturing industry. |
Numerical simulation of roof cavings in several Kuzbass mines using finite-difference continuum damage mechanics approach
Eremin M , Esterhuizen G , Smolin I . Int J Min Sci Technol 2020 30 (2) 157-166 An essential stage of mine design is an estimation of the steps of the first and periodic roof caving in longwall mines. Generally, this is carried out using the field experience and can be much enhanced by numerical simulation. In this work, the finite-difference method was applied coupled with the continuum damage mechanics (CDM) approach to simulate the stress-strain evolution of the rock mass with the underground opening during coal extraction. The steps and stages of roof caving were estimated relying on the numerical simulation data, and they were compared with the field data from several operating mines in the south of the Kuznetsk Basin, Russia. The dependence of the first roof caving step in simulation linearly correlates with field data. The results correspond to the actual roofs of longwall panels of the flat-dipping coal seams and the average rate of face advancement is approximately 5 m/day. |
A case study of the stability of a non-typical bleeder entry system at a U.S. longwall mine
Klemetti TM , Van Dyke MA , Tulu IB , Tuncay D . Int J Min Sci Technol 2020 30 (1) 25-31 Longwall abutment loads are influenced by several factors, including depth of cover, pillar sizes, panel dimensions, geological setting, mining height, proximity to gob, intersection type, and size of the gob. How does proximity to the gob affect pillar loading and entry condition? Does the gob influence depend on whether the abutment load is a forward, side, or rear loading? Do non-typical bleeder entry systems follow the traditional front and side abutment loading and extent concepts? If not, will an improved understanding of the combined abutment extent warrant a change in pillar design or standing support in bleeder entries? This paper details observations made in the non-typical bleeder entries of a moderate depth longwall panel—specifically, data collected from borehole pressure cells and roof extensometers, observations of the conditions of the entries, and numerical modeling of the bleeder entries during longwall extraction. The primary focus was on the extent and magnitude of the abutment loading experienced due to the extraction of the longwall panels. Due to the layout of the longwall panels and bleeder entries, the borehole pressure cells (BPCs) and roof extensometers did not show much change due to the advancing of the first longwall. However, they did show a noticeable increase due to the second longwall advancement, with a maximum of about 4 MPa of pressure increase and 5 mm of roof deformation. The observations of the conditions showed little to no change from before the first longwall panel extraction began to when the second longwall panel had been advanced more than 915 m. Localized pillar spalling was observed on the corners of the pillars closest to the longwall gob as well as an increase in water in the entries. In addition to the observations and instrumentation, numerical modeling was performed to validate modeling procedures against the monitoring results and evaluate the bleeder design. ITASCA Consulting Group's FLAC3D numerical modeling software was used to evaluate the bleeder entries. The results of the models indicated only a minor increase in load during the extraction of the longwall panels. These models showed a much greater increase in stress due to the development of the gateroad and bleeder entries–about 80% development and 20% longwall extraction. The FLAC3D model showed very good correlation between modeled and expected gateroad loading during panel extraction. The front and side abutment extent modeled was very similar to observations from this and previous panels. |
Loading characteristics of mechanical rib bolts determined through testing and numerical modeling
Mohamed K , Rashed G , Radakovic-Guzina Z . Int J Min Sci Technol 2020 30 (1) 17-24 Underground coal mines use mechanical bolts in addition to other types of bolts to control the rib deformation and to stabilize the yielded coal ribs. Limited research has been conducted to understand the performance of the mechanical bolts in coal ribs. Researchers from the National Institute for Occupational Safety and Health (NIOSH) conducted this work to understand the loading characteristics of mechanical bolts (stiffness and capacity) installed in coal ribs at five underground coal mines. Standard pull-out tests were performed in this study to define the loading characteristics of mechanical rib bolts. Different installation torques were applied to the tested bolts based on the strength of the coal seam. A typical tri-linear load-deformation response for mechanical bolts was obtained from these tests. It was found that the anchorage capacity depended mainly on the coal strength. Guidelines for modeling mechanical bolts have been developed using the tri-linear load-deformation response. The outcome of this research provides essential data for rib support design. |
Maternal race trends in early infant feeding patterns in Hawai'i using newborn metabolic screening-birth certificate linked data 2008-2015
Hayes DK , Boundy EO , Hansen-Smith H , Melcher CL . Hawaii J Health Soc Welf 2020 79 (2) 42-50 Breastfeeding provides optimal nutrition for infants, including short- and longterm health benefits for baby and mother. Maternity care practices supporting breastfeeding after delivery increase the likelihood of exclusive breastfeeding. This study explores trends in early infant feeding practices by maternal race and other characteristics in Hawai'i. Data from a linked 2008-2015 Hawai'i Newborn Metabolic Screening and Birth Certificate file for 128 399 singleton term infants were analyzed. Early infant feeding occurring 24-48 hours after delivery and before discharge was categorized: Early formula feeding; early mixed feeding; and early exclusive breastfeeding. Differences were assessed over time by maternal race and other socio-demographic characteristics. Further assessment of maternal race included a generalized logit model adjusting for maternal age, marital status, county of residence, type of birth attendant, and birth year. Statewide, early exclusive breastfeeding increased from 58.8% in 2008 to 79.1% in 2015 (relative increase=+35%); early mixed feeding declined from 31.1% to 16.0% (relative decrease=-49%) and early formula feeding declined from 10.1% to 4.9% (relative decrease=-51%). Most maternal race subgroups experienced increases in early exclusive breastfeeding and decreases in mixed and formula. Japanese mothers were 2.15 (95%CI=1.90-2.42) and Korean mothers were 1.73 (95%CI=1.37-2.18) times more likely to practice early exclusive breastfeeding compared with white mothers. Several subgroups were less likely to practice early exclusive breastfeeding compared with white mothers. Substantial increases in early exclusive breastfeeding in Hawai'i occurred across all subgroups. Development of culturally appropriate hospital practices, particularly in those with persistently lower estimates, could help improve early exclusive breastfeeding. |
Review of NIOSH cannabis-related health hazard evaluations and research
Couch JR , Grimes GR , Green BJ , Wiegand DM , King B , Methner MM . Ann Work Expo Health 2020 64 (7) 693-704 Since 2004, the National Institute for Occupational Safety and Health (NIOSH) has received 10 cannabis-related health hazard evaluation (HHE) investigation requests from law enforcement agencies (n = 5), state-approved cannabis grow operations (n = 4), and a coroner's office (n = 1). Earlier requests concerned potential illicit drug exposures (including cannabis) during law enforcement activities and criminal investigations. Most recently HHE requests have involved state-approved grow operations with potential occupational exposures during commercial cannabis production for medicinal and non-medical (recreational) use. As of 2019, the United States Drug Enforcement Administration has banned cannabis as a Schedule I substance on the federal level. However, cannabis legalization at the state level has become more common in the USA. In two completed cannabis grow operation HHE investigations (two investigations are still ongoing as of 2019), potential dermal exposures were evaluated using two distinct surface wipe sample analytical methods. The first analyzed for delta-9-tetrahydrocannabinol (Delta9-THC) using a liquid chromatography and tandem mass spectrometry (LC-MS-MS) method with a limit of detection (LOD) of 4 nanograms (ng) per sample. A second method utilized high performance liquid chromatography with diode-array detection to analyze for four phytocannabinoids (Delta9-THC, Delta9-THC acid, cannabidiol, and cannabinol) with a LOD (2000 ng per sample) which, when comparing Delta9-THC limits, was orders of magnitude higher than the LC-MS-MS method. Surface wipe sampling results for both methods illustrated widespread contamination of all phytocannabinoids throughout the tested occupational environments, highlighting the need to consider THC form (Delta9-THC or Delta9-THC acid) as well as other biologically active phytocannabinoids in exposure assessments. In addition to potential cannabis-related dermal exposures, ergonomic stressors, and psychosocial issues, the studies found employees in cultivation, harvesting, and processing facilities could potentially be exposed to allergens and respiratory hazards through inhalation of organic dusts (including fungus, bacteria, and endotoxin) and volatile organic compounds (VOCs) such as diacetyl and 2,3-pentanedione. These hazards were most evident during the decarboxylation and grinding of dried cannabis material, where elevated job-specific concentrations of VOCs and endotoxin were generated. Additionally, utilization of contemporary gene sequencing methods in NIOSH HHEs provided a more comprehensive characterization of microbial communities sourced during cannabis cultivation and processing. Internal Transcribed Spacer region sequencing revealed over 200 fungal operational taxonomic units and breathing zone air samples were predominantly composed of Botrytis cinerea, a cannabis plant pathogen. B. cinerea, commonly known as gray mold within the industry, has been previously associated with hypersensitivity pneumonitis. This work elucidates new occupational hazards related to cannabis production and the evolving occupational safety and health landscape of an emerging industry, provides a summary of cannabis-related HHEs, and discusses critical lessons learned from these previous HHEs. |
Using behavioral theory to enhance occupational safety and health: Applications to health care workers
Guerin RJ , Sleet DA . Am J Lifestyle Med 2020 15 (3) 269-278 Work-related morbidity and mortality are persistent public health problems across all US industrial sectors, including health care. People employed in health care and social services are at high risk for experiencing injuries and illnesses related to their work. Social and behavioral science theories can be useful tools for designing interventions to prevent workplace injuries and illnesses and can provide a roadmap for investigating the multilevel factors that may hinder or promote worker safety and health. Specifically, individual-level behavioral change theories can be useful in evaluating the proximal, person-related antecedents (such as perceived behavioral control) that influence work safety outcomes. This article (1) provides a brief overview of widely used, individual-level behavior change theories and examples of their application to occupational safety and health (OSH)–related interventions that involve the health care community; (2) introduces an integrated theory of behavior change and its application to promoting the OSH of health care workers; and (3) discusses opportunities for application of individual-level behavior change theory to OSH research and practice activities involving health care workers. The use of behavioral science to consider the role of individual behaviors in promoting health and preventing disease and injury provides a necessary complement to structural approaches to protecting workers in the health care industry. |
An application of a modified theory of planned behavior model to investigate adolescents job safety knowledge, norms, attitude and intention to enact workplace safety and health skills
Guerin RJ , Toland MD . J Safety Res 2020 72 189-198 Introduction: For many reasons, including a lack of adequate safety training and education, U.S. adolescents experience a higher rate of job-related injury compared to adult workers. Widely used social-psychological theories in public health research and practice, such as the theory of planned behavior, may provide guidance for developing and evaluating school-based interventions to prepare adolescents for workplace hazards and risks. Method: Using a structural equation modeling approach, the current study explores whether a modified theory of planned behavior model provides insight on 1,748 eighth graders’ occupational safety and health (OSH) attitude, subjective norm, self-efficacy and behavioral intention, before and after receiving instruction on a free, national young worker safety and health curriculum. Reliability estimates for the measures were produced and direct and indirect associations between knowledge and other model constructs assessed. Results: Overall, the findings align with the theory of planned behavior. The structural equation model adequately fit the data; most path coefficients are statistically significant and knowledge has indirect effects on behavioral intention. Confirmatory factor analyses suggest that the knowledge, attitude, self-efficacy, and behavioral intention measures each reflect a unique dimension (reliability estimates ≥0.86), while the subjective norm measure did not perform adequately. Conclusion: The findings presented provide support for using behavioral theory (specifically a modified theory of planned behavior) to investigate adolescents’ knowledge, perceptions, and behavioral intention to engage in safe and healthful activities at work, an understanding of which may contribute to reducing the downstream burden of injury on this vulnerable population—the future workforce. Practical application: Health behavior theories, commonly used in the social and behavioral sciences, have utility and provide guidance for developing and evaluating OSH interventions, including those aimed at preventing injuries and promoting the health and safety of adolescent workers in the U.S., who are injured at higher rates than are adults. |
Review of construction employer case studies of safety and health equipment interventions
Lowe Brian D , Albers James , Hayden Marie , Lampl Michael , Naber Steven , Wurzelbacher Steven . J Constr Eng Manag 2020 146 (4) This paper presents a review of 153 case studies of equipment interventions to improve safety and health of construction businesses in Ohio in 2003-2016. These represent $6.46 million (2016 USD) in purchases incentivized through the Ohio Bureau of Workers' Compensation (OHBWC) Safety Intervention Grant (SIG) program. The source data in the review were extracted from employer grant applications and final reports of the case studies. Results were aggregated by type of construction equipment and included the reduction in safety and ergonomic hazards (risk factors for work-related musculoskeletal disorders), and an assessment of the quality of the case studies as determined through criteria established by the authors. Equipment associated with greatest reduction in risk factors and with case studies of higher quality were electrical cable feeding/pulling systems, concrete sawing equipment, skid steer attachments for concrete breaking, and manlifts (boom lifts). This review illustrates challenges in demonstrating efficacy of equipment interventions to improve construction safety/health-even from case studies within a structured health/safety program. The authors are aware of no other systematic review of case studies reporting on experiences with health/safety intervention equipment specific to the construction industry. 2020 American Society of Civil Engineers. |
Associations of objectively measured sleep characteristics and incident hypertension among police officers: The role of obesity
Ma CC , Gu JK , Bhandari R , Charles LE , Violanti JM , Fekedulegn D , Andrew ME . J Sleep Res 2020 29 (6) e12988 This study investigated the associations of baseline sleep onset latency, wake after sleep onset, longest wake episode, number of awakenings, sleep efficiency and sleep duration with incident hypertension during a 7-year follow-up (n = 161, 68% men) and the joint effect of insufficient sleep and obesity on incident hypertension. Sleep parameters were derived from 15-day actigraphy data. Relative risks and 95% confidence intervals were estimated using a robust Poisson regression model. Each 10-min increase in sleep onset latency was associated with an 89% higher risk of hypertension (95% confidence interval [CI] = 1.12-3.20). Each 10-min increase in longest wake episode was associated with a 23% higher risk of hypertension (95% CI = 1.01-1.50) and each 10% decrease in sleep efficiency was associated with a 50% higher risk of hypertension (95% CI = 1.02-2.22). These associations were independent of demographic and lifestyle characteristics, depressive symptoms, shift work, sleep duration and body mass index. Having <6 hr of sleep and a body mass index >/=30 kg/m(2) increased the risk of hypertension (relative risk = 2.81; 95% CI = 1.26-6.25) compared with having >/=6 hr of sleep and a body mass index <30 after controlling for confounders. Relative excess risk due to interaction was 3.49 (95% CI = -1.69-8.68) and ratio of relative risk was 3.21 (95% CI = 0.72-14.26). These results suggest that poor sleep quality is a risk factor for hypertension. Longitudinal studies with larger sample sizes are warranted to examine the joint effect of insufficient sleep and obesity on development of hypertension. |
Whole-body vibration biodynamics - a critical review: I. Experimental biodynamics
Rakheja S , Dewangan KN , Dong RG , Marcotte P . Int J Veh Perform 2020 6 (1) 1-51 In the framework of whole-body vibration (WBV), biodynamics refers to biomechanical responses of the human body to impressed oscillatory forces or motions. The biodynamic responses of the human body to WBV form an essential basis for an understanding of mechanical-equivalent properties of the body and potential injury mechanisms, developments in frequencyweightings and design tools of systems coupled with the human operator. In this first part, the biodynamic responses obtained experimentally in terms of 'to-the-body' and 'through-the-body' functions, are critically reviewed and discussed to highlight influences of various contributory factors, such as those related to posture, body support, anthropometry and nature of vibration, together with the range of experimental conditions. The reported data invariably show highly complex, nonlinear and coupled effects of the majority of the contributory factors. It is shown that the reported studies often conclude conflicting effects of many factors, such as posture, gender, vibration and support conditions. |
Whole-body vibration biodynamics - a critical review: II. biodynamic modelling
Rakheja S , Dewangan KN , Dong RG , Marcotte P , Pranesh A . Int J Veh Perform 2020 6 (1) 52-84 Biodynamic models of seated body exposed to whole-body vibration are considered important for design of vibration control devices and anthropodynamic surrogates for efficient performance assessments of vibration isolators. In this second part, the reported biodynamic models of the seated body are briefly reviewed together with the different modelling approaches. The models are identified from target functions derived from the measured biodynamic responses, reviewed in the first part of this paper. Relationships between different target functions are discussed together with the merits and limitations of different modelling approaches. Further efforts are needed for developing representative target functions for deriving reliable models for designing engineering interventions and for predicting potential health and comfort effects. |
TREXMO plus: an advanced self-learning model for occupational exposure assessment
Savic N , Lee EG , Gasic B , Vernez D . J Expo Sci Environ Epidemiol 2020 30 (3) 554-566 In Europe, several occupational exposure models have been developed and are recommended for regulatory exposure assessment. Only some information on the substance of interest (e.g., vapor pressure) and the workplace conditions (e.g., ventilation rate) is required in these models to predict an exposure value that will be later used to characterize the risk. However, it has been shown that models may differ in their predictions and that usually, one of the models best fits a given set of exposure conditions. Unfortunately, there are no clear rules on how to select the best model. In this study, we developed a new modeling approach that together uses the three most popular models, Advanced REACH Tool, Stoffenmanger, and ECETOC TRAv3, to obtain a unique exposure prediction. This approach is an extension of the TREXMO tool, and is called TREXMO+. TREXMO+ applies a machine-learning technique on a set of exposure data with the measured values to split them into smaller subsets, corresponding to exposure conditions sharing similar characteristics. For each subset, TREXMO+ then establishes a regression model with the three REACH tools used as the exposure predictors. The performance of the new model was tested and a comparison was made between the results obtained by TREXMO+ and those obtained by conventional tools. TREXMO+ model was found to be less biased and more accurate than the REACH models. Its prediction differs generally from measurements by a factor of 2-3 from measurements, whereas conventional models were found to differ by a factor 2-14. However, as the available test dataset is limited, its results will need to be confirmed by larger-scale tests. |
Association of occupational exposures with ex vivo functional immune response in workers handling carbon nanotubes and nanofibers
Schubauer-Berigan MK , Dahm MM , Toennis CA , Sammons DL , Eye T , Kodali V , Zeidler-Erdely PC , Erdely A . Nanotoxicology 2020 14 (3) 1-16 The objective of this study was to evaluate the association between carbon nanotube and nanofiber (CNT/F) exposure and ex vivo responses of whole blood challenged with secondary stimulants, adjusting for potential confounders, in a cross-sectional study of 102 workers. Multi-day exposure was measured by CNT/F structure count (SC) and elemental carbon (EC) air concentrations. Demographic, lifestyle and other occupational covariate data were obtained via questionnaire. Whole blood collected from each participant was incubated for 18 hours with and without two microbial stimulants (lipopolysaccharide/LPS and staphylococcal enterotoxin type B/SEB) using TruCulture technology to evaluate immune cell activity. Following incubation, supernatants were preserved and analyzed for protein concentrations. The stimulant:null response ratio for each individual protein was analyzed using multiple linear regression, followed by principal component (PC) analysis to determine whether patterns of protein response were related to CNT/F exposure. Adjusting for confounders, CNT/F metrics (most strongly, the SC-based) were significantly (p < 0.05) inversely associated with stimulant:null ratios of several individual biomarkers: GM-CSF, IFN-gamma, interleukin (IL)-2, IL-4, IL-5, IL-10, IL-17, and IL-23. CNT/F metrics were significantly inversely associated with PC1 (a weighted mean of most biomarkers, explaining 25% of the variance in the protein ratios) and PC2 (a biomarker contrast, explaining 14%). Among other occupational exposures, only solvent exposure was significant (inversely related to PC2). CNT/F exposure metrics were uniquely related to stimulant responses in challenged whole blood, illustrating reduced responsiveness to a secondary stimulus. This approach, if replicated in other exposed populations, may present a relatively sensitive method to evaluate human response to CNT/F or other occupational exposures. |
Contribution of various types and categories of diesel-powered vehicles to aerosols in an underground mine
Bugarski AD , Hummer JA . J Occup Environ Hyg 2020 17 (4) 1-14 A study was conducted in an underground mine with the objective to assess relative contributions of different types and categories of diesel-powered vehicles to submicron aerosol concentrations and to assess the effectiveness of selected diesel particulate matter control strategies and technologies. The net contributions of each of six heavy-duty (HD) vehicles, five light-duty (LD) vehicles, and the effects of disposable filter elements (DFEs), a sintered metal filter (SMF) system, and repowering were assessed using isolated zone methodology. On average, the HD vehicles powered by engines that were not retrofitted with filtration systems contributed approximately three times more to the number of aerosols and six times more to elemental carbon (EC) mass concentrations than LD vehicles powered by engines that were not retrofitted with filtration systems. Replacing an Environmental Protection Agency (EPA) pre-Tier engine in the non-permissible HD vehicle with an EPA Tier 3 engine resulted in 63% lower EC concentrations and 41% lower aerosol number concentrations. The evaluated filtration system with DFEs reduced the contribution of diesel-powered vehicles to number concentrations of aerosols by 77 to 92% and the average EC concentrations by 95%. The SMF reduced the contribution of diesel-powered vehicles to number concentrations of aerosols and EC concentrations by 93 and 95%, respectively. When compared with older units, one of the newer model personnel carriers contributed noticeably less to EC mass concentrations but almost equally to the number concentrations of diesel aerosols in the mine air. The second newer type of alternative personnel carrier vehicle contributed more to number and EC mass concentrations than the old-style personnel carrier. The LD vehicle powered by an EPA Tier 4f engine equipped with a DPF system contributed least of all tested vehicles to aerosol number and EC mass concentrations. This information is critical to the efforts of the underground mining industry to reduce exposures of workers to diesel aerosols. |
Analysis of fall-related imminent danger orders in the metal/nonmetal mining sector
Hrica JK , Eiter BM , Pollard JP , Kocher LM , Nasarwanji M . Min Metall Explor 2020 37 (2) 619-630 Within the metal/nonmetal mining sector, fall-related incidents account for a large proportion of fatal and non-fatal injuries. However, the events and contributing factors leading up to these incidents have not been fully investigated. To help provide a clearer picture of these factors, an analysis of imminent danger orders issued by the Mine Safety and Health Administration (MSHA) between 2010 and 2017 at both surface and underground metal/nonmetal mine sites revealed that most orders are associated with fall risks. Of these cases, 84% involved the workers not using fall protection, fall protection not being provided, or the improper use of fall protection. Fall risks for workers most frequently occurred when standing on mobile equipment, performing maintenance and repairs on plant equipment, or working near highwalls. In most cases, a single, basic, corrective action (e.g., using fall protection) would have allowed workers to perform the task safely. Overall, these findings suggest that a systematic approach is needed to identify, eliminate, and prevent imminent danger situations. Furthermore, to protect mineworkers from falls from height, frequently performed tasks requiring fall protection should be redesigned to eliminate the reliance on personal fall protection. |
Prevalence of spirometry-defined airflow obstruction in never-smoking working US coal miners by pneumoconiosis status
Kurth L , Laney AS , Blackley DJ , Halldin CN . Occup Environ Med 2020 77 (4) [Epub ahead of print] Introduction: This study estimated the prevalence of spirometry-defined airflow obstruction and coal workers' pneumoconiosis (CWP) among never-smoking coal miners participating in the National Institute for Occupational Safety and Health (NIOSH) Coal Workers' Health Surveillance Program (CWHSP). Methods: Data were from working miners screened by a CWHSP mobile unit who had valid spirometry and chest radiography results. Spirometry-defined airflow obstruction was determined when the ratio of forced expiratory volume in the first second to forced vital capacity is less than the lower limit of normal. Chest radiographs were classified according to the International Labour Office system to identify pneumoconiosis, including the most severe form of pneumoconiosis, progressive massive fibrosis (PMF). Results: Prevalence of airflow obstruction among never- smoking coal miners in this sample was 7.7% overall, 16.4% among miners with CWP and 32.3% among miners with PMF. Airflow obstruction was significantly associated with CWP and PMF. Conclusions: There was a higher prevalence of airflow obstruction among never-smoking coal miners with pneumoconiosis compared with those without pneumoconiosis. these findings support prior research on airflow obstruction and smoking and show pneumoconiosis might present with an obstructive pattern regardless of smoking status. |
Helmet-CAM: strategically minimizing exposures to respirable dust through video exposure monitoring
Patts JR , Cecala AB , Haas EJ . Min Metall Explor 2020 37 (2) [Epub ahead of print] Exposure to respirable crystalline silica (RCS) remains a serious health hazard to the US mining workforce who are potentially exposed as various ore bodies are drilled, blasted, hauled by truck, crushed, screened, and transported to their destinations. The current Mine Safety and Health Administration (MSHA) permissible exposure limit (PEL) for RCS remains at approximately 100 µg/m3, but it is noteworthy that the Occupational Safety and Health Administration (OSHA) has lowered its PEL to 50 µg/m3 (with enforcement dates staggered through 2022 for various sectors), and the National Institute for Occupational Safety and Health (NIOSH) has held a 50 µg/m3 recommended standard since 1976. To examine a method for reducing RCS exposure using a NIOSH-developed video exposure monitoring (VEM) technology (referred to as Helmet-CAM), video and respirable dust concentration data were collected on eighty miners across seven unique mining sites. The data was then collated and partitioned using a thresholding scheme to determine exposures that were in excess of ten times the mean exposure for that worker. Focusing on these short duration, high magnitude exposures can provide insight to implement controls and interventions that can dramatically lower the employee's overall average exposure. In 19 of the 80 cases analyzed, it was found that exposure could be significantly lowered by 20% or more by reducing exposures that occur during just 10 min of work per 8-hour shift. This approach provides a method to quickly analyze and determine which activities are creating the greatest health concerns. In most cases, once identified, focused control technologies or behavioral modifications can be applied to those tasks. |
PUMA - pooled uranium miners analysis: cohort profile
Rage E , Richardson DB , Demers PA , Do M , Fenske N , Kreuzer M , Samet J , Wiggins C , Schubauer-Berigan MK , Kelly-Reif K , Tomasek L , Zablotska LB , Laurier D . Occup Environ Med 2020 77 (3) 194-200 OBJECTIVES: Epidemiological studies of underground miners have provided clear evidence that inhalation of radon decay products causes lung cancer. Moreover, these studies have served as a quantitative basis for estimation of radon-associated excess lung cancer risk. However, questions remain regarding the effects of exposure to the low levels of radon decay products typically encountered in contemporary occupational and environmental settings on the risk of lung cancer and other diseases, and on the modifiers of these associations. These issues are of central importance for estimation of risks associated with residential and occupational radon exposures. METHODS: The Pooled Uranium Miner Analysis (PUMA) assembles information on cohorts of uranium miners in North America and Europe. Data available include individual annual estimates of exposure to radon decay products, demographic and employment history information on each worker and information on vital status, date of death and cause of death. Some, but not all, cohorts also have individual information on cigarette smoking, external gamma radiation exposure and non-radiological occupational exposures. RESULTS: The PUMA study represents the largest study of uranium miners conducted to date, encompassing 124 507 miners, 4.51 million person-years at risk and 54 462 deaths, including 7825 deaths due to lung cancer. Planned research topics include analyses of associations between radon exposure and mortality due to lung cancer, cancers other than lung, non-malignant disease, modifiers of these associations and characterisation of overall relative mortality excesses and lifetime risks. CONCLUSION: PUMA provides opportunities to evaluate new research questions and to conduct analyses to assess potential health risks associated with uranium mining that have greater statistical power than can be achieved with any single cohort. |
Analysis of ARMPS2010 database with LaModel and an updated abutment angle equation
Tuncay D , Tulu IB , Klemetti T . Int J Min Sci Technol 2020 30 (1) 111-118 The Analysis of Retreat Mining Pillar Stability (ARMPS) program was developed by the National Institute for Occupational Safety and Health (NIOSH) to help the United States coal mining industry to design safe retreat room-and-pillar panels. ARMPS calculates the magnitude of the in-situ and mining-induced loads by using geometrical computations and empirical rules. In particular, the program uses the “abutment angle” concept in calculating the magnitude of the abutment load on pillars adjacent to a gob. In this paper, stress measurements from United States and Australian mines with different overburden geologies with varying hard rock percentages were back analyzed. The results of the analyses indicated that for depths less than 200 m, the ARMPS empirical derivation of a 21° abutment angle was supported by the case histories; however, at depths greater than 200 m, the abutment angle was found to be significantly less than 21°. In this paper, a new equation employing the panel width to overburden depth ratio is constructed for the calculation of accurate abutment angles for deeper mining cases. The new abutment angle equation was tested using both ARMPS2010 and LaModel for the entire case history database of ARMPS2010. The new abutment angle equation to estimate the magnitude of the mining-induced loads used together with the LaModel program was found to give good classification accuracies compared to ARMPS2010 for deep cover cases. |
Geologic data collection and assessment techniques in coal mining for ground control
Van Dyke M , Klemetti T , Wickline J . Int J Min Sci Technol 2020 30 (1) 131-139 The identification and mitigation of adverse geologic conditions are critical to the safety and productivity of underground coal mining operations. To anticipate and mitigate adverse geologic conditions, a formal method to evaluate geotechnical factors must be established. Each mine is unique and has its own separate approach for defining what an adverse geological condition consists of. The collection of geologic data is a first critical step to creating a geological database to map these hazards efficiently and effectively. Many considerations must be taken into account, such as lithology of immediate roof and floor strata, seam height, gas and oil wells, faults, depressions in the mine floor (water) and increases in floor elevation (gas), overburden, streams and horizontal stress directions, amongst many other factors. Once geologic data is collected, it can be refined and integrated into a database that can be used to develop maps showing the trend, orientation, and extent of the adverse geological conditions. This information, delivered in a timely manner, allows mining personnel to be proactive in mine planning and support implementations, ultimately reducing the impacts of these features. This paper covers geologic exploratory methods, data organization, and the value of collecting and interpreting geologic information in coal mines to enhance safety and production. The implementation of the methods described above has been proven effective in predicting and mitigating adverse geologic conditions in underground coal mining. Consistent re-evaluation of data collection methods, geologic interpretations, mapping procedures, and communication techniques ensures continuous improvement in the accuracy of predictions and mitigation of adverse geologic conditions. Providing a concise record of the work previously done to track geologic conditions at a mine will allow for the smoothest transition during employee turnover and transitions. With refinements and standardization of data collection methods, such as those described in this paper, along with improvement in technology, the evaluation of adverse geologic conditions will evolve and continue to improve the safety and productivity of underground coal mining. |
Influence of longwall mining on the stability of gas wells in chain pillars
Zhang P , Dougherty H , Su D , Trackemas J , Tulu B . Int J Min Sci Technol 2020 30 (1) 3-9 Longwall mining has a significant influence on gas wells located within longwall chain pillars. Subsurface subsidence and abutment pressure induced by longwall mining can cause excessive stresses and deformations in gas well casings. If the gas well casings are compromised or ruptured, natural gas could migrate into the mine workings, potentially causing a fire or explosion. By the current safety regulations, the gas wells in the chain pillars have to be either plugged or protected by adequate coal pillars. The current regulations for gas well pillar design are based on the 1957 Pennsylvania gas well pillar study. The study provided guidelines for gas well pillars by considering their support area and overburden depth as well as the location of the gas wells within the pillars. As the guidelines were developed for room-and-pillar mining under shallow cover, they are no longer applicable to modern longwall coal mining, particularly, under deep cover. Gas well casing of failures have occurred even though the chain pillars for the gas wells met the requirements by the 1957 study. This study, conducted by the National Institute for Occupational Safety and Health (NIOSH), presents seven cases of conventional gas wells penetrating through longwall chain pillars in the Pittsburgh Coal Seam. The study results indicate that overburden depth and pillar size are not the only determining factors for gas well stability. The other important factors include subsurface ground movement, overburden geology, weak floor, as well as the type of the construction of gas wells. Numerical modeling was used to model abutment pressure, subsurface deformations, and the response of gas well casings. The study demonstrated that numerical models are able to predict with reasonable accuracy the subsurface deformations in the overburden above, within, and below the chain pillars, and the potential location and modes of gas well failures, thereby providing a more quantifiable approach to assess the stability of the gas wells in longwall chain pillars. |
Survey of schistosomiasis in Saint Lucia: Evidence for interruption of transmission
Gaspard J , Usey MM , Fredericks-James M , Sanchez MJ , Atkins L , Campbell CH , Corstjens Plam , van Dam GJ , Colley DG , Secor WE . Am J Trop Med Hyg 2020 102 (4) 827-831 Saint Lucia at one time had levels of schistosomiasis prevalence and morbidity as high as many countries in Africa. However, as a result of control efforts and economic development, including more widespread access to sanitation and safe water, schistosomiasis on the island has practically disappeared. To evaluate the current status of schistosomiasis in Saint Lucia, we conducted a nationally representative school-based survey of 8-11-year-old children for prevalence of Schistosoma mansoni infections using circulating antigen and specific antibody detection methods. We also conducted a questionnaire about available water sources, sanitation, and contact with fresh water. The total population of 8-11-year-old children on Saint Lucia was 8,985; of these, 1,487 (16.5%) provided urine for antigen testing, 1,455 (16.2%) provided fingerstick blood for antibody testing, and 1,536 (17.1%) answered the questionnaire. Although a few children were initially low positives by antigen or antibody detection methods, none could be confirmed positive by follow-up testing. Most children reported access to clean water and sanitary facilities in or near their homes and 48% of the children reported contact with fresh water. Together, these data suggest that schistosomiasis transmission has been interrupted on Saint Lucia. Additional surveys of adults, snails, and a repeat survey among school-age children will be necessary to verify these findings. However, in the same way that research on Saint Lucia generated the data leading to use of mass drug administration for schistosomiasis control, the island may also provide the information needed for guidelines to verify interruption of schistosomiasis transmission (247 words). |
Association between malaria infection and early childhood development mediated by anemia in rural Kenya
Milner EM , Kariger P , Pickering AJ , Stewart CP , Byrd K , Lin A , Rao G , Achando B , Dentz HN , Null C , Fernald LCH . Int J Environ Res Public Health 2020 17 (3) Malaria is a leading cause of morbidity and mortality among children under five years of age, with most cases occurring in Sub-Saharan Africa. Children in this age group in Africa are at greatest risk worldwide for developmental deficits. There are research gaps in quantifying the risks of mild malaria cases, understanding the pathways linking malaria infection and poor child development, and evaluating the impact of malaria on the development of children under five years. We analyzed the association between malaria infection and gross motor, communication, and personal social development in 592 children age 24 months in rural, western Kenya as part of the WASH Benefits environmental enteric dysfunction sub-study. Eighteen percent of children had malaria, 20% were at risk for gross motor delay, 21% were at risk for communication delay, and 23% were at risk for personal social delay. Having a positive malaria test was associated with increased risk for gross motor, communication, and personal social delay while adjusting for child characteristics, household demographics, study cluster, and intervention treatment arm. Mediation analyses suggested that anemia was a significant mediator in the pathway between malaria infection and risk for gross motor, communication, and personal social development delays. The proportion of the total effect of malaria on the risk of developmental delay that is mediated by anemia across the subscales was small (ranging from 9% of the effect on gross motor development to 16% of the effect on communication development mediated by anemia). Overall, malaria may be associated with short-term developmental delays during a vulnerable period of early life. Therefore, preventative malaria measures and immediate treatment are imperative for children's optimal development, particularly in light of projections of continued high malaria transmission in Kenya and Africa. |
The forgotten exotic tapeworms: a review of uncommon zoonotic Cyclophyllidea
Sapp SGH , Bradbury RS . Parasitology 2020 147 (5) 1-26 As training in helminthology has declined in the medical microbiology curriculum, many rare species of zoonotic cestodes have fallen into obscurity. Even among specialist practitioners, knowledge of human intestinal cestode infections is often limited to three genera, Taenia, Hymenolepis and Dibothriocephalus. However, five genera of uncommonly encountered zoonotic Cyclophyllidea (Bertiella, Dipylidium, Raillietina, Inermicapsifer and Mesocestoides) may also cause patent intestinal infections in humans worldwide. Due to the limited availability of summarized and taxonomically accurate data, such cases may present a diagnostic dilemma to clinicians and laboratories alike. In this review, historical literature on these cestodes is synthesized and knowledge gaps are highlighted. Clinically relevant taxonomy, nomenclature, life cycles, morphology of human-infecting species are discussed and clarified, along with the clinical presentation, diagnostic features and molecular advances, where available. Due to the limited awareness of these agents and identifying features, it is difficult to assess the true incidence of these 'forgotten' cestodiases as clinical misidentifications are likely to occur. Also, the taxonomic status of many of the human-infecting species of these tapeworms is unclear, hampering accurate species identification. Further studies combining molecular data and morphological observations are necessary to resolve these long-standing taxonomic issues and to elucidate other unknown aspects of transmission and ecology. |
The growing field of legal epidemiology
Burris S , Cloud LK , Penn M . J Public Health Manag Pract 2020 26 Suppl 2 S4-s9 Legal epidemiology” is the scientific study and deployment of law as a factor in the cause, distribution, and prevention of disease and injury in a population.1 Its emergence as a distinct field reflects the indispensability of law to modern public health practice.2–4 Proponents of the field aim to remove 2 persistent barriers to the effective use of legal action for public health: the limited extent of rigorous and timely evaluation of the impact of law and legal practices on health5,6; and the inattention in training and practice to the important legal functions played by nonlawyers in the health system.7,8 In the authors' assessment, the research necessary to identify and spread best legal practices is too often never carried out. Legal interventions affecting millions of Americans are often not evaluated for years, if at all. Innovations that show promise in research or practice are sometimes not scaled, so they either do not spread or spread too slowly. The unintended (or incidental) effects of laws on population health often remain unidentified and unexplored.2,9 Limited professional training in law, disciplinary boundaries, and, arguably, a cultural tension between law and other health disciplines continue to limit the full integration of law into public health.5–8,10 The publication of this special supplement of JPHMP is an opportune time to take stock. The purpose of this commentary is to describe the emergence of legal epidemiology, its key methods and tools, and the challenges it faces going forward. |
Translating workforce development policy interventions for community health workers: Application of a policy research continuum
Fulmer EB , Barbero C , Gilchrist S , Shantharam SS , Bhuiya AR , Taylor LN , Jones CD . J Public Health Manag Pract 2020 26 Suppl 2 S10-s18 CONTEXT: There is a need for knowledge translation to advance health equity in the prevention and control of cardiovascular disease and type 2 diabetes. One recommended strategy is engaging community health workers (CHWs) to have a central role in related interventions. Despite strong evidence of effectiveness for CHWs, there is limited information examining the impact of state CHW policy interventions. This article describes the application of a policy research continuum to enhance knowledge translation of CHW workforce development policy in the United States. METHODS: During 2016-2019, a team of public health researchers and practitioners applied the policy research continuum, a multiphased systematic assessment approach that incorporates legal epidemiology to enhance knowledge translation of CHW workforce development policy interventions in the United States. The continuum consists of 5 discrete, yet interconnected, phases including early evidence assessments, policy surveillance, implementation studies, policy ratings, and impact studies. RESULTS: Application of the first 3 phases of the continuum demonstrated (1) how CHW workforce development policy interventions are linked to strong evidence bases, (2) whether existing state CHW laws are evidence-informed, and (3) how different state approaches were implemented. DISCUSSION: As a knowledge translation tool, the continuum enhances dissemination of timely, useful information to inform decision making and supports the effective implementation and scale-up of science-based policy interventions. When fully implemented, it assists public health practitioners in examining the utility of different policy intervention approaches, the effects of adaptation, and the linkages between policy interventions and more distal public health outcomes. |
Establishing a baseline: Evidence-supported state laws to advance stroke care
Gilchrist S , Sloan AA , Bhuiya AR , Taylor LN , Shantharam SS , Barbero C , Fulmer EB . J Public Health Manag Pract 2020 26 Suppl 2 S19-s28 OBJECTIVE: Approximately 800 000 strokes occur annually in the United States. Stroke systems of care policies addressing prehospital and in-hospital care have been proposed to improve access to time-sensitive, lifesaving treatments for stroke. Policy surveillance of stroke systems of care laws supported by best available evidence could reveal potential strengths and weaknesses in how stroke care delivery is regulated across the nation. DESIGN: This study linked the results of an early evidence assessment of 15 stroke systems of care policy interventions supported by best available evidence to a legal data set of the body of law in effect on January 1, 2018, for the 50 states and Washington, District of Columbia. RESULTS: As of January 1, 2018, 39 states addressed 1 or more aspects of prehospital or in-hospital stroke care in law; 36 recognized at least 1 type of stroke center. Thirty states recognizing stroke centers also had evidence-supported prehospital policy interventions authorized in law. Four states authorized 10 or more of 15 evidence-supported policy interventions. Some combinations of prehospital and in-hospital policy interventions were more prevalent than other combinations. CONCLUSION: The analysis revealed that many states had a stroke regulatory infrastructure for in-hospital care that is supported by best available evidence. However, there are gaps in how state law integrates evidence-supported prehospital and in-hospital care that warrant further study. This study provides a baseline for ongoing policy surveillance and serves as a basis for subsequent stroke systems of care policy implementation and policy impact studies. |
Effect of state policy changes in Florida on opioid-related overdoses
Guy GPJr , Zhang K . Am J Prev Med 2020 58 (5) 703-706 INTRODUCTION: With a rapid increase in prescription opioid overdose deaths and a proliferation of pain clinics in the mid-2000s, Florida emerged as an epicenter of the opioid overdose epidemic. In response, Florida implemented pain clinic laws and operationalized its Prescription Drug Monitoring Program. This study examines the effect of these policies on rates of inpatient stays and emergency department visits for opioid-related overdoses. METHODS: Using data from the 2008-2015 State Emergency Department Databases and State Inpatient Databases, quarterly rates of inpatient stays and emergency department visits for prescription opioid-related overdoses and heroin-related overdoses were computed. A comparative interrupted time series analysis examined the effect of these policies on opioid overdose rates. North Carolina served as a control state because it did not implement similar policies during the study period. The data were analyzed in 2019. RESULTS: Compared with North Carolina, Florida's polices were associated with reductions in the rates of prescription opioid-related overdose inpatient stays and emergency department visits, a level reduction of 2.31 per 100,000 and a reduction in the trend of 0.16 per 100,000 population each quarter. The policies were associated with a reduction of 13,532 inpatient stays and emergency department visits for prescription opioid-related overdoses during the study period. No statistically significant association was found between the policies and heroin-related overdose inpatient stays and emergency department visits. CONCLUSIONS: To address the opioid overdose epidemic, states have implemented policies such as Prescription Drug Monitoring Programs and pain clinic laws designed to reduce inappropriate opioid prescribing. Such laws may be effective in reducing prescription opioid-related overdoses. |
State preemption: Impacts on advances in tobacco control
Kang JY , Kenemer B , Mahoney M , Tynan MA . J Public Health Manag Pract 2020 26 Suppl 2 S54-s61 CONTEXT: Policy is an effective tool for reducing the health harms caused by tobacco use. State laws can establish baseline public health protections. Preemptive legislation at the state level, however, can prohibit localities from enacting laws that further protect their citizens from public health threats. APPROACH: Preemptive state tobacco control laws were assessed using the Centers for Disease Control and Prevention's State Tobacco Activities Tracking and Evaluation System. Based on the assessments, the Centers for Disease Control and Prevention quantified the number of states with certain types of preemptive tobacco control laws in place. In addition, 4 different case examples were presented to highlight the experiences of 4 states with respect to preemption. DISCUSSION: Tracking and reporting on preemptive state tobacco control laws through the Centers for Disease Control and Prevention's State Tobacco Activities Tracking and Evaluation System provide an understanding of the number and scope of preemptive laws. Case examples from Hawaii, North Carolina, South Carolina, and Washington provide a detailed account of how preemption affects tobacco control governance at state and local levels within these 4 states. |
Mapping and analysis of US state and urban local sodium reduction laws
Sloan AA , Keane T , Pettie JR , Bhuiya AR , Taylor LN , Bates M , Bernard S , Akinleye F , Gilchrist S . J Public Health Manag Pract 2020 26 Suppl 2 S62-s70 CONTEXT: Excessive sodium consumption contributes to high blood pressure, which is a risk factor for cardiovascular disease. OBJECTIVES: To (1) identify state and urban local laws addressing adult or general population sodium consumption in foods and beverages and (2) align findings to a previously published evidence classification review, the Centers for Disease Control and Prevention Sodium Quality and Impact of Component (QuIC) evidence assessment. DESIGN: Systematic collection of sodium reduction laws from all 50 states, the 20 most populous counties in the United States, and the 20 most populous cities in the United States, including Washington, District of Columbia, effective on January 1, 2019. Relevant laws were assigned to 1 or more of 6 interventions: (1) provision of sodium information in restaurants or at point of purchase; (2) consumer incentives to purchase lower sodium foods; and provision of lower sodium offerings in (3) workplaces, (4) vending machines, (5) institutional meal services, and (6) grocery, corner, and convenience stores. The researchers used Westlaw, local policy databases or city Web sites, and general nutrition policy databases to identify relevant laws. RESULTS: Thirty-nine sodium reduction laws and 10 state laws preempting localities from enacting sodium reduction laws were identified. Sodium reduction laws were more common in local jurisdictions and in the Western United States. Sodium reduction laws addressing meal services (n = 17), workplaces (n = 12), labeling (n = 13), and vending machines (n = 11) were more common, while those addressing grocery stores (n = 2) or consumer incentives (n = 6) were less common. Laws with high QuIC evidence classifications were generally more common than laws with low QuIC evidence classifications. CONCLUSIONS: The distribution of sodium laws in the US differed by region, QuIC classification, and jurisdiction type, indicating influence from public health and nonpublic health factors. Ongoing research is warranted to determine how the strength of public health evidence evolves over time and how those changes correlate with uptake of sodium reduction law. |
Advancing legal epidemiology: An introduction
Thompson BL , Cloud LK , Gable L . J Public Health Manag Pract 2020 26 Suppl 2 S1-s3 Chronic and noncommunicable health conditions, including heart disease, stroke, diabetes, hypertension, cancer, and asthma, are leading causes of death and disability in the United States, with 2 in 5 adults afflicted with multiple conditions.1,2 Deaths from heart disease are increasing in the majority of counties.3 While deaths from stroke had been declining for decades, decreases in stroke deaths have stalled in the majority of states since 2013.4 In addition, individuals with lower socioeconomic status and those who identify as members of racial and ethnic minority groups experience higher rates of chronic diseases, less access to quality health care, and worse health outcomes.5–7 The effects of chronic and noncommunicable conditions also extend far beyond health outcomes. Direct costs of treating chronic conditions and indirect costs including loss of productivity amount to an estimated $3.7 trillion.8 |
Contraceptive methods of privately insured US women with congenital heart defects
Anderson KN , Tepper NK , Downing K , Ailes EC , Abarbanell G , Farr SL . Am Heart J 2020 222 38-45 BACKGROUND: The American Heart Association recommends women with congenital heart defects (CHD) receive contraceptive counseling early in their reproductive years, but little is known about contraceptive method use among women with CHD. We describe recent female sterilization and reversible prescription contraceptive method use by presence of CHD and CHD severity in 2014. METHODS: Using IBM MarketScan Commercial Databases, we included women aged 15 to 44 years with prescription drug coverage in 2014 who were enrolled >/=11 months annually in employer-sponsored health plans between 2011 and 2014. CHD, CHD severity, contraceptive methods, and obstetrics-gynecology and cardiology provider encounters were identified using billing codes. We used log-binomial regression to calculate adjusted prevalence ratios (aPRs) and 95% confidence intervals (CIs) to compare contraceptive method use overall and by effectiveness tier by CHD presence and, for women with CHD, severity. RESULTS: Recent sterilization or current reversible prescription contraceptive method use varied slightly among women with (39.2%) and without (37.3%) CHD, aPR=1.04, 95% CI [1.01-1.07]. Women with CHD were more likely to use any Tier I method (12.9%) than women without CHD (9.3%), aPR=1.41, 95% CI [1.33-1.50]. Women with severe, compared to non-severe, CHD were less likely to use any method, aPR=0.85, 95% CI [0.78-0.92], or Tier I method, aPR=0.84, 95% CI [0.70-0.99]. Approximately 60% of women with obstetrics-gynecology and <40% with cardiology encounters used any included method. CONCLUSIONS: There may be missed opportunities for providers to improve uptake of safe, effective contraceptive methods for women with CHD who wish to avoid pregnancy. |
Assessment of contraceptive needs and improving access in the U.S.-affiliated Pacific islands in the context of Zika
Green C , Ntansah C , Frey MT , Krashin JW , Lathrop E , Romero L . J Womens Health (Larchmt) 2020 29 (2) 139-147 Scientific evidence demonstrated a causal relationship between Zika virus infection during pregnancy and neurologic abnormalities and other congenital defects. The U.S. government's Zika Virus Disease Contingency Response Plan recognized the importance of preventing unintended pregnancy through access to high-quality family planning services as a primary strategy to reduce adverse Zika-related birth outcomes during the 2016-2017 Zika virus outbreak. The U.S.-affiliated Pacific Islands (USAPI) includes three U.S. territories: American Samoa, the Commonwealth of the Northern Mariana Islands, and Guam, and three independent countries in free association with the United States: the Federated States of Micronesia, the Republic of the Marshall Islands, and the Republic of Palau. Aedes spp. mosquitoes, the primary vector that transmits Zika virus, are common across the Pacific Islands, and in 2016, laboratory-confirmed cases of Zika virus infection in USAPI were reported. CDC conducted a rapid assessment by reviewing available reproductive health data and discussing access to contraception with family planning providers and program staff in all six USAPI jurisdictions between January and May 2017. In this report, we summarize findings from the assessment; discuss strategies developed by jurisdictions to respond to identified needs; and describe a training that was convened to provide technical assistance to USAPI. Similar rapid assessments may be used to identify training and technical assistance needs in other emergency preparedness and response efforts that pose a risk to pregnant women and their infants. |
Paternal involvement and maternal perinatal behaviors: Pregnancy Risk Assessment Monitoring System, 2012-2015
Kortsmit K , Garfield C , Smith RA , Boulet S , Simon C , Pazol K , Kapaya M , Harrison L , Barfield W , Warner L . Public Health Rep 2020 135 (2) 253-261 OBJECTIVES: Paternal involvement is associated with improved infant and maternal outcomes. We compared maternal behaviors associated with infant morbidity and mortality among married women, unmarried women with an acknowledgment of paternity (AOP; a proxy for paternal involvement) signed in the hospital, and unmarried women without an AOP in a representative sample of mothers in the United States from 32 sites. METHODS: We analyzed 2012-2015 data from the Pregnancy Risk Assessment Monitoring System, which collects site-specific, population-based data on preconception, prenatal and postpartum behaviors, and experiences from women with a recent live birth. We calculated adjusted prevalence ratios (aPRs) and 95% confidence intervals (CIs) to examine associations between level of paternal involvement and maternal perinatal behaviors. RESULTS: Of 113 020 respondents (weighted N = 6 159 027), 61.5% were married, 27.4% were unmarried with an AOP, and 11.1% were unmarried without an AOP. Compared with married women and unmarried women with an AOP, unmarried women without an AOP were less likely to initiate prenatal care during the first trimester (married, aPR [95% CI], 0.94 [0.92-0.95]; unmarried with AOP, 0.97 [0.95-0.98]), ever breastfeed (married, 0.89 [0.87-0.90]; unmarried with AOP, 0.95 [0.94-0.97]), and breastfeed at least 8 weeks (married, 0.76 [0.74-0.79]; unmarried with AOP, 0.93 [0.90-0.96]) and were more likely to use alcohol during pregnancy (married, 1.20 [1.05-1.37]; unmarried with AOP, 1.21 [1.06-1.39]) and smoke during pregnancy (married, 3.18 [2.90-3.49]; unmarried with AOP, 1.23 [1.15-1.32]) and after pregnancy (married, 2.93 [2.72-3.15]; unmarried with AOP, 1.17 [1.10-1.23]). CONCLUSIONS: Use of information on the AOP in addition to marital status provides a better understanding of factors that affect maternal behaviors. |
Carfentanil outbreak - Florida, 2016-2017
Delcher C , Wang Y , Vega RS , Halpin J , Gladden RM , O'Donnell JK , Hvozdovich JA , Goldberger BA . MMWR Morb Mortal Wkly Rep 2020 69 (5) 125-129 Increased prevalence of illicitly manufactured fentanyl and fentanyl analogs has contributed substantially to overdose deaths in the United States (1-3). On October 26, 2015, CDC issued a Health Advisory regarding rapid increases in deaths involving fentanyl. This CDC Health Advisory has been updated twice to address increases in fentanyl and fentanyl analog overdoses and their co-occurrence with nonopioids (4). Deaths involving carfentanil, an analog reportedly 10,000 times more potent than morphine and 100 times more potent than fentanyl, were first reported in Florida, Michigan, and Ohio in 2016 and described in an August 2016 CDC Health Advisory (1,5). Carfentanil is used to rapidly immobilize large animals in veterinary medicine and has no U.S. approved therapeutic use in humans. Carfentanil's street price per dose is likely lower than that of heroin. During 2016 and 2017, an outbreak of carfentanil-involved fatal overdoses in Florida emerged, and the Medical Examiner jurisdiction serving Sarasota, Manatee, and DeSoto counties (the Sarasota area) was the outbreak epicenter. This report describes toxicology profiles, sociodemographic information, and geographic distributions of carfentanil-involved fatal overdoses (carfentanil deaths) in the Sarasota area compared with those in the rest of Florida (i.e., all Florida counties excluding Sarasota area) from January 2016 to December 2017. The Sarasota area accounted for 19.0% of 1,181 statewide carfentanil deaths that occurred during this time and experienced a peak in carfentanil deaths preceding the larger Florida outbreak. The report of a single carfentanil death from August to December 2017 (compared with 73 reported deaths during the same period in 2016) appeared to mark the end of the outbreak in the area. The threat of such rapid, intense fatal overdose outbreaks highlights the need for accelerated reporting, reliable data sharing systems, and novel proactive surveillance to support targeted prevention and response efforts by public health and safety organizations (6). |
Adults' attitudes toward raising the minimum age of sale for tobacco products to 21 years, United States, 2014-2017
Gentzke AS , Glover-Kudon R , Tynan M , Jamal A . Prev Med 2020 133 106012 Raising the minimum age of sale for tobacco products to 21years (Tobacco 21) could help prevent and delay tobacco product initiation among youth. This study examined changes in U.S. adults' attitudes toward Tobacco 21 policies during 2014-2017. Data came from the 2014-2017 annual Summer Styles surveys, an Internet-based, cross-sectional survey of U.S. adults aged >/=18years, drawn from GfK's KnowledgePanel(R). Sample sizes ranged from 4107 in 2017 to 4269 in 2014. Each year, respondents were asked if they "strongly favor," "somewhat favor," "somewhat oppose," or "strongly oppose" Tobacco 21 policies. Weighted prevalence estimates of favorability (strongly or somewhat favor) were assessed each year; differences in favorability between years were assessed by chi square tests. Adjusted odds ratios (aOR) of favorability with 95% confidence intervals (CI) were calculated using logistic regression for the year 2017. Tobacco 21 policy favorability was reported by 75.0% in 2014; 72.3% in 2015; 78.4% in 2016; and 75.2% in 2017; the difference in favorability between 2014 and 2017 was not statistically significant. In 2017, lower odds of favorability toward Tobacco 21 policies were observed for current (aOR=0.49, CI=0.37-0.64) and former (aOR=0.54, CI=0.44-0.66) cigarette smokers, and current other tobacco product users (aOR=0.54, CI=0.49-0.64) than respective nonusers. Among U.S. adults, Tobacco 21 favorability has remained high since 2014, coinciding with a period of rapid state and local-level policy adoption. These results could be helpful for states and localities as they work to understand the feasibility of Tobacco 21 policies in their jurisdiction. |
Utility of using cancer registry data to identify patients for tobacco treatment trials
Krebs P , Rogers E , Greenspan A , Goldfeld K , Lei L , Ostroff JS , Garrett BE , Momin B , Henley SJ . J Registry Manag 2019 46 (2) 30-36 Background: Many tobacco dependent cancer survivors continue to smoke after diagnosis and treatment. This study investigated the extent to which hospital-based cancer registries could be used to identify smokers in order to offer them assistance in quitting. The concordance of tobacco use coded in the registry was compared with tobacco use as coded in the accompanying Electronic Health Records (EHRs). Methods: We gathered data from three hospital-based cancer registries in New York City during June 2014 to December 2016. For each patient identified as a current combustible tobacco user in the cancer registries, we abstracted tobacco use data from their EHR to independently code and corroborate smoking status. We calculated the proportion of current smokers, former smokers, and never smokers as indicated in the EHR for the hospitals, cancer site, cancer stage, and sex. We used a logistic regression model to estimate the log odds of the registry-based smoking status correctly predicting the EHR-based smoking status. Results: Agreement in current smoking status between the registry-based smoking status and the EHR-based smoking status was 65%, 71%, and 90% at the three participating hospitals. Logistic regression results indicated that agreement in smoking status between the registry and the EHRs varied by hospital, cancer type, and stage, but not by age and sex. Conclusions: The utility of using tobacco use data in cancer registries for population-based tobacco treatment interventions is dependent on multiple factors including accurate entry into EHR systems, updated data, and consistent smoking status definitions and registry coding protocols. Our study found that accuracy varied across the three hospitals and may not be able to inform interventions at these hospitals at this time. Several changes may be needed to improve the coding of tobacco use status in EHRs and registries. |
ICD-10-CM-based definitions for emergency department opioid poisoning surveillance: Electronic health record case confirmation study
Slavova S , Quesinberry D , Costich JF , Pasalic E , Martinez P , Martin J , Eustice S , Akpunonu P , Bunn TL . Public Health Rep 2020 135 (2) 33354920904087 OBJECTIVES: Valid opioid poisoning morbidity definitions are essential to the accuracy of national surveillance. The goal of our study was to estimate the positive predictive value (PPV) of case definitions identifying emergency department (ED) visits for heroin or other opioid poisonings, using billing records with International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM) codes. METHODS: We examined billing records for ED visits from 4 health care networks (12 EDs) from October 2015 through December 2016. We conducted medical record reviews of representative samples to estimate the PPVs and 95% confidence intervals (CIs) of (1) first-listed heroin poisoning diagnoses (n = 398), (2) secondary heroin poisoning diagnoses (n = 102), (3) first-listed other opioid poisoning diagnoses (n = 452), and (4) secondary other opioid poisoning diagnoses (n = 103). RESULTS: First-listed heroin poisoning diagnoses had an estimated PPV of 93.2% (95% CI, 90.0%-96.3%), higher than secondary heroin poisoning diagnoses (76.5%; 95% CI, 68.1%-84.8%). Among other opioid poisoning diagnoses, the estimated PPV was 79.4% (95% CI, 75.7%-83.1%) for first-listed diagnoses and 67.0% (95% CI, 57.8%-76.2%) for secondary diagnoses. Naloxone was administered in 867 of 1055 (82.2%) cases; 254 patients received multiple doses. One-third of all patients had a previous drug poisoning. Drug testing was ordered in only 354 cases. CONCLUSIONS: The study findings suggest that heroin or other opioid poisoning surveillance definitions that include multiple diagnoses (first-listed and secondary) would identify a high percentage of true-positive cases. |
Absence of Evidence of Zika Virus Infection in Cord Blood and Urine from Newborns with Congenital Abnormalities, Indonesia.
Putri ND , Dhenni R , Handryastuti S , Johar E , Ma'roef CN , Fadhilah A , Perma Iskandar AT , Prayitno A , Karyanti MR , Satari HI , Jumiyanti N , Aprilia YY , Sriyani IY , Dewi YP , Yudhaputri FA , Safari D , Hadinegoro SR , Rosenberg R , Powers AM , Aye Myint KS . Am J Trop Med Hyg 2020 102 (4) 876-879 Zika virus (ZIKV) has recently been confirmed as endemic in Indonesia, but no congenital anomalies (CA) related to ZIKV infection have been reported. We performed molecular and serological testing for ZIKV and other flaviviruses on cord serum and urine samples collected in October 2016 to April 2017 during a prospective, cross-sectional study of neonates in Jakarta, Indonesia. Of a total of 429 neonates, 53 had CA, including 14 with microcephaly. These 53, and 113 neonate controls without evidence of CA, were tested by ZIKV-specific real-time reverse transcription polymerase chain reaction (RT-PCR), pan-flavivirus RT-PCR, anti-ZIKV and anti-DENV IgM ELISA, and plaque reduction neutralization test. There was no evidence of ZIKV infection among neonates in either the CA or non-CA cohorts, except in three cases with low titers of anti-ZIKV neutralizing antibodies. Further routine evaluation throughout Indonesia of pregnant women and their newborns for exposure to ZIKV should be a high priority for determining risk. |
Initial Public Health Response and Interim Clinical Guidance for the 2019 Novel Coronavirus Outbreak - United States, December 31, 2019-February 4, 2020.
Patel A , Jernigan DB . MMWR Morb Mortal Wkly Rep 2020 69 (5) 140-146 On December 31, 2019, Chinese health officials reported a cluster of cases of acute respiratory illness in persons associated with the Hunan seafood and animal market in the city of Wuhan, Hubei Province, in central China. On January 7, 2020, Chinese health officials confirmed that a novel coronavirus (2019-nCoV) was associated with this initial cluster (1). As of February 4, 2020, a total of 20,471 confirmed cases, including 2,788 (13.6%) with severe illness,* and 425 deaths (2.1%) had been reported by the National Health Commission of China (2). Cases have also been reported in 26 locations outside of mainland China, including documentation of some person-to-person transmission and one death (2). As of February 4, 11 cases had been reported in the United States. On January 30, the World Health Organization (WHO) Director-General declared that the 2019-nCoV outbreak constitutes a Public Health Emergency of International Concern.(dagger) On January 31, the U.S. Department of Health and Human Services (HHS) Secretary declared a U.S. public health emergency to respond to 2019-nCoV.( section sign) Also on January 31, the president of the United States signed a "Proclamation on Suspension of Entry as Immigrants and Nonimmigrants of Persons who Pose a Risk of Transmitting 2019 Novel Coronavirus," which limits entry into the United States of persons who traveled to mainland China to U.S. citizens and lawful permanent residents and their families (3). CDC, multiple other federal agencies, state and local health departments, and other partners are implementing aggressive measures to slow transmission of 2019-nCoV in the United States (4,5). These measures require the identification of cases and their contacts in the United States and the appropriate assessment and care of travelers arriving from mainland China to the United States. These measures are being implemented in anticipation of additional 2019-nCoV cases in the United States. Although these measures might not prevent the eventual establishment of ongoing, widespread transmission of the virus in the United States, they are being implemented to 1) slow the spread of illness; 2) provide time to better prepare health care systems and the general public to be ready if widespread transmission with substantial associated illness occurs; and 3) better characterize 2019-nCoV infection to guide public health recommendations and the development of medical countermeasures including diagnostics, therapeutics, and vaccines. Public health authorities are monitoring the situation closely. As more is learned about this novel virus and this outbreak, CDC will rapidly incorporate new knowledge into guidance for action by CDC and state and local health departments. |
Persons Evaluated for 2019 Novel Coronavirus - United States, January 2020.
Bajema KL , Oster AM , McGovern OL , Lindstrom S , Stenger MR , Anderson TC , Isenhour C , Clarke KR , Evans ME , Chu VT , Biggs HM , Kirking HL , Gerber SI , Hall AJ , Fry AM , Oliver SE . MMWR Morb Mortal Wkly Rep 2020 69 (6) 166-170 In December 2019, a cluster of cases of pneumonia emerged in Wuhan City in central China's Hubei Province. Genetic sequencing of isolates obtained from patients with pneumonia identified a novel coronavirus (2019-nCoV) as the etiology (1). As of February 4, 2020, approximately 20,000 confirmed cases had been identified in China and an additional 159 confirmed cases in 23 other countries, including 11 in the United States (2,3). On January 17, CDC and the U.S. Department of Homeland Security's Customs and Border Protection began health screenings at U.S. airports to identify ill travelers returning from Wuhan City (4). CDC activated its Emergency Operations Center on January 21 and formalized a process for inquiries regarding persons suspected of having 2019-nCoV infection (2). As of January 31, 2020, CDC had responded to clinical inquiries from public health officials and health care providers to assist in evaluating approximately 650 persons thought to be at risk for 2019-nCoV infection. Guided by CDC criteria for the evaluation of persons under investigation (PUIs) (5), 210 symptomatic persons were tested for 2019-nCoV; among these persons, 148 (70%) had travel-related risk only, 42 (20%) had close contact with an ill laboratory-confirmed 2019-nCoV patient or PUI, and 18 (9%) had both travel- and contact-related risks. Eleven of these persons had laboratory-confirmed 2019-nCoV infection. Recognizing persons at risk for 2019-nCoV is critical to identifying cases and preventing further transmission. Health care providers should remain vigilant and adhere to recommended infection prevention and control practices when evaluating patients for possible 2019-nCoV infection (6). Providers should consult with their local and state health departments when assessing not only ill travelers from 2019-nCoV-affected countries but also ill persons who have been in close contact with patients with laboratory-confirmed 2019-nCoV infection in the United States. |
First Case of 2019 Novel Coronavirus in the United States.
Holshue ML , DeBolt C , Lindquist S , Lofy KH , Wiesman J , Bruce H , Spitters C , Ericson K , Wilkerson S , Tural A , Diaz G , Cohn A , Fox L , Patel A , Gerber SI , Kim L , Tong S , Lu X , Lindstrom S , Pallansch MA , Weldon WC , Biggs HM , Uyeki TM , Pillai SK . N Engl J Med 2020 382 (10) 929-936 An outbreak of novel coronavirus (2019-nCoV) that began in Wuhan, China, has spread rapidly, with cases now confirmed in multiple countries. We report the first case of 2019-nCoV infection confirmed in the United States and describe the identification, diagnosis, clinical course, and management of the case, including the patient's initial mild symptoms at presentation with progression to pneumonia on day 9 of illness. This case highlights the importance of close coordination between clinicians and public health authorities at the local, state, and federal levels, as well as the need for rapid dissemination of clinical information related to the care of patients with this emerging infection. |
Host metabolic response in early Lyme disease
Fitzgerald BL , Molins CR , Islam MN , Graham B , Hove PR , Wormser GP , Hu L , Ashton LV , Belisle JT . J Proteome Res 2020 19 (2) 610-623 Lyme disease is a tick-borne bacterial illness that occurs in areas of North America, Europe, and Asia. Early infection typically presents as generalized symptoms with an erythema migrans (EM) skin lesion. Dissemination of the pathogen Borrelia burgdorferi can result in multiple EM skin lesions or in extracutaneous manifestations such as Lyme neuroborreliosis. Metabolic biosignatures of patients with early Lyme disease can potentially provide diagnostic targets as well as highlight metabolic pathways that contribute to pathogenesis. Sera from well-characterized patients diagnosed with either early localized Lyme disease (ELL) or early disseminated Lyme disease (EDL), plus healthy controls (HC), from the United States were analyzed by liquid chromatography-mass spectrometry (LC-MS). Comparative analyses were performed between ELL, or EDL, or ELL combined with EDL, and the HC to develop biosignatures present in early Lyme disease. A direct comparison between ELL and EDL was also performed to develop a biosignature for stages of early Lyme disease. Metabolic pathway analysis and chemical identification of metabolites with LC-tandem mass spectrometry (LC-MS/MS) demonstrated alterations of eicosanoid, bile acid, sphingolipid, glycerophospholipid, and acylcarnitine metabolic pathways during early Lyme disease. These metabolic alterations were confirmed using a separate set of serum samples for validation. The findings demonstrated that infection of humans with B. burgdorferi alters defined metabolic pathways that are associated with inflammatory responses, liver function, lipid metabolism, and mitochondrial function. Additionally, the data provide evidence that metabolic pathways can be used to mark the progression of early Lyme disease. |
Middle East respiratory syndrome coronavirus transmission
Killerby Marie E , Biggs Holly M , Midgley Claire M , Gerber Susan I , Watson John T . Emerg Infect Dis 2020 26 (2) 191-198 Middle East respiratory syndrome coronavirus (MERS-CoV) infection causes a spectrum of respiratory illness, from asymptomatic to mild to fatal. MERS-CoV is transmitted sporadically from dromedary camels to humans and occasionally through human-to-human contact. Current epidemiologic evidence supports a major role in transmission for direct contact with live camels or humans with symptomatic MERS, but little evidence suggests the possibility of transmission from camel products or asymptomatic MERS cases. Because a proportion of case-patients do not report direct contact with camels or with persons who have symptomatic MERS, further research is needed to conclusively determine additional mechanisms of transmission, to inform public health practice, and to refine current precautionary recommendations. |
Zoonotic disease awareness survey of backyard poultry and swine owners in southcentral Pennsylvania
Nicholson CW , Campagnolo ER , Boktor SW , Butler CL . Zoonoses Public Health 2020 67 (3) 280-290 Owners of small backyard poultry and swine operations may be at higher risk of zoonotic diseases due to husbandry inexperience and/or a lack of knowledge. Backyard poultry and swine owners in southcentral Pennsylvania were surveyed regarding their knowledge and attitudes towards zoonotic disease prevention. One hundred and six backyard poultry and/or swine owners completed the survey (74 poultry, 15 swine, 17 both), which included questions on demographics, flock/herd characteristics, recognition of selected zoonotic diseases and clinical signs in animals, and biosecurity practices for visitors and owners. Most responded that they were aware of avian (92.2%) and swine (84.4%) influenza, and were less aware of other zoonotic diseases such as salmonellosis and brucellosis. The majority of backyard poultry and swine owners combined (62.9%) reported allowing visitors freely around their animals and did not require any special precautions. Backyard poultry and swine owners most commonly reported rarely (32.7%) or never (28.9%) wearing work gloves and never (57.1%) wearing nose/mouth coverings, such as a respirator mask, while handling animals or manure. The study findings indicated that veterinarians (61.5%) and the Internet (50.0%) are the main sources where small-scale farm producers seek animal disease information. Approximately one-third (34.9%) of the respondents reported receiving seasonal influenza vaccine. The findings of this study will be utilized to provide targeted veterinary and public health education for the prevention of zoonotic diseases in backyard farm animal settings in Pennsylvania. |
Human rabies - Utah, 2018
Peterson D , Barbeau B , McCaffrey K , Gruninger R , Eason J , Burnett C , Dunn A , Dimond M , Harbour J , Rossi A , Lopansri B , Dascomb K , Scribellito T , Moosman T , Saw L , Jones C , Belenky M , Marsden L , Niezgoda M , Gigante CM , Condori RE , Ellison JA , Orciari LA , Yager P , Bonwitt J , Whitehouse ER , Wallace RM . MMWR Morb Mortal Wkly Rep 2020 69 (5) 121-124 On November 3, 2018, the Utah Department of Health (UDOH) was notified of a suspected human rabies case in a man aged 55 years. The patient's symptoms had begun 18 days earlier, and he was hospitalized for 15 days before rabies was suspected. As his symptoms worsened, he received supportive care, but he died on November 4. On November 7, a diagnosis of rabies was confirmed by CDC. This was the first documented rabies death in a Utah resident since 1944. This report summarizes the patient's clinical course and the subsequent public health investigation, which determined that the patient had handled several bats in the weeks preceding symptom onset. Public health agencies, in partnership with affected health care facilities, identified and assessed the risk to potentially exposed persons, facilitated receipt of postexposure prophylaxis (PEP), and provided education to health care providers and the community about the risk for rabies associated with bats. Human rabies is rare and almost always fatal. The findings from this investigation highlight the importance of early recognition of rabies, improved public awareness of rabies in bats, and the use of innovative tools after mass rabies exposure events to ensure rapid and recommended risk assessment and provision of PEP. |
Influenza A virus field surveillance at a swine-human interface
Rambo-Martin BL , Keller MW , Wilson MM , Nolting JM , Anderson TK , Vincent AL , Bagal UR , Jang Y , Neuhaus EB , Davis CT , Bowman AS , Wentworth DE , Barnes JR . mSphere 2020 5 (1) While working overnight at a swine exhibition, we identified an influenza A virus (IAV) outbreak in swine, Nanopore sequenced 13 IAV genomes from samples we collected, and predicted in real time that these viruses posed a novel risk to humans due to genetic mismatches between the viruses and current prepandemic candidate vaccine viruses (CVVs). We developed and used a portable IAV sequencing and analysis platform called Mia (Mobile Influenza Analysis) to complete and characterize full-length consensus genomes approximately 18 h after unpacking the mobile lab. Exhibition swine are a known source for zoonotic transmission of IAV to humans and pose a potential pandemic risk. Genomic analyses of IAV in swine are critical to understanding this risk, the types of viruses circulating in swine, and whether current vaccines developed for use in humans would be predicted to provide immune protection. Nanopore sequencing technology has enabled genome sequencing in the field at the source of viral outbreaks or at the bedside or pen-side of infected humans and animals. The acquired data, however, have not yet demonstrated real-time, actionable public health responses. The Mia system rapidly identified three genetically distinct swine IAV lineages from three subtypes, A(H1N1), A(H3N2), and A(H1N2). Analysis of the hemagglutinin (HA) sequences of the A(H1N2) viruses identified >30 amino acid differences between the HA1 of these viruses and the most closely related CVV. As an exercise in pandemic preparedness, all sequences were emailed to CDC collaborators who initiated the development of a synthetically derived CVV.IMPORTANCE Swine are influenza virus reservoirs that have caused outbreaks and pandemics. Genomic characterization of these viruses enables pandemic risk assessment and vaccine comparisons, though this typically occurs after a novel swine virus jumps into humans. The greatest risk occurs where large groups of swine and humans comingle. At a large swine exhibition, we used Nanopore sequencing and on-site analytics to interpret 13 swine influenza virus genomes and identified an influenza virus cluster that was genetically highly varied to currently available vaccines. As part of the National Strategy for Pandemic Preparedness exercises, the sequences were emailed to colleagues at the CDC who initiated the development of a synthetically derived vaccine designed to match the viruses at the exhibition. Subsequently, this virus caused 14 infections in humans and was the dominant U.S. variant virus in 2018. |
Travel-associated and locally acquired dengue cases - United States, 2010-2017
Rivera A , Adams LE , Sharp TM , Lehman JA , Waterman SH , Paz-Bailey G . MMWR Morb Mortal Wkly Rep 2020 69 (6) 149-154 Dengue is a potentially fatal acute febrile illness caused by any of four mosquito-transmitted dengue viruses (DENV-1 to DENV-4) belonging to the family Flaviviridae and endemic throughout the tropics. Competent mosquito vectors of DENV are present in approximately one half of all U.S. counties. To describe epidemiologic trends in travel-associated and locally acquired dengue cases in the United States, CDC analyzed cases reported from the 50 states and District of Columbia to the national arboviral surveillance system (ArboNET). Cases are confirmed by detection of 1) virus RNA by reverse transcription-polymerase chain reaction (RT-PCR) in any body fluid or tissue, 2) DENV antigen in tissue by a validated assay, 3) DENV nonstructural protein 1 (NS1) antigen, or 4) immunoglobulin M (IgM) anti-DENV antibody if the patient did not report travel to an area with other circulating flaviviruses. When travel to an area with other flaviviruses was reported, IgM-positive cases were defined as probable. During 2010-2017, totals of 5,009 (93%) travel-associated and 378 (7%) locally acquired confirmed or probable dengue cases were reported to ArboNET. Cases were equally distributed between males and females, and median age was 41 years. Eighteen (three per 1,000) fatal cases were reported, all among travelers. Travelers should review country-specific recommendations (https://wwwnc.cdc.gov/travel/notices/watch/dengue-asia) for reducing their risk for DENV infection, including using insect repellent and staying in residences with air conditioning or screens on windows and doors. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Food Safety
- Health Economics
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Mining
- Nutritional Sciences
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Parasitic Diseases
- Public Health Law
- Reproductive Health
- Substance Use and Abuse
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 22, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure