The role of stress in breast cancer incidence: risk factors, interventions, and directions for the future
Bowen DJ , Poole SF , White M , Lyn R , Flores DA , Haile HG , Williams DR . Int J Environ Res Public Health 2021 18 (4) 1-15 Stress is a common belief among breast cancer patients and the public to explain variation in breast cancer incidence. Epidemiological studies interrogating the relationship between stress and cancer have reported mixed results. The impact of the topic and the lack of consensus has sparked this review of the literature to investigate gaps in knowledge and identify areas of research. We first present a brief summary of the biopsychosocial model generally used to conduct research on stress. We then divide the overview of the literature into areas of research focus. These include the role of distressing life events in breast cancer incidence, the role of adverse childhood events in later breast cancer incidence, the importance of race and socioeconomic status (SES) as social determinants of breast cancer incidence, and the specific role of chronic stress in relation to breast cancer. For each topic, we discuss the potential of stress as a risk factor and possible intervention strategies that could reduce the effects of stress. We then identify further research questions to be probed to fill the gaps in knowledge. We conclude with a discussion of future research directions for stress research as it relates to breast cancer incidence. |
Associations of progression to diabetes and regression to normal glucose tolerance with development of cardiovascular and microvascular disease among people with impaired glucose tolerance: a secondary analysis of the 30year Da Qing Diabetes Prevention Outcome Study
Chen Y , Zhang P , Wang J , Gong Q , An Y , Qian X , Zhang B , Li H , Gregg EW , Bennett PH , Li G . Diabetologia 2021 64 (6) 1279-1287 AIMS/HYPOTHESIS: We aimed to determine associations of regression to normal glucose tolerance (NGT), maintaining impaired glucose tolerance (IGT) or progression to diabetes with subsequent risks of CVD and microvascular disease among Chinese adults with IGT. METHODS: We conducted an observational study among 540 participants in the Da Qing Diabetes Prevention Study, a 6 year lifestyle intervention trial in people with IGT, defined by 1985 WHO criteria as fasting plasma glucose <7.8 mmol/l and 2 h post-load plasma glucose ≥7.8 and <11.1 mmol/l. At the end of the trial, the groups that had regressed to NGT, remained with IGT or progressed to diabetes were identified. Participants were then followed for 24 years after completion of the trial, during which we compared the incidence and hazard ratios for CVD and microvascular disease in each group and estimated the differences in their median time to onset from parametric Weibull distribution models. RESULTS: At the end of the 6 year trial, 252 (46.7%) participants had developed diabetes, 114 (21.1%) had remained with IGT and 174 (32.2%) had regressed to NGT. Compared with those who developed diabetes during the trial, the median time to onset of diabetes was delayed by 14.86 years (95% CI 12.49, 17.25) in the NGT and 9.87 years (95% CI 8.12, 11.68) in the IGT groups. After completion of the trial, among those with diabetes, IGT and NGT, the 24 year cumulative incidence of CVD was 64.5%, 48.5% and 45.1%, respectively, and 36.8%, 21.7% and 16.5% for microvascular diseases. Compared with participants who had progressed to diabetes during the trial, those who regressed to NGT had a 37% (HR 0.63; 95% CI 0.47, 0.85) reduction in CVD incidence and a median delay of 7.45 years (95% CI 1.91, 12.99) in onset, and those who remained with IGT had a 34% (HR 0.66; 95% CI 0.47, 0.91) lower CVD incidence with a median delay in onset of 5.69 years (95% CI 1.0, 10.38). Participants with NGT had a 66% (HR 0.34; 95% CI 0.20, 0.56) lower incidence of microvascular diseases and a median delay in the onset of 18.66 years (95% CI 6.08, 31.24), and those remaining with IGT had a 52% (HR 0.48; 95% CI 0.29, 0.81) lower incidence with a median delay of 12.56 years (95% CI 2.49, 22.63). CONCLUSIONS/INTERPRETATION: People with IGT who reverted to NGT or remained with IGT at the end of the 6 year trial subsequently had significantly lower incidences of CVD and microvascular disease than those who had developed diabetes. |
Public health aspects of periodontitis: Recent advances and contributions by Dr. Robert J. Genco
Eke PI , Borgnakke WS , Thornton-Evans G . Curr Oral Health Rep 2021 8 (1) 1-8 Purpose of Review: This review provides an overview of the objectives, activities, and accomplishments of the CDC-AAP collaboration on public health aspects of periodontitis focusing mostly on surveillance. Dr. Robert Genco was co-chair of this effort. Recent Findings: This initiative developed new standard periodontitis case definitions for surveillance and implemented for the first time a full-mouth periodontal examination protocol for NHANES 2009–2014. Measurements from this survey resulted in a significantly greater estimate of the national prevalence of periodontitis in US adults and improved our understanding of population risk factors associations with periodontitis. Notably, this initiative also developed, and validated by field-testing, a battery of eight questions for multivariable modeling of self-report measures for predicting periodontitis in populations. Summary: This initiative resulted in significant improvements of surveillance of periodontitis and produced unique findings with important implications for advancing our understanding of population aspects of periodontitis in US adults at the national, state, and local levels. At long last, the world finally had a set of periodontitis case definitions that applied globally would enable valid comparisons between populations in different geographic settings and at different times. |
Stomach cancer incidence and mortality trends among circumpolar nations
Simkin J , Nash SH , Barchuk A , O'Brien DK , Erickson AC , Hanley B , Hannah H , Corriveau A , Larsen IK , Skovlund CW , Larønningen S , Dummer TJ , Bruce MG , Ogilvie G . Cancer Epidemiol Biomarkers Prev 2021 30 (5) 845-856 BACKGROUND: Stomach cancer incidence and mortality rates are declining across circumpolar nations, but the burden may not be distributed equally across sub-populations, including Indigenous peoples. Our objective was to examine stomach cancer incidence and mortality trends across circumpolar populations. METHODS: Cancer incidence and mortality data from 1999-2016 were obtained from the Canadian Cancer Registry, Canadian Vital Statistics, CDC WONDER, NORDCAN, Northwestern Russian cancer registries and National Cancer Reports. The direct method was used to calculate ten-year rolling age-standardized incidence and mortality rates to the World (WHO 2000-2025) and 2011 Canadian standard populations. Standardized incidence rate ratios (SRRs) were calculated. Data were stratified by sex, year and region. US data were broken down by race (White; American Indian/Alaska Native (AIAN)). Race data were not available from non-US cancer registries. RESULTS: Most populations showed declining incidence and mortality rates over time. Incidence rates among Greenland males and females, Alaska AIAN males and females, and Northern Canadian both sexes were elevated compared to regional counterparts and remained stable. The largest male SRR was observed among Alaska AIAN versus Alaska Whites (SRR=3.82, 95% CI=2.71-5.37). The largest female SRR was observed among Alaska AIAN versus Alaska Whites (SRR=4.10, 95% CI=2.62-6.43). CONCLUSIONS: Despite stomach cancer incidence and mortality rates declining overall, some northern and Indigenous populations experience elevated and stable incidence and mortality rates. IMPACT: There is a need to address disparities observed among circumpolar sub-populations. Given similarities in incidence, mortality and risk factor prevalence across circumpolar regions, addressing disparities could benefit from coordinated international action. |
Clusters of SARS-CoV-2 Infection Among Elementary School Educators and Students in One School District - Georgia, December 2020-January 2021.
Gold JAW , Gettings JR , Kimball A , Franklin R , Rivera G , Morris E , Scott C , Marcet PL , Hast M , Swanson M , McCloud J , Mehari L , Thomas ES , Kirking HL , Tate JE , Memark J , Drenzek C , Vallabhaneni S . MMWR Morb Mortal Wkly Rep 2021 70 (8) 289-292 In-person learning benefits children and communities (1). Understanding the context in which transmission of SARS-CoV-2, the virus that causes coronavirus disease 2019 (COVID-19), occurs in schools is critical to improving the safety of in-person learning. During December 1, 2020-January 22, 2021, Cobb and Douglas Public Health (CDPH), the Georgia Department of Public Health (GDPH), and CDC investigated SARS-CoV-2 transmission in eight public elementary schools in a single school district. COVID-19 cases* among educators and students were either self-reported or identified by local public health officials. Close contacts (contacts)(†) of persons with a COVID-19 case received testing. Among contacts who received positive test results, public health investigators assessed epidemiologic links, probable transmission directionality, and the likelihood of in-school transmission.(§) Nine clusters of three or more epidemiologically linked COVID-19 cases were identified involving 13 educators and 32 students at six of the eight elementary schools. Two clusters involved probable educator-to-educator transmission that was followed by educator-to-student transmission and resulted in approximately one half (15 of 31) of school-associated cases. Sixty-nine household members of persons with school-associated cases were tested, and 18 (26%) received positive results. All nine transmission clusters involved less than ideal physical distancing, and five involved inadequate mask use by students. Educators were central to in-school transmission networks. Multifaceted mitigation measures in schools, including promotion of COVID-19 precautions outside of school, minimizing in-person adult interactions at school, and ensuring universal and correct mask use and physical distancing among educators and students when in-person interaction is unavoidable, are important in preventing in-school transmission of SARS-CoV-2. Although not required for reopening schools, COVID-19 vaccination should be considered as an additional mitigation measure to be added when available. |
Impact of coronavirus disease 2019 (COVID-19) on US Hospitals and Patients, April-July 2020.
Sapiano MRP , Dudeck MA , Soe M , Edwards JR , O'Leary EN , Wu H , Allen-Bridson K , Amor A , Arcement R , Chernetsky Tejedor S , Dantes R , Gross C , Haass K , Konnor R , Kroop SR , Leaptrot D , Lemoine K , Nkwata A , Peterson K , Wattenmaker L , Weiner-Lastinger LM , Pollock D , Benin AL . Infect Control Hosp Epidemiol 2021 43 (1) 1-28 OBJECTIVE: The rapid spread of SARS-CoV-2 throughout key regions of the United States (U.S.) in early 2020 placed a premium on timely, national surveillance of hospital patient censuses. To meet that need, the Centers for Disease Control and Prevention's National Healthcare Safety Network (NHSN), the nation's largest hospital surveillance system, launched a module for collecting hospital COVID-19 data. This paper presents time series estimates of the critical hospital capacity indicators during April 1-July 14, 2020. DESIGN: From March 27-July 14, 2020, NHSN collected daily data on hospital bed occupancy, number of hospitalized patients with COVID-19, and availability/use of mechanical ventilators. Time series were constructed using multiple imputation and survey weighting to allow near real-time daily national and state estimates to be computed. RESULTS: During the pandemic's April peak in the United States, among an estimated 431,000 total inpatients, 84,000 (19%) had COVID-19. Although the number of inpatients with COVID-19 decreased during April to July, the proportion of occupied inpatient beds increased steadily. COVID-19 hospitalizations increased from mid-June in the South and Southwest after stay-at-home restrictions were eased. The proportion of inpatients with COVID-19 on ventilators decreased from April to July. CONCLUSIONS: The NHSN hospital capacity estimates served as important, near-real time indicators of the pandemic's magnitude, spread, and impact, providing quantitative guidance for the public health response. Use of the estimates detected the rise of hospitalizations in specific geographic regions in June after declining from a peak in April. Patient outcomes appeared to improve from early April to mid-July. |
Identifying COVID-19 Risk Through Observational Studies to Inform Control Measures.
Tenforde MW , Fisher KA , Patel MM . JAMA 2021 325 (14) 1464-1465 Ayear into the coronavirus disease 2019 (COVID-19) pandemic there remains an urgent need to limit severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) spread and to curb the pandemic in the US through nonpharmaceutical interventions. Clear evidence supports the effectiveness of simple strategies in identifying risks and mitigating the spread of infection, with much of this evidence coming from observational studies. Community risk factors for infection can be identified by comparing recent behaviors and exposures among people who have been infected with those who are not infected using a traditional case-control approach. High-risk environments identified from these investigations need to be clearly communicated to the public to support public health measures and motivate individual behavior change to reduce the risk of infection. |
Impact of non-pharmaceutical interventions (NPIs) for SARS-CoV-2 on norovirus outbreaks: an analysis of outbreaks reported by 9 US States.
Kraay ANM , Han P , Kambhampati AK , Wikswo ME , Mirza SA , Lopman BA . J Infect Dis 2021 224 (1) 9-13 In April 2020, the incidence of norovirus outbreaks reported to the National Outbreak Reporting System (NORS) dramatically declined. We used regression models to determine if this decline was best explained by underreporting, seasonal trends, or reduced exposure due to non-pharmaceutical interventions (NPIs) implemented for SARS-CoV-2 using data from 9 states from July 2012-July 2020. The decline in norovirus outbreaks was significant for all 9 states and underreporting or seasonality are unlikely to be the primary explanations for these findings. These patterns were similar across a variety of settings. NPIs appear to have reduced incidence of norovirus, a non-respiratory pathogen. |
Detection of B.1.351 SARS-CoV-2 Variant Strain - Zambia, December 2020.
Mwenda M , Saasa N , Sinyange N , Busby G , Chipimo PJ , Hendry J , Kapona O , Yingst S , Hines JZ , Minchella P , Simulundu E , Changula K , Nalubamba KS , Sawa H , Kajihara M , Yamagishi J , Kapin'a M , Kapata N , Fwoloshi S , Zulu P , Mulenga LB , Agolory S , Mukonka V , Bridges DJ . MMWR Morb Mortal Wkly Rep 2021 70 (8) 280-282 The first laboratory-confirmed cases of coronavirus disease 2019 (COVID-19), the illness caused by SARS-CoV-2, in Zambia were detected in March 2020 (1). Beginning in July, the number of confirmed cases began to increase rapidly, first peaking during July-August, and then declining in September and October (Figure). After 3 months of relatively low case counts, COVID-19 cases began rapidly rising throughout the country in mid-December. On December 18, 2020, South Africa published the genome of a SARS-CoV-2 variant strain with several mutations that affect the spike protein (2). The variant included a mutation (N501Y) associated with increased transmissibility.(†)(,)(§) SARS-CoV-2 lineages with this mutation have rapidly expanded geographically.(¶)(,)** The variant strain (PANGO [Phylogenetic Assignment of Named Global Outbreak] lineage B.1.351(††)) was first detected in the Eastern Cape Province of South Africa from specimens collected in early August, spread within South Africa, and appears to have displaced the majority of other SARS-CoV-2 lineages circulating in that country (2). As of January 10, 2021, eight countries had reported cases with the B.1.351 variant. In Zambia, the average number of daily confirmed COVID-19 cases increased 16-fold, from 44 cases during December 1-10 to 700 during January 1-10, after detection of the B.1.351 variant in specimens collected during December 16-23. Zambia is a southern African country that shares substantial commerce and tourism linkages with South Africa, which might have contributed to the transmission of the B.1.351 variant between the two countries. |
Suspected Recurrent SARS-CoV-2 Infections Among Residents of a Skilled Nursing Facility During a Second COVID-19 Outbreak - Kentucky, July-November 2020.
Cavanaugh AM , Thoroughman D , Miranda H , Spicer K . MMWR Morb Mortal Wkly Rep 2021 70 (8) 273-277 Reinfection with SARS-CoV-2, the virus that causes coronavirus disease 2019 (COVID-19), is believed to be rare (1). Some level of immunity after SARS-CoV-2 infection is expected; however, the evidence regarding duration and level of protection is still emerging (2). The Kentucky Department for Public Health (KDPH) and a local health department conducted an investigation at a skilled nursing facility (SNF) that experienced a second COVID-19 outbreak in October 2020, 3 months after a first outbreak in July. Five residents received positive SARS-CoV-2 reverse transcription-polymerase chain reaction (RT-PCR) test results during both outbreaks. During the first outbreak, three of the five patients were asymptomatic and two had mild symptoms that resolved before the second outbreak. Disease severity in the five residents during the second outbreak was worse than that during the first outbreak and included one death. Because test samples were not retained, phylogenetic strain comparison was not possible. However, interim period symptom resolution in the two symptomatic patients, at least four consecutive negative RT-PCR tests for all five patients before receiving a positive test result during the second outbreak, and the 3-month interval between the first and the second outbreaks, suggest the possibility that reinfection occurred. Maintaining physical distance, wearing face coverings or masks, and frequent hand hygiene are critical mitigation strategies necessary to prevent transmission of SARS-CoV-2 to SNF residents, a particularly vulnerable population at risk for poor COVID-19-associated outcomes.* Testing, containment strategies (isolation and quarantine), and vaccination of residents and health care personnel (HCP) are also essential components to protecting vulnerable residents. The findings of this study highlight the importance of maintaining public health mitigation and protection strategies that reduce transmission risk, even among persons with a history of COVID-19 infection. |
Characteristics and Outcomes of US Children and Adolescents With Multisystem Inflammatory Syndrome in Children (MIS-C) Compared With Severe Acute COVID-19.
Feldstein LR , Tenforde MW , Friedman KG , Newhams M , Rose EB , Dapul H , Soma VL , Maddux AB , Mourani PM , Bowens C , Maamari M , Hall MW , Riggs BJ , Giuliano JSJr , Singh AR , Li S , Kong M , Schuster JE , McLaughlin GE , Schwartz SP , Walker TC , Loftis LL , Hobbs CV , Halasa NB , Doymaz S , Babbitt CJ , Hume JR , Gertz SJ , Irby K , Clouser KN , Cvijanovich NZ , Bradford TT , Smith LS , Heidemann SM , Zackai SP , Wellnitz K , Nofziger RA , Horwitz SM , Carroll RW , Rowan CM , Tarquinio KM , Mack EH , Fitzgerald JC , Coates BM , Jackson AM , Young CC , Son MBF , Patel MM , Newburger JW , Randolph AG . JAMA 2021 325 (11) 1074-1087 IMPORTANCE: Refinement of criteria for multisystem inflammatory syndrome in children (MIS-C) may inform efforts to improve health outcomes. OBJECTIVE: To compare clinical characteristics and outcomes of children and adolescents with MIS-C vs those with severe coronavirus disease 2019 (COVID-19). SETTING, DESIGN, AND PARTICIPANTS: Case series of 1116 patients aged younger than 21 years hospitalized between March 15 and October 31, 2020, at 66 US hospitals in 31 states. Final date of follow-up was January 5, 2021. Patients with MIS-C had fever, inflammation, multisystem involvement, and positive severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) reverse transcriptase-polymerase chain reaction (RT-PCR) or antibody test results or recent exposure with no alternate diagnosis. Patients with COVID-19 had positive RT-PCR test results and severe organ system involvement. EXPOSURE: SARS-CoV-2. MAIN OUTCOMES AND MEASURES: Presenting symptoms, organ system complications, laboratory biomarkers, interventions, and clinical outcomes. Multivariable regression was used to compute adjusted risk ratios (aRRs) of factors associated with MIS-C vs COVID-19. RESULTS: Of 1116 patients (median age, 9.7 years; 45% female), 539 (48%) were diagnosed with MIS-C and 577 (52%) with COVID-19. Compared with patients with COVID-19, patients with MIS-C were more likely to be 6 to 12 years old (40.8% vs 19.4%; absolute risk difference [RD], 21.4% [95% CI, 16.1%-26.7%]; aRR, 1.51 [95% CI, 1.33-1.72] vs 0-5 years) and non-Hispanic Black (32.3% vs 21.5%; RD, 10.8% [95% CI, 5.6%-16.0%]; aRR, 1.43 [95% CI, 1.17-1.76] vs White). Compared with patients with COVID-19, patients with MIS-C were more likely to have cardiorespiratory involvement (56.0% vs 8.8%; RD, 47.2% [95% CI, 42.4%-52.0%]; aRR, 2.99 [95% CI, 2.55-3.50] vs respiratory involvement), cardiovascular without respiratory involvement (10.6% vs 2.9%; RD, 7.7% [95% CI, 4.7%-10.6%]; aRR, 2.49 [95% CI, 2.05-3.02] vs respiratory involvement), and mucocutaneous without cardiorespiratory involvement (7.1% vs 2.3%; RD, 4.8% [95% CI, 2.3%-7.3%]; aRR, 2.29 [95% CI, 1.84-2.85] vs respiratory involvement). Patients with MIS-C had higher neutrophil to lymphocyte ratio (median, 6.4 vs 2.7, P < .001), higher C-reactive protein level (median, 152 mg/L vs 33 mg/L; P < .001), and lower platelet count (<150 ×103 cells/μL [212/523 {41%} vs 84/486 {17%}, P < .001]). A total of 398 patients (73.8%) with MIS-C and 253 (43.8%) with COVID-19 were admitted to the intensive care unit, and 10 (1.9%) with MIS-C and 8 (1.4%) with COVID-19 died during hospitalization. Among patients with MIS-C with reduced left ventricular systolic function (172/503, 34.2%) and coronary artery aneurysm (57/424, 13.4%), an estimated 91.0% (95% CI, 86.0%-94.7%) and 79.1% (95% CI, 67.1%-89.1%), respectively, normalized within 30 days. CONCLUSIONS AND RELEVANCE: This case series of patients with MIS-C and with COVID-19 identified patterns of clinical presentation and organ system involvement. These patterns may help differentiate between MIS-C and COVID-19. |
Media Reports as a Tool for Timely Monitoring of COVID-19-Related Deaths Among First Responders-United States, April 2020.
Kelly-Reif K , Rinsky JL , Chiu SK , Burrer S , de Perio MA , Trotter AG , Miura SS , Seo JY , Hong R , Friedman L , Hand J , Richardson G , Sokol T , Sparer-Fine EH , Laing J , Oliveri A , McGreevy K , Borjan M , Harduar-Morano L , Luckhaupt SE . Public Health Rep 2021 136 (3) 315-319 We aimed to describe coronavirus disease 2019 (COVID-19) deaths among first responders early in the COVID-19 pandemic. We used media reports to gather timely information about COVID-19-related deaths among first responders during March 30-April 30, 2020, and evaluated the sensitivity of media scanning compared with traditional surveillance. We abstracted information about demographic characteristics, occupation, underlying conditions, and exposure source. Twelve of 19 US public health jurisdictions with data on reported deaths provided verification, and 7 jurisdictions reported whether additional deaths had occurred; we calculated the sensitivity of media scanning among these 7 jurisdictions. We identified 97 COVID-19-related first-responder deaths during the study period through media and jurisdiction reports. Participating jurisdictions reported 5 deaths not reported by the media. Sixty-six decedents worked in law enforcement, and 31 decedents worked in fire/emergency medical services. Media reports rarely noted underlying conditions. The media scan sensitivity was 88% (95% CI, 73%-96%) in the subset of 7 jurisdictions. Media reports demonstrated high sensitivity in documenting COVID-19-related deaths among first responders; however, information on risk factors was scarce. Routine collection of data on industry and occupation could improve understanding of COVID-19 morbidity and mortality among all workers. |
Nonmetropolitan COVID-19 incidence and mortality rates surpassed metropolitan rates within the first 24 weeks of the pandemic declaration: United States, March 1-October 18, 2020.
Matthews KA , Ullrich F , Gaglioti AH , Dugan S , Chen MS , Hall DM . J Rural Health 2021 37 (2) 272-277 PURPOSE: This report compares COVID-19 incidence and mortality rates in the nonmetropolitan areas of the United States with the metropolitan areas across three 11-week periods from March 1 to October 18, 2020. METHODS: County-level COVID-19 case, death, and population counts were downloaded from USAFacts.org. The 2013 NCHS Urban-Rural Classification Scheme was collapsed into two categories called metropolitan (large central, large fringe, medium, and small metropolitans) and nonmetropolitan (micropolitan/noncore). Daily COVID-19 incidence and mortality rates were computed to show temporal trends for each of these two categories. Maps showing the ratio of nonmetropolitan to metropolitan COVID-19 incidence and mortality rates by state identify states with higher rates in nonmetropolitan areas than in metropolitan areas in each of the three 11-week periods. FINDINGS: In the period between March 1 and October 18, 2020, 13.8% of the 8,085,214 confirmed COVID-19 cases and 10.7% of the 217,510 deaths occurred among people residing in nonmetropolitan counties. The nonmetropolitan incidence and mortality trends steadily increased and surpassed those in metropolitan areas, beginning in early August. CONCLUSIONS: Despite the relatively small size of the US population living in nonmetropolitan areas, these areas have an equal need for testing, health care personnel, and mitigation resources. Having state-specific rural data allow the development of prevention messages that are tailored to the sociocultural context of rural locations. |
First Identified Cases of SARS-CoV-2 Variant B.1.1.7 in Minnesota - December 2020-January 2021.
Firestone MJ , Lorentz AJ , Wang X , Como-Sabetti K , Vetter S , Smith K , Holzbauer S , Meyer S , Ehresmann K , Danila R , Lynfield R . MMWR Morb Mortal Wkly Rep 2021 70 (8) 278-279 On January 9, 2021, the Minnesota Department of Health (MDH) announced the identification of the SARS-CoV-2 variant of concern (VOC) B.1.1.7, also referred to as 20I/501Y.V1 and VOC 202012/01, in specimens from five persons; on January 25, MDH announced the identification of this variant in specimens from three additional persons. The B.1.1.7 variant, which is reported to be more transmissible than certain other SARS-CoV-2 lineages*(,)(†) (1), was first reported in the United Kingdom in December 2020 (1). As of February 14, 2021, a total of 1,173 COVID-19 cases of the B.1.1.7 variant had been identified in 39 U.S. states and the District of Columbia (2). Modeling data suggest that B.1.1.7 could become the predominant variant in the United States in March 2021 (3). |
Developing National Genotype-Independent Indicators for Recent Mycobacterium Tuberculosis Transmission Using Pediatric Cases-United States, 2011-2017
Harrist AV , McDaniel CJ , Wortham JM , Althomsons SP . Public Health Rep 2021 137 (1) 81-86 INTRODUCTION: Pediatric tuberculosis (TB) cases are sentinel events for Mycobacterium tuberculosis transmission in communities because children, by definition, must have been infected relatively recently. However, these events are not consistently identified by genotype-dependent surveillance alerting methods because many pediatric TB cases are not culture-positive, a prerequisite for genotyping. METHODS: We developed 3 potential indicators of ongoing TB transmission based on identifying counties in the United States with relatively high pediatric (aged <15 years) TB incidence: (1) a case proportion indicator: an above-average proportion of pediatric TB cases among all TB cases; (2) a case rate indicator: an above-average pediatric TB case rate; and (3) a statistical model indicator: a statistical model based on a significant increase in pediatric TB cases from the previous 8-quarter moving average. RESULTS: Of the 249 US counties reporting ≥2 pediatric TB cases during 2009-2017, 240 and 249 counties were identified by the case proportion and case rate indicators, respectively. The statistical model indicator identified 40 counties with a significant increase in the number of pediatric TB cases. We compared results from the 3 indicators with an independently generated list of 91 likely transmission events involving ≥2 pediatric cases (ie, known TB outbreaks or case clusters with reported epidemiologic links). All counties with likely transmission events involving multiple pediatric cases were identified by ≥1 indicator; 23 were identified by all 3 indicators. PRACTICE IMPLICATIONS: This retrospective analysis demonstrates the feasibility of using routine TB surveillance data to identify counties where ongoing TB transmission might be occurring, even in the absence of available genotyping data. |
A cross-sectional study of latent tuberculosis infection, insurance coverage, and usual sources of health care among non-US-born persons in the United States
Annan E , Stockbridge EL , Katz D , Mun EY , Miller TL . Medicine (Baltimore) 2021 100 (7) e24838 More than 70% of tuberculosis (TB) cases diagnosed in the United States (US) occur in non-US-born persons, and this population has experienced less than half the recent incidence rate declines of US-born persons (1.5% vs 4.2%, respectively). The great majority of TB cases in non-US-born persons are attributable to reactivation of latent tuberculosis infection (LTBI). Strategies to expand LTBI-focused TB prevention may depend on LTBI positive non-US-born persons' access to, and ability to pay for, health care.To examine patterns of health insurance coverage and usual sources of health care among non-US-born persons with LTBI, and to estimate LTBI prevalence by insurance status and usual sources of health care.Self-reported health insurance and usual sources of care for non-US-born persons were analyzed in combination with markers for LTBI using 2011-2012 National Health and Nutrition Examination Survey (NHANES) data for 1793 sampled persons. A positive result on an interferon gamma release assay (IGRA), a blood test which measures immunological reactivity to Mycobacterium tuberculosis infection, was used as a proxy for LTBI. We calculated demographic category percentages by IGRA status, IGRA percentages by demographic category, and 95% confidence intervals for each percentage.Overall, 15.9% [95% confidence interval (CI) = 13.5, 18.7] of non-US-born persons were IGRA-positive. Of IGRA-positive non-US-born persons, 63.0% (95% CI = 55.4, 69.9) had insurance and 74.1% (95% CI = 69.2, 78.5) had a usual source of care. IGRA positivity was highest in persons with Medicare (29.1%; 95% CI: 20.9, 38.9).Our results suggest that targeted LTBI testing and treatment within the US private healthcare sector could reach a large majority of non-US-born individuals with LTBI. With non-US-born Medicare beneficiaries' high prevalence of LTBI and the high proportion of LTBI-positive non-US-born persons with private insurance, future TB prevention initiatives focused on these payer types are warranted. |
Cryptococcal antigen screening among antiretroviral therapy-experienced people with HIV with viral load nonsuppression in rural Uganda
Baluku JB , Mugabe P , Mwebaza S , Nakaweesi J , Senyimba C , Opio JP , Mukasa B . Open Forum Infect Dis 2021 8 (2) ofab010 BACKGROUND: The World Health Organization recommends screening for the cryptococcal antigen (CrAg), a predictor of cryptococcal meningitis, among antiretroviral therapy (ART)-naïve people with HIV (PWH) with CD4 <100 cells/mm(3). CrAg positivity among ART-experienced PWH with viral load (VL) nonsuppression is not well established, yet high VLs are associated with cryptococcal meningitis independent of CD4 count. We compared the frequency and positivity yield of CrAg screening among ART-experienced PWH with VL nonsuppression and ART-naïve PWH with CD4 <100 cells/mm(3) attending rural public health facilities in Uganda. METHODS: We reviewed routinely generated programmatic reports on cryptococcal disease screening from 104 health facilities in 8 rural districts of Uganda from January 2018 to July 2019. A lateral flow assay (IMMY CrAg) was used to screen for cryptococcal disease. PWH were eligible for CrAg screening if they were ART-naïve with CD4 <100 cell/mm(3) or ART-experienced with an HIV VL >1000 copies/mL after at least 6 months of ART. We used Pearson's chi-square test to compare the frequency and yield of CrAg screening. RESULTS: Of 71( )860 ART-experienced PWH, 7210 (10.0%) were eligible for CrAg screening. Among 15( )417 ART-naïve PWH, 5719 (37.1%) had a CD4 count measurement, of whom 937 (16.4%) were eligible for CrAg screening. The frequency of CrAg screening was 11.5% (830/7210) among eligible ART-experienced PWH compared with 95.1% (891/937) of eligible ART- naïve PWH (P < .001). The CrAg positivity yield was 10.5% among eligible ART-experienced PWH compared with 13.8% among eligible ART-naïve PWH (P = .035). CONCLUSIONS: The low frequency and high positivity yield of CrAg screening among ART-experienced PWH with VL nonsuppression suggest a need for VL- directed CrAg screening in this population. Studies are needed to evaluate the cost-effectiveness and impact of CrAg screening and fluconazole prophylaxis on the outcomes of ART-experienced PWH with VL nonsuppression. |
Drug susceptibility patterns of Mycobacterium tuberculosis from adults with multidrug-resistant tuberculosis and implications for a household contact preventive therapy trial
Demers AM , Kim S , McCallum S , Eisenach K , Hughes M , Naini L , Mendoza-Ticona A , Pradhan N , Narunsky K , Poongulali S , Badal-Faesen S , Upton C , Smith E , Shah NS , Churchyard G , Gupta A , Hesseling A , Swindells S . BMC Infect Dis 2021 21 (1) 205 BACKGROUND: Drug susceptibility testing (DST) patterns of Mycobacterium tuberculosis (MTB) from patients with rifampicin-resistant tuberculosis (RR-TB) or multidrug-resistant TB (MDR-TB; or resistant to rifampicin and isoniazid (INH)), are important to guide preventive therapy for their household contacts (HHCs). METHODS: As part of a feasibility study done in preparation for an MDR-TB preventive therapy trial in HHCs, smear, Xpert MTB/RIF, Hain MTBDRplus, culture and DST results of index MDR-TB patients were obtained from routine TB programs. A sputum sample was collected at study entry and evaluated by the same tests. Not all tests were performed on all specimens due to variations in test availability. RESULTS: Three hundred eight adults with reported RR/MDR-TB were enrolled from 16 participating sites in 8 countries. Their median age was 36 years, and 36% were HIV-infected. Routine testing on all 308 were confirmed as having RR-TB, but only 75% were documented as having MDR-TB. The majority of those not classified as having MDR-TB were because only rifampicin resistance was tested. At study entry (median 59 days after MDR-TB treatment initiation), 280 participants (91%) were able to produce sputum for the study, of whom 147 (53%) still had detectable MTB. All but 2 of these 147 had rifampicin DST done, with resistance detected in 89%. Almost half (47%) of the 147 specimens had INH DST done, with 83% resistance. Therefore, 20% of the 280 study specimens had MDR-TB confirmed. Overall, DST for second-line drugs were available in only 35% of the 308 routine specimens and 15% of 280 study specimens. CONCLUSIONS: RR-TB was detected in all routine specimens but only 75% had documented MDR-TB, illustrating the need for expanded DST beyond Xpert MTB/RIF to target preventive therapy for HHC. |
County-level variation in hepatitis C virus mortality and trends in the United States, 2005-2017
Hall EW , Schillie S , Vaughan AS , Jones J , Bradley H , Lopman B , Rosenberg E , Sullivan P . Hepatology 2021 74 (2) 582-590 Since 2013, the national HCV death rate has steadily declined, but this decline has not been quantified or described on a local level. We estimated county-level HCV death rates and assessed trends in HCV mortality from 2005 to 2013 and 2013 to 2017. We used mortality data from National Vital Statistics Systems and a Bayesian multivariate space-time conditional autoregressive model to estimate age-standardized HCV death rates from 2005 through 2017 for 3115 U.S. counties. Additionally, we estimated county-level age-standardized rates for persons <40 and 40+ years of age. We used log-linear regression models to estimate average annual percent change in HCV mortality during periods of interest and compared county-level trends to national trends. Nationally, the age-adjusted HCV death rate peaked in 2013 at 5.20 HCV deaths per 100,000 (95% credible interval, CI: 5.12, 5.26) before decreasing to 4.34 per 100,000 persons (95% CI: 4.28, 4.41) in 2017 (average annual percent change -4.69, 95%CI: -5.01, -4.33). County-level rates revealed heterogeneity in HCV mortality (2017 median rate=3.66, interdecile range: 2.19, 6.77), with the highest rates concentrated in the West, Southwest, Appalachia and northern Florida. Between 2013 and 2017, HCV mortality decreased in 80.0% (n=2274) of all U.S. counties with a reliable trend estimate, with 25.8% (n=803) of all counties experiencing a decrease larger than the national decline. Conclusion: Although many counties have experienced a shift in HCV mortality trends since 2013, the magnitude and composition of that shift have varied by place. These data provide a better understanding of geographic differences in HCV mortality and can be used by local jurisdictions to evaluate HCV mortality in their areas relative to surrounding areas and the nation. |
Prevalence of condomless anal intercourse and associated risk factors among men who have sex with men in Bamako, Mali
Knox J , Patnaik P , Hakim AJ , Telly N , Ballo T , Traore B , Doumbia S , Lahuerta M . Int J STD AIDS 2021 32 (3) 218-227 BACKGROUND: Men who have sex with men (MSM) are disparately impacted by HIV in sub-Saharan Africa and condomless anal intercourse (CAI) is a major driver of HIV transmission. The objective of the current study was to identify factors associated with CAI among MSM in Bamako, Mali, among whom HIV prevalence was 13.7%. METHODS: A bio-behavioral survey was conducted between October 2014 and February 2015 using respondent-driven sampling to recruit 552 adult MSM. Weighted statistical analyses were conducted to determine the prevalence of CAI with one's most recent male partner and survey logistic procedures were used to identify associated factors. RESULTS: The prevalence of CAI with one's most recent male partner was 40.7%. Associated factors included: inability to get a condom when needed (aOR = 5.8, 95%CI: 2.7-12.3) and believing CAI is acceptable under some circumstances (aOR = 8.4, 95%CI: 4.4-16.2). CONCLUSIONS: Programs addressing HIV among MSM in Mali should aim to increase access to condoms and education about HIV prevention through consistent condom use during anal intercourse. |
Key population hotspots in Nigeria for targeted HIV program planning: Mapping, validation, and reconciliation
Lo J , Nwafor SU , Schwitters AM , Mitchell A , Sebastian V , Stafford KA , Ezirim I , Charurat M , McIntyre AF . JMIR Public Health Surveill 2021 7 (2) e25623 BACKGROUND: With the fourth highest HIV burden globally, Nigeria is characterized as having a mixed HIV epidemic with high HIV prevalence among key populations, including female sex workers, men who have sex with men, and people who inject drugs. Reliable and accurate mapping of key population hotspots is necessary for strategic placement of services and allocation of limited resources for targeted interventions. OBJECTIVE: We aimed to map and develop a profile for the hotspots of female sex workers, men who have sex with men, and people who inject drugs in 7 states of Nigeria to inform HIV prevention and service programs and in preparation for a multiple-source capture-recapture population size estimation effort. METHODS: In August 2018, 261 trained data collectors from 36 key population-led community-based organizations mapped, validated, and profiled hotspots identified during the formative assessment in 7 priority states in Nigeria designated by the United States President's Emergency Plan for AIDS Relief. Hotspots were defined as physical venues wherein key population members frequent to socialize, seek clients, or engage in key population-defining behaviors. Hotspots were visited by data collectors, and each hotspot's name, local government area, address, type, geographic coordinates, peak times of activity, and estimated number of key population members was recorded. The number of key population hotspots per local government area was tabulated from the final list of hotspots. RESULTS: A total of 13,899 key population hotspots were identified and mapped in the 7 states, that is, 1297 in Akwa Ibom, 1714 in Benue, 2666 in Cross River, 2974 in Lagos, 1550 in Nasarawa, 2494 in Rivers, and 1204 in Federal Capital Territory. The most common hotspots were those frequented by female sex workers (9593/13,899, 69.0%), followed by people who inject drugs (2729/13,899, 19.6%) and men who have sex with men (1577/13,899, 11.3%). Although hotspots were identified in all local government areas visited, more hotspots were found in metropolitan local government areas and state capitals. CONCLUSIONS: The number of key population hotspots identified in this study is more than that previously reported in similar studies in Nigeria. Close collaboration with key population-led community-based organizations facilitated identification of many new and previously undocumented key population hotspots in the 7 states. The smaller number of hotspots of men who have sex with men than that of female sex workers and that of people who inject drugs may reflect the social pressure and stigma faced by this population since the enforcement of the 2014 Same Sex Marriage (Prohibition) Act, which prohibits engaging in intimate same-sex relationships, organizing meetings of gays, or patronizing gay businesses. |
Viral etiology and seasonal trends of pediatric acute febrile illness in southern Puerto Rico; a seven-year review
Sánchez-González L , Quandelacy TM , Johansson M , Torres-Velásquez B , Lorenzi O , Tavarez M , Torres S , Alvarado LI , Paz-Bailey G . PLoS One 2021 16 (2) e0247481 BACKGROUND: Acute febrile illness (AFI) is an important cause for seeking health care among children. Knowledge of the most common etiologic agents of AFI and its seasonality is limited in most tropical regions. METHODOLOGY/PRINCIPAL FINDINGS: To describe the viral etiology of AFI in pediatric patients (≤18 years) recruited through a sentinel enhanced dengue surveillance system (SEDSS) in Southern Puerto Rico, we analyzed data for patients enrolled from 2012 to May 2018. To identify seasonal patterns, we applied time-series analyses to monthly arboviral and respiratory infection case data. We calculated coherence and phase differences for paired time-series to quantify the association between each time series. A viral pathogen was found in 47% of the 14,738 patients. Influenza A virus was the most common pathogen detected (26%). The incidence of Zika and dengue virus etiologies increased with age. Arboviral infections peaked between June and September throughout the times-series. Respiratory infections have seasonal peaks occurring in the fall and winter months of each year, though patterns vary by individual respiratory pathogen. CONCLUSIONS/SIGNIFICANCE: Distinct seasonal patterns and differences in relative frequency by age groups seen in this study can guide clinical and laboratory assessment in pediatric patients with AFI in Puerto Rico. |
Addressing the STI epidemic through the Medicaid program: A roadmap for states and managed care organizations
Seiler N , Horton K , Pearson WS , Cramer R , Adil M , Bishop D , Heyison C . Public Health Rep 2021 137 (1) 5-10 Chlamydia, gonorrhea, and syphilis are all detectable and treatable, yet rates of these 3 bacterial sexually transmitted infections (STIs) are soaring in the United States. 1 If left untreated, both chlamydia and gonorrhea can lead to costly and burdensome complications, including pelvic inflammatory disease and infertility.2,3 Untreated primary syphilis can lead to severe sequelae including death, and congenital syphilis can lead to miscarriage, stillbirth, prematurity, low birthweight, and death.4,5 People who develop these complications because of untreated STIs have high medical costs throughout their lifetime.6,7 Although rates of chlamydia, gonorrhea, and syphilis have been rising among all racial/ethnic groups, African American and Latinx people have persistently higher burdens of infection than White people. 8 |
Epidemiology of HIV in the USA: epidemic burden, inequities, contexts, and responses
Sullivan PS , Satcher Johnson A , Pembleton ES , Stephenson R , Justice AC , Althoff KN , Bradley H , Castel AD , Oster AM , Rosenberg ES , Mayer KH , Beyrer C . Lancet 2021 397 (10279) 1095-1106 The HIV epidemic in the USA began as a bicoastal epidemic focused in large cities but, over nearly four decades, the epidemiology of HIV has changed. Public health surveillance data can inform an understanding of the evolution of the HIV epidemic in terms of the populations and geographical areas most affected. We analysed publicly available HIV surveillance data and census data to describe: current HIV prevalence and new HIV diagnoses by region, race or ethnicity, and age; trends in HIV diagnoses over time by HIV acquisition risk and age; and the distribution of HIV prevalence by geographical area. We reviewed published literature to explore the reasons for the current distribution of HIV cases and important disparities in HIV prevalence. We identified opportunities to improve public health surveillance systems and uses of data for planning and monitoring public health responses. The current US HIV epidemic is marked by geographical concentration in the US South and profound disparities between regions and by race or ethnicity. Rural areas vary in HIV prevalence; rural areas in the South are more likely to have a high HIV prevalence than rural areas in other US Census regions. Ongoing disparities in HIV in the South are probably driven by the restricted expansion of Medicaid, health-care provider shortages, low health literacy, and HIV stigma. HIV diagnoses overall declined in 2009-18, but HIV diagnoses among individuals aged 25-34 years increased during the same period. HIV diagnoses decreased for all risk groups in 2009-18; among men who have sex with men (MSM), new diagnoses decreased overall and for White MSM, remained stable for Black MSM, and increased for Hispanic or Latino MSM. Surveillance data indicate profound and ongoing disparities in HIV cases, with disproportionate impact among people in the South, racial or ethnic minorities, and MSM. |
Antiretroviral treatment initiation among HIV-positive participants in the Bangkok men who have sex with men cohort study, 2006-2016
Wimonsate W , Sriporn A , Pattanasin S , Varangrat A , Promda N , Sukwicha W , Holtz TH , Ungsedhapand C , Chitwarakorn A , Hickey AC , Dunne EF . Int J STD AIDS 2021 32 (8) 687-693 INTRODUCTION: Data on HIV antiretroviral therapy (ART) initiation among key-affected populations will support reaching the UNAIDS goal to end AIDS by 2030. METHODS: We assessed ART initiation among HIV-positive participants of the Bangkok Men Who Have Sex with Men (MSM) Cohort Study, which enrolled sexually experienced MSM aged ≥ 18 years and included visits every four months for a period of 3-5 years, from 2006-2016. At each visit, participants had HIV testing and completed computer-assisted self-interviewing on demographics and HIV risk behaviors. If they acquired HIV infection during the study, they received active referral for HIV treatment, continued in the cohort, and were asked about ART initiation. We used logistic regression to determine factors associated with ART initiation. RESULTS: Overall, 632 (36.2%) participants were diagnosed with HIV infection; 463 (73%) had a follow-up visit reporting information about ART, of those 346 (74%) reported ART initiation, with 323 (93%) on ART initiating ART through their registered national health benefit program. Only 70 (11%) were eligible for ART at time of diagnosis, and 52 (74%) initiated ART, on average, within six months of diagnosis. Multivariable analysis evaluating factors associated with ART initiation demonstrated that low CD4 cell count at time of diagnosis was the only independent factor associated with ART initiation. CONCLUSIONS: Most HIV-positive participants in the cohort reported ART initiation through the national health benefit program but limited data suggests there could be improvements in length of time to initiation of ART. Efforts should focus on ART start in MSM and transgender women soon after HIV diagnosis. |
Using soil survey data to model potential Coccidioides soil habitat and inform Valley fever epidemiology
Dobos RR , Benedict K , Jackson BR , McCotter OZ . PLoS One 2021 16 (2) e0247263 Coccidioidomycosis, also known as Valley fever, is a disease that can result in substantial illness and death. It is most common in the southwestern United States and areas of Latin America with arid climates, though reports increasingly suggest its range is wider than previously recognized. The natural habitat of the causative organisms, Coccidioides spp., have been associated with certain soil properties and climatic conditions. Current understanding of its geographic range is primarily defined by skin test studies and outbreak locations. We developed a fuzzy system model to predict suitable soil habitats for Coccidioides across the western United States based on parameters (electrical conductivity, organic matter content, pH, water holding capacity, temperature, and precipitation) from sites where soil sampling has confirmed the presence of Coccidioides. The model identified high coccidioidomycosis incidence areas as having high suitability and identified pockets of elevated suitability corresponding with outbreak locations outside the traditional range. By providing high-resolution estimates of Coccidioides suitability, including areas without public health surveillance for coccidioidomycosis, this model may be able to aid public health and clinical provider decision making. Awareness of possible Coccidioides soil habitats could help mitigate risk during soil-disturbing activities and help providers improve coccidioidomycosis diagnosis and treatment. |
Spatial clustering of livestock Anthrax events associated with agro-ecological zones in Kenya, 1957-2017
Nderitu LM , Gachohi J , Otieno F , Mogoa EG , Muturi M , Mwatondo A , Osoro EM , Ngere I , Munyua PM , Oyas H , Njagi O , Lofgren E , Marsh T , Widdowson MA , Bett B , Njenga MK . BMC Infect Dis 2021 21 (1) 191 BACKGROUND: Developing disease risk maps for priority endemic and episodic diseases is becoming increasingly important for more effective disease management, particularly in resource limited countries. For endemic and easily diagnosed diseases such as anthrax, using historical data to identify hotspots and start to define ecological risk factors of its occurrence is a plausible approach. Using 666 livestock anthrax events reported in Kenya over 60 years (1957-2017), we determined the temporal and spatial patterns of the disease as a step towards identifying and characterizing anthrax hotspots in the region. METHODS: Data were initially aggregated by administrative unit and later analyzed by agro-ecological zones (AEZ) to reveal anthrax spatio-temporal trends and patterns. Variations in the occurrence of anthrax events were estimated by fitting Poisson generalized linear mixed-effects models to the data with AEZs and calendar months as fixed effects and sub-counties as random effects. RESULTS: The country reported approximately 10 anthrax events annually, with the number increasing to as many as 50 annually by the year 2005. Spatial classification of the events in eight counties that reported the highest numbers revealed spatial clustering in certain administrative sub-counties, with 12% of the sub-counties responsible for over 30% of anthrax events, whereas 36% did not report any anthrax disease over the 60-year period. When segregated by AEZs, there was significantly greater risk of anthrax disease occurring in agro-alpine, high, and medium potential AEZs when compared to the agriculturally low potential arid and semi-arid AEZs of the country (p < 0.05). Interestingly, cattle were > 10 times more likely to be infected by B. anthracis than sheep, goats, or camels. There was lower risk of anthrax events in August (P = 0.034) and December (P = 0.061), months that follow long and short rain periods, respectively. CONCLUSION: Taken together, these findings suggest existence of certain geographic, ecological, and demographic risk factors that promote B. anthracis persistence and trasmission in the disease hotspots. |
Outpatient insulin-related adverse events due to mix-up errors: Findings from two national surveillance systems, United States, 2012-2017
Geller AI , Conrad AO , Weidle NJ , Mehta H , Budnitz DS , Shehab N . Pharmacoepidemiol Drug Saf 2021 30 (5) 573-581 PURPOSE: We used data from two public health surveillance systems for national estimates and detailed descriptions of insulin mix-up errors resulting in emergency department (ED) visits and other serious adverse events to help inform prevention efforts. METHODS: ED visits involving patients seeking care for insulin medication errors collected by the NEISS-CADES project in 2012-2017 and voluntary reports of serious insulin medication errors submitted to the U.S. Food and Drug Administration (FDA) in 2016-2017 were analyzed. National estimates of insulin product prescriptions dispensed from retail pharmacies were obtained from IQVIA National Prescription Audit. RESULTS: Between 2012 and 2017, based on 514 NEISS-CADES cases, there were an estimated 5,636 (95% CI, 4,143-7,128) ED visits annually for insulin mix-up errors; overall, over three-quarters (77.5%; 95% CI, 71.6%-83.3%) involved taking rapid-acting instead of long-acting insulin. Between 2012 and 2017, the proportion of mix-up errors among all estimated ED visits for all insulin errors decreased by 60%; concurrently, the proportion of pens among all insulin package types dispensed increased by 50%. Among 58 voluntary reports submitted to FAERS, over one-half (56.9%) of cases involved taking rapid- instead of long-acting insulin. Among 27 cases with documented contributing factors, approximatley one-half involved patients having difficulty differentiating products. CONCLUSIONS: Among all ED visits for insulin errors collected by NEISS-CADES in 2012-2017, the proportion involving mix-up errors has declined. Continued reductions may require additional prevention strategies, including improving insulin distinctiveness, particularly for rapid- vs. long-acting insulins. Ongoing national surveillance is important for identifying the impact of interventions. This article is protected by copyright. All rights reserved. |
Status of state cyanoHAB outreach and monitoring efforts, United States
Hardy FJ , Preece E , Backer L . Lake Reserv Manage 2020 37 (3) 246-260 Hardy FJ, Preece E, Backer L. 2021. Status of state cyanoHAB outreach and monitoring efforts, United States. Lake Reserv Manage. XX:XXX–XXX. A widespread effort is underway to improve awareness of cyanobacteria harmful algal blooms (cyanoHABs) across the United States using a variety of monitoring programs and public health outreach measures to protect people, pets, and livestock. To determine the status of cyanoHAB outreach and monitoring efforts, 2 questionnaires were distributed to health/environmental departments in 50 states and the District of Columbia (DC). One questionnaire focused on cyanoHAB exposure to humans from drinking water and the second targeted exposure through recreational activities. All states plus DC responded to the recreational survey; 46 states plus DC responded to the drinking water survey. All states except Alaska answered that microcystins were the cyanotoxins of greatest concern for recreational exposure; microcystins were also of greatest concern for drinking water with the exception of Utah (anatoxin-a in reservoirs was greatest concern) and Rhode Island (microcystins and anatoxin-a in reservoirs/ponds were greatest concern). Regional comparisons disclosed a lack of cyanoHAB programs in southern states relative to northern states that may be related to the higher percentage of water surface area in northern states. Interestingly, recreational outreach is more extensive than drinking water outreach (only 16 states reported having some type of drinking water outreach program, compared with 35 states with recreational outreach), and preferred outreach methods are websites and press releases. Additionally, respondents reported very limited funding for outreach and monitoring programs. Our results establish baseline information to help determine what future direction cyanoHAB outreach and monitoring programs can take at local, regional, and national levels. |
Cyanobacteria growth in nitrogen- & phosphorus-spiked water from a hypereutrophic reservoir in Kentucky, USA
Hughes SE , Marion JW . J Environ Prot 2021 12 (2) 75-89 Cyanobacteria may adversely impact aquatic ecosystems through oxygen depletion and cyanotoxin production. These cyanotoxins can also harm human health and livestock. In recent years, cyanobacterial blooms have been observed in several drinking water reservoirs in Kentucky, United States. In Kentucky, the paradigm is that phosphorous is the limiting nutrient for cyanobacteria growth. To explore this paradigm, an indoor microcosm study was conducted using hypereutrophic Guist Creek Lake water. Samples were collected and spiked with various combinations of locally used agricultural grade fertilizers, including ammonium nitrate, urea, and triple phosphate (calcium dihydrogen phosphate). Samples were incubated indoors for the photoperiod-specific to the time of the year. Cyanobacteria density, measured by phycocyanin, did not demonstrate increased growth with the addition of phosphate fertilizer alone. Cyanobacteria growth was enhanced in these conditions by the combined addition of ammonium nitrate, urea, and phosphorus fertilizer. Growth also occurred when using either ammonium nitrate or urea fertilizer with no additional phosphorus input, suggesting that phosphorus was not limiting the cyanobacteria at the time of sample collection. The addition of both nitrogen fertilizers (ammonium nitrate and urea) at the concentrations used in this study, in the absence of phosphorus, was deleterious to both the Chlorophyta and cyanobacteria. The results suggest further studies using more robust experimental designs are needed to explore lake-specific dual nutrient management strategies for preventing cyanobacterial blooms in this phosphorus-rich hypereutrophic lake and possibly other hypereutrophic lakes. |
Communicating effectively to overcome misinformation
Khan A , Dove T , Segerlind S . J Environ Health 2021 83 (6) 44-46 Editor's Note: The National Environmental Health Association strives to provide up-to-date and relevant information on environmental health and to build partnerships in the profession. In pursuit of these goals, we feature a column on environmental health services from the Centers for Disease Control and Prevention (CDC) in every issue of the Journal. | | In these columns, authors from CDC's Water, Food, and Environmental Health Services Branch, as well as guest authors, will share insights and information about environmental health programs, trends, issues, and resources. The conclusions of these columns are those of the author(s) and do not necessarily represent the official position of CDC. | | Water Management Programs Are Key to Managing Legionella Growth and Spread | Elaine Curtiss, MEd, National Center for Environmental Health, Centers for Disease Control and Prevention | Janie Hils, MPH, National Center for Environmental Health, Centers for Disease Control and Prevention | CDR Jasen Kunz, MPH, REHS/RS, National Center for Environmental Health, Centers for Disease Control and Prevention | In summer 2021, several U.S. public health jurisdictions reported increases in Legionnaires' disease cases above their respective 5-year baseline averages. While the Centers for Disease Control and Prevention (CDC) does not know to what extent building water systems might have contributed to these increases, periods of reduced building occupancy or building closure and low water usage can create hazards for occupants. Reopening schools, workplaces, and businesses—and more people traveling and staying in hotels—can elevate the risk of exposure to Legionella bacteria if appropriate steps are not taken. Environmental health professionals have an important role in reminding building owners, building operators, and cooling tower operators of ways to safely reopen buildings to prevent the growth of Legionella. | | Water management programs help people identify hazardous conditions and take steps to minimize the growth and spread of Legionella and other waterborne pathogens in building water systems. Developing and maintaining a water management program is a multistep process that requires continuous review. This month's column provides several different resources from CDC to aid in the development of water management programs and prevent the spread and growth of Legionella. |
Per- and polyfluoroalkyl substances and calcifications of the coronary and aortic arteries in adults with prediabetes: Results from the Diabetes Prevention Program Outcomes Study
Osorio-Yáñez C , Sanchez-Guerra M , Cardenas A , Lin PD , Hauser R , Gold DR , Kleinman KP , Hivert MF , Fleisch AF , Calafat AM , Webster TF , Horton ES , Oken E . Environ Int 2021 151 106446 BACKGROUND: Per- and polyfluoroalkyl substances (PFAS) are endocrine disrupting chemicals that have been associated with cardiovascular risk factors including elevated body weight and hypercholesterolemia. Therefore, PFAS may contribute to the development of atherosclerosis and cardiovascular disease (CVD). However, no previous study has evaluated associations between PFAS exposure and arterial calcification. METHODS AND RESULTS: This study used data from 666 prediabetic adults enrolled in the Diabetes Prevention Program trial who had six PFAS quantified in plasma at baseline and two years after randomization, as well as measurements of coronary artery calcium (CAC) and ascending (AsAC) and descending (DAC) thoracic aortic calcification 13-14 years after baseline. We performed multinomial regression to test associations between PFAS and CAC categorized according to Agatston score [low (<10), moderate (11-400) and severe (>400)]. We used logistic regression to assess associations between PFAS and presence of AsAC and DAC. We adjusted models for baseline sex, age, BMI, race/ethnicity, cigarette smoking, education, treatment assignment (placebo or lifestyle intervention), and statin use. PFAS concentrations were similar to national means; 53.9% of participants had CAC > 11, 7.7% had AsAC, and 42.6% had DAC. Each doubling of the mean sum of plasma concentrations of linear and branched isomers of perfluorooctane sulfonic acid (PFOS) was associated with 1.49-fold greater odds (95% CI: 1.01, 2.21) of severe versus low CAC. This association was driven mainly by the linear (n-PFOS) isomer [1.54 (95% CI: 1.05, 2.25) greater odds of severe versus low CAC]. Each doubling of mean plasma N-ethyl-perfluorooctane sulfonamido acetic acid concentration was associated with greater odds of CAC in a dose-dependent manner [OR = 1.26 (95% CI:1.08, 1.47) for moderate CAC and OR = 1.37 (95% CI:1.07, 1.74) for severe CAC, compared to low CAC)]. Mean plasma PFOS and n-PFOS were also associated with greater odds of AsAC [OR = 1.67 (95% CI:1.10, 2.54) and OR = 1.70 (95% CI:1.13, 2.56), respectively], but not DAC. Other PFAS were not associated with outcomes. CONCLUSIONS: Prediabetic adults with higher plasma concentrations of select PFAS had higher risk of coronary and thoracic aorta calcification. PFAS exposure may be a risk factor for adverse cardiovascular health among high-risk populations. |
Chronic environmental contamination: A systematic review of psychological health consequences
Schmitt HJ , Calloway EE , Sullivan D , Clausen WH , Tucker PG , Rayman J , Gerhardstein B . Sci Total Environ 2021 772 145025 We sought to undertake a systematic review to assess the current research and to provide a platform for future research on the psychological health impact of chronic environmental contamination (CEC). CEC is the experience of living in an area where hazardous substances are known or perceived to be present in air, water, or soil at elevated levels for a prolonged and unknown period of time. We employed a systematic review approach to assess the psychological health impact of CEC in literature from 1995 to 2019, and conducted a meta-analysis of available findings (k = 60, N = 25,858) on the impact of CEC on anxiety, general stress, depression, and PTSD. We also present a narrative synthesis of findings that suggest risk factors for the experience of psychological health impacts in the wake of CEC. Likely factors increasing risk for elevated psychological health impact from CEC experience are institutional delegitimization of community concerns and the real or perceived presence of health effects from CEC. The meta-analyses observed small-to-medium effects of experiencing CEC on anxiety, general stress, depression, and PTSD. However, there was also evident risk of bias in the data. Our review suggests that psychological health in the context of CEC is an important potential public health burden and a key area for future improved research. 2021 Elsevier B.V. |
Follow-up Survey of US Adult Reports of Mental Health, Substance Use, and Suicidal Ideation During the COVID-19 Pandemic, September 2020.
Czeisler MÉ , Lane RI , Wiley JF , Czeisler CA , Howard ME , Rajaratnam SMW . JAMA Netw Open 2021 4 (2) e2037665 This survey study compared patterns of mental health concerns, substance use, and suicidal ideation during June and September 2020 of the COVID-19 pandemic and examined at-risk demographic groups. |
Estimated Medicaid costs associated with hepatitis A during an outbreak - West Virginia, 2018-2019
Batdorf SJ , Hofmeister MG , Surtees TC , Thomasson ED , McBee SM , Pauly NJ . MMWR Morb Mortal Wkly Rep 2021 70 (8) 269-272 Hepatitis A is a vaccine-preventable disease caused by the hepatitis A virus (HAV). Transmission of the virus most commonly occurs through the fecal-oral route after close contact with an infected person. Widespread outbreaks of hepatitis A among persons who use illicit drugs (injection and noninjection drugs) have increased in recent years (1). The Advisory Committee on Immunization Practices (ACIP) recommends routine hepatitis A vaccination for children and persons at increased risk for infection or severe disease, and, since 1996, has recommended hepatitis A vaccination for persons who use illicit drugs (2). Vaccinating persons who are at-risk for HAV infection is a mainstay of the public health response for stopping ongoing person-to-person transmission and preventing future outbreaks (1). In response to a large hepatitis A outbreak in West Virginia, an analysis was conducted to assess total hepatitis A-related medical costs during January 1, 2018-July 31, 2019, among West Virginia Medicaid beneficiaries with a confirmed diagnosis of HAV infection. Among the analysis population, direct clinical costs ranged from an estimated $1.4 million to $5.6 million. Direct clinical costs among a subset of the Medicaid population with a diagnosis of a comorbid substance use disorder ranged from an estimated $1.0 million to $4.4 million during the study period. In addition to insight on preventing illness, hospitalization, and death, the results from this study highlight the potential financial cost jurisdictions might incur when ACIP recommendations for hepatitis A vaccination, especially among persons who use illicit drugs, are not followed (2). |
Systematic review of violence prevention economic evaluations, 2000-2019
Peterson C , Kearns MC . Am J Prev Med 2021 60 (4) 552-562 CONTEXT: Health economic evaluations (e.g., cost-effectiveness analysis) can guide the efficient use of resources to improve health outcomes. This study aims to summarize the content and quality of interpersonal violence prevention economic evaluations. EVIDENCE ACQUISITION: In 2020, peer-reviewed journal articles published during 2000-2019 focusing on high-income countries were identified using index terms in multiple databases. Study content, including violence type prevented (e.g., child abuse and neglect), outcome measure (e.g., abusive head trauma clinical diagnosis), intervention type (e.g., education program), study methods, and results were summarized. Studies reporting on selected key methods elements essential for study comparison and public health decision making (e.g., economic perspective, time horizon, discounting, currency year) were assessed. EVIDENCE SYNTHESIS: A total of 26 economic evaluation studies were assessed, most of which reported that assessed interventions yielded good value for money. Physical assault in the community and child abuse and neglect were the most common violence types examined. Studies applied a wide variety of cost estimates to value avoided violence. Less than two thirds of the studies reported all the key methods elements. CONCLUSIONS: Comprehensive data collection on violence averted and intervention costs in experimental settings can increase opportunities to identify interventions that generate long-term value. More comprehensive estimates of the cost of violence can improve opportunities to demonstrate how prevention investment can be offset through avoided future costs. Better adherence to health economic evaluation reporting standards can enhance comparability across studies and may increase the likelihood that economic evidence is included in violence prevention decision making. |
Receipt of and spending on cessation medication among US adults with employer-sponsored health insurance, 2010 and 2017
Shrestha SS , Xu X , Wang X , Babb SD , Armour BS , King BA , Trivers KF . Public Health Rep 2021 136 (6) 736-744 OBJECTIVE: Studies examining the use of smoking cessation treatment and related spending among enrollees with employer-sponsored health insurance are dated and limited in scope. We assessed changes in annual receipt of and spending on cessation medications approved by the US Food and Drug Administration (FDA) among tobacco users with employer-sponsored health insurance from 2010 to 2017. METHODS: We analyzed data on 439 865 adult tobacco users in 2010 and 344 567 adult tobacco users in 2017 from the IBM MarketScan Commercial Database. We used a negative binomial regression to estimate changes in receipt of cessation medication (number of fills and refills and days of supply). We used a generalized linear model to estimate spending (total, employers', and out of pocket). In both models, covariates included year, age, sex, residence, and type of health insurance plan. RESULTS: From 2010 to 2017, the percentage of adult tobacco users with employer-sponsored health insurance who received any cessation medication increased by 2.4%, from 15.7% to 16.1% (P < .001). Annual average number of fills and refills per user increased by 15.1%, from 2.5 to 2.9 (P < .001) and days of supply increased by 26.4%, from 81.9 to 103.5 (P < .001). The total annual average spending per user increased by 53.6%, from $286.40 to $440.00 (P < .001). Annual average out-of-pocket spending per user decreased by 70.9%, from $70.80 to $20.60 (P < .001). CONCLUSIONS: Use of smoking cessation medications is low among smokers covered by employer-sponsored health insurance. Opportunities exist to further increase the use of cessation medications by promoting the use of evidence-based cessation treatments and reducing barriers to coverage, including out-of-pocket costs. |
MALDI-TOF MS: an alternative approach for ribotyping Clostridioides difficile isolates in Brazil.
Gouveia CL , Abreu PTC , Hercules M , John B , Cavalcanti Pilotto DRM , de Oliveira FE . Anaerobe 2021 69 102351 Clostridioides difficile is an important organism causing healthcare-associated infections. It has been documented that specific strains caused multiple outbreaks globally, and patients infected with those strains are more likely to develop severe C. difficile infection (CDI). With the appearance of a variant strain, BI/NAP1 ribotype 027, responsible for several outbreaks and high mortality rates worldwide, the epidemiology of the CDI changed drastically in the United States, Europe, and some Latin American countries. Although the epidemic strain 027 was not yet detected in Brazil, there are ribotypes exclusively found in the country, such as, 131, 132, 133, 135, 142 and 143, which are responsible for outbreaks in Brazilian hospitals and nursing homes. Although PCR-ribotyping is the most used method in epidemiology studies of C. difficile, it is not available in Brazil. This study aimed to develop and validate an in-house database for detecting C. difficile ribotypes, usually involved in CDI in Brazilian hospitals, by using MALDI-TOF MS. A database with 19 different ribotypes, 13 with worldwide circulation and 6 Brazilian-restricted, was created based on 27 spectra readings of each ribotype. After BioNumerics analysis, neighbor-joining trees revealed that spectra were distributed in clusters according to ribotypes, showing that MALDI-TOF MS could discriminate all 19 ribotypes. Moreover, each ribotype showed a different profile with 42 biomarkers detected in total. Based on their intensity and occurrence, 13 biomarkers were chosen to compose ribotype-specific profiles, and in silico analysis showed that most of these biomarkers were uncharacterized proteins or well-conserved peptides, such as ribosomal proteins. A double-blind assessment using the 13 biomarkers correctly assigned the ribotype in 73% of the spectra analyzed, with 94% to 100% of correct hits for 027 and for Brazilian ribotypes. Although further analyses are required, our results show that MALDI-TOF MS might be a reliable, fast and feasible alternative for epidemiological surveillance of C. difficile in Brazil. |
Evaluation of methods for detection of β-lactamase production in MSSA.
Skov R , Lonsway DR , Larsen J , Larsen AR , Samulioniené J , Limbago BM . J Antimicrob Chemother 2021 76 (6) 1487-1494 OBJECTIVES: Correct determination of penicillin susceptibility is pivotal for using penicillin in the treatment of Staphylococcus aureus infections. This study examines the performance of MIC determination, disc diffusion and a range of confirmatory tests for detection of penicillin susceptibility in S. aureus. METHODS: A total of 286 consecutive penicillin-susceptible S. aureus blood culture isolates as well as a challenge set of 62 MSSA isolates were investigated for the presence of the blaZ gene by PCR and subjected to penicillin-susceptibility testing using broth microdilution MIC determination, disc diffusion including reading of the zone edge, two nitrocefin tests and the cloverleaf test. RESULTS: Using PCR-based detection of blaZ as the gold standard, both broth microdilution MIC testing and disc diffusion testing resulted in a relatively low accuracy (82%-93%) with a sensitivity ranging from 49%-93%. Among the confirmatory tests, the cloverleaf test performed with 100% accuracy, while zone edge interpretation and nitrocefin-based tests increased the sensitivity of β-lactamase detection to 96%-98% and 82%-96% when using MIC determination or disc diffusion as primary test, respectively. CONCLUSIONS: This investigation showed that reliable and accurate detection of β-lactamase production in S. aureus can be obtained by MIC determination or penicillin disc diffusion followed by interpretation of the zone edge as a confirmatory test for apparently penicillin-susceptible isolates. The more cumbersome cloverleaf test can also be used. Nitrocefin-based tests should not be used as the only test for confirmation of a presumptive β-lactamase-negative isolate. |
Rates and causative pathogens of surgical site infections attributed to liver transplant procedures and other hepatic, biliary or pancreatic procedures, 2015-2018
Chea N , Sapiano MRP , Zhou L , Epstein L , Guh A , Edwards JR , Allen-Bridson K , Russo V , Watkins J , Pouch SM , Magill SS . Transpl Infect Dis 2021 23 (4) e13589 Liver transplant recipients are at high risk for surgical site infections (SSIs). Limited data are available on SSI epidemiology following liver transplant procedures (LTPs). We analyzed data on SSIs from 2015-2018 reported to CDC's National Healthcare Safety Network to determine rates, pathogen distribution, and antimicrobial resistance after LTPs and other hepatic, biliary, or pancreatic procedures (BILIs). LTP and BILI SSI rates were 5.7% and 5.9%, respectively. The odds of SSI after LTP were lower than after BILI (adjusted odds ratio = 0.70, 95% confidence interval 0.57-0.85). Among LTP SSIs, 43.1% were caused by Enterococcus spp., 17.2% by Candida spp., and 15.0% by coagulase-negative Staphylococcus spp. (CNS). Percentages of SSIs caused by Enterococcus faecium or CNS were higher after LTPs than BILIs, whereas percentages of SSIs caused by Enterobacteriaceae, Enterococcus faecalis, or viridans streptococci were higher after BILIs. Antimicrobial resistance was common in LTP SSI pathogens, including E. faecium (69.4% vancomycin-resistant); E. coli (68.8% fluoroquinolone-non-susceptible, 44.7% extended spectrum cephalosporin [ESC]-non-susceptible); and K. pneumoniae and K. oxytoca (39.4% fluoroquinolone-non-susceptible, 54.5% ESC-non-susceptible). National LTP SSI pathogen and resistance data can help prioritize studies to determine effective interventions to prevent SSIs and reduce antimicrobial resistance in liver transplant recipients. |
Core components of infection prevention and control programs at the facility level in Georgia: key challenges and opportunities
Deryabina A , Lyman M , Yee D , Gelieshvilli M , Sanodze L , Madzgarashvili L , Weiss J , Kilpatrick C , Rabkin M , Skaggs B , Kolwaite A . Antimicrob Resist Infect Control 2021 10 (1) 39 BACKGROUND: The Georgia Ministry of Labor, Health, and Social Affairs is working to strengthen its Infection Prevention and Control (IPC) Program, but until recently has lacked an assessment of performance gaps and implementation challenges faced by hospital staff. METHODS: In 2018, health care hospitals were assessed using a World Health Organization (WHO) adapted tool aimed at implementing the WHO's IPC Core Components. The study included site assessments at 41 of Georgia's 273 hospitals, followed by structured interviews with 109 hospital staff, validation observations of IPC practices, and follow up document reviews. RESULTS: IPC programs for all hospitals were not comprehensive, with many lacking defined objectives, workplans, targets, and budget. All hospitals had at least one dedicated IPC staff member, 66% of hospitals had IPC staff with some formal IPC training; 78% of hospitals had IPC guidelines; and 55% had facility-specific standard operating procedures. None of the hospitals conducted structured monitoring of IPC compliance and only 44% of hospitals used IPC monitoring results to make unit/facility-specific IPC improvement plans. 54% of hospitals had clearly defined priority healthcare-associated infections (HAIs), standard case definitions and data collection methods in their HAI surveillance systems. 85% hospitals had access to a microbiology laboratory. All reported having posters or other tools to promote hand hygiene, 29% had them for injection safety. 68% of hospitals had functioning hand-hygiene stations available at all points of care. 88% had single patient isolation rooms; 15% also had rooms for cohorting patients. 71% reported having appropriate waste management system. CONCLUSIONS: Among the recommended WHO IPC core components, existing programs, infrastructure, IPC staffing, workload and supplies present within Georgian healthcare hospitals should allow for implementation of effective IPC. Development and dissemination of IPC Guidelines, implementation of an effective IPC training system and systematic monitoring of IPC practices will be an important first step towards implementing targeted IPC improvement plans in hospitals. |
First-Dose COVID-19 Vaccination Coverage Among Skilled Nursing Facility Residents and Staff.
Gharpure R , Patel A , Link-Gelles R . JAMA 2021 325 (16) 1670-1671 Residents and staff of long-term care facilities (LTCFs) have been prioritized by the Advisory Committee on Immunization Practices for vaccination in the initial COVID-19 vaccine allocation phase in the US.1 Residents and staff of LTCFs, who live and work in congregate settings, are at increased risk for infection with SARS-CoV-2,2 and residents, given their advanced age and/or underlying chronic medical conditions, are at increased risk for severe outcomes.3 |
First Month of COVID-19 Vaccine Safety Monitoring - United States, December 14, 2020-January 13, 2021.
Gee J , Marquez P , Su J , Calvert GM , Liu R , Myers T , Nair N , Martin S , Clark T , Markowitz L , Lindsey N , Zhang B , Licata C , Jazwa A , Sotir M , Shimabukuro T . MMWR Morb Mortal Wkly Rep 2021 70 (8) 283-288 Two coronavirus disease 2019 (COVID-19) vaccines are currently authorized for use in the United States. The Food and Drug Administration (FDA) issued Emergency Use Authorization (EUA) for the Pfizer-BioNTech COVID-19 vaccine on December 11, 2020, and for the Moderna COVID-19 vaccine on December 18, 2020; each is administered as a 2-dose series. The Advisory Committee on Immunization Practices issued interim recommendations for Pfizer-BioNTech and Moderna COVID-19 vaccines on December 12, 2020 (1), and December 19, 2020 (2), respectively; initial doses were recommended for health care personnel and long-term care facility (LTCF) residents (3). Safety monitoring for these vaccines has been the most intense and comprehensive in U.S. history, using the Vaccine Adverse Event Reporting System (VAERS), a spontaneous reporting system, and v-safe,* an active surveillance system, during the initial implementation phases of the COVID-19 national vaccination program (4). CDC conducted descriptive analyses of safety data from the first month of vaccination (December 14, 2020-January 13, 2021). During this period, 13,794,904 vaccine doses were administered, and VAERS received and processed(†) 6,994 reports of adverse events after vaccination, including 6,354 (90.8%) that were classified as nonserious and 640 (9.2%) as serious.(§) The symptoms most frequently reported to VAERS were headache (22.4%), fatigue (16.5%), and dizziness (16.5%). A total of 113 deaths were reported to VAERS, including 78 (65%) among LTCF residents; available information from death certificates, autopsy reports, medical records, and clinical descriptions from VAERS reports and health care providers did not suggest any causal relationship between COVID-19 vaccination and death. Rare cases of anaphylaxis after receipt of both vaccines were reported (4.5 reported cases per million doses administered). Among persons who received Pfizer-BioNTech vaccine, reactions reported to the v-safe system were more frequent after receipt of the second dose than after the first. The initial postauthorization safety profiles of the two COVID-19 vaccines in current use did not indicate evidence of unexpected serious adverse events. These data provide reassurance and helpful information regarding what health care providers and vaccine recipients might expect after vaccination. |
Significant declines in juvenile onset recurrent respiratory papillomatosis following HPV vaccine introduction in the United States
Meites E , Stone L , Amiling R , Singh V , Unger ER , Derkay C , Markowitz LE . Clin Infect Dis 2021 73 (5) 885-890 BACKGROUND: Juvenile onset recurrent respiratory papillomatosis (JORRP) is a rare and serious disease caused by human papillomavirus (HPV) presumably acquired during vaginal delivery. HPV vaccination of females through age 26 years, recommended in the United States since 2006, can prevent HPV transmission. We assessed trends in JORRP cases before and after HPV vaccine introduction in the United States. METHODS: Case-patients were identified from 26 pediatric otolaryngology centers in 23 U.S. states. Demographics and clinical history were abstracted from medical records. Case-patients were grouped by year of birth, and birth-cohort incidences were calculated using number of births from either national or state-level natality data from the 23 states. We calculated incidence rate ratios (IRR) and 95% confidence intervals (CI) in 2-year intervals. RESULTS: We identified 576 U.S. JORRP case-patients born in 2004-2013. Median age at diagnosis was 3.4 years (interquartile range: 1.9, 5.5). Number of identified JORRP case-patients declined from a baseline of 165 born in 2004-2005 to 36 born in 2012-2013. Incidence of JORRP per 100,000 births using national data declined from 2.0 cases in 2004-2005 to 0.5 cases in 2012-2013 (IRR=0.2, CI=0.1-0.4); incidence using state-level data declined from 2.9 cases in 2004-2005 to 0.7 cases in 2012-2013 (IRR=0.2, CI=0.1-0.4). CONCLUSIONS: Over a decade, numbers of JORRP case-patients and incidences declined significantly. Incidences calculated using national denominator data are likely underestimates; those calculated using state-level denominator data could be overestimates. These declines are most likely due to HPV vaccination. Increasing vaccination uptake could lead to elimination of this HPV-related disease. |
Chainchecker: An application to visualise and explore transmission chains for Ebola virus disease.
Gaythorpe K , Morris A , Imai N , Stewart M , Freeman J , Choi M . PLoS One 2021 16 (2) e0247002 2020 saw the continuation of the second largest outbreak of Ebola virus disease (EVD) in history. Determining epidemiological links between cases is a key part of outbreak control. However, due to the large quantity of data and subsequent data entry errors, inconsistencies in potential epidemiological links are difficult to identify. We present chainchecker, an online and offline shiny application which visualises, curates and verifies transmission chain data. The application includes the calculation of exposure windows for individual cases of EVD based on user defined incubation periods and user specified symptom profiles. It has an upload function for viral hemorrhagic fever data and utility for additional entries. This data may then be visualised as a transmission tree with inconsistent links highlighted. Finally, there is utility for cluster analysis and the ability to highlight nosocomial transmission. chainchecker is a R shiny application which has an offline version for use with VHF (viral hemorrhagic fever) databases or linelists. The software is available at https://shiny.dide.imperial.ac.uk/chainchecker which is a web-based application that links to the desktop application available for download and the github repository, https://github.com/imperialebola2018/chainchecker. |
Evaluating a mobile phone-delivered text message reminder intervention to reduce infant vaccination dropout in Arua, Uganda: Protocol for a randomized controlled trial
Ehlman DC , Magoola J , Tanifum P , Wallace AS , Behumbiize P , Mayanja R , Luzze H , Yukich J , Daniels D , Mugenyi K , Baryarama F , Ayebazibwe N , Conklin L . JMIR Res Protoc 2021 10 (2) e17262 BACKGROUND: Globally, suboptimal vaccine coverage is a public health concern. According to Uganda's 2016 Demographic and Health Survey, only 49% of 12- to 23-month-old children received all recommended vaccinations by 12 months of age. Innovative ways are needed to increase coverage, reduce dropout, and increase awareness among caregivers to bring children for timely vaccination. OBJECTIVE: This study evaluates a personalized, automated caregiver mobile phone-delivered text message reminder intervention to reduce the proportion of children who start but do not complete the vaccination series for children aged 12 months and younger in select health facilities in Arua district. METHODS: A two-arm, multicenter, parallel group randomized controlled trial was conducted in four health facilities providing vaccination services in and around the town of Arua. Caregivers of children between 6 weeks and 6 months of age at the time of their first dose of pentavalent vaccine (Penta1; containing diphtheria, tetanus, pertussis, hepatitis B, and Haemophilus influenzae type b antigens) were recruited and interviewed. All participants received the standard of care, defined as the health worker providing child vaccination home-based records to caregivers as available and providing verbal instruction of when to return for the next visit. At the end of each day, caregivers and their children were randomized by computer either to receive or not receive personalized, automated text message reminders for their subsequent vaccination visits according to the national schedule. Text message reminders for Penta2 were sent 2 days before, on the day of, and 2 days after the scheduled vaccination visit. Reminders for Penta3 and the measles-containing vaccine were sent on the scheduled day of vaccination and 5 and 7 days after the scheduled day. Study personnel conducted postintervention follow-up interviews with participants at the health facilities during the children's measles-containing vaccine visit. In addition, focus group discussions were conducted to assess caregiver acceptability of the intervention, economic data were collected to evaluate the incremental costs and cost-effectiveness of the intervention, and health facility record review forms were completed to capture service delivery process indicators. RESULTS: Of the 3485 screened participants, 1961 were enrolled from a sample size of 1962. Enrollment concluded in August 2016. Follow-up interviews of study participants, including data extraction from the children's vaccination cards, data extraction from the health facility immunization registers, completion of the health facility record review forms, and focus group discussions were completed by December 2017. The results are expected to be released in 2021. CONCLUSIONS: Prompting health-seeking behavior with reminders has been shown to improve health intervention uptake. Mobile phone ownership continues to grow in Uganda, so their use in vaccination interventions such as this study is logical and should be evaluated with scientifically rigorous study designs. TRIAL REGISTRATION: ClinicalTrials.gov NCT04177485; https://clinicaltrials.gov/ct2/show/NCT04177485. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID): DERR1-10.2196/17262. |
Association of self-reported depression and anger with law enforcement officers' physical abuse of Black men in 4 Georgia counties, 2011
Iqbal SA , Truman BI , Crosby AE . J Natl Med Assoc 2021 113 (4) 371-381 INTRODUCTION: The association between the behavioral affect of black men and law enforcement officers' physical abuse of those men is not well-understood. This analysis measures the association between self-reported negative affect behavior (anger or depression) by the men and physical abuse by law enforcement officers, controlling for demographic and behavioral attributes. METHODS: A single point-in-time cross-sectional survey was conducted in 2011 through random-digit telephone dialing among a sample of English-speaking black men aged 18-65 years in 4 Georgia (USA) counties. Associations among the outcomes, self-reported history of physical abuse by law enforcement officers, and the predictor variables of interest (self-reported anger or depression) was conducted through multivariable logistic regression. Other independent variables of interest measured were age; country of origin; parental country of origin; education; income; employment status; previous residency in a juvenile, jail, or prison facility; coping styles; and self-reported gender role and racism stress levels. RESULTS: Of the 633 survey participants who had interacted with law enforcement officers within the past 5 years, 129 (20.4%) reported physical abuse by law enforcement officers. Three factors had statistically significant, independent associations with reported law enforcement officer physical abuse: high levels of depression stratified by often or sometimes coping with stress through anger (adjusted odds ratio [aOR] = 4.9; 95% confidence interval [CI]: 1.4-16.9), previous residency in a jail or prison (aOR = 2.3; 95% CI: 1.8-3.1), and higher levels of exposure to racism (aOR [high levels of racism] = 15.0; 95% CI: 6.7-33.7 and aOR [medium levels of racism] = 6.5; 95% CI: 3.4-12.3). CONCLUSION: Cohort studies are needed to determine if a black man's negative coping style, history of incarceration or exposure to racism is causally related to his history of physical abuse by a law enforcement officer. |
Prevalence of adverse childhood experiences (ACEs) and associated health risks and risk behaviors among young women and men in Honduras
Kappel RH , Livingston MD , Patel SN , Villaveces A , Massetti GM . Child Abuse Negl 2021 115 104993 BACKGROUND: Adverse Childhood Experiences (ACEs) are potentially traumatic childhood events associated with negative health outcomes. Limited data on ACEs exists from low- and middle-income countries (LMICs). No ACEs studies have been done in Honduras. OBJECTIVE: This study assessed the prevalence of ACEs in Honduras and associated health risks and risk behaviors among young adults. PARTICIPANTS AND SETTING: Data from the 2017 Honduras Violence Against Children and Youth Survey (VACS) were used. Analyses were restricted to participants ages 18-24 years (n = 2701). METHODS: This study uses nationally representative VACS data to estimate the weighted prevalence of ACEs (physical, emotional, and sexual violence; witnessing violence; parental migration). Logistic regression analyses assessed the relationship between individual ACEs, cumulative ACEs, and health risks and risk behaviors (psychological distress; suicide ideation or self-harm; binge drinking; smoking; drug use; STIs; early pregnancy). Chi-square tests examined differences by sex. RESULTS: An estimated 77 % of 18-24 year olds in Honduras experienced at least 1 ACE and 39 % experienced 3+ ACEs. Women experienced significantly more sexual, emotional, and physical violence compared to men. Compared to youth with no ACEs, those with 1-2 ACEs and 3+ ACEs had 1.8 and 2.8 increased odds for psychological distress, 2.3 and 6.4 increased odds for suicidal ideation and self-harm, and 1.7 and 1.9 increased odds for smoking, respectively, adjusting for age, education, and food insecurity. Physical violence victimization and witnessing violence in the community were associated with increased odds of all health risks and risk behaviors. CONCLUSIONS: The high prevalence of ACEs and associated negative health risks and risk behaviors in this population support the need for prevention and early intervention for ACEs. |
Changes in suicide rates - United States, 2018-2019
Stone DM , Jones CM , Mack KA . MMWR Morb Mortal Wkly Rep 2021 70 (8) 261-268 Suicide is the 10th leading cause of death in the United States overall, and the second and fourth leading cause among persons aged 10-34 and 35-44 years, respectively (1). In just over 2 decades (1999-2019), approximately 800,000 deaths were attributed to suicide, with a 33% increase in the suicide rate over the period (1). In 2019, a total of 12 million adults reported serious thoughts of suicide during the past year, 3.5 million planned a suicide, and 1.4 million attempted suicide (2). Suicides and suicide attempts in 2019 led to a lifetime combined medical and work-loss cost (i.e., the costs that accrue from the time of the injury through the course of a person's expected lifetime) of approximately $70 billion (https://wisqars.cdc.gov:8443/costT/). From 2018 to 2019, the overall suicide rate declined for the first time in over a decade (1). To understand how the decline varied among different subpopulations by demographic and other characteristics, CDC analyzed changes in counts and age-adjusted suicide rates from 2018 to 2019 by demographic characteristics, county urbanicity, mechanism of injury, and state. Z-tests and 95% confidence intervals were used to assess statistical significance. Suicide rates declined by 2.1% overall, by 3.2% among females, and by 1.8% among males. Significant declines occurred, overall, in five states. Other significant declines were noted among subgroups defined by race/ethnicity, age, urbanicity, and suicide mechanism. These declines, although encouraging, were not uniform, and several states experienced significant rate increases. A comprehensive approach to prevention that uses data to drive decision-making, implements prevention strategies from CDC's Preventing Suicide: A Technical Package of Policy, Programs, and Practices with the best available evidence, and targets the multiple risk factors associated with suicide, especially in populations disproportionately affected, is needed to build on initial progress from 2018 to 2019 (3). |
Performance and Implementation Evaluation of the Abbott BinaxNOW Rapid Antigen Test in a High-throughput Drive-through Community Testing Site in Massachusetts.
Pollock NR , Jacobs JR , Tran K , Cranston AE , Smith S , O'Kane CY , Roady TJ , Moran A , Scarry A , Carroll M , Volinsky L , Perez G , Patel P , Gabriel S , Lennon NJ , Madoff LC , Brown C , Smole SC . J Clin Microbiol 2021 59 (5) Background: Rapid diagnostic tests (RDTs) for SARS-CoV-2 antigens (Ag) that can be performed at point-of-care (POC) can supplement molecular testing and help mitigate the COVID-19 pandemic. Deployment of an Ag RDT requires an understanding of its operational and performance characteristics under real-world conditions and in relevant subpopulations. We evaluated the Abbott BinaxNOW™ COVID-19 Ag Card in a high-throughput, drive-through, free community testing site in Massachusetts (MA) using anterior nasal (AN) swab RT-PCR for clinical testing.Methods: Individuals presenting for molecular testing in two of seven lanes were offered the opportunity to also receive BinaxNOW testing. Dual AN swabs were collected from symptomatic and asymptomatic children (≤ 18 years) and adults. BinaxNOW testing was performed in a testing pod with temperature/humidity monitoring. One individual performed testing and official result reporting for each test, but most tests had a second independent reading to assess inter-operator agreement. Positive BinaxNOW results were scored as faint, medium, or strong. Positive BinaxNOW results were reported to patients by phone and they were instructed to isolate pending RT-PCR results. The paired RT-PCR result was the reference for sensitivity and specificity calculations.Results: Of 2482 participants, 1380 adults and 928 children had paired RT-PCR/BinaxNOW results and complete symptom data. 974/1380 (71%) adults and 829/928 (89%) children were asymptomatic. BinaxNOW had 96.5% (95% confidence interval [CI] 90.0- 99.3) sensitivity and 100% (98.6-100.0) specificity in adults within 7 days of symptoms, and 84.6% (65.1-95.6) sensitivity and 100% (94.5-100.0) specificity in children within 7 days of symptoms. Sensitivity and specificity in asymptomatic adults were 70.2% (56.6-81.6) and 99.6% (98.9-99.9), respectively, and in asymptomatic children were 65.4% (55.6-74.4) and 99.0% (98.0-99.6), respectively. By cycle threshold (Ct) value cutoff, sensitivity in all subgroups combined (n=292 RT-PCR-positive individuals) was 99.3% with Ct ≤25, 95.8% with ≤30, and 81.2% with ≤35. Twelve false positive BinaxNOW results (out of 2308 tests) were observed; in all twelve, the test bands were faint but otherwise normal, and were noted by both readers. One invalid BinaxNOW result was identified. Inter-operator agreement (positive versus negative BinaxNOW result) was 100% (n = 2230/2230 double reads). Each operator was able to process 20 RDTs per hour. In a separate set of 30 specimens (from individuals with symptoms ≤7 days) run at temperatures below the manufacturer's recommended range (46-58.5°F), sensitivity was 66.7% and specificity 95.2%.Conclusions: BinaxNOW had very high specificity in both adults and children and very high sensitivity in newly symptomatic adults. Overall, 95.8% sensitivity was observed with Ct ≤ 30. These data support public health recommendations for use of the BinaxNOW test in adults with symptoms for ≤7 days without RT-PCR confirmation. Excellent inter-operator agreement indicates that an individual can perform and read the BinaxNOW test alone. A skilled laboratorian can perform and read 20 tests per hour. Careful attention to temperature is critical. |
A real-time multiplex PCR assay for detection of the causative agents of rat bite fever, Streptobacillus moniliformis and zoonoticStreptobacillus species.
Kelly AJ , Ivey ML , Gulvik CA , Humrighouse BW , McQuiston JR . Diagn Microbiol Infect Dis 2021 100 (2) 115335 Rat bite fever (RBF) caused by Streptobacillus moniliformis has been described as a diagnostic challenge. While it has a favorable prognosis with treatment, timely diagnosis is hindered by the lack of culture-free identification methods. Here we present a multiplex real-time PCR assay that detects the zoonotic Streptobacillus spp. as well as differentiate the primary causative agent of RBF, Streptobacillus moniliformis. The performance of this assay was evaluated using mock clinical specimens for blood, serum, and urine. Analytical sensitivity was determined to be 3-4 genome equivalents (GE)/µl for the zoonotic Streptobacillus spp. target, and 1-2 GE/µl for the S. moniliformis specific target. The assay correctly detected only the intended targets with no cross-reactivity identified. The pathogen was detected in all spiked matrices and not detected in the negative non-spiked specimens. This rapid diagnostic assay may permit quicker diagnosis of RBF patients. |
Lung toxicity and gene expression changes in response to whole-body inhalation exposure to cellulose nanocrystal in rats.
Joseph P , Umbright CM , Roberts JR , Cumpston JL , Orandle MS , McKinney WG , Sager TM . Inhal Toxicol 2021 33 (2) 1-15 OBJECTIVE: Human exposure to cellulose nanocrystal (CNC) is possible during the production and/or use of products containing CNC. The objectives of the current study were to determine the lung toxicity of CNC and the underlying molecular mechanisms of the toxicity. METHODS: Rats were exposed to air or CNC (20 mg/m(3), six hours/day, 14 d) by whole-body inhalation and lung toxicity and global gene expression profile were determined. RESULTS: Significant increases in lactate dehydrogenase activity, pro-inflammatory cytokine levels, phagocyte oxidant production, and macrophage and neutrophil counts were detected in the bronchoalveolar lavage cells or fluid from the CNC exposed rats. Mild lung histological changes, such as the accumulation of macrophages and neutrophils, were detected in the CNC exposed rats. Gene expression profiling by next generation sequencing identified 531 genes whose expressions were significantly different in the lungs of the CNC exposed rats, compared with the controls. Bioinformatic analysis of the lung gene expression data identified significant enrichment in several biological functions and canonical pathways including those related to inflammation (cellular movement, immune cell trafficking, inflammatory diseases and response, respiratory disease, complement system, acute phase response, leukocyte extravasation signaling, granulocyte and agranulocyte adhesion and diapedesis, IL-10 signaling, and phagosome formation and maturation) and oxidative stress (NRF2-mediated oxidative stress response, production of nitric oxide and reactive oxygen species in macrophages, and free radical scavenging). CONCLUSION: Our data demonstrated that inhalation exposure of rats to CNC resulted in lung toxicity mediated mainly through the induction of inflammation and oxidative stress. |
Detection of Tick-borne Bacteria from Whole Blood Using 16S Ribosomal RNA Gene PCR Followed by Next-Generation Sequencing.
Rodino KG , Wolf MJ , Sheldon S , Kingry LC , Petersen JM , Patel R , Pritt BS . J Clin Microbiol 2021 59 (5) Reported cases of tick-borne diseases have steadily increased for more than a decade. In the United States, a majority of tick-borne infections are caused by bacteria. Clinical diagnosis may be challenging as tick-borne diseases can present with similar symptoms. Laboratory diagnosis has historically relied on serologic methods, which have limited utility during the acute phase of disease. Pathogen-specific molecular methods have improved early diagnosis, but can be expensive when bundled together and miss unexpected or novel pathogens. To address these shortcomings, we developed a 16S ribosomal RNA (rRNA) gene PCR with next-generation sequencing approach to detect tick-borne bacteria in whole blood. A workflow was optimized by comparing combinations of two extractions platforms and two primer sets, ultimately pursuing DNA extraction from blood with the MagNA Pure 96 and PCR amplification using dual-priming oligonucleotide primers specific to the V1-V3 region of the 16S rRNA gene. The amplified product underwent modified Illumina 16S metagenomics sequencing library preparation and sequencing on a MiSeq V2 Nano flow cell, with data analysis using Pathogenomix RipSeq NGS software. Results with the developed method were compared to those from a V1-V2 16S rRNA gene primer set described by the Centers for Disease Control and Prevention (CDC). The V1-V3 assay demonstrated equivalent performance to the CDC assay, with each method showing concordance with targeted PCR results in 31 of 32 samples, and detecting 22 of 23 expected organisms. These data demonstrate the potential for using a broad-range bacterial detection approach for diagnosis of tick-borne bacterial infection from blood. |
Genomic surveillance and improved molecular typing of Bordetella pertussis using wgMLST
Weigand MR , Peng Y , Pouseele H , Kania D , Bowden KE , Williams MM , Tondella ML . J Clin Microbiol 2021 59 (5) Multi-Locus Sequence Typing (MLST) provides allele-based characterization of bacterial pathogens in a standardized framework. However, classical MLST schemes for Bordetella pertussis, the causative agent of whooping cough, seldom reveal diversity among the small number of gene targets and thereby fail to delineate population structure. To improve discriminatory power of allele-based molecular typing of B. pertussis, we have developed a whole-genome MLST (wgMLST) scheme from 225 reference-quality genome assemblies. Iterative refinement and allele curation resulted in a scheme of 3,506 coding sequences and covering 81.4% of the B. pertussis genome. This wgMLST scheme was further evaluated with data from a convenience sample of 2,389 B. pertussis isolates sequenced on Illumina instruments, including isolates from known outbreaks and epidemics previously characterized by existing molecular assays, as well as replicates collected from individual patients. wgMLST demonstrated concordance with whole-genome single nucleotide polymorphisms (SNP) profiles, accurately resolved outbreak and sporadic cases in a retrospective comparison, and clustered replicate isolates collected from individual patients during diagnostic confirmation. Additionally, a re-analysis of isolates from two statewide epidemics using wgMLST reconstructed the population structures of circulating strains with increased resolution, revealing new clusters of related cases. Comparison with an existing core-genome (cgMLST) scheme highlights the stable gene content of this bacterium and forms the initial foundation for necessary standardization. These results demonstrate the utility of wgMLST for improving B. pertussis characterization and genomic surveillance during the current pertussis disease resurgence. |
Development and validation of a biomonitoring method to measure As, Cr, and Ni in human urine samples by ICP-UCT-MS
Jones DR , Jarrett JM , Stukes D , Baer A , McMichael M , Wallon K , Xiao G , Jones RL . Int J Hyg Environ Health 2021 234 113713 We developed an inductively coupled plasma mass spectrometry (ICP-MS) method using Universal Cell Technology (UCT) with a PerkinElmer NexION ICP-MS, to measure arsenic (As), chromium (Cr), and nickel (Ni) in human urine samples. The advancements of the UCT allowed us to expand the calibration range to make the method applicable for both low concentrations of biomonitoring applications and high concentrations that may be observed from acute exposures and emergency response. Our method analyzes As and Ni in kinetic energy discrimination (KED) mode with helium (He) gas, and Cr in dynamic reaction cell (DRC) mode with ammonia (NH(3)) gas. The combination of these elements is challenging because a carbon source, ethanol (EtOH), is required for normalization of As ionization in urine samples, which creates a spectral overlap ((40)Ar(12)C(+)) on (52)Cr. This method additionally improved lab efficiency by combining elements from two of our previously published methods(Jarrett et al., 2007; Quarles et al., 2014) allowing us to measure Cr and Ni concentrations in urine samples collected as part of the National Health and Nutrition Examination Survey (NHANES) beginning with the 2017-2018 survey cycle. We present our rigorous validation of the method selectivity and accuracy using National Institute of Standards and Technology (NIST) Standard Reference Materials (SRM), precision using in-house prepared quality control materials, and a discussion of the use of a modified UCT, a BioUCell, to address an ion transmission phenomenon we observed on the NexION 300 platform when using higher elemental concentrations and high cell gas pressures. The rugged method detection limits, calculated from measurements in more than 60 runs, for As, Cr, and Ni are 0.23 μg L-1, 0.19 μg L-1, and 0.31 μg L-1, respectively. |
Surface dosimetry of ultraviolet germicidal irradiation using a colorimetric technique
Neu DT , Mead KR , McClelland TL , Lindsley WG , Martin SB , Heil G , See M , Feng HA . Ann Work Expo Health 2021 65 (5) 605-611 Ultraviolet germicidal irradiation uses ultraviolet C (UV-C) energy to disinfect surfaces in clinical settings. Verifying that the doses of UV-C energy received by surfaces are adequate for proper disinfection levels can be difficult and expensive. Our study aimed to test commercially available colorimetric labels, sensitive to UV-C energy, and compare their precision with an accepted radiometric technique. The color-changing labels were found to predictably change color in a dose-dependent manner that would allow them to act as a qualitative alternative to radiometry when determining the minimum UV-C energy dosage received at surfaces. If deployed using careful protective techniques to avoid unintentional exposure to sunlight or other light sources, the use of colorimetric labels could provide inexpensive, easy, and accurate verification of effective UV-C dosing in clinical spaces. |
Longitudinal investigation of pubertal milestones and hormones as a function of body fat in girls
Ortega MT , McGrath JA , Carlson L , Flores Poccia V , Larson G , Douglas C , Sun BZ , Zhao S , Beery B , Vesper HW , Duke L , Botelho JC , Filie AC , Shaw ND . J Clin Endocrinol Metab 2021 106 (6) 1668-1683 BACKGROUND: Epidemiologic studies demonstrated that overweight/obese girls (OW/OB) undergo thelarche and menarche earlier than normal weight girls (NW). There have been no longitudinal studies to specifically investigate how body weight/fat affects both clinical and biochemical pubertal markers in girls. METHODS: 90 girls (36 OW/OB, 54 NW), aged 8.2-14.7 years, completed 2.8 ± 1.7 study visits over the course of four years. Visits included dual-energy x-ray absorptiometry to calculate total body fat (TBF), Tanner staging, breast ultrasound for morphological staging (BMORPH; A-E), pelvic ultrasound, hormone tests, and assessment of menarchal status. The effect of TBF on pubertal markers was determined using a mixed, multi-state, or Cox proportional hazards model, controlling for baseline BMORPH. RESULTS: NW were older than OW/OB (11.3 vs. 10.2 yrs, p<0.01) at baseline and had more advanced BMORPH (p<0.01). LH, estradiol, and ovarian and uterine volumes increased with time with no effect of TBF. There was a time x TBF interaction for FSH, inhibin B, estrone, total and free testosterone, and androstenedione: levels were initially similar, but after 1 yr, levels increased in girls with higher TBF, plateaued in girls with mid-range TBF, and decreased in girls with lower TBF. Girls with higher TBF progressed through BMORPH stage D more slowly but achieved menarche earlier than girls with lower TBF. CONCLUSIONS: In late puberty, girls with higher TBF demonstrate differences in standard hormonal and clinical markers of puberty. Investigation of the underlying causes and clinical consequences of these differences in girls with higher TBF deserves further study. |
Sensitive and feasible specimen collection and testing strategies for diagnosing tuberculosis in young children
Song R , Click ES , McCarthy KD , Heilig CM , McHembere W , Smith JP , Fajans M , Musau SK , Okeyo E , Okumu A , Orwa J , Gethi D , Odeny L , Lee SH , Perez-Velez CM , Wright CA , Cain KP . JAMA Pediatr 2021 175 (5) e206069 IMPORTANCE: Criterion-standard specimens for tuberculosis diagnosis in young children, gastric aspirate (GA) and induced sputum, are invasive and rarely collected in resource-limited settings. A far less invasive approach to tuberculosis diagnostic testing in children younger than 5 years as sensitive as current reference standards is important to identify. OBJECTIVE: To characterize the sensitivity of preferably minimally invasive specimen and assay combinations relative to maximum observed yield from all specimens and assays combined. DESIGN, SETTING, AND PARTICIPANTS: In this prospective cross-sectional diagnostic study, the reference standard was a panel of up to 2 samples of each of 6 specimen types tested for Mycobacterium tuberculosis complex by Xpert MTB/RIF assay and mycobacteria growth indicator tube culture. Multiple different combinations of specimens and tests were evaluated as index tests. A consecutive series of children was recruited from inpatient and outpatient settings in Kisumu County, Kenya, between October 2013 and August 2015. Participants were children younger than 5 years who had symptoms of tuberculosis (unexplained cough, fever, malnutrition) and parenchymal abnormality on chest radiography or who had cervical lymphadenopathy. Children with 1 or more evaluable specimen for 4 or more primary study specimen types were included in the analysis. Data were analyzed from February 2015 to October 2020. MAIN OUTCOMES AND MEASURES: Cumulative and incremental diagnostic yield of combinations of specimen types and tests relative to the maximum observed yield. RESULTS: Of the 300 enrolled children, the median (interquartile range) age was 2.0 (1.0-3.6) years, and 151 (50.3%) were female. A total of 294 met criteria for analysis. Of 31 participants with confirmed tuberculosis (maximum observed yield), 24 (sensitivity, 77%; interdecile range, 68%-87%) had positive results on up to 2 GA samples and 20 (sensitivity, 64%; interdecile range, 53%-76%) had positive test results on up to 2 induced sputum samples. The yields of 2 nasopharyngeal aspirate (NPA) samples (23 of 31 [sensitivity, 74%; interdecile range, 64%-84%]), of 1 NPA sample and 1 stool sample (22 of 31 [sensitivity, 71%; interdecile range, 60%-81%]), or of 1 NPA sample and 1 urine sample (21.5 of 31 [sensitivity, 69%; interdecile range, 58%-80%]) were similar to reference-standard specimens. Combining up to 2 each of GA and NPA samples had an average yield of 90% (28 of 31). CONCLUSIONS AND RELEVANCE: NPA, in duplicate or in combination with stool or urine specimens, was readily obtainable and had diagnostic yield comparable with reference-standard specimens. This combination could improve tuberculosis diagnosis among children in resource-limited settings. Combining GA and NPA had greater yield than that of the current reference standards and may be useful in certain clinical and research settings. |
Prepregnancy body mass index and spina bifida: Potential contributions of bias
Johnson CY , Honein MA , Rasmussen SA , Howards PP , Strickland MJ , Flanders WD . Birth Defects Res 2021 113 (8) 633-643 BACKGROUND: Epidemiologists have consistently observed associations between prepregnancy obesity and spina bifida in offspring. Most studies, however, used self-reported body mass index (potential for exposure misclassification) and incompletely ascertained cases of spina bifida among terminations of pregnancy (potential for selection bias). We conducted a quantitative bias analysis to explore the potential effects of these biases on study results. METHODS: We included 808 mothers of fetuses or infants with spina bifida (case mothers) and 7,685 mothers of infants without birth defects (control mothers) from a population-based case-control study, the National Birth Defects Prevention Study (1997-2011). First, we performed a conventional epidemiologic analysis, adjusting for potential confounders using logistic regression. Then, we used 5,000 iterations of probabilistic bias analysis to adjust for the combination of confounding, exposure misclassification, and selection bias. RESULTS: In the conventional confounding-adjusted analysis, prepregnancy obesity was associated with spina bifida (odds ratio 1.4, 95% confidence interval: 1.2, 1.7). In the probabilistic bias analysis, we tested nine different models for the combined effects of confounding, exposure misclassification, and selection bias. Results were consistent with a weak to moderate association between prepregnancy obesity and spina bifida, with the median odds ratios across the nine models ranging from 1.1 to 1.4. CONCLUSIONS: Given our assumptions about the occurrence of bias in the study, our results suggest that exposure misclassification, selection bias, and confounding do not completely explain the association between prepregnancy obesity and spina bifida. |
Evaluation of sex differences in preschool children with and without autism spectrum disorder enrolled in the Study to Explore Early Development
Wiggins LD , Rubenstein E , Windham G , Barger B , Croen L , Dowling N , Giarelli E , Levy S , Moody E , Soke G , Fields V , Schieve L . Res Dev Disabil 2021 112 103897 BACKGROUND AND AIMS: Research in school-aged children, adolescents, and adults with autism spectrum disorder (ASD) has found sex-based differences in behavioral, developmental, and diagnostic outcomes. These findings have not been consistently replicated in preschool-aged children. We examined sex-based differences in a large sample of 2-5-year-old children with ASD symptoms in a multi-site community-based study. METHODS AND PROCEDURES: Based on a comprehensive evaluation, children were classified as having ASD (n = 1480, 81.55 % male) or subthreshold ASD characteristics (n = 593, 70.15 % male). Outcomes were behavior problems, developmental abilities, performance on ASD screening and diagnostic tests, and parent-reported developmental conditions diagnosed before study enrollment. OUTCOMES AND RESULTS: We found no statistically significant sex differences in behavioral functioning, developmental functioning, performance on an ASD screening test, and developmental conditions diagnosed before study enrollment among children with ASD or subthreshold ASD characteristics. Males in both study groups had more parent reported restricted interests and repetitive behaviors than females, but these differences were small in magnitude and not clinically meaningful. CONCLUSIONS AND IMPLICATIONS: Preschool males and females who showed risk for ASD were more similar than different in the outcomes assessed in our study. Future research could examine sex-based differences in ASD phenotypes as children age. |
Why surveillance informatics is an integral part of a safe patient handling program: occupational injuries due to patient handling and Movement in 116 US hospitals, Occupational Health Safety Network, 2012-2016
Gomaa A , Groenewold MR , Vanoli K , Nowlin S , Marovich S . J Assoc Occup Health Pro Healthc 2020 40 (3) 16-25 Workplace musculoskeletal injuries due to patient handling and movement (PHM) are a significant occupational hazard for healthcare workers in the United States. Study authors Ahmed Gomaa, MD, ScD; Matthew R. Groenewold, PhD, MSPH; Kelly Vanoli; Susan Nowlin; and Stacey Marovich, MHI, MS, PMP, MCTS analyzed workplace musculoskeletal injuries surveillance data submitted by 116 hospitals participating in the Occupational Health Safety Network (OHSN) from 2012 to 2016. The detailed analysis of patient injury data showed nursing assistants, radiology technicians, and nurses are at the highest risk for injury. Improved data collection is needed to improve safe patient handling programs (SPHPs), and surveillance information is key for providing evidence on all aspects of SPHP. |
Development and application of an innovative instrument to assess work environment factors for injury prevention in the food service industry
Markkanen P , Peters SE , Grant M , Dennerlein JT , Wagner GR , Burke L , Wallace L , Sorensen G . Work 2021 68 (3) 641-651 BACKGROUND: With the growth the food service industry and associated high injury and illness rates, there is a need to assess workplace factors that contribute to injury prevention. OBJECTIVE: The objective of this report is to describe the development, application, and utility of a new instrument to evaluate ergonomics and safety for food service workers. METHODS: Starting with a similar tool developed for use in healthcare, a new tool was designed through a collaborative, participatory process with the stakeholders from a collaborating food service company. The new instrument enables the identification and assessment of key safety and health factors through a focused walkthrough of the physical work environment, and structured interviews exploring the organizational work environment. The researchers applied the instrument at 10 of the partnering company's worksites. RESULTS: The instrument identified factors related to both the physical work environment and organizational and contextual environment (e.g., vendor-client relationships) impacting worker safety and health. CONCLUSIONS: Modern assessment approaches should address both the physical and organizational aspects of the work environment, and consider the context complexities in which the worksites and the industry operate. |
Kurtosis: a new tool for noise analysis
Qiu W , Murphy WJ , Suter A . Acoust Today 2020 16 (4) 39-47 Hearing loss due to high-level noise exposure remains a significant occupational health hazard that continues to increase in prevalence in industrial and military work environments despite government-mandated hearing conservation programs. The underlying assumption in current noise standards is that hearing loss over an 8-hour A-weighted equivalent continuous exposure level (often abbreviated as LAeq,8h) can be predicted by the equal energy hypothesis. This method assumes equivalent effects on hearing for a 3 dB increase or decrease in exposure intensity with a halving or doubling of exposure duration, respectively. In other words, equal amounts of hearing loss are expected regardless of how the noise exposure levels have occurred over time. The equal energy hypothesis is the basis of most noise standards and guidelines in the United States and internationally. Although this approach is generally considered appropriate for steady-state noise, it is not adequate for complex noise (Hamernik and Qiu, 2001). Some Background: Consensus has been lacking on the use of simple energy averaging to predict the effects of noise on hearing. In the United States, some government agencies use a modification consisting of a 5 dB trading relationship, whereas others use the internationally accepted 3 dB rule. Use of the 3 dB rule has been recommended by the National Institute of Occupational Safety and Health (NIOSH) since 1998, which recommendation has been validated based on additional, more recent research (Suter, 2017). Another issue with using a simple energy metric is the inability of sound energy averaging to account for the increased hazard of noise with impulsive components. Although intermittences in noise exposures may have been considered helpful to hearing in the past, this no longer seems to be the case with complex noise exposures, which are found frequently in manufacturing industries. Because the additional hazards from impulsive noise were already recognized, the earliest version of the International Standards Organization (ISO) 1999 standard (1971) suggested a 10 dB adjustment to the average exposure level when impulsive noise is superimposed on a background of continuous noise. At a 1981 meeting of noise experts in Southampton, UK, some participants proposed keeping the 10 dB adjustment, with others wanting to change it to 5 dB, and a third group proposing just using simple energy averaging (Personal Observation, Suter, 1981). The resulting report concluded that hearing conservation programs should be initiated at a 5 dB lower level as a precautionary measure whenever there are impulsive noise conditions (von Gierke et al., 1981). Consequently, the 1990 version of the standard contained a note suggesting a 5 dB correction but even that disappeared without explanation in later iterations of the ISO 1999 standard (2013). Since then, more evidence has emerged regarding the hazard to hearing from complex noise environments relative to continuous noise environments. Complex or Non-Gaussian Noise: A steady-state, continuous noise exposure typically has a normal or Gaussian amplitude distribution (see background in Figure 1). However, the temporal pattern of noise exposures often varies significantly in work environments. A complex noise environment may be described as Gaussian background noise punctuated by a series of high-level transient noises resulting in a non-Gaussian distribution (as shown in Figure 1). These transients can be brief, high-level noise bursts, impulses, or impacts with varying interpeak intervals, peak levels, and peak durations. Industrial workers are often exposed to complex noise environments. Examples include jobs involving maintenance work, metalworking, and power tools, such as impact wrenches and nail guns. Over the past several decades, a number of studies using animal models have shown that exposure to complex noise produces more hearing damage than an equivalent energy exposure to continuous Gaussian noise, in terms of both behavioral hearing loss and sensory cell loss (e.g., Lei et al., 1994). These results, along with similar findings from human data in industrial settings, have demonstrated that although acoustic energy and exposure duration are necessary metrics, they are not sufficient to evaluate the hearing loss from complex non-Gaussian noise exposure. Because many noise environments can be characterized by the same equivalent energy and spectra, a metric that describes the temporal structure of an exposure would be a useful adjunct to the equivalent sound pressure level metric. The kurtosis of a sample distribution is such a metric. |
A deep learning approach for lower back-pain risk prediction during manual lifting
Snyder K , Thomas B , Lu ML , Jha R , Barim MS , Hayden M , Werren D . PLoS One 2021 16 (2) e0247162 Occupationally-induced back pain is a leading cause of reduced productivity in industry. Detecting when a worker is lifting incorrectly and at increased risk of back injury presents significant possible benefits. These include increased quality of life for the worker due to lower rates of back injury and fewer workers' compensation claims and missed time for the employer. However, recognizing lifting risk provides a challenge due to typically small datasets and subtle underlying features in accelerometer and gyroscope data. A novel method to classify a lifting dataset using a 2D convolutional neural network (CNN) and no manual feature extraction is proposed in this paper; the dataset consisted of 10 subjects lifting at various relative distances from the body with 720 total trials. The proposed deep CNN displayed greater accuracy (90.6%) compared to an alternative CNN and multilayer perceptron (MLP). A deep CNN could be adapted to classify many other activities that traditionally pose greater challenges in industrial environments due to their size and complexity. |
Correlates of variation in Guinea worm burden among infected domestic dogs
Guagliardo SAJ , Wiegand R , Roy SL , Cleveland CA , Zirimwabagabo H , Chop E , Tchindebet Ouakou P , Ruiz-Tiben E , Hopkins D , Weiss A . Am J Trop Med Hyg 2021 104 (4) 1418-1424 The Guinea Worm Eradication Program has been extraordinarily successful-in 2019, there were 53 human cases reported, down from the estimated 3.5 million in 1986. Yet the occurrence of guinea worm in dogs is a challenge to eradication efforts, and underlying questions about transmission dynamics remain. We used routine surveillance data to run negative binomial regressions predicting worm burden among infected dogs in Chad. Of 3,371 infected dogs reported during 2015-2018, 38.5% had multiple worms. A multivariable model showed that the number of dogs in the household was negatively associated with worm burden (adjusted incidence rate ratio [AIRR] = 0.95, 95% CI: 0.93-0.97, P < 0.0001) after adjusting for dog age (AIRR = 0.99, 95% CI: 0.96-1.01, P > 0.1). This could relate to the amount of infective inocula (e.g., contaminated food or water) shared by multiple dogs in a household. Other significant univariable associations with worm burden included dog history of guinea worm infection (IRR = 1.30, 95% CI: 1.18-1.45) and dog owners who were hunters (IRR = 0.78, 95% CI: 0.62-0.99, P < 0.05) or farmers (IRR = 0.83, 95% CI: 0.77-0.90, P < 0.0001). Further analysis showed that the number of dogs in the household was significantly and positively correlated with nearly all other independent variables (e.g., owner occupation: farmer, fisherman, or hunter; dog age, gender, and history of guinea worm). The associations we identified between worm burden and dogs per household, and dogs per household and owner characteristics should be further investigated with more targeted studies. |
Evaluation of the durability and use of long-lasting insecticidal nets in Nicaragua
Villalta EL , Soto Bravo AM , Vizcaino L , Dzuris N , Delgado M , Green M , Smith SC , Lenhart A , Macedo de Oliveira A . Malar J 2021 20 (1) 106 BACKGROUND: Vector control for malaria prevention relies most often on the use of insecticide-treated bed net (ITNs) and indoor residual spraying. Little is known about the longevity of long-lasting insecticidal nets (LLINs) in the Americas. The physical integrity and insecticide retention of LLINs over time were monitored after a bed net distribution campaign to assess community practices around LLIN care and use in Waspam, northeastern Nicaragua. METHODS: At least 30 nets were collected at 6, 12, 24, and 36 months post distribution. Physical integrity was measured by counting holes and classifying nets into categories (good, damaged, and too torn) depending on a proportionate hole index (pHI). Insecticide bioefficacy was assessed using cone bioassays, and insecticide content measured using a cyanopyrethroid field test (CFT). RESULTS: At 6 months, 87.3 % of LLINs were in good physical condition, while by 36 months this decreased to 20.6 %, with 38.2 % considered 'too torn.' The median pHI increased from 7 at the 6-month time point to 480.5 by 36 months. After 36 months of use, median mortality in cone bioassays was 2 % (range: 0-6 %) compared to 16 % (range: 2-70 %) at 6 months. There was a decrease in the level of deltamethrin detected on the surface of the LLINs with 100 % of tested LLINs tested at 12 months and 24 months crossing the threshold for being considered a failed net by CFT. CONCLUSIONS: This first comprehensive analysis of LLIN durability in Central America revealed rapid loss of chemical bioefficacy and progressive physical damage over a 36-month period. Use of these findings to guide future LLIN interventions in malaria elimination settings in Nicaragua, and potentially elsewhere in the Americas, could help optimize the successful implementation of vector control strategies. |
Legal literacy for public health practitioners
Yassine BB , Menon AN , Ramanathan Holiday T , Penn M . Public Health Rep 2021 137 (2) 370-374 Public health and law are inextricably intertwined. Law is the foundation of governmental public health practice, delineating the duties and authority to protect and promote conditions necessary for population health. 1 Law is also a social and structural determinant of health, because laws shape the physical, social, and economic environments that directly impact population health. 2 Public health laws at all levels of government enshrine public health strategies, are critical to addressing emerging issues, and are the means through which interventions are implemented and enforced. |
Using ICD-10-CM codes to detect illicit substance use: A comparison with retrospective self-report
Rowe CL , Santos GM , Kornbluh W , Bhardwaj S , Faul M , Coffin PO . Drug Alcohol Depend 2021 221 108537 BACKGROUND: Understanding whether International Classification of Disease, 10th Revision, Clinical Modification (ICD-10-CM) codes can be used to accurately detect substance use can inform their use in future surveillance and research efforts. METHODS: Using 2015-2018 data from a retrospective cohort study of 602 safety-net patients prescribed opioids for chronic non-cancer pain, we calculated the sensitivity and specificity of using ICD-10-CM codes to detect illicit substance use compared to retrospective self-report by substance (methamphetamine, cocaine, opioids [heroin or non-prescribed opioid analgesics]), self-reported use frequency, and type of healthcare encounter. RESULTS: Sensitivity of ICD-10-CM codes for detecting self-reported substance use was highest for methamphetamine (49.5 % [95 % confidence interval: 39.6-59.5 %]), followed by cocaine (44.4 % [35.8-53.2 %]) and opioids (36.3 % [28.8-44.2 %]); higher for participants who reported more frequent methamphetamine (intermittent use: 27.7 % [14.6-42.6 %]; ≥weekly use: 67.2 % [53.7-79.0 %]) and opioid use (intermittent use: 21.4 % [13.2-31.7 %]; ≥weekly use: 52.6 % [40.8-64.2 %]); highest for outpatient visits (methamphetamine: 43.8 % [34.1-53.8 %]; cocaine: 36.8 % [28.6-45.6 %]; opioids: 33.1 % [25.9-41.0 %]) and lowest for emergency department visits (methamphetamine: 8.6 % [4.0-15.6 %]; cocaine: 5.3 % [2.1-10.5 %]; opioids: 6.3 % [3.0-11.2 %]). Specificity was highest for methamphetamine (96.4 % [94.3-97.8 %]), followed by cocaine (94.0 % [91.5-96.0 %]) and opioids (85.0 % [81.3-88.2 %]). CONCLUSIONS: ICD-10-CM codes had high specificity and low sensitivity for detecting self-reported substance use but were substantially more sensitive in detecting frequent use. ICD-10-CM codes to detect substance use, particularly those from emergency department visits, should be used with caution, but may be useful as a lower-bound population measure of substance use or for capturing frequent use among certain patient populations. |
"Stories of starting": Understanding the complex contexts of opioid misuse initiation
Spencer NE , Taubenberger SP , Roberto R , Krishnamurti LS , Chang JC , Hacker K . Subst Abus 2021 42 (4) 1-16 Background: The impacts of opioid use disorder and opioid-involved overdose are known, but less is known about the contexts in which people first misuse opioids, and the motivations for continued misuse. Methods: In-depth interviews with 26 individuals in Allegheny County, Pennsylvania with current or past histories of opioid misuse were conducted. Narratives were analyzed to understand the circumstances and influences contributing to initial and continued misuse of opioids. Results: Participants described social and familial contexts that normalized or accepted opioid misuse-this often included their own use of other illicit substances prior to initiating opioids. Participants also described initial use of opioids as related to efforts to cope with physical pain. They also described recognizing and then seeking psychological/emotional benefits from opioids. All three of these themes often overlapped and intersected in these stories of starting opioid misuse. Conclusions: Opioid misuse stemmed from complex interacting influences involving coping with physical and psychological pain, perception that opioids are needed to feel "normal", and acceptance or normalization of opioid use. This suggests a multi-pronged approach to both prevention and treatment are needed. |
Reassortant Cache Valley virus associated with acute febrile, non-neurologic illness, Missouri.
Baker M , Hughes HR , Naqvi SH , Yates K , Velez JO , McGuirk S , Schroder B , Lambert AJ , Kosoy OI , Pue H , Turabelidze G , Staples JE . Clin Infect Dis 2021 73 (9) 1700-1702 An adult male from Missouri sought care for fever, fatigue, and gastrointestinal symptoms. He had leukopenia and thrombocytopenia and was treated for a presumed tickborne illness. His condition deteriorated with respiratory and renal failure, lactic acidosis, and hypotension. Next-generation sequencing and phylogenetic analysis identified a reassortant Cache Valley virus. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Health Behavior and Risk
- Health Economics
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Occupational Safety and Health
- Parasitic Diseases
- Public Health Law
- Substance Use and Abuse
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 22, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure