Last data update: Apr 22, 2024. (Total: 46599 publications since 2009)
Records 1-30 (of 310 Records) |
Query Trace: Kelly M [original query] |
---|
High HIV diversity, recombination, and superinfection revealed in a large outbreak among persons who inject drugs in Kentucky and Ohio, USA
Switzer WM , Shankar A , Jia H , Knyazev S , Ambrosio F , Kelly R , Zheng H , Campbell EM , Cintron R , Pan Y , Saduvala N , Panneer N , Richman R , Singh MB , Thoroughman DA , Blau EF , Khalil GM , Lyss S , Heneine W . Virus Evol 2024 10 (1) veae015 We investigated transmission dynamics of a large human immunodeficiency virus (HIV) outbreak among persons who inject drugs (PWID) in KY and OH during 2017-20 by using detailed phylogenetic, network, recombination, and cluster dating analyses. Using polymerase (pol) sequences from 193 people associated with the investigation, we document high HIV-1 diversity, including Subtype B (44.6 per cent); numerous circulating recombinant forms (CRFs) including CRF02_AG (2.5 per cent) and CRF02_AG-like (21.8 per cent); and many unique recombinant forms composed of CRFs with major subtypes and sub-subtypes [CRF02_AG/B (24.3 per cent), B/CRF02_AG/B (0.5 per cent), and A6/D/B (6.4 per cent)]. Cluster analysis of sequences using a 1.5 per cent genetic distance identified thirteen clusters, including a seventy-five-member cluster composed of CRF02_AG-like and CRF02_AG/B, an eighteen-member CRF02_AG/B cluster, Subtype B clusters of sizes ranging from two to twenty-three, and a nine-member A6/D and A6/D/B cluster. Recombination and phylogenetic analyses identified CRF02_AG/B variants with ten unique breakpoints likely originating from Subtype B and CRF02_AG-like viruses in the largest clusters. The addition of contact tracing results from OH to the genetic networks identified linkage between persons with Subtype B, CRF02_AG, and CRF02_AG/B sequences in the clusters supporting de novo recombinant generation. Superinfection prevalence was 13.3 per cent (8/60) in persons with multiple specimens and included infection with B and CRF02_AG; B and CRF02_AG/B; or B and A6/D/B. In addition to the presence of multiple, distinct molecular clusters associated with this outbreak, cluster dating inferred transmission associated with the largest molecular cluster occurred as early as 2006, with high transmission rates during 2017-8 in certain other molecular clusters. This outbreak among PWID in KY and OH was likely driven by rapid transmission of multiple HIV-1 variants including de novo viral recombinants from circulating viruses within the community. Our findings documenting the high HIV-1 transmission rate and clustering through partner services and molecular clusters emphasize the importance of leveraging multiple different data sources and analyses, including those from disease intervention specialist investigations, to better understand outbreak dynamics and interrupt HIV spread. |
Comparing binary & ordinal definitions of urinary & stool continence outcomes: Data from the National Spina Bifida Patient Registry
Kelly MS , Liu T , Routh JC , Castillo H , Tanaka ST , Smith K , Krach LE , Zhang A , Sherburne E , Castillo J , David J , Wiener JS . J Pediatr Urol 2024 INTRODUCTION: The National Spina Bifida Patient Registry (NSBPR) assesses bladder and bowel incontinence using ordinal categories, but prior NSBPR analyses employed binary classification. Our aims were to 1) perform the first NSBPR analysis of bladder and bowel incontinence as ordinal outcomes to compare to the binary definition and subject variables; 2) explore the correlation of incontinence with undergarment usage, and 3) assess incontinence status following continence surgeries. METHODS: Data from NSBPR participants' most recent clinic visit from 2013 to 2020 were analyzed. Ordinal categories of incontinence were compared to previously used binary definitions. Incontinence surgical outcomes were analyzed for those with data at least three months post-operatively. Chi-square tests evaluated associations among categorical variables. Univariate and ordinal logistic regression models were used to test associations of ordinal incontinence status with patient and condition factors. Statistical tests were 2-sided; p values < 0.05 were considered significant. RESULTS: Analysis of 7217 individuals using ordinal incontinence outcomes showed little difference from previously used binary outcomes. The final multivariable logistic regression models with ordinal multinomial outcomes showed that associations of incontinence with age, sex, race/ethnicity, health insurance, level of lesion, and continence management technique were similar to prior studies. Among those reporting never being incontinent of both bladder and bowel, 14% reported using protective undergarments. Of the 500 individuals who had bladder outlet surgery, 38% reported never being incontinent of urine. Of 1416 individuals who had appendicostomy (ACE) bowel surgery, 48% reported never being incontinent of stool. DISCUSSION: Our current analysis showed that ordinal continence outcome classification had similar continence findings as previous studies using the binary definition of continence. Expanding the binary definition of continence to include monthly episodes of incontinence did not greatly increase the proportion of continent individuals and, therefore, would have not likely made meaningful differences in continence outcomes in prior NSBPR analyses. However, it is known that even mild incontinence can affect quality of life, therefore, capturing any level of incontiennce is of clinical importance. Confirmation of the association of continence outcomes with sociodemographic, condition-related, and interventional factors with both approaches further validates previous analyses using the binary definition of continence. CONCLUSION: The previously used binary definition of bladder and bowel continence appears robust. Undergarment choice was a poor surrogate for reported incontinence. After bladder and bowel continence surgeries, 38% and 48%, respectively, reported never being incontinent. |
Ethnic and racial differences in self-reported symptoms, health status, activity level, and missed work at 3 and 6 months following SARS-CoV-2 infection
O'Laughlin KN , Klabbers RE , Ebna Mannan I , Gentile NL , Geyer RE , Zheng Z , Yu H , Li SX , Chan KCG , Spatz ES , Wang RC , L'Hommedieu M , Weinstein RA , Plumb ID , Gottlieb M , Huebinger RM , Hagen M , Elmore JG , Hill MJ , Kelly M , McDonald S , Rising KL , Rodriguez RM , Venkatesh A , Idris AH , Santangelo M , Koo K , Saydah S , Nichol G , Stephens KA . Front Public Health 2023 11 1324636 INTRODUCTION: Data on ethnic and racial differences in symptoms and health-related impacts following SARS-CoV-2 infection are limited. We aimed to estimate the ethnic and racial differences in symptoms and health-related impacts 3 and 6 months after the first SARS-CoV-2 infection. METHODS: Participants included adults with SARS-CoV-2 infection enrolled in a prospective multicenter US study between 12/11/2020 and 7/4/2022 as the primary cohort of interest, as well as a SARS-CoV-2-negative cohort to account for non-SARS-CoV-2-infection impacts, who completed enrollment and 3-month surveys (N = 3,161; 2,402 SARS-CoV-2-positive, 759 SARS-CoV-2-negative). Marginal odds ratios were estimated using GEE logistic regression for individual symptoms, health status, activity level, and missed work 3 and 6 months after COVID-19 illness, comparing each ethnicity or race to the referent group (non-Hispanic or white), adjusting for demographic factors, social determinants of health, substance use, pre-existing health conditions, SARS-CoV-2 infection status, COVID-19 vaccination status, and survey time point, with interactions between ethnicity or race and time point, ethnicity or race and SARS-CoV-2 infection status, and SARS-CoV-2 infection status and time point. RESULTS: Following SARS-CoV-2 infection, the majority of symptoms were similar over time between ethnic and racial groups. At 3 months, Hispanic participants were more likely than non-Hispanic participants to report fair/poor health (OR: 1.94; 95%CI: 1.36-2.78) and reduced activity (somewhat less, OR: 1.47; 95%CI: 1.06-2.02; much less, OR: 2.23; 95%CI: 1.38-3.61). At 6 months, differences by ethnicity were not present. At 3 months, Other/Multiple race participants were more likely than white participants to report fair/poor health (OR: 1.90; 95% CI: 1.25-2.88), reduced activity (somewhat less, OR: 1.72; 95%CI: 1.21-2.46; much less, OR: 2.08; 95%CI: 1.18-3.65). At 6 months, Asian participants were more likely than white participants to report fair/poor health (OR: 1.88; 95%CI: 1.13-3.12); Black participants reported more missed work (OR, 2.83; 95%CI: 1.60-5.00); and Other/Multiple race participants reported more fair/poor health (OR: 1.83; 95%CI: 1.10-3.05), reduced activity (somewhat less, OR: 1.60; 95%CI: 1.02-2.51; much less, OR: 2.49; 95%CI: 1.40-4.44), and more missed work (OR: 2.25; 95%CI: 1.27-3.98). DISCUSSION: Awareness of ethnic and racial differences in outcomes following SARS-CoV-2 infection may inform clinical and public health efforts to advance health equity in long-term outcomes. |
What's new in ototoxicity management?
Fernandez Katharine A , Garinis Angela , Knight Kristin , Konrad-Martin Dawn , Morata Thais , Poling Gayla L , Reavis Kelly M , Sanchez Victoria A , Dreisbachi Laura . Perspect ASHA Spec Interest Groups 2024 9 (1) 113-123 Purpose: Ototoxic medications and chemical agents in the workplace can put individuals' hearing and vestibular health at risk for permanent injury. Proactive ototoxicity management (OtoM) strategies aim to minimize exposure, avoid onset of symptoms, provide ongoing monitoring, and manage auditory and vestibular changes as the clinical needs of the patient evolve. During a 2021 American Speech-Language-Hearing Association Special Interest Groups Open House, members of the International Ototoxicity Management Group discussed how best to integrate OtoM into routine clinical practice, what tools to use, and what special considerations need to be understood to best support patients and their families. Here, we have summarized their viewpoints to encourage widespread adoption of improved OtoM services for at-risk individuals. Conclusions: The field of audiology needs to move to a place where we better understand the full extent of ototoxicity and can agree on expanding minimum guidelines that can be implemented more universally to mitigate, detect, and manage the damage from ototoxic exposures. Only recently has our field seen a therapeutic drug that can protect against ototoxicity; however, the population served is restricted only to children receiving treatment for nonmetastatic carcinoma. This is hopefully just the beginning of future therapeutic interventions to come, but, in the meantime, ototoxicity resulting from other medications in different patient populations and chemical agents persists. |
Lifetime excess absolute risk for lung cancer due to exposure to radon: results of the pooled uranium miners cohort study PUMA
Kreuzer M , Sommer M , Deffner V , Bertke S , Demers PA , Kelly-Reif K , Laurier D , Rage E , Richardson DB , Samet JM , Schubauer-Berigan MK , Tomasek L , Wiggins C , Zablotska LB , Fenske N . Radiat Environ Biophys 2024 The Pooled Uranium Miners Analysis (PUMA) study is the largest uranium miners cohort with 119,709 miners, 4.3 million person-years at risk and 7754 lung cancer deaths. Excess relative rate (ERR) estimates for lung cancer mortality per unit of cumulative exposure to radon progeny in working level months (WLM) based on the PUMA study have been reported. The ERR/WLM was modified by attained age, time since exposure or age at exposure, and exposure rate. This pattern was found for the full PUMA cohort and the 1960 + sub-cohort, i.e., miners hired in 1960 or later with chronic low radon exposures and exposure rates. The aim of the present paper is to calculate the lifetime excess absolute risk (LEAR) of lung cancer mortality per WLM using the PUMA risk models, as well as risk models derived in previously published smaller uranium miner studies, some of which are included in PUMA. The same methods were applied for all risk models, i.e., relative risk projection up to <95 years of age, an exposure scenario of 2 WLM per year from age 18-64 years, and baseline mortality rates representing a mixed Euro-American-Asian population. Depending upon the choice of model, the estimated LEAR per WLM are 5.38 × 10(-4) or 5.57 × 10(-4) in the full PUMA cohort and 7.50 × 10(-4) or 7.66 × 10(-4) in the PUMA 1960 + sub-cohort, respectively. The LEAR per WLM estimates derived from risk models reported for previously published uranium miners studies range from 2.5 × 10(-4) to 9.2 × 10(-4). PUMA strengthens knowledge on the radon-related lung cancer LEAR, a useful way to translate models for policy purposes. |
Infection precaution adherence varies by potential exposure risks to SARS-CoV-2 and job role: Findings from a US medical center
Haas EJ , Kelly-Reif K , Edirisooriya M , Reynolds L , Beatty Parker CN , Zhu D , Weber DJ , Sickbert-Bennett E , Boyce RM , Ciccone EJ , Aiello AE . Am J Infect Control 2023 BACKGROUND: Infection precautions (IP) facilitate standardized and safe patient care. Research has demonstrated several barriers to IP adherence among health care personnel (HCP) but potential exposure risk to SARS-CoV-2 and job role has not been considered. METHODS: Researchers used self-reported baseline surveys with 191 HCPs at a university medical center to examine factors that may have affected IP adherence (eg, personal protective equipment [PPE] and hand hygiene errors) over the 2 weeks prior to the survey. Chi-square tests were used to determine if differences existed first, among job role and IP adherence, and second, the potential risk of exposure to SARS-CoV-2 and IP adherence. A binary logistic regression estimated if PPE nonadherence was associated with COVID-19 stress, job role, and potential exposure risk to SARS-CoV-2. RESULTS: PPE nonadherence varied by job role. Those in the Other group (ie, nonphysician/non-nursing HCP) reported significantly fewer errors (9.6%) compared to Physicians (26.5%) and Registered Nurses (33.3%). Hand/glove hygiene errors between COVID-19 patient rooms varied by job role. Respondents who had higher risks of exposure to SARS-CoV-2 were 5.74 times more likely to experience errors. CONCLUSIONS: The results provide implications for adopting systems-level approaches to support worker knowledge and engagement across job roles to improve IP adherence. |
High prevalence of trachomatous inflammation-follicular with no trachomatous trichiasis: can alternative indicators explain the epidemiology of trachoma in Côte d'Ivoire?
Atekem K , Harding-Esch EM , Martin DL , Downs P , Palmer SL , Kaboré A , Kelly M , Bovary A , Sarr A , Nguessan K , James F , Gwyn S , Wickens K , Bakhtiari A , Boyd S , Aba A , Senyonjo L , Courtright P , Meite A . Int Health 2023 15 ii3-ii11 Baseline trachoma surveys in Côte d'Ivoire (2019) identified seven evaluation units (EUs) with a trachomatous inflammation-follicular (TF) prevalence ≥10%, but a trachomatous trichiasis (TT) prevalence in individuals ≥15 y of age below the elimination threshold (0.2%). Two of these EUs, Bondoukou 1 and Bangolo 2, were selected for a follow-up survey to understand the epidemiology of trachoma using additional indicators of Chlamydia trachomatis infection (DNA from conjunctival swabs) and exposure (anti-Pgp3 and Ct694 antibodies from dried blood spots [DBSs]). A two-stage cluster sampling methodology was used to select villages and households. All individuals 1-9 y of age from each selected household were recruited, graded for trachoma and had a conjunctival swab and DBS collected. Conjunctival swabs and DBSs were tested using Cepheid GeneXpert and a multiplex bead assay, respectively. The age-adjusted TF and infection prevalence in 1- to 9-year-olds was <1% and <0.3% in both EUs. Age-adjusted seroprevalence was 5.3% (95% confidence interval [CI] 1.5 to 15.6) in Bondoukou 1 and 8.2% (95% CI 4.3 to 13.7) in Bangolo 2. The seroconversion rate for Pgp3 was low, at 1.23 seroconversions/100 children/year (95% CI 0.78 to 1.75) in Bondoukou 1 and 1.91 (95% CI 1.58 to 2.24) in Bangolo 2. Similar results were seen for CT694. These infection, antibody and clinical data provide strong evidence that trachoma is not a public health problem in either EU. |
Principles of health equity science for public health action
Burton DC , Kelly A , Cardo D , Daskalakis D , Huang DT , Penman-Aguilar A , Raghunathan PL , Zhu BP , Bunnell R . Public Health Rep 2023 333549231213162 Health equity is the state in which everyone has a fair and just opportunity to attain their highest level of health, and no one is disadvantaged from achieving this potential because of social position or other socially determined circumstances.1 Science is a cornerstone of public health and central to efforts to achieve health equity. Science designed to generate knowledge to advance equity can improve population health and promote health for all members of society.2 In contrast, science and interventions not designed and implemented with equity in mind may inadvertently perpetuate or widen disparities, even while fostering overall improvements in population health.3 | Health equity science provides a conceptual framework for scientific endeavors that are designed and conducted to advance health equity.4 Health equity science investigates patterns and underlying contributors to health inequities and builds an evidence base that can guide action across the domains of the public health program, surveillance, policy, communication, and scientific inquiry to move toward eliminating, rather than simply documenting, inequities. | Building on extensive work in developing the importance and application of equity concepts in public health practice,5-7 we describe an equity-focused scientific framework and set of principles to guide public health efforts to fulfill the health equity mission of the Centers for Disease Control and Prevention (CDC).8 |
HantaNet: A new microbetrace application for hantavirus classification, genomic surveillance, epidemiology and outbreak investigations
Cintron R , Whitmer SLM , Moscoso E , Campbell EM , Kelly R , Talundzic E , Mobley M , Chiu KW , Shedroff E , Shankar A , Montgomery JM , Klena JD , Switzer WM . Viruses 2023 15 (11) Hantaviruses zoonotically infect humans worldwide with pathogenic consequences and are mainly spread by rodents that shed aerosolized virus particles in urine and feces. Bioinformatics methods for hantavirus diagnostics, genomic surveillance and epidemiology are currently lacking a comprehensive approach for data sharing, integration, visualization, analytics and reporting. With the possibility of hantavirus cases going undetected and spreading over international borders, a significant reporting delay can miss linked transmission events and impedes timely, targeted public health interventions. To overcome these challenges, we built HantaNet, a standalone visualization engine for hantavirus genomes that facilitates viral surveillance and classification for early outbreak detection and response. HantaNet is powered by MicrobeTrace, a browser-based multitool originally developed at the Centers for Disease Control and Prevention (CDC) to visualize HIV clusters and transmission networks. HantaNet integrates coding gene sequences and standardized metadata from hantavirus reference genomes into three separate gene modules for dashboard visualization of phylogenetic trees, viral strain clusters for classification, epidemiological networks and spatiotemporal analysis. We used 85 hantavirus reference datasets from GenBank to validate HantaNet as a classification and enhanced visualization tool, and as a public repository to download standardized sequence data and metadata for building analytic datasets. HantaNet is a model on how to deploy MicrobeTrace-specific tools to advance pathogen surveillance, epidemiology and public health globally. |
Long-term mediation of a sexual risk-reduction intervention for South African adolescents
Kim S , Jemmott LS , Icard L , Teitelman AM , Kelly TA , O'Leary A , Ngwane Z , Bellamy S , Jemmott JB . Health Psychol 2023 42 (11) 810-821 OBJECTIVE: Black adolescents in South Africa are disproportionately affected by HIV. A cluster-randomized controlled experiment examining the effects of a sexual risk-reduction intervention successfully reduced self-reported intercourse and unprotected intercourse. Based on long-term follow-up assessments, the present research examines theoretical constructs that could potentially mediate the intervention effects and how time and gender, respectively, moderated the mediation. METHOD: The behavioral outcome was measured by asking whether participants had had any vaginal sex in the past 3 months. Mediation and moderated mediation were tested based on the 3-, 6-, 12-, 42-, and 54-month postintervention outcomes. RESULTS: Three variables through which the sexual risk-reduction intervention had a significant mediated effect on the behavioral outcome were identified: abstinence career opportunities outcome expectancy (α × β product = -0.086, 95% asymmetric confidence interval [ACI] [-0.126, -0.047]), expected parental approval of sexual intercourse (α × β product = -0.061, [-0.102, -0.025]), and self-efficacy to avoid sexual-risk situations (α × β product = -0.022, [-0.049, -0.001]). The moderated mediation analysis showed that gender moderated the intervention's effects on abstinence prevention outcome expectancy (B = -0.186, SEB = 0.079, p = .019), expected parental approval of sexual intercourse (B = 0.143, SEB = 0.058, p = .013), and self-efficacy to avoid sexual-risk situations (B = -0.293, SEB = 0.112, p = .009). The moderated mediation analysis also revealed that time moderated the effects of the intervention on abstinence career opportunities outcome expectancy (B = -0.293, SEB = 0.106, p = .006), self-efficacy to avoid sexual-risk situations (B = 0.335, SEB = 0.060, p < .001), and cultural myths regarding HIV transmission (B = 0.138, SEB = 0.042, p = .001); and the association between four theoretical constructs and the behavioral outcome: abstinence career opportunities outcome expectancy (B = -0.267, SEB = 0.104, p = .001), self-efficacy to refuse sex (B = -0.132, SEB = 0.043, p = .002), self-efficacy to avoid sexual-risk situations (B = -0.093, SEB = 0.055, p = .009), and HIV risk-reduction knowledge (B = -0.286, SEB = 0.134, p = .003). CONCLUSIONS: The present study identifies theoretical constructs that mediated the intervention effects on the sexual behavior among South African adolescents for an extended period of time. The findings also reveal gender differences in psychological mechanisms initiated by a sexual risk-reduction intervention and the long-term temporal dynamics of the intervention. (PsycInfo Database Record (c) 2023 APA, all rights reserved). |
Persons 'never treated' in mass drug administration for lymphatic filariasis: identifying programmatic and research needs from a series of research review meetings 2020-2021
Brady MA , Toubali E , Baker M , Long E , Worrell C , Ramaiah K , Graves P , Hollingsworth TD , Kelly-Hope L , Stukel D , Tripathi B , Means AR , Matendechero SH , Krentel A . Int Health 2023 As neglected tropical disease programs rely on participation in rounds of mass drug administration (MDA), there is concern that individuals who have never been treated could contribute to ongoing transmission, posing a barrier to elimination. Previous research has suggested that the size and characteristics of the never-treated population may be important but have not been sufficiently explored. To address this critical knowledge gap, four meetings were held from December 2020 to May 2021 to compile expert knowledge on never treatment in lymphatic filariasis (LF) MDA programs. The meetings explored four questions: the number and proportion of people never treated, their sociodemographic characteristics, their infection status and the reasons why they were not treated. Meeting discussions noted key issues requiring further exploration, including how to standardize measurement of the never treated, adapt and use existing tools to capture never-treated data and ensure representation of never-treated people in data collection. Recognizing that patterns of never treatment are situation specific, participants noted measurement should be quick, inexpensive and focused on local solutions. Furthermore, programs should use existing data to generate mathematical models to understand what levels of never treatment may compromise LF elimination goals or trigger programmatic action. |
A narrative review of literature examining studies researching the impact of law on health and economic outcomes
Pepin DA , St Clair Sims R , Khushalani J , Tonti L , Kelly MA , Song S , Arifkhanova A , Hulkower R , Calhoun BH , Puddy RW , Kaminski JW . J Public Health Manag Pract 2023 30 (1) 12-35 CONTEXT: Public health policy can play an important role in improving public health outcomes. Accordingly, there has been an increasing emphasis by policy makers on identifying and implementing evidence-informed public health policy interventions. PROGRAM OR POLICY: Growth and refinement of the field of research assessing the impact of legal interventions on health outcomes, known as legal epidemiology, prompted this review of studies on the relationship between laws and health or economic outcomes. IMPLEMENTATION: Authors systematically searched 8 major literature databases for all English language journal articles that assessed the effect of a law on health and economic outcomes published between January 1, 2009, and September 18, 2019. This search generated 12 570 unique articles 177 of which met inclusion criteria. The team conducting the systematic review was a multidisciplinary team that included health economists and public health policy researchers, as well as public health lawyers with expertise in legal epidemiological research methods. The authors identified and assessed the types of methods used to measure the laws' health impact. EVALUATION: In this review, the authors examine how legal epidemiological research methods have been described in the literature as well as trends among the studies. Overall, 3 major themes emerged from this study: (1) limited variability in the sources of the health data across the studies, (2) limited differences in the methodological approaches used to connect law to health outcomes, and (3) lack of transparency surrounding the source and quality of the legal data relied upon. DISCUSSION: Through highlighting public health law research methodologies, this systematic review may inform researchers, practitioners, and lawmakers on how to better examine and understand the impacts of legal interventions on health and economic outcomes. Findings may serve as a source of suggested practices in conducting legal epidemiological outcomes research and identifying conceptual and method-related gaps in the literature. |
Domains of Excellence: A CDC framework for developing high-quality, impact-driven public health science publications
Parker EM , Zhu BP , Li Z , Puddy RW , Kelly MA , Scott C , Penman-Aguilar A , Mekonnen MA , Stephens JW . J Public Health Manag Pract 2023 30 (1) 72-78 CONTEXT: The Centers for Disease Control and Prevention (CDC) has a long history of using high-quality science to drive public health action that has improved the health, safety, and well-being of people in the United States and globally. To ensure scientific quality, manuscripts authored by CDC staff are required to undergo an internal review and approval process known as clearance. During 2022, CDC launched a scientific clearance transformation initiative to improve the efficiency of the clearance process while ensuring scientific quality. PROGRAM: As part of the scientific clearance transformation initiative, a group of senior scientists across CDC developed a framework called the Domains of Excellence for High-Quality Publications (DOE framework). The framework includes 7 areas ("domains") that authors can consider for developing high-quality and impactful scientific manuscripts: Clarity, Scientific Rigor, Public Health Relevance, Policy Content, Ethical Standards, Collaboration, and Health Equity. Each domain includes multiple quality elements, highlighting specific key considerations within. IMPLEMENTATION: CDC scientists are expected to use the DOE framework when conceptualizing, developing, revising, and reviewing scientific products to support collaboration and to ensure the quality and impact of their scientific manuscripts. DISCUSSION: The DOE framework sets expectations for a consistent standard for scientific manuscripts across CDC and promotes collaboration among authors, partners, and other subject matter experts. Many aspects have broad applicability to the public health field at large and might be relevant for others developing high-quality manuscripts in public health science. The framework can serve as a useful reference document for CDC authors and others in the public health community as they prepare scientific manuscripts for publication and dissemination. |
Gut microbiome perturbation, antibiotic resistance, and Escherichia coli strain dynamics associated with international travel: a metagenomic analysis
Worby CJ , Sridhar S , Turbett SE , Becker MV , Kogut L , Sanchez V , Bronson RA , Rao SR , Oliver E , Walker AT , Walters MS , Kelly P , Leung DT , Knouse MC , Hagmann SHF , Harris JB , Ryan ET , Earl AM , LaRocque RC . Lancet Microbe 2023 4 (10) e790-e799 BACKGROUND: Culture-based studies have shown that acquisition of extended-spectrum β-lactamase-producing Enterobacterales is common during international travel; however, little is known about the role of the gut microbiome before and during travel, nor about acquisition of other antimicrobial-resistant organisms. We aimed to identify (1) whether the gut microbiome provided colonisation resistance against antimicrobial-resistant organism acquisition, (2) the effect of travel and travel behaviours on the gut microbiome, and (3) the scale and global heterogeneity of antimicrobial-resistant organism acquisition. METHODS: In this metagenomic analysis, participants were recruited at three US travel clinics (Boston, MA; New York, NY; and Salt Lake City, UT) before international travel. Participants had to travel internationally between Dec 8, 2017, and April 30, 2019, and have DNA extractions for stool samples both before and after travel for inclusion. Participants were excluded if they had at least one low coverage sample (<1 million read pairs). Stool samples were collected at home before and after travel, sent to a clinical microbiology laboratory to be screened for three target antimicrobial-resistant organisms (extended-spectrum β-lactamase-producing Enterobacterales, carbapenem-resistant Enterobacterales, and mcr-mediated colistin-resistant Enterobacterales), and underwent DNA extraction and shotgun metagenomic sequencing. We profiled metagenomes for taxonomic composition, antibiotic-resistant gene content, and characterised the Escherichia coli population at the strain level. We analysed pre-travel samples to identify the gut microbiome risk factors associated with acquisition of the three targeted antimicrobial resistant organisms. Pre-travel and post-travel samples were compared to identify microbiome and resistome perturbation and E coli strain acquisition associated with travel. FINDINGS: A total of 368 individuals travelled between the required dates, and 296 had DNA extractions available for both before and after travel. 29 travellers were excluded as they had at least one low coverage sample, leaving a final group of 267 participants. We observed a perturbation of the gut microbiota, characterised by a significant depletion of microbial diversity and enrichment of the Enterobacteriaceae family. Metagenomic strain tracking confirmed that 67% of travellers acquired new strains of E coli during travel that were phylogenetically distinct from their pre-travel strains. We observed widespread enrichment of antibiotic-resistant genes in the gut, with a median 15% (95% CI 10-20, p<1 × 10(-10)) increase in burden (reads per kilobase per million reads). This increase included antibiotic-resistant genes previously classified as threats to public health, which were 56% (95% CI 36-91, p=2 × 10(-11)) higher in abundance after travel than before. Fluoroquinolone antibiotic-resistant genes were aquired by 97 (54%) of 181 travellers with no detected pre-travel carriage. Although we found that visiting friends or relatives, travel to south Asia, and eating uncooked vegetables were risk factors for acquisition of the three targeted antimicrobial resistant organisms, we did not observe an association between the pre-travel microbiome structure and travel-related antimicrobial-resistant organism acquisition. INTERPRETATION: This work highlights a scale of E coli and antimicrobial-resistant organism acquisition by US travellers not apparent from previous culture-based studies, and suggests that strategies to control antimicrobial-resistant organisms addressing international traveller behaviour, rather than modulating the gut microbiome, could be worthwhile. FUNDING: US Centers for Disease Control and Prevention and National Institute of Allergy and Infectious Diseases. |
Viral determinants of acute COVID-19 symptoms in a nonhospitalized adult population in the pre-Omicron era
Goldberg SA , Lu S , Garcia-Knight M , Davidson MC , Tassetto M , Anglin K , Pineda-Ramirez J , Chen JY , Rugart PR , Mathur S , Forman CA , Donohue KC , Abedi GR , Saydah S , Briggs-Hagen M , Midgley CM , Andino R , Peluso MJ , Glidden DV , Martin JN , Kelly JD . Open Forum Infect Dis 2023 10 (8) ofad396 BACKGROUND: The influence of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) RNA level and presence of infectious virus on symptom occurrence is poorly understood, particularly among nonhospitalized individuals. METHODS: The study included 85 nonhospitalized, symptomatic adults, who were enrolled from September 2020 to November 2021. Data from a longitudinal cohort studied over 28 days was used to analyze the association of individual symptoms with SARS-CoV-2 viral RNA load, or the presence or level of infectious (culturable) virus. Presence of infectious virus and viral RNA load were assessed daily, depending on specimen availability, and amount of infectious virus was assessed on the day of maximum RNA load. Participants were surveyed for the start and end dates of 31 symptoms at enrollment and at days 9, 14, 21, and 28; daily symptom presence was determined analytically. We describe symptoms and investigate their possible association with viral determinants through a series of single or pooled (multiple days across acute period) cross-sectional analyses. RESULTS: There was an association between viral RNA load and the same-day presence of many individual symptoms. Additionally, individuals with infectious virus were more than three times as likely to have a concurrent fever than individuals without infectious virus, and more than two times as likely to have concurrent myalgia, chills, headache, or sore throat. CONCLUSIONS: We found evidence to support the association of viral RNA load and infectious virus on some, but not all symptoms. Fever was most strongly associated with the presence of infectious virus; this may support the potential for symptom-based isolation guidance for COVID-19. |
Cancer mortality after low dose exposure to ionising radiation in workers in France, the United Kingdom, and the United States (INWORKS): cohort study
Richardson DB , Leuraud K , Laurier D , Gillies M , Haylock R , Kelly-Reif K , Bertke S , Daniels RD , Thierry-Chef I , Moissonnier M , Kesminiene A , Schubauer-Berigan MK . Bmj 2023 382 e074520 OBJECTIVE: To evaluate the effect of protracted low dose, low dose rate exposure to ionising radiation on the risk of cancer. DESIGN: Multinational cohort study. SETTING: Cohorts of workers in the nuclear industry in France, the UK, and the US included in a major update to the International Nuclear Workers Study (INWORKS). PARTICIPANTS: 309 932 workers with individual monitoring data for external exposure to ionising radiation and a total follow-up of 10.7 million person years. MAIN OUTCOME MEASURES: Estimates of excess relative rate per gray (Gy) of radiation dose for mortality from cancer. RESULTS: The study included 103 553 deaths, of which 28 089 were due to solid cancers. The estimated rate of mortality due to solid cancer increased with cumulative dose by 52% (90% confidence interval 27% to 77%) per Gy, lagged by 10 years. Restricting the analysis to the low cumulative dose range (0-100 mGy) approximately doubled the estimate of association (and increased the width of its confidence interval), as did restricting the analysis to workers hired in the more recent years of operations when estimates of occupational external penetrating radiation dose were recorded more accurately. Exclusion of deaths from lung cancer and pleural cancer had a modest effect on the estimated magnitude of association, providing indirect evidence that the association was not substantially confounded by smoking or occupational exposure to asbestos. CONCLUSIONS: This major update to INWORKS provides a direct estimate of the association between protracted low dose exposure to ionising radiation and solid cancer mortality based on some of the world's most informative cohorts of radiation workers. The summary estimate of excess relative rate solid cancer mortality per Gy is larger than estimates currently informing radiation protection, and some evidence suggests a steeper slope for the dose-response association in the low dose range than over the full dose range. These results can help to strengthen radiation protection, especially for low dose exposures that are of primary interest in contemporary medical, occupational, and environmental settings. |
Evaluation of commercially available high-throughput SARS-CoV-2 serological assays for serosurveillance and related applications (preprint)
Stone M , Grebe E , Sulaeman H , Di Germanio C , Dave H , Kelly K , Biggerstaff BJ , Crews BO , Tran N , Jerome KR , Denny TN , Hogema B , Destree M , Jones JM , Thornburg N , Simmons G , Krajden M , Kleinman S , Dumont LJ , Busch MP . medRxiv 2021 2021.09.04.21262414 SARS-CoV-2 serosurveys can estimate cumulative incidence for monitoring epidemics but require characterization of employed serological assays performance to inform testing algorithm development and interpretation of results. We conducted a multi-laboratory evaluation of 21 commercial high-throughput SARS-CoV-2 serological assays using blinded panels of 1,000 highly-characterized blood-donor specimens. Assays demonstrated a range of sensitivities (96%-63%), specificities (99%-96%) and precision (IIC 0.55-0.99). Durability of antibody detection in longitudinal samples was dependent on assay format and immunoglobulin target, with anti-spike, direct, or total Ig assays demonstrating more stable, or increasing reactivity over time than anti-nucleocapsid, indirect, or IgG assays. Assays with high sensitivity, specificity and durable antibody detection are ideal for serosurveillance. Less sensitive assays demonstrating waning reactivity are appropriate for other applications, including characterizing antibody responses after infection and vaccination, and detection of anamnestic boosting by reinfections and vaccine breakthrough infections. Assay performance must be evaluated in the context of the intended use.Competing Interest StatementThe authors have declared no competing interest.Funding StatementThis work was supported by research contracts from the Centers for Disease Control and Prevention (CDC Contract 75D30120C08170).Author DeclarationsI confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained.YesThe details of the IRB/oversight body that provided approval or exemption for the research described are given below:All blood donors consented to use of de-identified, residual specimens for further research purposes. UCSF IRB provided explicit approval for VRI self-certification that use of the de-identified CCP donations in this study does not meet the criteria for human subjects research. CDC investigators reviewed and relied on this determination as consistent with applicable federal law and CDC policy (45 C.F.R. part 46, 21 C.F.R. part 56; 42 U.S.C. Sect. 241(d); 5 U.S.C. Sect. 552a; 44 U.S.C. Sect. 3501).All necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived.YesI understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance).YesI have followed all appropriate research reporting guidelines and uploaded the relevant EQUATOR Network research reporting checklist(s) and other pertinent material as supplementary files, if applicable.YesThe analytic data set is available upon request. |
Description of a University COVID-19 Outbreak and Interventions to Disrupt Transmission, Wisconsin, August – October 2020 (preprint)
Currie DW , Moreno GK , Delahoy MJ , Pray IW , Jovaag A , Braun KM , Cole D , Shechter T , Fajardo GC , Griggs C , Yandell BS , Goldstein S , Bushman D , Segaloff HE , Kelly GP , Pitts C , Lee C , Grande KM , Kita-Yarbro A , Grogan B , Mader S , Baggott J , Bateman AC , Westergaard RP , Tate JE , Friedrich TC , Kirking HL , O'Connor DH , Killerby ME . medRxiv 2021 2021.05.07.21256834 University settings have demonstrated potential for COVID-19 outbreaks, as they can combine congregate living, substantial social activity, and a young population predisposed to mild illness. Using genomic and epidemiologic data, we describe a COVID-19 outbreak at the University of Wisconsin (UW)–Madison. During August – October 2020, 3,485 students tested positive, including 856/6,162 students living in residence halls. Case counts began rising during move-in week for on-campus students (August 25-31, 2020), then rose rapidly during September 1-11, 2020. UW-Madison initiated multiple prevention efforts, including quarantining two residence halls; a subsequent decline in cases was observed. Genomic surveillance of cases from Dane County, where UW-Madison is located, did not find evidence of transmission from a large cluster of cases in the two residence halls quarantined during the outbreak. Coordinated implementation of prevention measures can effectively reduce SARS-CoV-2 spread in university settings and may limit spillover to the community surrounding the university.Competing Interest StatementThe findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention. Use of trade names is for identification only and does not imply endorsement by the Centers for Disease Control and Prevention.Clinical TrialN/A.Funding StatementG.K.M. is supported by an NLM training grant to the Computation and Informatics in Biology and Medicine Training Program (NLM 5T15LM007359). This work was funded in part by the U.S. Centers for Disease Control and Prevention Contract #75D30120C09870: Defining the Role of College Students in SARS-CoV-2 Spread in the Upper Midwest.Author DeclarationsI confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained.YesThe details of the IRB/oversight body that provided approval or exemption for the research described are given below:A waiver of HIPAA Authorization was obtained by the Western Institutional Review Board (WIRB #1-1290953-1) to obtain the clinical specimens for whole genome sequencing. This analysis was reviewed by CDC and was conducted consistent with applicable federal law and CDC policy. These activities were determined to be non-research public health surveillance by the Institutional Review Board at UW-Madison.All necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived.YesI understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance).YesI have followed all appropriate research reporting guidelines and uploaded the relevant EQUATOR Network research reporting checklist(s) and other pertinent material as supplementary files, if applicable.YesAll sequencing data is available on www.gisaid.org. Scripts for sequence data analysis is available at https://github.com/gagekmoreno/SARS-CoV-2-at-UW_Madison. https://github.com/gagekmoreno/SARS-CoV-2-at-UW_Madison |
Mutations in TAC1B: a novel genetic determinant of clinical fluconazole resistance in C. auris (preprint)
Rybak JM , Munoz JF , Barker KS , Parker JE , Esquivel BD , Berkow EL , Lockhart SR , Gade L , Palmer GE , White TC , Kelly SL , Cuomo CA , Rogers PD . bioRxiv 2020 2020.02.18.955534 Candida auris has emerged as a multidrug-resistant pathogen of great clinical concern. Approximately 90% of clinical C. auris isolates are resistant to fluconazole, the most commonly prescribed antifungal agent, yet it remains unknown what mechanisms underpin this fluconazole resistance. To identify novel mechanisms contributing to fluconazole resistance in C. auris, the fluconazole-susceptible C. auris clinical isolate AR0387 was passaged in media supplemented with fluconazole to generate derivative strains which had acquired increased fluconazole resistance in vitro. Comparative analysis of comprehensive sterol profiles, [3H]-fluconazole uptake, sequencing of C. auris genes homologous to genes known to contribute to fluconazole resistance in other species of Candida, and the relative expression of C. auris ERG11, CDR1, and MDR1 were performed. All fluconazole-evolved derivative strains were found to have acquired mutations in the zinc-cluster transcription factor-encoding gene, TAC1B, and a corresponding increase in CDR1 expression relative to the parental clinical isolate, AR0387. Mutations in TAC1B were also identified in a set of 304 globally distributed C. auris clinical isolates representing each of the four major clades. Introduction of the most common mutation found among fluconazole-resistant clinical isolates of C. auris into the fluconazole-susceptible isolate AR0387, was confirmed to increase fluconazole resistance by 8-fold, and the correction of the same mutation in a fluconazole-resistant isolate, AR0390, decreased fluconazole MIC by 16-fold. Taken together, these data demonstrate that C. auris can rapidly acquire resistance to fluconazole in-vitro, and that mutations in TAC1B significantly contribute to clinical fluconazole resistance.IMPORTANCE Candida auris is an emerging multidrug-resistant pathogen of global concern, known to be responsible for outbreaks on six continents and commonly resistant to antifungals. While the vast majority of clinical C. auris isolates are highly resistant to fluconazole, an essential part of the available antifungal arsenal, very little is known about the mechanisms contributing to resistance. In this work, we show that mutations in the transcription factor TAC1B significantly contribute to clinical fluconazole resistance. These studies demonstrate that mutations in TAC1B can arise rapidly in vitro upon exposure to fluconazole, and that a multitude of resistance-associated TAC1B mutations are present among the majority of fluconazole-resistant C. auris isolates from a global collection and appear specific to a subset of lineages or clades. Thus, identification of this novel genetic determinant of resistance significantly adds to the understanding of clinical antifungal resistance in C. auris. |
Minimum Information for Reusable Arthropod Abundance Data (MIReAAD) (preprint)
Rund SSC , Braak K , Cator L , Copas K , Emrich SJ , Giraldo-Calderon GI , Johansson MA , Heydari N , Hobern D , Kelly SA , Lawson D , Lord C , MacCallum RM , Roche DG , Ryan SJ , Schigel D , Vandegrift K , Watts M , Zaspel JM , Pawar S . bioRxiv 2018 429142 Arthropods play a dominant role in natural and human-modified terrestrial ecosystem dynamics. Spatially-explicit population time-series are crucial for statistical or mathematical models of these dynamics and assessment of their veterinary, medical, agricultural, and ecological impacts. Arthropod data have been collected world-wide for over a century, but remain scattered and largely inaccessible. With the ever-present and growing threat of arthropod vectors of infectious diseases and pest species, there are enormous amounts of historical and ongoing surveillance. These data are currently reported in a wide variety of formats, typically lacking sufficient metadata to make reuse and re-analysis possible. We present the first minimum information standard for arthropod abundance. Developed with broad stakeholder collaboration, it balances sufficiency for reuse with the practicality of preparing the data for submission. It is designed to optimize data (re-)usability from the “FAIR,” (Findable, Accessible, Interoperable, and Reusable) principles of public data archiving (PDA). This standard will facilitate data unification across research initiatives and communities dedicated to surveillance for detection and control of vector-borne diseases and pests. |
Infectious viral shedding of SARS-CoV-2 Delta following vaccination: a longitudinal cohort study (preprint)
Garcia-Knight M , Anglin K , Tassetto M , Lu S , Zhang A , Goldberg SA , Catching A , Davidson MC , Shak JR , Romero M , Pineda-Ramirez J , Sanchez RD , Rugart P , Donohue K , Massachi J , Sans HM , Djomaleu M , Mathur S , Servellita V , McIlwain D , Gaudiliere B , Chen J , Martinez EO , Tavs JM , Bronstone G , Weiss J , Watson JT , Briggs-Hagen M , Abedi GR , Rutherford GW , Deeks SG , Chiu C , Saydah S , Peluso MJ , Midgley CM , Martin JN , Andino R , Kelly JD . medRxiv 2022 19 (9) e1010802 The impact of vaccination on SARS-CoV-2 infectiousness is not well understood. We compared longitudinal viral shedding dynamics in unvaccinated and fully vaccinated adults. SARS-CoV-2-infected adults were enrolled within 5 days of symptom onset and nasal specimens were self-collected daily for two weeks and intermittently for an additional two weeks. SARS-CoV-2 RNA load and infectious virus were analyzed relative to symptom onset stratified by vaccination status. We tested 1080 nasal specimens from 52 unvaccinated adults enrolled in the pre-Delta period and 32 fully vaccinated adults with predominantly Delta infections. While we observed no differences by vaccination status in maximum RNA levels, maximum infectious titers and the median duration of viral RNA shedding, the rate of decay from the maximum RNA load was faster among vaccinated; maximum infectious titers and maximum RNA levels were highly correlated. Furthermore, amongst participants with infectious virus, median duration of infectious virus detection was reduced from 7.5 days (IQR: 6.0-9.0) in unvaccinated participants to 6 days (IQR: 5.0-8.0) in those vaccinated (P=0.02). Accordingly, the odds of shedding infectious virus from days 6 to 12 post-onset were lower among vaccinated participants than unvaccinated participants (OR 0.42 95% CI 0.19-0.89). These results indicate that vaccination had reduced the probability of shedding infectious virus after 5 days from symptom onset. Copyright The copyright holder for this preprint is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license. |
SARS-CoV-2 antibody prevalence in Sierra Leone, March 2021: a cross-sectional, nationally representative, age-stratified serosurvey (preprint)
Barrie MB , Lakoh S , Kelly JD , Kanu JS , Squire J , Koroma Z , Bah S , Sankoh O , Brima A , Ansumana R , Goldberg SA , Chitre S , Osuagwu C , Maeda J , Barekye B , Numbere TW , Abdulaziz M , Mounts A , Blanton C , Singh T , Samai M , Vandi MA , Richardson ET . medRxiv 2021 Background As of 26 March 2021, the Africa CDC had reported 4,159,055 cases of COVID-19 and 111,357 deaths among the 55 African Union Member States; however, no country has published a nationally representative serosurvey as of May 2021. Such data are vital for understanding the pandemic's progression on the continent, evaluating containment measures, and policy planning. Methods We conducted a cross-sectional, nationally representative, age-stratified serosurvey in Sierra Leone in March 2021 by randomly selecting 120 Enumeration Areas throughout the country and 10 randomly selected households in each of these. One to two persons per selected household were interviewed to collect information on socio-demographics, symptoms suggestive of COVID-19, exposure history to laboratory-confirmed COVID-19 cases, and history of COVID-19 illness. Capillary blood was collected by fingerstick, and blood samples were tested using the Hangzhou Biotest Biotech RightSign COVID-19 IgG/IgM Rapid Test Cassette. Total seroprevalence was was estimated after applying sampling weights. Findings The overall weighted seroprevalence was 2.6% (95% CI 1.9-3.4). This is 43 times higher than the reported number of cases. Rural seropositivity was 1.8% (95% CI 1.0-2.5), and urban seropositivity was 4.2% (95% CI 2.6-5.7). Interpretation Although overall seroprevalence was low compared to countries in Europe and the Americas (suggesting relatively successful containment in Sierra Leone), our findings indicate enormous underreporting of active cases. This has ramifications for the country's third wave (which started in June 2021), where the average number of daily reported cases was 87 by the end of the month: this could potentially be on the order of 3,700 actual infections, calling for stronger containment measures in a country with only 0.2% of people fully vaccinated. It may also reflect significant underreporting of incidence and mortality across the continent. |
Early introductions and community transmission of SARS-CoV-2 variant B.1.1.7 in the United States (preprint)
Alpert T , Brito AF , Lasek-Nesselquist E , Rothman J , Valesano AL , MacKay MJ , Petrone ME , Breban MI , Watkins AE , Vogels CBF , Kalinich CC , Dellicour S , Russell A , Kelly JP , Shudt M , Plitnick J , Schneider E , Fitzsimmons WJ , Khullar G , Metti J , Dudley JT , Nash M , Beaubier N , Wang J , Liu C , Hui P , Muyombwe A , Downing R , Razeq J , Bart SM , Grills A , Morrison SM , Murphy S , Neal C , Laszlo E , Rennert H , Cushing M , Westblade L , Velu P , Craney A , Fauntleroy KA , Peaper DR , Landry ML , Cook PW , Fauver JR , Mason CE , Lauring AS , George KS , MacCannell DR , Grubaugh ND . medRxiv 2021 The emergence and spread of SARS-CoV-2 lineage B.1.1.7, first detected in the United Kingdom, has become a global public health concern because of its increased transmissibility. Over 2500 COVID-19 cases associated with this variant have been detected in the US since December 2020, but the extent of establishment is relatively unknown. Using travel, genomic, and diagnostic data, we highlight the primary ports of entry for B.1.1.7 in the US and locations of possible underreporting of B.1.1.7 cases. Furthermore, we found evidence for many independent B.1.1.7 establishments starting in early December 2020, followed by interstate spread by the end of the month. Finally, we project that B.1.1.7 will be the dominant lineage in many states by mid to late March. Thus, genomic surveillance for B.1.1.7 and other variants urgently needs to be enhanced to better inform the public health response. |
Longitudinal and Quantitative Fecal Shedding Dynamics of SARS-CoV-2, Pepper Mild Mottle Virus and CrAssphage (preprint)
Arts PJ , Kelly JD , Midgley CM , Anglin K , Lu S , Abedi GR , Andino R , Bakker KM , Banman B , Boehm AB , Briggs-Hagen M , Brouwer AF , Davidson MC , Eisenberg MC , Garcia-Knight M , Knight S , Peluso MJ , Pineda-Ramirez J , Sanchez RD , Saydah S , Tassetto M , Martin JN , Wigginton KR . medRxiv 2023 07 e0013223 Wastewater-based epidemiology (WBE) emerged during the COVID-19 pandemic as a scalable and broadly applicable method for community-level monitoring of infectious disease burden, though the lack of high-quality, longitudinal fecal shedding data of SARS-CoV-2 and other viruses limits the interpretation and applicability of wastewater measurements. In this study, we present longitudinal, quantitative fecal shedding data for SARS-CoV-2 RNA, as well as the commonly used fecal indicators Pepper Mild Mottle Virus (PMMoV) RNA and crAss-like phage (crAssphage) DNA. The shedding trajectories from 48 SARS-CoV-2 infected individuals suggest a highly individualized, dynamic course of SARS-CoV-2 RNA fecal shedding, with individual measurements varying from below limit of detection to 2.79x106 gene copies/mg - dry mass of stool (gc/mg-dw). Of individuals that contributed at least 3 samples covering a range of at least 15 of the first 30 days after initial acute symptom onset, 77.4% had at least one positive SARS-CoV-2 RNA stool sample measurement. We detected PMMoV RNA in at least one sample from all individuals and in 96% (352/367) of samples overall; and measured crAssphage DNA above detection limits in 80% (38/48) of individuals and 48% (179/371) of samples. Median shedding values for PMMoV and crAssphage nucleic acids were 1x105 gc/mg-dw and 1.86x103 gc/mgdw, respectively. These results can be used to inform and build mechanistic models to significantly broaden the potential of WBE modeling and to provide more accurate insight into SARS-CoV-2 prevalence estimates. Copyright The copyright holder for this preprint is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license. |
Longitudinal and quantitative fecal shedding dynamics of SARS-CoV-2, pepper mild mottle virus, and crAssphage
Arts PJ , Kelly JD , Midgley CM , Anglin K , Lu S , Abedi GR , Andino R , Bakker KM , Banman B , Boehm AB , Briggs-Hagen M , Brouwer AF , Davidson MC , Eisenberg MC , Garcia-Knight M , Knight S , Peluso MJ , Pineda-Ramirez J , Diaz Sanchez R , Saydah S , Tassetto M , Martin JN , Wigginton KR . mSphere 2023 8 (4) e0013223 Wastewater-based epidemiology (WBE) emerged during the coronavirus disease 2019 (COVID-19) pandemic as a scalable and broadly applicable method for community-level monitoring of infectious disease burden. The lack of high-resolution fecal shedding data for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) limits our ability to link WBE measurements to disease burden. In this study, we present longitudinal, quantitative fecal shedding data for SARS-CoV-2 RNA, as well as for the commonly used fecal indicators pepper mild mottle virus (PMMoV) RNA and crAss-like phage (crAssphage) DNA. The shedding trajectories from 48 SARS-CoV-2-infected individuals suggest a highly individualized, dynamic course of SARS-CoV-2 RNA fecal shedding. Of the individuals that provided at least three stool samples spanning more than 14 days, 77% had one or more samples that tested positive for SARS-CoV-2 RNA. We detected PMMoV RNA in at least one sample from all individuals and in 96% (352/367) of samples overall. CrAssphage DNA was detected in at least one sample from 80% (38/48) of individuals and was detected in 48% (179/371) of all samples. The geometric mean concentrations of PMMoV and crAssphage in stool across all individuals were 8.7 × 10(4) and 1.4 × 10(4) gene copies/milligram-dry weight, respectively, and crAssphage shedding was more consistent for individuals than PMMoV shedding. These results provide us with a missing link needed to connect laboratory WBE results with mechanistic models, and this will aid in more accurate estimates of COVID-19 burden in sewersheds. Additionally, the PMMoV and crAssphage data are critical for evaluating their utility as fecal strength normalizing measures and for source-tracking applications. IMPORTANCE This research represents a critical step in the advancement of wastewater monitoring for public health. To date, mechanistic materials balance modeling of wastewater-based epidemiology has relied on SARS-CoV-2 fecal shedding estimates from small-scale clinical reports or meta-analyses of research using a wide range of analytical methodologies. Additionally, previous SARS-CoV-2 fecal shedding data have not contained sufficient methodological information for building accurate materials balance models. Like SARS-CoV-2, fecal shedding of PMMoV and crAssphage has been understudied to date. The data presented here provide externally valid and longitudinal fecal shedding data for SARS-CoV-2, PMMoV, and crAssphage which can be directly applied to WBE models and ultimately increase the utility of WBE. |
Clinical characteristics and outcomes among travelers with severe Dengue : A geosentinel analysis
Huits R , Angelo KM , Amatya B , Barkati S , Barnett ED , Bottieau E , Emetulu H , Epelboin L , Eperon G , Medebb L , Gobbi F , Grobusch MP , Itani O , Jordan S , Kelly P , Leder K , Díaz-Menéndez M , Okumura N , Rizwan A , Rothe C , Saio M , Waggoner J , Yoshimura Y , Libman M , Hamer DH , Schwartz E . Ann Intern Med 2023 176 (7) 940-948 BACKGROUND: Dengue virus is a flavivirus transmitted by Aedes mosquitoes and is an important cause of illness worldwide. Data on the severity of travel-associated dengue illness are limited. OBJECTIVE: To describe the epidemiology, clinical characteristics, and outcomes among international travelers with severe dengue or dengue with warning signs as defined by the 2009 World Health Organization classification (that is, complicated dengue). DESIGN: Retrospective chart review and analysis of travelers with complicated dengue reported to GeoSentinel from January 2007 through July 2022. SETTING: 20 of 71 international GeoSentinel sites. PATIENTS: Returning travelers with complicated dengue. MEASUREMENTS: Routinely collected surveillance data plus chart review with abstraction of clinical information using predefined grading criteria to characterize the manifestations of complicated dengue. RESULTS: Of 5958 patients with dengue, 95 (2%) had complicated dengue. Eighty-six (91%) patients had a supplemental questionnaire completed. Eighty-five of 86 (99%) patients had warning signs, and 27 (31%) were classified as severe. Median age was 34 years (range, 8 to 91 years); 48 (56%) were female. Patients acquired dengue most frequently in the Caribbean (n = 27 [31%]) and Southeast Asia (n = 21 [24%]). Frequent reasons for travel were tourism (46%) and visiting friends and relatives (32%). Twenty-one of 84 (25%) patients had comorbidities. Seventy-eight (91%) patients were hospitalized. One patient died of nondengue-related illnesses. Common laboratory findings and signs were thrombocytopenia (78%), elevated aminotransferase (62%), bleeding (52%), and plasma leakage (20%). Among severe cases, ophthalmologic pathology (n = 3), severe liver disease (n = 3), myocarditis (n = 2), and neurologic symptoms (n = 2) were reported. Of 44 patients with serologic data, 32 confirmed cases were classified as primary dengue (IgM+/IgG-) and 12 as secondary (IgM-/IgG+) dengue. LIMITATIONS: Data for some variables could not be retrieved by chart review for some patients. The generalizability of our observations may be limited. CONCLUSION: Complicated dengue is relatively rare in travelers. Clinicians should monitor patients with dengue closely for warning signs that may indicate progression to severe disease. Risk factors for developing complications of dengue in travelers need further prospective study. PRIMARY FUNDING SOURCE: Centers for Disease Control and Prevention, International Society of Travel Medicine, Public Health Agency of Canada, and GeoSentinel Foundation. |
The impact of climate change on asthma and allergic-immunologic disease
Kelly G , Idubor OI , Binney S , Schramm PJ , Mirabelli MC , Hsu J . Curr Allergy Asthma Rep 2023 23 (8) 453-461 PURPOSE OF REVIEW: This review discusses climate change-related impacts on asthma and allergic-immunologic disease, relevant US public health efforts, and healthcare professional resources. RECENT FINDINGS: Climate change can impact people with asthma and allergic-immunologic disease through various pathways, including increased exposure to asthma triggers (e.g., aeroallergens, ground-level ozone). Climate change-related disasters (e.g., wildfires, floods) disrupting healthcare access can complicate management of any allergic-immunologic disease. Climate change disproportionately affects some communities, which can exacerbate disparities in climate-sensitive diseases like asthma. Public health efforts include implementing a national strategic framework to help communities track, prevent, and respond to climate change-related health threats. Healthcare professionals can use resources or tools to help patients with asthma and allergic-immunologic disease prevent climate change-related health impacts. Climate change can affect people with asthma and allergic-immunologic disease and exacerbate health disparities. Resources and tools are available to help prevent climate change-related health impacts at the community and individual level. |
Assessment of anti-SARS-CoV-2 antibody levels among university students vaccinated with different COVID-19 primary and booster doses - fall 2021, Wisconsin
DeJonge PM , Lambrou AS , Segaloff HE , Bateman A , Sterkel A , Griggs C , Baggott J , Kelly P , Thornburg N , Epperson M , Desamu-Thorpe R , Abedi G , Hsu CH , Nakayama JY , Ruffin J , Turner-Harper D , Matanock A , Almendares O , Whaley M , Chakrabarti A , DeGruy K , Daly M , Westergaard R , Tate JE , Kirking HL . BMC Infect Dis 2023 23 (1) 374 BACKGROUND: University students commonly received COVID-19 vaccinations before returning to U.S. campuses in the Fall of 2021. Given likely immunologic variation among students based on differences in type of primary series and/or booster dose vaccine received, we conducted serologic investigations in September and December 2021 on a large university campus in Wisconsin to assess anti-SARS-CoV-2 antibody levels. METHODS: We collected blood samples, demographic information, and COVID-19 illness and vaccination history from a convenience sample of students. Sera were analyzed for both anti-spike (anti-S) and anti-nucleocapsid (anti-N) antibody levels using World Health Organization standardized binding antibody units per milliliter (BAU/mL). Levels were compared across categorical primary COVID-19 vaccine series received and binary COVID-19 mRNA booster status. The association between anti-S levels and time since most recent vaccination dose was estimated by mixed-effects linear regression. RESULTS: In total, 356 students participated, of whom 219 (61.5%) had received a primary vaccine series of Pfizer-BioNTech or Moderna mRNA vaccines and 85 (23.9%) had received vaccines from Sinovac or Sinopharm. Median anti-S levels were significantly higher for mRNA primary vaccine series recipients (2.90 and 2.86 log [BAU/mL], respectively), compared with those who received Sinopharm or Sinovac vaccines (1.63 and 1.95 log [BAU/mL], respectively). Sinopharm and Sinovac vaccine recipients were associated with a significantly faster anti-S decline over time, compared with mRNA vaccine recipients (P <.001). By December, 48/172 (27.9%) participants reported receiving an mRNA COVID-19 vaccine booster, which reduced the anti-S antibody discrepancies between primary series vaccine types. CONCLUSIONS: Our work supports the benefit of heterologous boosting against COVID-19. COVID-19 mRNA vaccine booster doses were associated with increases in anti-SARS-CoV-2 antibody levels; following an mRNA booster dose, students with both mRNA and non-mRNA primary series receipt were associated with comparable levels of anti-S IgG. |
Ionizing radiation and solid cancer mortality among US nuclear facility workers
Kelly-Reif K , Bertke SJ , Daniels RD , Richardson DB , Schubauer-Berigan MK . Int J Epidemiol 2023 52 (4) 1015-1024 BACKGROUND: The risk of solid cancers from low-level protracted ionizing radiation is not well characterized. Nuclear workers provide valuable information on the effects of ionizing radiation in contemporary exposure scenarios relevant to workers and the public. METHODS: We evaluated the association between penetrating ionizing radiation exposure and solid cancer mortality among a pooled cohort of nuclear workers in the USA, with extended follow-up to examine cancers with long latencies. This analysis includes 101 363 workers from five nuclear facilities, with 12 069 solid cancer deaths between 1944 and 2016. The association between cumulative equivalent dose measured in sieverts (Sv) and solid cancer subtypes were modelled as the excess relative rate per Sv (ERR Sv-1) using Cox regression. RESULTS: For the association between ionizing radiation exposure and all solid cancer mortality we observed an elevated rate (ERR Sv-1=0.19; 95% CI: -0.10, 0.52), which was higher among a contemporary sub-cohort of workers first hired in 1960 or later (ERR Sv-1= 2.23; 95% CI: 1.13, 3.49). Similarly, we observed an elevated rate for lung cancer mortality (ERR Sv-1= 0.65; 95% CI: 0.09, 1.30) that was higher among contemporary hires (ERR Sv-1= 2.90; 95% CI: 1.00, 5.26). CONCLUSIONS: Although concerns remain about confounding, measurement error and precision, this analysis strengthens the evidence base indicating there are radiogenic risks for several solid cancer types. |
The association of forced expiratory volume in one second with occupational exposures in a longitudinal study of adults in a rural community in Iowa
Henneberger PK , Rollins SM , Humann MJ , Liang X , Doney BC , Kelly KM , Cox-Ganser JM . Int Arch Occup Environ Health 2023 96 (6) 919-930 PURPOSE: The Keokuk County Rural Health Study (KCRHS) is a longitudinal population-based study conducted in rural Iowa. A prior analysis of enrollment data identified an association of airflow obstruction with occupational exposures only among cigarette smokers. The current study used spirometry data from all three rounds to investigate whether level of forced expiratory volume in one second (FEV(1)) and longitudinal change in FEV(1) were associated with occupational vapor-gas, dust, and fumes (VGDF) exposures, and whether these associations were modified by smoking. METHODS: This study sample comprised 1071 adult KCRHS participants with longitudinal data. A job-exposure matrix (JEM) was applied to participants' lifetime work histories to assign exposures to occupational VGDF. Mixed regression models of pre-bronchodilator FEV(1) (millimeters, ml) were fit to test for associations with occupational exposures while adjusting for potential confounders. RESULTS: Mineral dust had the most consistent association with change in FEV(1), including ever/never ( - 6.3 ml/year) and nearly every level of duration, intensity, and cumulative exposure. Because 92% of participants with mineral dust also had organic dust exposure, the results for mineral dust may be due to a combination of the two. An association of FEV(1) level with fumes was observed for high intensity ( - 91.4 ml) among all participants, and limited to cigarette smokers with results of - 104.6 ml ever/never exposed, - 170.3 ml high duration, and - 172.4 ml high cumulative. CONCLUSION: The current findings suggest that mineral dust, possibly in combination with organic dust, and fumes exposure, especially among cigarette smokers, were risk factors for adverse FEV(1) results. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 22, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure