Last data update: Mar 17, 2025. (Total: 48910 publications since 2009)
Records 1-30 (of 628 Records) |
Query Trace: Xu M[original query] |
---|
A single-camera method for estimating lift asymmetry angles using deep learning computer vision algorithms
Lou Z , Zhan Z , Xu H , Li Y , Hu YH , Lu ML , Werren DM , Radwin RG . IEEE Trans Human Mach Syst 2025 ![]() A computer vision (CV) method to automatically measure the revised NIOSH lifting equation asymmetry angle (A) from a single camera is described and tested. A laboratory study involving ten participants performing various lifts was used to estimate A in comparison to ground truth joint coordinates obtained using 3-D motion capture (MoCap). To address challenges, such as obstructed views and limitations in camera placement in real-world scenarios, the CV method utilized video-derived coordinates from a selected set of landmarks. A 2-D pose estimator (HR-Net) detected landmark coordinates in each video frame, and a 3-D algorithm (VideoPose3D) estimated the depth of each 2-D landmark by analyzing its trajectories. The mean absolute precision error for the CV method, compared to MoCap measurements using the same subset of landmarks for estimating A, was 6.25° (SD = 10.19°, N = 360). The mean absolute accuracy error of the CV method, compared against conventional MoCap landmark markers was 9.45° (SD = 14.01°, N = 360). © 2013 IEEE. |
Incidence of leading causes of pediatric chronic kidney disease using electronic health record-driven computable phenotype
Beus JM , Liu K , Westbrook A , Harding JL , Orenstein EW , Shin HS , Kandaswamy S , Wekon-Kemeni C , Pavkov ME , Xu F , Smith EA , Rouster-Stevens KA , Prahalad S , Greenbaum LA , Wang CS . Kidney360 2025 BACKGROUND: Incidence data on pediatric chronic kidney disease (CKD) is incomplete. We developed electronic health record (EHR)-based algorithms (e-phenotypes) to identify cases and provide incidence estimates of 5 leading causes of pediatric CKD. METHODS: E-Phenotypes using common standardized clinical terminology were built and contained utilization, diagnostic, procedural, age, and time-period inclusion and exclusion criteria for autosomal dominant polycystic kidney disease (ADPKD), Alport Syndrome (AS), congenital anomalies of the kidney and urinary tract (CAKUT), lupus nephritis (LN), and primary childhood nephrotic syndrome (NS). Cases diagnosed between 2014 and 2023 were identified from a pediatric healthcare system that is the sole pediatric nephrology provider serving the Atlanta Metropolitan Statistical Area (MSA). The performance of the e-phenotypes was tested using a cohort of 1,000 pediatric patients. Cases identified were used to estimate incidences using population information from the Georgia Department of Health. RESULTS: The e-phenotypes demonstrated sensitivity ranging from 0.83 to 0.95, specificity 0.96 to 1.00, PPV 0.81 to 1.00, and NPV 0.98 to 1.00. All positive likelihood ratios (LR) were >20 and negative LR < 0.20. The 6,814 combined cases of ADPKD (n=107), AS (n=31), CAKUT (n=6,120), LN (n=161), and NS (n=395) had an annual incidence of 47.07 (95% CI 45.96-48.20) per 100,000 children. Annual incidence per 100,000 children (95% CI) for each condition was: ADPKD 0.74 (0.61- 0.89), AS 0.21 (0.15-0.30), CAKUT 42.28 (41.22-43.35), LN 1.11 (0.95-1.30), and NS 2.73 (2.47-3.01). CONCLUSIONS: Our incidence estimates suggest CKD conditions are common among children. The e-phenotypes require validation for use at other institutions but offer opportunities to examine determinants of CKD detection, management, and outcomes. |
Gestational PBDE concentrations and executive function in adolescents with self- and caregiver-report: The HOME study
Cecil KM , Xu Y , Chen A , Braun JM , Sjodin A , Lanphear BP , Vuong AM , Yolton K . Environ Res 2025 273 121256 BACKGROUND: Polybrominated diphenyl ethers (PBDEs), synthetic chemicals previously used as flame retardants in commercial products, impact human behaviors, mood symptoms and cognitive abilities. OBJECTIVE: We estimated the association of gestational PBDE serum concentrations with early adolescent self- and caregiver-reported ratings of executive function in a prospective pregnancy and birth cohort. METHODS: We measured gestational serum concentrations of five PBDE congeners and created a summary exposure variable (∑(5)BDE: 28, -47, -99, -100 and -153). At age 12 years, we assessed executive function for 237 adolescents using self- and caregiver-reports with the Behavior Rating Inventory of Executive Functioning (BRIEF-2). We used multivariable linear regression models to estimate covariate-adjusted associations of lipid standardized, log(10)-transformed gestational PBDE concentrations with BRIEF-2 T-scores. We evaluated potential effect measure modification (EMM) of sex by examining sex-stratified regression models. RESULTS: As higher scores indicate greater deficits in executive function, gestational PBDE concentrations were positively associated with adolescent-reported BRIEF-2 T-scores for Global Executive Composite (BDE-28: β = 6.31 (95%CI: 2.59, 10.03), BDE-47: (β = 3.32 (95%CI: 0.1, 6.54), ∑(5)BDE: (β = 3.70 (95%CI: 0.37, 7.03), Behavior Regulation Index (BDE-28: β = 5.36 (95%CI: 1.56, 9.15), BDE-99: β = 3.53 (95%CI: 0.33, 6.74), ∑(5)BDE: β = 3.93 (95%CI: 0.57, 7.3), Emotion Regulation Index (BDE-28: β = 4.76 (95%CI: 0.88, 8.64) and the Cognitive Regulation Index (BDE-28: β = 6.69 (95%CI: 3.08, 10.31), BDE-47: β = 3.45 (95%CI: 0.3, 6.59), ∑(5)BDE: β = 3.57 (95%CI: 0.32, 6.82) and several other scales. We observed stronger associations with gestational PBDE concentrations for all congeners among males, especially for the caregiver-rated scales (all EMM p-values <0.1). DISCUSSION: This study provides evidence that gestational PBDE serum concentrations may adversely influence offspring executive function during adolescence. |
Salmonella serotypes in the genomic era: simplified Salmonella serotype interpretation from DNA sequence data
Deng X , Li S , Xu T , Zhou Z , Moore MM , Timme R , Zhao S , Lane C , Dinsmore BA , Weill F , Fields PI . Appl Environ Microbiol 2025 e0260024 ![]() ![]() In the era of genomic characterization of strains for public health microbiology, whole genome sequencing (WGS)-enabled subtyping of Salmonella provides superior discrimination of strains compared to traditional methods such as serotyping. Nonetheless, serotypes are still very useful; they maintain historical continuity and facilitate clear communication. Genetic determination of serotypes from WGS data is now routine. Genetic determination of rarer serotypes can be problematic due to a lack of sequences for rare antigen types and alleles, a lack of understanding of the genetic basis for some antigens, or some inconsistencies in the White-Kauffmann-Le Minor (WKL) Scheme for Salmonella serotype designation. Here, we present a simplified interpretation of serotypes to address the shortcomings of genetic methods, which will allow the streamlined integration of serotype determination into the WGS workflow. The simplification represents a consensus perspective among major U.S. public health agencies and serves as a WGS-oriented interpretation of the WKL Scheme. We also present SeqSero2S, a bioinformatics tool for WGS-based serotype prediction using the simplified interpretation.IMPORTANCEThe utility of Salmonella serotyping has evolved from a primary subtyping method, where the need for strain discrimination justified its complexity, to a supplemental subtyping scheme and nomenclature convention, where clarity and simplicity in communication have become important for its continued use. Compared to phenotypic methods like serotyping, whole genome sequencing (WGS)-based subtyping methods excel in recognizing natural populations, which avoids grouping together strains from different genetic backgrounds or splitting genetically related strains into different groups. This simplified interpretation of serotypes addresses a shortcoming of the original scheme by combining some serotypes that are known to be genetically related. Our simplified interpretation of the White-Kauffmann-Le Minor (WKL) Scheme facilitates a complete and smooth transition of serotyping's role, especially from the public health perspective that has been shaped by the routine use of WGS. |
Incidence of metabolic and bariatric surgery among US adults with obesity by diabetes status: 2016-2020
Cheng YJ , Bullard KM , Hora I , Belay B , Xu F , Holliday CS , Simons-Linares R , Benoit SR . BMJ Open Diabetes Res Care 2025 13 (1) INTRODUCTION: Metabolic and bariatric surgery (MBS) is an effective intervention to manage diabetes and obesity. The population-based incidence of MBS is unknown. OBJECTIVE: To estimate the incidence of MBS among US adults with obesity by diabetes status and selected sociodemographic characteristics. RESEARCH DESIGN AND METHODS: This cross-sectional study used data from the 2016-2020 Nationwide Inpatient Sample and Nationwide Ambulatory Surgery Sample to capture MBS procedures. The National Health Interview Survey was used to establish the denominator for incidence calculations. Participants included US non-pregnant adults aged ≥18 years with obesity. The main outcome was incident MBS without previous MBS, defined by International Classification of Diseases, Tenth Revision Procedure Codes, Diagnosis Related Group system codes, and Current Procedural Terminology codes. Adjusted incidence and annual percentage change (2016-2019) were estimated using logistic regression. RESULTS: Among US adults with obesity, over 900 000 MBS procedures were performed in inpatient and hospital-owned ambulatory surgical centers in the USA during 2016-2020. The age- and sex-adjusted incidence of MBS per 1000 adults was 5.9 (95% CI 5.4 to 6.4) for adults with diabetes and 2.0 (95% CI 1.9 to 2.1) for adults without diabetes. MBS incidence was significantly higher for women and adults with class III obesity regardless of diabetes status. The highest incidence of MBS occurred in the Northeast region. Sleeve gastrectomy was the most common MBS surgical approach. CONCLUSIONS: Incident MBS procedures were nearly threefold higher among adults with obesity and diabetes than those with obesity but without diabetes. Continued monitoring of the trends of MBS and other treatment modalities can inform our understanding of treatment accessibility to guide prevention efforts aimed at reducing obesity and diabetes. |
Health and health care utilization outcomes for individuals with traumatic brain injury: A 1-year longitudinal study
Waltzman D , Miller GF , Xu L , Haarbauer-Krupa J , Hammond FM . J Head Trauma Rehabil 2025 OBJECTIVE: Traumatic brain injury (TBI) can result in new onset of comorbidities and limited studies suggest health care utilization following TBI may be high. Setting, Participants, Mean Measures, and Design: This study used 2018 and 2019 MarketScan Commercial Claims and Encounters data to examine differences in longitudinal health outcomes (health care utilization and new diagnoses) by various demographic factors (age, sex, U.S. region, intent/mechanism of injury, urbanicity, and insurance status) among individuals with and without a TBI in the year following an index health care encounter. RESULTS: Results show that within 1 year of the initial encounter, a higher percentage of patients with TBI versus without TBI had at least one outpatient visit (96.7% vs 86.1%), emergency department (ED) visit (28.5% vs 13.1%), or hospital admission (6.4% vs 2.6%). Both children (33.8% vs 23.4%) and adults (43.8% vs 31.4%) who sustained a TBI had a higher percentage of new diagnoses within 1 year compared to the non-TBI group. Additionally, individuals with a TBI had greater health care utilization across all types of health care settings (outpatient and inpatient), visits (ED visits and hospital admissions), and across all demographic factors (P < .001). CONCLUSION: These results may inform future research around the development of systems of care to improve longer-term outcomes in individuals with TBI. |
Substructure-specific antibodies against fentanyl derivatives
Chapman A , Xu M , Schroeder M , Goldstein JM , Chida A , Lee JR , Tang X , Wharton RE , Finn MG . ACS Nano 2025 ![]() Structural variants of the synthetic opioid fentanyl are a major threat to public health. Following an investigation showing that many derivatives are poorly detected by commercial lateral flow and related assays, we created hapten conjugate vaccines using an immunogenic virus-like particle carrier and eight synthetic fentanyl derivatives designed to mimic the structural features of several of the more dangerous analogues. Immunization of mice elicited strong antihapten humoral responses, allowing the screening of hundreds of hapten-specific hybridomas for binding strength and specificity. A panel of 13 monoclonal IgG antibodies were selected, each showing a different pattern of recognition of fentanyl structural variations, and all proving to be highly efficient at capturing parent fentanyl compounds in competition ELISA experiments. These results provide antibody reagents for assay development as well as a demonstration of the power of the immune system to create binding agents capable of both broad and specific recognition of small-molecule targets. |
Pain management and social functioning limitations among adults with chronic pain by diabetes status: National Health Interview Survey, United States, 2019-2020
Zaganjor I , Saelee R , Miyamoto Y , Xu F , Pavkov ME . Prim Care Diabetes 2024 AIMS: This study aims to describe pain management technique usage and social functioning limitations among adults with chronic pain by diabetes status. METHODS: The 2019 and 2020 National Health Interview Survey data were pooled to complete this analysis. Use of the following techniques in the past 3 months were measured: 1) prescription opioids; 2) physical, rehabilitative, or occupational therapy; 3) talk therapies; 4) chiropractic care; 5) yoga, Tai Chi, or Qi Gong; 6) massage; and 7) relaxation techniques. The social functioning limitations assessed were: 1) doing errands alone; 2) participating in social activities; and 3) work limitations. Weighted prevalence and 95 % confidence intervals (CIs) were estimated for each outcome by diabetes status. Logistic regression was used to estimate age- and sex-adjusted odds ratios (aORs) to assess differences by diabetes status. RESULTS: Adults with diabetes and chronic pain were more likely to use prescription opioids (aOR: 1.4; 95 % CI: 1.2, 1.6) but less likely to use various nonpharmacological techniques than those without diabetes. Additionally, adults with diabetes and chronic pain were more likely to report each social functioning limitation than those without diabetes. CONCLUSIONS: Results suggest adults with diabetes and chronic pain may be missing beneficial opportunities to manage pain. |
Indicator-based tuberculosis infection control assessments with knowledge, attitudes, and practices evaluations among health facilities in China, 2017-2019
Zhang C , O'Connor S , Chen H , Rodriguez DF , Hao L , Wang Y , Li Y , Xu J , Chen Y , Xia L , Yang X , Zhao Y , Cheng J . Am J Infect Control 2024 BACKGROUND: Tuberculosis (TB) Building and Strengthening Infection Control Strategies (TB BASICS) aimed to achieve improvements in TB infection prevention and control (IPC) through structured training and mentorship. METHODS: TB BASICS was implemented in six Chinese provinces from 2017-2019. Standardized, facility-based risk assessments tailored to inpatient, laboratory, and outpatient departments were conducted quarterly for 18 months. Knowledge, attitudes, and practices surveys were administered to healthcare workers (HCW) at nine participating facilities during the first and last assessments. Kruskal-Wallis rank sum test assessed score differences between departments (alpha = 0.05). RESULTS: Fifty-seven departments received risk assessments. IPC policies and practices improved substantially during follow up. Facility-based assessment scores were significantly lower in outpatient departments than other departments (p <0.05). All indicators achieved at least partial implementation by the final assessment. Low scores persisted for implementing isolation protocols, while personal protective equipment use among staff was consistent among all departments. Overall, we observed minimal change in IPC knowledge among HCW. In general, HCW had favorable views of their own IPC capabilities, but reported limited agency to improve institutional IPC. CONCLUSIONS: TB BASICS demonstrated improvements in TB IPC implementation. Structured training and mentorship engaged HCW to maintain confidence and competency for TB prevention. |
Genomic perspective on the bacillus causing paratyphoid B fever
Hawkey J , Frézal L , Tran Dien A , Zhukova A , Brown D , Chattaway MA , Simon S , Izumiya H , Fields PI , De Lappe N , Kaftyreva L , Xu X , Isobe J , Clermont D , Njamkepo E , Akeda Y , Issenhuth-Jeanjean S , Makarova M , Wang Y , Hunt M , Jenkins BM , Ravel M , Guibert V , Serre E , Matveeva Z , Fabre L , Cormican M , Yue M , Zhu B , Morita M , Iqbal Z , Silva Nodari C , Pardos de la Gandara M , Weill FX . Nat Commun 2024 15 (1) 10143 ![]() ![]() Paratyphoid B fever (PTB) is caused by an invasive lineage (phylogroup 1, PG1) of Salmonella enterica serotype Paratyphi B (SPB). However, little was known about the global population structure, geographic distribution, and evolution of this pathogen. Here, we report a whole-genome analysis of 568 historical and contemporary SPB PG1 isolates, obtained globally, between 1898 and 2021. We show that this pathogen existed in the 13th century, subsequently diversifying into 11 lineages and 38 genotypes with strong phylogeographic patterns. Following its discovery in 1896, it circulated across Europe until the 1970s, after which it was mostly reimported into Europe from South America, the Middle East, South Asia, and North Africa. Antimicrobial resistance recently emerged in various genotypes of SPB PG1, mostly through mutations of the quinolone-resistance-determining regions of gyrA and gyrB. This study provides an unprecedented insight into SPB PG1 and essential genomic tools for identifying and tracking this pathogen, thereby facilitating the global genomic surveillance of PTB. |
Three-dimensional heat and moisture transfer analysis for thermal protection of firefighters' gloves with phase change materials
Xu SS , Pollard J , Zhao W . Int J Occup Saf Ergon 2024 1-17 Transient three-dimensional (3D) heat and moisture transfer simulations were conducted to analyze the thermal performances of the entire phase change material (PCM) integrated into firefighters' gloves. PCM was broken down into several segments to cover the back and palm of the hand but to avoid finger joints to keep hand functions. Parametric studies were performed to explore the effects of PCM melting temperatures, PCM locations in the glove and PCM layer thicknesses on the overall thermal performance improvement of firefighters' gloves. The study found that PCM segments could extend the time for hand skin surfaces (areas covered or not covered by PCM) to reach second-degree burn injury (60 °C) by 1.5-2 times compared to conventional firefighters' gloves without PCM. Moreover, PCM segments could help mitigate the temperature increase on hand skin and glove surface after fire exposure. |
Force-induced tissue compression alters circulating hormone levels and biomarkers of peripheral vascular and sensorineural dysfunction in an animal model of hand-arm vibration syndrome
Krajnak K , Waugh S , Warren C , Chapman P , Xu X , Welcome D , Hammer M , Richardson D , Dong R . J Toxicol Environ Health A 2024 1-21 Workers regularly using vibrating hand tools may develop a disorder referred to as hand-arm vibration syndrome (HAVS). HAVS is characterized by cold-induced vasospasms in the hands and fingers that result in blanching of the skin, loss of sensory function, pain, and reductions in manual dexterity. Exposure to vibration induces some of these symptoms. However, the soft tissues of the hands and fingers of workers are compressed as a result of the force generated when a worker grips a tool. The compression of these soft tissues might also contribute to the development of HAVS. The goal of this study was to use an established rat tail model to determine the mechanisms by which compression of the tail tissues affects (1) the ventral tail artery (VTA) and ventral tail nerves (VTN), (2) nerves and sensory receptors in the skin, (3) dorsal root ganglia (DRG), and (4) spinal cord. Tissue compression resulted in the following changes (1) circulating pituitary and steroid hormone concentrations, (2) expression of factors that modulate vascular function in the skin and tail artery, and (3) factors associated with nerve damage, DRG, and spinal cord. Some of these observed effects differed from those previously noted with vibration exposure. Based upon these findings, the effects of applied force and vibration are different. Studies examining the combination of these factors might provide data that may potentially be used to improve risk assessment and support revision of standards. |
Telemedicine use among adults with and without diagnosed prediabetes or diabetes, National Health Interview Survey, United States, 2021 and 2022
Zaganjor I , Saelee R , Onufrak S , Miyamoto Y , Koyama AK , Xu F , Bullard KM , Pavkov ME . Prev Chronic Dis 2024 21 E90 We analyzed 2021 and 2022 National Health Interview Survey data to describe the prevalence of past 12-month telemedicine use among US adults with no prediabetes or diabetes diagnosis, diagnosed prediabetes, and diagnosed diabetes. In 2021 and 2022, telemedicine use prevalence was 34.1% and 28.2% among adults without diagnosed diabetes or prediabetes, 47.6% and 37.6% among adults with prediabetes, and 52.8% and 39.4% among adults with diabetes, respectively. Differences in telemedicine use were identified by region, urbanicity, insurance status, and education among adults with prediabetes or diabetes. Findings suggest that telemedicine use can be improved among select populations with prediabetes or diabetes. |
Prevalence of self-reported diagnosed diabetes among adults, by county metropolitan status and region, United States, 2019-2022
Onufrak S , Saelee R , Zaganjor I , Miyamoto Y , Koyama AK , Xu F , Pavkov ME , Bullard KM , Imperatore G . Prev Chronic Dis 2024 21 E81 INTRODUCTION: Previous research suggests that rural-urban disparities in diabetes mortality, hospitalization, and incidence rates may manifest differently across US regions. However, no studies have examined disparities in diabetes prevalence by metropolitan residence and region. METHODS: We used data from the 2019-2022 National Health Interview Survey to compare diabetes status, socioeconomic characteristics, and weight status among adults in each census region (Northeast, Midwest, South, West) according to county metropolitan status of residence (large central metro, large fringe metro, small/medium metro, and nonmetro). We used χ(2) tests and logistic regression models to assess the association of metropolitan residence with diabetes prevalence in each region. RESULTS: Diabetes prevalence ranged from 7.0% in large fringe metro counties in the Northeast to 14.8% in nonmetro counties in the South. Compared with adults from large central metro counties, those from small/medium metro counties had significantly higher odds of diabetes in the Midwest (age-, sex-, and race and ethnicity-adjusted odds ratio [OR] = 1.24; 95% CI, 1.06-1.45) and South (OR = 1.15; 95% CI, 1.02-1.30). Nonmetro residence was also associated with diabetes in the South (OR = 1.62 vs large central metro; 95% CI, 1.43-1.84). After further adjustment for socioeconomic and body weight status, small/medium metro associations with diabetes became nonsignificant, but nonmetro residence in the South remained significantly associated with diabetes (OR = 1.22; 95% CI, 1.07-1.39). CONCLUSION: The association of metropolitan residence with diabetes prevalence differs across US regions. These findings can help to guide efforts in areas where diabetes prevention and care resources may be better directed. |
Rat-tail models for studying hand-arm vibration syndrome: A comparison between living and cadaver rat tails
Warren CM , Xu XS , Jackson M , McKinney WG , Wu JZ , Welcome DE , Waugh S , Chapman P , Sinsel EW , Service S , Krajnak K , Dong RG . Vib 2024 7 (3) 722-737 Over-exposure of the hand-arm system to intense vibration and force over time may cause degeneration of the vascular, neurological, and musculoskeletal systems in the fingers. A novel animal model using rat tails has been developed to understand the health effects on human fingers exposed to vibration and force when operating powered hand tools or workpieces. The biodynamic responses, such as vibration stress, strain, and power absorption density, of the rat tails can be used to help evaluate the health effects related to vibration and force and to establish a dose-effect relationship. While the biodynamic responses of cadaver rat tails have been investigated, the objective of the current study was to determine whether the biodynamic responses of living rat tails are different from those of cadaver rat tails, and whether the biodynamic responses of both living and cadaver tails change with exposure duration. To make direct comparisons, the responses of both cadaver and living rat tails were examined on four different testing stations. The transfer function of each tail under a given contact force (2 N) was measured at each frequency in the one-third octave bands from 20 to 1000 Hz, and used to calculate the mechanical system parameters of the tails. The transfer functions were also measured at different exposure durations to determine the time dependency of the response. Differences were observed in the vibration biodynamic responses between living and cadaver tails, but the general trends were similar. The biodynamic responses of both cadaver and living rat tails varied with exposure duration. © 2024 by the authors. |
Impact of self-contained breathing apparatus (SCBA) weights on firefighter's kinematics during simulated firefighter tasks
Xu S , Jones R , Ratnakumar N , Akbas K , Powell J , Zhuang Z , Zhou X . Appl Hum Factors Ergon Conf 2024 131 142-149 Firefighters face a multitude of hazards in their line of duty, with overexertion being one of the foremost causes of injuries or fatalities. This high risk is often exacerbated by the burden of carrying a heavy self-contained breathing apparatus (SCBA). This study aims to explore the impact of SCBA weight on firefighters' musculoskeletal joint movements. Six firefighters participated in this study, performing four simulated firefighting tasks under three different SCBA weight conditions. A hybrid inverse kinematics approach was employed to analyze the kinematic data from two participants. The results revealed a notable decrease in lumbar range of motion (ROM) as the weight increased, particularly noticeable during hose advancement and stair climbing tasks. Conversely, an increase in hip ROM during stair climbing was observed, suggesting a compensatory response to reduced spinal flexibility. These findings underscore the critical need to understand the implications of turnout gear and SCBA weight to enhance firefighter performance and reduce the risk of injury. |
3D numerical simulation for thermal protection of phase change material-integrated firefighters' turnout gear
Xu SS , Pollard J , Zhao W . Appl Hum Factors Ergon Conf 2024 131 133-141 This work aims to investigate and develop a novel phase change material (PCM)-integrated firefighters' turnout gear technology that would significantly enhance the thermal protection of firefighters' bodies from thermal burn injuries under high-heat conditions (such as in fire scenes). This work established a 3D human thermal simulation to explore the thermal protection improvements of firefighters' turnout gear by using PCM segments under flashover and hazardous conditions. This simulation study will guide future experimental design and testing effectively and save time and effort. The study found that the 3.0-mm-thick PCM segments with a melting temperature of 60°C could extend the thermal protection time for skin surface to reach second-degree burn injury (60°C) by one to three times under flashover conditions compared to the turnout gear without PCM. Moreover, thinner PCM segments, i.e., 1.0-3.0 mm thickness, could also significantly mitigate the skin surface temperature increase while avoiding the added weight on the turnout gear. The 3D modelling results can be used to develop a next-generation firefighter turnout gear technology. |
Antigenic drift and subtype interference shape A(H3N2) epidemic dynamics in the United States
Perofsky AC , Huddleston J , Hansen CL , Barnes JR , Rowe T , Xu X , Kondor R , Wentworth DE , Lewis N , Whittaker L , Ermetal B , Harvey R , Galiano M , Daniels RS , McCauley JW , Fujisaki S , Nakamura K , Kishida N , Watanabe S , Hasegawa H , Sullivan SG , Barr IG , Subbarao K , Krammer F , Bedford T , Viboud C . Elife 2024 13 ![]() ![]() ![]() Influenza viruses continually evolve new antigenic variants, through mutations in epitopes of their major surface proteins, hemagglutinin (HA) and neuraminidase (NA). Antigenic drift potentiates the reinfection of previously infected individuals, but the contribution of this process to variability in annual epidemics is not well understood. Here, we link influenza A(H3N2) virus evolution to regional epidemic dynamics in the United States during 1997-2019. We integrate phenotypic measures of HA antigenic drift and sequence-based measures of HA and NA fitness to infer antigenic and genetic distances between viruses circulating in successive seasons. We estimate the magnitude, severity, timing, transmission rate, age-specific patterns, and subtype dominance of each regional outbreak and find that genetic distance based on broad sets of epitope sites is the strongest evolutionary predictor of A(H3N2) virus epidemiology. Increased HA and NA epitope distance between seasons correlates with larger, more intense epidemics, higher transmission, greater A(H3N2) subtype dominance, and a greater proportion of cases in adults relative to children, consistent with increased population susceptibility. Based on random forest models, A(H1N1) incidence impacts A(H3N2) epidemics to a greater extent than viral evolution, suggesting that subtype interference is a major driver of influenza A virus infection ynamics, presumably via heterosubtypic cross-immunity. | Seasonal influenza (flu) viruses cause outbreaks every winter. People infected with influenza typically develop mild respiratory symptoms. But flu infections can cause serious illness in young children, older adults and people with chronic medical conditions. Infected or vaccinated individuals develop some immunity, but the viruses evolve quickly to evade these defenses in a process called antigenic drift. As the viruses change, they can re-infect previously immune people. Scientists update the flu vaccine yearly to keep up with this antigenic drift. The immune system fights flu infections by recognizing two proteins, known as antigens, on the virus’s surface, called hemagglutinin (HA) and neuraminidase (NA). However, mutations in the genes encoding these proteins can make them unrecognizable, letting the virus slip past the immune system. Scientists would like to know how these changes affect the size, severity and timing of annual influenza outbreaks. Perofsky et al. show that tracking genetic changes in HA and NA may help improve flu season predictions. The experiments compared the severity of 22 flu seasons caused by the A(H3N2) subtype in the United States with how much HA and NA had evolved since the previous year. The A(H3N2) subtype experiences the fastest rates of antigenic drift and causes more cases and deaths than other seasonal flu viruses. Genetic changes in HA and NA were a better predictor of A(H3N2) outbreak severity than the blood tests for protective antibodies that epidemiologists traditionally use to track flu evolution. However, the prevalence of another subtype of influenza A circulating in the population, called A(H1N1), was an even better predictor of how severe A(H3N2) outbreaks would be. Perofsky et al. are the first to show that genetic changes in NA contribute to the severity of flu seasons. Previous studies suggested a link between genetic changes in HA and flu season severity, and flu vaccines include the HA protein to help the body recognize new influenza strains. The results suggest that adding the NA protein to flu vaccines may improve their effectiveness. In the future, flu forecasters may want to analyze genetic changes in both NA and HA to make their outbreak predictions. Tracking how much of the A(H1N1) subtype is circulating may also be useful for predicting the severity of A(H3N2) outbreaks. | eng |
Antivirals for treatment of severe influenza: a systematic review and network meta-analysis of randomised controlled trials
Gao Y , Guyatt G , Uyeki TM , Liu M , Chen Y , Zhao Y , Shen Y , Xu J , Zheng Q , Li Z , Zhao W , Luo S , Chen X , Tian J , Hao Q . Lancet 2024 404 (10454) 753-763 BACKGROUND: The optimal antiviral drug for treatment of severe influenza remains unclear. To support updated WHO influenza clinical guidelines, this systematic review and network meta-analysis evaluated antivirals for treatment of patients with severe influenza. METHODS: We systematically searched MEDLINE, Embase, Cochrane Central Register of Controlled Trials, Cumulative Index to Nursing and Allied Health Literature, Global Health, Epistemonikos, and ClinicalTrials.gov for randomised controlled trials published up to Sept 20, 2023, that enrolled hospitalised patients with suspected or laboratory-confirmed influenza and compared direct-acting influenza antivirals against placebo, standard care, or another antiviral. Pairs of coauthors independently extracted data on study characteristics, patient characteristics, antiviral characteristics, and outcomes, with discrepancies resolved by discussion or by a third coauthor. Key outcomes of interest were time to alleviation of symptoms, duration of hospitalisation, admission to intensive care unit, progression to invasive mechanical ventilation, duration of mechanical ventilation, mortality, hospital discharge destination, emergence of antiviral resistance, adverse events, adverse events related to treatments, and serious adverse events. We conducted frequentist network meta-analyses to summarise the evidence and evaluated the certainty of evidence using the GRADE (Grading of Recommendations Assessment, Development and Evaluation) approach. This study is registered with PROSPERO, CRD42023456650. FINDINGS: Of 11 878 records identified by our search, eight trials with 1424 participants (mean age 36-60 years for trials that reported mean or median age; 43-78% male patients) were included in this systematic review, of which six were included in the network meta-analysis. The effects of oseltamivir, peramivir, or zanamivir on mortality compared with placebo or standard care without placebo for seasonal and zoonotic influenza were of very low certainty. Compared with placebo or standard care, we found low certainty evidence that duration of hospitalisation for seasonal influenza was reduced with oseltamivir (mean difference -1·63 days, 95% CI -2·81 to -0·45) and peramivir (-1·73 days, -3·33 to -0·13). Compared with standard care, there was little or no difference in time to alleviation of symptoms with oseltamivir (0·34 days, -0·86 to 1·54; low certainty evidence) or peramivir (-0·05 days, -0·69 to 0·59; low certainty evidence). There were no differences in adverse events or serious adverse events with oseltamivir, peramivir, and zanamivir (very low certainty evidence). Uncertainty remains about the effects of antivirals on other outcomes for patients with severe influenza. Due to the small number of eligible trials, we could not test for publication bias. INTERPRETATION: In hospitalised patients with severe influenza, oseltamivir and peramivir might reduce duration of hospitalisation compared with standard care or placebo, although the certainty of evidence is low. The effects of all antivirals on mortality and other important patient outcomes are very uncertain due to scarce data from randomised controlled trials. FUNDING: World Health Organization. |
Urban-rural differences in acute kidney injury mortality in the United States
Xu F , Miyamoto Y , Zaganjor I , Onufrak S , Saelee R , Koyama AK , Pavkov ME . Am J Prev Med 2024 INTRODUCTION: Acute kidney injury (AKI) is associated with increased mortality. AKI-related mortality trends by US urban and rural counties were assessed. METHODS: In the cross-sectional study, based on the Centers for Disease Control and Prevention WONDER (Wide-ranging ONline Data for Epidemiologic Research) Multiple Cause of Death data, age-standardized mortality with AKI as the multiple cause was obtained among adults aged ≥25 years from 2001-2020, by age, sex, race and ethnicity, stratified by urban-rural counties. Joinpoint regressions were used to assess trends from 2001-2019 in AKI-related mortality rate. Pairwise comparison was used to compare mean differences in mortality between urban and rural counties from 2001-2019. RESULTS: From 2001-2020, age-standardized AKI-related mortality was consistently higher in rural than urban counties. AKI-related mortality (per 100,000 population) increased from 18.95 in 2001 to 29.46 in 2020 in urban counties and from 20.10 in 2001 to 38.24 in 2020 in rural counties. In urban counties, AKI-related mortality increased annually by 4.6% during 2001-2009 and decreased annually by 1.8% until 2019 (p<0.001). In rural counties, AKI-related mortality increased annually by 5.0% during 2001-2011 and decreased by 1.2% until 2019 (p<0.01). The overall urban-rural difference in AKI-related mortality was greater after 2009-2011. AKI-related mortality was significantly higher among older adults, men, and non-Hispanic Black adults than their counterparts in both urban and rural counties. Higher mortality was concentrated in rural counties in the Southern United States. CONCLUSIONS: Multidisciplinary efforts are needed to increase AKI awareness and implement strategies to reduce AKI-related mortality in rural and high-risk populations. |
Tinnitus after COVID-19 vaccination: Findings from the vaccine adverse event reporting system and the vaccine safety datalink
Yih WK , Duffy J , Su JR , Bazel S , Fireman B , Hurley L , Maro JC , Marquez P , Moro P , Nair N , Nelson J , Smith N , Sundaram M , Vasquez-Benitez G , Weintraub E , Xu S , Shimabukuro T . Am J Otolaryngol 2024 45 (6) 104448 ![]() ![]() PURPOSE: To assess the occurrence of tinnitus following COVID-19 vaccination using data mining and descriptive analyses in two U.S. vaccine safety surveillance systems. METHODS: Reports of tinnitus after COVID-19 vaccination to the Vaccine Adverse Event Reporting System (VAERS) from 2020 through 2024 were examined using empirical Bayesian data mining and by calculating reporting rates. In the Vaccine Safety Datalink (VSD) population, ICD-10 coded post-vaccination medical visits were examined using tree-based data mining, and tinnitus visit incidence rates during post-vaccination days 1-140 were calculated by age group for COVID-19 vaccines and for comparison, influenza vaccine. RESULTS: VAERS data mining did not find disproportionate reporting of tinnitus for any COVID-19 vaccine. VAERS received up to 84.82 tinnitus reports per million COVID-19 vaccine doses administered. VSD tree-based data mining found no signals for tinnitus. VSD tinnitus visit incidence rates after COVID-19 vaccines were similar to those after influenza vaccine except for the group aged ≥65 years (Moderna COVID-19 vaccine, 165 per 10,000 person-years; Pfizer-BioNTech COVID-19 vaccine, 154; influenza vaccine, 135). CONCLUSIONS: Overall, these findings do not support an increased risk of tinnitus following COVID-19 vaccination but cannot definitively exclude the possibility. Descriptive comparisons between COVID-19 and influenza vaccines were limited by lack of adjustment for potential confounding factors. |
Medical and work loss costs of violence, self-harm, unintentional and traumatic brain injuries per injured person in the USA
Peterson C , Xu L , Zhu S , Dunphy C , Florence C . Inj Prev 2024 OBJECTIVE: Injuries and poisoning are leading causes of US morbidity and mortality. This study aimed to update medical and work loss cost estimates per injured person. METHODS: Injuries treated in emergency departments (ED) during 2019-2020 were analysed in terms of mechanism (eg, fall) and intent (eg, unintentional), as well as traumatic brain injury (TBI) (multiple mechanisms and intents). Fatal injury medical spending was based on the Nationwide Emergency Department Sample and National Inpatient Sample. Non-fatal injury medical spending and workplace absences (general, short-term disability and workers' compensation) were analysed among injury patients with commercial insurance or Medicaid and matched controls during the year following an injury ED visit using MarketScan databases. RESULTS: Medical spending for injury deaths in hospital EDs and inpatient settings averaged US$4777 (n=57 296) and US$45 678 per fatality (n=89 175) (2020 USD). Estimates for fatal TBI were US$5052 (n=5363) and US$47 952 (n=37 184). People with ED treat and release visits for non-fatal injuries had on average US$5798 (n=895 918) in attributable medical spending and US$1686 (11 missed days) (n=116 836) in work loss costs during the following year, while people with non-fatal injuries who required hospitalisation after an ED injury visit had US$52 246 (n=32 976) in medical spending and US$7815 (51 days) (n=4473) in work loss costs. Estimates for non-fatal TBI were US$4529 (n=25 792), US$1503 (10 days) (n=1631), US$51 241 (n=3030) and US$6110 (40 days) (n=246). CONCLUSIONS AND RELEVANCE: Per person costs of injuries and violence are important to monitor the economic burden of injuries and assess the value of prevention strategies. |
Pyrazinamide safety, efficacy, and dosing for treating drug-susceptible pulmonary tuberculosis: A phase 3, randomized, controlled clinical trial
Xu AY , Velásquez GE , Zhang N , Chang VK , Phillips PP , Nahid P , Dorman SE , Kurbatova EV , Whitworth WC , Sizemore E , Bryant K , Carr W , Brown NE , Engle ML , Nhung NV , Nsubuga P , Diacon A , Dooley KE , Chaisson RE , Swindells S , Savic RM . Am J Respir Crit Care Med 2024 RATIONALE: Optimizing pyrazinamide dosing is critical to improve treatment efficacy while minimizing toxicity during tuberculosis treatment. Study 31/ACTG A5349 represents the largest Phase 3 randomized controlled therapeutic trial to date for such investigation. OBJECTIVES: We sought to report pyrazinamide pharmacokinetic parameters, risk factors for lower pyrazinamide exposure, and relationships between pyrazinamide exposure with efficacy and safety outcomes. We aimed to determine pyrazinamide dosing strategies that optimize risks and benefits. METHODS: We analyzed pyrazinamide steady-state pharmacokinetic data using population nonlinear mixed-effects models. We evaluated the contribution of pyrazinamide exposure to long-term efficacy using parametric time-to-event models and safety outcomes using logistic regression. We evaluated optimal dosing with therapeutic windows targeting ≥95% durable cure and safety within the observed proportion of the primary safety outcome. MEASUREMENTS AND MAIN RESULTS: Among 2255 participants with 6978 plasma samples, pyrazinamide displayed 7-fold exposure variability (151-1053 mg·h/L). Body weight was not a clinically relevant predictor of drug clearance and thus did not justify the need for weight-banded dosing. Both clinical and safety outcomes were associated with pyrazinamide exposure, resulting in a therapeutic window of 231-355 mg·h/L for the control and 226-349 mg·h/L for the rifapentine-moxifloxacin regimen. Flat dosing of pyrazinamide at 1000 mg would have permitted an additional 13.1% (n=96) participants allocated to the control and 9.2% (n=70) to the rifapentine-moxifloxacin regimen dosed within the therapeutic window, compared to the current weight-banded dosing. CONCLUSIONS: Flat dosing of pyrazinamide at 1000 mg daily would be readily implementable and could optimize treatment outcomes in drug-susceptible tuberculosis. Clinical trial registration available at www. CLINICALTRIALS: gov, ID: NCT02410772. |
Evaluation of self-contained breathing apparatus (SCBA) weight on firefighter stamina, comfort, and postural stability
Kesler RM , Powell J , Nguyen D , Massey KA , Joshi S , Xu S , Zhuang Z , Horn GP , Burd NA , Masoud F . Ergonomics 2024 1-14 Firefighters wear personal protective equipment to protect them from the thermal and chemical environment in which they operate. The self-contained breathing apparatus (SCBA) provides isolation of the airway from the hazardous fireground. National standards limit SCBA weight, however, integration of additional features could result in an SCBA exceeding the current limit. The purpose of this study was to examine the effects of increased SCBA weight on firefighters' physiological responses, work output, dynamic stability, and comfort. Completion of simulated firefighting activities induced a strong physiological response. Peak oxygen consumption was higher with the lightest SCBA than the heaviest SCBA. Few other physiological differences were noted as SCBA weight increased. Importantly, increased SCBA weight resulted in significantly more negative perceptions by the firefighters and a trend towards significance for the duration of work time prior to reaching volitional fatigue. These results should be considered when assessing changes to existing SCBA weight limits. | Increased SCBA weight above existing national standards resulted in negative perceptions by the firefighters, but not significant physiological changes after two simulated bouts of firefighting activity. SCBA weight had a nearly significant impact on the time firefighters worked before reaching volitional fatigue, with heavier SCBA trending towards decreased working time. | eng |
Near-universal resistance to macrolides of treponema pallidum in North America
Lieberman NAP , Reid TB , Cannon CA , Nunley BE , Berzkalns A , Cohen SE , Newman LM , Aldrete S , Xu LH , Thornlund CP , Pettus K , Lundy S , Kron M , Soge OO , Workowski K , Perlowski C , Hook EW 3rd , Dionne JA , Golden MR , Lieberman JA , Lee MK , Morshed M , Naidu P , Cao W , Pillay A , Giacani L , Greninger AL . N Engl J Med 2024 390 (22) 2127-2128 ![]() ![]() |
COVID-19 vaccination coverage and factors associated with vaccine uptake among people with HIV
Hechter RC , Qian L , Liu IA , Sy LS , Ryan DS , Xu S , Williams JTB , Klein NP , Kaiser RM , Liles EG , Glanz JM , Jackson LA , Sundaram ME , Weintraub ES , Tseng HF . JAMA Netw Open 2024 7 (6) e2415220 IMPORTANCE: People with HIV (PWH) may be at increased risk for severe outcomes with COVID-19 illness compared with people without HIV. Little is known about COVID-19 vaccination coverage and factors associated with primary series completion among PWH. OBJECTIVES: To evaluate COVID-19 vaccination coverage among PWH and examine sociodemographic, clinical, and community-level factors associated with completion of the primary series and an additional primary dose. DESIGN, SETTING, AND PARTICIPANTS: This retrospective cohort study used electronic health record data to assess COVID-19 vaccination information from December 14, 2020, through April 30, 2022, from 8 health care organizations of the Vaccine Safety Datalink project in the US. Participants were adults diagnosed with HIV on or before December 14, 2020, enrolled in a participating site. MAIN OUTCOMES AND MEASURES: The percentage of PWH with at least 1 dose of COVID-19 vaccine and PWH who completed the COVID-19 vaccine primary series by December 31, 2021, and an additional primary dose by April 30, 2022. Rate ratios (RR) and 95% CIs were estimated using Poisson regression models for factors associated with completing the COVID-19 vaccine primary series and receiving an additional primary dose. RESULTS: Among 22 058 adult PWH (mean [SD] age, 52.1 [13.3] years; 88.8% male), 90.5% completed the primary series by December 31, 2021. Among 18 374 eligible PWH who completed the primary series by August 12, 2021, 15 982 (87.0%) received an additional primary dose, and 4318 (23.5%) received a booster dose by April 30, 2022. Receipt of influenza vaccines in the last 2 years was associated with completion of the primary series (RR, 1.17; 95% CI, 1.15-1.20) and an additional primary dose (RR, 1.61; 95% CI, 1.54-1.69). PWH with uncontrolled viremia (HIV viral load ≥200 copies/mL) (eg, RR, 0.90 [95% CI, 0.85-0.95] for viral load 200-10 000 copies/mL vs undetected or <200 copies/mL for completing the primary series) and Medicaid insurance (eg, RR, 0.89 [95% CI, 0.87-0.90] for completing the primary series) were less likely to be fully vaccinated. By contrast, greater outpatient utilization (eg, RR, 1.07 [95% CI, 1.05-1.09] for ≥7 vs 0 visits for primary series completion) and residence in counties with higher COVID-19 vaccine coverage (eg, RR, 1.06 [95% CI, 1.03-1.08] for fourth vs first quartiles for primary series completion) were associated with primary series and additional dose completion (RRs ranging from 1.01 to 1.21). CONCLUSIONS AND RELEVANCE: Findings from this cohort study suggest that, while COVID-19 vaccination coverage was high among PWH, outreach efforts should focus on those who did not complete vaccine series and those who have uncontrolled viremia. |
Contributions of the community-based organization program funded by the Centers For Disease Control and Prevention to linkage to HIV medical care
Marano-Lee M , Williams W , Xu S , Andia J , Shapatava E . Public Health Rep 2024 333549241252579 OBJECTIVE: Linkage to HIV medical care is important in the continuum of HIV care and health outcomes for people with HIV. The objective of this analysis was to identify how the community-based organization (CBO) program contributes to linkage to HIV medical care among people with newly diagnosed HIV in the Centers for Disease Control and Prevention's (CDC's) HIV testing program. METHODS: We analyzed HIV linkage-to-care data submitted to CDC from 2019 through 2021. Linkage was defined as confirmation that an individual attended their first HIV medical care appointment within 30 days of their HIV test date. We included in the analysis data submitted from the health department (HD) program that included 61 state and local HDs in the United States, Puerto Rico, and the US Virgin Islands and the CBO program that included 150 CBOs. RESULTS: The CBO program linked a higher proportion of people to HIV medical care within 30 days of diagnosis (86.7%) than the HD program (73.7%). By population group, the proportion linked in the CBO program was higher than the proportion linked in the HD program among men who have sex with men (prevalence ratio [PR] = 1.13; P < .001), men who have sex with men/people who inject drugs (PR = 1.29; P < .001), transgender people (PR = 1.28; P < .001), and those reporting no sexual contact or injection drug use (PR = 1.34; P < .001). In the Cox proportional hazards model, time to linkage in the CBO program was significantly shorter than in the HD program (hazard ratio = 0.63; P < .001). CONCLUSION: This analysis shows that the CBO program fills a vital need in linking newly diagnosed HIV-positive people to HIV medical care, which is important in the HIV care continuum and for viral suppression. |
Juvenile hormone as a contributing factor in establishing midgut microbiota for fecundity and fitness enhancement in adult female Aedes aegypti
Taracena-Agarwal ML , Walter-Nuno AB , Bottino-Rojas V , Mejia APG , Xu K , Segal S , Dotson EM , Oliveira PL , Paiva-Silva GO . Commun Biol 2024 7 (1) 687 ![]() Understanding the factors influencing mosquitoes' fecundity and longevity is important for designing better and more sustainable vector control strategies, as these parameters can impact their vectorial capacity. Here, we address how mating affects midgut growth in Aedes aegypti, what role Juvenile Hormone (JH) plays in this process, and how it impacts the mosquito's immune response and microbiota. Our findings reveal that mating and JH induce midgut growth. Additionally, the establishment of a native bacterial population in the midgut due to JH-dependent suppression of the immune response has important reproductive outcomes. Specific downregulation of AMPs with an increase in bacteria abundance in the gut results in increased egg counts and longer lifespans. Overall, these findings provide evidence of a cross-talk between JH response, gut epithelial tissue, cell cycle regulation, and the mechanisms governing the trade-offs between nutrition, immunity, and reproduction at the cellular level in the mosquito gut. |
Understanding forms of childhood adversities and associations with adult health outcomes: A regression tree analysis
Perrins SP , Vermes E , Cincotta K , Xu Y , Godoy-Garraza L , Chen MS , Addison R , Douglas B , Yatco A , Idaikkadar N , Willis LA . Child Abuse Negl 2024 153 106844 ![]() BACKGROUND: Empirical studies have demonstrated associations between ten original adverse childhood experiences (ACEs) and multiple health outcomes. Identifying expanded ACEs can capture the burden of other childhood adversities that may have important health implications. OBJECTIVE: We sought to identify childhood adversities that warrant consideration as expanded ACEs. We hypothesized that experiencing expanded and original ACEs would be associated with poorer adult health outcomes compared to experiencing original ACEs alone. PARTICIPANTS: The 11,545 respondents of the National Longitudinal Surveys (NLS) and Child and Young Adult Survey were 48.9 % female, 22.7 % Black, 15.8 % Hispanic, 36.1 % White, 1.7 % Asian/Native Hawaiian/Pacific Islander/Native American/Native Alaskan, and 7.5 % Other. METHODS: This study used regression trees and generalized linear models to identify if/which expanded ACEs interacted with original ACEs in association with six health outcomes. RESULTS: Four expanded ACEs-basic needs instability, lack of parental love and affection, community stressors, and mother's experience with physical abuse during childhood -significantly interacted with general health, depressive symptom severity, anxiety symptom severity, and violent crime victimization in adulthood (all p-values <0.005). Basic needs instability and/or lack of parental love and affection emerged as correlates across multiple outcomes. Experiencing lack of parental love and affection and original ACEs was associated with greater anxiety symptoms (p = 0.022). CONCLUSIONS: This is the first study to use supervised machine learning to investigate interaction effects among original ACEs and expanded ACEs. Two expanded ACEs emerged as predictors for three adult health outcomes and warrant further consideration in ACEs assessments. |
Challenges of COVID-19 case forecasting in the US, 2020-2021
Lopez VK , Cramer EY , Pagano R , Drake JM , O'Dea EB , Adee M , Ayer T , Chhatwal J , Dalgic OO , Ladd MA , Linas BP , Mueller PP , Xiao J , Bracher J , Castro Rivadeneira AJ , Gerding A , Gneiting T , Huang Y , Jayawardena D , Kanji AH , Le K , Mühlemann A , Niemi J , Ray EL , Stark A , Wang Y , Wattanachit N , Zorn MW , Pei S , Shaman J , Yamana TK , Tarasewicz SR , Wilson DJ , Baccam S , Gurung H , Stage S , Suchoski B , Gao L , Gu Z , Kim M , Li X , Wang G , Wang L , Wang Y , Yu S , Gardner L , Jindal S , Marshall M , Nixon K , Dent J , Hill AL , Kaminsky J , Lee EC , Lemaitre JC , Lessler J , Smith CP , Truelove S , Kinsey M , Mullany LC , Rainwater-Lovett K , Shin L , Tallaksen K , Wilson S , Karlen D , Castro L , Fairchild G , Michaud I , Osthus D , Bian J , Cao W , Gao Z , Lavista Ferres J , Li C , Liu TY , Xie X , Zhang S , Zheng S , Chinazzi M , Davis JT , Mu K , Pastore YPiontti A , Vespignani A , Xiong X , Walraven R , Chen J , Gu Q , Wang L , Xu P , Zhang W , Zou D , Gibson GC , Sheldon D , Srivastava A , Adiga A , Hurt B , Kaur G , Lewis B , Marathe M , Peddireddy AS , Porebski P , Venkatramanan S , Wang L , Prasad PV , Walker JW , Webber AE , Slayton RB , Biggerstaff M , Reich NG , Johansson MA . PLoS Comput Biol 2024 20 (5) e1011200 During the COVID-19 pandemic, forecasting COVID-19 trends to support planning and response was a priority for scientists and decision makers alike. In the United States, COVID-19 forecasting was coordinated by a large group of universities, companies, and government entities led by the Centers for Disease Control and Prevention and the US COVID-19 Forecast Hub (https://covid19forecasthub.org). We evaluated approximately 9.7 million forecasts of weekly state-level COVID-19 cases for predictions 1-4 weeks into the future submitted by 24 teams from August 2020 to December 2021. We assessed coverage of central prediction intervals and weighted interval scores (WIS), adjusting for missing forecasts relative to a baseline forecast, and used a Gaussian generalized estimating equation (GEE) model to evaluate differences in skill across epidemic phases that were defined by the effective reproduction number. Overall, we found high variation in skill across individual models, with ensemble-based forecasts outperforming other approaches. Forecast skill relative to the baseline was generally higher for larger jurisdictions (e.g., states compared to counties). Over time, forecasts generally performed worst in periods of rapid changes in reported cases (either in increasing or decreasing epidemic phases) with 95% prediction interval coverage dropping below 50% during the growth phases of the winter 2020, Delta, and Omicron waves. Ideally, case forecasts could serve as a leading indicator of changes in transmission dynamics. However, while most COVID-19 case forecasts outperformed a naïve baseline model, even the most accurate case forecasts were unreliable in key phases. Further research could improve forecasts of leading indicators, like COVID-19 cases, by leveraging additional real-time data, addressing performance across phases, improving the characterization of forecast confidence, and ensuring that forecasts were coherent across spatial scales. In the meantime, it is critical for forecast users to appreciate current limitations and use a broad set of indicators to inform pandemic-related decision making. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Mar 17, 2025
- Content source:
- Powered by CDC PHGKB Infrastructure