Last data update: Jun 20, 2025. (Total: 49421 publications since 2009)
Records 1-30 (of 341 Records) |
Query Trace: Stone N[original query] |
---|
Wrangling Real-World Data: Optimizing Clinical Research Through Factor Selection with LASSO Regression
Howard KA , Anderson W , Podichetty JT , Gould R , Boyce D , Dasher P , Evans L , Kao C , Kumar VK , Hamilton C , Mathé E , Guerin PJ , Dodd K , Mehta AK , Ortman C , Patil N , Rhodes J , Robinson M , Stone H , Heavner SF . Int J Environ Res Public Health 2025 22 (4) Data-driven approaches to clinical research are necessary for understanding and effectively treating infectious diseases. However, challenges such as issues with data validity, lack of collaboration, and difficult-to-treat infectious diseases (e.g., those that are rare or newly emerging) hinder research. Prioritizing innovative methods to facilitate the continued use of data generated during routine clinical care for research, but in an organized, accelerated, and shared manner, is crucial. This study investigates the potential of CURE ID, an open-source platform to accelerate drug-repurposing research for difficult-to-treat diseases, with COVID-19 as a use case. Data from eight US health systems were analyzed using least absolute shrinkage and selection operator (LASSO) regression to identify key predictors of 28-day all-cause mortality in COVID-19 patients, including demographics, comorbidities, treatments, and laboratory measurements captured during the first two days of hospitalization. Key findings indicate that age, laboratory measures, severity of illness indicators, oxygen support administration, and comorbidities significantly influenced all-cause 28-day mortality, aligning with previous studies. This work underscores the value of collaborative repositories like CURE ID in providing robust datasets for prognostic research and the importance of factor selection in identifying key variables, helping to streamline future research and drug-repurposing efforts. |
Performance of a whole blood immunoassay for tenofovir detection and correlation with self-reported pre-exposure prophylaxis use in HIV-negative men who have sex with men interested in blood donation
Buccheri R , Whitaker B , Pollack LM , Bhaskar JR , Di Germanio C , Guillon G , Haaland R , Stramer SL , Reik R , Pandey S , Stone M , Anderson SA , Marks P , Custer B . Transfusion 2025 BACKGROUND: In 2023, the United States Food and Drug Administration revised its blood donor eligibility policy for men who have sex with men (MSM) from a 3-month deferral to individual assessment. Human Immunodeficiency Virus (HIV) pre-exposure prophylaxis (PrEP) use remains a reason for deferral, and nondisclosure is a concern. STUDY DESIGN AND METHODS: In a cross-sectional study of sexually active MSM from 8 U.S. cities who were interested in future blood donation, we assessed the performance of an enzyme-linked immunosorbent assay for detecting tenofovir (TFV) in whole blood (WB) and plasma and the correlation with self-reported PrEP use. RESULTS: Of 1548 individuals, 48% reported oral PrEP use. The WB assay identified 95% of PrEP users, while the plasma assay detected 88%. The WB assay performed well up to 14 days after the last reported dose. Receiver operating characteristics curve analysis showed an area under the curve of 0.96 (95% confidence interval [CI]: 0.95-0.97) using WB and 0.88 (95% CI: 0.86-0.90) using plasma. Specificity was 80% for WB and 66% for plasma. Detection rates for TFV disoproxil fumarate/emtricitabine (FTC) formulations were 99% in WB and 98% in plasma, compared to 93% and 86% for the TFV alafenamide/FTC formulation. DISCUSSION: High concordance between self-reported oral PrEP use and TFV detection was observed among PrEP users, suggesting the potential utility of WB as a biomatrix for TFV detection to support screening strategies. Given the expanded eligibility for MSM, who may be PrEP users, to donate blood, further examination of undisclosed PrEP use is important. |
Ecologic Risk Factors for Infestation of Rhipicephalus sanguineus s.l. in a Rocky Mountain Spotted Fever-Endemic Area of Eastern Arizona
Brophy MK , Drexler NA , Stone NE , Busch JD , Ballard R , Bourgeois RM , Pemberton GL , Paddock CD , Horiuchi K , Biggerstaff BJ , Blocher BH , Kersh GJ , Bendle H , Wagner DM , Nicholson WL , Salzer JS . Am J Trop Med Hyg 2025 Rocky Mountain spotted fever (RMSF) is a deadly tick-borne disease caused by the bacterium Rickettsia rickettsii. An ongoing epidemic of RMSF is affecting tribal communities in Arizona, with nearly 500 cases and 28 deaths since 2003. The San Carlos Apache Tribe has been consistently working to prevent RMSF using tick collars on dogs, pesticide treatments around homes, and increasing education for nearly a decade. Besides monitoring human disease levels and tick burden on dogs, we have little understanding of the long-term impact of prevention practices on tick abundance and infection rates in the peridomestic environment. We evaluated risk factors associated for tick infestation at home sites across the San Carlos Indian Reservation as well as R. rickettsii and Rickettsia massiliae prevalence in off-host ticks. Although the presence of fencing appears protective, the number of nearby structures is the most important risk factor associated with increased adult and nymphal tick abundance, highlighting the impact of a free-roaming dog population. |
Detection of SARS-CoV-2 Reinfections Using Nucleocapsid Antibody Boosting
Grebe E , Chacreton D , Stone M , Spencer BR , Haynes J , Akinseye A , Lanteri MC , Green V , Sulaeman H , Bruhn R , Avelino-Silva VI , Contestable P , Biggerstaff BJ , Coughlin MM , Custer B , Jones JM , Wright D , Busch MP . Emerg Infect Dis 2025 31 (5) 958-966 More than 85% of US adults had been infected with SARS-CoV-2 by the end of 2023. Continued serosurveillance of transmission and assessments of correlates of protection require robust detection of reinfections. We developed a serologic method for identifying reinfections in longitudinal blood donor data by assessing nucleocapsid (N) antibody boosting using a total immunoglobulin assay. Receiver operating characteristic curve analysis yielded an optimal ratio of >1.43 (sensitivity 87.1%, specificity 96.0%). When prioritizing specificity, a ratio of >2.33 was optimal (sensitivity 75.3%, specificity 99.3%). In donors with higher anti-N reactivity levels before reinfection, sensitivity was reduced. Sensitivity could be improved by expanding the dynamic range of the assay through dilutional testing, from 38.8% to 66.7% in the highest reactivity group (signal-to-cutoff ratio before reinfection >150). This study demonstrated that longitudinal testing for N antibodies can be used to identify reinfections and estimate total infection incidence in a blood donor cohort. |
Proportions of US Blood Donors With Serological Evidence of Severe Acute Respiratory Syndrome Coronavirus 2 Infections Who Reported Survey-Based Diagnosed Infections During July 2020-December 2022
Akinseye A , Wright DJ , Grebe E , Stone M , Hathaway CA , Fink RV , Spencer BR , Saa P , Lanteri MC , Busch M , Jones JM . Open Forum Infect Dis 2025 12 (5) ofaf210 The proportion of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections diagnosed by coronavirus disease 2019 (COVID-19) tests, including home antigen tests, is unknown. We detected infections among blood donors in the United States (US) by testing for nucleocapsid antibody (anti-N) seroconversion and administered a questionnaire to determine the proportion of those infections that were associated with a self-reported positive COVID-19 test. Among US blood donors with serologic evidence of SARS-CoV-2 infection who completed a survey, 47.7% reported an associated self-reported positive COVID-19 test. This proportion changed from July-December 2020 (44.9%) to July-December 2022 (54.8%). This study suggests many SARS-CoV-2 infections in adults are not diagnosed with a test. |
Host population dynamics influence Leptospira spp. transmission patterns among Rattus norvegicus in Boston, Massachusetts, US
Stone NE , Hamond C , Clegg JR , McDonough RF , Bourgeois RM , Ballard R , Thornton NB , Nuttall M , Hertzel H , Anderson T , Whealy RN , Timm S , Roberts AK , Barragán V , Phipatanakul W , Leibler JH , Benson H , Specht A , White R , LeCount K , Furstenau TN , Galloway RL , Hill NJ , Madison JD , Fofanov VY , Pearson T , Sahl JW , Busch JD , Weiner Z , Nally JE , Wagner DM , Rosenbaum MH . PLoS Negl Trop Dis 2025 19 (4) e0012966 ![]() ![]() Leptospirosis (caused by pathogenic bacteria in the genus Leptospira) is prevalent worldwide but more common in tropical and subtropical regions. Transmission can occur following direct exposure to infected urine from reservoir hosts, or a urine-contaminated environment, which then can serve as an infection source for additional rats and other mammals, including humans. The brown rat, Rattus norvegicus, is an important reservoir of Leptospira spp. in urban settings. We investigated the presence of Leptospira spp. among brown rats in Boston, Massachusetts and hypothesized that rat population dynamics in this urban setting influence the transportation, persistence, and diversity of Leptospira spp. We analyzed DNA from 328 rat kidney samples collected from 17 sites in Boston over a seven-year period (2016-2022); 59 rats representing 12 of 17 sites were positive for Leptospira spp. We used 21 neutral microsatellite loci to genotype 311 rats and utilized the resulting data to investigate genetic connectivity among sampling sites. We generated whole genome sequences for 28 Leptospira spp. isolates obtained from frozen and fresh tissue from some of the 59 positive rat kidneys. When isolates were not obtained, we attempted genomic DNA capture and enrichment, which yielded 14 additional Leptospira spp. genomes from rats. We also generated an enriched Leptospira spp. genome from a 2018 human case in Boston. We found evidence of high genetic structure among rat populations that is likely influenced by major roads and/or other dispersal barriers, resulting in distinct rat population groups within the city; at certain sites these groups persisted for multiple years. We identified multiple distinct phylogenetic clades of L. interrogans among rats that were tightly linked to distinct rat populations. This pattern suggests L. interrogans persists in local rat populations and its transportation is influenced by rat population dynamics. Finally, our genomic analyses of the Leptospira spp. detected in the 2018 human leptospirosis case in Boston suggests a link to rats as the source. These findings will be useful for guiding rat control and human leptospirosis mitigation efforts in this and other similar urban settings. |
Assessment of a Continuing Education Course about Wildfire Smoke and Patient Health
Dowling TC , Stone SL , Cascio WE , Damon SA , Hutson MR , Sacks JD , Mirabelli MC . ATS Sch 2025 |
U.S. Emergency Department Visits Attributed by Clinicians to Semaglutide Adverse Events, 2022-2023
Lovegrove MC , Stone ND , Geller AI , Weidle NJ , Lind JN , Cohen PA . Ann Intern Med 2025 |
Characteristics of nursing homes with high rates of invasive methicillin-resistant Staphylococcus aureus infections
See I , Jackson KA , Hatfield KM , Paul P , Li R , Nadle J , Petit S , Ray SM , Harrison LH , Jeffrey L , Lynfield R , Bernu C , Dumyati G , Gellert A , Schaffner W , Markus T , Gokhale RH , Stone ND , Jacobs Slifka K . J Am Geriatr Soc 2025 73 (3) 849-858 BACKGROUND: Nursing home residents experience a large burden of invasive methicillin-resistant Staphylococcus aureus (MRSA) infections. Data are limited regarding nursing home characteristics associated with differences in facility-level invasive MRSA rates. METHODS: We analyzed 2011-2015 data from CDC's Emerging Infections Program (EIP) active population- and laboratory-based surveillance for invasive MRSA cases within seven states. A nursing home-onset case was defined as MRSA cultured from a normally sterile site in a person living in a nursing home 3 days before culture collection. Facility rates were calculated as nursing home-onset cases per 100,000 resident-days. Nursing home resident-day denominators and facility characteristics were obtained from four Centers for Medicare & Medicaid Services (CMS) datasets. A general estimating equations model with a logit link assessed characteristics of the facilities with highest rates comprising 50% of nursing home MRSA cases ("high rates"). RESULTS: The 626 nursing homes in the surveillance area had 2824 invasive MRSA cases; 82% of facilities had at ≥1 case. The 20% of facilities with highest rates (≥3.84 cases/100,000 resident-days) had 50% of nursing home-onset cases. In multivariable regression, facilities with high rates were more likely to have CMS-derived characteristics of presence of a resident with a multidrug-resistant organism; or greater proportions of residents who were male, were short stay (in the facility <100 days), had a nasogastric or percutaneous gastrostomy tube, or require extensive assistance with bed repositioning; and more likely to be in an EIP area with higher hospital-onset MRSA rates. Higher registered nurses staffing levels (hours/resident/day) and higher proportions of White residents were associated with lower rates. CONCLUSIONS: Facilities with higher invasive MRSA rates served residents with more clinical and functional care needs. Increasing registered nurse staffing in high-risk facilities might assist with reduction of invasive MRSA rates. These findings could help prioritize nursing homes for future MRSA prevention work. |
Risk period for transmission of SARS-CoV-2 and seasonal influenza: a rapid review
Stone EC , Okasako-Schmucker DL , Taliano J , Schaefer M , Kuhar DT . Infect Control Hosp Epidemiol 2025 1-9 ![]() ![]() BACKGROUND: Restricting infectious healthcare workers (HCWs) from the workplace is an important infection prevention strategy. The duration of viral shedding or symptoms are often used as proxies for the infectious period in adults but may not accurately estimate it. OBJECTIVE: To determine the risk period for transmission among previously healthy adults infected with SARS-CoV-2 omicron variant (omicron) or influenza A (influenza) by examining the duration of shedding and symptoms, and day of symptom onset in secondary cases of transmission pairs. DESIGN: Rapid review. METHODS: This rapid review adhered to PRISMA-ScR; five databases were searched. The cumulative daily proportion of participants with an outcome of interest was calculated for each study and summarized. RESULTS: Forty-three studies were included. Shedding resolved among ≥ 70% of participants by the end of day nine post symptom onset for omicron, and day seven for influenza; and for ≥ 90% of participants, by the end of day 10 for omicron and day nine for influenza. Two studies suggested shedding continues > 24 hours post-fever resolution for both viruses. Symptom onset occurred in ≥ 80% of secondary cases by the end of day seven post-primary case symptom onset for omicron and day six for influenza. CONCLUSIONS: Omicron shedding is consistent with previous recommendations to exclude infected HCWs from work for 10 days; and influenza follows a similar trend. Earlier symptom onset in most secondary cases for both pathogens indicates that, despite persistent viral shedding, most transmission occurs earlier; and the cumulative serial interval might better approximate the duration of infectiousness. |
The association between underlying conditions, risk factors, risk markers, and post-COVID conditions ≥6 months after COVID-19: A systematic review
Hill A , Morford M , Saydah S , Logan P , Raso D , Stone EC , Taliano J , Koumans EH , Varechtchouk O . J Family Med Prim Care 2024 13 (12) 5868-5884 INTRODUCTION: While various demographic factors and underlying medical conditions are associated with the development of post-COVID conditions within a month after SARS-CoV-2 infection, less is known about factors associated with post-COVID symptoms that persist for 6 months or more. The aim of this review was to determine the association between underlying conditions, other risk factors, health behaviors, and the presence of symptoms ≥6 months after COVID-19. METHODS: Studies reporting on post-COVID symptoms were searched in databases, including Medline, EMBASE, Global Health, PsycInfo, Scopus, CINAHL, Proquest, and WHO COVID-19 literature, from the beginning of the pandemic until November 2022. Studies were included if they reported on symptoms ≥6 months after COVID-19 and a relevant measure of association (adjusted or unadjusted odds or risk ratio). RESULTS: A total of 17 studies with 109,293 participants met the inclusion criteria; they were conducted in China (3), Italy (3), Spain (3), Russia (2), France (1), Germany (1), Sweden (1), Scotland (1), United Kingdom (1), and the United States (1). When compared to males, female participants were at an increased risk of post-COVID-19 symptoms (risk ratio (RR): 1.24; adjusted odds ratio (aOR): 3.08). Underlying conditions, including COPD/lung disease, overweight status or obesity, hypertension, cardiovascular disease, and asthma, were identified as possibly being associated with an increased risk of post-COVID symptoms. CONCLUSION: Female gender and certain underlying medical conditions were associated with an increased risk of post-COVID symptoms ≥6 months after COVID-19. Further research is needed to better understand some of these associations and identify groups that are at increased risk for persistent post-COVID conditions. |
An insight into limestone pillar stability in dipping environments using actual mine geometries
Rashed G , Slaker B , Evanek N . Min Metall Explor 2025 As stone mine operations continue to develop in more challenging conditions including inclined seams, more complex loading conditions and pillar geometries are generated. The main objective of this study is to gain more understanding about the effect of seam inclination on the strength, the loading path, deformation of sidewalls, and yield patterns of a stone pillar using numerical models. The modeled width-to-height (W/H) ratio of the pillars, the unconfined compressive strength of limestone material, in situ stress field, and roof interface were varied to consider their potential distribution across underground limestone mines in the United States. Two actual mine geometries, referred to as a-type and b-type, were modeled. In a-type mine geometry, the roof is dipping while the floor is flat, making one side of the pillar shorter than the other side. In b-type mine geometry, the roof and floor lines of pillars are dipping while the headings/crosscuts are flat. The intention is not to compare pillar stability in these mine geometries, but to show pillar response in different dipping environments because these environments are different in pillar size, shape, and extraction ratio. Numerical modeling results indicate that dip pillars have reduced strength compared to flat pillars. The shear strength between the pillar and the surrounding rock has an impact on dipping pillar response. Dipping pillars experience high shear stresses, highly non-uniform stress distributions, and asymmetric yield pattern with more yielding compared to flat pillars. All these reasons place dipping pillars, particularly those with a small width-to-height ratio (<1) at an elevated risk of instability. The yield pattern for a flat pillar is simple while it is complex for a dipping pillar and depends on numerous parameters such as the width-to-height ratio of the pillar and seam inclination. The down-dip side of dipping pillars experiences more outward normal displacement compared to the up-dip side, while it experiences less vertical displacement. The results of this study improve the understanding of pillar stability in dipping environments and advance the ultimate goal of reducing the risk of dipping pillar instability in underground stone mines. © This is a U.S. Government work and not under copyright protection in the US; foreign copyright protection may apply 2025. |
Release of crystalline silica nanoparticles during engineered stone fabrication
Rishi K , Ku BK , Qi C , Thompson D , Wang C , Dozier A , Vogiazi V , Zervaki O , Kulkarni P . ACS Omega 2024 9 (51) 50308-50317 Inhalation exposure to respirable crystalline silica (RCS) during the fabrication of engineered stone-based kitchen countertops has been on the rise in recent years and has become a significant occupational health problem in the United States and globally. Little is known about the presence of nanocrystalline silica (NCS), i.e., particles below 100 nm. We present a methodology to quantify the crystalline silica content in the sub-100 nm size fraction of the aerosol released during engineered stone fabrication using X-ray diffraction (XRD) and Fourier transform infrared (FTIR) spectroscopy. Aerosol was generated in a test chamber designed per EN 1093-3 and sampled using cascade impactors. XRD and FTIR analysis showed the presence of both α-quartz (15-60%) and cristobalite (10-50%) polymorphs in all size fractions. With increasing particle size, the cristobalite content increased. Seventy percent of the total aerosol mass in the sub-100 nm fraction was found to be crystalline silica, qualitatively confirmed by electron diffraction and electron energy loss spectroscopy. The presence of other minerals was detected in all size fractions; no polymeric resin binder was detected in the sub-100 nm fraction. Although the sub-100 nm fraction was about 1% of the aerosol mass, it accounted for 4-24% of the aerosol surface area based on the total lung deposition. If the surface area is a more relevant exposure metric, the assessment of the efficacy of current engineering control systems using mass as an exposure metric may not provide adequate protection. |
Spike and nucleocapsid antibody dynamics following SARS-CoV-2 infection and vaccination: Implications for sourcing COVID-19 convalescent plasma from routinely collected blood donations
Di Germanio C , Deng X , Balasko BG , Simmons G , Martinelli R , Grebe E , Stone M , Spencer BR , Saa P , Yu EA , Lanteri MC , Green V , Wright D , Lartey I , Kleinman S , Jones J , Biggerstaff BJ , Contestable P , Busch MP . Transfusion 2024 BACKGROUND: COVID-19 convalescent plasma (CCP) remains a treatment option for immunocompromised patients; however, the current FDA qualification threshold of ≥200 BAU/mL of spike antibody appears to be relatively low. We evaluated the levels of binding (bAb) and neutralizing antibodies (nAb) on serial samples from repeat blood donors who were vaccinated and/or infected to inform criteria for qualifying CCP from routinely collected plasma components. METHODS: Donors were categorized into four groups: (1) infected, then vaccinated, (2) vaccinated then infected during the delta, or (3) omicron waves, (4) vaccinated without infection. IgG Spike and total Nuclecapsid bAb were measured, along with S variants and nAb titers using reporter viral particle neutralization. RESULTS: Mean S IgG bAb peaks after infection alone were lower than after primary and booster vaccinations, and higher after delta and omicron infection in previously vaccinated donors. Half-lives for S IgG ranged from 34 to 66 days after first infection/vaccination events and up to 108 days after second events. The levels of S IgG bAb and nAb were similar across different variants, except for omicron, which were lower. Better correlations of nAb with bAb were observed at higher levels (hybrid immunity) than at the current FDA CCP qualifying threshold. DISCUSSION: Routine plasma donations from donors with hybrid immunity had high S bAb and potent neutralizing activity for 3-6 months after infection. In donations with high (>4000 BAU/mL) S IgG, >95% had high nAb titers (>500) against ancestral and variant S, regardless of COVID-19 symptoms. These findings provide the basis for test-based criteria for qualifying CCP from routine blood donations. |
Vital signs: Suicide rates and selected county-level factors - United States, 2022
Cammack AL , Stevens MR , Naumann RB , Wang J , Kaczkowski W , Valderrama J , Stone DM , Lee R . MMWR Morb Mortal Wkly Rep 2024 73 (37) 810-818 INTRODUCTION: Approximately 49,000 persons died by suicide in the United States in 2022, and provisional data indicate that a similar number died by suicide in 2023. A comprehensive approach that addresses upstream community risk and protective factors is an important component of suicide prevention. A better understanding of the role of these factors is needed, particularly among disproportionately affected populations. METHODS: Suicide deaths were identified in the 2022 National Vital Statistics System. County-level factors, identified from federal data sources, included health insurance coverage, household broadband Internet access, and household income. Rates and levels of factors categorized by tertiles were calculated and presented by race and ethnicity, sex, age, and urbanicity. RESULTS: In 2022, the overall suicide rate was 14.2 per 100,000 population; rates were highest among non-Hispanic American Indian or Alaska Native (AI/AN) persons (27.1), males (23.0), and rural residents (20.0). On average, suicide rates were lowest in counties in the top one third of percentage of persons or households with health insurance coverage (13.0), access to broadband Internet (13.3), and income >100% of the federal poverty level (13.5). These factors were more strongly associated with lower suicide rates in some disproportionately affected populations; among AI/AN persons, suicide rates in counties in the highest tertile of these factors were approximately one half the rates of counties in the lowest tertile. CONCLUSIONS AND IMPLICATIONS FOR PUBLIC HEALTH PRACTICE: Higher levels of health insurance coverage, household broadband Internet access, and household income in communities might play a role in reducing suicide rates. Upstream programs, practices, and policies detailed in CDC's Suicide Prevention Resource for Action can be implemented by decision-makers, government agencies, and communities as they work together to address community-specific needs and save lives. |
Trends in drug overdose deaths by intent and drug categories, United States, 1999‒2022
Nguyen A , Wang J , Holland KM , Ehlman DC , Welder LE , Miller KD , Stone DM . Am J Public Health 2024 e1-e5 Objectives. To examine trends in overdose deaths by intent and drug category to better understand the recent decrease in overdose suicides amid the overdose epidemic. Methods. We examined trends in rates of overdose deaths by intent (unintentional, suicide, or undetermined) across 9 drug categories from 1999 to 2022 using US National Vital Statistics System mortality data. Results. Unintentional overdoses involving synthetic opioids, polydrug toxicity involving synthetic opioids, psychostimulants, and cocaine increased exponentially with annual percentage changes ranging from 15.0% to 104.9% during 2010 to 2022. The death rates also increased for suicides involving these drugs, especially for psychostimulants (annual percentage change = 12.9% for 2010-2022; P < .001). However, these drugs accounted for relatively small percentages of overdose suicides. The leading drug categories among suicides were antidepressants, prescription opioids, and benzodiazepines, though these deaths have decreased or leveled off in recent years. Conclusions. Different drugs commonly involved in suicides and unintentional overdoses may contribute to their divergent trends. Public Health Implications. Amid the overdose epidemic, safe storage of medications remains a crucial strategy to prevent overdose suicides. The large increases in suicides involving psychostimulants warrant monitoring. (Am J Public Health. Published online ahead of print August 8, 2024:e1-e5. https://doi.org/10.2105/AJPH.2024.307745). |
Intersection of adverse childhood experiences, suicide and overdose prevention
Austin AE , DePadilla L , Niolon P , Stone D , Bacon S . Inj Prev 2024 Adverse childhood experiences (ACEs), suicide and overdose are linked across the life course and across generations and share common individual-, interpersonal-, community- and societal-level risk factors. The purpose of this review is to summarise the shared aetiology of these public health issues, synthesise evidence regarding potential community- and societal-level prevention strategies and discuss future research and practice directions.Growing evidence shows the potential for community- and societal-level programmes and policies, including higher minimum wage; expanded Medicaid eligibility; increased earned income tax credits, child tax credits and temporary assistance for needy families benefits; Paid Family Leave; greater availability of affordable housing and rental assistance; and increased participation in the Supplemental Nutrition Assistance Program (SNAP), to contribute to ACEs, suicide and overdose prevention. Considerations for future prevention efforts include (1) expanding the evidence base through rigorous research and evaluation; (2) assessing the implications of prevention strategies for equity; (3) incorporating a relational health perspective; (4) enhancing community capacity to implement, scale and sustain evidenced-informed prevention strategies; and (5) acknowledging that community- and societal-level prevention strategies are longer-term strategies. |
Absence of lung tumor promotion with reduced tumor size in mice after inhalation of copper welding fumes
Zeidler-Erdely PC , Kodali V , Falcone LM , Mercer R , Leonard SS , Stefaniak AB , Grose L , Salmen R , Trainor-DeArmitt T , Battelli LA , McKinney W , Stone S , Meighan TG , Betler E , Friend S , Hobbie KR , Service S , Kashon M , Antonini JM , Erdely A . Carcinogenesis 2024 Welding fumes are a Group 1 (carcinogenic to humans) carcinogen as classified by the International Agency for Research on Cancer. The process of welding creates inhalable fumes rich in iron (Fe) that may also contain known carcinogenic metals such as chromium (Cr) and nickel (Ni). Epidemiological evidence has shown that both mild-steel (Fe-rich) and stainless steel (Fe-rich + Cr + Ni) welding fume exposure increase lung cancer risk, and experimental animal data support these findings. Copper-nickel (CuNi) welding processes have not been investigated in the context of lung cancer. Cu is intriguing, however, given the role of Cu in carcinogenesis and cancer therapeutics. This study examines the potential for a CuNi fume to induce mechanistic key characteristics of carcinogenesis in vitro and to promote lung tumorigenesis, using a two-stage mouse bioassay, in vivo. Male A/J mice, initiated with 3-methylcholanthrene (MCA; 10 µg/g), were exposed to CuNi fumes or air by whole-body inhalation for nine weeks (low-deposition-LD and high deposition-HD) then sacrificed at 30 weeks. In BEAS-2B cells, the CuNi fume induced micronuclei and caused DNA damage as measured by γ-H2AX. The fume exhibited high reactivity and a dose response in cytotoxicity and oxidative stress. In vivo, MCA/CuNi HD and LD significantly decreased lung tumor size and adenomas. MCA/CuNi HD exposure significantly decreased gross-evaluated tumor number. In summary, the CuNi fume in vitro exhibited characteristics of a carcinogen, but in vivo the exposure resulted in smaller tumors, fewer adenomas, less hyperplasia severity, and with the HD exposure, less overall lung lesion/tumors. |
Detection of nucleocapsid antibodies associated with primary SARS-CoV-2 infection in unvaccinated and vaccinated blood donors
Grebe E , Stone M , Spencer BR , Akinseye A , Wright D , Di Germanio C , Bruhn R , Zurita KG , Contestable P , Green V , Lanteri MC , Saa P , Biggerstaff BJ , Coughlin MM , Kleinman S , Custer B , Jones JM , Busch MP . Emerg Infect Dis 2024 30 (8) ![]() ![]() Nucleocapsid antibody assays can be used to estimate SARS-CoV-2 infection prevalence in regions implementing spike-based COVID-19 vaccines. However, poor sensitivity of nucleocapsid antibody assays in detecting infection after vaccination has been reported. We derived a lower cutoff for identifying previous infections in a large blood donor cohort (N = 142,599) by using the Ortho VITROS Anti-SARS-CoV-2 Total-N Antibody assay, improving sensitivity while maintaining specificity >98%. We validated sensitivity in samples donated after self-reported swab-confirmed infections diagnoses. Sensitivity for first infections in unvaccinated donors was 98.1% (95% CI 98.0-98.2) and for infection after vaccination was 95.6% (95% CI 95.6-95.7) based on the standard cutoff. Regression analysis showed sensitivity was reduced in the Delta compared with Omicron period, in older donors, in asymptomatic infections, <30 days after infection, and for infection after vaccination. The standard Ortho N antibody threshold demonstrated good sensitivity, which was modestly improved with the revised cutoff. |
Detection of Leptospira kirschneri in a short-beaked common dolphin (Delphinus delphis delphis) stranded off the coast of southern California, USA
Prager KC , Danil K , Wurster E , Colegrove KM , Galloway R , Kettler N , Mani R , McDonough RF , Sahl JW , Stone NE , Wagner DM , Lloyd-Smith JO . BMC Vet Res 2024 20 (1) 266 ![]() ![]() BACKGROUND: Pathogenic Leptospira species are globally important zoonotic pathogens capable of infecting a wide range of host species. In marine mammals, reports of Leptospira have predominantly been in pinnipeds, with isolated reports of infections in cetaceans. CASE PRESENTATION: On 28 June 2021, a 150.5 cm long female, short-beaked common dolphin (Delphinus delphis delphis) stranded alive on the coast of southern California and subsequently died. Gross necropsy revealed multifocal cortical pallor within the reniculi of the kidney, and lymphoplasmacytic tubulointerstitial nephritis was observed histologically. Immunohistochemistry confirmed Leptospira infection, and PCR followed by lfb1 gene amplicon sequencing suggested that the infecting organism was L.kirschneri. Leptospira DNA capture and enrichment allowed for whole-genome sequencing to be conducted. Phylogenetic analyses confirmed the causative agent was a previously undescribed, divergent lineage of L.kirschneri. CONCLUSIONS: We report the first detection of pathogenic Leptospira in a short-beaked common dolphin, and the first detection in any cetacean in the northeastern Pacific Ocean. Renal lesions were consistent with leptospirosis in other host species, including marine mammals, and were the most significant lesions detected overall, suggesting leptospirosis as the likely cause of death. We identified the cause of the infection as L.kirschneri, a species detected only once before in a marine mammal - a northern elephant seal (Mirounga angustirostris) of the northeastern Pacific. These findings raise questions about the mechanism of transmission, given the obligate marine lifestyle of cetaceans (in contrast to pinnipeds, which spend time on land) and the commonly accepted view that Leptospira are quickly killed by salt water. They also raise important questions regarding the source of infection, and whether it arose from transmission among marine mammals or from terrestrial-to-marine spillover. Moving forward, surveillance and sampling must be expanded to better understand the extent to which Leptospira infections occur in the marine ecosystem and possible epidemiological linkages between and among marine and terrestrial host species. Generating Leptospira genomes from different host species will yield crucial information about possible transmission links, and our study highlights the power of new techniques such as DNA enrichment to illuminate the complex ecology of this important zoonotic pathogen. |
A mutation associated with resistance to synthetic pyrethroids is widespread in US populations of the tropical lineage of Rhipicephalus sanguineus s.l
Stone NE , Ballard R , Bourgeois RM , Pemberton GL , McDonough RF , Ruby MC , Backus LH , López-Pérez AM , Lemmer D , Koch Z , Brophy M , Paddock CD , Kersh GJ , Nicholson WL , Sahl JW , Busch JD , Salzer JS , Foley JE , Wagner DM . Ticks Tick Borne Dis 2024 15 (4) 102344 ![]() ![]() The brown dog tick, Rhipicephalus sanguineus sensu lato (s.l.), is an important vector for Rickettsia rickettsii, causative agent of Rocky Mountain spotted fever. Current public health prevention and control efforts to protect people involve preventing tick infestations on domestic animals and in and around houses. Primary prevention tools rely on acaricides, often synthetic pyrethroids (SPs); resistance to this chemical class is widespread in ticks and other arthropods. Rhipicephalus sanguineus s.l. is a complex that likely contains multiple unique species and although the distribution of this complex is global, there are differences in morphology, ecology, and perhaps vector competence among these major lineages. Two major lineages within Rh. sanguineus s.l., commonly referred to as temperate and tropical, have been documented from multiple locations in North America, but are thought to occupy different ecological niches. To evaluate potential acaricide resistance and better define the distributions of the tropical and temperate lineages throughout the US and in northern Mexico, we employed a highly multiplexed amplicon sequencing approach to characterize sequence diversity at: 1) three loci within the voltage-gated sodium channel (VGSC) gene, which contains numerous genetic mutations associated with resistance to SPs; 2) a region of the gamma-aminobutyric acid-gated chloride channel gene (GABA-Cl) containing several mutations associated with dieldrin/fipronil resistance in other species; and 3) three mitochondrial genes (COI, 12S, and 16S). We utilized a geographically diverse set of Rh sanguineus s.l. collected from domestic pets in the US in 2013 and a smaller set of ticks collected from canines in Baja California, Mexico in 2021. We determined that a single nucleotide polymorphism (T2134C) in domain III segment 6 of the VGSC, which has previously been associated with SP resistance in Rh. sanguineus s.l., was widespread and abundant in tropical lineage ticks (>50 %) but absent from the temperate lineage, suggesting that resistance to SPs may be common in the tropical lineage. We found evidence of multiple copies of GABA-Cl in ticks from both lineages, with some copies containing mutations associated with fipronil resistance in other species, but the effects of these patterns on fipronil resistance in Rh. sanguineus s.l. are currently unknown. The tropical lineage was abundant and geographically widespread, accounting for 79 % of analyzed ticks and present at 13/14 collection sites. The temperate and tropical lineages co-occurred in four US states, and as far north as New York. None of the ticks we examined were positive for Rickettsia rickettsii or Rickettsia massiliae. |
Reducing hospitalizations and multidrug-resistant organisms via regional decolonization in hospitals and nursing homes
Gussin GM , McKinnell JA , Singh RD , Miller LG , Kleinman K , Saavedra R , Tjoa T , Gohil SK , Catuna TD , Heim LT , Chang J , Estevez M , He J , O'Donnell K , Zahn M , Lee E , Berman C , Nguyen J , Agrawal S , Ashbaugh I , Nedelcu C , Robinson PA , Tam S , Park S , Evans KD , Shimabukuro JA , Lee BY , Fonda E , Jernigan JA , Slayton RB , Stone ND , Janssen L , Weinstein RA , Hayden MK , Lin MY , Peterson EM , Bittencourt CE , Huang SS . Jama 2024 IMPORTANCE: Infections due to multidrug-resistant organisms (MDROs) are associated with increased morbidity, mortality, length of hospitalization, and health care costs. Regional interventions may be advantageous in mitigating MDROs and associated infections. OBJECTIVE: To evaluate whether implementation of a decolonization collaborative is associated with reduced regional MDRO prevalence, incident clinical cultures, infection-related hospitalizations, costs, and deaths. DESIGN, SETTING, AND PARTICIPANTS: This quality improvement study was conducted from July 1, 2017, to July 31, 2019, across 35 health care facilities in Orange County, California. EXPOSURES: Chlorhexidine bathing and nasal iodophor antisepsis for residents in long-term care and hospitalized patients in contact precautions (CP). MAIN OUTCOMES AND MEASURES: Baseline and end of intervention MDRO point prevalence among participating facilities; incident MDRO (nonscreening) clinical cultures among participating and nonparticipating facilities; and infection-related hospitalizations and associated costs and deaths among residents in participating and nonparticipating nursing homes (NHs). RESULTS: Thirty-five facilities (16 hospitals, 16 NHs, 3 long-term acute care hospitals [LTACHs]) adopted the intervention. Comparing decolonization with baseline periods among participating facilities, the mean (SD) MDRO prevalence decreased from 63.9% (12.2%) to 49.9% (11.3%) among NHs, from 80.0% (7.2%) to 53.3% (13.3%) among LTACHs (odds ratio [OR] for NHs and LTACHs, 0.48; 95% CI, 0.40-0.57), and from 64.1% (8.5%) to 55.4% (13.8%) (OR, 0.75; 95% CI, 0.60-0.93) among hospitalized patients in CP. When comparing decolonization with baseline among NHs, the mean (SD) monthly incident MDRO clinical cultures changed from 2.7 (1.9) to 1.7 (1.1) among participating NHs, from 1.7 (1.4) to 1.5 (1.1) among nonparticipating NHs (group × period interaction reduction, 30.4%; 95% CI, 16.4%-42.1%), from 25.5 (18.6) to 25.0 (15.9) among participating hospitals, from 12.5 (10.1) to 14.3 (10.2) among nonparticipating hospitals (group × period interaction reduction, 12.9%; 95% CI, 3.3%-21.5%), and from 14.8 (8.6) to 8.2 (6.1) among LTACHs (all facilities participating; 22.5% reduction; 95% CI, 4.4%-37.1%). For NHs, the rate of infection-related hospitalizations per 1000 resident-days changed from 2.31 during baseline to 1.94 during intervention among participating NHs, and from 1.90 to 2.03 among nonparticipating NHs (group × period interaction reduction, 26.7%; 95% CI, 19.0%-34.5%). Associated hospitalization costs per 1000 resident-days changed from $64 651 to $55 149 among participating NHs and from $55 151 to $59 327 among nonparticipating NHs (group × period interaction reduction, 26.8%; 95% CI, 26.7%-26.9%). Associated hospitalization deaths per 1000 resident-days changed from 0.29 to 0.25 among participating NHs and from 0.23 to 0.24 among nonparticipating NHs (group × period interaction reduction, 23.7%; 95% CI, 4.5%-43.0%). CONCLUSIONS AND RELEVANCE: A regional collaborative involving universal decolonization in long-term care facilities and targeted decolonization among hospital patients in CP was associated with lower MDRO carriage, infections, hospitalizations, costs, and deaths. |
Description of antibiotic use variability among US nursing homes using electronic health record data
Kabbani S , Wang SW , Ditz LL , Gouin KA , Palms D , Rowe TA , Hyun DY , Chi NW , Stone ND , Hicks LA . Antimicrob Steward Healthc Epidemiol 12/28/2021 1 (1) e58 BACKGROUND: Antibiotics are frequently prescribed in nursing homes; national data describing facility-level antibiotic use are lacking. The objective of this analysis was to describe variability in antibiotic use in nursing homes across the United States using electronic health record orders. METHODS: A retrospective cohort study of antibiotic orders for 309,884 residents in 1,664 US nursing homes in 2016 were included in the analysis. Antibiotic use rates were calculated as antibiotic days of therapy (DOT) per 1,000 resident days and were compared by type of stay (short stay ≤100 days vs long stay >100 days). Prescribing indications and the duration of nursing home-initiated antibiotic orders were described. Facility-level correlations of antibiotic use, adjusting for resident health and facility characteristics, were assessed using multivariate linear regression models. RESULTS: In 2016, 54% of residents received at least 1 systemic antibiotic. The overall rate of antibiotic use was 88 DOT per 1,000 resident days. The 3 most common antibiotic classes prescribed were fluoroquinolones (18%), cephalosporins (18%), and urinary anti-infectives (9%). Antibiotics were most frequently prescribed for urinary tract infections, and the median duration of an antibiotic course was 7 days (interquartile range, 5-10). Higher facility antibiotic use rates correlated positively with higher proportions of short-stay residents, for-profit ownership, residents with low cognitive performance, and having at least 1 resident on a ventilator. Available facility-level characteristics only predicted a small proportion of variability observed (Model R(2) version 0.24 software). CONCLUSIONS: Using electronic health record orders, variability was found among US nursing-home antibiotic prescribing practices, highlighting potential opportunities for targeted improvement of prescribing practices. |
Comparative diagnostic utility of SARS-CoV-2 rapid antigen and molecular testing in a community setting
Kim AE , Bennett JC , Luiten K , O'Hanlon JA , Wolf CR , Magedson A , Han PD , Acker Z , Regelbrugge L , McCaffrey KM , Stone J , Reinhart D , Capodanno BJ , Morse SS , Bedford T , Englund JA , Boeckh M , Starita LM , Uyeki TM , Carone M , Weil A , Chu HY . J Infect Dis 2024 ![]() ![]() BACKGROUND: SARS-CoV-2 antigen-detection rapid diagnostic tests (Ag-RDTs) have become widely utilized but longitudinal characterization of their community-based performance remains incompletely understood. METHODS: This prospective longitudinal study at a large public university in Seattle, WA utilized remote enrollment, online surveys, and self-collected nasal swab specimens to evaluate Ag-RDT performance against real-time reverse transcription polymerase chain reaction (rRT-PCR) in the context of SARS-CoV-2 Omicron. Ag-RDT sensitivity and specificity within 1 day of rRT-PCR were evaluated by symptom status throughout the illness episode and Orf1b cycle threshold (Ct). RESULTS: From February to December 2022, 5,757 participants reported 17,572 Ag-RDT results and completed 12,674 rRT-PCR tests, of which 995 (7.9%) were rRT-PCR-positive. Overall sensitivity and specificity were 53.0% (95% CI: 49.6-56.4%) and 98.8% (98.5-99.0%), respectively. Sensitivity was comparatively higher for Ag-RDTs used 1 day after rRT-PCR (69.0%), 4 to 7 days post-symptom onset (70.1%), and Orf1b Ct ≤20 (82.7%). Serial Ag-RDT sensitivity increased with repeat testing ≥2 (68.5%) and ≥4 (75.8%) days after an initial Ag-RDT-negative result. CONCLUSION: Ag-RDT performance varied by clinical characteristics and temporal testing patterns. Our findings support recommendations for serial testing following an initial Ag-RDT-negative result, especially among recently symptomatic persons or those at high-risk for SARS-CoV-2 infection. |
SARS-CoV-2 infection, inflammation and birth outcomes in a prospective NYC pregnancy cohort
Gigase FAJ , Jessel RH , Kaplowitz E , Boychuk N , Ohrn S , Ibroci Est , Castro J , Lynch J , Tubassum R , Balbierz A , Molenaar NM , Graziani M , Missall R , Flores T , Stern T , Carreno JM , Krammer F , Adler A , Brody RI , Lesseur C , Chen J , Ellington S , Galang RR , Snead MC , Howell E , Stone J , Bergink V , Dolan S , Lieb W , Rommel AS , de Witte LD , Janevic T . J Reprod Immunol 2024 163 104243 Associations between antenatal SARS-CoV-2 infection and pregnancy outcomes have been conflicting and the role of the immune system is currently unclear. This prospective cohort study investigated the interaction of antenatal SARS-CoV-2 infection, changes in cytokine and HS-CRP levels, birthweight and gestational age at birth. 2352 pregnant participants from New York City (2020-2022) were included. Plasma levels of interleukin (IL)-1β, IL-6, IL-17A and high-sensitivity C-reactive protein (HS-CRP) were quantified in blood specimens obtained across pregnancy. Quantile and linear regression models were conducted to 1) assess the impact of antenatal SARS-CoV-2 infection, overall and by timing of detection of SARS-CoV-2 positivity (< 20 weeks versus ≥ 20 weeks), on birthweight and gestational age at delivery; 2) examine the relationship between SARS-CoV-2 infection and maternal immune changes during pregnancy. All models were adjusted for maternal demographic and obstetric factors and pandemic timing. Birthweight models were additionally adjusted for gestational age at delivery and fetal sex. Immune marker models were also adjusted for gestational age at specimen collection and multiplex assay batch. 371 (15.8%) participants were infected with SARS-CoV-2 during pregnancy, of which 98 (26.4%) were infected at < 20 weeks gestation. Neither SARS-CoV-2 infection in general nor in early or late pregnancy was associated with lower birthweight nor earlier gestational age at delivery. Further, we did not observe cytokine or HS-CRP changes in response to SARS-CoV-2 infection and thus found no evidence to support a potential association between immune dysregulation and the diversity in pregnancy outcomes following infection. |
Notes from the field: Emergency department visits for unsupervised pediatric melatonin ingestion - United States, 2019-2022
Freeman DI , Lind JN , Weidle NJ , Geller AI , Stone ND , Lovegrove MC . MMWR Morb Mortal Wkly Rep 2024 73 (9) 215-217 |
Economic cost of US suicide and nonfatal self-harm
Peterson C , Haileyesus T , Stone DM . Am J Prev Med 2024 INTRODUCTION: The US age-adjusted suicide rate is 35% higher than two decades ago and the COVID-19 pandemic era highlighted the urgent need to address nonfatal self-harm, particularly among youth. This study aimed to report the estimated annual economic cost of US suicide and nonfatal self-harm. METHODS: In 2023 CDC's WISQARS Cost of Injury provided the retrospective number of suicides and nonfatal self-harm injury emergency department (ED) visits from national surveillance sources by sex and age group, as well as the estimated annual economic cost of associated medical spending, lost work productivity, reduced quality of life from injury morbidity, and avoidable mortality based on the value of statistical life during 2015-2020. RESULTS: The economic cost of suicide and nonfatal self-harm averaged $510 billion (2020 USD) annually, the majority from life years lost to suicide. Working-aged adults (aged 25-64 years) comprised nearly 75% of the average annual economic cost of suicide ($356B of $484B) and children and younger adults (aged 10-44 years) comprised nearly 75% of the average annual economic cost of nonfatal self-harm injuries ($19B of $26B). CONCLUSIONS: Suicide and self-harm have substantial societal costs. Measuring the consequences in terms of comprehensive economic cost can inform investments in suicide prevention strategies. |
CDC guidance for community response to suicide clusters, United States, 2024
Ivey-Stephenson AZ , Ballesteros MF , Trinh E , Stone DM , Crosby AE . MMWR Suppl 2024 73 (2) 17-26 This is the third of three reports in the MMWR supplement that updates and expands CDC's guidance for assessing, investigating, and responding to suicide clusters based on current science and public health practice. The first report, Background and Rationale - CDC Guidance for Communities Assessing, Investigating, and Responding to Suicide Clusters, United States, 2024, describes an overview of suicide clusters, methods used to develop the supplement guidance, and intended use of the supplement reports. The second report, CDC Guidance for Community Assessment and Investigation of Suspected Suicide Clusters, United States, 2024, describes the potential methods, data sources, and analysis that communities can use to identify and confirm suspected suicide clusters and better understand the relevant issues. This report describes how local public health and community leaders can develop a response plan for suicide clusters. Specifically, the steps for responding to a suicide cluster include preparation, direct response, and action for prevention. These steps are not intended to be explicitly adopted but rather adapted into the local context, culture, capacity, circumstances, and needs for each suicide cluster. |
Suicidal ideation and behaviors among high school students - Youth Risk Behavior Survey, United States, 2019
Ivey-Stephenson AZ , Demissie Z , Crosby AE , Stone DM , Gaylor E , Wilkins N , Lowry R , Brown M . MMWR Suppl 2020 69 (1) 47-55 Suicide is the second leading cause of death among high school-aged youths 14-18 years after unintentional injuries. This report summarizes data regarding suicidal ideation (i.e., seriously considered suicide) and behaviors (i.e., made a suicide plan, attempted suicide, and made a suicide attempt requiring medical treatment) from CDC's 2019 Youth Risk Behavior Survey. Results are reported overall and by sex, grade, race/ethnicity, sexual identity, and sex of sexual contacts, overall and within sex groups. Trends in suicide attempts during 2009-2019 are also reported by sex, race/ethnicity, and grade. During 2009-2019, prevalence of suicide attempts increased overall and among female, non-Hispanic white, non-Hispanic black, and 12th-grade students. Data from 2019 reflect substantial differences by demographics regarding suicidal ideation and behaviors. For example, during 2019, a total of 18.8% of students reported having seriously considered suicide, with prevalence estimates highest among females (24.1%); white non-Hispanic students (19.1%); students who reported having sex with persons of the same sex or with both sexes (54.2%); and students who identified as lesbian, gay, or bisexual (46.8%). Among all students, 8.9% reported having attempted suicide, with prevalence estimates highest among females (11.0%); black non-Hispanic students (11.8%); students who reported having sex with persons of the same sex or with both sexes (30.3%); and students who identified as lesbian, gay, or bisexual (23.4%). Comprehensive suicide prevention can address these differences and reduce prevalence of suicidal ideation and behaviors by implementing programs, practices, and policies that prevent suicide (e.g., parenting programs), supporting persons currently at risk (e.g., psychotherapy), preventing reattempts (e.g., emergency department follow-up), and attending to persons who have lost a friend or loved one to suicide. |
Trends in violence victimization and suicide risk by sexual identity among high school students - Youth Risk Behavior Survey, United States, 2015-2019
Johns MM , Lowry R , Haderxhanaj LT , Rasberry CN , Robin L , Scales L , Stone D , Suarez NA . MMWR Suppl 2020 69 (1) 19-27 Lesbian, gay, and bisexual (LGB) youths continue to experience more violence victimization and suicide risk than heterosexual youths; however, few studies have examined whether the proportion of LGB youths affected by these outcomes has varied over time, and no studies have assessed such trends in a nationally representative sample. This report analyzes national trends in violence victimization and suicide risk among high school students by self-reported sexual identity (LGB or heterosexual) and evaluates differences in these trends among LGB students by sex (male or female) and race/ethnicity (non-Hispanic black, non-Hispanic white, or Hispanic). Data for this analysis were derived from the 2015, 2017, and 2019 cycles of CDC's Youth Risk Behavior Survey (YRBS), a cross-sectional, school-based survey conducted biennially since 1991. Logistic regression models assessed linear trends in prevalence of violence victimization and indicators of suicide risk among LGB and heterosexual students during 2015-2019; in subsequent models, sex-stratified (controlling for race/ethnicity and grade) and race/ethnicity-stratified (controlling for sex and grade) linear trends were examined for students self-identifying as LGB during 2015-2019. Results demonstrated that LGB students experienced more violence victimization and reported more suicide risk behaviors than heterosexual youths. Among LGB youths, differences in the proportion reporting violence victimization and suicide risk by sex and race/ethnicity were found. Across analyses, very few linear trends in these outcomes were observed among LGB students. Results highlight the continued need for comprehensive intervention strategies within schools and communities with the express goal of reducing violence victimization and preventing suicide risk behaviors among LGB students. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Jun 20, 2025
- Content source:
- Powered by CDC PHGKB Infrastructure