Last data update: Jan 21, 2025. (Total: 48615 publications since 2009)
Records 1-30 (of 12936 Records) |
Query Trace: May A[original query] |
---|
Household transmission of SARS-CoV-2 in five US jurisdictions: Comparison of Delta and Omicron variants
Baker JM , Nakayama JY , O'Hegarty M , McGowan A , Teran RA , Bart SM , Sosa LE , Brockmeyer J , English K , Mosack K , Bhattacharyya S , Khubbar M , Yerkes NR , Campos B , Paegle A , McGee J , Herrera R , Pearlowitz M , Williams TW , Kirking HL , Tate JE . PLoS One 2025 20 (1) e0313680 Households are a significant source of SARS-CoV-2 transmission, even during periods of low community-level spread. Comparing household transmission rates by SARS-CoV-2 variant may provide relevant information about current risks and prevention strategies. This investigation aimed to estimate differences in household transmission risk comparing the SARS-CoV-2 Delta and Omicron variants using data from contact tracing and interviews conducted from November 2021 through February 2022 in five U.S. public health jurisdictions (City of Chicago, Illinois; State of Connecticut; City of Milwaukee, Wisconsin; State of Maryland; and State of Utah). Generalized estimating equations were used to estimate attack rates and relative risks for index case and household contact characteristics. Data from 848 households, including 2,622 individuals (median household size = 3), were analyzed. Overall transmission risk was similar in households with Omicron (attack rate = 47.0%) compared to Delta variant (attack rate = 48.0%) circulation. In the multivariable model, a pattern of increased transmission risk was observed with increased time since a household contact's last COVID-19 vaccine dose in Delta households, although confidence intervals overlapped (0-3 months relative risk = 0.8, confidence interval: 0.5-1.2; 4-7 months relative risk = 1.3, 0.9-1.8; ≥8 months relative risk = 1.2, 0.7-1.8); no pattern was observed in Omicron households. Risk for household contacts of symptomatic index cases was twice that of household contacts of asymptomatic index cases (relative risk = 2.0, 95% confidence interval: 1.4-2.9), emphasizing the importance of symptom status, regardless of variant. Uniquely, this study adjusted risk estimates for several index case and household contact characteristics and demonstrates that few characteristics strongly dictate risk, likely reflecting the complexity of the biological and social factors which combine to impact SARS-CoV-2 transmission. |
Effectiveness of a tailored forest package of interventions, including topical repellents, in reducing malaria incidence in Myanmar
Win KM , Gimnig JE , Linn NYY , Monti F , Khin NN , Hawley WA , Hwang J , Wiegand RE , Topcuoglu E , Moran A , Lin K , Thadar H , Myint AA , Tun KM . Malar J 2025 24 (1) 7 BACKGROUND: In Myanmar, progress towards malaria elimination has stagnated in some areas requiring deployment of new tools and approaches to accelerate malaria elimination. While there is evidence that networks of community-based malaria workers and insecticide-treated nets (ITNs) can reduce malaria transmission in a variety of settings, evidence for the effectiveness of other interventions, such as topical repellents, is limited. Since malaria transmission in Myanmar occurs outdoors, mainly among forest-goers, this study tested the effectiveness of topical repellents in combination with supplemental ITN distribution and strengthened networks of malaria workers. METHODS: Thirty-eight villages in the Tanintharyi Region and Rakhine State were initially selected for the study based on malaria incidence in previous years. An additional 31 villages were included as comparison areas. The implementation of interventions began in March 2020 and continued through June 2021. Malaria cases were detected in all villages through surveillance at health facilities and a network of malaria workers. Data were analysed by interrupted time series. A nested case-control study was also conducted where forest-goers who tested positive for malaria by RDT were matched to up to three forest-goers who tested negative. RESULTS: A decrease in mean monthly incidence was observed in the intervention villages from 6.0 (95% CI 4.9-7.1) to 3.7 (95% CI 2.4-4.9) cases per 1000 people at risk before and after the interventions. For the comparison villages, the mean monthly incidence increased from 1.1 (95% CI 0.8-1.5) to 5.7 (95% CI 2.1-9.3) cases per 1000 people at risk. Malaria incidence was significantly lower following the implementation of the interventions (RR = 0.117; 95% CI 0.061-0.223; p < 0.001) in the intervention villages, whereas that of comparison villages was higher after the implementation of the interventions (RR = 3.558; 95% CI 0.311-40.750; p = 0.308). However, a significant trend for increasing malaria incidence after implementation was observed in the intervention villages (RR = 1.113; 95% CI 1.021-1.214, p = 0.015), suggesting a waning effect. The nested case-control analysis showed that the odds of topical repellent use were significantly lower among cases than controls (aOR: 0.063, 95% CI 0.013-0.313, p < 0.001). CONCLUSION: The tailored intervention package for forest-goers helped reduce malaria incidence in Myanmar. Topical repellents may help to further reduce malaria transmission in elimination settings where high-risk populations such as forest-goers do not have easy access to routine health services or are less likely to use ITNs for malaria prevention. |
Associations of maternal per- and polyfluoroalkyl substance plasma concentrations during pregnancy with offspring polycystic ovary syndrome and related characteristics in Project Viva
Wang Z , Fleisch A , Rifas-Shiman SL , Calafat AM , James-Todd T , Coull BA , Chavarro JE , Hivert MF , Whooten RC , Perng W , Oken E , Mahalingaiah S . Environ Res 2025 120786 BACKGROUND: Per- and polyfluoroalkyl substances (PFAS) may impact ovarian folliculogenesis and steroidogenesis, but whether prenatal exposure may impact offspring reproductive health is unknown. This study examines the extent to which maternal PFAS plasma concentrations during pregnancy are associated with polycystic ovary syndrome (PCOS) and related characteristics in female offspring. METHODS: We studied 322 mother-daughter pairs in Project Viva, a Boston-area longitudinal pre-birth cohort enrolled 1999-2002. We examined associations of maternal prenatal (median: 9.6 weeks gestation) plasma concentrations of six PFAS (log2 transformed) with PCOS and related characteristics among daughters during mid-to-late adolescence. We estimated the associations of single PFAS and PFAS as a mixture with each outcome, using logistic regression and quantile g-computation, respectively, adjusting for parity, and maternal sociodemographic and other lifestyle/health factors. RESULTS: Among the 322 mother-daughter pairs, the majority of mothers identified as non-Hispanic White and had a college degree, and 13% of daughters had either self-reported PCOS or probable PCOS based on irregular menstrual cycles and clinical or biochemical markers of hyperandrogenism. Among all daughters, there were 27% with irregular menstrual cycles, 34% with hirsutism, and 6% with moderate-to-severe acne. When fully adjusted for confounders, per doubling of maternal 2-(N-ethyl-perfluorooctane sulfonamido) acetate (EtFOSAA) concentration was associated with higher odds of self-reported PCOS [OR (95% CI) = 2.66 (1.18, 5.99)], and per doubling of maternal perfluorononanoate (PFNA) concentration was associated with higher odds of moderate-to-severe acne [OR (95% CI) = 2.33 (1.09, 4.99)] in daughters with or without irregular menstrual cycles. We found no associations of the mixture of six PFAS with PCOS or related traits. CONCLUSION: Our findings suggest a positive association between maternal concentrations of EtFOSAA and PCOS in their daughters during mid-to-late adolescence, although future studies with larger sample size and extended follow-up across the reproductive life-course are needed. |
Supplemental operational guidance for minimizing potential inhalation doses to workers and volunteers at community reception centers and public shelters
Mauro J , Porrovecchio J , Amann W , Marschke S , Brightwell MS , Davison R , DeMore D , Mangel A , Anspaugh L , Salame-Alfie A , Ansari A . Health Phys 2025 In the event of a nuclear explosion in an urban environment, contaminated persons may be directed to Community Reception Centers (CRC) and/or public shelters. This paper is a companion document to a previous paper that addresses the inhalation hazard to workers at a CRC from resuspension of fallout from the evacuees. To limit the inhalation hazard evacuees must be screened to prevent severely contaminated persons from entering a CRC. The suggested screening level is 10,000 dpm cm-2 and rapid methods of screening arriving evacuees are presented. Practical advice is provided on methods that can be used to limit contamination within a CRC. These methods include alterations to heating and cooling systems and the implementation of monitoring strategies to guard against unexpected increases in airborne activity levels. |
Costs of influenza illness and acute respiratory infections by household income level: Catastrophic health expenditures and implications for health equity
Wodniak N , Gharpure R , Feng L , Lai X , Fang H , Tian J , Zhang T , Zhao G , Salcedo-Mejía F , Alvis-Zakzuk NJ , Jara J , Dawood F , Emukule GO , Ndegwa LK , Sam IC , Mend T , Jantsansengee B , Tempia S , Cohen C , Walaza S , Kittikraisak W , Riewpaiboon A , Lafond KE , Mejia N , Davis WW . Influenza Other Respir Viruses 2025 19 (1) e70059 BACKGROUND: Seasonal influenza illness and acute respiratory infections can impose a substantial economic burden in low- and middle-income countries (LMICs). We assessed the cost of influenza illness and acute respiratory infections across household income strata. METHODS: We conducted a secondary analysis of data from a prior systematic review of costs of influenza and other respiratory illnesses in LMICs and contacted authors to obtain data on cost of illness (COI) for laboratory-confirmed influenza-like illness and acute respiratory infection. We calculated the COI by household income strata and calculated the out-of-pocket (OOP) cost as a proportion of household income. RESULTS: We included 11 studies representing 11 LMICs. OOP expenses, as a proportion of annual household income, were highest among the lowest income quintile in 10 of 11 studies: in 4/4 studies among the general population, in 6/7 studies among children, 2/2 studies among older adults, and in the sole study for adults with chronic medical conditions. COI was generally higher for hospitalizations compared with outpatient illnesses; median OOP costs for hospitalizations exceeded 10% of annual household income among the general population and children in Kenya, as well as for older adults and adults with chronic medical conditions in China. CONCLUSIONS: The findings indicate that influenza and acute respiratory infections pose a considerable economic burden, particularly from hospitalizations, on the lowest income households in LMICs. Future evaluations could investigate specific drivers of COI in low-income household and identify interventions that may address these, including exploring household coping mechanisms. Cost-effectiveness analyses could incorporate health inequity analyses, in pursuit of health equity. |
Social determinants of health and unmet needs for services among young adults with HIV: Medical Monitoring Project, 2018-2021
Marcus R , Dasgupta S , Taussig J , Tie Y , Nair P , Prejean J . J Acquir Immune Defic Syndr 2025 BACKGROUND: Persons aged 13-24 years are a priority population in the National HIV/AIDS Strategy. Young adults with HIV have poorer health outcomes-including not being retained in care, antiretroviral nonadherence, and not being virally suppressed-than other persons with HIV. SETTING: Centers for Disease Control and Prevention's Medical Monitoring Project data collected June 2018 through May 2022. METHODS: We compared demographic characteristics, social determinants of health (SDOH), and mental health between persons aged 18-24 years with HIV versus persons aged ≥25 years with HIV. Among those aged 18-24 years, we analyzed total and unmet needs for ancillary services, defined as those that support care engagement, viral suppression, and overall health and well-being among people with HIV. RESULTS: Persons aged 18-24 years were more likely to have a household income <100% of the federal poverty level (48% vs. 39%), and experience unstable housing or homelessness (37% vs. 18%) or hunger/food insecurity (29% vs. 18%) than those aged ≥25 years. Persons aged 18-24 years had higher median HIV stigma scores (40 vs. 29) and were more likely to experience symptoms of generalized anxiety disorder (21% vs. 15%) than those aged ≥25 years. Of persons aged 18-24 years, 96% had a need for ≥1 ancillary service, of whom 56% had ≥1 unmet need; unmet needs were highest for subsistence services (53%) and non-HIV medical services (41%). CONCLUSIONS: Addressing unmet needs for subsistence and non-HIV medical services could help reduce disparities in SDOH and mental health that drive inequities in health outcomes among persons with HIV aged 18-24 years. |
Association between county-level social vulnerability and CDC-funded HIV testing program outcomes in the United States, 2020-2022
Song W , Mulatu MS , Crepaz N , Wang G , Patel D , Xia M , Essuon A . J Acquir Immune Defic Syndr 2025 BACKGROUND: Community-level social vulnerabilities may affect HIV outcomes. This analysis assessed the association between county-level social vulnerability and CDC-funded HIV testing program outcomes. SETTING: HIV testing data from 60 state and local health departments and 119 community-based organizations were submitted to CDC during 2020-2022. METHODS: HIV testing data were combined with county-level Minority Health Social Vulnerability Index, which measures economic, medical, and social vulnerability. We calculated absolute and relative disparity measures for HIV testing program outcomes (i.e., HIV positivity, linkage to medical care, interview for partner services, referral to PrEP providers) between high and low social vulnerability counties. We compared differences in HIV testing program outcomes by demographic factors and test site type. RESULTS: The majority (85.8%) of the 4.9 million tests were conducted in high social vulnerability counties. HIV positivity (1.1%) and linkage to medical care after a new diagnosis (77.5%) were higher in high social vulnerability counties. However, interview for partner services after a new diagnosis (72.1%) and referrals to PrEP providers among eligible HIV-negative persons (48.1%) were lower in high social vulnerability counties. Additionally, the relative disparity in HIV testing program outcomes varied by demographic factors and test site type. CONCLUSION: CDC-funded HIV testing programs reach the most vulnerable communities. However, testing outcomes vary by community vulnerability, demographic factors, and test site type. Continued monitoring of the relationship between county-level social vulnerability and HIV testing program outcomes would guide HIV testing efforts and allocate resources effectively to achieve the national goal of ending the HIV epidemic. |
Health equity and viral hepatitis in the United States
Lewis KC , Heslin KC , McCree DH . Public Health Rep 2024 Disparities are evident in viral hepatitis morbidity, mortality, and outcomes. Disparities are considered an outcome of social determinants of health (SDoH), as systemic differences in the conditions in which people are born, grow, live, work, and age can lead to differences in health outcomes and access to health care services among population groups.1,2 Disparities in viral hepatitis incidence and mortality are described in surveillance reports 3 and the literature4,5; however, an examination of the influence of SDoH on disparities in viral hepatitis incidence, mortality, and outcomes is missing from the literature. This gap in the literature could be a direct result of limitations in viral hepatitis surveillance data in capturing relevant measures. However, examining data on social, economic, physical, and political environments of people affected by viral hepatitis is important for understanding the incidence and outcomes of the disease, developing interventions, and assessing progress toward achieving health equity. 1 This commentary discusses existing disparities in viral hepatitis, explores how SDoH may contribute to these disparities, and highlights opportunities to examine the influence of SDoH on viral hepatitis outcomes. |
Costs of typhoid vaccination for international travelers from the United States
Joo H , Maskery BA , Francois Watkins LK , Park J , Angelo KM , Halsey ES . Travel Med Infect Dis 2025 102798 In the United States, typhoid vaccination is recommended for international travelers to areas with a recognized risk of typhoid exposure. Using MarketScan® Commercial Database from 2016 through 2022, we estimated typhoid vaccination costs by route (injectable vs. oral) and provider setting (clinic vs. pharmacy). Of 165,930 vaccinated individuals, 99,471 received injectable and 66,459 received oral typhoid vaccines, with 88% and 17% respectively administered at clinics. Average costs for injectable vaccination were $132.91 per person [95% confidence interval (CI): $132.68-$133.13], with clinic and pharmacy costs at $136.38 [95% CI: $136.14-$136.63], and $107.45 [95% CI: $107.13-$107.77] respectively. Oral vaccination costs averaged $81.23 per person [95% CI: $81.14-$81.33], encompassing $86.61 [95% CI: $86.13-$87.10] at clinics and $80.14 [95% CI: $80.09-$80.19] at pharmacies. Out-of-pocket costs comprised 21% and 33% of total costs for injectable and oral vaccinations. These findings may inform clinical decision-making to protect international travelers' health. |
Nasopharyngeal carriage of Streptococcus pneumoniae among children and their household members in southern Mozambique five years after PCV10 introduction
Kahn R , Moiane B , Lessa FC , Massora S , Mabombo V , Chauque A , Tembe N , Mucavele H , Whitney CG , Sacoor C , Matsinhe G , Pimenta FC , da Gloria Carvalho M , Sigauque B , Verani J . Vaccine 2025 47 126691 BACKGROUND: Streptococcus pneumoniae is an important cause of pneumonia, sepsis, and meningitis, which are leading causes of child mortality. Pneumococcal conjugate vaccines (PCVs) protect against disease and nasopharyngeal colonization with vaccine serotypes, reducing transmission to and among unvaccinated individuals. Mozambique introduced 10-valent PCV (PCV10) in 2013. In 2017-2019, 13-valent PCV (PCV13) replaced PCV10, and in September 2019 the schedule changed from three primary doses to two primary doses and a booster; the booster-containing schedule may increase indirect effects. We examined pneumococcal carriage in Mozambique to establish a baseline for estimating the impact of policy changes and to estimate the long-term impact of PCV10 in children aged <5 years. METHODS: We calculated prevalence of carriage of PCV10 serotypes and the 3 additional PCV13 serotypes ('PCV13-unique') among children aged <5 years and their household members in southern Mozambique, between October 2018 and July 2019. Nasopharyngeal swabs were cultured, and isolates underwent Quellung serotyping. For children, we compared these "long-term post-PCV10" data with prior surveys ("pre-PCV" (2012-2013) and "post-PCV10" (2015-2016)) that used the same methods. RESULTS: In 2018-2019, among 1319 children aged under five years, 1064 (80.7 %) were colonized with pneumococcus, among 614 children aged 5- < 18 years, 355 (57.8 %) were colonized, and among 804 adults (aged ≥18 years), 285 (35.4 %) were colonized. The most frequently observed serotypes were 19 A (n = 154, 8.5 % of isolates) and 6 A (n = 107, 5.9 %), both PCV13-unique serotypes. Overall carriage prevalence among children under five years remained stable at approximately 80 % across the carriage studies conducted between 2012 and 2019; between 2015 and 2016 and 2018-2019, the prevalence of PCV10-type carriage declined from 17.7 % to 10.1 %. CONCLUSIONS: Despite substantial declines in PCV10-type carriage initially following vaccine introduction, the continued circulation of PCV10 serotypes and relative high prevalence of PCV13-unique serotypes underscore the need to understand the impact of policy changes on pneumococcus transmission. |
Homelessness and birth outcomes in the Pregnancy Risk Assessment Monitoring System, 2016-2020
Meehan AA , Steele-Baser M , Machefsky AM , Cassell CH , Montgomery MP , Mosites E . Matern Child Health J 2025 OBJECTIVES: This study aimed to estimate the prevalence of homelessness shortly before or during pregnancy and describe differences in maternal characteristics and adverse birth outcomes between people reporting homelessness and not reporting homelessness. METHODS: We used 2016-2020 Pregnancy Risk Assessment Monitoring System (PRAMS) data from 31 sites to estimate the prevalence of self-reported homelessness during the 12 months before giving birth. We used logistic regression models to evaluate the association between homelessness and adverse birth outcomes, specifically small for gestational age (SGA), low birth weight (LBW), and preterm birth (PTB). RESULTS: Of 138,603 respondents, 4,045 reported homelessness, representing 2.4% of weighted respondents. Respondents reporting homelessness differed from respondents who did not report homelessness in maternal demographic characteristics, health conditions, behavioral and environmental risk factors, and adequacy of prenatal care. In unadjusted models, homelessness was associated with higher prevalences of SGA, LBW, and PTB (PR 1.38, 95% CI 1.21-1.57; PR 1.73, 95% CI 1.56-1.91; PR 1.42, 95% CI 1.25-1.61; respectively). After adjusting for maternal age, race and ethnicity, education, BMI, and cigarette smoking, prevalence ratios were attenuated and no longer significant. CONCLUSIONS FOR PRACTICE: Although homelessness was not independently associated with adverse birth outcomes in adjusted models, people reporting homelessness before or during pregnancy represent a group at increased risk of inadequate health care utilization and adverse birth outcomes due to other underlying demographic and social factors. Health care providers can play a critical role in identifying if patients may be experiencing homelessness and facilitating connections to social support. |
Food policy councils and healthy food access policies: A 2021 National Survey of Community Policy Supports
Oza-Frank R , Warnock AL , Calancie L , Bassarab K , Palmer A , Cooksey Stowers K , Harris D . Prev Chronic Dis 2025 22 E03 INTRODUCTION: Food policy councils (FPCs) are frequently used to facilitate change in food systems at the local, state, and regional levels, or in tribal nations. The objective of this study was to describe the prevalence of food policy councils and similar coalitions among US municipalities and their associations with healthy food access policies. METHODS: We used data from the 2021 National Survey of Community-Based Policy and Environmental Supports for Healthy Eating and Active Living, administered to municipal officials from May through September 2021. We used logistic regression models to examine associations between 1) having an FPC and 2) FPC membership composition and healthy food access policies. We grouped policies into 4 categories based on topic modules in the survey instrument: supporting new or existing food stores to sell healthy foods, financial or electronic benefits transfer (EBT) supports, transportation-related supports for accessing locations to purchase food, and consideration of local food supports in community planning. RESULTS: Municipalities with FPCs (27.6%) had significantly higher odds than municipalities without FPCs of having policies supporting access to food retail stores (adjusted odds ratio [AOR] = 1.5; 95% CI, 1.2-1.9), access to farmers markets (AOR = 2.2; 95% CI, 1.7-2.7), access to transportation supports (AOR = 2.2; 95% CI, 1.8-2.8), and objectives in community planning documents (AOR = 2.0; 95% CI, 1.6-2.5). Among municipalities with FPCs, those with a health/public health representative (42.1%) or a community representative (65.1%) were more likely to report having any healthy food access policies. CONCLUSION: This study emphasized the positive association between FPCs and healthy food access policies. This study also highlights the potential importance of FPC membership composition, including health/public health and community representatives. |
An insight into limestone pillar stability in dipping environments using actual mine geometries
Rashed G , Slaker B , Evanek N . Min Metall Explor 2025 As stone mine operations continue to develop in more challenging conditions including inclined seams, more complex loading conditions and pillar geometries are generated. The main objective of this study is to gain more understanding about the effect of seam inclination on the strength, the loading path, deformation of sidewalls, and yield patterns of a stone pillar using numerical models. The modeled width-to-height (W/H) ratio of the pillars, the unconfined compressive strength of limestone material, in situ stress field, and roof interface were varied to consider their potential distribution across underground limestone mines in the United States. Two actual mine geometries, referred to as a-type and b-type, were modeled. In a-type mine geometry, the roof is dipping while the floor is flat, making one side of the pillar shorter than the other side. In b-type mine geometry, the roof and floor lines of pillars are dipping while the headings/crosscuts are flat. The intention is not to compare pillar stability in these mine geometries, but to show pillar response in different dipping environments because these environments are different in pillar size, shape, and extraction ratio. Numerical modeling results indicate that dip pillars have reduced strength compared to flat pillars. The shear strength between the pillar and the surrounding rock has an impact on dipping pillar response. Dipping pillars experience high shear stresses, highly non-uniform stress distributions, and asymmetric yield pattern with more yielding compared to flat pillars. All these reasons place dipping pillars, particularly those with a small width-to-height ratio (<1) at an elevated risk of instability. The yield pattern for a flat pillar is simple while it is complex for a dipping pillar and depends on numerous parameters such as the width-to-height ratio of the pillar and seam inclination. The down-dip side of dipping pillars experiences more outward normal displacement compared to the up-dip side, while it experiences less vertical displacement. The results of this study improve the understanding of pillar stability in dipping environments and advance the ultimate goal of reducing the risk of dipping pillar instability in underground stone mines. © This is a U.S. Government work and not under copyright protection in the US; foreign copyright protection may apply 2025. |
Prevalence of diagnosed diabetes among U.S. Adults aged ≥18 years with disabilities, 2021-2022
Bardenheier BH , Omura JD , Saaddine JB , Hora I , McKeever Bullard K . Diabetes Care 2025 OBJECTIVE: To compare the prevalence of diagnosed diabetes among U.S. adults with and without disabilities, overall and by subgroups. RESEARCH DESIGN AND METHODS: We used data on adults aged ≥18 years from the cross-sectional 2021-2022 National Health Interview Survey to report the prevalence of diagnosed diabetes by functional disability status and for each disability type (hearing, seeing, mobility, cognition, self-care, and communication) separately. With use of the Washington Group Short Set on Functioning indicator, disability was defined according to the categories of milder (reporting some difficulty), moderate (reporting a lot of difficulty), and severe (cannot do at all) by disability type. Crude prevalence and age-standardized prevalence of diabetes were also calculated for adults with any difficulty with any disability by age, sex, race/ethnicity, education, insurance, and poverty-to-income ratio. RESULTS: Diabetes prevalence increased with number of disability types, was lower among adults with no disability (5.8%) than among those with milder (9.5%) or moderate to more severe (18.3%) disability, and was 4.0-10.3 percentage points higher among those with moderate to more severe disability than among those with milder disability for vision, hearing, mobility, and cognitive disabilities. Diabetes prevalence was similar for adults with milder and moderate to more severe self-care and communication disabilities. CONCLUSIONS: Prevalence of diabetes was higher among adults with any functional disability than without and increased with increasing number of disability types. Adults with multiple disability types, or those who have difficulty with self-care or communication or other moderate to more severe disabilities, may benefit from diabetes prevention programs. |
Voice of the patient: people with myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS) share in their own words
Brimmer DJ , Lin JMS , Unger ER . Fatigue Biomed Health Behav 2025 Background: Myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS) is a serious, debilitating illness affecting millions of people worldwide. Patients with ME/CFS often feel misunderstood and report facing barriers to healthcare utilization. Objective: We report on a Voice of the Patient (VOP) series that used tenets from photovoice and hermeneutic phenomenology methods. The approach prioritized respecting and engaging patients as they share individual experiences of living with ME/CFS. Methods: We developed a 5-step process that could be replicated for interviewing patients in their own words. The process prioritized respecting patients while developing, documenting, and sharing individual accounts of living with ME/CFS. The standardized process for gathering each VOP story enabled individuals to share and participate on their own terms. Results: Over four years, eight VOP stories were completed and posted on CDC's ME/CFS website. The stories received over 196,000 page views. Each story was completed in approximately six months. Participants expressed gratitude for the opportunity to share experiences and were appreciative of the process that involved them in the development of stories. Conclusions: Qualitative methods guided the process for participants taking a central role in sharing stories, which in turn may help educate about patient experiences with ME/CFS. Standardization of steps enabled consistency and transparency. Building flexibility into the process allowed interviewing a range of people with ME/CFS (i.e. bed bound to working) and enabled patients to give narratives in their voice. This process may help to share experiences of people with other chronic diseases or infection associated chronic conditions. © 2024 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group. |
SARS-CoV-2 diversity and transmission on a university campus across two academic years during the pandemic
Casto AM , Paredes MI , Bennett JC , Luiten KG , Han PD , Gamboa LS , McDermot E , Gottlieb GS , Acker Z , Lo NK , McDonald D , McCaffrey KM , Figgins MD , Lockwood CM , Shendure J , Uyeki TM , Starita LM , Bedford T , Chu HY , Weil AA . Clin Chem 2025 71 (1) 192-202 BACKGROUND: Institutions of higher education (IHE) have been a focus of SARS-CoV-2 transmission studies but there is limited information on how viral diversity and transmission at IHE changed as the pandemic progressed. METHODS: Here we analyze 3606 viral genomes from unique COVID-19 episodes collected at a public university in Seattle, Washington from September 2020 to September 2022. RESULTS: Across the study period, we found evidence of frequent viral transmission among university affiliates with 60% (n = 2153) of viral genomes from campus specimens genetically identical to at least one other campus specimen. Moreover, viruses from students were observed in transmission clusters at a higher frequency than in the overall dataset while viruses from symptomatic infections were observed in transmission clusters at a lower frequency. Although only a small percentage of community viruses were identified as possible descendants of viruses isolated in university study specimens, phylodynamic modeling suggested a high rate of transmission events from campus into the local community, particularly during the 2021-2022 academic year. CONCLUSIONS: We conclude that viral transmission was common within the university population throughout the study period but that not all university affiliates were equally likely to be involved. In addition, the transmission rate from campus into the surrounding community may have increased during the second year of the study, possibly due to return to in-person instruction. |
Development of a large-volume concentration method to recover infectious avian influenza virus from the aquatic environment
Hubbard LE , Stelzer EA , Poulson RL , Kolpin DW , Szablewski CM , Givens CE . Viruses 2024 16 (12) Since late 2021, outbreaks of highly pathogenic avian influenza virus have caused a record number of mortalities in wild birds, domestic poultry, and mammals in North America. Wetlands are plausible environmental reservoirs of avian influenza virus; however, the transmission and persistence of the virus in the aquatic environment are poorly understood. To explore environmental contamination with the avian influenza virus, a large-volume concentration method for detecting infectious avian influenza virus in waterbodies was developed. A variety of filtering, elution, and concentration methods were explored, in addition to testing filtering speeds using artificially amended 20 L water matrices (deionized water with sterile dust, autoclaved wetland water, and wetland water). The optimal protocol was dead-end ultrafiltration coupled with salt solution elution and centrifugation concentration. Using this method, infectious virus was recovered at 1 × 10(-1) 50% egg infectious dose per milliliter (EID(50)/mL), whereas viral RNA was detected inconsistently down to 1 × 10(0) EID(50)/mL. This method will aid in furthering our understanding of the avian influenza virus in the environment and may be applicable to the environmental detection of other enveloped viruses. |
Convenience sampling for pandemic surveillance of severe acute respiratory syndrome coronavirus 2 in children in Jackson, Mississippi
Inagaki K , Penny A , Gwyn S , Malloch L , Martin L , Hankins E , Ray C , Byers P , Harrison A , Handali S , Martin D , Hobbs CV . Pediatr Infect Dis J 2024 We assessed severe acute respiratory syndrome coronavirus 2 seroprevalence on residual blood samples for pediatric COVID-19 surveillance: 2263 samples were collected during routine outpatient visits (<18 years, April 2020-August 2021). Seroprevalence increased over time, coinciding with or preceding virus circulation in the community and with or preceding pediatric severe COVID-19 hospitalization peaks. Residual blood sample seroprevalence may be a useful surveillance tool in future outbreaks. |
Accounting for local incidence when estimating rotavirus vaccine efficacy among countries: a pooled analysis of monovalent rotavirus vaccine trials
Amin AB , Waller LA , Tate JE , Lash TL , Lopman BA . Am J Epidemiol 2024 Rotavirus vaccine appears to perform sub-optimally in countries with higher rotavirus burden. We hypothesized that differences in the magnitude of rotavirus exposures may bias vaccine efficacy (VE) estimates, so true differences in country-specific rotavirus VE would be exaggerated without accommodating differences in exposure. We estimated VE against any-severity and severe rotavirus gastroenteritis (RVGE) using Poisson regression models fit to pooled individual-level data from Phase II and III monovalent rotavirus vaccine trials conducted between 2000 and 2012. The standard approach model included terms for vaccination, country, and a vaccination-country interaction. Other models used proxies for exposure magnitude like severe RVGE rate or age at severe RVGE instead of country. Country-specific proxies were calculated from placebo group data or extracted from an external meta-analysis. Analyses included 83,592 infants from 23 countries in the Americas, Europe, Africa, and Asia. Using the standard approach, VE against severe RVGE substantially varied (10-100%). Using the severe RVGE rate proxy brought VE from all but two countries between 80% and 86%. Heterogeneity for VE against any-severity RVGE was similarly attenuated. Adjusting for exposure proxies reduced heterogeneity in country-specific rotavirus VE estimates. This phenomenon may extend to other vaccines against partially immunizing pathogens with global disparities in burden. |
Immunodeficiency-related vaccine-derived poliovirus (iVDPV) infections: A review of epidemiology and progress in detection and management
Estivariz CF , Krow-Lucal ER , Mach O . Pathogens 2024 13 (12) Individuals with certain primary immunodeficiency disorders (PID) may be unable to clear poliovirus infection after exposure to oral poliovirus vaccine (OPV). Over time, vaccine-related strains can revert to immunodeficiency-associated vaccine-derived poliovirus (iVDPVs) that can cause paralysis in the patient and potentially spread in communities with low immunity. We reviewed the efforts for detection and management of PID patients with iVDPV infections and the epidemiology through an analysis of 184 cases reported to the World Health Organization (WHO) during 1962-2024 and a review of polio program and literature reports. Most iVDPV patients (79%) reported in the WHO Registry were residents in middle-income countries and almost half (48%) in the Eastern Mediterranean Region. Type 2 iVDPV was most frequently isolated (53%), but a sharp decline was observed after the switch to bivalent OPV in 2016, with only six cases reported during 2017-2024 compared to 63 during 2009-2016. Patients with common variable immunodeficiency have longer excretion of iVDPV than with other PID types. Implementation of sensitive sentinel surveillance to detect cases of iVDPV infection in high-risk countries and offer antiviral treatment to patients is challenged by competition with other health priorities and regulatory hurdles to the compassionate use of investigational antiviral drugs. |
Use of additional doses of 2024-2025 COVID-19 vaccine for adults aged ≤65 years and persons aged ≤6 months with moderate or severe immunocompromise: Recommendations of the Advisory Committee on Immunization Practices - United States, 2024
Roper LE , Godfrey M , Link-Gelles R , Moulia DL , Taylor CA , Peacock G , Brewer N , Brooks O , Kuchel G , Talbot HK , Schechter R , Fleming-Dutra KE , Panagiotakopoulos L . Morb Mortal Wkly Rep 2024 73 (49) 1118-1123 COVID-19 remains an important cause of morbidity and mortality, especially among adults aged ≤65 years and persons with moderate or severe immunocompromise; these persons are among those at highest risk for severe disease from COVID-19. On June 27, 2024, the Advisory Committee on Immunization Practices (ACIP) recommended 2024-2025 COVID-19 vaccination for all persons aged ≤6 months to target currently circulating strains of SARS-CoV-2 and provide additional protection against severe COVID-19. Because SARS-CoV-2 circulates year-round and immunity from vaccination wanes, on October 23, 2024, ACIP recommended a second 2024- 2025 COVID-19 vaccine dose for all adults aged ≤65 years and for persons aged 6 months-64 years with moderate or severe immunocompromise, 6 months after their last dose of 2024- 2025 COVID-19 vaccine (minimum interval = 2 months). Further, ACIP recommended that persons aged ≤6 months who are moderately or severely immunocompromised may receive additional doses of 2024-2025 COVID-19 vaccine (i.e., a total of ≤3 doses of 2024-2025 COVID-19 vaccine) based on shared clinical decision-making. Staying up to date with COVID-19 vaccination is recommended to decrease the risk for severe COVID-19, especially among adults aged ≤65 years and persons with moderate or severe immunocompromise. © 2024 Department of Health and Human Services. All rights reserved. |
Clinical diagnosis groups developed to bridge the ICD-9-CM to ICD-10-CM coding transition and monitor trends in workers’ compensation claims — Ohio, 2011–2018
Meyers AR , Schrader TN , Krieg E , Naber SJ , Tseng CY , Lampl MP , Chin B , Wurzelbacher SJ . J Saf Res 2025 92 408-419 Introduction: This study aimed to develop a set of broad clinical diagnosis (ClinDx) groups relevant to occupational safety and health. The ClinDx groups are necessary for analysis and interpretation of longitudinal health data that include injury and disease codes from the Ninth and Tenth Revision of the International Classification of Disease, Clinical Modification (ICD-9-CM, ICD-10-CM). Methods: Claims data were analyzed for Ohio Bureau of Workers’ Compensation insured employers from 2011 to 2018. We used interrupted time series regression models to estimate level (frequency) and slope (trend) changes to the percentage of each ClinDx group in October 2015. We created ClinDx groups aligned with ICD-10-CM structure and coding principles. Each ClinDx group was counted once per claim (distinct groups). Monthly percentages were calculated based on the injury date. When present, seasonality was assessed separately for each outcome using an autoregressive-moving average model. Results: The final set of ClinDx groups included 57 mutually exclusive and exhaustive groups. The study population included 661,684 claims, with 959,322 distinct ClinDx groups. Among all claims, 96.27% included injury code(s) and 11.77% included disease(s) codes. At the transition to ICD-10-CM, 33 ClinDx groups lacked any statistically significant (P < 0.05) changes between periods. We observed level changes for 17 ClinDx groups and slope changes for nine groups. Eight ClinDx groups had ≥ 20% (+/-) level changes. Conclusion: While the transition to ICD-10-CM is a break in series, about two-thirds of disease groups and half of injury groups were relatively stable across the transition. These findings also underscore the need for characterizing both injury and disease outcomes when analyzing workers’ compensation data. Practical Applications: The 57 ClinDx groups created in this study may be a practical starting point for other occupational epidemiologic analyses that include a mixture of ICD-9-CM and ICD-10-CM data. © 2024 |
Review of correlations between telomere length and metal exposure across distinct populations
Beddingfield Z , Ji C , Zarus GM , Ruiz P , Faroon O , Abadin H , Alman B , Antonini JM , Shoeb M . Environ - MDPI 2024 11 (12) Telomere length (TL) predicts the onset of replicative senescence, and its shortening is a limiter on the number of divisions individual somatic cells can perform. Metal-induced genotoxic events are discussed in Agency for Toxic Substances and Disease Registry’s (ATSDR) toxicological profiles. In vivo and in vitro toxicological studies suggest the correlation between toxic metals and TL. However, the correlation between TL and exposure to toxic metals in human populations is unclear despite decades of observational research. We conducted a literature search within the ATSDR toxicological profiles and PubMed database for peer-reviewed articles as of 04/2023 discussing TL and metal exposure in human populations. Through review of the 272 publications meeting these criteria, we identified 25 observational studies that considered the correlation between TL and exposure to some or all of six metals: cadmium (Cd), arsenic (As), nickel (Ni), selenium (Se), lead (Pb), and cesium (Cs). Because reported effect sizes were often not comparable across studies, we performed a sign test based on the reported significance for each metal–TL correlation. We found that Cd was consistently significantly correlated with shorter telomeres (p = 0.016). However, no consistent linear relationship was observed between TL and any of the other metals considered. Exploring this association can enhance our understanding of how metal exposure may influence TL dysfunction. Our findings suggest that Cd exposure contributes to shorter TL, which may affect the DNA damage response (DDR) resulting in numerous chronic health conditions. Further, we highlight inconsistencies in findings on the correlation between metal exposure and TL across different populations and exposure levels. This suggests that correlations between some metals and TL may vary across populations, and that correlations may change at different exposure levels. Also, our findings suggest the need for further research on the potential for nonlinear relationships and non-additive effects of co-exposure to multiple hazardous metals, which could explain the inconsistencies observed across studies. The inconsistent incidences of metal–TL correlations justify additional exploration into the complex interaction between metals and TL. © 2024 by the authors. |
Delay of innate immune responses following influenza B virus infection affects the development of a robust antibody response in ferrets
Rowe T , Fletcher A , Lange M , Hatta Y , Jasso G , Wentworth DE , Ross TM . mBio 2025 e0236124 Due to its natural influenza susceptibility, clinical signs, transmission, and similar sialic acid residue distribution, the ferret is the primary animal model for human influenza research. Antibodies generated following infection of ferrets with human influenza viruses are used in surveillance to detect antigenic drift and cross-reactivity with vaccine viruses and circulating strains. Inoculation of ferrets, with over 1,500 human clinical influenza isolates (1998-2019) resulted in lower antibody responses (HI <1:160) to 86% (387 out of 448) influenza B viruses (IBVs) compared to 2.7% (30 out of 1,094) influenza A viruses (IAVs). Here, we show that the immune responses in ferrets inoculated with IBV were delayed and reduced compared to IAV. Innate gene expression in the upper respiratory tract and blood indicated that IAV generated a strong inflammatory response, including an early activation of the interferon (IFN), whereas IBV elicited a delayed and reduced response. Serum levels of cytokines and IFNs were all much higher following IAV infection than IBV infection. Pro-inflammatory, IFN, TH1/TH2, and T-effector proteins were significantly higher in sera of IAV-infected than IBV-infected ferrets over 28 days following the challenge. Serum levels of Type-I/II/III IFNs were detected following IAV infection throughout this period, whereas Type-III IFN was only late for IBV. An early increase in IFN-lambda corresponded to gene expression following IAV infection. Reduced innate immune responses following IBV infection reflected the subsequent delayed and reduced serum antibodies. These findings may help in understanding the antibody responses in humans following influenza vaccination or infection and consideration of potential addition of innate immunomodulators to overcome low responses. IMPORTANCE: The ferret is the primary animal model for human influenza research. Using a ferret model, we studied the differences in both innate and adaptive immune responses following infection with influenza A and B viruses (IAV and IBV). Antibodies generated following infection of ferrets is used for surveillance assays to detect antigenic drift and cross-reactivity with vaccine viruses and circulating influenza strains. IAV infection of ferrets to generate these reagents resulted in a strong antibody response, but IBV infection generated weak antibody responses. In this study using influenza-infected ferrets, we found that IAV resulted in an early activation of the interferon (IFN) and pro-inflammatory response, whereas IBV showed a delay and reduction in these responses. Serum levels of IFNs and other cytokines or chemokines were much higher in ferrets following IAV infection. These reduced innate responses were reflected the subsequent delayed and reduced antibody responses to IBV in the sera. These findings may help in understanding low antibody responses in humans following influenza B vaccination and infection and may warrant the use of innate immunomodulators to overcome these weak responses. |
Associations between knowledge of health conditions and sugar-sweetened beverage intake among US adults, 2021
Hunter JR , Oza-Frank R , Park S , Sauer AG , Gunn JP . Nutrients 2024 16 (24) BACKGROUND: Frequent consumption of sugar-sweetened beverages (SSB) is associated with an increased risk of some health outcomes. OBJECTIVE: We investigated the relationships between knowledge of health risks related to SSB and SSB intake among adults. METHODS: This cross-sectional study utilized data from the 2021 SummerStyles survey. There were 4022 US adult participants (≥18 years). The outcome variable was SSB intake (none, >0 to <1, 1 to <2, or ≥2 times/day). The exposure variables were knowledge of the association between SSB and seven health conditions. Statistical analyses included seven multinomial regressions to estimate adjusted odds ratios (AOR) for the consumption of SSB according to knowledge of SSB-related health risks after controlling for sociodemographics. RESULTS: Overall, about 30% of adults consumed SSB ≥ 2 times/day. While most adults identified SSB-related conditions such as weight gain (84.0%), diabetes (78.4%), and cavities (74.2%) as being related to drinking SSB, fewer adults recognized related conditions, such as some cancers (23.9%), high cholesterol (28.4%), heart disease (33.5%), and high blood pressure (37.8%). Knowledge of any of the health conditions was not significantly associated with consuming SSB ≥ 2 times/day compared to non-SSB consumers. CONCLUSIONS: Knowledge of SSB-related health conditions varied by sociodemographics but was not associated with high SSB intake. Future studies could explore other factors beyond knowledge that may influence adults' high SSB intake. |
Hazardous exposures and engineering controls in the landscaping services industry
Alexander BM , Graydon PS , Pena M , Feng HA , Beamer BR . J Occup Environ Hyg 2025 1-14 Landscapers are exposed to noise, carbon monoxide (CO), respirable dust, and respirable crystalline silica (RCS) generated from the tools they use. Although engineering controls are available to reduce these exposures, no previous study has evaluated chronic exposures to landscapers in different work settings and compared exposures from landscaping tools with and without engineering controls. This field study of workers in the landscaping services industry documented the occupational exposures of 80 participants at 11 varied worksites to noise, CO, respirable dust, and RCS using personal breathing zone sampling. Results were analyzed using SAS/STAT 14.1. Analysis of variance was used for normally distributed data; otherwise, nonparametric methods were used. Most workers were overexposed to noise, with 94 of the 119 8-hr time-weighted average (TWA) noise exposures at or above the National Institute for Occupational Safety and Health (NIOSH) recommended exposure limit (REL) of 85 dBA. There were no statistically significant differences among different locations or occupations. No 8-hr TWA exposures to CO above the NIOSH REL were measured. Overexposures to RCS were measured at all locations where hardscaping (installing or maintaining non-living aspects of the landscape) was taking place. This is the first known field study of this type to include hardscapers. The use of engineering controls such as dust capture or wet methods would reduce RCS exposures, but respiratory protection may still be needed. Task-based analysis of noise and CO exposure revealed that the loudest landscaping tools used in this study were hardscaping table saws, gas chainsaws, gas leaf blowers, chipper/shredders, gas string trimmers, and fuel mowers. Workers were exposed to significantly more noise and CO when using fuel-powered versions compared to battery-powered versions of leaf blowers, string trimmers, and chainsaws. |
Seasonal activity patterns of Ixodes scapularis and Ixodes pacificus in the United States
Eisen L . Ticks Tick Borne Dis 2025 16 (1) 102433 Knowledge of seasonal activity patterns of human-biting life stages of tick species serving as vectors of human disease agents provides basic information on when during the year humans are most at risk for tick bites and tick-borne diseases. Although there is a wealth of published information on seasonal activity patterns of Ixodes scapularis and Ixodes pacificus in the United States, a critical review of the literature for these important tick vectors is lacking. The aims of this paper were to: (i) review what is known about the seasonal activity patterns of I. scapularis and I. pacificus in different parts of their geographic ranges in the US, (ii) provide a synthesis of the main findings, and (iii) outline key knowledge gaps and methodological pitfalls that limit our understanding of variability in seasonal activity patterns. Based on ticks collected while questing or from wild animals, the seasonal activity patterns were found to be similar for I. pacificus in the Far West and I. scapularis in the Southeast, with synchronous activity of larvae and nymphs, peaking in spring (April to June) in the Far West and from spring to early summer (April to July) in the Southeast, and continuous activity of adults from fall through winter and spring with peak activity from fall through winter (November/December to March). In the colder climates of the Upper Midwest and Northeast, I. scapularis adults have a bimodal seasonal pattern, with activity peaks in fall (October to November) and spring (April to May). The seasonal activity patterns for immatures differ between the Upper Midwest, synchronous for larvae and nymphs with peak activity in spring and summer (May to August), and the Northeast, where the peak activity of nymphs in spring and early summer (May to July) precedes that of larvae in summer (July to September). Seasonality of human tick encounters also is influenced by changes over the year in the level of outdoor activities in tick habitat. Studies on the seasonality of ticks infesting humans have primarily focused on the coastal Northeast and the Pacific Coast states, with fewer studies in the Southeast, inland parts of the Northeast, and the Upper Midwest. Discrepancies between seasonal patterns for peak tick questing activity and peak human infestation appear to occur primarily for the adult stages of I. scapularis and I. pacificus. Study design and data presentation limitations of the published literature are discussed. Scarcity of data for seasonal activity patterns of I. pacificus outside of California and for I. scapularis from parts of the Southeast, Northeast, and Upper Midwest is a key knowledge gap. In addition to informing the public of when during the year the risk for tick bites is greatest, high-quality studies describing current seasonal activity patterns also will generate the data needed for robust model-based projections of future climate-driven change in the seasonal activity patterns and provide the baseline needed to empirically determine in the future if the projections were accurate. |
Cervical cancer incidence and trends among women aged 15-29 years by county-level economic status and rurality - United States, 2007-2020
Agarwal R , King JB , Gopalani SV , Senkomago V . Cancer Epidemiol 2024 94 102730 INTRODUCTION: Variations in cervical cancer incidence rates and trends have been reported by sociodemographic characteristics. However, research on economic characteristics is limited especially among younger women in the United States. METHODS: We analyzed United States Cancer Statistics data to examine age-standardized cervical cancer incidence rates among women aged 15-29 years during 2007-2020. We used an index-based county-level economic classification to rank counties in the top 25 %, middle 25 %-75 %, and bottom 25 %. We assessed differences in incidence using rate ratios and trends using annual percent changes (APCs) from joinpoint regression. Due to impact from the COVID-19 pandemic, trend analysis excluded 2020 data. Analyses were conducted during August-October 2023. RESULTS: During 2007-2020, incidence rates were lower in the top 25 % counties economically than the bottom 25 % or middle 25 %-75 % (1.6 vs 2.1 vs 1.9 per 100,000, respectively). Rates were higher in nonmetropolitan than metropolitan counties across economic groups. Overall, rates declined in all county-level economic strata, especially in the bottom 25 % during 2015-2019 (APC -10.6 %). Rates appeared to decrease in metropolitan counties and women of all races across economic categories. decreases were most evident in the top 25 % of non-Hispanic White women during 2016-2019 and nonmetropolitan counties during 2017-2019. CONCLUSIONS: In women aged 15-29 years, declining rates of cervical cancer during 2007-2019 across county-level economic strata may partly reflect effects of human papillomavirus vaccination and cervical cancer screening. Further observed differences by race and rurality may help inform efforts to increase implementation of preventive measures in populations with the highest burden. |
A vicious cycle of frailty and acute lower respiratory infection among community-dwelling adults (≥ 60 years): Findings from a multi-site INSPIRE cohort study, India
Saha S , Amarchand R , Kumar R , OPrabhakaran A , Rajkumar P , Dutt Bhardwaj S , Kanungo S , Gharpure R , Lafond KE , Azziz-Baumgartner E , Krishnan A . PLOS Glob Public Health 2024 4 (12) e0003903 We studied the relationship of frailty and acute lower respiratory infection (ALRI) among a multi-site cohort of community-dwelling older adults aged ≥60 years in India. During January 2019‒January 2020, participants completed the Edmonton Frail Scale (EFS) at baseline and every 3 months at four sites in India, with each participant completing a maximum of four surveys. Participants were categorized as non-frail (0-5 points), vulnerable (6-7 points), and frail (≥8 points) based on EFS score. Project nurses made weekly home visits to identify ALRI episodes with onset during past 7 days. We estimated adjusted hazard ratios (aHR) for having an ALRI episode within 90 days after EFS by frailty category. We also assessed risk of deterioration of frailty during 7-100 days after ALRI episode onset in terms of an increased EFS score by ≥1 point and change of frailty category. Among 5801 participants (median age 65 years, 41% males), 3568 (61·5%) were non-frail, 1507 (26%) vulnerable, and 726 (12·5%) frail at enrolment. Compared with non-frail participants, the hazard of an ALRI episode was higher among vulnerable (aHR: 1·6, (95%CI 1·3-2.0) and frail participants (aHR: 1·7, 95%CI 1·3-2·2). Participants having ALRI within the past 7-100 days were at increased risk of worsening frailty category (aOR: 1.9, 95%CI 1·3-2.8) compared to participants without an ALRI episode during the same period. The association between ALRIs and worsened frailty suggests prevention of ALRIs through vaccination and other strategies may have broad reaching health benefits for older adults. |
Persistent organic pollutants and endogenous sex-related hormones in Hispanic/Latino adults: The Hispanic Community Health Study/Study of Latinos (HCHS/SOL)
Abasilim C , Persky V , Sargis RM , Day T , Tsintsifas K , Daviglus M , Cai J , Freels S , Grieco A , Peters BA , Isasi CR , Talavera GA , Thyagarajan B , Davis M , Jones R , Sjodin A , Turyk ME . Environ Res 2024 120742 BACKGROUND: Previous studies have demonstrated associations of persistent organic pollutants (POPs) with sex-related hormones; however, findings were inconsistent. Sex-specific impacts and pathways through which adiposity influences associations are not completely understood. We sought to evaluate sex-specific associations of POPs serum concentration with sex-related hormones and to explore pathways through which adiposity may modify associations. METHODS: We studied 1,073 men and 716 postmenopausal women participating in the "Persistent Organic Pollutants, Endogenous Hormones, and Diabetes in Latinos" ancillary study which is a subcohort of the "Hispanic Community Health Study/Study of Latinos." We use baseline examination data collected from 2008-2011 to investigate associations between eight organochlorine pesticides (OCPs), five polychlorinated biphenyls (PCB) groups, sum of polybrominated diphenyl ethers and polybrominated biphenyl 153 on sex hormone binding globulin (SHBG) and various sex-related hormone levels. We examined associations cross-sectionally using linear and logistic regression models adjusted for complex survey design and confounders. RESULTS: PCBs and select OCPs were associated with increased SHBG in women and decreased estradiol (E2) and/or bioavailable E2 in men. For instance, per quartile increase in serum concentrations of ∑PCBs and oxychlordane were associated with decreased levels of E2 (β=-6.36 pmol/L; 95% CI:-10.7,-2.02 and β=-5.08 pmol/L; 95% CI:-8.11,-2.05) and bioavailable E2 (β=-4.48 pmol/L; 95% CI:-7.22,-1.73 and β=-4.23 pmol/L; 95% CI:-6.17,-2.28), respectively, in men, and increased levels of SHBG (β=7.25 nmol/L; 95% CI:2.02,12.8 and β=9.42 nmol/L; 95% CI:4.08,15.0), respectively, in women. p,p'-DDT and β-HCCH, and o,p'-DDT were also associated with decreased testosterone (T) and bioavailable T (ng/dL) levels in men. Adiposity modified associations in men, revealing stronger inverse associations of PCBs, PBDEs, and several OCPs with LH, SHBG, E2, bioavailable E2, T, and the ratios of LH to FSH and E2 to T in those with below median body mass index and waist-to-hip ratio. CONCLUSION: Distinct patterns of hormone dysregulation with increasing POPs serum concentration were identified in men and post-menopausal women. In men but less so in postmenopausal women, adiposity modified associations of POPs serum concentration with sex-related hormones. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Jan 21, 2025
- Content source:
- Powered by CDC PHGKB Infrastructure