Will invasive fungal infections be the Last of Us The importance of surveillance, public-health interventions, and antifungal stewardship
Rodriguez Stewart RM , Gold JAW , Chiller T , Sexton DJ , Lockhart SR . Expert Rev Anti Infect Ther 2023 21 (8) 1-4 The video game-turned-HBO show ‘The Last of Us’ is a fanciful representation of a zombie apocalypse caused by a fungal infection. Although Ophiocordyceps, the ‘zombie fungi’ featured in the show, do not infect vertebrates, the show serves as a reminder that many fungi can cause life-threatening invasive fungal infections (IFIs). Candida and Aspergillus species are the most common and well-known causes of IFIs, but at least 300 species of opportunistic human pathogenic yeasts and molds exist. | | Each year, IFIs are responsible for over 1.5 million deaths globally and, in the United States alone, impose health-care costs ranging from five to seven billion dollars [Citation1,Citation2]. During the COVID-19 pandemic, rates of death from fungal infections have increased [Citation3], and the burden of IFIs is poised to grow given the expanding population of patients living with immunosuppressive conditions (e.g. solid organ and stem cell transplantation), increasing antifungal resistance, and potential climate-change related expansion of the geographic ranges in which pathogenic fungi live. Despite the morbidity and mortality associated with fungal infections and their growing public health importance, we still have much to learn about their diagnosis and management. In this review, we discuss gaps and global disparities in fungal laboratory capacity including antifungal susceptibility testing, the paucity of fungal surveillance, and the importance of antifungal stewardship, all against the backdrop of increasing antifungal resistance and a limited armamentarium of antifungal therapies. |
Variation in hysterectomy prevalence and trends among U.S. States and Territories-Behavioral Risk Factor Surveillance System, 2012-2020
Gopalani SV , Dasari SR , Adam EE , Thompson TD , White MC , Saraiya M . Cancer Causes Control 2023 34 (10) 829-835 PURPOSE: We estimated up-to-date state- and territory-level hysterectomy prevalence and trends, which can help correct the population at risk denominator and calculate more accurate uterine and cervical cancer rates. METHODS: We analyzed self-reported data for a population-based sample of 1,267,013 U.S. women aged ≥ 18 years who participated in the Behavioral Risk Factor Surveillance System surveys from 2012 to 2020. Estimates were age-standardized and stratified by sociodemographic characteristics and geography. Trends were assessed by testing for any differences in hysterectomy prevalence across years. RESULTS: Hysterectomy prevalence was highest among women aged 70-79 years (46.7%) and ≥ 80 years (48.8%). Prevalence was also higher among women who were non-Hispanic (NH) Black (21.3%), NH American Indian and Alaska Native (21.1%), and from the South (21.1%). Hysterectomy prevalence declined by 1.9 percentage points from 18.9% in 2012 to 17.0% in 2020. CONCLUSIONS: Approximately one in five U.S. women overall and half of U.S. women aged ≥ 70 years reported undergoing a hysterectomy. Our findings reveal large variations in hysterectomy prevalence within and between each of the four census regions and by race and other sociodemographic characteristics, underscoring the importance of adjusting epidemiologic measures of uterine and cervical cancers for hysterectomy status. |
National, state-level, and county-level prevalence estimates of adults aged 18 years self-reporting a lifetime diagnosis of depression - United States, 2020
Lee B , Wang Y , Carlson SA , Greenlund KJ , Lu H , Liu Y , Croft JB , Eke PI , Town M , Thomas CW . MMWR Morb Mortal Wkly Rep 2023 72 (24) 644-650 Depression is a major contributor to mortality, morbidity, disability, and economic costs in the United States (1). Examining the geographic distribution of depression at the state and county levels can help guide state- and local-level efforts to prevent, treat, and manage depression. CDC analyzed 2020 Behavioral Risk Factor Surveillance System (BRFSS) data to estimate the national, state-level, and county-level prevalence of U.S. adults aged ≥18 years self-reporting a lifetime diagnosis of depression (referred to as depression). During 2020, the age-standardized prevalence of depression among adults was 18.5%. Among states, the age-standardized prevalence of depression ranged from 12.7% to 27.5% (median = 19.9%); most of the states with the highest prevalence were in the Appalachian* and southern Mississippi Valley(†) regions. Among 3,143 counties, the model-based age-standardized prevalence of depression ranged from 10.7% to 31.9% (median = 21.8%); most of the counties with the highest prevalence were in the Appalachian region, the southern Mississippi Valley region, and Missouri, Oklahoma, and Washington. These data can help decision-makers prioritize health planning and interventions in areas with the largest gaps or inequities, which could include implementation of evidence-based interventions and practices such as those recommended by The Guide to Community Preventive Services Task Force (CPSTF) and the Substance Abuse and Mental Health Services Administration (SAMHSA). |
Prevalence of diabetic retinopathy in the US in 2021
Lundeen EA , Burke-Conte Z , Rein DB , Wittenborn JS , Saaddine J , Lee AY , Flaxman AD . JAMA Ophthalmol 2023 IMPORTANCE: Diabetic retinopathy (DR) is a common microvascular complication of diabetes and a leading cause of blindness among working-age adults in the US. OBJECTIVE: To update estimates of DR and vision-threatening diabetic retinopathy (VTDR) prevalence by demographic factors and US county and state. DATA SOURCES: The study team included data from the National Health and Nutrition Examination Survey (2005 to 2008 and 2017 to March 2020), Medicare fee-for-service claims (2018), IBM MarketScan commercial insurance claims (2016), population-based studies of adult eye disease (2001 to 2016), 2 studies of diabetes in youth (2021 and 2023), and a previously published analysis of diabetes by county (2012). The study team used population estimates from the US Census Bureau. STUDY SELECTION: The study team included relevant data from the US Centers for Disease Control and Prevention's Vision and Eye Health Surveillance System. DATA EXTRACTION AND SYNTHESIS: Using bayesian meta-regression methods, the study team estimated the prevalence of DR and VTDR stratified by age, a nondifferentiated sex and gender measure, race, ethnicity, and US county and state. MAIN OUTCOMES AND MEASURES: The study team defined individuals with diabetes as those who had a hemoglobin A1c level at 6.5% or more, took insulin, or reported ever having been told by a physician or health care professional that they have diabetes. The study team defined DR as any retinopathy in the presence of diabetes, including nonproliferative retinopathy (mild, moderate, or severe), proliferative retinopathy, or macular edema. The study team defined VTDR as having, in the presence of diabetes, severe nonproliferative retinopathy, proliferative retinopathy, panretinal photocoagulation scars, or macular edema. RESULTS: This study used data from nationally representative and local population-based studies that represent the populations in which they were conducted. For 2021, the study team estimated 9.60 million people (95% uncertainty interval [UI], 7.90-11.55) living with DR, corresponding to a prevalence rate of 26.43% (95% UI, 21.95-31.60) among people with diabetes. The study team estimated 1.84 million people (95% UI, 1.41-2.40) living with VTDR, corresponding to a prevalence rate of 5.06% (95% UI, 3.90-6.57) among people with diabetes. Prevalence of DR and VTDR varied by demographic characteristics and geography. CONCLUSIONS AND RELEVANCE: US prevalence of diabetes-related eye disease remains high. These updated estimates on the burden and geographic distribution of diabetes-related eye disease can be used to inform the allocation of public health resources and interventions to communities and populations at highest risk. |
Number needed to vaccinate with a COVID-19 booster to prevent a COVID-19-associated hospitalization during SARS-CoV-2 Omicron BA.1 variant predominance, December 2021-February 2022, VISION Network: a retrospective cohort study
Adams K , Riddles JJ , Rowley EAK , Grannis SJ , Gaglani M , Fireman B , Hartmann E , Naleway AL , Stenehjem E , Hughes A , Dalton AF , Natarajan K , Dascomb K , Raiyani C , Irving SA , Sloan-Aagard C , Kharbanda AB , DeSilva MB , Dixon BE , Ong TC , Keller J , Dickerson M , Grisel N , Murthy K , Nanez J , Fadel WF , Ball SW , Patel P , Arndorfer J , Mamawala M , Valvi NR , Dunne MM , Griggs EP , Embi PJ , Thompson MG , Link-Gelles R , Tenforde MW . Lancet Reg Health Am 2023 23 100530 BACKGROUND: Understanding the usefulness of additional COVID-19 vaccine doses-particularly given varying disease incidence-is needed to support public health policy. We characterize the benefits of COVID-19 booster doses using number needed to vaccinate (NNV) to prevent one COVID-19-associated hospitalization or emergency department encounter. METHODS: We conducted a retrospective cohort study of immunocompetent adults at five health systems in four U.S. states during SARS-CoV-2 Omicron BA.1 predominance (December 2021-February 2022). Included patients completed a primary mRNA COVID-19 vaccine series and were either eligible to or received a booster dose. NNV were estimated using hazard ratios for each outcome (hospitalization and emergency department encounters), with results stratified by three 25-day periods and site. FINDINGS: 1,285,032 patients contributed 938 hospitalizations and 2076 emergency department encounters. 555,729 (43.2%) patients were aged 18-49 years, 363,299 (28.3%) 50-64 years, and 366,004 (28.5%) ≥65 years. Most patients were female (n = 765,728, 59.6%), White (n = 990,224, 77.1%), and non-Hispanic (n = 1,063,964, 82.8%). 37.2% of patients received a booster and 62.8% received only two doses. Median estimated NNV to prevent one hospitalization was 205 (range 44-615) and NNV was lower across study periods for adults aged ≥65 years (110, 46, and 88, respectively) and those with underlying medical conditions (163, 69, and 131, respectively). Median estimated NNV to prevent one emergency department encounter was 156 (range 75-592). INTERPRETATION: The number of patients needed to receive a booster dose was highly dependent on local disease incidence, outcome severity, and patient risk factors for moderate-to-severe disease. FUNDING: Funding was provided by the Centers for Disease Control and Prevention though contract 75D30120C07986 to Westat, Inc. and contract 75D30120C07765 to Kaiser Foundation Hospitals. |
Identification of tecovirimat resistance-associated mutations in human monkeypox virus - Los Angeles County
Garrigues JM , Hemarajata P , Karan A , Shah NK , Alarcón J , Marutani AN , Finn L , Smith TG , Gigante CM , Davidson W , Wynn NT , Hutson CL , Kim M , Terashita D , Balter SE , Green NM . Antimicrob Agents Chemother 2023 67 (7) e0056823 Tecovirimat (also known as TPOXX or ST-246) is a drug available for the treatment of mpox through the Centers for Disease Control and Prevention’s Expanded Access Investigational New Drug “compassionate use” protocol (https://www.cdc.gov/poxvirus/monkeypox/clinicians/Tecovirimat.html). In Los Angeles County, a fatal case of mpox with tecovirimat resistance was previously reported (1). Epidemiologic surveillance in Los Angeles County has since identified additional cases of severe mpox that did not improve after multiple rounds of tecovirimat treatment, including one involving a person who succumbed to infection (Table 1). Consistent with reports describing severe manifestations of mpox within the current global outbreak (1, 2), the identified cases involved host immunodeficiency due to advanced HIV infection. |
Notes from the field: Comparison of COVID-19 mortality rates among adults aged 65 years who were unvaccinated and those who received a bivalent booster dose within the preceding 6 months - 20 U.S. Jurisdictions, September 18, 2022-April 1, 2023
Johnson AG , Linde L , Payne AB , Ali AR , Aden V , Armstrong B , Armstrong B , Auche S , Bayoumi NS , Bennett S , Boulton R , Chang C , Collingwood A , Cueto K , Davidson SL , Du Y , Fleischauer A , Force V , Frank D , Hamilton R , Harame K , Harrington P , Hicks L , Hodis JD , Hoskins M , Jones A , Kanishka F , Kaur R , Kirkendall S , Khan SI , Klioueva A , Link-Gelles R , Lyons S , Mansfield J , Markelz A , Masarik J 3rd , Mendoza E , Morris K , Omoike E , Paritala S , Patel K , Pike M , Pompa XP , Praetorius K , Rammouni N , Razzaghi H , Riggs A , Shi M , Sigalo N , Stanislawski E , Tilakaratne BP , Turner KA , Wiedeman C , Silk BJ , Scobie HM . MMWR Morb Mortal Wkly Rep 2023 72 (24) 667-669 Updated (bivalent) COVID-19 vaccines were first recommended by CDC on September 1, 2022.* An analysis of case and death rates by vaccination status shortly after authorization of bivalent COVID-19 vaccines showed that receipt of a bivalent booster dose provided additional protection against SARS-CoV-2 infection and associated death (1). In this follow-up report on the durability of bivalent booster protection against death among adults aged ≥65 years, mortality rate ratios (RRs) were estimated among unvaccinated persons and those who received a bivalent booster dose by time since vaccination during three periods of Omicron lineage predominance (BA.5 [September 18–November 5, 2022], BQ.1/BQ.1.1 [November 6, 2022–January 21, 2023], and XBB.1.5 [January 22–April 1, 2023]).† | | During September 18, 2022–April 1, 2023, weekly counts of COVID-19–associated deaths§ among unvaccinated persons and those who received a bivalent booster dose¶ were reported from 20 U.S. jurisdictions** that routinely link case surveillance data to immunization registries and vital registration databases (1). Vaccinated persons who did not receive a bivalent COVID-19 booster dose were excluded. Rate denominators were calculated from vaccine administration data and 2019 U.S. intercensal population estimates,†† with numbers of unvaccinated persons estimated by subtracting numbers of vaccinated persons from the 2019 intercensal population estimates, as previously described§§ (1). Average weekly mortality rates were estimated based on date of specimen collection¶¶ during each variant period by vaccination status and time since bivalent booster dose receipt. RRs were calculated by dividing rates among unvaccinated persons by rates among bivalent booster dose recipients; after detrending the underlying linear changes in weekly rates, 95% CIs were estimated from the remaining variation in rates observed*** (1). SAS (version 9.4; SAS Institute) and R (version 4.1.2; R Foundation) software were used to conduct all analyses. This activity was reviewed by CDC and was conducted consistent with applicable federal law and CDC policy.††† |
The testing imperative: Why the US ending the human immunodeficiency virus (HIV) epidemic program needs to renew efforts to expand HIV testing in clinical and community-based settings
Nosyk B , Fojo AT , Kasaie P , Enns B , Trigg L , Piske M , Hutchinson AB , DiNenno EA , Zang X , Del Rio C . Clin Infect Dis 2023 76 (12) 2206-2208 Data from several modeling studies demonstrate that large-scale increases in human immunodeficiency virus (HIV) testing across settings with a high burden of HIV may produce the largest incidence reductions to support the US Ending the HIV Epidemic (EHE) initiative's goal of reducing new HIV infections 90% by 2030. Despite US Centers for Disease Control and Prevention's recommendations for routine HIV screening within clinical settings and at least yearly screening for individuals most at risk of acquiring HIV, fewer than half of US adults report ever receiving an HIV test. Furthermore, total domestic funding for HIV prevention has remained unchanged between 2013 and 2019. The authors describe the evidence supporting the value of expanded HIV testing, identify challenges in implementation, and present recommendations to address these barriers through approaches at local and federal levels to reach EHE targets. |
SARS-CoV-2 cases reported on international arriving and domestic flights: United States, January 2020-December 2021
Preston LE , Rey A , Dumas S , Rodriguez A , Gertz AM , Delea KC , Alvarado-Ramy F , Christensen DL , Brown C , Chen TH . Am J Public Health 2023 113 (8) e1-e5 Objectives. To describe trends in the number of air travelers categorized as infectious with SARS-CoV-2 (severe acute respiratory syndrome coronavirus 2; the virus that causes COVID-19) in the context of total US COVID-19 vaccinations administered, and overall case counts of SARS-CoV-2 in the United States. Methods. We searched the Quarantine Activity Reporting System (QARS) database for travelers with inbound international or domestic air travel, a positive SARS-CoV-2 lab result, and a surveillance categorization of SARS-CoV-2 infection reported during January 2020 to December 2021. Travelers were categorized as infectious during travel if they had arrival dates from 2 days before to 10 days after symptom onset or a positive viral test. Results. We identified 80 715 persons meeting our inclusion criteria; 67 445 persons (83.6%) had at least 1 symptom reported. Of 67 445 symptomatic passengers, 43 884 (65.1%) reported an initial symptom onset date after their flight arrival date. The number of infectious travelers mirrored the overall number of US SARS-CoV-2 cases. Conclusions. Most travelers in the study were asymptomatic during travel, and therefore unknowingly traveled while infectious. During periods of high community transmission, it is important for travelers to stay up to date with COVID-19 vaccinations and consider wearing a high-quality mask to decrease the risk of transmission. (Am J Public Health. Published online ahead of print June 15, 2023:e1-e5. https://doi.org/10.2105/AJPH.2023.307325). |
Notes from the field: Increase in meningococcal disease among persons with HIV - United States, 2022
Rubis AB , Howie RL , Marasini D , Sharma S , Marjuki H , McNamara LA . MMWR Morb Mortal Wkly Rep 2023 72 (24) 663-664 Meningococcal disease, caused by the bacterium Neisseria meningitidis, is a sudden-onset, life-threatening illness that typically occurs as meningitis or meningococcemia. The most common signs and symptoms of meningitis include fever, headache, and stiff neck; the most common signs and symptoms of meningococcemia are fever, chills, fatigue, vomiting, diarrhea, cold hands and feet, and severe aches or pain.* Quadrivalent meningococcal conjugate vaccination (MenACWY) is routinely recommended for adolescents and persons at increased risk for meningococcal disease (1), including those with HIV. In 2016, a 2-dose series of MenACWY was recommended by the Advisory Committee on Immunization Practices (ACIP) for persons with HIV and incorporated into the U.S. immunization schedule. Coverage among persons with HIV, however, remains low: in a study of administrative claims data during January 2016–March 2018, only 16.3% of persons with HIV received ≥1 doses of MenACWY vaccine within 2 years after their diagnosis (2). This report describes an increase in meningococcal disease among persons with HIV in the United States in 2022. Data are typically finalized in the fall of the next year; therefore, this report is based on preliminary data for 2022. |
Occurrence of tuberculosis among people exposed to cattle in Bangladesh
Sarkar S , Haider N , Islam A , Hossain MB , Hossain K , Mafij Uddin MK , Rahman A , Ahmed SSU , Banu S , Rahim Z , Heffelfinger JD , Zeidner N . Vet Med Sci 2023 9 (4) 1923-1933 BACKGROUND: Tuberculosis (TB) has been an important public health concern in Bangladesh. The most common cause of human TB is Mycobacterium tuberculosis, while bovine TB is caused by Mycobacterium bovis. OBJECTIVE: The objective of this study was to determine the frequency of TB in individuals with occupational exposure to cattle and to detect Mycobacterium bovis among cattle in slaughterhouses in Bangladesh. METHODS: Between August and September 2015, an observational study was conducted in two government chest disease hospitals, one cattle market, and two slaughterhouses. Sputum samples were collected from individuals who met the criteria for suspected TB and had been exposed to cattle. Tissue samples were collected from cattle that had low body condition score(s). Both humans and cattle samples were screened for acid-fast bacilli (AFB) by Ziehl-Neelsen (Z-N) staining and cultured for Mycobacterium tuberculosis complex (MTC). Region of difference (RD) 9-based polymerase chain reaction (PCR) was also performed to identify Mycobacterium spp. We also conducted Spoligotyping to identify the specific strain of Mycobacterium spp. RESULTS: Sputum was collected from a total of 412 humans. The median age of human participants was 35 (IQR: 25-50) years. Twenty-five (6%) human sputum specimens were positive for AFB, and 44 (11%) were positive for MTC by subsequent culture. All (N = 44) culture-positive isolates were confirmed as Mycobacterium tuberculosis by RD9 PCR. Besides, 10% of cattle workers were infected with Mycobacterium tuberculosis in the cattle market. Of all TB (caused by Mycobacterium tuberculosis) infected individuals, 6.8% of individuals were resistant to one or two anti-TB drugs. The majority of the sampled cattle (67%) were indigenous breeds. No Mycobacterium bovis was detected in cattle. CONCLUSIONS: We did not detect any TB cases caused by Mycobacterium bovis in humans during the study. However, we detected TB cases caused by Mycobacterium tuberculosis in all humans, including cattle market workers. |
Adolescent COVID-19 cases during the SARS-CoV-2 Delta and Omicron variant surges in Kentucky: Association with vaccination and prior infection
Spicer KB , Glick C , Thoroughman DA . J Adolesc Health 2023 PURPOSE: Effectiveness of COVID-19 mRNA vaccines is influenced by SARS-CoV-2 variant and history of prior infection. Data regarding protection against SARS-CoV-2 infection among adolescents, accounting for prior infection and time since vaccination, are limited. METHODS: SARS-CoV-2 testing and immunization data from the Kentucky Electronic Disease Surveillance System and the Kentucky Immunization Registry, August-September 2021 (Delta predominance) and January 2022 (Omicron Predominance) among adolescents aged 12-17 years, were used to assess association of SARS-CoV-2 infection with mRNA vaccination and prior SARS-CoV-2 infection. Estimated protection was derived from prevalence ratios ([1-PR] × 100%). RESULTS: During Delta predominance, 89,736 tested adolescents were evaluated. Completion of primary series (second dose of mRNA vaccine ≥ 14 days prior to testing) and history of prior infection (> 90 days prior to testing) were both protective against SARS-CoV-2 infection (primary series: 81%, 95% confidence interval [CI] 79.7-82.3; prior infection: 66%, 95% CI 62.0-69.6). Prior infection plus primary series provided the greatest protection (92.3%, 95% CI 88.0-95.1). During Omicron predominance, 67,331 tested adolescents were evaluated. Primary series alone provided no benefit against SARS-CoV-2 infection after 90 days; prior infection was protective for up to one year (24.2%, 95% CI 17.2-30.7). Prior infection plus booster vaccination provided the greatest protection against infection (82.4%, 95% CI 62.1-91.8). DISCUSSION: Strength and duration of protection against infection provided by COVID-19 vaccination and prior SARS-CoV-2 infection differed by variant. Vaccination provided additional benefit to the protection offered by prior infection alone. Remaining up to date with vaccination is recommended for all adolescents regardless of infection history. |
Application of multi-criteria decision analysis techniques and decision support framework for informing select agent designation for agricultural animal pathogens
Pillai SP , West T , Anderson K , Fruetel JA , McNeil C , Hernandez P , Ball C , Beck N , Morse SA . Front Bioeng Biotechnol 2023 11 1185743 The United States Department of Agriculture (USDA), Division of Agricultural Select Agents and Toxins (DASAT) established a list of biological agents and toxins (Select Agent List) that potentially threaten agricultural health and safety, the procedures governing the transfer of those agents, and training requirements for entities working with them. Every 2 years the USDA DASAT reviews the Select Agent List, using subject matter experts (SMEs) to perform an assessment and rank the agents. To assist the USDA DASAT biennial review process, we explored the applicability of multi-criteria decision analysis (MCDA) techniques and a Decision Support Framework (DSF) in a logic tree format to identify pathogens for consideration as select agents, applying the approach broadly to include non-select agents to evaluate its robustness and generality. We conducted a literature review of 41 pathogens against 21 criteria for assessing agricultural threat, economic impact, and bioterrorism risk and documented the findings to support this assessment. The most prominent data gaps were those for aerosol stability and animal infectious dose by inhalation and ingestion routes. Technical review of published data and associated scoring recommendations by pathogen-specific SMEs was found to be critical for accuracy, particularly for pathogens with very few known cases, or where proxy data (e.g., from animal models or similar organisms) were used to address data gaps. The MCDA analysis supported the intuitive sense that select agents should rank high on the relative risk scale when considering agricultural health consequences of a bioterrorism attack. However, comparing select agents with non-select agents indicated that there was not a clean break in scores to suggest thresholds for designating select agents, requiring subject matter expertise collectively to establish which analytical results were in good agreement to support the intended purpose in designating select agents. The DSF utilized a logic tree approach to identify pathogens that are of sufficiently low concern that they can be ruled out from consideration as a select agent. In contrast to the MCDA approach, the DSF rules out a pathogen if it fails to meet even one criteria threshold. Both the MCDA and DSF approaches arrived at similar conclusions, suggesting the value of employing the two analytical approaches to add robustness for decision making. |
Volunteer-contributed observations of flowering often correlate with airborne pollen concentrations
Crimmins TM , Vogt E , Brown CL , Dalan D , Manangan A , Robinson G , Song Y , Zhu K , Katz DSW . Int J Biometeorol 2023 67 (8) 1363-1372 Characterizing airborne pollen concentrations is crucial for supporting allergy and asthma management; however, pollen monitoring is labor intensive and, in the USA, geographically limited. The USA National Phenology Network (USA-NPN) engages thousands of volunteer observers in regularly documenting the developmental and reproductive status of plants. The reports of flower and pollen cone status contributed to the USA-NPN's platform, Nature's Notebook, have the potential to help address gaps in pollen monitoring by providing real-time, spatially explicit information from across the country. In this study, we assessed whether observations of flower and pollen cone status contributed to Nature's Notebook can serve as effective proxies for airborne pollen concentrations. We compared daily pollen concentrations from 36 National Allergy Bureau (NAB) stations in the USA with flowering and pollen cone status observations collected within 200 km of each NAB station in each year, 2009-2021, for 15 common tree taxa using Spearman's correlations. Of 350 comparisons, 58% of correlations were significant (p < 0.05). Comparisons could be made at the largest numbers of sites for Acer and Quercus. Quercus demonstrated a comparatively high proportion of tests with significant agreement (median ρ = 0.49). Juglans demonstrated the strongest overall coherence between the two datasets (median ρ = 0.79), though comparisons were made at only a small number of sites. For particular taxa, volunteer-contributed flowering status observations demonstrate promise to indicate seasonal patterns in airborne pollen concentrations. The quantity of observations, and therefore, their utility for supporting pollen alerts, could be substantially increased through a formal observation campaign. |
An analysis of prescribed fire activities and emissions in the Southeastern United States from 2013 to 2020
Li Z , Maji KJ , Hu Y , Vaidyanathan A , O’Neill SM , Odman MT , Russell AG . Remote Sens 2023 15 (11) Prescribed burning is a major source of a fine particular matter, especially in the southeastern United States, and quantifying emissions from burning operations accurately is an integral part of ascertaining air quality impacts. For instance, a critical factor in calculating fire emissions is identifying fire activity information (e.g., location, date/time, fire type, and area burned) and prior estimations of prescribed fire activity used for calculating emissions have either used burn permit records or satellite-based remote sensing products. While burn permit records kept by state agencies are a reliable source, they are not always available or readily accessible. Satellite-based remote sensing products are currently used to fill the data gaps, especially in regional studies; however, they cannot differentiate prescribed burns from the other types of fires. In this study, we developed novel algorithms to distinguish prescribed burns from wildfires and agricultural burns in a satellite-derived product, Fire INventory from NCAR (FINN). We matched and compared the burned areas from permit records and FINN at various spatial scales: individual fire level, 4 km grid level, and state level. The methods developed in this study are readily usable for differentiating burn type, matching and comparing the burned area between two datasets at various resolutions, and estimating prescribed burn emissions. The results showed that burned areas from permits and FINN have a weak correlation at the individual fire level, while the correlation is much higher for the 4 km grid and state levels. Since matching at the 4 km grid level showed a relatively higher correlation and chemical transport models typically use grid-based emissions, we used the linear regression relationship between FINN and permit burned areas at the grid level to adjust FINN burned areas. This adjustment resulted in a reduction in FINN-burned areas by 34%. The adjusted burned area was then used as input to the BlueSky Smoke Modeling Framework to provide long-term, three-dimensional prescribed burning emissions for the southeastern United States. In this study, we also compared emissions from different methods (FINN or BlueSky) and different data sources (adjusted FINN or permits) to evaluate uncertainties of our emission estimation. The comparison results showed the impacts of the burned area, method, and data source on prescribed burning emission estimations. © 2023 by the authors. |
Longitudinal and quantitative fecal shedding dynamics of SARS-CoV-2, pepper mild mottle virus, and crAssphage
Arts PJ , Kelly JD , Midgley CM , Anglin K , Lu S , Abedi GR , Andino R , Bakker KM , Banman B , Boehm AB , Briggs-Hagen M , Brouwer AF , Davidson MC , Eisenberg MC , Garcia-Knight M , Knight S , Peluso MJ , Pineda-Ramirez J , Diaz Sanchez R , Saydah S , Tassetto M , Martin JN , Wigginton KR . mSphere 2023 8 (4) e0013223 Wastewater-based epidemiology (WBE) emerged during the coronavirus disease 2019 (COVID-19) pandemic as a scalable and broadly applicable method for community-level monitoring of infectious disease burden. The lack of high-resolution fecal shedding data for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) limits our ability to link WBE measurements to disease burden. In this study, we present longitudinal, quantitative fecal shedding data for SARS-CoV-2 RNA, as well as for the commonly used fecal indicators pepper mild mottle virus (PMMoV) RNA and crAss-like phage (crAssphage) DNA. The shedding trajectories from 48 SARS-CoV-2-infected individuals suggest a highly individualized, dynamic course of SARS-CoV-2 RNA fecal shedding. Of the individuals that provided at least three stool samples spanning more than 14 days, 77% had one or more samples that tested positive for SARS-CoV-2 RNA. We detected PMMoV RNA in at least one sample from all individuals and in 96% (352/367) of samples overall. CrAssphage DNA was detected in at least one sample from 80% (38/48) of individuals and was detected in 48% (179/371) of all samples. The geometric mean concentrations of PMMoV and crAssphage in stool across all individuals were 8.7 × 10(4) and 1.4 × 10(4) gene copies/milligram-dry weight, respectively, and crAssphage shedding was more consistent for individuals than PMMoV shedding. These results provide us with a missing link needed to connect laboratory WBE results with mechanistic models, and this will aid in more accurate estimates of COVID-19 burden in sewersheds. Additionally, the PMMoV and crAssphage data are critical for evaluating their utility as fecal strength normalizing measures and for source-tracking applications. IMPORTANCE This research represents a critical step in the advancement of wastewater monitoring for public health. To date, mechanistic materials balance modeling of wastewater-based epidemiology has relied on SARS-CoV-2 fecal shedding estimates from small-scale clinical reports or meta-analyses of research using a wide range of analytical methodologies. Additionally, previous SARS-CoV-2 fecal shedding data have not contained sufficient methodological information for building accurate materials balance models. Like SARS-CoV-2, fecal shedding of PMMoV and crAssphage has been understudied to date. The data presented here provide externally valid and longitudinal fecal shedding data for SARS-CoV-2, PMMoV, and crAssphage which can be directly applied to WBE models and ultimately increase the utility of WBE. |
Investigation of a multistate outbreak of Listeria monocytogenes infections linked to frozen vegetables produced at individually quick-frozen vegetable manufacturing facilities
Madad A , Heiman Marshall K , Blessington T , Hardy C , Salter M , Basler C , Conrad A , Stroika S , Luo Y , Dwarka A , Gerhardt T , Rosa Y , Cibulskas K , Rosen HE , Adcock B , Kiang D , Hutton S , Parish M , Podoski B , Patel B , Viazis S . J Food Prot 2023 86 (8) 100117 In 2016, the U.S. Food and Drug Administration (FDA), the Centers for Disease Control and Prevention (CDC), and state partners investigated nine Listeria monocytogenes infections linked to frozen vegetables. The investigation began with two environmental L. monocytogenes isolates recovered from Manufacturer A, primarily a processor of frozen onions, that were a match by whole genome sequencing (WGS) to eight clinical isolates and historical onion isolates with limited collection details. Epidemiologic information, product distribution, and laboratory evidence linked suspect food items, including products sourced from Manufacturer B, also a manufacturer of frozen vegetable/fruit products, with an additional illness. The environmental isolates were obtained during investigations at Manufacturers A and B. State and federal partners interviewed ill people, analyzed shopper card data, and collected household and retail samples. Nine ill persons between 2013 and 2016 were reported in four states. Of four ill people with information available, frozen vegetable consumption was reported by three, with shopper cards confirming purchases of Manufacturer B brands. Two identified outbreak strains of L. monocytogenes (Outbreak Strain 1 and Outbreak Strain 2) were a match to environmental isolates from Manufacturer A and/or isolates from frozen vegetables recovered from open and unopened product samples sourced from Manufacturer B; the investigation resulted in extensive voluntary recalls. The close genetic relationship between isolates helped investigators determine the source of the outbreak and take steps to protect public health. This is the first known multistate outbreak of listeriosis in the United States linked to frozen vegetables and highlights the significance of sampling and WGS analyses when there is limited epidemiologic information. Additionally, this investigation emphasizes the need for further research regarding food safety risks associated with frozen foods. |
Risk factors for non-O157 shiga toxin-producing Escherichia coli infections, United States
Marder EP , Cui Z , Bruce BB , Richardson LC , Boyle MM , Cieslak PR , Comstock N , Lathrop S , Garman K , McGuire S , Olson D , Vugia DJ , Wilson S , Griffin PM , Medus C . Emerg Infect Dis 2023 29 (6) 1183-1190 Shiga toxin-producing Escherichia coli (STEC) causes acute diarrheal illness. To determine risk factors for non-O157 STEC infection, we enrolled 939 patients and 2,464 healthy controls in a case-control study conducted in 10 US sites. The highest population-attributable fractions for domestically acquired infections were for eating lettuce (39%), tomatoes (21%), or at a fast-food restaurant (23%). Exposures with 10%-19% population attributable fractions included eating at a table service restaurant, eating watermelon, eating chicken, pork, beef, or iceberg lettuce prepared in a restaurant, eating exotic fruit, taking acid-reducing medication, and living or working on or visiting a farm. Significant exposures with high individual-level risk (odds ratio >10) among those >1 year of age who did not travel internationally were all from farm animal environments. To markedly decrease the number of STEC-related illnesses, prevention measures should focus on decreasing contamination of produce and improving the safety of foods prepared in restaurants. |
Genomic epidemiology of a severe acute respiratory syndrome coronavirus 2 outbreak in a US major league soccer club: Was it travel related
Carmola LR , Turcinovic J , Draper G , Webner D , Putukian M , Silvers-Granelli H , Bombin A , Connor BA , Angelo KM , Kozarsky P , Libman M , Huits R , Hamer DH , Fairley JK , Connor JH , Piantadosi A , Bourque DL . Open Forum Infect Dis 2023 10 (6) ofad235 BACKGROUND: Professional soccer athletes are at risk of acquiring severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). United States Major League Soccer (MLS) uses protocol-based SARS-CoV-2 testing for identification of individuals with coronavirus disease 2019. METHODS: Per MLS protocol, fully vaccinated players underwent SARS-CoV-2 real-time polymerase chain reaction testing weekly; unvaccinated players were tested every other day. Demographic and epidemiologic data were collected from individuals who tested positive, and contact tracing was performed. Whole genome sequencing (WGS) was performed on positive specimens, and phylogenetic analyses were used to identify potential transmission patterns. RESULTS: In the fall of 2021, all 30 players from 1 MLS team underwent SARS-CoV-2 testing per protocol; 27 (90%) were vaccinated. One player who had recently traveled to Africa tested positive for SARS-CoV-2; within the following 2 weeks, 10 additional players and 1 staff member tested positive. WGS yielded full genome sequences for 10 samples, including 1 from the traveler. The traveler's sample was Delta sublineage AY.36 and was closely related to a sequence from Africa. Nine samples yielded other Delta sublineages including AY.4 (n = 7), AY.39 (n = 1), and B.1.617.2 (n = 1). The 7 AY.4 sequences clustered together; suggesting a common source of infection. Transmission from a family member visiting from England to an MLS player was identified as the potential index case. The other 2 AY.4 sequences differed from this group by 1-3 nucleotides, as did a partial genome sequence from an additional team member. CONCLUSIONS: WGS is a useful tool for understanding SARS-CoV-2 transmission dynamics in professional sports teams. |
Genomic surveillance for SARS-CoV-2 variants: Circulation of Omicron lineages - United States, January 2022-May 2023
Ma KC , Shirk P , Lambrou AS , Hassell N , Zheng XY , Payne AB , Ali AR , Batra D , Caravas J , Chau R , Cook PW , Howard D , Kovacs NA , Lacek KA , Lee JS , MacCannell DR , Malapati L , Mathew S , Mittal N , Nagilla RR , Parikh R , Paul P , Rambo-Martin BL , Shepard SS , Sheth M , Wentworth DE , Winn A , Hall AJ , Silk BJ , Thornburg N , Kondor R , Scobie HM , Paden CR . MMWR Morb Mortal Wkly Rep 2023 72 (24) 651-656 CDC has used national genomic surveillance since December 2020 to monitor SARS-CoV-2 variants that have emerged throughout the COVID-19 pandemic, including the Omicron variant. This report summarizes U.S. trends in variant proportions from national genomic surveillance during January 2022-May 2023. During this period, the Omicron variant remained predominant, with various descendant lineages reaching national predominance (>50% prevalence). During the first half of 2022, BA.1.1 reached predominance by the week ending January 8, 2022, followed by BA.2 (March 26), BA.2.12.1 (May 14), and BA.5 (July 2); the predominance of each variant coincided with surges in COVID-19 cases. The latter half of 2022 was characterized by the circulation of sublineages of BA.2, BA.4, and BA.5 (e.g., BQ.1 and BQ.1.1), some of which independently acquired similar spike protein substitutions associated with immune evasion. By the end of January 2023, XBB.1.5 became predominant. As of May 13, 2023, the most common circulating lineages were XBB.1.5 (61.5%), XBB.1.9.1 (10.0%), and XBB.1.16 (9.4%); XBB.1.16 and XBB.1.16.1 (2.4%), containing the K478R substitution, and XBB.2.3 (3.2%), containing the P521S substitution, had the fastest doubling times at that point. Analytic methods for estimating variant proportions have been updated as the availability of sequencing specimens has declined. The continued evolution of Omicron lineages highlights the importance of genomic surveillance to monitor emerging variants and help guide vaccine development and use of therapeutics. |
Genomic characterization of the rotavirus G3P[8] strain in vaccinated children, reveals possible reassortment events between human and animal strains in Manhia District, Mozambique
Manjate F , João ED , Mwangi P , Chirinda P , Mogotsi M , Messa A Jr , Garrine M , Vubil D , Nobela N , Nhampossa T , Acácio S , Tate JE , Parashar U , Weldegebriel G , Mwenda JM , Alonso PL , Cunha C , Nyaga M , Mandomando I . Front Microbiol 2023 14 1193094 Mozambique introduced the rotavirus vaccine (Rotarix®; GlaxoSmithKline Biologicals, Rixensart, Belgium) in 2015, and since then, the Centro de Investigação em Saúde de Manhiça has been monitoring its impact on rotavirus-associated diarrhea and the trend of circulating strains, where G3P[8] was reported as the predominant strain after the vaccine introduction. Genotype G3 is among the most commonly detected Rotavirus strains in humans and animals, and herein, we report on the whole genome constellation of G3P[8] detected in two children (aged 18 months old) hospitalized with moderate-to-severe diarrhea at the Manhiça District Hospital. The two strains had a typical Wa-like genome constellation (I1-R1-C1-M1-A1-N1-T1-E1-H1) and shared 100% nucleotide (nt) and amino acid (aa) identities in 10 gene segments, except for VP6. Phylogenetic analysis demonstrated that genome segments encoding VP7, VP6, VP1, NSP3, and NSP4 of the two strains clustered most closely with porcine, bovine, and equine strains with identities ranging from 86.9-99.9% nt and 97.2-100% aa. Moreover, they consistently formed distinct clusters with some G1P[8], G3P[8], G9P[8], G12P[6], and G12P[8] strains circulating from 2012 to 2019 in Africa (Mozambique, Kenya, Rwanda, and Malawi) and Asia (Japan, China, and India) in genome segments encoding six proteins (VP2, VP3, NSP1-NSP2, NSP5/6). The identification of segments exhibiting the closest relationships with animal strains shows significant diversity of rotavirus and suggests the possible occurrence of reassortment events between human and animal strains. This demonstrates the importance of applying next-generation sequencing to monitor and understand the evolutionary changes of strains and evaluate the impact of vaccines on strain diversity. |
The cost burden of metastatic prostate cancer in the US populations covered by employer-sponsored health insurance
Horný M , Yabroff KR , Filson CP , Zheng Z , Ekwueme DU , Richards TB , Howard DH . Cancer 2023 129 (20) 3252-3262 BACKGROUND: Recent advancements in the clinical management of metastatic prostate cancer include several costly therapies and diagnostic tests. The objective of this study was to provide updated information on the cost to payers attributable to metastatic prostate cancer among men aged 18 to 64 years with employer-sponsored health plans and men aged 18 years or older covered by employer-sponsored Medicare supplement insurance. METHODS: By using Merative MarketScan commercial and Medicare supplemental data for 2009-2019, the authors calculated differences in spending between men with metastatic prostate cancer and their matched, prostate cancer-free controls, adjusting for age, enrollment length, comorbidities, and inflation to 2019 US dollars. RESULTS: The authors compared 9011 patients who had metastatic prostate cancer and were covered by commercial insurance plans with a group of 44,934 matched controls and also compared 17,899 patients who had metastatic prostate cancer and were covered by employer-sponsored Medicare supplement plans with a group of 87,884 matched controls. The mean age of patients with metastatic prostate cancer was 58.5 years in the commercial samples and 77.8 years in the Medicare supplement samples. Annual spending attributable to metastatic prostate cancer was $55,949 per person-year (95% confidence interval [CI], $54,074-$57,825 per person-year) in the commercial population and $43,682 per person-year (95% CI, $42,022-$45,342 per person-year) in the population covered by Medicare supplement plans, both in 2019 US dollars. CONCLUSIONS: The cost burden attributable to metastatic prostate cancer exceeds $55,000 per person-year among men with employer-sponsored health insurance and $43,000 among those covered by employer-sponsored Medicare supplement plans. These estimates can improve the precision of value assessments of clinical and policy approaches to the prevention, screening, and treatment of prostate cancer in the United States. |
Medication cost concerns and disparities in patient-reported outcomes among a multiethnic cohort of patients with lupus
Aguirre A , DeQuattro K , Shiboski S , Katz P , Greenlund KJ , Barbour KE , Gordon C , Lanata C , Criswel L , Dall'Era M , Yazdany J . J Rheumatol 2023 50 (10) 1302-1309 OBJECTIVE: Concerns about the affordability of medications are common in systemic lupus erythematosus (SLE), but the relationship between medication cost concerns and health outcomes is poorly understood. We assessed the association of self-reported medication cost concerns and patient-reported outcomes (PROs) in a multiethnic lupus cohort. METHODS: The California Lupus Epidemiology Study is a cohort of individuals with physicianconfirmed SLE. Medication cost concerns was defined as having difficulties affording lupus medications, skipping doses, delaying refills, requesting lower cost alternatives, purchasing medications outside the US, or applying for patient assistance programs. Linear regression and mixed effects models assessed the cross-sectional and longitudinal association of medication cost concerns and PROs, respectively, adjusting for age, sex, race and ethnicity, income, principal insurance, immunomodulatory medications, and organ damage. RESULTS: Of 334 participants, medication cost concerns were reported by 91 (27%). Medication cost concerns were associated with worse Systemic Lupus Erythematosus Activity Questionnaire (SLAQ, beta coefficient 5.9, 95% CI 4.3 to 7.6, P<0.001), Patient Health Questionnaire Depression Scale (PHQ-8, beta coefficient 2.7, 95% CI 1.4 to 4.0, P<0.001), and Patient-Reported Outcomes Measurement Information System (PROMIS, beta coefficient for physical function -4.6, 95% CI -6.7 to -2.4, P<0.001) scores after adjusting for covariates. Medication cost concerns were not associated with significant changes in PROs over two-year follow-up. CONCLUSION: More than a quarter of participants reported at least one medication cost concern, which was associated with worse patient-reported outcomes. Our results reveal a potentially modifiable risk factor for poor outcomes rooted in the unaffordability of lupus care. |
Coordinated global cessation of oral poliovirus vaccine use: Options and potential consequences
Kalkowska DA , Wassilak SGF , Wiesen E , Burns CC , Pallansch MA , Badizadegan K , Thompson KM . Risk Anal 2023 Due to the very low, but nonzero, paralysis risks associated with the use of oral poliovirus vaccine (OPV), eradicating poliomyelitis requires ending all OPV use globally. The Global Polio Eradication Initiative (GPEI) coordinated cessation of Sabin type 2 OPV (OPV2 cessation) in 2016, except for emergency outbreak response. However, as of early 2023, plans for cessation of bivalent OPV (bOPV, containing types 1 and 3 OPV) remain undefined, and OPV2 use for outbreak response continues due to ongoing transmission of type 2 polioviruses and reported type 2 cases. Recent development and use of a genetically stabilized novel type 2 OPV (nOPV2) leads to additional potential vaccine options and increasing complexity in strategies for the polio endgame. Prior applications of integrated global risk, economic, and poliovirus transmission modeling consistent with GPEI strategic plans that preceded OPV2 cessation explored OPV cessation dynamics and the evaluation of options to support globally coordinated risk management efforts. The 2022-2026 GPEI strategic plan highlighted the need for early bOPV cessation planning. We review the published modeling and explore bOPV cessation immunization options as of 2022, assuming that the GPEI partners will not support restart of the use of any OPV type in routine immunization after a globally coordinated cessation of such use. We model the potential consequences of globally coordinating bOPV cessation in 2027, as anticipated in the 2022-2026 GPEI strategic plan. We do not find any options for bOPV cessation likely to succeed without a strategy of bOPV intensification to increase population immunity prior to cessation. |
Worst-case scenarios: Modeling uncontrolled type 2 polio transmission
Kalkowska DA , Wiesen E , Wassilak SGF , Burns CC , Pallansch MA , Badizadegan K , Thompson KM . Risk Anal 2023 In May 2016, the Global Polio Eradication Initiative (GPEI) coordinated the cessation of all use of type 2 oral poliovirus vaccine (OPV2), except for emergency outbreak response. Since then, paralytic polio cases caused by type 2 vaccine-derived polioviruses now exceed 3,000 cases reported by 39 countries. In 2022 (as of April 25, 2023), 20 countries reported detection of cases and nine other countries reported environmental surveillance detection, but no reported cases. Recent development of a genetically modified novel type 2 OPV (nOPV2) may help curb the generation of neurovirulent vaccine-derived strains; its use since 2021 under Emergency Use Listing is limited to outbreak response activities. Prior modeling studies showed that the expected trajectory for global type 2 viruses does not appear headed toward eradication, even with the best possible properties of nOPV2 assuming current outbreak response performance. Continued persistence of type 2 poliovirus transmission exposes the world to the risks of potentially high-consequence events such as the importation of virus into high-transmission areas of India or Bangladesh. Building on prior polio endgame modeling and assuming current national and GPEI outbreak response performance, we show no probability of successfully eradicating type 2 polioviruses in the near term regardless of vaccine choice. We also demonstrate the possible worst-case scenarios could result in rapid expansion of paralytic cases and preclude the goal of permanently ending all cases of poliomyelitis in the foreseeable future. Avoiding such catastrophic scenarios will depend on the development of strategies that raise population immunity to type 2 polioviruses. |
Safety of simultaneous vaccination with COVID-19 vaccines in the Vaccine Safety Datalink
Kenigsberg TA , Hanson KE , Klein NP , Zerbo O , Goddard K , Xu S , Yih WK , Irving SA , Hurley LP , Glanz JM , Kaiser R , Jackson LA , Weintraub ES . Vaccine 2023 INTRODUCTION: Safety data on simultaneous vaccination (SV) with primary series monovalent COVID-19 vaccines and other vaccines are limited. We describe SV with primary series COVID-19 vaccines and assess 23 pre-specified health outcomes following SV among persons aged ≥5 years in the Vaccine Safety Datalink (VSD). METHODS: We utilized VSD's COVID-19 vaccine surveillance data from December 11, 2020-May 21, 2022. Analyses assessed frequency of SV. Rate ratios (RRs) were estimated by Poisson regression when the number of outcomes was ≥5 across both doses, comparing outcome rates between COVID-19 vaccinees receiving SV and COVID-19 vaccinees receiving no SV in the 1-21 days following COVID-19 vaccine dose 1 and 1-42 days following dose 2 by SV type received ("All SV", "Influenza SV", "Non-influenza SV"). RESULTS: SV with COVID-19 vaccines was not common practice (dose 1: 0.7 % of 8,455,037 persons, dose 2: 0.3 % of 7,787,013 persons). The most frequent simultaneous vaccines were influenza, HPV, Tdap, and meningococcal. Outcomes following SV with COVID-19 vaccines were rare (total of 56 outcomes observed after dose 1 and dose 2). Overall rate of outcomes among COVID-19 vaccinees who received SV was not statistically significantly different than the rate among those who did not receive SV (6.5 vs. 6.8 per 10,000 persons). Statistically significant elevated RRs were observed for appendicitis (2.09; 95 % CI, 1.06-4.13) and convulsions/seizures (2.78; 95 % CI, 1.10-7.06) in the "All SV" group following dose 1, and for Bell's palsy (2.82; 95 % CI, 1.14-6.97) in the "Influenza SV" group following dose 2. CONCLUSION: Combined pre-specified health outcomes observed among persons who received SV with COVID-19 vaccine were rare and not statistically significantly different compared to persons who did not receive SV with COVID-19 vaccine. Statistically significant adjusted rate ratios were observed for some individual outcomes, but the number of outcomes was small and there was no adjustment for multiple testing. |
Interim recommendations for use of bivalent mRNA COVID-19 vaccines for persons aged 6 months - United States, April 2023
Moulia DL , Wallace M , Roper LE , Godfrey M , Rosenblum HG , Link-Gelles R , Britton A , Daley MF , Meyer S , Fleming-Dutra KE , Oliver SE , Twentyman E . MMWR Morb Mortal Wkly Rep 2023 72 (24) 657-662 Throughout the national public health emergency declared in response to the COVID-19 pandemic, CDC, guided by the Advisory Committee on Immunization Practices (ACIP), has offered evidence-based recommendations for the use of COVID-19 vaccines in U.S. populations after each regulatory action by the Food and Drug Administration (FDA). During August 2022-April 2023, FDA amended its Emergency Use Authorizations (EUAs) to authorize the use of a single, age-appropriate, bivalent COVID-19 vaccine dose (i.e., containing components from the ancestral and Omicron BA.4/BA.5 strains in equal amounts) for all persons aged ≥6 years, use of bivalent COVID-19 vaccine doses for children aged 6 months-5 years, and additional bivalent doses for immunocompromised persons and adults aged ≥65 years (1). ACIP voted in September 2022 on the use of the bivalent vaccine, and CDC made recommendations after the September vote and subsequently, through April 2023, with input from ACIP. This transition to a single bivalent COVID-19 vaccine dose for most persons, with additional doses for persons at increased risk for severe disease, facilitates implementation of simpler, more flexible recommendations. Three COVID-19 vaccines are currently available for use in the United States and recommended by ACIP: 1) the bivalent mRNA Pfizer-BioNTech COVID-19 vaccine, 2) the bivalent mRNA Moderna COVID-19 vaccine, and 3) the monovalent adjuvanted, protein subunit-based Novavax COVID-19 vaccine.* As of August 31, 2022, monovalent mRNA vaccines based on the ancestral SARS-CoV-2 strain are no longer authorized for use in the United States (1). |
Seasonal influenza vaccination in the Americas: Progress and challenges during the COVID-19 pandemic
Nogareda F , Gharpure R , Contreras M , Velandia M , Lucia Pacis C , Elena Chevez A , Azziz-Baumgartner E , Salas D . Vaccine 2023 BACKGROUND: Vaccination is one of the most effective measures to prevent influenza illness and its complications; influenza vaccination remained important during the COVID-19 pandemic to prevent additional burden on health systems strained by COVID-19 demand. OBJECTIVES: We describe policies, coverage, and progress of seasonal influenza vaccination programs in the Americas during 2019-2021 and discuss challenges in monitoring and maintaining influenza vaccination coverage among target groups during the COVID-19 pandemic. METHODS: We used data on influenza vaccination policies and vaccination coverage reported by countries/territories via the electronic Joint Reporting Form on Immunization (eJRF) for 2019-2021. We also summarized country vaccination strategies shared with PAHO. RESULTS: As of 2021, 39 (89 %) out of 44 reporting countries/territories in the Americas had policies for seasonal influenza vaccination. Countries/territories adapted health services and immunization delivery strategies using innovative approaches, such as new vaccination sites and expanded schedules, to ensure continuation of influenza vaccination during the COVID-19 pandemic. However, among countries/territories that reported data to eJRF in both 2019 and 2021, median coverage decreased; the percentage point decrease was 21 % (IQR = 0-38 %; n = 13) for healthcare workers, 10 % (IQR = -1.5-38 %; n = 12) for older adults, 21 % (IQR = 5-31 %; n = 13) for pregnant women, 13 % (IQR = 4.8-20.8 %; n = 8) for persons with chronic diseases, and 9 % (IQR = 3-27 %; n = 15) for children. CONCLUSIONS: Countries/territories in the Americas successfully adapted influenza vaccination delivery to continue vaccination services during the COVID-19 pandemic; however, reported influenza vaccination coverage decreased from 2019 to 2021. Reversing declines in vaccination will necessitate strategic approaches that prioritize sustainable vaccination programs across the life course. Efforts should be made to improve the completeness and quality of administrative coverage data. Lessons learned from COVID-19 vaccination, such as the rapid development of electronic vaccination registries and digital certificates, might facilitate advances in coverage estimation. |
Uninsured and not immune - closing the vaccine-coverage gap for adults
Wallender E , Peacock G , Wharton M , Walensky RP . N Engl J Med 2023 389 (3) 193-195 The U.S. Covid-19 vaccination strategy was simple: get safe and effective vaccines into arms as quickly as possible by making them free and accessible. This strategy worked: more than 670 million Covid-19 vaccine doses had been administered to more than 270 million Americans by the end of the national public health emergency. As we look toward the fall, Covid-19 vaccines — the most effective tool for preventing severe disease — will largely be moving to the commercial market, where access to vaccines is often limited for adults without health insurance. | | In keeping with its pandemic-long goal of promoting equitable access to Covid-19 vaccines, the Biden administration in April 2023 introduced the Health and Human Services Bridge Access Program for Covid-19 Vaccines and Treatments to cover Covid-19 vaccines for uninsured people through 2024. Conceived of as a temporary solution until a more comprehensive vaccine-access plan can be authorized and funded by Congress, the program would utilize more than $1 billion of Covid-19 funds to distribute Covid-19 vaccines to state and local health departments and associated providers, clinics supported by the Health Resources and Services Administration (HRSA), and pharmacies. The fact that this program had to be created from scratch — with funding identified, contracts modified, and timelines and end points designated — speaks to the need for a sustainable, comprehensive vaccine program for uninsured adults, to provide protection against vaccine-preventable diseases for both eligible participants and the general public. |
Post-authorization safety surveillance of Ad.26.COV2.S vaccine: Reports to the Vaccine Adverse Event Reporting System and v-safe, February 2021-February 2022
Woo EJ , Gee J , Marquez P , Baggs J , Abara WE , McNeil MM , Dimova RB , Su JR . Vaccine 2023 41 (30) 4422-4430 BACKGROUND: On 2/27/2021, FDA authorized Janssen COVID-19 Vaccine (Ad.26.COV2.S) for use in individuals 18 years of age and older. Vaccine safety was monitored using the Vaccine Adverse Event Reporting System (VAERS), a national passive surveillance system, and v-safe, a smartphone-based surveillance system. METHODS: VAERS and v-safe data from 2/27/2021 to 2/28/2022 were analyzed. Descriptive analyses included sex, age, race/ethnicity, seriousness, AEs of special interest (AESIs), and cause of death. For prespecified AESIs, reporting rates were calculated using the total number of doses of Ad26.COV2.S administered. For myopericarditis, observed-to-expected (O/E) analysis was performed based on the number verified cases, vaccine administration data, and published background rates. Proportions of v-safe participants reporting local and systemic reactions, as well as health impacts, were calculated. RESULTS: During the analytic period, 17,018,042 doses of Ad26.COV2.S were administered in the United States, and VAERS received 67,995 reports of AEs after Ad26.COV2.S vaccination. Most AEs (59,750; 87.9 %) were non-serious and were similar to those observed during clinical trials. Serious AEs included COVID-19 disease, coagulopathy (including thrombosis with thrombocytopenia syndrome; TTS), myocardial infarction, Bell's Palsy, and Guillain-Barré syndrome (GBS). Among AESIs, reporting rates per million doses of Ad26.COV2.S administered ranged from 0.06 for multisystem inflammatory syndrome in children to 263.43 for COVID-19 disease. O/E analysis revealed elevated reporting rate ratios (RRs) for myopericarditis; among adults ages 18-64 years, the RR was 3.19 (95 % CI 2.00, 4.83) within 7 days and 1.79 (95 % CI 1.26, 2.46) within 21 days of vaccination. Of 416,384 Ad26.COV2.S recipients enrolled into v-safe, 60.9 % reported local symptoms (e.g. injection site pain) and 75.9 % reported systemic symptoms (e.g., fatigue, headache). One-third of participants (141,334; 33.9 %) reported a health impact, but only 1.4 % sought medical care. CONCLUSION: Our review confirmed previously established safety risks for TTS and GBS and identified a potential safety concern for myocarditis. |
Firearm-related traumatic brain injury homicides in the United States, 2000-2019
Waltzman D , Sarmiento K , Daugherty J , Lumba-Brown A , Klevens J , Miller GF . Neurosurgery 2023 93 (1) 43-49 BACKGROUND: Traumatic brain injury (TBI) is a leading cause of homicide-related death in the United States. Penetrating TBI associated with firearms is a unique injury with an exceptionally high mortality rate that requires specialized neurocritical trauma care. OBJECTIVE: To report incidence patterns of firearm-related and nonfirearm-related TBI homicides in the United States between 2000 and 2019 by demographic characteristics to provide foundational data for prevention and treatment strategies. METHODS: Data were obtained from multiple cause of death records from the National Vital Statistics System using Centers for Disease Control and Prevention's Wide-Ranging Online Data for Epidemiologic Research database for the years 2000 to 2019. Number, age-adjusted rates, and percent of firearm and nonfirearm-related TBI homicides by demographic characteristics were calculated. Temporal trends were also evaluated. RESULTS: During the study period, there were 77 602 firearm-related TBI homicides. Firearms were involved in the majority (68%) of all TBI homicides. Overall, men, people living in metro areas, and non-Hispanic Black persons had higher rates of firearm-related TBI homicides. The rate of nonfirearm-related TBI homicides declined by 40%, whereas the rate of firearm-related TBI homicides only declined by 3% during the study period. There was a notable increase in the rate of firearm-related TBI homicides from 2012/2013 through 2019 for women (20%) and nonmetro residents (39%). CONCLUSION: Firearm-related violence is an important public health problem and is associated with the majority of TBI homicide deaths in the United States. The findings from this study may be used to inform prevention and guide further research to improve treatment strategies directed at reducing TBI homicides involving firearms. |
The biosafety research road map: The search for evidence to support practices in the laboratory-Shigella spp
Blacksell SD , Dhawan S , Kusumoto M , Le KK , Davis BJ , Summermatter K , O'Keefe J , Kozlovac J , Almuhairi SS , Sendow I , Scheel CM , Ahumibe A , Masuku ZM , Bennett AM , Kojima K , Harper DR , Hamilton K . Appl Biosaf 2023 28 (2) 96-101 INTRODUCTION: Shigella bacteria cause shigellosis, a gastrointestinal infection most often acquired from contaminated food or water. METHODS: In this review, the general characteristics of Shigella bacteria are described, cases of laboratory-acquired infections (LAIs) are discussed, and evidence gaps in current biosafety practices are identified. RESULTS: LAIs are undoubtedly under-reported. Owing to the low infectious dose, rigorous biosafety level 2 practices are required to prevent LAIs resulting from sample manipulation or contact with infected surfaces. CONCLUSIONS: It is recommended that, before laboratory work with Shigella, an evidence-based risk assessment be conducted. Particular emphasis should be placed on personal protective equipment, handwashing, and containment practices for procedures that generate aerosols or droplets. |
The biosafety research road map: The search for evidence to support practices in the laboratory-SARS-CoV-2
Blacksell SD , Dhawan S , Kusumoto M , Le KK , Summermatter K , O'Keefe J , Kozlovac J , Almuhairi SS , Sendow I , Scheel CM , Ahumibe A , Masuku ZM , Kojima K , Harper DR , Hamilton K . Appl Biosaf 2023 28 (2) 87-95 INTRODUCTION: The SARS-CoV-2 virus emerged as a novel virus and is the causative agent of the COVID-19 pandemic. It spreads readily human-to-human through droplets and aerosols. The Biosafety Research Roadmap aims to support the application of laboratory biological risk management by providing an evidence base for biosafety measures. This involves assessing the current biorisk management evidence base, identifying research and capability gaps, and providing recommendations on how an evidence-based approach can support biosafety and biosecurity, including in low-resource settings. METHODS: A literature search was conducted to identify potential gaps in biosafety and focused on five main sections, including the route of inoculation/modes of transmission, infectious dose, laboratory-acquired infections, containment releases, and disinfection and decontamination strategies. RESULTS: There are many knowledge gaps related to biosafety and biosecurity due to the SARS-CoV-2 virus's novelty, including infectious dose between variants, personal protective equipment for personnel handling samples while performing rapid diagnostic tests, and laboratory-acquired infections. Detecting vulnerabilities in the biorisk assessment for each agent is essential to contribute to the improvement and development of laboratory biosafety in local and national systems. |
A novel rat-tail model for studying human finger vibration health effects
Dong RG , Warren C , Xu XS , Wu JZ , Welcome DE , Waugh S , Krajnak K . Proc Inst Mech Eng H 2023 237 (7) 9544119231181246 It has been hypothesized that the biodynamic responses of the human finger tissues to vibration are among the major stimuli that cause vibration health effects. Furthermore, the finger contact pressure can alter these effects. It is difficult to test these hypotheses using human subjects or existing animal models. The objective of this study was to develop a new rat-tail vibration model to investigate the combined effects of vibration and contact pressure and to identify their relationships with the biodynamic responses. Physically, the new exposure system was developed by adding a loading device to an existing rat-tail model. An analytical model of the rat-tail exposure system was proposed and used to formulate the methods for quantifying the biodynamic responses. A series of tests with six tails dissected from rat cadavers were conducted to test and evaluate the new model. The experimental and modeling results demonstrate that the new model behaves as predicted. Unlike the previous model, the vibration strain and stress of the rat tail does not depend primarily on the vibration response of the tail itself but on that of the loading device. This makes it possible to quantify and control the biodynamic responses conveniently and reliably by measuring the loading device response. This study also identified the basic characteristics of the tail biodynamic responses in the exposure system, which can be used to help design the experiments for studying vibration biological effects. |
Biologically synthesized zinc and copper oxide nanoparticles using Cannabis sativa L. enhance soybean (Glycine max) defense against fusarium virguliforme
Karmous I , Vaidya S , Dimkpa C , Zuverza-Mena N , da Silva W , Barroso KA , Milagres J , Bharadwaj A , Abdelraheem W , White JC , Elmer WH . Pestic Biochem Physiol 2023 194 105486 In this study, zinc and copper oxide nanoparticles (NPs) were synthesized using hemp (Cannabis sativa L.) leaves (ZnONP-HL and CuONP-HL), and their antifungal potential was assessed against Fusarium virguliforme in soybean (Glycine max L.). Hemp was selected because it is known to contain large quantities of secondary metabolites that can potentially enhance the reactivity of NPs through surface property modification. Synthesizing NPs with biologically derived materials allows to avoid the use of harsh and expensive synthetic reducing and capping agents. The ZnONP-HL and CuONP-HL showed average grain/crystallite size of 13.51 nm and 7.36 nm, respectively. The biologically synthesized NPs compared well with their chemically synthesized counterparts (ZnONP chem, and CuONP chem; 18.75 nm and 10.05 nm, respectively), confirming the stabilizing role of hemp-derived biomolecules. Analysis of the hemp leaf extract and functional groups that were associated with ZnONP-HL and CuONP-HL confirmed the presence of terpenes, flavonoids, and phenolic compounds. Biosynthesized NPs were applied on soybeans as bio-nano-fungicides against F. virguliforme via foliar treatments. ZnONP-HL and CuONP-HL at 200 μg/mL significantly (p < 0.05) increased (∼ 50%) soybean growth, compared to diseased controls. The NPs improved the nutrient (e.g., K, Ca, P) content and enhanced photosynthetic indicators of the plants by 100–200%. A 300% increase in the expression of soybean pathogenesis related GmPR genes encoding antifungal and defense proteins confirmed that the biosynthesized NPs enhanced disease resistance against the fungal phytopathogen. The findings from this study provide novel evidence of systemic suppression of fungal disease by nanobiopesticides, via promoting plant defense mechanisms. © 2023 Elsevier Inc. |
Hush little baby - promise of the eat, sleep, console approach
Barfield WD . N Engl J Med 2023 388 (25) 2391-2392 Hush little baby, don’t say a word …” This traditional lullaby is symbolic of our attempts to offer an appropriate intervention for infants with neonatal opioid withdrawal syndrome. The incidence of this condition, which affects newborns after maternal opioid exposure during pregnancy, has increased substantially in recent years,1 and more holistic approaches are being sought to support the care of mother, infant, family, and community.2 | | In this issue of the Journal, Young et al.3 report the results of a large trial assessing a nonpharmacologic strategy — the Eat, Sleep, Console approach — for the treatment of neonatal opioid withdrawal syndrome. The study compares Eat, Sleep, Console with the more traditional approach of neonatal scoring for severity of withdrawal symptoms (typically, by means of the Finnegan or Modified Finnegan Neonatal Abstinence Scoring Tool4), which may overestimate the need for medications, typically morphine. The authors tested the hypothesis that the Eat, Sleep, Console approach can reduce the time until infants are ready for hospital discharge, without introducing harm. |
Individualized education programs and transition planning for adolescents with autism
Hughes MM , Kirby AV , Davis J , Bilder DA , Patrick M , Lopez M , DaWalt LS , Pas ET , Bakian AV , Shaw KA , DiRienzo M , Hudson A , Schwenk YD , Baroud TM , Washington A , Maenner MJ . Pediatrics 2023 152 (1) OBJECTIVES: The study objectives were to examine the contents of individualized education programs (IEPs) of adolescents with autism spectrum disorder (ASD), including postsecondary transition goals, services, and changes in special education classification over time. METHODS: This study involved a longitudinal population-based surveillance cohort from the Autism Developmental Disabilities Monitoring Network from 2002 to 2018 in 3 catchment areas in the United States. The sample included 322 adolescents who were born in 2002, identified with ASD, and had an IEP available for review at ages 15-16 years. RESULTS: We found that 297 (92%) adolescents with ASD had an IEP including a transition plan. Those without intellectual disability (ID) were more likely to have postsecondary education and employment goals and have those goals be to pursue higher education or competitive employment compared with those with ID. Forty-one percent of adolescents with ASD had a postsecondary living arrangement goal. Although 28% of adolescents with ASD received school-based mental health services, none of these adolescents were Black; additionally, 15% of those with ID received mental health services compared with 34% without ID. The percentage of adolescents with ASD served under an autism classification increased from 44% at age 8 years to 62% by age 16. CONCLUSIONS: We identified gaps and disparities in school-based postsecondary transition planning. Working with education partners, families, and adolescents will be important to identify what challenges contribute to these findings and what supports are needed to improve the equity and quality of the transition planning process for adolescents with ASD so they are prepared for adulthood. |
Baylisascaris procyonis roundworm infection in child with autism spectrum disorder, Washington, USA, 2022
Lipton BA , Oltean HN , Capron RB , Hamlet A , Montgomery SP , Chancey RJ , Konold VJL , Steffl KE . Emerg Infect Dis 2023 29 (6) 1232-1235 We describe a case of Baylisascaris procyonis roundworm infection in a child in Washington, USA, with autism spectrum disorder. Environmental assessment confirmed nearby raccoon habitation and B. procyonis eggs. B. procyonis infections should be considered a potential cause of human eosinophilic meningitis, particularly among young children and persons with developmental delays. |
Fathers, breastfeeding, and infant sleep practices: Findings from a state-representative survey
Parker JJ , Simon C , Bendelow A , Bryan M , Smith RA , Kortsmit K , Salvesen von Essen B , Williams L , Dieke A , Warner L , Garfield CF . Pediatrics 2023 152 (2) OBJECTIVES: To assess infant breastfeeding initiation and any breastfeeding at 8 weeks and safe sleep practices (back sleep position, approved sleep surface, and no soft objects or loose bedding ["soft bedding"]) by select paternal characteristics among a state-representative sample of fathers with new infants. METHODS: Pregnancy Risk Assessment Monitoring System (PRAMS) for Dads, a novel population-based cross-sectional study, surveyed fathers in Georgia 2-6 months after their infant's birth. Fathers were eligible if the infant's mother was sampled for maternal PRAMS from October 2018 to July 2019. RESULTS: Of 250 respondents, 86.1% reported their infants ever breastfed and 63.4% reported breastfeeding at 8 weeks. Initiation and breastfeeding at 8 weeks were more likely to be reported by fathers who reported wanting their infant's mother to breastfeed than those who did not want her to breastfeed or had no opinion (adjusted prevalence ratio [aPR] = 1.39; 95% confidence interval [CI], 1.15-1.68; aPR = 2.33; 95% CI, 1.59-3.42, respectively) and fathers who were college graduates than those with ≤high school diploma (aPR = 1.25; 95% CI, 1.06-1.46; aPR = 1.44; 95% CI, 1.08-1.91, respectively). Although about four-fifths (81.1%) of fathers reported usually placing their infants to sleep on their back, fewer fathers report avoiding soft bedding (44.1%) or using an approved sleep surface (31.9%). Non-Hispanic Black fathers were less likely to report back sleep position (aPR = 0.70; 95% CI, 0.54-0.90) and no soft bedding (aPR = 0.52; 95% CI, 0.30-0.89) than non-Hispanic white fathers. CONCLUSIONS: Fathers reported suboptimal infant breastfeeding rates and safe sleep practices overall and by paternal characteristics, suggesting opportunities to include fathers in promotion of breastfeeding and infant safe sleep. |
Using near-miss events to create training videos
Bellanca JL , Macdonald B , Navoyski J , Hrica JK , Orr TJ , Demich B , Hoebbel CL . Min Metall Explor 2023 [Epub ahead of print] Haul truck fatal accidents and injuries continue to be a significant concern for the mining industry. However, the availability of high-quality training materials continues to be limited. Near-miss incident accounts, if packaged well, could help fill this gap, because for every fatality, there are hundreds of reportable accidents and thousands of undocumented near misses. Researchers from the National Institute for Occupational Safety and Health (NIOSH) collected detailed accounts of 21 near-miss incidents in virtual interviews with mineworkers at surface mining operations across the country. From these interviews, researchers created four simulation videos using the Unity game engine. The simulation videos bring these events to life through first-person retelling and various visual perspectives of actual events. Each video exemplifies a critical safety message and a common haul truck hazard. This paper describes the process of taking narratives and turning them into impactful visual stories using graphic simulation. NIOSH plans to co-release these simulation videos with the Mine Safety and Health Administration (MSHA) to the mining industry to raise awareness and ultimately help reduce haul truck-related accidents and fatalities in mining. |
Fit evaluation of NIOSH Approved N95 filtering facepiece respirators with various skin protectants: a pilot study
Bergman MS , Grinshpun SA , Yermakov MV , Zhuang Z , Vollmer BE , Yoon KN . J Occup Environ Hyg 2023 20 (9) 1-10 Widespread disease outbreaks can result in prolonged wear times of National Institute for Occupational Safety and Health Approved N95 filtering facepiece respirators by healthcare personnel. Prolonged wear times of these devices can cause the development of various adverse facial skin conditions. Healthcare personnel have been reported to apply "skin protectants" to the face to reduce pressure and friction of respirators. Because tight-fitting respirators rely on a good face seal to protect the wearer, it is important to understand if fit is affected when skin protectants are used. This laboratory pilot study included 10 volunteers who performed quantitative fit tests to evaluate respirator fit while wearing skin protectants. Three N95 filtering facepiece respirator models and three skin protectants were evaluated. Three replicate fit tests were performed for each combination of subject, skin protectant (including a control condition of no protectant), and respirator model. Fit Factor (FF) was affected differently by the combination of protectant type and respirator model. The main effects of protectant type and respirator model were both significant (p <0.001); additionally, their interaction was significant (p = 0.02), indicating FF is affected by the combined effects of protectant type and respirator model. Compared to the control condition, using a bandage-type or surgical tape skin protectant decreased the odds of passing the fit test. Using a barrier cream skin protectant also decreased the odds of passing the fit test across all models compared to the control condition; however, the probability of passing a fit test was not statistically significantly different from the control condition (p = 0.174). These results imply that all three skin protectants reduced mean fit factors for all N95 filtering facepiece respirator models tested. The bandage-type and surgical tape skin protectants both reduced fit factors and passing rates to a greater degree than the barrier cream. Respirator users should follow respirator manufacturers' guidance on the use of skin protectants. If a skin protectant is to be worn with a tight-fitting respirator, the fit of the respirator should be evaluated with the skin protectant applied before use in the workplace. |
Time series, seasonality and trend evaluation of 7years (20152021) of OSHA severe injury data
Gomes H , Parasram V , Collins J , Socias-Morales C . J Saf Res 2023 [Epub ahead of print] Problem: Employers are required to report severe work-related injuries (e.g., amputation, inpatient hospitalization, or loss of an eye), to the Occupational Safety and Health Administration (OSHA). This study examined the OSHA severe injury reports (SIRs) public microdata to understand time-related trends and patterns. Methods: This study included all SIRs from January 2015 to December 2021 (84 months). We employed time series decomposition models (classical additive and multiplicative, X-11, and X-13ARIMA-SEATS) to evaluate monthly seasonal effect and seasonally adjusted trend of SIRs. We developed data visuals to display trends from different models with the original data series. We compared number of daily SIRs by day of the week, and yearly trends by 2-digit NAICS and separately by 1-digit OIICS injury event. Results: There were a total of 70,241 SIRs in this 7 year period; ranging from 8,704 to 11,156 per year, and 600 to 1,100 per month. Seasonally adjusted trend indicated a gradual increase of SIRs over time until October 2018, then a steeper decrease until August 2020, and staying somewhat flat for the rest of the months. Seasonality indicated more SIRs were reported in the summer months (June, July, August). Daily SIRs indicated a weekday average of 34 (SD = 9) and weekend average of 11 (SD = 5). The Manufacturing and Construction industries reported the highest yearly SIRs. Contact with objects and equipment, and falls, slips, trips were the most numerous injury events associated with SIRs. Discussion: Although Federal OSHA SIR data do not include SIRs from state-plan jurisdictions, the data provide a timely national trend of SIR. This is the first known time series analysis of SIRs. Practical Applications: The findings of this study highlight the ability of researchers to use the SIRs as a timely indicator to understand occupational injury trends by specific industries and injury events. |
Biological effects of inhaled crude oil vapor. III. Pulmonary inflammation, cytotoxicity, and gene expression profile
Sager TM , Joseph P , Umbright CM , Hubbs AF , Barger M , Kashon ML , Fedan JS , Roberts JR . Inhal Toxicol 2023 35 1-13 OBJECTIVE: Workers may be exposed to vapors emitted from crude oil in upstream operations in the oil and gas industry. Although the toxicity of crude oil constituents has been studied, there are very few in vivo investigations designed to mimic crude oil vapor (COV) exposures that occur in these operations. The goal of the current investigation was to examine lung injury, inflammation, oxidant generation, and effects on the lung global gene expression profile following a whole-body acute or sub-chronic inhalation exposure to COV. MATERIALS AND METHODS: To conduct this investigation, rats were subjected to either a whole-body acute (6 hr) or a sub-chronic (28 d) inhalation exposure (6 hr/d × 4 d/wk × 4 wk) to COV (300 ppm; Macondo well surrogate oil). Control rats were exposed to filtered air. One and 28 d after acute exposure, and 1, 28, and 90 d following sub-chronic exposure, bronchoalveolar lavage was performed on the left lung to collect cells and fluid for analyses, the apical right lobe was preserved for histopathology, and the right cardiac and diaphragmatic lobes were processed for gene expression analyses. RESULTS: No exposure-related changes were identified in histopathology, cytotoxicity, or lavage cell profiles. Changes in lavage fluid cytokines indicative of inflammation, immune function, and endothelial function after sub-chronic exposure were limited and varied over time. Minimal gene expression changes were detected only at the 28 d post-exposure time interval in both the exposure groups. CONCLUSION: Taken together, the results from this exposure paradigm, including concentration, duration, and exposure chamber parameters, did not indicate significant and toxicologically relevant changes in markers of injury, oxidant generation, inflammation, and gene expression profile in the lung. |
Time series, seasonality and trend evaluation of 7 years (2015–2021) of OSHA severe injury data
Gomes H , Parasram V , Collins J , Socias-Morales C . J Safety Res 2023 86 [Epub ahead of print] Problem: Employers are required to report severe work-related injuries (e.g., amputation, inpatient hospitalization, or loss of an eye), to the Occupational Safety and Health Administration (OSHA). This study examined the OSHA severe injury reports (SIRs) public microdata to understand time-related trends and patterns. Methods: This study included all SIRs from January 2015 to December 2021 (84 months). We employed time series decomposition models (classical additive and multiplicative, X-11, and X-13ARIMA-SEATS) to evaluate monthly seasonal effect and seasonally adjusted trend of SIRs. We developed data visuals to display trends from different models with the original data series. We compared number of daily SIRs by day of the week, and yearly trends by 2-digit NAICS and separately by 1-digit OIICS injury event. Results: There were a total of 70,241 SIRs in this 7 year period; ranging from 8,704 to 11,156 per year, and 600 to 1,100 per month. Seasonally adjusted trend indicated a gradual increase of SIRs over time until October 2018, then a steeper decrease until August 2020, and staying somewhat flat for the rest of the months. Seasonality indicated more SIRs were reported in the summer months (June, July, August). Daily SIRs indicated a weekday average of 34 (SD = 9) and weekend average of 11 (SD = 5). The Manufacturing and Construction industries reported the highest yearly SIRs. Contact with objects and equipment, and falls, slips, trips were the most numerous injury events associated with SIRs. Discussion: Although Federal OSHA SIR data do not include SIRs from state-plan jurisdictions, the data provide a timely national trend of SIR. This is the first known time series analysis of SIRs. Practical Applications: The findings of this study highlight the ability of researchers to use the SIRs as a timely indicator to understand occupational injury trends by specific industries and injury events. |
Emerging technology in agriculture: opportunities and considerations for occupational safety and health researchers
Lincoln JM , Elliott KC . J Safety Res 2023 86 [Epub ahead of print] Introduction: A variety of factors are driving the development of robotics and automation in the agriculture industry including the nature of work, workforce shortages, and a variety of economic, climatic, technologic, political, and social factors. While some new robotics and automated machines are available commercially, most are still being developed. This provides occupational safety and health researchers an unprecedented opportunity to mitigate risks and benefits to the health and safety of agriculture workers. Method: The NIOSH Office of Agriculture Safety and Health (OASH) is working to better understand how the advancements in automation and robotics is affecting workers. OASH is coordinating with the NIOSH Center of Occupational Robotics Research (CORR) to help to increase the understanding of human/machine interactions; improve the ability to identify injuries and fatalities involving automation/ robotics; and provide guidance on working safely with automation/ robotics. OASH also joined a small team of academics and industry to organize the SAfety For Emerging Robotics and Autonomous aGriculture or (SAFER AG) Workshop to identify gaps in knowledge and research needs that connect to issues related to risks and regulations/standards, occupational safety research, and impacts on workforce and society. This workshop was sponsored by USDA NIFA. Practical Applications: Occupational safety and health experts need to engage and collaborate with developers of technology. It is also increasingly important for occupational safety and health researchers and practitioners to not only become familiar with existing manufacturing safety standards, but also the lengthy standards development process. Joining consensus standards groups to help shape new standards for emerging technologies may help to mitigate adverse worker impacts. NIOSH's Office of Agriculture Safety and Health will continue to identify research gaps, support new research projects, education, outreach efforts and the development of best practices with our partners. |
Public health impact of the spread of Anopheles stephensi in the WHO Eastern Mediterranean Region countries in Horn of Africa and Yemen: need for integrated vector surveillance and control
Al-Eryani SM , Irish SR , Carter TE , Lenhart A , Aljasari A , Montoya LF , Awash AA , Mohammed E , Ali S , Esmail MA , Hussain A , Amran JG , Kayad S , Nouredayem M , Adam MA , Azkoul L , Assada M , Baheshm YA , Eltahir W , Hutin YJ . Malar J 2023 22 (1) 187 BACKGROUND: Anopheles stephensi is an efficient vector of both Plasmodium falciparum and Plasmodium vivax in South Asia and the Middle East. The spread of An. stephensi to countries within the Horn of Africa threatens progress in malaria control in this region as well as the rest of sub-Saharan Africa. METHODS: The available malaria data and the timeline for the detection of An. stephensi was reviewed to analyse the role of An. stephensi in malaria transmission in Horn of Africa of the Eastern Mediterranean Region (EMR) in Djibouti, Somalia, Sudan and Yemen. RESULTS: Malaria incidence in Horn of Africa of EMR and Yemen, increased from 41.6 in 2015 to 61.5 cases per 1000 in 2020. The four countries from this region, Djibouti, Somalia, Sudan and Yemen had reported the detection of An. stephensi as of 2021. In Djibouti City, following its detection in 2012, the estimated incidence increased from 2.5 cases per 1000 in 2013 to 97.6 cases per 1000 in 2020. However, its contribution to malaria transmission in other major cities and in other countries, is unclear because of other factors, quality of the urban malaria data, human mobility, uncertainty about the actual arrival time of An. stephensi and poor entomological surveillance. CONCLUSIONS: While An. stephensi may explain a resurgence of malaria in Djibouti, further investigations are needed to understand its interpretation trends in urban malaria across the greater region. More investment for multisectoral approach and integrated surveillance and control should target all vectors particularly malaria and dengue vectors to guide interventions in urban areas. |
Estimating malaria transmission risk through surveillance of human-vector interactions in northern Ghana
Coleman S , Yihdego Y , Gyamfi F , Kolyada L , Tongren JE , Zigirumugabe S , Dery DB , Badu K , Obiri-Danso K , Boakye D , Szumlas D , Armistead JS , Dadzie SK . Parasit Vectors 2023 16 (1) 205 BACKGROUND: Vector bionomics are important aspects of vector-borne disease control programs. Mosquito-biting risks are affected by environmental, mosquito behavior and human factors, which are important for assessing exposure risk and intervention impacts. This study estimated malaria transmission risk based on vector-human interactions in northern Ghana, where indoor residual spraying (IRS) and insecticide-treated nets (ITNs) have been deployed. METHODS: Indoor and outdoor human biting rates (HBRs) were measured using monthly human landing catches (HLCs) from June 2017 to April 2019. Mosquitoes collected were identified to species level, and Anopheles gambiae sensu lato (An. gambiae s.l.) samples were examined for parity and infectivity. The HBRs were adjusted using mosquito parity and human behavioral observations. RESULTS: Anopheles gambiae was the main vector species in the IRS (81%) and control (83%) communities. Indoor and outdoor HBRs were similar in both the IRS intervention (10.6 vs. 11.3 bites per person per night [b/p/n]; z = -0.33, P = 0.745) and control communities (18.8 vs. 16.4 b/p/n; z = 1.57, P = 0.115). The mean proportion of parous An. gambiae s.l. was lower in IRS communities (44.6%) than in control communities (71.7%). After adjusting for human behavior observations and parity, the combined effect of IRS and ITN utilization (IRS: 37.8%; control: 57.3%) on reducing malaria transmission risk was 58% in IRS + ITN communities and 27% in control communities with ITNs alone (z = -4.07, P < 0.001). However, this also revealed that about 41% and 31% of outdoor adjusted bites in IRS and control communities respectively, occurred before bed time (10:00 pm). The mean directly measured annual entomologic inoculation rates (EIRs) during the study were 6.1 infective bites per person per year (ib/p/yr) for IRS communities and 16.3 ib/p/yr for control communities. After considering vector survival and observed human behavior, the estimated EIR for IRS communities was 1.8 ib/p/yr, which represents about a 70% overestimation of risk compared to the directly measured EIR; for control communities, it was 13.6 ib/p/yr (16% overestimation). CONCLUSION: Indoor residual spraying significantly impacted entomological indicators of malaria transmission. The results of this study indicate that vector bionomics alone do not provide an accurate assessment of malaria transmission exposure risk. By accounting for human behavior parameters, we found that high coverage of ITNs alone had less impact on malaria transmission indices than combining ITNs with IRS, likely due to observed low net use. Reinforcing effective communication for behavioral change in net use and IRS could further reduce malaria transmission. |
Similar prevalence of Plasmodium falciparum and non-P. falciparum malaria infections among schoolchildren, Tanzania(1)
Sendor R , Mitchell CL , Chacky F , Mohamed A , Mhamilawa LE , Molteni F , Nyinondi S , Kabula B , Mkali H , Reaves EJ , Serbantez N , Kitojo C , Makene T , Kyaw T , Muller M , Mwanza A , Eckert EL , Parr JB , Lin JT , Juliano JJ , Ngasala B . Emerg Infect Dis 2023 29 (6) 1143-1153 Achieving malaria elimination requires considering both Plasmodium falciparum and non-P. falciparum infections. We determined prevalence and geographic distribution of 4 Plasmodium spp. by performing PCR on dried blood spots collected within 8 regions of Tanzania during 2017. Among 3,456 schoolchildren, 22% had P. falciparum, 24% had P. ovale spp., 4% had P. malariae, and 0.3% had P. vivax infections. Most (91%) schoolchildren with P. ovale infections had low parasite densities; 64% of P. ovale infections were single-species infections, and 35% of those were detected in low malaria endemic regions. P. malariae infections were predominantly (73%) co-infections with P. falciparum. P. vivax was detected mostly in northern and eastern regions. Co-infections with >1 non-P. falciparum species occurred in 43% of P. falciparum infections. A high prevalence of P. ovale infections exists among schoolchildren in Tanzania, underscoring the need for detection and treatment strategies that target non-P. falciparum species. |
Essential public health functions are not enough: fostering linkages between functions through National Public Health Institutes improves public health impact
Zuber A , Pearson J , Sebeh Y , Jarvis D , Bratton S . BMJ Glob Health 2023 8 (6) COVID-19 has highlighted the importance of essential public health functions (EPHFs) and the coordination between them. The US Centers for Disease Control and Prevention defines EPHFs as 'the public health activities that all communities should undertake'. According to multiple functional frameworks published in literature, the functions typically include workforce development, surveillance, public health research, laboratory services, health promotion, outbreak response and emergency management. National Public Health Institutes (NPHIs) are often the lead government agency responsible for execution of these functions.This paper describes how NPHIs or other health authorities can improve public health impact by enhancing the coordination of public health functions and public health actors through functional and organisational linkages. We define public health linkages as practical, replicable activities that facilitate collaboration between public health functions or organisations to improve public health. In this paper, we propose a novel typology to categorise important public health linkages and describe enablers of linkages identified through our research.Based on our research, investments in health systems should move beyond vertical approaches to developing public health capacity and place greater emphasis on strengthening the interactions between public health functions and institutions. Development of linkages and their enablers require a purposeful, proactive focus that establishes and strengthens linkages over time and cannot be developed during an outbreak or other public health emergency. |
Validating Wave 1 (2014) urinary cotinine and TNE-2 cut-points for differentiating Wave 4 (2017) cigarette use from non-use in the US using data from the PATH Study
Edwards KC , Khan A , Sharma E , Wang L , Feng J , Blount BC , Sosnoff CS , Smith DM , Goniewicz ML , Pearson J , Villanti AC , Delnevo CD , Bover Manderski MT , Hatsukami DK , Niaura R , Everard C , Kimmel HL , Duffy K , Rostron BL , Del Valle-Pinero AY , van Bemmel DM , Stanton CA , Hyland A . Cancer Epidemiol Biomarkers Prev 2023 32 (9) 1233-1241 BACKGROUND: Sex and racial/ethnic identity specific cut-points for validating tobacco use using Wave 1 (W1) of the PATH Study were published in 2020. The current study establishes predictive validity of the W1 (2014) urinary cotinine and Total Nicotine Equivalents-2 (TNE-2) cut-points on estimating Wave 4 (W4; 2017) tobacco use. METHODS: For exclusive and polytobacco cigarette use, weighted prevalence estimates based on W4 self-report alone and with exceeding the W1 cut-point were calculated to identify the percentage missed without biochemical verification. Sensitivity and specificity of W1 cut-points on W4 self-reported tobacco use status were examined. Receiver Operating Characteristic curves were used to determine the optimal W4 cut-points to distinguish P30D users from non-users, and evaluate if the cut-points significantly differed from W1. RESULTS: Agreement between W4 self-reported use and exceeding the W1 cut-points was high overall and when stratified by demographic subgroups (0.7- 4.4% of use was missed if relying on self-report alone). The predictive validity of using the W1 cut-points to classify exclusive cigarette and polytobacco cigarette use at W4 was high (>90% sensitivity and specificity, except among polytobacco Hispanic smokers). Cut-points derived using W4 data did not significantly differ from the W1 derived cut-points (e.g., W1 exclusive= 40.5 ng/mL cotinine [95% CI: 26.1-62.8], W4 exclusive = 29.9 ng/ml cotinine [95% CI: 13.5-66.4]), among most demographic subgroups. CONCLUSION: The W1 cut-points remain valid for biochemical verification of self-reported tobacco use in W4. IMPACT: Findings from can be used in clinical and epidemiological studies to reduce misclassification of cigarette smoking status. |
Clinical characteristics and outcomes among travelers with severe Dengue : A geosentinel analysis
Huits R , Angelo KM , Amatya B , Barkati S , Barnett ED , Bottieau E , Emetulu H , Epelboin L , Eperon G , Medebb L , Gobbi F , Grobusch MP , Itani O , Jordan S , Kelly P , Leder K , Díaz-Menéndez M , Okumura N , Rizwan A , Rothe C , Saio M , Waggoner J , Yoshimura Y , Libman M , Hamer DH , Schwartz E . Ann Intern Med 2023 176 (7) 940-948 BACKGROUND: Dengue virus is a flavivirus transmitted by Aedes mosquitoes and is an important cause of illness worldwide. Data on the severity of travel-associated dengue illness are limited. OBJECTIVE: To describe the epidemiology, clinical characteristics, and outcomes among international travelers with severe dengue or dengue with warning signs as defined by the 2009 World Health Organization classification (that is, complicated dengue). DESIGN: Retrospective chart review and analysis of travelers with complicated dengue reported to GeoSentinel from January 2007 through July 2022. SETTING: 20 of 71 international GeoSentinel sites. PATIENTS: Returning travelers with complicated dengue. MEASUREMENTS: Routinely collected surveillance data plus chart review with abstraction of clinical information using predefined grading criteria to characterize the manifestations of complicated dengue. RESULTS: Of 5958 patients with dengue, 95 (2%) had complicated dengue. Eighty-six (91%) patients had a supplemental questionnaire completed. Eighty-five of 86 (99%) patients had warning signs, and 27 (31%) were classified as severe. Median age was 34 years (range, 8 to 91 years); 48 (56%) were female. Patients acquired dengue most frequently in the Caribbean (n = 27 [31%]) and Southeast Asia (n = 21 [24%]). Frequent reasons for travel were tourism (46%) and visiting friends and relatives (32%). Twenty-one of 84 (25%) patients had comorbidities. Seventy-eight (91%) patients were hospitalized. One patient died of nondengue-related illnesses. Common laboratory findings and signs were thrombocytopenia (78%), elevated aminotransferase (62%), bleeding (52%), and plasma leakage (20%). Among severe cases, ophthalmologic pathology (n = 3), severe liver disease (n = 3), myocarditis (n = 2), and neurologic symptoms (n = 2) were reported. Of 44 patients with serologic data, 32 confirmed cases were classified as primary dengue (IgM+/IgG-) and 12 as secondary (IgM-/IgG+) dengue. LIMITATIONS: Data for some variables could not be retrieved by chart review for some patients. The generalizability of our observations may be limited. CONCLUSION: Complicated dengue is relatively rare in travelers. Clinicians should monitor patients with dengue closely for warning signs that may indicate progression to severe disease. Risk factors for developing complications of dengue in travelers need further prospective study. PRIMARY FUNDING SOURCE: Centers for Disease Control and Prevention, International Society of Travel Medicine, Public Health Agency of Canada, and GeoSentinel Foundation. |
Investigating the etiology of acute febrile illness: a prospective clinic-based study in Uganda
Kigozi BK , Kharod GA , Bukenya H , Shadomy SV , Haberling DL , Stoddard RA , Galloway RL , Tushabe P , Nankya A , Nsibambi T , Mbidde EK , Lutwama JJ , Perniciaro JL , Nicholson WL , Bower WA , Bwogi J , Blaney DD . BMC Infect Dis 2023 23 (1) 411 BACKGROUND: Historically, malaria has been the predominant cause of acute febrile illness (AFI) in sub-Saharan Africa. However, during the last two decades, malaria incidence has declined due to concerted public health control efforts, including the widespread use of rapid diagnostic tests leading to increased recognition of non-malarial AFI etiologies. Our understanding of non-malarial AFI is limited due to lack of laboratory diagnostic capacity. We aimed to determine the etiology of AFI in three distinct regions of Uganda. METHODS: A prospective clinic-based study that enrolled participants from April 2011 to January 2013 using standard diagnostic tests. Participant recruitment was from St. Paul's Health Centre (HC) IV, Ndejje HC IV, and Adumi HC IV in the western, central and northern regions, which differ by climate, environment, and population density. A Pearson's chi-square test was used to evaluate categorical variables, while a two-sample t-test and Krukalis-Wallis test were used for continuous variables. RESULTS: Of the 1281 participants, 450 (35.1%), 382 (29.8%), and 449 (35.1%) were recruited from the western, central, and northern regions, respectively. The median age (range) was 18 (2-93) years; 717 (56%) of the participants were female. At least one AFI pathogen was identified in 1054 (82.3%) participants; one or more non-malarial AFI pathogens were identified in 894 (69.8%) participants. The non-malarial AFI pathogens identified were chikungunya virus, 716 (55.9%); Spotted Fever Group rickettsia (SFGR), 336 (26.2%) and Typhus Group rickettsia (TGR), 97 (7.6%); typhoid fever (TF), 74 (5.8%); West Nile virus, 7 (0.5%); dengue virus, 10 (0.8%) and leptospirosis, 2 (0.2%) cases. No cases of brucellosis were identified. Malaria was diagnosed either concurrently or alone in 404 (31.5%) and 160 (12.5%) participants, respectively. In 227 (17.7%) participants, no cause of infection was identified. There were statistically significant differences in the occurrence and distribution of TF, TGR and SFGR, with TF and TGR observed more frequently in the western region (p = 0.001; p < 0.001) while SFGR in the northern region (p < 0.001). CONCLUSION: Malaria, arboviral infections, and rickettsioses are major causes of AFI in Uganda. Development of a Multiplexed Point-of-Care test would help identify the etiology of non-malarial AFI in regions with high AFI rates. |
Risk for infection in humans after exposure to birds infected with highly pathogenic avian influenza A(H5N1) virus, United States, 2022
Kniss K , Sumner KM , Tastad KJ , Lewis NM , Jansen L , Julian D , Reh M , Carlson E , Williams R , Koirala S , Buss B , Donahue M , Palm J , Kollmann L , Holzbauer S , Levine MZ , Davis T , Barnes JR , Flannery B , Brammer L , Fry A . Emerg Infect Dis 2023 29 (6) 1215-1219 During February 7─September 3, 2022, a total of 39 US states experienced outbreaks of highly pathogenic avian influenza A(H5N1) virus in birds from commercial poultry farms and backyard flocks. Among persons exposed to infected birds, highly pathogenic avian influenza A(H5) viral RNA was detected in 1 respiratory specimen from 1 person. |
Probable transmission of SARS-CoV-2 from African lion to zoo employees, Indiana, USA, 2021
Siegrist AA , Richardson KL , Ghai RR , Pope B , Yeadon J , Culp B , Behravesh CB , Liu L , Brown JA , Boyer LV . Emerg Infect Dis 2023 29 (6) 1102-1108 We describe animal-to-human transmission of SARS-CoV-2 in a zoo setting in Indiana, USA. A vaccinated African lion with physical limitations requiring hand feeding tested positive for SARS-CoV-2 after onset of respiratory signs. Zoo employees were screened, monitored prospectively for onset of symptoms, then rescreened as indicated; results were confirmed by using reverse transcription PCR and whole-genome virus sequencing when possible. Traceback investigation narrowed the source of infection to 1 of 6 persons. Three exposed employees subsequently had onset of symptoms, 2 with viral genomes identical to the lion's. Forward contact tracing investigation confirmed probable lion-to-human transmission. Close contact with large cats is a risk factor for bidirectional zoonotic SARS-CoV-2 transmission that should be considered when occupational health and biosecurity practices at zoos are designed and implemented. SARS-CoV-2 rapid testing and detection methods for big cats and other susceptible animals should be developed and validated to enable timely implementation of One Health investigations. |
Content Index (Achived Edition)
- Antimicrobial Resistance and Antibiotic Stewardship
- Chronic Diseases and Conditions
- Communicable Diseases
- Disaster Preparedness and Emergency Services
- Environmental Health
- Epidemiology and Surveillance
- Food Safety
- Genetics and Genomics
- Health Economics
- Health Equity and Health Disparities
- Immunity and Immunization
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Occupational Safety and Health
- Parasitic Diseases
- Public Health Leadership and Management
- Substance Use and Abuse
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 22, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure