Last data update: Jan 13, 2025. (Total: 48570 publications since 2009)
Records 1-30 (of 36 Records) |
Query Trace: Bates M[original query] |
---|
Elevating professional well-being in healthcare
Carson W , Bates M , Howard J . Nurs Manage 2024 55 (8) 7-12 A crosswalk of the NIOSH Impact Wellbeing™ campaign and the ANCC Pathway to Excellence® Framework. |
Assessing the impact of COVID-19 on HIV outcomes in the United States: A modeling study
Viguerie A , Jacobson EU , Hicks KA , Bates L , Carrico J , Honeycutt A , Lyles C , Farnham PG . Sex Transm Dis 2024 BACKGROUND: The COVID-19 pandemic impacted sexual behaviors and the HIV continuum-of-care in the United States, reducing HIV testing and diagnosis, and use of pre-exposure prophylaxis (PrEP) and antiretroviral therapy (ART). We aim to understand the future implications of these effects through a modeling study. METHODS: We first ran our compartmental model of HIV transmission in the US accounting for pandemic-related short-term changes in transmission behavior and HIV prevention and care provision in 2020-2021 only. We then ran a comparison scenario that did not apply pandemic effects but assumed a continuation of past HIV prevention and care trends. We compared results from the two scenarios through 2024. RESULTS: HIV incidence was 4·4% lower in 2020-21 for the pandemic scenario compared with the no-pandemic scenario due to reduced levels of transmission behavior, despite reductions in HIV prevention and care caused by the pandemic. However, reduced care led to less viral load suppression among people with HIV (PWH) in 2020 and, in turn, our model resulted in a slightly greater incidence of 2·0% from 2022-24 in the COVID-19 scenario, as compared to the non-COVID scenario. DISCUSSION: Disruptions in HIV prevention and care services during COVID-19 may lead to somewhat higher post-pandemic HIV incidence, than assuming pre-pandemic trends in HIV care and prevention continued. These results underscore the importance of continuing to increase HIV prevention and care efforts in the coming years. |
Laboratory Analysis of an Outbreak of Candida auris in New York from 2016 to 2018-Impact and Lessons Learned (preprint)
Zhu Y , O'Brien B , Leach L , Clark A , Bates M , Adams E , Ostrowsky B , Quinn M , Dufort E , Southwick K , Erazo R , Haley VB , Bucher C , Chaturvedi V , Limberger RJ , Blog D , Lutterloh E , Chaturvedi S . bioRxiv 2019 760090 Candida auris is a multidrug-resistant yeast which has emerged in healthcare facilities worldwide, however little is known about identification methods, patient colonization, spread, environmental survival, and drug resistance. Colonization on both biotic and abiotic surfaces, along with travel, appear to be the major factors for the spread of this pathogen across the globe. In this investigation, we present laboratory findings from an ongoing C. auris outbreak in NY from August 2016 through 2018. A total of 540 clinical isolates, 11,035 patient surveillance specimens, and 3,672 environmental surveillance samples were analyzed. Laboratory methods included matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) for yeast isolate identification, real-time PCR for rapid surveillance sample screening, culture on selective/non-selective media for recovery of C. auris and other yeasts from surveillance samples, antifungal susceptibility testing to determine the C. auris resistance profile, and Sanger sequencing of ribosomal genes for C. auris genotyping. Results included: a) identification and confirmation of C. auris in 413 clinical isolates and 931 patient surveillance isolates, as well as identification of 277 clinical cases and 350 colonized cases from 151 healthcare facilities including 59 hospitals, 92 nursing homes, 1 long-term acute care hospital (LTACH), and 2 hospices, b) successful utilization of an in-house developed C. auris real-time PCR assay for the rapid screening of patient and environmental surveillance samples, c) demonstration of relatively heavier colonization of C. auris in nares compared to the axilla/groin, and d) predominance of the South Asia Clade I with intrinsic resistance to fluconazole and elevated minimum inhibitory concentration (MIC) to voriconazole (81%), amphotericin B (61%), 5-FC (3%) and echinocandins (1%). These findings reflect greater regional prevalence and incidence of C. auris and the deployment of better detection tools in an unprecedented outbreak. |
Effectiveness of the COVID-19 vaccines on preventing symptomatic SARS-CoV-2 infections and hospitalizations in Southwestern Alaska, January-December 2021
Lefferts B , Bruden D , Plumb ID , Hodges E , Bates E , January G , Bruce MG . Vaccine 2023 The population in rural southwest Alaska has been disproportionately affected by COVID-19. To assess the benefit of COVID-19 vaccines, we analyzed data from the regional health system. We estimated vaccine effectiveness (VE) during January 16-December 3, 2021, against symptomatic SARS-CoV-2 infection after a primary series or booster dose, and overall VE against hospitalization. VE of a primary series against symptomatic infection among adult residents was 91.3% (95% CI: 85.7, 95.2) during January 16-May 7, 2021, 50.3% (95% CI, 41.1%-58.8%) during July 17-September 24, and 37.0% (95% CI, 27.8-45.0) during September 25-December 3, 2021; VE of a booster dose during September 25-December 3, 2021, was 92.1% (95% CI: 87.2-95.2). During the overall study period, VE against hospitalization was 91.9% (95% CI: 85.4-95.5). COVID-19 vaccination offered strong protection against hospitalization and a booster dose restored protection against symptomatic infection. |
Patient flow time data of COVID-19 vaccination clinics in 23 sites, United States, April and May 2021.
Cho BH , Athar HM , Bates LG , Yarnoff BO , Harris LQ , Washington ML , Jones-Jack NH , Pike JJ . Vaccine 2022 41 (3) 750-755 INTRODUCTION: Public health department (PHD) led COVID-19 vaccination clinics can be a critical component of pandemic response as they facilitate high volume of vaccination. However, few patient-time analyses examining patient throughput at mass vaccination clinics with unique COVID-19 vaccination challenges have been published. METHODS: During April and May of 2021, 521 patients in 23 COVID-19 vaccination sites counties of 6 states were followed to measure the time spent from entry to vaccination. The total time was summarized and tabulated by clinic characteristics. A multivariate linear regression analysis was conducted to evaluate the association between vaccination clinic settings and patient waiting times in the clinic. RESULTS: The average time a patient spent in the clinic from entry to vaccination was 9 min 5 s (range: 02:00-23:39). Longer patient flow times were observed in clinics with higher numbers of doses administered, 6 or fewer vaccinators, walk-in patients accepted, dedicated services for people with disabilities, and drive-through clinics. The multivariate linear regression showed that longer patient waiting times were significantly associated with the number of vaccine doses administered, dedicated services for people with disabilities, the availability of more than one brand of vaccine, and rurality. CONCLUSIONS: Given the standardized procedures outlined by immunization guidelines, reducing the wait time is critical in lowering the patient flow time by relieving the bottleneck effect in the clinic. Our study suggests enhancing the efficiency of PHD-led vaccination clinics by preparing vaccinators to provide vaccines with proper and timely support such as training or delivering necessary supplies and paperwork to the vaccinators. In addition, patient wait time can be spent answering questions about vaccination or reviewing educational materials on other public health services. |
The Martinsburg Initiative: A collaboration between public safety, public health, and schools to address trauma and substance use
Wisdom AC , Villamil V , Govindu M , Kursey M , Peppard L , Bates RA , Myrick A , Snyder C , Noonan RK . J Public Health Manag Pract 2022 28 S355-s358 The Martinsburg Initiative (TMI) is a community-based model developed in Martinsburg, West Virginia, that implements a comprehensive approach to adverse childhood experiences and substance use prevention and mitigation by leveraging partnerships in public health and health care, public safety, and education. TMI receives coordinated federal funding and technical assistance from the Centers for Disease Control and Prevention, the Washington-Baltimore High Intensity Drug Trafficking Agency, and the National Association of County and City Health Officials to integrate evidence-based and promising strategies. It advances such strategies by translating them for implementation within the community, evaluating the reach and potential impact of the model, and by engaging key stakeholders. Preliminary results describing program reach and short-term outcomes collected for a subset of the interventions during implementation are presented. The model uses touchpoints across multiple community sectors in the city of Martinsburg to break the cycle of trauma and substance use across the life span. |
Assessment of the Costs of Implementing COVID-19 Vaccination Clinics in 34 Sites, United States, March 2021.
Yarnoff BO , Pike JJ , Athar HM , Bates LG , Tayebali ZA , Harris LQ , Jones-Jack NH , Washington ML , Cho BH . J Public Health Manag Pract 2022 28 (6) 624-630 OBJECTIVES: To estimate the costs to implement public health department (PHD)-run COVID-19 vaccination clinics. DESIGN: Retrospectively reported data on COVID-19 vaccination clinic characteristics and resources used during a high-demand day in March 2021. These resources were combined with national average wages, supply costs, and facility costs to estimate the operational cost and start-up cost of clinics. SETTING: Thirty-four PHD-run COVID-19 vaccination clinics across 8 states and 1 metropolitan statistical area. PARTICIPANTS: Clinic managers at 34 PHD-run COVID-19 vaccination clinics. INTERVENTION: Large-scale COVID-19 vaccination clinics were implemented by public health agencies as part of the pandemic response. MAIN OUTCOMES MEASURED: Operational cost per day, operational cost per vaccination, start-up cost per clinic. RESULTS: Median operational cost per day for a clinic was $10 314 (range, $637-$95 163) and median cost per vaccination was $38 (range, $9-$206). There was a large range of operational costs across clinics. Clinics used an average of 99 total staff hours per 100 patients vaccinated. Median start-up cost per clinic was $15 348 (range, $1 409-$165 190). CONCLUSIONS: Results show that clinics require a large range of resources to meet the high throughput needs of the COVID-19 pandemic response. Estimating the costs of PHD-run vaccination clinics for the pandemic response is essential for ensuring that resources are available for clinic success. If clinics are not adequately supported, they may stop functioning, which would slow the pandemic response if no other setting or approach is possible. |
Antigen Test Positivity After COVID-19 Isolation - Yukon-Kuskokwim Delta Region, Alaska, January-February 2022.
Lefferts B , Blake I , Bruden D , Hagen MB , Hodges E , Kirking HL , Bates E , Hoeldt A , Lamont B , Saydah S , MacNeil A , Bruce MG , Plumb ID . MMWR Morb Mortal Wkly Rep 2022 71 (8) 293-298 Isolation is recommended during acute infection with SARS-CoV-2, the virus that causes COVID-19, but the duration of infectiousness varies among individual persons. Rapid antigen test results have been correlated with detection of viable virus (1-3) and might inform isolation guidance, but data are limited for the recently emerged SARS-CoV-2 B.1.1.529 (Omicron) variant. On January 5, 2022, the Yukon-Kuskokwim Health Corporation (YKHC) recommended that persons with SARS-CoV-2 infection isolate for 10 days after symptom onset (or, for asymptomatic persons, 10 days after a positive nucleic acid amplification or antigen test result). However, isolation could end after 5-9 days if symptoms were resolving or absent, fever was absent for 24 hours without fever-reducing medications, and an Abbott BinaxNOW COVID-19 Ag (BinaxNOW) rapid antigen test result was negative. Antigen test results and associated individual characteristics were analyzed among 3,502 infections reported to YKHC during January 1-February 9, 2022. After 5-9 days, 396 of 729 persons evaluated (54.3%) had a positive antigen test result, with a declining percentage positive over time. In a multivariable model, a positive antigen test result was more likely after 5 days compared with 9 days (adjusted odds ratio [aOR]=6.39) or after symptomatic infection (aOR=9.63), and less likely after previous infection (aOR=0.30), receipt of a primary COVID-19 vaccination series (aOR=0.60), or after both previous infection and receipt of a primary COVID-19 vaccination series (aOR=0.17). Antigen tests might be a useful tool to guide recommendations for isolation after SARS-CoV-2 infection. During the 10 days after infection, persons might be infectious to others and are recommended to wear a well-fitting mask when around others, even if ending isolation after 5 days. |
Updated Estimates of the Number of Men Who Have Sex With Men (MSM) With Indications for HIV Pre-exposure Prophylaxis
Bates L , Honeycutt A , Bass S , Green TA , Farnham PG . J Acquir Immune Defic Syndr 2021 88 (4) e28-e30 In 2018, the U.S. Public Health Service (USPHS) published updated clinical guidelines for the use of preexposure prophylaxis (PrEP) to reduce the risk of HIV infection among men who have sex with men (MSM), heterosexual women and men, and persons who inject drugs.1 PrEP is one of the main tools being used to achieve the Ending the HIV Epidemic in the U.S. incidence-reduction goals.2 Thus, policy makers need accurate estimates of the number of U.S. adults having indications for PrEP. |
Frequency of early intervention sessions and vocabulary skills in children with hearing loss
Wiggin M , Sedey AL , Yoshinaga-Itano C , Mason CA , Gaffney M , Chung W . J Clin Med 2021 10 (21) Background: A primary goal of early intervention is to assist children in achieving age-appropriate language skills. The amount of intervention a child receives is ideally based on his or her individual needs, yet it is unclear if language ability impacts amount of intervention and/or if an increased frequency of intervention sessions results in better outcomes. The purpose of this study was to determine the relationship between the frequency of early intervention sessions and vocabulary outcomes in young children with hearing loss. Methods: This was a longitudinal study of 210 children 9 to 36 months of age with bilateral hearing loss living in 12 different states. Expressive vocabulary skills were evaluated using the MacArthur–Bates Communicative Development Inven-tories. Results: A higher number of intervention sessions reported at the first assessment predicted better vocabulary scores at the second assessment, and more sessions reported at the second assessment predicted better scores at the third assessment. For each increase in the number of sessions reported, there was a corresponding, positive increase in vocabulary quotient. In contrast, children’s vocabulary ability at an earlier time point did not predict intervention session frequency at a later point in time. Conclusions: A significant prospective effect was apparent with more therapy sessions resulting in improved vocabulary scores 9 months later. These findings underscore the importance of early intervention. Pediatricians and other health care professionals can help apply these findings by counseling parents regarding the value of frequent and consistent participation in early inter-vention. © 2021 by the authors. Licensee MDPI, Basel, Switzerland. |
Temporal Trends in Dietary Sodium Intake Among Adults Aged 19 Years - United States, 2003-2016
Clarke LS , Overwyk K , Bates M , Park S , Gillespie C , Cogswell ME . MMWR Morb Mortal Wkly Rep 2021 70 (42) 1478-1482 Hypertension, which can be brought on by excess sodium intake, affects nearly one half of U.S. adults and is a major risk factor for heart disease, the leading cause of death in the United States (1). In 2019, the National Academies of Sciences, Engineering, and Medicine (NASEM) established the Chronic Disease Risk Reduction (CDRR) intake, a chronic-disease-specific recommendation for dietary sodium of 2,300 mg/day. Reducing daily sodium to CDRR intake is expected to reduce chronic disease risk among healthy persons, primarily by lowering blood pressure (2). Although the 2019 sodium CDRR intake is equivalent in number to the 2005 Tolerable Upper Limit (UL) released by NASEM (then known as the Institute of Medicine), the UL was intended to provide guidance on safe intake levels, not to serve as an intake goal (2). To describe excess sodium intake in the context of the CDRR intake goal, this report analyzed National Health and Nutrition Examination Survey (NHANES) data from 2003 to 2016 to yield temporal trends in usual sodium intake >2,300 mg/day and in mean sodium intake, unadjusted and adjusted for total energy intake, among U.S. adults aged ≥19 years. The percentage of U.S. adults with sodium intake above CDRR intake was 87.0% during 2003-2004 and 86.7% during 2015-2016. Among U.S. adults overall, no significant linear trend was noted from 2003 to 2016 in unadjusted or energy intake-adjusted mean sodium intake. Small, significant declines were observed in mean usual sodium intake among some groups (adults aged 19-50 years, non-Hispanic White adults, adults experiencing obesity, and adults without hypertension). However, after energy adjustment, only adults aged ≥71 years and Mexican American adults demonstrated significant change in usual sodium intake. Many U.S. adults might be at risk for chronic disease associated with sodium intake above CDRR intake, and efforts to lower sodium intake could improve population cardiovascular health. The results of this report support enhanced efforts to reduce population sodium intake and cardiovascular disease risk, including the Food and Drug Administration's (FDA's) recently released guidance for the reduction of sodium in the commercially processed, packaged, and prepared food supply. |
Estimating the Cost-Effectiveness of the Sodium Reduction in Communities Program
Yarnoff B , Teachout E , MacLeod KE , Whitehill J , Jordan J , Tayebali Z , Bates L . Public Health Nutr 2021 25 (4) 1-29 OBJECTIVE: This study assessed the cost-effectiveness of the Centers for Disease Control and Prevention's (CDC's) Sodium Reduction in Communities Program (SRCP). DESIGN: We collected implementation costs and performance measure indicators from SRCP recipients and their partner food service organizations. We estimated the cost per person and per food service organization reached and the cost per menu item impacted. We estimated the short-term effectiveness of SRCP in reducing sodium consumption and used it as an input in the Prevention Impact Simulation Model to project the long-term impact on medical cost savings and quality adjusted life years gained due to a reduction in cardiovascular disease and estimate the cost-effectiveness of SRCP if sustained through 2025 and 2040. SETTING: CDC funded eight recipients as part of the 2016-2021 round of the Sodium Reduction in Communities Program (SRCP) to work with food service organizations in eight settings to increase the availability and purchase of lower-sodium food options. PARTICIPANTS: Eight SRCP recipients and 20 of their partners. RESULTS: At the recipient level, average cost per person reached was $10, and average cost per food service organization reached was $42,917. At the food service organization level, median monthly cost per food item impacted by recipe modification or product substitution was $684. Cost-effectiveness analyses showed that, if sustained, the program is cost saving (i.e. the reduction in medical costs is greater than the implementation costs) in the target population by $1.82 through 2025 and $2.09 through 2040. CONCLUSIONS: By providing evidence of the cost-effectiveness of a real-world sodium reduction initiative, this study can help inform decisions by public health organizations about related cardiovascular disease prevention interventions. |
State-level health care expenditures associated with disability
Khavjou OA , Anderson WL , Honeycutt AA , Bates LG , Hollis ND , Grosse SD , Razzaghi H . Public Health Rep 2021 136 (4) 33354920979807 OBJECTIVE: Given the growth in national disability-associated health care expenditures (DAHE) and the changes in health insurance-specific DAHE distribution, updated estimates of state-level DAHE are needed. The objective of this study was to update state-level estimates of DAHE. METHODS: We combined data from the 2013-2015 Medical Expenditure Panel Survey, 2013-2015 Behavioral Risk Factor Surveillance System, and 2014 National Health Expenditure Accounts to calculate state-level DAHE for US adults in total, per adult, and per (adult) person with disability (PWD). We adjusted expenditures to 2017 prices and assessed changes in DAHE from 2003 to 2015. RESULTS: In 2015, DAHE were $868 billion nationally (range, $1.4 billion in Wyoming to $102.8 billion in California) accounting for 36% of total health care expenditures (range, 29%-41%). From 2003 to 2015, total DAHE increased by 65% (range, 35%-125%). In 2015, DAHE per PWD were highest in the District of Columbia ($27 839) and lowest in Alabama ($12 603). From 2003 to 2015, per-PWD DAHE increased by 13% (range, -20% to 61%) and per-capita DAHE increased by 28% (range, 7%-84%). In 2015, Medicare DAHE per PWD ranged from $10 067 in Alaska to $18 768 in New Jersey. Medicaid DAHE per PWD ranged from $9825 in Nevada to $43 365 in the District of Columbia. Nonpublic-health insurer per-PWD DAHE ranged from $7641 in Arkansas to $18 796 in Alaska. CONCLUSION: DAHE are substantial and vary by state. The public sector largely supports the health care costs of people with disabilities. State policy makers and other stakeholders can use these results to inform the development of public health programs that support and provide ongoing health care to people with disabilities. |
Validation of the prevention impacts simulation model (PRISM)
Yarnoff B , Honeycutt A , Bradley C , Khavjou O , Bates L , Bass S , Kaufmann R , Barker L , Briss P . Prev Chronic Dis 2021 18 E09 INTRODUCTION: Demonstrating the validity of a public health simulation model helps to establish confidence in the accuracy and usefulness of a model's results. In this study we evaluated the validity of the Prevention Impacts Simulation Model (PRISM), a system dynamics model that simulates health, mortality, and economic outcomes for the US population. PRISM primarily simulates outcomes related to cardiovascular disease but also includes outcomes related to other chronic diseases that share risk factors. PRISM is openly available through a web application. METHODS: We applied the model validation framework developed independently by the International Society of Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making modeling task force to validate PRISM. This framework included model review by external experts and quantitative data comparison by the study team. RESULTS: External expert review determined that PRISM is based on up-to-date science. One-way sensitivity analysis showed that no parameter affected results by more than 5%. Comparison with other published models, such as ModelHealth, showed that PRISM produces lower estimates of effects and cost savings. Comparison with surveillance data showed that projected model trends in risk factors and outcomes align closely with secular trends. Four measures did not align with surveillance data, and those were recalibrated. CONCLUSION: PRISM is a useful tool to simulate the potential effects and costs of public health interventions. Results of this validation should help assure health policy leaders that PRISM can help support community health program planning and evaluation efforts. |
Iron content of commercially available infant and toddler foods in the United States, 2015
Bates M , Gupta PM , Cogswell ME , Hamner HC , Perrine CG . Nutrients 2020 12 (8) OBJECTIVES: To describe the iron content of commercially available infant and toddler foods. METHODS: Nutrition Facts label data were used from a 2015 database of 1037 commercial infant and toddler food and drink products. Products were grouped into food categories on the basis of name, ingredients, target age, and reference amounts customarily consumed (RACC). Mean and median iron content per 100 g and per RACC were calculated. The proportion of products considered good and excellent sources of iron were determined on the basis of percent daily value (% DV) thresholds. RESULTS: Among products marketed for infants (aged 4-12 months), infant cereals had the highest mean (6.19 mg iron per RACC; 41.25 iron mg per 100 g) iron content. Among products marketed for toddlers (aged 12-36 months), vegetable-based mixtures or meals contained the highest mean iron in mg per RACC (mean: 2.97 mg) and dry, grain-based desserts had the highest mean iron in mg per 100 g (mean: 6.45 mg). Juice and drink products had the lowest mean iron contents in both infant and toddler products. CONCLUSIONS: Most commercially available infant cereals are considered to be an excellent source of iron, likely from fortification, but wide variability was observed in iron content by food category. Products that are considered good or excellent sources of iron (≥10% DV) can help consumers identify products with higher iron content, such as infant cereals or toddler vegetable-based mixtures/meals. |
National health care expenditures associated with disability
Khavjou OA , Anderson WL , Honeycutt AA , Bates LG , Razzaghi H , Hollis ND , Grosse SD . Med Care 2020 58 (9) 826-832 BACKGROUND: In 2003, national disability-associated health care expenditures (DAHE) were $398 billion. Updated estimates will improve our understanding of current DAHE. OBJECTIVE: The objective of this study was to estimate national DAHE for the US adult population and analyze spending by insurance and service categories and to assess changes in spending over the past decade. RESEARCH DESIGN: Data from the 2013-2015 Medical Expenditure Panel Survey were used to estimate DAHE for noninstitutionalized adults. These estimates were reconciled with National Health Expenditure Accounts (NHEA) data and adjusted to 2017 medical prices. Expenditures for institutionalized adults were added from NHEA data. MEASURES: National DAHE in total, by insurance and service categories, and percentage of total expenditures associated with disability. RESULTS: DAHE in 2015 were $868 billion (at 2017 prices), representing 36% of total national health care spending (up from 27% in 2003). DAHE per person with disability increased from $13,395 in 2003 to $17,431 in 2015, whereas nondisability per-person spending remained constant (about $6700). Public insurers paid 69% of DAHE. Medicare paid the largest portion ($324.7 billion), and Medicaid DAHE were $277.2 billion. More than half (54%) of all Medicare expenditures and 72% of all Medicaid expenditures were associated with disability. CONCLUSIONS: The share of health care expenditures associated with disability has increased substantially over the past decade. The high proportion of DAHE paid by public insurers reinforces the importance of public programs designed to improve health care for people with disabilities and emphasizes the need for evaluating programs and health services available to this vulnerable population. |
Dietary sodium intake and health indicators: A systematic review of published literature between January 2015 and December 2019
Overwyk KJ , Quader ZS , Maalouf J , Bates M , Webster J , George MG , Merritt RK , Cogswell ME . Adv Nutr 2020 11 (5) 1174-1200 As the science surrounding population sodium reduction evolves, monitoring and evaluating new studies on intake and health can help increase our understanding of the associated benefits and risks. Here we describe a systematic review of recent studies on sodium intake and health, examine the risk of bias (ROB) of selected studies, and provide direction for future research. Seven online databases were searched monthly from January 2015 to December 2019. We selected human studies that met specified population, intervention, comparison, outcome, time, setting/study design (PICOTS) criteria and abstracted attributes related to the study population, design, intervention, exposure, and outcomes, and evaluated ROB for the subset of studies on sodium intake and cardiovascular disease risks or indicators. Of 41,601 abstracts reviewed, 231 studies were identified that met the PICOTS criteria and ROB was assessed for 54 studies. One hundred and fifty-seven (68%) studies were observational and 161 (70%) focused on the general population. Five types of sodium interventions and a variety of urinary and dietary measurement methods were used to establish and quantify sodium intake. Five observational studies used multiple 24-h urine collections to assess sodium intake. Evidence mainly focused on cardiovascular-related indicators (48%) but encompassed an assortment of outcomes. Studies varied in ROB domains and 87% of studies evaluated were missing information on >/=1 domains. Two or more studies on each of 12 outcomes (e.g., cognition) not previously included in systematic reviews and 9 new studies at low ROB suggest the need for ongoing or updated systematic reviews of evidence on sodium intake and health. Summarizing evidence from assessments on sodium and health outcomes was limited by the various methods used to measure sodium intake and outcomes, as well as lack of details related to study design and conduct. In line with research recommendations identified by the National Academies of Science, future research is needed to identify and standardize methods for measuring sodium intake. |
Mapping and analysis of US state and urban local sodium reduction laws
Sloan AA , Keane T , Pettie JR , Bhuiya AR , Taylor LN , Bates M , Bernard S , Akinleye F , Gilchrist S . J Public Health Manag Pract 2020 26 Suppl 2 S62-s70 CONTEXT: Excessive sodium consumption contributes to high blood pressure, which is a risk factor for cardiovascular disease. OBJECTIVES: To (1) identify state and urban local laws addressing adult or general population sodium consumption in foods and beverages and (2) align findings to a previously published evidence classification review, the Centers for Disease Control and Prevention Sodium Quality and Impact of Component (QuIC) evidence assessment. DESIGN: Systematic collection of sodium reduction laws from all 50 states, the 20 most populous counties in the United States, and the 20 most populous cities in the United States, including Washington, District of Columbia, effective on January 1, 2019. Relevant laws were assigned to 1 or more of 6 interventions: (1) provision of sodium information in restaurants or at point of purchase; (2) consumer incentives to purchase lower sodium foods; and provision of lower sodium offerings in (3) workplaces, (4) vending machines, (5) institutional meal services, and (6) grocery, corner, and convenience stores. The researchers used Westlaw, local policy databases or city Web sites, and general nutrition policy databases to identify relevant laws. RESULTS: Thirty-nine sodium reduction laws and 10 state laws preempting localities from enacting sodium reduction laws were identified. Sodium reduction laws were more common in local jurisdictions and in the Western United States. Sodium reduction laws addressing meal services (n = 17), workplaces (n = 12), labeling (n = 13), and vending machines (n = 11) were more common, while those addressing grocery stores (n = 2) or consumer incentives (n = 6) were less common. Laws with high QuIC evidence classifications were generally more common than laws with low QuIC evidence classifications. CONCLUSIONS: The distribution of sodium laws in the US differed by region, QuIC classification, and jurisdiction type, indicating influence from public health and nonpublic health factors. Ongoing research is warranted to determine how the strength of public health evidence evolves over time and how those changes correlate with uptake of sodium reduction law. |
Laboratory Analysis of an Outbreak of Candida auris in New York from 2016 to 2018-Impact and Lessons Learned.
Zhu Y , O'Brien B , Leach L , Clark A , Bates M , Adams E , Ostrowsky B , Quinn M , Dufort E , Southwick K , Erazo R , Haley VB , Bucher C , Chaturvedi V , Limberger RJ , Blog D , Lutterloh E , Chaturvedi S . J Clin Microbiol 2019 58 (4) Candida auris is a multidrug-resistant yeast which has emerged in healthcare facilities worldwide, however little is known about identification methods, patient colonization, environmental survival, spread, and drug resistance. Colonization on both biotic (patients) and abiotic (healthcare objects) surfaces, along with travel, appear to be the major factors for the spread of this pathogen across the globe. In this investigation, we present laboratory findings from an ongoing C. auris outbreak in New York (NY) from August 2016 through 2018. A total of 540 clinical isolates, 11,035 patient surveillance specimens, and 3,672 environmental surveillance samples were analyzed. Laboratory methods included matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) for yeast isolate identification, real-time PCR for rapid surveillance sample screening, culture on selective/non-selective media for recovery of C. auris and other yeasts from surveillance samples, antifungal susceptibility testing to determine the C. auris resistance profile, and Sanger sequencing of the internal transcribed spacer (ITS) and D1/D2 regions of the ribosomal gene for C. auris genotyping. Results included: a) identification and confirmation of C. auris in 413 clinical isolates and 931 patient surveillance isolates, as well as identification of 277 clinical cases and 350 colonized cases from 151 healthcare facilities including 59 hospitals, 92 nursing homes, 1 long-term acute care hospital (LTACH), and 2 hospices, b) successful utilization of an in-house developed C. auris real-time PCR assay for the rapid screening of patient and environmental surveillance samples, c) demonstration of relatively heavier colonization of C. auris in nares compared to the axilla/groin, and d) predominance of the South Asia clade I with intrinsic resistance to fluconazole and elevated minimum inhibitory concentration (MIC) to voriconazole (81%), amphotericin B (61%), 5-FC (3%) and echinocandins (1%). These findings reflect greater regional prevalence and incidence of C. auris and the deployment of better detection tools in an unprecedented outbreak. |
Analysis of the profitability of adult vaccination in 13 private provider practices in the United States
Yarnoff B , Khavjou O , King G , Bates L , Zhou F , Leidner AJ , Shen AK . Vaccine 2019 37 (42) 6180-6185 Vaccination coverage among adults remains low in the United States. Understanding the barriers to provision of adult vaccination is an important step to increasing vaccination coverage and improving public health. To better understand financial factors that may affect practice decisions about adult vaccination, this study sought to understand how costs compared with payments for adult vaccinations in a sample of U.S. physician practices. We recruited a convenience sample of 19 practices in nine states in 2017. We conducted a time-motion study to assess the time costs of vaccination activities and conducted a survey of practice managers to assess materials, management, and dose costs and payments for vaccination. We received complete cost and payment data from 13 of the 19 practices. We calculated annual income from vaccination services by comparing estimated costs with payments received for vaccine doses and vaccine administration. Median annual total income from vaccination services was $90,343 at family medicine practices (range: $3968-$249,628), $28,267 at internal medicine practices (-$32,659-$141,034) and $2886 at obstetrics and gynecology practices (-$73,451-$23,820). Adult vaccination was profitable at the median of our sample, but there is wide variation in profitability due to differences in costs and payment rates across practices. This study provides evidence on the financial viability of adult vaccination and supports actions for improving financial viability. These results can help inform practices' decisions whether to provide adult vaccines and contribute to keeping adults up-to-date with the recommended vaccination schedule. |
Nutrient content of squeeze pouch foods for infants and toddlers sold in the United States in 2015
Beauregard JL , Bates M , Cogswell ME , Nelson JM , Hamner HC . Nutrients 2019 11 (7) BACKGROUND: To describe the availability and nutrient composition of U.S. commercially available squeeze pouch infant and toddler foods in 2015. MATERIALS AND METHODS: Data were from information presented on nutrition labels for 703 ready-to-serve, pureed food products from 24 major U.S. infant and toddler food brands. We described nutritional components (e.g., calories, fat) and compared them between packaging types (squeeze pouch versus other packaging types) within food categories. RESULTS: 397 (56%) of the analyzed food products were packaged as squeeze pouches. Differences in 13 nutritional components between squeeze pouch versus other packaging types were generally small and varied by food category. Squeeze pouches in the fruits and vegetables, fruit-based, and vegetable-based categories were more likely to contain added sugars than other package types. CONCLUSION: In 2015, squeeze pouches were prevalent in the U.S. commercial infant and toddler food market. Nutrient composition differed between squeeze pouches and other packaging types for some macro- and micronutrients. Although it is recommended that infants and toddlers under two years old not consume any added sugars, a specific area of concern may be the inclusion of sources of added sugar in squeeze pouches. Linking this information with children's dietary intake would facilitate understanding how these differences affect overall diet quality. |
Review of acellular assays of ambient particulate matter oxidative potential: Methods and relationships with composition, sources, and health effects
Bates JT , Fang T , Verma V , Zeng L , Weber RJ , Tolbert PE , Abrams JY , Sarnat SE , Klein M , Mulholland JA , Russell AG . Environ Sci Technol 2019 53 (8) 4003-4019 Oxidative stress is a potential mechanism of action for particulate matter (PM) toxicity and can occur when the body's antioxidant capacity cannot counteract or detoxify harmful effects of reactive oxygen species (ROS) due to an excess presence of ROS. ROS are introduced to the body via inhalation of PM with these species present on and/or within the particles (particle-bound ROS) and/or through catalytic generation of ROS in vivo after inhaling redox-active PM species (oxidative potential, OP). The recent development of acellular OP measurement techniques has led to a surge in research across the globe. In this review, particle-bound ROS techniques are discussed briefly while OP measurements are the focus due to an increasing number of epidemiologic studies using OP measurements showing associations with adverse health effects in some studies. The most common OP measurement techniques, including the dithiothreitol assay, glutathione assay, and ascorbic acid assay, are discussed along with evidence for utility of OP measurements in epidemiologic studies and PM characteristics that drive different responses between assay types (such as species composition, emission source, and photochemistry). Overall, most OP assays respond to metals like copper than can be found in emission sources like vehicles. Some OP assays respond to organics, especially photochemically aged organics, from sources like biomass burning. Select OP measurements have significant associations with certain cardiorespiratory end points, such as asthma, congestive heart disease, and lung cancer. In fact, multiple studies have found that exposure to OP measured using the dithiothreitol and glutathione assays drives higher risk ratios for certain cardiorespiratory outcomes than PM mass, suggesting OP measurements may be integrating the health-relevant fraction of PM and will be useful tools for future health analyses. The compositional impacts, including species and emission sources, on OP could have serious implications for health-relevant PM exposure. Though more work is needed, OP assays show promise for health studies as they integrate the impacts of PM species and properties on catalytic redox reactions into one measurement, and current work highlights the importance of metals, organic carbon, vehicles, and biomass burning emissions to PM exposures that could impact health. |
Estimating the costs and income of providing vaccination to adults and children
Yarnoff B , Kim D , Zhou F , Leidner AJ , Khavjou O , Bates L , Bridges CB . Med Care 2019 57 (6) 410-416 INTRODUCTION: Vaccinations are recommended to prevent serious morbidity and mortality. However, providers' concerns regarding costs and payments for providing vaccination services are commonly reported barriers to adult vaccination. Information on the costs of providing vaccination is limited, especially for adults. METHODS: We recruited 4 internal medicine, 4 family medicine, 2 pediatric, 2 obstetrics and gynecology (OBGYN) practices, and 2 community health clinics in North Carolina to participate in a study to assess the economic costs and benefits of providing vaccination services for adults and children. We conducted a time-motion assessment of vaccination-related activities in the provider office and a survey to providers on vaccine management costs. We estimated mean cost per vaccination, minimum and maximum payments received, and income. RESULTS: Across all provider settings, mean cost per vaccine administration was $14 with substantial variation by practice setting (pediatric: $10; community health clinics: $15; family medicine: $17; OBGYN: $23; internal medicine: $23). When receiving the maximum payment, all provider settings had positive income for vaccination services. When receiving the minimum reported payments for vaccination services, pediatric and family medicine practices had positive income, internal medicine, and OBGYN practices had approximately equal costs and payments, and community health clinics had losses or negative income. CONCLUSIONS: Overall, vaccination service providers appeared to have small positive income from vaccination services. In some cases, providers experienced negative income, which underscores the need for providers and policymakers to design interventions and system improvements to make vaccination services financially sustainable for all provider types. |
Provider time and costs to vaccinate adult patients: Impact of time counseling without vaccination
Shen A , Khavjou O , King G , Bates L , Zhou F , Leidner AJ , Yarnoff B . Vaccine 2019 37 (6) 792-797 Amid provider reports of financial barriers as an impediment to adult immunization, this study explores the time and costs of vaccination in adult provider practices. Both a Vaccination Time-Motion Study and Vaccine Practice Management Survey were conducted (March - October 2017) in a convenience sample of 19 family medicine (FM), internal medicine (IM), and obstetrician-gynecology (OBGYN) practices, in nine states. Practices were directly observed during a one week period; estimates were collected of time spent on activities that could not be directly observed. Cost estimates were calculated by converting staff time for performed activities. In the time-motion study, FM and IM practices spent similar time conducting vaccination activities (median=5min per vaccination), while OBGYN practices spent more time (median=29min per vaccination). Combining results from the time-motion study and the practice management survey, the median costs of vaccination remained similar for FM practices and IM practices at $7 and $8 per vaccination, respectively, but was substantially higher for OBGYN practices at $43 per vaccination. Factors that contributed to higher costs among OBGYN practices were the increased time to counsel patients, administer vaccines, and to plan and manage vaccine supplies. In addition, 68% of OBGYN patients who were offered and counseled to receive vaccines declined to receive them. Counseling patients who ultimately do not go on to receive a vaccine may be an important cost factor. Lower costs of vaccination services may be achieved by increasing efficiencies in workflow or the volume of vaccinations. |
Prenatal exposure to organochlorine pesticides and early childhood communication development in British girls
Jeddy Z , Kordas K , Allen K , Taylor EV , Northstone K , Dana Flanders W , Namulanda G , Sjodin A , Hartman TJ . Neurotoxicology 2018 69 121-129 BACKGROUND: The developing brain is susceptible to exposure to neurodevelopmental toxicants such as pesticides. AIMS: We explored associations of prenatal serum concentrations of hexachlorobenzene (HCB), beta-Hexachlorocyclohexane (beta-HCH), 2,2-Bis(4-chlorophenyl)-1,1-dichloroethene (p,p'-DDE) and 2,2-Bis(4-chlorophenyl-1,1,1-trichloroethane (p,p'-DDT) with maternal-reported measures of verbal and non-verbal communication in young girls. STUDY DESIGN AND METHODS: We studied a sample of 400 singleton girls and their mothers participating in the Avon Longitudinal Study of Parents and Children (ALSPAC) using multivariable linear regression models adjusting for parity, Home Observation Measurement of the Environment (HOME) score, maternal age and education status, and maternal tobacco use during the first trimester of pregnancy. EXPOSURE AND OUTCOME MEASURES: Maternal serum samples (collected at median 15 wks. gestation [IQR 10, 28]) were assessed for selected organochlorine pesticide levels. Communication was assessed at 15 and 38 months, using adapted versions of the MacArthur Bates Communicative Development Inventories for Infants and Toddlers (MCDI). RESULTS: At 15 months, girls born to mothers with prenatal concentrations of HCB in the highest tertile had vocabulary comprehension and production scores approximately 16% (p = 0.007) lower than girls born to mothers with concentrations in the lowest tertile. This association varied by maternal parity in that the evidence was stronger for daughters of nulliparous mothers. At 38 months, girls born to mothers with prenatal concentrations of HCB in the highest tertile had mean adjusted intelligibility scores that were 3% (p = 0.03) lower than those born to mothers with concentrations in the lowest tertile; however, results did not vary significantly by parity. Maternal concentrations of beta-HCH and p,p'-DDE were not significantly associated with MCDI scores at 15 or 36 months. p,p'-DDT had an inconsistent pattern of association; a significant positive association was observed between p,p'-DDT with verbal comprehension scores at 15 months; however, at 38 months a significant inverse association was observed for p,p'-DDT with communicative scores. This inverse association for p,p'-DDT among older girls tended to be stronger among daughters of mothers who had lower depression scores. CONCLUSIONS: Organochlorine pesticide exposure in utero may affect communication development. |
Sodium, sugar, and fat content of complementary infant and toddler foods sold in the United States, 2015
Maalouf J , Cogswell ME , Bates M , Yuan K , Scanlon KS , Pehrsson P , Gunn JP , Merritt RK . Am J Clin Nutr 2017 105 (6) 1443-1452 Background: As part of a healthy diet, limiting intakes of excess sodium, added sugars, saturated fat, and trans fat has been recommended. The American Heart Association recommends that children aged <2 y should avoid added sugars.Objective: We sought to determine commercial complementary infant-toddler food categories that were of potential concern because of the sodium, added sugar, saturated fat, or trans fat content.Design: Nutrition label information (e.g., serving size, sodium, saturated fat, trans fat) for 1032 infant and toddler foods was collected from manufacturers' websites and stores from May to July 2015 for 24 brands, which accounted for >95% of infant-toddler food sales. The presence of added sugars was determined from the ingredient list. Reference amount customarily consumed (RACC) categories were used to group foods and standardize serving sizes. A high sodium content was evaluated on the basis of the Upper Intake Level for children aged 1-3 y and the number of potential servings per day ([i.e., 1500 mg/7 servings (>210 mg/RACC)], a sodium amount >200 mg/100 g, or a mean sodium density >1000 mg/1000 kcal.Results: In 2015, most commercial infant-only vegetables, fruit, dinners, and cereals were low in sodium, contained no saturated fat, and did not contain added sugars. On average, toddler meals contained 2233 mg Na/1000 kcal, and 84% of the meals had >210 mg Na/RACC (170 g), whereas 69% of infant-toddler savory snacks had >200 mg Na/100 g. More than 70% of toddler's meals, cereal bars and breakfast pastries, and infant-toddler grain- or dairy-based desserts contained ≥1 sources of added sugar. Approximately 70% of toddler meals contained saturated fat (mean: 1.9 g/RACC), and no commercial infant-toddler foods contained trans fats.Conclusion: Most commercial toddler meals, cereal bars and breakfast pastries, and infant-toddler snacks and desserts have high sodium contents or contain added sugars, suggesting a need for continued public health efforts to support parents in choosing complementary foods for their infants and toddlers. |
Community laboratory testing for cryptosporidium: Multicenter study retesting public health surveillance stool samples positive for cryptosporidium by rapid cartridge assay with direct fluorescent antibody testing
Roellig DM , Yoder JS , Madison-Antenucci S , Robinson TJ , Van TT , Collier SA , Boxrud D , Monson T , Bates LA , Blackstock AJ , Shea S , Larson K , Xiao L , Beach M . PLoS One 2017 12 (1) e0169915 Cryptosporidium is a common cause of sporadic diarrheal disease and outbreaks in the United States. Increasingly, immunochromatography-based rapid cartridge assays (RCAs) are providing community laboratories with a quick cryptosporidiosis diagnostic method. In the current study, the Centers for Disease Control and Prevention (CDC), the Association of Public Health Laboratories (APHL), and four state health departments evaluated RCA-positive samples obtained during routine Cryptosporidium testing. All samples underwent "head to head" re-testing using both RCA and direct fluorescence assay (DFA). Community level results from three sites indicated that 54.4% (166/305) of Meridian ImmunoCard STAT! positives and 87.0% (67/77) of Remel Xpect positives were confirmed by DFA. When samples were retested by RCA at state laboratories and compared with DFA, 83.3% (155/186) of Meridian ImmunoCard STAT! positives and 95.2% (60/63) of Remel Xpect positives were confirmed. The percentage of confirmed community results varied by site: Minnesota, 39.0%; New York, 63.9%; and Wisconsin, 72.1%. The percentage of confirmed community results decreased with patient age; 12.5% of community positive tests could be confirmed by DFA for patients 60 years of age or older. The percentage of confirmed results did not differ significantly by sex, storage temperature, time between sample collection and testing, or season. Findings from this study demonstrate a lower confirmation rate of community RCA positives when compared to RCA positives identified at state laboratories. Elucidating the causes of decreased test performance in order to improve overall community laboratory performance of these tests is critical for understanding the epidemiology of cryptosporidiosis in the United States (US). |
The road to tuberculosis (Mycobacterium tuberculosis) elimination in Arkansas; a re-examination of risk groups
Berzkalns A , Bates J , Ye W , Mukasa L , France AM , Patil N , Yang Z . PLoS One 2014 9 (3) e90664 OBJECTIVES: This study was conducted to generate knowledge useful for developing public health interventions for more effective tuberculosis control in Arkansas. METHODS: The study population included 429 culture-confirmed reported cases (January 1, 2004-December 31, 2010). Mycobacterium tuberculosis genotyping data were used to identify cases likely due to recent transmission (clustered) versus reactivation (non-clustered). Poisson regression models estimated average decline rate in incidence over time and assessed the significance of differences between subpopulations. A multinomial logistic model examined differences between clustered and non-clustered incidence. RESULTS: A significant average annual percent decline was found for the overall incidence of culture-confirmed (9%; 95% CI: 5.5%, 16.9%), clustered (6%; 95% CI: 0.5%, 11.6%), and non-clustered tuberculosis cases (12%; 95% CI: 7.6%, 15.9%). However, declines varied among demographic groups. Significant declines in clustered incidence were only observed in males, non-Hispanic blacks, 65 years and older, and the rural population. CONCLUSIONS: These findings suggest that the Arkansas tuberculosis control program must target both traditional and non-traditional risk groups for successful tuberculosis elimination. The present study also demonstrates that a thorough analysis of TB trends in different population subgroups of a given geographic region or state can lead to the identification of non-traditional risk factors for TB transmission. Similar studies in other low incidence populations would provide beneficial data for how to control and eventually eliminate TB in the U.S. |
Advances in tuberculosis diagnostics: the Xpert MTB/RIF assay and future prospects for a point-of-care test
Lawn SD , Mwaba P , Bates M , Piatek A , Alexander H , Marais BJ , Cuevas LE , McHugh TD , Zijenah L , Kapata N , Abubakar I , McNerney R , Hoelscher M , Memish ZA , Migliori GB , Kim P , Maeurer M , Schito M , Zumla A . Lancet Infect Dis 2013 13 (4) 349-61 Rapid progress has been made in the development of new diagnostic assays for tuberculosis in recent years. New technologies have been developed and assessed, and are now being implemented. The Xpert MTB/RIF assay, which enables simultaneous detection of Mycobacterium tuberculosis (MTB) and rifampicin (RIF) resistance, was endorsed by WHO in December, 2010. This assay was specifically recommended for use as the initial diagnostic test for suspected drug-resistant or HIV-associated pulmonary tuberculosis. By June, 2012, two-thirds of countries with a high tuberculosis burden and half of countries with a high multidrug-resistant tuberculosis burden had incorporated the assay into their national tuberculosis programme guidelines. Although the development of the Xpert MTB/RIF assay is undoubtedly a landmark event, clinical and programmatic effects and cost-effectiveness remain to be defined. We review the rapidly growing body of scientific literature and discuss the advantages and challenges of using the Xpert MTB/RIF assay in areas where tuberculosis is endemic. We also review other prospects within the developmental pipeline. A rapid, accurate point-of-care diagnostic test that is affordable and can be readily implemented is urgently needed. Investment in the tuberculosis diagnostics pipeline should remain a major priority for funders and researchers. |
Using major outer membrane protein typing as an epidemiological tool to investigate outbreaks caused by milk-borne Campylobacter jejuni isolates in California
Jay-Russell MT , Mandrell RE , Yuan J , Bates A , Manalac R , Mohle-Boetani J , Kimura A , Lidgard J , Miller WG . J Clin Microbiol 2013 51 (1) 195-201 We describe using major outer membrane protein (MOMP) typing as a screen to compare the Campylobacter jejuni porA gene sequences of clinical outbreak strains from human stool with the porA sequences of dairy farm strains isolated during two milk-borne campylobacteriosis outbreak investigations in California. The genetic relatedness of clinical and environmental strains with identical or closely related porA sequences was confirmed by multilocus sequence typing and pulsed-field gel electrophoresis analysis. The first outbreak involved 1,644 C. jejuni infections at 11 state correctional facilities and was associated with consumption of pasteurized milk supplied by an on-site dairy (dairy A) at a prison in the central valley. The second outbreak involved eight confirmed and three suspect C. jejuni cases linked to consumption of commercial raw milk and raw chocolate colostrum at another central valley dairy (dairy B). Both dairies bottled fluid milk on the farm and distributed the finished product to off-site locations. Altogether, C. jejuni was isolated from 7 of 15 (46.7%) bovine fecal, 12 of 20 (60%) flush alley water, and 1 of 20 (5%) lagoon samples collected on dairy A. At dairy B, C. jejuni was cultured from 9 of 26 (34.6%) bovine fecal samples. Environmental strains indistinguishable from the clinical outbreak strains were found in five flush alley water samples (dairy A) and four bovine fecal samples (dairy B). The findings demonstrate that MOMP typing is a useful tool to triage environmental isolates prior to conducting more labor-intensive molecular typing methods. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Jan 13, 2025
- Content source:
- Powered by CDC PHGKB Infrastructure