Interrelationships among age at adiposity rebound, BMI during childhood, and BMI after age 14 years in an electronic health record database
Freedman DS , Goodwin-Davies AJ , Kompaniyets L , Lange SJ , Goodman AB , Phan TT , Rao S , Eneli I , Forrest CB . Obesity (Silver Spring) 2022 30 (1) 201-208 OBJECTIVE: This study compared the importance of age at adiposity rebound versus childhood BMI to subsequent BMI levels in a longitudinal analysis. METHODS: From the electronic health records of 4.35 million children, a total of 12,228 children were selected who were examined at least once each year between ages 2 and 7 years and reexamined after age 14 years. The minimum number of examinations per child was six. Each child's rebound age was estimated using locally weighted regression (lowess), a smoothing technique. RESULTS: Children who had a rebound age < 3 years were, on average, 7 kg/m(2) heavier after age 14 years than were children with a rebound age ≥ 7 years. However, BMI after age 14 years was more strongly associated with BMI at the rebound than with rebound age (r = 0.57 vs. -0.44). Furthermore, a child's BMI at age 3 years provided more information on BMI after age 14 years than did rebound age. In addition, rebound age provided no information on subsequent BMI if a child's BMI at age 6 years was known. CONCLUSIONS: Although rebound age is related to BMI after age 14 years, a child's BMI at age 3 years provides more information and is easier to obtain. |
Incidence rates of systemic lupus erythematosus in the USA: estimates from a meta-analysis of the Centers for Disease Control and Prevention national lupus registries
Izmirly PM , Ferucci ED , Somers EC , Wang L , Lim SS , Drenkard C , Dall'Era M , McCune WJ , Gordon C , Helmick C , Parton H . Lupus Sci Med 2021 8 (1) OBJECTIVE: To estimate the annual incidence rate of SLE in the USA. METHODS: A meta-analysis used sex/race/ethnicity-specific data spanning 2002-2009 from the Centers for Disease Control and Prevention network of four population-based state registries to estimate the incidence rates. SLE was defined as fulfilling the 1997 revised American College of Rheumatology classification criteria. Given heterogeneity across sites, a random effects model was employed. Applying sex/race/ethnicity-stratified rates, including data from the Indian Health Service registry, to the 2018 US Census population generated estimates of newly diagnosed SLE cases. RESULTS: The pooled incidence rate per 100 000 person-years was 5.1 (95% CI 4.6 to 5.6), higher in females than in males (8.7 vs 1.2), and highest among black females (15.9), followed by Asian/Pacific Islander (7.6), Hispanic (6.8) and white (5.7) females. Male incidence was highest in black males (2.4), followed by Hispanic (0.9), white (0.8) and Asian/Pacific Islander (0.4) males. The American Indian/Alaska Native population had the second highest race-specific SLE estimates for females (10.4 per 100 000) and highest for males (3.8 per 100 000). In 2018, an estimated 14 263 persons (95% CI 11 563 to 17 735) were newly diagnosed with SLE in the USA. CONCLUSIONS: A network of population-based SLE registries provided estimates of SLE incidence rates and numbers diagnosed in the USA. |
Estimating the contribution of HIV-infected adults to household pneumococcal transmission in South Africa, 2016-2018: A hidden Markov modelling study.
Thindwa D , Wolter N , Pinsent A , Carrim M , Ojal J , Tempia S , Moyes J , McMorrow M , Kleynhans J , Gottberg AV , French N , Cohen C , Flasche S . PLoS Comput Biol 2021 17 (12) e1009680 Human immunodeficiency virus (HIV) infected adults are at a higher risk of pneumococcal colonisation and disease, even while receiving antiretroviral therapy (ART). To help evaluate potential indirect effects of vaccination of HIV-infected adults, we assessed whether HIV-infected adults disproportionately contribute to household transmission of pneumococci. We constructed a hidden Markov model to capture the dynamics of pneumococcal carriage acquisition and clearance observed during a longitudinal household-based nasopharyngeal swabbing study, while accounting for sample misclassifications. Households were followed-up twice weekly for approximately 10 months each year during a three-year study period for nasopharyngeal carriage detection via real-time PCR. We estimated the effect of participant's age, HIV status, presence of a HIV-infected adult within the household and other covariates on pneumococcal acquisition and clearance probabilities. Of 1,684 individuals enrolled, 279 (16.6%) were younger children (<5 years-old) of whom 4 (1.5%) were HIV-infected and 726 (43.1%) were adults (≥18 years-old) of whom 214 (30.4%) were HIV-infected, most (173, 81.2%) with high CD4+ count. The observed range of pneumococcal carriage prevalence across visits was substantially higher in younger children (56.9-80.5%) than older children (5-17 years-old) (31.7-50.0%) or adults (11.5-23.5%). We estimate that 14.4% (95% Confidence Interval [CI]: 13.7-15.0) of pneumococcal-negative swabs were false negatives. Daily carriage acquisition probabilities among HIV-uninfected younger children were similar in households with and without HIV-infected adults (hazard ratio: 0.95, 95%CI: 0.91-1.01). Longer average carriage duration (11.4 days, 95%CI: 10.2-12.8 vs 6.0 days, 95%CI: 5.6-6.3) and higher median carriage density (622 genome equivalents per millilitre, 95%CI: 507-714 vs 389, 95%CI: 311.1-435.5) were estimated in HIV-infected vs HIV-uninfected adults. The use of ART and antibiotics substantially reduced carriage duration in all age groups, and acquisition rates increased with household size. Although South African HIV-infected adults on ART have longer carriage duration and density than their HIV-uninfected counterparts, they show similar patterns of pneumococcal acquisition and onward transmission. |
Contact tracing outcomes among household contacts of fully vaccinated COVID-19 patients - San Francisco, California, January 29-July 2, 2021.
Sachdev DD , Chew Ng R , Sankaran M , Ernst A , Hernandez KT , Servellita V , Sotomayor-Gonzalez A , Stoltey J , Cohen SE , Nguyen TQ , Chiu C , Philip S . Clin Infect Dis 2021 75 (1) e267-e275 BACKGROUND: The extent to which vaccinated persons diagnosed with COVID-19 can transmit to other vaccinated and unvaccinated persons is unclear. METHODS: Using data from the San Francisco Department of Public Health (SFDPH), this report describes outcomes of household contact tracing during January 29-July 2, 2021, where fully vaccinated COVID-19 patients were the index case in the household. RESULTS: Among 248 fully vaccinated patients with breakthrough infections, 203 (82%) were symptomatic and 105 were identified as the index patient within their household. Among 179 named household contacts, 71 (40%) contacts tested, over half (56%) were fully vaccinated and the secondary attack rate was 28%. Overall transmission from a symptomatic fully vaccinated patient with breakthrough infection to household contacts was suspected in 14 of 105 (13%) of households. Viral genomic sequencing of samples from 44% of fully vaccinated patients showed that 82% of those sequenced were infected by a variant of concern or interest, and 77% by a variant carrying mutation(s) associated with resistance to neutralizing antibodies. CONCLUSIONS: Transmission from fully vaccinated symptomatic index patients to vaccinated and unvaccinated household contacts can occur. Indoor face masking and timely testing of all household contacts should be considered when a household member receives a positive test result in order to identify and interrupt transmission chains. |
Face mask fit modifications that improve source control performance.
Blachere FM , Lemons AR , Coyle JP , Derk RC , Lindsley WG , Beezhold DH , Woodfork K , Duling MG , Boutin B , Boots T , Harris JR , Nurkiewicz T , Noti JD . Am J Infect Control 2021 50 (2) 133-140 BACKGROUND: During the COVID-19 pandemic, face masks are used as source control devices to reduce the expulsion of respiratory aerosols from infected people. Modifications such as mask braces, earloop straps, knotting and tucking, and double masking have been proposed to improve mask fit however the data on source control are limited. METHODS: The effectiveness of mask fit modifications was determined by conducting fit tests on human subjects and simulator manikins and by performing simulated coughs and exhalations using a source control measurement system. RESULTS: Medical masks without modification blocked ≥56% of cough aerosols and ≥42% of exhaled aerosols. Modifying fit by crossing the earloops or placing a bracket under the mask did not increase performance, while using earloop toggles, an earloop strap, and knotting and tucking the mask increased performance. The most effective modifications for improving source control performance were double masking and using a mask brace. Placing a cloth mask over a medical mask blocked ≥85% of cough aerosols and ≥91% of exhaled aerosols. Placing a brace over a medical mask blocked ≥95% of cough aerosols and ≥99% of exhaled aerosols. CONCLUSIONS: Fit modifications can greatly improve the performance of face masks as source control devices for respiratory aerosols. |
Use of public data to describe COVID-19 contact tracing in Hubei Province and non-Hubei provinces in China between 20 January and 29 February 2020.
Dirlikov E , Zhou S , Han L , Li Z , Hao L , Millman AJ , Marston B . Western Pac Surveill Response J 2021 12 (3) 82-87 OBJECTIVE: Contact tracing has been used in China and several other countries in the WHO Western Pacific Region as part of the COVID-19 response. We describe COVID-19 cases and the number of contacts traced and quarantined per case as part of COVID-19 emergency public health response activities in China. METHODS: We abstracted publicly available, online aggregated data published in daily COVID-19 situational reports by China's National Health Commission and provincial health commissions between 20 January and 29 February 2020. The number of new contacts traced by report date was computed as the difference between total contacts traced in consecutive reports. A proxy for the number of contacts traced per case was computed as the number of new contacts traced divided by the number of new cases. RESULTS: During the study period, China reported 80 968 new COVID-19 cases and 659 899 contacts. In Hubei Province, there were 67 608 cases and 264 878 contacts, representing 83% and 40% of the total, respectively. Non-Hubei provinces reported tracing 1.5 times more contacts than Hubei Province; the weekly number of contacts traced per case was also higher in non-Hubei provinces than in Hubei Province and increased from 17.2 in epidemiological week 4 to 115.7 in epidemiological week 9. DISCUSSION: More contacts per case were reported from areas and periods with lower COVID-19 case counts. With other non-pharmaceutical interventions used in China, contact tracing and quarantining large numbers of potentially infected contacts probably contributed to reducing SARS-CoV-2 transmission. |
Notes from the Field: COVID-19-Associated Mucormycosis - Arkansas, July-September 2021.
Dulski TM , DeLong M , Garner K , Patil N , Cima MJ , Rothfeldt L , Gulley T , Porter A , Vyas KS , Liverett HK , Toda M , Gold JAW , Kothari A . MMWR Morb Mortal Wkly Rep 2021 70 (50) 1750-1751 During September 17–24, 2021, three clinicians independently notified the Arkansas Department of Health (ADH) of multiple patients with mucormycosis after a recent diagnosis of COVID-19. To provide data to guide clinical and public health practice, ADH coordinated a statewide call on October 11, 2021 to infection preventionists for COVID-19–associated mucormycosis cases. |
Accuracy of Case-Based Seroprevalence of SARS-CoV-2 Antibodies in Maricopa County, Arizona.
Jehn M , Pandit U , Sabin S , Tompkins C , White J , Kaleta E , Dale AP , Ross HM , MacMcCullough J , Pepin S , Kenny K , Sanborn H , Heywood N , Schnall AH , Lant T , Sunenshine R . Am J Public Health 2022 112 (1) 38-42 We conducted a community seroprevalence survey in Arizona, from September 12 to October 1, 2020, to determine the presence of antibodies to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). We used the seroprevalence estimate to predict SARS-CoV-2 infections in the jurisdiction by applying the adjusted seroprevalence to the county's population. The estimated community seroprevalence of SARS-CoV-2 infections was 4.3 times greater (95% confidence interval=2.2, 7.5) than the number of reported cases. Field surveys with representative sampling provide data that may help fill in gaps in traditional public health reporting. (Am J Public Health. 2022;112(1):38-42. https://doi.org/10.2105/AJPH.2021.306568). |
COVID-19, Influenza and RSV: Surveillance-informed prevention and treatment - Meeting report from an isirv-WHO virtual conference.
McKimm-Breschkin JL , Hay AJ , Cao B , Cox RJ , Dunning J , Moen AC , Olsen D , Pizzorno A , Hayden FG . Antiviral Res 2021 197 105227 The International Society for Influenza and other Respiratory Virus Diseases (isirv) and the WHO held a joint virtual conference from 19th-21st October 2021. While there was a major focus on the global response to the SARS-CoV-2 pandemic, including antivirals, vaccines and surveillance strategies, papers were also presented on treatment and prevention of influenza and respiratory syncytial virus (RSV). Potential therapeutics for SARS-CoV-2 included host-targeted therapies baricitinib, a JAK inhibitor, tocilizumab, an IL-6R inhibitor, verdinexor and direct acting antivirals ensovibep, S-217622, AT-527, and monoclonal antibodies casirivimab and imdevimab, directed against the spike protein. Data from trials of nirsevimab, a monoclonal antibody with a prolonged half-life which binds to the RSV F-protein, and an Ad26.RSV pre-F vaccine were also presented. The expanded role of the WHO Global Influenza Surveillance and Response System to address the SARS-CoV-2 pandemic was also discussed. This report summarizes the oral presentations given at this meeting for the benefit of the broader medical and scientific community involved in surveillance, treatment and prevention of respiratory virus diseases. |
Notes from the Field: Mucormycosis Cases During the COVID-19 Pandemic - Honduras, May-September 2021.
Mejía-Santos H , Montoya S , Chacón-Fuentes R , Zielinski-Gutierrez E , Lopez B , Ning MF , Farach N , García-Coto F , Rodríguez-Araujo DS , Rosales-Pavón K , Urbina G , Rivera AC , Peña R , Tovar A , Paz MC , Lopez R , Pardo-Cruz F , Mendez C , Flores A , Varela M , Chiller T , Jackson BR , Jordan A , Lyman M , Toda M , Caceres DH , Gold JAW . MMWR Morb Mortal Wkly Rep 2021 70 (50) 1747-1749 On July 15, 2021, the Secretary of Health of Honduras (SHH) was notified of an unexpected number of mucormycosis cases among COVID-19 patients. SHH partnered with the Honduras Field Epidemiology Training Program, the Executive Secretariat of the Council of Ministers of Health of Central America and the Dominican Republic (SE-COMISCA), Pan American Health Organization (PAHO), and CDC to investigate mucormycosis cases at four geographically distinct hospitals in Honduras. | | Mucormycosis is a severe, often fatal disease caused by infection with angioinvasive molds belonging to the order Mucorales. Risk factors for mucormycosis include certain underlying medical conditions (e.g., hematologic malignancy, stem cell or solid organ transplantation, or uncontrolled diabetes) and the use of certain immunosuppressive medications (1). COVID-19 might increase mucormycosis risk because of COVID-19–induced immune dysregulation or associated medical treatments, such as systemic corticosteroids and other immunomodulatory drugs (e.g., tocilizumab), which impair the immune response against mold infections (2). In India, an apparent increase in mucormycosis cases (which was referred to by the misnomer “black fungus”) was attributed to COVID-19 (3). |
Extensively drug-resistant typhoid fever in the United States
Hughes MJ , Birhane MG , Dorough L , Reynolds JL , Caidi H , Tagg KA , Snyder CM , Yu AT , Altman SM , Boyle MM , Thomas D , Robbins AE , Waechter HA , Cody I , Mintz ED , Gutelius B , Langley G , Francois Watkins LK . Open Forum Infect Dis 2021 8 (12) ofab572 Cases of extensively drug-resistant (XDR) typhoid fever have been reported in the United States among patients who did not travel internationally. Clinicians should consider if and where the patient traveled when selecting empiric treatment for typhoid fever. XDR typhoid fever should be treated with a carbapenem, azithromycin, or both. |
Multidrug-resistant tuberculosis in U.S.-bound immigrants and refugees
Liu Y , Posey DL , Yang Q , Weinberg MS , Maloney SA , Lambert LA , Ortega LS , Marano N , Cetron MS , Phares CR . Ann Am Thorac Soc 2021 19 (6) 943-951 RATIONALE: Approximately two-thirds of new cases of tuberculosis (TB) in the United States are among non-U.S.-born persons. Culture-based overseas TB screening in U.S.-bound immigrants and refugees has substantially reduced the importation of TB into the United States, but it is unclear to what extent this program prevents the importation of multidrug-resistant TB (MDR-TB). OBJECTIVES: To study the epidemiology of MDR-TB in U.S.-bound immigrants and refugees, and to evaluate effect of culture-based overseas TB screening in U.S.-bound immigrants and refugees on reducing the importation of MDR-TB into the United States. METHODS: We analyzed data of immigrants and refugees who completed overseas treatment for culture-positive TB during 2015-2019. We also compared mean annual number of MDR-TB cases in non-U.S.-born persons within 1 year of arrival in the United States between 1996-2006 (when overseas screening followed a smear-based algorithm) and 2014-2019 (after full implementation of a culture-based algorithm). RESULTS: Of 3,300 culture-positive TB cases prevented by culture-based overseas TB screening in immigrants and refugees during 2015-2019, 122 (3.7%, 95% confidence interval (CI) 3.1-4.1) had MDR-TB, 20 (0.6%, 95% CI 0.3-0.9) had rifampicin-resistant TB, 382 (11.6%, 95% CI 10.5-12.7) had isoniazid-resistant TB, and 2,776 (84.1%, 95% CI 82.9-85.4) had rifampicin- and isoniazid-susceptible TB. None were diagnosed with extensively drug-resistant TB (XDR-TB). Culture-based overseas TB screening in U.S.-bound immigrants and refugees prevented 24.4 MDR-TB cases per year from arriving in the United States, 18.2 cases more than smear-based overseas TB screening. Mean annual number of MDR-TB cases among non-U.S.-born persons within 1 year of arrival in the United States decreased from 34.6 cases in 1996-2006 to 19.5 cases in 2014-2019 (difference of 15.1, p<0.001). CONCLUSIONS: Culture-based overseas TB screening in U.S.-bound immigrants and refugees substantially reduced the importation of MDR-TB into the United States. |
Virologic outcomes among adults with HIV using integrase inhibitor-based antiretroviral therapy
Lu H , Cole SR , Westreich D , Hudgens MG , Adimora AA , Althoff KN , Silverberg MJ , Buchacz K , Li J , Edwards JK , Rebeiro PF , Lima VD , Marconi VC , Sterling TR , Horberg MA , Gill MJ , Kitahata MM , Eron JJ , Moore RD . AIDS 2022 36 (2) 277-286 BACKGROUND: Integrase strand transfer inhibitor (InSTI)-based regimens have been recommended as first-line antiretroviral therapy (ART) for adults with HIV. But data on long-term effects of InSTI-based regimens on virologic outcomes remain limited. Here we examined whether InSTI improved long-term virologic outcomes compared with efavirenz (EFV). METHODS: We included adults from the North American AIDS Cohort Collaboration on Research and Design who initiated their first ART regimen containing either InSTI or EFV between 2009 and 2016. We estimated differences in the proportion virologically suppressed up to 7 years of follow-up in observational intention-to-treat and per-protocol analyses. RESULTS: Of 15 318 participants, 5519 (36%) initiated an InSTI-based regimen and 9799 (64%) initiated the EFV-based regimen. In observational intention-to-treat analysis, 81.3% of patients in the InSTI group and 67.3% in the EFV group experienced virologic suppression at 3 months after ART initiation, corresponding to a difference of 14.0% (95% CI 12.4-15.6). At 1 year after ART initiation, the proportion virologically suppressed was 89.5% in the InSTI group and 90.2% in the EFV group, corresponding to a difference of -0.7% (95% CI -2.1 to 0.8). At 7 years, the proportion virologically suppressed was 94.5% in the InSTI group and 92.5% in the EFV group, corresponding to a difference of 2.0% (95% CI -7.3 to 11.3). The observational per-protocol results were similar to intention-to-treat analyses. CONCLUSIONS: Although InSTI-based initial ART regimens had more rapid virologic response than EFV-based regimens, the long-term virologic effect was similar. Our findings may inform guidelines regarding preferred initial regimens for HIV treatment. |
Efavirenz pharmacokinetics and HIV-1 viral suppression among patients receiving TB treatment containing daily high-dose rifapentine
Podany AT , Pham M , Sizemore E , Martinson N , Samaneka W , Mohapi L , Badal-Faesen S , Dawson R , Johnson JL , Mayanja H , Lalloo U , Whitworth WC , Pettit A , Campbell K , Phillips P , Bryant K , Scott N , Vernon A , Kurbatova E , Chaisson RE , Dorman S , Nahid P , Swindells S , Dooley KE , Fletcher CV . Clin Infect Dis 2021 75 (4) 560-566 BACKGROUND: A four-month regimen containing rifapentine and moxifloxacin has non-inferior efficacy compared to the standard 6-month regimen for drug-sensitive tuberculosis. We evaluated the effect of regimens containing daily, high-dose rifapentine on efavirenz pharmacokinetics and viral suppression in patients with HIV-associated TB. METHODS: In the context of a Phase 3 randomized controlled trial, HIV-positive individuals already virally suppressed on efavirenz--containing ART (EFV1), or newly initiating efavirenz (EFV2) received TB treatment containing rifapentine (1200mg), isoniazid, pyrazinamide, and either ethambutol or moxifloxacin. Mid-interval efavirenz concentrations were measured (a) during ART and TB co-treatment (Weeks 4, 8, 12 and 17, different by EFV group) and (b) when ART was taken alone (pre- or post-TB treatment, Weeks 0 and 22). Apparent oral clearance (CL/F) was estimated and compared. Target mid-interval efavirenz concentrations were > 1 mg/L. Co-treatment was considered acceptable if >80% of participants had mid-interval efavirenz concentrations meeting this target. RESULTS: EFV1 and EFV2 included 70 and 41 evaluable participants, respectively. The geometric mean ratio comparing efavirenz CL/F with vs. without TB drugs was 0.79 [90% CI 0.72-0.85] in EFV1 and 0.84 [90% CI 0.69-0.97] in EFV2. The percent of participants with mid-interval efavirenz concentrations >1mg/L in EFV1 at Weeks 0, 4, 8, and 17 was 96%, 96%, 88%, and 89%, respectively. In EFV2, at approximately 4 and 8 weeks post efavirenz initiation, the value was 98%. CONCLUSIONS: TB treatment containing high-dose daily rifapentine modestly decreased (rather than increased) efavirenz clearance and therapeutic targets were met supporting the use of efavirenz with these regimens, without dose adjustment. |
HIV testing and ART initiation among partners, family members, and high-risk associates of index clients participating in the CommLink linkage case management program, Eswatini, 2016-2018
Williams D , MacKellar D , Dlamini M , Byrd J , Dube L , Mndzebele P , Mazibuko S , Ao T , Pathmanathan I , Beyer A , Ryan C . PLoS One 2021 16 (12) e0261605 To help diagnose and initiate antiretroviral therapy (ART) for ≥95% of all persons living with HIV (PLHIV), the World Health Organization (WHO) recommends offering HIV testing to biological children, and sexual and needle-sharing partners of all PLHIV (index-client testing, ICT). Many index clients, however, do not identify or have contactable partners, and often substantially fewer than 95% of HIV-positive partners initiate ART soon after index testing. To help improve early HIV diagnosis and ART initiation in Eswatini (formerly Swaziland), we implemented a community-based HIV testing and peer-delivered, linkage case management program (CommLink) that provided ICT as part of a comprehensive package of WHO recommended linkage services. CommLink was implemented June 2015 -March 2017 (Phase I), and April 2017 -September 2018 (Phase II). In addition to biological children and partners, HIV testing was offered to adult family members (Phases I and II) and high-risk associates including friends and acquaintances (Phase II) of CommLink index clients. Compared with Phase I, in Phase II proportionally more CommLink clients disclosed their HIV-infection status to a partner or family member [94% (562/598) vs. 75% (486/652)], and had ≥1 partners, family members, or high-risk associates (contacts) tested through CommLink [41% (245/598) vs. 18% (117/652)]. Of 537 contacts tested, 253 (47%) were HIV-positive and not currently in HIV care, including 17% (17/100) of family members aged <15 years, 42% (78/187) of non-partner family members aged ≥15 years, 60% (73/121) of sexual partners, and 66% (85/129) of high-risk associates. Among 210 HIV-positive contacts aged ≥15 years who participated in CommLink, nearly all received recommended linkage services including treatment navigation (95%), weekly telephone follow-up (93%), and ≥3 counseling sessions (94%); peer counselors resolved 76% (306/404) of identified barriers to care (e.g., perceived wellness); and 200 (95%) initiated ART at a healthcare facility, of whom 196 (98%) received at least one antiretroviral refill before case-management services ended. To help countries achieve ≥90% ART coverage among all PLHIV, expanding ICT for adult family members and high-risk associates of index clients, and providing peer-delivered linkage case management for all identified PLHIV, should be considered. |
Community health workers' experiences in strengthening the uptake of childhood immunization and malaria prevention services in urban Sierra Leone
Ishizumi A , Sutton R , Mansaray A , Parmley L , Eleeza O , Kulkarni S , Sesay T , Conklin L , Wallace AS , Akinjeji A , Toure M , Lahuerta M , Jalloh MF . Front Public Health 2021 9 767200 Introduction: Community health workers (CHWs) play an integral role in Sierra Leone's health systems strengthening efforts. Our goal was to understand CHWs' experiences of providing immunization and malaria prevention services in urban settings and explore opportunities to optimize their contributions to these services. Methods: In 2018, we conducted an exploratory qualitative assessment in the Western Area Urban district, which covers most of the capital city of Freetown. We purposively selected diverse health facilities (i.e., type, ownership, setting) and recruited CHWs through their supervisors. We conducted eight focus group discussions (FGD) with CHWs, which were audio-recorded. The topics explored included participants' background, responsibilities and priorities of urban CHWs, sources of motivation at work, barriers to CHWs' immunization and malaria prevention activities, and strategies used to address these barriers. The local research team transcribed and translated FGDs into English; then we used qualitative content analysis to identify themes. Results: Four themes emerged from the qualitative content analysis: (1) pride, compassion, recognition, and personal benefits are important motivating factors to keep working as CHWs; (2) diverse health responsibilities and competing priorities result in overburdening of CHWs; (3) health system- and community-level barriers negatively affect CHWs' activities and motivation; (4) CHWs use context-specific strategies to address challenges in their work but require further support. Conclusion: Focused support for CHWs is needed to optimize their contributions to immunization and malaria prevention activities. Such interventions should be coupled with systems-level efforts to address the structural barriers that negatively affect CHWs' overall work and motivation, such as the shortage of work supplies and the lack of promised financial support. |
Inter-annual variation in prevalence of Borrelia burgdorferi sensu stricto and Anaplasma phagocytophilum in host-seeking Ixodes scapularis (Acari: Ixodidae) at long-term surveillance sites in the upper midwestern United States: Implications for public health practice
Foster E , Burtis J , Sidge JL , Tsao JI , Bjork J , Liu G , Neitzel DF , Lee X , Paskewitz S , Caporale D , Eisen RJ . Ticks Tick Borne Dis 2021 13 (2) 101886 The geographic range of the blacklegged tick, Ixodes scapularis, and its associated human pathogens have expanded substantially over the past 20 years putting an increasing number of persons at risk for tick-borne diseases, particularly in the upper midwestern and northeastern United States. Prevention and diagnosis of tick-borne diseases rely on an accurate understanding by the public and health care providers of when and where persons may be exposed to infected ticks. While tracking changes in the distribution of ticks and tick-borne pathogens provides fundamental information on risk for tick-borne diseases, metrics that incorporate prevalence of infection in ticks better characterize acarological risk. However, assessments of infection prevalence are more labor intensive and costly than simple measurements of tick or pathogen presence. Our objective was to examine whether data derived from repeated sampling at longitudinal sites substantially influences public health recommendations for Lyme disease and anaplasmosis prevention, or if more constrained sampling is sufficient. Here, we summarize inter-annual variability in prevalence of the agents of Lyme disease (Borrelia burgdorferi s.s.) and anaplasmosis (Anaplasma phagocytophilum) in host-seeking I. scapularis nymphs and adults at 28 longitudinal sampling sites in the Upper Midwestern US (Michigan, Minnesota, and Wisconsin). Infection prevalence was highly variable among sites and among years within sites. We conclude that monitoring infection prevalence in ticks aids in describing coarse acarological risk trends, but setting a fixed prevalence threshold for prevention or diagnostic decisions is not feasible given the observed variability and lack of temporal trends. Reducing repeated sampling of the same sites had minimal impact on regional (Upper Midwest) estimates of average infection prevalence; this information should be useful in allocating scarce public health resources for tick and tick-borne pathogen surveillance, prevention, and control activities. |
Entomological investigation following a Zika outbreak in Brownsville, Texas
Mutebi JP , Godsey M , Rose D , Barnes F , Rodriguez J , Presas YE , Qualls W , Bolling B , Rodriguez A . J Am Mosq Control Assoc 2021 37 (4) 286-290 In November and December 2016, an outbreak of locally transmitted Zika occurred in Brownsville, TX. The Texas Department of State Health Services requested for a Centers for Disease Control and Prevention (CDC) Epi Aid, and as part of that Epi Aid a team of CDC entomologists was deployed in January 2017. The mission was to improve mosquito-based arbovirus surveillance and evaluate the possibility of continuing local Zika virus (ZIKV) transmission in the city. The mosquito-based arbovirus surveillance program was expanded from 4 to 40 BG-Sentinel traps evenly distributed throughout the city. Over a 2-wk period, 15 mosquito species were detected; the most abundant species were Culex quinquefasciatus, Aedes aegypti, and Ae. albopictus, which accounted for 66.7%, 16.2%, and 5.7% of the total mosquito collection, respectively. The relative abundance of Ae. aegypti (1.0 mosquitoes/trap/day) and Ae. albopictus (0.4 mosquitoes/trap/day) was very low and unlikely to initiate and/or sustain ZIKV transmission. Zika virus was not detected in the mosquitoes collected, suggesting no or extremely low ZIKV transmission at that time. |
First record of the Asian longhorned tick Haemaphysalis longicornis in Missouri
Roberts L , Brauer B , Nicholson WL , Ayres BN , Thompson KR , Claborn DM . J Am Mosq Control Assoc 2021 37 (4) 296-297 The Asian longhorned tick, Haemaphysalis longicornis, is an invasive species, originally from eastern Asia, and was first reported in the USA in New Jersey. It is now reported in several eastern states. In 2018, researchers reported H. longicornis in northwest Arkansas (Benton County). This tick species is a proven vector of livestock and human diseases, which prompted the current survey of ticks in southwest Missouri. A tick drag in Greene County, Missouri, produced 2 H. longicornis nymphs on June 9, 2021. This is the first report of this species for both the state and county. |
Heavy metal pollution of soils and risk assessment in Houston, Texas following Hurricane Harvey
Han I , Whitworth KW , Christensen B , Afshar M , An Han H , Rammah A , Oluwadairo T , Symanski E . Environ Pollut 2021 296 118717 In August 2017, after Hurricane Harvey made landfall, almost 52 inches of rain fell during a three-day period along the Gulf Coast Region of Texas, including Harris County, where Houston is located. Harris County was heavily impacted with over 177,000 homes and buildings (approximately 12 percent of all buildings in the county) experiencing flooding. The objective of this study was to measure 13 heavy metals in soil in residential areas and to assess cancer and non-cancer risk for children and adults after floodwaters receded. Between September and November 2017, we collected 174 surface soil samples in 10 communities, which were classified as "High Environmental Impact" or "Low Environmental Impact" communities, based on a composite metric of six environmental parameters. A second campaign was conducted between May 2019 and July 2019 when additional 204 soil samples were collected. Concentrations of metals at both sampling campaigns were higher in High Environmental Impact communities than in Low Environmental Impact communities and there was little change in metal levels between the two sampling periods. The Pollution Indices of lead (Pb), zinc, copper, nickel, and manganese in High Environmental Impact communities were significantly higher than those in Low Environmental Impact communities. Further, cancer risk estimates in three communities for arsenic through soil ingestion were greater than 1 in 1,000,000. Although average soil Pb was lower than the benchmark of the United States Environmental Protection Agency, the hazard indices for non-cancer outcomes in three communities, mostly attributed to Pb, were greater than 1. Health risk estimates for children living in these communities were greater than those for adults. |
Assessing exposures to per- and polyfluoroalkyl substances in two populations of Great Lakes Basin fish consumers in Western New York State
Liu M , Nordstrom M , Forand S , Lewis-Michl E , Wattigney WA , Kannan K , Wang W , Irvin-Barnwell E , Hwang SA . Int J Hyg Environ Health 2021 240 113902 BACKGROUND: Fish and other seafood are an important dietary source of per- and polyfluoroalkyl substances (PFAS) exposure in many areas of the world, and PFAS were found to be pervasive in fish from the Great Lakes area. Few studies, however, have examined the associations between Great Lakes Basin fish consumption and PFAS exposure. Many licensed anglers and Burmese refugees and immigrants residing in western New York State consume fish caught from the Great Lakes and surrounding waters, raising their risk of exposure to environmental contaminants including PFAS. The aims of this study were to: 1) present the PFAS exposure profile of the licensed anglers and Burmese refugees and 2) examine the associations between serum PFAS levels and local fish consumption. METHODS: Licensed anglers (n = 397) and Burmese participants (n = 199) provided blood samples and completed a detailed questionnaire in 2013. We measured 12 PFAS in serum. Multiple linear regression was used to assess associations between serum PFAS concentrations and self-reported consumption of fish from Great Lakes waters. RESULTS: Licensed anglers and Burmese participants reported consuming a median of 16 (IQR: 6-36) and 88 (IQR: 44-132) meals of locally caught fish in the year before sample collection, respectively (data for Burmese group restricted to 10 months of the year). Five PFAS were detected in almost all study participants (PFOS, PFOA, PFHxS, PFNA and PFDA; 97.5-100%). PFOS had the highest median serum concentration in licensed anglers (11.6 ng/mL) and the Burmese (35.6 ng/mL), approximately two and six times that of the U.S. general population, respectively. Serum levels of other PFAS in both groups were generally low and comparable to those in the general U.S. POPULATION: Among licensed anglers, Great Lakes Basin fish meals over the past year were positively associated with serum PFOS (P < 0.0001), PFDA (P < 0.0001), PFHxS (P = 0.01), and PFNA (P = 0.02) and the number of years consuming locally caught fish was positively associated with serum PFOS (P = 0.01) and PFDA (P = 0.01) levels. In the Burmese group, consuming Great Lakes Basin fish more than three times a week in the past summer was positively associated with serum PFOS (P = 0.004) and PFDA (P = 0.02) among the Burmese of non-Karen ethnicity, but not among those of Karen ethnicity, suggesting potential ethnic differences in PFAS exposure. CONCLUSIONS: Great Lakes Basin fish consumption was associated with an increase in blood concentrations of some PFAS, and especially of PFOS, among licensed anglers and Burmese refugees and immigrants in western New York State. In the Burmese population, there may be other important PFAS exposure routes related to residential history and ethnicity. Continued outreach efforts to increase fish advisory awareness and reduce exposure to contaminants are needed among these populations. |
Maternal phthalates exposure and blood pressure during and after pregnancy in the PROGRESS Study
Wu H , Kupsco A , Just A , Calafat AM , Oken E , Braun JM , Sanders AP , Mercado-Garcia A , Cantoral A , Pantic I , Téllez-Rojo MM , Wright RO , Baccarelli AA , Deierlein AL . Environ Health Perspect 2021 129 (12) 127007 BACKGROUND: Phthalate exposure is ubiquitous and may affect biological pathways related to regulators of blood pressure. Given the profound changes in vasculature during pregnancy, pregnant women may be particularly susceptible to the potential effects of phthalates on blood pressure. OBJECTIVES: We examined associations of phthalate exposure during pregnancy with maternal blood pressure trajectories from mid-pregnancy through 72 months postpartum. METHODS: Women with singleton pregnancies delivering a live birth in Mexico City were enrolled during the second trimester (n = 892). Spot urine samples from the second and third trimesters were analyzed for 15 phthalate metabolites. Blood pressure and covariate data were collected over nine visits through 72 months postpartum. We used linear, logistic, and linear mixed models; latent class growth models (LCGMs); and Bayesian kernel machine regression to estimate the relationship of urinary phthalate biomarkers with maternal blood pressure. RESULTS: As a joint mixture, phthalate biomarker concentrations during pregnancy were associated with higher blood pressure rise during mid-to-late gestation. With respect to individual biomarkers, second trimester concentrations of monobenzyl phthalate (MBzP) and di(2-ethylhexyl) phthalate biomarkers (ΣDEHP) were associated with higher third trimester blood pressure. Two trajectory classes were identified by LCGM, characterized by increasing blood pressure through 72 months postpartum ("increase-increase") or decreased blood pressure through 18 months postpartum with a gradual increase thereafter ("decrease-increase"). Increasing exposure to phthalate mixtures during pregnancy was associated with higher odds of being in the increase-increase class. Similar associations were observed for mono-2-ethyl-5-carboxypentyl terephthalate (MECPTP) and dibutyl phthalate (ΣDBP) biomarkers. When specific time periods were examined, we observed specific temporal relationships were observed for ΣDEHP, MECPTP, MBzP, and ΣDBP. DISCUSSION: In our cohort of pregnant women from Mexico City, exposure to phthalates and phthalate biomarkers was associated with higher blood pressure during late pregnancy, as well as with long-term changes in blood pressure trajectories. https://doi.org/10.1289/EHP8562. |
Estimating the number of illnesses caused by agents transmitted commonly through food: A scoping review
Scallan Walter EJ , Griffin PM , Bruce BB , Hoekstra RM . Foodborne Pathog Dis 2021 18 (12) 841-858 Estimates of the overall human health impact of agents transmitted commonly through food complement surveillance and help guide food safety interventions and regulatory initiatives. The purpose of this scoping review was to summarize the methods and reporting practices used in studies that estimate the total number of illnesses caused by these agents. We identified and included 43 studies published from January 1, 1995, to December 31, 2019, by searching PubMed and screening selected articles for other relevant publications. Selected articles presented original estimates of the number of illnesses caused by ≥1 agent transmitted commonly through food. The number of agents (species or subspecies for pathogens) included in each study ranged from 1 to 31 (median: 4.5; mean: 9.2). Of the 40 agents assessed across the 43 studies, the most common agent was Salmonella (36; 84% of studies), followed by Campylobacter (33; 77%), Shiga toxin-producing Escherichia coli (25; 58%), and norovirus (20; 47%). Investigators used a variety of data sources and methods that could be grouped into four distinct estimation approaches-direct, surveillance data scaled-up, syndrome or population scaled-down, and inferred. Based on our review, we propose four recommendations to improve the interpretability, comparability, and reproducibility of studies that estimate the number of illnesses caused by agents transmitted commonly through food. These include providing an assessment of statistical and nonstatistical uncertainty, providing a ranking of estimates by agent, including uncertainties; describing the rationale used to select agents and data sources; and publishing raw data and models, along with clear, detailed methods. These recommendations could lead to better decision-making about food safety policies. Although these recommendations have been made in the context of illness estimation for agents transmitted commonly through food, they also apply to estimates of other health outcomes and conditions. |
Enhancing response to foodborne disease outbreaks: Findings of the Foodborne Diseases Centers for Outbreak Response Enhancement (FoodCORE), 2010-2019
Tilashalski FP , Sillence EM , Newton AE , Biggerstaff GK . J Public Health Manag Pract 2021 28 (4) E702-E710 CONTEXT: Each year, foodborne diseases cause an estimated 48 million illnesses resulting in 128000 hospitalizations and 3000 deaths in the United States. Fast and effective outbreak investigations are needed to identify and remove contaminated food from the market to reduce the number of additional illnesses that occur. Many state and local health departments have insufficient resources to identify, respond to, and control the increasing burden of foodborne illnesses. PROGRAM: The Centers for Disease Control and Prevention (CDC) Foodborne Diseases Centers for Outbreak Response Enhancement (FoodCORE) program provides targeted resources to state and local health departments to improve completeness and timeliness of laboratory, epidemiology, and environmental health activities for foodborne disease surveillance and outbreak response. IMPLEMENTATION: In 2009, pilot FoodCORE centers were selected through a competitive application process and then implemented work plans to achieve faster and more complete surveillance and outbreak response activities in their jurisdiction. By 2019, 10 centers participated in FoodCORE: Colorado, Connecticut, Minnesota, New York City, Ohio, Oregon, South Carolina, Tennessee, Utah, and Wisconsin. EVALUATION: CDC and FoodCORE centers collaboratively developed performance metrics to evaluate the impact and effectiveness of FoodCORE activities. Centers used performance metrics to document successes, identify gaps, and set goals for their jurisdiction. CDC used performance metrics to evaluate the implementation of FoodCORE priorities and identify successful strategies to develop replicable model practices. This report provides a description of implementing the FoodCORE program during year 1 (October 2010 to September 2011) through year 9 (January 2019 to December 2019). DISCUSSION: FoodCORE centers address gaps in foodborne disease response through enhanced capacity to improve timeliness and completeness of surveillance and outbreak response activities. Strategies resulting in faster, more complete surveillance and response are documented as model practices and are shared with state and local foodborne disease programs across the country. |
Two multistate outbreaks of a reoccurring Shiga toxin-producing Escherichia coli strain associated with romaine lettuce - United States, 2018-2019
Waltenburg MA , Schwensohn C , Madad A , Seelman SL , Peralta V , Koske SE , Boyle MM , Arends K , Patel K , Mattioli M , Gieraltowski L , Neil KP . Epidemiol Infect 2021 150 e16 Leafy green vegetables are a common source of Shiga toxin-producing Escherichia coli O157:H7 (STEC O157) foodborne illness outbreaks. Ruminant animals, primarily cattle, are the major reservoir of STEC O157. Epidemiological, traceback, and field investigations were conducted to identify potential outbreak sources. Product and environmental samples were tested for STEC. A reoccurring strain of STEC O157 caused two multistate outbreaks linked to romaine lettuce in 2018 and 2019, resulting in 234 illnesses in 33 states. Over 80% of patients interviewed consumed romaine lettuce before illness. The romaine lettuce was sourced from two California growing regions: Santa Maria and Salinas Valley in 2018 and Salinas Valley in 2019. The outbreak strain was isolated from environmental samples collected at sites >90 miles apart across growing regions, as well as from romaine-containing products in 2019. Although the definitive route of romaine contamination was undetermined, use of a contaminated agricultural water reservoir in 2018 and contamination from cattle grazing on adjacent land in 2019 were suspected as possible factors. Preventing lettuce contamination from growth to consumption is imperative to preventing illness. These outbreaks highlight the need to further understand mechanisms of romaine contamination, including the role of environmental or animal reservoirs for STEC O157. © 2021 Cambridge University Press. All rights reserved. |
Editorial: Codon Usage and Dinucleotide Composition of Virus Genomes: From the Virus-Host Interaction to the Development of Vaccines.
Pintó RM , Burns CC , Moratorio G . Front Microbiol 2021 12 791750 The codon usage and dinucleotide frequencies of an organism depend mostly on its nucleotide composition, but also on the evolutionary forces that have acted on its genome including mutation, genetic drift, and selection. Different selection pressures may contribute to shape the genome composition and codon usage of viruses: | | - Codon usage and dinucleotide biases may result from the need to maintain RNA secondary structures involved in splicing and gene expression (Takata et al., 2018). | - Dinucleotide bias may result from the need to evade cell defense mechanisms. UpA, and particularly CpG, dinucleotides may be perceived as pathogen-associated molecular patterns by host cells and consequently their frequencies in viral genomes tend to decrease (Atkinson et al., 2014). | - Codon usage may also result from translation selection. Abundant codons, pairing with abundant tRNAs, would be selected over rare codons, pairing with non-abundant tRNAs, to improve the efficiency and accuracy of translation. On the contrary, rare codons would persist through mutational pressure and genetic drift (Duret, 2002; Hershberg and Petrov, 2008). Viruses do not code for tRNAs, and instead they use the host tRNA pool for their own translation, making even harder to discern the role of selection on shaping their codon usage. Although a similar codon usage between viruses and their hosts may be anticipated, excessive similarity may impede host translation, with the associated deleterious effects; consequently, viruses have evolved to an optimal range of codon usage bias (Moratorio et al., 2013; Chen et al., 2020). | - Codon usage may also be shaped by selection for the control of the translation rate. The scarcity of rare codon pairing tRNAs may result in ribosome stalling, slowing down the mRNA translation speed. In doing so, rare codons may play a role in controlling the co-translational folding (Chaney and Clark, 2015; Yu et al., 2015; Zhao et al., 2017; D'Andrea et al., 2019; Pintó and Bosch, 2021). | - Additionally, codon usage may be selected to maintain mutational robustness. Codons one mutation apart from a stop codon may tend to be avoided in genomes (Moratorio et al., 2017; Carrau et al., 2019). |
Change in self-reported health: A signal for early intervention in a Medicare population
Antol DD , Hagan A , Nguyen H , Li Y , Haugh GS , Radmacher M , Greenlund KJ , Thomas CW , Renda A , Hacker K , Shrank WH . Healthc (Amst) 2021 10 (1) 100610 BACKGROUND: Health plans and risk-bearing provider organizations seek information sources to inform proactive interventions for patients at risk of adverse health events. Interventions should take into account the strong relationship between social context and health. This retrospective cohort study of a Medicare Advantage population examined whether a change in self-reported health-related quality of life (HRQOL) signals a subsequent change in healthcare needs. METHODS: A retrospective longitudinal analysis of administrative claims data was conducted for participants in a Medicare Advantage plan with prescription drug coverage (MAPD) who responded to 2 administrations of the Centers for Disease Control and Prevention 4-item Healthy Days survey within 6-18 months during 2015-2018. Changes in HRQOL, as measured by the Healthy Days instrument, were compared with changes in utilization and costs, which were considered to be a reflection of change in healthcare needs. RESULTS: A total of 48,841 individuals met inclusion criteria. Declining HRQOL was followed by increases in utilization and costs. An adjusted analysis showed that every additional unhealthy day reported one year after baseline was accompanied by an $8 increase in monthly healthcare costs in the subsequent six months for the average patient. CONCLUSIONS: Declining HRQOL signaled subsequent increases in healthcare needs and utilization. IMPLICATIONS: Findings suggest that HRQOL assessments in general, and the Healthy Days instrument in particular, could serve as a leading indicator of the need for interventions designed to mitigate poor health outcomes and rising healthcare costs. LEVEL OF EVIDENCE: III. |
Characterizing financial sustainability of sexually transmitted disease clinics through insurance billing practices
Pearson WS , Chan PA , Cramer R , Gift TL . J Public Health Manag Pract 2021 28 (4) 358-365 CONTEXT: Sexually transmitted infections (STIs) continue to increase in the United States. Publicly funded sexually transmitted disease (STD) clinics provide important safety net services for communities at greater risk for STIs. However, creating financially sustainable models of STI care remains a challenge. OBJECTIVE: Characterization of clinic insurance billing practices and patient willingness to use insurance. DESIGN: Cross-sectional survey assessment of clinic administrators and patients. SETTING: Twenty-six STD clinics and 4138 patients attending these clinics in high STD morbidity metropolitan statistical areas in the United States. PARTICIPANTS: Clinic administrators and patients of these clinics. INTERVENTION: Survey assessment. MAIN OUTCOME MEASURE: Insurance billing practices of STD clinics and patient insurance status and willingness to use their insurance. RESULTS: Fifteen percent of clinics (4/26) indicated that they billed only Medicaid, 58% (15/26) billed both Medicaid and private insurance, 27% (7/26) did not bill for any health insurance, and none (0%) billed only private health insurance companies. Of 4138 patients surveyed, just more than one-half of patients (52.6%) were covered by some form of health insurance. More than one-half (57.2%) of all patients covered by health insurance indicated that they would be willing to use their health insurance for that visit. After adjusting for patient demographics and clinic characteristics, the patients covered by government insurance were 3 times as likely (odds ratio: 3.16; 95% confidence interval, 2.44-4.10) than patients covered by private insurance to be willing to use their insurance for their visit. CONCLUSION: Opportunities exist for sustainable STI services through the enhancement of billing practices in STD clinics. The STD clinics provide care to large numbers of individuals who are both insured and who are willing to use their insurance for their care. As Medicaid expansion continues across the country, efforts focused on improving reimbursement rates for Medicaid may improve financial sustainability of STD clinics. |
Virus decay rates should not be used to reduce recommended room air clearance times
Lindsley WG , Martin SB , Mead KR , Hammond DR . Infect Control Hosp Epidemiol 2021 43 (12) 1-2 We read with concern the letter by Hurlburt et al Reference Hurlburt, DeKleer and Bryce1 proposing revisions to the recommended room air clearance times for infectious aerosols in healthcare facilities. We believe that the calculations performed to justify the changes are based on flawed assumptions and an erroneous calculation. Experimental data on the survival of airborne SARS-CoV-2 virus and the dynamics of room ventilation do not support their conclusions. |
Safety surveillance of meningococcal group B vaccine (Bexsero®), Vaccine Adverse Event Reporting System, 2015-2018.
Perez-Vilar S , Dores GM , Marquez PL , Ng CS , Cano MV , Rastogi A , Lee L , Su JR , Duffy J . Vaccine 2021 40 (2) 247-254 BACKGROUND: Bexsero® (GlaxoSmithKline) is a four-component Neisseria meningitidis serogroup B vaccine (MenB-4C). It was licensed in the United States in 2015 for use among individuals ages 10-25 years. We aimed to assess the post-licensure safety profile of MenB-4C by examining reports received in the Vaccine Adverse Event Reporting System (VAERS). METHODS: VAERS is a national passive surveillance system for adverse events (AEs) following immunization that uses the Medical Dictionary for Regulatory Activities to code reported AEs and the Code of Federal Regulations to classify reports by seriousness. In this case series, we analyzed U.S. reports involving MenB-4C received between January 23, 2015 through December 31, 2018. We used Empirical Bayesian data mining to identify MenB-4C/AE combinations reported at least twice as often as expected. RESULTS: VAERS received 1,867 reports following MenB-4C administration, representing 332 reports per million doses distributed. Most reports were for females (59%), with a median age of 17 years (interquartile range: 16-18 years); 40% of reports described simultaneous administration of other vaccines. The majority of reports were classified as non-serious (96%). The most commonly reported AEs were injection site pain (22%), pyrexia (16%), and headache (16%). Data mining identified disproportionate reporting for "injected limb mobility decreased" secondary to injection site reactions, including extensive swelling of the vaccinated limb and injection site pain. CONCLUSIONS: Analysis of passive surveillance data from over 5.6 million doses of MenB-4C distributed in the United States did not reveal new safety concerns. The large majority of reports were classified as non-serious and the reported AEs were generally consistent with the safety experience described in clinical studies and the product's package insert. While our results are reassuring, continued post-marketing surveillance is warranted. |
Booster and Additional Primary Dose COVID-19 Vaccinations Among Adults Aged ≥65 Years - United States, August 13, 2021-November 19, 2021.
Fast HE , Zell E , Murthy BP , Murthy N , Meng L , Scharf LG , Black CL , Shaw L , Chorba T , Harris LQ . MMWR Morb Mortal Wkly Rep 2021 70 (50) 1735-1739 Vaccination against SARS-CoV-2 (the virus that causes COVID-19) is highly effective at preventing hospitalization due to SARS-CoV-2 infection and booster and additional primary dose COVID-19 vaccinations increase protection (1-3). During August-November 2021, a series of Emergency Use Authorizations and recommendations, including those for an additional primary dose for immunocompromised persons and a booster dose for persons aged ≥18 years, were approved because of reduced immunogenicity in immunocompromised persons, waning vaccine effectiveness over time, and the introduction of the highly transmissible B.1.617.2 (Delta) variant (4,5). Adults aged ≥65 years are at increased risk for COVID-19-associated hospitalization and death and were one of the populations first recommended a booster dose in the U.S. (5,6). Data on COVID-19 vaccinations reported to CDC from 50 states, the District of Columbia (DC), and eight territories and freely associated states were analyzed to ascertain coverage with booster or additional primary doses among adults aged ≥65 years. During August 13-November 19, 2021, 18.7 million persons aged ≥65 years received a booster or additional primary dose of COVID-19 vaccine, constituting 44.1% of 42.5 million eligible* persons in this age group who previously completed a primary vaccination series.(†) Coverage was similar by sex and age group, but varied by primary series vaccine product and race and ethnicity, ranging from 30.3% among non-Hispanic American Indian or Alaska Native persons to 50.5% among non-Hispanic multiple/other race persons. Strategic efforts are needed to encourage eligible persons aged ≥18 years, especially those aged ≥65 years and those who are immunocompromised, to receive a booster and/or additional primary dose to ensure maximal protection against COVID-19. |
Neutralizing Antibody Response to Pseudotype SARS-CoV-2 Differs between mRNA-1273 and BNT162b2 COVID-19 Vaccines and by History of SARS-CoV-2 Infection.
Tyner HL , Burgess JL , Grant L , Gaglani M , Kuntz JL , Naleway AL , Thornburg NJ , Caban-Martinez AJ , Yoon SK , Herring MK , Beitel SC , Blanton L , Nikolich-Zugich J , Thiese MS , Pleasants JF , Fowlkes AL , Lutrick K , Dunnigan K , Yoo YM , Rose S , Groom H , Meece J , Wesley MG , Schaefer-Solle N , Louzado-Feliciano P , Edwards LJ , Olsho LEW , Thompson MG . Clin Infect Dis 2021 75 (1) e827-e837 BACKGROUND: Data on the development of neutralizing antibodies against SARS-CoV-2 after SARS-CoV-2 infection and after vaccination with messenger RNA (mRNA) COVID-19 vaccines are limited. METHODS: From a prospective cohort of 3,975 adult essential and frontline workers tested weekly from August 2020 to March 2021 for SARS-CoV-2 infection by Reverse Transcription-Polymerase Chain Reaction (RT-PCR) assay irrespective of symptoms, 497 participants had sera drawn after infection (170), vaccination (327), and after both infection and vaccination (50 from the infection population). Serum was collected after infection and each vaccine dose. Serum-neutralizing antibody titers against USA-WA1/2020-spike pseudotype virus were determined by the 50% inhibitory dilution. Geometric mean titers (GMTs) and corresponding fold increases were calculated using t-tests and linear mixed effects models. RESULTS: Among 170 unvaccinated participants with SARS-CoV-2 infection, 158 (93%) developed neutralizing antibodies (nAb) with a GMT of 1,003 (95% CI=766-1,315). Among 139 previously uninfected participants, 138 (99%) developed nAb after mRNA vaccine dose-2 with a GMT of 3,257 (95% CI = 2,596-4,052). GMT was higher among those receiving mRNA-1273 vaccine (GMT =4,698, 95%CI= 3,186-6,926) compared to BNT162b2 vaccine (GMT=2,309, 95%CI=1,825-2,919). Among 32 participants with prior SARS-CoV-2 infection, GMT was 21,655 (95%CI=14,766-31,756) after mRNA vaccine dose-1, without further increase after dose-2. CONCLUSIONS: A single dose of mRNA vaccine after SARS-CoV-2 infection resulted in the highest observed nAb response. Two doses of mRNA vaccine in previously uninfected participants resulted in higher nAb to SARS-CoV-2 than after one dose of vaccine or SARS-CoV-2 infection alone. Neutralizing antibody response also differed by mRNA vaccine product. |
One Year of COVID-19 Vaccines: A Shot of Hope, a Dose of Reality.
Cohn AC , Mahon BE , Walensky RP . JAMA 2021 327 (2) 119-120 One year after the first authorized COVID-19 vaccine was administered in the US, the nation celebrates a historic achievement in vaccine development and delivery. In 12 months, more than an estimated 200 million people in the US have completed their primary vaccine series. Vaccination has prevented millions of COVID-19 cases and hospitalizations and saved hundreds of thousands of lives. A report released in December 2021 by the Office of the Assistant Secretary of Planning and Evaluation highlights the large effects of vaccine on health, and also the estimated social value of avoiding COVID-19 hospitalizations and deaths.1 |
Effectiveness of mRNA vaccines in preventing COVID-19 hospitalization by age and burden of chronic medical conditions among immunocompetent US adults, March-August 2021.
Lewis NM , Naioti EA , Self WH , Ginde AA , Douin DJ , Talbot HK , Casey JD , Mohr NM , Zepeski A , Gaglani M , Ghamande SA , McNeal TA , Shapiro NI , Gibbs KW , Files DC , Hager DN , Shehu A , Prekker ME , Erickson HL , Gong MN , Mohamed A , Henning DJ , Steingrub JS , Peltan ID , Brown SM , Martin ET , Hubel K , Hough CL , Busse LW , Ten Lohuis CC , Duggal A , Wilson JG , Gordon AJ , Qadir N , Chang SY , Mallow C , Rivas C , Babcock HM , Kwon JH , Exline MC , Halasa N , Chappell JD , Lauring AS , Grijalva CG , Rice TW , Rhoads JP , Stubblefield WB , Baughman A , Womack KN , Lindsell CJ , Hart KW , Zhu Y , Schrag SJ , Kobayashi M , Verani JR , Patel MM , Tenforde MW . J Infect Dis 2021 225 (10) 1694-1700 In a multi-state network, vaccine effectiveness (VE) against COVID-19 hospitalizations was evaluated among immunocompetent adults (≥18-years) during March-August 2021 using a case-control design. Among 1669 hospitalized COVID-19 cases (11% fully vaccinated) and 1950 RT-PCR-negative controls (54% fully vaccinated), VE was higher at 96% (95% CI: 93-98%) among patients with no chronic medical conditions than patients with ≥3 categories of conditions (83% [95% CI: 76-88%]). VE was similar between those aged 18-64 years vs ≥65 years (p>0.05). Vaccine effectiveness against severe COVID-19 was very high among adults without chronic conditions and lessened with increasing burden of comorbidities. |
Report of Health Care Provider Recommendation for COVID-19 Vaccination Among Adults, by Recipient COVID-19 Vaccination Status and Attitudes - United States, April-September 2021.
Nguyen KH , Yankey D , Lu PJ , Kriss JL , Brewer NT , Razzaghi H , Meghani M , Manns BJ , Lee JT , Singleton JA . MMWR Morb Mortal Wkly Rep 2021 70 (50) 1723-1730 Vaccination is critical to controlling the COVID-19 pandemic, and health care providers play an important role in achieving high vaccination coverage (1). To examine the prevalence of report of a provider recommendation for COVID-19 vaccination and its association with COVID-19 vaccination coverage and attitudes, CDC analyzed data among adults aged ≥18 years from the National Immunization Survey-Adult COVID Module (NIS-ACM), a nationally representative cellular telephone survey. Prevalence of report of a provider recommendation for COVID-19 vaccination among adults increased from 34.6%, during April 22-May 29, to 40.5%, during August 29-September 25, 2021. Adults who reported a provider recommendation for COVID-19 vaccination were more likely to have received ≥1 dose of a COVID-19 vaccine (77.6%) than were those who did not receive a recommendation (61.9%) (adjusted prevalence ratio [aPR] = 1.12). Report of a provider recommendation was associated with concern about COVID-19 (aPR = 1.31), belief that COVID-19 vaccines are important to protect oneself (aPR = 1.15), belief that COVID-19 vaccination was very or completely safe (aPR = 1.17), and perception that many or all of their family and friends had received COVID-19 vaccination (aPR = 1.19). Empowering health care providers to recommend vaccination to their patients could help reinforce confidence in, and increase coverage with, COVID-19 vaccines, particularly among groups known to have lower COVID-19 vaccination coverage, including younger adults, racial/ethnic minorities, and rural residents. |
Characterizing and Identifying the Prevalence of Web-Based Misinformation Relating to Medication for Opioid Use Disorder: Machine Learning Approach.
ElSherief M , Sumner SA , Jones CM , Law RK , Kacha-Ochana A , Shieber L , Cordier L , Holton K , De Choudhury M . J Med Internet Res 2021 23 (12) e30753 BACKGROUND: Expanding access to and use of medication for opioid use disorder (MOUD) is a key component of overdose prevention. An important barrier to the uptake of MOUD is exposure to inaccurate and potentially harmful health misinformation on social media or web-based forums where individuals commonly seek information. There is a significant need to devise computational techniques to describe the prevalence of web-based health misinformation related to MOUD to facilitate mitigation efforts. OBJECTIVE: By adopting a multidisciplinary, mixed methods strategy, this paper aims to present machine learning and natural language analysis approaches to identify the characteristics and prevalence of web-based misinformation related to MOUD to inform future prevention, treatment, and response efforts. METHODS: The team harnessed public social media posts and comments in the English language from Twitter (6,365,245 posts), YouTube (99,386 posts), Reddit (13,483,419 posts), and Drugs-Forum (5549 posts). Leveraging public health expert annotations on a sample of 2400 of these social media posts that were found to be semantically most similar to a variety of prevailing opioid use disorder-related myths based on representational learning, the team developed a supervised machine learning classifier. This classifier identified whether a post's language promoted one of the leading myths challenging addiction treatment: that the use of agonist therapy for MOUD is simply replacing one drug with another. Platform-level prevalence was calculated thereafter by machine labeling all unannotated posts with the classifier and noting the proportion of myth-indicative posts over all posts. RESULTS: Our results demonstrate promise in identifying social media postings that center on treatment myths about opioid use disorder with an accuracy of 91% and an area under the curve of 0.9, including how these discussions vary across platforms in terms of prevalence and linguistic characteristics, with the lowest prevalence on web-based health communities such as Reddit and Drugs-Forum and the highest on Twitter. Specifically, the prevalence of the stated MOUD myth ranged from 0.4% on web-based health communities to 0.9% on Twitter. CONCLUSIONS: This work provides one of the first large-scale assessments of a key MOUD-related myth across multiple social media platforms and highlights the feasibility and importance of ongoing assessment of health misinformation related to addiction treatment. |
Preferences for using a mobile app in sickle cell disease self-management: Descriptive qualitative study
Mayo-Gamble TL , Quasie-Woode D , Cunningham-Erves J , Rollins M , Schlundt D , Bonnet K , Murry VM . JMIR Form Res 2021 5 (11) e28678 BACKGROUND: Individuals with sickle cell disease (SCD) and their caregivers may benefit from technology-based resources to improve disease self-management. OBJECTIVE: This study explores the preferences regarding a mobile health (mHealth) app to facilitate self-management in adults with SCD and their caregivers living in urban and rural communities. METHODS: Five community listening sessions were conducted in 2 urban and rural communities among adults with SCD and their caregivers (N=43). Each session comprised 4 to 15 participants. Participants were asked questions on methods of finding information about SCD self-care, satisfaction with current methods for finding SCD management information, support for SCD management, important features for development of an mHealth app, and areas of benefit for using an mHealth app for SCD self-management. An inductive-deductive content analysis approach was implemented to identify the critical themes. RESULTS: Seven critical themes emerged, including the current methods for receiving self-management information, desired information, recommendations for communicating sickle cell self-management information, challenges of disease management, types of support received for disease management, barriers to and facilitators of using an mHealth app, and feature preferences for an mHealth app. In addition, we found that the participants were receptive to using mHealth apps in SCD self-management. CONCLUSIONS: This study expands our knowledge on the use of mHealth technology to reduce information access barriers pertaining to SCD. The findings can be used to develop a patient-centered, user-friendly mHealth app to facilitate disease self-management, thus increasing access to resources for families of patients with SCD residing in rural communities. |
A clinical decision support system is associated with reduced loss to follow-up among patients receiving HIV treatment in Kenya: a cluster randomized trial
Oluoch T , Cornet R , Muthusi J , Katana A , Kimanga D , Kwaro D , Okeyo N , Abu-Hanna A , de Keizer N . BMC Med Inform Decis Mak 2021 21 (1) 357 BACKGROUND: Loss to follow-up (LFTU) among HIV patients remains a major obstacle to achieving treatment goals with the risk of failure to achieve viral suppression and thereby increased HIV transmission. Although use of clinical decision support systems (CDSS) has been shown to improve adherence to HIV clinical guidance, to our knowledge, this is among the first studies conducted to show its effect on LTFU in low-resource settings. METHODS: We analyzed data from a cluster randomized controlled trial in adults and children (aged ≥ 18 months) who were receiving antiretroviral therapy at 20 HIV clinics in western Kenya between Sept 1, 2012 and Jan 31, 2014. Participating clinics were randomly assigned, via block randomization. Clinics in the control arm had electronic health records (EHR) only while the intervention arm had an EHR with CDSS. The study objectives were to assess the effects of a CDSS, implemented as alerts on an EHR system, on: (1) the proportion of patients that were LTFU, (2) LTFU patients traced and successfully linked back to treatment, and (3) time from enrollment on the study to documentation of LTFU. RESULTS: Among 5901 eligible patients receiving ART, 40.6% (n = 2396) were LTFU during the study period. CDSS was associated with lower LTFU among the patients (Adjusted Odds Ratio-aOR 0.70 (95% CI 0.65-0.77)). The proportions of patients linked back to treatment were 25.8% (95% CI 21.5-25.0) and 30.6% (95% CI 27.9-33.4)) in EHR only and EHR with CDSS sites respectively. CDSS was marginally associated with reduced time from enrollment on the study to first documentation of LTFU (adjusted Hazard Ratio-aHR 0.85 (95% CI 0.78-0.92)). CONCLUSION: A CDSS can potentially improve quality of care through reduction and early detection of defaulting and LTFU among HIV patients and their re-engagement in care in a resource-limited country. Future research is needed on how CDSS can best be combined with other interventions to reduce LTFU. Trial registration NCT01634802. Registered at www.clinicaltrials.gov on 12-Jul-2012. Registered prospectively. |
A comparison of two population-based household surveys in Uganda for assessment of violence against youth
Currie DW , Apondi R , West CA , Biraro S , Wasula LN , Patel P , Hegle J , Howard A , Benevides de Barros R , Durant T , Chiang LF , Voetsch AC , Massetti GM . PLoS One 2021 16 (12) e0260986 Violence is associated with health-risk behaviors, potentially contributing to gender-related HIV incidence disparities in sub-Saharan Africa. Previous research has demonstrated that violence, gender, and HIV are linked via complex mechanisms that may be direct, such as through forced sex, or indirect, such as an inability to negotiate safe sex. Accurately estimating violence prevalence and its association with HIV is critical in monitoring programmatic efforts to reduce both violence and HIV. We compared prevalence estimates of violence in youth aged 15-24 years from two Ugandan population-based cross-sectional household surveys (Uganda Violence Against Children Survey 2015 [VACS] and Uganda Population-based HIV Impact Assessment 2016-2017 [UPHIA]), stratified by gender. UPHIA violence estimates were consistently lower than VACS estimates, including lifetime physical violence, recent intimate partner physical violence, and lifetime sexual violence, likely reflecting underestimation of violence in UPHIA. Multiple factors likely contributed to these differences, including the survey objectives, interviewer training, and questionnaire structure. VACS may be better suited to estimate distal determinants of HIV acquisition for youth (including experience of violence) than UPHIA, which is crucial for monitoring progress toward HIV epidemic control. |
Associations among age of first experience of violence, type of victimization, polyvictimization, and mental distress in Nigerian females
Lee N , Osborne M , Massetti G , Watson A , Self-Brown S . Violence Against Women 2021 28 2992-3012 This study explored associations of age of first victimization, sexual violence (SV), physical violence (PV), polyvictimization, and mental distress among females in Nigeria (n = 1,766, 13-24 years old) using the nationally representative 2014 Nigeria Violence Against Children Survey. Multinomial logistic regressions were performed. Nigerian females reporting SV victimization and polyvictimization were more likely to experience higher mental distress. The older the female was at the time of PV victimization, the greater the risk for mental distress. Violence is prevalent in Nigeria and its impact on youth's health is severe. However, evidence-based and data-driven policies and programs can reduce and prevent violence. |
The relationship between commercial sexual exploitation of children (CSEC) and childhood sexual abuse (CSA) among boys and girls in Haiti
Silverman JG , Boyce SC , Fonseka RW , Triplett D , Chiang LF , Caslin SS , Raj A . Int J Inj Contr Saf Promot 2021 29 (1) 1-7 To test the hypothesis that childhood sexual abuse (CSA) is a risk factor for commercial sexual exploitation of children (CSEC), we analysed data from the Haiti Violence Against Children Survey (VACS), a population-based sample of adolescents and young adults ages 13-24 (1459 males and 1457 females). Twenty-one percent of males and 25% of females reported CSA; 6% of males and 4% of females reported CSEC. The adjusted odds ratios (AORs) for CSEC based on exposure to CSA were 5.6 (95% confidence interval/CI: 3.1-10.2) for males and 5.9 (CI: 2.6-13.0) for females. For each year earlier that males first experienced CSA, the odds of CSEC increased 60% (AOR 1.6, CI 1.2-2.0). In this first nationally-representative study of lifetime CSEC, both boys and girls victimised by CSA in Haiti were more likely to have also experienced CSEC than other youth, with children who experienced CSA at younger ages at the greatest risk. |
Prevalence of suspected concussions among K-12 students in Utah: Findings from Utah's Student Injury Reporting System
Waltzman D , Daugherty J , Sarmiento K , Haarbauer-Krupa J , Campbell H , Ferrell D . J Sch Health 2021 92 (3) 241-251 BACKGROUND: To inform prevention strategies, this study provides incidence, factors, and actions taken when a suspected concussion occurred in K-12 schools in Utah. METHODS: Data were collected using Utah's Student Injury Reporting System (SIRS) from the academic years 2011-2012 to 2018-2019. SIRS is a unique online system that tracks injuries that occur in the school setting among K-12 students in Utah. Descriptive statistics were computed to characterize students with a suspected concussion. Chi-square (χ(2) ) analysis looking at characteristics by school level was also conducted. RESULTS: Over 63,000 K-12 students in Utah sustained an injury at school during the study period. Suspected concussions comprised 10% of all injuries. The prevalence of concussions was highest among males (60.6%) and elementary school students (42.6%) and most often occurred outdoors (57.6%) or on a playground/playfield (33.9%), and in sports- and recreation-related activities (75.1%) (specifically contact sports, 24.0%). Most students with a suspected concussion were absent 1 day or less from school (71.4%) but about 68% were seen by a medical professional. Further, there were differences by school level. Females and students playing contact sports had a higher percentage of suspected concussions as school level increased, whereas males and concussions sustained during school hours had a lower percentage of suspected concussions as school level increased. CONCLUSIONS: SIRS enables schools in Utah to identify groups at risk for concussion, as well as activities most commonly associated with these injuries, within the school environment. Using this information, schools may implement targeted prevention strategies to protect students. |
Single dose of chimeric dengue-2/Zika vaccine candidate protects mice and non-human primates against Zika virus.
Baldwin WR , Giebler HA , Stovall JL , Young G , Bohning KJ , Dean HJ , Livengood JA , Huang CY . Nat Commun 2021 12 (1) 7320 The development of a safe and effective Zika virus (ZIKV) vaccine has become a global health priority since the widespread epidemic in 2015-2016. Based on previous experience in using the well-characterized and clinically proven dengue virus serotype-2 (DENV-2) PDK-53 vaccine backbone for live-attenuated chimeric flavivirus vaccine development, we developed chimeric DENV-2/ZIKV vaccine candidates optimized for growth and genetic stability in Vero cells. These vaccine candidates retain all previously characterized attenuation phenotypes of the PDK-53 vaccine virus, including attenuation of neurovirulence for 1-day-old CD-1 mice, absence of virulence in interferon receptor-deficient mice, and lack of transmissibility in the main mosquito vectors. A single DENV-2/ZIKV dose provides protection against ZIKV challenge in mice and rhesus macaques. Overall, these data indicate that the ZIKV live-attenuated vaccine candidates are safe, immunogenic and effective at preventing ZIKV infection in multiple animal models, warranting continued development. |
Enhanced fitness of SARS-CoV-2 variant of concern Alpha but not Beta.
Ulrich L , Halwe NJ , Taddeo A , Ebert N , Schön J , Devisme C , Trüeb BS , Hoffmann B , Wider M , Fan X , Bekliz M , Essaidi-Laziosi M , Schmidt ML , Niemeyer D , Corman VM , Kraft A , Godel A , Laloli L , Kelly JN , Calderon BM , Breithaupt A , Wylezich C , Veiga IB , Gultom M , Osman S , Zhou B , Adea K , Meyer B , Eberhardt C , Thomann L , Gsell M , Labroussaa F , Jores J , Summerfield A , Drosten C , Eckerle IA , Wentworth DE , Dijkman R , Hoffmann D , Thiel V , Beer M , Benarafa C . Nature 2021 602 (7896) 307-313 Emerging variants of concern (VOC) drive the SARS-CoV-2 pandemic(1,2). Experimental assessment of replication and transmission of major VOC compared to progenitors are needed to understand successful emerging mechanisms of VOC(3). Here, we show that Alpha and Beta spike (S) proteins have a greater affinity to human angiotensin converting enzyme 2 (hACE2) receptor over the progenitor variant (wt-S(614G)) in vitro. Yet Alpha and wt-S(614G) had similar replication kinetics in human nasal airway epithelial cultures, whereas Beta was outcompeted by both. In vivo, competition experiments showed a clear fitness advantage of Alpha over the progenitor variant (wt-S(614G)) in ferrets and two mouse models, where the substitutions in S were major drivers for fitness advantage. In hamsters, supporting high replication levels, Alpha and wt-S(614G) had comparable fitness. In contrast, Beta was outcompeted by Alpha and wt-S(614G) in hamsters and hACE2-expressing mice. Our study highlights the importance of using multiple models for complete fitness characterization of VOC and demonstrates adaptation of Alpha towards increased upper respiratory tract replication and enhanced transmission in vivo in restrictive models, whereas Beta fails to overcome contemporary strains in naïve animals. |
Laboratory and field evaluations of a commercially available real-time loop-mediated isothermal amplification assay for the detection of West Nile virus in mosquito pools
Burkhalter KL , O'Keefe M , Holbert-Watson Z , Green T , Savage HM , Markowski DM . J Am Mosq Control Assoc 2021 37 (4) 256-262 Although the specific cDNA amplification mechanisms of reverse-transcriptase polymerase chain reaction (RT-PCR) and RT loop-mediated isothermal amplification (RT-LAMP) are very different, both molecular assays serve as options to detect arboviral RNA in mosquito pools. Like RT-PCR, RT-LAMP uses a reverse transcription step to synthesize complementary DNA (cDNA) from an RNA template and then uses target-specific primers to amplify cDNA to detectable levels in a single-tube reaction. Using laboratory-generated West Nile virus (WNV) samples and field-collected mosquito pools, we evaluated the sensitivity and specificity of a commercially available WNV real-time RT-LAMP assay (Pro-AmpRT™ WNV; Pro-Lab Diagnostics, Inc., Round Rock, Texas) and compared the results to a validated real-time RT-PCR assay. Laboratory generated virus stock samples containing ≥ 2.3 log10 plaque-forming units (PFU)/ml and intrathoracically inoculated mosquitoes containing ≥ 2.4 log10 PFU/ml produced positive results in the Pro-AmpRT WNV assay. Of field-collected pools that were WNV positive by real-time RT-PCR, 74.5% (70 of 94) were also positive by the Pro-AmpRT WNV assay, resulting in an overall Cohen's kappa agreement of 79.4% between the 2 tests. The Pro-AmpRT WNV assay shows promise as a suitable virus screening tool for vector surveillance programs provided agencies are aware of its characteristics and limitations. |
Histopathology of the broad class of carbon nanotubes and nanofibers used or produced in U.S. facilities in a murine model
Fraser K , Hubbs A , Yanamala N , Mercer RR , Stueckle TA , Jensen J , Eye T , Battelli L , Clingerman S , Fluharty K , Dodd T , Casuccio G , Bunker K , Lersch TL , Kashon ML , Orandle M , Dahm M , Schubauer-Berigan MK , Kodali V , Erdely A . Part Fibre Toxicol 2021 18 (1) 47 BACKGROUND: Multi-walled carbon nanotubes and nanofibers (CNT/F) have been previously investigated for their potential toxicities; however, comparative studies of the broad material class are lacking, especially those with a larger diameter. Additionally, computational modeling correlating physicochemical characteristics and toxicity outcomes have been infrequently employed, and it is unclear if all CNT/F confer similar toxicity, including histopathology changes such as pulmonary fibrosis. Male C57BL/6 mice were exposed to 40 µg of one of nine CNT/F (MW #1-7 and CNF #1-2) commonly found in exposure assessment studies of U.S. facilities with diameters ranging from 6 to 150 nm. Human fibroblasts (0-20 µg/ml) were used to assess the predictive value of in vitro to in vivo modeling systems. RESULTS: All materials induced histopathology changes, although the types and magnitude of the changes varied. In general, the larger diameter MWs (MW #5-7, including Mitsui-7) and CNF #1 induced greater histopathology changes compared to MW #1 and #3 while MW #4 and CNF #2 were intermediate in effect. Differences in individual alveolar or bronchiolar outcomes and severity correlated with physical dimensions and how the materials agglomerated. Human fibroblast monocultures were found to be insufficient to fully replicate in vivo fibrosis outcomes suggesting in vitro predictive potential depends upon more advanced cell culture in vitro models. Pleural penetrations were observed more consistently in CNT/F with larger lengths and diameters. CONCLUSION: Physicochemical characteristics, notably nominal CNT/F dimension and agglomerate size, predicted histopathologic changes and enabled grouping of materials by their toxicity profiles. Particles of greater nominal tube length were generally associated with increased severity of histopathology outcomes. Larger particle lengths and agglomerates were associated with more severe bronchi/bronchiolar outcomes. Spherical agglomerated particles of smaller nominal tube dimension were linked to granulomatous inflammation while a mixture of smaller and larger dimensional CNT/F resulted in more severe alveolar injury. |
High-fat western diet-consumption alters crystalline silica-induced serum adipokines, inflammatory cytokines and arterial blood flow in the F344 rat
Thompson JA , Krajnak K , Johnston RA , Kashon ML , McKinney W , Fedan JS . Toxicol Rep 2022 9 12-21 Adipose tissue (AT) plays a central role in the maintenance of whole-body energy homeostasis through release of adipokines. High-fat Western diet (HFWD)-consumption contributes to obesity, disruption of adipocyte metabolism, chronic systemic inflammation, and metabolic dysfunction (MetDys). MetDys is associated with impaired lung function, pulmonary hypertension, and asthma. Thirty-five percent of adults in the U.S. have MetDys, yet the impact of MetDys on susceptibility to occupational hazards is unknown. The aim of this study was to determine the potential of HFWD-consumption to alter inhaled crystalline silica dust-induced metabolic responses. Six-wk old male F344 rats were fed a HFWD (45 kcal % fat, sucrose 22.2 % by weight) or standard rat chow (STD, controls), and exposed to silica-inhalation (6 h/d, 5 d/wk, 39 d; Min-U-Sil 5®, 15 mg/m3) or filtered air. Indices of MetDys and systemic inflammation were measured at 0, 4, and 8 wk following cessation of silica exposure. At 8 wk post-exposure, silica reduced serum leptin and adiponectin levels, and increased arterial pulse frequency. HFWD-consumption induced weight gain, altered adipokines, liver, kidney, and pancreatic function, and increased tail artery blood flow. At 8 wk in HFWD + SIL-treated animals, the levels of serum pro-inflammatory cytokines (IFN-γ, CXCL-1, TNF-α, IL-1β, IL-4, IL-5, IL-6, IL-10 and IL-13) were increased compared to STD + SIL but were less than HFWD + AIR-induced levels. In conclusion, consumption of a HFWD altered silica-induced metabolic responses and silica exposure disrupted AT endocrine function. These findings demonstrate previously unknown interactions between HFWD-consumption and occupational silica exposure. © 2021 The Authors |
Prevention and awareness of birth defects across the lifespan using examples from congenital heart defects and spina bifida
Farr SL , Riley C , Van Zutphen AR , Brei TJ , Leedom VO , Kirby RS , Pabst LJ . Birth Defects Res 2021 114 (2) 35-44 The emergence of birth defects programs in the United States accelerated in the 1970s and 1980s due to recognition that the use of the drug thalidomide during pregnancy resulted in fetal abnormalities (McBride, 1961; Smithells, 1962) and concerns around environmental exposures, such as Agent Orange exposure during the Vietnam War (Erickson et al., 1984). These experiences shaped the mission of many birth defect programs to focus on the surveillance of fetuses/infants affected by birth defects to monitor prevalence, identify and respond to clusters, and explore the epidemiology of birth defects as early warning systems to identify potential teratogens. This work helped identify additional risk factors for birth defects, support primary prevention opportunities, such as folic acid fortification and supplementation for neural tube defect prevention, and enabled evaluations of the success of those efforts (Harris et al., 2017). |
Risk perception and psychological state of healthcare workers in referral hospitals during the early phase of the COVID-19 pandemic, Uganda.
Migisha R , Ario AR , Kwesiga B , Bulage L , Kadobera D , Kabwama SN , Katana E , Ndyabakira A , Wadunde I , Byaruhanga A , Amanya G , Harris JR , Fitzmaurice AG . BMC Psychol 2021 9 (1) 195 BACKGROUND: Safeguarding the psychological well-being of healthcare workers (HCWs) is crucial to ensuring sustainability and quality of healthcare services. During the COVID-19 pandemic, HCWs may be subject to excessive mental stress. We assessed the risk perception and immediate psychological state of HCWs early in the pandemic in referral hospitals involved in the management of COVID-19 patients in Uganda. METHODS: We conducted a cross-sectional survey in five referral hospitals from April 20-May 22, 2020. During this time, we distributed paper-based, self-administered questionnaires to all consenting HCWs on day shifts. The questionnaire included questions on socio-demographics, occupational behaviors, potential perceived risks, and psychological distress. We assessed risk perception towards COVID-19 using 27 concern statements with a four-point Likert scale. We defined psychological distress as a total score > 12 from the 12-item Goldberg's General Health Questionnaire (GHQ-12). We used modified Poisson regression to identify factors associated with psychological distress. RESULTS: Among 335 HCWs who received questionnaires, 328 (98%) responded. Respondents' mean age was 36 (range 18-59) years; 172 (52%) were male. The median duration of professional experience was eight (range 1-35) years; 208 (63%) worked more than 40 h per week; 116 (35%) were nurses, 52 (14%) doctors, 30 (9%) clinical officers, and 86 (26%) support staff. One hundred and forty-four (44%) had a GHQ-12 score > 12. The most common concerns reported included fear of infection at the workplace (81%), stigma from colleagues (79%), lack of workplace support (63%), and inadequate availability of personal protective equipment (PPE) (56%). In multivariable analysis, moderate (adjusted prevalence ratio, [aPR] = 2.2, 95% confidence interval [CI] 1.2-4.0) and high (aPR = 3.8, 95% CI 2.0-7.0) risk perception towards COVID-19 (compared with low-risk perception) were associated with psychological distress. CONCLUSIONS: Forty-four percent of HCWs surveyed in hospitals treating COVID-19 patients during the early COVID-19 epidemic in Uganda reported psychological distress related to fear of infection, stigma, and inadequate PPE. Higher perceived personal risk towards COVID-19 was associated with increased psychological distress. To optimize patient care during the pandemic and future outbreaks, workplace management may consider identifying and addressing HCW concerns, ensuring sufficient PPE and training, and reducing infection-associated stigma. |
The Wildland Firefighter Exposure and Health Effect (WFFEHE) Study: Rationale, design, and methods of a repeated-measures study
Navarro KM , Butler CR , Fent K , Toennis C , Sammons D , Ramirez-Cardenas A , Clark KA , Byrne DC , Graydon PS , Hale CR , Wilkinson AF , Smith DL , Alexander-Scott MC , Pinkerton LE , Eisenberg J , Domitrovich JW . Ann Work Expo Health 2021 66 (6) 714-727 The wildland firefighter exposure and health effect (WFFEHE) study was a 2-year repeated-measures study to investigate occupational exposures and acute and subacute health effects among wildland firefighters. This manuscript describes the study rationale, design, methods, limitations, challenges, and lessons learned. The WFFEHE cohort included fire personnel ages 18-57 from six federal wildland firefighting crews in Colorado and Idaho during the 2018 and 2019 fire seasons. All wildland firefighters employed by the recruited crews were invited to participate in the study at preseason and postseason study intervals. In 2019, one of the crews also participated in a 3-day midseason study interval where workplace exposures and pre/postshift measurements were collected while at a wildland fire incident. Study components assessed cardiovascular health, pulmonary function and inflammation, kidney function, workplace exposures, and noise-induced hearing loss. Measurements included self-reported risk factors and symptoms collected through questionnaires; serum and urine biomarkers of exposure, effect, and inflammation; pulmonary function; platelet function and arterial stiffness; and audiometric testing. Throughout the study, 154 wildland firefighters participated in at least one study interval, while 144 participated in two or more study interval. This study was completed by the Centers for Disease Control and Prevention's National Institute for Occupational Safety and Health through a collaborative effort with the U.S. Department of Agriculture Forest Service, Department of the Interior National Park Service, and Skidmore College. Conducting research in the wildfire environment came with many challenges including collecting study data with study participants with changing work schedules and conducting study protocols safely and operating laboratory equipment in remote field locations. Forthcoming WFFEHE study results will contribute to the scientific evidence regarding occupational risk factors and exposures that can impact wildland firefighter health over a season and across two wildland fire seasons. This research is anticipated to lead to the development of preventive measures and policies aimed at reducing risk for wildland firefighters and aid in identifying future research needs for the wildland fire community. |
Physical activity in the workplace: Does just working meet activity recommendations
Quinn TD , Kline CE , FNagle E , Radonovich LJ , Barone Gibbs B . Workplace Health Saf 2021 70 (2) 81-89 Background: The physical activity (PA) health paradox hypothesizes that occupational physical activity (OPA) and leisure time PA have differential cardiovascular health effects due to increased cardiovascular load without adequate recovery; however, research describing worker PA lacks high-quality objective OPA measurement. This study aimed to objectively describe PA profiles of men reporting high OPA and make comparisons to aerobic PA and OPA recommendations. Methods: Male food service, material moving, health care, or maintenance workers wore activity (ActiGraph(®) and activPAL(®)) and heart rate monitors for 7 days. Participants recorded work, non-work, and sleep times in a diary. PA was operationalized as time spent in sedentary behavior, upright time, light, moderate, vigorous, and moderate-to-vigorous PA during work and non-work hours. PA profiles were described and compared with Centers for Disease Control and Prevention aerobic PA guidelines (≥21.4 minute/day) and OPA recommendations (<30 minute/hour upright and intensity of <30% heart rate reserve). Findings: Nineteen male workers (68% White, age = 46.6±7.9 years) were more active on workdays than non-workdays (sedentary: 492.3 vs. 629.7 minute/day; upright: 462.4 vs. 325.2 minute/day; moderate-to-vigorous PA: 72.4 vs. 41.5 minute/day, respectively; all p < .05). Most participants (17/19) achieved aerobic PA guidelines across all days with more achieving on workdays (19/19) than non-workdays (13/19). OPA often exceeded recommended limits with participants accumulating 39.6±12.2 minutes/work hour upright and 30.3±25.9% of working time >30% heart rate reserve. Conclusions/Application to Practice: Male workers reporting high OPA typically met aerobic PA guidelines but exceeded recommended OPA limits. The long-term health implications of such activity profiles should be investigated. |
Water, sanitation, and hygiene for control of trachoma in Ethiopia (WUHA): a two-arm, parallel-group, cluster-randomised trial
Aragie S , Wittberg DM , Tadesse W , Dagnew A , Hailu D , Chernet A , Melo JS , Aiemjoy K , Haile M , Zeru T , Tadesse Z , Gwyn S , Martin DL , Arnold BF , Freeman MC , Nash SD , Callahan EK , Porco TC , Lietman TM , Keenan JD . Lancet Glob Health 2022 10 (1) e87-e95 BACKGROUND: WHO promotes the SAFE strategy for the elimination of trachoma as a public health programme, which promotes surgery for trichiasis (ie, the S component), antibiotics to clear the ocular strains of chlamydia that cause trachoma (the A component), facial cleanliness to prevent transmission of secretions (the F component), and environmental improvements to provide water for washing and sanitation facilities (the E component). However, little evidence is available from randomised trials to support the efficacy of interventions targeting the F and E components of the strategy. We aimed to determine whether an integrated water, sanitation, and hygiene (WASH) intervention prevents the transmission of trachoma. METHODS: The WASH Upgrades for Health in Amhara (WUHA) was a two-arm, parallel-group, cluster-randomised trial in 40 rural communities in Wag Hemra Zone (Amhara Region, Ethiopia) that had been treated with 7 years of annual mass azithromycin distributions. The randomisation unit was the school catchment area. All households within a 1·5 km radius of a potential water point within the catchment area (as determined by the investigators) were eligible for inclusion. Clusters were randomly assigned (at a 1:1 ratio) to receive a WASH intervention either immediately (intervention) or delayed until the conclusion of the trial (control), in the absence of concurrent antibiotic distributions. Given the nature of the intervention, participants and field workers could not be masked, but laboratory personnel were masked to treatment allocation. The WASH intervention consisted of both hygiene infrastructure improvements (namely, construction of a community water point) and hygiene promotion by government, school, and community leaders, which were implemented at the household, school, and community levels. Hygiene promotion focused on two simple messages: to use soap and water to wash your or your child's face, and to always use a latrine for defecation. The primary outcome was the cluster-level prevalence of ocular chlamydia, measured annually using conjunctival swabs in a random sample of children aged 0-5 years from each cluster at 12, 24, and 36 month timepoints. Analyses were done in an intention-to-treat manner. This trial is ongoing and is registered at ClinicalTrials.gov, NCT02754583. FINDINGS: Between Nov 9, 2015, and March 5, 2019, 40 of 44 clusters assessed for eligibility were enrolled and randomly allocated to the trial groups (20 clusters each, with 7636 people from 1751 households in the intervention group and 9821 people from 2211 households in the control group at baseline). At baseline, ocular chlamydia prevalence among children aged 0-5 years was 11% (95% CI 6 to 16) in the WASH group and 11% (5 to 18) in the control group. At month 36, ocular chlamydia prevalence had increased in both groups, to 32% (24 to 41) in the WASH group and 31% (21 to 41) in the control group (risk difference across three annual monitoring visits, after adjustment for prevalence at baseline: 3·7 percentage points; 95% CI -4·9 to 12·4; p=0·40). No adverse events were reported in either group. INTERPRETATION: An integrated WASH intervention addressing the F and E components of the SAFE strategy did not prevent an increase in prevalence of ocular chlamydia following cessation of antibiotics in an area with hyperendemic trachoma. The impact of WASH in the presence of annual mass azithromycin distributions is currently being studied in a follow-up trial of the 40 study clusters. Continued antibiotic distributions will probably be important in areas with persistent trachoma. FUNDING: National Institutes of Health-National Eye Institute. TRANSLATION: For the Amharic translation of the abstract see Supplementary Materials section. |
Bioefficacy and durability of Olyset() Plus, a permethrin and piperonyl butoxide-treated insecticidal net in a 3-year long trial in Kenya
Gichuki PM , Kamau L , Njagi K , Karoki S , Muigai N , Matoke-Muhia D , Bayoh N , Mathenge E , Yadav RS . Infect Dis Poverty 2021 10 (1) 135 BACKGROUND: Long-lasting insecticide nets (LLINs) are a core malaria intervention. LLINs should retain efficacy against mosquito vectors for a minimum of three years. Efficacy and durability of Olyset(®) Plus, a permethrin and piperonyl butoxide (PBO) treated LLIN, was evaluated versus permethrin treated Olyset(®) Net. In the absence of WHO guidelines of how to evaluate PBO nets, and considering the manufacturer's product claim, Olyset(®) Plus was evaluated as a pyrethroid LLIN. METHODS: This was a household randomized controlled trial in a malaria endemic rice cultivation zone of Kirinyaga County, Kenya between 2014 and 2017. Cone bioassays and tunnel tests were done against Anopheles gambiae Kisumu. The chemical content, fabric integrity and LLIN survivorship were monitored. Comparisons between nets were tested for significance using the Chi-square test. Exact binomial distribution with 95% confidence intervals (95% CI) was used for percentages. The WHO efficacy criteria used were ≥ 95% knockdown and/or ≥ 80% mortality rate in cone bioassays and ≥ 80% mortality and/or ≥ 90% blood-feeding inhibition in tunnel tests. RESULTS: At 36 months, Olyset(®) Plus lost 52% permethrin and 87% PBO content; Olyset(®) Net lost 24% permethrin. Over 80% of Olyset(®) Plus and Olyset(®) Net passed the WHO efficacy criteria for LLINs up to 18 and 12 months, respectively. At month 36, 91.2% Olyset(®) Plus and 86.4% Olyset(®) Net survived, while 72% and 63% developed at least one hole. The proportionate Hole Index (pHI) values representing nets in good, serviceable and torn condition were 49.6%, 27.1% and 23.2%, respectively for Olyset(®) Plus, and 44.9%, 32.8% and 22.2%, respectively for Olyset(®) Net but were not significantly different. CONCLUSIONS: Olyset(®) Plus retained efficacy above or close to the WHO efficacy criteria for about 2 years than Olyset(®) Net (1-1.5 years). Both nets did not meet the 3-year WHO efficacy criteria, and showed little attrition, comparable physical durability and survivorship, with 50% of Olyset(®) Plus having good and serviceable condition after 3 years. Better community education on appropriate use and upkeep of LLINs is essential to ensure effectiveness of LLIN based malaria interventions. |
The effect of long-lasting insecticidal nets (LLINs) physical integrity on utilization
Hiruy HN , Zewde A , Irish SR , Abdelmenan S , Woyessa A , Wuletaw Y , Solomon H , Haile M , Sisay A , Chibsa S , Worku A , Yukich J , Berhane Y , Keating J . Malar J 2021 20 (1) 468 BACKGROUND: In Ethiopia, despite improvements in coverage and access, utilization of long-lasting insecticidal nets (LLINs) remains a challenge. Different household-level factors have been identified as associated with LLIN use. However, the contribution of LLIN physical integrity to their utilization is not well investigated and documented. This study aimed to assess the association between the physical integrity of LLINs and their use. METHODS: This study employed a nested case-control design using secondary data from the Ethiopian LLIN durability monitoring study conducted from May 2015 to June 2018. LLINs not used the night before the survey were identified as cases, while those used the previous night were categorized as controls. The physical integrity of LLINs was classified as no holes, good, acceptable, and torn using the proportionate hole index (pHI). A Generalized Estimating Equation (GEE) model was used to assess and quantify the association between LLIN physical integrity and use. The model specifications included binomial probabilistic distribution, logit link, exchangeable correlation matrix structure, and robust standard errors. The factors included in the model were selected first by fitting binary regression, and then by including all factors that showed statistical significance at P-value less than 0.25 and conceptually relevant variables into the multivariate regression model. RESULTS: A total of 5277 observations fulfilled the inclusion criteria. Out of these 1767 observations were cases while the remaining 3510 were controls. LLINs that were in torn physical condition had higher odds (AOR [95% CI] = 1.76 [1.41, 2.19]) of not being used compared to LLINs with no holes. Other factors that showed significant association included the age of the LLIN, sleeping place type, washing status of LLINs, perceptions towards net care and repair, LLIN to people ratio, economic status, and study site. CONCLUSION AND RECOMMENDATION: LLINs that have some level of physical damage have a relatively higher likelihood of not being used. Community members need to be educated about proper care and prevention of LLIN damage to delay the development of holes as long as possible and use available LLINs regularly. |
Incidence and consequences of damage to insecticide-treated mosquito nets in Kenya
Smith T , Denz A , Ombok M , Bayoh N , Koenker H , Chitnis N , Briet O , Yukich J , Gimnig JE . Malar J 2021 20 (1) 476 BACKGROUND: Efforts to improve the impact of long-lasting insecticidal nets (LLINs) should be informed by understanding of the causes of decay in effect. Holes in LLINs have been estimated to account for 7-11% of loss in effect on vectorial capacity for Plasmodium falciparum malaria in an analysis of repeated cross-sectional surveys of LLINs in Kenya. This does not account for the effect of holes as a cause of net attrition or non-use, which cannot be measured using only cross-sectional data. There is a need for estimates of how much these indirect effects of physical damage on use and attrition contribute to decay in effectiveness of LLINs. METHODS: Use, physical integrity, and survival were assessed in a cohort of 4514 LLINs followed for up to 4 years in Kenya. Flow diagrams were used to illustrate how the status of nets, in terms of categories of use, physical integrity, and attrition, changed between surveys carried out at 6-month intervals. A compartment model defined in terms of ordinary differential equations (ODEs) was used to estimate the transition rates between the categories. Effects of physical damage to LLINs on use and attrition were quantified by simulating counterfactuals in which there was no damage. RESULTS: Allowing for the direct effect of holes, the effect on use, and the effect on attrition, 18% of the impact on vectorial capacity was estimated to be lost because of damage. The estimated median lifetime of the LLINs was 2.9 years, but this was extended to 5.7 years in the counterfactual without physical damage. Nets that were in use were more likely to be in a damaged state than unused nets but use made little direct difference to LLIN lifetimes. Damage was reported as the reason for attrition for almost half of attrited nets, but the model estimated that almost all attrited nets had suffered some damage before attrition. CONCLUSIONS: Full quantification of the effects of damage will require measurement of the supply of new nets and of household stocks of unused nets, and also of their impacts on both net use and retention. The timing of mass distribution campaigns is less important than ensuring sufficient supply. In the Kenyan setting, nets acquired damage rapidly once use began and the damage led to rapid attrition. Increasing the robustness of nets could substantially increase their lifetime and impact but the impact of LLIN programmes on malaria transmission is ultimately limited by levels of use. Longitudinal analyses of net integrity data from different settings are needed to determine the importance of physical damage to nets as a driver of attrition and non-use, and the importance of frequent use as a cause of physical damage in different contexts. |
A memorandum of understanding has facilitated guideline development involving collaborating groups
Alam M , Getchius TS , Schünemann H , Amer YS , Bak A , Fatheree LA , Ginex P , Jakhmola P , Marsden GL , McFarlane E , Meremikwu M , Taske N , Temple-Smolkin RL , Ventura C , Burgers J , Bradfield L , O'Brien MD , Einhaus K , Kopp IB , Munn Z , Scudeller L , Schaefer C , Ibrahim SA , Kang BY , Ogunremi T , Morgan RL . J Clin Epidemiol 2021 144 8-15 OBJECTIVE: Collaboration between groups can facilitate the development of high-quality guidelines. While collaboration is often desirable, misunderstandings can occur. One method to minimize misunderstandings is the pre-specification of terms of engagement in a memorandum of understanding (MOU). This study considered when an MOU may be most helpful, and which key elements should be included. STUDY DESIGN AND SETTING: An international panel of representatives from guideline groups was convened. A literature review to identify publications and other documents relevant to the establishment of MOUs between two or more guideline groups, supplemented by available source documents, was used to inform development of a draft MOU resource. This was iteratively refined until consensus was achieved. RESULTS: The level of detail in an MOU may vary based on institutional preferences and the particular collaboration. Elements within an MOU include those pertaining to: (1) scope and purpose; (2) leadership and team; (3) methods and commitment; (4) review and endorsement; and (5) publication and dissemination. CONCLUSION: Since groups may have different expectations regarding how a collaboration will unfold, an MOU may mitigate preventable misunderstandings. The result may be a higher likelihood of producing a guideline without disruption and delay. |
Post-traumatic stress disorder, anxiety, and depression among adults with congenital heart defects
Simeone RM , Downing KF , Bobo WV , Grosse SD , Khanna AD , Farr SL . Birth Defects Res 2021 114 124-135 BACKGROUND: Due to invasive treatments and stressors related to heart health, adults with congenital heart defects (CHDs) may have an increased risk of post-traumatic stress disorder (PTSD), anxiety, and/or depressive disorders. Our objectives were to estimate the prevalence of these disorders among individuals with CHDs. METHODS: Using IBM® MarketScan® Databases, we identified adults age 18-49 years with ≥2 outpatient anxiety/depressive disorder claims on separate dates or ≥1 inpatient anxiety/depressive disorder claim in 2017. CHDs were defined as ≥2 outpatient CHD claims ≥30 days apart or ≥1 inpatient CHD claim documented in 2007-2017. We used log-binomial regression to estimate adjusted prevalence ratios (aPR) and 95% confidence intervals (CI) for associations between CHDs and anxiety/depressive disorders. RESULTS: Of 13,807 adults with CHDs, 12.4% were diagnosed with an anxiety or depressive disorder. Adults with CHDs, compared to the 5,408,094 without CHDs, had higher prevalence of PTSD (0.8% vs. 0.5%; aPR: 1.5 [CI: 1.2-1.8]), anxiety disorders (9.9% vs. 7.5%; aPR: 1.3 [CI: 1.3-1.4]), and depressive disorders (6.3% vs. 4.9%; aPR: 1.3 [CI: 1.2-1.4]). Among individuals with CHDs, female sex (aPR range: 1.6-3.3) and inpatient admission (aPR range 1.1-1.9) were associated with anxiety/depressive disorders. CONCLUSION: Over 1 in 8 adults with CHDs had diagnosed PTSD and/or other anxiety/depressive disorders, 30-50% higher than adults without CHDs. PTSD was rare, but three times more common in women with CHDs than men. Screening and referral for services for these conditions in people with CHDs may be beneficial. |
Trends in and characteristics of drug overdose deaths involving illicitly manufactured fentanyls - United States, 2019-2020
O'Donnell J , Tanz LJ , Gladden RM , Davis NL , Bitting J . MMWR Morb Mortal Wkly Rep 2021 70 (50) 1740-1746 During May 2020-April 2021, the estimated number of drug overdose deaths in the United States exceeded 100,000 over a 12-month period for the first time, with 64.0% of deaths involving synthetic opioids other than methadone (mainly illicitly manufactured fentanyls [IMFs], which include both fentanyl and illicit fentanyl analogs).* Introduced primarily as adulterants in or replacements for white powder heroin east of the Mississippi River (1), IMFs are now widespread in white powder heroin markets, increasingly pressed into counterfeit pills resembling oxycodone, alprazolam, or other prescription drugs, and are expanding into new markets, including in the western United States(†) (2). This report describes trends in overdose deaths involving IMFs (IMF-involved deaths) during July 2019-December 2020 (29 states and the District of Columbia [DC]), and characteristics of IMF-involved deaths during 2020 (39 states and DC) using data from CDC's State Unintentional Drug Overdose Reporting System (SUDORS). During July 2019-December 2020, IMF-involved deaths increased sharply in midwestern (33.1%), southern (64.7%), and western (93.9%) jurisdictions participating in SUDORS. Approximately four in 10 IMF-involved deaths also involved a stimulant. Highlighting the need for timely overdose response, 56.1% of decedents had no pulse when first responders arrived. Injection drug use was the most frequently reported individual route of drug use (24.5%), but evidence of snorting, smoking, or ingestion, but not injection drug use was found among 27.1% of decedents. Adapting and expanding overdose prevention, harm reduction, and response efforts is urgently needed to address the high potency (3), and various routes of use for IMFs. Enhanced treatment for substance use disorders is also needed to address the increased risk for overdose (4) and treatment complications (5) associated with using IMFs with stimulants. |
Effect of puffing behavior on particle size distributions and respiratory depositions from pod-style electronic cigarette, or vaping, products
Ranpara A , Stefaniak AB , Fernandez E , LeBouf RF . Front Public Health 2021 9 750402 The current fourth generation ("pod-style") electronic cigarette, or vaping, products (EVPs) heat a liquid ("e-liquid") contained in a reservoir ("pod") using a battery-powered coil to deliver aerosol into the lungs. A portion of inhaled EVP aerosol is estimated as exhaled, which can present a potential secondhand exposure risk to bystanders. The effects of modifiable factors using either a prefilled disposable or refillable pod-style EVPs on aerosol particle size distribution (PSD) and its respiratory deposition are poorly understood. In this study, the influence of up to six puff profiles (55-, 65-, and 75-ml puff volumes per 6.5 and 7.5 W EVP power settings) on PSD was evaluated using a popular pod-style EVP (JUUL(®) brand) and a cascade impactor. JUUL(®) brand EVPs were used to aerosolize the manufacturers' e-liquids in their disposable pods and laboratory prepared "reference e-liquid" (without flavorings or nicotine) in refillable pods. The modeled dosimetry and calculated aerosol mass median aerodynamic diameters (MMADs) were used to estimate regional respiratory deposition. From these results, exhaled fraction of EVP aerosols was calculated as a surrogate of the secondhand exposure potential. Overall, MMADs did not differ among puff profiles, except for 55- and 75-ml volumes at 7.5 W (p < 0.05). For the reference e-liquid, MMADs ranged from 1.02 to 1.23 μm and dosimetry calculations predicted that particles would deposit in the head region (36-41%), in the trachea-bronchial (TB) region (19-21%), and in the pulmonary region (40-43%). For commercial JUUL(®) e-liquids, MMADs ranged from 0.92 to 1.67 μm and modeling predicted that more particles would deposit in the head region (35-52%) and in the pulmonary region (30-42%). Overall, 30-40% of the particles aerosolized by a pod-style EVP were estimated to deposit in the pulmonary region and 50-70% of the inhaled EVP aerosols could be exhaled; the latter could present an inhalational hazard to bystanders in indoor occupational settings. More research is needed to understand the influence of other modifiable factors on PSD and exposure potential. |
Capacity of a multiplex IgM antibody capture ELISA to differentiate Zika and dengue virus infections in areas of concurrent endemic transmission
Medina FA , Vila F , Premkumar L , Lorenzi O , Paz-Bailey G , Alvarado LI , Rivera-Amill V , de Silva A , Waterman S , Muñoz-Jordán J . Am J Trop Med Hyg 2021 106 (2) 585-592 Serological cross-reactivity has proved to be a challenge to diagnose Zika virus (ZIKV) infections in dengue virus (DENV) endemic countries. Confirmatory testing of ZIKV IgM positive results by plaque reduction neutralization tests (PRNTs) provides clarification in only a minority of cases because most individuals infected with ZIKV were previously exposed to DENV. The goal of this study was to evaluate the performance of a ZIKV/DENV DUO IgM antibody capture ELISA (MAC-ELISA) for discriminating between DENV and ZIKV infections in endemic regions. Our performance evaluation included acute and convalescent specimens from patients with real-time reverse transcription polymerase chain reaction (RT-PCR)-confirmed DENV or ZIKV from the Sentinel Enhanced Dengue Surveillance System in Ponce, Puerto Rico. The ZIKV/DENV DUO MAC-ELISA specificity was 100% for DENV (N = 127) and 98.4% for ZIKV (N = 275) when specimens were tested during the optimal testing window (days post-onset of illness [DPO] 6-120). The ZIKV/DENV DUO MAC-ELISA sensitivity of RT-PCR confirmed specimens reached 100% for DENV by DPO 6 and for ZIKV by DPO 9. Our new ZIKV/DENV DUO MAC-ELISA was also able to distinguish ZIKV and DENV regardless of previous DENV exposure. We conclude this novel serologic diagnostic assay can accurately discriminate ZIKV and DENV infections. This can potentially be useful considering that the more labor-intensive and expensive PRNT assay may not be an option for confirmatory diagnosis in areas that lack PRNT capacity, but experience circulation of both DENV and ZIKV. |
Incidence Estimates of Acute Q Fever and Spotted Fever Group Rickettsioses, Kilimanjaro, Tanzania, from 2007 to 2008 and from 2012 to 2014
Pisharody S , Rubach MP , Carugati M , Nicholson WL , Perniciaro JL , Biggs HM , Maze MJ , Hertz JT , Halliday JEB , Allan KJ , Mmbaga BT , Saganda W , Lwezaula BF , Kazwala RR , Cleaveland S , Maro VP , Crump JA . Am J Trop Med Hyg 2021 106 (2) 494-503 Q fever and spotted fever group rickettsioses (SFGR) are common causes of severe febrile illness in northern Tanzania. Incidence estimates are needed to characterize the disease burden. Using hybrid surveillance-coupling case-finding at two referral hospitals and healthcare utilization data-we estimated the incidences of acute Q fever and SFGR in Moshi, Kilimanjaro, Tanzania, from 2007 to 2008 and from 2012 to 2014. Cases were defined as fever and a four-fold or greater increase in antibody titers of acute and convalescent paired sera according to the indirect immunofluorescence assay of Coxiella burnetii phase II antigen for acute Q fever and Rickettsia conorii (2007-2008) or Rickettsia africae (2012-2014) antigens for SFGR. Healthcare utilization data were used to adjust for underascertainment of cases by sentinel surveillance. For 2007 to 2008, among 589 febrile participants, 16 (4.7%) of 344 and 27 (8.8%) of 307 participants with paired serology had Q fever and SFGR, respectively. Adjusted annual incidence estimates of Q fever and SFGR were 80 (uncertainty range, 20-454) and 147 (uncertainty range, 52-645) per 100,000 persons, respectively. For 2012 to 2014, among 1,114 febrile participants, 52 (8.1%) and 57 (8.9%) of 641 participants with paired serology had Q fever and SFGR, respectively. Adjusted annual incidence estimates of Q fever and SFGR were 56 (uncertainty range, 24-163) and 75 (uncertainty range, 34-176) per 100,000 persons, respectively. We found substantial incidences of acute Q fever and SFGR in northern Tanzania during both study periods. To our knowledge, these are the first incidence estimates of either disease in sub-Saharan Africa. Our findings suggest that control measures for these infections warrant consideration. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Community Health Services
- Disease Reservoirs and Vectors
- Environmental Health
- Food Safety
- Genetics and Genomics
- Health Economics
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Occupational Safety and Health
- Parasitic Diseases
- Public Health Leadership and Management
- Social and Behavioral Sciences
- Substance Use and Abuse
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure