Serum immune profiling for early detection of cervical disease
Ewaisha R , Panicker G , Maranian P , Unger ER , Anderson KS . Theranostics 2017 7 (16) 3814-3823 Background: The most recent (2012) worldwide estimates from International Agency for Research on Cancer indicate that approximately 528,000 new cases and 270,000 deaths per year are attributed to cervical cancer worldwide. The disease is preventable with HPV vaccination and with early detection and treatment of pre-invasive cervical intraepithelial neoplasia, CIN. Antibodies (Abs) to HPV proteins are under investigation as potential biomarkers for early detection. Methods: To detect circulating HPV-specific IgG Abs, we developed programmable protein arrays (NAPPA) that display the proteomes of two low-risk HPV types (HPV6 and 11) and ten oncogenic high-risk HPV types (HPV16, 18, 31, 33, 35, 39, 45, 51, 52 and 58). Arrays were probed with sera from women with CIN 0/I (n=78), CIN II/III (n=84), or invasive cervical cancer (ICC, n=83). Results: Abs to any early (E) HPV protein were detected less frequently in women with CIN 0/I (23.7%) than women with CIN II/III (39.0%) and ICC (46.1%, p<0.04). Of the E Abs, anti-E7 Abs were the most frequently detected (6.6%, 19.5%, and 30.3%, respectively). The least frequently detected Abs were E1 and E2-Abs in CIN 0/I (1.3%) and E1-Abs in CIN II/III (1.2%) and ICC (7.9%). HPV16-specific Abs correlated with HPV16 DNA detected in the cervix in 0% of CIN 0/I, 21.2% of CIN II/III, and 45.5% of ICC. A significant number (29 - 73%) of E4, E7, L1, and L2 Abs had cross-reactivity between HPV types. Conclusion: HPV protein arrays provide a valuable high-throughput tool for measuring the breadth, specificity, and heterogeneity of the serologic response to HPV in cervical disease. |
Prevention of noise-induced hearing loss from recreational firearms
Meinke DK , Finan DS , Flamme GA , Murphy WJ , Stewart M , Lankford JE , Tasko S . Semin Hear 2017 38 (4) 267-281 In the United States and other parts of the world, recreational firearm shooting is a popular sport that puts the hearing of the shooter at risk. Peak sound pressure levels (SPLs) from firearms range from approximately 140 to 175 dB. The majority of recreational firearms (excluding small-caliber 0.17 and 0.22 rifles and air rifles) generate between 150 and 165 dB peak SPLs. High-intensity impulse sounds will permanently damage delicate cochlear structures, and thus individuals who shoot firearms are at a higher risk of bilateral, high-frequency, noise-induced hearing loss (NIHL) than peer groups who do not shoot. In this article, we describe several factors that influence the risk of NIHL including the use of a muzzle brake, the number of shots fired, the distance between shooters, the shooting environment, the choice of ammunition, the use of a suppressor, and hearing protection fit and use. Prevention strategies that address these factors and recommendations for specialized hearing protectors designed for shooting sports are offered. Partnerships are needed between the hearing health community, shooting sport groups, and wildlife conservation organizations to develop and disseminate accurate information and promote organizational resources that support hearing loss prevention efforts. |
Challenges and opportunities to scale up cardiovascular disease secondary prevention in Latin America and the Caribbean
Avezum A , Perel P , Oliveira GBF , Lopez-Jaramillo P , Restrepo G , Loustalot F , Srur A , de La Noval R , Connell KI , Cruz-Flores S , de Moura L , Castellac G , Mattos AC , Ordunez P . Glob Heart 2017 13 (2) 83-91 Cardiovascular disease (CVD) is the leading cause of death throughout the world; however, a reduction of 21% (age-standardized cardiovascular mortality rates per 100,000 inhabitants) was observed between 1990 and 2010, with more substantial reductions in CVD mortality evident in high-income countries (~42% reduction in CVD deaths) (Table 1) [1,2]. |
Community burden and prognostic impact of reduced kidney function among patients hospitalized with acute decompensated heart failure: The Atherosclerosis Risk in Communities (ARIC) Study Community Surveillance
Matsushita K , Kwak L , Hyun N , Bessel M , Agarwal SK , Loehr LR , Ni H , Chang PP , Coresh J , Wruck LM , Rosamond W . PLoS One 2017 12 (8) e0181373 BACKGROUND: Kidney dysfunction is prevalent and impacts prognosis in patients with acute decompensated heart failure (ADHF). However, most previous reports were from a single hospital, limiting their generalizability. Also, contemporary data using new equation for estimated glomerular filtration rate (eGFR) are needed. METHODS AND RESULTS: We analyzed data from the ARIC Community Surveillance for ADHF conducted for residents aged ≥55 years in four US communities between 2005-2011. All ADHF cases (n = 5, 391) were adjudicated and weighted to represent those communities (24,932 weighted cases). The association of kidney function (creatinine-based eGFR by the CKD-EPI equation and blood urea nitrogen [BUN]) during hospitalization with 1-year mortality was assessed using logistic regression. Based on worst and last serum creatinine, there were 82.5% and 70.6% with reduced eGFR (<60 ml/min/1.73m2) and 37.4% and 26.6% with severely reduced eGFR (<30 ml/min/1.73m2), respectively. Lower eGFR (regardless of last or worst eGFR), particularly eGFR <30 ml/min/1.73m2, was significantly associated with higher 1-year mortality independently of potential confounders (odds ratio 1.60 [95% CI 1.26-2.04] for last eGFR 15-29 ml/min/1.73m2 and 2.30 [1.76-3.00] for <15 compared to eGFR ≥60). The association was largely consistent across demographic subgroups. Of interest, when both eGFR and BUN were modeled together, only BUN remained significant. CONCLUSIONS: Severely reduced eGFR (<30 ml/min/1.73m2) was observed in ~30% of ADHF cases and was an independent predictor of 1-year mortality in community. For prediction, BUN appeared to be superior to eGFR. These findings suggest the need of close attention to kidney dysfunction among ADHF patients. |
Comparison of methods for estimating prevalence of chronic diseases and health behaviors for small geographic areas: Boston validation study, 2013
Wang Y , Holt JB , Zhang X , Lu H , Shah SN , Dooley DP , Matthews KA , Croft JB . Prev Chronic Dis 2017 14 E99 INTRODUCTION: Local health authorities need small-area estimates for prevalence of chronic diseases and health behaviors for multiple purposes. We generated city-level and census-tract-level prevalence estimates of 27 measures for the 500 largest US cities. METHODS: To validate the methodology, we constructed multilevel logistic regressions to predict 10 selected health indicators among adults aged 18 years or older by using 2013 Behavioral Risk Factor Surveillance System (BRFSS) data; we applied their predicted probabilities to census population data to generate city-level, neighborhood-level, and zip-code-level estimates for the city of Boston, Massachusetts. RESULTS: By comparing the predicted estimates with their corresponding direct estimates from a locally administered survey (Boston BRFSS 2010 and 2013), we found that our model-based estimates for most of the selected health indicators at the city level were close to the direct estimates from the local survey. We also found strong correlation between the model-based estimates and direct survey estimates at neighborhood and zip code levels for most indicators. CONCLUSION: Findings suggest that our model-based estimates are reliable and valid at the city level for certain health outcomes. Local health authorities can use the neighborhood-level estimates if high quality local health survey data are not otherwise available. |
The Female Genital Tract Microbiome Is Associated With Vaginal Antiretroviral Drug Concentrations in Human Immunodeficiency Virus-Infected Women on Antiretroviral Therapy.
Donahue Carlson R , Sheth AN , Read TD , Frisch MB , Mehta CC , Martin A , Haaland RE , Patel AS , Pau CP , Kraft CS , Ofotokun I . J Infect Dis 2017 216 (8) 990-999 Background: The female genital tract (FGT) microbiome may affect vaginal pH and other factors that influence drug movement into the vagina. We examined the relationship between the microbiome and antiretroviral concentrations in the FGT. Methods: Over one menstrual cycle, 20 human immunodeficiency virus (HIV)-infected women virologically suppressed on tenofovir (TFV) disoproxil fumarate/emtricitabine and ritonavir-boosted atazanavir (ATV) underwent serial paired cervicovaginal and plasma sampling for antiretroviral concentrations using high-performance liquid chromatography-tandem mass spectrometry. Analysis of 16S ribosomal RNA gene sequencing of cervicovaginal lavage clustered each participant visit into a unique microbiome community type (mCT). Results: Participants were predominantly African American (95%), with a median age of 38 years. Cervicovaginal lavage sequencing (n = 109) resulted in a low-diversity mCT dominated by Lactobacillus (n = 40), and intermediate-diversity (n = 28) and high-diversity (n = 41) mCTs with abundance of anaerobic taxa. In multivariable models, geometric mean FGT:plasma ratios varied significantly by mCT for all 3 drugs. For both ATV and TFV, FGT:plasma was significantly lower in participant visits with high- and low-diversity mCT groups (all P < .02). For emtricitabine, FGT:plasma was significantly lower in participant visits with low- vs intermediate-diversity mCT groups (P = .002). Conclusions: Certain FGT mCTs are associated with decreased FGT antiretroviral concentrations. These findings are relevant for optimizing antiretrovirals used for biomedical HIV prevention in women. |
Time to treatment for rifampicin-resistant tuberculosis: systematic review and meta-analysis.
Boyd R , Ford N , Padgen P , Cox H . Int J Tuberc Lung Dis 2017 21 (11) 1173-1180 BACKGROUND: To reduce transmission and improve patient outcomes, rapid diagnosis and treatment of rifampicin-resistant tuberculosis (RR-TB) is required. OBJECTIVE: To conduct a systematic review and meta-analysis assessing time to treatment for RR-TB and variability using diagnostic testing methods and treatment delivery approach. DESIGN: Studies from 2000 to 2015 reporting time to second-line treatment initiation were selected from PubMed and published conference abstracts. RESULTS: From 53 studies, 83 cohorts (13 034 patients) were included. Overall weighted mean time to treatment from specimen collection was 81 days (95%CI 70-91), and was shorter with ambulatory (57 days, 95%CI 40-74) than hospital-based treatment (86 days, 95%CI 71-102). Time to treatment was shorter with genotypic susceptibility testing (38 days, 95%CI 27-49) than phenotypic testing (108 days, 95%CI 98-117). The mean percentage of diagnosed patients initiating treatment was 76% (95%CI 70-83, range 25-100). CONCLUSION: Time to second-line anti-tuberculosis treatment initiation is extremely variable across studies, and often unnecessarily long. Reduced delays are associated with genotypic testing and ambulatory treatment settings. Routine monitoring of the proportion of diagnosed patients initiating treatment and time to treatment are necessary to identify areas for intervention. |
Detailed Transmission Network Analysis of a Large Opiate-Driven Outbreak of HIV Infection in the United States.
Campbell EM , Jia H , Shankar A , Hanson D , Luo W , Masciotra S , Owen SM , Oster AM , Galang RR , Spiller MW , Blosser SJ , Chapman E , Roseberry JC , Gentry J , Pontones P , Duwve J , Peyrani P , Kagan RM , Whitcomb JM , Peters PJ , Heneine W , Brooks JT , Switzer WM . J Infect Dis 2017 216 (9) 1053-1062 In January 2015, an outbreak of undiagnosed human immunodeficiency virus (HIV) infections among persons who inject drugs (PWID) was recognized in rural Indiana. By September 2016, 205 persons in this community of approximately 4400 had received a diagnosis of HIV infection. We report results of new approaches to analyzing epidemiologic and laboratory data to understand transmission during this outbreak. HIV genetic distances were calculated using the polymerase region. Networks were generated using data about reported high-risk contacts, viral genetic similarity, and their most parsimonious combinations. Sample collection dates and recency assay results were used to infer dates of infection. Epidemiologic and laboratory data each generated large and dense networks. Integration of these data revealed subgroups with epidemiologic and genetic commonalities, one of which appeared to contain the earliest infections. Predicted infection dates suggest that transmission began in 2011, underwent explosive growth in mid-2014, and slowed after the declaration of a public health emergency. Results from this phylodynamic analysis suggest that the majority of infections had likely already occurred when the investigation began and that early transmission may have been associated with sexual activity and injection drug use. Early and sustained efforts are needed to detect infections and prevent or interrupt rapid transmission within networks of uninfected PWID. |
Reporting deaths among children aged <5 years after the Ebola virus disease epidemic - Bombali District, Sierra Leone, 2015-2016
Wilkinson AL , Kaiser R , Jalloh MF , Kamara M , Blau DM , Raghunathan PL , Kamara A , Kamara U , Houston-Suluku N , Clarke K , Jambai A , Redd JT , Hersey S , Osaio-Kamara B . MMWR Morb Mortal Wkly Rep 2017 66 (41) 1116-1118 Mortality surveillance and vital registration are limited in Sierra Leone, a country with one of the highest mortality rates among children aged <5 years worldwide, approximately 120 deaths per 1,000 live births (1,2). To inform efforts to strengthen surveillance, stillbirths and deaths in children aged <5 years from multiple surveillance streams in Bombali Sebora chiefdom were retrospectively reviewed. In total, during January 2015-November 2016, 930 deaths in children aged <5 years were identified, representing 73.3% of the 1,269 deaths that were expected based on modeled estimates. The "117" telephone alert system established during the Ebola virus disease (Ebola) epidemic captured 683 (73.4%) of all reported deaths in children aged <5 years, and was the predominant reporting source for stillbirths (n = 172). In the absence of complete vital events registration, 117 call alerts markedly improved the completeness of reporting of stillbirths and deaths in children aged <5 years. |
Spatial distribution of extensively drug-resistant tuberculosis (XDR TB) patients in KwaZulu-Natal, South Africa
Kapwata T , Morris N , Campbell A , Mthiyane T , Mpangase P , Nelson KN , Allana S , Brust JCM , Moodley P , Mlisana K , Gandhi NR , Shah NS . PLoS One 2017 12 (10) e0181797 BACKGROUND: KwaZulu-Natal province, South Africa, has among the highest burden of XDR TB worldwide with the majority of cases occurring due to transmission. Poor access to health facilities can be a barrier to timely diagnosis and treatment of TB, which can contribute to ongoing transmission. We sought to determine the geographic distribution of XDR TB patients and proximity to health facilities in KwaZulu-Natal. METHODS: We recruited adults and children with XDR TB diagnosed in KwaZulu-Natal. We calculated distance and time from participants' home to the closest hospital or clinic, as well as to the actual facility that diagnosed XDR TB, using tools within ArcGIS Network analyst. Speed of travel was assigned to road classes based on Department of Transport regulations. Results were compared to guidelines for the provision of social facilities in South Africa: 5km to a clinic and 30km to a hospital. RESULTS: During 2011-2014, 1027 new XDR TB cases were diagnosed throughout all 11 districts of KwaZulu-Natal, of whom 404 (39%) were enrolled and had geospatial data collected. Participants would have had to travel a mean distance of 2.9 km (CI 95%: 1.8-4.1) to the nearest clinic and 17.6 km (CI 95%: 11.4-23.8) to the nearest hospital. Actual distances that participants travelled to the health facility that diagnosed XDR TB ranged from <10 km (n = 143, 36%) to >50 km (n = 109, 27%), with a mean of 69 km. The majority (77%) of participants travelled farther than the recommended distance to a clinic (5 km) and 39% travelled farther than the recommended distance to a hospital (30 km). Nearly half (46%) of participants were diagnosed at a health facility in eThekwini district, of whom, 36% resided outside the Durban metropolitan area. CONCLUSIONS: XDR TB cases are widely distributed throughout KwaZulu-Natal province with a denser focus in eThekwini district. Patients travelled long distances to the health facility where they were diagnosed with XDR TB, suggesting a potential role for migration or transportation in the XDR TB epidemic. |
Trends in pelvic inflammatory disease emergency department visits, United States, 2006-2013
Kreisel K , Flagg EW , Torrone E . Am J Obstet Gynecol 2017 218 (1) 117 e1-117 e10 BACKGROUND: Pelvic inflammatory disease is a female genital tract disorder with severe reproductive sequelae. Due to difficulties in diagnosing pelvic inflammatory disease, it is not a reportable condition in many states. Females seeking care in emergency departments are a sentinel population for pelvic inflammatory disease surveillance. OBJECTIVE: To determine trends in diagnoses of acute pelvic inflammatory disease in a nationally representative sample of emergency departments. STUDY DESIGN: All emergency department visits among females aged 15-44 years with an International Classification of Diseases, Ninth Revision, Clinical Modification diagnosis code indicating pelvic inflammatory disease during 2006-2013 were assessed from the HealthCare Utilization Project Nationwide Emergency Department Sample. Total and annual percent changes in the proportion of pelvic inflammatory disease emergency department visits were estimated using trend analyses. RESULTS: While the number of emergency department visits among females aged 15-44 years during 2006-2013 increased (6.5 to 7.4 million), the percent of visits due to pelvic inflammatory disease decreased from 0.57% in 2006 to 0.41% in 2013 (total percent change: -28.1%; annual percent change: -4.3%; 95% CI: -5.7%, -2.9%). The largest decreases were among those aged 15-19 years (total percent change: -40.6%; annual percent change: -6.6%; 95% CI: -8.6%, -4.4%) and living in the South (total percent change: -38.0%; annual percent change: -6.2%; 95% CI: -7.8%, -4.6%). Females aged 15-19 years who lived in the South had a 47.9% decrease in visits due to pelvic inflammatory disease (APC: -8.4%, 95% CI: -10.4, -6.5). Patients living in zip codes with the lowest median income (<$38,000) had the highest percent of visits with a pelvic inflammatory disease diagnosis; the smallest declines over time were in patients living in zip codes with the highest median income (i.e., >$64,000, total percent change: -24.4%; annual percent change: -3.8%; 95% CI: -5.2%, -2.4%). The percent of emergency department visits due to pelvic inflammatory disease was highest among patients not charged for their visit, self-paying, or those covered by Medicaid, with total percent changes in these three groups of -27.8%, -30.7%, and -35.1%, respectively. Patients with Medicaid coverage had the largest decrease in visits with a diagnosis of pelvic inflammatory disease (total percent change: -35.1%; annual percent change: -5.8%; 95% CI: -7.2%, -4.3%). CONCLUSIONS: Nationally representative data indicate the percent of emergency department visits with a pelvic inflammatory disease diagnosis decreased during 2006-2013 among females aged 15-44 years, primarily driven by decreased diagnoses of pelvic inflammatory disease among females aged 15-19 years and among women living in the Southern United States. Despite declines, a large number of females of reproductive age are receiving care for pelvic inflammatory disease in emergency departments. Patients with lower median income and no or public health insurance status, which may decrease access to and use of healthcare services, consistently had the highest percent of emergency department visits due to pelvic inflammatory disease. Future research should focus on obtaining a better understanding of factors influencing trends in pelvic inflammatory disease diagnoses and ways to address the challenges surrounding surveillance for this condition. |
Maritime illness and death reporting and public health response, United States, 2010-2014
Stamatakis C , Rice M , Washburn F , Krohn K , Bannerman M , Regan JJ . Travel Med Infect Dis 2017 19 16-21 BACKGROUND: Deaths and certain illnesses onboard ships arriving at US ports are required to be reported to the US Centers for Disease Control and Prevention (CDC), and notifications of certain illnesses are requested. METHODS: We performed a descriptive analysis of required maritime illness and death reports of presumptive diagnoses and requested notifications to CDC's Division of Global Migration and Quarantine, which manages CDC's Quarantine Stations, from January 2010 to December 2014. RESULTS: CDC Quarantine Stations received 2891 individual maritime case reports: 76.8% (2221/2891) illness reports, and 23.2% (670/2891) death reports. The most frequent individual illness reported was varicella (35.9%, 797/2221) and the most frequently reported causes of death were cardiovascular- or pulmonary-related conditions (79.6%, 533/670). There were 7695 cases of influenza-like illness received within aggregate notifications. CDC coordinated 63 contact investigations with partners to identify 972 contacts; 88.0% (855/972) were notified. There was documentation of 6.5% (19/293) receiving post-exposure prophylaxis. Three pertussis contacts were identified as secondary cases; and one tuberculosis contact was diagnosed with active tuberculosis. CONCLUSION: These data provide a picture of US maritime illness and death reporting and response. Varicella reports are the most frequent individual disease reports received. Contact investigations identified few cases of disease transmission. |
Prevalence and direct costs of emergency department visits and hospitalizations for selected diseases that can be transmitted by water, United States
Adam EA , Collier SA , Fullerton KE , Gargano JW , Beach MJ . J Water Health 2017 15 (5) 673-683 National emergency department (ED) visit prevalence and costs for selected diseases that can be transmitted by water were estimated using large healthcare databases (acute otitis externa, campylobacteriosis, cryptosporidiosis, Escherichia coli infection, free-living ameba infection, giardiasis, hepatitis A virus (HAV) infection, Legionnaires' disease, nontuberculous mycobacterial (NTM) infection, Pseudomonas-related pneumonia or septicemia, salmonellosis, shigellosis, and vibriosis or cholera). An estimated 477,000 annual ED visits (95% CI: 459,000-494,000) were documented, with 21% (n = 101,000, 95% CI: 97,000-105,000) resulting in immediate hospital admission. The remaining 376,000 annual treat-and-release ED visits (95% CI: 361,000-390,000) resulted in $194 million in annual direct costs. Most treat-and-release ED visits (97%) and costs ($178 million/year) were associated with acute otitis externa. HAV ($5.5 million), NTM ($2.3 million), and salmonellosis ($2.2 million) were associated with next highest total costs. Cryptosporidiosis ($2,035), campylobacteriosis ($1,783), and NTM ($1,709) had the highest mean costs per treat-and-release ED visit. Overall, the annual hospitalization and treat-and-release ED visit costs associated with the selected diseases totaled $3.8 billion. As most of these diseases are not solely transmitted by water, an attribution process is needed as a next step to determine the proportion of these visits and costs attributable to waterborne transmission. |
Provision of ART to individuals infected with HIV: impact on the epidemiology and control of tuberculosis
Borgdorff MW , De Cock KM . Int J Tuberc Lung Dis 2017 21 (11) 1091-1092 THE PROVISION of antiretroviral therapy (ART) has | changed the face of the human immunodeficiency | (HIV) epidemic. In high HIV burden settings it has | improved survival and contributed to declining HIV | incidence.1,2 Declining HIV incidence has been followed by declining tuberculosis (TB) incidence in some | high TB-HIV burden countries, presumably through | declining HIV prevalence among young adults, and | through a declining prevalence of advanced immunodeficiency attributable to ART.3–6 While the incidence | of TB among HIV-infected individuals on ART | remains higher than among HIV-negative individuals, | including from an increased risk of recurrent TB, it is | much lower than among HIV-infected individuals not | on ART.7–9 Survival of TB patients with advanced HIV | co-infection is improved with early start of ART.10 | Thus, ART has also had major benefits for TB | treatment and control. | There are two potential areas of concern from the | increased use of ART that need further clarification to | better predict the full impact of ART on TB epidemiology. One is the lifetime risk of TB, the other is the | infectiousness of HIV-infected TB patients on ART. | ART reduces the risk per year among HIV-infected | individuals of developing TB,11 but at the same time it | increases their life expectancy.1 As a result, the impact | of ART on the lifetime risk of developing TB is | unclear.12 While postponing the development of TB is | likely to represent a net benefit not only for the | individual concerned, but also for the general population, further information on the lifetime risk of TB | would be helpful to determine the net direct benefit of | ART on TB incidence. Obviously, as ART is expected | to reduce HIV transmission and thus HIV incidence, | this by itself is expected to reduce TB incidence. |
Expansion of viral load testing and the potential impact on human immunodeficiency virus drug resistance
Chun HM , Obeng-Aduasare YF , Broyles LN , Ellenberger D . J Infect Dis 2017 216 S808-S811 Increasing the volume, strengthening the quality, and proactively using data of human immunodeficiency virus (HIV) load testing are pivotal to limiting the threat of HIV drug resistance (HIVDR) accumulation,and allow for optimal case-based HIVDR surveillance. Triangulation of viral load (VL) and HIVDR testing data could be pursued to answer key questions and translate data and results for program and public policy. Identification of virologic failure and early management mitigates the greater risk of HIVDR. Routine VL monitoring and evaluation systems are necessary, and countries should consider reviewing system requirements, structural needs, and procedural and technical factors for the entire VL cascade, with special emphasis on post-test result use. |
HIV testing, linkage to HIV medical care, and interviews for partner services among women - 61 health department jurisdictions, United States, Puerto Rico, and the U.S. Virgin Islands, 2015
Stein R , Xu S , Marano M , Williams W , Cheng Q , Eke A , Moore A , Wang G . MMWR Morb Mortal Wkly Rep 2017 66 (41) 1100-1104 Diagnoses of human immunodeficiency virus (HIV) infection among women declined 17% during 2011-2015, and a total of 7,498 women received a diagnosis of HIV infection in 2015 (1). Although black or African American (black) women accounted for only 12% of the U.S. female population, 60% of women with newly diagnosed HIV infection were black (1,2). By the end of 2014, an estimated 255,900 women were living with HIV infection (3), including approximately 12% who did not know they were infected; in addition, approximately 45% of women who had received a diagnosis had not achieved viral suppression (3). HIV testing is an important public health strategy for identifying women with HIV infection and linking them to HIV medical care. Analysis of CDC-funded program data submitted by 61 health departments in 2015 indicated that among 4,749 women tested who received a diagnosis of HIV infection, 2,951 (62%) had received a diagnosis in the past (previous diagnosis), and 1,798 (38%) were receiving a diagnosis for the first time (new diagnosis). Of those who had received a previous diagnosis, 87% were not in HIV medical care at the time of the current test. Testing and identifying women who are living with HIV infection but who are not in care (regardless of when they received their first diagnosis) and rapidly linking them to care so they can receive antiretroviral therapy and become virally suppressed are essential for reducing HIV infection among all women. |
Human parainfluenza virus surveillance in pediatric patients with lower respiratory tract infections: a special view of parainfluenza type 4
Thomazelli LM , Oliveira DBL , Durigon GS , Whitaker B , Kamili S , Berezin EN , Durigon EL . J Pediatr (Rio J) 2017 94 (5) 554-558 OBJECTIVE: Characterize the role of human parainfluenza virus and its clinical features in Brazilian children under 2 years of age presenting with acute lower respiratory tract infections. METHODS: Real-time assays were used to identify strains of human parainfluenza virus and other common respiratory viruses in nasopharyngeal aspirates. One thousand and two children presenting with acute lower respiratory tract illnesses were enrolled from February 2008 to August 2010. RESULTS: One hundred and four (10.4%) patients were human parainfluenza virus positive, of whom 60 (57.7%) were positive for human parainfluenza virus-3, 30 (28.8%) for human parainfluenza virus-4, 12 (11.5%) for human parainfluenza virus-1, and two (1.9%) for human parainfluenza virus-2. Seven (6.7%) patients had more than one strain of human parainfluenza virus detected. The most frequent symptoms were tachypnea and cough, similar to other viral respiratory infections. Clinical manifestations did not differ significantly between human parainfluenza virus-1, -2, -3, and -4 infections. Human parainfluenza virus-1, -3, and -4 were present in the population studied throughout the three years of surveillance, with human parainfluenza virus-3 being the predominant type identified in the first two years. CONCLUSION: Human parainfluenza viruses contribute substantially to pediatric acute respiratory illness (ARI) in Brazil, with nearly 30% of this contribution attributable to human parainfluenza virus-4. |
Impact of human immunodeficiency virus drug resistance on treatment of human immunodeficiency virus infection in children in low- and middle-income countries
Siberry GK , Amzel A , Ramos A , Rivadeneira ED . J Infect Dis 2017 216 S838-S842 Children living with human immunodeficiency virus (HIV) in low- and middle-income countries (LMICs) experience higher rates of virologic failure than adults. Human immunodeficiency virus drug resistance (HIVDR) plays a major role in pediatric HIV treatment failure because nonsuppressive maternal antiretroviral therapy (ART) during pregnancy and breastfeeding as well as infant antiretroviral prophylaxis lead to high rates of pretreatment drug resistance to regimens most commonly used in children living with HIV. Lack of availability of durable, potent drugs in child-friendly formulations in LMICs and adherence difficulties contribute to acquired drug resistance during treatment. Optimizing drugs available for treating children living with HIV in LMICs, providing robust adherence support, and ensuring virologic monitoring for children receiving ART are essential for reducing HIVDR and improving treatment outcomes for children living with HIV in LMICs. |
In- and out-of-hospital mortality associated with seasonal and pandemic influenza and respiratory syncytial virus in South Africa, 2009-2013
Cohen C , Walaza S , Treurnicht FK , McMorrow M , Madhi SA , McAnerney JM , Tempia S . Clin Infect Dis 2017 66 (1) 95-103 Background: Estimates of influenza- and respiratory syncytial virus (RSV)-associated mortality burden are important to guide policy for control. Data are limited on the contribution of out-of-hospital deaths to this mortality. Methods: We modeled excess mortality attributable to influenza and RSV infection by applying regression models to weekly deaths from national vital statistics from 2009 through 2013, using influenza and RSV laboratory surveillance data as covariates. We fitted separate models for in- and out-of-hospital deaths. Results: There were 509791 average annual deaths in South Africa, of which 44% (95% confidence interval [CI] 43%-45%) occurred out-of-hospital. Seasonal influenza and RSV all-cause mortality rates were 23.0 (95% CI 11.0-30.6) and 13.2 (95% CI 6.4-33.8) per 100000 population annually (2.3% [95%CI 2.3%-2.4%] and 1.3% [95% CI 1.2%-1.4%] of all deaths respectively). The peak mortality rate was in individuals aged ≥75 years (386.0; 95% CI 176.5-466.3) for influenza and in infants (143.4; 95% CI 0-194.8) for RSV. Overall, 63% (95% CI 62%--65%) of seasonal influenza and 48% (95% CI 47%-49%) of RSV-associated deaths occurred out-of-hospital. Among children aged <5 years, RSV-associated deaths were more likely to occur in-hospital, whereas influenza-associated deaths were more likely to occur out-of-hospital. The mortality rate was 6.7 (95% CI 6.4-33.8) in the first influenza A(H1N1)pdm09 wave in 2009 and 20.9 (95% CI 6.4-33.8) in the second wave in 2011, with 30% (95% CI 29%-32%) of A(H1N1)pdm09-associated deaths in 2009 occurring out-of-hospital. Discussion: More than 45% of seasonal influenza- and RSV-associated deaths occur out-of-hospital in South Africa. These data suggest that hospital-based studies may substantially underestimate mortality burden. |
Infrequent testing of women for rectal chlamydia and gonorrhea in the United States
Tao G , Hoover KW , Nye MB , Peters PJ , Gift TL , Body BA . Clin Infect Dis 2017 66 (4) 570-575 Background: Anal sex is a common sexual behavior among women that increases their risk of acquiring rectal infection with Chlamydia trachomatis (CT) and Neisseria gonorrhoeae (GC). Methods: We estimated the frequency and positivity of rectal CT and GC tests for women aged 15-60 years performed by a large U.S. commercial laboratory between November 2012 and September 2015. We also estimated the frequency and positivity of pharyngeal and genital specimens also performed on the same date. Among women with a positive CT or GC result, we estimated the frequency and positivity of recommended repeat testing within 12 months. Results: Of 5,499 women who had rectal CT and GC tests, positivity was 10.8%. On the same date, approximately 80% also had genital CT tests, genital GC tests, and pharyngeal GC tests, while 40% had pharyngeal CT tests. Rectal CT or GC infection was associated with genital CT or GC infection, but 46.5% of rectal CT and GC infections would not have been identified with genital testing alone. Among women with a rectal CT or GC infection, only 20.0% had a recommended repeat rectal test. Of those who had a repeat test, 17.7% were positive. Conclusions: Testing women for rectal CT and GC was infrequent but positive tests were often found in women with negative genital tests. Most women with positive rectal tests were not retested. Interventions are needed to increase extragenital CT and GC testing of at-risk women. |
Knowledge, attitudes, and practices related to Ebola virus disease at the end of a national epidemic - Guinea, August 2015
Jalloh MF , Robinson SJ , Corker J , Li W , Irwin K , Barry AM , Ntuba PN , Diallo AA , Jalloh MB , Nyuma J , Sellu M , VanSteelandt A , Ramsden M , Tracy L , Raghunathan PL , Redd JT , Martel L , Marston B , Bunnell R . MMWR Morb Mortal Wkly Rep 2017 66 (41) 1109-1115 Health communication and social mobilization efforts to improve the public's knowledge, attitudes, and practices (KAP) regarding Ebola virus disease (Ebola) were important in controlling the 2014-2016 Ebola epidemic in Guinea (1), which resulted in 3,814 reported Ebola cases and 2,544 deaths.* Most Ebola cases in Guinea resulted from the washing and touching of persons and corpses infected with Ebola without adequate infection control precautions at home, at funerals, and in health facilities (2,3). As the 18-month epidemic waned in August 2015, Ebola KAP were assessed in a survey among residents of Guinea recruited through multistage cluster sampling procedures in the nation's eight administrative regions (Boke, Conakry, Faranah, Kankan, Kindia, Labe, Mamou, and Nzerekore). Nearly all participants (92%) were aware of Ebola prevention measures, but 27% believed that Ebola could be transmitted by ambient air, and 49% believed they could protect themselves from Ebola by avoiding mosquito bites. Of the participants, 95% reported taking actions to avoid getting Ebola, especially more frequent handwashing (93%). Nearly all participants (91%) indicated they would send relatives with suspected Ebola to Ebola treatment centers, and 89% said they would engage special Ebola burial teams to remove corpses with suspected Ebola from homes. Of the participants, 66% said they would prefer to observe an Ebola-affected corpse from a safe distance at burials rather than practice traditional funeral rites involving corpse contact. The findings were used to guide the ongoing epidemic response and recovery efforts, including health communication, social mobilization, and planning, to prevent and respond to future outbreaks or sporadic cases of Ebola. |
Association between male circumcision and women's biomedical health outcomes: a systematic review
Grund JM , Bryant TS , Jackson I , Curran K , Bock N , Toledo C , Taliano J , Zhou S , Del Campo JM , Yang L , Kivumbi A , Li P , Pals S , Davis SM . Lancet Glob Health 2017 5 (11) e1113-e1122 BACKGROUND: Male circumcision reduces men's risk of acquiring HIV and some sexually transmitted infections from heterosexual exposure, and is essential for HIV prevention in sub-Saharan Africa. Studies have also investigated associations between male circumcision and risk of acquisition of HIV and sexually transmitted infections in women. We aimed to review all evidence on associations between male circumcision and women's health outcomes to benefit women's health programmes. METHODS: In this systematic review we searched for peer-reviewed and grey literature publications reporting associations between male circumcision and women's health outcomes up to April 11, 2016. All biomedical (not psychological or social) outcomes in all study types were included. Searches were not restricted by year of publication, or to sub-Saharan Africa. Publications without primary data and not in English were excluded. We extracted data and assessed evidence on each outcome as high, medium, or low consistency on the basis of agreement between publications; outcomes found in fewer than three publications were indeterminate consistency. FINDINGS: 60 publications were included in our assessment. High-consistency evidence was found for five outcomes, with male circumcision protecting against cervical cancer, cervical dysplasia, herpes simplex virus type 2, chlamydia, and syphilis. Medium-consistency evidence was found for male circumcision protecting against human papillomavirus and low-risk human papillomavirus. Although the evidence shows a protective association with HIV, it was categorised as low consistency, because one trial showed an increased risk to female partners of HIV-infected men resuming sex early after male circumcision. Seven outcomes including HIV had low-consistency evidence and six were indeterminate. INTERPRETATION: Scale-up of male circumcision in sub-Saharan Africa has public health implications for several outcomes in women. Evidence that female partners are at decreased risk of several diseases is highly consistent. Synergies between male circumcision and women's health programmes should be explored. FUNDING: US Centers for Disease Control and Prevention and Jhpiego. |
Changes in viral suppression status among HIV patients receiving care: United States, 2014
Crepaz N , Tang T , Marks G , Hall HI . AIDS 2017 31 (17) 2421-2425 OBJECTIVE: To examine changes in viral suppression status among HIV patients receiving care in 2014 and the extent of viral suppression among persons with infrequent care visits. METHODS: Using data reported to the National HIV Surveillance System from 33 jurisdictions with complete reporting of CD4 and viral load tests, we created four viral suppression status groups based on their first and last viral loads in 2014: both suppressed, first unsuppressed and last suppressed (improved), first suppressed and last unsuppressed (worsened), and both unsuppressed. We also calculated the number and percentage of persons whose sole viral load in 2014 was suppressed and had a suppressed viral load at their last test in 2013. RESULTS: Among 339515 persons with at least two viral load tests in 2014, 72.6% had all viral loads suppressed (durably suppressed); 75.5% had the first and last tests suppressed, 10.5% improved, 4.2% worsened, and 9.9% had both unsuppressed. Among 92309 persons who had only one viral load test in 2014, 69960 (75.8%) were suppressed and, of those, 53834 (76.9%) also had a suppressed viral load at their last test in 2013. CONCLUSIONS: National surveillance data show that the majority of patients in HIV care during 2014 were durably suppressed. More showed improved compared to worsened viral suppression status. Some patients who have less frequent care visits have sustained viral suppression. Yet one in ten who were in regular care did not have a suppressed viral load in 2014, indicating missed opportunities for clinical interventions to help patients achieve and sustain viral suppression. |
Clinical characteristics of hospitalized infants with laboratory-confirmed pertussis in Guatemala
Phadke VK , McCracken JP , Kriss JL , Lopez MR , Lindblade KA , Bryan JP , Garcia ME , Funes CE , Omer SB . J Pediatric Infect Dis Soc 2017 7 (4) 310-316 Background: Pertussis is an important cause of hospitalization and death in infants too young to be vaccinated (aged <2 months). Limited data on infant pertussis have been reported from Central America. The aim of this study was to characterize acute respiratory illnesses (ARIs) attributable to Bordetella pertussis among infants enrolled in an ongoing surveillance study in Guatemala. Methods: As part of a population-based surveillance study in Guatemala, infants aged <2 months who presented with ARI and required hospitalization were enrolled, and nasopharyngeal and oropharyngeal swab specimens were obtained. For this study, these specimens were tested for B pertussis using real-time polymerase chain reaction (PCR). Results: Among 301 infants hospitalized with ARI, we found 11 with pertussis confirmed by PCR (pertussis-positive infants). Compared to pertussis-negative infants, pertussis-positive infants had a higher mean admission white blood cell count (20900 vs 12579 cells/mul, respectively; P = .024), absolute lymphocyte count (11517 vs 5591 cells/mul, respectively; P < .001), rate of admission to the intensive care unit (64% vs 35%, respectively; P = .054), and case fatality rate (18% vs 3%, respectively; P = .014). Ten of the 11 pertussis-positive infants had cough at presentation; the majority (80%) of them had a cough duration of <7 days, and only 1 had a cough duration of >14 days. Fever (temperature ≥ 38 degrees C) was documented in nearly half (45%) of the pertussis-positive infants (range, 38.0-38.4 degrees C). Conclusions: In this study of infants <2 months of age hospitalized with ARI in Guatemala, pertussis-positive infants had a high rate of intensive care unit admission and a higher case fatality rate than pertussis-negative infants. |
Combinations of interventions to achieve a national HIV incidence reduction goal: insights from the agent-based PATH 2.0 model
Gopalappa C , Sansom SL , Farnham PG , Chen YH . AIDS 2017 31 (18) 2533-2539 OBJECTIVE: Analyzing HIV care service targets for achieving a national goal of a 25% reduction in annual HIV incidence and evaluating the use of annual HIV diagnoses to measure progress in incidence reduction. DESIGN: Because there are considerable interactions among HIV care services, we model the dynamics of combinations of increases in HIV care continuum targets to identify those that would achieve 25% reductions in annual incidence and diagnoses. METHODS: We used Progression and Transmission of HIV/AIDS (PATH 2.0), an agent-based dynamic stochastic simulation of HIV in the United States. RESULTS: A 25% reduction in annual incidence could be achieved by multiple alternative combinations of percentages of persons with diagnosed infection and persons with viral suppression including 85% and 68%, respectively, and 90% and 59%, respectively. The first combination corresponded to an 18% reduction in annual diagnoses, and infections being diagnosed at a median CD4 count of 372 cells/muL or approximately 3.8 years from time of infection. The corresponding values on the second combination are 4%, 462 cells/muL, and 2.0 years, respectively. CONCLUSIONS: Our analysis provides policy makers with specific targets and alternative choices to achieve the goal of a 25% reduction in HIV incidence. Reducing annual diagnoses does not equate to reducing annual incidence. Instead, progress toward reducing incidence can be measured by monitoring HIV surveillance data trends in CD4 count at diagnosis along with the proportion who have achieved viral suppression to determine where to focus local programmatic efforts. |
Conveyance contact investigation for imported Middle East Respiratory Syndrome cases, United States, May 2014
Lippold SA , Objio T , Vonnahme L , Washburn F , Cohen NJ , Chen TH , Edelson PJ , Gulati R , Hale C , Harcourt J , Haynes L , Jewett A , Jungerman R , Kohl KS , Miao C , Pesik N , Regan JJ , Roland E , Schembri C , Schneider E , Tamin A , Tatti K , Alvarado-Ramy F . Emerg Infect Dis 2017 23 (9) 1585-1589 In 2014, the Centers for Disease Control and Prevention conducted conveyance contact investigations for 2 Middle East respiratory syndrome cases imported into the United States, comprising all passengers and crew on 4 international and domestic flights and 1 bus. Of 655 contacts, 78% were interviewed; 33% had serologic testing. No secondary cases were identified. |
Cost-effectiveness of testing and treatment for latent tuberculosis infection in residents born outside the United States with and without medical comorbidities in a simulation model
Tasillo A , Salomon JA , Trikalinos TA , Horsburgh CR Jr , Marks SM , Linas BP . JAMA Intern Med 2017 177 (12) 1755-1764 Importance: Testing for and treating latent tuberculosis infection (LTBI) is among the main strategies to achieve TB elimination in the United States. The best approach to testing among non-US born residents, particularly those with comorbid conditions, is uncertain. Objective: To estimate health outcomes, costs, and cost-effectiveness of LTBI testing and treatment among non-US born residents with and without medical comorbidities. Design, Setting, and Participants: Decision analytic tree and Markov cohort simulation model among non-US born residents with no comorbidities, with diabetes, with HIV infection, or with end-stage renal disease (ESRD) using a health care sector perspective with 3% annual discounting. Strategies compared included no testing, tuberculin skin test (TST), interferon gamma release assay (IGRA), confirm positive (initial TST, IGRA only for TST-positive results; both tests positive indicates LTBI), and confirm negative (initial IGRA, then TST for IGRA-negative; any test positive indicates LTBI). All strategies were coupled to treatment with 3 months of self-administered rifapentine and isoniazid. Main Outcomes and Measures: Number needed to test and treat to prevent 1 case of TB reactivation, discounted quality-adjusted life-years (QALYs), discounted lifetime medical costs, and incremental cost-effectiveness ratios (ICERs). Results: Improving health outcomes increased costs, with choice of test dependent on willingness to pay. Strategies ranked by ascending costs and benefits: no testing, confirm positive, TST, IGRA, and confirm negative. The ICERs varied by non-US born patient risk group: patients with no comorbidities, IGRA was likely cost-effective at $83000/QALY; patients with diabetes, both confirm positive ($53000/QALY) and IGRA ($120000/QALY) were likely cost-effective; patients with HIV, confirm negative was clearly preferred ($63000/QALY); and patients with ESRD, no testing was cost-effective. Increased LTBI prevalence and reduced return for TST reading improved IGRA's relative performance. In 10000 probabilistic simulations among non-US born patients with no comorbidities, with diabetes, and with HIV, some form of testing was virtually always cost-effective. These simulations highlight the uncertainty of test choice for non-US born patients with no comorbidities and non-US born patients with diabetes, but strategies including IGRA were preferred in over 60% of simulations for all non-US born populations except those with ESRD. Conclusions and Relevance: Testing for and treating LTBI among non-US born residents with and without selected comorbidities is likely cost-effective except among those with ESRD in whom competing risks of death limit benefits. Strategies including IGRA fell below a $100000/QALY willingness-to-pay threshold for non-US born patients with no comorbidities, patients with diabetes, and patients with HIV. |
Country of birth of children with diagnosed HIV infection in the United States, 2008-2014
Nesheim SR , Linley L , Gray KM , Zhang T , Shi J , Lampe MA , FitzHarris LF . J Acquir Immune Defic Syndr 2017 77 (1) 23-30 BACKGROUND: Diagnoses of HIV infection among children in the United States (US) have been declining; however, a notable percentage of diagnoses are among those born outside the United States. The impact of foreign birth among children with diagnosed infections has not been examined in the United States. METHOD: Using the CDC National HIV Surveillance System, we analyzed data for children aged <13 years with diagnosed HIV infection ("children") in the United States (reported from 50 states and the District of Columbia) during 2008-2014, by place of birth and selected characteristics. RESULTS: There were 1,516 children (726 US-born [47.9%] and 676 foreign-born [44.6%]). US-born children accounted for 70.0% in 2008, declining to 32.3% in 2013, and 40.9% in 2014. Foreign-born children have exceeded US-born children in number since 2011.Age at diagnosis was younger for US-born than foreign-born children (0-18 months: 72.6% vs. 9.8%; 5-12 years: 16.9% vs. 60.3%). HIV diagnoses in mothers of US-born children were made more often before pregnancy (49.7% vs 21.4%), or during pregnancy (16.6% vs 13.9%), and less often after birth (23.7% vs 41%). Custodians of US-born children were more often biological parents (71.9% vs 43.2%), and less likely to be foster or non-related adoptive parents (10.4% vs 55.1%).Of 676 foreign-born children with known place of birth, 65.5% were born in sub-Saharan Africa and 14.3% in Eastern Europe. The top countries of birth were Ethiopia, Ukraine, Uganda, Haiti, and Russia. CONCLUSION: The increasing number of foreign-born children with diagnosed HIV infection in the United States requires specific considerations for care and treatment. |
Current status of point-of-care testing for human immunodeficiency virus drug resistance
Duarte HA , Panpradist N , Beck IA , Lutz B , Lai J , Kanthula RM , Kantor R , Tripathi A , Saravanan S , MacLeod IJ , Chung MH , Zhang G , Yang C , Frenkel LM . J Infect Dis 2017 216 S824-S828 Healthcare delivery has advanced due to the implementation of point-of-care testing, which is often performed within minutes to hours in minimally equipped laboratories or at home. Technologic advances are leading to point-of-care kits that incorporate nucleic acid-based assays, including polymerase chain reaction, isothermal amplification, ligation, and hybridization reactions. As a limited number of single-nucleotide polymorphisms are associated with clinically significant human immunodeficiency virus (HIV) drug resistance, assays to detect these mutations have been developed. Early versions of these assays have been used in research. This review summarizes the principles underlying each assay and discusses strategic needs for their incorporation into the management of HIV infection. |
Diagnosing antimicrobial resistance
Burnham CD , Leeds J , Nordmann P , O'Grady J , Patel J . Nat Rev Microbiol 2017 15 (11) 697-703 Antimicrobial resistance constitutes a global burden and is one of the major threats to public health. Although the emergence of resistant microorganisms is a natural phenomenon, the overuse or inappropriate use of antimicrobials has had a great effect on resistance evolution. Rapid diagnostic tests that identify drug-resistant bacteria, determine antimicrobial susceptibility and distinguish viral from bacterial infections can guide effective treatment strategies. Moreover, rapid diagnostic tests could facilitate epidemiological surveillance, as emerging resistant infectious agents and transmission can be monitored. In this Viewpoint article, several experts in the field discuss the drawbacks of current diagnostic methods that are used to identify antimicrobial resistance, novel diagnostic strategies and how such rapid tests can inform drug development and the surveillance of resistance evolution. |
Lack of evidence for Zika virus transmission by Culex mosquitoes
Roundy CM , Azar SR , Brault AC , Ebel GD , Failloux AB , Fernandez-Salas I , Kitron U , Kramer LD , Lourenco-de-Oliveira R , Osorio JE , Paploski ID , Vazquez-Prokopec GM , Ribeiro GS , Ritchie SA , Tauro LB , Vasilakis N , Weaver SC . Emerg Microbes Infect 2017 6 (10) e90 Since Zika virus (ZIKV) emerged in the Americas, major research efforts have been focused on identifying the mosquito species responsible for transmission. While almost all published results support Aedes aegypti and potentially Ae. albopictus as urban vectors, a recent article1 suggests that Culex quinquefasciatus may serve as a ZIKV vector in Recife, Brazil, a region that has experienced a high incidence of infection. Accurately identifying the vector of a pathogen enables public health agencies to implement appropriate control strategies and inform citizens of proper prevention measures. Additionally, establishing the vector for an emerging pathogen paves the way for researchers to advance our understanding of virus–vector interactions and pursue novel methods of control. In contrast, erroneously incriminating a vector could lead to misdirected use of limited government funds, diversion of research efforts and misinforming the public through misdirected media and educational programs. | Traditional criteria for arthropod vector incrimination include: (i) demonstration of feeding or other effective contact with pathogen’s host; (ii) association in time and space of the vector and the pathogen-infected host; (iii) repeated demonstration of natural infection of the vector and (iv) experimental transmission of the pathogen by the vector.2 |
Modeling the environmental suitability for Aedes (Stegomyia) aegypti and Aedes (Stegomyia) albopictus (Diptera: Culicidae) in the contiguous United States
Johnson TL , Haque U , Monaghan AJ , Eisen L , Hahn MB , Hayden MH , Savage HM , McAllister J , Mutebi JP , Eisen RJ . J Med Entomol 2017 54 (6) 1605-1614 The mosquitoes Aedes (Stegomyia) aegypti (L.)(Diptera:Culicidae) and Ae. (Stegomyia) albopictus (Skuse) (Diptera:Culicidae) transmit dengue, chikungunya, and Zika viruses and represent a growing public health threat in parts of the United States where they are established. To complement existing mosquito presence records based on discontinuous, non-systematic surveillance efforts, we developed county-scale environmental suitability maps for both species using maximum entropy modeling to fit climatic variables to county presence records from 1960-2016 in the contiguous United States. The predictive models for Ae. aegypti and Ae. albopictus had an overall accuracy of 0.84 and 0.85, respectively. Cumulative growing degree days (GDDs) during the winter months, an indicator of overall warmth, was the most important predictive variable for both species and was positively associated with environmental suitability. The number (percentage) of counties classified as environmentally suitable, based on models with 90 or 99% sensitivity, ranged from 1,443 (46%) to 2,209 (71%) for Ae. aegypti and from 1,726 (55%) to 2,329 (75%) for Ae. albopictus. Increasing model sensitivity results in more counties classified as suitable, at least for summer survival, from which there are no mosquito records. We anticipate that Ae. aegypti and Ae. albopictus will be found more commonly in counties classified as suitable based on the lower 90% sensitivity threshold compared with the higher 99% threshold. Counties predicted suitable with 90% sensitivity should therefore be a top priority for expanded mosquito surveillance efforts while still keeping in mind that Ae. aegypti and Ae. albopictus may be introduced, via accidental transport of eggs or immatures, and potentially proliferate during the warmest part of the year anywhere within the geographic areas delineated by the 99% sensitivity model. |
Estimating the high-arsenic domestic-well population in the conterminous United States
Ayotte JD , Medalie L , Qi SL , Backer LC , Nolan BT . Environ Sci Technol 2017 51 (21) 12443-12454 Arsenic concentrations from 20450 domestic wells in the U.S. were used to develop a logistic regression model of the probability of having arsenic >10 mug/L ("high arsenic"), which is presented at the county, state, and national scales. Variables representing geologic sources, geochemical, hydrologic, and physical features were among the significant predictors of high arsenic. For U.S. Census blocks, the mean probability of arsenic >10 mug/L was multiplied by the population using domestic wells to estimate the potential high-arsenic domestic-well population. Approximately 44.1 M people in the U.S. use water from domestic wells. The population in the conterminous U.S. using water from domestic wells with predicted arsenic concentration >10 mug/L is 2.1 M people (95% CI is 1.5 to 2.9 M). Although areas of the U.S. were underrepresented with arsenic data, predictive variables available in national data sets were used to estimate high arsenic in unsampled areas. Additionally, by predicting to all of the conterminous U.S., we identify areas of high and low potential exposure in areas of limited arsenic data. These areas may be viewed as potential areas to investigate further or to compare to more detailed local information. Linking predictive modeling to private well use information nationally, despite the uncertainty, is beneficial for broad screening of the population at risk from elevated arsenic in drinking water from private wells. |
Interpreting mobile and handheld air sensor readings in relation to air quality standards and health effect reference values: Tackling the challenges
Woodall GM , Hoover MD , Williams R , Benedict K , Harper M , Soo JC , Jarabek AM , Stewart MJ , Brown JS , Hulla JS , Caudill M , Clements AL , Kaufman A , Parker AJ , Keating M , Balshaw D , Garrahan K , Burton L , Batka S , Limaye VS , Hakkinen PJ , Thompson B . Atmosphere (Basel) 2017 8 (10) 182 The US Environmental Protection Agency (EPA) and other federal agencies face a number of challenges in interpreting and reconciling short-duration (seconds to minutes) readings from mobile and handheld air sensors with the longer duration averages (hours to days) associated with the National Ambient Air Quality Standards (NAAQS) for the criteria pollutants-particulate matter (PM), ozone, carbon monoxide, lead, nitrogen oxides, and sulfur oxides. Similar issues are equally relevant to the hazardous air pollutants (HAPs) where chemical-specific health effect reference values are the best indicators of exposure limits; values which are often based on a lifetime of continuous exposure. A multi-agency, staff-level Air Sensors Health Group (ASHG) was convened in 2013. ASHG represents a multi-institutional collaboration of Federal agencies devoted to discovery and discussion of sensor technologies, interpretation of sensor data, defining the state of sensor-related science across each institution, and provides consultation on how sensors might effectively be used to meet a wide range of research and decision support needs. ASHG focuses on several fronts: improving the understanding of what hand-held sensor technologies may be able to deliver; communicating what hand-held sensor readings can provide to a number of audiences; the challenges of how to integrate data generated by multiple entities using new and unproven technologies; and defining best practices in communicating health-related messages to various audiences. This review summarizes the challenges, successes, and promising tools of those initial ASHG efforts and Federal agency progress on crafting similar products for use with other NAAQS pollutants and the HAPs. NOTE: The opinions expressed are those of the authors and do not necessary represent the opinions of their Federal Agencies or the US Government. Mention of product names does not constitute endorsement. |
Benchmark datasets for phylogenomic pipeline validation, applications for foodborne pathogen surveillance.
Timme RE , Rand H , Shumway M , Trees EK , Simmons M , Agarwala R , Davis S , Tillman GE , Defibaugh-Chavez S , Carleton HA , Klimke WA , Katz LS . PeerJ 2017 2017 (10) e3893 Background. As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. Methods. We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (referencebased SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and ``known'' phylogenetic trees in publiclyaccessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Results. Our ``outbreak'' benchmark datasets represent the four major foodborne bacterial pathogens (Listeria monocytogenes, Salmonella enterica, Escherichia coli, and Campylobacter jejuni) and one simulated dataset where the ``known tree'' can be accurately called the ``true tree''. The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. Discussion. These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross-institutional collaborations. Our work is part of a global effort to provide collaborative infrastructure for sequence data and analytic tools-we welcome additional benchmark datasets in our recommended format, and, if relevant, we will add these on our GitHub site. Together, these datasets, dataset format, and the underlying GitHub infrastructure present a recommended path for worldwide standardization of phylogenomic pipelines. |
Editorial: Emergence of Gene-Environment Interaction Analysis in Epidemiologic Research.
Khoury MJ . Am J Epidemiol 2017 186 (7) 751-752 In this issue of the Journal, we publish 4 review articles (1–4) on gene-environment interaction (G×E) analysis in epidemiologic research. The papers resulted from a 2014 workshop held by the National Institute of Environmental Health Sciences and the National Cancer Institute to explore new approaches for discovery and characterization of G×E in epidemiologic research. The 4 papers provide an update on: 1) the state of the science in analytical methods of G×E (1); 2) opportunities for incorporation of biological knowledge into G×E analyses (2); 3) lessons learned from past G×E successes (3); and 4) overarching themes on current challenges and opportunities for gene-environment interaction studies of complex diseases (4). Topics include improved data analytical methods, environmental exposure assessment (5), and incorporation of functional information. | Together, these papers provide an overview of the G×E field at a time of explosive growth in the quantity and types of data that are being collected at the individual and population levels. Obviously, genome-sequencing data is a major emerging source of information on individual genetic susceptibility, but we are increasingly able to use other “-omic” data (6)—such as metabolomics, proteomics, epigenetics, and others—to measure and characterize biological processes that result from G×E. We are also increasingly able to join diverse biological data with various environmental, social, financial, geographic, and transactional data in a “big data” (7) health-impact framework. |
Stakeholder education for community-wide health initiatives: A focus on teen pregnancy prevention
Finley C , Suellentrop K , Griesse R , House LD , Brittain A . Health Promot Pract 2017 19 (1) 1524839917734521 Teen pregnancies and births continue to decline due in part to implementation of evidence-based interventions and clinical strategies. While local stakeholder education is also thought to be critical to this success, little is known about what types of strategies work best to engage stakeholders. With the goal of identifying and describing evidence-based or best practice strategies for stakeholder education in community-based public health initiatives, we conducted a systematic literature review of strategies used for effective stakeholder education. Over 400 articles were initially retrieved; 59 articles met inclusion criteria. Strategies were grouped into four steps that communities can use to support stakeholder education efforts: identify stakeholder needs and resources, develop a plan, develop tailored and compelling messaging, and use implementation strategies. These strategies lay a framework for high-quality stakeholder education. In future research, it is important to prioritize evaluating specific activities taken to raise awareness, educate, and engage a community in community-wide public health efforts. |
Antifungal susceptibility testing practices at acute care hospitals enrolled in the National Healthcare Safety Network, United States, 2011-2015
Vallabhaneni S , Sapiano M , Weiner LM , Lockhart SR , Magill S . Open Forum Infect Dis 2017 4 (4) ofx175 We assessed availability of antifungal susceptibility testing (AFST) at nearly 4000 acute care hospitals enrolled in the National Healthcare Safety Network. In 2015, 95% offered any AFST, 28% offered AFST at their own laboratory or at an affiliated medical center, and 33% offered reflexive AFST. Availability of AFST improved from 2011 to 2015, but substantial gaps exist in the availability of AFST. |
Retrospective proteomic analysis of serum after Akhmeta virus infection: new suspect case identification and insights into poxvirus humoral immunity
Townsend MB , Gallardo-Romero NF , Khmaladze E , Vora NM , Maghlakelidze G , Geleishvili M , Carroll DS , Emerson GL , Reynolds MG , Satheshkumar PS . J Infect Dis 2017 216 (12) 1505-1512 Serologic cross-reactivity, a hallmark of orthopoxvirus (OPXV) infection, makes species-specific diagnosis of infection difficult. In this study, we used a Variola virus (VARV) proteome microarray to characterize and differentiate antibody responses to non-vaccinia OPXV infections from smallpox vaccination. The profile of two-case patients infected with newly discovered OPXV, Akhmeta virus (AKMV), exhibited antibody responses of greater intensity and broader recognition of viral proteins and includes the B21/22 family glycoproteins not encoded by vaccinia virus (VACV) strains used as vaccines. An additional case of AKMV, or non-vaccinia OPXV infection, was identified from community surveillance of individuals with no or uncertain history of vaccination and no recent infection. The results demonstrate the utility of microarrays for high resolution mapping of antibody response to determine nature of OPXV exposure. |
Risk factors for measles virus infection among adults during a large outbreak in postelimination era in Mongolia, 2015
Hagan JE , Takashima Y , Sarankhuu A , Dashpagma O , Jantsansengee B , Pastore R , Nyamaa G , Yadamsuren B , Mulders MN , Wannemuehler KA , Anderson R , Bankamp B , Rota P , Goodson JL . J Infect Dis 2017 216 (10) 1187-1195 Background: In 2015, a large nationwide measles outbreak occurred in Mongolia, with very high incidence in the capital city of Ulaanbaatar and among young adults. Methods: We conducted an outbreak investigation including a matched case-control study of risk factors for laboratory-confirmed measles among young adults living in Ulaanbaatar. Young adults with laboratory-confirmed measles, living in the capital city of Ulaanbaatar, were matched with 2-3 neighborhood controls. Conditional logistic regression was used to estimate adjusted matched odds ratios (aMORs) for risk factors, with 95% confidence intervals. Results: During March 1-September 30, 2015, 20 077 suspected measles cases were reported; 14 010 cases were confirmed. Independent risk factors for measles included being unvaccinated (adjusted matched odds ratio [aMOR] 2.0, P < .01), being a high school graduate without college education (aMOR 2.6, P < .01), remaining in Ulaanbaatar during the outbreak (aMOR 2.5, P < .01), exposure to an inpatient healthcare facility (aMOR 4.5 P < .01), and being born outside of Ulaanbaatar (aMOR 1.8, P = .02). Conclusions: This large, nationwide outbreak shortly after verification of elimination had high incidence among young adults, particularly those born outside the national capital. In addition, findings indicated that nosocomial transmission within health facilities helped amplify the outbreak. |
Rotavirus vaccine response correlates with the infant gut microbiota composition in Pakistan
Harris V , Ali A , Fuentes S , Korpela K , Kazi M , Tate J , Parashar U , Wiersinga WJ , Giaquinto C , de Weerth C , de Vos WM . Gut Microbes 2017 9 (2) 1-9 Rotavirus (RV) is the leading cause of diarrhea-related death in children worldwide and ninety-five percent of rotavirus deaths occur in Africa and Asia. Rotavirus vaccines (RVV) can dramatically reduce RV deaths, but have low efficacy in low-income settings where they are most needed. The intestinal microbiome may contribute to this decreased RVV efficacy. This pilot study hypothesizes that infants' intestinal microbiota composition correlates with RVV immune responses and that RVV responders have different gut microbiota as compared to non-responders. We conducted a nested, matched case-control study comparing the pre-vaccination intestinal microbiota composition between 10 6-week old Pakistani RVV-responders, 10 6-week old Pakistani RVV non-responders, and 10 healthy Dutch infants. RVV response was defined as an Immunoglobulin A of ≥20 IU/mL following Rotarix(RV1) vaccination in an infant with a pre-vaccination IgA<20. Infants were matched in a 1:1 ratio using ranked variables: RV1 dosing schedule (6/10/14; 6/10; or 10/14 weeks), RV season, delivery mode, delivery place, breastfeeding practices, age and gender. Fecal microbiota analysis was performed using a highly reproducible phylogenetic microarray. RV1 response correlated with a higher relative abundance of bacteria belonging to Clostridium cluster XI and Proteobacteria, including bacteria related to Serratia and Escherichia coli. Remarkably, abundance of these Proteobacteria was also significantly higher in Dutch infants when compared to RV1-non-responders in Pakistan. This small but carefully matched study showed the intestinal microbiota composition to correlate with RV1 seroconversion in Pakistan infants, identifying signatures shared with healthy Dutch infants. |
Tdap vaccination coverage during pregnancy - selected sites, United States, 2006-2015
Kerr S , Van Bennekom CM , Liang JL , Mitchell AA . MMWR Morb Mortal Wkly Rep 2017 66 (41) 1105-1108 Tetanus toxoid, reduced diphtheria toxoid, and acellular pertussis (Tdap) vaccine is recommended during the third trimester of each pregnancy to provide protection to newborns, who are at risk for pertussis-related morbidity and mortality (1). As part of its case-control surveillance study of medications and birth defects, the Birth Defects Study of the Slone Epidemiology Center at Boston University (the Birth Defects Study) has recorded data on vaccinations received during pregnancy since 2006. Among 5,606 mothers of infants without structural birth defects in this population (control group), <1% had received Tdap vaccine before 2009. By 2012, the percentage of mothers of infants in the control group (control infants) who had received Tdap increased to approximately 9%, and then in 2013 and continuing through 2015, increased markedly, to 28% and 54%, respectively. As the prevalence of maternal Tdap vaccination increased, so did the proportion of pregnant women who received Tdap in the third trimester, as recommended (94%-100% from 2010 to 2015). The vast majority of Tdap vaccinations (96%) were received in a traditional health care setting (e.g., the office of the woman's obstetrician or primary care physician or her prenatal clinic). Increasing vaccination coverage during pregnancy could help reduce the impact of pertussis on infant morbidity and mortality. |
Timeliness of childhood vaccination in the Federated States of Micronesia
Tippins A , Leidner AJ , Meghani M , Griffin A , Helgenberger L , Nyaku M , Underwood JM . Vaccine 2017 35 (47) 6404-6411 BACKGROUND: Vaccination coverage is typically measured as the proportion of individuals who have received recommended vaccine doses by the date of assessment. This approach does not provide information about receipt of vaccines by the recommended age, which is critical for ensuring optimal protection from vaccine-preventable diseases (VPDs). OBJECTIVE: To assess vaccination timeliness in the Federated States of Micronesia (FSM), and the projected impact of suboptimal vaccination in the event of an outbreak. METHODS: Timeliness of the 4th dose of diphtheria, tetanus, and acellular pertussis vaccine (DTaP) and 1st dose of measles, mumps, and rubella vaccine (MMR) among children 24-35 months was assessed in FSM. Both doses are defined as on time if administered from 361 through 395 days in age. Timeliness was calculated by one-way frequency analysis, and dose delays, measured in months after recommended age, were described using inverse Kaplan-Meier analysis. A time-series susceptible-exposed-infected-recovery (TSEIR) model simulated measles outbreaks in populations with on time and late vaccination. RESULTS: Total coverage for the 4th dose of DTaP ranged from 36.6% to 98.8%, and for the 1st dose of MMR ranged from 80.9% to 100.0% across FSM states. On time coverage for the 4th dose of DTaP ranged from 3.2% to 52.3%, and for the 1st dose of MMR ranged from 21.1% to 66.9%. Maximum and median dose delays beyond the recommended age varied by state. TSEIR models predicted 10.8-13.7% increases in measles cases during an outbreak based on these delays. CONCLUSIONS: In each of the FSM states, a substantial proportion of children received DTaP and MMR doses outside the recommended timeframe. Children who receive vaccinations later than recommended remain susceptible to VPDs during the period they remain unvaccinated, which may have a substantial impact on health systems during an outbreak. Immunization programs should consider vaccination timeliness in addition to coverage as a measure of susceptibility to VPDs in young children. |
Measles and rubella elimination: Learning from polio eradication and moving forward with a diagonal approach
Goodson JL , Alexander JP , Linkins RW , Orenstein WA . Expert Rev Vaccines 2017 16 (12) 1203-1216 INTRODUCTION: In 1988, an estimated 350,000 children were paralyzed by polio and 125 countries reported polio cases, the World Health Assembly passed a resolution to achieve polio eradication by 2000, and the Global Polio Eradication Initiative (GPEI) was established as a partnership focused on eradication. Today, following eradication efforts, polio cases have decreased >99% and eradication of all three types of wild polioviruses is approaching. However, since polio resources substantially support disease surveillance and other health programs, losing polio assets could reverse progress toward achieving Global Vaccine Action Plan goals. Areas covered: As the end of polio approaches and GPEI funds and capacity decrease, we document knowledge, experience, and lessons learned from 30 years of polio eradication. Expert commentary: Transitioning polio assets to measles and rubella (MR) elimination efforts would accelerate progress toward global vaccination coverage and equity. MR elimination feasibility and benefits have long been established. Focusing efforts on MR elimination after achieving polio eradication would make a permanent impact on reducing child mortality but should be done through a 'diagonal approach' of using measles disease transmission to identify areas possibly susceptible to other vaccine-preventable diseases and to strengthen the overall immunization and health systems to achieve disease-specific goals. |
Effectiveness of measles vaccination and immune globulin post-exposure prophylaxis in an outbreak setting - New York City, 2013
Arciuolo RJ , Jablonski RR , Zucker JR , Rosen JB . Clin Infect Dis 2017 65 (11) 1843-1847 Background: Measles, mumps, and rubella vaccine (MMR) or immune globulin (IG) are routinely used for measles post-exposure prophylaxis (PEP). However, current literature on the effectiveness of measles PEP is limited and variable. Here, we examined the effectiveness of MMR and IG PEP among children exposed to measles during an outbreak in New York City (NYC) in 2013. Methods: Contacts were identified by the NYC Department of Health and Mental Hygiene between 13 March 2013 and 30 June 2013. Immunity to measles and receipt of PEP was determined for contacts. PEP effectiveness [(1 - relative risk of developing measles) x 100] was calculated for MMR, IG, and any PEP (MMR or IG) for nonimmune contacts aged <19 years. Results: A total of 3409 contacts were identified, of which 208 (6.1%), 274 (8.0%), and 318 (9.3%) met the inclusion criteria for analysis of MMR, IG, and any PEP effectiveness, respectively. Of the contacts included, 44 received MMR PEP and 77 received IG PEP. Effectiveness of MMR PEP was 83.4% (95% confidence interval [CI], 34.4%, 95.8%). No contact who received IG PEP developed measles; effectiveness of IG PEP was 100% (approximated 95% CI, 56.2%, 99.8%). Effectiveness of receiving any PEP (MMR or IG) was 92.9% (95% CI, 56.2%, 99.8%). Conclusions: Contacts who received PEP were less likely to develop disease. Our findings support current recommendations for administration of PEP following exposure to measles. These results highlight the importance of a rapid public health outbreak response to limit measles transmission following case identification. |
Estimating the full public health value of vaccination
Gessner BD , Kaslow D , Louis J , Neuzil K , O'Brien KL , Picot V , Pang T , Parashar UD , Saadatian-Elahi M , Nelson CB . Vaccine 2017 35 (46) 6255-6263 There is an enhanced focus on considering the full public health value (FPHV) of vaccination when setting priorities, making regulatory decisions and establishing implementation policy for public health activities. Historically, a therapeutic paradigm has been applied to the evaluation of prophylactic vaccines and focuses on an individual benefit-risk assessment in prospective and individually-randomized phase III trials to assess safety and efficacy against etiologically-confirmed clinical outcomes. By contrast, a public health paradigm considers the population impact and encompasses measures of community benefits against a range of outcomes. For example, measurement of the FPHV of vaccination may incorporate health inequity, social and political disruption, disruption of household integrity, school absenteeism and work loss, health care utilization, long-term/on-going disability, the development of antibiotic resistance, and a range of non-etiologically and etiologically defined clinical outcomes. Following an initial conference at the Fondation Merieux in mid-2015, a second conference (December 2016) was held to further describe the efficacy of using the FPHV of vaccination on a variety of prophylactic vaccines. The wider scope of vaccine benefits, improvement in risk assessment, and the need for partnership and coalition building across interventions has also been discussed during the 2014 and 2016 Global Vaccine and Immunization Research Forums and the 2016 Geneva Health Forum, as well as in numerous publications including a special issue of Health Affairs in February 2016. The December 2016 expert panel concluded that while progress has been made, additional efforts will be necessary to have a more fully formulated assessment of the FPHV of vaccines included into the evidence-base for the value proposition and analysis of unmet medical need to prioritize vaccine development, vaccine licensure, implementation policies and financing decisions. The desired outcomes of these efforts to establish an alternative framework for vaccine evaluation are a more robust vaccine pipeline, improved appreciation of vaccine value and hence of its relative affordability, and greater public access and acceptance of vaccines. |
Hospital-based collaboration for epidemiological investigation of vaccine safety: A potential solution for low and middle-income countries?
Izurieta HS , Moro PL , Chen RT . Vaccine 2017 36 (3) 345-346 As each national immunization program matures with increasing uptake of vaccines in the population and control of their targeted vaccine-preventable diseases (VPD), vaccine safety concerns become more prominent. Such concerns are frequently a mix of coincidental adverse events falsely attributed to the memorable immunization event and real vaccine-induced reactions.(1) (2) Sorting the two types of concerns out requires implementation of appropriate surveillance systems for adverse events following immunizations (AEFI)s, and timely and rigorous scientific assessment (and occasionally good media skills) to maintain public confidence in immunization programs.(3) Failures to do so have tragically resulted in resurgence of VPD’s in multiple countries. (4) (5). | Traditionally, in high income countries, large populations with computerized databases that link vaccination history exposures and medical visits have been used for rigorous testing of hypothesized vaccine safety concerns raised by passive surveillance systems for AEFI’s.(6, 7) Pilot projects for similar large linked databases (LLDB) in low and middle-income countries (LMIC)’s have begun, but are not without problems.(8) Given the major challenges in obtaining the substantial resources needed to develop and sustain such LLDB’s,(9) affordable, timely and reliable alternative solutions for LMICs (even if imperfect) are needed. This need is highlighted by the accelerated introduction of new vaccines for diseases endemic in LMICs (e.g., Meningitis A, Rotavirus, Cholera and Dengue) without prior use in countries with strong pharmacovigilance systems. (10) |
Impact of the US maternal tetanus, diphtheria, and acellular pertussis vaccination program on preventing pertussis in infants <2 months of age: A case-control evaluation
Skoff TH , Blain AE , Watt J , Scherzinger K , McMahon M , Zansky SM , Kudish K , Cieslak PR , Lewis M , Shang N , Martin SW . Clin Infect Dis 2017 65 (12) 1977-1983 Background: Infants aged <1 year are at highest risk for pertussis-related morbidity and mortality. In 2012, Tdap (tetanus toxoid, reduced diphtheria toxoid, and acellular pertussis) vaccine was recommended for women during each pregnancy to protect infants in the first months of life; data on effectiveness of this strategy are currently limited. Methods: We conducted a case-control evaluation among pertussis cases <2 months old with cough onset between 1 January 2011 and 31 December 2014 from 6 US Emerging Infection Program Network states. Controls were hospital-matched and selected by birth certificate. Mothers were interviewed to collect information on demographics, household characteristics, and healthcare providers. Provider-verified immunization history was obtained on mothers and infants. Mothers were considered vaccinated during pregnancy if Tdap was received ≥14 days before delivery; trimester was calculated using Tdap date, infant's date of birth, and gestational age. Odds ratios were calculated using multivariable conditional logistic regression; vaccine effectiveness (VE) was estimated as (1 - odds ratio) x 100%. Results: A total of 240 cases and 535 controls were included; 17 (7.1%) case mothers and 90 (16.8%) control mothers received Tdap during the third trimester of pregnancy. The multivariable VE estimate for Tdap administered during the third trimester of pregnancy was 77.7% (95% confidence interval [CI], 48.3%-90.4%); VE increased to 90.5% (95% CI, 65.2%-97.4%) against hospitalized cases. Conclusions: Vaccination during pregnancy is an effective way to protect infants during the early months of life. With a continuing resurgence in pertussis, efforts should focus on maximizing Tdap uptake among pregnant women. |
Antibody response to human papillomavirus vaccination and natural exposure in individuals with Fanconi Anemia
Mehta PA , Sauter S , Zhang X , Davies SM , Wells SI , Myers KC , Panicker G , Unger ER , Butsch Kovacic M . Vaccine 2017 35 6712-6719 Fanconi anemia (FA) is a rare genetic disorder associated with predisposition to head and neck and gynecological squamous cell cancers. In the general population, these cancers are commonly linked to human papillomavirus (HPV) infection. Antibodies to natural HPV infection and HPV vaccination were evaluated in 63 individuals with FA while considering host immune factors. Approximately 30% of reportedly unvaccinated participants were seropositive (HPV6-38%, HPV11-25%, HPV16-26%, and HPV18-26%). Seropositivity was significantly associated with having had sex regardless of age (p=.007). Most participants showed seropositivity after HPV vaccination (HPV6-100%, HPV11-100%, HPV16-100% and HPV18-92%). Interestingly, titers for all 4 subtypes were significantly lower in the post-hematopoietic stem cell transplant (HSCT) participants compared to those who received the vaccine, but had not undergone HSCT (HPV6-p=.030, HPV11-p=.003, HPV16-p=.018, HPV18-p=<.001). It is unclear if these titers sufficiently protect from new infection since protective serologic cut offs have not yet been defined for the HPV vaccine. Individual immune functions were not associated with HPV seropositivity, however, underlying heterogeneous immune deficiency may explain higher rates of seropositivity in our younger unvaccinated participants (age 4-13 years). To better measure the efficacy of HPV vaccination in those with FA and other immune-compromised or cancer-prone disorders, future well-controlled vaccine studies are required. |
Experience and compliance with scanning vaccines' two-dimensional barcodes to record data
Evanson HV , Rodgers L , Reed J , Daily A , Gerlach K , Greene M , Koeppl P , Cox R , Williams W . Comput Inform Nurs 2017 36 (1) 8-17 Automated population of data into health information system fields offers the potential to increase efficiencies and save time. Increasingly, as two-dimensional barcoded vaccine products and barcode scanning technology become more widely available, manual recording of vaccine data can be reduced. This evaluation explores how often two-dimensional barcodes on vaccine vials and syringes were scanned and the perceived benefits and challenges reported by vaccine providers. Eighty-two facilities that administer vaccines completed the evaluation. Twenty-seven of those facilities provided records from vaccines administered between July 2014 and January 2015. Among the 63 179 two-dimensional barcoded vaccine administrations recorded, 12 408 (19%) were scanned. We received 116 user surveys from 63 facilities; using content analysis, we identified perceived benefits of scanning, workflow challenges, scanning challenges, and other challenges. The findings of this evaluation can guide health information system developers, vaccine manufacturers, and vaccine providers on how to remove potential barriers to using two-dimensional barcode scanning. |
STEADI: CDC's approach to make older adult fall prevention part of every primary care practice
Sarmiento K , Lee R . J Safety Res 2017 63 105-109 Introduction: Primary care providers play a critical role in protecting older adult patients from one of the biggest threats to their health and independence-falls. A fall among an older adult patient cannot only be fatal or cause a devastating injury, but can also lead to problems that can effect a patient's overall quality of life. Methods: In response, the Centers for Disease Control and Prevention (CDC) developed the STEADI initiative to give health care providers the tools they need to help reduce their older adult patient's risk of a fall. Results: CDC's STEADI resources have been distributed widely and include practical materials and tools for health care providers and their patients that are designed to be integrated into every primary care practice. Conclusion: As the population ages, the need for fall prevention efforts, such as CDC's STEADI, will become increasingly critical to safeguard the health of Americans. Practical applications: STEADI's electronic health records (EHRs), online trainings, assessment tools, and patient education materials are available at no-cost and can be downloaded online at www.cdc.gov/STEADI. Health care providers should look for opportunities to integrate STEADI materials into their practice, using a team-based approach, to help protect their older patients. |
Recommendations for Laboratory Containment and Management of Gene Drive Systems in Arthropods.
Benedict MQ , Burt A , Capurro ML , De Barro P , Handler AM , Hayes KR , Marshall JM , Tabachnick WJ , Adelman ZN . Vector Borne Zoonotic Dis 2017 18 (1) 2-13 Versatile molecular tools for creating driving transgenes and other invasive genetic factors present regulatory, ethical, and environmental challenges that should be addressed to ensure their safe use. In this article, we discuss driving transgenes and invasive genetic factors that can potentially spread after their introduction into a small proportion of individuals in a population. The potential of invasive genetic factors to increase their number in natural populations presents challenges that require additional safety measures not provided by previous recommendations regarding accidental release of arthropods. In addition to providing physical containment, invasive genetic factors require greater attention to strain management, including their distribution and identity confirmation. In this study, we focus on insects containing such factors with recommendations for investigators who are creating them, institutional biosafety committees charged with ensuring safety, funding agencies providing support, those managing insectaries handling these materials who are responsible for containment, and other persons who will be receiving insects-transgenic or not-from these facilities. We give specific examples of efforts to modify mosquitoes for mosquito-borne disease control, but similar considerations are relevant to other arthropods that are important to human health, the environment, and agriculture. |
Rapid determination of ebolavirus infectivity in clinical samples using a novel reporter cell line
Kainulainen MH , Nichol ST , Albarino CG , Spiropoulou CF . J Infect Dis 2017 216 (11) 1380-1385 Modern ebolavirus diagnostics rely primarily on qRT-PCR, a sensitive method to detect viral genetic material in the acute phase of the disease. However, qRT-PCR does not confirm presence of infectious virus, presenting limitations in patient and outbreak management. Attempts to isolate infectious virus rely on in vivo or basic cell culture approaches, which prohibit rapid results and screening. Here we present a novel reporter cell line capable of detecting live ebolaviruses. These cells permit sensitive large-scale screening and titration of infectious virus in experimental and clinical samples, independent of ebolavirus species and variant. |
Screen collection efficiency of airborne fibers with monodisperse length
Ku BK , Deye G , Turkevich LA . J Aerosol Sci 2017 114 250-262 Fiber length is believed to be an important variable in determining various toxicological responses to asbestos and other elongate mineral particles. In this study we investigated screen collection characteristics using monodisperse-length glass fibers (i.e., 11, 15, 25, and 53 µm in length), to better understand the collection of fibers with different lengths on screens with different mesh sizes. A well-dispersed aerosol of glass fibers (geometric mean length ~ 20 µm), generated by vortex shaking, was fed directly into the Baron Fiber Length Classifier, in order to produce monodisperse length fibers. With nylon mesh screens (10, 20, 30, 41 and 60 µm mesh sizes), the screen collection efficiency was measured using an aerodynamic particle sizer. As the screen mesh size decreases from 60 µm to 10 µm, the screen collection efficiency for 53 µm fibers increases (from 0.3 to 0.9) while 11 µm fibers exhibited a collection efficiency independent of screen mesh size. The collection efficiency for the longest fibers was found to be nearly constant for aerodynamic diameters 1–4 µm for screens 20 and 30 µm, but to rise significantly at aerodynamic diameters larger than 4 µm. For the 20 µm screen, the collection efficiency for fibers with lengths > 20 µm is a factor of two to five larger than that for spherical particles with the same aerodynamic diameter. We believe that fibers are collected on the screen primarily by interception below 4 µm in aerodynamic diameter, and by impaction above 4 µm. This study represents a fundamental advance in the understanding of the interaction of screens with a fibrous aerosol. |
Comparison of classical multi-locus sequence typing software for next-generation sequencing data
Page AJ , Alikhan NF , Carleton HA , Seemann T , Keane JA , Katz LS . Microb Genom 2017 3 (8) e000124 Multi-locus sequence typing (MLST) is a widely used method for categorizing bacteria. Increasingly, MLST is being performed using next-generation sequencing (NGS) data by reference laboratories and for clinical diagnostics. Many software applications have been developed to calculate sequence types from NGS data; however, there has been no comprehensive review to date on these methods. We have compared eight of these applications against real and simulated data, and present results on: (1) the accuracy of each method against traditional typing methods, (2) the performance on real outbreak datasets, (3) the impact of contamination and varying depth of coverage, and (4) the computational resource requirements. |
Correlation of treponemal immunoassay signal strength values with reactivity of confirmatory treponemal testing
Fakile YF , Jost H , Hoover KW , Gustafson KJ , Novak-Weekley SM , Schapiro JM , Tran A , Chow JM , Park IU . J Clin Microbiol 2017 56 (1) Automated treponemal immunoassays are used for syphilis screening with the reverse sequence algorithm; discordant results (e.g., enzyme immunoassay [EIA]-reactive, reactive plasma reagin [RPR]-non-reactive) are resolved with a second treponemal test. We conducted a study to determine automated immunoassay signal strength values consistently correlating with reactive confirmatory treponemal testing.We conducted a cross-sectional analysis of four automated immunoassays: BioPlex 2200 microbead immunoassay (MBIA), LIAISON chemiluminesence immunoassay (CIA), ADVIA-Centaur CIA, and TrepSure EIA, and three manual assays: Treponema Pallidum Particle Agglutination (TP-PA), Fluorescent Treponemal Antibody-Absorption (FTA-ABS) test, INNO-LIA line immunoassay. We compared signal strength values of automated immunoassays and positive and negative agreement. Among 1995 specimens, 908 (45.5%) were true positives (≥4/7 tests reactive) and 1087 (54.5%) were true negatives (≥4/7 tests non-reactive). Positive agreement ranged from 86.1% (83.7-88.2%) for FTA-ABS to 99.7% (99.0-99.9%) for ADVIA-Centaur CIA; negative agreement ranged from 86.3% (84.1-88.2%) for TrepSure EIA to 100% for TP-PA (99.6-100%). Increasing signal strength values correlated with increasing reactivity of confirmatory testing (ptrend <0.0001 for all automated immunoassay). All automated immunoassays had signal strength cutoffs corresponding to ≥4/7 reactive treponemal tests. Bioplex MBIA and LIAISON CIA had signal strength cutoffs correlating with ≥99% and 100% TP-PA reactivity, respectively. ADVIA-Centaur CIA and TrepSure EIA had signal strength cutoffs correlating with at least 95%TP-PA reactivity. All automated immunoassays had signal strength cutoffs correlating with at least 95% FTA-ABS reactivity. Assuming that a 95% level of confirmation is adequate, these signal strength values can be used in lieu of confirmatory testing with TP-PA and FTA-ABS. |
Development and characterization of novel chimeric monoclonal antibodies for broad spectrum neutralization of rabies virus
Kim PK , Keum SJ , Osinubi MOV , Franka R , Shin JY , Park ST , Kim MS , Park MJ , Lee SY , Carson W , Greenberg L , Yu P , Tao X , Lihua W , Tang Q , Liang G , Shampur M , Rupprecht CE , Chang SJ . PLoS One 2017 12 (10) e0186380 Current post-exposure prophylaxis for rabies virus infection has several limitations in terms of supply, cost, safety, and efficacy. Attempts to replace human or equine rabies immune globulins (HRIG or ERIG) have been made by several companies and institutes. We developed potent monoclonal antibodies to neutralize a broad spectrum of rabies viruses by screening hybridomas received from the U.S. Centers for Disease Control and Prevention (CDC). Two kinds of chimeric human antibodies (chimeric #7 and #17) were constructed by cloning the variable regions from selected hybridomas and the constant region of a human antibody. Two antibodies were bound to antigenic site III and I/IV, respectively, and were able to neutralize 51 field isolates of rabies virus that were isolated at different times and places such as Asia, Africa, North America, South America, and Australia. These two antibodies neutralize rabies viruses with high efficacy in an in vivo test using Syrian hamster and mouse models and show low risk for adverse immunogenicity. |
Government information systems to monitor complementary feeding programs for young children
Jefferds MED . Matern Child Nutr 2017 13 Suppl 2 Accelerating progress to improve complementary feeding of young children is a global priority. Strengthening monitoring through government information systems may increase the quality and implementation of infant and young child feeding (IYCF) programs. Monitoring is necessary for the effective implementation of programs as it allows program managers to assess program performance, identify problems, and take corrective action. Program descriptions and conceptual models explain how program inputs and activities should lead to outputs and outcomes, and ultimately public health impact; thus, they are critical tools when designing effective IYCF programs and monitoring systems as these descriptions and conceptual models form the basis for the program and are key for developing the monitoring system, indicators, and tools. Despite their importance, many programs do not have these documented, nor monitoring plans, limiting their ability to design effective programs and monitoring systems. Once in place, it is important to periodically review the monitoring system to confirm it still appropriately meets stakeholder needs and the data are being used to inform decision-making, and to make program adjustments as the monitoring focus, resources, or capacity may change during the program lifecycle. Including priority indicators of IYCF practices and counseling indicators in the government information systems may strengthen IYCF programs when the indicators are contextualized to the government IYCF program, capacity, and setting, and the indicators are used for decision-making and program improvement. |
Comparison of methods to assess consumption of micronutrient powders among young children in Nepal
Ng'eno BN , Perrine CG , Subedi GR , Mebrahtu S , Dahal P , Jefferds MED . Food Nutr Bull 2017 38 (3) 441-446 BACKGROUND: Assessing micronutrient powder (MNP) consumption is the key for monitoring program performance; no gold standard exists for assessing consumption in nutrition programs. OBJECTIVE: To compare estimates of MNP consumption assessed by maternal report versus observed unopened MNP sachets in the household. METHODS: Cross-sectional household surveys of children aged 6 to 23 months were conducted to assess an MNP project in Nepal; eligible children received 60 sachets per distribution. Mothers reported the number of sachets consumed and showed unused sachets. Directly observed difference (DOD) of MNP consumption was calculated by subtracting the number of observed unopened sachets from 60. Spearman correlation coefficient, categories of MNP consumption, and end digit preference were assessed. RESULTS: A total of 205 mothers did not show remaining unopened sachets despite reporting that all were not consumed. For the remaining 605 children, median consumption was 60.0 sachets by DOD and maternal report; the correlation coefficient was 0.91. With consumption grouped into categories of 0 to 14, 15 to 29, 30 to 44, and 45 to 60 sachets, the percent categorized into the same groupings by DOD and maternal report was 100%, 80.6%, 80.7%, and 91.2%, respectively. Excluding those who consumed 60 sachets, 16.9% and 8.0% by report and 14.2% and 6.1% DOD, ended with 0 and 5, respectively. CONCLUSION: Had the observation of unused sachets been used alone to assess MNP consumption, 205 children would not have been assessed. Estimates of MNP consumption by DOD and maternal report were similar in this population with high intake adherence. |
Unmanned aerial vehicles in construction and worker safety
Howard J , Murashov V , Branche CM . Am J Ind Med 2017 61 (1) 3-10 Applications of unmanned aerial vehicles (UAVs) for military, recreational, public, and commercial uses have expanded significantly in recent years. In the construction industry, UAVs are used primarily for monitoring of construction workflow and job site logistics, inspecting construction sites to assess structural integrity, and for maintenance assessments. As is the case with other emerging technologies, occupational safety assessments of UAVs lag behind technological advancements. UAVs may create new workplace hazards that need to be evaluated and managed to ensure their safe operation around human workers. At the same time, UAVs can perform dangerous tasks, thereby improving workplace safety. This paper describes the four major uses of UAVs, including their use in construction, the potential risks of their use to workers, approaches for risk mitigation, and the important role that safety and health professionals can play in ensuring safe approaches to the their use in the workplace. |
Leveraging the domain of work to improve migrant health
Flynn MA , Wickramage K . Int J Environ Res Public Health 2017 14 (10) Work is a principal driver of current international migration, a primary social determinant of health, and a fundamental point of articulation between migrants and their host society. Efforts by international organizations to promote migrant health have traditionally focused on infectious diseases and access to healthcare, while international labor organizations have largely focused on issues of occupational health. The underutilization of the domain of work in addressing the health of migrants is truly a missed opportunity for influencing worker well-being and reducing societal economic burden. Understanding of the relationships among migration, work, and health would facilitate further integration of migrant health concerns into the policy agenda of governments and international agencies that work at the nexus of labor, health and development. The domain of work offers an opportunity to capitalize on the existing health and development infrastructure and leverage technical resources, programs and research to promote migrant health. It also provides the opportunity to advance migrant health through new and innovative approaches and partnerships. |
Prediction of WBGT-based clothing adjustment values from evaporative resistance
Bernard TE , Ashley CD , Garzon XP , Kim JH , Coca A . Ind Health 2017 Wet bulb globe temperature (WBGT) index is used by many professionals in combination with metabolic rate and clothing adjustments to assess whether a heat stress exposure is sustainable. The progressive heat stress protocol is a systematic method to prescribe a clothing adjustment value (CAV) from human wear trials, and it also provides an estimate of apparent total evaporative resistance (Re,T,a). It is clear that there is a direct relationship between the two descriptors of clothing thermal effects with diminishing increases in CAV at high Re,T,a. There were data to suggest an interaction of CAV and Re,T,a with relative humidity at high evaporative resistance. Because human trials are expensive, manikin data can reduce the cost by considering the static total evaporative resistance (Re,T,s). In fact, as the static evaporative resistance increases, the CAV increases in a similar fashion as Re,T,a. While the results look promising that Re,T,s can predict CAV, some validation remains, especially for high evaporative resistance. The data only supports air velocities near 0.5 m/s. |
Hospital security director background, opinions, and the implementation of security programs
Blando JD , Ridenour ML , Hartley D , Nocera M . J Appl Secur Res 2017 12 (4) 497-511 Effective security is crucial to the functioning of a hospital because it impacts patient care, employee satisfaction and turnover, and patient confidence in the healthcare facility to provide a safe environment for medical care. A survey was conducted of NJ hospital security directors to describe their security programs, assess compliance with statewide workplace violence prevention regulations, and evaluate the influence of their experience and opinions on the comprehensiveness of their security program. The surveyed security programs (n = 52) had partial compliance with the regulations, security directors (n = 35) viewed the regulations positively but also had suggestions for improvements, and having a director with law enforcement experience did not improve regulatory compliance. |
Malaria prevalence, prevention and treatment seeking practices among nomadic pastoralists in northern Senegal
Seck MC , Thwing J , Fall FB , Gomis JF , Deme A , Ndiaye YD , Daniels R , Volkman SK , Ndiop M , Ba M , Ndiaye D . Malar J 2017 16 (1) 413 BACKGROUND: Malaria transmission in Senegal is highly stratified, from low in the dry north to moderately high in the moist south. In northern Senegal, along the Senegal River Valley and in the Ferlo semi-desert region, annual incidence is less than five cases per 1000 inhabitants. Many nomadic pastoralists have permanent dwellings in the Ferlo Desert and Senegal River Valley, but spend dry season in the south with their herds, returning north when the rains start, leading to a concern that this population could contribute to ongoing transmission in the north. METHODS: A modified snowball sampling survey was conducted at six sites in northern Senegal to determine the malaria prevention and treatment seeking practices and parasite prevalence among nomadic pastoralists in the Senegal River Valley and the Ferlo Desert. Nomadic pastoralists aged 6 months and older were surveyed during September and October 2014, and data regarding demographics, access to care and preventive measures were collected. Parasite infection was detected using rapid diagnostic tests (RDTs), microscopy (thin and thick smears) and polymerase chain reaction (PCR). Molecular barcodes were determined by high resolution melting (HRM). RESULTS: Of 1800 participants, 61% were male. Sixty-four percent had at least one bed net in the household, and 53% reported using a net the night before. Only 29% had received a net from a mass distribution campaign. Of the 8% (142) who reported having had fever in the last month, 55% sought care, 20% of whom received a diagnostic test, one-third of which (n = 5) were reported to be positive. Parasite prevalence was 0.44% by thick smear and 0.50% by PCR. None of the molecular barcodes identified among the nomadic pastoralists had been previously identified in Senegal. CONCLUSIONS: While access to and utilization of malaria control interventions among nomadic pastoralists was lower than the general population, parasite prevalence was lower than expected and sheds doubt on the perception that they are a source of ongoing transmission in the north. The National Malaria Control Program is making efforts to improve access to malaria prevention and case management for nomadic populations. |
Possible role of fish as transport hosts for Dracunculus spp. larvae
Cleveland CA , Eberhard ML , Thompson AT , Smith SJ , Zirimwabagabo H , Bringolf R , Yabsley MJ . Emerg Infect Dis 2017 23 (9) 1590-1592 To inform Dracunculus medinensis (Guinea worm) eradication efforts, we evaluated the role of fish as transport hosts for Dracunculus worms. Ferrets fed fish that had ingested infected copepods became infected, highlighting the importance of recommendations to cook fish, bury entrails, and prevent dogs from consuming raw fish and entrails. |
Promoting regulatory reform: The African Health Profession Regulatory Collaborative (ARC) for Nursing and Midwifery year 4 evaluation
Kelley MA , Spangler SA , Tison LI , Johnson CM , Callahan TL , Iliffe J , Hepburn KW , Gross JM . J Nurs Regul 2017 8 (3) 41-52 As countries across sub-Saharan Africa work towards universal health coverage and HIV epidemic control, investments seek to bolster the quality and relevance of the health workforce. The African Health Profession Regulatory Collaborative (ARC) partnered with 17 countries across East, Central, and Southern Africa to ensure nurses and midwives were authorized and equipped to provide essential HIV services to pregnant women and children with HIV. Through ARC, nursing leadership teams representing each country identify a priority regulatory function and develop a proposal to strengthen that regulation over a 1-year period. Each year culminates with a summative congress meeting, involving all ARC countries, where teams present their projects and share lessons learned with their colleagues. During a recent ARC Summative Congress, a group survey was administered to 11 country teams that received ARC Year 4 grants to measure advancements in regulatory function using the five-stage Regulatory Function Framework, and a group questionnaire was administered to 16 country teams to measure improvements in national nursing capacity (February 2011–2016). In ARC Year 4, eight countries implemented continuing professional development projects, Botswana revised their scope of practice, Mozambique piloted a licensing examination to assess HIV-related competencies, and South Africa developed accreditation standards for HIV/tuberculosis specialty nurses. Countries reported improvements in national nursing leaders’ teamwork, collaborations with national organizations, regional networking with nursing leaders, and the ability to garner additional resources. ARC provides an effective, collaborative model to rapidly strengthen national regulatory frameworks, which other health professional cadres or regions may consider using to ensure a relevant health workforce, authorized and equipped to meet the emerging demand for health services. |
Uptake and correlates of contraception among postpartum women in Kenya: results from a national cross-sectional survey
Achwoka D , Pintye J , McGrath CJ , Kinuthia J , Unger JA , Obudho N , Langat A , John-Stewart G , Drake AL . Contraception 2017 97 (3) 227-235 OBJECTIVES: To characterize uptake and correlates of effective contraceptive use postpartum. STUDY DESIGN: We analyzed data from a national, cross-sectional evaluation of prevention of mother-to-child HIV transmission programs that enrolled women attending 6-week or 9-month infant immunization visits at 120 Kenyan maternal and child health clinics. We classified women who resumed sexual activity postpartum and did not desire a child within 2years as having a need for family planning (FP). RESULTS: We included 955 (94%) of 1012 women 8-10months postpartum in the analysis. Mean age was 25.8years and 36% were primigravidas. By 9-months postpartum 62% of all women used contraception and 59% used effective contraception (injectables, implants, intrauterine devices [IUDs], oral contraceptives [OCs], and tubal ligations). Most contraceptive users (61%) used injectables, followed by implants (10%), OCs (6%), IUDs (4%), and condoms alone (2%). The majority (n=733, 77%) had a need for FP and 67% of 733 women with FP need used effective contraception. Among women with a need for FP, effective contraception use was higher among those who discussed FP in postnatal care (PNC) than who did not discuss FP in PNC (Prevalence Ratio (PR) for PNC alone: 1.35 (95% Confidence Interval [CI]:1.16-1.58; PR for PNC and antenatal care [ANC]:1.42, 95% CI: 1.21-1.67; p=.001 for both). CONCLUSIONS: Two-thirds of postpartum women with a need for FP used effective contraception at 9-months postpartum, and use was associated with discussing FP during PNC. IMPLICATIONS: Integrating FP counseling in ANC/PNC could be an effective strategy to increase effective contraception use. |
Association between biomarkers of ovarian reserve and infertility among older women of reproductive age
Steiner AZ , Pritchard D , Stanczyk FZ , Kesner JS , Meadows JW , Herring AH , Baird DD . JAMA 2017 318 (14) 1367-1376 Importance: Despite lack of evidence of their utility, biomarkers of ovarian reserve are being promoted as potential markers of reproductive potential. Objective: To determine the associations between biomarkers of ovarian reserve and reproductive potential among women of late reproductive age. Design, Setting, and Participants: Prospective time-to-pregnancy cohort study (2008 to date of last follow-up in March 2016) of women (N = 981) aged 30 to 44 years without a history of infertility who had been trying to conceive for 3 months or less, recruited from the community in the Raleigh-Durham, North Carolina, area. Exposures: Early-follicular-phase serum level of antimullerian hormone (AMH), follicle-stimulating hormone (FSH), and inhibin B and urinary level of FSH. Main Outcomes and Measures: The primary outcomes were the cumulative probability of conception by 6 and 12 cycles of attempt and relative fecundability (probability of conception in a given menstrual cycle). Conception was defined as a positive pregnancy test result. Results: A total of 750 women (mean age, 33.3 [SD, 3.2] years; 77% white; 36% overweight or obese) provided a blood and urine sample and were included in the analysis. After adjusting for age, body mass index, race, current smoking status, and recent hormonal contraceptive use, women with low AMH values (<0.7 ng/mL [n = 84]) did not have a significantly different predicted probability of conceiving by 6 cycles of attempt (65%; 95% CI, 50%-75%) compared with women (n = 579) with normal values (62%; 95% CI, 57%-66%) or by 12 cycles of attempt (84% [95% CI, 70%-91%] vs 75% [95% CI, 70%-79%], respectively). Women with high serum FSH values (>10 mIU/mL [n = 83]) did not have a significantly different predicted probability of conceiving after 6 cycles of attempt (63%; 95% CI, 50%-73%) compared with women (n = 654) with normal values (62%; 95% CI, 57%-66%) or after 12 cycles of attempt (82% [95% CI, 70%-89%] vs 75% [95% CI, 70%-78%], respectively). Women with high urinary FSH values (>11.5 mIU/mg creatinine [n = 69]) did not have a significantly different predicted probability of conceiving after 6 cycles of attempt (61%; 95% CI, 46%-74%) compared with women (n = 660) with normal values (62%; 95% CI, 58%-66%) or after 12 cycles of attempt (70% [95% CI, 54%-80%] vs 76% [95% CI, 72%-80%], respectively). Inhibin B levels (n = 737) were not associated with the probability of conceiving in a given cycle (hazard ratio per 1-pg/mL increase, 0.999; 95% CI, 0.997-1.001). Conclusions and Relevance: Among women aged 30 to 44 years without a history of infertility who had been trying to conceive for 3 months or less, biomarkers indicating diminished ovarian reserve compared with normal ovarian reserve were not associated with reduced fertility. These findings do not support the use of urinary or blood follicle-stimulating hormone tests or antimullerian hormone levels to assess natural fertility for women with these characteristics. |
Underlying factors in drug overdose deaths
Dowell D , Noonan RK , Houry D . JAMA 2017 318 (23) 2295-2296 Drug overdose accounted for 52 404 deaths in the United States in 2015,1 which are more deaths than for AIDS at its peak in 1995. Provisional data from the US Centers for Disease Control and Prevention (CDC) indicate drug overdose deaths increased again from 2015 to 2016 by more than 20% (from 52 898 deaths in the year ending in January 2016 to 64 070 deaths in the year ending in January 2017).2 Increases are greatest forover-doses related to the category including illicitly manufactured fentanyl (ie, synthetic opioids excluding methadone), which more than doubled, accounting for more than 20 000 overdose deaths in 2016 vs less than 10 000 deaths in 2015. This difference is enough to account for nearly all the increase in drug overdose deaths from 2015 to 2016.2 | Since 2010, overdose deaths involving predominantly illicit opioids (heroin, synthetic nonmethadone opioids, or both) have increased by more than 200% (Figure). Why have overdose deaths related to illicit opioids increased so substantially? Data from the National Survey on Drug Use and Health reveal moderate increases in people reporting past-year heroin use from 2010 to 2015 (Figure). Increasing numbers of individuals who use heroin are younger, might be less experienced, and might use heroin in riskier ways that are difficult to measure (eg, using it alone, using more heroin, using it more often, or combining drugs). |
Notes from the field: Counterfeit Percocet-related overdose cluster - Georgia, June 2017
Edison L , Erickson A , Smith S , Lopez G , Hon S , King A , Nydam N , O'Neal JP , Drenzek C . MMWR Morb Mortal Wkly Rep 2017 66 (41) 1119-1120 On June 5, 2017, a Georgia North-Central Health District emergency department (ED) notified the Georgia Poison Center of six opioid overdoses and one death during the previous day. All patients had severe respiratory depression, loss of consciousness, or both, and some required high naloxone doses and mechanical ventilation. Two patients reported taking one or two pills that they believed to be Percocet, purchased without a prescription, on the street. | The Georgia Poison Center notified area hospitals and a Georgia Department of Public Health (GDPH) epidemiologist, who informed partners, including 1) health district epidemiologists, who worked with hospitals; 2) the Georgia Bureau of Investigation, which performed drug testing; 3) the High Intensity Drug Trafficking Area office, which notified law enforcement; 4) local coroners, who reported related deaths to GDPH; and 5) the GDPH Office of Emergency Medical Services (EMS), which notified EMS providers and the medical community. A coordinated communication effort led to two multiagency press conferences on June 6 to notify the public about the presence of the dangerous counterfeit pills. |
Illicit drug use, illicit drug use disorders, and drug overdose deaths in metropolitan and nonmetropolitan areas - United States
Mack KA , Jones CM , Ballesteros MF . MMWR Surveill Summ 2017 66 (19) 1-12 PROBLEM/CONDITION: Drug overdoses are a leading cause of injury death in the United States, resulting in approximately 52,000 deaths in 2015. Understanding differences in illicit drug use, illicit drug use disorders, and overall drug overdose deaths in metropolitan and nonmetropolitan areas is important for informing public health programs, interventions, and policies. REPORTING PERIOD: Illicit drug use and drug use disorders during 2003-2014, and drug overdose deaths during 1999-2015. DESCRIPTION OF DATA: The National Survey of Drug Use and Health (NSDUH) collects information through face-to-face household interviews about the use of illicit drugs, alcohol, and tobacco among the U.S. noninstitutionalized civilian population aged ≥12 years. Respondents include residents of households and noninstitutional group quarters (e.g., shelters, rooming houses, dormitories, migratory workers' camps, and halfway houses) and civilians living on military bases. NSDUH variables include sex, age, race/ethnicity, residence (metropolitan/nonmetropolitan), annual household income, self-reported drug use, and drug use disorders. National Vital Statistics System Mortality (NVSS-M) data for U.S. residents include information from death certificates filed in the 50 states and the District of Columbia. Cases were selected with an underlying cause of death based on the ICD-10 codes for drug overdoses (X40-X44, X60-X64, X85, and Y10-Y14). NVSS-M variables include decedent characteristics (sex, age, and race/ethnicity) and information on intent (unintentional, suicide, homicide, or undetermined), location of death (medical facility, in a home, or other [including nursing homes, hospices, unknown, and other locations]) and county of residence (metropolitan/nonmetropolitan). Metropolitan/nonmetropolitan status is assigned independently in each data system. NSDUH uses a three-category system: Core Based Statistical Area (CBSA) of ≥1 million persons; CBSA of <1 million persons; and not a CBSA, which for simplicity were labeled large metropolitan, small metropolitan, and nonmetropolitan. Deaths from NVSS-M are categorized by the county of residence of the decedent using CDC's National Center for Health Statistics 2013 Urban-Rural Classification Scheme, collapsed into two categories (metropolitan and nonmetropolitan). RESULTS: Although both metropolitan and nonmetropolitan areas experienced significant increases from 2003-2005 to 2012-2014 in self-reported past-month use of illicit drugs, the prevalence was highest for the large metropolitan areas compared with small metropolitan or nonmetropolitan areas throughout the study period. Notably, past-month use of illicit drugs declined over the study period for the youngest respondents (aged 12-17 years). The prevalence of past-year illicit drug use disorders among persons using illicit drugs in the past year varied by metropolitan/nonmetropolitan status and changed over time. Across both metropolitan and nonmetropolitan areas, the prevalence of past-year illicit drug use disorders declined during 2003-2014. In 2015, approximately six times as many drug overdose deaths occurred in metropolitan areas than occurred in nonmetropolitan areas (metropolitan: 45,059; nonmetropolitan: 7,345). Drug overdose death rates (per 100,000 population) for metropolitan areas were higher than in nonmetropolitan areas in 1999 (6.4 versus 4.0), however, the rates converged in 2004, and by 2015, the nonmetropolitan rate (17.0) was slightly higher than the metropolitan rate (16.2). INTERPRETATION: Drug use and subsequent overdoses continue to be a critical and complicated public health challenge across metropolitan/nonmetropolitan areas. The decline in illicit drug use by youth and the lower prevalence of illicit drug use disorders in rural areas during 2012-2014 are encouraging signs. However, the increasing rate of drug overdose deaths in rural areas, which surpassed rates in urban areas, is cause for concern. PUBLIC HEALTH ACTIONS: Understanding the differences between metropolitan and nonmetropolitan areas in drug use, drug use disorders, and drug overdose deaths can help public health professionals to identify, monitor, and prioritize responses. Consideration of where persons live and where they die from overdose could enhance specific overdose prevention interventions, such as training on naloxone administration or rescue breathing. Educating prescribers on CDC's guideline for prescribing opioids for chronic pain (Dowell D, Haegerich TM, Chou R. CDC guideline for prescribing opioids for chronic pain-United States, 2016. MMWR Recomm Rep 2016;66[No. RR-1]) and facilitating better access to medication-assisted treatment with methadone, buprenorphine, or naltrexone could benefit communities with high opioid use disorder rates. |
Multilocus genotyping of Giardia duodenalis in Tibetan sheep and yaks in Qinghai, China
Jin Y , Fei J , Cai J , Wang X , Li N , Guo Y , Feng Y , Xiao L . Vet Parasitol 2017 247 70-76 Giardia duodenalis is a common gastrointestinal protozoon in mammals. Although many studies have been reported on the distribution of G. duodenalis genotypes in sheep and cattle raised under intensive farming, few studies are available on the distribution of G. duodenalis in Tibetan sheep and yaks, which are raised free ranging in a continental plateau climate. In this study, 495 fecal specimens from Tibetan sheep and 605 from yaks were collected from eight counties in Qinghai, China and analyzed for G. duodenalis by PCR targeting the β-giardin (bg), glutamate dehydrogenase (gdh), and triosephosphate isomerase (tpi) genes. Based on PCR positivity at the bg locus, G. duodenalis occurrence rates were 13.1% (65/495) in Tibetan sheep and 10.4% (63/605) in yaks. DNA sequence analysis identified the presence of G. duodenalis Assemblages A (in 10 Tibetan sheep and 2 yaks) and E (in 51 Tibetan sheep and 60 yaks). In addition, mixed infections of the two were identified in four Tibetan sheep and one yak. Among the sequences obtained in this study, 1, 10, and 2 new subtypes of Assemblage E were detected at the bg, gdh and tpi loci, respectively. Based on sequences from the three loci, 28 multilocus genotypes (MLGs) were obtained, including 27 MLGs in Assemblage E and one MLG in Assemblage A. Each MLG was found in no more than seven animals, with most MLGs forming host-specific clusters in phylogenetic analysis except for one cluster including MLGs from both Tibetan sheep and yaks. Only two MLGs were found in both sheep and yaks. The above results demonstrate a high subtype diversity of G. duodenalis Assemblage E in Tibetan sheep and yaks raised in a traditional animal husbandry system and suggest that only limited cross-species transmission of G. duodenalis occurs between yaks and sheep sharing pastures. |
Chikungunya fever outbreak identified in North Bali, Indonesia.
Sari K , Myint KSA , Andayani AR , Adi PD , Dhenni R , Perkasa A , Ma'roef CN , Witari NPD , Megawati D , Powers AM , Jaya UA . Trans R Soc Trop Med Hyg 2017 111 (7) 1-3 Background: Chikungunya virus (CHIKV) infections have been reported sporadically within the last 5 years in several areas of Indonesia including Bali. Most of the reports, however, have lacked laboratory confirmation. Method: A recent fever outbreak in a village in the North Bali area was investigated using extensive viral diagnostic testing including both molecular and serological approaches. Results and conclusions: Ten out of 15 acute febrile illness samples were confirmed to have CHIKV infection by real-time PCR or CHIKV-specific IgM enzyme-linked immunosorbent assay (ELISA). The outbreak strain belonged to the Asian genotype with highest homology to other CHIKV strains currently circulating in Indonesia. The results are of public health concern particularly because Bali is a popular tourist destination in Indonesia and thereby the potential to spread the virus to non-endemic areas is high. GenBank accession numbers: KY885022, KY885023, KY885024, KY885025, KY885026, KY885027. |
Influenza A(H3N2) Virus in Swine at Agricultural Fairs and Transmission to Humans, Michigan and Ohio, USA, 2016.
Bowman AS , Walia RR , Nolting JM , Vincent AL , Killian ML , Zentkovich MM , Lorbach JN , Lauterbach SE , Anderson TK , Davis CT , Zanders N , Jones J , Jang Y , Lynch B , Rodriguez MR , Blanton L , Lindstrom SE , Wentworth DE , Schiltz J , Averill JJ , Forshey T . Emerg Infect Dis 2017 23 (9) 1551-1555 In 2016, a total of 18 human infections with influenza A(H3N2) virus occurred after exposure to influenza-infected swine at 7 agricultural fairs. Sixteen of these cases were the result of infection by a reassorted virus with increasing prevalence among US swine containing a hemagglutinin gene from 2010-11 human seasonal H3N2 strains. |
Role of food insecurity in outbreak of anthrax infections among humans and hippopotamuses living in a game reserve area, Rural Zambia
Lehman MW , Craig AS , Malama C , Kapina-Kany'anga M , Malenga P , Munsaka F , Muwowo S , Shadomy S , Marx MA . Emerg Infect Dis 2017 23 (9) 1471-1477 In September 2011, a total of 511 human cases of anthrax (Bacillus anthracis) infection and 5 deaths were reported in a game management area in the district of Chama, Zambia, near where 85 hippopotamuses (Hippopotamus amphibious) had recently died of suspected anthrax. The human infections generally responded to antibiotics. To clarify transmission, we conducted a cross-sectional, interviewer-administered household survey in villages where human anthrax cases and hippopotamuses deaths were reported. Among 284 respondents, 84% ate hippopotamus meat before the outbreak. Eating, carrying, and preparing meat were associated with anthrax infection. Despite the risk, 23% of respondents reported they would eat meat from hippopotamuses found dead again because of food shortage (73%), lack of meat (12%), hunger (7%), and protein shortage (5%). Chronic food insecurity can lead to consumption of unsafe foods, leaving communities susceptible to zoonotic infection. Interagency cooperation is necessary to prevent outbreaks by addressing the root cause of exposure, such as food insecurity. |
Update: Interim guidance for the diagnosis, evaluation, and management of infants with possible congenital Zika virus infection - United States, October 2017
Adebanjo T , Godfred-Cato S , Viens L , Fischer M , Staples JE , Kuhnert-Tallman W , Walke H , Oduyebo T , Polen K , Peacock G , Meaney-Delman D , Honein MA , Rasmussen SA , Moore CA . MMWR Morb Mortal Wkly Rep 2017 66 (41) 1089-1099 CDC has updated its interim guidance for U.S. health care providers caring for infants with possible congenital Zika virus infection (1) in response to recently published updated guidance for health care providers caring for pregnant women with possible Zika virus exposure (2), unknown sensitivity and specificity of currently available diagnostic tests for congenital Zika virus infection, and recognition of additional clinical findings associated with congenital Zika virus infection. All infants born to mothers with possible Zika virus exposure* during pregnancy should receive a standard evaluation at birth and at each subsequent well-child visit including a comprehensive physical examination, age-appropriate vision screening and developmental monitoring and screening using validated tools (3-5), and newborn hearing screen at birth, preferably using auditory brainstem response (ABR) methodology (6). Specific guidance for laboratory testing and clinical evaluation are provided for three clinical scenarios in the setting of possible maternal Zika virus exposure: 1) infants with clinical findings consistent with congenital Zika syndrome regardless of maternal testing results, 2) infants without clinical findings consistent with congenital Zika syndrome who were born to mothers with laboratory evidence of possible Zika virus infection,dagger and 3) infants without clinical findings consistent with congenital Zika syndrome who were born to mothers without laboratory evidence of possible Zika virus infection. Infants in the first two scenarios should receive further testing and evaluation for Zika virus, whereas for the third group, further testing and clinical evaluation for Zika virus are not recommended. Health care providers should remain alert for abnormal findings (e.g., postnatal-onset microcephaly and eye abnormalities without microcephaly) in infants with possible congenital Zika virus exposure without apparent abnormalities at birth. |
Modeling the environmental suitability of anthrax in Ghana and estimating populations at risk: Implications for vaccination and control
Kracalik IT , Kenu E , Ayamdooh EN , Allegye-Cudjoe E , Polkuu PN , Frimpong JA , Nyarko KM , Bower WA , Traxler R , Blackburn JK . PLoS Negl Trop Dis 2017 11 (10) e0005885 Anthrax is hyper-endemic in West Africa. Despite the effectiveness of livestock vaccines in controlling anthrax, underreporting, logistics, and limited resources makes implementing vaccination campaigns difficult. To better understand the geographic limits of anthrax, elucidate environmental factors related to its occurrence, and identify human and livestock populations at risk, we developed predictive models of the environmental suitability of anthrax in Ghana. We obtained data on the location and date of livestock anthrax from veterinary and outbreak response records in Ghana during 2005-2016, as well as livestock vaccination registers and population estimates of characteristically high-risk groups. To predict the environmental suitability of anthrax, we used an ensemble of random forest (RF) models built using a combination of climatic and environmental factors. From 2005 through the first six months of 2016, there were 67 anthrax outbreaks (851 cases) in livestock; outbreaks showed a seasonal peak during February through April and primarily involved cattle. There was a median of 19,709 vaccine doses [range: 0-175 thousand] administered annually. Results from the RF model suggest a marked ecological divide separating the broad areas of environmental suitability in northern Ghana from the southern part of the country. Increasing alkaline soil pH was associated with a higher probability of anthrax occurrence. We estimated 2.2 (95% CI: 2.0, 2.5) million livestock and 805 (95% CI: 519, 890) thousand low income rural livestock keepers were located in anthrax risk areas. Based on our estimates, the current anthrax vaccination efforts in Ghana cover a fraction of the livestock potentially at risk, thus control efforts should be focused on improving vaccine coverage among high risk groups. |
Patterns of human plague in Uganda, 2008-2016
Forrester JD , Apangu T , Griffith K , Acayo S , Yockey B , Kaggwa J , Kugeler KJ , Schriefer M , Sexton C , Beard CB , Candini G , Abaru J , Candia B , Okoth JF , Apio H , Nolex L , Ezama G , Okello R , Atiku L , Mpanga J , Mead PS . Emerg Infect Dis 2017 23 (9) 1517-1521 Plague is a highly virulent fleaborne zoonosis that occurs throughout many parts of the world; most suspected human cases are reported from resource-poor settings in sub-Saharan Africa. During 2008-2016, a combination of active surveillance and laboratory testing in the plague-endemic West Nile region of Uganda yielded 255 suspected human plague cases; approximately one third were laboratory confirmed by bacterial culture or serology. Although the mortality rate was 7% among suspected cases, it was 26% among persons with laboratory-confirmed plague. Reports of an unusual number of dead rats in a patient's village around the time of illness onset was significantly associated with laboratory confirmation of plague. This descriptive summary of human plague in Uganda highlights the episodic nature of the disease, as well as the potential that, even in endemic areas, illnesses of other etiologies might be being mistaken for plague. |
Epidemiology of Salmonella enterica serotype Dublin infections among humans, United States, 1968-2013
Harvey RR , Friedman CR , Crim SM , Judd M , Barrett KA , Tolar B , Folster JP , Griffin PM , Brown AC . Emerg Infect Dis 2017 23 (9) 1493-501 Salmonella enterica serotype Dublin is a cattle-adapted bacterium that typically causes bloodstream infections in humans. To summarize demographic, clinical, and antimicrobial drug resistance characteristics of human infections with this organism in the United States, we analyzed data for 1968-2013 from 5 US surveillance systems. During this period, the incidence rate for infection with Salmonella Dublin increased more than that for infection with other Salmonella. Data from 1 system (FoodNet) showed that a higher percentage of persons with Salmonella Dublin infection were hospitalized and died during 2005-2013 (78% hospitalized, 4.2% died) than during 1996-2004 (68% hospitalized, 2.7% died). Susceptibility data showed that a higher percentage of isolates were resistant to >7 classes of antimicrobial drugs during 2005-2013 (50.8%) than during 1996-2004 (2.4%). |
Acute Zika virus infection as a risk factor for Guillain-Barre syndrome in Puerto Rico
Dirlikov E , Medina NA , Major CG , Munoz-Jordan JL , Luciano CA , Rivera-Garcia B , Sharp TM . JAMA 2017 318 (15) 1498-1500 This case-control study conducted during the Zika virus epidemic in Puerto Rico estimates the association between preceding Zika virus infection and subsequent Guillain-Barre syndrome. |
Clusters of human infections with avian influenza A(H7N9) virus in China, March 2013 to June 2015
Liu B , Havers FP , Zhou L , Zhong H , Wang X , Mao S , Li H , Ren R , Xiang N , Shu Y , Zhou S , Liu F , Chen E , Zhang Y , Widdowson MA , Li Q , Feng Z . J Infect Dis 2017 216 S548-s554 Multiple clusters of human infections with novel avian influenza A(H7N9) virus have occurred since the virus was first identified in spring 2013. However, in many situations it is unclear whether these clusters result from person-to-person transmission or exposure to a common infectious source. We analyzed the possibility of person-to-person transmission in each cluster and developed a framework to assess the likelihood that person-to-person transmission had occurred. We described 21 clusters with 22 infected contact cases that were identified by the Chinese Center for Disease Control and Prevention from March 2013 through June 2015. Based on detailed epidemiological information and the timing of the contact case patients' exposures to infected persons and to poultry during their potential incubation period, we graded the likelihood of person-to-person transmission as probable, possible, or unlikely. We found that person-to-person transmission probably occurred 12 times and possibly occurred 4 times; it was unlikely in 6 clusters. Probable nosocomial transmission is likely to have occurred in 2 clusters. Limited person-to-person transmission is likely to have occurred on multiple occasions since the H7N9 virus was first identified. However, these transmission events represented a small fraction of all identified cases of H7N9 human infection, and sustained person-to-person transmission was not documented. |
Convergence of humans, bats, trees, and culture in Nipah virus transmission, Bangladesh
Gurley ES , Hegde ST , Hossain K , Sazzad HMS , Hossain MJ , Rahman M , Sharker MAY , Salje H , Islam MS , Epstein JH , Khan SU , Kilpatrick AM , Daszak P , Luby SP . Emerg Infect Dis 2017 23 (9) 1446-1453 Preventing emergence of new zoonotic viruses depends on understanding determinants for human risk. Nipah virus (NiV) is a lethal zoonotic pathogen that has spilled over from bats into human populations, with limited person-to-person transmission. We examined ecologic and human behavioral drivers of geographic variation for risk of NiV infection in Bangladesh. We visited 60 villages during 2011-2013 where cases of infection with NiV were identified and 147 control villages. We compared case villages with control villages for most likely drivers for risk of infection, including number of bats, persons, and date palm sap trees, and human date palm sap consumption behavior. Case villages were similar to control villages in many ways, including number of bats, persons, and date palm sap trees, but had a higher proportion of households in which someone drank sap. Reducing human consumption of sap could reduce virus transmission and risk for emergence of a more highly transmissible NiV strain. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Food Safety
- Genetics and Genomics
- Health Communication and Education
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Public Health Leadership and Management
- Reproductive Health
- Substance Use and Abuse
- Veterinary Medicine
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure