The gut microbiota in conventional and serrated precursors of colorectal cancer.
Peters BA , Dominianni C , Shapiro JA , Church TR , Wu J , Miller G , Yuen E , Freiman H , Lustbader I , Salik J , Friedlander C , Hayes RB , Ahn J . Microbiome 2016 4 (1) 69 BACKGROUND: Colorectal cancer is a heterogeneous disease arising from at least two precursors-the conventional adenoma (CA) and the serrated polyp. We and others have previously shown a relationship between the human gut microbiota and colorectal cancer; however, its relationship to the different early precursors of colorectal cancer is understudied. We tested, for the first time, the relationship of the gut microbiota to specific colorectal polyp types. RESULTS: Gut microbiota were assessed in 540 colonoscopy-screened adults by 16S rRNA gene sequencing of stool samples. Participants were categorized as CA cases (n = 144), serrated polyp cases (n = 73), or polyp-free controls (n = 323). CA cases were further classified as proximal (n = 87) or distal (n = 55) and as non-advanced (n = 121) or advanced (n = 22). Serrated polyp cases were further classified as hyperplastic polyp (HP; n = 40) or sessile serrated adenoma (SSA; n = 33). We compared gut microbiota diversity, overall composition, and normalized taxon abundance among these groups. CA cases had lower species richness in stool than controls (p = 0.03); in particular, this association was strongest for advanced CA cases (p = 0.004). In relation to overall microbiota composition, only distal or advanced CA cases differed significantly from controls (p = 0.02 and p = 0.002). In taxon-based analysis, stool of CA cases was depleted in a network of Clostridia operational taxonomic units from families Ruminococcaceae, Clostridiaceae, and Lachnospiraceae, and enriched in the classes Bacilli and Gammaproteobacteria, order Enterobacteriales, and genera Actinomyces and Streptococcus (all q < 0.10). SSA and HP cases did not differ in diversity or composition from controls, though sample size for these groups was small. Few taxa were differentially abundant between HP cases or SSA cases and controls; among them, class Erysipelotrichi was depleted in SSA cases. CONCLUSIONS: Our results indicate that gut microbes may play a role in the early stages of colorectal carcinogenesis through the development of CAs. Findings may have implications for developing colorectal cancer prevention therapies targeting early microbial drivers of colorectal carcinogenesis. |
Use of medications for treating anxiety and depression in cancer survivors in the United States
Hawkins NA , Soman A , Buchanan Lunsford N , Leadbetter S , Rodriguez JL . J Clin Oncol 2017 35 (1) 78-85 Purpose This study used population-based data to estimate the percentage of cancer survivors in the United States reporting current medication use for anxiety and depression and to characterize the survivors taking this type of medication. Rates of medication use in cancer survivors were compared with rates in the general population. Methods We analyzed data from the National Health Interview Survey, years 2010 to 2013, identifying cancer survivors (n = 3,184) and adults with no history of cancer (n = 44,997) who completed both the Sample Adult Core Questionnaire and the Adult Functioning and Disability Supplement. Results Compared with adults with no history of cancer, cancer survivors were significantly more likely to report taking medication for anxiety (16.8% v 8.6%, P < .001), depression (14.1% v 7.8%, P < .001), and one or both of these conditions combined (19.1% v 10.4%, P < .001), indicating that an estimated 2.5 million cancer survivors were taking medication for anxiety or depression in the United States at that time. Survivor characteristics associated with higher rates of medication use for anxiety included being younger than 65 years old, female, and non-Hispanic white, and having public insurance, a usual source of medical care, and multiple chronic health conditions. Survivor characteristics associated with medication use for depression were largely consistent with those for anxiety, with the exceptions that insurance status was not significant, whereas being widowed/divorced/separated was associated with more use. Conclusion Cancer survivors in the United States reported medication use for anxiety and depression at rates nearly two times those reported by the general public, likely a reflection of greater emotional and physical burdens from cancer or its treatment. |
High school start times and the impact on high school students: What we know, and what we hope to learn
Morgenthaler TI , Hashmi S , Croft JB , Dort L , Heald JL , Mullington J . J Clin Sleep Med 2016 12 (12) 1681-1689 STUDY OBJECTIVES: Several organizations have provided recommendations to ensure high school starts no sooner than 08:30. However, although there are plausible biological reasons to support such recommendations, published recommendations have been based largely on expert opinion and a few observational studies. We sought to perform a critical review of published evidence regarding the effect of high school start times on sleep and other relevant outcomes. METHODS: We performed a broad literature search to identify 287 candidate publications for inclusion in our review, which focused on studies offering direct comparison of sleep time, academic or physical performance, behavioral health measures, or motor vehicular accidents in high school students. Where possible, outcomes were combined for meta-analysis. RESULTS: After application of study criteria, only 18 studies were suitable for review. Eight studies were amenable to meta-analysis for some outcomes. We found that later school start times, particularly when compared with start times more than 60 min earlier, are associated with longer weekday sleep durations, lower weekday-weekend sleep duration differences, reduced vehicular accident rates, and reduced subjective daytime sleepiness. Improvement in academic performance and behavioral issues is less established. CONCLUSIONS: The literature regarding effect of school start time delays on important aspects of high school life suggests some salutary effects, but often the evidence is indirect, imprecise, or derived from cohorts of convenience, making the overall quality of evidence weak or very weak. This review highlights a need for higher-quality data upon which to base important and complex public health decisions. |
Norovirus and Sapovirus Epidemiology and Strain Characteristics among Navajo and Apache Infants.
Grant LR , O'Brien KL , Weatherholtz RC , Reid R , Goklish N , Santosham M , Parashar U , Vinje J . PLoS One 2017 12 (1) e0169491 Norovirus and sapovirus are important causes of acute gastroenteritis (AGE) among American Indian infants. We investigated the prevalence and molecular epidemiology of norovirus and sapovirus in American Indian infants who have historically experienced a high burden of AGE compared to other US populations. Stool samples were collected from 241 children with AGE (cases) and from 343 infants without AGE (controls) ≤9 months of age from 2002-2004. Cases experienced forceful vomiting and/or 3 or more watery or looser-than-normal stools in 24 hours. Stools were tested by real-time RT-PCR for norovirus GI, GII and GIV and sapovirus GI, GII, GIV and GV. Positive samples were genotyped after sequencing conventional RT-PCR products. Norovirus was identified in 76 (31.5%) of the cases and 70 (20.4%) of the controls (p<0.001). GII.3 and GII.4 Farmington Hills were the most frequently identified genotypes in 14.5% and 30.3% of cases and 17.1% and 27.1% of controls, respectively. Sapovirus GI and GII genotypes were identified in 8 (3.3%) of cases and 8 (2.3%) of controls and a single GIV virus was detected in a control. The same norovirus and sapovirus genotypes were circulating in the general U.S. population in the same time period. The high detection rate of norovirus in healthy controls suggests significant asymptomatic transmission in young infants in these communities. |
Life expectancy after initiation of combination antiretroviral therapy in Thailand
Teeraananchai S , Chaivooth S , Kerr SJ , Bhakeecheep S , Avihingsanon A , Teeraratkul A , Sirinirund P , Law MG , Ruxrungtham K . Antivir Ther 2017 22 (5) 393-402 BACKGROUND: Access to combination antiretroviral therapy (cART) has decreased mortality in HIV positive people. We aimed to estimate the expected additional years of life in HIV-positive Thai people after starting cART through the National AIDS Program (NAP), administered by the Thai National Health Security Office (NHSO). METHODS: The NHSO database collects characteristics of all Thai HIV-infected patients through the National AIDS Program, including linkage with the National Death Registry for vital status. This study included patients aged ≥15 years at cART initiation between 2008 and 2014. The abridged life table method was used to construct life tables stratified by sex and baseline CD4 cell count. Life expectancy was defined as the additional years of life from age at starting cART. RESULTS: 201,688 eligible patients were included in analyses, contributing 618,837 person-years of follow-up. Median CD4 was 109 cells/mm3 and median age 37 years. The overall life expectancy after cART initiation at age 20 was 25.4 (95%CI, 25.3-25.6) years and 20.6 (95%CI, 20.5-20.7) at age 35 years. Life expectancy at baseline CD4 cell count ≥ 350 cells/mm3 was 51.9 (95% CI, 51.0-52.9) years for age 20 years, and 43.2 (95%CI, 42.4-44.1) years for age 35 years, close to life expectancy in the general Thai population. CONCLUSIONS: Increasing life expectancy with higher baseline CD4 cell counts supports the guideline recommendations to start cART irrespective of CD4 cell count. These results are beneficial to forecast the treatment cost and develop health policies for people living with HIV in Thailand and Asia. |
Official American Thoracic Society/Infectious Diseases Society of America/Centers for Disease Control and Prevention Clinical Practice Guidelines: Diagnosis of tuberculosis in adults and children
Lewinsohn DM , Leonard MK , LoBue PA , Cohn DL , Daley CL , Desmond E , Keane J , Lewinsohn DA , Loeffler AM , Mazurek GH , O'Brien RJ , Pai M , Richeldi L , Salfinger M , Shinnick TM , Sterling TR , Warshauer DM , Woods GL . Clin Infect Dis 2017 64 (2) 111-115 BACKGROUND: Individuals infected with Mycobacterium tuberculosis (Mtb) may develop symptoms and signs of disease (tuberculosis disease) or may have no clinical evidence of disease (latent tuberculosis infection [LTBI]). Tuberculosis disease is a leading cause of infectious disease morbidity and mortality worldwide, yet many questions related to its diagnosis remain. METHODS: A task force supported by the American Thoracic Society, Centers for Disease Control and Prevention, and Infectious Diseases Society of America searched, selected, and synthesized relevant evidence. The evidence was then used as the basis for recommendations about the diagnosis of tuberculosis disease and LTBI in adults and children. The recommendations were formulated, written, and graded using the Grading, Recommendations, Assessment, Development and Evaluation (GRADE) approach. RESULTS: Twenty-three evidence-based recommendations about diagnostic testing for latent tuberculosis infection, pulmonary tuberculosis, and extrapulmonary tuberculosis are provided. Six of the recommendations are strong, whereas the remaining 17 are conditional. CONCLUSIONS: These guidelines are not intended to impose a standard of care. They provide the basis for rational decisions in the diagnosis of tuberculosis in the context of the existing evidence. No guidelines can take into account all of the often compelling unique individual clinical circumstances. |
Feasibility study of HIV sentinel surveillance using PMTCT data in cameroon: From scientific success to programmatic failure
Billong SC , Dee J , Fokam J , Nguefack-Tsague G , Ekali GL , Fodjo R , Temgoua ES , Billong EJ , Sosso SM , Mosoko JJ , Monebenimp F , Ndjolo A , Bissek AZ , Bolu O , Elat JN . BMC Infect Dis 2017 17 (1) 3 BACKGROUND: In low-income countries (LICs), HIV sentinel surveillance surveys (HIV-SSS) are recommended in between two demographic and health surveys, due to low-cost than the latter. Using the classical unlinked anonymous testing (UAT), HIV-SSS among pregnant women raised certain ethical and financial challenges. We therefore aimed at evaluating how to use prevention of mother-to-child transmission of HIV (PMTCT) routine data as an alternative approach for HIV-SSS in LICs. METHODS: A survey conducted through 2012 among first antenatal-care attendees (ANC1) in the ten regions of Cameroon. HIV testing was performed at PMTCT clinics as-per the national serial algorithm (rapid test), and PMTCT site laboratory (PMTCT-SL) performances were evaluated by comparison with results of the national reference laboratory (NRL), determined as the reference standard. RESULTS: Acceptance rate for HIV testing was 99%, for a total of 6521 ANC1 (49 . 3% aged 15-24) enrolled nationwide. Among 6103 eligible ANC1, sensitivity (using NRL testing as the reference standard) was 81 . 2%, ranging from 58 . 8% (South region) to 100% (West region); thus implying that 18 . 8% HIV-infected ANC1 declared HIV-negative at the PMTCT-SL were positive from NRL-results. Specificity was 99 . 3%, without significant disparity across sites. At population-level, this implies that every year in Cameroon, ~2,500 HIV-infected women are wrongly declared seronegative, while ~1,000 are wrongly declared seropositive. Only 44 . 4% (16/36) of evaluated laboratories reached the quality target of 80%. CONCLUSIONS: The study identified weaknesses in routine PMTCT HIV testing. As Cameroon transitions to using routine PMTCT data for HIV-SSS among pregnant women, there is need in optimizing quality system to ensure robust routine HIV testing for programmatic and surveillance purposes. |
Addressing tuberculosis in differentiated care provision for people living with HIV
Pathmanathan I , Pevzner E , Cavanaugh J , Nelson L . Bull World Health Organ 2017 95 (1) 3 Despite advances in prevention, diagnosis and treatment of tuberculosis and human immunodeficiency virus (HIV), tuberculosis remains the leading cause of death and illness among people living with HIV. In 2015, an estimated 1.2 million of the people who developed tuberculosis disease worldwide were HIV positive, and tuberculosis was the direct cause of at least one third of HIV-related deaths.1 The 2015 “Treat All” strategy requires that everyone with HIV is offered antiretroviral therapy (ART) as soon as they are diagnosed. By treating HIV infections earlier, this strategy should mitigate the HIV-associated tuberculosis epidemic, but it alone is not sufficient to eliminate preventable tuberculosis suffering and deaths among people living with HIV.2 The 2016 World Health Organization (WHO) guidelines recommend differentiated HIV service delivery, which is intended to facilitate the “Treat All” strategy by tailoring services to the differing needs of individuals.3 As HIV programmes adopt these WHO guidelines, tuberculosis also needs to be addressed.3 |
Disparities in retention in HIV care among HIV-infected young men who have sex with men in the District of Columbia, 2013
Morales-Aleman MM , Opoku J , Murray A , Lanier Y , Kharfen M , Sutton MY . LGBT Health 2017 4 (1) 34-41 PURPOSE: Among young men who have sex with men (YMSM), aged 13-24 years, blacks/African Americans and Hispanics/Latinos are disproportionately affected by HIV, accounting for 58% and 21%, respectively, of diagnoses of HIV infection in the United States. In the District of Columbia (DC), YMSM of color are also disproportionately affected by HIV. National goals are that 80% of HIV-infected persons be retained in HIV care. We analyzed DC surveillance data to examine retention among YMSM living with HIV infection in DC. METHODS: We characterized correlates of retention in HIV care (≥2 clinical visits, ≥3 months apart, within 12 months of diagnosis) among YMSM in DC to inform and strengthen local HIV care efforts. We analyzed data from DC HIV surveillance system for YMSM aged 13-29 years diagnosed between 2005 and 2012 and alive in 2013. We also combined demographic and clinical variables with sociodemographic data from the U.S. American Community Survey (ACS) by census tracts. RESULTS: From 2005 to 2012, 1034 YMSM were diagnosed and living with HIV infection in DC; 83% were black or Latino. Of the 1034 YMSM, 910 (88%) had census tract data available and were included in analyses (72% black, 10% Latino, and 17% white); among the 854 (94%) linked to care, 376 (44%) were retained in continuous care. In multivariate analyses, retention in care was less likely among 19-24 year YMSM compared with 13-18-year-old YMSM (adjusted prevalence ratios [aPR] = 0.89, confidence intervals [95% CI] 0.80-0.99). CONCLUSION: Retention in HIV care was suboptimal for YMSM. Increased retention efforts are warranted to improve outcomes and reduce age and racial/ethnic disparities. |
Relationship between nutritional support and tuberculosis treatment outcomes in West Bengal, India
Samuel B , Volkmann T , Cornelius S , Mukhopadhay S , MejoJose , Mitra K , Kumar AM , Oeltmann JE , Parija S , Prabhakaran AO , Moonan PK , Chadha VK . J Tuberc Res 2016 4 (4) 213-219 INTRODUCTION: Poverty and poor nutrition are associated with the risk of developing tuberculosis (TB). Socioeconomic factors may interfere with anti-tuberculosis treatment compliance and its outcome. We examined whether providing nutritional support (monthly supply of rice and lentil beans) to TB patients who live below the poverty line was associated with TB treatment outcome. METHODS: This was a retrospective cohort study of sputum smear-positive pulmonary TB patients living below the poverty line (income of <$1.25 per day) registered for anti-tuberculosis treatment in two rural districts of West Bengal, India during 2012 to 2013. We compared treatment outcomes among patients who received nutritional support with those who did not. A log-binomial regression model was used to assess the relation between nutritional support and unsuccessful treatment outcome (loss-to-follow-up, treatment failure and death). RESULTS: Of 173 TB patients provided nutritional support, 15 (9%) had unsuccessful treatment outcomes, while 84 (21%) of the 400 not provided nutrition support had unsuccessful treatment outcomes (p < 0.001). After adjusting for age, sex and previous treatment, those who received nutritional support had a 50% reduced risk of unsuccessful treatment outcome than those who did not receive nutritional support (Relative Risk: 0.51; 95% Confidence Intervals: 0.30 - 0.86). CONCLUSION: Under programmatic conditions, monthly rations of rice and lentils were associated with lower risk of unsuccessful treatment outcome among impoverished TB patients. Given the relatively small financial commitment needed per patient ($10 per patient per month), the national TB programme should consider scaling up nutritional support among TB patients living below the poverty line. |
Diagnosed HIV infection in transgender adults and adolescents: Results from the National HIV Surveillance System, 2009-2014
Clark H , Babu AS , Wiewel EW , Opoku J , Crepaz N . AIDS Behav 2016 21 (9) 2774-2783 Publications on diagnosed HIV infection among transgender people have been limited to state- or local-level data. We analyzed data from the National HIV Surveillance System and present results from the first national-level analysis of transgender people with diagnosed HIV infection. From 2009 to 2014, HIV surveillance jurisdictions from 45 states plus the District of Columbia identified and reported at least one case of newly diagnosed HIV infection for transgender people; jurisdictions from 5 states reported no cases for transgender people. Of 2351 transgender people with newly diagnosed HIV infection during 2009-2014, 84.0% were transgender women (male-to-female), 15.4% were transgender men (female-to-male), and 0.7% were additional gender identity (e.g., gender queer, bi-gender). Over half of both transgender women (50.8%; 1002/1974) and men (58.4%; 211/361) with newly diagnosed HIV infection were non-Hispanic black/African American. Improvements in data collection methods and quality are needed to gain a better understanding of HIV burden among transgender people. |
Prevalence and antimicrobial resistance in Salmonella enterica isolated from broiler chickens, pigs and meat products in the Thailand-Cambodia border provinces
Trongjit S , Angkititrakul S , Tuttle RE , Poungseree J , Padungtod P , Chuanchuen R . Microbiol Immunol 2017 61 (1) 23-33 This study aimed to examine prevalence and antimicrobial resistance (AMR) in the Salmonella isolates from broilers, pigs and their associated meat products in the Thailand-Cambodia border provinces. A total of 941 samples were collected from pigs and broilers at slaughter houses and from carcasses at local fresh markets in Sa Kaeo, Thailand (n = 554) and Banteay Meanchey, Cambodia (n = 387) in 2014 and 2015. Three hundred-forty five Salmonella isolates were collected from Sa Keao (n = 145; 23%) and Banteay Meanchey (n = 200; 47%) and assayed for antimicrobial susceptibility, class 1 integrons and extended-spectrum beta-lactamase (ESBL) genes. Serovars Typhimurium (29%) and Rissen (29%) were the most common serotypes found in Thai and Cambodian isolates, respectively. Multidrug resistance was detected in 34% and 52% of the isolates from Sa Keao and Banteay Meanchey, respectively. The majority of the Thai isolates were resistant to ampicillin (72.4%), while most Cambodian isolates were resistant to sufamethoxazole (71%). Eleven isolates from Sa Keao and 44 isolates from Banteay Meanchey carried class 1 integrons comprising resistance gene cassettes. The most common gene cassette array was dfrA12-aadA2(61.1%). Six isolates were ESBL producers. The beta-lactamase genes found included blaTEM-1 , blaCTX-M-55 and blaCMY-2 . Some of these class 1 integrons and ESBL genes were located on conjugative plasmid. In conclusion, mltidrug-resistant Salmonella are common in pigs, chicken and their products along the Thailand-Cambodia border provinces. Class 1 integrons plays a role in spread of AMR of the strains in this study. |
Exposure to multiple chemicals in a cohort of reproductive-aged Danish women
Rosofsky A , Janulewicz P , Thayer KA , McClean M , Wise LA , Calafat AM , Mikkelsen EM , Taylor KW , Hatch EE . Environ Res 2016 154 73-85 BACKGROUND: Current exposure assessment research does not sufficiently address multi-pollutant exposure and their correlations in human media. Understanding the extent of chemical exposure in reproductive-aged women is of particular concern due to the potential for in utero exposure and fetal susceptibility. OBJECTIVES: The objectives of this study were to characterize concentrations of chemical biomarkers during preconception and examine correlations between and within chemical classes. METHODS: We examined concentrations of 135 biomarkers from 16 chemical classes in blood and urine from 73 women aged 18-40 enrolled in Snart Foraeldre/Milieu, a prospective cohort study of pregnancy planners in Denmark (2011-2014). We compared biomarker concentrations with United States similarly-aged, non-pregnant women who participated in the National Health and Nutrition Environmental Survey (NHANES) and with other international biomonitoring studies. We performed principal component analysis to examine biomarker correlations. RESULTS: The mean number of biomarkers detected in the population was 92 (range: 60-108). The most commonly detected chemical classes were phthalates, metals, phytoestrogens and polycyclic aromatic hydrocarbons. Except blood mercury, urinary barium and enterolactone, geometric means were higher in women from NHANES. Chemical classes measured in urine generally did not load on a single component, suggesting high between-class correlation among urinary biomarkers, while there is high within-class correlation for biomarkers measured in serum and blood. CONCLUSIONS: We identified ubiquitous exposure to multiple chemical classes in reproductive-aged Danish women, supporting the need for more research on chemical mixtures during preconception and early pregnancy. Inter- and intra-class correlation between measured biomarkers may reflect common exposure sources, specific lifestyle factors or shared metabolism pathways. |
Applied epidemiology and public health: are we training the future generations appropriately?
Brownson RC , Samet JM , Bensyl DM . Ann Epidemiol 2016 27 (2) 77-82 To extend the reach and relevance of epidemiology for public health practice, the science needs be broadened beyond etiologic research, to link more strongly with emerging technologies and to acknowledge key societal transformations. This new focus for epidemiology and its implications for epidemiologic training can be considered in the context of macro trends affecting society, including a greater focus on upstream causes of disease, shifting demographics, the Affordable Care Act and health care system reform, globalization, changing health communication environment, growing centrality of team and transdisciplinary science, emergence of translational sciences, greater focus on accountability, big data, informatics, high-throughput technologies ("omics"), privacy changes, and the evolving funding environment. This commentary describes existing approaches to and competencies for training in epidemiology, maps macro trends with competencies, highlights an example of competency-based education in the Epidemic Intelligence Service of Centers for Disease Control and Prevention, and suggests expanded and more dynamic training approaches. A reexamination of current approaches to epidemiologic training is needed. |
Global health security: Building capacities for early event detection, epidemiologic workforce, and laboratory response
Balajee SA , Arthur R , Mounts AW . Health Secur 2016 14 (6) 424-432 The Global Health Security Agenda (GHSA) was launched in February 2014 to bring countries with limited capacity into compliance with the International Health Regulations (IHR) (2005). Recent international public health events, such as the appearance of Middle Eastern respiratory syndrome coronavirus and the reappearance of Ebola in West Africa, have highlighted the importance of early detection of disease events and the interconnectedness of countries. Surveillance systems that allow early detection and recognition of signal events, a public health infrastructure that allows rapid notification and information sharing within countries and across borders, a trained epidemiologic workforce, and a laboratory network that can respond appropriately and rapidly are emerging as critical components of an early warning and response system. This article focuses on 3 aspects of the GHSA that will lead to improved capacities for the detection and response to outbreaks as required by the IHR: (1) early detection and reporting of events, (2) laboratory capacity, and (3) a trained epidemiologic workforce. |
Human capital on the move: Education as a determinant of internal migration in selected INDEPTH surveillance populations in Africa
Ginsburg C , Beguy D , Augusto O , Odhiambo F , Soura A , White MJ , Bocquier P , Afolabi S , Derra K , Otiende M , Zabre P , Collinson MA . Demogr Res 2016 34 (1) 845-884 BACKGROUND Education, as a key indicator of human capital, is considered one of the major determinants of internal migration, with previous studies suggesting that human capital accumulates in urban areas at the expense of rural areas. However, there is fragmentary evidence concerning the educational correlates of internal migration in sub-Saharan Africa. OBJECTIVES The study questions whether more precise measures of migration in Health and Demographic Surveillance System (HDSS) populations support the hypothesis that migrants are self-selected on human capital and more educated people are more likely to leave rural areas or enter urban areas within a geographical region. METHODS Using unique longitudinal data representing approximately 900,000 people living in eight sub-Saharan African HDSS sites that are members of the INDEPTH Network, the paper uses Event History Analysis techniques to examine the relationship between formal educational attainment and in- and out-migration, over the period 2009 to 2011. RESULTS Between 7% and 27% of these local populations are moving in or out of the HDSS area over this period. Education is positively associated with both in- and out-migration in the Kenyan HDSS areas; however, the education effect has no clear pattern in the HDSS sites in Burkina Faso, Mozambique, and South Africa. CONCLUSIONS Empirical results presented in this paper confirm a strong age profile of migration consistent with human capital expectation, yet the results point to variability in the association of education and the propensity to migrate. In particular, the hypothesis of a shift of human capital from rural to urban areas is not universally valid. |
Notes from the field: Botulism outbreak from drinking prison-made illicit alcohol in a federal correctional facility - Mississippi, June 2016
McCrickard L , Marlow M , Self JL , Watkins LF , Chatham-Stephens K , Anderson J , Hand S , Taylor K , Hanson J , Patrick K , Luquez C , Dykes J , Kalb SR , Hoyt K , Barr JR , Crawford T , Chambers A , Douthit B , Cox R , Craig M , Spurzem J , Doherty J , Allswede M , Byers P , Dobbs T . MMWR Morb Mortal Wkly Rep 2017 65 (52) 1491-1492 On June 9, 2016, the Mississippi Poison Control Center and the Mississippi State Department of Health (MSDH) notified CDC of five suspected cases of botulism, a potentially fatal neuroparalytic illness (1), in inmates at a medium-security federal correctional institution (prison A). By June 10, a total of 13 inmates were hospitalized, including 12 in Mississippi and one in Oklahoma (the inmate in Oklahoma had been transferred there after his exposure for reasons unrelated to his illness). MSDH, Oklahoma State Department of Health, Bureau of Prisons, and CDC conducted an investigation to identify the source and scope of the outbreak, and to develop recommendations. | Prison A staff members suspected that an alcoholic beverage, illicitly made by inmates and known as “hooch” or “pruno,” was the source of the outbreak. Among 33 inmates who reported consuming hooch during June 1–19, 2016, a total of 31 (94%) had signs or symptoms suggesting botulism. The median interval from first exposure to symptom onset was 3 days (range = 0–11 days) (Figure). Cases were categorized using modified Council of State and Territorial Epidemiologists definitions. A confirmed case was defined as an illness in an inmate consistent with botulism that began on or after June 1, with botulinum toxin type A detected in a serum or stool specimen or Clostridium botulinum cultured from a stool specimen; a probable case was defined as an illness in an inmate with signs or symptoms of any cranial nerve palsy and extremity weakness that began on or after June 1; and a suspected case was an illness in an inmate with signs or symptoms of any cranial nerve palsy without extremity weakness that began on or after June 1. |
Quantifying the risk of human Toxoplasma gondii infection due to consumption of fresh pork in the United States
Guo M , Lambertini E , Buchanan RL , Dubey JP , Hill DE , Gamble HR , Jones JL , Pradhan AK . Food Control 2017 73 1210-1222 Toxoplasma gondii is one of the leading foodborne pathogens in the United States. The Centers for Disease Control and Prevention (CDC) reported that T. gondii accounts for 24% of deaths due to foodborne illness in the United States. Consumption of undercooked pork products in which T. gondii has encysted has been identified as an important route of human exposure. However, little quantitative evaluation of risk due to different pork products as a function of microbial quality at the abattoir, during the production process, and due to consumer handling practices is available to inform risk management actions. The goal of this study was to develop a farm-to-table quantitative microbial risk assessment (QMRA) model to predict the public health risk associated with consumption of fresh pork in the United States. T. gondii prevalence in pigs was derived through a meta-analysis of existing data, and the concentration of the infectious life stage (bradyzoites) was calculated for each pork cut from an infected pig. Logistic regression and log-linear regression models were developed to predict the reduction of T. gondii during further processing and consumer preparation, respectively. A mouse-derived exponential dose-response model was used to predict infection risk in humans. The estimated mean probability of infection per serving of fresh pork products ranges from 3.2 × 10−7 to 9.5 × 10−6, corresponding to a predicted approximately 94,600 new infections annually in the U.S. population due to fresh pork ingestion. Approximately 957 new infections per year were estimated to occur in pregnant women, corresponding to 277 cases of congenital toxoplasmosis per year due to fresh pork ingestion. In the context of available data, sensitivity analysis suggested that cooking is the most important parameter impacting human health risk. This study provides a scientific basis for risk management and also could serve as a baseline model to quantify infection risk from T. gondii and other parasites associated with meat products. |
Estimated incidence of antimicrobial drug-resistant nontyphoidal Salmonella infections, United States, 2004-2012
Medalla F , Gu W , Mahon BE , Judd M , Folster J , Griffin PM , Hoekstra RM . Emerg Infect Dis 2016 23 (1) 29-37 Salmonella infections are a major cause of illness in the United States. The antimicrobial agents used to treat severe infections include ceftriaxone, ciprofloxacin, and ampicillin. Antimicrobial drug resistance has been associated with adverse clinical outcomes. To estimate the incidence of resistant culture-confirmed nontyphoidal Salmonella infections, we used Bayesian hierarchical models of 2004-2012 data from the Centers for Disease Control and Prevention National Antimicrobial Resistance Monitoring System and Laboratory-based Enteric Disease Surveillance. We based 3 mutually exclusive resistance categories on susceptibility testing: ceftriaxone and ampicillin resistant, ciprofloxacin nonsusceptible but ceftriaxone susceptible, and ampicillin resistant but ceftriaxone and ciprofloxacin susceptible. We estimated the overall incidence of resistant infections as 1.07/100,000 person-years for ampicillin-only resistance, 0.51/100,000 person-years for ceftriaxone and ampicillin resistance, and 0.35/100,000 person-years for ciprofloxacin nonsusceptibility, or approximately 6,200 resistant culture-confirmed infections annually. These national estimates help define the magnitude of the resistance problem so that control measures can be appropriately targeted. |
Use of whole-genome sequencing data to analyze 23S rRNA-mediated azithromycin resistance.
Johnson SR , Grad Y , Abrams AJ , Pettus K , Trees DL . Int J Antimicrob Agents 2016 49 (2) 252-254 The whole-genome sequences of 24 isolates of Neisseria gonorrhoeae with elevated minimum inhibitory concentrations (MICs) to azithromycin (≥2.0 microg/mL) were analyzed against a modified sequence derived from the whole-genome sequence of N. gonorrhoeae FA1090 to determine, by signal ratio, the number of mutant copies of the 23S rRNA gene and the copy number effect on 50S ribosome-mediated azithromycin resistance. Isolates that were predicted to contain four mutated copies were accurately identified compared with the results of direct sequencing. Fewer than four mutated copies gave less accurate results but were consistent with elevated MICs. |
Cost and economic benefit of clinical decision support systems for cardiovascular disease prevention: a Community Guide systematic review
Jacob V , Thota AB , Chattopadhyay SK , Njie GJ , Proia KK , Hopkins DP , Ross MN , Pronk NP , Clymer JM . J Am Med Inform Assoc 2017 24 (3) 669-676 OBJECTIVE: This review evaluates costs and benefits associated with acquiring, implementing, and operating clinical decision support systems (CDSSs) to prevent cardiovascular disease (CVD). MATERIALS AND METHODS: Methods developed for the Community Guide were used to review CDSS literature covering the period from January 1976 to October 2015. Twenty-one studies were identified for inclusion. RESULTS: It was difficult to draw a meaningful estimate for the cost of acquiring and operating CDSSs to prevent CVD from the available studies (n = 12) due to considerable heterogeneity. Several studies (n = 11) indicated that health care costs were averted by using CDSSs but many were partial assessments that did not consider all components of health care. Four cost-benefit studies reached conflicting conclusions about the net benefit of CDSSs based on incomplete assessments of costs and benefits. Three cost-utility studies indicated inconsistent conclusions regarding cost-effectiveness based on a conservative $50,000 threshold. DISCUSSION: Intervention costs were not negligible, but specific estimates were not derived because of the heterogeneity of implementation and reporting metrics. Expected economic benefits from averted health care cost could not be determined with confidence because many studies did not fully account for all components of health care. CONCLUSION: We were unable to conclude whether CDSSs for CVD prevention is either cost-beneficial or cost-effective. Several evidence gaps are identified, most prominently a lack of information about major drivers of cost and benefit, a lack of standard metrics for the cost of CDSSs, and not allowing for useful life of a CDSS that generally extends beyond one accounting period. |
Cost-effectiveness of increasing access to contraception during the Zika virus outbreak, Puerto Rico, 2016
Li R , Simmons KB , Bertolli J , Rivera-Garcia B , Cox S , Romero L , Koonin LM , Valencia-Prado M , Bracero N , Jamieson DJ , Barfield W , Moore CA , Mai CT , Korhonen LC , Frey MT , Perez-Padilla J , Torres-Munoz R , Grosse SD . Emerg Infect Dis 2017 23 (1) 74-82 We modeled the potential cost-effectiveness of increasing access to contraception in Puerto Rico during a Zika virus outbreak. The intervention is projected to cost an additional $33.5 million in family planning services and is likely to be cost-saving for the healthcare system overall. It could reduce Zika virus-related costs by $65.2 million ($2.8 million from less Zika virus testing and monitoring and $62.3 million from avoided costs of Zika virus-associated microcephaly [ZAM]). The estimates are influenced by the contraception methods used, the frequency of ZAM, and the lifetime incremental cost of ZAM. Accounting for unwanted pregnancies that are prevented, irrespective of Zika virus infection, an additional $40.4 million in medical costs would be avoided through the intervention. Increasing contraceptive access for women who want to delay or avoid pregnancy in Puerto Rico during a Zika virus outbreak can substantially reduce the number of cases of ZAM and healthcare costs. |
Notes from the field: Detection of Sabin-like type 2 poliovirus from sewage after global cessation of trivalent oral poliovirus vaccine - Hyderabad and Ahmedabad, India, August-September 2016
Bahl S , Hampton LM , Bhatnagar P , Rao GS , Haldar P , Sangal L , Jetty PA , Nalavade UP . MMWR Morb Mortal Wkly Rep 2017 65 (52) 1493-1494 During September 2–October 4, 2016, four sewage samples collected during August 3–September 19 (Hyderabad, Telangana State, India) and one sewage sample collected on August 30 (Ahmedabad, Gujarat State, India) tested positive for Sabin-like type 2 polioviruses. These polioviruses were detected approximately 4 months after April 25, 2016, when India officially ceased use of trivalent oral poliovirus vaccine (tOPV), containing Sabin attenuated types 1, 2, and 3 polioviruses, and switched to bivalent OPV (bOPV), containing Sabin attenuated types 1 and 3 polioviruses (1). | Detection of Sabin-like type 2 poliovirus approximately 4 months after the switch from tOPV to bOPV suggested that tOPV use might have continued after it was supposed to stop globally, creating a risk for emergence of new type 2 vaccine-derived polioviruses (VDPV2s), which can cause paralysis. Genetic sequencing of the 903-nucleotide VP1 region of the isolated viruses showed zero, one, two, and four nucleotide changes in the four Hyderabad isolates and one nucleotide change in the Ahmedabad isolate, compared with the type 2 polioviruses in tOPV. These findings indicated that the isolated polioviruses had not replicated sufficiently to accumulate more than a few mutations on a potential pathway to becoming VDPV2s, and that the tOPV they originated from had likely been used during the preceding 4 months. | In accordance with global guidelines for responding to poliovirus events (2), detailed investigations were initiated within 48 hours of detection of the type 2 poliovirus in Hyderabad and the neighboring Rangareddy district, and in Ahmedabad (Box). As part of global poliovirus containment efforts (3), laboratories in those areas potentially storing type 2 polioviruses had previously been found to not have such polioviruses, so they were not searched. Telangana and Gujarat state officials met with immunization program stakeholders in the affected districts and other districts in their states regarding the need to reconfirm withdrawal of all tOPV. |
Hospitalizations within 14 days of vaccination among pediatric recipients of the live attenuated influenza vaccine, United States 2010-2012
Millman AJ , Reynolds S , Duffy J , Chen J , Gargiullo P , Fry AM . Vaccine 2016 35 (4) 529-535 BACKGROUND: Live attenuated influenza vaccine (LAIV) is safe in healthy children 2years. The original clinical trials excluded individuals with underlying conditions; however, post-marketing data suggest LAIV may be safe for these populations. METHODS: We analyzed MarketScan Commercial Claims Databases from 2010 to 2012 to describe hospitalizations within 14days of vaccination among LAIV recipients. We evaluated LAIV recipients aged 2-18years and defined underlying conditions by presence of inpatient or outpatient ICD-9 code during the previous calendar year. We excluded asthma and immunocompromising conditions. We defined risk windows as 1-7days and 8-14days after vaccination; the control period was 12-4days prior to and 15-23days after vaccination. We conducted a self-controlled case series analysis using a conditional Poisson regression model to estimate incidence-rate ratios (IRR). RESULTS: 1,216,123 children aged 2-18years received LAIV from 2010 to 2012. 634 children met our inclusion criteria and were hospitalized during the observation period (12days prior to vaccination to 23days after vaccination). Of those hospitalized, 72 (11.4%) had non-asthma, non-immunocompromising underlying conditions. Children with non-asthma, non-immunocompromising underlying conditions had an all-cause hospitalization IRR of 1.1 (95% CI 0.6-2.0, p=0.83) in the 1-7day risk period and 0.9 (95% CI 0.4-1.7, p=0.67) in the 8-14day risk period. Children with no underlying conditions had an all-cause hospitalization IRR of 0.9 (0.8-1.2, p=0.60) in the 1-7day risk period and 1.1 (95% CI 0.9-1.3, p=0.53) in the 8-14day risk period. There were no differences in all-cause hospitalization risk in individuals with non-asthma, non-immunocompromising underlying conditions compared to those without underlying conditions in the 1-7day (p=0.88) or 8-14day (p=0.24) risk period. CONCLUSIONS: We found no evidence of differences in post-LAIV hospitalization risk among children with non-asthma, non-immunocompromising underlying conditions compared to healthy children. |
Intraseason waning of influenza vaccine protection: Evidence from the US Influenza Vaccine Effectiveness Network, 2011-12 through 2014-15
Ferdinands JM , Fry AM , Reynolds S , Petrie J , Flannery B , Jackson ML , Belongia EA . Clin Infect Dis 2016 64 (5) 544-550 BACKGROUND: Recent studies suggest that influenza vaccine effectiveness (VE) may wane over the course of an influenza season, leading to suboptimal VE during late influenza seasons. METHODS: We examined the association between influenza VE and time since vaccination among patients ≥9 years old with medically-attended acute respiratory illness in the US Influenza Vaccine Effectiveness Network using data pooled from the 2011-12 through 2014-15 influenza seasons. We used multivariate logistic regression with PCR-confirmed influenza infection as the outcome and vaccination status defined by days between vaccination and symptom onset as the predictor. Models were adjusted for calendar time and other potential confounding factors. RESULTS: We observed decreasing VE with increasing time since vaccination for influenza A(H3N2) (p=0.004), influenza A(H1N1)pdm09 (p=0.01), and influenza B viruses (p=0.04). Maximum VE was observed shortly after vaccination, followed by a decline in VE of about 7% (absolute) per month for influenza A(H3N2) and influenza B and 6% - 11% per month for influenza A(H1N1)pdm09 viruses. VE remained greater than zero for at least six months for influenza A(H1N1)pdm09 and influenza B and at least five months for influenza A(H3N2) viruses. Decline in VE was more pronounced among patients with prior season influenza vaccination. A similar pattern of increasing influenza risk with increasing time since vaccination was seen in analyses limited to vaccinees. CONCLUSIONS: We observed decreasing influenza vaccine protection with increasing time since vaccination across influenza types/subtypes. This association is consistent with intraseason waning of host immunity, but bias or residual confounding could explain these findings. |
Potential Inhibitory Influence of miRNA 210 on Regulatory T Cells during Epicutaneous Chemical Sensitization.
Long CM , Lukomska E , Marshall NB , Nayak A , Anderson SE . Genes (Basel) 2016 8 (1) Toluene diisocyanate (TDI) is a potent low molecular weight chemical sensitizer and a leading cause of chemical-induced occupational asthma. The regulatory potential of microRNAs (miRNAs) has been recognized in a variety of disease states, including allergic disease; however, the roles of miRNAs in chemical sensitization are largely unknown. In a previous work, increased expression of multiple miRNAs during TDI sensitization was observed and several putative mRNA targets identified for these miRNAs were directly related to regulatory T-cell (Treg) differentiation and function including Foxp3 and Runx3. In this work, we show that miR-210 expression is increased in the mouse draining lymph node (dLN) and Treg subsets following dermal TDI sensitization. Alterations in dLN mRNA and protein expression of Treg related genes/putative miR-210 targets (foxp3, runx3, ctla4, and cd25) were observed at multiple time points following TDI exposure and in ex vivo systems. A Treg suppression assay, including a miR-210 mimic, was utilized to investigate the suppressive ability of Tregs. Cells derived from TDI sensitized mice treated with miR-210 mimic had less expression of miR-210 compared to the acetone control suggesting other factors, such as additional miRNAs, might be involved in the regulation of the functional capabilities of these cells. These novel findings indicate that miR-210 may have an inhibitory role in Treg function during TDI sensitization. Because the functional roles of miRNAs have not been previously elucidated in a model of chemical sensitization, these data contribute to the understanding of the potential immunologic mechanisms of chemical induced allergic disease. |
Optical Screening for Rapid Antimicrobial Susceptibility Testing and for Observation of Phenotypic Diversity among Strains of the Genetically Clonal Species Bacillus anthracis.
McLaughlin HP , Gargis AS , Michel P , Sue D , Weigel LM . J Clin Microbiol 2017 55 (3) 959-970 During high-impact events involving Bacillus anthracis, such as the Amerithrax incident of 2001 or the anthrax outbreaks in Russia and Sweden in 2016, critical decisions to reduce morbidity and mortality include rapid selection and distribution of effective antimicrobial agents for treatment and post-exposure prophylaxis. Detection of antimicrobial resistance currently relies on a conventional broth microdilution (BMD) method that requires a 16 - 20 hour incubation time for B. anthracis Advances in high-resolution optical screening offer a new technology to more rapidly evaluate antimicrobial susceptibility and to simultaneously assess growth characteristics of an isolate. Herein, we describe a new method developed and evaluated as a rapid antimicrobial susceptibility test for B. anthracis This method is based on automated, digital, time-lapse microscopy to observe growth and morphological effects of relevant antibiotics using an optical screening instrument, the oCelloScopeTM B. anthracis strains were monitored over time in the presence and absence of penicillin, ciprofloxacin, and doxycycline. Susceptibility to each antibiotic was determined in ≤ 4 hours, a 75-80% decrease in the time required for conventional methods. Time-lapse video imaging compiled from the optical screening images revealed unexpected differences in growth characteristics among strains of B. anthracis, which is considered to be a clonal organism. This technology provides a new approach for rapidly detecting phenotypic antimicrobial resistance and for documenting growth attributes that may be beneficial in further characterization of individual strains. IMPORTANCE: Early treatment of bacterial infections such as anthrax can dramatically improve survival rates and outcomes for affected populations. Conventional, gold-standard methods to detect drug resistance, such as broth microdilution, functionally assess the ability of bacteria to grow in the presence of drug, but are time intensive and rely on the subjective interpretation of results. Here, we describe the application of automated, time-lapsed optical imaging to rapidly and accurately detect single- and multi-drug resistant strains of Bacillus anthracis based on growth in microtiter cultures. Drug resistance was determined up to 16 hours faster than the conventional BMD method and the ability to visualize growth of B. anthracis in real-time revealed novel growth morphologies. Detailed growth characteristics of strains and rapid time-to-susceptibility results can assist in critical decision-making about clinical treatment or post-exposure prophylaxis regimes during public health emergency events involving infections such as anthrax. |
Inhalation of gas metal arc-stainless steel welding fume promotes lung tumorigenesis in A/J mice
Falcone LM , Erdely A , Meighan TG , Battelli LA , Salmen R , McKinney W , Stone S , Cumpston A , Cumpston J , Andrews RN , Kashon M , Antonini JM , Zeidler-Erdely PC . Arch Toxicol 2017 91 (8) 2953-2962 Epidemiologic studies suggest an increased risk of lung cancer with exposure to welding fumes, but controlled animal studies are needed to support this association. Oropharyngeal aspiration of collected "aged" gas metal arc-stainless steel (GMA-SS) welding fume has been shown by our laboratory to promote lung tumor formation in vivo using a two-stage initiation-promotion model. Our objective in this study was to determine whether inhalation of freshly generated GMA-SS welding fume also acts as a lung tumor promoter in lung tumor-susceptible mice. Male A/J mice received intraperitoneal (IP) injections of corn oil or the chemical initiator 3-methylcholanthrene (MCA; 10 microg/g) and 1 week later were exposed by whole-body inhalation to air or GMA-SS welding aerosols for 4 h/d x 4 d/w x 9 w at a target concentration of 40 mg/m3. Lung nodules were enumerated at 30 weeks post-initiation. GMA-SS fume significantly promoted lung tumor multiplicity in A/J mice initiated with MCA (16.11 +/- 1.18) compared to MCA/air-exposed mice (7.93 +/- 0.82). Histopathological analysis found that the increased number of lung nodules in the MCA/GMA-SS group were hyperplasias and adenomas, which was consistent with developing lung tumorigenesis. Metal deposition analysis in the lung revealed a lower deposited dose, approximately fivefold compared to our previous aspiration study, still elicited a significant lung tumorigenic response. In conclusion, this study demonstrates that inhaling GMA-SS welding fume promotes lung tumorigenesis in vivo which is consistent with the epidemiologic studies that show welders may be at an increased risk for lung cancer. |
Toward the development of the next generation of a rapid diagnostic test: Synthesis of glycophosphatidylinositol (GPI) analogues of Plasmodium falciparum and immunological characterization
Gurale BP , He Y , Cui X , Dinh H , Dhawane AN , Lucchi NW , Udhayakumar V , Iyer SS . Bioconjug Chem 2016 27 (12) 2886-2899 A large number of proteins in malaria parasites are anchored using glycophosphatidylinositols (GPIs) with lipid tails. These GPIs are structurally distinct from human GPIs. Plasmodium falciparum GPIs have been considered as potential vaccine candidates because these molecules are involved in inducing inflammatory responses in human hosts, and natural anti-GPI antibody responses have been shown to be associated with protection against severe disease. GPIs can also be considered as targets for rapid diagnostic tests. Because isolation of native GPIs in large quantities is challenging, development of synthetic GPI molecules can facilitate further exploration of GPI molecules for diagnostics. Here, we report synthesis and immunological characterization of a panel of malaria-specific GPI analogues. A total of three GPI analogues were chemically synthesized and conjugated to a carrier protein to immunize and generate antibodies in rabbits. The rabbit immune sera showed reactivity with synthetic GPIs and native GPIs extracted from P. falciparum parasite, as determined by Luminex and ELISA methods. |
Enhanced virulence of clade 2.3.2.1 highly pathogenic avian influenza A H5N1 viruses in ferrets
Pearce MB , Pappas C , Gustin KM , Davis CT , Pantin-Jackwood MJ , Swayne DE , Maines TR , Belser JA , Tumpey TM . Virology 2016 502 114-122 Sporadic avian to human transmission of highly pathogenic avian influenza (HPAI) A(H5N1) viruses necessitates the analysis of currently circulating and evolving clades to assess their potential risk. Following the spread and sustained circulation of clade 2 viruses across multiple continents, numerous subclades and genotypes have been described. To better understand the pathogenesis associated with the continued diversification of clade 2A(H5N1) influenza viruses, we investigated the relative virulence of eleven human and poultry isolates collected from 2006 to 2013 by determining their ability to cause disease in the ferret model. Numerous clade 2 viruses, including a clade 2.2 avian isolate, a 2.2.2.1 human isolate, and two 2.2.1 human isolates, were found to be of low virulence in the ferret model, though lethality was detected following infection with one 2.2.1 human isolate. In contrast, three of six clade 2.3.2.1 avian isolates tested led to severe disease and death among infected ferrets. Clade 2.3.2.1b and 2.3.2.1c isolates, but not 2.3.2.1a isolates, were associated with ferret lethality. All A(H5N1) viruses replicated efficiently in the respiratory tract of ferrets regardless of their virulence and lethality. However, lethal isolates were characterized by systemic viral dissemination, including detection in the brain and enhanced histopathology in lung tissues. The finding of disparate virulence phenotypes between clade 2A(H5N1) viruses, notably differences between subclades of 2.3.2.1 viruses, suggests there are distinct molecular determinants present within the established subclades, the identification of which will assist in molecular-based surveillance and public health efforts against A(H5N1) viruses. |
Parental refusal of vitamin K and neonatal preventive services: A need for surveillance
Marcewicz LH , Clayton J , Maenner M , Odom E , Okoroh E , Christensen D , Goodman A , Warren MD , Traylor J , Miller A , Jones T , Dunn J , Schaffner W , Grant A . Matern Child Health J 2017 21 (5) 1079-1084 Objectives Vitamin K deficiency bleeding (VKDB) in infants is a coagulopathy preventable with a single dose of injectable vitamin K at birth. The Tennessee Department of Health (TDH) and Centers for Disease Control and Prevention (CDC) investigated vitamin K refusal among parents in 2013 after learning of four cases of VKDB associated with prophylaxis refusal. Methods Chart reviews were conducted at Nashville-area hospitals for 2011-2013 and Tennessee birthing centers for 2013 to identify parents who had refused injectable vitamin K for their infants. Contact information was obtained for parents, and they were surveyed regarding their reasons for refusing. Results At hospitals, 3.0% of infants did not receive injectable vitamin K due to parental refusal in 2013, a frequency higher than in 2011 and 2012. This percentage was much higher at birthing centers, where 31% of infants did not receive injectable vitamin K. The most common responses for refusal were a belief that the injection was unnecessary (53%) and a desire for a natural birthing process (36%). Refusal of other preventive services was common, with 66% of families refusing vitamin K, newborn eye care with erythromycin, and the neonatal dose of hepatitis B vaccine. Conclusions for Practice Refusal of injectable vitamin K was more common among families choosing to give birth at birthing centers than at hospitals, and was related to refusal of other preventive services in our study. Surveillance of vitamin K refusal rates could assist in further understanding this occurrence and tailoring effective strategies for mitigation. |
Identifying birth defects in automated data sources in the Vaccine Safety Datalink
Kharbanda EO , Vazquez-Benitez G , Romitti PA , Naleway AL , Cheetham TC , Lipkind HS , Sivanandam S , Klein NP , Lee GM , Jackson ML , Hambidge SJ , Olsen A , McCarthy N , DeStefano F , Nordin JD . Pharmacoepidemiol Drug Saf 2017 26 (4) 412-420 PURPOSE: The Vaccine Safety Datalink (VSD), a collaboration between the Centers for Disease Control and Prevention and several large healthcare organizations, aims to monitor safety of vaccines administered in the USA. We present definitions and prevalence estimates for major structural birth defects to be used in studies of maternal vaccine safety. METHODS: In this observational study, we created and refined algorithms for identifying major structural birth defects from electronic healthcare data, conducted formal chart reviews for severe cardiac defects, and conducted limited chart validation for other defects. We estimated prevalence for selected defects by VSD site and birth year and compared these estimates to those in a US and European surveillance system. RESULTS: We developed algorithms to enumerate >50 major structural birth defects from standardized administrative and healthcare data based on utilization patterns and expert opinion, applying criteria for number, timing, and setting of diagnoses. Our birth cohort included 497 894 infants across seven sites. The period prevalence for all selected major birth defects in the VSD from 2004 to 2013 was 1.7 per 100 live births. Cardiac defects were most common (65.4 per 10 000 live births), with one-fourth classified as severe, requiring emergent intervention. For most major structural birth defects, prevalence estimates were stable over time and across sites and similar to those reported in other population-based surveillance systems. CONCLUSIONS: Our algorithms can efficiently identify many major structural birth defects in large healthcare datasets and can be used in studies evaluating the safety of vaccines administered to pregnant women. |
Notes from the field: Compliance with postexposure prophylaxis for exposure to Bacillus anthracis among U.S. military personnel - South Korea, May 2015
Allen KC , Hendricks K , Sergienko E , Mirza R , Chitale RA . MMWR Morb Mortal Wkly Rep 2017 65 (52) 1489-1490 In the United States, Bacillus anthracis is a select agent and is subject to select agent requirements under the U.S. Code of Federal Regulations.* On April 20, 2015, samples of B. anthracis spores considered inactivated were shipped from a U.S. Department of Defense (DoD) laboratory at Dugway Proving Ground, Utah, to various laboratories for routine collaborative diagnostics research. On May 22, 2015, CDC was notified of live B. anthracis in one sample received by a private company and initiated a response. On May 29, 2015, DoD began reviewing safety practices for generating and handling inactivated B. anthracis spores. By June 1, 2015, the Office of the Assistant Secretary of Defense for Nuclear, Chemical, and Biological Defense Programs had established a task force to coordinate the DoD response (1). | The DoD Comprehensive Anthrax Laboratory Review (2) was completed within 30 days and addressed five main objectives: 1) conduct root cause analysis for incomplete inactivation of B. anthracis; 2) investigate the lack of effective postinactivation sterility testing for detection of live B. anthracis; 3) review DoD laboratory biohazard safety procedures/protocols; 4) determine laboratory adherence to established procedures/protocols; and 5) identify systemic problems and corresponding solutions. The DoD investigation identified 194 commercial companies, academic institutions, and federal laboratories that had received potentially live B. anthracis samples across 50 states, the District of Columbia, three U.S. territories, and nine foreign countries. | In South Korea, the Joint U.S. Forces Korea Portal and Integrated Threat Recognition program works on detection of biologic agents in the environment. A sample of B. anthracis was sent to Osan Air Base from the Dugway Proving Ground shipment for research, and 22 DoD personnel were exposed to the sample. Immediately after the event was discovered, these personnel were assessed for the need for emergency postexposure prophylaxis (PEP). On May 27, 2015, all 22 potentially exposed personnel began a PEP regimen tailored to their individual vaccination history. Persons lacking prior anthrax vaccination or with expired vaccination history received the standard emergency use protocol for PEP: 3 anthrax vaccine doses over 4 weeks plus 60 days of oral ciprofloxacin (500 mg twice a day) or doxycycline (100 mg twice a day) (3,4). Persons current for B. anthracis vaccination received emergency PEP: 30 days of oral ciprofloxacin or doxycycline (3,4) (Table). |
Identifying acceptability and price points for purchasing micronutrient powders for children 2 to 5 years old in Nepal
Gunnala R , Perrine CG , Subedi G , Mebrahtu S , Dahal P , Jefferds ME . Asia Pac J Clin Nutr 2017 26 (1) 110-117 BACKGROUND AND OBJECTIVE: Little is known about purchasing micronutrient powders (MNP) for children 2-5 years. We describe acceptability for purchasing and price points for MNP for children 2-5 years among caregivers living in districts where free MNP are distributed for children 6-23 months. METHODS AND STUDY DESIGN: Crosssectional surveys conducted 3 months after MNP program implementation in 2 districts; 15 months after implementation in 2 different districts. Chi square tests and logistic regression describe associations among sociodemographics and program exposure factors and acceptability of purchasing MNP among 1,261 mothers of children 6-23 months who had heard of MNP. RESULTS: Overall, 77.5% and 86.1% of mothers reported acceptability for purchasing MNP in the 3 and 15 month surveys, respectively. Positive pricing attitude (PPA) about paying 150 Nepali rupees for 60 sachets of MNP was reported by 66.3% and 73.4% of mothers. Acceptability for purchasing MNP in both time periods increased with higher wealth quintile and higher maternal education; PPA increased with higher maternal education. Controlling for socio-demographics, program exposure factors associated with acceptability for purchasing MNP included: lack of perceived barriers to MNP intake and health worker counselling (3 month surveys); knowledge of benefits of MNP intake and lack of perceived barriers to MNP intake (15 month surveys). CONCLUSIONS: Mothers reported acceptability for purchasing MNP and PPA for older children in Nepal. Differences in acceptability were found across socio-demographics and program exposures. Use of these results and further exploration into actual purchasing behaviour can inform future MNP distribution methods in Nepal. |
Communications and tracking research supports MINER act
Reyes M . Coal Age 2016 121 (11) 42-44 Driven by a series of tragic mining accidents, the Mine Improvement and New Emergency Response (MINER) Act was passed in June 2006. One of the clauses of the MINER Act mandated mine operators to adopt two-way wireless underground communications and electronic tracking systems that would allow personnel on the surface to communicate with and know the location of workers underground. The act prescribed action items and special focus areas related to mine safety and health, requiring mines to include detailed plans for two-way communications and tracking in their Emergency Response Plans (ERPs) and to have the systems installed by June 2009. | When the act was passed, limitations existed relative to the commercially available products and the technologies that could potentially be applied to underground mining environments. As a result, the National Institute for Occupational Safety and Health (NIOSH) formed an interagency working group to investigate technology and research developments that could be applied to mining. The group assessed the status of CT technologies and transferability potential to mining. The discussions ultimately revealed that viable new technologies were not yet available for mining or would not be fully developed by the 2009 deadline. |
Evidence of non-Plasmodium falciparum malaria infection in Kedougou, Senegal
Daniels RF , Deme AB , Gomis JF , Dieye B , Durfee K , Thwing JI , Fall FB , Ba M , Ndiop M , Badiane AS , Ndiaye YD , Wirth DF , Volkman SK , Ndiaye D . Malar J 2017 16 (1) 9 BACKGROUND: Expanded malaria control efforts in Senegal have resulted in increased use of rapid diagnostic tests (RDT) to identify the primary disease-causing Plasmodium species, Plasmodium falciparum. However, the type of RDT utilized in Senegal does not detect other malaria-causing species such as Plasmodium ovale spp., Plasmodium malariae, or Plasmodium vivax. Consequently, there is a lack of information about the frequency and types of malaria infections occurring in Senegal. This study set out to better determine whether species other than P. falciparum were evident among patients evaluated for possible malaria infection in Kedougou, Senegal. METHODS: Real-time polymerase chain reaction speciation assays for P. vivax, P. ovale spp., and P. malariae were developed and validated by sequencing and DNA extracted from 475 Plasmodium falciparum-specific HRP2-based RDT collected between 2013 and 2014 from a facility-based sample of symptomatic patients from two health clinics in Kedougou, a hyper-endemic region in southeastern Senegal, were analysed. RESULTS: Plasmodium malariae (n = 3) and P. ovale wallikeri (n = 2) were observed as co-infections with P. falciparum among patients with positive RDT results (n = 187), including one patient positive for all three species. Among 288 negative RDT samples, samples positive for P. falciparum (n = 24), P. ovale curtisi (n = 3), P. ovale wallikeri (n = 1), and P. malariae (n = 3) were identified, corresponding to a non-falciparum positivity rate of 2.5%. CONCLUSIONS: These findings emphasize the limitations of the RDT used for malaria diagnosis and demonstrate that non-P. falciparum malaria infections occur in Senegal. Current RDT used for routine clinical diagnosis do not necessarily provide an accurate reflection of malaria transmission in Kedougou, Senegal, and more sensitive and specific methods are required for diagnosis and patient care, as well as surveillance and elimination activities. These findings have implications for other malaria endemic settings where species besides P. falciparum may be transmitted and overlooked by control or elimination activities. |
Normative values for cardiorespiratory fitness testing among US children aged 6-11 years
Gahche JJ , Kit BK , Fulton JE , Carroll DD , Rowland T . Pediatr Exerc Sci 2017 29 (2) 1-23 BACKGROUND: Nationally representative normative values for cardiorespiratory fitness (CRF) have not been described for US children since the mid 1980's. OBJECTIVE: To provide sex- and age-specific normative values for CRF of US children aged 6-11 years. METHODS: Data from 624 children aged 6-11 years who participated in the CRF testing as part of the 2012 National Health and Nutrition Examination Survey National Youth Fitness Survey, a cross-sectional survey, were analyzed. Participants were assigned to one of three age-specific protocols and asked to exercise to volitional fatigue. The difficulty of the protocols increased with successive age groups. CRF was assessed as maximal endurance time (min:sec). Data analysis was conducted in 2016. RESULTS: For 6-7, 8-9, 10-11 year olds, corresponding with the age-specific protocols, mean endurance time was 12:10 min:sec (95% CI: 11:49-12:31), 11:16 min:sec (95% CI: 11:00-11:31), and 10:01 min:sec (95% CI: 9:37-10:25), respectively. Youth in the lowest 20th percentile for endurance time were more likely to be obese, to report less favorable health, and to report greater than two hours of screen time per day. CONCLUSION: These data may serve as baseline estimates to monitor trends over time in CRF among US children aged 6-11 years. |
Prevalence of Complete Streets policies in U.S. municipalities
Carlson SA , Paul P , Kumar G , Watson KB , Atherton E , Fulton JE . J Transp Health 2016 5 142-150 Communities can adopt Complete Streets policies to support physical activity through the routine design and operation of streets and communities that are safe for all people, regardless of age, ability, or mode of transport. Our aim was two-fold: (1) to estimate the prevalence of Complete Streets policies in the United States overall and by select municipality characteristics using data from the National Survey of Community-Based Policy and Environmental Supports for Healthy Eating and Active Living (CBS HEAL) and (2) examine the agreement between information about local policies reported in CBS HEAL with those found in the National Complete Streets Coalition's database. Data from a representative sample of incorporated U.S. municipalities with a population of at least 1000 people (n = 2029) were analyzed using survey weights to create national estimates. In 2014, 25.2% of municipalities had a Complete Streets policy reported by a local official. Prevalence of local policies decreased with decreasing population size and was lower among those with a lower median education level and those in the South, with and without adjustment for other municipality characteristics. Agreement between local Complete Streets policies reported in CBS HEAL and the coalition's database was moderate with 72.5% agreement (kappa = 0.21); however, agreement was lower for municipalities with smaller populations, those located in rural areas, and those with a lower median education level. About 16.8% of local officials reported they did not know if their municipality had such a policy. There is room for improvement in the awareness and adoption of Complete Streets policies in the United States, especially among smaller municipalities and those with lower median education levels. Helping communities address issues related to the awareness, adoption, and implementation of Complete Streets policies can be an important step toward creating more walkable communities. |
Planning a national-level data collection protocol to measure outcomes for the Colorectal Cancer Control Program
Satsangi A , DeGroff A . J Ga Public Health Assoc 2016 6 292-297 BACKGROUND: The Colorectal Cancer Control Program (CRCCP) of the Centers for Disease Control and Prevention (CDC) funded 30 grantees to partner with health systems with the goal of increasing screening for colorectal cancer (CRC). METHODS: Evaluators applied CDC's Framework for Program Evaluation to design a national level outcome evaluation for measuring changes in CRC screening rates in partner health systems. RESULTS: The resulting evaluation design involves the collection and reporting of clinic-level CRC screening rates supplemented by various tools to support the reporting of high quality, reliable data. CONCLUSIONS: The CRCCP evaluation represents a strong design to measure the primary outcome of interest, CRC screening rates, and public health practitioners can benefit from lessons learned about stakeholder involvement, data quality, and the role of evaluators in data dissemination. |
Importance of implementation economics for program planning-evaluation of CDC's colorectal cancer control program
Tangka FK , Subramanian S . Eval Program Plann 2016 62 64-66 Understanding the cost of initiating and operationalizing colorectal cancer (CRC) control programs is essential for planning successful implementation of evidence-based recommendations to reduce disparities in the use and quality of CRC cancer screening services. Currently, only about 58% of adults ages 50–75 years in the United States are up-to-date with CRC screening recommendations; adults without health insurance have a much lower uptake of about 24% (Sabatino, White, Thompson, & Klabunde, 2015). Targeted interventions and programs, especially those focused on the uninsured and underinsured populations, are required to meet the population-wide target of 80% by 2018 set by The National Colorectal Cancer Roundtable (NCCRT, n.d.). The Community Guide contains several evidence-based recommendations for screening promotion interventions but there are very few studies on the economics of screening program implementation (Baron et al., 2010; Sabatino et al., 2012). There is an urgent need to increase the number of ‘implementation economics’ studies to develop the evidence-base to guide funding decision making, design cost-effective programs and ensure optimal use of limited resources. We define ‘implementation economics’ as a sub-discipline within implementation science that focusses on economic evaluation related to cost (cost-of-illness analysis, program cost analysis), cost-effectiveness, cost-benefit, cost-utility, budget impact, and cost minimization. | For more than a decade, CDC has funded and provided technical support to a range of grantee programs to implement CRC screening and implementation economics has been a cornerstone of the evaluations of these programs. Between 2005 and 2009, CDC administered the Colorectal Cancer Screening Demonstration Program (CRCSDP) in five programs [Baltimore, Maryland; St. Louis, Missouri; the entire state of Nebraska; Suffolk County, New York; and King, Clallam and Jefferson counties in Washington] (Centers for Disease Control and Prevention, 2013a). These programs provided CRC screening for low-income, underinsured, or uninsured men and women between the ages of 50 and 64 years. In 2009, successes and lessons learned (Centers for Disease Control and Prevention, 2013b; Joseph, DeGroff, Hayes, Wong, & Plescia, 2011) from the CRCSDP informed planning and funding of the first round of CDC’s Colorectal Cancer Control Program (CRCCP) (2009–2015). Through the CRCCP, CDC provided funding to 22 states and 4 tribal organizations to implement CRC programs starting in 2009 and another 3 states were added to the program in 2010. Fig. 1 provides a map of the United States highlighting the CRCCP grantees. |
Updated estimates of ectopic pregnancy among commercially and Medicaid-insured women in the United States, 2002-2013
Tao G , Patel C , Hoover KW . South Med J 2017 110 (1) 18-24 OBJECTIVES: To update trends in the rates of ectopic pregnancy, to compare rates of ectopic pregnancy between commercially insured and Medicaid-insured women, and to assess the differences in rates of ectopic pregnancy by different measures of ectopic pregnancy. METHODS: We analyzed data from 2002 to 2013 using the Truven Health MarketScan Commercial and Medicaid Claims Database. We limited the study population to women aged 15 to 44 years with any pregnancy in each year. Pregnancy and ectopic pregnancy were identified by clinical services with diagnostic or procedural codes. Ectopic pregnancy was measured in two ways: diagnosed and treated compared with diagnosed only; pregnancy was measured in two ways: any pregnancy compared with pregnancy with delivery. RESULTS: We did not observe a substantial trend in the rate of ectopic pregnancy from 2002 to 2013. The rate of diagnosed and treated ectopic pregnancy substantially increased by age: 0.29% in women aged 15 to 19 years and 0.89% in women aged 40 to 44 years among the commercially insured population and 0.23% and 0.85% among the Medicaid-insured population, respectively. The rate of ectopic pregnancy also varied by the different methodologies used to estimate rates. CONCLUSIONS: The rate of ectopic pregnancy is relatively low and stable for women of reproductive age in the United States. Our findings highlight that it is important to clearly define the numerator and denominator in the measure of ectopic pregnancy rates. |
Preventing unintended pregnancy among young sexually active women: Recognizing the role of violence, self-esteem, and depressive symptoms on use of contraception
Nelson DB , Zhao H , Corrado R , Mastrogiannnis DM , Lepore SJ . J Womens Health (Larchmt) 2017 26 (4) 352-360 OBJECTIVES: Ineffective contraceptive use among young sexually active women is extremely prevalent and poses a significant risk for unintended pregnancy (UP). Ineffective contraception involves the use of the withdrawal method or the inconsistent use of other types of contraception (i.e., condoms and birth control pills). This investigation examined violence exposure and psychological factors related to ineffective contraceptive use among young sexually active women. MATERIALS AND METHODS: Young, nonpregnant sexually active women (n = 315) were recruited from an urban family planning clinic in 2013 to participate in a longitudinal study. Tablet-based surveys measured childhood violence, community-level violence, intimate partner violence, depressive symptoms, and self-esteem. Follow-up surveys measured type and consistency of contraception used 9 months later. Multivariate logistic regression models assessed violence and psychological risk factors as main effects and moderators related to ineffective compared with effective use of contraception. RESULTS: The multivariate logistic regression model showed that childhood sexual violence and low self-esteem were significantly related to ineffective use of contraception (adjusted odds ratio [aOR] = 2.69, confidence interval [95% CI]: 1.18-6.17, and aOR = 0.51, 95% CI: 0.28-0.93; respectively), although self-esteem did not moderate the relationship between childhood sexual violence and ineffective use of contraception (aOR = 0.38, 95% CI: 0.08-1.84). Depressive symptoms were not related to ineffective use of contraception in the multivariate model. CONCLUSIONS: Interventions to reduce UP should recognize the long-term effects of childhood sexual violence and address the role of low self-esteem on the ability of young sexually active women to effectively and consistently use contraception to prevent UP. |
Use of combined hormonal contraceptives among women with migraines and risk of ischemic stroke
Champaloux SW , Tepper NK , Monsour M , Curtis KM , Whiteman MK , Marchbanks PA , Jamieson DJ . Am J Obstet Gynecol 2016 216 (5) 489 e1-489 e7 BACKGROUND: Migraine with aura and combined hormonal contraceptives are independently associated with an increased risk of ischemic stroke. However, little is known about whether there are any joint effects of migraine and hormonal contraceptives on risk of stroke. OBJECTIVE: To estimate the incidence of stroke in women of reproductive age and examine the association between combined hormonal contraceptive use, migraine type (with or without aura), and ischemic stroke. STUDY DESIGN: This study used a nationwide health care claims database and employed a nested case control study design. Women ages 15-49 years with first-ever stroke during 2006-2012 were identified using the International Classification of Diseases-9th Revision-Clinical Modifications inpatient services diagnosis codes. Four controls were matched to each case based on age. Migraine headache with and without aura was identified using inpatient or outpatient diagnosis codes. Current combined hormonal contraceptive use was identified using the National Drug Code from the pharmacy database. Conditional logistic regression was used to estimate adjusted odds ratios and 95% confidence intervals of ischemic stroke by migraine type and combined hormonal contraceptive use. RESULTS: Between 2006-2012, there were 25,887 ischemic strokes among women ages 15-49, for a cumulative incidence of 11 strokes per 100,000 women. Compared to women with neither migraine nor combined hormonal contraceptive use, the odds ratio of ischemic stroke was highest among women with migraine with aura using combined hormonal contraceptives (odds ratio 6.1, 95% confidence interval 3.1-12.1); odds ratios were also elevated for migraine with aura without combined hormonal contraceptive use (odds ratio 2.7, 95% confidence interval 1.9-3.7), migraine without aura and combined hormonal contraceptive use (odds ratio 1.8, 95% confidence interval 1.1-2.9), and migraine without aura without combined hormonal contraceptive use (odds ratio 2.2, 95% confidence interval 1.9-2.7). CONCLUSION: The joint effect of combined hormonal contraceptives and migraine with aura was associated with a 6-fold increased risk of ischemic stroke compared with neither risk factor. Use of combined hormonal contraceptives did not substantially further increase risk of ischemic stroke among women with migraine without aura. Determining migraine type is critical in assessing safety of combined hormonal contraceptives among women with migraine. |
Quitting smoking among adults - United States, 2000-2015
Babb S , Malarcher A , Schauer G , Asman K , Jamal A . MMWR Morb Mortal Wkly Rep 2017 65 (52) 1457-1464 Quitting cigarette smoking benefits smokers at any age (1). Individual, group, and telephone counseling and seven Food and Drug Administration-approved medications increase quit rates. To assess progress toward the Healthy People 2020 objectives of increasing the proportion of U.S. adults who attempt to quit smoking cigarettes to ≥80.0% (TU-4.1), and increasing recent smoking cessation success to ≥8.0% (TU-5.1), CDC assessed national estimates of cessation behaviors among adults aged ≥18 years using data from the 2000, 2005, 2010, and 2015 National Health Interview Surveys (NHIS). During 2015, 68.0% of adult smokers wanted to stop smoking, 55.4% made a past-year quit attempt, 7.4% recently quit smoking, 57.2% had been advised by a health professional to quit, and 31.2% used cessation counseling and/or medication when trying to quit. During 2000-2015, increases occurred in the proportion of smokers who reported a past-year quit attempt, recently quit smoking, were advised to quit by a health professional, and used cessation counseling and/or medication (p<0.05). Throughout this period, fewer than one third of persons used evidence-based cessation methods when trying to quit smoking. As of 2015, 59.1% of adults who had ever smoked had quit. To further increase cessation, health care providers can consistently identify smokers, advise them to quit, and offer them cessation treatments. In addition, health insurers can increase cessation by covering and promoting evidence-based cessation treatments and removing barriers to treatment access. |
Analysis of anthrax immune globulin intravenous with antimicrobial treatment in injection drug users, Scotland, 2009-2010
Cui X , Nolen LD , Sun J , Booth M , Donaldson L , Quinn CP , Boyer AE , Hendricks K , Shadomy S , Bothma P , Judd O , McConnell P , Bower WA , Eichacker PQ . Emerg Infect Dis 2017 23 (1) 56-65 We studied anthrax immune globulin intravenous (AIG-IV) use from a 2009-2010 outbreak of Bacillus anthracis soft tissue infection in injection drug users in Scotland, UK, and we compared findings from 15 AIG-IV recipients with findings from 28 nonrecipients. Death rates did not differ significantly between recipients and nonrecipients (33% vs. 21%). However, whereas only 8 (27%) of 30 patients at low risk for death (admission sequential organ failure assessment score of 0-5) received AIG-IV, 7 (54%) of the 13 patients at high risk for death (sequential organ failure assessment score of 6-11) received treatment. AIG-IV recipients had surgery more often and, among survivors, had longer hospital stays than did nonrecipients. AIG-IV recipients were sicker than nonrecipients. This difference and the small number of higher risk patients confound assessment of AIG-IV effectiveness in this outbreak. |
Zika virus -10 public health achievements in 2016 and future priorities
Oussayef NL , Pillai SK , Honein MA , Ben Beard C , Bell B , Boyle CA , Eisen LM , Kohl K , Kuehnert MJ , Lathrop E , Martin SW , Martin R , McAllister JC , McClune EP , Mead P , Meaney-Delman D , Petersen B , Petersen LR , Polen KN , Powers AM , Redd SC , Sejvar JJ , Sharp T , Villanueva J , Jamieson DJ . MMWR Morb Mortal Wkly Rep 2017 65 (52) 1482-1488 The introduction of Zika virus into the Region of the Americas (Americas) and the subsequent increase in cases of congenital microcephaly resulted in activation of CDC's Emergency Operations Center on January 22, 2016, to ensure a coordinated response and timely dissemination of information, and led the World Health Organization to declare a Public Health Emergency of International Concern on February 1, 2016. During the past year, public health agencies and researchers worldwide have collaborated to protect pregnant women, inform clinicians and the public, and advance knowledge about Zika virus (Figure 1). This report summarizes 10 important contributions toward addressing the threat posed by Zika virus in 2016. To protect pregnant women and their fetuses and infants from the effects of Zika virus infection during pregnancy, public health activities must focus on preventing mosquito-borne transmission through vector control and personal protective practices, preventing sexual transmission by advising abstention from sex or consistent and correct use of condoms, and preventing unintended pregnancies by reducing barriers to access to highly effective reversible contraception. |
Zika virus knowledge among pregnant women who were in areas with active transmission
Whittemore K , Tate A , Illescas A , Saffa A , Collins A , Varma JK , Vora NM . Emerg Infect Dis 2017 23 (1) 164-166 We surveyed women in New York, New York, USA, who were in areas with active Zika virus transmission while pregnant. Of 99 women who were US residents, 30 were unaware of the government travel advisory to areas with active Zika virus transmission while pregnant, and 37 were unaware of their pregnancies during travel. |
Human rabies - Puerto Rico, 2015
Styczynski A , Tran C , Dirlikov E , Zapata MR , Ryff K , Petersen B , Sanchez AC , Mayshack M , Martinez LC , Condori R , Ellison J , Orciari L , Yager P , Pena RG , Sanabria D , Velazquez JC , Thomas D , Garcia BR . MMWR Morb Mortal Wkly Rep 2017 65 (52) 1474-1476 On December 1, 2015, the Puerto Rico Department of Health (PRDH) was notified by a local hospital of a suspected human rabies case. The previous evening, a Puerto Rican man aged 54 years arrived at the emergency department with fever, difficulty swallowing, hand paresthesia, cough, and chest tightness. The next morning the patient left against medical advice but returned to the emergency department in the afternoon with worsening symptoms. The patient's wife reported that he had been bitten by a mongoose during the first week of October, but had not sought care for the bite. While being transferred to the intensive care unit, the patient went into cardiac arrest and died. On December 3, rabies was confirmed from specimens collected during autopsy. PRDH conducted an initial rapid risk assessment, and five family members were started on rabies postexposure prophylaxis (PEP). |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Epidemiology and Surveillance
- Food Safety
- Genetics and Genomics
- Health Economics
- Immunity and Immunization
- Laboratory Sciences
- Maternal and Child Health
- Military Medicine and Health
- Nutritional Sciences
- Occupational Safety and Health - Mining
- Parasitic Diseases
- Physical Activity
- Program Evaluation
- Reproductive Health
- Substance Use and Abuse
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure