Nephrology care prior to end-stage renal disease and outcomes among new ESRD patients in the USA
Gillespie BW , Morgenstern H , Hedgeman E , Tilea A , Scholz N , Shearon T , Burrows NR , Shahinian VB , Yee J , Plantinga L , Powe NR , McClellan W , Robinson B , Williams DE , Saran R . Clin Kidney J 2015 8 (6) 772-780 BACKGROUND: Longer nephrology care before end-stage renal disease (ESRD) has been linked with better outcomes. METHODS: We investigated whether longer pre-end-stage renal disease (ESRD) nephrology care was associated with lower mortality at both the patient and state levels among 443 761 incident ESRD patients identified in the USA between 2006 and 2010. RESULTS: Overall, 33% of new ESRD patients had received no prior nephrology care, while 28% had received care for >12 months. At the patient level, predictors of >12 months of nephrology care included having health insurance, white race, younger age, diabetes, hypertension and US region. Longer pre-ESRD nephrology care was associated with lower first-year mortality (adjusted hazard ratio = 0.58 for >12 months versus no care; 95% confidence interval 0.57-0.59), higher albumin and hemoglobin, choice of peritoneal dialysis and native fistula and discussion of transplantation options. Living in a state with a 10% higher proportion of patients receiving >12 months of pre-ESRD care was associated with a 9.3% lower relative mortality rate, standardized for case mix (R 2 = 0.47; P < 0.001). CONCLUSIONS: This study represents the largest cohort of incident ESRD patients to date. Although we did not follow patients before ESRD onset, our findings, both at the individual patient and state levels, reflect the importance of early nephrology care among those with chronic kidney disease. |
Prevalence of cholesterol treatment eligibility and medication use among adults - United States, 2005-2012
Mercado C , DeSimone AK , Odom E , Gillespie C , Ayala C , Loustalot F . MMWR Morb Mortal Wkly Rep 2015 64 (47) 1305-11 A high blood level of low-density lipoprotein cholesterol (LDL-C) remains a major risk factor for atherosclerotic cardiovascular disease (ASCVD) (1), although data from 2005 through 2012 has shown a decline in high cholesterol (total and LDL cholesterol) along with an increase in the use of cholesterol-lowering medications (2-4). The most recent national guidelines (published in 2013) from the American College of Cardiology and the American Heart Association (ACC/AHA) expand previous recommendations for reducing cholesterol to include lifestyle modifications and medication use as part of complete cholesterol management and to lower risk for ASCVD (5-8). Because changes in cholesterol treatment guidelines might magnify existing disparities in care and medication use, it is important to describe persons currently eligible for treatment and medication use, particularly as more providers implement the 2013 ACC/AHA guidelines. To understand baseline estimates of U.S. adults on or eligible for cholesterol treatment, as well as to identify sex and racial/ethnic disparities, CDC analyzed data from the 2005-2012 National Health and Nutrition Examination Surveys (NHANES). Because the 2013 ACC/AHA guidelines focus on initiation or continuation of cholesterol treatment, adults meeting the guidelines' eligibility criteria as well as adults who were currently taking cholesterol-lowering medication were assessed as a group. Overall, 36.7% of U.S. adults or 78.1 million persons aged ≥21 years were on or eligible for cholesterol treatment. Within this group, 55.5% were currently taking cholesterol-lowering medication, and 46.6% reported making lifestyle modifications, such as exercising, dietary changes, or controlling their weight, to lower cholesterol; 37.1% reported making lifestyle modifications and taking medication, and 35.5% reported doing neither. Among adults on or eligible for cholesterol-lowering medication, the proportion taking cholesterol-lowering medication was higher for women than men and for non-Hispanic whites (whites) than Mexican-Americans and non-Hispanic blacks (blacks). Further efforts by clinicians and public health practitioners are needed to implement complementary and targeted patient education and disease management programs to reduce sex and racial/ethnic disparities among adults eligible for treatment of cholesterol. |
Health-care access among adults with epilepsy: The U.S. National Health Interview Survey, 2010 and 2013
Thurman DJ , Kobau R , Luo YH , Helmers SL , Zack MM . Epilepsy Behav 2015 55 184-8 INTRODUCTION: Community-based and other epidemiologic studies within the United States have identified substantial disparities in health care among adults with epilepsy. However, few data analyses addressing their health-care access are representative of the entire United States. This study aimed to examine national survey data about adults with epilepsy and to identify barriers to their health care. MATERIALS AND METHODS: We analyzed data from U.S. adults in the 2010 and the 2013 National Health Interview Surveys, multistage probability samples with supplemental questions on epilepsy. We defined active epilepsy as a history of physician-diagnosed epilepsy either currently under treatment or accompanied by seizures during the preceding year. We employed SAS-callable SUDAAN software to obtain weighted estimates of population proportions and rate ratios (RRs) adjusted for sex, age, and race/ethnicity. RESULTS: Compared to adults reporting no history of epilepsy, adults reporting active epilepsy were significantly more likely to be insured under Medicaid (RR=3.58) and less likely to have private health insurance (RR=0.58). Adults with active epilepsy were also less likely to be employed (RR=0.53) and much more likely to report being disabled (RR=6.14). They experience greater barriers to health-care access including an inability to afford medication (RR=2.40), mental health care (RR=3.23), eyeglasses (RR=2.36), or dental care (RR=1.98) and are more likely to report transportation as a barrier to health care (RR=5.28). CONCLUSIONS: These reported substantial disparities in, and barriers to, access to health care for adults with active epilepsy are amenable to intervention. |
The 1994-1995 National Health Interview Survey on Disability (NHIS-D): A bibliography of 20 years of research
Ward BW , Ridolfo H , Creamer L , Gray C . Rev Disabil Stud 2015 11 (2) 1-22 The 1994-1995 National Health Interview Survey on Disability (NHIS-D) has been one of the most unique and important data sources for studying disability, impairment, and health in the United States. In celebration of the NHIS-D's twenty-year anniversary, we created an extensive bibliography (n=212) of research that has used these data. |
CDC Grand Rounds: Prevention and control of skin cancer
Watson M , Thomas CC , Massetti GM , McKenna S , Gershenwald JE , Laird S , Iskander J , Lushniak B . MMWR Morb Mortal Wkly Rep 2015 64 (47) 1312-4 Skin cancer is the most common cancer in the United States, and most cases are preventable. Persons with certain genetic risk factors, including having a lighter natural skin color; blue or green eyes; red or blonde hair; dysplastic nevi or a large number of common moles; and skin that burns, freckles, or reddens easily or becomes painful after time in the sun, have increased risk for skin cancer. Persons with a family or personal history of skin cancer, especially melanoma, are also at increased risk. Although these genetic factors contribute to individual risk, most skin cancers are also strongly associated with ultraviolet (UV) radiation exposure. Most UV exposure comes from the sun, although some persons use UV-emitting indoor tanning devices (e.g., beds, booths, and lamps). |
Risk factors for transmission of tuberculosis among United States-born African Americans and Whites
Pagaoa MA , Royce RA , Chen MP , Golub JE , Davidow AL , Hirsch-Moverman Y , Marks SM , Teeter LD , Thickstun PM , Katz DJ . Int J Tuberc Lung Dis 2015 19 (12) 1485-92 SETTING: Tuberculosis (TB) patients and their contacts enrolled in nine states and the District of Columbia from 16 December 2009 to 31 March 2011. OBJECTIVE: To evaluate characteristics of TB patients that are predictive of tuberculous infection in their close contacts. DESIGN: The study population was enrolled from a list of eligible African-American and White TB patients from the TB registry at each site. Information about close contacts was abstracted from the standard reports of each site. RESULTS: Close contacts of African-American TB patients had twice the risk of infection of contacts of White patients (adjusted risk ratio [aRR] 2.1, 95%CI 1.3-3.4). Close contacts of patients whose sputum was positive for acid-fast bacilli on sputum smear microscopy had 1.6 times the risk of tuberculous infection compared to contacts of smear-negative patients (95%CI 1.1-2.3). TB patients with longer (>3 months) estimated times to diagnosis did not have higher proportions of infected contacts (aRR 1.2, 95%CI 0.9-1.6). CONCLUSION: African-American race and sputum smear positivity were predictive of tuberculous infection in close contacts. This study did not support previous findings that longer estimated time to diagnosis predicted tuberculous infection in contacts. |
Surveillance for Ebola virus in wildlife, Thailand
Wacharapluesadee S , Olival KJ , Kanchanasaka B , Duengkae P , Kaewchot S , Srongmongkol P , Ieamsaard G , Maneeorn P , Sittidetboripat N , Kaewpom T , Petcharat S , Yingsakmongkon S , Rollin PE , Towner JS , Hemachudha T . Emerg Infect Dis 2015 21 (12) 2271-3 Active surveillance for zoonotic pathogens in wildlife is particularly critical when the pathogen has the potential to cause a large-scale outbreak. The recent outbreak of Ebola virus (EBOV) disease in West Africa in 2014 was initiated by a single spillover event, followed by human-to-human transmission (1). Projection of filovirus ecologic niches suggests possible areas of distribution in Southeast Asia (2). Reston virus was discovered in macaques exported from the Philippines to the United States in 1989 and in sick domestic pigs in the Philippines in 2008 (with asymptomatic infection in humans) (3). Dead insectivorous bats in Europe were found to be infected by a filovirus, similar to other members of the genus Ebolavirus (4). | Although EBOV has historically been viewed as a virus from Africa, recent studies found that bat populations in Bangladesh and China contain antibodies against EBOV and Reston virus recombinant proteins, which suggests that EBOVs are widely distributed throughout Asia (5,6). Thus, an outbreak in Asian countries free of EBOV diseases may not only be caused by importation of infected humans and/or wildlife from Africa but may arise from in-country filovirus–infected wildlife. Serologic and molecular evidence for filoviruses suggests that members of the order Chiroptera (bats) may be their natural reservoir (7). |
Sustainable HIV treatment in Africa through viral-load-informed differentiated care
Phillips A , Shroufi A , Vojnov L , Cohn J , Roberts T , Ellman T , Bonner K , Rousseau C , Garnett G , Cambiano V , Nakagawa F , Ford D , Bansi-Matharu L , Miners A , Lundgren JD , Eaton JW , Parkes-Ratanshi R , Katz Z , Maman D , Ford N , Vitoria M , Doherty M , Dowdy D , Nichols B , Murtagh M , Wareham M , Palamountain KM , Chakanyuka Musanhu C , Stevens W , Katzenstein D , Ciaranello A , Barnabas R , Braithwaite RS , Bendavid E , Nathoo KJ , van de Vijver D , Wilson DP , Holmes C , Bershteyn A , Walker S , Raizes E , Jani I , Nelson LJ , Peeling R , Terris-Prestholt F , Murungu J , Mutasa-Apollo T , Hallett TB , Revill P . Nature 2015 528 (7580) S68-76 There are inefficiencies in current approaches to monitoring patients on antiretroviral therapy in sub-Saharan Africa. Patients typically attend clinics every 1 to 3 months for clinical assessment. The clinic costs are comparable with the costs of the drugs themselves and CD4 counts are measured every 6 months, but patients are rarely switched to second-line therapies. To ensure sustainability of treatment programmes, a transition to more cost-effective delivery of antiretroviral therapy is needed. In contrast to the CD4 count, measurement of the level of HIV RNA in plasma (the viral load) provides a direct measure of the current treatment effect. Viral-load-informed differentiated care is a means of tailoring care so that those with suppressed viral load visit the clinic less frequently and attention is focussed on those with unsuppressed viral load to promote adherence and timely switching to a second-line regimen. The most feasible approach to measuring viral load in many countries is to collect dried blood spot samples for testing in regional laboratories; however, there have been concerns over the sensitivity and specificity of this approach to define treatment failure and the delay in returning results to the clinic. We use modelling to synthesize evidence and evaluate the cost-effectiveness of viral-load-informed differentiated care, accounting for limitations of dried blood sample testing. We find that viral-load-informed differentiated care using dried blood sample testing is cost-effective and is a recommended strategy for patient monitoring, although further empirical evidence as the approach is rolled out would be of value. We also explore the potential benefits of point-of-care viral load tests that may become available in the future. |
Trachoma and yaws: Common ground?
Solomon AW , Marks M , Martin DL , Mikhailov A , Flueckiger RM , Mitja O , Asiedu K , Jannin J , Engels D , Mabey DC . PLoS Negl Trop Dis 2015 9 (12) e0004071 Trachoma is an important cause of blindness. The causative organism is an intracellular bacterium, Chlamydia trachomatis, which is susceptible to single-dose azithromycin [1]. A World Health Organization (WHO)-led program aims to eliminate trachoma as a public health problem globally by 2020 [2]. Yaws is a cause of skin, bone, and cartilage disease. The causative organism is a spirochaete bacterium, Treponema pallidum ssp. pertenue, which is susceptible to single-dose azithromycin [3]. A WHO-led program aims to eradicate yaws globally by 2020 [4]. | These diseases are both found in hard-to-reach populations—they affect the poorest people living in the most remote areas of the countries where they’re found—and have some apparent similarity in the methods recommended to counter them. Maximum synergy between programs is possible only if the two diseases affect the same communities, and if program goals permit alignment of work. Trachoma’s elimination as a public health problem means “the reduction of disease incidence, prevalence, morbidity or mortality to a locally acceptable level as a result of deliberate efforts” [5], whereas yaws eradication requires “permanent reduction to zero of the worldwide incidence of infection caused by a specific agent as a result of deliberate efforts” [5]—a quite different goal. This symposium reviews the extent to which the epidemiologies of and management strategies for these diseases actually overlap, to determine areas for mutually beneficial collaboration. |
Use of capture-recapture to estimate underreporting of Ebola virus disease, Montserrado County, Liberia
Gignoux E , Idowu R , Bawo L , Hurum L , Sprecher A , Bastard M , Porten K . Emerg Infect Dis 2015 21 (12) 2265-7 Underreporting of cases during a large outbreak of disease is not without precedent (1–5). Health systems in West Africa were ill-prepared for the arrival of Ebola virus disease (Ebola) (6). The Ebola outbreak in Liberia was declared on March 31, 2014, and peaked in September 2014. However, by mid-June, the outbreak had reached Montserrado County, where the capital, Monrovia, is located. In response, the Liberia Ministry of Health and Social Welfare (MOHSW) created a National Ebola Hotline: upon receipt of a call, a MOHSW case investigation team was dispatched to the site of the possible case. Additionally, persons could seek care at an Ebola Treatment Unit (ETU) or be referred to an ETU by another health care facility. During June 1–August 14, 2014, MOHSW, Médecins Sans Frontières, and the US nongovernment organization Samaritan’s Purse managed 3 ETUs in Montserrado County, including 2 in Monrovia operated by Eternal Love Winning Africa (ELWA). | In August 2014, to assess the extent of underreporting in the midst of the Ebola outbreak, we analyzed 2 sources of data collected during June 1–August 14. The first comprised data collected by MOHSW case investigation teams. These data were collected on MOHSW case forms and entered into a database emulating these forms using Epi Info version 7 software (Centers for Disease Control and Prevention, Atlanta, GA, USA). The second data source (designed on Excel 2003; Microsoft, Redmond, WA, USA) comprised data on all patients admitted to the 2 ELWA ETUs (ELWA1 and ELWA2). We used a capture–recapture (CRC) approach. |
National studies as a component of the World Health Organization initiative to estimate the global and regional burden of foodborne disease
Lake RJ , Devleesschauwer B , Nasinyama G , Havelaar AH , Kuchenmuller T , Haagsma JA , Jensen HH , Jessani N , Maertens de Noordhout C , Angulo FJ , Ehiri JE , Molla L , Agaba F , Aungkulanon S , Kumagai Y , Speybroeck N . PLoS One 2015 10 (12) e0140319 BACKGROUND: The World Health Organization (WHO) initiative to estimate the global burden of foodborne diseases established the Foodborne Diseases Burden Epidemiology Reference Group (FERG) in 2007. In addition to global and regional estimates, the initiative sought to promote actions at a national level. This involved capacity building through national foodborne disease burden studies, and encouragement of the use of burden information in setting evidence-informed policies. To address these objectives a FERG Country Studies Task Force was established and has developed a suite of tools and resources to facilitate national burden of foodborne disease studies. This paper describes the process and lessons learned during the conduct of pilot country studies under the WHO FERG initiative. FINDINGS: Pilot country studies were initiated in Albania, Japan and Thailand in 2011 and in Uganda in 2012. A brief description of each study is provided. The major scientific issue is a lack of data, particularly in relation to disease etiology, and attribution of disease burden to foodborne transmission. Situation analysis, knowledge translation, and risk communication to achieve evidence-informed policies require specialist expertise and resources. CONCLUSIONS: The FERG global and regional burden estimates will greatly enhance the ability of individual countries to fill data gaps and generate national estimates to support efforts to reduce the burden of foodborne disease. |
A population-based acute meningitis and encephalitis syndromes surveillance in Guangxi, China, May 2007- June 2012
Xie Y , Tan Y , Chongsuvivatwong V , Wu X , Bi F , Hadler SC , Jiraphongsa C , Sornsrivichai V , Lin M , Quan Y . PLoS One 2015 10 (12) e0144366 OBJECTIVES: Acute meningitis and encephalitis (AME) are common diseases with the main pathogens being viruses and bacteria. As specific treatments are different, it is important to develop clinical prediction rules to distinguish aseptic from bacterial or fungal infection. In this study we evaluated the incidence rates, seasonal variety and the main etiologic agents of AME, and identified factors that could be used to predict the etiologic agents. METHODS: A population-based AME syndrome surveillance system was set up in Guigang City, Guangxi, involving 12 hospitals serving the study communities. All patients meeting the case definition were investigated. Blood and/or cerebrospinal fluid were tested for bacterial pathogens using culture or RT-PCR and serological tests for viruses using enzyme-linked immunosorbent assays. Laboratory testing variables were grouped using factor analysis. Multinomial logistic regression was used to predict the etiology of AME. RESULTS: From May 2007 to June 2012, the annual incidence rate of AME syndrome, and disease specifically caused by Japanese encephalitis (JE), other viruses, bacteria and fungi were 12.55, 0.58, 4.57, 0.45 and 0.14 per 100,000 population, respectively. The top three identified viral etiologic agents were enterovirus, mumps virus, and JE virus, and for bacteria/fungi were Streptococcus sp., Cryptococcus neoformans and Staphylococcus sp. The incidence of JE and other viruses affected younger populations and peaked from April to August. Alteration of consciousness and leukocytosis were more likely to be caused by JE, bacteria and fungi whereas CSF inflammation was associated with bacterial/fungal infection. CONCLUSIONS: With limited predictive validity of symptoms and signs and routine laboratory tests, specific tests for JE virus, mumps virus and enteroviruses are required to evaluate the immunization impact and plan for further intervention. CSF bacterial culture cannot be omitted in guiding clinical decisions regarding patient treatment. |
Environmental transmission of typhoid fever in an urban slum
Akullian A , Ng'eno E , Matheson AI , Cosmas L , Macharia D , Fields B , Bigogo G , Mugoh M , John-Stewart G , Walson JL , Wakefield J , Montgomery JM . PLoS Negl Trop Dis 2015 9 (12) e0004212 BACKGROUND: Enteric fever due to Salmonella Typhi (typhoid fever) occurs in urban areas with poor sanitation. While direct fecal-oral transmission is thought to be the predominant mode of transmission, recent evidence suggests that indirect environmental transmission may also contribute to disease spread. METHODS: Data from a population-based infectious disease surveillance system (28,000 individuals followed biweekly) were used to map the spatial pattern of typhoid fever in Kibera, an urban informal settlement in Nairobi Kenya, between 2010-2011. Spatial modeling was used to test whether variations in topography and accumulation of surface water explain the geographic patterns of risk. RESULTS: Among children less than ten years of age, risk of typhoid fever was geographically heterogeneous across the study area (p = 0.016) and was positively associated with lower elevation, OR = 1.87, 95% CI (1.36-2.57), p <0.001. In contrast, the risk of typhoid fever did not vary geographically or with elevation among individuals less than 6b ten years of age. CONCLUSIONS: Our results provide evidence of indirect, environmental transmission of typhoid fever among children, a group with high exposure to fecal pathogens in the environment. Spatially targeting sanitation interventions may decrease enteric fever transmission. |
Expanding human immunodeficiency virus testing and counseling to reach tuberculosis clients' partners and families
Courtenay-Quirk C , Date A , Bachanas P , Baggaley R , Getahun H , Nelson L , Granich R . Int J Tuberc Lung Dis 2015 19 (12) 1414-6 Recent years have shown important increases in human immunodeficiency virus (HIV) testing and counseling (HTC), diagnosis, and coverage of antiretroviral therapy (ART) among HIV-infected tuberculosis (TB) patients. Expansion of HTC for partners and families are critical next steps to increase earlier HIV diagnoses and access to ART, and to achieve international goals for reduced TB and HIV-related morbidity, mortality, transmission and costs. TB and HIV programs should develop and evaluate feasible and effective strategies to increase access to HTC among the partners and families of TB patients, and ensure that newly diagnosed people living with HIV and HIV-infected TB patients who complete anti-tuberculosis treatment are successfully linked to ongoing HIV clinical care. |
Incidence and characteristics of early childhood wheezing, Dhaka, Bangladesh, 2004-2010
Dawood FS , Fry AM , Goswami D , Sharmeen A , Nahar K , Anjali BA , Rahman M , Brooks WA . Pediatr Pulmonol 2015 51 (6) 588-95 BACKGROUND: Early childhood wheezing substantially impacts quality of life in high-income countries, but data are sparse on early childhood wheezing in low-income countries. We estimate wheezing incidence, describe wheezing phenotypes, and explore the contribution of respiratory viral illnesses among children aged <5 years in urban Bangladesh. METHODS: During 2004-2010, respiratory illness surveillance was conducted through weekly home visits. Children with fever or respiratory illness were referred for examination by study physicians including lung auscultation. During 2005-2007, every fifth referred child had nasal washes tested for human metapneumovirus, respiratory syncytial viruses, and influenza and parainfluenza viruses. RESULTS: During April 2004-July 2010, 23,609 children were enrolled in surveillance. Of these, 11,912 (50%) were male, median age at enrollment was 20 months (IQR 5-38), and 4,711 (20%) had ≥1 wheezing episode accounting for 8,901 episodes (733 [8%] associated with hospitalization); 25% wheezed at <1 year of age. Among children aged <5 years, incidences of wheezing and wheezing hospitalizations were 2,335/10,000 and 192/10,000 child-years. Twenty-eight percent had recurrent wheezing. Recurrent versus non-recurrent wheezing episodes were more likely to be associated with oxygen saturation <93% (OR 6.9, 95%CI 2.8-17.3), increased work of breathing (OR 1.6, 95%CI 1.4-1.8), and hospitalization (OR 2.0, 95%CI 1.6-2.4). Respiratory viruses were detected in 66% (578/873) of episodes with testing. CONCLUSION: In urban Bangladesh, early childhood wheezing is common and largely associated with respiratory virus infections. Recurrent wheezing is associated with more severe illness and may predict children who would benefit most from closer follow-up and targeted interventions. Pediatr Pulmonol. (c) 2015 Wiley Periodicals, Inc. |
Increased number of human cases of influenza virus A(H5N1) infection, Egypt, 2014-15
Refaey S , Azziz-Baumgartner E , Amin MM , Fahim M , Roguski K , Elaziz HA , Iuliano AD , Salah N , Uyeki TM , Lindstrom S , Davis CT , Eid A , Genedy M , Kandeel A . Emerg Infect Dis 2015 21 (12) 2171-3 During November 2014-April 2015, a total of 165 case-patients with influenza virus A(H5N1) infection, including 6 clusters and 51 deaths, were identified in Egypt. Among infected persons, 99% reported poultry exposure: 19% to ill poultry and 35% to dead poultry. Only 1 person reported wearing personal protective equipment while working with poultry. |
Kenyan MSM: no longer a hidden population
Sanders EJ , Jaffe H , Musyoki H , Muraguri N , Graham SM . AIDS 2015 29 Suppl 3 S195-9 In 2005, almost 25 years after the emergence of the HIV pandemic among MSM in the United States, the first substantial report of HIV and sexually transmitted infections (STIs) among a large group of MSM from Senegal was published in AIDS [1]. Although MSM received late recognition in the African HIV epidemic [2,3], Kenya was at the forefront in recognizing the vulnerabilities of this highly stigmatized population that feared legal authorities and had virtually no access to health services [4]. Numerous studies have since documented the elevated HIV/STI infection risks of African MSM, and donor responses have begun to focus on inclusion of MSM and their emerging organizations in HIV prevention and care programming in Africa [5]. Despite legal challenges and largely negative public debates [6], the Kenyan Ministry of Health and National AIDS and STI Control Programme has recognized that MSM are one of the key populations in need of urgent attention and have demonstrated their willingness to work with them [7]. | This relatively supportive environment set the stage for recruitment of MSM into a cohort study investigating the feasibility of HIV-1 vaccine research on the Kenyan coast [8]. The Key Populations Cohort studies at the Kenya Medical Research Institute–Wellcome Trust Research Program in Kilifi, now in existence for 10 years, have reported the much higher HIV-1 incidence among MSM who had exclusive sex with men than in MSM who had sex with men and women [9]. In addition, numerous operational research studies based in this cohort have informed HIV prevention and care programming for MSM in Kenya and beyond [10–13]. In the past few years, HIV research with MSM in Kenya has expanded to several major cities. In Nairobi, over 1000 male sex workers, most of them MSM, have been engaged in research and provided with HIV care and prevention services [14,15], whereas counselling services targeting MSM have also been provided by the Liverpool Voluntary Counselling and Testing Programme [16]. In Kisumu, a Centers for Diseases Control and Prevention-funded study of a combination prevention and care programme for up to 700 MSM started enrolment in 2015, and an additional 100 MSM will be targeted for enrolment into an HIV Prevention Trials Network study of the feasibility of engaging and retaining MSM in research at sites in South Africa, Malawi and Kenya. As a result of this increased activity, researchers in Kenya have formed an MSM health research consortium, with the aim of improving healthcare for MSM and sharing findings with the Ministry of Health. Increasingly, research with MSM is informed by the views and planned with the support of Kenyan lesbian, gay, bisexual, and transgender groups. In addition to tackling health challenges, these LGBT groups and their leaders aim to address human rights challenges. Clearly, MSM in Kenya are no longer a hidden population. |
A birth-cohort testing intervention identified hepatitis C virus infection among patients with few identified risks: a cross-sectional study
Southern WN , Norton B , Steinman M , DeLuca J , Drainoni ML , Smith BD , Litwin AH . BMC Infect Dis 2015 15 (1) 553 BACKGROUND: International guidelines and U.S. guidelines prior to 2012 only recommended testing for hepatitis C virus (HCV) infection among patients at risk, but adherence to guidelines is poor, and the majority of those infected remain undiagnosed. A strategy to perform one-time testing of all patients born during 1945-1965, birth cohort testing, may diagnose HCV infection among patients whose risk remains unknown. We sought to determine if a birth-cohort testing intervention for HCV antibody positivity helped identify patients with fewer documented risk factors or medical indications than a pre-intervention, risk-based testing strategy. METHODS: We used a cross-sectional design with retrospective electronic medical record review to examine patients identified with HCV antibody positivity (Ab+) during a pre-intervention (risk-based) phase, the standard of care at the time, vs. a birth-cohort testing intervention phase. We compared demographic and clinical characteristics and HCV risk-associated factors among patients whose HCV Ab + was identified during the pre-intervention (risk-based testing) vs. post birth-cohort intervention phases. Study subjects were patients identified as HCV-Ab + in the baseline (risk-based) and birth-cohort testing phases of the Hepatitis C Assessment and Testing (HepCAT) Project. RESULTS: Compared to the risk-based phase, patients newly diagnosed with HCV Ab + after the birth-cohort intervention were significantly less likely to have a history of any substance abuse (30.5 % vs. 49.5 %, p = 0.02), elevated alanine transaminase levels of > 40 U/L (22.0 % vs. 46.7 %, p = 0.002), or the composite any risk-associated factor (55.9 % vs. 79.0 %, p = 0.002). CONCLUSIONS: Birth-cohort testing is an useful strategy for identifying previously undiagnosed HCV Ab + because it does not require providers ask risk-based questions, or patients to disclose risk behaviors, and appears to identify HCV Ab + in patients who would not have been identified using a risk-based testing strategy. |
The burden of influenza-associated hospitalizations in Oman, January 2008-June 2013
Al-Awaidy S , Hamid S , Al Obaidani I , Al Baqlani S , Al Busaidi S , Bawikar S , El-Shoubary W , Dueger EL , Said MM , Elamin E , Shah P , Talaat M . PLoS One 2015 10 (12) e0144186 INTRODUCTION: Acute respiratory infections (ARI), including influenza, comprise a leading cause of morbidity and mortality worldwide. Influenza surveillance provides important information to inform policy on influenza control and vaccination. While the epidemiology of influenza has been well characterized in western countries, few data exist on influenza epidemiology in the Eastern Mediterranean Region. We describe the epidemiology of influenza virus in Oman. METHODS: Using syndromic case definitions and protocols, patients from four regional hospitals in Oman were enrolled in a descriptive prospective study to characterize the burden of severe acute respiratory infections (SARI) and influenza. Eligible patients provided demographic information as well as oropharyngeal (OP) and nasopharyngeal (NP) swabs. Specimens were tested for influenza A and influenza B; influenza A viruses were subtyped using RT-PCR. RESULTS: From January 2008 through June 2013, a total of 5,147 cases were enrolled and tested for influenza. Influenza strains were detected in 8% of cases for whom samples were available. Annual incidence rates ranged from 0.5 to 15.4 cases of influenza-associated SARI per 100,000 population. The median age of influenza patients was 6 years with children 0-2 years accounting for 34% of all influenza-associated hospitalizations. By contrast, the median age of non-influenza SARI cases was 1 year with children 0-2 years comprising 59% of SARI. Compared to non-influenza SARI cases, a greater proportion of influenza cases had pre-existing chronic conditions and underwent ventilation during hospitalization. CONCLUSIONS: Influenza virus is associated with a substantial proportion of SARI in Oman. Influenza in Oman approximately follows northern hemisphere seasonality, with major peaks in October to December and a lesser peak around April. The burden of influenza was greatest in children and the elderly. Future efforts should examine the burden of influenza in other potential risk groups such as pregnant women to inform interventions including targeted vaccination. |
Combination of pegylated interferon and tenofovir for hepatitis B treatment: Screening and counseling of patients are warranted
McMahon BJ . Gastroenterology 2015 150 (1) 32-4 Chronic hepatitis B virus (HBV) infection is a significant cause of global morbidity and mortality, primarily owing to liver cirrhosis and hepatocellular carcinoma (HCC). There are an estimated 240 million persons worldwide with chronic HBV.1 Universal hepatitis B vaccination is recommended for all newborns and infants; and already the incidence of HCC in children has significantly decreased in Taiwan and dropped to zero in Alaska where universal infant and catchup vaccination was begun in the 1980s.2, 3 However, it will take several decades for the impact of vaccination in childhood to decrease the incidence of these adverse liver in older adults where most cases occur. Currently there are 2 classes of medications for the treatment of HBV, the reverse transcriptase inhibitors, nucleoside/nucleotide analogues and the interferons (IFN), neither of which result in a “cure” of HBV infection. In this edition of Gastroenterology, a consortium of investigators from 139 sites in 19 countries present the results of an open-label, active-controlled study of 740 patients with chronic HBV who were randomly assigned to 1 of 4 treatment regimens, tenofovir disoproxil fumarate (TDF) plus pegylated alpha-2 IFN (PEG-IFN) for 48 weeks, TDF plus PEG-IFN for 16 weeks followed by TDF for 32 weeks, TDF alone for 120 weeks, or PEG-IFN alone for 48 weeks.2, 4 |
Coexistence of Bartonella henselae and B. clarridgeiae in populations of cats and their fleas in Guatemala.
Bai Y , Rizzo MF , Alvarez D , Moran D , Peruski LF , Kosoy M . J Vector Ecol 2015 40 (2) 327-32 Cats and their fleas collected in Guatemala were investigated for the presence of Bartonella infections. Bartonella bacteria were cultured from 8.2% (13/159) of cats, and all cultures were identified as B. henselae. Molecular analysis allowed detection of Bartonella DNA in 33.8% (48/142) of cats and in 22.4% (34/152) of cat fleas using gltA, nuoG, and 16S-23S internal transcribed spacer targets. Two Bartonella species, B. henselae and B. clarridgeiae, were identified in cats and cat fleas by molecular analysis, with B. henselae being more common than B. clarridgeiae in the cats (68.1%; 32/47 vs 31.9%; 15/47). The nuoG was found to be less sensitive for detecting B. clarridgeiae compared with other molecular targets and could detect only two of the 15 B. clarridgeiae-infected cats. No significant differences were observed for prevalence between male and female cats and between different age groups. No evident association was observed between the presence of Bartonella species in cats and in their fleas. |
Marine harmful algal blooms, human health and wellbeing: challenges and opportunities in the 21st century
Berdalet E , Fleming LE , Gowen R , Davidson K , Hess P , Backer LC , Moore SK , Hoagland P , Enevoldsen H . J Mar Biol Assoc U K 2015 2015 Microalgal blooms are a natural part of the seasonal cycle of photosynthetic organisms in marine ecosystems. They are key components of the structure and dynamics of the oceans and thus sustain the benefits that humans obtain from these aquatic environments. However, some microalgal blooms can cause harm to humans and other organisms. These harmful algal blooms (HABs) have direct impacts on human health and negative influences on human wellbeing, mainly through their consequences to coastal ecosystem services (fisheries, tourism and recreation) and other marine organisms and environments. HABs are natural phenomena, but these events can be favoured by anthropogenic pressures in coastal areas. Global warming and associated changes in the oceans could affect HAB occurrences and toxicity as well, although forecasting the possible trends is still speculative and requires intensive multidisciplinary research. At the beginning of the 21st century, with expanding human populations, particularly in coastal and developing countries, mitigating HABs impacts on human health and wellbeing is becoming a more pressing public health need. The available tools to address this global challenge include maintaining intensive, multidisciplinary and collaborative scientific research, and strengthening the coordination with stakeholders, policymakers and the general public. Here we provide an overview of different aspects of the HABs phenomena, an important element of the intrinsic links between oceans and human health and wellbeing. |
Melting barriers to faunal exchange across ocean basins
McKeon CS , Weber MX , Alter SE , Seavy NE , Crandall ED , Barshis DJ , Fechter-Leggett ED , Oleson KL . Glob Chang Biol 2015 22 (2) 465-73 Accelerated loss of sea ice in the Arctic is opening routes connecting the Atlantic and Pacific Oceans for longer periods each year. These changes may increase the ease and frequency with which marine birds and mammals move between the Pacific and Atlantic Ocean basins. Indeed, recent observations of birds and mammals suggest these movements have intensified in recent decades. Reconnection of the Pacific and Atlantic Ocean basins will present both challenges to marine ecosystem conservation and an unprecedented opportunity to examine the ecological and evolutionary consequences of interoceanic faunal exchange in real time. To understand these changes and implement effective conservation of marine ecosystems, we need to further develop modeling efforts to predict the rate of dispersal and consequences of faunal exchange. These predictions can be tested by closely monitoring wildlife dispersal through the Arctic Ocean and using modern methods to explore the ecological and evolutionary consequences of these movements. |
Environmental predictors of US county mortality patterns on a national basis
Chan MP , Weinhold RS , Thomas R , Gohlke JM , Portier CJ . PLoS One 2015 10 (12) e0137832 A growing body of evidence has found that mortality rates are positively correlated with social inequalities, air pollution, elevated ambient temperature, availability of medical care and other factors. This study develops a model to predict the mortality rates for different diseases by county across the US. The model is applied to predict changes in mortality caused by changing environmental factors. A total of 3,110 counties in the US, excluding Alaska and Hawaii, were studied. A subset of 519 counties from the 3,110 counties was chosen by using systematic random sampling and these samples were used to validate the model. Step-wise and linear regression analyses were used to estimate the ability of environmental pollutants, socio-economic factors and other factors to explain variations in county-specific mortality rates for cardiovascular diseases, cancers, chronic obstructive pulmonary disease (COPD), all causes combined and lifespan across five population density groups. The estimated models fit adequately for all mortality outcomes for all population density groups and, adequately predicted risks for the 519 validation counties. This study suggests that, at local county levels, average ozone (0.07 ppm) is the most important environmental predictor of mortality. The analysis also illustrates the complex inter-relationships of multiple factors that influence mortality and lifespan, and suggests the need for a better understanding of the pathways through which these factors, mortality, and lifespan are related at the community level. |
Frequency of extreme heat event as a surrogate exposure metric for examining the human health effects of climate change
Romeo Upperman C , Parker J , Jiang C , He X , Murtugudde R , Sapkota A . PLoS One 2015 10 (12) e0144202 Epidemiological investigation of the impact of climate change on human health, particularly chronic diseases, is hindered by the lack of exposure metrics that can be used as a marker of climate change that are compatible with health data. Here, we present a surrogate exposure metric created using a 30-year baseline (1960-1989) that allows users to quantify long-term changes in exposure to frequency of extreme heat events with near unabridged spatial coverage in a scale that is compatible with national/state health outcome data. We evaluate the exposure metric by decade, seasonality, area of the country, and its ability to capture long-term changes in weather (climate), including natural climate modes. Our findings show that this generic exposure metric is potentially useful to monitor trends in the frequency of extreme heat events across varying regions because it captures long-term changes; is sensitive to the natural climate modes (ENSO events); responds well to spatial variability, and; is amenable to spatial/temporal aggregation, making it useful for epidemiological studies. |
World Health Organization estimates of the global and regional disease burden of 11 foodborne parasitic diseases, 2010: a data synthesis
Torgerson PR , Devleesschauwer B , Praet N , Speybroeck N , Willingham AL , Kasuga F , Rokni MB , Zhou XN , Fevre EM , Sripa B , Gargouri N , Furst T , Budke CM , Carabin H , Kirk MD , Angulo FJ , Havelaar A , de Silva N . PLoS Med 2015 12 (12) e1001920 BACKGROUND: Foodborne diseases are globally important, resulting in considerable morbidity and mortality. Parasitic diseases often result in high burdens of disease in low and middle income countries and are frequently transmitted to humans via contaminated food. This study presents the first estimates of the global and regional human disease burden of 10 helminth diseases and toxoplasmosis that may be attributed to contaminated food. METHODS AND FINDINGS: Data were abstracted from 16 systematic reviews or similar studies published between 2010 and 2015; from 5 disease data bases accessed in 2015; and from 79 reports, 73 of which have been published since 2000, 4 published between 1995 and 2000 and 2 published in 1986 and 1981. These included reports from national surveillance systems, journal articles, and national estimates of foodborne diseases. These data were used to estimate the number of infections, sequelae, deaths, and Disability Adjusted Life Years (DALYs), by age and region for 2010. These parasitic diseases, resulted in 48.4 million cases (95% Uncertainty intervals [UI] of 43.4-79.0 million) and 59,724 (95% UI 48,017-83,616) deaths annually resulting in 8.78 million (95% UI 7.62-12.51 million) DALYs. We estimated that 48% (95% UI 38%-56%) of cases of these parasitic diseases were foodborne, resulting in 76% (95% UI 65%-81%) of the DALYs attributable to these diseases. Overall, foodborne parasitic disease, excluding enteric protozoa, caused an estimated 23.2 million (95% UI 18.2-38.1 million) cases and 45,927 (95% UI 34,763-59,933) deaths annually resulting in an estimated 6.64 million (95% UI 5.61-8.41 million) DALYs. Foodborne Ascaris infection (12.3 million cases, 95% UI 8.29-22.0 million) and foodborne toxoplasmosis (10.3 million cases, 95% UI 7.40-14.9 million) were the most common foodborne parasitic diseases. Human cysticercosis with 2.78 million DALYs (95% UI 2.14-3.61 million), foodborne trematodosis with 2.02 million DALYs (95% UI 1.65-2.48 million) and foodborne toxoplasmosis with 825,000 DALYs (95% UI 561,000-1.26 million) resulted in the highest burdens in terms of DALYs, mainly due to years lived with disability. Foodborne enteric protozoa, reported elsewhere, resulted in an additional 67.2 million illnesses or 492,000 DALYs. Major limitations of our study include often substantial data gaps that had to be filled by imputation and suffer from the uncertainties that surround such models. Due to resource limitations it was also not possible to consider all potentially foodborne parasites (for example Trypanosoma cruzi). CONCLUSIONS: Parasites are frequently transmitted to humans through contaminated food. These estimates represent an important step forward in understanding the impact of foodborne diseases globally and regionally. The disease burden due to most foodborne parasites is highly focal and results in significant morbidity and mortality among vulnerable populations. |
World Health Organization estimates of the global and regional disease burden of 22 foodborne bacterial, protozoal, and viral diseases, 2010: a data synthesis
Kirk MD , Pires SM , Black RE , Caipo M , Crump JA , Devleesschauwer B , Dopfer D , Fazil A , Fischer-Walker CL , Hald T , Hall AJ , Keddy KH , Lake RJ , Lanata CF , Torgerson PR , Havelaar AH , Angulo FJ . PLoS Med 2015 12 (12) e1001921 BACKGROUND: Foodborne diseases are important worldwide, resulting in considerable morbidity and mortality. To our knowledge, we present the first global and regional estimates of the disease burden of the most important foodborne bacterial, protozoal, and viral diseases. METHODS AND FINDINGS: We synthesized data on the number of foodborne illnesses, sequelae, deaths, and Disability Adjusted Life Years (DALYs), for all diseases with sufficient data to support global and regional estimates, by age and region. The data sources included varied by pathogen and included systematic reviews, cohort studies, surveillance studies and other burden of disease assessments. We sought relevant data circa 2010, and included sources from 1990-2012. The number of studies per pathogen ranged from as few as 5 studies for bacterial intoxications through to 494 studies for diarrheal pathogens. To estimate mortality for Mycobacterium bovis infections and morbidity and mortality for invasive non-typhoidal Salmonella enterica infections, we excluded cases attributed to HIV infection. We excluded stillbirths in our estimates. We estimate that the 22 diseases included in our study resulted in two billion (95% uncertainty interval [UI] 1.5-2.9 billion) cases, over one million (95% UI 0.89-1.4 million) deaths, and 78.7 million (95% UI 65.0-97.7 million) DALYs in 2010. To estimate the burden due to contaminated food, we then applied proportions of infections that were estimated to be foodborne from a global expert elicitation. Waterborne transmission of disease was not included. We estimate that 29% (95% UI 23-36%) of cases caused by diseases in our study, or 582 million (95% UI 401-922 million), were transmitted by contaminated food, resulting in 25.2 million (95% UI 17.5-37.0 million) DALYs. Norovirus was the leading cause of foodborne illness causing 125 million (95% UI 70-251 million) cases, while Campylobacter spp. caused 96 million (95% UI 52-177 million) foodborne illnesses. Of all foodborne diseases, diarrheal and invasive infections due to non-typhoidal S. enterica infections resulted in the highest burden, causing 4.07 million (95% UI 2.49-6.27 million) DALYs. Regionally, DALYs per 100,000 population were highest in the African region followed by the South East Asian region. Considerable burden of foodborne disease is borne by children less than five years of age. Major limitations of our study include data gaps, particularly in middle- and high-mortality countries, and uncertainty around the proportion of diseases that were foodborne. CONCLUSIONS: Foodborne diseases result in a large disease burden, particularly in children. Although it is known that diarrheal diseases are a major burden in children, we have demonstrated for the first time the importance of contaminated food as a cause. There is a need to focus food safety interventions on preventing foodborne diseases, particularly in low- and middle-income settings. |
World Health Organization global estimates and regional comparisons of the burden of foodborne disease in 2010
Havelaar AH , Kirk MD , Torgerson PR , Gibb HJ , Hald T , Lake RJ , Praet N , Bellinger DC , de Silva NR , Gargouri N , Speybroeck N , Cawthorne A , Mathers C , Stein C , Angulo FJ , Devleesschauwer B . PLoS Med 2015 12 (12) e1001923 Illness and death from diseases caused by contaminated food are a constant threat to public health and a significant impediment to socio-economic development worldwide. To measure the global and regional burden of foodborne disease (FBD), the World Health Organization (WHO) established the Foodborne Disease Burden Epidemiology Reference Group (FERG), which here reports their first estimates of the incidence, mortality, and disease burden due to 31 foodborne hazards. We find that the global burden of FBD is comparable to those of the major infectious diseases, HIV/AIDS, malaria and tuberculosis. The most frequent causes of foodborne illness were diarrheal disease agents, particularly norovirus and Campylobacter spp. Diarrheal disease agents, especially non-typhoidal Salmonella enterica, were also responsible for the majority of deaths due to FBD. Other major causes of FBD deaths were Salmonella Typhi, Taenia solium and hepatitis A virus. The global burden of FBD caused by the 31 hazards in 2010 was 33 million Disability Adjusted Life Years (DALYs); children under five years old bore 40% of this burden. The 14 subregions, defined on the basis of child and adult mortality, had considerably different burdens of FBD, with the greatest falling on the subregions in Africa, followed by the subregions in South-East Asia and the Eastern Mediterranean D subregion. Some hazards, such as non-typhoidal S. enterica, were important causes of FBD in all regions of the world, whereas others, such as certain parasitic helminths, were highly localised. Thus, the burden of FBD is borne particularly by children under five years old-although they represent only 9% of the global population-and people living in low-income regions of the world. These estimates are conservative, i.e., underestimates rather than overestimates; further studies are needed to address the data gaps and limitations of the study. Nevertheless, all stakeholders can contribute to improvements in food safety throughout the food chain by incorporating these estimates into policy development at national and international levels. |
Methodological framework for World Health Organization estimates of the global burden of foodborne disease
Devleesschauwer B , Haagsma JA , Angulo FJ , Bellinger DC , Cole D , Dopfer D , Fazil A , Fevre EM , Gibb HJ , Hald T , Kirk MD , Lake RJ , Maertens de Noordhout C , Mathers CD , McDonald SA , Pires SM , Speybroeck N , Thomas MK , Torgerson PR , Wu F , Havelaar AH , Praet N . PLoS One 2015 10 (12) e0142498 BACKGROUND: The Foodborne Disease Burden Epidemiology Reference Group (FERG) was established in 2007 by the World Health Organization to estimate the global burden of foodborne diseases (FBDs). This paper describes the methodological framework developed by FERG's Computational Task Force to transform epidemiological information into FBD burden estimates. METHODS AND FINDINGS: The global and regional burden of 31 FBDs was quantified, along with limited estimates for 5 other FBDs, using Disability-Adjusted Life Years in a hazard- and incidence-based approach. To accomplish this task, the following workflow was defined: outline of disease models and collection of epidemiological data; design and completion of a database template; development of an imputation model; identification of disability weights; probabilistic burden assessment; and estimating the proportion of the disease burden by each hazard that is attributable to exposure by food (i.e., source attribution). All computations were performed in R and the different functions were compiled in the R package 'FERG'. Traceability and transparency were ensured by sharing results and methods in an interactive way with all FERG members throughout the process. CONCLUSIONS: We developed a comprehensive framework for estimating the global burden of FBDs, in which methodological simplicity and transparency were key elements. All the tools developed have been made available and can be translated into a user-friendly national toolkit for studying and monitoring food safety at the local level. |
Qualitative assessment for Toxoplasma gondii exposure risk associated with meat products in the United States
Guo M , Buchanan RL , Dubey JP , Hill DE , Lambertini E , Ying Y , Gamble HR , Jones JL , Pradhan AK . J Food Prot 2015 78 (12) 2207-19 Toxoplasma gondii is a global protozoan parasite capable of infecting most warm-blooded animals. Although healthy adult humans generally have no symptoms, severe illness does occur in certain groups, including congenitally infected fetuses and newborns, immunocompromised individuals including transplant patients. Epidemiological studies have demonstrated that consumption of raw or undercooked meat products is one of the major sources of infection with T. gondii. The goal of this study was to develop a framework to qualitatively estimate the exposure risk to T. gondii from various meat products consumed in the United States. Risk estimates of various meats were analyzed by a farm-to-retail qualitative assessment that included evaluation of farm, abattoir, storage and transportation, meat processing, packaging, and retail modules. It was found that exposure risks associated with meats from free-range chickens, nonconfinement-raised pigs, goats, and lamb are higher than those from confinement-raised pigs, cattle, and caged chickens. For fresh meat products, risk at the retail level was similar to that at the farm level unless meats had been frozen or moisture enhanced. Our results showed that meat processing, such as salting, freezing, commercial hot air drying, long fermentation times, hot smoking, and cooking, are able to reduce T. gondii levels in meat products. whereas nitrite and/or nitrate, spice, low pH, and cold storage have no effect on the viability of T. gondii tissue cysts. Raw-fermented sausage, cured raw meat, meat that is not hot-air dried, and fresh processed meat were associated with higher exposure risks compared with cooked meat and frozen meat. This study provides a reference for meat management control programs to determine critical control points and serves as the foundation for future quantitative risk assessments. |
Aetiology-specific estimates of the global and regional incidence and mortality of diarrhoeal diseases commonly transmitted through food
Pires SM , Fischer-Walker CL , Lanata CF , Devleesschauwer B , Hall AJ , Kirk MD , Duarte AS , Black RE , Angulo FJ . PLoS One 2015 10 (12) e0142927 BACKGROUND: Diarrhoeal diseases are major contributors to the global burden of disease, particularly in children. However, comprehensive estimates of the incidence and mortality due to specific aetiologies of diarrhoeal diseases are not available. The objective of this study is to provide estimates of the global and regional incidence and mortality of diarrhoeal diseases caused by nine pathogens that are commonly transmitted through foods. METHODS AND FINDINGS: We abstracted data from systematic reviews and, depending on the overall mortality rates of the country, applied either a national incidence estimate approach or a modified Child Health Epidemiology Reference Group (CHERG) approach to estimate the aetiology-specific incidence and mortality of diarrhoeal diseases, by age and region. The nine diarrhoeal diseases assessed caused an estimated 1.8 billion (95% uncertainty interval [UI] 1.1-3.3 billion) cases and 599,000 (95% UI 472,000-802,000) deaths worldwide in 2010. The largest number of cases were caused by norovirus (677 million; 95% UI 468-1,153 million), enterotoxigenic Escherichia coli (ETEC) (233 million; 95% UI 154-380 million), Shigella spp. (188 million; 95% UI 94-379 million) and Giardia lamblia (179 million; 95% UI 125-263); the largest number of deaths were caused by norovirus (213,515; 95% UI 171,783-266,561), enteropathogenic E. coli (121,455; 95% UI 103,657-143,348), ETEC (73,041; 95% UI 55,474-96,984) and Shigella (64,993; 95% UI 48,966-92,357). There were marked regional differences in incidence and mortality for these nine diseases. Nearly 40% of cases and 43% of deaths caused by these nine diarrhoeal diseases occurred in children under five years of age. CONCLUSIONS: Diarrhoeal diseases caused by these nine pathogens are responsible for a large disease burden, particularly in children. These aetiology-specific burden estimates can inform efforts to reduce diarrhoeal diseases caused by these nine pathogens commonly transmitted through foods. |
Genome of Rhodnius prolixus, an insect vector of Chagas disease, reveals unique adaptations to hematophagy and parasite infection.
Mesquita RD , Vionette-Amaral RJ , Lowenberger C , Rivera-Pomar R , Monteiro FA , Minx P , Spieth J , Carvalho AB , Panzera F , Lawson D , Torres AQ , Ribeiro JM , Sorgine MH , Waterhouse RM , Montague MJ , Abad-Franch F , Alves-Bezerra M , Amaral LR , Araujo HM , Araujo RN , Aravind L , Atella GC , Azambuja P , Berni M , Bittencourt-Cunha PR , Braz GR , Calderon-Fernandez G , Carareto CM , Christensen MB , Costa IR , Costa SG , Dansa M , Daumas-Filho CR , De-Paula IF , Dias FA , Dimopoulos G , Emrich SJ , Esponda-Behrens N , Fampa P , Fernandez-Medina RD , da Fonseca RN , Fontenele M , Fronick C , Fulton LA , Gandara AC , Garcia ES , Genta FA , Giraldo-Calderon GI , Gomes B , Gondim KC , Granzotto A , Guarneri AA , Guigo R , Harry M , Hughes DS , Jablonka W , Jacquin-Joly E , Juarez MP , Koerich LB , Latorre-Estivalis JM , Lavore A , Lawrence GG , Lazoski C , Lazzari CR , Lopes RR , Lorenzo MG , Lugon MD , Majerowicz D , Marcet PL , Mariotti M , Masuda H , Megy K , Melo AC , Missirlis F , Mota T , Noriega FG , Nouzova M , Nunes RD , Oliveira RL , Oliveira-Silveira G , Ons S , Pagola L , Paiva-Silva GO , Pascual A , Pavan MG , Pedrini N , Peixoto AA , Pereira MH , Pike A , Polycarpo C , Prosdocimi F , Ribeiro-Rodrigues R , Robertson HM , Salerno AP , Salmon D , Santesmasses D , Schama R , Seabra-Junior ES , Silva-Cardoso L , Silva-Neto MA , Souza-Gomes M , Sterkel M , Taracena ML , Tojo M , Tu ZJ , Tubio JM , Ursic-Bedoya R , Venancio TM , Walter-Nuno AB , Wilson D , Warren WC , Wilson RK , Huebner E , Dotson EM , Oliveira PL . Proc Natl Acad Sci U S A 2015 112 (48) 14936-14941 Rhodnius prolixus not only has served as a model organism for the study of insect physiology, but also is a major vector of Chagas disease, an illness that affects approximately seven million people worldwide. We sequenced the genome of R. prolixus, generated assembled sequences covering 95% of the genome ( approximately 702 Mb), including 15,456 putative protein-coding genes, and completed comprehensive genomic analyses of this obligate blood-feeding insect. Although immune-deficiency (IMD)-mediated immune responses were observed, R. prolixus putatively lacks key components of the IMD pathway, suggesting a reorganization of the canonical immune signaling network. Although both Toll and IMD effectors controlled intestinal microbiota, neither affected Trypanosoma cruzi, the causal agent of Chagas disease, implying the existence of evasion or tolerance mechanisms. R. prolixus has experienced an extensive loss of selenoprotein genes, with its repertoire reduced to only two proteins, one of which is a selenocysteine-based glutathione peroxidase, the first found in insects. The genome contained actively transcribed, horizontally transferred genes from Wolbachia sp., which showed evidence of codon use evolution toward the insect use pattern. Comparative protein analyses revealed many lineage-specific expansions and putative gene absences in R. prolixus, including tandem expansions of genes related to chemoreception, feeding, and digestion that possibly contributed to the evolution of a blood-feeding lifestyle. The genome assembly and these associated analyses provide critical information on the physiology and evolution of this important vector species and should be instrumental for the development of innovative disease control methods. |
Presence of the knockdown resistance mutation, Vgsc-1014F in Anopheles gambiae and An. arabiensis in western Kenya.
Ochomo E , Subramaniam K , Kemei B , Rippon E , Bayoh NM , Kamau L , Atieli F , Vulule JM , Ouma C , Gimnig J , Donnelly MJ , Mbogo C . Parasit Vectors 2015 8 (1) 616 INTRODUCTION: The voltage gated sodium channel mutation Vgsc-1014S (kdr-east) was first reported in Kenya in 2000 and has since been observed to occur at high frequencies in the local Anopheles gambiae s.s. POPULATION: The mutation Vgsc-1014F has never been reported from An. gambiae Complex complex mosquitoes in Kenya. FINDINGS: Molecularly confirmed An. gambiae s.s. (hereafter An. gambiae) and An. arabiensis collected from 4 different parts of western Kenya were genotyped for kdr from 2011 to 2013. Vgsc-1014F was observed to have emerged, apparently, simultaneously in both An. gambiae and An. arabiensis in 2012. A portion of the samples were submitted for sequencing in order to confirm the Vgsc-1014F genotyping results. The resulting sequence data were deposited in GenBank (Accession numbers: KR867642-KR867651, KT758295-KT758303). A single Vgsc-1014F haplotype was observed suggesting, a common origin in both species. CONCLUSION: This is the first report of Vgsc-1014F in Kenya. Based on our samples, the mutation is present in low frequencies in both An. gambiae and An. arabiensis. It is important that we start monitoring relative frequencies of the two kdr genes so that we can determine their relative importance in an area of high insecticide treated net ownership. |
Finished Annotated Genome Sequence of Burkholderia pseudomallei Strain Bp1651, a Multidrug-Resistant Clinical Isolate.
Bugrysheva JV , Sue D , Hakovirta J , Loparev VN , Knipe K , Sammons SA , Ranganathan-Ganakammal S , Changayil S , Srinivasamoorthy G , Weil MR , Tatusov RL , Gee JE , Elrod MG , Hoffmaster AR , Weigel LM . Genome Announc 2015 3 (6) Burkholderia pseudomallei strain Bp1651, a human isolate, is resistant to all clinically relevant antibiotics. We report here on the finished genome sequence assembly and annotation of the two chromosomes of this strain. This genome sequence may assist in understanding the mechanisms of antimicrobial resistance for this pathogenic species. |
Characterization of 137 Genomic DNA Reference Materials for 28 Pharmacogenetic Genes: A GeT-RM Collaborative Project.
Pratt VM , Everts RE , Aggarwal P , Beyer BN , Broeckel U , Epstein-Baak R , Hujsak P , Kornreich R , Liao J , Lorier R , Scott SA , Smith CH , Toji LH , Turner A , Kalman LV . J Mol Diagn 2015 18 (1) 109-23 Pharmacogenetic testing is increasingly available from clinical laboratories. However, only a limited number of quality control and other reference materials are currently available to support clinical testing. To address this need, the Centers for Disease Control and Prevention-based Genetic Testing Reference Material Coordination Program, in collaboration with members of the pharmacogenetic testing community and the Coriell Cell Repositories, has characterized 137 genomic DNA samples for 28 genes commonly genotyped by pharmacogenetic testing assays (CYP1A1, CYP1A2, CYP2A6, CYP2B6, CYP2C8, CYP2C9, CYP2C19, CYP2D6, CYP2E1, CYP3A4, CYP3A5, CYP4F2, DPYD, GSTM1, GSTP1, GSTT1, NAT1, NAT2, SLC15A2, SLC22A2, SLCO1B1, SLCO2B1, TPMT, UGT1A1, UGT2B7, UGT2B15, UGT2B17, and VKORC1). One hundred thirty-seven Coriell cell lines were selected based on ethnic diversity and partial genotype characterization from earlier testing. DNA samples were coded and distributed to volunteer testing laboratories for targeted genotyping using a number of commercially available and laboratory developed tests. Through consensus verification, we confirmed the presence of at least 108 variant pharmacogenetic alleles. These samples are also being characterized by other pharmacogenetic assays, including next-generation sequencing, which will be reported separately. Genotyping results were consistent among laboratories, with most differences in allele assignments attributed to assay design and variability in reported allele nomenclature, particularly for CYP2D6, UGT1A1, and VKORC1. These publicly available samples will help ensure the accuracy of pharmacogenetic testing. |
Genomics in Public Health: Perspective from the Office of Public Health Genomics at the Centers for Disease Control and Prevention (CDC).
Green RF , Dotson WD , Bowen S , Kolor K , Khoury MJ . Healthcare (Basel) 2015 3 (3) 830-837 The national effort to use genomic knowledge to save lives is gaining momentum, as illustrated by the inclusion of genomics in key public health initiatives, including Healthy People 2020, and the recent launch of the precision medicine initiative. The Office of Public Health Genomics (OPHG) at the Centers for Disease Control and Prevention (CDC) partners with state public health departments and others to advance the translation of genome-based discoveries into disease prevention and population health. To do this, OPHG has adopted an "identify, inform, and integrate" model: identify evidence-based genomic applications ready for implementation, inform stakeholders about these applications, and integrate these applications into public health at the local, state, and national level. This paper addresses current and future work at OPHG for integrating genomics into public health programs. |
'He is the one who is providing you with everything so whatever he says is what you do': a qualitative study on factors affecting secondary schoolgirls' dropout in rural western Kenya
Oruko K , Nyothach E , Zielinski-Gutierrez E , Mason L , Alexander K , Vulule J , Laserson KF , Phillips-Howard PA . PLoS One 2015 10 (12) e0144321 Education is an effective way to improve girls' self-worth, health, and productivity; however there remains a gender gap between girls' and boys' completion of school. The literature around factors influencing girls' decision to stay in school is limited. Seven focus group discussions took place among 79 girls in forms 2 to 4 at secondary schools in rural western Kenya, to examine their views on why girls absent themselves or dropout from school. Data were analysed thematically. Lack of resources, sexual relationships with boyfriends, and menstrual care problems were reported to lead directly to dropout or school absence. These were tied to girls increased vulnerability to pregnancy, poor performance in school, and punishments, which further increase school absence and risk of dropout. Poverty, unmet essential needs, coercive sexual relationships, and an inequitable school environment collude to counter girls' resolve to complete their schooling. Lack of resources drive girls to have sex with boyfriends or men who provide them with essentials their family cannot afford, such as sanitary pads and transport to school. While these improve quality of their school life, this dynamic increases their exposure to sexual risk, pregnancy, punishment, and dropout. Evaluation of interventions to ameliorate these challenges is warranted, including provision of pocket money to address their needs. |
Parental monitoring and its associations with adolescent sexual risk behavior: a meta-analysis
Dittus PJ , Michael SL , Becasen JS , Gloppen KM , McCarthy K , Guilamo-Ramos V . Pediatrics 2015 136 (6) e1587-99 CONTEXT: Increasingly, health care providers are using approaches targeting parents in an effort to improve adolescent sexual and reproductive health. Research is needed to elucidate areas in which providers can target adolescents and parents effectively. Parental monitoring offers one such opportunity, given consistent protective associations with adolescent sexual risk behavior. However, less is known about which components of monitoring are most effective and most suitable for provider-initiated family-based interventions. OBJECTIVE: We performed a meta-analysis to assess the magnitude of association between parental monitoring and adolescent sexual intercourse, condom use, and contraceptive use. DATA SOURCES: We conducted searches of Medline, the Cumulative Index to Nursing and Allied Health Literature, PsycInfo, Cochrane, the Education Resources Information Center, Social Services Abstracts, Sociological Abstracts, Proquest, and Google Scholar. STUDY SELECTION: We selected studies published from 1984 to 2014 that were written in English, included adolescents, and examined relationships between parental monitoring and sexual behavior. DATA EXTRACTION: We extracted effect size data to calculate pooled odds ratios (ORs) by using a mixed-effects model. RESULTS: Higher overall monitoring (pooled OR, 0.74; 95% confidence interval [CI], 0.69-0.80), monitoring knowledge (pooled OR, 0.81; 95% CI, 0.73-0.90), and rule enforcement (pooled OR, 0.67; 95% CI, 0.59-0.75) were associated with delayed sexual intercourse. Higher overall monitoring (pooled OR, 1.12; 95% CI, 1.01-1.24) and monitoring knowledge (pooled OR, 1.14; 95% CI, 1.01-1.31) were associated with greater condom use. Finally, higher overall monitoring was associated with increased contraceptive use (pooled OR, 1.42; 95% CI, 1.09-1.86), as was monitoring knowledge (pooled OR, 2.27; 95% CI, 1.42-3.63). LIMITATIONS: Effect sizes were not uniform across studies, and most studies were cross-sectional. CONCLUSIONS: Provider-initiated family-based interventions focused on parental monitoring represent a novel mechanism for enhancing adolescent sexual and reproductive health. |
Business models, vaccination services, and public health relationships of retail clinics: a qualitative study
Arthur BC , Fisher AK , Shoemaker SJ , Pozniak A , Stokley S . J Healthc Manag 2015 60 (6) 429-441 Despite the rapid growth of retail clinics (RCs), literature is limited in terms of how these facilities offer preventive services, particularly vaccination services. The purpose of this study was to obtain an in-depth understanding of the RC business model pertaining to vaccine offerings, profitability, and decision making. From March to June 2009, we conducted 15 interviews with key individuals from three types of organizations: 12 representatives of RC corporations, 2 representatives of retail hosts (i.e., stores in which the RCs are located), and 1 representative of an industry association. We analyzed interview transcripts qualitatively. Our results indicate that consumer demand and profitability were the main drivers in offering vaccinations. RCs in this sample primarily offered vaccinations to adults and adolescents, and they were not well integrated with local public health and immunization registries. Our findings demonstrate the potential for stronger linkages with public health in these settings. The findings also may help inform future research to increase patient access to vaccination services at RCs. |
Cost of operating central cancer registries and factors that affect cost: Findings from an economic evaluation of Centers for Disease Control and Prevention National Program Of Cancer Registries
Tangka FK , Subramanian S , Beebe MC , Weir HK , Trebino D , Babcock F , Ewing J . J Public Health Manag Pract 2015 22 (5) 452-60 CONTEXT: The Centers for Disease Control and Prevention evaluated the economics of the National Program of Cancer Registries to provide the Centers for Disease Control and Prevention, the registries, and policy makers with the economics evidence-base to make optimal decisions about resource allocation. Cancer registry budgets are under increasing threat, and, therefore, systematic assessment of the cost will identify approaches to improve the efficiencies of this vital data collection operation and also justify the funding required to sustain registry operations. OBJECTIVES: To estimate the cost of cancer registry operations and to assess the factors affecting the cost per case reported by National Program of Cancer Registries-funded central cancer registries. METHODS: We developed a Web-based cost assessment tool to collect 3 years of data (2009-2011) from each National Program of Cancer Registries-funded registry for all actual expenditures for registry activities (including those funded by other sources) and factors affecting registry operations. We used a random-effects regression model to estimate the impact of various factors on cost per cancer case reported. RESULTS: The cost of reporting a cancer case varied across the registries. Central cancer registries that receive high-quality data from reporting sources (as measured by the percentage of records passing automatic edits) and electronic data submissions, and those that collect and report on a large volume of cases had significantly lower cost per case. The volume of cases reported had a large effect, with low-volume registries experiencing much higher cost per case than medium- or high-volume registries. CONCLUSIONS: Our results suggest that registries operate with substantial fixed or semivariable costs. Therefore, sharing fixed costs among low-volume contiguous state registries, whenever possible, and centralization of certain processes can result in economies of scale. Approaches to improve quality of data submitted and increasing electronic reporting can also reduce cost. |
A cost-benefit analysis of a proposed overseas refugee latent tuberculosis infection screening and treatment program
Wingate LT , Coleman MS , de la Motte Hurst C , Semple M , Zhou W , Cetron MS , Painter JA . BMC Public Health 2015 15 (1) 1201 BACKGROUND: This study explored the effect of screening and treatment of refugees for latent tuberculosis infection (LTBI) before entrance to the United States as a strategy for reducing active tuberculosis (TB). The purpose of this study was to estimate the costs and benefits of LTBI screening and treatment in United States bound refugees prior to arrival. METHODS: Costs were included for foreign and domestic LTBI screening and treatment and the domestic treatment of active TB. A decision tree with multiple Markov nodes was developed to determine the total costs and number of active TB cases that occurred in refugee populations that tested 55, 35, and 20 % tuberculin skin test positive under two models: no overseas LTBI screening and overseas LTBI screening and treatment. For this analysis, refugees that tested 55, 35, and 20 % tuberculin skin test positive were divided into high, moderate, and low LTBI prevalence categories to denote their prevalence of LTBI relative to other refugee populations. RESULTS: For a hypothetical 1-year cohort of 100,000 refugees arriving in the United States from regions with high, moderate, and low LTBI prevalence, implementation of overseas screening would be expected to prevent 440, 220, and 57 active TB cases in the United States during the first 20 years after arrival. The cost savings associated with treatment of these averted cases would offset the cost of LTBI screening and treatment for refugees from countries with high (net cost-saving: $4.9 million) and moderate (net cost-saving: $1.6 million) LTBI prevalence. For low LTBI prevalence populations, LTBI screening and treatment exceed expected future TB treatment cost savings (net cost of $780,000). CONCLUSIONS: Implementing LTBI screening and treatment for United States bound refugees from countries with high or moderate LTBI prevalence would potentially save millions of dollars and contribute to United States TB elimination goals. These estimates are conservative since secondary transmission from tuberculosis cases in the United States was not considered in the model. |
Notes from the field: Carbapenem-resistant Enterobacteriaceae producing OXA-48-like carbapenemases - United States, 2010-2015
Lyman M , Walters M , Lonsway D , Rasheed K , Limbago B , Kallen A . MMWR Morb Mortal Wkly Rep 2015 64 (47) 1315-6 Carbapenem-resistant Enterobacteriaceae (CRE) are bacteria that are often resistant to most classes of antibiotics and cause health care-associated infections with high mortality rates Among CRE, strains that carry plasmid-encoded carbapenemase enzymes that inactivate carbapenem antibiotics are of greatest public health concern because of their potential for rapid global dissemination, as evidenced by the increasing distribution of CRE that produce the Klebsiella pneumoniae carbapenemase and the New Delhi metallo-beta-lactamase. Newly described resistance in Enterobacteriaceae, such as plasmid-mediated resistance to the last-line antimicrobial colistin, recently detected in China, and resistance to the newly approved antimicrobial, ceftazidime-avibactam, identified from a U.S. K. pneumoniae carbapenemase-producing isolate, highlight the continued urgency to delay spread of CRE. Monitoring the emergence of carbapenemases is crucial to limiting their spread; identification of patients carrying carbapenemase-producing CRE should result in the institution of transmission-based precautions and enhanced environmental cleaning to prevent transmission. The OXA-48 carbapenemase was first identified in Enterobacteriaceae in Turkey in 2001, and OXA-48-like variants have subsequently been reported around the world. The first U.S. reports of OXA-48-like carbapenemases were published in 2013 and included retrospectively identified isolates from 2009 and two isolates collected in 2012 from patients in Virginia who had recently been hospitalized outside the United States. Although there are limited additional published reports from the United States, CDC continues to receive reports of these organisms. This report describes patients identified as carrying CRE producing OXA-48-like carbapenemases in the United States during June 2010-August 2015. |
Effect of body surface decolonisation on bacteriuria and candiduria in intensive care units: an analysis of a cluster-randomised trial
Huang SS , Septimus E , Hayden MK , Kleinman K , Sturtevant J , Avery TR , Moody J , Hickok J , Lankiewicz J , Gombosev A , Kaganov RE , Haffenreffer K , Jernigan JA , Perlin JB , Platt R , Weinstein RA . Lancet Infect Dis 2015 16 (1) 70-79 BACKGROUND: Urinary tract infections (UTIs) are common health-care-associated infections. Bacteriuria commonly precedes UTI and is often treated with antibiotics, particularly in hospital intensive care units (ICUs). In 2013, a cluster-randomised trial (REDUCE MRSA Trial [Randomized Evaluation of Decolonization vs Universal Clearance to Eradicate MRSA]) showed that body surface decolonisation reduced all-pathogen bloodstream infections. We aim to further assess the effect of decolonisation on bacteriuria and candiduria in patients admitted to ICUs. METHODS: We did a secondary analysis of a three-group, cluster-randomised trial of 43 hospitals (clusters) with patients in 74 adult ICUs. The three groups included were either meticillin-resistant Staphylococcus aureus (MRSA) screening and isolation, targeted decolonisation (screening, isolation, and decolonisation of MRSA carriers) with chlorhexidine and mupirocin, and universal decolonisation (no screening, all patients decolonised) with chlorhexidine and mupirocin. Protocol included chlorhexidine cleansing of the perineum and proximal 6 inches (15.24 cm) of urinary catheters. ICUs within the same hospital were assigned the same strategy. Outcomes included high-level bacteriuria (≥50 000 colony forming units [CFU]/mL) with any uropathogen, high-level candiduria (≥50 000 CFU/mL), and any bacteriuria with uropathogens. Sex-specific analyses were specified a priori. Proportional hazards models assessed differences in outcome reductions across groups, comparing an 18-month intervention period to a 12-month baseline period. FINDINGS: 122 646 patients (48 390 baseline, 74 256 intervention) were enrolled. Intervention versus baseline hazard ratios (HRs) for high-level bacteriuria were 1.02 (95% CI 0.88-1.18) for screening or isolation, 0.88 (0.76-1.02) for targeted decolonisation, and 0.87 (0.77-1.00) for universal decolonisation (no difference between groups, p=0.26), with no sex-specific reductions (HRs for men: 1.09 [95% CI 0.85-1.40] for screening or isolation, 1.01 [0.79-1.29] for targeted decolonisation, and 0.78 [0.63-0.98] for universal decolonisation, p=0.12; HRs for women: 0.97 [0.80-1.17] for screening and isolation, 0.83 [0.70-1.00] for targeted decolonisation, and 0.93 [0.79-1.09] for universal decolonisation, p=0.49). HRs for high-level candiduria were 1.14 (0.95-1.37) for screening and isolation, 0.99 (0.83-1.18) for targeted decolonisation, and 0.83 (0.70-0.99) for universal decolonisation (p=0.05). Differences between sexes were due to reductions in men in the universal decolonisation group (HRs: 1.21 [95% CI 0.88-1.68] for screening or isolation, 1.01 [0.73-1.39] for targeted decolonisation, and 0.63 [0.45-0.89] for universal decolonisation, p=0.02). Bacteriuria with any CFU/mL was also reduced in men in the universal decolonisation group (HRs 1.01 [0.81-1.25] for screening or isolation, 1.04 [0.83-1.30] for targeted decolonisation, and 0.74 [0.61-0.90] for universal decolonisation, p=0.04). INTERPRETATION: Universal decolonisation of patients in the ICU with once a day chlorhexidine baths and short-course nasal mupirocin could be a potential preventive strategy in male patients because it significantly decreases candiduria and any bacteriuria, but not for women. FUNDING: HAI Program from AHRQ, US Department of Health and Human Services as part of the Developing Evidence to Inform Decisions about Effectiveness (DEcIDE) program, CDC Prevention Epicenters Program. |
Nondaily preexposure prophylaxis for HIV prevention
Anderson PL , Garcia-Lerma JG , Heneine W . Curr Opin HIV AIDS 2016 11 (1) 94-101 PURPOSE OF REVIEW: To discuss nondaily preexposure prophylaxis (PrEP) modalities that may provide advantages compared with daily PrEP in cost and cumulative toxicity, but may have lower adherence forgiveness. RECENT FINDINGS: Animal models have informed our understanding of early viral transmission events, which help guide event-driven PrEP dosing strategies. These models indicate early establishment of viral replication in rectal or cervicovaginal tissues, so event-driven PrEP should rapidly deliver high mucosal drug concentrations within hours of the potential exposure event. Macaque models have demonstrated the high biological efficacy for event-driven dosing of oral tenofovir disoproxil fumarate (TDF) and emtricitabine (FTC) against both vaginal and rectal virus transmission. In humans, the IPERGAY study demonstrated 86% efficacy for event-driven oral TDF/FTC dosing among men who have sex with men (MSM), while no similar efficacy data are available on women or heterosexual men. The HPTN 067 study showed that certain MSM populations adhere well to nondaily PrEP, whereas other populations of women adhere more poorly to nondaily versus daily regimens. Pharmacokinetic studies following oral TDF/FTC dosing in humans indicate that TFV-diphosphate (the active form of TFV) accumulates to higher concentrations in rectal versus cervicovaginal tissue, but nonadherence in trials complicates the interpretation of differential mucosal drug concentrations. SUMMARY: Event-driven dosing for TFV-based PrEP has promise for HIV prevention in MSM. Future research of event-driven PrEP in women and heterosexual men should be guided by a better understanding of the importance of mucosal drug concentrations for PrEP efficacy and its sensitivity to adherence. |
Measles 50 years after use of measles vaccine
Goodson JL , Seward JF . Infect Dis Clin North Am 2015 29 (4) 725-43 In response to severe measles, the first measles vaccine was licensed in the United States in 1963. Widespread use of measles vaccines for more than 50 years has significantly reduced global measles morbidity and mortality. However, measles virus continues to circulate, causing infection, illness, and an estimated 400 deaths worldwide each day. Measles is preventable by vaccine, and humans are the only reservoir. Clinicians should promote and provide on-time vaccination for all patients and keep measles in their differential diagnosis of febrile rash illness for rapid case detection, confirmation of measles infection, isolation, treatment, and appropriate public health response. |
Meeting the challenges of immunizing adults
Bridges CB , Hurley LP , Williams WW , Ramakrishnan A , Dean AK , Groom AV . Vaccine 2015 33 Suppl 4 D114-20 The overall burden of illness from diseases for which vaccines are available disproportionately falls on adults. Adults are recommended to receive vaccinations based on their age, underlying medical conditions, lifestyle, prior vaccinations, and other considerations. Updated vaccine recommendations from CDC are published annually in the U.S. Adult Immunization Schedule. Vaccine use among U.S. adults is low. Although receipt of a provider (physician or other vaccinating healthcare provider) recommendation is a key predictor of vaccination, more often consumers report not receiving vaccine recommendations at healthcare provider visits. Although providers support the benefits of vaccination, they also report several barriers to vaccinating adults, including the cost of providing vaccination services, inadequate or inconsistent payment for vaccines and vaccine administration, and acute medical care taking precedence over preventive services. Despite these challenges, a number of strategies have been demonstrated to substantially improve adult vaccine coverage, including patient and provider reminders and standing orders for vaccination. Providers are encouraged to incorporate routine assessment of their adult patients' vaccination needs during all clinical encounters to ensure patients receive recommendations for needed vaccines and are either offered needed vaccines or referred for vaccination. |
National and state-specific Td and Tdap vaccination of adult populations
Lu PJ , O'Halloran A , Ding H , Liang JL , Williams WW . Am J Prev Med 2015 50 (5) 616-626 INTRODUCTION: The Advisory Committee on Immunization Practices recommends a single dose of tetanus, diphtheria, and acellular pertussis vaccine (Tdap) for adults followed by tetanus and diphtheria toxoids (Td) booster doses every 10 years thereafter. This study assessed recent Td and Tdap vaccination among adult populations. METHODS: The 2013 Behavioral Risk Factor Surveillance System data were analyzed in 2015 to assess Td and Tdap vaccination coverage among adults at national and state levels. Multivariable logistic regression and predictive marginal models were performed to identify factors independently associated with vaccination. RESULTS: Overall, national vaccination coverage among adults aged ≥18 years for Td was 57.5% and for Tdap was 28.9%. Among states, Td vaccination coverage ranged from 47.8% in Nevada to 73.1% in Minnesota, and Tdap coverage ranged from 17.7% in Mississippi to 47.6% in Minnesota. Characteristics independently associated with an increased likelihood of Tdap vaccination among adults aged ≥18 years were younger age; being female; American Indian/Alaska Native race; being never married; higher education; not being in the workforce; reporting a household income ≥$75,000; living in the West or Midwest U.S.; reporting excellent, very good, good, or fair health; having health insurance; having a healthcare provider; having a routine checkup in the previous year; receipt of influenza vaccination in the previous year; and having ever received pneumococcal vaccination. CONCLUSIONS: By 2013, Td and Tdap vaccination coverage were 57.5% and 28.9%, respectively. Coverage varied by state. Implementation of evidence-based programs is needed to improve Td and Tdap vaccination levels among adult populations. |
Persistence of seropositivity among persons vaccinated for hepatitis A during infancy by maternal antibody status: 15-year follow-up
Spradling PR , Bulkow LR , Negus SE , Homan C , Bruce MG , McMahon BJ . Hepatology 2015 63 (3) 703-11 The effect of passively transferred maternal antibody to hepatitis A virus (anti-HAV) on the duration of seropositivity after hepatitis A vaccination during infancy and early childhood is unclear. We obtained levels of anti-HAV at intervals through ages 15-16 years among three groups of Alaskan Native children who initiated a two-dose inactivated hepatitis A vaccination series at ages 6 (Group 1), 12 (Group 2), and 15 months (Group 3), each group randomized according to maternal anti-HAV status. Seropositivity (anti-HAV ≥20 mIU/mL) 30 years after the second vaccine dose among the three groups was predicted using a random effects model. One hundred eighty-three children participated in the study; follow-up did not differ significantly by vaccine group or maternal anti-HAV status. Although the frequency of seropositivity among all participants through age 10 years was high (100% among Groups 2/3 and >90% among Group 1 children), there was a decrease thereafter through ages 15-16 years among Group 1 children, who initiated vaccination at age 6 months (50%-75%), and among maternal anti-HAV-positive children in Groups 2 and 3 (67%-87%), who initiated vaccination at ages 12 and 15 months, respectively. Nonetheless, the model indicated that anti-HAV seropositivity should persist for ≥30 years after vaccination in 64% of all participants; among those seropositive at ages 15-16 years, 84% were predicted to remain so for ≥30 years. CONCLUSIONS: Most children vaccinated during early childhood available for sampling maintained seropositivity through ages 15-16 years; however, seropositivity was less frequent among those starting vaccination at age 6 months and among maternal antibody-positive participants who started vaccination at age 12 or 15 months. Overall, our findings support current vaccine recommendations and continued follow-up of this cohort. |
Assessment of influenza vaccine effectiveness in a sentinel surveillance network 2010-13, United States
Cowling BJ , Feng S , Finelli L , Steffens A , Fowlkes A . Vaccine 2015 34 (1) 61-6 BACKGROUND: Influenza vaccines are now widely used to reduce the burden of annual epidemics of influenza virus infections. Influenza vaccine effectiveness (VE) is monitored annually to determine VE against each season's circulating influenza strains in different groups such as children, adults and the elderly. Few prospective surveillance programs are available to evaluate influenza VE against medically attended illness for patients of all ages in the United States. METHODS: We conducted surveillance of patients with acute respiratory illnesses in 101 clinics across the US during three consecutive influenza seasons. We analyzed laboratory testing results for influenza virus, self-reported vaccine history, and patient characteristics, defining cases as patients who tested positive for influenza virus and controls as patients who tested negative for influenza virus. Comparison of influenza vaccination coverage among cases versus controls, adjusted for potential confounders, was used to estimate VE as one minus the adjusted odds ratio multiplied by 100%. RESULTS: We included 10,650 patients during three influenza seasons from August 2010 through December 2013, and estimated influenza VE in children 6m-5y of age (58%; 95% CI: 49%-66%), children 6-17y (45%; 95% CI: 34%-53%), adults 18-49y (36%; 95% CI: 24%, 46%), and adults ≥50y (34%, 95% CI: 13%, 51%). VE was higher against influenza A(H1N1) compared to A(H3N2) and B. CONCLUSIONS: Our estimates of moderate influenza VE confirm the important role of vaccination in protecting against medically attended influenza virus infection. |
The challenge of global poliomyelitis eradication
Garon JR , Cochi SL , Orenstein WA . Infect Dis Clin North Am 2015 29 (4) 651-65 In the United States during the 1950's, polio was on the forefront of every provider and caregiver's mind. Today, most providers in the United States have never seen a case. The Global Polio Eradication Initiative (GPEI), which began in 1988 has reduced the number of cases by over 99%. The world is closer to achieving global eradication of polio than ever before but as long as poliovirus circulates anywhere in the world, every country is vulnerable. The global community can support the polio eradication effort through continued vaccination, surveillance, enforcing travel regulations and contributing financial support, partnerships and advocacy. |
The changing epidemiology of meningococcal disease
Cohn A , MacNeil J . Infect Dis Clin North Am 2015 29 (4) 667-77 The incidence of meningococcal disease is at an historic low in the United States, but prevention remains a priority because of the devastating outcomes and risk for outbreaks. Available vaccines are recommended routinely for persons at increased risk for disease to protect against all major serogroups of Neisseria meningitidis circulating in the United States. Although vaccination has virtually eliminated serogroup A meningococcal outbreaks from the Meningitis Belt of Africa and reduced the incidence of serogroup C disease worldwide, eradication of N meningitidis will unlikely be achieved by currently available vaccines because of the continued carriage and transmission of nonencapsulated organisms. |
An assessment of data quality in a multi-site electronic medical record system in Haiti
Puttkammer N , Baseman JG , Devine EB , Valles JS , Hyppolite N , Garilus F , Honore JG , Matheson AI , Zeliadt S , Yuhas K , Sherr K , Cadet JR , Zamor G , Pierre E , Barnhart S . Int J Med Inform 2015 86 104-16 OBJECTIVES: Strong data quality (DQ) is a precursor to strong data use. In resource limited settings, routine DQ assessment (DQA) within electronic medical record (EMR) systems can be resource-intensive using manual methods such as audit and chart review; automated queries offer an efficient alternative. This DQA focused on Haiti's national EMR - iSante - and included longitudinal data for over 100,000 persons living with HIV (PLHIV) enrolled in HIV care and treatment services at 95 health care facilities (HCF). METHODS: This mixed-methods evaluation used a qualitative Delphi process to identify DQ priorities among local stakeholders, followed by a quantitative DQA on these priority areas. The quantitative DQA examined 13 indicators of completeness, accuracy, and timeliness of retrospective data collected from 2005 to 2013. We described levels of DQ for each indicator over time, and examined the consistency of within-HCF performance and associations between DQ and HCF and EMR system characteristics. RESULTS: Over all iSante data, age was incomplete in <1% of cases, while height, pregnancy status, TB status, and ART eligibility were more incomplete (approximately 20-40%). Suspicious data flags were present for <3% of cases of male sex, ART dispenses, CD4 values, and visit dates, but for 26% of cases of age. Discontinuation forms were available for about half of all patients without visits for 180 or more days, and >60% of encounter forms were entered late. For most indicators, DQ tended to improve over time. DQ was highly variable across HCF, and within HCFs DQ was variable across indicators. In adjusted analyses, HCF and system factors with generally favorable and statistically significant associations with DQ were University hospital category, private sector governance, presence of local iSante server, greater HCF experience with the EMR, greater maturity of the EMR itself, and having more system users but fewer new users. In qualitative feedback, local stakeholders emphasized lack of stable power supply as a key challenge to data quality and use of the iSante EMR. CONCLUSIONS: Variable performance on key DQ indicators across HCF suggests that excellent DQ is achievable in Haiti, but further effort is needed to systematize and routinize DQ approaches within HCFs. A dynamic, interactive "DQ dashboard" within iSante could bring transparency and motivate improvement. While the results of the study are specific to Haiti's iSante data system, the study's methods and thematic lessons learned holdgeneralized relevance for other large-scale EMR systems in resource-limited countries. |
Prevalence of physical violence against children in Haiti: a national population-based cross-sectional survey
Flynn-O'Brien KT , Rivara FP , Weiss NS , Lea VA , Marcelin LH , Vertefeuille J , Mercy JA . Child Abuse Negl 2015 51 154-62 Although physical violence against children is common worldwide, there are no national estimates in Haiti. To establish baseline national estimates, a three-stage clustered sampling design was utilized to administer a population-based household survey about victimization due to physical violence to 13-24 year old Haitians (n=2,916), including those residing in camps or settlements. Descriptive statistics and weighted analysis techniques were used to estimate national lifetime prevalence and characteristics of physical violence against children. About two-thirds of respondents reported having experienced physical violence during childhood (67.0%; 95% CI 63.4-70.4), the percentage being similar in males and females. More than one-third of 13-17 year old respondents were victimized in the 12 months prior to survey administration (37.8%; 95% CI 33.6-42.1). The majority of violence was committed by parents and teachers; and the perceived intent was often punishment or discipline. While virtually all (98.8%; 95% CI 98.0-99.3) victims of childhood physical violence were punched, kicked, whipped or beaten; 11.0% (95% CI 9.2-13.2) were subject to abuse by a knife or other weapon. Injuries sustained from violence varied by victim gender and perpetrator, with twice as many females (9.6%; 95% CI 7.1-12.7) than males (4.0%; 95% CI 2.6-6.1) sustaining permanent injury or disfigurement by a family member or caregiver (p-value<.001). Our findings suggest that physical violence against children in Haiti is common, and may lead to severe injury. Characterization of the frequency and nature of this violence provides baseline estimates to inform interventions. |
Gang membership and marijuana use among African American female adolescents in North Carolina
Wechsberg WM , Doherty IA , Browne FA , Kline TL , Carry MG , Raiford JL , Herbst JH . Subst Abuse Rehabil 2015 6 141-150 The southeastern US sustains the highest high school dropout rates, and gangs persist in underserved communities. African American female adolescents who drop out of school and are gang members are at substantial risk of exposure to severe violence, physical abuse, and sexual exploitation. In this study of 237 female African American adolescents 16-19 years of age from North Carolina who dropped out or considered dropping out, 11% were current or past gang members. Adolescents who reported gang membership began smoking marijuana at a mean age of 13, whereas those who reported no gang membership began at a mean age of 15 years (P<0.001). The mean ages of first alcohol use were 14 years and 15 years for gang members and non-gang members, respectively (P=0.04). Problem alcohol use was high in both groups: 40% and 65% for non-gang and gang members, respectively (P=0.02). Controlling for frequent marijuana use and problem alcohol use, adolescents who reported gang membership were more likely than non-gang members to experience sexual abuse (odds ratio [OR] =2.60, 95% confidence interval [CI] [1.06, 6.40]), experience physical abuse (OR =7.33, 95% CI [2.90, 18.5]), report emotional abuse from their main partner (OR =3.55, 95% CI [1.44, 8.72]), run away from home (OR =4.65, 95% CI [1.90, 11.4]), get arrested (OR =2.61, 95% CI [1.05, 6.47]), and report violence in their neighborhood including murder (OR =3.27, 95% CI [1.35, 7.96]) and fights with weapons (OR =3.06, 95% CI [1.15, 8.11]). Gang members were less likely to receive emotional support (OR =0.89, 95% CI [0.81, 0.97]). These findings reinforce the urgent need to reach young African American women in disadvantaged communities affiliated with gangs to address the complexity of context and interconnected risk behaviors. |
The global burden of injury: incidence, mortality, disability-adjusted life years and time trends from the Global Burden of Disease study 2013
Haagsma JA , Graetz N , Bolliger I , Naghavi M , Higashi H , Mullany EC , Abera SF , Abraham JP , Adofo K , Alsharif U , Ameh EA , Ammar W , Antonio CA , Barrero LH , Bekele T , Bose D , Brazinova A , Catala-Lopez F , Dandona L , Dandona R , Dargan PI , De Leo D , Degenhardt L , Derrett S , Dharmaratne SD , Driscoll TR , Duan L , Petrovich Ermakov S , Farzadfar F , Feigin VL , Franklin RC , Gabbe B , Gosselin RA , Hafezi-Nejad N , Hamadeh RR , Hijar M , Hu G , Jayaraman SP , Jiang G , Khader YS , Khan EA , Krishnaswami S , Kulkarni C , Lecky FE , Leung R , Lunevicius R , Lyons RA , Majdan M , Mason-Jones AJ , Matzopoulos R , Meaney PA , Mekonnen W , Miller TR , Mock CN , Norman RE , Orozco R , Polinder S , Pourmalek F , Rahimi-Movaghar V , Refaat A , Rojas-Rueda D , Roy N , Schwebel DC , Shaheen A , Shahraz S , Skirbekk V , Soreide K , Soshnikov S , Stein DJ , Sykes BL , Tabb KM , Temesgen AM , Tenkorang EY , Theadom AM , Tran BX , Vasankari TJ , Vavilala MS , Vlassov VV , Woldeyohannes SM , Yip P , Yonemoto N , Younis MZ , Yu C , Murray CJ , Vos T . Inj Prev 2015 22 (1) 3-18 BACKGROUND: The Global Burden of Diseases (GBD), Injuries, and Risk Factors study used the disability-adjusted life year (DALY) to quantify the burden of diseases, injuries, and risk factors. This paper provides an overview of injury estimates from the 2013 update of GBD, with detailed information on incidence, mortality, DALYs and rates of change from 1990 to 2013 for 26 causes of injury, globally, by region and by country. METHODS: Injury mortality was estimated using the extensive GBD mortality database, corrections for ill-defined cause of death and the cause of death ensemble modelling tool. Morbidity estimation was based on inpatient and outpatient data sets, 26 cause-of-injury and 47 nature-of-injury categories, and seven follow-up studies with patient-reported long-term outcome measures. RESULTS: In 2013, 973 million (uncertainty interval (UI) 942 to 993) people sustained injuries that warranted some type of healthcare and 4.8 million (UI 4.5 to 5.1) people died from injuries. Between 1990 and 2013 the global age-standardised injury DALY rate decreased by 31% (UI 26% to 35%). The rate of decline in DALY rates was significant for 22 cause-of-injury categories, including all the major injuries. CONCLUSIONS: Injuries continue to be an important cause of morbidity and mortality in the developed and developing world. The decline in rates for almost all injuries is so prominent that it warrants a general statement that the world is becoming a safer place to live in. However, the patterns vary widely by cause, age, sex, region and time and there are still large improvements that need to be made. |
Comparative Analytical Evaluation of the Respiratory TaqMan Array Card with Real-Time PCR and Commercial Multi-Pathogen Assays.
Harvey JJ , Chester S , Burke SA , Ansbro M , Aden T , Gose R , Sciulli R , Bai J , DesJardin L , Benfer JL , Hall J , Smole S , Doan K , Popowich MD , St George K , Quinlan T , Halse TA , Li Z , Perez-Osorio AC , Glover WA , Russell D , Reisdorf E , Whyte T Jr , Whitaker B , Hatcher C , Srinivasan V , Tatti K , Tondella ML , Wang X , Winchell JM , Mayer LW , Jernigan D , Mawle AC . J Virol Methods 2015 228 151-7 In this study, a multicenter evaluation of the Life Technologies TaqMan(R) Array Card (TAC) with 21 custom viral and bacterial respiratory assays was performed on the Applied Biosystems ViiA 7 Real-Time PCR System. The goal of the study was to demonstrate the analytical performance of this platform when compared to identical individual pathogen specific laboratory developed tests (LDTs) designed at the Centers for Disease Control and Prevention (CDC), equivalent LDTs provided by state public health laboratories, or to three different commercial multi-respiratory panels. CDC and Association of Public Health Laboratories (APHL) LDTs had similar analytical sensitivities for viral pathogens, while several of the bacterial pathogen APHL LDTs demonstrated sensitivities one log higher than the corresponding CDC LDT. When compared to CDC LDTs, TAC assays were generally one to two logs less sensitive depending on the site performing the analysis. Finally, TAC assays were generally more sensitive than their counterparts in three different commercial multi-respiratory panels. TAC technology allows users to spot customized assays and design TAC layout, simplify assay setup, conserve specimen, dramatically reduce contamination potential, and as demonstrated in this study, analyze multiple samples in parallel with good reproducibility between instruments and operators. |
Reactivity measurement in estimation of benzoquinone and benzoquinone derivatives' allergenicity
Mbiya W , Chipinda I , Simoyi RH , Siegel PD . Toxicology 2015 339 34-39 Benzoquinone (BQ) and benzoquinone derivatives (BQD) are used in the production of dyes and cosmetics. While BQ, an extreme skin sensitizer, is an electrophile known to covalently modify proteins via Michael Addition (MA) reaction whilst halogen substituted BQD undergo nucleophilic vinylic substitution (SNV) mechanism onto amine and thiol moieties on proteins, the allergenic effects of adding substituents on BQ have not been reported. The effects of inserting substituents on the BQ ring has not been studied in animal assays however mandated reduction/elimination of animals used in cosmetics testing in Europe has led to an increased need for alternatives for the prediction of skin sensitization potential. Electron withdrawing and electron donating substituents on BQ were assessed for effects on BQ reactivity toward nitrobenzene thiol (NBT). The NBT binding studies demonstrated that addition of EWG to BQ as exemplified by the chlorine substituted BQDs increased reactivity while addition of EDG as in the methyl substituted BQDs reduced reactivity. BQD with electron withdrawing groups had the highest chemical potency followed by unsubstituted BQ and the least potent were the BQD with electron donating groups. BQ and BQD skin allergenicity, was evaluated in the murine local lymph node assay (LLNA). The BQD results demonstrate the impact of inductive effects on both BQ reactivity and allergenicity, and suggest the potential utility of chemical reactivity data for electrophilic allergen identification and potency ranking. |
Recommended mass spectrometry-based strategies to identify ricin-containing samples
Kalb SR , Schieltz DM , Becher F , Astot C , Fredriksson SA , Barr JR . Toxins (Basel) 2015 7 (12) 4881-94 Ricin is a protein toxin produced by the castor bean plant (Ricinus communis) together with a related protein known as R. communis agglutinin (RCA120). Mass spectrometric (MS) assays have the capacity to unambiguously identify ricin and to detect ricin's activity in samples with complex matrices. These qualitative and quantitative assays enable detection and differentiation of ricin from the less toxic RCA120 through determination of the amino acid sequence of the protein in question, and active ricin can be monitored by MS as the release of adenine from the depurination of a nucleic acid substrate. In this work, we describe the application of MS-based methods to detect, differentiate and quantify ricin and RCA120 in nine blinded samples supplied as part of the EQuATox proficiency test. Overall, MS-based assays successfully identified all samples containing ricin or RCA120 with the exception of the sample spiked with the lowest concentration (0.414 ng/mL). In fact, mass spectrometry was the most successful method for differentiation of ricin and RCA120 based on amino acid determination. Mass spectrometric methods were also successful at ranking the functional activities of the samples, successfully yielding semi-quantitative results. These results indicate that MS-based assays are excellent techniques to detect, differentiate, and quantify ricin and RCA120 in complex matrices. |
A novel eight amino acid insertion contributes to the hemagglutinin cleavability and the virulence of a highly pathogenic avian influenza A (H7N3) virus in mice
Sun X , Belser JA , Tumpey TM . Virology 2015 488 120-128 In 2012, an avian influenza A H7N3 (A/Mexico/InDRE7218/2012; Mx/7218) virus was responsible for two confirmed cases of human infection and led to the death or culling of more than 22 million chickens in Jalisco, Mexico. Interestingly, this virus acquired an 8-amino acid (aa)-insertion (..PENPK-DRKSRHRR-TR/GLF) near the hemagglutinin (HA) cleavage site by nonhomologous recombination with host rRNA. It remains unclear which specific residues at the cleavage site contribute to the virulence of H7N3 viruses in mammals. Using loss-of-function approaches, we generated a series of cleavage site mutant viruses by reverse genetics and characterized the viruses in vitro and in vivo. We found that the 8-aa insertion and the arginine at position P4 of the Mx/7218 HA cleavage site are essential for intracellular HA cleavage in 293T cells, but have no effect on the pH of membrane fusion. However, we identified a role for the histidine residue at P5 position in viral fusion pH. In mice, the 8-aa insertion is required for Mx/7218 virus virulence; however, the basic residues upstream of the P4 position are dispensable for virulence. Overall, our study provides the first line of evidence that the insertion in the Mx/7218 virus HA cleavage site confers its intracellular cleavability, and consequently contributes to enhanced virulence in mice. |
The effect of a mechanical arm system on portable grinder vibration emissions
McDowell TW , Welcome DE , Warren C , Xu XS , Dong RG . Ann Occup Hyg 2015 60 (3) 371-86 Mechanical arm systems are commonly used to support powered hand tools to alleviate ergonomic stressors related to the development of workplace musculoskeletal disorders. However, the use of these systems can increase exposure times to other potentially harmful agents such as hand-transmitted vibration. To examine how these tool support systems affect tool vibration, the primary objectives of this study were to characterize the vibration emissions of typical portable pneumatic grinders used for surface grinding with and without a mechanical arm support system at a workplace and to estimate the potential risk of the increased vibration exposure time afforded by the use of these mechanical arm systems. This study also developed a laboratory-based simulated grinding task based on the ISO 28927-1 (2009) standard for assessing grinder vibrations; the simulated grinding vibrations were compared with those measured during actual workplace grinder operations. The results of this study demonstrate that use of the mechanical arm may provide a health benefit by reducing the forces required to lift and maneuver the tools and by decreasing hand-transmitted vibration exposure. However, the arm does not substantially change the basic characteristics of grinder vibration spectra. The mechanical arm reduced the average frequency-weighted acceleration by about 24% in the workplace and by about 7% in the laboratory. Because use of the mechanical arm system can increase daily time-on-task by 50% or more, the use of such systems may actually increase daily time-weighted hand-transmitted vibration exposures in some cases. The laboratory acceleration measurements were substantially lower than the workplace measurements, and the laboratory tool rankings based on acceleration were considerably different than those from the workplace. Thus, it is doubtful that ISO 28927-1 is useful for estimating workplace grinder vibration exposures or for predicting workplace grinder acceleration rank orders. |
A case-study of implementation of improved strategies for prevention of laboratory-acquired Brucellosis
Castrodale LJ , Raczniak GA , Rudolph KM , Chikoyak L , Cox RS , Franklin TL , Traxler RM , Guerra M . Saf Health Work 2015 6 (4) 353-6 BACKGROUND: In 2012, the Alaska Section of Epidemiology investigated personnel potentially exposed to a Brucella suis isolate as it transited through three laboratories. METHODS: We summarize the first implementation of the United States Centers for Disease Control and Prevention 2013 revised recommendations for monitoring such exposures: (1) risk classification; (2) antimicrobial postexposure prophylaxis; (3) serologic monitoring; and (4) symptom surveillance. RESULTS: Over 30 people were assessed for exposure and subsequently monitored for development of illness. No cases of laboratory-associated brucellosis occurred. Changes were made to gaps in laboratory biosafety practices that had been identified in the investigation. CONCLUSION: Achieving full compliance for the precise schedule of serologic monitoring was challenging and resource intensive for the laboratory performing testing. More refined exposure assessments could inform decision making for follow-up to maximize likelihood of detecting persons at risk while not overtaxing resources. |
Observed and expected frequencies of structural hemoglobin variants in newborn screening surveys in Africa and the Middle East: deviations from Hardy-Weinberg equilibrium.
Piel FB , Adamkiewicz TV , Amendah D , Williams TN , Gupta S , Grosse SD . Genet Med 2015 18 (3) 265-74 PURPOSE: Our objective was to compare observed and expected genotype proportions from newborn screening surveys of structural hemoglobin variants. METHODS: We conducted a systematic review of newborn screening surveys of hemoglobins S and C in Africa and the Middle East. We compared observed frequencies to those expected assuming Hardy-Weinberg equilibrium (HWE). Significant deviations were identified by an exact test. The fixation index FIS was calculated to assess excess homozygosity. We compared newborn estimates corrected and uncorrected for HWE deviations using demographic data. RESULTS: Sixty samples reported genotype counts for hemoglobin variants in Africa and the Middle East. Observed and expected counts matched in 27%. The observed number of sickle cell anemia (SCA) individuals was higher than expected in 42 samples, reaching significance (P < 0.05) in 24. High FIS values were common across the study regions. The estimated total number of newborns with SCA, corrected based on FIS, was 33,261 annual births instead of 24,958 for the 38 samples across sub-Saharan Africa and 1,109 annual births instead of 578 for 12 samples from the Middle East. CONCLUSION: Differences between observed and expected genotype frequencies are common in surveys of hemoglobin variants in the study regions. Further research is required to identify and quantify factors responsible for such deviations. Estimates based on HWE might substantially underestimate the annual number of SCA-affected newborns (up to one-third in sub-Saharan Africa and one-half in the Middle East). |
ADHD and psychiatric comorbidity: Functional outcomes in a school-based sample of children
Cuffe SP , Visser SN , Holbrook JR , Danielson ML , Geryk LL , Wolraich ML , McKeown RE . J Atten Disord 2015 24 (9) 1345-1354 OBJECTIVE: Investigate the prevalence and impact of psychiatric comorbidities in community-based samples of schoolchildren with/without ADHD. METHOD: Teachers and parents screened children in South Carolina (SC; n = 4,604) and Oklahoma (OK; n = 12,626) for ADHD. Parents of high-screen and selected low-screen children received diagnostic interviews (SC: n = 479; OK: n = 577). RESULTS: Psychiatric disorders were increased among children with ADHD and were associated with low academic performance. Conduct disorder/oppositional defiant disorder (CD/ODD) were associated with grade retention (ODD/CD + ADHD: odds ratio [OR] = 3.0; confidence interval [CI] = [1.5, 5.9]; ODD/CD without ADHD: OR = 4.0; CI = [1.7, 9.7]). School discipline/police involvement was associated with ADHD alone (OR = 3.2; CI = [1.5, 6.8]), ADHD + CD/ODD (OR = 14.1, CI = [7.3, 27.1]), ADHD + anxiety/depression (OR = 4.8, CI = [1.6, 14.8]), and CD/ODD alone (OR = 2.8, CI = [1.2, 6.4]). Children with ADHD + anxiety/depression had tenfold risk for poor academic performance (OR = 10.8; CI = [2.4, 49.1]) compared to children with ADHD alone. This should be interpreted with caution due to the wide confidence interval. CONCLUSION: Most children with ADHD have psychiatric comorbidities, which worsens functional outcomes. The pattern of outcomes varies by type of comorbidity. |
Performance of an early infant diagnostic test, AmpliSens DNA-HIV-FRT, using dried blood spots collected from children bBorn to human immunodeficiency virus-infected mothers in Ukraine
Chang J , Tarasova T , Shanmugam V , Azarskova M , Nguyen S , Hurlston M , Sabatier J , Zhang G , Osmanov S , Ellenberger D , Yang C , Vitek C , Liulchuk M , Nizova N . J Clin Microbiol 2015 53 (12) 3853-8 An accurate accessible test for early infant diagnosis (EID) is crucial for identifying HIV-infected infants and linking them to treatment. To improve EID services in Ukraine, dried blood spot (DBS) samples obtained from 237 HIV-exposed children (≤18 months of age) in six regions in Ukraine in 2012 to 2013 were tested with the AmpliSens DNA-HIV-FRT assay, the Roche COBAS AmpliPrep/COBAS TaqMan (CAP/CTM) HIV-1 Qual test, and the Abbott RealTime HIV-1 Qualitative assay. In comparison with the paired whole-blood results generated from AmpliSens testing at the oblast HIV reference laboratories in Ukraine, the sensitivity was 0.99 (95% confidence interval [CI], 0.95 to 1.00) for the AmpliSens and Roche CAP/CTM Qual assays and 0.96 (95% CI, 0.90 to 0.98) for the Abbott Qualitative assay. The specificity was 1.00 (95% CI, 0.97 to 1.00) for the AmpliSens and Abbott Qualitative assays and 0.99 (95% CI, 0.96 to 1.00) for the Roche CAP/CTM Qual assay. McNemar analysis indicated that the proportions of positive results for the tests were not significantly different (P > 0.05). Cohen's kappa (0.97 to 0.99) indicated almost perfect agreement among the three tests. These results indicated that the AmpliSens DBS and whole-blood tests performed equally well and were comparable to the two commercially available EID tests. More importantly, the performance characteristics of the AmpliSens DBS test meets the World Health Organization EID test requirements; implementing AmpliSens DBS testing might improve EID services in resource-limited settings. |
Population-based birth defects data in the United States, 2008 to 2012: Presentation of state-specific data and descriptive brief on variability of prevalence
Mai CT , Isenburg J , Langlois PH , Alverson CJ , Gilboa SM , Rickard R , Canfield MA , Anjohrin SB , Lupo PJ , Jackson DR , Stallings EB , Scheuerle AE , Kirby RS . Birth Defects Res A Clin Mol Teratol 2015 103 (11) 972-93 Major structural birth defects collectively affect 3 to 5% of births in the United States and contribute substantially to mortality and morbidity (CDC, 2008; TDSHS, 2015). Since 2000, the National Birth Defects Prevention Network (NBDPN) has annually published state-specific data for selected major birth defects affecting a range of organ systems, including central nervous, eye, ear, cardiovascular, orofacial, gastrointestinal, genitourinary, and musculoskeletal, as well as chromosomal and other conditions, such as amniotic bands. While the NBPDN list of birth defects had remained relatively unchanged for two decades, it was recently revised and released with the 2014 NBDPN Annual Report (Mai et al., 2014). Several factors necessitated an in-depth examination of the list of conditions: (1) development of national data quality standards for birth defects surveillance in the United States; (2) transition of the diagnostic coding system from the International Classification of Diseases, 9th Revision, Clinical Modification (ICD-9-CM) to ICD-10-CM; and (3) inclusion of newborn screening for critical congenital heart defects (CCHD), with 12 primary and secondary CCHD targets, on the national Recommended Uniform Screening Panel. The revision process included a review of each condition in relation to its public health importance, state of current knowledge, and clinical factors, such as accuracy of diagnosis within a child’s first year of life. Table 1 presents the revised list of birth defects and their diagnostic codes [ICD-9-CM and Centers for Disease Control and Prevention/British Pediatric Association Classification of Diseases (CDC/BPA)]. |
State-of-the-science: the evolution of occupational exposure limit derivation and application
Maier A , Lentz TJ , MacMahon KL , McKernan LT , Whittaker C , Schulte PA . J Occup Environ Hyg 2015 12 Suppl 1 S4-6 Occupational exposure limits (OELs) are a critical component of the risk assessment and risk management process and their use remains a staple of occupational hygiene practice. There are dozens of organizations and agencies that derive OELs worldwide. Yet, while most of these groups describe their administrative procedures as well as the rationale for the derivation of OELs for individual substances, few provide equally complete documentation of the underlying scientific methodology for conducting the quantitative risk assessment employed in OEL development. The paucity of written descriptions of OEL development methodology has resulted in a lack of transparency related to implementation of important scientific principles for OEL development and inconsistent practices for OEL development within and among organizations. The absence of such transparency limits the opportunities for international harmonization of existing values and OEL setting practices among organizations. | Given these and other challenges, the National Institute for Occupational Safety and Health (NIOSH) began an effort to identify and characterize leading issues pertaining to OELs and their development through research which culminated in a collection of articles focused on each key issue. Those articles and the key issues they explore comprise this supplement of the Journal of Occupational and Environmental Hygiene. Utilizing subject matter expertise from researchers and thought leaders in the occupational hygiene profession and affiliated fields of environmental public health, the goal of this effort is to describe the issues related to education and communication of science principles and to understand how they can be incorporated into (and thereby impact) the practices of OEL development and interpretation. Focusing specifically on the state-of-the-science in the fields of exposure science, occupational hygiene, risk assessment, and toxicology this effort sought to provide a clear description of how advances in these research areas can contribute to the practice of OEL setting—by reviewing the methods used for most OELs that are currently available as well as new methods that are actively being incorporated in the OEL process. An essential topic included within the set of complementary and interrelated articles dedicated to this pursuit is the consideration and interpretation of OELs in the context of evolving risk management practices. The articles are intended to serve as a current critical review of occupational risk assessment methods that will enable occupational hygiene professionals to have a clear understanding of the science methods incorporated in the OELs they develop or use. A brief introduction to each article in this collection is provided in the following paragraphs. |
Exploring the state of health and safety management system performance measurement in mining organizations
Haas EJ , Yorio P . Saf Sci 2016 83 48-58 Complex arguments continue to be articulated regarding the theoretical foundation of health and safety management system (HSMS) performance measurement. The culmination of these efforts has begun to enhance a collective understanding. Despite this enhanced theoretical understanding, however, there are still continuing debates and little consensus. The goal of the current research effort was to empirically explore common methods to HSMS performance measurement in mining organizations. The purpose was to determine if value and insight could be added into the ongoing approaches of the best ways to engage in health and safety performance measurement. Nine site-level health and safety management professionals were provided with 133 practices corresponding to 20 HSMS elements, each fitting into the plan, do, check, act phases common to most HSMS. Participants were asked to supply detailed information as to how they (1) assess the performance of each practice in their organization, or (2) would assess each practice if it were an identified strategic imperative. Qualitative content analysis indicated that the approximately 1200 responses provided could be described and categorized into interventions, organizational performance, and worker performance. A discussion of how these categories relate to existing indicator frameworks is provided. The analysis also revealed divergence in two important measurement issues; (1) quantitative vs qualitative measurement and reporting; and (2) the primary use of objective or subjective metrics. In lieu of these findings we ultimately recommend a balanced measurement and reporting approach within the three metric categories and conclude with suggestions for future research. |
Design and development of a dust dispersion chamber to quantify the dispersibility of rock dust
Perera IE , Sapko MJ , Harris ML , Zlochower IA , Weiss ES . J Loss Prev Process Ind 2016 39 7-16 Dispersible rock dust must be applied to the surfaces of entries in underground coal mines in order to inert the coal dust entrained or made airborne during an explosion and prevent propagating explosions. 30 CFR. 75.2 states that ". . . [rock dust particles] when wetted and dried will not cohere to form a cake which will not be dispersed into separate particles by a light blast of air . . ." However, a proper definition or quantification of "light blast of air" is not provided. The National Institute for Occupational Safety and Health (NIOSH) has, consequently, designed a dust dispersion chamber to conduct quantitative laboratory-scale dispersibility experiments as a screening tool for candidate rock dusts. A reproducible pulse of air is injected into the chamber and across a shallow tray of rock dust. The dust dispersed and carried downwind is monitored. The mass loss of the dust tray and the airborne dust measurements determine the relative dispersibility of the dust with respect to a Reference rock dust. This report describes the design and the methodology to evaluate the relative dispersibility of rock dusts with and without anti-caking agents. Further, the results of this study indicate that the dispersibility of rock dusts varies with particle size, type of anti-caking agent used, and with the untapped bulk density. Untreated rock dusts, when wetted and dried forming a cake that was much less dispersible than the reference rock dust used in supporting the 80% total incombustible content rule. |
Sampling and analysis method for measuring airborne coal dust mass in mixtures with limestone (rock) dust
Barone TL , Patts JR , Janisko SJ , Colinet JF , Patts LD , Beck TW , Mischler SE . J Occup Environ Hyg 2015 13 (4) 0 Airborne coal dust mass measurements in underground bituminous coal mines can be challenged by the presence of airborne limestone dust, which is an incombustible dust applied to prevent the propagation of dust explosions. To accurately measure the coal portion of this mixed airborne dust, the National Institute for Occupational Safety and Health (NIOSH) developed a sampling and analysis protocol that used a stainless steel cassette adapted with an isokinetic inlet and the low temperature ashing (LTA) analytical method. The Mine Safety and Health Administration (MSHA) routinely utilizes this LTA method to quantify the incombustible content of bulk dust samples collected from the roof, floor, and ribs of mining entries. The use of the stainless steel cassette with isokinetic inlet allowed NIOSH to adopt the LTA method for the analysis of airborne dust samples. Mixtures of known coal and limestone dust masses were prepared in the laboratory, loaded into the stainless steel cassettes, and analyzed to assess the accuracy of this method. Coal dust mass measurements differed from predicted values by an average of 0.5%, 0.2%, and 0.1% for samples containing 20%, 91%, and 95% limestone dust, respectively. The ability of this method to accurately quantify the laboratory samples confirmed the validity of this method and allowed NIOSH to successfully measure the coal fraction of airborne dust samples collected in an underground coal mine. |
An unusual presentation of neurocysticercosis: a space-occupying lesion in the fourth ventricle associated with progressive cognitive decline
Kurz C , Schmidt V , Poppert H , Wilkins P , Noh J , Poppert S , Schlegel J , Ertelt-Delbridge C , da Costa CP , Winkler AS . Am J Trop Med Hyg 2015 94 (1) 172-5 We communicate a case of a middle-aged Brazilian patient with an unusual presentation of fourth ventricular neurocysticercosis: occurrence of two intraventricular cysts at different locations in the brain within 2 years and cognitive decline as the only neurological symptom. Neurocysticercosis was confirmed by magnetic resonance imaging, serology, histology, and genetic analysis. Neurocysticercosis should be considered as a differential diagnosis in cases with atypical neurologic or psychiatric symptoms, atypical neuroimaging, and travel history. Especially, fourth ventricular cysts carry the risk of obstructive hydrocephalus and brainstem compression and therefore should be extirpated completely. If complete removal of the cystic structures cannot be proven in cases with surgically treated neurocysticercosis, anthelminthic therapy and thorough follow-up examinations should be conducted. |
Assessment of the safety of antimalarial drug use during early pregnancy (ASAP): protocol for a multicenter prospective cohort study in Burkina Faso, Kenya and Mozambique
Tinto H , Sevene E , Dellicour S , Calip GS , d'Alessandro U , Macete E , Nakanabo-Diallo S , Kazienga A , Valea I , Sorgho H , Vala A , Augusto O , Ruperez M , Menendez C , Ouma P , Desai M , Ter Kuile F , Stergachis A . Reprod Health 2015 12 (1) 112 BACKGROUND: A major unresolved safety concern for malaria case management is the use of artemisinin combination therapies (ACTs) in the first trimester of pregnancy. There is a need for human data to inform policy makers and treatment guidelines on the safety of artemisinin combination therapies (ACT) when used during early pregnancy. METHODS: The overall goal of this paper is to describe the methods and implementation of a study aimed at developing surveillance systems for identifying exposures to antimalarials during early pregnancy and for monitoring pregnancy outcomes using health and demographic surveillance platforms. This was a multi-center prospective observational cohort study involving women at health and demographic surveillance sites in three countries in Africa: Burkina Faso, Kenya and Mozambique [(ClinicalTrials.gov Identifier: NCT01232530)]. The study was designed to identify pregnant women with artemisinin exposure in the first trimester and compare them to: 1) pregnant women without malaria, 2) pregnant women treated for malaria, but exposed to other antimalarials, and 3) pregnant women with malaria and treated with artemisinins in the 2nd or 3rd trimesters from the same settings. Pregnant women were recruited through community-based surveys and attendance at health facilities, including antenatal care clinics and followed until delivery. Data from the three sites will be pooled for analysis at the end of the study. Results are forthcoming. DISCUSSION: Despite few limitations, the methods described here are relevant to the development of sustainable pharmacovigilance systems for drugs used by pregnant women in the tropics using health and demographic surveillance sites to prospectively ascertain drug safety in early pregnancy. TRIAL REGISTRATION: NCT01232530. |
Size and characteristics of the biomedical research workforce associated with U.S. National Institutes of Health extramural grants
Pool LR , Wagner RM , Scott LL , RoyChowdhury D , Berhane R , Wu C , Pearson K , Sutton JA , Schaffer WT . FASEB J 2015 30 (3) 1023-36 The U.S. National Institutes of Health (NIH) annually invests approximately $22 billion in biomedical research through its extramural grant programs. Since fiscal year (FY) 2010, all persons involved in research during the previous project year have been required to be listed on the annual grant progress report. These new data have enabled the production of the first-ever census of the NIH-funded extramural research workforce. Data were extracted from All Personnel Reports submitted for NIH grants funded in FY 2009, including position title, months of effort, academic degrees obtained, and personal identifiers. Data were de-duplicated to determine a unique person count. Person-years of effort (PYE) on NIH grants were computed. In FY 2009, NIH funded 50,885 grant projects, which created 313,049 full- and part-time positions spanning all job functions involved in biomedical research. These positions were staffed by 247,457 people at 2,604 institutions. These persons devoted 121,465 PYE to NIH grant-supported research. Research project grants each supported 6 full- or part-time positions, on average. Over 20% of positions were occupied by postdoctoral researchers and graduate and undergraduate students. These baseline data were used to project workforce estimates for FYs 2010-2014 and will serve as a foundation for future research. |
First trimester pregnancy loss after fresh and frozen in vitro fertilization cycles
Hipp H , Crawford S , Kawwass JF , Chang J , Kissin DM , Jamieson DJ . Fertil Steril 2015 105 (3) 722-728 OBJECTIVE: To characterize risks for early pregnancy loss after fresh and frozen IVF cycles and to investigate whether risk is modified by infertility diagnoses or transfer of embryos in fresh versus frozen cycles. DESIGN: Retrospective cohort study using data from the National Assisted Reproductive Technology (ART) Surveillance System. SETTING: U.S. fertility centers. PATIENT(S): Clinical pregnancies achieved with fresh and frozen IVF cycles between 2007 and 2012 (N = 249,630). INTERVENTION(S): None. MAIN OUTCOME MEASURE(S): First trimester pregnancy loss. RESULT(S): A diagnosis of uterine factor was associated with an increased risk of loss in women aged 40 years and younger (<30 years: adjusted risk ratio (aRR) = 1.24, 95% confidence interval (CI) 1.04-1.48; 30-34 years: aRR = 1.27, 95% CI 1.17-1.38; 35-37 years: aRR = 1.12, 95% CI 1.03-1.21; 38-40 years: aRR = 1.08, 95% CI 1.01-1.17). There was an increased risk of loss in women with diminished ovarian reserve aged 30-34 years (aRR = 1.08, 95% CI 1.01-1.15) and in women with ovulatory dysfunction younger than 35 years (<30 years: aRR = 1.12, 95% CI 1.05-1.19; 30-34 years: aRR = 1.07, 95% CI 1.02-1.13). There was an increased risk of loss after frozen ETs versus fresh among women younger than 38 years, but this remained significant in the subanalysis of similar quality embryos only in women younger than 30 years (aRR = 1.16, 95% CI 1.04-1.32). CONCLUSION(S): Uterine factor had the largest increased risk of loss among infertility diagnoses, although the magnitudes of all risks were small. When transferring embryos of similar quality, the risks of loss were similar between fresh and frozen cycles. |
Abortion surveillance - United States, 2012
Pazol K , Creanga AA , Jamieson DJ . MMWR Surveill Summ 2015 64 (10) 1-40 PROBLEM/CONDITION: Since 1969, CDC has conducted abortion surveillance to document the number and characteristics of women obtaining legal induced abortions in the United States. REPORTING PERIOD COVERED: 2012. DESCRIPTION OF SYSTEM: Each year, CDC requests abortion data from the central health agencies of 52 reporting areas (the 50 states, the District of Columbia, and New York City). The reporting areas provide this information voluntarily. For 2012, data were received from 49 reporting areas. For trend analysis, abortion data were evaluated from 47 areas that reported data every year during 2003-2012. Census and natality data, respectively, were used to calculate abortion rates (number of abortions per 1,000 women) and ratios (number of abortions per 1,000 live births). RESULTS: A total of 699,202 abortions were reported to CDC for 2012. Of these abortions, 98.4% were from the 47 reporting areas that provided data every year during 2003-2012. Among these same 47 reporting areas, the abortion rate for 2012 was 13.2 abortions per 1,000 women aged 15-44 years, and the abortion ratio was 210 abortions per 1,000 live births. From 2011 to 2012, the total number and ratio of reported abortions decreased 4% and the abortion rate decreased 5%. From 2003 to 2012, the total number, rate, and ratio of reported abortions decreased 17%, 18%, and 14%, respectively, and reached their lowest level in 2012 for the entire period of analysis (2003-2012). In 2012 and throughout the period of analysis, women in their 20s accounted for the majority of abortions and had the highest abortion rates; women in their 30s and older accounted for a much smaller percentage of abortions and had lower abortion rates. In 2012, women aged 20-24 and 25-29 years accounted for 32.8% and 25.4% of all abortions, respectively, and had abortion rates of 23.3 and 18.9 abortions per 1,000 women aged 20-24 and 25-29 years, respectively. In contrast, women aged 30-34, 35-39, and ≥40 years accounted for 16.4%, 9.1%, and 3.7% of all abortions, respectively, and had abortion rates of 12.4, 7.3, and 2.8 abortions per 1,000 women aged 30-34 years, 35-39 years, and ≥40 years, respectively. Throughout the period of analysis, abortion rates decreased among women aged 20-24, 25-29, and 30-34 years by 24%, 18%, and 10%, respectively, whereas they increased among women aged ≥40 years by 8%. In 2012, adolescents aged <15 and 15-19 years accounted for 0.4% and 12.2% of all abortions, respectively, and had abortion rates of 0.8 and 9.2 abortions per 1,000 adolescents aged <15 and 15-19 years, respectively. From 2003 to 2012, the percentage of abortions accounted for by adolescents aged 15-19 years decreased 27% and their abortion rate decreased 40%. These decreases were greater than the decreases for women in any older age group. In contrast to the percentage distribution of abortions and abortion rates by age, abortion ratios in 2012 and throughout the entire period of analysis were highest among adolescents aged ≤19 years and lowest among women aged 30-39 years. Abortion ratios decreased from 2003 to 2012 for women in all age groups. In 2012, the majority (65.8%) of abortions were performed by ≤8 weeks' gestation, and nearly all (91.4%) were performed by ≤13 weeks' gestation. Few abortions (7.2%) were performed between 14-20 weeks' gestation or at ≥21 weeks' gestation (1.3%). From 2003 to 2012, the percentage of all abortions performed at ≤8 weeks' gestation increased 7%; the percentage performed at >13 weeks remained consistently low (≤9.0%). In 2012, among the 40 reporting areas that included medical (nonsurgical) abortion on their reporting form, a total of 69.4% of abortions were performed by curettage at ≤13 weeks' gestation, 20.8% were performed by early medical abortion (a nonsurgical abortion at ≤8 weeks' gestation), and 8.7% were performed by curettage at >13 weeks' gestation; all other methods were uncommon. Among abortions performed at ≤8 weeks' gestation that were eligible on the basis of gestational age for early medical abortion, 30.8% were completed by this method. The percentage of abortions reported as early medical abortions increased 10% from 2011 to 2012. Deaths of women associated with complications from abortions for 2012 are being investigated as part of CDC's Pregnancy Mortality Surveillance System. In 2011, the most recent year for which data were available, two women were identified to have died as a result of complications from known legal induced abortions. No reported deaths were associated with known illegal induced abortions. INTERPRETATION: Among the 47 areas that reported data every year during 2003-2012, the notable decreases that occurred during 2008-2011 in the total number, rate, and ratio of reported abortions continued from 2011 to 2012 and resulted in historic lows for all three measures of abortion. PUBLIC HEALTH ACTIONS: The data in this report can help to identify groups of women at greatest risk for abortion and can be used to guide and evaluate prevention efforts. Because unintended pregnancy is the major contributor to abortion, and unintended pregnancies are rare among women who use the most effective methods of contraception, increasing access to and use of these methods can help further reduce the number of unintended pregnancies, and therefore abortions, performed in the United States. |
Assisted reproductive technology surveillance - United States, 2013
Sunderam S , Kissin DM , Crawford SB , Folger SG , Jamieson DJ , Warner L , Barfield WD . MMWR Surveill Summ 2015 64 (11) 1-25 PROBLEM/CONDITION: Since the first U.S. infant conceived with assisted reproductive technology (ART) was born in 1981, both the use of ART and the number of fertility clinics providing ART services have increased steadily in the United States. ART includes fertility treatments in which eggs or embryos are handled in the laboratory (i.e., in vitro fertilization [IVF] and related procedures). Women who undergo ART procedures are more likely than women who conceive naturally to deliver multiple-birth infants. Multiple births pose substantial risks to both mothers and infants, including obstetric complications, preterm delivery, and low birthweight infants. This report provides state-specific information for the United States (including Puerto Rico) on ART procedures performed in 2013 and compares infant outcomes that occurred in 2013 (resulting from ART procedures performed in 2012 and 2013) with outcomes for all infants born in the United States in 2013. REPORTING PERIOD COVERED: 2013. DESCRIPTION OF SYSTEM: In 1996, CDC began collecting data on ART procedures performed in fertility clinics in the United States as mandated by the Fertility Clinic Success Rate and Certification Act of 1992 (FCSRCA) (Public Law 102-493). Data are collected through the National ART Surveillance System (NASS), a web-based data collection system developed by CDC. This report includes data from 52 reporting areas (the 50 states, the District of Columbia [DC], and Puerto Rico). RESULTS: In 2013, a total of 160,521 ART procedures (range: 109 in Wyoming to 20,299 in California) with the intent to transfer at least one embryo were performed in 467 U.S. fertility clinics and were reported to CDC. These procedures resulted in 53,252 live-birth deliveries (range: 47 in Alaska to 6,979 in California) and 66,691 infants (range: 61 in Alaska to 8,649 in California). Nationally, the total number of ART procedures performed per million women of reproductive age (15-44 years), a proxy measure of the ART usage rate, was 2,521 (range: 352 in Puerto Rico to 7,688 in DC). ART use exceeded the national rate in 13 reporting areas (California, Connecticut, Delaware, Hawaii, Illinois, Maryland, Massachusetts, New Hampshire, New Jersey, New York, Rhode Island, Virginia, and DC). Nationally, among ART transfer procedures in patients using fresh embryos from their own eggs, the average number of embryos transferred increased with increasing age of the woman (1.8 among women aged <35 years, 2.0 among women aged 35-37 years, and 2.5 among women aged >37 years). Among women aged <35 years, who typically are considered to be good candidates for elective single embryo transfer (eSET) procedures, the national eSET rate was 21.4% (range: 4.0% in Idaho to 77.5% in Delaware). In 2013, ART contributed to 1.6% of all infants born in the United States (range: 0.2% in Puerto Rico to 4.8% in Massachusetts) and 18.7% of all multiple-birth infants (range: 4.5% in Puerto Rico to 35.7% in Massachusetts), including 18.5% of all twin infants (range: 4.5% in Mississippi to 35.3% in Massachusetts) and 25.2% of all triplet and higher-order infants (range: 0% in several reporting areas to 51.5% in New Jersey). Multiple-birth deliveries were higher among infants conceived with ART (41.1%; range: 20.4% in Delaware to 61.6% in Wyoming) than among all infants born in the total birth population (only 3.5%; range: 1.8% in Puerto Rico to 4.5% in Massachusetts and New Jersey). Approximately 39% of ART-conceived infants were twin infants, and 2% were triplet and higher-order infants. ART-conceived twins accounted for approximately 95.4% of all ART-conceived infants born in multiple deliveries. Nationally, infants conceived with ART contributed to 5.8% of all low birthweight (<2,500 grams) infants (range: 0.9% in Puerto Rico to 15.1% in Massachusetts). Among ART-conceived infants, 29.1% were low birthweight (range: 18.3% in Delaware to 42.6% in Louisiana), compared with 8.0% among all infants (range: 5.8% in Alaska to 11.5% in Mississippi). ART-conceived infants contributed to 4.6% of all preterm (<37 weeks) infants (range: 0.6% in Puerto Rico to 13.3% in Massachusetts). Preterm birth rates were higher among infants conceived with ART (33.6%; range: 22.3% in DC to 50.7% in Louisiana) than among all infants born in the total birth population (11.4%; range: 8.8% in California to 16.6% in Mississippi). The percentage of ART-conceived infants who were low birthweight was 9.0% (range: 5.1% in Mississippi to 19.7% in Puerto Rico) among singletons and 56.3% (range: 48.3% in Maine to 72.4% in Puerto Rico) among twins; the corresponding percentages among all infants born were 6.3% for singletons (range: 4.6% in Alaska to 9.6% in Mississippi and Puerto Rico) and 55.3% for twins (range: 43.6% in Alaska to 65.6% in Mississippi). The percentage of ART-conceived infants who were preterm varied from 13.3% (range: 8.7% in Rhode Island to 26.9% in West Virginia) among singletons to 61.0% (range: 47.8% in DC to 78.8% in Oklahoma) among twins; the corresponding percentages among all infants were 10.1% for singletons (range: 6.8% in Vermont to 14.8% in Mississippi) and 56.6% for twins (range: 44.7% in New Hampshire to 68.9% in Louisiana). INTERPRETATION: The percentage of infants conceived with ART varied considerably by reporting area. In most reporting areas, multiple births from ART contributed to a substantial proportion of all twins, triplets, and higher-order infants born, and the low birthweight and preterm infant birth rates were disproportionately higher among ART-conceived infants than among the overall birth population. Although women aged <35 years are typically considered good candidates for eSET, on average two embryos were transferred per ART procedure with women in this group, increasing the overall multiple-birth rates in the United States. Compared with ART-conceived singletons, ART-conceived twins were approximately four-and-a-half times more likely to be born preterm, and approximately six times more likely to be born with low birthweight. Singleton infants conceived with ART had slightly higher rates of preterm delivery and low birthweight than all singleton infants born in the United States. ART use per population unit was geographically variable, with 13 reporting areas showing ART use above the national rate. Of the four states (Illinois, Massachusetts, New Jersey, and Rhode Island) with comprehensive statewide-mandated health insurance coverage for ART procedures (i.e., coverage for at least four cycles of IVF), two states (Massachusetts and New Jersey) had rates of ART use exceeding twice the national level. This type of mandated insurance has been associated with greater use of ART and likely accounts for some of the difference in per capita ART use observed among states. PUBLIC HEALTH ACTIONS: Reducing the number of embryos transferred per ART procedure and increasing use of eSET, when clinically appropriate (typically for women aged <35 years), could help reduce multiple births, particularly ART-conceived twin infants, and related adverse consequences of ART. Because twins account for the majority of ART-conceived multiple births, improved patient education and counseling on the maternal and infant health risks of having twins is needed. Although ART contributes to high rates of multiple births, other factors not investigated in this report (e.g., delayed childbearing and non-ART fertility treatments) also contribute to multiple births and warrant further study. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Food Safety
- Genetics and Genomics
- Global Health
- Health Behavior and Risk
- Health Economics
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Parasitic Diseases
- Public Health Leadership and Management
- Reproductive Health
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure