Trends in mortality from COPD among adults in the United States
Ford ES . Chest 2015 148 (4) 962-70 BACKGROUND: COPD imposes a large public health burden internationally and in the United States. The objective of this study was to examine trends in mortality from COPD among US adults from 1968 to 2011. METHODS: Data from the National Vital Statistics System from 1968 to 2011 for adults aged ≥ 25 years were accessed, and trends in mortality rates were examined with Joinpoint analysis. RESULTS: Among all adults, age-adjusted mortality rate rose from 29.4 per 100,000 population in 1968 to 67.0 per 100,000 population in 1999 and then declined to 63.7 per 100,000 population in 2011 (annual percentage change [APC] 2000-2011, -0.2%; 95% CI, -0.6 to 0.2). The age-adjusted mortality rate among men peaked in 1999 and then declined (APC 1999-2011, -1.1%; 95% CI, -1.4 to -0.7), whereas the age-adjusted mortality rate among women increased from 2000 to 2011, peaking in 2008 (APC 2000-2011, 0.4%; 95% CI, 0.0-0.9). Despite a narrowing of the sex gap, mortality rates in men continued to exceed those in women. Evidence of a decline in the APC was noted for black men (1999-2011, -1.5%; 95% CI, -2.1 to -1.0) and white men (1999-2011, -0.9%; 95% CI, -1.3 to -0.6), adults aged 55 to 64 years (1989-2011, -1.0%; 95% CI, -1.2 to -0.8), and adults aged 65 to 74 years (1999-2011, -1.2%; 95% CI, -1.6 to -0.9). CONCLUSIONS: In the United States, the mortality rate from COPD has declined since 1999 in men and some age groups but appears to be still rising in women, albeit at a reduced pace. |
Are the recent secular increases in waist circumference among children and adolescents independent of changes in BMI?
Freedman DS , Kit BK , Ford ES . PLoS One 2015 10 (10) e0141056 BACKGROUND: Several studies have shown that the waist circumference of children and adolescents has increased over the last 25 years. However, given the strong correlation between waist circumference and BMI, it is uncertain if the secular trends in waist circumference are independent of those in BMI. METHODS: We analyzed data from 6- to 19-year-olds who participated in the 1988-1994 through 2011-2012 cycles of the National Health and Nutrition Examination Survey to assess whether the trends in waist circumference were independent of changes in BMI, race-ethnicity and age. RESULTS: Mean, unadjusted levels of waist circumference increased by 3.7 cm (boys) and 6.0 cm (girls) from 1988-94 through 2011-12, while mean BMI levels increased by 1.1 kg/m2 (boys) and 1.6 kg/m2 (girls). Overall, the proportional changes in mean levels of both waist circumference and BMI were fairly similar among boys (5.3%, waist vs. 5.6%, BMI) and girls (8.7%, waist vs. 7.7%, BMI). As assessed by the area under the curve, adjustment for BMI reduced the secular increases in waist circumference by about 75% (boys) and 50% (girls) beyond that attributable to age and race-ethnicity. There was also a race-ethnicity interaction (p < 0.001). Adjustment for BMI reduced the secular trend in waist circumference among non-Hispanic (NH) black children (boys and girls) to a greater extent (about 90%) than among other children. CONCLUSIONS: Our results indicate that among children in the U.S., about 75% (boys) and 50% (girls) of the secular increases in waist circumference since 1988-94 can be accounted for by changes in BMI. The reasons for the larger independent effects among girls and among NH blacks are uncertain. |
Epidemiologic, Virologic, and Host Genetic Factors of Norovirus Outbreaks in Long-term Care Facilities.
Costantini VP , Cooper EM , Hardaker HL , Lee LE , Bierhoff M , Biggs C , Cieslak PR , Hall AJ , Vinje J . Clin Infect Dis 2015 62 (1) 1-10 BACKGROUND: In the Unites States, long-term care facilities (LTCFs) are the most common setting for norovirus outbreaks. These outbreaks provide a unique opportunity to better characterize the viral and host characteristics of norovirus disease. METHODS: We enrolled 43 LTCFs prospectively to study the epidemiology, virology, and genetic host factors of naturally occurring norovirus outbreaks. Acute and convalescent stool, serum, and saliva samples from cases, exposed and nonexposed controls were collected. Norovirus infection was confirmed using quantitative polymerase chain reaction testing of stool samples or 4-fold increase in serum antibody titers. The presence of histo-blood group antigens (secretor, ABO, and Lewis type) was determined in saliva. RESULTS: Sixty-two cases, 34 exposed controls, and 18 nonexposed controls from 10 norovirus outbreaks were enrolled. Forty-six percent of acute, 27% of convalescent case, and 11% of control stool samples tested norovirus positive. Outbreak genotypes were GII.4 (Den Haag, n = 3; New Orleans, n = 4; and Sydney, n = 2) and GI.1 (n = 1). Viral load in GII.4 Sydney outbreaks was significantly higher than in outbreaks caused by other genotypes; cases and controls shed similar amounts of virus. Forty-seven percent of cases shed virus for ≥21 days. Symptomatic infections with GII.4 Den Haag and GII.4 New Orleans were detected among nonsecretor individuals. CONCLUSIONS: Almost half of all symptomatic individuals shed virus for at least 21 days. Viral load was highest in GII.4 viruses that most recently emerged; these viruses also infect the nonsecretor population. These findings will help to guide development of targeted prevention and control measures in the elderly. |
Modelling and estimation of HIV prevalence and number of people living with HIV in India, 2010-2011
Raj Y , Sahu D , Pandey A , Venkatesh S , Reddy D , Bakkali T , Das C , Singh KJ , Kant S , Bhattacharya M , Stover J , Jha UM , Kumar P , Mishra RM , Chandra N , Gulati BK , Mathur S , Joshi D , Chavan L . Int J STD AIDS 2015 27 (14) 1257-1266 This paper provides HIV estimation methodology used in India and key HIV estimates for 2010-2011. We used a modified version of the Spectrum tool that included Estimation and Projection Package as part of its AIDS Impact Module. Inputs related to population size, age-specific pattern of fertility, sex-ratio at birth, age and sex-specific pattern of mortality, and volume and age-sex distribution of net migration were derived from census records, Sample Registration System and large-scale demographic health surveys. Epidemiological and programmatic data were derived from HIV sentinel surveillance, large-scale epidemiological surveys and the programme management information system. Estimated adult HIV prevalence retained a declining trend in India, following its peak in 2002 at a level of 0.41% (within bounds 0.35-0.47%). By 2010 and 2011, it levelled at estimates of 0.28% (0.24-0.34%) and 0.27% (0.22-0.33%), respectively. The estimated number of people living with HIV (PLHIV) reduced by 8% between 2007 and 2011. While children accounted for approximately 6.3% of total HIV infections in 2007, this proportion increased to about 7% in 2011. With changing priorities and epidemic patterns, the programme has to customise its strategies to effectively address the emerging vulnerabilities and adapt them to suit the requirements of different geographical regions. |
Multidrug-resistant tuberculosis treatment outcomes in relation to treatment, initial and acquired second-line drug resistance
Cegielski JP , Kurbatova E , van der Walt M , Brand J , Ershova J , Tupasi T , Campos Caoili J , Dalton T , Contreras C , Yagui M , Bayona J , Kvasnovsky C , Leimane V , Kuksa L , Chen MP , Via LE , Hwang SH , Wolfgang M , Volchenkov GV , Somova T , Smith SE , Akksilp S , Wattanaamornkiet W , Kim HJ , Kim CK , Kazennyy BY , Khorosheva T , Kliiman K , Viiklepp P , Jou R , Huang AS , Vasilyeva IA , Demikhova OV . Clin Infect Dis 2015 62 (4) 418-430 BACKGROUND: Resistance to second-line drugs (SLD) develops during treatment of multidrug-resistant (MDR) tuberculosis (TB), but the impact on treatment outcome has not been determined. OBJECTIVES: To determine the relationship with treatment outcomes of (1) initial versus acquired drug resistance and (2) treatment regimens. METHODS: MDR-TB patients starting SLD treatment were enrolled in a prospective cohort study. Sputum cultures were analyzed at a central reference laboratory. We compared subjects with successful and poor treatment outcomes in terms of (1) initial and acquired resistance to fluoroquinolones and second-line injectables (SLI), and (2) treatment regimens. RESULTS: Of 1,244 MDR-TB patients, 973 (78.2%) had known outcomes and 232 (18.6%) were lost to follow-up. Among those with known outcomes, treatment succeeded in 85.8% with plain MDR-TB, 69.7% with initial resistance to either a fluoroquinolone or SLI, 37.5% with acquired resistance to a fluoroquinolone or SLI, 29.3% with initial XDR-TB, and 13.0% with acquired XDR-TB (P<0.0001 for trend). In contrast, among those with known outcomes, treatment success increased stepwise from 41.6% to 92.3% as the number of proven effective drugs increased from ≤1 to ≥5 (P<0.0001 for trend); while acquired drug resistance decreased from 12%-16% range, depending on the drug, down to 0%-2% (P<0.0001 for trend). In multivariable analysis, the adjusted odds of treatment success decreased 0.62-fold (CL 0.56,0.69) for each increment in drug resistance and increased 2.1-fold (1.40,3.18) for each additional effective drug, controlling for differences between programs and patients. Specific treatment, patient and program variables were also associated with treatment outcome. CONCLUSION: Increasing drug resistance was associated in a logical stepwise manner with poor treatment outcomes. Acquired resistance was worse than initial resistance to the same drugs. Increasing numbers of effective drugs, specific drugs, and specific program characteristics were associated with better outcomes and less acquired resistance. |
Mycotic infections acquired outside areas of known endemicity, United States
Benedict K , Thompson GR 3rd , Deresinski S , Chiller T . Emerg Infect Dis 2015 21 (11) 1935-41 In the United States, endemic mycoses-blastomycosis, coccidioidomycosis, and histoplasmosis-pose considerable clinical and public health challenges. Although the causative fungi typically exist within broadly defined geographic areas or ecologic niches, some evidence suggests that cases have occurred in humans and animals not exposed to these areas. We describe cases acquired outside regions of traditionally defined endemicity. These patients often have severe disease, but diagnosis may be delayed because of a low index of suspicion for mycotic disease, and many more cases probably go entirely undetected. Increased awareness of these diseases, with a specific focus on their potential occurrence in unusual areas, is needed. Continued interdisciplinary efforts to reevaluate and better describe areas of true endemicity are warranted, along with a more nuanced view of the notion of endemicity. The term "nonendemic" should be used with care; mycoses in such regions might more accurately be considered "not known to be endemic." |
Potential exposure to Ebola virus from body fluids due to ambulance compartment permeability in Sierra Leone
Casey ML , Nguyen DT , Idriss B , Bennett S , Dunn A , Martin S . Prehosp Disaster Med 2015 30 (6) 1-3 INTRODUCTION: Prehospital care, including patient transport, is integral in the patient care process during the Ebola response. Transporting ill persons from the community to Ebola care facilities can stop community spread. Vehicles used for patient transport in infectious disease outbreaks should be evaluated for adequate infection prevention and control. PROBLEM: An ambulance driver in Sierra Leone attributed his Ebola infection to exposure to body fluids that leaked from the patient compartment to the driver cabin of the ambulance. METHODS: A convenience sample of 14 vehicles used to transport patients with suspected or confirmed Ebola in Sierra Leone were assessed. The walls separating the patient compartment and driver cabin in these vehicles were evaluated for structural integrity and potential pathways for body fluid leakage. Ambulance drivers and other staff were asked to describe their cleaning and decontamination practices. Ambulance construction and design standards from the National Fire Protection Association, US General Services Administration, and European Committee on Standardization (CEN) were reviewed. RESULTS: Many vehicles used by ambulance staff in Sierra Leone were not traditional ambulances, but were pick-up trucks or sport-utility vehicles that had been assembled or modified for patient transport. The wall separating the patient compartment and driver cabin in many vehicles did not have a waterproof seal around the edges. Staff responsible for cleaning and disinfection did not thoroughly clean bulk body fluids with disposable towels before disinfection of the patient compartment. Pressure from chlorine sprayers used in the decontamination process may have pushed body fluids from the patient compartment into the driver cabin through gaps around the wall. Ambulance design standards do not require a waterproof seal between the patient compartment and driver cabin. Sealing the wall by tightening or replacing existing bolts is recommended, followed by caulking of all seams with a sealant. CONCLUSION: Waterproof separation between the patient compartment and driver cabin may be essential for patient transport vehicles in infectious disease outbreaks, especially when chlorine sprayers are used for decontamination or in resource-limited settings where cleaning supplies may be limited. |
Ebola in West Africa - CDC's role in epidemic detection, control, and prevention
Frieden TR , Damon IK . Emerg Infect Dis 2015 21 (11) 1897-905 Since Ebola virus disease was identified in West Africa on March 23, 2014, the Centers for Disease Control and Prevention (CDC) has undertaken the most intensive response in the agency's history; >3,000 staff have been involved, including >1,200 deployed to West Africa for >50,000 person workdays. Efforts have included supporting incident management systems in affected countries; mobilizing partners; and strengthening laboratory, epidemiology, contact investigation, health care infection control, communication, and border screening in West Africa, Nigeria, Mali, Senegal, and the United States. All efforts were undertaken as part of national and global response activities with many partner organizations. CDC was able to support community, national, and international health and public health staff to prevent an even worse event. The Ebola virus disease epidemic highlights the need to strengthen national and international systems to detect, respond to, and prevent the spread of future health threats. |
Epidemiology of acute respiratory infections in children - preliminary results of a cohort in a rural north Indian community
Krishnan A , Amarchand R , Gupta V , Lafond KE , Suliankatchi RA , Saha S , Rai S , Misra P , Purakayastha DR , Wahi A , Sreenivas V , Kapil A , Dawood F , Pandav CS , Broor S , Kapoor SK , Lal R , Widdowson MA . BMC Infect Dis 2015 15 462 BACKGROUND: Despite acute respiratory infections being a major cause of death among children in developing countries including India, there is a lack of community-based studies that document its burden and aetiology. METHODS: A dynamic cohort of children aged 0-10 years was established in four villages in a north Indian state of Haryana from August 2012 onwards. Trained health workers conducted weekly home visits to screen children for acute respiratory infection (ARI) defined as one of the following: cough, sore throat, nasal congestion, earache/discharge, or breathing difficulty. Nurses clinically assessed these children to grade disease severity based on standard age-specific guidelines into acute upper or lower respiratory infection (AURI or ALRI) and collected nasal/throat swabs for pathogen testing. RESULTS: Our first year results show that ARI incidence in 0-10 years of age was 5.9 (5.8-6.0) per child-year with minimal gender difference, the ALRI incidence in the under-five age group was higher among boys (0.43; 0.39-0.49) as compared to girls (0.31; 0.26-0.35) per child year. Boys had 2.4 times higher ARI-related hospitalization rate as compared to girls. CONCLUSION: ARI impose a significant burden on the children of this cohort. This study platform aims to provide better evidence for prevention and control of pneumonia in developing countries. |
Epidemiology of primary multidrug-resistant tuberculosis, Vladimir Region, Russia
Ershova JV , Volchenkov GV , Kaminski DA , Somova TR , Kuznetsova TA , Kaunetis NV , Cegielski JP , Kurbatova EV . Emerg Infect Dis 2015 21 (11) 2048-51 We studied the epidemiology of drug-resistant tuberculosis (TB) in Vladimir Region, Russia, in 2012. Most cases of multidrug-resistant TB (MDR TB) were caused by transmission of drug-resistant strains, and >33% were in patients referred for testing after mass radiographic screening. Early diagnosis of drug resistance is essential for preventing transmission of MDR TB. |
Incidence of pneumococcal pneumonia among adults in rural Thailand, 2006-2011: implications for pneumococcal vaccine considerations
Piralam B , Tomczyk SM , Rhodes JC , Thamthitiwat S , Gregory CJ , Olsen SJ , Praphasiri P , Sawatwong P , Naorat S , Chantra S , Areerat P , Hurst CP , Moore MR , Muangchana C , Baggett HC . Am J Trop Med Hyg 2015 93 (6) 1140-1147 The incidence of pneumococcal pneumonia among adults is a key driver for the cost-effectiveness of pneumococcal conjugate vaccine used among children. We sought to obtain more accurate incidence estimates among adults by including results of pneumococcal urine antigen testing (UAT) from population-based pneumonia surveillance in two Thai provinces. Active surveillance from 2006 to 2011 identified acute lower respiratory infection (ALRI)-related hospital admissions. Adult cases of pneumococcal pneumonia were defined as hospitalized ALRI patients aged ≥ 18 years with isolation of Streptococcus pneumoniae from blood or with positive UAT. Among 39,525 adult ALRI patients, we identified 481 pneumococcal pneumonia cases (105 by blood culture, 376 by UAT only). Estimated incidence of pneumococcal pneumonia hospitalizations was 30.5 cases per 100,000 person-years (2.2 and 28.3 cases per 100,000 person-years by blood culture and UAT, respectively). Incidence varied between 22.7 in 2007 and 43.5 in 2010, and increased with age to over 150 per 100,000 person-years among persons aged ≥ 70 years. Viral coinfections including influenza A/B, respiratory syncytial virus (RSV), and adenovirus occurred in 11% (44/405) of pneumococcal pneumonia cases tested. Use of UAT to identify cases of pneumococcal pneumonia among adults in rural Thailand substantially increases estimates of pneumococcal pneumonia burden, thereby informing cost-effectiveness analyses and vaccine policy decisions. |
Incidence of viral respiratory infections in a prospective cohort of outpatient and hospitalized children aged ≤5 years and its associated cost in Buenos Aires, Argentina
Marcone DN , Durand LO , Azziz-Baumgartner E , Vidaurreta S , Ekstrom J , Carballal G , Echavarria M . BMC Infect Dis 2015 15 447 BACKGROUND: Although information about the incidence of viral respiratory illnesses and their associated cost can help health officials explore the value of interventions, data are limited from middle-income countries. METHODS: During 2008-2010, we conducted a prospective cohort study and followed ~1,800 Argentinian children aged ≤5 years to identify those children who were hospitalized or who sought care at an emergency room with any acute respiratory infection sign or symptom (e.g., rhinorrhea, cough, wheezing, tachypnea, retractions, or cyanosis). Respiratory samples were obtained for respiratory syncytial virus, influenza, parainfluenza, adenovirus, and metapneumovirus testing by immunofluorescence and for rhinovirus by real-time reverse transcription polymerase chain reaction. RESULTS: The incidence of respiratory syncytial virus (24/1000 children-years), human metapneumovirus (8/1000 children-years), and influenza (8/1000 children-years) illnesses was highest among hospitalized children aged <6 months and decreased among older children. In contrast, the incidence of rhinovirus was highest (12/1000 children-years) among those aged 6-23 months. In the emergency room, the incidence of rhinovirus was 459; respiratory syncytial virus 352; influenza 185; parainfluenza 177; metapneumovirus 130; and adenovirus 73/1,000 children-years. The total cost of hospitalization was a median of US$529 (Interquartile range, US$362-789). CONCLUSIONS: Our findings indicate that respiratory viruses, in particular rhinovirus, respiratory syncytial virus, metapneumovirus, and influenza may be associated with severe illness causing substantial economic burden. |
Influenza hospitalization epidemiology from a severe acute respiratory infection surveillance system in Jordan, January 2008-February 2014
Al-Abdallat M , Dawson P , Haddadin AJ , El-Shoubary W , Dueger E , Al-Sanouri T , Said MM , Talaat M . Influenza Other Respir Viruses 2015 10 (2) 91-7 BACKGROUND: Acute respiratory infections (ARIs) are a major cause of morbidity and mortality worldwide. Influenza typically contributes substantially to the burden of ARI, but only limited data are available on influenza activity and seasonality in Jordan. METHODS: Syndromic case definitions were used to identify individuals with severe acute respiratory infections (SARI) admitted to four sentinel hospitals in Jordan. Demographic and clinical data were collected. Nasopharyngeal and oropharyngeal swabs were tested for influenza using real-time reverse transcription polymerase chain reaction and typed as influenza A or B, with influenza A further subtyped. RESULTS: From January 2008-February 2014, 2,891 SARI cases were tested for influenza, and 257 (9%) were positive. While 73% of all SARI cases were under five years old, only 57% of influenza-positive cases were under five years old. Eight (3%) influenza-positive cases died. An annual seasonal pattern of influenza activity was observed. The proportion of influenza-positive cases peaked during November-January (14-42%) in the non-pandemic years. CONCLUSIONS: Influenza is associated with substantial morbidity and mortality in Jordan. The seasonal pattern of influenza aligns with known Northern Hemisphere seasonality. Further characterization of the clinical and financial burden of influenza in Jordan will be critical in supporting decisions regarding disease control activities. |
Integrating antiretroviral strategies for human immunodeficiency virus prevention: post- and pre-exposure prophylaxis and early treatment
Grant RM , Smith DK . Open Forum Infect Dis 2015 2 (4) ofv126 Best practices for integrating human immunodeficiency virus (HIV) testing and antiretroviral interventions for prevention and treatment are suggested based on research evidence and existing normative guidance. The goal is to provide high-impact prevention services during periods of substantial risk. Antiretroviral medications are recommended for postexposure prophylaxis (PEP), pre-exposure prophylaxis (PrEP), and treatment of HIV infection. We reviewed research evidence and current normative guidelines to identify best practices for integrating these high-impact prevention strategies. More sensitive HIV tests used for screening enable earlier diagnosis and treatment of HIV infection, more appropriate counseling, and help limit drug resistance. A fully suppressive PEP regimen should be initiated based on exposure history or physical findings when sensitive diagnostic testing is delayed or not available and antibody tests are negative. Transitions from PEP to PrEP are often warranted because HIV exposure events may continue to occur. This algorithmic approach to integrating PEP, PrEP, and early treatment decisions may increase the uptake of these interventions by a greater number and diversity of knowledgeable healthcare providers. |
The kynurenine pathway of tryptophan catabolism and AIDS-associated Kaposi sarcoma in Africa
Byakwaga H , Hunt PW , Laker-Oketta M , Glidden DV , Huang Y , Bwana BM , Mocello AR , Bennett J , Walusansa V , Dollard SC , Bangsberg DR , Mbidde EK , Martin JN . J Acquir Immune Defic Syndr 2015 70 (3) 296-303 BACKGROUND: Other than Kaposi sarcoma (KS)-associated herpesvirus and CD4 T-cell lymphopenia, the mechanisms responsible for KS in the context of HIV are poorly understood. One recently explored pathway of HIV pathogenesis involves induction of the enzyme indoleamine 2,3-dioxygenase-1 (IDO), which catabolizes tryptophan into kynurenine and several other immunologically active metabolites that suppress T-cell proliferation. We investigated the role of IDO in the development of KS in HIV disease. METHODS: In a case-control study among untreated HIV-infected Ugandans, cases were adults with KS and controls were without KS. IDO activity was assessed by the ratio of plasma kynurenine to tryptophan levels (KT ratio), measured by liquid chromatography-tandem mass spectrometry. RESULTS: We studied 631 HIV-infected subjects: 222 KS cases and 409 controls. Non-KS controls had a higher median plasma KT ratio (130, interquartile range: 90 to 190 nM/muM) than KS cases (110, interquartile range: 90 to 150 nM/muM) (P = 0.004). After adjustment for age, sex, CD4 count, and plasma HIV RNA level, subjects with the highest (fourth quartile) plasma KT ratios had a 59% reduction (95% confidence interval: 27% to 77%) in the odds of KS compared with those with the lowest (first quartile) levels. KS was also independently associated with lower CD4 count, higher plasma HIV RNA, and men. CONCLUSIONS: Among HIV-infected individuals, greater activity of the kynurenine pathway of tryptophan catabolism, as evidenced by higher levels of plasma KT ratio, was associated with lower occurrence of KS. Some consequences of immune activation in HIV infection might actually suppress certain cancers. |
Association of higher MERS-CoV virus load with severe disease and death, Saudi Arabia, 2014
Feikin DR , Alraddadi B , Qutub M , Shabouni O , Curns A , Oboho IK , Tomczyk SM , Wolff B , Watson JT , Madani TA . Emerg Infect Dis 2015 21 (11) 2029-35 Middle East respiratory syndrome coronavirus (MERS-CoV) causes a spectrum of illness. We evaluated whether cycle threshold (Ct) values (which are inversely related to virus load) were associated with clinical severity in patients from Saudi Arabia whose nasopharyngeal specimens tested positive for this virus by real-time reverse transcription PCR. Among 102 patients, median Ct of 31.0 for the upstream of the E gene target for 41 (40%) patients who died was significantly lower than the median of 33.0 for 61 survivors (p = 0.0087). In multivariable regression analyses, risk factors for death were age >60 years), underlying illness, and decreasing Ct for each 1-point decrease in Ct). Results were similar for a composite severe outcome (death and/or intensive care unit admission). More data are needed to determine whether modulation of virus load by therapeutic agents affects clinical outcomes. |
Contact tracing activities during the Ebola virus disease epidemic in Kindia and Faranah, Guinea, 2014
Dixon MG , Taylor MM , Dee J , Hakim A , Cantey P , Lim T , Bah H , Camara SM , Ndongmo CB , Togba M , Toure LY , Bilivogui P , Sylla M , Kinzer M , Coronado F , Tongren JE , Swaminathan M , Mandigny L , Diallo B , Seyler T , Rondy M , Rodier G , Perea WA , Dahl B . Emerg Infect Dis 2015 21 (11) 2022-8 The largest recorded Ebola virus disease epidemic began in March 2014; as of July 2015, it continued in 3 principally affected countries: Guinea, Liberia, and Sierra Leone. Control efforts include contact tracing to expedite identification of the virus in suspect case-patients. We examined contact tracing activities during September 20-December 31, 2014, in 2 prefectures of Guinea using national and local data about case-patients and their contacts. Results show less than one third of case-patients (28.3% and 31.1%) were registered as contacts before case identification; approximately two thirds (61.1% and 67.7%) had no registered contacts. Time to isolation of suspected case-patients was not immediate (median 5 and 3 days for Kindia and Faranah, respectively), and secondary attack rates varied by relationships of persons who had contact with the source case-patient and the type of case-patient to which a contact was exposed. More complete contact tracing efforts are needed to augment control of this epidemic. |
Delayed entry into HIV medical care in a nationally representative sample of HIV-infected adults receiving medical care in the USA
Robertson M , Wei SC , Beer L , Adedinsewo D , Stockwell S , Dombrowski JC , Johnson C , Skarbinski J . AIDS Care 2015 28 (3) 1-9 Before widespread antiretroviral therapy (ART), an estimated 17% of people delayed HIV care. We report national estimates of the prevalence and factors associated with delayed care entry in the contemporary ART era. We used Medical Monitoring Project data collected from June 2009 through May 2011 for 1425 persons diagnosed with HIV from May 2004 to April 2009 who initiated care within 12 months. We defined delayed care as entry >three months from diagnosis. Adjusted prevalence ratios (aPRs) were calculated to identify risk factors associated with delayed care. In this nationally representative sample of HIV-infected adults receiving medical care, 7.0% (95% confidence interval [CI]: 5.3-8.8) delayed care after diagnosis. Black race was associated with a lower likelihood of delay than white race (aPR 0.38). Men who have sex with women versus women who have sex with men (aPR 1.86) and persons required to take an HIV test versus recommended by a provider (aPR 2.52) were more likely to delay. Among those who delayed 48% reported a personal factor as the primary reason. Among persons initially diagnosed with HIV (non-AIDS), those who delayed care were twice as likely (aPR 2.08) to develop AIDS as of May 2011. Compared to the pre-ART era, there was a nearly 60% reduction in delayed care entry. Although relatively few HIV patients delayed care entry, certain groups may have an increased risk. Focus on linkage to care among persons who are required to take an HIV test may further reduce delayed care entry. |
Delivery of HIV transmission risk-reduction services by HIV care providers in the United States-2013
Beer L , Weiser J , West BT , Duke C , Gremel G , Skarbinski J . J Int Assoc Provid AIDS Care 2015 15 (6) 494-504 OBJECTIVES: Evidence-based guidelines have long recommended that HIV care providers deliver HIV transmission risk-reduction (RR) services, but recent data are needed to assess their adoption. METHODS: The authors surveyed a probability sample of 1234 US HIV care providers on delivery of 9 sexual behavior- and 7 substance use-related HIV transmission RR services and created an indicator of "adequate" delivery of services in each area, defined as performing approximately 70% or more of applicable services. RESULTS: Providers were most likely to encourage patients to disclose HIV status to all partners since HIV diagnosis (81%) and least likely to ask about disclosure to new sex and drug injection partners at follow-up visits (both 41%). Adequate delivery of sexual behavior- and substance use-related RR services was low (37% and 43%, respectively). CONCLUSION: The majority of US HIV care providers may need additional support to improve delivery of comprehensive HIV transmission RR services. |
Surveillance for Borrelia burgdorferi in Ixodes ticks and small rodents in British Columbia
Morshed MG , Lee MK , Man S , Fernando K , Wong Q , Hojgaard A , Tang P , Mak S , Henry B , Patrick DM . Vector Borne Zoonotic Dis 2015 15 (11) 701-5 To determine the prevalence of Borrelia burgdorferi in British Columbian ticks, fieldwork was conducted over a 2-year period. In all, 893 ticks (Ixodes pacificus, I. angustus, I. soricis, Ixodes spp., and Dermacentor andersoni) of different life stages were retrieved from 483 small rodents (Peromyscus maniculatus, Perognathus parvus, and Reithrodontomys megalotis). B. burgdorferi DNA was detected in 5 out of 359 tick pools, and 41 out of 483 mice were serologically confirmed to have antibodies against B. burgdorferi. These results were consistent with previous studies, data from passive surveillance in British Columbia, and data from neighboring states in the Pacific Northwest, suggesting a continually low prevalence of B. burgdorferi in British Columbia ticks. |
Effect of Rickettsia rickettsii (Rickettsiales: rickettsiaceae) infection on the biological parameters and survival of its tick vector-Dermacentor variabilis (Acari: Ixodidae)
Schumacher L , Snellgrove A , Levin ML . J Med Entomol 2015 53 (1) 172-6 Rocky Mountain spotted fever, caused by Rickettsia rickettsii, is a potentially fatal tick-borne disease spread from North America to Argentina. The major vectors of R. rickettsii in the United States are Dermacentor andersoni Stiles and Dermacentor variabilis (Say). It is generally believed that vector ticks serve as major reservoirs of R. rickettsii in nature; however, the ability of ticks to support the indefinite perpetuation of R. rickettsii has been challenged by reports of deleterious effects of rickettsial infection on D. andersoni. To better elucidate the relationship of the pathogen with D. variabilis, we assessed the effects of R. rickettsii on the survival, fertility, and fecundity of D. variabilis. We used an isolate of R. rickettsii (Di-6), originally acquired from an opossum caught in Virginia, and ticks from a laboratory colony established from adult D. variabilis also collected in Virginia. Overall, infection with R. rickettsii protracted the feeding periods of all life stages of ticks. Infected nymphal and adult ticks experienced a slight decrease in feeding success compared with the uninfected colony, but neither larval nor nymphal molting success was affected. Infected females reached smaller engorgement weights, were less efficient in conversion of bloodmeal into eggs, and produced smaller egg clutches with a lower proportion of eggs hatching. However, no sudden die-off was observed among infected ticks, and longevity was not decreased due to R. rickettsii infection in any stage. Although infection with the studied isolate of R. rickettsii caused slight decrease in fecundity in sympatric vector ticks, no obvious deleterious effects were observed. |
Climatic influences on Cryptoccoccus gattii populations, Vancouver Island, Canada, 2002-2004
Uejio CK , Mak S , Manangan A , Luber G , Bartlett KH . Emerg Infect Dis 2015 21 (11) 1989-96 Vancouver Island, Canada, reports the world's highest incidence of Cryptococcus gattii infection among humans and animals. To identify key biophysical factors modulating environmental concentrations, we evaluated monthly concentrations of C. gatti in air, soil, and trees over a 3-year period. The 2 study datasets were repeatedly measured plots and newly sampled plots. We used hierarchical generalized linear and mixed effect models to determine associations. Climate systematically influenced C. gattii concentrations in all environmental media tested; in soil and on trees, concentrations decreased when temperatures were warmer. Wind may be a key process that transferred C. gattii from soil into air and onto trees. C. gattii results for tree and air samples were more likely to be positive during periods of higher solar radiation. These results improve the understanding of the places and periods with the greatest C. gattii colonization. Refined risk projections may help susceptible persons avoid activities that disturb the topsoil during relatively cool summer days. |
Update on multistate outbreak of fungal infections associated with contaminated methylprednisolone injections, 2012-2014
McCotter OZ , Smith RM , Westercamp M , Kerkering TM , Malani AN , Latham R , Peglow SL , Mody RK , Pappas PG , Chiller TM . MMWR Morb Mortal Wkly Rep 2015 64 (42) 1200-1 During September 2012, CDC, in collaboration with state and local health departments and the Food and Drug Administration (FDA), investigated a multistate outbreak of fungal meningitis and other infections caused by injections of contaminated methylprednisolone acetate solution (MPA). After this unprecedented outbreak, scientists in the CDC Mycotic Diseases Branch, along with infectious diseases specialists who cared for patients from the outbreak, clinical experts, and public health officials from affected states, have continued to monitor the recovery of affected patients. A long-term follow-up study involving these patients was initiated and is being conducted by the Mycoses Study Group Education and Research Consortium (MSGERC). This update summarizes subsequent information about the current state of the outbreak. |
Association between outpatient antibiotic prescribing practices and community-associated Clostridium difficile Infection
Dantes R , Mu Y , Hicks LA , Cohen J , Bamberg W , Beldavs ZG , Dumyati G , Farley MM , Holzbauer S , Meek J , Phipps E , Wilson L , Winston LG , McDonald LC , Lessa FC . Open Forum Infect Dis 2015 2 (3) ofv113 BACKGROUND: Antibiotic use predisposes patients to Clostridium difficile infections (CDI), and approximately 32% of these infections are community-associated (CA) CDI. The population-level impact of antibiotic use on adult CA-CDI rates is not well described. METHODS: We used 2011 active population- and laboratory-based surveillance data from 9 US geographic locations to identify adult CA-CDI cases, defined as C difficile-positive stool specimens (by toxin or molecular assay) collected from outpatients or from patients ≤3 days after hospital admission. All patients were surveillance area residents and aged ≥20 years with no positive test ≤8 weeks prior and no overnight stay in a healthcare facility ≤12 weeks prior. Outpatient oral antibiotic prescriptions dispensed in 2010 were obtained from the IMS Health Xponent database. Regression models examined the association between outpatient antibiotic prescribing and adult CA-CDI rates. RESULTS: Healthcare providers prescribed 5.2 million courses of antibiotics among adults in the surveillance population in 2010, for an average of 0.73 per person. Across surveillance sites, antibiotic prescription rates (0.50-0.88 prescriptions per capita) and unadjusted CA-CDI rates (40.7-139.3 cases per 100 000 persons) varied. In regression modeling, reducing antibiotic prescribing rates by 10% among persons ≥20 years old was associated with a 17% (95% confidence interval, 6.0%-26.3%; P = .032) decrease in CA-CDI rates after adjusting for age, gender, race, and type of diagnostic assay. Reductions in prescribing penicillins and amoxicillin/clavulanic acid were associated with the greatest decreases in CA-CDI rates. CONCLUSIONS AND RELEVANCE: Community-associated CDI prevention should include reducing unnecessary outpatient antibiotic use. A modest reduction of 10% in outpatient antibiotic prescribing can have a disproportionate impact on reducing CA-CDI rates. |
Associations between different sedatives and ventilator-associated events, length-of-stay, and mortality in mechanically ventilated patients
Klompas M , Li L , Szumita P , Kleinman K , Murphy MV . Chest 2015 149 (6) 1373-9 BACKGROUND: Current sedation guidelines recommend avoiding benzodiazepines but express no preference for propofol versus dexmedetomidine. In addition, there are limited data on how well randomized controlled trials on sedatives generalize to routine practice where conditions tend to be more varied and complex. METHODS: We gathered daily sedative exposure data from all patients on mechanical ventilation for ≥3 days over a 7-year period in a large academic medical center. We compared hazard ratios for ventilator-associated events (VAEs), extubation, hospital discharge, and hospital death amongst benzodiazepines, propofol, and dexmedetomidine using proportional subdistribution hazard models with competing risks. We adjusted all analyses for ICU type, demographics, comorbidities, procedures, severity of illness, hypotension, oxygenation, renal function, opioids, neuroleptics, neuromuscular blockers, awakening and breathing trials, and calendar year. RESULTS: We evaluated 9,603 consecutive episodes of mechanical ventilation. Benzodiazepines and propofol were associated with increased VAE risk whereas dexmedetomidine was not. Propofol was associated with less time to extubation compared to benzodiazepines (HR for extubation 1.4, 95% CI 1.3-1.5). Dexmedetomidine was associated with less time to extubation compared to both benzodiazepines (HR 2.3, 95% CI 2.0-2.7) and propofol (HR 1.7, 95% CI 1.4-2.0) but there were relatively few dexmedetomidine exposures available for analysis. There were no differences between any two agents in hazards for hospital discharge or mortality. CONCLUSIONS: In this large, real-world cohort, propofol and dexmedetomidine were associated with less time to extubation compared to benzodiazepines but dexmedetomidine was also associated with less time to extubation compared to propofol. These possible differences merit further study. |
Windows of Sensitivity to Toxic Chemicals in the Development of Cleft Palates.
Buser MC , Pohl HR . J Toxicol Environ Health B Crit Rev 2015 18 (5) 242-57 Cleft lip and cleft palate are among the most common birth defects worldwide. There is a genetic component to the development of these malformations, as well as evidence that environmental exposures and prescription drug use may exacerbate or even produce these manifestations. Thus, it is important to understand the underlying mechanisms and when these exposures affect development of the growing fetus. The purpose of this investigation was to critically review the available literature related to orofacial cleft formation following chemical exposure and identify specific time frames for windows of sensitivity. Further, an aim was to evaluate the potential for predicting effects in humans based on animal studies. Evidence indicates that chemical causes of cleft palate development are due to dose and timing of exposure, susceptibility of the species (i.e., the genetic makeup), and mechanism of action. Several studies demonstrated that dose is a crucial factor; however, some investigators argued that even more important than dose was timing of exposure. Data show that the window of sensitivity to environmental teratogens in the development of cleft palates is quite narrow and follows closely the window of palatogenesis in the fetus of any given species. |
Recent research on Gulf War illness and other health problems in veterans of the 1991 Gulf War: Effects of toxicant exposures during deployment
White RF , Steele L , O'Callaghan JP , Sullivan K , Binns JH , Golomb BA , Bloom FE , Bunker JA , Crawford F , Graves JC , Hardie A , Klimas N , Knox M , Meggs WJ , Melling J , Philbert MA , Grashow R . Cortex 2015 74 449-75 Veterans of Operation Desert Storm/Desert Shield - the 1991 Gulf War (GW) - are a unique population who returned from theater with multiple health complaints and disorders. Studies in the U.S. and elsewhere have consistently concluded that approximately 25-32% of this population suffers from a disorder characterized by symptoms that vary somewhat among individuals and include fatigue, headaches, cognitive dysfunction, musculoskeletal pain, and respiratory, gastrointestinal and dermatologic complaints. Gulf War illness (GWI) is the term used to describe this disorder. In addition, brain cancer occurs at increased rates in subgroups of GW veterans, as do neuropsychological and brain imaging abnormalities. Chemical exposures have become the focus of etiologic GWI research because nervous system symptoms are prominent and many neurotoxicants were present in theater, including organophosphates (OPs), carbamates, and other pesticides; sarin/cyclosarin nerve agents, and pyridostigmine bromide (PB) medications used as prophylaxis against chemical warfare attacks. Psychiatric etiologies have been ruled out. This paper reviews the recent literature on the health of 1991 GW veterans, focusing particularly on the central nervous system and on effects of toxicant exposures. In addition, it emphasizes research published since 2008, following on an exhaustive review that was published in that year that summarizes the prior literature (RACGWI, 2008). We conclude that exposure to pesticides and/or to PB are causally associated with GWI and the neurological dysfunction in GW veterans. Exposure to sarin and cyclosarin and to oil well fire emissions are also associated with neurologically based health effects, though their contribution to development of the disorder known as GWI is less clear. Gene-environment interactions are likely to have contributed to development of GWI in deployed veterans. The health consequences of chemical exposures in the GW and other conflicts have been called "toxic wounds" by veterans. This type of injury requires further study and concentrated treatment research efforts that may also benefit other occupational groups with similar exposure-related illnesses. |
Summary of notifiable noninfectious conditions and disease outbreaks: childhood blood lead levels - United States, 2007-2012
Raymond J , Brown MJ . MMWR Morb Mortal Wkly Rep 2015 62 (54) 76-80 This report provides data concerning childhood blood lead levels (BLLs) in the United States during 2007–2012. These data were collected and compiled from extracts sent by state and local health departments to CDC's Childhood Blood Lead Surveillance (CBLS) system. The numbers of children aged <5 years reported to CDC for 2007–2012 with BLLs ≥10 µg/dL are provided by month, geographic location, and age group in tabular form (Tables 1–3). The number of children who received a new diagnosis of BLLs ≥70 µg/dL during the same time period is summarized (Figure). This report is a part of the first-ever Summary of Notifiable Noninfectious Conditions and Disease Outbreaks, which encompasses various surveillance years but is being published in 2015 (1). The Summary of Notifiable Noninfectious Conditions and Disease Outbreaks appears in the same volume of MMWR as the annual Summary of Notifiable Infectious Diseases (2). | Background | In 1991, CDC recommended that identification of children with BLLs ≥10 µg/dL should prompt public health action by state or local health departments with follow-up testing (3). In 1995, in collaboration with CDC, the Council of State and Territorial Epidemiologists designated elevated blood lead levels as the first noninfectious condition to be added to the list of conditions designated as reportable at the national level (4). | In May 2012, the Advisory Committee on Childhood Lead Poisoning Prevention (ACCLPP) recommended the use of a reference range for blood lead. ACCLPP recommended that clinical and public health-care providers use the upper value of the reference range to identify children with elevated BLLs, on the basis of the 97.5th percentile of the National Health and Nutritional Examination Survey (NHANES)–generated BLL distribution in children aged 1–5 years (currently 5 µg/dL) (5). | Permanent neurological damage and behavioral disorders have been found to be associated with lead exposure at blood levels at or below 5 µg/dL (6–9). Previous studies have shown that high BLLs (≥70 µg/dL) can cause severe neurologic problems such as seizures, comas, and even death (10). | In 2007, a total 38 states identified and reported 37,289 children aged <6 years with BLLs ≥10 µg/dL (11). In 2012, approximately 122,000 children aged <6 years were reported with BLLs ≥5 µg/dL (11). For the period 2007–2012, CDC examined reported BLLs of children aged <5 years in three categories: children with BLLs ≥10 µg/dL, children with new reports of BLLs ≥10 µg/dL, and children with new reports of BLLs ≥70 µg/dL. |
VOCs emissions from multiple wood pellet types and concentrations in indoor air
Soto-Garcia L , Ashley WJ , Bregg S , Walier D , Lebouf R , Hopke PK , Rossner A . Energy Fuels 2015 29 (10) 6485-6493 Wood pellet storage safety is an important aspect for implementing woody biomass as a renewable energy source. When wood pellets are stored indoors in large quantities (tons) in poorly ventilated spaces in buildings, such as in basements, off-gassing of volatile organic compounds (VOCs) can significantly affect indoor air quality. To determine the emission rates and potential impact of VOC emissions, a series of laboratory and field measurements were conducted using softwood, hardwood, and blended wood pellets manufactured in New York. Evacuated canisters were used to collect air samples from the headspace of drums containing pellets and then in basements and pellet storage areas of homes and small businesses. Multiple peaks were identified during GC/MS and GC/FID analysis, and four primary VOCs were characterized and quantified: methanol, pentane, pentanal, and hexanal. Laboratory results show that total VOCs (TVOCs) concentrations for softwood (SW) were statistically (p < 0.02) higher than blended or hardwood (HW) (SW: 412 ± 25; blended: 203 ± 4; HW: 99 ± 8, ppb). The emission rate from HW was the fastest, followed by blended and SW, respectively. Emissions rates were found to range from 10-1 to 10-5 units, depending upon environmental factors. Field measurements resulted in airborne concentrations ranging from 67 ± 8 to 5000 ± 3000 ppb of TVOCs and 12 to 1500 ppb of aldehydes, with higher concentrations found in a basement with a large fabric bag storage unit after fresh pellet delivery and lower concentrations for aged pellets. These results suggest that large fabric bag storage units resulted in a substantial release of VOCs into the building air. Occupants of the buildings tested discussed concerns about odor and sensory irritation when new pellets were delivered. The sensory response was likely due to the aldehydes. © 2015 American Chemical Society. |
Serum Total Testosterone Concentrations in the US Household Population from the NHANES 2011-2012 Study Population
Vesper HW , Wang Y , Vidal M , Cook Botelho J , Caudill SP . Clin Chem 2015 BACKGROUND: Limited information is available about testosterone concentrations representative of the general US population, especially children, women, and non-Hispanic Asians. METHODS: We obtained nationally representative data for total testosterone (totalT), measured with standardized LC-MS/MS, for the US population age 6 years and older from the 2011-2012 National Health and Nutrition Examination Survey (NHANES). We analyzed 6746 serum samples and calculated the geometric means, distribution percentiles, and covariate-adjusted geometric means by age, sex, and race/ethnicity. RESULTS: The 10th-90th percentiles of totalT values in adults (≤20 years) was 150-698 ng/dL (5.20-24.2 nmol/L) in men, 7.1-49.8 ng/dL (0.25-1.73 nmol/L) in women, and 1.0-9.5 ng/dL (0.04-0.33 nmol/L) in children (6-10 years old). Differences among race/ethnic groups existed in children and men: covariate-adjusted totalT values in non-Hispanic Asians were highest among children (58% compared to non-Hispanic black children) and lowest among men (12% compared to Mexican-American men). Covariate-adjusted totalT values in men were higher at age 55-60 years compared to ages 35 and 80 years, a pattern different from that observed in previous NHANES cycles. CONCLUSIONS: TotalT patterns were different among age groups in men compared with previous NHANES cycles. Covariate-adjusted totalT values peaked at age 55-60 years in men, which appeared to be consistent with the increased use of exogenous testosterone. Differences among race/ethnic groups existed and appeared more pronounced in children than adults. |
Summary of notifiable noninfectious conditions and disease outbreaks: foodborne and waterborne disease outbreaks - United States, 1971-2012
Dewey-Mattia D , Roberts V , Yoder J , Gould LH . MMWR Morb Mortal Wkly Rep 2015 62 (54) 86-9 CDC collects data on foodborne and waterborne disease outbreaks reported by all U.S. states and territories through the Foodborne Disease Outbreak Surveillance System (FDOSS) and the Waterborne Disease and Outbreak Surveillance System (WBDOSS), respectively. These two systems are the primary source of national data describing the number of illnesses, hospitalizations, and deaths; etiologic agents; water source or implicated foods; settings of exposure; and other factors associated with recognized foodborne and waterborne disease outbreaks in the United States. This report summarizes data on foodborne disease outbreaks reported during 1973–2012 and waterborne disease outbreaks reported during 1971–2012. This report is a part of the first-ever Summary of Notifiable Noninfectious Conditions and Disease Outbreaks, which encompasses various surveillance years but is being published in 2015 (1). The Summary of Notifiable Noninfectious Conditions and Disease Outbreaks appears in the same volume of MMWR as the annual Summary of Notifiable Infectious Diseases (2). | Background | Foodborne Disease Outbreak Surveillance | Foodborne diseases cause an estimated 48 million illnesses each year in the United States, including 9.4 million caused by known pathogens (3,4). Only a minority of foodborne illnesses, hospitalizations, and deaths occur as part of recognized outbreaks (5). However, information gathered from foodborne disease outbreak surveillance provides valuable insights into the agents that cause foodborne illness, types of implicated foods and ingredients, and settings in which transmission occurs. | Foodborne disease outbreaks have been nationally notifiable since 2010; however, reports of foodborne disease outbreaks have been collected by CDC through FDOSS since 1973. Initially a paper-based system, FDOSS became web-based in 1998. In 2009, the system was transitioned to an enhanced reporting platform, the National Outbreak Reporting System (NORS), which also collects information on waterborne disease outbreaks and enteric disease outbreaks with modes of transmission other than food, including person-to-person contact, animal contact, and environmental contamination. Information about NORS is available at http://www.cdc.gov/nors. | Foodborne disease outbreak surveillance data highlight the etiologic agents, foods, and settings involved most often in outbreaks and can help to identify food commodities and preparation settings in which interventions might be most effective. Surveillance for foodborne disease outbreaks provides insight into the effectiveness of regulations and control measures, helps identify new and emerging pathogens, provides information regarding the food preparation and consumption settings where outbreaks occur, informs prevention and control measures in the food industry by identifying points of contamination, and can be used to describe trends in foodborne disease outbreaks over time. |
Summary of notifiable noninfectious conditions and disease outbreaks: introduction to the summary of notifiable noninfectious conditions and disease outbreaks - United States
Coates RJ , Jajosky RA , Stanbury M , Macdonald SC . MMWR Morb Mortal Wkly Rep 2015 62 (54) 1-4 With this 2015 Summary of Notifiable Noninfectious Conditions and Disease Outbreaks — United States, CDC is publishing official statistics for the occurrence of nationally notifiable noninfectious conditions and disease outbreaks for the first time in the same volume of MMWR as the annual Summary of Notifiable Infectious Diseases (1). | This two-part publication provides the opportunity for readers to review information on all of the nationally notifiable conditions identified by the Council of State and Territorial Epidemiologists (CSTE) in collaboration with CDC. This combined publication is the result of a February 2013 request by CSTE for CDC to present surveillance data on all nationally notifiable conditions and disease outbreaks in the same publication. In recent years, CSTE formalized and expanded the list of nationally notifiable conditions to include foodborne and waterborne disease outbreaks and four noninfectious conditions: acute pesticide-related illness and injury, cancer, silicosis, and elevated blood lead levels.* After discussion within the organization and with subject matter experts at CDC, CSTE concluded that inclusion of information on all nationally notifiable conditions in the same MMWR annual surveillance summary of nationally notifiable conditions would be useful and important for the public and public health professionals. | This Summary of Notifiable Noninfectious Conditions and Disease Outbreaks includes six chapters treating the following subjects: acute pesticide-related illness and injury arising from occupational exposure (2), cancer (3), elevated blood lead levels among employed adults (4), elevated blood lead levels among children (5), silicosis (6), and foodborne and waterborne disease outbreaks (7). Information about nonoccupational acute pesticide-related illness could not be included this year because the data were not ready for publication. However, the CDC programs involved in pesticide-related illness surveillance activities plan to include these data in the 2016 MMWR publication of the annual Summary of Notifiable Noninfectious Conditions and Disease Outbreaks. | Information on elevated lead exposure is provided in two separate chapters because the sources of lead exposure differ between children and adults. Lead exposure among children is caused principally by deteriorated lead paint found in homes whereas lead exposure among adults occurs principally in the workplace. CDC's National Center for Environmental Health (NCEH) has primary responsibility for preventing disease from environmental (principally nonoccupational) hazards, and CDC's National Institute of Occupational Safety and Health (NIOSH) is responsible for preventing disease from workplace hazards. Because of the separate delegation of responsibilities and differences in sources of lead exposure, CDC has a linked surveillance system for lead exposure with NCEH responsible for the Childhood Blood Lead Surveillance (CBLS) system (5) and with NIOSH responsible for the Adult Blood Lead Epidemiology and Surveillance system (ABLES) (4). | Each of the six chapters in this Summary (Noninfectious) presents the most recent statistics available to the CDC program. Local, state, and territorial public health departments and other agencies within those jurisdictions (e.g., departments of labor, environmental protection agencies, cancer registries, and their agents) submit data on these conditions and outbreaks to CDC programs at the National Center for Chronic Disease Prevention and Health Promotion, the National Center for Emerging and Zoonotic Infectious Diseases, NCEH, and NIOSH. Previously, the programs compiled and published surveillance data on these noninfectious conditions and disease outbreaks periodically in multiple venues with variable timeframes and formats. | The Center for Surveillance, Epidemiology, and Laboratory Services (CSELS) coordinated the development and publication of this summary. Comments and suggestions from readers on this new combined publication are encouraged, including ones about whether the information presented could be made more useful. Comments should be sent to NNDSSweb@cdc.gov. |
Summary of notifiable noninfectious conditions and disease outbreaks: surveillance for cancer incidence and mortality - United States, 2011
Singh SD , Henley SJ , Ryerson AB . MMWR Morb Mortal Wkly Rep 2015 62 (54) 11-51 This report provides, in tabular and graphic form, official federal statistics on the occurrence of cancer for 2011 and trends for 1999–2011 as reported by CDC and the National Cancer Institute (NCI) (1). Cancer incidence data are from population-based cancer registries that participate in CDC's National Program of Cancer Registries (NPCR) and NCI's Surveillance, Epidemiology, and End Results (SEER) program reported as of November 2013. Cancer mortality data are from death certificate information reported to state vital statistics offices through 2011 and compiled into a national file for the entire United States by CDC's National Center for Health Statistics' (NCHS) National Vital Statistics System (NVSS). This report is a part of the first-ever Summary of Notifiable Noninfectious Conditions and Disease Outbreaks, which encompasses various surveillance years but is being published in 2015 (2). The Summary of Notifiable Noninfectious Conditions and Disease Outbreaks appears in the same volume of MMWR as the annual Summary of Notifiable Infectious Diseases (3). | This report presents information on new cancer cases and deaths for 2011. The number and rate of cancer cases and deaths are stratified by the primary cancer sites as reported for 2011; information is provided by demographic characteristic (e.g., sex, age, race, and ethnicity) and primary cancer site (68 selected sites among men and 72 selected sites among women) (Tables 1–12). Age-adjusted cancer incidence and death rates for the most common sites are shown by race, sex, and ethnicity for 2011, the most recent diagnosis year (Figure 1). Maps of the United States display age-adjusted cancer incidence and death rates, presented by quartiles, for 2011, the most recent diagnosis year (Figures 2 and 3). Time trends in age-adjusted cancer incidence and death rates during 1999–2011 are shown for all sites combined by race, sex, and ethnicity (Figures 4–7). Age-adjusted cancer incidence and death rates are shown by primary site and year for the period 1999–2011 (Tables 13–16). | Background | Cancer comprises a diverse mix of diseases occurring in every part of the body and is a leading cause of death in the United States, second only to heart disease (4). More than half of cancer cases could be prevented (5). Surveillance of cancer incidence and mortality can help public health officials target areas for control efforts (6) and track progress toward meeting the national health objectives set forth in Healthy People 2020 (7). Because cancer is a reportable disease in every state, hospitals, physician's offices, pathology laboratories, and other medical facilities are required to submit data on all cancer diagnoses to a central cancer registry at the state or territorial level. A cancer registry is a database that contains individual records of all cancer cases in a defined population and includes patient demographics, tumor characteristics (e.g., cancer site and pathology), and information about the notifying health provider or facility. In 1992, Congress established NPCR by enacting the Cancer Registries Amendment Act, Public Law 102-515 (8). Administered by CDC, NPCR collects data on the occurrence of cancer, and the type, extent, and location of the cancer. Before NPCR was established, 10 states had no registry, and most states with registries lacked the resources and state legislation needed to gather complete data (9). Presently, NPCR supports central cancer registries in 45 states, the District of Columbia, Puerto Rico, and the U.S. Pacific Island Jurisdictions. NPCR data represent 96% of the overall U.S. population. Together, NPCR and NCI's SEER Program collect data for the entire U.S. population. Cancer control planners and others can identify variations in cancer rates by population subgroups and monitor trends over time to guide the planning and evaluation of cancer prevention and control programs and allocation of health resources. |
Technology and data collection in chronic disease epidemiology
Holt JB . Prev Chronic Dis 2015 12 E187 In this issue of Preventing Chronic Disease, Moodley et al (1) present the results of a spatial analysis of the locations of advertisements for sugar-sweetened beverages (SSBs) and vendors who sell SSBs in relation to the location of schools in 5 neighborhoods in South Africa. In their article, “Obesogenic Environments: Access to and Advertising of Sugar-Sweetened Beverages in Soweto, South Africa,” the authors used a global positioning system (GPS) and a digital camera to gather data on the locations of SSB advertisements and vendors. Their innovative and low-cost approach could be replicated in any setting, including the United States, where time-sensitive point-location data on environmental exposure are needed but are unavailable through more traditional data-collection sources. In this sense, their approach to gathering data is situated within the broader technological developments of volunteered geographic information, crowdsourced data, and GPS-enabled mobile technology for public health (2–6). | Although the main objective of Moodley et al was to provide a descriptive analysis of the intensity of SSB advertising, their approach to using technology deserves to be highlighted because it may be of great value to public health practitioners. To this end, Preventing Chronic Disease readers may find valuable some additional examples of the use of handheld GPS devices or smartphones for data collection for chronic disease epidemiology. Smartphones are GPS-enabled, and photographs taken with smartphone cameras are encoded with a GPS location. Software applications for smartphones that allow photographs to be exported and their location information to be stored on a convenient database include commercial applications such as Collector for ArcGIS (Esri, http://doc.arcgis.com/en/collector/) and open-source free applications such as Ushahidi (www.ushahidi.com/product/ushahidi/). |
Active bacterial core surveillance for Legionellosis - United States, 2011-2013
Dooling KL , Toews KA , Hicks LA , Garrison LE , Bachaus B , Zansky S , Carpenter LR , Schaffner B , Parker E , Petit S , Thomas A , Thomas S , Mansmann R , Morin C , White B , Langley GE . MMWR Morb Mortal Wkly Rep 2015 64 (42) 1190-3 During 2000-2011, passive surveillance for legionellosis in the United States demonstrated a 249% increase in crude incidence, although little was known about the clinical course and method of diagnosis. In 2011, a system of active, population-based surveillance for legionellosis was instituted through CDC's Active Bacterial Core surveillance (ABCs) program. Overall disease rates were similar in both the passive and active systems, but more complete demographic information and additional clinical and laboratory data were only available from ABCs. ABCs data during 2011-2013 showed that approximately 44% of patients with legionellosis required intensive care, and 9% died. Disease incidence was higher among blacks than whites and was 10 times higher in New York than California. Laboratory data indicated a reliance on urinary antigen testing, which only detects Legionella pneumophila serogroup 1 (Lp1). ABCs data highlight the severity of the disease, the need to better understand racial and regional differences, and the need for better diagnostic testing to detect infections. |
Outbreak of Escherichia coli O157:H7 infections associated with dairy education event attendance - Whatcom County, Washington, 2015
Curran K , Heiman KE , Singh T , Doobovsky Z , Hensley J , Melius B , Burnworth L , Williams I , Nichols M . MMWR Morb Mortal Wkly Rep 2015 64 (42) 1202-3 On April 27, 2015, the Whatcom County Health Department (WCHD) in Bellingham, Washington, was notified by a local laboratory regarding three children with presumptive Escherichia coli O157 infection. WCHD interviewed the parents, who indicated that all three children had attended a dairy education event held in a barn April 20-24, 2015, during a school field trip. WCHD, the Washington State Department of Health, and CDC investigated to determine the magnitude of the outbreak, identify risk factors and potential environmental sources of infection, and develop recommendations. A total of 60 cases (25 confirmed and 35 probable) were identified, and 11 patients were hospitalized. |
Characterization of the Proteins Associated with Caulobacter crescentus Bacteriophage CbK Particles.
Callahan CT , Wilson KM , Ely B . Curr Microbiol 2015 72 (1) 75-80 Bacteriophage genomes contain an abundance of genes that code for hypothetical proteins with either a conserved domain or no predicted function. The Caulobacter phage CbK has an unusual shape, designated morphotype B3 that consists of an elongated cylindrical head and a long flexible tail. To identify CbK proteins associated with the phage particle, intact phage particles were subjected to SDS-PAGE, and the resulting protein bands were digested with trypsin and analyzed using MALDI mass spectroscopy to provide peptide molecular weights. These peptide molecular weights were then compared with the peptides that would be generated from the predicted amino acid sequences that are coded by the CbK genome, and the comparison of the actual and predicted peptide masses resulted in the identification of single genes that could code for the set of peptides derived from each of the 20 phage proteins. We also found that CsCl density gradient centrifugation resulted in the separation of empty phage heads, phage heads containing material organized in a spiral, isolated phage tails, and other particulate material from the intact phage particles. This additional material proved to be a good source of additional phage proteins, and preliminary results suggest that it may include a CbK DNA replication complex. |
Host Genetic Susceptibility to Enteric Viruses: A Systematic Review and Metaanalysis.
Kambhampati A , Payne DC , Costantini V , Lopman BA . Clin Infect Dis 2015 62 (1) 11-18 BACKGROUND: Norovirus and rotavirus are prominent enteric viruses responsible for severe acute gastroenteritis disease burden around the world. Both viruses recognize and bind to histo-blood group antigens, which are expressed by the fucosyltransferase 2 (FUT2) gene. Individuals with a functional FUT2 gene are termed "secretors." FUT2 polymorphisms may influence viral binding patterns and, therefore, may influence host susceptibility to infection by these viruses. METHODS: We performed a systematic review of the published literature on this topic. Data were abstracted and compiled for descriptive analyses and metaanalyses. We estimated pooled odds ratios (ORs) for infection using random-effects models. RESULTS: We found that secretors were 9.9 times (95% confidence interval [CI], 3.9-24.8) as likely to be infected with genogroup II.4 noroviruses and 2.2 times as likely to be infected with genogroup II non-4 noroviruses (95% CI, 1.2-4.2) compared with nonsecretors. Secretors were also 26.6 times more susceptible to infections from P[8]-type rotaviruses compared with nonsecretors (95% CI, 8.3-85.0). CONCLUSIONS: Our analyses indicate that host genetic susceptibility to norovirus and rotavirus infection may be strain specific. As strain distribution and the proportion of genetic phenotypes vary in different countries, future studies should focus on differences in susceptibility among various ethnicities. Knowledge of innate susceptibility to rotavirus and norovirus can lead to improved understanding of both vaccine performance and individual risk of disease. |
G2P[4]-RotaTeq Reassortant Rotavirus in Vaccinated Child, United States.
Roy S , Rungsrisuriyachai K , Esona MD , Boom JA , Sahni LC , Rench MA , Baker CJ , Wikswo ME , Payne DC , Parashar UD , Bowen MD . Emerg Infect Dis 2015 21 (11) 2103-4 Group A rotaviruses (RVAs) are a leading cause of acute gastroenteritis-associated deaths among children <5 years of age in developing countries (1). The genome of RVA consists of 11 double-stranded RNA segments that code for 11 or 12 viral proteins (VP1–VP4, VP6, VP7, nonstructural protein 1 [NSP1]–NSP5/6) (2). In 2008, the Rotavirus Classification Working Group established a system of extended classification that was based on the sequences of all 11 gene segments and used the notations Gx-P[x]-Ix-Rx-Cx-Mx-Ax-Nx-Tx-Ex-Hx for the genes VP7, VP4, VP6, VP1–VP3, NSP1–NSP5, respectively (3). Similar to other RNA viruses, RVAs show high genomic diversity, which is generated primarily through point mutations, reassortment, rearrangement, and recombination events. | In 2006 and 2008, two live-attenuated vaccines, RotaTeq (Merck, Whitehouse Station, NJ, USA) and Rotarix (GlaxoSmithKline, Rixensart, Belgium), respectively, were introduced in the United States (4). RotaTeq is a pentavalent human bovine reassortant vaccine that contains 4 G types (G1, G2, G3, and G4; VP7 gene) plus the P[8] VP4 type on a bovine WC3 (G6P[5]) backbone (5). In 2012, Bucardo et al. reported finding a vaccine-derived nonstructural protein 2 (NSP2) gene in 2 wild-type RVA strains with a G1P[8] genogroup 1 backbone (6). Each of these strains had been found during routine surveillance in Nicaragua, where RotaTeq was introduced in 2006, suggesting reassortment of the vaccine strain with circulating wild-type strains. The authors also examined alignments of the NSP2 gene and found no differences at functional domains between the vaccine-derived NSP2 and the circulating wild-type NSP2 (6). This finding could explain why a vaccine-derived NSP2 reassortant was viable. |
State medicaid coverage for tobacco cessation treatments and barriers to coverage - United States, 2014-2015
Singleterry J , Jump Z , DiGiulio A , Babb S , Sneegas K , MacNeil A , Zhang L , Williams KA . MMWR Morb Mortal Wkly Rep 2015 64 (42) 1194-9 Medicaid enrollees have a cigarette smoking prevalence (30.4%) twice as high as that of privately insured Americans (14.7%), placing them at increased risk for smoking-related disease and death. Individual, group, and telephone counseling and seven Food and Drug Administration (FDA)-approved medications are evidence-based, effective treatments for helping tobacco users quit. A Healthy People 2020 objective (TU-8) calls for all state Medicaid programs to adopt comprehensive coverage of these treatments. However, a previous MMWR report indicated that, although state Medicaid coverage of cessation treatments had improved during 2008-2014, this coverage was still limited in most states. To monitor the most recent trends in state Medicaid cessation coverage, the American Lung Association collected data on coverage of, and barriers to, accessing all evidence-based cessation treatments except telephone counseling in state Medicaid programs (for a total of nine treatments) during January 31, 2014-June 30, 2015. As of June 30, 2015, all 50 states covered certain cessation treatments for at least some Medicaid enrollees. During 2014-2015, increases were observed in the number of states covering individual counseling, group counseling, and all seven FDA-approved cessation medications for all Medicaid enrollees; however, only nine states covered all nine treatments for all enrollees. Common barriers to accessing covered treatments included prior authorization requirements, limits on duration, annual limits on quit attempts, and required copayments. Previous research in both Medicaid and other populations indicates that state Medicaid programs could reduce smoking prevalence, smoking-related morbidity, and smoking-related health care costs among Medicaid enrollees by covering all evidence-based cessation treatments, removing all barriers to accessing these treatments, promoting coverage to Medicaid enrollees and health care providers, and monitoring use of covered treatments. |
Human papillomavirus vaccination coverage among female adolescents in managed care plans - United States, 2013
Ng J , Ye F , Roth L , Sobel K , Byron S , Barton M , Lindley M , Stokley S . MMWR Morb Mortal Wkly Rep 2015 64 (42) 1185-9 Human papillomavirus (HPV) is the most common sexually transmitted infection, with a reported 79 million persons aged 15-59 years in the United States currently infected with HPV, and approximately 14 million new cases diagnosed each year. Although most HPV infections are asymptomatic, transient, and do not cause disease, persistent HPV infection can lead to cervical, vulvar, vaginal, anal, penile, and oropharyngeal cancer. In the United States, approximately 27,000 HPV-attributable cancers occur each year. HPV vaccination is an effective primary prevention strategy that can reduce many of the HPV infections that lead to cancer, and is routinely recommended for adolescents aged 11-12 years. To determine whether the recommended HPV vaccination series is currently being administered to adolescents with health insurance, CDC and the National Committee for Quality Assurance (NCQA) assessed 2013 data from the Healthcare Effectiveness Data and Information Set (HEDIS). The HEDIS HPV Vaccine for Female Adolescents performance measure evaluates the proportion of female adolescent members in commercial and Medicaid health plans who receive the recommended 3-dose HPV vaccination series by age 13 years. In 2013, in the United States, the median HPV vaccination coverage levels for female adolescents among commercial and Medicaid plans were 12% and 19%, respectively (ranges = 0%-34% for commercial plans; 5%-52% for Medicaid plans). Improving HPV vaccination coverage and understanding of what health plans might do to support HPV vaccination are needed, including understanding the barriers to, and facilitators for, vaccination coverage. |
Cost-effectiveness of active-passive prophylaxis and antiviral prophylaxis during pregnancy to prevent perinatal hepatitis B virus infection
Fan L , Owusu-Edusei K , Schillie SF , Murphy TV . Hepatology 2015 63 (5) 1471-80 In an era of antiviral treatment, reexamination of the cost-effectiveness of strategies to prevent perinatal hepatitis B virus (HBV) transmission in the United States is needed. We used a decision tree and Markov model to estimate the cost-effectiveness of the current U.S. strategy and two alternatives: 1. Universal hepatitis B vaccination (HepB) strategy: No pregnant women are screened for hepatitis B surface antigen (HBsAg). All infants receive HepB before hospital discharge; no infants receive Hepatitis B Immunoglobulin (HBIG). 2. Current strategy: All pregnant women are screened for HBsAg. Infants of HBsAg-positive women receive HepB and HBIG ≤12 hours of birth. All other infants receive HepB before hospital discharge. 3. Antiviral prophylaxis strategy: All pregnant women are screened for HBsAg. HBsAg-positive women have HBV DNA load measured. Antiviral prophylaxis is offered for 4 months starting in the third trimester to women with DNA load ≥106 copies/mL. HepB and HBIG are administered at birth to infants of HBsAg-positive women, and HepB is administered before hospital discharge to infants of HBsAg-negative women. Effects were measured in quality-adjusted life years (QALYs) and incremental cost-effectiveness ratios (ICER). Compared to the 'Universal HepB strategy', the 'Current strategy' prevented 1,006 chronic HBV infections and saved 13,600 QALYs (ICER: $6,957/QALY saved). 'Antiviral prophylaxis' dominated the 'Current strategy,' preventing an additional 489 chronic infections, and saving 800 QALYs and $2.8 million. The results remained robust over a wide range of assumptions. CONCLUSION: The current U.S. strategy for preventing perinatal HBV remains cost-effective compared to 'Universal HepB strategy'. An 'Antiviral prophylaxis strategy' was cost-saving compared to the 'Current strategy,' and should be considered to continue to decrease the burden of perinatal hepatitis B in the United States. |
Healthcare-Associated Transmission of Plasmodium falciparum in New York City.
Lee EH , Adams EH , Madison-Antenucci S , Lee L , Barnwell JW , Whitehouse J , Clement E , Bajwa W , Jones LE , Lutterloh E , Weiss D , Ackelsberg J . Infect Control Hosp Epidemiol 2015 37 (1) 1-3 A patient with no risk factors for malaria was hospitalized in New York City with Plasmodium falciparum infection. After investigating all potential sources of infection, we concluded the patient had been exposed to malaria while hospitalized less than 3 weeks earlier. Molecular genotyping implicated patient-to-patient transmission in a hospital setting. |
Risk of injection-site abscess among infants receiving a preservative-free, two-dose vial formulation of pneumococcal conjugate vaccine in Kenya
Burton DC , Bigogo GM , Audi AO , Williamson J , Munge K , Wafula J , Ouma D , Khagayi S , Mugoya I , Mburu J , Muema S , Bauni E , Bwanaali T , Feikin DR , Ochieng PM , Mogeni OD , Otieno GA , Olack B , Kamau T , Van Dyke MK , Chen R , Farrington P , Montgomery JM , Breiman RF , Scott JA , Laserson KF . PLoS One 2015 10 (10) e0141896 There is a theoretical risk of adverse events following immunization with a preservative-free, 2-dose vial formulation of 10-valent-pneumococcal conjugate vaccine (PCV10). We set out to measure this risk. Four population-based surveillance sites in Kenya (total annual birth cohort of 11,500 infants) were used to conduct a 2-year post-introduction vaccine safety study of PCV10. Injection-site abscesses occurring within 7 days following vaccine administration were clinically diagnosed in all study sites (passive facility-based surveillance) and, also, detected by caregiver-reported symptoms of swelling plus discharge in two sites (active household-based surveillance). Abscess risk was expressed as the number of abscesses per 100,000 injections and was compared for the second vs first vial dose of PCV10 and for PCV10 vs pentavalent vaccine (comparator). A total of 58,288 PCV10 injections were recorded, including 24,054 and 19,702 identified as first and second vial doses, respectively (14,532 unknown vial dose). The risk ratio for abscess following injection with the second (41 per 100,000) vs first (33 per 100,000) vial dose of PCV10 was 1.22 (95% confidence interval [CI] 0.37-4.06). The comparator vaccine was changed from a 2-dose to 10-dose presentation midway through the study. The matched odds ratios for abscess following PCV10 were 1.00 (95% CI 0.12-8.56) and 0.27 (95% CI 0.14-0.54) when compared to the 2-dose and 10-dose pentavalent vaccine presentations, respectively. In Kenya immunization with PCV10 was not associated with an increased risk of injection site abscess, providing confidence that the vaccine may be safely used in Africa. The relatively higher risk of abscess following the 10-dose presentation of pentavalent vaccine merits further study. |
Pneumococcal prevention gets older and wiser
Schuchat A . JAMA Intern Med 2015 175 (12) 1-2 Pneumococcus, or Streptococcus pneumoniae, the “captain of the men of death” in the parlance of Sir William Osler, has killed millions of people while repeatedly frustrating clinicians, vaccine experts, and epidemiologists. The advent of effective antibiotics did not eliminate deaths from pneumococcal disease. Pneumococcal morbidity has remained substantial among the elderly population even though most have received the 23-valent pneumococcal polysaccharide vaccine (PPSV-23). Diagnostic tests for pneumonia are relatively insensitive and nonspecific.1 Thus, it is difficult to evaluate the efficacy of pneumococcal vaccines against pneumonias that do not lead to detectable bloodstream infection. |
HPV vaccination coverage of male adolescents in the United States
Lu PJ , Yankey D , Jeyarajah J , O'Halloran A , Elam-Evans LD , Smith PJ , Stokley S , Singleton JA , Dunne EF . Pediatrics 2015 136 (5) 839-49 BACKGROUND: In 2011, the Advisory Committee for Immunization Practices (ACIP) recommended routine use human papillomavirus (HPV) vaccine for male adolescents. METHODS: We used the 2013 National Immunization Survey-Teen data to assess HPV vaccine uptake (≥1 dose) and series completion (≥3 doses). Multivariable logistic regression analysis and a predictive marginal model were conducted to identify independent predictors of vaccination among adolescent males aged 13 to 17 years. RESULTS: HPV vaccination coverage with ≥1 dose was 34.6%, and series completion (≥3 doses) was 13.9%. Coverage was significantly higher among non-Hispanic blacks and Hispanics compared with non-Hispanic white male adolescents. Multivariable logistic regression showed that characteristics independently associated with a higher likelihood of HPV vaccination (≥1 dose) included being non-Hispanic black race or Hispanic ethnicity; having mothers who were widowed, divorced, or separated; having 1 to 3 physician contacts in the past 12 months; a well-child visit at age 11 to 12 years; having 1 or 2 vaccination providers; living in urban or suburban areas; and receiving vaccinations from >1 type of facility (P < .05). Having mothers with some college or college education, having a higher family income to poverty ratio, living in the South or Midwest, and receiving vaccinations from all sexually transmitted diseases/school/teen clinics or other facilities were independently associated with a lower likelihood of HPV vaccination (P < .05). CONCLUSIONS: Following recommendations for routine HPV vaccination among male adolescents, uptake in 2013 was low in this population. Increased efforts are needed to improve vaccination coverage, especially for those who are least likely to be vaccinated. |
In-hospital pneumococcal polysaccharide vaccination is associated with detection of pneumococcal vaccine serotypes in adults hospitalized for community-acquired pneumonia
Grijalva CG , Wunderink RG , Zhu Y , Williams DJ , Balk R , Fakhran S , Courtney DM , Anderson EJ , Qi C , Trabue C , Pavia AT , Moore MR , Jain S , Edwards KM , Self WH . Open Forum Infect Dis 2015 2 (4) ofv135 During an etiology study of adults hospitalized for pneumonia, in which urine specimens were examined for serotype-specific pneumococcal antigen detection, we observed that some patients received 23-valent pneumococcal polysaccharide vaccine before urine collection. Some urine samples became positive for specific vaccine pneumococcal serotypes shortly after vaccination, suggesting false-positive test results. |
An anti-influenza virus antibody inhibits viral infection by reducing nucleus entry of influenza nucleoprotein
Yoon A , Yi KS , Chang SY , Kim SH , Song M , Choi JA , Bourgeois M , Hossain MJ , Chen LM , Donis RO , Kim H , Lee Y , Hwang do B , Min JY , Chang SJ , Chung J . PLoS One 2015 10 (10) e0141312 To date, four main mechanisms mediating inhibition of influenza infection by anti-hemagglutinin antibodies have been reported. Anti-globular-head-domain antibodies block either influenza virus receptor binding to the host cell or progeny virion release from the host cell. Anti-stem region antibodies hinder the membrane fusion process or induce antibody-dependent cytotoxicity to infected cells. In this study we identified a human monoclonal IgG1 antibody (CT302), which does not inhibit both the receptor binding and the membrane fusion process but efficiently reduced the nucleus entry of viral nucleoprotein suggesting a novel inhibition mechanism of viral infection by antibody. This antibody binds to the subtype-H3 hemagglutinin globular head domain of group-2 influenza viruses circulating throughout the population between 1997 and 2007. |
Association of Tdap vaccination with acute events and adverse birth outcomes among pregnant women with prior tetanus-containing immunizations
Sukumaran L , McCarthy NL , Kharbanda EO , McNeil MM , Naleway AL , Klein NP , Jackson ML , Hambidge SJ , Lugg MM , Li R , Weintraub ES , Bednarczyk RA , King JP , DeStefano F , Orenstein WA , Omer SB . JAMA 2015 314 (15) 1581-7 IMPORTANCE: The Advisory Committee on Immunization Practices (ACIP) recommends the tetanus, diphtheria, and acellular pertussis (Tdap) vaccine for pregnant women during each pregnancy, regardless of prior immunization status. However, safety data on repeated Tdap vaccination in pregnancy is lacking. OBJECTIVE: To determine whether receipt of Tdap vaccine during pregnancy administered in close intervals from prior tetanus-containing vaccinations is associated with acute adverse events in mothers and adverse birth outcomes in neonates. DESIGN, SETTING, AND PARTICIPANTS: A retrospective cohort study in 29,155 pregnant women aged 14 through 49 years from January 1, 2007, through November 15, 2013, using data from 7 Vaccine Safety Datalink sites in California, Colorado, Minnesota, Oregon, Washington, and Wisconsin. EXPOSURES: Women who received Tdap in pregnancy following a prior tetanus-containing vaccine less than 2 years before, 2 to 5 years before, and more than 5 years before. MAIN OUTCOMES AND MEASURES: Acute adverse events (fever, allergy, and local reactions) and adverse birth outcomes (small for gestational age, preterm delivery, and low birth weight) were evaluated. Women who were vaccinated with Tdap in pregnancy and had a prior tetanus-containing vaccine more than 5 years before served as controls. RESULTS: There were no statistically significant differences in rates of medically attended acute adverse events or adverse birth outcomes related to timing since prior tetanus-containing vaccination. [table: see text]. CONCLUSIONS AND RELEVANCE: Among women who received Tdap vaccination during pregnancy, there was no increased risk of acute adverse events or adverse birth outcomes for those who had been previously vaccinated less than 2 years before or 2 to 5 years before compared with those who had been vaccinated more than 5 years before. These findings suggest that relatively recent receipt of a prior tetanus-containing vaccination does not increase risk after Tdap vaccination in pregnancy. |
Effectiveness of a statewide abusive head trauma prevention program in North Carolina
Zolotor AJ , Runyan DK , Shanahan M , Durrance CP , Nocera M , Sullivan K , Klevens J , Murphy R , Barr M , Barr RG . JAMA Pediatr 2015 169 (12) 1-6 IMPORTANCE: Abusive head trauma (AHT) is a serious condition, with an incidence of approximately 30 cases per 100000 person-years in the first year of life. OBJECTIVE: To assess the effectiveness of a statewide universal AHT prevention program. DESIGN, SETTING, AND PARTICIPANTS: In total, 88.29% of parents of newborns (n = 405 060) in North Carolina received the intervention (June 1, 2009, to September 30, 2012). A comparison of preintervention and postintervention was performed using nurse advice line telephone calls regarding infant crying (January 1, 2005, to December 31, 2010). A difference-in-difference analysis compared AHT rates in the prevention program state with those of other states before and after the implementation of the program (January 1, 2000, to December 31, 2011). INTERVENTION: The Period of PURPLE Crying intervention, developed by the National Center on Shaken Baby Syndrome, was delivered by nurse-provided education, a DVD, and a booklet, with reinforcement by primary care practices and a media campaign. MAIN OUTCOMES AND MEASURES: Changes in proportions of telephone calls for crying concerns to a nurse advice line and in AHT rates per 100000 infants after the intervention (June 1, 2009, to September 30, 2011) in the first year of life using hospital discharge data for January 1, 2000, to December 31, 2011. Results: In the 2 years after implementation of the intervention, parental telephone calls to the nurse advice line for crying declined by 20% for children younger than 3 months (rate ratio, 0.80; 95% CI, 0.73-0.87; P < .001) and by 12% for children 3 to 12 months old (rate ratio, 0.88; 95% CI, 0.78-0.99; P = .03). No reduction in state-level AHT rates was observed, with mean rates of 34.01 person-years before the intervention and 36.04 person-years after the intervention. A difference-in-difference analysis from January 1, 2000, to December 31, 2011, controlling for economic indicators, indicated that the intervention did not have a statistically significant effect on AHT rates (beta coefficient, -1.42; 95% CI, -13.31 to 10.45). CONCLUSIONS AND RELEVANCE: The Period of PURPLE Crying intervention was associated with a reduction in telephone calls to a nurse advice line. The study found no reduction in AHT rates over time in North Carolina relative to other states. Consequently, while this observational study was feasible and supported the program effectiveness in part, further programmatic efforts and evaluation are needed to demonstrate an effect on AHT rates. |
Assessing the impact of pneumococcal conjugate vaccines on invasive pneumococcal disease using polymerase chain reaction-based surveillance: an experience from South Africa.
Tempia S , Wolter N , Cohen C , Walaza S , von Mollendorf C , Cohen AL , Moyes J , de Gouveia L , Nzenze S , Treurnicht F , Venter M , Groome MJ , Madhi SA , von Gottberg A . BMC Infect Dis 2015 15 450 BACKGROUND: The use of molecular diagnostic techniques for the evaluation of the impact of pneumococcal conjugate vaccines (PCVs) has not been documented. We aimed to evaluate the impact of PCVs on invasive pneumococcal disease (IPD) using polymerase chain reaction (PCR)-based techniques and compare with results obtained from culture-based methods. METHODS: We implemented two independent surveillance programs for IPD among individuals hospitalized at one large surveillance site in Soweto, South Africa during 2009-2012: (i) PCR-based (targeting the lytA gene) syndromic pneumonia surveillance; and (ii) culture-based laboratory surveillance. Positive samples were serotyped. The molecular serotyping assay included targets for 42 serotypes including all serotypes/serogroups included in the 7-valent (PCV-7) and 13-valent (PCV-13) PCV. The Quellung reaction was used for serotyping of culture-positive cases. We calculated the change in rates of IPD (lytA- or culture-positive) among HIV-uninfected children aged <2 years from the year of PCV-7 introduction (2009) to the post-vaccine years (2011 or 2012). RESULTS: During the study period there were 607 lytA-positive and 1,197 culture-positive cases that were serotyped. Samples with lytA cycle threshold (Ct)-values ≥35 (30.2 %; 123/407) were significantly less likely to have a serotype/serogroup detected for serotypes included in the molecular serotyping assay than those with Ct-values <35 (78.0 %; 156/200) (p < 0.001). From 2009 to 2012 rates of PCV-7 serotypes/serogroups decreased -63.8 % (95 % CI: -79.3 % to -39.1 %) among lytA-positive cases and -91.7 % (95 % CI: -98.8 % to -73.6 %) among culture-positive cases. Rates of lytA-positive non-vaccine serotypes/serogroups also significantly decreased (-71.7 %; 95 % CI: -81.1 % to -58.5 %) over the same period. Such decline was not observed among the culture-positive non-vaccine serotypes (1.2 %; 95 % CI: -96.7 % to 58.4 %). CONCLUSIONS: Significant downward trends in IPD PCV-7 serotype-associated rates were observed among patients tested by PCR or culture methods; however trends of non-vaccine serotypes/serogroups differed between the two groups. Misclassifications of serotypes/serogroups, affecting the use of non-vaccine serotypes as a control group, may have occurred due to the low performance of the serotyping assay among lytA-positive cases with high Ct-values. Until PCR methods improve further, culture methods should continue to be used to monitor the effects of PCV vaccination programs on IPD incidence. |
A molecular sensor to characterize arenavirus envelope glycoprotein cleavage by subtilisin kexin isozyme-1 (SKI-1)/site-1 protease (S1P).
Oppliger J , da Palma JR , Burri DJ , Bergeron E , Khatib AM , Spiropoulou CF , Pasquato A , Kunz S . J Virol 2015 90 (2) 705-14 Arenaviruses are emerging viruses including several causative agents of severe hemorrhagic fevers in humans. The advent of next-generation sequencing technology has greatly accelerated the discovery of novel arenavirus species. However, for many of these viruses only genetic information is available and their zoonotic disease potential remains unknown. During the arenavirus life cycle, processing of the viral envelope glycoprotein precursor (GPC) by the cellular subtilisin kexin isozyme-1 (SKI-1)/site-1 protease (S1P) is crucial for productive infection. The ability of newly emerging arenaviruses to hijack human SKI-1/S1P appears therefore as a requirement for efficient zoonotic transmission and human disease potential. Here we implement a newly developed cell-based molecular sensor for SKI-1/S1P to characterize the processing of arenavirus GPC-derived target sequences by human SKI-1/S1P in a quantitative manner. We show that only nine amino acids flanking the putative cleavage site are necessary and sufficient to accurately recapitulate efficiency and subcellular location of arenavirus GPC processing. In proof-of-concept, our sensor correctly predicts efficient processing of the GPC of the newly emerged pathogenic Lujo virus by human SKI-1/S1P and defines the exact cleavage site. Lastly, we employed our sensor to show efficient GPC processing of a panel of pathogenic and non-pathogenic New World arenaviruses, suggesting that GPC cleavage represents no barrier for zoonotic transmission of these pathogens. Our SKI-1/S1P sensor thus represents a rapid and robust test system to assess processing of putative cleavage sites derived from newly discovered arenavirus GPC by SKI-1/S1P of humans or any other species, based solely on sequence information. IMPORTANCE: Arenaviruses are important emerging human pathogens that can cause severe hemorrhagic fevers with high m-+ortality in humans. A crucial step in productive infection of arenaviruses in human cells is processing of the viral envelope glycoprotein by the cellular subtilisin kexin isozyme-1 (SKI-1)/site-1 protease (S1P). In order to break the species barrier during zoonotic transmission and to cause severe disease in man, newly emerging arenaviruses must be able to efficiently hijack human SKI-1/S1P. Here we implement a newly developed cell-based molecular sensor for human SKI-1/S1P to characterize the processing of arenavirus glycoproteins in a quantitative manner. We further use our sensor to correctly predict efficient processing of the glycoprotein of the newly emerged pathogenic Lujo virus by human SKI-1/S1P. Our sensor represents thus a rapid and robust test system to assess if the glycoprotein of any newly emerging arenavirus can be efficiently processed by human SKI-1/S1P, based solely on sequence information. |
Survey of influenza and other respiratory viruses diagnostic testing in US hospitals, 2012-2013.
Su S , Fry AM , Kirley PD , Aragon D , Yousey-Hindes K , Meek J , Openo K , Oni O , Sharangpani R , Morin C , Hollick G , Lung K , Laidler M , Lindegren ML , Schaffner W , Atkinson A , Chaves SS . Influenza Other Respir Viruses 2015 10 (2) 86-90 We sought to assess diagnostic practices for influenza and other respiratory virus in a survey of hospitals and laboratories participating in the US Influenza Hospitalization Surveillance Network in 2012-13. Of the 240 participating laboratories, 67% relied only on commercially-available rapid influenza diagnostic tests to diagnose influenza. Few reported the availability of molecular diagnostic assays for detection of influenza (26%) and other viral pathogens (≤ 20%) in hospitals and commercial laboratories. Reliance on insensitive assays to detect influenza may detract from optimal clinical management of influenza infections in hospitals. |
A real-time PCR assay for detection of the Ehrlichia muris-like agent, a newly recognized pathogen of humans in the upper Midwestern United States.
Allerdice ME , Pritt BS , Sloan LM , Paddock CD , Karpathy SE . Ticks Tick Borne Dis 2015 7 (1) 146-149 The Ehrlichia muris-like agent (EMLA) is an emerging, tick-transmitted human pathogen that occurs in the upper Midwestern United States. Here, we describe the development and validation of a p13-based quantitative real-time PCR TaqMan assay to detect EMLA in blood or tissues of ticks, humans, and rodents. The primer and probe specificities of the assay were ascertained using a large panel of various Ehrlichia species and other members of Rickettsiales. In addition to control DNA, both non-infected and EMLA-infected human blood, Mus musculus blood, and M. musculus tissue extracts were evaluated, as were non-infected and EMLA-infected Ixodes scapularis and uninfected Dermacentor variabilis DNA lysates. The specificity of the probe was determined via real-time PCR. An EMLA p13 control plasmid was constructed, and serial dilutions were used to determine the analytical sensitivity, which was found to be 1 copy per 4mul of template DNA. The sensitivity and specificity of this assay provides a powerful tool for ecological studies involving arthropod vectors and their mammalian hosts. |
Gas-phase reaction products and yields of terpinolene with ozone and nitric oxide using a new derivatization agent
Ham JE , Jackson SR , Harrison JC , Wells JR . Atmos Environ (1994) 2015 122 513-520 The new derivatization agent, O-tert-butylhydroxylamine hydrochloride (TBOX) was used to investigate the carbonyl reaction products from terpinolene ozonolysis. With ozone (O3) as the limiting reagent, four carbonyl compounds were detected: methylglyoxal (MG), 4-methylcyclohex-3-en-1-one, (4MCH), 6-oxo-3-(propan-2-ylidene) heptanal (6OPH), and 3,6-dioxoheptanal (36DOH). The tricarbonyl 36DOH has not been previously observed. Using cyclohexane as a hydroxyl radical (OH) scavenger, the yields of 6OPH and 36DOH were reduced indicating the influence secondary OH radicals have on terpinolene ozonolysis products. However, the MG yield increased and the 4MCH yield was unchanged when OH radicals were scavenged suggesting they are only made by the terpinolene + O3reaction. The detection of 36DOH using TBOX highlights the advantages of a smaller molecular weight derivatization agent for the detection of multi-carbonyl compounds. The product yields from terpinolene ozonolysis experiments conducted in the presence of 20 ppb nitric oxide (NO) remained unchanged except for MG which decreased. However, in experiments where O3was kept constant at 50 ppb and NO was varied (20, 50, 100 ppb) MG, 6OPH, 36DOH decreased with increasing NO while 4MCH increased with increasing NO. The use of TBOX derivatization if combined with other derivatization agents may address a recurring need to simply and accurately detect multi-functional oxygenated species in air. |
Human norovirus culture in B cells
Jones MK , Grau KR , Costantini V , Kolawole AO , de Graaf M , Freiden P , Graves CL , Koopmans M , Wallet SM , Tibbetts SA , Schultz-Cherry S , Wobus CE , Vinje J , Karst SM . Nat Protoc 2015 10 (12) 1939-47 Human noroviruses (HuNoVs) are a leading cause of foodborne disease and severe childhood diarrhea, and they cause a majority of the gastroenteritis outbreaks worldwide. However, the development of effective and long-lasting HuNoV vaccines and therapeutics has been greatly hindered by their uncultivability. We recently demonstrated that a HuNoV replicates in human B cells, and that commensal bacteria serve as a cofactor for this infection. In this protocol, we provide detailed methods for culturing the GII.4-Sydney HuNoV strain directly in human B cells, and in a coculture system in which the virus must cross a confluent epithelial barrier to access underlying B cells. We also describe methods for bacterial stimulation of HuNoV B cell infection and for measuring viral attachment to the surface of B cells. Finally, we highlight variables that contribute to the efficiency of viral replication in this system. Infection assays require 3 d and attachment assays require 3 h. Analysis of infection or attachment samples, including RNA extraction and RT-qPCR, requires approximately 6 h. |
ICAM-1 regulates the survival of influenza virus in lung epithelial cells during the early stages of infection
Othumpangat S , Noti JD , McMillen CM , Beezhold DH . Virology 2015 487 85-94 Intercellular cell adhesion molecule-1 (ICAM-1) is an inducible cell surface glycoprotein that is expressed on many cell types. Influenza virus infection enhanced ICAM-1 expression and messenger RNA levels. Human bronchial epithelial cells (HBEpC) and nasal epithelial cells, on exposure to different strains of influenza virus (H1N1, H3N2, and H9N1) showed significant increase in ICAM-1 gene expression (p<0.001) along with the ICAM-1 protein levels (surface and secreted). Depleting ICAM-1 in HBEpC with ICAM-1 siRNA and subsequently infecting with H1N1 showed increased viral copy numbers. Influenza virus infection in HBEpC resulted in up-regulation of NF-kB protein and the lack of ICAM-1 decreased NF-kB activity in NF-kB luciferase reporter assay. Addition of exogenous IL-1beta to HBEpC induced the ICAM-1 expression and decreased matrix gene copy number. Taken together, HBEpC induced ICAM-1 plays a key role in modulating the influenza virus survival possibly through the NF-kB pathway. |
Detection of the HA-33 protein in botulinum neurotoxin type G complex by mass spectrometry
Kalb SR , Baudys J , Barr JR . BMC Microbiol 2015 15 (1) 227 BACKGROUND: The disease botulism is caused by intoxication with botulinum neurotoxins (BoNTs), extremely toxic proteins which cause paralysis. This neurotoxin is produced by some members of the Clostridium botulinum and closely related species, and is produced as a protein complex consisting of the neurotoxin and neurotoxin-associated proteins (NAPs). There are seven known serotypes of BoNT, A-G, and the composition of the NAPs can differ between these serotypes. It was previously published that the BoNT/G complex consisted of BoNT/G, nontoxic-nonhemagglutinin (NTNH), Hemagglutinin 70 (HA-70), and HA-17, but that HA-33, a component of the protein complex of other serotypes of BoNT, was not found. METHODS: Components of the BoNT/G complex were first separated by SDS-PAGE, and bands corresponding to components of the complex were digested and analyzed by LC-MS/MS. RESULTS: Gel bands were identified with sequence coverages of 91 % for BoNT/G, 91 % for NTNH, 89 % for HA-70, and 88 % for HA-17. Notably, one gel band was also clearly identified as HA-33 with 93 % sequence coverage. CONCLUSIONS: The BoNT/G complex consists of BoNT/G, NTNH, HA-70, HA-17, and HA-33. These proteins form the progenitor form of BoNT/G, similar to all other HA positive progenitor toxin complexes. |
Growth charts for children with Down syndrome in the United States
Zemel BS , Pipan M , Stallings VA , Hall W , Schadt K , Freedman DS , Thorpe P . Pediatrics 2015 136 (5) e1204-11 BACKGROUND AND OBJECTIVES: Children with Down syndrome (DS) have lower birth weights and grow more slowly than children without DS. Advances in and increased access to medical care have improved the health and well-being of individuals with DS; however, it is unknown whether their growth has also improved. Our objective was to develop new growth charts for children with DS and compare them to older charts from the United States and more contemporary charts from the United Kingdom. METHODS: The Down Syndrome Growing Up Study (DSGS) enrolled a convenience sample of children with DS up to 20 years of age and followed them longitudinally. Growth parameters were measured by research anthropometrists. Sex-specific growth charts were generated for the age ranges birth to 36 months and 2 to 20 years using the LMS method. Weight-for-length and BMI charts were also generated. Comparisons with other curves were presented graphically. RESULTS: New DSGS growth charts were developed by using 1520 measurements on 637 participants. DSGS growth charts for children <36 months of age showed marked improvements in weight compared with older US charts. DSGS charts for 2- to 20-year-olds showed that contemporary males are taller than previous charts showed. Generally, the DSGS growth charts are similar to the UK charts. CONCLUSIONS: The DSGS growth charts can be used as screening tools to assess growth and nutritional status and to provide indications of how growth of an individual child compares with peers of the same age and sex with DS. |
The state of evaluation research on food policies to reduce obesity and diabetes among adults in the United States, 2000-2011
Freudenberg N , Franzosa E , Sohler N , Li R , Devlin H , Albu J . Prev Chronic Dis 2015 12 E182 INTRODUCTION: Improvements in diet can prevent obesity and type 2 diabetes. Although policy changes provide a foundation for improvement at the population level, evidence for the effectiveness of such changes is slim. This study summarizes the literature on recent efforts in the United States to change food-related policies to prevent obesity and diabetes among adults. METHODS: We conducted a systematic review of evidence of the impact of food policies. Websites of government, academic, and nonprofit organizations were scanned to generate a typology of food-related policies, which we classified into 18 categories. A key-word search and a search of policy reports identified empirical evaluation studies of these categories. Analyses were limited to strategies with 10 or more reports. Of 422 articles identified, 94 met these criteria. Using publication date, study design, study quality, and dietary outcomes assessed, we evaluated the strength of evidence for each strategy in 3 assessment categories: time period, quality, and study design. RESULTS: Five strategies yielded 10 or more reports. Only 2 of the 5 strategies, menu labeling and taxes on unhealthy foods, had 50% or more studies with positive findings in at least 2 of 3 assessment categories. Most studies used methods that were rated medium quality. Although the number of published studies increased over 11 years, study quality did not show any clear trend nor did it vary by strategy. CONCLUSION: Researchers and policy makers can improve the quality and rigor of policy evaluations to synthesize existing evidence and develop better methods for gleaning policy guidance from the ample but imperfect data available. |
Calcium plus vitamin D supplementation and risk of fractures: an updated meta-analysis from the National Osteoporosis Foundation
Weaver CM , Alexander DD , Boushey CJ , Dawson-Hughes B , Lappe JM , LeBoff MS , Liu S , Looker AC , Wallace TC , Wang DD . Osteoporos Int 2015 27 (1) 367-76 The aim was to meta-analyze randomized controlled trials of calcium plus vitamin D supplementation and fracture prevention. Meta-analysis showed a significant 15 % reduced risk of total fractures (summary relative risk estimate [SRRE], 0.85; 95 % confidence interval [CI], 0.73-0.98) and a 30 % reduced risk of hip fractures (SRRE, 0.70; 95 % CI, 0.56-0.87). INTRODUCTION: Calcium plus vitamin D supplementation has been widely recommended to prevent osteoporosis and subsequent fractures; however, considerable controversy exists regarding the association of such supplementation and fracture risk. The aim was to conduct a meta-analysis of randomized controlled trials [RCTs] of calcium plus vitamin D supplementation and fracture prevention in adults. METHODS: A PubMed literature search was conducted for the period from July 1, 2011 through July 31, 2015. RCTs reporting the effect of calcium plus vitamin D supplementation on fracture incidence were selected from English-language studies. Qualitative and quantitative information was extracted; random-effects meta-analyses were conducted to generate summary relative risk estimates (SRREs) for total and hip fractures. Statistical heterogeneity was assessed using Cochran's Q test and the I 2 statistic, and potential for publication bias was assessed. RESULTS: Of the citations retrieved, eight studies including 30,970 participants met criteria for inclusion in the primary analysis, reporting 195 hip fractures and 2231 total fractures. Meta-analysis of all studies showed that calcium plus vitamin D supplementation produced a statistically significant 15 % reduced risk of total fractures (SRRE, 0.85; 95 % confidence interval [CI], 0.73-0.98) and a 30 % reduced risk of hip fractures (SRRE, 0.70; 95 % CI, 0.56-0.87). Numerous sensitivity and subgroup analyses produced similar summary associations. A limitation is that this study utilized data from subgroup analysis of the Women's Health Initiative. CONCLUSIONS: This meta-analysis of RCTs supports the use of calcium plus vitamin D supplements as an intervention for fracture risk reduction in both community-dwelling and institutionalized middle-aged to older adults. |
A source-based measurement database for occupational exposure assessment of electromagnetic fields in the INTEROCC study: a literature review approach
Vila J , Bowman JD , Richardson L , Kincl L , Conover DL , McLean D , Mann S , Vecchia P , van Tongeren M , Cardis E . Ann Occup Hyg 2015 60 (2) 184-204 INTRODUCTION: To date, occupational exposure assessment of electromagnetic fields (EMF) has relied on occupation-based measurements and exposure estimates. However, misclassification due to between-worker variability remains an unsolved challenge. A source-based approach, supported by detailed subject data on determinants of exposure, may allow for a more individualized exposure assessment. Detailed information on the use of occupational sources of exposure to EMF was collected as part of the INTERPHONE-INTEROCC study. To support a source-based exposure assessment effort within this study, this work aimed to construct a measurement database for the occupational sources of EMF exposure identified, assembling available measurements from the scientific literature. METHODS: First, a comprehensive literature search was performed for published and unpublished documents containing exposure measurements for the EMF sources identified, a priori as well as from answers of study subjects. Then, the measurements identified were assessed for quality and relevance to the study objectives. Finally, the measurements selected and complementary information were compiled into an Occupational Exposure Measurement Database (OEMD). RESULTS: Currently, the OEMD contains 1624 sets of measurements (>3000 entries) for 285 sources of EMF exposure, organized by frequency band (0 Hz to 300 GHz) and dosimetry type. Ninety-five documents were selected from the literature (almost 35% of them are unpublished technical reports), containing measurements which were considered informative and valid for our purpose. Measurement data and complementary information collected from these documents came from 16 different countries and cover the time period between 1974 and 2013. CONCLUSION: We have constructed a database with measurements and complementary information for the most common sources of exposure to EMF in the workplace, based on the responses to the INTERPHONE-INTEROCC study questionnaire. This database covers the entire EMF frequency range and represents the most comprehensive resource of information on occupational EMF exposure. It is available at www.crealradiation.com/index.php/en/databases. |
Summary of notifiable noninfectious conditions and disease outbreaks: acute occupational pesticide-related illness and injury - United States, 2007-2010
Calvert GM , Beckman J , Prado JB , Bojes H , Mulay P , Lackovic M , Waltz J , Schwartz A , Mitchell Y , Moraga-McHaley S , Leinenkugel K , Higgins S . MMWR Morb Mortal Wkly Rep 2015 62 (54) 5-10 CDC's National Institute for Occupational Safety and Health (NIOSH) collects data on acute pesticide-related illnesses and injuries reported by 11 states (California, Florida, Iowa, Louisiana, Michigan, North Carolina, New Mexico [2007–2008 only], New York, Oregon, Texas, and Washington). This report summarizes data on illnesses and injuries arising from occupational exposure to conventional pesticides during 2007–2010. This report is a part of the first-ever Summary of Notifiable Noninfectious Conditions and Disease Outbreaks, which encompasses various surveillance years but is being published in 2015 (1). The Summary of Notifiable Noninfectious Conditions and Disease Outbreaks appears in the same volume of MMWR as the annual Summary of Notifiable Infectious Diseases (2). | Background | Pesticides are substances or mixtures of substances intended to prevent, destroy, repel, or mitigate pests (e.g., insects, rodents, fungi, and weeds). In 2007, the year with the most currently available data, an estimated 2.1 billion pounds of conventional pesticides were used in the United States (3), which represents approximately 22% of the entire worldwide use of these pesticides. Conventional pesticides include insecticides, herbicides, fungicides, and fumigants and exclude chlorine, hypochlorites, and biocides. | The toxicity of pesticides continues to raise public concern and is the focus of much media attention. The benefits of pesticides are well recognized and primarily include their role in protecting the food supply and in controlling disease vectors (4). However, no form of pest control is perfectly safe. Tracking the associated health effects of pesticides can help ensure that no pesticides pose an unreasonable burden (5). As such, public health surveillance of acute pesticide-related illness and injury serves a vital societal role by assessing the magnitude and characteristics of this condition. Surveillance of acute pesticide-related illness and injury has been endorsed by several professional organizations and federal agencies including the American Medical Association (6), the Council of State and Territorial Epidemiologists (7), NIOSH (8), and the U.S. Government Accountability Office (9). To address the need for public health surveillance of acute pesticide-related illness and injury, NIOSH established such a surveillance program in 1987. | Pesticide products must pass an extensive battery of testing prior to being registered by the U.S. Environmental Protection Agency (EPA). This testing forms the basis for the human health and environmental risk assessments conducted by EPA that guide identification of the conditions under which a pesticide can be used. These conditions of use are reflected in pesticide product labeling. Compliance with these use conditions are expected to prevent unreasonable adverse effects to human health and the environment. To verify the real-world effectiveness of pesticide product labeling in preventing adverse human health effects, findings from acute pesticide-related illness and injury surveillance systems are reviewed. These surveillance data assist EPA to determine whether labeling is effective or if labeling improvements are needed. When adverse health effects occur despite adherence to label instructions, and if EPA determines the magnitude to be unreasonable, EPA requires that interventions be instituted that involve changing pesticide use practices and/or modifying regulatory measures (10). Acute pesticide-related illness and injury also can occur because of a lack of compliance with existing pesticide regulations. The appropriate interventions for these cases include enhanced education and enforcement. |
Summary of notifiable noninfectious conditions and disease outbreaks: surveillance for silicosis - Michigan and New Jersey, 2003-2010
Filios MS , Mazurek JM , Schleiff PL , Reilly MJ , Rosenman KD , Lumia ME , Worthington K . MMWR Morb Mortal Wkly Rep 2015 62 (54) 81-5 CDC's National Institute for Occupational Safety and Health (NIOSH), state health departments, and other state entities maintain a state-based surveillance program of confirmed silicosis cases. Data on confirmed cases are collected and compiled by state entities and submitted to CDC. This report summarizes information for cases of silicosis that were reported to CDC for 2003–2010. The data for this report were final as of December 31, 2010. Data are presented in tabular form on the prevalence of silicosis, the number of cases and the distribution of cases by year, industry, occupation, and the duration of occupational exposure to dust containing respirable crystalline silica (Tables 1–4). The number of cases by year is presented graphically (Figure). This report is a part of the first-ever Summary of Notifiable Noninfectious Conditions and Disease Outbreaks, which encompasses various surveillance years but is being published in 2015 (1). The Summary of Notifiable Noninfectious Conditions and Disease Outbreaks appears in the same volume of MMWR as the annual Summary of Notifiable Infectious Diseases (2). | Background | Silicosis, a form of pneumoconiosis, is a progressive occupational lung disease caused by the inhalation, deposition, and retention of respirable dust containing crystalline silica. There is no effective specific treatment, and patients with silicosis can be offered only supportive care. Silicosis is preventable by using non-silica substitution materials, effective dust control measures, and personal protective equipment.* Occupational exposure to respirable dust containing crystalline silica occurs in mining, quarrying, sandblasting, rock drilling, construction, pottery making, stone masonry, and tunneling operations (3). The Occupational Safety and Health Administration (OSHA) estimates that approximately 2.2 million workers are currently exposed† to respirable crystalline silica in industries where exposure might occur: 1.85 million workers in the construction industry and 320,000 workers in general industry and maritime workplaces (4,5). Typically a disease of long latency, silicosis usually is diagnosed through a chest radiograph after ≥10 years of exposure to respirable crystalline silica dust. Nodular silicosis can also develop within 5–10 years of exposure to higher concentrations of crystalline silica. A clinical continuum exists between the accelerated and the chronic forms of silicosis. Acute silicosis has a different pathophysiology than accelerated or chronic silicosis. It might develop within weeks of initial exposure and is associated with exposures to extremely high concentrations† of crystalline silica. Respiratory impairment is severe, and the disease is usually fatal within a year of diagnosis. In addition, occupational exposure to respirable crystalline silica puts workers at increased risk for other serious health conditions including chronic obstructive lung disease, kidney and connective tissue disease, tuberculosis and other mycobacterial-related diseases, and lung cancer (6). In 1997, the International Agency for Research on Cancer classified crystalline silica as carcinogenic to humans (7), and this classification was reconfirmed in 2012 (8). | During 1968–2010, the number of deaths in the United States for which silicosis was listed on the death certificate declined from 1,065 (age-adjusted death rate: 8.21 per million persons aged ≥15 years) in 1968 to 101 (rate: 0.39) in 2010 (9). Analysis of 1968–2005 data indicated that silicosis-attributable years of potential life lost before age 65 years decreased substantially during 1968–2005, but the decline slowed during the last 10 years of that period (10). However, no decline occurred in the number of hospitalizations for which silicosis was listed as one of the discharge diagnoses during 1993–2011.§ Cases of silicosis continue to occur despite the existence of legally enforceable exposure limits.† Silicosis in any of its clinical forms is consistently undercounted by the Survey of Occupational Injuries and Illnesses (SOII), an employer-based surveillance system maintained by the Bureau of Labor Statistics (11). Estimates indicate that 3,600–7,300 new cases of silicosis might be occurring each year (11). In 2008, the National Academy of Sciences recommended that surveillance efforts to prevent silicosis and other interstitial lung diseases be continued and expanded (12). | Cases of silicosis are sentinel events that indicate the need for intervention (13). Silicosis was first designated as a notifiable condition at the national level in 1999¶ and reconfirmed in 2009.** In 2010, silicosis was a reportable condition in 25 states.†† | NIOSH has supported efforts by states to conduct surveillance for silicosis under several cooperative agreements, including the Sentinel Event Notification system for Occupational Risks (SENSOR) and the State-Based Occupational Safety and Health Surveillance agreements. In 1987, states initiated active silicosis surveillance under SENSOR and began providing data voluntarily to NIOSH (14,15). Since 1992, data summaries have been published in a series of reports.§§ The number of states¶¶ that conduct silicosis surveillance varies by year based on funding support by NIOSH. Currently, Michigan and New Jersey continue to maintain their sentinel case-based silicosis surveillance systems and intervention programs. These two states are the only states that continue to provide data voluntarily to NIOSH. | This report summarizes data for silicosis cases that met the surveillance case definition for a confirmed silicosis case for the period 2003–2010 as reported by Michigan and New Jersey. Data from state programs are updated annually and are available through the CDC's Work-Related Lung Disease Surveillance System (eWoRLD).*** |
Coccidioidomycosis among workers constructing solar power farms, California, USA, 2011-2014
Wilken JA , Sondermeyer G , Shusterman D , McNary J , Vugia DJ , McDowell A , Borenstein P , Gilliss D , Ancock B , Prudhomme J , Gold D , Windham GC , Lee L , Materna BL . Emerg Infect Dis 2015 21 (11) 1997-2005 Coccidioidomycosis is associated with soil-disruptive work in Coccidioides-endemic areas of the southwestern United States. Among 3,572 workers constructing 2 solar power-generating facilities in San Luis Obispo County, California, USA, we identified 44 patients with symptom onset during October 2011-April 2014 (attack rate 1.2 cases/100 workers). Of these 44 patients, 20 resided in California outside San Luis Obispo County and 10 resided in another state; 9 were hospitalized (median 3 days), 34 missed work (median 22 days), and 2 had disseminated disease. Of the 25 patients who frequently performed soil-disruptive work, 6 reported frequent use of respiratory protection. As solar farm construction in Coccidioides-endemic areas increases, additional workers will probably be exposed and infected unless awareness is emphasized and effective exposure reduction measures implemented, including limiting dust generation and providing respiratory protection. Medical providers, including those in non-Coccidioides-endemic areas, should suspect coccidioidomycosis in workers with compatible illness and report cases to their local health department. |
Shale failure mechanics and intervention measures in underground coal mines: results from 50 years of ground control safety research
Murphy MM . Rock Mech Rock Eng 2015 49 (2) 661-671 Ground control research in underground coal mines has been ongoing for over 50 years. One of the most problematic issues in underground coal mines is roof failures associated with weak shale. This paper will present a historical narrative on the research the National Institute for Occupational Safety and Health has conducted in relation to rock mechanics and shale. This paper begins by first discussing how shale is classified in relation to coal mining. Characterizing and planning for weak roof sequences is an important step in developing an engineering solution to prevent roof failures. Next, the failure mechanics associated with the weak characteristics of shale will be discussed. Understanding these failure mechanics also aids in applying the correct engineering solutions. The various solutions that have been implemented in the underground coal mining industry to control the different modes of failure will be summarized. Finally, a discussion on current and future research relating to rock mechanics and shale is presented. The overall goal of the paper is to share the collective ground control experience of controlling roof structures dominated by shale rock in underground coal mining. |
Lung pathology in U.S. coal workers with rapidly progressive pneumoconiosis implicates silica and silicates
Cohen RA , Petsonk EL , Rose C , Young B , Regier MPhD , Najmuddin AMd , Abraham JL , Churg A , Green FH . Am J Respir Crit Care Med 2015 193 (6) 673-80 RATIONALE: Recent reports of progressive massive fibrosis and rapidly progressive pneumoconiosis among US coal miners have raised concerns over excessive exposures to coal mine dust, despite reports of declining dust levels. OBJECTIVES: To evaluate the histologic abnormalities and retained dust particles in available coal miner lung pathology specimens and compare findings to those from corresponding chest radiographs. METHODS: Miners with severe disease and available lung tissue were identified through investigator outreach. Demographics, smoking, and work history were obtained. Chest radiographs were interpreted according to the International Labour Organization classification to determine if criteria for rapidly progressive pneumoconiosis were confirmed. Pathology slides were scored by three expert pulmonary pathologists, using standardized nomenclature and scoring system. MEASUREMENTS AND MAIN RESULTS: Of the 13 cases reviewed, 12 had progressive massive fibrosis, 11 had silicosis, many with features of accelerated silicosis and mixed dust lesions. Only four had classic lesions of simple coal workers' pneumoconiosis. Four had diffuse interstitial fibrosis with chronic inflammation, and two had focal alveolar proteinosis. Polarized light microscopy revealed large amounts of birefringent mineral dust particles consistent with silica and silicates; carbonaceous coal dust was less prominent. Specimens with features of silicosis were significantly associated (p=0.047) with rounded (p,q,r) opacities on chest imaging, while grade 3 interstitial fibrosis was associated (p=0.02) with the presence of irregular (s,t,u) opacities on chest imaging. CONCLUSIONS: Our findings suggest that rapidly progressive pneumoconiosis in these miners was caused by exposures to coal mine dusts containing high concentrations of respirable silica and silicates. |
Cardiac and mitochondrial dysfunction following acute pulmonary exposure to mountaintop removal mining particulate matter
Nichols CE , Shepherd DL , Knuckles TL , Thapa D , Stricker JC , Stapleton PA , Minarchick VC , Erdely A , Zeidler-Erdely PC , Alway SE , Nurkiewicz TR , Hollander JM . Am J Physiol Heart Circ Physiol 2015 309 (12) ajpheart 00353 2015 Throughout the United States, air pollution correlates with adverse health outcomes and cardiovascular disease incidence is commonly increased following environmental exposure. In areas surrounding active mountaintop removal mines (MTM) a further increase in cardiovascular morbidity is observed and may be attributed in part to particulate matter (PM) released from the mine. The mitochondrion has been shown to be central in the etiology of many cardiovascular diseases, yet its role in PM related cardiovascular effects are not realized. In this study we sought to elucidate the cardiac processes that are disrupted following exposure to mountaintop removal mining particulate matter (PMMTM). To address this question we exposed male Sprague-Dawley rats to PMMTM, collected within one mile of an active MTM site, using intratracheal instillation. Twenty-four hours following exposure we evaluated cardiac function, apoptotic indices and mitochondrial function. PMMTM exposure, elicited a significant decrease in ejection fraction and fractional shortening compared to controls. Investigation into the cellular impacts of PMMTM exposure identified a significant increase in mitochondrial-induced apoptosis as reflected by an increase in TUNEL positive nuclei and increased caspase-3 and -9 activities. Finally, a significant increase in mitochondrial transition pore opening leading to decreased mitochondrial function was identified following exposure. In conclusion, our data suggest that pulmonary exposure to PMMTM increases cardiac mitochondrial-associated apoptosis and decreases mitochondrial function concomitant with decreased cardiac function. These results suggest that increased cardiovascular disease incidence in populations surrounding MTM mines may be associated with increased cardiac cell apoptosis and decreased mitochondrial function. |
One hundred years after its discovery in Guatemala by Rodolfo Robles, Onchocerca volvulus transmission has been eliminated from the Central Endemic Zone
Richards F Jr , Rizzo N , Diaz Espinoza CE , Morales Monroy Z , Crovella Valdez CG , de Cabrera RM , de Leon O , Zea-Flores G , Sauerbrey M , Morales AL , Rios D , Unnasch TR , Hassan HK , Klein R , Eberhard M , Cupp E , Dominguez A . Am J Trop Med Hyg 2015 93 (6) 1295-304 We report the elimination of Onchocerca volvulus transmission from the Central Endemic Zone (CEZ) of onchocerciasis in Guatemala, the largest focus of this disease in the Americas and the first to be discovered in this hemisphere by Rodolfo Robles Valverde in 1915. Mass drug administration (MDA) with ivermectin was launched in 1988, with semiannual MDA coverage reaching at least 85% of the eligible population in > 95% of treatment rounds during the 12-year period, 2000-2011. Serial parasitological testing to monitor MDA impact in sentinel villages showed a decrease in microfilaria skin prevalence from 70% to 0%, and polymerase chain reaction (PCR)-based entomological assessments of the principle vector Simulium ochraceum s.l. showed transmission interruption by 2007. These assessments, together with a 2010 serological survey in children 9-69 months of age that showed Ov16 IgG4 antibody prevalence to be < 0.1%, meeting World Health Organization (WHO) guidelines for stopping MDA, and treatment was halted after 2011. After 3 years an entomological assessment showed no evidence of vector infection or recrudescence of transmission. In 2015, 100 years after the discovery of its presence, the Ministry of Health of Guatemala declared onchocerciasis as having been eliminated from the CEZ. |
Shattuck lecture: The future of public health
Frieden TR . N Engl J Med 2015 373 (18) 1748-54 The field of public health aims to improve the health of as many people as possible as rapidly as possible. Since 1900, the average life span in the United States has increased by more than 30 years; 25 years of this gain have been attributed to public health advances.1,2 Globally, life expectancy doubled during the 20th century,3 largely as a result of reductions in child mortality attributable to expanded immunization coverage, clean water, sanitation, and other child-survival programs.4 | Public health focuses on denominators — what proportion of all people who can benefit from an intervention actually benefit. Maximizing health requires contributions from many sectors of society, including broad social, economic, environmental, transportation, and other policies in which government plays key roles; involvement of civil society; innovation by the public and private sectors; and health care and public health action. Although there has sometimes been distrust and disrespect between the health care and public health fields,5 they are inevitably and increasingly interdependent; maximizing potential health gains is a defining challenge for both fields. |
A simple approach for sample size calculation for comparing two concordance correlation coefficients estimated on the same subjects
Lin HM , Williamson JM . J Biopharm Stat 2015 25 (6) 1145-60 Some studies are designed to assess the agreement between different raters and/or different instruments in the medical sciences and pharmaceutical research. In practice, the same sample will be used to compare the agreement of two or more assessment methods for simplicity and to take advantage of the positive correlation of the ratings. The concordance correlation coefficient (CCC) is often used as a measure of agreement when the rating is a continuous variable. We present an approach for calculating the sample size required for testing the equality of two CCCs, H0: CCC1 = CCC2 vs. HA: CCC1 not equal CCC2, where two assessment methods are used on the same sample, with two raters resulting in correlated CCC estimates. Our approach is to simulate one large "exemplary" dataset based on the specification of the joint distribution of the pairwise ratings for the two methods. We then create two new random variables from the simulated data that have the same variance-covariance matrix as the two dependent CCC estimates using the Taylor series linearization method. The method requires minimal computing time and can be easily extended to comparing more than two CCCs, or Kappa statistics. |
Freshwater harmful algal blooms and cyanotoxin poisoning in domestic dogs
Cherry C , Buttke D , Wong D . J Am Vet Med Assoc 2015 247 (9) 1004-1005 Freshwater harmful algal blooms (HABs) of cyanobacteria (blue-green algae) are ocurring with increasing frequency and wider geographic distribution in lakes, ponds, and rivers worldwide. Factors influencing these increases include nutrient overenrichment, climate change, food web changes, and altered hydrology. The US Environmental Protection Agency recnetly reported that 17 states issued toxic algae and health advisories for 81 freshwater bodies during July 2015. Harmful algal blooms were detected for the first time at Lake Mead National Recreation Area in 2015, where staff were contacted by two persons who reported illnesses in pet dogs that swam in the lake. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Epidemiology and Surveillance
- Food Safety
- Genetics and Genomics
- Health Economics
- Healthcare Associated Infections
- Immunity and Immunization
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Parasitic Diseases
- Public Health, General
- Statistics as Topic
- Veterinary Medicine
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 22, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure