Trends and characteristics of self-reported case presentation of diabetes diagnosis among youth from 2002 to 2010: findings from the SEARCH for Diabetes in Youth Study
Saydah SH , Imperatore G , Henkin L , D'Agostino R Jr , Divers J , Mayer-Davis EJ , Dabelea D , Klingensmith G , Pihoker C , Lawrence JM . Diabetes Care 2015 38 (6) e84-5 Diagnosis of diabetes in youth is increasing in the U.S. (1,2). It is not known how much of this change is due to an increase in diabetes and how much is due to improved case detection, especially for type 2 diabetes. Some researchers have hypothesized that part of the explanation for the increase in diabetes diagnosis in youth is increased screening, resulting in a higher percentage of cases being identified. The objective of this study was to assess whether the change in diabetes could be explained by changes in case identification by examining trends from 2002 to 2010 in self-reported case presentation of diabetes. | Briefly, there were 9,054 youth aged <20 years with newly diagnosed diabetes between 2002 and 2010 in the SEARCH for Diabetes in Youth study (3). Participants were asked, “How did you find out you had diabetes?” Responses were grouped into symptoms, checkup, community screening, or other. Self-reported case presentation patterns were examined in 3-year blocks to assess change over time, reported by diabetes type. We explored trends in self-reported modes of diabetes diagnosis (i.e., symptoms, checkup, screening, and other method) and reported results unadjusted and then adjusted for age-group, sex, and race/ethnicity. |
Neurobehavioral concerns among males with dystrophinopathy using population-based surveillance data from the Muscular Dystrophy Surveillance, Tracking, and Research Network
Caspers Conway K , Mathews KD , Paramsothy P , Oleszek J , Trout C , Zhang Y , Romitti PA . J Dev Behav Pediatr 2015 36 (6) 455-63 OBJECTIVE: To describe the occurrence of selected neurobehavioral concerns among males with a dystrophinopathy and to explore the associations with corticosteroid or supportive device use. METHODS: Medical record abstraction of neurobehavioral concerns was conducted for 857 affected males from 765 families, born since 1982 and followed through 2011, and enrolled in the population-based Muscular Dystrophy Surveillance, Tracking, and Research Network. Cumulative probabilities for attention-deficit hyperactivity disorder (ADHD), behavior problems, and depressed mood were calculated from Kaplan-Meier estimates for the subsample of oldest affected males (n = 765). Hazard ratios (HRs) and 95% confidence intervals (95% CIs) for corticosteroid and supportive device use were estimated from Cox regression models with time-dependent covariates. RESULTS: Of the 857 affected males, 375 (44%) had at least 1 of the 3 selected neurobehavioral concerns; a similar percentage (45%) was found among the 765 oldest affected males. The estimated cumulative probabilities among these oldest affected males were 23% for ADHD, 43% for behavior problems, and 51% for depressed mood. Corticosteroid (HR = 2.35, 95% CI = 1.75-3.16) and mobility device (HR = 1.53, 95% CI = 1.06-2.21) use were associated with behavior problems. Use of a mobility device (HR = 3.53, 95% CI = 2.13-5.85), but not corticosteroids, was associated with depressed mood. ADHD was not significantly associated with corticosteroid or mobility device use. Respiratory assist device use was not examined due to low numbers of users before onset of neurobehavioral concerns. CONCLUSION: Selected neurobehavioral concerns were common among males with a dystrophinopathy. Reported associations highlight the importance of increased monitoring of neurobehavioral concerns as interventions are implemented and disease progresses. |
Outcomes after multivessel or culprit-vessel intervention for ST-elevation myocardial infarction in patients with multivessel coronary disease: a Bayesian cross-design meta-analysis
Bittl JA , Tamis-Holland JE , Lang CD , He Y . Catheter Cardiovasc Interv 2015 86 Suppl 1 S15-22 INTRODUCTION: During primary percutaneous coronary intervention (PCI), patients with ST-elevation myocardial infarction (STEMI) and multivessel coronary disease can undergo either multivessel intervention (MVI) or culprit-vessel intervention (CVI) only. BACKGROUND: Randomized controlled trials (RCTs) support the use of MVI, but cohort studies support the use of CVI. METHODS: We developed Bayesian models that incorporated parameters for study type and study outcome after MVI or CVI. RESULTS: A total of 18 studies (4 RCTs, 3 matched cohort studies, and 11 unmatched observational studies) enrolled 48,398 patients with STEMI and multivessel CAD and reported outcomes after MVI or CVI-only at the time of primary PCI. Using a Bayesian hierarchical model, we found that the point estimates replicated previously reported trends, but the wide Bayesian credible intervals (BCI) excluded any plausible mortality difference between MVI versus CVI in all three study types: RCTs (odds ratio [OR] 0.60, 95% BCI 0.31-1.20), matched cohort studies (OR 1.37, 95% BCI 0.86-2.24), or unmatched cohort studies (OR 1.16, 95% BCI 0.70-1.89). Both the global summary (OR 1.10, 95% BCI 0.74-1.51) and a sensitivity analysis that weighted the RCTs 1-5 times as much as observational studies revealed no credible advantage of one PCI strategy over the other (OR 1.05, 95% BCI 0.64-1.48). CONCLUSIONS: Bayesian approaches contextualize the comparison of different strategies by study type and suggest that neither MVI nor CVI emerges as a preferred strategy in an analysis that accounts mortality differences. |
Patterns of sunscreen use on the face and other exposed skin among US adults
Holman DM , Berkowitz Z , Guy GP Jr , Hawkins NA , Saraiya M , Watson M . J Am Acad Dermatol 2015 73 (1) 83-92 e1 BACKGROUND: Sunscreen is a common form of sun protection, but little is known about patterns of use. OBJECTIVE: We sought to assess patterns of sunscreen use on the face and other exposed skin among US adults. METHODS: Using cross-sectional data from the 2013 Summer ConsumerStyles survey (N = 4033), we calculated descriptive statistics and adjusted risk ratios to identify characteristics associated with regular sunscreen use (always/most of the time when outside on a warm sunny day for ≥1 hour). RESULTS: Few adults regularly used sunscreen on the face (men: 18.1%, 95% confidence interval [CI] 15.8-20.6; women: 42.6%, 95% CI 39.5-46.7), other exposed skin (men: 19.9%, 95% CI 17.5-22.6; women: 34.4%, 95% CI 31.5-37.5), or both the face and other exposed skin (men: 14.3%, 95% CI 12.3-16.6; women: 29.9%, 95% CI 27.2-32.8). Regular use was associated with sun-sensitive skin, an annual household income ≥$60,000, and meeting aerobic activity guidelines (Ps < .05). Nearly 40% of users were unsure if their sunscreen provided broad-spectrum protection. LIMITATIONS: Reliance on self-report and lack of information on sunscreen reapplication or other sun-safety practices are limitations. CONCLUSION: Sunscreen use is low, especially among certain demographic groups. These findings can inform sun-safety interventions and the interpretation of surveillance data on sunscreen use. |
Pre-screening discussions and prostate-specific antigen testing for prostate cancer screening
Li J , Zhao G , Hall IJ . Am J Prev Med 2015 49 (2) 259-63 INTRODUCTION: For many men, the net benefit of prostate cancer screening with prostate-specific antigen (PSA) tests may be small. Many major medical organizations have issued recommendations for prostate cancer screening, stressing the need for shared decision making before ordering a test. The purpose of this study is to better understand associations between discussions about benefits and harms of PSA testing and uptake of the test among men aged ≥40 years. METHODS: Associations between pre-screening discussions and PSA testing were examined using self-reported data from the 2012 Behavioral Risk Factor Surveillance System. Unadjusted prevalence of PSA testing was estimated and AORs were calculated using logistic regression in 2014. RESULTS: The multivariate analysis showed that men who had ever discussed advantages of PSA testing only or discussed both advantages and disadvantages were more likely, respectively, to report having had a test within the past year than men who had no discussions (p<0.001). In addition, men who had only discussed the disadvantages of PSA testing with their healthcare providers were more likely (AOR=2.75, 95% CI=2.00, 3.79) to report getting tested than men who had no discussions. CONCLUSIONS: Discussions of the benefits or harms of PSA testing are positively associated with increased uptake of the test. Given the conflicting recommendations for prostate cancer screening and increasing importance of shared decision making, this study points to the need for understanding how pre-screening discussions are being conducted in clinical practice and the role played by patients' values and preferences in decisions about PSA testing. |
Evaluating diabetes health policies using natural experiments: the Natural Experiments for Translation in Diabetes study
Ackermann RT , Kenrik Duru O , Albu JB , Schmittdiel JA , Soumerai SB , Wharam JF , Ali MK , Mangione CM , Gregg EW . Am J Prev Med 2015 48 (6) 747-54 The high prevalence and costs of type 2 diabetes makes it a rapidly evolving focus of policy action. Health systems, employers, community organizations, and public agencies have increasingly looked to translate the benefits of promising research interventions into innovative polices intended to prevent or control diabetes. Though guided by research, these health policies provide no guarantee of effectiveness and may have opportunity costs or unintended consequences. Natural experiments use pragmatic and available data sources to compare specific policies to other policy alternatives or predictions of what would likely have happened in the absence of any intervention. The Natural Experiments for Translation in Diabetes (NEXT-D) Study is a network of academic, community, industry, and policy partners, collaborating to advance the methods and practice of natural experimental research, with a shared aim of identifying and prioritizing the best policies to prevent and control diabetes. This manuscript describes the NEXT-D Study group's multi-sector natural experiments in areas of diabetes prevention or control as case examples to illustrate the selection, design, analysis, and challenges inherent to natural experimental study approaches to inform development or evaluation of health policies. |
Independent and joint associations of race/ethnicity and educational attainment with sleep-related symptoms in a population-based US sample
Cunningham TJ , Ford ES , Chapman DP , Liu Y , Croft JB . Prev Med 2015 77 99-105 OBJECTIVE: Prior studies have documented disparities in short and long sleep duration, excessive daytime sleepiness, and insomnia by educational attainment and race/ethnicity separately. We examined both independent and interactive effects of these factors with a broader range of sleep indicators in a racially/ethnically diverse sample. METHODS: We analyzed 2012 National Health Interview Survey data from 33,865 adults aged ≥ 18 years. Sleep-related symptomatology included short sleep duration (≤6 hours), long sleep duration (≥9 hours), fatigue > 3 days, excessive daytime sleepiness, and insomnia. Bivariate analyses with chi-square tests and log-linear regression were performed. RESULTS: The overall age-adjusted prevalence was 29.1% for short sleep duration, 8.5% for long sleep duration, 15.1% for fatigue, 12.6% for excessive daytime sleepiness, and 18.8% for insomnia. Educational attainment and race/ethnicity were independently related to the five sleep-related symptoms. Among Whites, the likelihood of most sleep indicators increased as educational attainment decreased; relationships varied for the other racial/ethnic groups. For short sleep duration, the educational attainment-by-race/ethnicity interaction effect was significant for African Americans (p<0.0001), Hispanics (p<0.0001), and Asians (p=0.0233) compared to Whites. For long sleep duration, the interaction was significant for Hispanics only (p=0.0003). CONCLUSIONS: Our results demonstrate the importance of examining both educational attainment and race/ethnicity simultaneously to more fully understand disparities in sleep health. Increased understanding of the mechanisms linking sociodemographic factors to sleep health is needed to determine whether policies and programs to increase educational attainment may also reduce these disparities within an increasingly diverse population. |
Body mass index, respiratory conditions, asthma, and chronic obstructive pulmonary disease
Liu Y , Pleasants RA , Croft JB , Lugogo N , Ohar J , Heidari K , Strange C , Wheaton AG , Mannino DM , Kraft M . Respir Med 2015 109 (7) 851-9 BACKGROUND: This study aims to assess the relationship of body mass index (BMI) status with respiratory conditions, asthma, and chronic obstructive pulmonary disease (COPD) in a state population. METHODS: Self-reported data from 11,868 adults aged ≥18 years in the 2012 South Carolina Behavioral Risk Factor Surveillance System telephone survey were analyzed using multivariable logistic regression that accounted for the complex sampling design and adjusted for sex, age, race/ethnicity, education, smoking status, physical inactivity, and cancer history. RESULTS: The distribution of BMI (kg/m2) was 1.5% for underweight (<18.5), 32.3% for normal weight (18.5-24.9), 34.6% for overweight (25.0-29.9), 26.5% for obese (30.0-39.9), and 5.1% for morbidly obese (≥40.0). Among respondents, 10.0% had frequent productive cough, 4.3% had frequent shortness of breath (SOB), 7.3% strongly agreed that SOB affected physical activity, 8.4% had current asthma, and 7.4% had COPD. Adults at extremes of body weight were more likely to report having asthma or COPD, and to report respiratory conditions. Age-adjusted U-shaped relationships of BMI categories with current asthma and strongly agreeing that SOB affected physical activity, but not U-shaped relationship with COPD, persisted after controlling for the covariates (p < 0.001). Morbidly obese but not underweight or obese respondents were significantly more likely to have frequent productive cough and frequent SOB than normal weight adults after adjustment. CONCLUSION: Our data confirm that both underweight and obesity are associated with current asthma and obesity with COPD. Increased emphasis on exercise and nutrition may improve respiratory conditions. |
Risk factors for acquisition of drug resistance during multidrug-resistant tuberculosis treatment, Arkhangelsk Oblast, Russia, 2005-2010
Smith SE , Ershova J , Vlasova N , Nikishova E , Tarasova I , Eliseev P , Maryandyshev AO , Shemyakin IG , Kurbatova E , Cegielski JP . Emerg Infect Dis 2015 21 (6) 1002-11 Acquired resistance to antituberculosis drugs decreases effective treatment options and the likelihood of treatment success. We identified risk factors for acquisition of drug resistance during treatment for multidrug-resistant tuberculosis (MDR TB) and evaluated the effect on treatment outcomes. Data were collected prospectively from adults from Arkhangelsk Oblast, Russia, who had pulmonary MDR TB during 2005-2008. Acquisition of resistance to capreomycin and of extensively drug-resistant TB were more likely among patients who received <3 effective drugs than among patients who received >3 effective drugs (9.4% vs. 0% and 8.6% vs. 0.8%, respectively). Poor outcomes were more likely among patients with acquired capreomycin resistance (100% vs. 25.9%), acquired ofloxacin resistance (83.6% vs. 22.7%), or acquired extensive drug resistance (100% vs. 24.4%). To prevent acquired drug resistance and poor outcomes, baseline susceptibility to first- and second-line drugs should be determined quickly, and treatment should be adjusted to contain >3 effective drugs. |
Salmonella enterica Paratyphi A infections in travelers returning from Cambodia, United States
Judd MC , Grass JE , Mintz ED , Bicknese A , Mahon BE . Emerg Infect Dis 2015 21 (6) 1089-91 Health authorities from Cambodia and European Union member states recently described a pronounced increase in Salmonella enterica serotype Paratyphi A infections in Cambodia resulting from an ongoing outbreak (1,2). To further characterize this outbreak, we analyzed 2013–2014 data on Paratyphi A infections associated with travel to Southeast Asia that were reported to the Centers for Disease Control and Prevention (CDC) National Typhoid and Paratyphoid Fever Surveillance (NTPFS) system and the CDC National Antimicrobial Monitoring System (NARMS). | NTPFS began tracking Salmonella Paratyphi A infections in 2008. During 2008–2012, ten cases were reported in patients who had traveled to Southeast Asia within 30 days before illness onset; only 1, who also reported travel to Sri Lanka, Nepal, and Nigeria, reported travel to Cambodia. During January 1, 2013–August 22, 2014, however, NTPFS received 19 reports of laboratory-confirmed Paratyphi A infection in travelers returning from Southeast Asia; 13 traveled to Cambodia, and 8 of them reported travel only to Cambodia (Table). Of the 7 patients who traveled only to Cambodia and reported reason for travel, all cited “visiting friends and relatives.” Six (75%) of the 8 patients who traveled only to Cambodia were hospitalized (median duration 7 days, range 2–10 days), and all recovered. Cases occurring in 2014, especially later in the year, might not yet have been reported, so the 2014 data most likely are an underestimate. Although many cases reported to health authorities in Cambodia and the European Union clustered in the Phnom Penh region (1,2), we lack information about destinations within Cambodia for US patients. |
Notes from the field: outbreak of skin lesions among high school wrestlers - Arizona, 2014
Williams C , Wells J , Klein R , Sylvester T , Sunenshine R . MMWR Morb Mortal Wkly Rep 2015 64 (20) 559-560 Skin infections are a common problem among athletes at all levels of competition; among wrestlers, 8.5% of all adverse events are caused by skin infections. Wrestlers are at risk because of the constant skin-to-skin contact required during practice and competition. The most common infections transmitted among high school wrestlers include fungal infections (e.g., ringworm), the viral infection herpes gladiatorum caused by herpes simplex virus-1 (HSV-1), and bacterial infections (e.g., impetigo) caused by Staphylococcus or Streptococcus species, including methicillin-resistant Staphylococcal aureus (MRSA). On February 7, 2014, the Maricopa County Department of Public Health was notified of multiple wrestlers who reported skin lesions 2 weeks after participating in a wrestling tournament at school A. The tournament was held on January 24-25 and included 168 wrestlers representing 24 schools. The county health department initiated an investigation to identify cases of skin lesion, determine lesion etiology, identify risks associated with lesion development, and provide guidance for preventing additional cases. |
On the relative role of different age groups in influenza epidemics
Worby CJ , Chaves SS , Wallinga J , Lipsitch M , Finelli L , Goldstein E . Epidemics 2015 13 10-16 The identification of key "driver" groups in influenza epidemics is of much interest for the implementation of effective public health response strategies, including vaccination programs. However, the relative importance of different age groups in propagating epidemics is uncertain.During a communicable disease outbreak, some groups may be disproportionately represented during the outbreak's ascent due to increased susceptibility and/or contact rates. Such groups or subpopulations can be identified by considering the proportion of cases within the subpopulation occurring before (Bp) and after the epidemic peak (Ap) to calculate the subpopulation's relative risk, RR=Bp/Ap. We estimated RR for several subpopulations (age groups) using data on laboratory-confirmed US influenza hospitalizations during epidemics between 2009 and 2014. Additionally, we simulated various influenza outbreaks in an age-stratified population, relating the RR to the impact of vaccination in each subpopulation on the epidemic's initial effective reproductive number Re(0).We found that children aged 5-17 had the highest estimates of RR during the five largest influenza A outbreaks, though the relative magnitude of RR in this age group compared to other age groups varied, being highest for the 2009 A/H1N1 pandemic. For the 2010-2011 and 2012-2013 influenza B epidemics, adults aged 18-49, and 0-4 year-olds had the highest estimates of RR, respectively.For 83% of simulated epidemics, the group with the highest RR was also the group for which initial distribution of a given quantity of vaccine would result in the largest reduction of Re(0). In the largest 40% of simulated outbreaks, the group with the highest RR and the largest vaccination impact was children 5-17. While the relative importance of different age groups in propagating influenza outbreaks varies, children aged 5-17 play the leading role during the largest influenza A epidemics. Extra vaccination efforts for this group may contribute to reducing the epidemic's impact in the whole community. |
Evaluating HIV prevention programs: herpes simplex virus type 2 antibodies as biomarker for sexual risk behavior in young adults in resource-poor countries
Behling J , Chan AK , Zeh C , Nekesa C , Heinzerling L . PLoS One 2015 10 (5) e0128370 BACKGROUND: Measuring effectiveness of HIV prevention interventions is challenged by bias when using self-reported knowledge, attitude or behavior change. HIV incidence is an objective marker to measure effectiveness of HIV prevention interventions, however, because new infection rates are relatively low, prevention studies require large sample sizes. Herpes simplex virus type 2 (HSV-2) is similarly transmitted and more prevalent and could thus serve as a proxy marker for sexual risk behavior and therefore HIV infection. METHODS: HSV-2 antibodies were assessed in a sub-study of 70,000 students participating in an education intervention in Western Province, Kenya. Feasibility of testing for HSV-2 antibodies was assessed comparing two methods using Fisher's exact test. Three hundred and ninety four students (aged 18 to 22 years) were randomly chosen from the cohort and tested for HIV, Chlamydia trachomatis, Neisseria gonorrhoeae, and Trichomonas vaginalis. Out of these, 139 students were tested for HSV-2 with ELISA and surveyed for sexual risk behavior and 89 students were additionally tested for HSV-2 with a point-of-contact (POC) test. RESULTS: Prevalence rates were 0.5%, 1.8%, 0.3% and 2.3% for HIV, Chlamydia trachomatis, Neisseria gonorrhoeae, and Trichomonas vaginalis, respectively. Prevalence of HSV-2 antibodies was 3.4 % as measured by POC test (n=89) and 14.4 % by ELISA (n=139). Specificity of the POC test compared with ELISA was 100%, and the sensitivity only 23.1%. Associations between self-reported sexual behavior and HSV-2 serostatus could not be shown. CONCLUSIONS: Associations between self-reported sexual risk behavior and HSV-2 serostatus could not be shown, probably due to social bias in interviews since its transmission is clearly linked. HSV-2 antibody testing is feasible in resource-poor settings and shows higher prevalence rates than other sexually transmitted diseases thus representing a potential biomarker for evaluation of HIV prevention interventions. |
Histoplasmosis in Idaho and Montana, USA, 2012-2013
Nett RJ , Skillman D , Riek L , Davis B , Blue SR , Sundberg EE , Merriman JR , Hahn CG , Park BJ . Emerg Infect Dis 2015 21 (6) 1071-2 Histoplasmosis occurs after infection with the dimorphic fungus Histoplasma capsulatum (1–6). Patients become ill after they inhale soil contaminated with H. capsulatum (1,2). Most infections are asymptomatic or result in mild illness not determined to be histoplasmosis (1,2). Symptoms usually develop 3–14 days after exposure and range from self-limited pneumonia to severe disseminated disease requiring antifungal therapy (2,7). | In the United States, H. capsulatum is endemic to the Mississippi and Ohio River Valleys (1,2,5,8) but is not known to be endemic to the Rocky Mountain region (8). During June 2012–November 2013, a total of 6 unrelated cases of histoplasmosis were reported in Idaho (n = 1) and Montana (n = 5) in patients who had no recent travel to recognized H. capsulatum–endemic regions. Public health authorities investigated the illnesses by reviewing medical records and collecting exposure and travel histories. | The median age of the patients (3 male, 3 female) was 68 (range 17–79) years (Table). Each case was diagnosed by a different physician; no known epidemiologic links existed among the patients. Five patients had >1 immunocompromising conditions (Table), and 2 had acute pneumonia; 1 each had left parotid gland enlargement, anterior cervical lymphadenopathy, tricuspid valve mass, and acute changes in mental status. Three patients were hospitalized: 2 required intensive care, and 1 died. |
Acute rheumatic fever and rheumatic heart disease among children - American Samoa, 2011-2012
Beaudoin A , Edison L , Introcaso CE , Goh L , Marrone J , Mejia A , Beneden CV . MMWR Morb Mortal Wkly Rep 2015 64 (20) 555-558 Acute rheumatic fever is a nonsuppurative, immune-mediated consequence of group A streptococcal pharyngitis (strep throat). Recurrent or severe acute rheumatic fever can cause permanent cardiac valve damage and rheumatic heart disease, which increases the risk for cardiac conditions (e.g., infective endocarditis, stroke, and congestive heart failure). Antibiotics can prevent acute rheumatic fever if administered no more than 9 days after symptom onset. Long-term benzathine penicillin G (BPG) injections are effective in preventing recurrent acute rheumatic fever attacks and are recommended to be administered every 3-4 weeks for 10 years or until age 21 years to children who receive a diagnosis of acute rheumatic fever. During August 2013, in response to anecdotal reports of increasing rates of acute rheumatic fever and rheumatic heart disease, CDC collaborated with the American Samoa Department of Health and the Lyndon B. Johnson Tropical Medical Center (the only hospital in American Samoa) to quantify the number of cases of pediatric acute rheumatic fever and rheumatic heart disease in American Samoa and to assess the potential roles of missed pharyngitis diagnosis, lack of timely prophylaxis prescription, and compliance with prescribed BPG prophylaxis. Using data from medical records, acute rheumatic fever incidence was calculated as 1.1 and 1.5 cases per 1,000 children aged ≤18 years in 2011 and 2012, respectively; 49% of those with acute rheumatic fever subsequently received a diagnosis of rheumatic heart disease. Noncompliance with recommended prophylaxis with BPG after physician-diagnosed acute rheumatic fever was noted for 22 (34%) of 65 patients. Rheumatic heart disease point prevalence was 3.2 cases per 1,000 children in August 2013. Establishment of a coordinated acute rheumatic fever and rheumatic heart disease control program in American Samoa, likely would improve diagnosis, treatment, and patient compliance with BPG prophylaxis. |
Additional drug resistance of multidrug-resistant tuberculosis in patients in 9 countries
Kurbatova EV , Dalton T , Ershova J , Tupasi T , Caoili JC , Van Der Walt M , Kvasnovsky C , Yagui M , Bayona J , Contreras C , Leimane V , Via LE , Kim H , Akksilp S , Kazennyy BY , Volchenkov GV , Jou R , Kliiman K , Demikhova OV , Cegielski JP . Emerg Infect Dis 2015 21 (6) 977-83 Data from a large multicenter observational study of patients with multidrug-resistant tuberculosis (MDR TB) were analyzed to simulate the possible use of 2 new approaches to treatment of MDR TB: a short (9-month) regimen and a bedaquiline-containing regimen. Of 1,254 patients, 952 (75.9%) had no resistance to fluoroquinolones and second-line injectable drugs and thus would qualify as candidates for the 9-month regimen; 302 (24.1%) patients with resistance to a fluoroquinolone or second-line injectable drug would qualify as candidates for a bedaquiline-containing regimen in accordance with published guidelines. Among candidates for the 9-month regimen, standardized drug-susceptibility tests demonstrated susceptibility to a median of 5 (interquartile range 5-6) drugs. Among candidates for bedaquiline, drug-susceptibility tests demonstrated susceptibility to a median of 3 (interquartile range 2-4) drugs; 26% retained susceptibility to <2 drugs. These data may assist national TB programs in planning to implement new drugs and drug regimens. |
Coccidioides exposure and coccidioidomycosis among prison employees, California, United States
de Perio MA , Niemeier RT , Burr GA . Emerg Infect Dis 2015 21 (6) 1031-3 Responding to a request by corrections agency management, we investigated coccidioidomycosis in prison employees in central California, a coccidioidomycosis-endemic area. We identified 103 cases of coccidioidomycosis that occurred over 4.5 years. As a result, we recommended training and other steps to reduce dust exposure among employees and thus potential exposure to Coccidioides. |
Prevalence and transmission of Trypanosoma cruzi in people of rural communities of the high jungle of northern Peru
Alroy KA , Huang C , Gilman RH , Quispe-Machaca VR , Marks MA , Ancca-Juarez J , Hillyard M , Verastegui M , Sanchez G , Cabrera L , Vidal E , Billig EM , Cama VA , Naquira C , Bern C , Levy MZ . PLoS Negl Trop Dis 2015 9 (5) e0003779 BACKGROUND: Vector-borne transmission of Trypanosoma cruzi is seen exclusively in the Americas where an estimated 8 million people are infected with the parasite. Significant research in southern Peru has been conducted to understand T. cruzi infection and vector control, however, much less is known about the burden of infection and epidemiology in northern Peru. METHODOLOGY: A cross-sectional study was conducted to estimate the seroprevalence of T. cruzi infection in humans (n=611) and domestic animals [dogs (n=106) and guinea pigs (n=206)] in communities of Cutervo Province, Peru. Sampling and diagnostic strategies differed according to species. An entomological household study (n=208) was conducted to identify the triatomine burden and species composition, as well as the prevalence of T. cruzi in vectors. Electrocardiograms (EKG) were performed on a subset of participants (n=90 T. cruzi infected participants and 170 age and sex-matched controls). The seroprevalence of T. cruzi among humans, dogs, and guinea pigs was 14.9% (95% CI: 12.2 - 18.0%), 19.8% (95% CI: 12.7- 28.7%) and 3.3% (95% CI: 1.4 - 6.9%) respectively. In one community, the prevalence of T. cruzi infection was 17.2% (95% CI: 9.6 - 24.7%) among participants < 15 years, suggesting recent transmission. Increasing age, positive triatomines in a participant's house, and ownership of a T. cruzi positive guinea pig were independent correlates of T. cruzi infection. Only one species of triatomine was found, Panstrongylus lignarius, formerly P. herreri. Approximately forty percent (39.9%, 95% CI: 33.2 - 46.9%) of surveyed households were infested with this vector and 14.9% (95% CI: 10.4 - 20.5%) had at least one triatomine positive for T. cruzi. The cardiac abnormality of right bundle branch block was rare, but only identified in seropositive individuals. CONCLUSIONS: Our research documents a substantial prevalence of T. cruzi infection in Cutervo and highlights a need for greater attention and vector control efforts in northern Peru. |
Introduction of monkeypox into a community and household: risk factors and zoonotic reservoirs in the Democratic Republic of the Congo
Nolen LD , Osadebe L , Katomba J , Likofata J , Mukadi D , Monroe B , Doty J , Kalemba L , Malekani J , Kabamba J , Bomponda PL , Lokota JI , Balilo MP , Likafi T , Lushima RS , Tamfum JJ , Okitolonda EW , McCollum AM , Reynolds MG . Am J Trop Med Hyg 2015 93 (2) 410-5 An increased incidence of monkeypox (MPX) infections in the Democratic Republic of the Congo was noted by the regional surveillance system in October 2013. Little information exists regarding how MPX is introduced into the community and the factors associated with transmission within the household. Sixty-eight wild animals were collected and tested for Orthopoxvirus. Two of three rope squirrels (Funisciurus sp.) were positive for antibodies to Orthopoxviruses; however, no increased risk was associated with the consumption or preparation of rope squirrels. A retrospective cohort investigation and a case-control investigation were performed to identify risk factors affecting the introduction of monkeypox virus (MPXV) into the community and transmission within the home. School-age males were the individuals most frequently identified as the first person infected in the household and were the group most frequently affected overall. Risk factors of acquiring MPXV in a household included sleeping in the same room or bed, or using the same plate or cup as the primary case. There was no significant risk associated with eating or processing of wild animals. Activities associated with an increased risk of MPXV transmission all have potential for virus exposure to the mucosa. |
U.S. compounding pharmacy-related outbreaks, 2001-2013: public health and patient safety lessons learned
Shehab N , Brown MN , Kallen AJ , Perz JF . J Patient Saf 2015 14 (3) 164-173 OBJECTIVES: Pharmacy-compounded sterile preparations (P-CSPs) are frequently relied upon in U.S. health care but are increasingly being linked to outbreaks of infections. We provide an updated overview of outbreak burden and characteristics, identify drivers of P-CSP demand, and discuss public health and patient safety lessons learned to help inform prevention. METHODS: Outbreaks of infections linked to contaminated P-CSPs that occurred between January 1, 2001, and December 31, 2013, were identified from internal Centers for Disease Control and Prevention reports, Food and Drug Administration drug safety communications, and published literature. RESULTS: We identified 19 outbreaks linked to P-CSPs, resulting in at least 1000 cases, including deaths. Outbreaks were reported across two-thirds of states, with almost one-half (8/19) involving cases in more than 1 state. Almost one-half of outbreaks were linked to injectable steroids (5/19) and intraocular bevacizumab (3/19). Non-patient-specific compounding originating from nonsterile ingredients and repackaging of already sterile products were the most common practices associated with P-CSP contamination. Breaches in aseptic processing and deficiencies in sterilization procedures or in sterility/endotoxin testing were consistent findings. Hospital outsourcing, preference for variations of commercially available products, commercial drug shortages, and lower prices were drivers of P-CSP demand. CONCLUSIONS: Recognized outbreaks linked to P-CSPs have been most commonly associated with non-patient-specific repackaging and nonsterile to sterile compounding and linked to lack of adherence to sterile compounding standards. Recently enhanced regulatory oversight of compounding may improve adherence to such standards. Additional measures to limit and control these outbreaks include vigilance when outsourcing P-CSPs, scrutiny of drivers for P-CSP demand, as well as early recognition and notification of possible outbreaks. |
Quality of artemisinin-based combination formulations for malaria treatment: prevalence and risk factors for poor quality medicines in public facilities and private sector drug outlets in Enugu, Nigeria
Kaur H , Allan EL , Mamadu I , Hall Z , Ibe O , El Sherbiny M , Wyk AV , Yeung S , Swamidoss I , Green MD , Dwivedi P , Culzoni MJ , Clarke S , Schellenberg D , Fernandez FM , Onwujekwe O . PLoS One 2015 10 (5) e0125577 BACKGROUND: Artemisinin-based combination therapies are recommended by the World Health Organisation (WHO) as first-line treatment for Plasmodium falciparum malaria, yet medication must be of good quality for efficacious treatment. A recent meta-analysis reported 35% (796/2,296) of antimalarial drug samples from 21 Sub-Saharan African countries, purchased from outlets predominantly using convenience sampling, failed chemical content analysis. We used three sampling strategies to purchase artemisinin-containing antimalarials (ACAs) in Enugu metropolis, Nigeria, and compared the resulting quality estimates. METHODS: ACAs were purchased using three sampling approaches - convenience, mystery clients and overt, within a defined area and sampling frame in Enugu metropolis. The active pharmaceutical ingredients were assessed using high-performance liquid chromatography and confirmed by mass spectrometry at three independent laboratories. Results were expressed as percentage of APIs stated on the packaging and used to categorise each sample as acceptable quality, substandard, degraded, or falsified. RESULTS: Content analysis of 3024 samples purchased from 421 outlets using convenience (n=200), mystery (n=1,919) and overt (n=905) approaches, showed overall 90.8% ACAs to be of acceptable quality, 6.8% substandard, 1.3% degraded and 1.2% falsified. Convenience sampling yielded a significantly higher prevalence of poor quality ACAs, but was not evident by the mystery and overt sampling strategies both of which yielded results that were comparable between each other. Artesunate (n=135; 4 falsified) and dihydroartemisinin (n=14) monotherapy tablets, not recommended by WHO, were also identified. CONCLUSION: Randomised sampling identified fewer falsified ACAs than previously reported by convenience approaches. Our findings emphasise the need for specific consideration to be given to sampling frame and sampling approach if representative information on drug quality is to be obtained. |
Estimated deaths and illnesses averted during fungal meningitis outbreak associated with contaminated steroid injections, United States, 2012-2013
Smith RM , Derado G , Wise M , Harris JR , Chiller T , Meltzer MI , Park BJ . Emerg Infect Dis 2015 21 (6) 933-40 During 2012-2013, the US Centers for Disease Control and Prevention and partners responded to a multistate outbreak of fungal infections linked to methylprednisolone acetate (MPA) injections produced by a compounding pharmacy. We evaluated the effects of public health actions on the scope of this outbreak. A comparison of 60-day case-fatality rates and clinical characteristics of patients given a diagnosis on or before October 4, the date the outbreak was widely publicized, with those of patients given a diagnosis after October 4 showed that an estimated 3,150 MPA injections, 153 cases of meningitis or stroke, and 124 deaths were averted. Compared with diagnosis after October 4, diagnosis on or before October 4 was significantly associated with a higher 60-day case-fatality rate (28% vs. 5%; p<0.0001). Aggressive public health action resulted in a substantially reduced estimated number of persons affected by this outbreak and improved survival of affected patients. |
Ozone, Fine Particulate Matter and Chronic Lower Respiratory Disease Mortality in the United States
Hao Y , Balluz L , Strosnider H , Wen XJ , Li C , Qualters JR . Am J Respir Crit Care Med 2015 192 (3) 337-41 RATIONALE: Short-term effects of air pollution exposure on respiratory disease mortality are well established. However, few studies have examined the effects of long-term exposure and, among those that have, results are inconsistent. OBJECTIVE: To evaluate long-term association between ambient ozone, fine particulate matter (PM2.5, particles with aerodynamic diameter of 2.5 micrometers or less) and chronic lower respiratory disease (CLRD) mortality in the contiguous United States. METHODS: We fit Bayesian hierarchical spatial Poisson models, adjusting for five county-level covariates (percent adults aged ≥65 years, poverty, lifetime smoking, obesity, and temperature), with random effects at state and county levels to account for spatial heterogeneity and spatial dependence. MEASUREMENTS AND MAIN RESULTS: We derived county-level average daily exposure levels for ambient ozone and PM2.5 for 2001-2008 from the U.S. Environmental Protection Agency's down-scaled estimates and obtained 2007-2008 CLRD deaths from the National Center for Health Statistics. Exposure to ambient ozone was associated with increased rate of CLRD deaths, with a rate ratio of 1.05 (95% credible interval, 1.01-1.09) per 5-ppb increase in ozone; the association between ambient PM2.5 and CLRD mortality was positive but statistically insignificant (rate ratio 1.068, 95% credible interval, 0.995-1.146). CONCLUSIONS: This is the first national study that links air pollution exposure data with CLRD mortality for 3109 contiguous U.S. counties. Ambient ozone may be associated with increased rate of death from CLRD in the contiguous United States. |
Investigating fungal outbreaks in the 21st century.
Litvintseva AP , Brandt ME , Mody RK , Lockhart SR . PLoS Pathog 2015 11 (5) e1004804 Public attention has been drawn to recent high-profile outbreaks of mycotic diseases, such as those of fungal meningitis and other infections linked to contaminated steroids [1] and an outbreak of necrotizing cutaneous mucormycosis linked to a tornado [2]. However, fungal outbreaks are more common than most people appreciate, and reports of outbreaks caused by unusual fungal pathogens are increasing. The Mycotic Diseases Branch at the Centers for Disease Control and Prevention (CDC) investigates 3–6 fungal outbreaks per year, many of which are caused by rare fungi with limited diagnostic and treatment options. This is a considerable increase from the 1990s, when the Branch investigated 1–2 hospital-based outbreaks per year, generally caused by yeast and traced to a single source. Although the exact reasons for this increase are unknown, the increased number of patients with impaired immune system may have contributed to this trend. | The majority of fungal outbreaks can be attributed to either environmental exposure or a contaminated product (Table 1). For example, two recent outbreaks were linked to contaminated medications. In 2012, two medications produced by a single compounding pharmacy in Florida were contaminated with Fusarium sp. and Bipolaris sp., respectively, shipped to 15 states, and injected into the eyes of patients undergoing vitrectomies. As a result, 47 patients developed endophthalmitis, and most lost vision [3]. In the 2012 fungal meningitis outbreak, methylprednisolone acetate (MPA) contaminated with Exserohilum rostratum and several other microorganisms was shipped to 23 states, potentially exposing nearly 14,000 individuals to this contaminated medication. As a result, 752 people developed meningitis, arachnoiditis, or spinal/paraspinal abscesses, and 64 patients died, making this the deadliest fungal outbreak to date |
Haemagglutinin mutations and glycosylation changes shaped the 2012/13 influenza A(H3N2) epidemic, Houston, Texas.
Stucker KM , Schobel SA , Olsen RJ , Hodges HL , Lin X , Halpin RA , Fedorova N , Stockwell TB , Tovchigrechko A , Das SR , Wentworth DE , Musser JM . Euro Surveill 2015 20 (18) While the early start and higher intensity of the 2012/13 influenza A virus (IAV) epidemic was not unprecedented, it was the first IAV epidemic season since the 2009 H1N1 influenza pandemic where the H3N2 subtype predominated. We directly sequenced the genomes of 154 H3N2 clinical specimens collected throughout the epidemic to better understand the evolution of H3N2 strains and to inform the H3N2 vaccine selection process. Phylogenetic analyses indicated that multiple co-circulating clades and continual antigenic drift in the haemagglutinin (HA) of clades 5, 3A, and 3C, with the evolution of a new 3C subgroup (3C-2012/13), were the driving causes of the epidemic. Drift variants contained HA substitutions and alterations in the potential N-linked glycosylation sites of HA. Antigenic analysis demonstrated that viruses in the emerging subclade 3C.3 and subgroup 3C-2012/13 were not well inhibited by antisera generated against the 3C.1 vaccine strains used for the 2012/13 (A/Victoria/361/2011) or 2013/14 (A/Texas/50/2012) seasons. Our data support updating the H3N2 vaccine strain to a clade 3C.2 or 3C.3-like strain or a subclade that has drifted further. They also underscore the challenges in vaccine strain selection, particularly regarding HA and neuraminidase substitutions derived during laboratory passage that may alter antigenic testing accuracy. |
Further Confirmation of Germline Glioma Risk Variant rs78378222 in TP53 and Its Implication in Tumor Tissues via Integrative Analysis of TCGA Data.
Wang Z , Rajaraman P , Melin BS , Chung CC , Zhang W , McKean-Cowdin R , Michaud D , Yeager M , Ahlbom A , Albanes D , Andersson U , Freeman LE , Buring JE , Butler MA , Carreon T , Feychting M , Gapstur SM , Gaziano JM , Giles GG , Hallmans G , Henriksson R , Hoffman-Bolton J , Inskip PD , Kitahara CM , Marchand LL , Linet MS , Li S , Peters U , Purdue MP , Rothman N , Ruder AM , Sesso HD , Severi G , Stampfer M , Stevens VL , Visvanathan K , Wang SS , White E , Zeleniuch-Jacquotte A , Hoover R , Fraumeni JF , Chatterjee N , Hartge P , Chanock SJ . Hum Mutat 2015 36 (7) 684-8 We confirmed strong association of rs78378222:A>C (per allele odds ratio [OR] = 3.14; P = 6.48 x 10-11 ), a germline rare single-nucleotide polymorphism (SNP) in TP53, via imputation of a genome-wide association study of glioma (1,856 cases and 4,955 controls). We subsequently performed integrative analyses on the Cancer Genome Atlas (TCGA) data for GBM (glioblastoma multiforme) and LUAD (lung adenocarcinoma). Based on SNP data, we imputed genotypes for rs78378222 and selected individuals carrying rare risk allele (C). Using RNA sequencing data, we observed aberrant transcripts with approximately 3 kb longer than normal for those individuals. Using exome sequencing data, we further showed that loss of haplotype carrying common protective allele (A) occurred somatically in GBM but not in LUAD. Our bioinformatic analysis suggests rare risk allele (C) disrupts mRNA termination, and an allelic loss of a genomic region harboring common protective allele (A) occurs during tumor initiation or progression for glioma. |
Molecular and morphologic data reveal multiple species in Peromyscus pectoralis
Bradley RD , Schmidly DJ , Amman BR , Platt RN , Neumann KM , Huynh HM , Muñiz-Martínez R , López-González C , Ordóñez-Garza N . J Mammal 2015 96 (2) 446-459 DNA sequence and morphometric data were used to re-evaluate the taxonomy and systematics of Peromyscus pectoralis. Phylogenetic analyses (maximum likelihood and Bayesian inference) of DNA sequences from the mitochondrial cytochrome-b gene in 44 samples of P. pectoralis indicated 2 well-supported monophyletic clades. The 1st clade contained specimens from Texas historically assigned to P. p. laceianus; the 2nd was comprised of specimens previously referable to P. p. collinus, P. p. laceianus, and P. p. pectoralis obtained from northern and eastern Mexico. Levels of genetic variation (~7%) between these 2 clades indicated that the genetic divergence typically exceeded that reported for other species of Peromyscus. Samples of P. p. laceianus north and south of the Río Grande were not monophyletic. In addition, samples representing P. p. collinus and P. p. pectoralis formed 2 clades that differed genetically by 7.14%. Multivariate analyses of external and cranial measurements from 63 populations of P. pectoralis revealed 4 morpho-groups consistent with clades in the DNA sequence analysis: 1 from Texas and New Mexico assignable to P. p. laceianus; a 2nd from western and southern Mexico assignable to P. p. pectoralis; a 3rd from northern and central Mexico previously assigned to P. p. pectoralis but herein shown to represent an undescribed taxon; and a 4th from southeastern Mexico assignable to P. p. collinus. Based on the concordance of these results, populations from the United States are referred to as P. laceianus, whereas populations from Mexico are referred to as P. pectoralis (including some samples historically assigned to P. p. collinus, P. p. laceianus, and P. p. pectoralis). A new subspecies is described to represent populations south of the Río Grande in northern and central Mexico. Additional research is needed to discern if P. p. collinus warrants species recognition. |
Correlation of biomarker expression in colonic mucosa with disease phenotype in Crohn's disease and ulcerative colitis
Bruno ME , Rogier EW , Arsenescu RI , Flomenhoft DR , Kurkjian CJ , Ellis GI , Kaetzel CS . Dig Dis Sci 2015 60 (10) 2976-84 BACKGROUND: Inflammatory bowel diseases (IBD), including Crohn's disease (CD) and ulcerative colitis (UC), are characterized by chronic intestinal inflammation due to immunological, microbial, and environmental factors in genetically predisposed individuals. Advances in the diagnosis, prognosis, and treatment of IBD require the identification of robust biomarkers that can be used for molecular classification of diverse disease presentations. We previously identified five genes, RELA, TNFAIP3 (A20), PIGR, TNF, and IL8, whose mRNA levels in colonic mucosal biopsies could be used in a multivariate analysis to classify patients with CD based on disease behavior and responses to therapy. AIM: We compared expression of these five biomarkers in IBD patients classified as having CD or UC, and in healthy controls. RESULTS: Patients with CD were characterized as having decreased median expression of TNFAIP3, PIGR, and TNF in non-inflamed colonic mucosa as compared to healthy controls. By contrast, UC patients exhibited decreased expression of PIGR and elevated expression of IL8 in colonic mucosa compared to healthy controls. A multivariate analysis combining mRNA levels for all five genes resulted in segregation of individuals based on disease presentation (CD vs. UC) as well as severity, i.e., patients in remission versus those with acute colitis at the time of biopsy. CONCLUSION: We propose that this approach could be used as a model for molecular classification of IBD patients, which could further be enhanced by the inclusion of additional genes that are identified by functional studies, global gene expression analyses, and genome-wide association studies. |
The prevalence and incidence of latent tuberculosis infection and its associated factors among village doctors in China
He G , Li Y , Zhao F , Wang L , Cheng S , Guo H , Klena JD , Fan H , Gao F , Gao F , Han G , Ren L , Song Y , Xiong Y , Geng M , Hou Y , He G , Li J , Guo S , Yang J , Yan D , Wang Y , Gao H , An J , Duan X , Wu C , Duan F , Hu D , Lu K , Zhao Y , Rao CY , Wang Y . PLoS One 2015 10 (5) e0124097 BACKGROUND: China is a high tuberculosis (TB) burden country. More than half of acute TB cases first seek medical care in village doctors' clinics or community health centers. Despite being responsible for patient referral and management, village doctors are not systematically evaluated for TB infection or disease. We assessed prevalence and incidence of latent TB infection (LTBI) among village doctors in China. METHODS AND FINDINGS: A longitudinal study was conducted in Inner Mongolia Autonomous Region. We administered a questionnaire on demographics and risk factors for TB exposure and disease; Tuberculin skin testing (TST) and QuantiFERON-TB Gold in-tube assay (QFT-GIT) was conducted at baseline and repeated 12 months later. We used a logistic regression model to calculate adjusted odds ratios (ORs) for risk factors for TST and QFT-GIT prevalence and incidence. At the time of follow up, 19.5% of the 880 participating village doctors had a positive TST and 46.0% had a positive QFT-GIT result. Factors associated with TST prevalence included having a BCG scar (OR = 1.45, 95%CI 1.03-2.04) and smoking (OR = 1.69, 95%CI 1.17-2.44). Risk factors associated with QFT-GIT prevalence included being male (OR = 2.17, 95%CI 1.63-2.89), below college education (OR=1.42, 95%CI 1.01-1.97), and working for ≥25 years as a village doctor (OR = 1.64, 95%CI 1.12-2.39). The annual incidence of LTBI was 11.4% by TST and 19.1% by QFT-GIT. QFT-GIT conversion was associated with spending 15 minutes or more per patient on average (OR = 2.62, 95%CI 1.39-4.97) and having BCG scar (OR = 0.53, 95%CI 0.28-1.00). CONCLUSIONS: Prevalence and incidence of LTBI among Chinese village doctors is high. TB infection control measures should be strengthened among village doctors and at village healthcare settings. |
Estimating central line-associated bloodstream infection incidence rates by sampling of denominator data: a prospective, multicenter evaluation
Thompson ND , Edwards JR , Bamberg W , Beldavs ZG , Dumyati G , Godine D , Maloney M , Kainer M , Ray S , Thompson D , Wilson L , Magill SS . Am J Infect Control 2015 43 (8) 853-6 BACKGROUND: Large-scale, prospective, evaluation of sampling for central line-associated bloodstream infection (CLABSI) denominator data was necessary prior to National Healthcare Safety Network (NHSN) implementation. METHODS: In a sample of volunteer hospitals from states in the Emerging Infections Program, prospective collection of CLABSI denominators (patient days, central line days [CLDs]) was performed in eligible locations for ≥6 and ≤12 consecutive months using the current NHSN method (daily collection) and also by a second data collector who sampled the denominator data 1 d/wk. The quality of the sampled data was evaluated and used to calculate estimated CLDs and CLABSI rates, which were compared with actual CLDs and CLABSI rates (daily counts). RESULTS: In total, 89 locations in 66 acute care hospitals participated. Sampled data were collected as intended 88% of the time; the quality of the data was comparable with the data collected daily. In locations with higher CLDs per month (≥75), estimated CLDs and CLABSI rates were similar to actual CLDs and CLABSI rates; however, there were significant differences in actual and estimated values among locations with lower (≤74) CLDs per month.Sampling was successfully implemented, but significant differences in the accuracy of estimated CLDs and CLABSI rates, based on the actual number of CLDs per month, were noted. CONCLUSION: For locations with a higher number of CLDs per month, sampling 1 d/wk is a valid and accurate alternative to daily collection of CLABSI denominator data. |
Incidence of medically-attended norovirus-associated acute gastroenteritis in four Veteran's Affairs Medical Center populations in the United States, 2011-2012
Grytdal SP , Rimland D , Shirley SH , Rodriguez-Barradas MC , Goetz MB , Brown ST , Lucero-Obusan C , Holodniy M , Graber C , Parashar U , Vinje J , Lopman B . PLoS One 2015 10 (5) e0126733 An estimated 179 million acute gastroenteritis (AGE) illnesses occur annually in the United States. The role of noroviruses in hospital-related AGE has not been well-documented in the U. S. We estimated the population incidence of community- acquired outpatient and inpatient norovirus AGE encounters, as well as hospital-acquired inpatient norovirus AGE among inpatients at four Veterans Affairs (VA) Medical Centers (VAMCs). Fifty (4%) of 1,160 stool specimens collected ≤7 days from symptom onset tested positive for norovirus. During a one year period, the estimated incidence of outpatient, community- and hospital-acquired inpatient norovirus AGE was 188 cases, 11 cases, and 54 cases/ 100,000 patients, respectively. This study demonstrates the incidence of outpatient and community- and hospital-acquired inpatient norovirus AGE among the VA population seeking care at these four VAMCs. |
Statistical power and validity of Ebola vaccine trials in Sierra Leone: a simulation study of trial design and analysis
Bellan SE , Pulliam JR , Pearson CA , Champredon D , Fox SJ , Skrip L , Galvani AP , Gambhir M , Lopman BA , Porco TC , Meyers LA , Dushoff J . Lancet Infect Dis 2015 15 (6) 703-10 BACKGROUND: Safe and effective vaccines could help to end the ongoing Ebola virus disease epidemic in parts of west Africa, and mitigate future outbreaks of the virus. We assess the statistical validity and power of randomised controlled trial (RCT) and stepped-wedge cluster trial (SWCT) designs in Sierra Leone, where the incidence of Ebola virus disease is spatiotemporally heterogeneous, and is decreasing rapidly. METHODS: We projected district-level Ebola virus disease incidence for the next 6 months, using a stochastic model fitted to data from Sierra Leone. We then simulated RCT and SWCT designs in trial populations comprising geographically distinct clusters at high risk, taking into account realistic logistical constraints, and both individual-level and cluster-level variations in risk. We assessed false-positive rates and power for parametric and non-parametric analyses of simulated trial data, across a range of vaccine efficacies and trial start dates. FINDINGS: For an SWCT, regional variation in Ebola virus disease incidence trends produced increased false-positive rates (up to 0.15 at alpha=0.05) under standard statistical models, but not when analysed by a permutation test, whereas analyses of RCTs remained statistically valid under all models. With the assumption of a 6-month trial starting on Feb 18, 2015, we estimate the power to detect a 90% effective vaccine to be between 49% and 89% for an RCT, and between 6% and 26% for an SWCT, depending on the Ebola virus disease incidence within the trial population. We estimate that a 1-month delay in trial initiation will reduce the power of the RCT by 20% and that of the SWCT by 49%. INTERPRETATION: Spatiotemporal variation in infection risk undermines the statistical power of the SWCT. This variation also undercuts the SWCT's expected ethical advantages over the RCT, because an RCT, but not an SWCT, can prioritise vaccination of high-risk clusters. FUNDING: US National Institutes of Health, US National Science Foundation, and Canadian Institutes of Health Research. |
Factors associated with seasonal influenza vaccination in pregnant women
Henninger ML , Irving SA , Thompson M , Avalos LA , Ball SW , Shifflett P , Naleway AL . J Womens Health (Larchmt) 2015 24 (5) 394-402 BACKGROUND: This observational study followed a cohort of pregnant women during the 2010-2011 influenza season to determine factors associated with vaccination. METHODS: Participants were 1105 pregnant women who completed a survey assessing health beliefs related to vaccination upon enrollment and were then followed to determine vaccination status by the end of the 2010-2011 influenza season. We conducted univariate and multivariate analyses to explore factors associated with vaccination status and a factor analysis of survey items to identify health beliefs associated with vaccination. RESULTS: Sixty-three percent (n=701) of the participants were vaccinated. In the univariate analyses, multiple factors were associated with vaccination status, including maternal age, race, marital status, educational level, and gravidity. Factor analysis identified two health belief factors associated with vaccination: participant's positive views (factor 1) and negative views (factor 2) of influenza vaccination. In a multivariate logistic regression model, factor 1 was associated with increased likelihood of vaccination (adjusted odds ratio [aOR]=2.18; 95% confidence interval [CI]=1.72-2.78), whereas factor 2 was associated with decreased likelihood of vaccination (aOR=0.36; 95% CI=0.28-0.46). After controlling for the two health belief factors in multivariate analyses, demographic factors significant in univariate analyses were no longer significant. Women who received a provider recommendation were about three times more likely to be vaccinated (aOR=3.14; 95% CI=1.99-4.96). CONCLUSION: Pregnant women's health beliefs about vaccination appear to be more important than demographic and maternal factors previously associated with vaccination status. Provider recommendation remains one of the most critical factors influencing vaccination during pregnancy. |
Cluster survey evaluation of a measles vaccination campaign in Jharkhand, India, 2012
Scobie HM , Ray A , Routray S , Bose A , Bahl S , Sosler S , Wannemuehler K , Kumar R , Haldar P , Anand A . PLoS One 2015 10 (5) e0127105 INTRODUCTION: India was the last country in the world to implement a two-dose strategy for measles-containing vaccine (MCV) in 2010. As part of measles second-dose introduction, phased measles vaccination campaigns were conducted during 2010-2013, targeting 131 million children 9 months to <10 years of age. We performed a post-campaign coverage survey to estimate measles vaccination coverage in Jharkhand state. METHODS: A multi-stage cluster survey was conducted 2 months after the phase 2 measles campaign occurred in 19 of 24 districts of Jharkhand during November 2011-March 2012. Vaccination status of children 9 months to <10 years of age was documented based on vaccination card or mother's recall. Coverage estimates and 95% confidence intervals (95% CI) for 1,018 children were calculated using survey methods. RESULTS: In the Jharkhand phase 2 campaign, MCV coverage among children aged 9 months to <10 years was 61.0% (95% CI: 54.4-67.7%). Significant differences in coverage were observed between rural (65.0%; 95% CI: 56.8-73.2%) and urban areas (45.6%; 95% CI: 37.3-53.9%). Campaign awareness among mothers was low (51.5%), and the most commonly reported reason for non-vaccination was being unaware of the campaign (69.4%). At the end of the campaign, 53.7% (95% CI: 46.5-60.9%) of children 12 months to <10 years of age received ≥2 MCV doses, while a large proportion of children remained under-vaccinated (34.0%, 95% CI: 28.0-40.0%) or unvaccinated (12.3%, 95% CI: 9.3-16.2%). CONCLUSIONS: Implementation of the national measles campaign was a significant achievement towards measles elimination in India. In Jharkhand, campaign performance was below the target coverage of ≥90% set by the Government of India, and challenges in disseminating campaign messages were identified. Efforts towards increasing two-dose MCV coverage are needed to achieve the recently adopted measles elimination goal in India and the South-East Asia region. |
Deaths following vaccination: what does the evidence show?
Miller ER , Moro PL , Cano M , Shimabukuro T . Vaccine 2015 33 (29) 3288-92 Vaccines are rigorously tested and monitored and are among the safest medical products we use. Millions of vaccinations are given to children and adults in the United States each year. Serious adverse reactions are rare. However, because of the high volume of use, coincidental adverse events including deaths, that are temporally associated with vaccination, do occur. When death occurs shortly following vaccination, loved ones and others might naturally question whether it was related to vaccination. A large body of evidence supports the safety of vaccines, and multiple studies and scientific reviews have found no association between vaccination and deaths except in rare cases. During the US multi-state measles outbreak of 2014-2015, unsubstantiated claims of deaths caused by measles, mumps, and rubella (MMR) vaccine began circulating on the Internet, prompting responses by public health officials to address common misinterpretations and misuses of vaccine safety surveillance data, particularly around spontaneous reports submitted to the US Vaccine Adverse Event Reporting System (VAERS). We summarize epidemiologic data on deaths following vaccination, including examples where reasonable scientific evidence exists to support that vaccination caused or contributed to deaths. Rare cases where a known or plausible theoretical risk of death following vaccination exists include anaphylaxis, vaccine-strain systemic infection after administration of live vaccines to severely immunocompromised persons, intussusception after rotavirus vaccine, Guillain-Barre syndrome after inactivated influenza vaccine, fall-related injuries associated with syncope after vaccination, yellow fever vaccine-associated viscerotropic disease or associated neurologic disease, serious complications from smallpox vaccine including eczema vaccinatum, progressive vaccinia, postvaccinal encephalitis, myocarditis, and dilated cardiomyopathy, and vaccine-associated paralytic poliomyelitis from oral poliovirus vaccine. However, making general assumptions and drawing conclusions about vaccinations causing deaths based on spontaneous reports to VAERS - some of which might be anecdotal or second-hand - or case reports in the media, is not a scientifically valid practice. |
A review of data quality of an electronic tuberculosis surveillance system for case-based reporting in Kenya
Sharma A , Ndisha M , Ngari F , Kipruto H , Cain KP , Sitienei J , Bloss E . Eur J Public Health 2015 25 (6) 1095-7 BACKGROUND: Kenya recently transitioned from a paper to an electronic system for recording and reporting of tuberculosis (TB) data. METHODS: During September -October 2013, the data quality of the new system was evaluated through an audit of data in paper source documents and in the national electronic system, and an analysis of all 99 281 cases reported in 2012. RESULTS: While the new electronic system overall is robust, this assessment demonstrated limitations in the concordance and completeness of data reaching the national level. CONCLUSIONS: Additional oversight and training in data entry are needed to strengthen TB surveillance data quality in Kenya. |
A new source of data for public health surveillance: Facebook likes
Gittelman S , Lange V , Gotway Crawford CA , Okoro CA , Lieb E , Dhingra SS , Trimarchi E . J Med Internet Res 2015 17 (4) e98 BACKGROUND: Investigation into personal health has become focused on conditions at an increasingly local level, while response rates have declined and complicated the process of collecting data at an individual level. Simultaneously, social media data have exploded in availability and have been shown to correlate with the prevalence of certain health conditions. OBJECTIVE: Facebook likes may be a source of digital data that can complement traditional public health surveillance systems and provide data at a local level. We explored the use of Facebook likes as potential predictors of health outcomes and their behavioral determinants. METHODS: We performed principal components and regression analyses to examine the predictive qualities of Facebook likes with regard to mortality, diseases, and lifestyle behaviors in 214 counties across the United States and 61 of 67 counties in Florida. These results were compared with those obtainable from a demographic model. Health data were obtained from both the 2010 and 2011 Behavioral Risk Factor Surveillance System (BRFSS) and mortality data were obtained from the National Vital Statistics System. RESULTS: Facebook likes added significant value in predicting most examined health outcomes and behaviors even when controlling for age, race, and socioeconomic status, with model fit improvements (adjusted R(2)) of an average of 58% across models for 13 different health-related metrics over basic sociodemographic models. Small area data were not available in sufficient abundance to test the accuracy of the model in estimating health conditions in less populated markets, but initial analysis using data from Florida showed a strong model fit for obesity data (adjusted R(2)=.77). CONCLUSIONS: Facebook likes provide estimates for examined health outcomes and health behaviors that are comparable to those obtained from the BRFSS. Online sources may provide more reliable, timely, and cost-effective county-level data than that obtainable from traditional public health surveillance systems as well as serve as an adjunct to those systems. |
Global status report on violence prevention 2014
Butchart A , Mikton C , Dahlberg LL , Krug EG . Inj Prev 2015 21 (3) 213 The Global status report on violence prevention 20141 describes what countries are doing to address interpersonal violence, and is a joint publication of WHO, United Nations Office on Drugs and Crime, and United Nations Development Programme. Interpersonal violence includes child maltreatment, youth violence, intimate partner violence, sexual violence and elder abuse.2 It is a leading cause of death among young people and results in millions of non-fatal injuries that receive emergency medical care. Furthermore, exposure to interpersonal violence is associated with increased health risk behaviours mental health problems, physical health problems and reproductive health problems.2 | Epidemiological studies of interpersonal violence are increasing, as are outcome evaluation studies of what works to prevent it.3 By contrast, few efforts have documented the extent to which countries are making use of scientific knowledge to design and monitor policies, programmes and laws to prevent such violence and provide services for victims.4–6 |
Changes in density of on-premises alcohol outlets and impact on violent crime, Atlanta, Georgia, 1997-2007
Zhang X , Hatcher B , Clarkson L , Holt J , Bagchi S , Kanny D , Brewer RD . Prev Chronic Dis 2015 12 E84 INTRODUCTION: Regulating alcohol outlet density is an evidence-based strategy for reducing excessive drinking. However, the effect of this strategy on violent crime has not been well characterized. A reduction in alcohol outlet density in the Buckhead neighborhood of Atlanta from 2003 through 2007 provided an opportunity to evaluate this effect. METHODS: We conducted a community-based longitudinal study to evaluate the impact of changes in alcohol outlet density on violent crime in Buckhead compared with 2 other cluster areas in Atlanta (Midtown and Downtown) with high densities of alcohol outlets, from 1997 through 2002 (preintervention) to 2003 through 2007 (postintervention). The relationship between exposures to on-premises retail alcohol outlets and violent crime were assessed by using annual spatially defined indices at the census block level. Multilevel regression models were used to evaluate the relationship between changes in exposure to on-premises alcohol outlets and violent crime while controlling for potential census block-level confounders. RESULTS: A 3% relative reduction in alcohol outlet density in Buckhead from 1997-2002 to 2003-2007 was associated with a 2-fold greater reduction in exposure to violent crime than occurred in Midtown or Downtown, where exposure to on-premises retail alcohol outlets increased. The magnitude of the association between exposure to alcohol outlets and violent crime was 2 to 5 times greater in Buckhead than in either Midtown or Downtown during the postintervention period. CONCLUSIONS: A modest reduction in alcohol outlet density can substantially reduce exposure to violent crime in neighborhoods with high density of alcohol outlets. Routine monitoring of community exposure to alcohol outlets could also inform the regulation of alcohol outlet density, consistent with Guide to Community Preventive Services recommendations. |
Intestinal Amebae.
Ali IK . Clin Lab Med 2015 35 (2) 393-422 Among the Entamoeba species that infect humans, Entamoeba histolytica causes diseases, Entamoeba dispar is a harmless commensal, Entamoeba moshkovskii seems to be a pathogen, and the pathogenicity of Entamoeba bangladeshi remains to be investigated. Species-specific detection needed for treatment decisions and for understanding the epidemiology and pathogenicity of these amebae. Antigen-based detection methods are needed for E dispar, E moshkovskii, and E bangladeshi; and molecular diagnostic test capable of detecting E histolytica, E dispar, E moshkovskii, and E bangladeshi simultaneously in clinical samples. Next-generation sequencing of DNA from stool is needed to identify novel species of Entamoeba. |
Rotavirus.
Esona MD , Gautam R . Clin Lab Med 2015 35 (2) 363-391 Group A rotavirus (RVA) is the major cause of acute gastroenteritis (AGE) in young children worldwide. Introduction of two live, attenuated rotavirus vaccines, Rotarix(R) and RotaTeq(R), has dramatically reduced RVA-associated AGE and mortality. High-throughput, sensitive and specific techniques are required to rapidly diagnose and characterize rotavirus strains in stool samples for proper patient treatment and to monitor circulating vaccine and wild-type rotavirus strains. New molecular assays are rapidly developed that are more sensitive and specific than the conventional assays for detection, genotyping and full genome characterization of circulating rotavirus wild-type and vaccine (Rotarix(R) and RotaTeq(R)) strains causing AGE. |
Quantitative analysis of the relative mutagenicity of five chemical constituents of tobacco smoke in the mouse lymphoma assay.
Guo X , Heflich RH , Dial SL , Richter PA , Moore MM , Mei N . Mutagenesis 2015 31 (3) 287-96 Quantifying health-related biological effects, like genotoxicity, could provide a way of distinguishing between tobacco products. In order to develop tools for using genotoxicty data to quantitatively evaluate the risk of tobacco products, we tested five carcinogens found in cigarette smoke, 4-aminobiphenyl (4-ABP), benzo[a]pyrene (BaP), cadmium (in the form of CdCl2), 2-amino-3,4-dimethyl-3H-imidazo[4,5-f]quinoline (MeIQ) and 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanone (NNK), in the mouse lymphoma assay (MLA). The resulting mutagenicity dose responses were analyzed by various quantitative approaches and their strengths and weaknesses for distinguishing responses in the MLA were evaluated. L5178Y/Tk +/- 3.7.2C mouse lymphoma cells were treated with four to seven concentrations of each chemical for 4h. Only CdCl2 produced a positive response without metabolic activation (S9); all five chemicals produced dose-dependent increases in cytotoxicity and mutagenicity with S9. The lowest dose exceeding the global evaluation factor, the benchmark dose producing a 10%, 50%, 100% or 200% increase in the background frequency (BMD10, BMD50, BMD100 and BMD200), the no observed genotoxic effect level (NOGEL), the lowest observed genotoxic effect level (LOGEL) and the mutagenic potency expressed as a mutant frequency per micromole of chemical, were calculated for all the positive responses. All the quantitative metrics had similar rank orders for the agents' ability to induce mutation, from the most to least potent as CdCl2(-S9) > BaP(+S9) > CdCl2(+S9) > MeIQ(+S9) > 4-ABP(+S9) > NNK(+S9). However, the metric values for the different chemical responses (i.e. the ratio of the greatest value to the least value) for the different chemicals ranged from 16-fold (BMD10) to 572-fold (mutagenic potency). These results suggest that data from the MLA are capable of discriminating the mutagenicity of various constituents of cigarette smoke, and that quantitative analyses are available that can be useful in distinguishing between the exposure responses. |
Development of a nucleic Acid extraction procedure for simultaneous recovery of DNA and RNA from diverse microbes in water.
Hill VR , Narayanan J , Gallen RR , Ferdinand KL , Cromeans T , Vinje J . Pathogens 2015 4 (2) 335-54 Drinking and environmental water samples contain a diverse array of constituents that can interfere with molecular testing techniques, especially when large volumes of water are concentrated to the small volumes needed for effective molecular analysis. In this study, a suite of enteric viruses, bacteria, and protozoan parasites were seeded into concentrated source water and finished drinking water samples, in order to investigate the relative performance of nucleic acid extraction techniques for molecular testing. Real-time PCR and reverse transcription-PCR crossing threshold (CT) values were used as the metrics for evaluating relative performance. Experimental results were used to develop a guanidinium isothiocyanate-based lysis buffer (UNEX buffer) that enabled effective simultaneous extraction and recovery of DNA and RNA from the suite of study microbes. Procedures for bead beating, nucleic acid purification, and PCR facilitation were also developed and integrated in the protocol. The final lysis buffer and sample preparation procedure was found to be effective for a panel of drinking water and source water concentrates when compared to commercial nucleic acid extraction kits. The UNEX buffer-based extraction protocol enabled PCR detection of six study microbes, in 100 L finished water samples from four drinking water treatment facilities, within three CT values (i.e., within 90% difference) of the reagent-grade water control. The results from this study indicate that this newly formulated lysis buffer and sample preparation procedure can be useful for standardized molecular testing of drinking and environmental waters. |
Recommended mass spectrometry-based strategies to identify botulinum neurotoxin-containing samples
Kalb SR , Baudys J , Wang D , Barr JR . Toxins (Basel) 2015 7 (5) 1765-78 Botulinum neurotoxins (BoNTs) cause the disease called botulism, which can be lethal. BoNTs are proteins secreted by some species of clostridia and are known to cause paralysis by interfering with nerve impulse transmission. Although the human lethal dose of BoNT is not accurately known, it is estimated to be between 0.1 mug to 70 mug, so it is important to enable detection of small amounts of these toxins. Our laboratory previously reported on the development of Endopep-MS, a mass-spectrometricbased endopeptidase method to detect, differentiate, and quantify BoNT immunoaffinity purified from complex matrices. In this work, we describe the application of Endopep-MS for the analysis of thirteen blinded samples supplied as part of the EQuATox proficiency test. This method successfully identified the presence or absence of BoNT in all thirteen samples and was able to successfully differentiate the serotype of BoNT present in the samples, which included matrices such as buffer, milk, meat extract, and serum. Furthermore, the method yielded quantitative results which had z-scores in the range of -3 to +3 for quantification of BoNT/A containing samples. These results indicate that Endopep-MS is an excellent technique for detection, differentiation, and quantification of BoNT in complex matrices. |
Effect of pregnancy upon facial anthropometrics and respirator fit testing
Roberge RJ , Kim JH , Palmiero A , Powell JB . J Occup Environ Hyg 2015 12 (11) 0 Workers required to wear respirators must undergo additional respirator fit testing if a significant change in body weight occurs. Approximately 10% of working women of reproductive age will be pregnant and experience a significant change in weight, yet the effect of pregnancy-associated weight gain on respirator fit is unknown. Cephalo-facial anthropometric measurements and quantitative fit testing of N95 filtering facepiece respirators (N95 FFR) of 15 pregnant women and 15 matched, non-pregnant women were undertaken for comparisons between the groups. There were no significant differences between pregnant and non-pregnant women with respect to cephalo-facial anthropometric measurements or N95 FFR quantitative fit tests. Healthy pregnant workers, who adhere to the recommended weight gain limits of pregnancy, are unlikely to experience an increase in cephalo-facial dimensions that would mandate additional N95 FFR fit testing above that which is normally required on an annual basis. |
Evaluation of PIMA point-of-care CD4 analyzer in Yunnan, China
Liang J , Duan S , Ma YL , Wang JB , Su YZ , Zhang H , Ou CY , Hao L , Qi MS , Bulterys M , Westerman L , Jiang Y , Xiao Y . Chin Med J (Engl) 2015 128 (7) 890-5 BACKGROUND: CD4 count is used to determine antiretroviral therapy (ART) eligibility. In China, flow cytometers are mostly located in urban areas with limited access by patients residing in remote areas. In an attempt to address this issue, we conducted a study to validate the performance of Alere PIMA point-of-care CD4 analyzer. METHODS: Venous and finger-prick blood specimens were collected from HIV-positive participants from two voluntary counseling and testing sites in Yunnan Province. Both venous and finger-prick blood specimens were tested with the PIMA analyzer. Venous blood specimens tested with the Becton Dickinson FACSCalibur were used as a reference. RESULTS: Venous specimens from 396 and finger-prick specimens from 387 persons were available for analysis. CD4 counts by PIMA correlated well with those from FACSCalibur with an R2 of 0.91 for venous blood and 0.81 for finger-prick blood. Compared to FACSCalibur, the PIMA analyzer yielded lower counts with a mean bias of - 47.0 cells/mul (limit of agreement, [LOA]: -204-110 cells/mul) for venous blood and -71.0 cells/mul (LOA: -295-153 cells/mul) for finger-prick blood. For a CD4 threshold of 350 cells/mul, the positive predictive value (PPV) of PIMA was 84.2% and 75.7% and the negative predictive value (NPV) was 97.6% and 95.8% for venous and finger-prick blood, respectively. For an ART threshold of 500 cells/mul, the corresponding PPV was 90.3% and 84.0% and NPV was 94.3% and 93.4%, respectively. CONCLUSIONS: CD4 counting using venous blood with PIMA analyzers is a feasible alternative to a large flow cytometer to determine ART eligibility. |
Inward leakage variability between respirator fit test panels - part I. deterministic approach
Zhuang Z , Liu Y , Coffey CC , Miller C , Szalajda J . J Occup Environ Hyg 2015 12 (11) 0 Inter-panel variability has never been investigated. The objective of this study was to determine the variability between different anthropometric panels used to determine the inward leakage (IL) of N95 filtering facepiece respirators (FFRs) and elastomeric half-mask respirators (EHRs). A total of 144 subjects, who were both experienced and non-experienced N95 FFR users, were recruited. Five N95 FFRs and five N95 EHRs were randomly selected from among those models tested previously in our laboratory. The PortaCount--> Pro+ (without N95-Companion) was used to measure IL of the ambient particles with a detectable size range of 0.02 to 1 microm. The Occupational Safety and Health Administration (OSHA) standard fit test exercises were used for this study. IL test were performed for each subject using each of the 10 respirators. Each respirator/subject combination was tested in duplicate, resulting in a total 20 IL tests for each subject. Three 35-member panels were randomly selected without replacement from the 144 study subjects stratified by the National Institute for Occupational Safety and Health (NIOSH) bivariate panel cell for conducting statistical analyses. The geometric mean (GM) IL values for all 10 studied respirators were not significantly different among the three randomly selected 35-member panels. Passing rate was not significantly different among the three panels for all respirators combined or by each model. This was true for all IL pass/fail levels of 1%, 2% and 5%. Using 26 or more subjects to pass the IL test, all three panels had consistent passing/failing results for pass/fail levels of 1% and 5%. Some disagreement was observed for the 2% pass/fail level. Inter-panel variability exists, but it is small relative to the other sources of variation in fit testing data. The concern about inter-panel variability and other types of variability can be alleviated by properly selecting: pass/fail level (IL 1% to 5%); panel size (e.g., 25 or 35); and minimum number of subjects required to pass (e.g., 26 of 35 or 23 of 35). |
Campylobacter
Fitzgerald C . Clin Lab Med 2015 35 (2) 289-298 Campylobacter continues to be one of the most common bacterial causes of diarrheal illness in the United States and worldwide. Infection with Campylobacter causes a spectrum of diseases including acute enteritis, extraintestinal infections, and postinfectious complications. The most common species of Campylobacter associated with human illness is Campylobacter jejuni, but other Campylobacter species can also cause human infections. This comprehensive review includes discussion of the taxonomy, clinical manifestations of infection, epidemiology and the different methods of laboratory detection of Campylobacter. |
Risks associated with smallpox vaccination in pregnancy: a systematic review and meta-analysis
Badell ML , Meaney-Delman D , Tuuli MG , Rasmussen SA , Petersen BW , Sheffield JS , Beigi RH , Damon IK , Jamieson DJ . Obstet Gynecol 2015 125 (6) 1439-51 OBJECTIVE: To estimate the maternal and fetal risks of smallpox vaccination during pregnancy. DATA SOURCES: MEDLINE, Web of Science, EMBASE, Global Health, ClinicalTrials.gov, and CINHAL from inception to September 2014. METHODS OF STUDY SELECTION: We included published articles containing primary data regarding smallpox vaccination during pregnancy that reported maternal or fetal outcomes (spontaneous abortion, congenital defect, stillbirth, preterm birth, or fetal vaccinia). TABULATIONS, INTEGRATION, AND RESULTS: The primary search yielded 887 articles. After hand-searching, 37 articles were included: 18 articles with fetal outcome data and 19 case reports of fetal vaccinia. Outcomes of smallpox vaccination in 12,201 pregnant women were included. Smallpox vaccination was not associated with an increased risk of spontaneous abortion (pooled relative risk [RR] 1.03, confidence interval [CI] 0.76-1.41), stillbirth (pooled RR 1.03, CI 0.75-1.40), or preterm birth (pooled RR 0.84, CI 0.62-1.15). When vaccination in any trimester was considered, smallpox vaccination was not associated with an increased risk of congenital defects (pooled RR 1.25, CI 0.99-1.56); however, first-trimester exposure was associated with an increased risk of congenital defects (2.4% compared with 1.5%, pooled RR 1.34, CI 1.02-1.77). No cases of fetal vaccinia were reported in the studies examining fetal outcomes; 21 cases of fetal vaccinia were identified in the literature, of which three neonates survived. CONCLUSION: The overall risk associated with maternal smallpox vaccination appears low. No association between smallpox vaccination and spontaneous abortion, preterm birth, or stillbirth was identified. First-trimester vaccination was associated with a small increase in congenital defects, but the effect size was small and based on limited data. Fetal vaccinia appears to be a rare consequence of maternal smallpox vaccination but is associated with a high rate of fetal loss. |
Trends of US hospitals distributing infant formula packs to breastfeeding mothers, 2007 to 2013
Nelson JM , Li R , Perrine CG . Pediatrics 2015 135 (6) 1051-6 OBJECTIVE: To examine trends in the prevalence of hospitals and birth centers (hereafter, hospitals) distributing infant formula discharge packs to breastfeeding mothers in the United States from 2007 to 2013. METHODS: The Maternity Practices in Infant Nutrition and Care survey is administered every 2 years to all hospitals with registered maternity beds in the United States. A Web- or paper-based questionnaire was distributed and completed by the people most knowledgeable about breastfeeding-related hospital practices. We examined the distribution of infant formula discharge packs to breastfeeding mothers from 2007 to 2013 by state and hospital characteristics. RESULTS: The percentage of hospitals distributing infant formula discharge packs to breastfeeding mothers was 72.6% in 2007 and 31.6% in 2013, a decrease of 41 percentage points. In 2007, there was only 1 state (Rhode Island) in which <25% of hospitals distributed infant formula discharge packs to breastfeeding mothers, whereas in 2013 there were 24 such states and territories. Distribution declined across all hospital characteristics examined, including facility type, teaching versus nonteaching, and size (annual number of births). CONCLUSIONS: The distribution of infant formula discharge packs to breastfeeding mothers declined markedly from 2007 to 2013. Discontinuing the practice of distributing infant formula discharge packs is a part of optimal, evidence-based maternity care to support mothers who want to breastfeed. |
Folate and vitamin B12 deficiency among non-pregnant women of childbearing-age in Guatemala 2009-2010: prevalence and identification of vulnerable populations
Rosenthal J , Lopez-Pazos E , Dowling NF , Pfeiffer CM , Mulinare J , Vellozzi C , Zhang M , Lavoie DJ , Molina R , Ramirez N , Reeve ME . Matern Child Health J 2015 19 (10) 2272-85 INTRODUCTION: Information on folate and vitamin B12 deficiency rates in Guatemala is essential to evaluate the current fortification program. The objectives of this study were to describe the prevalence of folate and vitamin B12 deficiencies among women of childbearing age (WCBA) in Guatemala and to identify vulnerable populations at greater risk for nutrient deficiency. METHODS: A multistage cluster probability study was designed with national and regional representation of nonpregnant WCBA (15-49 years of age). Primary data collection was carried out in 2009-2010. Demographic and health information was collected through face-to-face interviews. Blood samples were collected from 1473 WCBA for serum and red blood cell (RBC) folate and serum vitamin B12. Biochemical concentrations were normalized using geometric means. Prevalence rate ratios were estimated to assess relative differences among different socioeconomic and cultural groups including ethnicity, age, education level, wealth index and rural versus urban locality. RESULTS: National prevalence estimates for deficient serum [<10 nmol per liter (nmol/L)] and RBC folate (<340 nmol/L) concentrations were 5.1 % (95 % CI 3.8, 6.4) and 8.9 % (95 % CI 6.7, 11.7), respectively; for vitamin B12 deficiency (<148 pmol/L) 18.5 % (95 % CI 15.6, 21.3). Serum and RBC folate deficiency prevalences were higher for rural areas than for urban areas (8.0 vs. 2.0 % and 13.5 vs. 3.9 %, respectively). The prevalence of RBC folate deficiency showed wide variation by geographic region (3.2-24.9 %) and by wealth index (4.1-15.1 %). The prevalence of vitamin B12 deficiency also varied among regions (12.3-26.1 %). CONCLUSIONS: In Guatemala, folate deficiency was more prevalent among indigenous rural and urban poor populations. Vitamin B12 deficiency was widespread among WCBA. Our results suggest the ongoing need to monitor existing fortification programs, in particular regarding its reach to vulnerable populations. |
Introduction of rapid syphilis testing in antenatal care: a systematic review of the impact on HIV and syphilis testing uptake and coverage
Swartzendruber A , Steiner RJ , Adler MR , Kamb ML , Newman LM . Int J Gynaecol Obstet 2015 130 Suppl 1 S15-21 BACKGROUND: Global guidelines recommend universal syphilis and HIV screening for pregnant women. Rapid syphilis testing (RST) may contribute toward achievement of universal screening. OBJECTIVES: To examine the impact of RST on syphilis and HIV screening among pregnant women. SEARCH STRATEGY: We searched MEDLINE for English- and non-English language articles published through November, 2014. SELECTION CRITERIA: We included studies that used a comparative design and reported on syphilis and HIV test uptake among pregnant women in low- and middle-income countries (LMICs) following introduction of RST. DATA COLLECTION AND ANALYSIS: Data were extracted from six eligible articles presenting findings from Asia, Africa, and Latin America. MAIN RESULTS: All studies reported substantial increases in antenatal syphilis testing following introduction of RST; the latter did not appear to adversely impact antenatal HIV screening levels at sites already offering rapid HIV testing and may increase HIV screening among pregnant women in some settings. Qualitative data revealed that women were highly satisfied with RST. Nevertheless, ensuring adequate training for healthcare workers and supplies of commodities were cited as key implementation barriers. CONCLUSIONS: RST may increase antenatal syphilis and HIV screening and contribute to the improvement of antenatal care in LMICs. |
Cancer among children with perinatal exposure to HIV and antiretroviral medications - New Jersey, 1995 - 2010
Ivy W 3rd , Nesheim SR , Paul S , Ibrahim A , Chan M , Niu X , Lampe MA . J Acquir Immune Defic Syndr 2015 70 (1) 62-6 BACKGROUND: Concerns remain regarding the cancer risk associated with perinatal ARV exposure among infants. No excessive cancer risk has been found in short-term studies. METHOD: Children born to HIV-infected women (HIV-exposed) in New Jersey from 1995 to 2008 were identified through the Enhanced HIV/AIDS Reporting System (eHARS) and cross-referenced with data from the New Jersey State Cancer Registry to identify new cases of cancer among children who were perinatally exposed to ARV. Matching of individuals in eHARS to the New Jersey State cancer registry was conducted based on name, birthdate, Social Security number, residential address, and sex using AUTOMATCH. Age- and sex-standardized incidence ratio (SIR) and exact 95% confidence intervals (CI) were calculated using New Jersey (1979-2005) and United States (1999-2009) cancer rates. RESULTS: Among 3,087 children (29,099 person-years; 9.8 years median follow-up), four were diagnosed with cancer. Cancer incidence among HIV-exposed children who were not exposed to ARV prophylaxis (22.5 per 100,000 person-years) did not differ significantly from the incidence among children who were exposed to any perinatal ARV prophylaxis (14.3 per 100,000 person-years). Furthermore, the number of cases observed among individuals exposed to ARV did not differ significantly from cases expected based on state (SIR = 1.21; 95% CI, 0.25-3.54) and national (SIR = 1.27; 95% CI, 0.26-3.70) reference rates. CONCLUSION: Our findings are reassuring that current use of ARV for perinatal HIV prophylaxis does not increase cancer risk. We found no evidence to alter the current federal guidelines of 2014 that recommend ARV prophylaxis of HIV-exposed infants. |
Does autism diagnosis age or symptom severity differ among children according to whether assisted reproductive technology was used to achieve pregnancy?
Schieve LA , Fountain C , Boulet SL , Yeargin-Allsopp M , Kissin DM , Jamieson DJ , Rice C , Bearman P . J Autism Dev Disord 2015 45 (9) 2991-3003 Previous studies report associations between conception with assisted reproductive technology (ART) and autism. Whether these associations reflect an ascertainment or biologic effect is undetermined. We assessed diagnosis age and initial autism symptom severity among >30,000 children with autism from a linkage study of California Department of Developmental Services records, birth records, and the National ART Surveillance System. Median diagnosis age and symptom severity levels were significantly lower for ART-conceived than non-ART-conceived children. After adjustment for differences in the socio-demographic profiles of the two groups, the diagnosis age differentials were greatly attenuated and there were no differences in autism symptomatology. Thus, ascertainment issues related to SES, not ART per se, are likely the driving influence of the differences we initially observed. |
Manganese fractionation using a sequential extraction method to evaluate welders' shielded metal arc welding exposures during construction projects in oil refineries
Hanley KW , Andrews R , Bertke S , Ashley K . J Occup Environ Hyg 2015 12 (11) 0 The National Institute for Occupational Safety and Health (NIOSH) has conducted an occupational exposure assessment study of manganese (Mn) in welding fume of construction workers rebuilding tanks, piping, and process equipment at two oil refineries. The objective of this study was to evaluate exposures to different Mn fractions using a sequential extraction procedure. Seventy-two worker-days were monitored for either total or respirable Mn during stick welding and associated activities both within and outside of confined spaces. The samples were analyzed using an experimental method to separate different Mn fractions by valence states based on selective chemical solubility. The full-shift total particulate Mn time-weighted average (TWA) breathing zone concentrations ranged from 0.013 - 29 for soluble Mn in a mild ammonium acetate solution; from 0.26 - 250 for Mn0,2+ in acetic acid; from non-detectable (ND) - 350 for Mn3+,4+ in hydroxylamine-hydrochloride; and from ND - 39 micrograms per cubic meter (microg/m3) for insoluble Mn fractions in hydrochloric and nitric acid. The summation of all Mn fractions in total particulate TWA ranged from 0.52 to 470 microg/m3. The range of respirable particulate Mn TWA concentrations were from 0.20 - 28 for soluble Mn; from 1.4 - 270 for Mn0,2+; from 0.49 - 150 for Mn3+,4+; from ND - 100 for insoluble Mn; and from 2.0 - 490 microg/m3 for Mn (sum of fractions). For all jobs combined, total particulate TWA GM concentrations of the Mn(sum) were 99 (GSD=3.35) and 8.7 (GSD=3.54) microg/m3 for workers inside and outside of confined spaces; respirable Mn also showed much higher levels for welders within confined spaces. Regardless of particle size and confined space work status, Mn0,2+ fraction was the most abundant followed by Mn3+,4+ fraction, typically >50% and 30-40% of Mn(sum), respectively. Eighteen welders' exposures exceeded the ACGIH Threshold Limit Values for total Mn (100 microg/m3) and 25 exceeded the recently adopted respirable Mn TLV (20 microg/m3). This study shows that a welding fume exposure control and management program is warranted, especially for welding jobs in confined spaces. |
Occupational fatalities during the oil and gas boom - United States, 2003-2013
Mason KL , Retzer KD , Hill R , Lincoln JM . MMWR Morb Mortal Wkly Rep 2015 64 (20) 551-554 During 2003-2013, the U.S. oil and gas extraction industry experienced unprecedented growth, doubling the size of its workforce and increasing the number of drilling rigs by 71%. To describe fatal events among oil and gas workers during this period, CDC analyzed data from the Bureau of Labor Statistics (BLS) Census of Fatal Occupational Injuries (CFOI), a comprehensive database of fatal work injuries. During 2003-2013, the number of work-related fatalities in the oil and gas extraction industry increased 27.6%, with a total of 1,189 deaths; however, the annual occupational fatality rate significantly decreased 36.3% (p<0.05) during this 11-year period. Two-thirds of all worker fatalities were attributed to transportation incidents (479, [40.3%]) and contact with objects/equipment (308 [25.9%]). More than 50% of persons fatally injured were employed by companies that service wells (615 [51.7%]). It is important for employers to consider measures such as land transportation safety policies and engineering controls (e.g., automated technologies) that would address these leading causes of death and reduce workers' exposure to hazards. |
Cancer mortality through 2005 among a pooled cohort of U.S. nuclear workers exposed to external ionizing radiation
Schubauer-Berigan MK , Daniels RD , Bertke SJ , Tseng CY , Richardson DB . Radiat Res 2015 183 (6) 620-31 Nuclear workers worldwide have been studied for decades to estimate associations between their exposure to ionizing radiation and cancer. The low-level exposure of these workers requires pooling of large cohorts studied over many years to obtain risk estimates with appropriate latency and good precision. We assembled a pooled cohort of 119,195 U.S. nuclear workers at four Department of Energy nuclear weapons facilities (Hanford site, Idaho National Laboratory, Oak Ridge National Laboratory and Savannah River site) and at the Portsmouth Naval Shipyard. The cohort was followed at the start of the workers beginning their radiation work (at earliest, between 1944 and 1952) through 2005, and we compared its mortality to that of the U.S. POPULATION: We also conducted regression-modeling analysis to evaluate dose-response associations for external radiation exposure and outcomes: all cancers, smoking- and nonsmoking-related cancers, all lymphatic and hematopoietic cancers, leukemia (excluding chronic lymphocytic), multiple myeloma, cardiovascular disease and others. The mean dose observed among the cohort was 20 mSv. For most outcomes, mortality was below expectation compared to the general population, but mesothelioma and pleura cancers were highly elevated. We found an excess relative risk (ERR) per 10 mSv of 0.14% [95% confidence interval (CI): -0.17%, 0.48%] for all cancers excluding leukemia. Estimates were higher for nonsmoking-related cancers (0.70%, 95% CI: 0.058%, 1.5%) and lower for smoking-related cancers (-0.079%, 95% CI: -0.43%, 0.32%). The ERR per 10 mSv was 1.7% (95% CI: -0.22%, 4.7%) for leukemia, which was similar to the estimate of 1.8% (95% CI: 0.027%, 4.4%) for all lymphatic and hematopoietic cancers. The ERR per 10 mSv for multiple myeloma was 3.9% (95% CI: 0.60%, 9.5%). The ERR per 10 mSv for cardiovascular disease was 0.026% (-0.25%, 0.32%). Little evidence of heterogeneity was seen by facility, birth cohort or sex for most outcomes. The estimates observed here are similar to those found in previous large pooled nuclear worker studies and also (with the exception of multiple myeloma) to those conducted in the Life Span Study of Japanese atomic bomb survivors. The tendency of observed risks to persist many years after exposure for most outcomes illustrates the importance of continued follow-up of nuclear worker cohorts. |
Review of mass drug administration for malaria and its oerational challenges
Newby G , Hwang J , Koita K , Chen I , Greenwood B , von Seidlein L , Shanks GD , Slutsker LM , Kachur SP , Wegbreit J , Ippolito MM , Poirot E , Gosling R . Am J Trop Med Hyg 2015 93 (1) 125-134 Mass drug administration (MDA) was a component of many malaria programs during the eradication era, but later was seldomly deployed due to concerns regarding efficacy and feasibility, and fear of accelerating drug resistance. Recently, however, there has been renewed interest in the role of MDA as an elimination tool. Following a 2013 Cochrane Review that focused on the quantitative effects of malaria MDA, we have conducted a systematic, qualitative review of published, unpublished, and gray literature documenting past MDA experiences. We have also consulted with field experts, using their historical experience to provide an informed, contextual perspective on the role of MDA in malaria elimination. Substantial knowledge gaps remain and more research is necessary, particularly on optimal target population size, methods to improve coverage, and primaquine safety. Despite these gaps, MDA has been used successfully to control and eliminate Plasmodium falciparum and P. vivax malaria in the past, and should be considered as part of a comprehensive malaria elimination strategy in aspecific settings. |
Something old, something new: is praziquantel enough for schistosomiasis control?
Secor WE , Montgomery SP . Future Med Chem 2015 7 (6) 681-4 In recent years, more people living in schistosomiasis endemic areas are receiving treatment with praziquantel. The combination of the World Health Assembly’s resolution 54.19, the WHO preventive chemotherapy strategy for neglected tropical diseases, and the recognition of the considerable health impact of schistosomiasis even in persons who do not have severe fibrotic disease led financial donors and drug companies to provide the resources that have allowed availability of a much higher number of treatments for at risk individuals [1]. However, the available resources to provide treatment still are insufficient for the vast number of people who need it. The WHO treatment guidelines for schistosomiasis were drafted before sufficient quantities of praziquantel were available to make mass drug administration (MDA) a realistic possibility and were therefore primarily focused on treatment strategies to reduce morbidity. Now that praziquantel MDA has been implemented more widely, there is also a push for elimination of schistosomiasis [2]. Efforts to evaluate the efficacy of a single intervention, praziquantel MDA, are underway; however, there is growing evidence that once yearly MDA, the highest frequency of treatment in the current guidelines, will not by itself be sufficient to lower prevalence in high risk communities to rates where elimination of transmission is feasible [3–5]. There are many critical questions concerning the current and future treatment of schistosomiasis that need to be addressed. |
Spatial distribution of schistosomiasis and treatment needs in sub-Saharan Africa: a systematic review and geostatistical analysis
Lai YS , Biedermann P , Ekpo UF , Garba A , Mathieu E , Midzi N , Mwinzi P , N'Goran EK , Raso G , Assare RK , Sacko M , Schur N , Talla I , Tchuente LA , Toure S , Winkler MS , Utzinger J , Vounatsou P . Lancet Infect Dis 2015 15 (8) 927-40 BACKGROUND: Schistosomiasis affects more than 200 million individuals, mostly in sub-Saharan Africa, but empirical estimates of the disease burden in this region are unavailable. We used geostatistical modelling to produce high-resolution risk estimates of infection with Schistosoma spp and of the number of doses of praziquantel treatment needed to prevent morbidity at different administrative levels in 44 countries. METHODS: We did a systematic review to identify surveys including schistosomiasis prevalence data in sub-Saharan Africa via PubMed, ISI Web of Science, and African Journals Online, from inception to May 2, 2014, with no restriction of language, survey date, or study design. We used Bayesian geostatistical meta-analysis and rigorous variable selection to predict infection risk over a grid of 1 155 818 pixels at 5 x 5 km, on the basis of environmental and socioeconomic predictors and to calculate the number of doses of praziquantel needed for prevention of morbidity. FINDINGS: The literature search identified Schistosoma haematobium and Schistosoma mansoni surveys done in, respectively, 9318 and 9140 unique locations. Infection risk decreased from 2000 onwards, yet estimates suggest that 163 million (95% Bayesian credible interval [CrI] 155 million to 172 million; 18.5%, 17.6-19.5) of the sub-Saharan African population was infected in 2012. Mozambique had the highest prevalence of schistosomiasis in school-aged children (52.8%, 95% CrI 48.7-57.8). Low-risk countries (prevalence among school-aged children lower than 10%) included Burundi, Equatorial Guinea, Eritrea, and Rwanda. The numbers of doses of praziquantel needed per year were estimated to be 123 million (95% CrI 121 million to 125 million) for school-aged children and 247 million (239 million to 256 million) for the entire population. INTERPRETATION: Our results will inform policy makers about the number of treatments needed at different levels and will guide the spatial targeting of schistosomiasis control interventions. FUNDING: European Research Council, China Scholarship Council, UBS Optimus Foundation, and Swiss National Science Foundation. |
Enhanced surveillance and data feedback loop associated with improved malaria data in Lusaka, Zambia
Chisha Z , Larsen DA , Burns M , Miller JM , Chirwa J , Mbwili C , Bridges DJ , Kamuliwo M , Hawela M , Tan KR , Craig AS , Winters AM . Malar J 2015 14 (1) 222 BACKGROUND: Accurate and timely malaria data are crucial to monitor the progress towards and attainment of elimination. Lusaka, the capital city of Zambia, has reported very low malaria prevalence in Malaria Indicator Surveys. Issues of low malaria testing rates, high numbers of unconfirmed malaria cases and over consumption of anti-malarials were common at clinics within Lusaka, however. The Government of Zambia (GRZ) and its partners sought to address these issues through an enhanced surveillance and feedback programme at clinic level. METHODS: The enhanced malaria surveillance programme began in 2011 to verify trends in reported malaria, as well as to implement a data feedback loop to improve data uptake, use, and quality. A process of monthly data collection and provision of feedback was implemented within all GRZ health clinics in Lusaka District. During clinic visits, clinic registers were accessed to record the number of reported malaria cases, malaria test positivity rate, malaria testing rate, and proportion of total suspected malaria that was confirmed with a diagnostic test. RESULTS AND DISCUSSION: Following the enhanced surveillance programme, the odds of receiving a diagnostic test for a suspected malaria case increased (OR = 1.54, 95 % CI = 0.96-2.49) followed by an upward monthly trend (OR = 1.05, 95 % CI = 1.01-1.09). The odds of a reported malaria case being diagnostically confirmed also increased monthly (1.09, 95 % CI 1.04-1.15). After an initial 140 % increase (95 % CI = 91-183 %), costs fell by 11 % each month (95 % CI = 5.7-10.9 %). Although the mean testing rate increased from 18.9 to 64.4 % over the time period, the proportion of reported malaria unconfirmed by diagnostic remained high at 76 %. CONCLUSIONS: Enhanced surveillance and implementation of a data feedback loop have substantially increased malaria testing rates and decreased the number of unconfirmed malaria cases and courses of ACT consumed in Lusaka District within just two years. Continued support of enhanced surveillance in Lusaka as well as national scale-up of the system is recommended to reinforce good case management and to ensure timely, reliable data are available to guide targeting of limited malaria prevention and control resources in Zambia. |
What strategies are used to build practitioners' capacity to implement community-based interventions and are they effective?: a systematic review
Leeman J , Calancie L , Hartman MA , Escoffery CT , Herrmann AK , Tague LE , Moore AA , Wilson KM , Schreiner M , Samuel-Hodge C . Implement Sci 2015 10 (1) 80 BACKGROUND: Numerous agencies are providing training, technical assistance, and other support to build community-based practitioners' capacity to adopt and implement evidence-based prevention interventions. Yet, little is known about how best to design capacity-building interventions to optimize their effectiveness. Wandersman et al. (Am J Community Psychol.50:445-59, 2102) proposed the Evidence-Based System of Innovation Support (EBSIS) as a framework to guide research and thereby strengthen the evidence base for building practitioners' capacity. The purpose of this review was to contribute to further development of the EBSIS by systematically reviewing empirical studies of capacity-building interventions to identify (1) the range of strategies used, (2) variations in the way they were structured, and (3) evidence for their effectiveness at increasing practitioners' capacity to use evidence-based prevention interventions. METHODS: PubMed, EMBASE, and CINAHL were searched for English-language articles reporting findings of empirical studies of capacity-building interventions that were published between January 2000 and January 2014 and were intended to increase use of evidence-based prevention interventions in non-clinical settings. To maximize review data, studies were not excluded a priori based on design or methodological quality. Using the EBSIS as a guide, two researchers independently extracted data from included studies. Vote counting and meta-summary methods were used to summarize findings. RESULTS: The review included 42 publications reporting findings from 29 studies. In addition to confirming the strategies and structures described in the EBSIS, the review identified two new strategies and two variations in structure. Capacity-building interventions were found to be effective at increasing practitioners' adoption (n = 10 of 12 studies) and implementation (n = 9 of 10 studies) of evidence-based interventions. Findings were mixed for interventions' effects on practitioners' capacity or intervention planning behaviors. Both the type and structure of capacity-building strategies may have influenced effectiveness. The review also identified contextual factors that may require variations in the ways capacity-building interventions are designed. CONCLUSIONS: Based on review findings, refinements are suggested to the EBSIS. The refined framework moves the field towards a more comprehensive and standardized approach to conceptualizing the types and structures of capacity-building strategies. This standardization will assist with synthesizing findings across studies and guide capacity-building practice and research. |
Effects of mental health benefits legislation: a Community Guide systematic review
Sipe TA , Finnie RK , Knopf JA , Qu S , Reynolds JA , Thota AB , Hahn RA , Goetzel RZ , Hennessy KD , McKnight-Eily LR , Chapman DP , Anderson CW , Azrin S , Abraido-Lanza AF , Gelenberg AJ , Vernon-Smiley ME , Nease DE Jr . Am J Prev Med 2015 48 (6) 755-766 CONTEXT: Health insurance benefits for mental health services typically have paid less than benefits for physical health services, resulting in potential underutilization or financial burden for people with mental health conditions. Mental health benefits legislation was introduced to improve financial protection (i.e., decrease financial burden) and to increase access to, and use of, mental health services. This systematic review was conducted to determine the effectiveness of mental health benefits legislation, including executive orders, in improving mental health. EVIDENCE ACQUISITION: Methods developed for the Guide to Community Preventive Services were used to identify, evaluate, and analyze available evidence. The evidence included studies published or reported from 1965 to March 2011 with at least one of the following outcomes: access to care, financial protection, appropriate utilization, quality of care, diagnosis of mental illness, morbidity and mortality, and quality of life. Analyses were conducted in 2012. EVIDENCE SYNTHESIS: Thirty eligible studies were identified in 37 papers. Implementation of mental health benefits legislation was associated with financial protection (decreased out-of-pocket costs) and appropriate utilization of services. Among studies examining the impact of legislation strength, most found larger positive effects for comprehensive parity legislation or policies than for less-comprehensive ones. Few studies assessed other mental health outcomes. CONCLUSIONS: Evidence indicates that mental health benefits legislation, particularly comprehensive parity legislation, is effective in improving financial protection and increasing appropriate utilization of mental health services for people with mental health conditions. Evidence was limited for other mental health outcomes. |
Factors involved in the collaboration between the National Comprehensive Cancer Control Programs and Tobacco Control Programs: a qualitative study of 6 states, United States, 2012
Momin B , Neri A , Goode SA , Sarris Esquivel N , Schmitt CL , Kahende J , Zhang L , Stewart SL . Prev Chronic Dis 2015 12 E83 INTRODUCTION: Historically, federal funding streams to address cancer and tobacco use have been provided separately to state health departments. This study aims to document the impact of a recent focus on coordinating chronic disease efforts through collaboration between the 2 programs. METHODS: Through a case-study approach using semistructured interviews, we collected information on the organizational context, infrastructure, and interaction between cancer and tobacco control programs in 6 states from March through July 2012. Data were analyzed with NVivo software, using a grounded-theory approach. RESULTS: We found between-program activities in the state health department and coordinated implementation of interventions in the community. Factors identified as facilitating integrated interventions in the community included collaboration between programs in the strategic planning process, incorporation of one another's priorities into state strategic plans, co-location, and leadership support for collaboration. Coalitions were used to deliver integrated interventions to the community. Five states perceived high staff turnover as a barrier to collaboration, and all 5 states felt that federal funding requirements were a barrier. CONCLUSIONS: Cancer and tobacco programs are beginning to implement integrated interventions to address chronic disease. Findings can inform the development of future efforts to integrate program activities across chronic disease prevention efforts. |
Disability within US public health school and program curricula
Sinclair LB , Tanenhaus RH , Courtney-Long E , Eaton DK . J Public Health Manag Pract 2015 21 (4) 400-5 OBJECTIVE: To describe the percentage of US public health schools and programs offering graduate-level courses with disability content as a potential baseline measurement for Healthy People 2020 objective DH-3 and compare the percentage of public health schools that offered disability coursework in 1999 with those in 2011. DESIGN: In 2011, using SurveyMonkey.com, cross-sectional information was collected from the deans, associate deans, directors, or chairpersons of master of public health-granting public health schools and programs that were accredited and listed with the Council on Education for Public Health. Two rounds of follow-up were conducted at 4-month intervals by e-mails and phone calls to program contacts who had not responded. The responses from schools and programs were calculated and compared. RESULTS: There were 78 responses (34 schools and 44 programs) for a response rate of 63%. Fifty percent of public health schools and programs offered some disability content within their graduate-level courses. A greater percentage of schools than programs (71% vs 34%; P = .003) offered some graduate-level disability coursework within their curricula. The percentage of schools that offered disability coursework was similar in 1999 and 2011. CONCLUSION: This assessment provides a potential baseline measurement for Healthy People 2020 objective DH-3. Future assessments should focus on clarifying disability content within courses and identifying capacity to offering disability training within public health schools and programs. |
Nonuse of contraception among women at risk of unintended pregnancy in the United States
Mosher W , Jones J , Abma J . Contraception 2015 92 (2) 170-6 OBJECTIVE: This paper seeks to determine factors associated with nonuse of contraception by women at risk of unintended pregnancy in the United States. This nonuse may be associated with about 900,000 unintended births in the US each year. STUDY DESIGN: The 2002 and 2006-2010 National Surveys of Family Growth were combined to yield a nationally representative sample of 9,675 women at risk of unintended pregnancy. Logistic regression analyses identified factors associated with nonuse of contraception. RESULTS: This analysis reveals previously undocumented patterns of nonuse: controlling for confounding variables, cohabiting women (Adjusted Odds ratio =2.3, 95% Confidence Interval=1.45-3.52) had higher odds of nonuse than married women; women who reported a difficulty getting pregnant (AOR = 2.5, 95% CI=2.01-3.01) had higher odds of nonuse than those who did not. Nonuse was also more common among women with a Master's degree or more (AOR=1.5, 95% CI=1.11-2.08) compared with those with some college or bachelor's degree, and it was more common among women in their first year after first intercourse than after the first year (AOR 1.6, 95% CI=1.12-2.22). Among women who had a recent unintended birth, the most common reason for not using contraception prior to conception was that she did not think she could get pregnant. CONCLUSIONS: This study establishes national estimates of reasons for nonuse of contraception, and identifies some new subgroups at risk of nonuse. IMPLICATIONS: These results may help better understand factors affecting nonuse of contraception, and develop strategies in preventing unintended pregnancy in the United States. |
Postpartum contraceptive use among women with a recent preterm birth
Robbins CL , Farr SL , Zapata LB , D'Angelo DV , Callaghan WM . Am J Obstet Gynecol 2015 213 (4) 508 e1-9 OBJECTIVE: To evaluate the associations between postpartum contraception and having a recent preterm birth. STUDY DESIGN: Population-based data from the Pregnancy Risk Assessment Monitoring System in nine states were used to estimate postpartum use of highly or moderately effective contraception (sterilization, intrauterine device, implants, shots, pills, patch, and ring) and user-independent contraception (sterilization, implants, and intrauterine device) among women with recent live births (2009-2011). We assessed differences in contraception by gestational age (≤27, 28-33, or 34-36 weeks versus term [≥37 weeks]) and modeled the associations using multivariable logistic regression with weighted data. RESULTS: A higher percentage of women with recent extreme preterm birth (≤27 weeks) reported using no postpartum method (31%) compared with all other women (15%-16%). Women delivering extreme preterm infants had decreased odds of using highly or moderately effective methods (adjusted odds ratio [aOR]=0.5, 95% confidence interval [CI]: 0.4 - 0.6) and user-independent methods (aOR=0.5, 95% CI: 0.4 - 0.7) compared with women having term births. Wanting to get pregnant was more frequently reported as a reason for contraceptive non-use by women with an extreme preterm birth overall (45%) compared with all other women (15%-18%, p<.0001). Infant death occurred in 41% of extreme preterm births and over half (54%) of these mothers reported wanting to become pregnant as the reason for contraceptive non-use. CONCLUSIONS: During contraceptive counseling with women who had recent preterm births, providers should address optimal pregnancy interval, and consider that women with recent extreme preterm birth, particularly those whose infants died, may not use contraception because they want to get pregnant. |
Adolescent schoolgirls' experiences of menstrual cups and pads in rural western Kenya: a qualitative study
Mason L , Laserson KF , Oruko K , Nyothach E , Alexander KT , Odhiambo FO , Eleveld A , Isiye E , Ngere I , Omoto J , Mohammed A , Vulule J , Phillips-Howard PA . Waterlines 2015 34 (1) 15-30 Poor menstrual hygiene management (MHM) among schoolgirls in low-income countries affects girls' dignity, self-esteem, and schooling. Hygienic, effective, and sustainable menstrual products are required. A randomized controlled feasibility study was conducted among 14-16-year-old girls, in 30 primary schools in rural western Kenya, to examine acceptability, use, and safety of menstrual cups or sanitary pads. Focus group discussions (FGDs) were conducted to evaluate girls' perceptions and experiences six months after product introduction. Narratives from 10 girls' and 6 parents' FGDs were analysed thematically. Comparison, fear, and confidence were emergent themes. Initial use of cups was slow. Once comfortable, girls using cups or pads reported being free of embarrassing leakage, odour, and dislodged items compared with girls using traditional materials. School absenteeism and impaired concentration were only reported by girls using traditional materials. Girls using cups preferred them to pads. Advantages of cups and pads over traditional items provide optimism for MHM programmes. |
Combined hormonal contraceptive use among breastfeeding women: an updated systematic review
Tepper NK , Phillips SJ , Kapp N , Gaffield ME , Curtis KM . Contraception 2015 94 (3) 262-74 BACKGROUND: Contraception is important for women who are postpartum, including those who are breastfeeding. Use of combined hormonal contraceptives (CHCs) may affect breastfeeding performance and infant health outcomes. OBJECTIVES: To identify evidence examining clinical outcomes for breastfeeding and infant health among breastfeeding women using combined hormonal contraceptives compared to non-users. SEARCH STRATEGY: We searched the PubMed database for all articles published from database inception through September 30, 2014. SELECTION CRITERIA: We included primary research studies that compared breastfeeding women using CHCs with breastfeeding women using non-hormonal or no contraception, or compared breastfeeding women initiating combined hormonal contraception at early versus later times postpartum. Breastfeeding outcomes of interest included duration, rate of exclusive breastfeeding and timing of supplementation. Infant outcomes of interest included growth, health, and development. RESULTS: Fifteen articles describing 13 studies met inclusion criteria for this review. Studies ranged from poor to fair methodological quality and demonstrated inconsistent effects of combined oral contraceptives (COCs) on breastfeeding performance with COC initiation before or after 6 weeks postpartum; some studies demonstrated greater supplementation and decreased breastfeeding continuation among COC users compared with non-users and others demonstrated no effect. For infant outcomes, some studies found decreases in infant weight gain for COC users compared with non-users when COCs were initiated at < 6 weeks potpartum while other studies found no effect. None of the studies found an effect on infant weight gain when COCs were started after 6 weeks postpartum, and no studies found an effect on other infant health outcomes regardless of time of COC initiation. CONCLUSION: Limited evidence of poor to fair quality demonstrates an inconsistent impact of COCs on breastfeeding duration and success. The evidence also demonstrated conflicting results on whether early initiation of COCs affects infant outcomes, but generally found no negative impact on infant outcomes with later initiation of COCs. The body of evidence is limited by older studies using different formulations and doses of estrogen and poor methodologic quality. Given the significant limitations of this body of evidence, the importance of contraception for postpartum women, and the theoretical concerns that have been raised about the use of combined hormonal contraception by women who are breastfeeding, rigorous studies examining these issues are needed. In addition, postpartum women should be counseled about the full range of safe alternative contraceptive methods, particularly during the first 6 weeks postpartum when the risk of venous thromboembolism is highest and use of estrogen may exacerbate this risk. |
Use of Tobacco Tax Stamps to Prevent and Reduce Illicit Tobacco Trade - United States, 2014
Chriqui J , DeLong H , Chaloupka F , Edwards SM , Xu X , Promoff G . MMWR Morb Mortal Wkly Rep 2015 64 (20) 541-546 Tobacco use is the leading cause of preventable disease and death in the United States. Increasing the unit price on tobacco products is the most effective tobacco prevention and control measure. Illicit tobacco trade (illicit trade) undermines high tobacco prices by providing tobacco users with cheaper-priced alternatives. In the United States, illicit trade primarily occurs when cigarettes are bought from states, jurisdictions, and federal reservation land with lower or no excise taxes, and sold in jurisdictions with higher taxes. Applying tax stamps to tobacco products, which provides documentation that taxes have been paid, is an important tool to combat illicit trade. Comprehensive tax stamping policy, which includes using digital, encrypted ("high-tech") stamps, applying stamps to all tobacco products, and working with tribes on stamping agreements, can further prevent and reduce illicit trade. This report describes state laws governing tax stamps on cigarettes, little cigars (cigarette-sized cigars), roll-your-own tobacco (RYOT), and tribal tobacco sales across the United States as of January 1, 2014, and assesses the extent of comprehensive tobacco tax stamping in the United States. Forty-four states (including the District of Columbia [DC]) applied traditional paper ("low-tech") tax stamps to cigarettes, whereas four authorized more effective high-tech stamps. Six states explicitly required stamps on other tobacco products (i.e., tobacco products other than cigarettes), and in approximately one third of states with tribal lands, tribes required tax stamping to address illicit purchases by nonmembers. No U.S. state had a comprehensive approach to tobacco tax stamping. Enhancing tobacco tax stamping across the country might further prevent and reduce illicit trade in the United States. |
Approaches for controlling illicit tobacco trade - nine countries and the European Union
Ross H , Husain MJ , Kostova D , Xu X , Edwards SM , Chaloupka FJ , Ahluwalia IB . MMWR Morb Mortal Wkly Rep 2015 64 (20) 547-550 An estimated 11.6% of the world cigarette market is illicit, representing more than 650 billion cigarettes a year and $40.5 billion in lost revenue. Illicit tobacco trade refers to any practice related to distributing, selling, or buying tobacco products that is prohibited by law, including tax evasion (sale of tobacco products without payment of applicable taxes), counterfeiting, disguising the origin of products, and smuggling. Illicit trade undermines tobacco prevention and control initiatives by increasing the accessibility and affordability of tobacco products, and reduces government tax revenue streams. The World Health Organization (WHO) Protocol to Eliminate Illicit Trade in Tobacco Products, signed by 54 countries, provides tools for addressing illicit trade through a package of regulatory and governing principles. As of May 2015, only eight countries had ratified or acceded to the illicit trade protocol, with an additional 32 needed for it to become international law (i.e., legally binding). Data from multiple international sources were analyzed to evaluate the 10 most commonly used approaches for addressing illicit trade and to summarize differences in implementation across select countries and the European Union (EU). Although the WHO illicit trade protocol defines shared global standards for addressing illicit trade, countries are guided by their own legal and enforcement frameworks, leading to a diversity of approaches employed across countries. Continued adoption of the methods outlined in the WHO illicit trade protocol might improve the global capacity to reduce illicit trade in tobacco products. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Epidemiology and Surveillance
- Genetics and Genomics
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Occupational Safety and Health
- Parasitic Diseases
- Program Evaluation
- Public Health Law
- Public Health Leadership and Management
- Reproductive Health
- Substance Use and Abuse
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure