Risk indicators for periodontitis in US adults: National Health and Nutrition Examination Survey (NHANES) 2009 - 2012
Eke PI , Wei L , Thornton-Evans GO , Borrell LN , Borgnakke WS , Dye B , Genco RJ . J Periodontol 2016 87 (10) 1-18 OBJECTIVE: To determine population-average risk profiles for severe and non-severe periodontitis in US adults (30 years and older) using optimal surveillance measures and standard case definitions. METHODS: We used data from the 2009-2012 National Health and Nutrition Examination Survey (NHANES), which for the first time used the "gold standard" full-mouth periodontitis surveillance protocol to classify severity of periodontitis following the suggested CDC/AAP case definitions. The probabilities of periodontitis by socio-demographics, behavioral factors, and co-morbid conditions, were assessed using prevalence ratios (PR) estimated by the predicted marginal probability from multivariable generalized logistic regression models. The analyses were further stratified by gender for each classification of periodontitis. RESULTS: The likelihood of periodontitis increased with age for overall and non-severe relative to non-periodontitis. Compared to non-Hispanic whites, periodontitis was more likely among Hispanics (aPR=1.38; 1.26-1.52) and non-Hispanic blacks (aPR=1.35; 1.22-1.50), whereas severe periodontitis was most likely among non-Hispanic blacks (aPR=1.82; 1.44-2.31). There was at least a 50% greater likelihood of periodontitis among current smokers compared to non-smokers. Among males, the likelihood of periodontitis among adults 65 years and older was greater (aPR=2.07; 1.76 - 2.43) than adults 30-44 years old. This probability was even greater among women (aPR=3.15; 95% CI 2.63 - 3.77). The likelihood of periodontitis was higher among current smokers relative to non-smokers regardless of gender and periodontitis classification. Periodontitis was more likely among men with un-controlled diabetes compared to persons with no diabetes only. CONCLUSIONS: An assessment of risk profiles for periodontitis in US adults based on gold standard periodontal measures show important differences by severity of disease and gender. Cigarette smoking, specifically among current smokers remains an important modifiable risk for all levels of periodontitis severity. The higher likelihood of periodontitis in older adults and in males with uncontrolled diabetes is noteworthy. These findings could improve the identification of target populations for effective public health interventions to improve periodontal health of US adults. |
United States National Pain Strategy for Population Research: Concepts, definitions and pilot data
Von Korff M , Scher AI , Helmick C , Carter-Pokras O , Dodick D , Goulet J , Hamill-Ruth R , LeResche L , Porter L , Tait R , Terman G , Veasley C , Mackey S . J Pain 2016 17 (10) 1068-1080 National Pain Strategy (NPS) population research objectives include: estimating chronic pain prevalence; studying pain treatment with electronic health care data; and, developing metrics to assess progress in reducing chronic pain impact. In this paper, the NPS Population Research Workgroup reviews concepts relevant to achieving these aims. High impact chronic pain was defined as persistent pain with substantial restriction of life activities lasting six months or more. In pilot work, we tested a brief assessment of high impact chronic pain, and employed electronic health records data to describe pain-related health care. A mail survey of adult health plan enrollees (N=770) found that 14% had high impact chronic pain. Relative to persons with lower impact chronic pain, those with high impact chronic pain were more often frequent users of health care for pain, reported lower quality of life, greater pain-related interference with activities, and more often reported pain at multiple anatomic locations. Analyses of health care data (N=289,464) found that pain patients had higher health care costs compared to others and that pain services were typically delivered in primary care. These results support the feasibility of developing data on chronic pain through national health interview surveys and large electronic health care databases. PERSPECTIVE: Pilot analyses supported the feasibility of brief chronic pain assessments suitable for national health surveys and use of electronic health care databases to develop data regarding trends in the delivery of pain treatments, costs, and effectiveness. These methods are relevant to achieving the aims of the U.S. National Pain Strategy. |
Improved blood pressure control to reduce cardiovascular disease morbidity and mortality: The Standardized Hypertension Treatment and Prevention Project
Patel P , Ordunez P , DiPette D , Escobar MC , Hassell T , Wyss F , Hennis A , Asma S , Angell S . J Clin Hypertens (Greenwich) 2016 18 (12) 1284-1294 Hypertension is the leading remediable risk factor for cardiovascular disease, affecting more than 1 billion people worldwide, and is responsible for more than 10 million preventable deaths globally each year. While hypertension can be successfully diagnosed and treated, only one in seven persons with hypertension have controlled blood pressure. To meet the challenge of improving the control of hypertension, particularly in low- and middle-income countries, the authors developed the Standardized Hypertension Treatment and Prevention Project, which involves a health systems-strengthening approach that advocates for standardized hypertension management using evidence-based interventions. These interventions include the use of standardized treatment protocols, a core set of medications along with improved procurement mechanisms to increase the availability and affordability of these medications, registries for cohort monitoring and evaluation, patient empowerment, team-based care (task shifting), and community engagement. With political will and strong partnerships, this approach provides the groundwork to reduce high blood pressure and cardiovascular disease-related morbidity and mortality. |
Inflammatory markers and the risk of hip and vertebral fractures in men: the Osteoporotic Fractures In Men (MrOS)
Cauley JA , Barbour KE , Harrison SL , Cloonan YK , Danielson ME , Ensrud KE , Fink HA , Orwoll ES , Boudreau R . J Bone Miner Res 2016 31 (12) 2129-2138 Cytokines play major roles in regulating bone remodeling but their relationship to incident fractures in older men is uncertain. We tested the hypothesis that men with higher concentrations of pro-inflammatory markers have a higher risk of fracture. We used a case-cohort design and measured inflammatory markers in a random sample of 961 men and in men with incident fractures including 120 clinical vertebral, 117 hip and 577 non-spine fractures; average follow up, 6.13 years (7.88 years for vertebral fractures). We measured interleukin (IL)-6, C-reactive protein (CRP), tumor necrosis factor alpha (TNFalpha), soluble receptors (SR) of IL-6 (IL-6SR) and TNF (TNFalphaSR1 and TNFalphaSR2) and IL-10. The risk of non-spine, hip and clinical vertebral fracture was compared across quartiles (Q) of inflammatory markers using Cox proportional hazard models with tests for linear trend. In multivariable adjusted models, men with the highest (Q4) TNF cytokine concentrations and their receptors had a 2.9 fold higher risk of hip and clinical vertebral fracture than men with the lowest (Q1). Results were similar for all non-spine fractures but associations were smaller. There was no association between CRP and IL-6SR and fracture. Men in the highest Q of IL-10 had a 49% lower risk of vertebral fracture compared to men in Q1. Among men with >3 inflammatory markers in the highest Q, the hazard ratio (HR) for hip fractures was 2.03 (95% confidence interval (CI), 1.11-3.71) and for vertebral fracture, 3.06 (1.66-5.63). The HRs for hip fracture were attenuated by 27%, 27%,and 15% respectively, after adjusting for appendicular lean mass(ALM), disability and bone density suggesting mediating roles. ALM also attenuated the HR for vertebral fractures by 10%. There was no association between inflammation and rate of hip BMD loss. We conclude that inflammation may play an important role in the etiology of fractures in older men. This article is protected by copyright. All rights reserved. |
Association of higher consumption of foods derived from subsidized commodities with adverse cardiometabolic risk among US adults
Siegel KR , McKeever Bullard K , Imperatore G , Kahn HS , Stein AD , Ali MK , Narayan KM . JAMA Intern Med 2016 176 (8) 1124-32 Importance: Food subsidies are designed to enhance food availability, but whether they promote cardiometabolic health is unclear. Objective: To investigate whether higher consumption of foods derived from subsidized food commodities is associated with adverse cardiometabolic risk among US adults. Design, Setting, and Participants: Cross-sectional analysis of the National Health and Nutrition Examination Survey data from 2001 to 2006. Our final analysis was performed in January 2016. Participants were 10308 nonpregnant adults 18 to 64 years old in the general community. Exposure: From a single day of 24-hour dietary recall in the National Health and Nutrition Examination Survey, we calculated an individual-level subsidy score that estimated an individual's consumption of subsidized food commodities as a percentage of total caloric intake. Main Outcomes and Measures: The main outcomes were body mass index (calculated as weight in kilograms divided by height in meters squared), abdominal adiposity, C-reactive protein level, blood pressure, non-high-density lipoprotein cholesterol level, and glycemia. Results: Among 10308 participants, the mean (SD) age was 40.2 (0.3) years, and a mean (SD) of 50.5% (0.5%) were male. Overall, 56.2% of calories consumed were from the major subsidized food commodities. United States adults in the highest quartile of the subsidy score (compared with the lowest) had increased probabilities of having a body mass index of at least 30 (prevalence ratio, 1.37; 95% CI, 1.23-1.52), a ratio of waist circumference to height of at least 0.60 (prevalence ratio, 1.41; 95% CI, 1.25-1.59), a C-reactive protein level of at least 0.32 mg/dL (prevalence ratio, 1.34; 95% CI, 1.19-1.51), an elevated non-high-density lipoprotein cholesterol level (prevalence ratio, 1.14; 95% CI, 1.05-1.25), and dysglycemia (prevalence ratio, 1.21; 95% CI, 1.04-1.40). There was no statistically significant association between the subsidy score and blood pressure. Conclusions and Relevance: Among US adults, higher consumption of calories from subsidized food commodities was associated with a greater probability of some cardiometabolic risks. Better alignment of agricultural and nutritional policies may potentially improve population health. |
A comprehensive capacity assessment tool for non-communicable diseases in low- to middle-income countries: development and results of pilot testing
Garcia de Quevedo I , Lobelo F , Cadena L , Soares M , Pratt M . Glob Health Promot 2016 25 (1) 43-53 Non-communicable diseases (NCDs) are the leading causes of death worldwide, with higher rates of premature mortality in low- and middle-income countries (LMICs). This places a high economic burden on these countries, which usually have limited capacity to address this public health problem. We developed a guided self-assessment tool for describing national capacity for NCD prevention and control. The purpose of this tool was to assist countries in identifying key opportunities and gaps in NCD capacity. It was piloted in three countries between 2012 and 2013: Mozambique, Colombia, and the Dominican Republic. The tool includes details about NCD burden; health system infrastructure and primary care services; workforce capacity; surveillance; planning, policy, and program management; and partnerships. In the three pilot countries, the tool helped to identify differences in capacity needs pertaining to staff, training, and surveillance, but similarities were also found related to NCD challenges and opportunities. The NCD tool increased our understanding of needs and critical capacity elements for addressing NCDs in the three pilot countries. This tool can be used by other LMICs to map their efforts toward addressing NCD goals and defining priorities. |
Dietary nitrate and the epidemiology of cardiovascular disease: report from a National Heart, Lung, and Blood Institute Workshop
Ahluwalia A , Gladwin M , Coleman GD , Hord N , Howard G , Kim-Shapiro DB , Lajous M , Larsen FJ , Lefer DJ , McClure LA , Nolan BT , Pluta R , Schechter A , Wang CY , Ward MH , Harman JL . J Am Heart Assoc 2016 5 (7) In view of continuing unanswered questions regarding the geographical and demographic distribution of cardiovascular disease, and recent discoveries about the effects of dietary nitrate on cardiovascular physiology, the National Heart, Lung, and Blood Institute (NHLBI) convened a workshop to identify approaches to address how best to incorporate the study of nitrate exposures into ongoing studies of cardiovascular epidemiology. The NHLBI invited speakers who had made recent contributions to the study of the functions of nitrate on the cardiovascular system, on the occurrence of nitrate in foods and drinking water, or who had expert knowledge of cardiovascular surveys with wide geographical variability and therefore the greatest potential variability in dietary and drinking water nitrate. Because of the history of research on the possible carcinogenicity of nitrite, an expert in this field was also invited. The following document is a synthesis of the material presented and discussed and of literature cited at the workshop. The workshop from which this article is derived was funded and convened by the NHLBI. |
Molecular and Growth-Based Drug Susceptibility Testing of Mycobacterium tuberculosis Complex for Ethambutol Resistance in the United States.
Yakrus MA , Driscoll J , McAlister A , Sikes D , Hartline D , Metchock B , Starks AM . Tuberc Res Treat 2016 2016 3404860 Ethambutol (EMB) is used as a part of drug regimens for treatment of tuberculosis (TB). Susceptibility of Mycobacterium tuberculosis complex (MTBC) isolates to EMB can be discerned by DNA sequencing to detect mutations in the embB gene associated with resistance. US Public Health Laboratories (PHL) primarily use growth-based drug susceptibility test (DST) methods to determine EMB resistance. The Centers for Disease Control and Prevention (CDC) provides a service for molecular detection of drug resistance (MDDR) by DNA sequencing and concurrent growth-based DST using agar proportion. PHL and CDC test results were compared for 211 MTBC samples submitted to CDC from September 2009 through February 2011. Concordance between growth-based DST results from PHL and CDC was 88.2%. A growth-based comparison of 39 samples, where an embB mutation associated with EMB resistance was detected, revealed a higher percentage of EMB resistance by CDC (84.6%) than by PHL (59.0%) which was significant (P value = 0.002). Discordance between all growth-based test results from PHL and CDC was also significant (P value = 0.003). Most discordance was linked to false susceptibility using the BACTEC MGIT 960 (MGIT) growth-based system. Our analysis supports coalescing growth-based and molecular results for an informed interpretation of potential EMB resistance. |
Reduced evolutionary rate in reemerged Ebola virus transmission chains.
Blackley DJ , Wiley MR , Ladner JT , Fallah M , Lo T , Gilbert ML , Gregory C , D'Ambrozio J , Coulter S , Mate S , Balogun Z , Kugelman J , Nwachukwu W , Prieto K , Yeiah A , Amegashie F , Kearney B , Wisniewski M , Saindon J , Schroth G , Fakoli L , Diclaro JW 2nd , Kuhn JH , Hensley LE , Jahrling PB , Stroher U , Nichol ST , Massaquoi M , Kateh F , Clement P , Gasasira A , Bolay F , Monroe SS , Rambaut A , Sanchez-Lockhart M , Scott Laney A , Nyenswah T , Christie A , Palacios G . Sci Adv 2016 2 (4) e1600378 On 29 June 2015, Liberia's respite from Ebola virus disease (EVD) was interrupted for the second time by a renewed outbreak ("flare-up") of seven confirmed cases. We demonstrate that, similar to the March 2015 flare-up associated with sexual transmission, this new flare-up was a reemergence of a Liberian transmission chain originating from a persistently infected source rather than a reintroduction from a reservoir or a neighboring country with active transmission. Although distinct, Ebola virus (EBOV) genomes from both flare-ups exhibit significantly low genetic divergence, indicating a reduced rate of EBOV evolution during persistent infection. Using this rate of change as a signature, we identified two additional EVD clusters that possibly arose from persistently infected sources. These findings highlight the risk of EVD flare-ups even after an outbreak is declared over. |
Risk factors for herpes zoster among adults
Marin M , Harpaz R , Zhang J , Wollan PC , Bialek SR , Yawn BP . Open Forum Infect Dis 2016 3 (3) ofw119 Background: The causes of varicella-zoster virus reactivation and herpes zoster (HZ) are largely unknown. We assessed potential risk factors for HZ, the data for which cannot be obtained from the medical sector. Methods: We conducted a matched case-control study. We established active surveillance in Olmsted County, Minnesota to identify HZ occurring among persons age ≥50 years during 2010-2011. Cases were confirmed by medical record review. Herpes zoster-free controls were age- and sex-matched to cases. Risk factor data were obtained by telephone interview. Results: We enrolled 389 HZ case patients and 511 matched controls; the median age was 65 and 66 years, respectively. Herpes zoster was associated with family history of HZ (adjusted odds ratio [aOR] = 1.65); association was highest with first-degree or multiple relatives (aOR = 1.87 and 3.08, respectively). Herpes zoster was also associated with prior HZ episodes (aOR = 1.82), sleep disturbance (aOR = 2.52), depression (aOR = 3.81), and recent weight loss (aOR = 1.95). Stress was a risk factor for HZ (aOR = 2.80), whereas a dose-response relationship was not noted. All associations indicated were statistically significant (P < .05). Herpes zoster was not associated with trauma, smoking, tonsillectomy, diet, or reported exposure to pesticides or herbicides (P > .1). Conclusions: We identified several important risk factors for HZ; however, the key attributable causes of HZ remain unknown. |
Three months of weekly rifapentine and isoniazid for treatment of Mycobacterium tuberculosis infection in HIV-coinfected persons
Sterling TR , Scott NA , Miro JM , Calvet G , La Rosa A , Infante R , Chen MP , Benator DA , Gordin F , Benson CA , Chaisson RE , Villarino ME . AIDS 2016 30 (10) 1607-15 OBJECTIVE: Compare the effectiveness, tolerability, and safety of 3 months of weekly rifapentine and isoniazid under direct observation (3HP) versus 9 months of daily isoniazid (9H) in HIV-infected persons. DESIGN: Prospective, randomized, and open-label noninferiority trial. SETTING: The United States , Brazil, Spain, Peru, Canada, and Hong Kong. PARTICIPANTS: HIV-infected persons who were tuberculin skin test positive or close contacts of tuberculosis cases. INTERVENTION: 3HP versus 9H. MAIN OUTCOME MEASURES: The effectiveness endpoint was tuberculosis; the noninferiority margin was 0.75%. The tolerability endpoint was treatment completion; the safety endpoint was drug discontinuation because of adverse drug reaction. RESULTS: Median baseline CD4 cell counts were 495 (IQR 389-675) and 538 (IQR 418-729) cells/mul in the 3HP and 9H arms, respectively (P = 0.09). In the modified intention-to-treat analysis, there were two tuberculosis cases among 206 persons [517 person-years (p-y) of follow-up] in the 3HP arm (0.39 per 100 p-y) and six tuberculosis cases among 193 persons (481 p-y of follow-up) in the 9H arm (1.25 per 100 p-y). Cumulative tuberculosis rates were 1.01 versus 3.50% in the 3HP and 9H arms, respectively (rate difference: -2.49%; upper bound of the 95% confidence interval of the difference: 0.60%). Treatment completion was higher with 3HP (89%) than 9H (64%) (P < 0.001), and drug discontinuation because of an adverse drug reaction was similar (3 vs. 4%; P = 0.79) in 3HP and 9H, respectively. CONCLUSION: Among HIV-infected persons with median CD4 cell count of approximately 500 cells/mul, 3HP was as effective and safe for treatment of latent Mycobacterium tuberculosis infection as 9H, and better tolerated. |
Measles outbreak reveals measles susceptibility among adults in Namibia, 2009 - 2011
Ogbuanu IU , Muroua C , Allies M , Chitala K , Gerber S , Shilunga P , Mhata P , Kriss JL , Caparos L , Smit SB , De Wee RJ , Goodson JL . S Afr Med J 2016 106 (7) 715-720 BACKGROUND: The World Health Organization, African Region, set the goal of achieving measles elimination by 2020. Namibia was one of seven African countries to implement an accelerated measles control strategy beginning in 1996. Following implementation of this strategy, measles incidence decreased; however, between 2009 and 2011 a major outbreak occurred in Namibia. METHODS: Measles vaccination coverage data were analysed and a descriptive epidemiological analysis of the measles outbreak was conducted using measles case-based surveillance and laboratory data. RESULTS: During 1989 - 2008, MCV1 (the first routine dose of measles vaccine) coverage increased from 56% to 73% and five supplementary immunisation activities were implemented. During the outbreak (August 2009 - February 2011), 4 605 suspected measles cases were reported; of these, 3 256 were confirmed by laboratory testing or epidemiological linkage. Opuwo, a largely rural district in north-western Namibia with nomadic populations, had the highest confirmed measles incidence (16 427 cases per million). Infants aged ≤11 months had the highest cumulative age-specific incidence (9 252 cases per million) and comprised 22% of all confirmed cases; however, cases occurred across a wide age range, including adults aged ≥30 years. Among confirmed cases, 85% were unvaccinated or had unknown vaccination history. The predominantly detected measles virus genotype was B3, circulating in concurrent outbreaks in southern Africa, and B2, previously detected in Angola. CONCLUSION: A large-scale measles outbreak with sustained transmission over 18 months occurred in Namibia, probably caused by importation. The wide age distribution of cases indicated measles-susceptible individuals accumulated over several decades prior to the start of the outbreak. |
Population health or individualized care in the global AIDS response: synergy or conflict?
El-Sadr WM , Rabkin M , DeCock KM . AIDS 2016 30 (14) 2145-8 Extraordinary progress has been achieved in confronting the global HIV epidemic. The number of people living with HIV (PLWH) accessing antiretroviral treatment (ART) in low- and middle-income countries rose from 400,000 in 2003 to 17 million in 2015,1 and an estimated 7.8 million deaths have been averted by the scale-up of ART services.2 Increased access to prevention and treatment has also led to a 35% drop in new HIV infections since 2000, including a 58% decrease amongst children.3 | The majority of PLWH accessing ART in low-resource settings live in sub Saharan Africa, a region with some of the world’s weakest health systems. Despite austere settings, health worker shortages, dysfunctional supply chains and laboratories, and absent continuity care systems, the HIV response has succeeded beyond expectations.4 Although this success was built on the use of simple, standardized, and evidence-based approaches to HIV prevention and treatment, new global guidelines support the use of more individualized services.5 While such a differentiated care strategy has the potential to improve both the quality and efficiency of HIV programs, this can only be accomplished if key elements of the public health approach that has been so successful over the past 20 years are retained. |
Positive predictive value of the WHO clinical and immunologic criteria to predict viral load failure among adults on first, or second-line antiretroviral therapy in Kenya
Waruru A , Muttai H , Ng'ang'a L , Ackers M , Kim A , Miruka F , Erick O , Okonji J , Ayuaya T , Schwarcz S . PLoS One 2016 11 (7) e0158881 Routine HIV viral load (VL) monitoring is the standard of care for persons receiving antiretroviral therapy (ART) in developed countries. Although the World Health Organization recommends annual VL monitoring of patients on ART, recognizing difficulties in conducting routine VL testing, the WHO continues to recommend targeted VL testing to confirm treatment failure for persons who meet selected immunologic and clinical criteria. Studies have measured positive predictive value (PPV), negative predictive value, sensitivity and specificity of these criteria among patients receiving first-line ART but not specifically among those on second-line or subsequent regimens. Between 2008 and 2011, adult ART patients in Nyanza, Kenya who met national clinical or immunologic criteria for treatment failure received targeted VL testing. We calculated PPV and 95% confidence intervals (CI) of these criteria to detect virologic treatment failure among patients receiving a) first-line ART, b) second/subsequent ART, and c) any regimen. Of 12,134 patient specimens tested, 2,874 (23.7%) were virologically confirmed as treatment failures. The PPV for 2,834 first-line ART patients who met either the clinical or immunologic criteria for treatment failure was 34.4% (95% CI 33.2-35.7), 33.1% (95% CI 24.7-42.3) for the 40 patients on second-line/subsequent regimens, and 33.4% (95% CI 33.1-35.6) for any ART. PPV, regardless of criteria, for first-line ART patients was lowest among patients over 44 years old and highest for patients aged 15 to 34 years. PPV of immunological and clinical criteria for correctly identifying treatment failure was similarly low for adult patients receiving either first-line or second-line/subsequent ART regimens. Our data confirm the inadequacy of clinical and immunologic criteria to correctly identify treatment failure and support the implementation of routine VL testing. |
HIV infection care and viral suppression among people who inject drugs, 28 U.S. jurisdictions, 2012-2013
Karch DL , Gray KM , Shi J , Hall HI . Open AIDS J 2016 10 127-35 OBJECTIVES: Assess outcomes along the care continuum for HIV-infected people who inject drugs (PWID), by type of facility and stage of infection at diagnosis. METHODS: Data reported by 28 jurisdictions to the National HIV Surveillance System by December 2014 were used to identify PWID aged ≥13 years, diagnosed with HIV infection before December 31, 2013. Analyses used the CDC definition of linkage to care (LTC), retention in care (RIC), and viral suppression (VS), and are stratified by age, sex, race/ethnicity, and type of facility and stage of HIV infection at diagnosis. RESULTS: Of 1,409 PWID diagnosed with HIV in 2013, 1,116 (79.2%) were LTC with the lowest percentages among males (78.4%); blacks (77.5%) ages 13-24 years (69.0%); those diagnosed in early stage infection (71.6%); and at screening, diagnostic, or referral agencies (60.0%). Of 80,958 PWID living with HIV in 2012, 40,234 (49.7%) were RIC and 34,665 (42.8%) achieved VS. The lowest percentages for RIC and VS were among males (47.1% and 41.3% respectively); those diagnosed with late stage disease (47.1% and 42.4%); and young people. Whites had the lowest RIC (47.0%) while blacks had the lowest VS (41.1%). CONCLUSION: Enhanced LTC activities are needed for PWID diagnosed at screening, diagnostic or referral agencies versus those diagnosed at inpatient or outpatient settings, especially among young people and blacks diagnosed in early stage infection. Less than half of PWID are retained in care or reach viral suppression indicating the need for continued engagement and return to care activities over the long term. |
Hepatotoxicity associated with weight loss or sports dietary supplements, including OxyELITE Pro - United States, 2013
Chatham-Stephens K , Taylor E , Chang A , Peterson A , Daniel J , Martin C , Deuster P , Noe R , Kieszak S , Schier J , Klontz K , Lewis L . Drug Test Anal 2016 9 (1) 68-74 In September 2013, the Hawaii Department of Health (HDOH) was notified of seven adults who developed acute hepatitis after taking OxyELITE Pro, a weight loss and sports dietary supplement. CDC assisted HDOH with their investigation, then conducted case-finding outside of Hawaii with FDA and the Department of Defense (DoD). We defined cases as acute hepatitis of unknown etiology that occurred from April 1, 2013, through December 5, 2013, following exposure to a weight loss or muscle-building dietary supplement, such as OxyELITE Pro. We conducted case-finding through multiple sources, including data from poison centers (National Poison Data System [NPDS]) and FDA MedWatch. We identified 40 case-patients in 23 states and two military bases with acute hepatitis of unknown etiology and exposure to a weight loss or muscle building dietary supplement. Of 35 case-patients who reported their race, 15 (42.9%) reported white and 9 (25.7%) reported Asian. Commonly reported symptoms included jaundice, fatigue, and dark urine. Twenty-five (62.5%) case-patients reported taking OxyELITE Pro. Of these 25 patients, 17 of 22 (77.3%) with available data were hospitalized and 1 received a liver transplant. NPDS and FDA MedWatch each captured seven (17.5%) case-patients. Improving the ability to search surveillance systems like NPDS and FDA MedWatch for individual and grouped dietary supplements, as well as coordinating case-finding with DoD, may benefit ongoing surveillance efforts and future outbreak responses involving adverse health effects from dietary supplements. This investigation highlights opportunities and challenges in using multiple sources to identify cases of suspected supplement associated adverse events. |
Serum testosterone concentrations and urinary bisphenol A, benzophenone-3, triclosan, and paraben levels in male and female children and adolescents: NHANES 2011-2012
Scinicariello F , Buser MC . Environ Health Perspect 2016 124 (12) 1898-1904 BACKGROUND: Exposure to environmental phenols (e.g., bisphenol-A, benzophenone-3, and triclosan) and parabens is widespread in the population. Many of these chemicals have been shown to have anti-androgenic effects both in vitro and in vivo. OBJECTIVE: To examine the association of bisphenol-A (BPA), benzophenone-3 (BP-3), triclosan (TCS), and parabens with serum total testosterone (TT) levels in child and adolescent participants (ages 6-19 years) in the National Health and Nutrition Examination Survey (NHANES) 2011-2012. METHODS: We performed multivariable linear regression to estimate associations between natural log-transformed serum TT and quartiles of urinary BPA, BP-3, TCS, and parabens in male and female children (ages 6 - 11 years) and adolescents (ages 12 - 19 years). RESULTS: BP-3 and BPA were associated with significantly lower TT in male adolescents, and BPA was associated with significantly higher TT in female adolescents. TT was not consistently associated with TCS or total parabens in children or adolescents of either sex. CONCLUSIONS: To our knowledge, this is the first study to report an association between BP-3 and BPA with serum TT in adolescents. Associations between BPA and TT differed according to sex in adolescents, with inverse associations in boys and positive associations in girls. BP-3 was associated with significantly lower TT in adolescent boys only. However, because of the limitations inherent to the cross-sectional study design, further studies are needed to confirm and elucidate on our findings. |
Medical toxicology and public health-update on research and activities at the Centers for Disease Control and Prevention and the Agency for Toxic Substances and Disease Registry : environmental exposures among Arctic populations: the Maternal Organics Monitoring Study in Alaska
Anwar M , Ridpath A , Berner J , Schier JG . J Med Toxicol 2016 12 (3) 315-7 Evidence suggests that in-utero exposure to environmental chemicals, such as persistent organic pollutants (POPs), heavy metals, and radionuclides, that might bioaccumulate in the mother may increase a newborn's risk of adverse developmental, neurological, and immunologic effects. Chemical contamination of bodies of water and strong ocean currents worldwide can drive these chemicals from lower latitudes to Arctic waters where they accumulate in common traditional subsistence foods. In response to concerns of the people from Alaska of the effects of bio-accumulated chemicals on their children, the Maternal Organics Monitoring Study(MOMS) was developed. The objective of the study was to assess the risks and benefits associated with the population's subsistence diet. Data analysis of biological samples at the CDC's NCEH laboratory and maternal questionnaires is ongoing. Results will be provided to Alaska Native communities to help support public health actions and inform future interventions and research. |
Outdoor PM2.5, ambient air temperature, and asthma symptoms in the past 14 days among adults with active asthma
Mirabelli MC , Vaidyanathan A , Flanders WD , Qin X , Garbe P . Environ Health Perspect 2016 124 (12) 1882-1890 BACKGROUND: Relationships between air quality and health are well-described, but little information is available about the joint associations between particulate air pollution, ambient temperature, and respiratory morbidity. OBJECTIVES: To evaluate associations between concentrations of particulate matter ≤2.5 microns in diameter (PM2.5) and exacerbation of existing asthma and modification of the associations by ambient air temperature. METHODS: Data from 50,356 adult 2006-2010 Asthma Call-back Survey respondents were linked by interview date and county of residence to estimates of daily averages of PM2.5 and maximum air temperature. Associations between 14-day average PM2.5 and the presence of any asthma symptoms during the 14 days leading up to and including the interview date were evaluated using binomial regression. We explored variation by air temperature using similar models, stratified into quintiles of the 14-day average maximum temperature. RESULTS: Among adults with active asthma, 57.1% reported asthma symptoms within the past 14 days and 14-day average PM2.5 ≥7.07 microg.m-3 was associated with an estimated 4 to 5% higher asthma symptom prevalence. In the range of 4.00 to 7.06 microg.m-3 of PM2.5, each microg.m-3 increase was associated with a 3.4% (95% confidence interval: 1.1, 5.7) increase in symptom prevalence; across categories of temperature from 1.1 to 80.5 degrees F, each microg.m-3 increase was associated with increased symptom prevalence (1.1-44.4 degrees F: 7.9%; 44.5-58.6 degrees F: 6.9%; 58.7-70.1 degrees F: 2.9%; 70.2-80.5 degrees F: 7.3%). CONCLUSIONS: These results suggest that each unit increase in PM2.5 may be associated with an increase in the prevalence of asthma symptoms, even at levels as low as 4.00 to 7.06 microg.m-3. |
Prenatal PBDE and PCB exposures and reading, cognition, and externalizing behavior in children
Zhang H , Yolton K , Webster GM , Sjodin A , Calafat AM , Dietrich KN , Xu Y , Xie C , Braun JM , Lanphear BP , Chen A . Environ Health Perspect 2016 125 (4) 746-752 BACKGROUND: Prenatal polybrominated diphenyl ethers (PBDEs) and polychlorinated biphenyls (PCBs) exposures may influence children's neurodevelopment. OBJECTIVE: To examine the association of prenatal PBDE and PCB exposures with children's reading skills at ages 5 and 8 years, Full-Scale Intelligence Quotient (FSIQ) and Externalizing Problems at age 8 years. METHODS: From 239 mother-child pairs recruited (2003-2006) in Cincinnati, OH, we measured maternal serum PBDE and PCB concentrations, assessed child's reading skills using the Woodcock-Johnson Tests of Achievement III (WJ-III) at age 5 years and the Wide Range Achievement Test-4 (WRAT-4) at age 8 years, tested FSIQ using the Wechsler Intelligence Scale for Children-IV (WISC-IV), and Externalizing Problems using the Behavioral Assessment System for Children-2 (BASC-2) at age 8 years. We used multiple linear regression to examine the association of prenatal PBDE and PCB concentrations and reading, FSIQ and Externalizing Problems after adjustment for covariates. RESULTS: A 10-fold increase in Sum4PBDEs (BDE-47, -99, -100 and -153) was not significantly associated with reading scores at age 5 years at the p=0.05 level, but was inversely associated with Reading Composite scores (beta: -6.2, 95% CI: -11.7, -0.6) and FSIQ (beta: -5.3, 95% CI: -10.6, -0.02) at age 8 years; it was positively associated with Externalizing Problems score (beta: 3.5, 95% CI: -0.1, 7.2) at age 8 years. Prenatal Sum4PCBs (PCB-118, -153, -138-158, and -180) was not significantly associated with child's reading skills, FSIQ and Externalizing Problems. CONCLUSION: Prenatal PBDE concentration was inversely associated with reading skills and FSIQ, positively associated with Externalizing Problems at age 8 years. No significant associations were found in prenatal PCB concentration. |
Quantification of toxins in soapberry (Sapindaceae) arils: hypoglycin A and methylenecyclopropylglycine
Isenberg SL , Carter MD , Hayes SR , Graham LA , Johnson D , Mathews TP , Harden LA , Takeoka GR , Thomas JD , Pirkle JL , Johnson RC . J Agric Food Chem 2016 64 (27) 5607-13 Methylenecyclopropylglycine (MCPG) and hypoglycin A (HGA) are naturally occurring amino acids found in some soapberry fruits. Fatalities have been reported worldwide as a result of HGA ingestion, and exposure to MCPG has been implicated recently in the Asian outbreaks of hypoglycemic encephalopathy. In response to an outbreak linked to soapberry ingestion, the authors developed the first method to simultaneously quantify MCPG and HGA in soapberry fruits from 1 to 10000 ppm of both toxins in dried fruit aril. Further, this is the first report of HGA in litchi, longan, and mamoncillo arils. This method is presented to specifically address the laboratory needs of public-health investigators in the hypoglycemic encephalitis outbreaks linked to soapberry fruit ingestion. |
Assessment of the quality, effectiveness, and acceptability of ceramic water filters in Tanzania
Lemons A , Branz A , Kimirei M , Hawkins T , Lantagne D . J Water Sanit Hyg Dev 2016 6 (2) 195-204 Globally, approximately two billion people drink contaminated water. Use of household water treatment (HWT) methods, such as locally manufactured ceramic filters, reduces the diarrheal disease burden associated with unclean water. We evaluated the quality, effectiveness, and acceptability of ceramic filters in two communities in Arusha, Tanzania, by conducting: 1) baseline household surveys with 50 families; 2) filter flow rate testing; 3) filter distribution with training sessions; 4) follow-up surveys at 2, 4, and 6 weeks after distribution; and 5) project end focus group discussions. We tested Escherichia coli (E. coli) and turbidity at baseline and the first two follow-ups. We found: 1) filter quality was low, as only 46% of filters met recommended flow rate guidelines and 18% of filters broke during the 6-week study; 2) filter effectiveness was moderate, with 8% and 35% of filters effectively reducing E. coli to <1 CFU/100 mL and <10 CFU/100 mL, respectively, at followups; and, 3) filter acceptability was high, with 94% overall satisfaction and 96–100% reported use in the previous day. These results highlight the importance of mixed methods research as HWT product quality, effectiveness, and acceptability all impact product efficacy, and the need for quality assurance/quality control and certification schemes for locally manufactured HWT products. © IWA Publishing 2016. |
High-Quality Draft Genome Sequences for Five Non-O157 Shiga Toxin-Producing Escherichia coli Strains Generated with PacBio Sequencing and Optical Maps.
Lindsey RL , Rowe L , Garcia-Toledo L , Loparev V , Knipe K , Stripling D , Martin H , Trees E , Juieng P , Batra D , Strockbine N . Genome Announc 2016 4 (3) Shiga toxin-producing Escherichia coli (STEC) is a foodborne pathogen. We report here the high-quality draft whole-genome sequences of five STEC strains isolated from clinical cases in the United States. This report is for STEC of serotypes O55:H7, O79:H7, O91:H14, O153:H2, and O156:H25. |
Virus fitness differences observed between two naturally occurring isolates of Ebola virus Makona variant using a reverse genetics approach.
Albarino CG , Guerrero LW , Chakrabarti AK , Kainulainen MH , Whitmer SL , Welch SR , Nichol ST . Virology 2016 496 237-243 During the large outbreak of Ebola virus disease that occurred in Western Africa from late 2013 to early 2016, several hundred Ebola virus (EBOV) genomes have been sequenced and the virus genetic drift analyzed. In a previous report, we described an efficient reverse genetics system designed to generate recombinant EBOV based on a Makona variant isolate obtained in 2014. Using this system, we characterized the replication and fitness of 2 isolates of the Makona variant. These virus isolates are nearly identical at the genetic level, but have single amino acid differences in the VP30 and L proteins. The potential effects of these differences were tested using minigenomes and recombinant viruses. The results obtained with this approach are consistent with the role of VP30 and L as components of the EBOV RNA replication machinery. Moreover, the 2 isolates exhibited clear fitness differences in competitive growth assays. |
The use of whole genome sequencing for detecting antimicrobial resistance in nontyphoidal Salmonella.
McDermott PF , Tyson GH , Kabera C , Chen Y , Li C , Folster JP , Ayers SL , Lam C , Tate HP , Zhao S . Antimicrob Agents Chemother 2016 60 (9) 5515-20 Laboratory-based in vitro antimicrobial susceptibility testing is the foundation for guiding anti-infective therapy and monitoring antimicrobial resistance trends. We used whole-genome sequencing (WGS) technology to identify known antimicrobial resistance determinants among strains of nontyphoidal Salmonella and correlated these with susceptibility phenotypes to evaluate the utility of WGS for antimicrobial resistance surveillance. Six hundred forty Salmonella of 43 different serotypes were selected from among retail meat and human clinical isolates that were tested for susceptibility to 14 antimicrobials using broth microdilution. The minimum inhibitory concentration (MIC) for each drug was used to categorize isolates as susceptible or resistant based on Clinical and Laboratory Standards Institute clinical breakpoints or NARMS consensus interpretive criteria. Each isolate was subjected to whole-genome shotgun sequencing, and resistance genes were identified from assembled sequences. A total of 65 unique resistance genes, plus mutations in two structural resistance loci, were identified. There were more unique resistance genes (n=59) in the 104 human isolates than in the 536 retail meat isolates (n=36). Overall, resistance genotypes and phenotypes correlated in 99.0% of cases. Correlations approached 100% for most classes of antibiotics, but were lower for aminoglycosides and beta-lactams. We report the first finding of ESBLs (blaCTX-M1 and blaSHV2a) in retail meat isolates of Salmonella in the U.S. Whole-genome sequencing is an effective tool for predicting antibiotic resistance in nontyphoidal Salmonella, although the use of more appropriate surveillance breakpoints and increased knowledge of new resistance alleles will further improve correlations. |
Human prion diseases: surgical lessons learned from iatrogenic prion transmission
Bonda DJ , Manjila S , Mehndiratta P , Khan F , Miller BR , Onwuzulike K , Puoti G , Cohen ML , Schonberger LB , Cali I . Neurosurg Focus 2016 41 (1) E10 The human prion diseases, or transmissible spongiform encephalopathies, have captivated our imaginations since their discovery in the Fore linguistic group in Papua New Guinea in the 1950s. The mysterious and poorly understood "infectious protein" has become somewhat of a household name in many regions across the globe. From bovine spongiform encephalopathy (BSE), commonly identified as mad cow disease, to endocannibalism, media outlets have capitalized on these devastatingly fatal neurological conditions. Interestingly, since their discovery, there have been more than 492 incidents of iatrogenic transmission of prion diseases, largely resulting from prion-contaminated growth hormone and dura mater grafts. Although fewer than 9 cases of probable iatrogenic neurosurgical cases of Creutzfeldt-Jakob disease (CJD) have been reported worldwide, the likelihood of some missed cases and the potential for prion transmission by neurosurgery create considerable concern. Laboratory studies indicate that standard decontamination and sterilization procedures may be insufficient to completely remove infectivity from prion-contaminated instruments. In this unfortunate event, the instruments may transmit the prion disease to others. Much caution therefore should be taken in the absence of strong evidence against the presence of a prion disease in a neurosurgical patient. While the Centers for Disease Control and Prevention (CDC) and World Health Organization (WHO) have devised risk assessment and decontamination protocols for the prevention of iatrogenic transmission of the prion diseases, incidents of possible exposure to prions have unfortunately occurred in the United States. In this article, the authors outline the historical discoveries that led from kuru to the identification and isolation of the pathological prion proteins in addition to providing a brief description of human prion diseases and iatrogenic forms of CJD, a brief history of prion disease nosocomial transmission, and a summary of the CDC and WHO guidelines for prevention of prion disease transmission and decontamination of prion-contaminated neurosurgical instruments. |
Clinical correlates of surveillance events detected by National Healthcare Safety Network Pneumonia and Lower Respiratory Infection Definitions - Pennsylvania, 2011-2012
See I , Chang J , Gualandi N , Buser GL , Rohrbach P , Smeltz DA , Bellush MJ , Coffin SE , Gould JM , Hess D , Hennessey P , Hubbard S , Kiernan A , O'Donnell J , Pegues DA , Miller JR , Magill SS . Infect Control Hosp Epidemiol 2016 37 (7) 818-24 OBJECTIVE: To determine the clinical diagnoses associated with the National Healthcare Safety Network (NHSN) pneumonia (PNEU) or lower respiratory infection (LRI) surveillance events DESIGN Retrospective chart review SETTING: A convenience sample of 8 acute-care hospitals in Pennsylvania PATIENTS All patients hospitalized during 2011-2012 METHODS Medical records were reviewed from a random sample of patients reported to the NHSN to have PNEU or LRI, excluding adults with ventilator-associated PNEU. Documented clinical diagnoses corresponding temporally to the PNEU and LRI events were recorded. RESULTS: We reviewed 250 (30%) of 838 eligible PNEU and LRI events reported to the NHSN; 29 reported events (12%) fulfilled neither PNEU nor LRI case criteria. Differences interpreting radiology reports accounted for most misclassifications. Of 81 PNEU events in adults not on mechanical ventilation, 84% had clinician-diagnosed pneumonia; of these, 25% were attributed to aspiration. Of 43 adult LRI, 88% were in mechanically ventilated patients and 35% had no corresponding clinical diagnosis (infectious or noninfectious) documented at the time of LRI. Of 36 pediatric PNEU events, 72% were ventilator associated, and 70% corresponded to a clinical pneumonia diagnosis. Of 61 pediatric LRI patients, 84% were mechanically ventilated and 21% had no corresponding clinical diagnosis documented. CONCLUSIONS: In adults not on mechanical ventilation and in children, most NHSN-defined PNEU events corresponded with compatible clinical conditions documented in the medical record. In contrast, NHSN LRI events often did not. As a result, substantial modifications to the LRI definitions were implemented in 2015. |
Tetanus, diphtheria, and acellular pertussis vaccination among women of childbearing age - United States, 2013
O'Halloran AC , Lu PJ , Williams WW , Ding H , Meyer SA . Am J Infect Control 2016 44 (7) 786-93 The incidence of pertussis in the United States has increased since the 1990s. Tetanus, diphtheria, and acellular pertussis (Tdap) vaccination of pregnant women provides passive protection to infants. Tdap vaccination is currently recommended for pregnant women during each pregnancy, but coverage among pregnant women and women of childbearing age has been suboptimal. Data from the 2013 Behavioral Risk Factor Surveillance System (BRFSS) and 2013 National Health Interview Survey (NHIS) were used to determine national and state-specific Tdap vaccination coverage among women of childbearing age by self-reported pregnancy status at the time of the survey. Although this study could not assess coverage of Tdap vaccination received during pregnancy because questions on whether Tdap vaccination was received during pregnancy were not asked in BRFSS and NHIS, demographic and access-to-care factors associated with Tdap vaccination coverage in this population were assessed. Tdap vaccination coverage among all women 18-44 years old was 38.4% based on the BRFSS and 23.3% based on the NHIS. Overall, coverage did not differ by pregnancy status at the time of the survey. Coverage among all women 18-44 years old varied widely by state. Age, race and ethnicity, education, number of children in the household, and access-to-care characteristics were independently associated with Tdap vaccination in both surveys. We identified associations of demographic and access-to-care characteristics with Tdap vaccination that can guide strategies to improve vaccination rates in women during pregnancy. |
Preventing vaccine-derived poliovirus emergence during the polio endgame
Pons-Salort M , Burns CC , Lyons H , Blake IM , Jafari H , Oberste MS , Kew OM , Grassly NC . PLoS Pathog 2016 12 (7) e1005728 Reversion and spread of vaccine-derived poliovirus (VDPV) to cause outbreaks of poliomyelitis is a rare outcome resulting from immunisation with the live-attenuated oral poliovirus vaccines (OPVs). Global withdrawal of all three OPV serotypes is therefore a key objective of the polio endgame strategic plan, starting with serotype 2 (OPV2) in April 2016. Supplementary immunisation activities (SIAs) with trivalent OPV (tOPV) in advance of this date could mitigate the risks of OPV2 withdrawal by increasing serotype-2 immunity, but may also create new serotype-2 VDPV (VDPV2). Here, we examine the risk factors for VDPV2 emergence and implications for the strategy of tOPV SIAs prior to OPV2 withdrawal. We first developed mathematical models of VDPV2 emergence and spread. We found that in settings with low routine immunisation coverage, the implementation of a single SIA increases the risk of VDPV2 emergence. If routine coverage is 20%, at least 3 SIAs are needed to bring that risk close to zero, and if SIA coverage is low or there are persistently "missed" groups, the risk remains high despite the implementation of multiple SIAs. We then analysed data from Nigeria on the 29 VDPV2 emergences that occurred during 2004-2014. Districts reporting the first case of poliomyelitis associated with a VDPV2 emergence were compared to districts with no VDPV2 emergence in the same 6-month period using conditional logistic regression. In agreement with the model results, the odds of VDPV2 emergence decreased with higher routine immunisation coverage (odds ratio 0.67 for a 10% absolute increase in coverage [95% confidence interval 0.55-0.82]). We also found that the probability of a VDPV2 emergence resulting in poliomyelitis in >1 child was significantly higher in districts with low serotype-2 population immunity. Our results support a strategy of focused tOPV SIAs before OPV2 withdrawal in areas at risk of VDPV2 emergence and in sufficient number to raise population immunity above the threshold permitting VDPV2 circulation. A failure to implement this risk-based approach could mean these SIAs actually increase the risk of VDPV2 emergence and spread. |
Experience of integrating vitamin A supplementation into polio campaigns in the African Region
Chehab ET , Anya BM , Onyango AW , Tevi-Benissan MC , Okeibunor J , Mkanda P , Mihigo R . Vaccine 2016 34 (43) 5199-5202 INTRODUCTION: Vitamin A deficiency is a public health problem that affects children across the WHO African Region. Countries have integrated vitamin A supplementation in different child health interventions, most notably with polio campaigns. The integration of vitamin A in polio campaigns was documented as a best practice in Angola, Chad, Cote d'Ivoire, Tanzania, and Togo. There are potential risks to vitamin A supplementation associated with the polio endgame and certification in the African Region. METHODS: We reviewed the findings from the documentation of best practices assessment that was conducted by the WHO Regional Office for Africa in 2014 and 2015 in the five countries that noted integration of vitamin A with polio as a best practice. In addition, we reviewed the coverage rates for oral poliovirus vaccine and vitamin A supplementation in Angola, Chad, Cote d'Ivoire, Tanzania, and Togo in 2014 and 2015. RESULTS: Vitamin A deficiency in 2004 ranged from 35% in Togo to as high as 55% in Angola. All five countries integrated vitamin A supplementation in at least one campaign in 2013-2014 and all achieved over 80% coverage for vitamin A supplementation when it was integrated with polio. DISCUSSION: Given the progress of the polio program, and decreasing campaigns, there is a risk that fewer children will be reached each year with vitamin A supplementation. We recommend that for countries strengthen the integration of vitamin A supplementation with routine immunization services. |
Influenza vaccine effectiveness against antigenically drifted influenza higher than expected in hospitalized adults: 2014-2015
Petrie JG , Ohmit SE , Cheng CK , Martin ET , Malosh RE , Lauring AS , Lamerato LE , Reyes KC , Flannery B , Ferdinands JM , Monto AS . Clin Infect Dis 2016 63 (8) 1017-25 BACKGROUND: The 2014-2015 influenza season was severe, with widespread circulation of influenza A (H3N2) viruses that were antigenically drifted from the vaccine virus. Reported vaccine effectiveness (VE) estimates from ambulatory care settings were markedly decreased. METHODS: Adults, hospitalized at two hospitals in southeast Michigan for acute respiratory illnesses, defined by admission diagnoses, of ≤10 days duration were prospectively enrolled. Throat and nasal swab specimens were collected, combined, and tested for influenza by RT-PCR. VE was estimated by comparing the vaccination status of those who tested positive for influenza with those who tested negative in logistic regression models adjusted for age, sex, hospital, calendar time, time from illness onset to specimen collection, frailty score, and Charlson Comorbidity Index (CCI). RESULTS: Among 624 patients included in the analysis, 421 (68%) were considered vaccinated, 337 (54%) were female, 220 (35%) were age ≥65 years, and 92% had CCI >0 indicating ≥1 comorbid conditions. 98 (16%) patients tested positive for influenza A (H3N2); among 60 (61%) A (H3N2) viruses tested by pyrosequencing, 53 (88%) belonged to the drifted 3C.2a genetic group. Adjusted VE was 43% (95% CI: 4 to 67) against influenza A (H3N2); 40% (95% CI: -13 to 68) for those <65 years of age and 48% (95% CI: -33 to 80) for those ≥65. Sensitivity analyses largely supported these estimates. CONCLUSIONS: VE estimates appeared higher than reports from similar studies in ambulatory care settings, suggesting that the 2014-15 vaccine may have been more effective in preventing severe illness requiring hospitalization. |
Adverse event reports following yellow fever vaccination, 2007-13
Lindsey NP , Rabe IB , Miller ER , Fischer M , Staples JE . J Travel Med 2016 23 (5) BACKGROUND: Yellow fever (YF) vaccines have been available since the 1930s and are generally considered safe and effective. However, rare reports of serious adverse events (SAE) following vaccination have prompted the Advisory Committee for Immunization Practices to periodically expand the list of conditions considered contraindications and precautions to vaccination. METHODS: We describe adverse events following YF vaccination reported to the U.S. Vaccine Adverse Event Reporting System (VAERS) from 2007 through 2013 and calculate age- and sex-specific reporting rates of all SAE, anaphylaxis, YF vaccine-associated neurologic disease (YEL-AND) and YF vaccine-associated viscerotropic disease (YEL-AVD). RESULTS: There were 938 adverse events following YF vaccination reported to VAERS from 2007 through 2013. Of these, 84 (9%) were classified as SAEs for a rate of 3.8 per 100 000 doses distributed. Reporting rates of SAEs increased with increasing age with a rate of 6.5 per 100 000 in persons aged 60-69 years and 10.3 for ≥70 years. The reporting rate for anaphylaxis was 1.3 per 100 000 doses distributed and was highest in persons ≤18 years (2.7 per 100 000). Reporting rates of YEL-AND and YEL-AVD were 0.8 and 0.3 per 100 000 doses distributed, respectively; both rates increased with increasing age. CONCLUSIONS: These findings reinforce the generally acceptable safety profile of YF vaccine, but highlight the importance of continued physician and traveller education regarding the risks and benefits of YF vaccination, particularly for older travellers. |
Suicide among people with epilepsy: A population-based analysis of data from the U.S. National Violent Death Reporting System, 17 states, 2003-2011
Tian N , Cui W , Zack M , Kobau R , Fowler KA , Hesdorffer DC . Epilepsy Behav 2016 61 210-217 OBJECTIVE: This study analyzed suicide data in the general population from the U.S. National Violent Death Reporting System (NVDRS) to investigate suicide burden among those with epilepsy and risk factors associated with suicide and to suggest measures to prevent suicide among people with epilepsy. METHODS: The NVDRS is a multiple-state, population-based, active surveillance system that collects information on violent deaths including suicide. Among people 10years old and older, we identified 972 suicide cases with epilepsy and 81,529 suicide cases without epilepsy in 17 states from 2003 through 2011. We estimated their suicide rates, evaluated suicide risk among people with epilepsy, and investigated suicide risk factors specific to epilepsy by comparing those with and without epilepsy. In 16 of the 17 states providing continual data from 2005 through 2011, we also compared suicide trends in people with epilepsy (n=833) and without epilepsy (n=68,662). RESULTS: From 2003 through 2011, the estimated annual suicide mortality rate among people with epilepsy was 16.89/100,000 per persons, 22% higher than that in the general population. Compared with those without epilepsy, those with epilepsy were more likely to have died from suicide in houses, apartments, or residential institutions (81% vs. 76%, respectively) and were twice as likely to poison themselves (38% vs. 17%) (P<0.01). More of those with epilepsy aged 40-49 died from suicide than comparably aged persons without epilepsy (29% vs. 22%) (P<0.01). The proportion of suicides among those with epilepsy increased steadily from 2005 through 2010, peaking significantly in 2010 before falling. SIGNIFICANCE: For the first time, the suicide rate among people with epilepsy in a large U.S. general population was estimated, and the suicide risk exceeded that in the general population. Suicide prevention efforts should target people with epilepsy 40-49years old. Additional preventive efforts include reducing the availability or exposure to poisons, especially at home, and supporting other evidence-based programs to reduce mental illness comorbidity associated with suicide. |
Associations of teen dating violence victimization with school violence and bullying among US high school students
Vivolo-Kantor AM , Olsen EO , Bacon S . J Sch Health 2016 86 (8) 620-7 BACKGROUND: Teen dating violence (TDV) negatively impacts health, mental and physical well-being, and school performance. METHODS: Data from a nationally representative sample of high school students participating in the Centers for Disease Control and Prevention (CDC)'s 2013 Youth Risk Behavior Survey (YRBS) are used to demonstrate associations of physical and sexual TDV with school violence-related experiences and behaviors, including bullying victimization. Bivariate and adjusted sex-stratified regressions assessed relationships between TDV and school violence-related experiences and behaviors. RESULTS: Compared to students not reporting TDV, those experiencing both physical and sexual TDV were more likely to report carrying a weapon at school, missing school because they felt unsafe, being threatened or injured with a weapon on school property, having a physical fight at school, and being bullied on school property. CONCLUSIONS: School-based prevention efforts should target multiple forms of violence. |
Multiplex assay for subtyping avian influenza A viruses by cDNA hybridization and adapter-mediated amplification.
Yang G , Jones J , Jang Y , Davis CT . Appl Microbiol Biotechnol 2016 100 (20) 8809-18 Multiple subtypes of influenza A viruses circulating in animals must be closely monitored to understand their risk to humans and animal populations. Many molecular-based subtyping methods require constant monitoring of viral genomes for primer and/or probe mismatches and are prone to primer-primer interactions. This report presents a new approach that involves target enrichment through cDNA hybridization followed by adapter-mediated amplification for subtyping influenza virus (AmASIV). As a proof of concept, the AmASIV assay was multiplexed to specifically detect and differentiate influenza A virus subtypes (H5, N5, N7, and N9) in a single reaction without cross-recognition of nontarget subtypes or influenza B virus. The limit of detection (LOD) of AmASIV, as measured by 50 % egg-infective dose per reaction (EID50/reaction), was comparable to that of singleplex TaqMan(R) qPCR assays with LODs of 10-0.6 (H5), 102 (N5), 10-0.3 (N7), and 10-0.5 (N9) EID50/reaction. The AmASIV will strengthen animal influenza virus surveillance and laboratory capacity to improve prevention and control of influenza. |
Quantitative Real-time PCR Assays for Detection and Type-Specific Identification of the Endemic Species C Human Adenoviruses.
Lu X , Erdman DD . J Virol Methods 2016 237 174-178 Human adenoviruses (HAdVs) are medically important respiratory pathogens. Among the 7 recognized species (A-G), species C HAdVs (serotypes 1, 2, 5 and 6) are globally endemic and infect most people early in life. Species C HAdV infections are most often subclinical or mild and can lead to persistent shedding from the gastrointestinal and upper respiratory tracts. They can also cause severe disseminated disease in newborn and immunocompromised persons, where rapid and quantitative detection and identification of the virus would help guide therapeutic intervention. To this end, we developed quantitative type-specific real-time PCR (qPCR) assays for HAdV-1, -2, -5 and -6 targeting the HAdV hexon gene. All type-specific qPCR assays reproducibly detected as few as 5 copies/reaction of quantified hexon recombinant plasmids with a linear dynamic range of 8 log units (5 to 5x107 copies). No non-specific amplifications were observed with concentrated nucleic acid from other HAdV types or other common respiratory pathogens. Of 199 previously typed HAdV field isolates and positive clinical specimens, all were detected and correctly identified to type by the qPCR assays; 10 samples had 2 HAdV types and 1 sample had 3 types identified which were confirmed by amplicon sequencing. The species C HAdV qPCR assays permit rapid, sensitive, specific and quantitative detection and identification of four recognized endemic HAdVs. Together with our previously developed qPCR assays for the epidemic respiratory HAdVs, these assays provide a convenient alternative to classical typing methods. |
Murine precision-cut lung slices exhibit acute responses following exposure to gasoline direct injection engine emissions
Maikawa CL , Zimmerman N , Rais K , Shah M , Hawley B , Pant P , Jeong CH , Delgado-Saborit JM , Volckens J , Evans G , Wallace JS , Godri Pollitt KJ . Sci Total Environ 2016 568 1102-1109 Gasoline direct injection (GDI) engines are increasingly prevalent in the global vehicle fleet. Particulate matter emissions from GDI engines are elevated compared to conventional gasoline engines. The pulmonary effects of these higher particulate emissions are unclear. This study investigated the pulmonary responses induced by GDI engine exhaust using an ex vivo model. The physiochemical properties of GDI engine exhaust were assessed. Precision cut lung slices were prepared using Balb/c mice to evaluate the pulmonary response induced by one-hour exposure to engine-out exhaust from a laboratory GDI engine operated at conditions equivalent to vehicle highway cruise conditions. Lung slices were exposed at an air-liquid interface using an electrostatic aerosol in vitro exposure system. Particulate and gaseous exhaust was fractionated to contrast mRNA production related to polycyclic aromatic hydrocarbon (PAH) metabolism and oxidative stress. Exposure to GDI engine exhaust upregulated genes involved in PAH metabolism, including Cyp1a1 (2.71, SE=0.22), and Cyp1b1 (3.24, SE=0.12) compared to HEPA filtered air (p<0.05). GDI engine exhaust further increased Cyp1b1 expression compared to filtered GDI engine exhaust (i.e., gas fraction only), suggesting this response was associated with the particulate fraction. Exhaust particulate was dominated by high molecular weight PAHs. Hmox1, an oxidative stress marker, exhibited increased expression after exposure to GDI (1.63, SE=0.03) and filtered GDI (1.55, SE=0.04) engine exhaust compared to HEPA filtered air (p<0.05), likely attributable to a combination of the gas and particulate fractions. Exposure to GDI engine exhaust contributes to upregulation of genes related to the metabolism of PAHs and oxidative stress. |
Enhancement of skeletal muscle in aged rats following high-intensity stretch-shortening contraction training
Rader EP , Naimo MA , Layner KN , Triscuit AM , Chetlin RD , Ensey J , Baker BA . Rejuvenation Res 2016 20 (2) 93-102 Exercise is the most accessible, efficacious, and multifactoral intervention to improve health and treat chronic disease. High-intensity resistance exercise, in particular, also maximizes skeletal muscle size and strength - outcomes crucial at advanced age. However, such training is capable of inducing muscle maladaptation when misapplied at old age. Therefore, characterization of parameters (e.g. mode and frequency) which foster adaptation is an active research area. To address this issue, we utilized a rodent model that allowed training at maximal intensity in terms of muscle activation and tested the hypothesis that muscles of old rats adapt to stretch-shortening contraction training, provided the training frequency is sufficiently low. At termination of training, normalized muscle mass (i.e. muscle mass divided by tibia length) and muscle quality (isometric force divided by normalized muscle mass) were determined. For young rats, normalized muscle mass increased by ~20% regardless of training frequency. No difference was also observed for muscle quality values after 2 days vs 3 days per week training (0.65 +/- 0.09 N/mg/mm vs 0.59 +/- 0.05 N/mg/mm, respectively). For old rats following 3 days per week training, normalized muscle mass was unaltered and muscle quality was 30% lower than young levels. Following 2 days per week training at old age, normalized muscle mass increased by 17% and muscle quality was restored to young levels. To investigate this enhanced response, oxidative stress was assessed by lipid peroxidation quantification. For young rats, lipid peroxidation levels were unaltered by training. With aging, baseline levels of lipid peroxidation increased by 1.5-fold. For old rats, only 2 days per week training decreased lipid peroxidation to levels indistinguishable from young values. These results imply appropriately scheduled high-intensity stretch-shortening contraction training at old age is capable of restoring muscle to a younger phenotype in terms of lipid peroxidation levels and muscle quality. |
Evaluation of amount of blood in dry blood spots: ring-disk electrode conductometry
Kadjo AF , Stamos BN , Shelor CP , Berg JM , Blount BC , Dasgupta PK . Anal Chem 2016 88 (12) 6531-7 A fixed area punch in dried blood spot (DBS) analysis is assumed to contain a fixed amount of blood, but the amount actually depends on a number of factors. The presently preferred approach is to normalize the measurement with respect to the sodium level, measured by atomic spectrometry. Instead of sodium levels, we propose electrical conductivity of the extract as an equivalent nondestructive measure. A dip-type small diameter ring-disk electrode (RDE) is ideal for very small volumes. However, the conductance (G) measured by an RDE depends on the depth (D) of the liquid below the probe. There is no established way of computing the specific conductance (sigma) of the solution from G. Using a COMSOL Multiphysics model, we were able to obtain excellent agreement between the measured and the model predicted conductance as a function of D. Using simulations over a large range of dimensions, we provide a spreadsheet-based calculator where the RDE dimensions are the input parameters and the procedure determines the 99% of the infinite depth conductance (G99) and the depth D99 at which this is reached. For typical small diameter probes (outer electrode diameter approximately <2 mm), D99 is small enough for dip-type measurements in extract volumes of approximately 100 muL. We demonstrate the use of such probes with DBS extracts. In a small group of 12 volunteers (age 20-66), the specific conductance of 100 muL aqueous extracts of 2 muL of spotted blood showed a variance of 17.9%. For a given subject, methanol extracts of DBS spots nominally containing 8 and 4 muL of blood differed by a factor of 1.8-1.9 in the chromatographically determined values of sulfate and chloride (a minor and major constituent, respectively). The values normalized with respect to the conductance of the extracts differed by approximately 1%. For serum associated analytes, normalization of the analyte value by the extract conductance can thus greatly reduce errors from variations in the spotted blood volume/unit area. |
Congenital Heart Defects in the United States: Estimating the Magnitude of the Affected Population in 2010.
Gilboa SM , Devine OJ , Kucik JE , Oster ME , Riehle-Colarusso T , Nembhard WN , Xu P , Correa A , Jenkins K , Marelli AJ . Circulation 2016 134 (2) 101-9 BACKGROUND: -Because of advancements in care, there has been a decline in mortality from congenital heart defects (CHD) over the last several decades. However, there are no current empirical data documenting the number of people living with CHD in the United States (US). Our aim was to estimate the CHD prevalence across all age groups in the US in the year 2010. METHODS: -The age-, sex-, and severity-specific observed prevalence of CHD in Quebec, Canada in the year 2010 was assumed to equal the CHD prevalence in the non-Hispanic white population in the US in 2010. A race-ethnicity adjustment factor, reflecting differential survival between racial-ethnic groups through age 5 for persons with a CHD and that in the general US population, was applied to the estimated non-Hispanic white rates to derive CHD prevalence estimates among US non-Hispanic blacks and Hispanics. Confidence intervals for the estimated CHD prevalence rates and case counts were derived using a combination of Taylor series approximations and Monte Carlo simulation. RESULTS: -We estimated that approximately 2.4 million people (1.4 million adults, 1 million children) were living with CHD in the US in 2010. Nearly 300,000 of these individuals had severe CHD. CONCLUSIONS: -Our estimates highlight the need for two important efforts: (1) planning for health services delivery to meet the needs of the growing population of adults with CHD and; (2) the development of surveillance data across the lifespan to provide empirical estimates of the prevalence of CHD across all age groups in the US. |
Trends in breastfeeding initiation and duration by birth weight among US children, 1999-2012
Herrick KA , Rossen LM , Kit BK , Wang CY , Ogden CL . JAMA Pediatr 2016 170 (8) 805-7 In the United States, breastfeeding initiation rates have risen to 80%.1 We report secular trends of breastfeeding initiation and duration by birth weight using nationally representative data from the National Health and Nutrition Examination Survey (NHANES). |
New improvements to MFIRE to enhance fire modeling capabilities
Zhou L , Smith AC , Yuan L . Min Eng 2016 68 (6) 45-50 NIOSH's mine fire simulation program, MFIRE, is widely accepted as a standard for assessing and predicting the impact of a fire on the mine ventilation system and the spread of fire contaminants in coal and metal/nonmetal mines, which has been used by U.S. and international companies to simulate fires for planning and response purposes. MFIRE is a dynamic, transient-state, mine ventilation network simulation program that performs normal planning calculations. It can also be used to analyze ventilation networks under thermal and mechanical influence such as changes in ventilation parameters, external influences such as changes in temperature, and internal influences such as a fire. The program output can be used to analyze the effects of these influences on the ventilation system. Since its original development by Michigan Technological University for the Bureau of Mines in the 1970s, several updates have been released over the years. In 2012, NIOSH completed a major redesign and restructuring of the program with the release of MFIRE 3.0. MFIRE's outdated FORTRAN programming language was replaced with an object-oriented C++ language and packaged into a dynamic link library (DLL). However, the MFIRE 3.0 release made no attempt to change or improve the fire modeling algorithms inherited from its previous version, MFIRE 2.20. This paper reports on improvements that have been made to the fire modeling capabilities of MFIRE 3.0 since its release. These improvements include the addition of fire source models of the t-squared fire and heat release rate curve data file, the addition of a moving fire source for conveyor belt fire simulations, improvement of the fire location algorithm, and the identification and prediction of smoke rollback phenomena. All the improvements discussed in this paper will be termed as MFIRE 3.1 and released by NIOSH in the near future. |
The vitamin D status of the US population from 1988 to 2010 using standardized serum concentrations of 25-hydroxyvitamin D shows recent modest increases
Schleicher RL , Sternberg MR , Lacher DA , Sempos CT , Looker AC , Durazo-Arvizu RA , Yetley EA , Chaudhary-Webb M , Maw KL , Pfeiffer CM , Johnson CL . Am J Clin Nutr 2016 104 (2) 454-61 BACKGROUND: Temporal trends in the US population's vitamin D status have been uncertain because of nonstandardized serum 25-hydroxyvitamin D [25(OH)D] measurements. OBJECTIVE: To accurately assess vitamin D status trends among those aged ≥12 y, we used data from the cross-sectional NHANESs. DESIGN: A liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for measuring 25(OH)D (sum of 25-hydroxyvitamin D2 and 25-hydroxyvitamin D3), calibrated to standard reference materials, was used to predict LC-MS/MS-equivalent concentrations from radioimmunoassay data (1988-2006 surveys; n = 38,700) and to measure LC-MS/MS concentrations (2007-2010 surveys; n = 12,446). Weighted arithmetic means and the prevalence of 25(OH)D above or below cutoff concentrations were calculated to evaluate long-term trends. RESULTS: Overall, mean predicted 25(OH)D showed no time trend from 1988 to 2006, but during 2007-2010 the mean measured 25(OH)D was 5-6 nmol/L higher. Those groups who showed the largest 25(OH)D increases (7-11 nmol/L) were older, female, non-Hispanic white, and vitamin D supplement users. During 1988-2010, the proportions of persons with 25(OH)D <40 nmol/L were 14-18% (overall), 46-60% (non-Hispanic blacks), 21-28% (Mexican Americans), and 6-10% (non-Hispanic whites). CONCLUSIONS: An accurate method for measuring 25(OH)D showed stable mean concentrations in the US population (1988-2006) and recent modest increases (2007-2010). Although it is unclear to what extent supplement usage compared with different laboratory methods explain the increases in 25(OH)D, the use of higher vitamin D supplement dosages coincided with the increase. Marked race-ethnic differences in 25(OH)D concentrations were apparent. These data provide the first standardized information about temporal trends in the vitamin D status of the US population. |
Taking stock of the occupational safety and health challenges of nanotechnology: 2000–2015
Schulte PA , Roth G , Hodson LL , Murashov V , Hoover MD , Zumwalde R , Kuempel ED , Geraci CL , Stefaniak AB , Castranova V , Howard J . J Nanopart Res 2016 18 159 Engineered nanomaterials significantly entered commerce at the beginning of the 21st century. Concerns about serious potential health effects of nanomaterials were widespread. Now, approximately 15 years later, it is worthwhile to take stock of research and efforts to protect nanomaterial workers from potential risks of adverse health effects. This article provides and examines timelines for major functional areas (toxicology, metrology, exposure assessment, engineering controls and personal protective equipment, risk assessment, risk management, medical surveillance, and epidemiology) to identify significant contributions to worker safety and health. The occupational safety and health field has responded effectively to identify gaps in knowledge and practice, but further research is warranted and is described. There is now a greater, if imperfect, understanding of the mechanisms underlying nanoparticle toxicology, hazards to workers, and appropriate controls for nanomaterials, but unified analytical standards and exposure characterization methods are still lacking. The development of control-banding and similar strategies has compensated for incomplete data on exposure and risk, but it is unknown how widely such approaches are being adopted. Although the importance of epidemiologic studies and medical surveillance is recognized, implementation has been slowed by logistical issues. Responsible development of nanotechnology requires protection of workers at all stages of the technological life cycle. In each of the functional areas assessed, progress has been made, but more is required. |
Occupational exposure to pesticides and the incidence of lung cancer in the Agricultural Health Study
Bonner MR , Beane Freeman LE , Hoppin JA , Koutros S , Sandler DP , Lynch CF , Hines CJ , Thomas K , Blair A , Alavanja MC . Environ Health Perspect 2016 125 (4) 544-551 BACKGROUND: Occupational pesticide use is associated with lung cancer in some, but not all, epidemiologic studies. In the Agricultural Health Study (AHS), we previously reported positive associations between several pesticides and lung cancer incidence. OBJECTIVE: We evaluated use of 43 pesticides and 654 lung cancer cases after ten years of additional follow-up in the AHS, a prospective cohort study comprised of 57,310 pesticide applicators from Iowa and North Carolina. METHODS: Information about lifetime pesticide use and other factors was ascertained at enrollment (1993-1997) and updated with a follow-up questionnaire (1999-2005). Cox proportional hazards models were used to calculate hazard ratios (HR) and 95% confidence intervals (CI), adjusting for smoking (smoking status and pack-years), gender, and lifetime days of use of any pesticides. RESULTS: Hazard ratios were elevated in the highest exposure category of lifetime days of use for pendimethalin (1.50; 95% CI = 0.98-2.31), dieldrin (1.93; 95% CI = 0.70-5.30), and chlorimuron-ethyl (1.74; 95% CI = 1.02-2.96), although monotonic exposure-response gradients were not evident. The HRs for intensity-weighted lifetime days of use of these pesticides were similar. For parathion, the trend was statistically significant for intensity-weighted lifetime days (p=0.049) and borderline for lifetime days (p=0.073). None of the remaining pesticides evaluated were associated with lung cancer incidence. CONCLUSIONS: These analyses provide additional evidence for an association between pendimethalin, dieldrin, and parathion use and lung cancer risk. We found an association between chlorimuron-ethyl, a herbicide introduced in 1986, and lung cancer that has not been previously reported. Continued follow-up is warranted. |
Pesticides are associated with allergic and non-allergic wheeze among male farmers
Hoppin JA , Umbach DM , Long S , London SJ , Henneberger PK , Blair A , Alavanja M , Beane Freeman LE , Sandler DP . Environ Health Perspect 2016 125 (4) 535-543 BACKGROUND: Growing evidence suggests that pesticide use may contribute to respiratory symptoms. OBJECTIVE: To evaluate the association of currently used pesticides with allergic and non-allergic wheeze among male farmers. METHODS: Using the 2005-2010 interview data of the Agricultural Health Study, a prospective study of farmers in North Carolina and Iowa, we evaluated the association between allergic and non-allergic wheeze and self-reported use of 78 specific pesticides, reported by ≥ 1% of the 22,134 men interviewed. We used polytomous regression models adjusted for age, BMI, state, smoking, and current asthma, as well as for days applying pesticides and days driving diesel tractors. We defined allergic wheeze as reporting both wheeze and doctor-diagnosed hay fever (n=1,310, 6%) and non-allergic wheeze as reporting wheeze but not hay fever (n=3,939, 18%); men without wheeze were the referent. RESULTS: In models evaluating current use of specific pesticides, 19 pesticides were significantly associated (p<0.05) with allergic wheeze (18 positive, 1 negative) and 21 pesticides with non-allergic wheeze (19 positive, 2 negative); 11 pesticides with both. Seven pesticides (herbicides: 2,4-D and simazine; insecticides: carbaryl, dimethoate, disulfoton, and zeta-cypermethrin; and fungicide pyraclostrobin) had significantly different associations for allergic and non-allergic wheeze. In exposure-response models with up to five exposure categories, we saw evidence of an exposure-response relationship for several pesticides including the commonly used herbicides 2,4-D and glyphosate, the insecticides permethrin and carbaryl and the rodenticide warfarin. CONCLUSIONS: These results for farmers implicate several pesticides that are commonly used in agricultural and residential settings with adverse respiratory effects. |
Preface to the special section on the impact of Thomas Waters on the field of ergonomics
Davis KG , Hudock SD . Hum Factors 2016 58 (5) 665-6 Thomas R. Waters had a distinguished career in the field of occupational ergonomics for 24 years while working at the National Institute for Occupational Safety and Health (NIOSH). Although his work focused on musculoskeletal disorders (MSDs) across many industries, including manufacturing, retail trade, warehousing, agriculture, and health care, he is most known for leading the development and validation of the Revised NIOSH Lifting Equation (RNLE) starting in 1993. The RNLE has become the most widely used ergonomic assessment tool in the world. Researchers across the world are now revising and expanding this equation to ensure wide applicability of the RNLE. Waters published 27 articles on the RNLE, with one of the last articles being awarded the 2000 Alice Hamilton Award for Excellence in the Human Studies (Waters et al., 1999) and another expanding the RNLE to be used by pregnant workers lifting at work (MacDonald et al., 2013). The reach of this tool has been phenomenal, with almost 65,000 downloads of the RNLE documentation from the NIOSH Web page between 2007 and 2012, more than 72,000 page views from 2009 to 2012, and more than 25,000 copies of the RNLE distributed by NIOSH. More than 130 articles have been published that employ the RNLE as an assessment tool, providing one indication of the impact that this tool has had on the field. |
Prevalence of work-site injuries and relationship between obesity and injury among U.S. workers: NHIS 2004-2012
Gu Ja K , Charles Luenda E , Andrew Michael E , Ma Claudia C , Hartley Tara A , Violanti John M , Burchfiel Cecil M . J Safety Res 2016 58 21-30 Introduction: Studies have reported associations between obesity and injury in a single occupation or industry. Our study estimated the prevalence of work-site injuries and investigated the association between obesity and work-site injury in a nationally representative sample of U.S. workers. Methods: Self-reported weight, height, and injuries within the previous three months were collected annually for U.S. workers in the National Health Interview Survey (NHIS) from 2004-2012. Participants were categorized as normal weight (BMI: 18.5-24.9 kg/m2), overweight (BMI: 25.0-29.9), obese I (BMI: 30.0-34.9), and obese II (BMI: 35 +). The prevalence of injury and prevalence ratios from fitted logistic regression models was used to assess relationships between obesity and injury after adjusting for covariates. Sampling weights were incorporated using SUDAAN software. Results: During the 9-year study period from 2004 to 2012, 1120 workers (78 workers per 10,000) experienced a work-related injury during the previous three months. The anatomical sites with the highest prevalence of injury were the back (14.3/10,000 1.2), fingers (11.5 1.3), and knees (7.1 0.8). The most common types of injuries were sprains/strains/twists (41.5% of all injuries), cuts (20.0%), and fractures (11.8%). Compared to normal weight workers, overweight and obese workers were more likely to experience work-site injuries [overweight: PR = 1.25 (95% CI = 1.04-1.52); obese I: 1.41 (1.14-1.74); obese II: 1.68 (1.32-2.14)]. These injuries were more likely to affect the lower extremities [overweight: PR = 1.48, (95% CI = 1.03-2.13); obese I: 1.70 (1.13-2.55); obese II: 2.91 (1.91-4.41)] and were more likely to be due to sprains/strains/twists [overweight: PR = 1.73 (95% CI = 1.29-2.31); obese I: PR = 2.24 (1.64-3.06); obese II: PR = 2.95 (2.04-4.26)]. Conclusions: Among NHIS participants, overweight and obese workers were 25% to 68% more likely to experience injuries than normal weight workers. Practical applications: Weight reduction policies and management programs may be effectively targeted towards overweight and obese groups to prevent or reduce work-site injuries. 2016 National Safety Council and Elsevier Ltd. All rights reserved. |
Breath-taking jobs: a case-control study of respiratory work disability by occupation in Norway
Fell A , Abrahamsen R , Henneberger PK , Svendsen MV , Andersson E , Toren K , Kongerud J . Occup Environ Med 2016 73 (9) 600-6 BACKGROUND: The current knowledge on respiratory work disability is based on studies that used crude categories of exposure. This may lead to a loss of power, and does not provide sufficient information to allow targeted workplace interventions and follow-up of patients with respiratory symptoms. OBJECTIVES: The aim of this study was to identify occupations and specific exposures associated with respiratory work disability. METHODS: In 2013, a self-administered questionnaire was mailed to a random sample of the general population, aged 16-50, in Telemark County, Norway. We defined respiratory work disability as a positive response to the survey question: 'Have you ever had to change or leave your job because it affected your breathing?' Occupational exposures were assessed using an asthma-specific job-exposure matrix, and comparison of risks was made for cases and a median of 50 controls per case. RESULTS: 247 workers had changed their work because of respiratory symptoms, accounting for 1.7% of the respondents ever employed. The 'breath-taking jobs' were cooks/chefs: adjusted OR 3.6 (95% CI 1.6 to 8.0); welders: 5.2 (2.0 to 14); gardeners: 4.5 (1.3 to 15); sheet metal workers: 5.4 (2.0 to 14); cleaners: 5.0 (2.2 to 11); hairdressers: 6.4 (2.5 to 17); and agricultural labourers: 7.4 (2.5 to 22). Job changes were also associated with a variety of occupational exposures, with some differences between men and women. CONCLUSIONS: Self-report and job-exposure matrix data showed similar findings. For the occupations and exposures associated with job change, preventive measures should be implemented. |
Can control banding be useful for the safe handling of nanomaterials? A systematic review
Eastlake Adrienne , Zumwalde Ralph , Geraci Charles . J Nanopart Res 2016 18 169 Control banding (CB) is a risk management strategy that has been used to identify and recommend exposure control measures to potentially hazardous substances for which toxicological information is limited. The application of CB and level of expertise required for implementation and management can differ depending on knowledge of the hazard potential, the likelihood of exposure, and the ability to verify the effectiveness of exposure control measures. A number of different strategies have been proposed for using CB in workplaces where exposure to engineered nanomaterials (ENMs) can occur. However, it is unclear if the use of CB can effectively reduce worker exposure to nanomaterials. A systematic review of studies was conducted to answer the question can control banding be useful to ensure adequate controls for the safe handling of nanomaterials. A variety of databases were searched to identify relevant studies pertaining to CB. Database search terms included control, hazard, exposure, and risk banding as well as the use of these terms in the context of nanotechnology or nanomaterials. Other potentially relevant studies were identified during the review of articles obtained in the systematic review process. Identification of studies and the extraction of data were independently conducted by the reviewers. Quality of the studies was assessed using the methodological index for nonrandomized studies. The quality of the evidence was evaluated using grading of recommendations assessment, development and evaluation (GRADE). A total of 235 records were identified in the database search in which 70 records were determined to be eligible for full-text review. Only two studies were identified that met the inclusion criteria. These studies evaluated the application of the CB Nanotool in workplaces where ENMs were being handled. A total of 32 different nanomaterial handling activities were evaluated in these studies by comparing the recommended exposure controls using CB to existing exposure controls previously recommended by an industrial hygienist. It was determined that the selection of exposure controls using CB were consistent with those recommended by an industrial hygienist for 19 out of 32 (59.4%) job activities. A higher level of exposure control was recommended for nine out of 32 (28.1%) job activities using CB, while four out of 32 (12.5%) job activities had in-place exposure controls that were more stringent than those recommended using CB. After evaluation using GRADE, evidence indicated that the use of CB Nanotool can recommend exposure controls for many ENM job activities that would be consistent with those recommended by an experienced industrial hygienist. The use of CB for reducing exposures to ENMs has the potential to be an effective risk management strategy when information is limited on the health risk to the nanomaterial and/or there is an absence of an occupational exposure limit. However, there remains a lack of evidence to conclude that the use of CB can provide adequate exposure control in all work environments. Additional validation work is needed to provide more data to support the use of CB for the safe handling of ENMs. 2016, Springer Science+Business Media Dordrecht. |
Cryptosporidium species and C. parvum subtypes in dairy calves and goat kids reared under traditional farming systems in Turkey.
Taylan-Ozkan A , Yasa-Duru S , Usluca S , Lysen C , Ye J , Roellig DM , Feng Y , Xiao L . Exp Parasitol 2016 170 16-20 Molecular characterizations of Cryptosporidium spp. in ruminants reared under traditional animal management systems are scarce and studies conducted thus far have revealed largely an absence of the pathogenic and zoonotic species C. parvum in pre-weaned animals. In this study, we examined Cryptosporidium species and subtype distribution in free-range pre-weaned dairy calves and goat kids with diarrhea. Cryptosporidium-positive specimens from pre-weaned calves on 10 farms and goat kids on 4 farms in Ankara, Balikesir, Corum, Kirikkale, and Kirsehir Provinces, Turkey were genotyped by PCR-restriction length polymorphism analysis of the small subunit rRNA gene, which identified C. parvum in 27 calves and 9 goat kids and C. ryanae in 1 calf. Among the C. parvum isolates successfully subtyped by DNA sequence analysis of the 60 kDa glycoprotein gene, three subtypes were detected in calves, including IIaA13G2R1 (20/23), IIdA18G1 (2/23), and IIdA20G1b (1/23), and four subtypes were detected in goat kids, including IIaA13G2R1 (3/8), IIaA15G1R1 (2/8), IIdA22G1 (2/8), and IIdA18G1 (1/8). Data of the study suggest that dairy calves reared in a traditional cow-calf system in Turkey are mainly infected with a C. parvum subtype rarely seen elsewhere, whereas goat kids are infected with diverse subtypes. As all five C. parvum subtypes found in this study are known human pathogens, pre-weaned farm animals could play a potential role in the transmission of human cryptosporidiosis. |
Safety of hormonal contraception and intrauterine devices among women with depressive and bipolar disorders: a systematic review
Pagano HP , Zapata LB , Berry-Bibee EN , Nanda K , Curtis KM . Contraception 2016 94 (6) 641-649 BACKGROUND: Women with depressive or bipolar disorders are at an increased risk for unintended pregnancy. OBJECTIVE: To examine the safety of hormonal contraception among women with depressive and bipolar disorders. METHODS: We searched for articles published through January 2016 on the safety of using any hormonal contraceptive method among women with depressive or bipolar disorders, including those who had been diagnosed clinically or scored above threshold levels on a validated screening instrument. Outcomes included changes in symptoms, hospitalization, suicide, and modifications in medication regimens such as increase or decrease in dosage or changes in type of drug. RESULTS: Of 2376 articles, six met inclusion criteria. Of three studies that examined women clinically diagnosed with depressive or bipolar disorder, one found that oral contraceptives (OCs) did not significantly change mood across the menstrual cycle among women with bipolar disorder, whereas mood did significantly change across the menstrual cycle among women not using OCs; one found no significant differences in the frequency of psychiatric hospitalizations among women with bipolar disorder who used depot medroxyprogesterone acetate (DMPA), intrauterine devices (IUDs), or sterilization; and one found no increase in depression scale scores among women with depression using and not using OCs, for both those treated with fluoxetine and those receiving placebo. Of three studies that examined women who met a threshold for depression on a screening instrument, one found that adolescent girls using combined OCs (COCs) had significantly improved depression scores after 3 months compared with placebo, one found that OC users had similar odds of no longer being depressed at follow-up compared with non-users, and one found that COC users were less frequently classified as depressed over 11 months than IUD users. CONCLUSIONS: Limited evidence from six studies found that OC, levonorgestrel-releasing (LNG)-IUD, and DMPA use among women with depressive or bipolar disorders was not associated with worse clinical course of disease compared with no hormonal method use. |
Medications to ease intrauterine device insertion: A systematic review
Zapata LB , Jatlaoui TC , Marchbanks PA , Curtis KM . Contraception 2016 94 (6) 739-759 BACKGROUND: Potential barriers to intrauterine device (IUD) use include provider concern about difficult insertion, particularly for nulliparous women. OBJECTIVE: To evaluate the evidence on the effectiveness of medications to ease IUD insertion on provider outcomes (i.e., ease of insertion, need for adjunctive insertion measures, insertion success). SEARCH STRATEGY: We searched the PubMed database for peer-reviewed articles published in any language from database inception through February 2016. SELECTION CRITERIA: We included randomized controlled trials (RCTs) that examined medications to ease interval insertion of levonorgestrel (LNG) IUDs and copper-T IUDs. RESULTS: From 1855 articles, we identified 15 RCTs that met our inclusion criteria. Most evidence suggested that misoprostol did not improve provider ease of insertion, reduce the need for adjunctive insertion measures, or improve insertion success among general samples of women seeking an IUD (evidence Level I, good to fair). However, 1 RCT found significantly higher insertion success among women receiving misoprostol prior to a second IUD insertion attempt after failed attempt versus placebo (evidence Level I, good). Two RCTs on 2% intracervical lidocaine as a topical gel or injection suggested no positive effect on provider ease of insertion (evidence Level I, good to poor), and one RCT on diclofenac plus 2% intracervical lidocaine as a topical gel suggested no positive effect on provider ease of insertion (evidence Level I, good). Limited evidence from 2 RCTs on nitric oxide donors, specifically nitroprusside or nitroglycerin gel, suggested no positive effect on provider ease of insertion or need for adjunctive insertion measures (evidence Level I, fair). CONCLUSIONS: Overall, most studies found no significant differences between women receiving interventions to ease IUD insertion versus controls. Among women with a recent failed insertion who underwent a second insertion attempt, one RCT found improved insertion success among women using misoprostol versus placebo. |
Effectiveness of an adaptation of the Project Connect Health Systems Intervention: youth and clinic-level findings
Loosier PS , Doll S , Lepar D , Ward K , Gamble G , Dittus PJ . J Sch Health 2016 86 (8) 595-603 BACKGROUND: The Project Connect Health Systems Intervention (Project Connect) uses a systematic process of collecting community and healthcare infrastructure information to craft a referral guide highlighting local healthcare providers who provide high quality sexual and reproductive healthcare. Previous self-report data on healthcare usage indicated Project Connect was successful with sexually experienced female youth, where it increased rates of human immunodeficiency virus (HIV) and sexually transmitted disease (STD) testing and receipt of contraception. This adaption of Project Connect examined its effectiveness in a new context and via collection of clinic encounter-level data. METHODS: Project Connect was implemented in 3 high schools. (only 2 schools remained open throughout the entire project period). Participant recruitment and data collection occurred in 5 of 8 participating health clinics. Students completed Youth Surveys (N = 608) and a Clinic Survey (paired with medical data abstraction in 2 clinics [N = 305]). RESULTS: Students were more likely than nonstudents to report having reached a clinic via Project Connect. Nearly 40% of students attended a Project Connect school, with 32.7% using Project Connect to reach the clinic. Students were most likely to have been referred by a school nurse or coach. CONCLUSIONS: Project Connect is a low-cost, sustainable structural intervention with multiple applications within schools, either as a standalone intervention or in combination with ongoing efforts. |
Elective single embryo transfer in women less than age 38 years reduces multiple birth rates, but not live birth rates, in United States fertility clinics
Mancuso A , Boulet SL , Duran E , Munch E , Kissin DM , Van Voorhis BJ . Fertil Steril 2016 106 (5) 1107-1114 OBJECTIVE: To determine the effect of elective single ET (eSET) on live birth and multiple birth rates by a cycle-level and clinic-level analysis. DESIGN: Retrospective cohort study. SETTING: Not applicable. PATIENT(S): Patient ages <35 and 35-37 years old. INTERVENTION(S): None. MAIN OUTCOME MEASURE(S): Clinics were divided into groups based on eSET rate for each age group and aggregate rates of live birth per ET and multiple birth per delivery were calculated. A cycle-level analysis comparing eSET and double ET (DET) live birth and multiple birth rates was also performed, stratified based on total number (2, 3, or 4+) of embryos available, embryo stage, and patient age. RESULT(S): There was a linear decrease in multiple birth rate with increasing eSET rate and no significant difference in clinic-level live birth rates for each age group. Cycle-level analysis found slightly higher live birth rates with double ET, but this was mainly observed in women aged 35-37 years or with four or more embryos available for transfer, and confirmed the marked reduction in multiple births with eSET. CONCLUSION(S): Our study showed a marked and linear reduction in multiple birth rates, and important, little to no effect on clinic-level live birth rates with increasing rates of eSET supporting the growing evidence that eSET is effective in decreasing the high multiple birth rates associated with IVF and suggests that eSET should be used more frequently than is currently practiced. |
The role of screening, brief intervention and referral to treatment (SBIRT) in the perinatal period
Wright TE , Terplan M , Ondersma SJ , Boyce C , Yonkers K , Chang G , Creanga AA . Am J Obstet Gynecol 2016 215 (5) 539-547 Substance use during pregnancy is at least as common as many of the medical conditions screened for and managed during pregnancy. While harmful and costly, it is often ignored or managed poorly. Screening, Brief Intervention and Referral to treatment (SBIRT) is an evidence-based approach to manage substance use. In September 2012, the US Centers for Disease Control and Prevention convened an Expert Meeting on Perinatal Illicit Drug Abuse to help address key issues around drug use in pregnancy in the United States. This manuscript reflects the formal conclusions of the expert panel that discussed the use of SBIRT during pregnancy. Screening for substance use during pregnancy should be universal. It allows stratification of women into zones of risk given their pattern of use. Low-risk women should receive brief advice, those classified as moderate-risk should receive a brief intervention, whereas those who are high-risk need referral to specialty care. A brief intervention is a patient-centered form of counseling using the principles of motivational interviewing to SBIRT has the potential to reduce the burden of substance use in pregnancy and should be integrated into prenatal care. |
Tobacco control in Africa
Ahluwalia IB , Arrazola RA , Ogwell Ouma AE . Prev Med 2016 Tobacco use is a leading cause of preventable morbidity and mortality worldwide, with nearly 6 million tobacco-attributable deaths every year (World Health Organization, 2012). If current trends continue, tobacco use is expected to result in an estimated 1 billion deaths by the end of the century, with most from low- and middle-income countries (Mathers and Loncar, 2006). | Cigarette smoking is the most common form of tobacco use in most countries, and the majority of adult smokers first try cigarettes before age 18 (CDC Foundation, 2015; Anon., 2012). Limiting access to cigarettes among youth is an effective strategy to curb the tobacco epidemic by preventing smoking initiation and reducing the number of new smokers (CDC Foundation, 2015; Anon., 2012; DiFranza, 2012). To reduce the threat posed by tobacco to public health, the World Health Organization (WHO) has promoted the ratification of the WHO Framework Convention for Tobacco Control (FCTC) and has developed demand reduction tools (http://www.who.int/fctc/reporting/en/). The “MPOWER” demand reduction package includes the following strategies to assist countries in addressing the tobacco epidemic: Monitor tobacco use; Protect people from second hand smoke; Offer help to quit tobacco use; Warn about the dangers of tobacco; Enforce bans on tobacco advertising and promotion; Raise taxes on tobacco products. |
Tracking MPOWER in 14 countries: results from the Global Adult Tobacco Survey, 2008-2010
Song Y , Zhao L , Palipudi KM , Asma S , Morton J , Talley B , Hsia J , Ramanandraibe N , Caixeta R , Fouad H , Khoury R , Sinha D , Rarick J , Bettcher D , Peruga A , Deland K , D'Espaignet ET . Glob Health Promot 2016 23 24-37 BACKGROUND: The World Health Organization (WHO) MPOWER is a technical package of six tobacco control measures that assist countries in meeting their obligations of the WHO Framework Convention Tobacco Control and are proven to reduce tobacco use. The Global Adult Tobacco Survey (GATS) systematically monitors adult tobacco use and tracks key tobacco control indicators. METHODS: GATS is a nationally representative household survey of adults aged 15 and older, using a standard and consistent protocol across countries; it includes information on the six WHO MPOWER measures. GATS Phase I was conducted from 2008-2010 in 14 high-burden low- and middle-income countries. We selected one key indicator from each of the six MPOWER measures and compared results across 14 countries. RESULTS: Current tobacco use prevalence rates ranged from 16.1% in Mexico to 43.3% in Bangladesh. We found that the highest rate of exposure to secondhand smoke in the workplace was in China (63.3%). We found the highest 'smoking quit attempt' rates in the past 12 months among cigarette smokers in Viet Nam (55.3%) and the lowest rate was in the Russian Federation (32.1%). In five of the 14 countries, more than one-half of current smokers in those 5 countries said they thought of quitting because of health warning labels on cigarette packages. The Philippines (74.3%) and the Russian Federation (68.0%) had the highest percentages of respondents noticing any cigarette advertising, promotion and sponsorship. Manufactured cigarette affordability ranged from 0.6% in Russia to 8.0% in India. CONCLUSIONS: Monitoring tobacco use and tobacco control policy achievements is crucial to managing and implementing measures to reverse the epidemic. GATS provides internationally-comparable data that systematically monitors and tracks the progress of the other five MPOWER measures. |
Methodology of the Global Adult Tobacco Survey - 2008-2010
Palipudi KM , Morton J , Hsia J , Andes L , Asma S , Talley B , Caixeta RD , Fouad H , Khoury RN , Ramanandraibe N , Rarick J , Sinha DN , Pujari S , Tursan d'Espaignet E . Glob Health Promot 2016 23 3-23 In 2008, the Centers for Disease Control and Prevention (CDC) and the World Health Organization developed the Global Adult Tobacco Survey (GATS), an instrument to monitor global tobacco use and measure indicators of tobacco control. GATS, a nationally representative household survey of persons aged 15 years or older, was conducted for the first time during 2008-2010 in 14 low- and middle-income countries. In each country, GATS used a standard core questionnaire, sample design, and procedures for data collection and management and, as needed, added country-specific questions that were reviewed and approved by international experts. The core questionnaire included questions about various characteristics of the respondents, their tobacco use (smoking and smokeless), and a wide range of tobacco-related topics (cessation; secondhand smoke; economics; media; and knowledge, attitudes, and perceptions). In each country, a multistage cluster sample design was used, with households selected proportionate to the size of the population. Households were chosen randomly within a primary or secondary sampling unit, and one respondent was selected at random from each household to participate in the survey. Interviewers administered the survey in the country's local language(s) using handheld electronic data collection devices. Interviews were conducted privately, and same-sex interviewers were used in countries where mixed-sex interviews would be culturally inappropriate. All 14 countries completed the survey during 2008-2010. In each country, the ministry of health was the lead coordinating agency for GATS, and the survey was implemented by national statistical organizations or surveillance institutes. This article describes the background and rationale for GATS and includes a comprehensive description of the survey methods and protocol. |
Prevalence of tobacco use among adults in Egypt, 2009
Fouad H , Awa FE , Naga RA , Emam AH , Labib S , Palipudi KM , Andes LJ , Asma S , Talley B . Glob Health Promot 2016 23 38-47 INTRODUCTION: We assessed the differences in overall use of tobacco and in the use of various tobacco products, by sex and by frequency of use across various demographic groups. METHODS: We used data from the Global Adult Tobacco Survey (GATS), conducted in 2009 in Egypt. The data consist of answers to GATS by 20,924 respondents from a nationally representative, multistage probability sample of adults aged 15 years or older from all regions of Egypt. Current tobacco use was defined as current smoking or use of smokeless tobacco products, either daily or occasionally. We analyzed the differences in current cigarette, shisha, and smokeless tobacco use by sex and frequency of use (daily or occasional); and by demographic characteristics that included age, region, education level and employment status. RESULTS: Overall, 19.7% of the Egyptian population currently use some form of tobacco. Men (38.1% [95% confidence interval (CI) 36.8-39.4]) are much more likely than women (0.6% [95% CI 0.4-0.9]) to use tobacco. Almost 96% of men who use tobacco, do so daily. Men are more likely to use manufactured cigarettes (31.8% [95% CI 30.6-33.1]) than shisha (6.2% [95% CI 5.6-6.9]) or smokeless tobacco (4.1% [95% CI 3.4-4.8]). Few women use tobacco (cigarettes (0.2%), shisha (0.3%) and smokeless tobacco (0.3%)); however, all women who currently smoke shisha, do so daily. Lower educational status, being between ages 25-64 and being employed predicted a higher use of tobacco. CONCLUSION: Egypt has implemented several initiatives to reduce tobacco use. The World Health Organization (WHO) MPOWER technical package, which aims to reverse the tobacco epidemic, is implemented at various levels throughout the country. Our findings show that there is significant variation in the prevalence of tobacco use and types of tobacco used by adult men and women in Egypt. GATS data can be used to better understand comparative patterns of tobacco use by adults, which in turn can be used to develop interventions. |
Exposure to anti- and pro-tobacco advertising, promotions or sponsorships: Turkey, 2008
Erguder T , Bilir N , Ozcebe H , Irmak H , Tasti E , Ilter H , Palipudi KM , Andes LJ , Asma S , Khoury RN , Talley B . Glob Health Promot 2016 23 58-67 INTRODUCTION: In 2008, Turkey became one of 26 countries with a complete ban on all forms of direct and indirect tobacco marketing. We assessed the level of exposure to anti- and pro-cigarette advertising and to cigarette promotions and sponsorships among various demographic groups in Turkey. METHODS: We used the data from the Global Adult Tobacco Survey (GATS), conducted in November 2008 in Turkey. The data consist of answers to GATS questions by 9030 respondents from a nationally representative, multistage probability sample of adults 15 years of age or older. To find differences in exposure to the advertising by sex, age, education level and smoking status, we analyzed responses to GATS questions about cigarette advertisements and anti-cigarette smoking information in various forms and through various advertising channels, during the 30 days before the survey, using bivariate analysis. RESULTS: Overall, 13.3% of respondents aged 15 years or older noticed some type of cigarette marketing during the 30 days before the survey: 7.1% saw advertisements, 5.3% saw promotions and 3.3% saw sports sponsorships. Men were more likely than women to have seen cigarette promotions (7.8% versus 3.0%) and sports sponsorships (5.3% versus 1.4%). Respondents aged 15-24 years were more likely than those aged 25 years or older to have seen cigarette advertisements (10.2% versus 6.2%), promotions (8.7% versus 4.4%) and sponsorships (6.6% versus 2.3%), respectively. Respondents were most likely to have seen cigarette advertisements on television (3.4%) or in shops (2.7%). In addition, 2.8% of respondents reported seeing a clothing item with a brand name or logo, 2.5% reported that they received free samples of cigarettes and 0.3% received gifts along with the purchase of cigarettes. Almost 9 of 10 survey respondents (88.8%) reported having noticed some anti-cigarette information during the 30 days before the survey. Most anti-cigarette information was seen on television (85.5%). The anti-cigarette information was seen by slightly more cigarette smokers (91.6%) than nonsmokers (87.6%). Persons with less than a primary education were less likely to notice anti-cigarette information than those with a higher level of education, in all examined media channels. CONCLUSIONS: Our findings showed a low prevalence of noticing cigarette marketing, which indicates high compliance with the Turkish law banning such marketing. GATS data provide an in-depth understanding of the level of exposure to pro- and anti-cigarette information in 2008 and they are of practical assistance to those who implement policies to reduce the demand for tobacco. The challenge now is to maintain rigorous enforcement. To do so requires ongoing surveillance to produce data on the effectiveness of the enforcement efforts. |
Exposure to secondhand smoke among adults - Philippines, 2009
Baquilod MM , Segarra AB , Barcenas G , Mercado SP , Rarick J , Palipudi KM , Asma S , Andes LJ , Talley B . Glob Health Promot 2016 23 48-57 INTRODUCTION: We assessed the differences in exposure to secondhand smoke (SHS) among adults at home, in indoor workplaces, and in various public places in the Philippines across various socio-demographic groups. METHODS: Data from the Global Adult Tobacco Survey conducted in 2009 in the Philippines were used. The data consist of survey answers from 9705 respondents from a nationally representative, multistage probability sample of adults aged 15 years or older. We considered that respondents were exposed to SHS if during the previous 30 days they reported that they lived in a home, worked in a building, or visited a public place where people smoked. The public places included in our analysis were indoor workplaces, public transportation vehicles, restaurants, government buildings or offices, and healthcare facilities. The differences in various socioeconomic and demographic groups' exposure to SHS in these places were also examined. RESULTS: Of respondents who reported working indoors, 36.8% were exposed to SHS. Men (43.3% [95% CI 39.7-46.9]) were more likely than women (28.8% [95% CI 25.4-32.4]) to be exposed to SHS (p < 0.001). Of those working in sites where smoking was not allowed, 13.9% were exposed to SHS, whereas 66.5% were exposed where smoking is allowed in some enclosed areas, and 90.7% were exposed where smoking is allowed everywhere. During the 30 days preceding the survey, more than 50% of those who took public transportation were exposed to SHS; exposure for those who visited public buildings was 33.6% in restaurants, 25.5% in government buildings or offices, and 7.6% in healthcare facilities. CONCLUSION: Despite a national law passed and several local government ordinances that have promulgated smoke-free workplaces, schools, government offices, and healthcare facilities, our findings show that a large proportion of adults were exposed to SHS at work and in public places, which offers opportunities to strengthen and improve enforcement of the smoke-free initiatives and ordinances in the Philippines. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Drug Safety
- Environmental Health
- Genetics and Genomics
- Healthcare Associated Infections
- Immunity and Immunization
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Mining
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Reproductive Health
- Substance Use and Abuse
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure