The role of clinical trial participation in cancer research: barriers, evidence, and strategies
Unger JM , Cook E , Tai E , Bleyer A . Am Soc Clin Oncol Educ Book 2016 35 185-98 Fewer than one in 20 adult patients with cancer enroll in cancer clinical trials. Although barriers to trial participation have been the subject of frequent study, the rate of trial participation has not changed substantially over time. Barriers to trial participation are structural, clinical, and attitudinal, and they differ according to demographic and socioeconomic factors. In this article, we characterize the nature of cancer clinical trial barriers, and we consider global and local strategies for reducing barriers. We also consider the specific case of adolescents with cancer and show that the low rate of trial enrollment in this age group strongly correlates with limited improvements in cancer population outcomes compared with other age groups. Our analysis suggests that a clinical trial system that enrolls patients at a higher rate produces treatment advances at a faster rate and corresponding improvements in cancer population outcomes. Viewed in this light, the issue of clinical trial enrollment is foundational, lying at the heart of the cancer clinical trial endeavor. Fewer barriers to trial participation would enable trials to be completed more quickly and would improve the generalizability of trial results. Moreover, increased accrual to trials is important for patients, because trials provide patients the opportunity to receive the newest treatments. In an era of increasing emphasis on a treatment decision-making process that incorporates the patient perspective, the opportunity for patients to choose trial participation for their care is vital. |
Sodium reduction - saving lives by putting choice into consumers' hands
Frieden TR . JAMA 2016 316 (6) 579-80 Although sodium reduction has been proposed as a public health strategy in the United States for more than 4 decades, there has been no progress reducing consumption. One reason for this lack of progress is the continued ubiquity of dietary sodium in the US food supply. The Food and Drug Administration (FDA) has released draft proposed voluntary guidelines1 to encourage companies to steadily reduce sodium in processed and restaurant foods, a change that would increase consumers’ control over their sodium intake. The proposed guidelines set targets for the gradual reduction in sodium across a range of food categories for both manufactured and restaurant products and would lead to a sustained reduction in the amount of sodium added to the food supply before foods reach consumers’ hands. This Viewpoint provides answers, based on the best available science, to important questions about why this action is needed. |
State of the science on prevention and screening to reduce melanoma incidence and mortality: the time is now
Tripp MK , Watson M , Balk SJ , Swetter SM , Gershenwald JE . CA Cancer J Clin 2016 66 (6) 460-480 Answer questions and earn CME/CNE Although overall cancer incidence rates are decreasing, melanoma incidence rates continue to increase about 3% annually. Melanoma is a significant public health problem that exacts a substantial financial burden. Years of potential life lost from melanoma deaths contribute to the social, economic, and human toll of this disease. However, most cases are potentially preventable. Research has clearly established that exposure to ultraviolet radiation increases melanoma risk. Unprecedented antitumor activity and evolving survival benefit from novel targeted therapies and immunotherapies are now available for patients with unresectable and/or metastatic melanoma. Still, prevention (minimizing sun exposure that may result in tanned or sunburned skin and avoiding indoor tanning) and early detection (identifying lesions before they become invasive or at an earlier stage) have significant potential to reduce melanoma incidence and melanoma-associated deaths. This article reviews the state of the science on prevention and early detection of melanoma and current areas of scientific uncertainty and ongoing debate. The US Surgeon General's Call to Action to Prevent Skin Cancer and US Preventive Services Task Force reviews on skin cancer have propelled a national discussion on melanoma prevention and screening that makes this an extraordinary and exciting time for diverse disciplines in multiple sectors-health care, government, education, business, advocacy, and community-to coordinate efforts and leverage existing knowledge to make major strides in reducing the public health burden of melanoma in the United States. CA Cancer J Clin 2016. (c) 2016 American Cancer Society. |
Toward consensus on self-management support: the international chronic condition self-management support framework
Mills SL , Brady TJ , Jayanthan J , Ziabakhsh S , Sargious PM . Health Promot Int 2016 32 (6) 942-952 Self-management support (SMS) initiatives have been hampered by insufficient attention to underserved and disadvantaged populations, a lack of integration between health, personal and social domains, over emphasis on individual responsibility and insufficient attention to ethical issues. This paper describes a SMS framework that provides guidance in developing comprehensive and coordinated approaches to SMS that may address these gaps and provides direction for decision makers in developing and implementing SMS initiatives in key areas at local levels. The framework was developed by researchers, policy-makers, practitioners and consumers from 5 English-speaking countries and reviewed by 203 individuals in 16 countries using an e-survey process. While developments in SMS will inevitably reflect local and regional contexts and needs, the strategic framework provides an emerging consensus on how we need to move SMS conceptualization, planning and development forward. The framework provides definitions of self-management (SM) and SMS, a collective vision, eight guiding principles and seven strategic directions. The framework combines important and relevant SM issues into a strategic document that provides potential value to the SMS field by helping decision-makers plan SMS initiatives that reflect local and regional needs and by catalyzing and expanding our thinking about the SMS field in relation to system thinking; shared responsibility; health equity and ethical issues. The framework was developed with the understanding that our knowledge and experience of SMS is continually evolving and that it should be modified and adapted as more evidence is available, and approaches in SMS advance. |
Urolithiasis, urinary cancer, and home drinking water source in the United States Territory of Guam, 2006-2010
Haddock RL , Olson DR , Backer L , Malilay J . Int J Environ Res Public Health 2016 13 (6) We reviewed patient records with a first-listed diagnosis of urolithiasis-also known as urinary tract or kidney stone disease, nephrolithiasis-upon discharge from Guam's sole civilian hospital during 2006 to 2010 and urinary cancer mortality records from the Guam Cancer Registry for 1970 to 2009 to determine the source of municipal water supplied to the patients' residence. The objective was to investigate a possible relationship between the sources of municipal water supplied to Guam villages and the incidence of urolithiasis and urinary cancer. We analyzed hospital discharge diagnoses of urolithiasis or renal calculi by calculating the incidence of first-mentioned discharge for urolithiasis or renal calculi and comparing rates across demographic or geographic categories while adjusting by age, sex, and ethnicity/race. We reviewed cancer registry records of urinary cancer deaths by patient residence. The annual incidence of hospitalization for urolithiasis was 5.22 per 10,000. Rates adjusted for sex or age exhibited almost no change. The rate of 9.83 per 10,000 among Chamorros was significantly higher (p < 0.05) than the rates among any other ethnic group or race. When villages were grouped by water source, rates of patients discharged with a first-listed diagnosis of urolithiasis, adjusted for ethnicity/race, were similar for villages using either well water (5.44 per 10,000) or mixed source water (5.39 per 10,000), and significantly greater than the rate for villages using exclusively reservoir water (1.35 per 10,000). No statistically significant differences were found between the water source or village of residence and urinary cancer mortality. Some Guam residents living in villages served completely or partly by deep well water high in calcium carbonate may be at increased risk for urolithiasis compared with residents living in villages served by surface waters. Although the risk appears to be highest in villagers of Chamorro ethnicity, residents should be aware of other contributing risk factors and steps to take to avoid developing this health problem. |
Accuracy of ICD-9-CM codes by hospital characteristics and stroke severity: Paul Coverdell National Acute Stroke Program
Chang TE , Lichtman JH , Goldstein LB , George MG . J Am Heart Assoc 2016 5 (6) BACKGROUND: Epidemiological and health services research often use International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes to identify patients with clinical conditions in administrative databases. We determined whether there are systematic variations between stroke patient clinical diagnoses and ICD-9-CM codes, stratified by hospital characteristics and stroke severity. METHODS AND RESULTS: We used the records of patients discharged from hospitals participating in the Paul Coverdell National Acute Stroke Program in 2013. Within this stroke-enriched cohort, we compared agreement between the attending physician's clinical diagnosis and principal ICD-9-CM code and determined whether disagreements varied by hospital characteristics (presence of a stroke unit, stroke team, number of hospital beds, and hospital location). For patients with a documented National Institutes of Health Stroke Scale score at admission, we assessed whether diagnostic agreement varied by stroke severity. Agreement was generally high (>89%); differences between the physician diagnosis and ICD-9-CM codes were primarily attributed to discordance between ischemic stroke and transient ischemic attack (TIA), and subarachnoid and intracerebral hemorrhage. Agreement was higher for patients in metropolitan hospitals with stroke units, stroke teams, and >200 beds (all P<0.001). Agreement was lowest (60.3%) for rural hospitals with ≤200 beds and without stroke units or teams. Agreement was also lower for milder (94.9%) versus more-severe (96.4%) ischemic strokes (P<0.001). CONCLUSIONS: We identified disagreements in stroke/TIA coding by hospital characteristics and stroke severity, particularly for milder ischemic strokes. Such systematic variations in ICD-9-CM coding practices can affect stroke case identification in epidemiological studies and may have implications for hospital-level quality metrics. |
Clustering of five health-related behaviors for chronic disease prevention among adults, United States, 2013
Liu Y , Croft JB , Wheaton AG , Kanny D , Cunningham TJ , Lu H , Onufrak S , Malarcher AM , Greenlund KJ , Giles WH . Prev Chronic Dis 2016 13 E70 INTRODUCTION: Five key health-related behaviors for chronic disease prevention are never smoking, getting regular physical activity, consuming no alcohol or only moderate amounts, maintaining a normal body weight, and obtaining daily sufficient sleep. The objective of this study was to estimate the clustering of these 5 health-related behaviors among adults aged 21 years or older in each state and the District of Columbia and to assess geographic variation in clustering. METHODS: We used data from the 2013 Behavioral Risk Factor Surveillance System (BRFSS) to assess the clustering of the 5 behaviors among 395,343 BRFSS respondents aged 21 years or older. The 5 behaviors were defined as currently not smoking cigarettes, meeting the aerobic physical activity recommendation, consuming no alcohol or only moderate amounts, maintaining a normal body mass index (BMI), and sleeping at least 7 hours per 24-hour period. Prevalence of having 4 or 5 of these behaviors, by state, was also examined. RESULTS: Among US adults, 81.6% were current nonsmokers, 63.9% obtained 7 hours or more sleep per day, 63.1% reported moderate or no alcohol consumption, 50.4% met physical activity recommendations, and 32.5% had a normal BMI. Only 1.4% of respondents engaged in none of the 5 behaviors; 8.4%, 1 behavior; 24.3%, 2 behaviors; 35.4%, 3 behaviors; and 24.3%, 4 behaviors; only 6.3% reported engaging in all 5 behaviors. The highest prevalence of engaging in 4 or 5 behaviors was clustered in the Pacific and Rocky Mountain states. Lowest prevalence was in the southern states and along the Ohio River. CONCLUSION: Additional efforts are needed to increase the proportion of the population that engages in all 5 health-related behaviors and to eliminate geographic variation. Collaborative efforts in health care systems, communities, work sites, and schools can promote all 5 behaviors and produce population-wide changes, especially among the socioeconomically disadvantaged. |
Dietary sodium and cardiovascular disease risk - measurement matters
Cogswell ME , Mugavero K , Bowman BA , Frieden TR . N Engl J Med 2016 375 (6) 580-6 Hypertension is a common and major risk factor for the leading U.S. killer, cardiovascular disease. Reducing excess dietary sodium can lower blood pressure, with a greater response among persons with hypertension. Nine of 10 Americans consume excess dietary sodium, defined as more than 2300 mg per day. Many leading medical and public health organizations recommend reducing dietary sodium to a maximum of 2300 mg per day on the basis of evidence indicating a public health benefit. Yet this benefit has been questioned, mainly on the basis of studies suggesting that low sodium intake is also associated with an increased risk of cardiovascular disease. In science, conflicting evidence from studies with methods of different strengths is not uncommon. Studies that measure sodium intake vary widely in their methods and should be judged accordingly. Accurate measurement matters. Paradoxical findings based on inaccurate sodium measurements should not stall efforts to improve the food environment in ways that enable consumers to reduce excess sodium intake. Gradual, stepwise sodium reduction, as recommended by the Institute of Medicine, remains an achievable, effective, and important public health strategy to prevent tens of thousands of heart attacks and strokes and save billions of dollars in health care costs annually. |
Sentinel surveillance for influenza among severe acute respiratory infection and acute febrile illness inpatients at three hospitals in Ghana
Jones AH , Ampofo W , Akuffo R , Doman B , Duplessis C , Amankwa JA , Sarpong C , Sagoe K , Agbenohevi P , Puplampu N , Armah G , Koram KA , Nyarko EO , Bel-Nono S , Dueger E . Influenza Other Respir Viruses 2016 10 (5) 367-74 BACKGROUND: Influenza epidemiology in Africa is generally not well understood. Using syndrome definitions to screen patients for laboratory confirmation of infection is an established means to effectively conduct influenza surveillance. METHODS: To compare influenza-related epidemiologic data, from October 2010 through March 2013, we enrolled hospitalized severe acute respiratory infection (SARI; fever with respiratory symptoms) and acute febrile illness (AFI; fever without respiratory or other localizing symptoms) patients from three referral hospitals in Ghana. Demographic and epidemiologic data were obtained from enrolled patients after which nasopharyngeal and oropharyngeal swabs were collected, and processed by molecular methods for the presence of influenza viruses. RESULTS: Of 730 SARI patients, 59 (8%) were influenza positive; of 543 AFI patients, 34 (6%) were positive for influenza. Both SARI and AFI surveillance yielded influenza A(H3N2) (3% versus 1%), A(H1N1)pdm09 (2% versus 1%), and influenza B (3% versus 4%) in similar proportions. Data from both syndromes show year-round influenza transmission but with increased caseloads associated with the rainy seasons. CONCLUSIONS: Since an appreciable percentage of influenza cases (37%) presented without defined respiratory symptoms, and thus met the AFI but not the SARI definition, it is important to consider broader screening criteria (i.e. AFI) to identify all laboratory-confirmed influenza. The identified influenza transmission seasonality has important implications for the timing of related public health interventions. |
Social support, sexual violence, and transactional sex among female transnational migrants to South Africa
Giorgio M , Townsend L , Zembe Y , Guttmacher S , Kapadia F , Cheyip M , Mathews C . Am J Public Health 2016 106 (6) 1123-9 OBJECTIVES: To examine the relationship between sexual violence and transactional sex and assess the impact of social support on this relationship among female transnational migrants in Cape Town, South Africa. METHODS: In 2012 we administered a behavioral risk factor survey using respondent-driven sampling to transnational migrant women aged between 16 and 39 years, born outside South Africa, living in Cape Town, and speaking English, Shona, Swahili, Lingala, Kirundi, Kinyarwanda, French, or Somali. RESULTS: Controlling for study covariates, travel-phase sexual violence was positively associated with engagement in transactional sex (adjusted prevalence ratio [APR] = 1.38; 95% confidence interval [CI] = 1.07, 1.77), and social support was shown to be a protective factor (APR = 0.84; 95% CI = 0.75, 0.95). The interaction of experienced sexual violence during migration and social support score was APR = 0.85 (95% CI = 0.66, 1.10). In the stratified analysis, we found an increased risk of transactional sex among the low social support group (APR = 1.56; 95% CI = 1.22, 2.00). This relationship was not statistically significant among the moderate or high social support group (APR = 1.04; 95% CI = 0.58, 1.87). CONCLUSIONS: Programs designed to strengthen social support may reduce transactional sex among migrant women after they have settled in their receiving communities. |
Notes from the field: Increase in Neisseria meningitidis-associated urethritis among men at two sentinel clinics - Columbus, Ohio, and Oakland County, Michigan, 2015
Bazan JA , Peterson AS , Kirkcaldy RD , Briere EC , Maierhofer C , Turner AN , Licon DB , Parker N , Dennison A , Ervin M , Johnson L , Weberman B , Hackert P , Wang X , Kretz CB , Abrams AJ , Trees DL , Del Rio C , Stephens DS , Tzeng YL , DiOrio M , Roberts MW . MMWR Morb Mortal Wkly Rep 2016 65 (21) 550-2 Neisseria meningitidis (Nm) urogenital infections, although less common than infections caused by Neisseria gonorrhoeae (Ng), have been associated with urethritis, cervicitis, proctitis, and pelvic inflammatory disease. Nm can appear similar to Ng on Gram stain analysis (gram-negative intracellular diplococci). Because Nm colonizes the nasopharynx, men who receive oral sex (fellatio) can acquire urethral Nm infections. This report describes an increase in Nm-associated urethritis in men attending sexual health clinics in Columbus, Ohio, and Oakland County, Michigan. |
Prevalence of HIV testing and counseling and associated factors among secondary school students in Botswana
Bodika SM , Lekone PE , Loeto P , Alwano MG , Zulu TC , Kim E , Machao G , Voetsch AC . Int J Adolesc Med Health 2016 28 (2) 149-54 BACKGROUND: The World Health Organization recommends HIV testing and counseling (HTC) for all adolescents living in countries with generalized HIV epidemics. In Botswana, HIV prevalence among adolescents 15-19 years is 3.7% and among pregnant adolescents is 10%. We describe the proportion and characteristics of secondary school students who have accessed HTC. METHODS: A multistage sample survey was conducted among students in Botswana's public secondary schools in 2010. The survey was self-administered using a personal digital assistant device. The HTC rate was estimated using self-reported history of HIV testing. RESULTS: Of 1,632 participants, 52% were girls, 43% aged below 16 years, and 27% had ever had sexual intercourse. Most (81%) students knew where to get tested for HIV. Overall, 2.2% of students were HIV positive by self-report. The HTC rate was 23% overall, 34% among students who had ever had sexual intercourse, and 45% among students who had sexual intercourse in the past 12 months. Being pregnant or having made someone pregnant and having had sexual intercourse in the past 12 months were associated with having been tested for HIV among students who had ever had sexual intercourse. DISCUSSION: Overall, the HTC rate was low, and the self- reported HIV prevalence was high among secondary students in Botswana. Most sexually active students have never been tested for HIV. Health communications efforts for adolescents that increase demand for HTC, routine opt-out HIV testing in healthcare facilities, and school-based HIV testing are needed as part of a national HIV prevention strategy. |
Progesterone levels associate with a novel population of CCR5+CD38+ CD4 T cells resident in the genital mucosa with lymphoid trafficking potential
Swaims-Kohlmeier A , Haaland RE , Haddad LB , Sheth AN , Evans-Strickfaden T , Lupo LD , Cordes S , Aguirre AJ , Lupoli KA , Chen CY , Ofotukun I , Hart CE , Kohlmeier JE . J Immunol 2016 197 (1) 368-76 The female genital tract (FGT) provides a means of entry to pathogens, including HIV, yet immune cell populations at this barrier between host and environment are not well defined. We initiated a study of healthy women to characterize resident T cell populations in the lower FGT from lavage and patient-matched peripheral blood to investigate potential mechanisms of HIV sexual transmission. Surprisingly, we observed FGT CD4 T cell populations were primarily CCR7hi, consistent with a central memory or recirculating memory T cell phenotype. In addition, roughly half of these CCR7hi CD4 T cells expressed CD69, consistent with resident memory T cells, whereas the remaining CCR7hi CD4 T cells lacked CD69 expression, consistent with recirculating memory CD4 T cells that traffic between peripheral tissues and lymphoid sites. HIV susceptibility markers CCR5 and CD38 were increased on FGT CCR7hi CD4 T cells compared with blood, yet migration to the lymphoid homing chemokines CCL19 and CCL21 was maintained. Infection with GFP-HIV showed that FGT CCR7hi memory CD4 T cells are susceptible HIV targets, and productive infection of CCR7hi memory T cells did not alter chemotaxis to CCL19 and CCL21. Variations of resident CCR7hi FGT CD4 T cell populations were detected during the luteal phase of the menstrual cycle, and longitudinal analysis showed the frequency of this population positively correlated to progesterone levels. These data provide evidence women may acquire HIV through local infection of migratory CCR7hi CD4 T cells, and progesterone levels predict opportunities for HIV to access these novel target cells. |
Public confidence in the health care system 1 year after the start of the Ebola virus disease outbreak - Sierra Leone, July 2015
Li W , Jalloh MF , Bunnell R , Aki-Sawyerr Y , Conteh L , Sengeh P , Redd JT , Hersey S , Morgan O , Jalloh MB , O'Leary A , Burdette E , Hageman K . MMWR Morb Mortal Wkly Rep 2016 65 (21) 538-42 Ensuring confidence in the health care system has been a challenge to Ebola virus disease (Ebola) response and recovery efforts in Sierra Leone (1). A national multistage cluster-sampled household survey to assess knowledge, attitudes, and practices (KAP) related to Sierra Leone's health care system was conducted in July 2015. Among 3,564 respondents, 93% were confident that a health care facility could treat suspected Ebola cases, and approximately 90% had confidence in the health system's ability to provide non-Ebola services, including immunizations, antenatal care, and maternity care. Respondents in districts with ongoing Ebola transmission ("active districts") and respondents with higher educational levels reported more confidence in the health care system than did respondents in nonactive districts and respondents with less education. Active districts were the focus of the Ebola response; these districts implemented intensified social mobilization and communication efforts, and established district response centers, Ebola-specific health care facilities, and ambulances. Greater infrastructure and response capacity might have resulted in higher confidence in the health care system in these areas. Respondents ranked Ebola and malaria as the country's most important health issues. Health system recovery efforts in Sierra Leone can build on existing public confidence in the health system. |
Hospitalizations with lower respiratory tract infections among American Indian and Alaska Native children under age 5 Years: the use of non-federal hospital discharge data to analyze rates
Singleton R , Holman RC . J Pediatr 2016 175 10-2 American Indian and Alaska Native (AI/AN) children aged <5 years who live on or near reservation communities and receive their health care through the Indian Health Service (IHS)/tribal healthcare system (IHS, tribal and contract healthcare facilities) are known to be have an extremely high rate of hospitalization associated with lower respiratory tract infections (LRTIs).1, 2, 3, 4 Less is known about the rate of LRTI-associated hospitalization for AI/AN children who do not use the IHS/tribal healthcare system and receive hospital care in only nonfederal hospitals. The occurrence of LRTIs among these AI/AN children is critical to evaluate, because the majority of the AI/AN population (60%) live outside of reservation communities4 and indigenous children in other industrialized countries experience disproportionately higher morbidity and mortality due to LRTIs.5 | Three previous publications have compared the LRTI hospitalization rate for AI/AN children aged <5 years receiving care within the IHS/tribal healthcare system with that for the corresponding general US childhood population.1, 2, 3 Although the LRTI hospitalization rate declined during the study periods for both AI/AN children and the general US children population aged <5 years, the rate remained greater for AI/AN children. In the most recent study of these populations, Foote et al1 reported that the 2009-2011 average annual LRTI-associated hospitalization rate was 1.5-fold higher for AI/AN children (20 per 1000) than that for the US child population aged <5 years (13.7 per 1000).1 Significant rate disparities were found to exist among AI/AN children, with higher rates among infants and among children in the Alaska and the Southwest IHS regions. |
Human metapneumovirus circulation in the United States, 2008 to 2014
Haynes AK , Fowlkes AL , Schneider E , Mutuc JD , Armstrong GL , Gerber SI . Pediatrics 2016 137 (5) BACKGROUND: Human metapneumovirus (HMPV) infection causes respiratory illness, including bronchiolitis and pneumonia. However, national HMPV seasonality, as it compares with respiratory syncytial virus (RSV) and influenza seasonality patterns, has not been well described. METHODS: Hospital and clinical laboratories reported weekly aggregates of specimens tested and positive detections for HMPV, RSV, and influenza to the National Respiratory and Enteric Virus Surveillance System from 2008 to 2014. A season was defined as consecutive weeks with ≥3% positivity for HMPV and ≥10% positivity for RSV and influenza during a surveillance year (June through July). For each virus, the season, onset, offset, duration, peak, and 6-season medians were calculated. RESULTS: Among consistently reporting laboratories, 33 583 (3.6%) specimens were positive for HMPV, 281 581 (15.3%) for RSV, and 401 342 (18.2%) for influenza. Annually, 6 distinct HMPV seasons occurred from 2008 to 2014, with onsets ranging from November to February and offsets from April to July. Based on the 6-season medians, RSV, influenza, and HMPV onsets occurred sequentially and season durations were similar at 21 to 22 weeks. HMPV demonstrated a unique biennial pattern of early and late seasonal onsets. RSV seasons (onset, offset, peak) were most consistent and occurred before HMPV seasons. There were no consistent patterns between HMPV and influenza circulations. CONCLUSIONS: HMPV circulation begins in winter and lasts until spring and demonstrates distinct seasons each year, with the onset beginning after that of RSV. HMPV, RSV, and influenza can circulate simultaneously during the respiratory season. |
The importance of population denominators for high-impact public health for marginalized populations
Purcell DW , Hall HI , Bernstein KL , Gift TL , McCray E , Mermin J . JMIR Public Health Surveill 2016 2 (1) e26 The lack of consistent methods to enumerate population-level denominators for hidden populations has made it difficult for public health to articulate some of the most pressing disparities in America. For example, since the first cases of AIDS in the United States struck gay and bisexual men, injection drug users, and transgender persons, calculating rates of disease to compare impact across populations and geographic areas to highlight disparities and target resources has been challenging. While routine census data have allowed the Centers for Disease Control and Prevention (CDC) to calculate disease rates by sex, age, race/ethnicity, and geographic area [1], the census does not collect information on sexual orientation or same-sex sexual behavior, persons who inject drugs or injection behaviors, heterosexuals who are at higher risk of HIV infection, or transgender persons. This lack of information is nowhere more evident than among gay, bisexual, and other men who have sex with men (MSM), who comprise 67% of estimated number of persons with HIV diagnosed in 2014 (70% when MSM who also inject drugs are included) [1]. Among youth ages 13 to 24, 80% of diagnoses in 2014 were among MSM or MSM who also inject drugs [1]. The impact of HIV on MSM has made them a key focus of the National HIV/AIDS Strategy (NHAS) [2,3]; yet, proportions alone cannot accurately describe disparities, because the size of population denominators vary. | Over the past 5 years, CDC has tried to fill the gap in national, population-wide denominators by using various analytic techniques to estimate the US population size of MSM [4], persons who inject drugs [5], and high-risk heterosexuals [6], and to estimate the population size of MSM and persons who inject drugs by urbanicity and region [7]. Other groups have tried to estimate the size of the population of transgender adults [8] and youth [9]. These national estimates have allowed for the calculation of disease rates for these populations for HIV and other sexually transmitted diseases, which in turn has allowed for national disparities to be highlighted and for federal resources to be better targeted to maximize health impact and increase equity. MSM, who constitute 4% of men in the United States [4], have HIV prevalence and diagnosis rates at least 40 times as great, and syphilis rates at least 60 times as great as for women and other men [4]. However, national estimates may not be applicable to state or local areas because the proportion of the population that is MSM may differ greatly between and within states. Therefore, more refined information is necessary for accurate local information to help plan local programs and allocate resources. |
Addressing the medical and support service needs of people living with HIV (PLWH) through Program Collaboration and Service Integration (PCSI)
Bernard S , Tailor A , Jones P , Alexander DE . Calif J Health Promot 2016 14 (1) 1-14 Background: Approximately 1.2 million Americans are living with HIV, and about 50,000 new infections occur each year. People living with HIV (PLWH) have numerous medical and psychosocial needs that impact HIV disease progression and challenge treatment outcomes. Purpose: Using CDC's Program Collaboration and Service Integration (PCSI) framework, we examined strategies, challenges, and lessons learned from a local health department's efforts to institute PCSI to address the diverse needs of their patients with HIV. Methods: We captured case study data through: (1) semi-structured interviews with key program administrators, (2) analysis of program documents, and (3) site observations and review of clinic procedures. Results: Findings highlight the importance of co-locating services, partnering to leverage resources, and conducting cross-training of staff. Providing co-located services reduced wait times and enhanced coordination of care. Partnering to leverage resources increased patient referrals and enhanced access to comprehensive services. Staff cross-training resulted in more coordinated care and efficient service delivery. Conclusion: The results show that PCSI is essential for optimal care for PLWH. Incorporating PCSI was a vital component of the health department's comprehensive approach to addressing the multiple medical and support service needs of its HIV-infected clients. |
Centrofacial balamuthiasis: Case report of a rare cutaneous amebic infection
Chang OH , Liu F , Knopp E , Muehlenbachs A , Cope JR , Ali I , Thompson R , George E . J Cutan Pathol 2016 43 (10) 892-7 Free-living amebae are ubiquitous in our environment, but rarely cause cutaneous infection. Balamuthia mandrillaris has a predilection for infecting skin of the central face. Infection may be restricted to the skin or associated with life-threatening central nervous system (CNS) involvement. We report a case of a 91-year-old woman, who presented with a non-healing red plaque over her right cheek. Several punch biopsies exhibited non-specific granulomatous inflammation without demonstrable fungi or mycobacteria in histochemical stains. She was treated empirically for granulomatous rosacea, but the lesion continued to progress. A larger incisional biopsy was performed in which amebae were observed in hematoxylin-eosin stained sections. These were retrospectively apparent in the prior punch biopsy specimens. Immunohistochemistry and polymerase chain reaction studies identified the organisms as Balamuthia mandrillaris. Cutaneous infection by Balamuthia mandrillaris is a rare condition that is sometimes complicated by life-threatening CNS involvement and which often evades timely diagnosis due to its rarity and nonspecific clinical manifestations. Moreover, these amebae are easily overlooked in histopathologic sections because of their small number and their resemblance to histiocytes. Dermatopathologists should be familiar with the histopathologic appearance of these organisms and include balamuthiasis and other amebic infections in the differential diagnosis of granulomatous dermatitis. |
Culex Tarsalis Mosquitoes as Vectors of Highlands J Virus.
Borland EM , Ledermann JP , Powers AM . Vector Borne Zoonotic Dis 2016 16 (8) 558-65 Highlands J virus (HJV) is an alphavirus closely related to western equine encephalitis virus (WEEV) and eastern equine encephalitis virus (EEEV). HJV is an avian pathogen with the potential for disruption of poultry operations, but is not known to cause human or equine disease. HJV has only been identified in the eastern United States and is thought to have a transmission cycle similar to that of EEEV involving Culiseta melanura mosquitoes and birds. However, HJV is more genetically similar to WEEV and it remains unclear if it may be transmitted by Culex species mosquitoes like WEEV. Seven strains of HJV were characterized to assess this potential. Phylogenetic analysis of whole genome sequences revealed four distinct HJV lineages (lineages 1-4), and vector competence studies in Cx. tarsalis with four of the HJV strains from different lineages yielded two distinct infection patterns. Lineage 1 strains had low infection rates, while lineages 2 and 4 had significantly higher infection rates similar to those previously published for WEEV. The average mosquito body viral titer was highest at 8 dpi (6.60-7.26 log10 pfu equivalents/body), and head titers at all time points ranged between 6.01 and 6.80 log10 pfu equivalents/head. Nearly 45% of mosquitoes infected with strain AB-80-9 were able to transmit virus in saliva with an average titer of 5.02 log10 pfu equivalents/saliva. A single amino acid difference between high and low infectivity phenotypes was identified at genome position 8605, in the E2 gene. A nonpolar glycine was present in the low infectivity lineage 1 strains, while an acidic glutamic acid was present in the higher infectivity lineage 2 and 4 strains. This study demonstrates HJV transmission by Cx. tarsalis mosquitoes and clearly identifies the potential for transmission in the western United States. Two infection phenotypes were exhibited, indicating the need for further studies to understand Culex species transmission patterns. |
Global distribution and environmental suitability for chikungunya virus, 1952 to 2015.
Nsoesie EO , Kraemer MU , Golding N , Pigott DM , Brady OJ , Moyes CL , Johansson MA , Gething PW , Velayudhan R , Khan K , Hay SI , Brownstein JS . Euro Surveill 2016 21 (20) Chikungunya fever is an acute febrile illness caused by the chikungunya virus (CHIKV), which is transmitted to humans by Aedes mosquitoes. Although chikungunya fever is rarely fatal, patients can experience debilitating symptoms that last from months to years. Here we comprehensively assess the global distribution of chikungunya and produce high-resolution maps, using an established modelling framework that combines a comprehensive occurrence database with bespoke environmental correlates, including up-to-date Aedes distribution maps. This enables estimation of the current total population-at-risk of CHIKV transmission and identification of areas where the virus may spread to in the future. We identified 94 countries with good evidence for current CHIKV presence and a set of countries in the New and Old World with potential for future CHIKV establishment, demonstrated by high environmental suitability for transmission and in some cases previous sporadic reports. Aedes aegypti presence was identified as one of the major contributing factors to CHIKV transmission but significant geographical heterogeneity exists. We estimated 1.3 billion people are living in areas at-risk of CHIKV transmission. These maps provide a baseline for identifying areas where prevention and control efforts should be prioritised and can be used to guide estimation of the global burden of CHIKV. |
Prenatal phthalate exposure and infant size at birth and gestational duration
Shoaff JR , Romano ME , Yolton K , Lanphear BP , Calafat AM , Braun JM . Environ Res 2016 150 52-58 BACKGROUND: Phthalate exposure is widespread. Prior research suggests that prenatal phthalate exposure may influence birth size and gestational duration, but published results have been inconsistent. OBJECTIVE: We quantified the relationship between maternal urinary phthalate concentrations and infant birth weight z-scores, length, head circumference, and gestational duration. METHODS: In a cohort of 368 women from the HOME Study, based in Cincinnati, OH, we measured nine phthalate metabolites representing exposure to six parent phthalate diesters in urine collected at approximately 16 and 26 weeks gestation. Infant birth size and gestational duration were abstracted from medical records. We used multivariable linear regression to estimate covariate adjusted associations between urinary phthalate metabolite concentrations and infant outcomes. RESULTS: In unadjusted models, we observed a negative association between monoethyl phthalate (MEP) and birth weight z-scores, while mono-3-carboxypropyl phthalate (MCPP) was positively associated with gestational duration. After covariate adjustment, phthalate metabolite concentrations were no longer associated with birth size or gestational duration. CONCLUSIONS: In this cohort, urinary phthalate metabolite concentrations during pregnancy were not associated with infant birth size or gestational duration. Additional research is needed to determine if exposures during earlier periods of fetal development are associated with infant health. |
Identifying heat-related deaths by using medical examiner and vital statistics data: Surveillance analysis and descriptive epidemiology - Oklahoma, 1990-2011
Johnson MG , Brown S , Archer P , Wendelboe A , Magzamen S , Bradley KK . Environ Res 2016 150 30-37 OBJECTIVES: Approximately 660 deaths occur annually in the United States associated with excess natural heat. A record heat wave in Oklahoma during 2011 generated increased interest concerning heat-related mortality among public health preparedness partners. We aimed to improve surveillance for heat-related mortality and better characterize heat-related deaths in Oklahoma during 1990-2011, and to enhance public health messaging during future heat emergencies. METHODS: Heat-related deaths were identified by querying vital statistics (VS) and medical examiner (ME) data during 1990-2011. Case inclusion criteria were developed by using heat-related International Classification of Diseases codes, cause-of-death nomenclature, and ME investigation narrative. We calculated sensitivity and predictive value positive (PVP) for heat-related mortality surveillance by using VS and ME data and performed a descriptive analysis. RESULTS: During the study period, 364 confirmed and probable heat-related deaths were identified when utilizing both data sets. ME reports had 87% sensitivity and 74% PVP; VS reports had 80% sensitivity and 52% PVP. Compared to Oklahoma's general population, decedents were disproportionately male (67% vs. 49%), aged ≥65 years (46% vs. 14%), and unmarried (78% vs. 47%). Higher rates of heat-related mortality were observed among Blacks. Of 95 decedents with available information, 91 (96%) did not use air conditioning. CONCLUSIONS: Linking ME and VS data sources together and using narrative description for case classification allows for improved case ascertainment and surveillance data quality. Males, Blacks, persons aged ≥65 years, unmarried persons, and those without air conditioning carry a disproportionate burden of the heat-related deaths in Oklahoma. |
Use of Whole Genome Sequencing and Patient Interviews To Link a Case of Sporadic Listeriosis to Consumption of Prepackaged Lettuce.
Jackson KA , Stroika S , Katz LS , Beal J , Brandt E , Nadon C , Reimer A , Major B , Conrad A , Tarr C , Jackson BR , Mody RK . J Food Prot 2016 79 (5) 806-809 We report on a case of listeriosis in a patient who probably consumed a prepackaged romaine lettuce-containing product recalled for Listeria monocytogenes contamination. Although definitive epidemiological information demonstrating exposure to the specific recalled product was lacking, the patient reported consumption of a prepackaged romaine lettuce-containing product of either the recalled brand or a different brand. A multinational investigation found that patient and food isolates from the recalled product were indistinguishable by pulsed-field gel electrophoresis and were highly related by whole genome sequencing, differing by four alleles by whole genome multilocus sequence typing and by five high-quality single nucleotide polymorphisms, suggesting a common source. To our knowledge, this is the first time prepackaged lettuce has been identified as a likely source for listeriosis. This investigation highlights the power of whole genome sequencing, as well as the continued need for timely and thorough epidemiological exposure data to identify sources of foodborne infections. |
Molecular characterization of the first G24P[14] rotavirus strain detected in humans.
Ward ML , Mijatovic-Rustempasic S , Roy S , Rungsrisuriyachai K , Boom JA , Sahni LC , Baker CJ , Rench MA , Wikswo ME , Payne DC , Parashar UD , Bowen MD . Infect Genet Evol 2016 43 338-42 Here we report the genome of a novel rotavirus A (RVA) strain detected in a stool sample collected during routine surveillance by the Centers for Disease Control and Prevention's New Vaccine Surveillance Network. The strain, RVA/human-wt/USA/2012741499/2012/G24P[14], has a genomic constellation of G24-P[14]-I2-R2-C2-M2-A3-N2-T9-E2-H3. The VP2, VP3, VP7 and NSP3 genes cluster phylogenetically with bovine strains. The other genes occupy mixed clades containing animal and human strains. Strain RVA/human-wt/USA/2012741499/2012/G24P[14] most likely is the product of interspecies transmission and reassortment events. This is the second report of the G24 genotype and the first report of the G24P[14] genotype combination in humans. |
Implementation of Whole Genome Sequencing (WGS) for Identification and Characterization of Shiga Toxin-Producing Escherichia coli (STEC) in the United States.
Lindsey RL , Pouseele H , Chen JC , Strockbine NA , Carleton HA . Front Microbiol 2016 7 766 Shiga toxin-producing Escherichia coli (STEC) is an important foodborne pathogen capable of causing severe disease in humans. Rapid and accurate identification and characterization techniques are essential during outbreak investigations. Current methods for characterization of STEC are expensive and time-consuming. With the advent of rapid and cheap whole genome sequencing (WGS) benchtop sequencers, the potential exists to replace traditional workflows with WGS. The aim of this study was to validate tools to do reference identification and characterization from WGS for STEC in a single workflow within an easy to use commercially available software platform. Publically available serotype, virulence, and antimicrobial resistance databases were downloaded from the Center for Genomic Epidemiology (CGE) (www.genomicepidemiology.org) and integrated into a genotyping plug-in with in silico PCR tools to confirm some of the virulence genes detected from WGS data. Additionally, down sampling experiments on the WGS sequence data were performed to determine a threshold for sequence coverage needed to accurately predict serotype and virulence genes using the established workflow. The serotype database was tested on a total of 228 genomes and correctly predicted from WGS for 96.1% of O serogroups and 96.5% of H serogroups identified by conventional testing techniques. A total of 59 genomes were evaluated to determine the threshold of coverage to detect the different WGS targets, 40 were evaluated for serotype and virulence gene detection and 19 for the stx gene subtypes. For serotype, 95% of the O and 100% of the H serogroups were detected at > 40x and ≥ 30x coverage, respectively. For virulence targets and stx gene subtypes, nearly all genes were detected at > 40x, though some targets were 100% detectable from genomes with coverage ≥20x. The resistance detection tool was 97% concordant with phenotypic testing results. With isolates sequenced to > 40x coverage, the different databases accurately predicted serotype, virulence, and resistance from WGS data, providing a fast and cheaper alternative to conventional typing techniques. |
Detailed phylogenetic analysis of primate T-lymphotropic virus type 1 (PTLV-1) sequences from orangutans (Pongo pygmaeus) reveals new insights into the evolutionary history of PTLV-1 in Asia.
Reid MJ , Switzer WM , Schillaci MA , Ragonnet M , Joanisse I , Caminiti K , Lowenberger CA , Galdikas BM , Sandstrom PA , Brooks JI . Infect Genet Evol 2016 43 434-50 While human T-lymphotropic virus type 1 (HTLV-1) originates from ancient cross-species transmission of simian T-lymphotropic virus type 1 (STLV-1) from infected nonhuman primates, much debate exists on whether the first HTLV-1 occurred in Africa, or in Asia during early human evolution and migration. This topic is complicated by a lack of representative Asian STLV-1 to infer PTLV-1 evolutionary histories. In this study we obtained new STLV-1 LTR and tax sequences from a wild-born Bornean orangutan (Pongo pygmaeus) and performed detailed phylogenetic analyses using both maximum likelihood and Bayesian inference of available Asian PTLV-1 and African STLV-1 sequences. Phylogenies, divergence dates and nucleotide substitution rates were co-inferred and compared using six different molecular clock calibrations in a Bayesian framework, including both archaeological and/or nucleotide substitution rate calibrations. We then combined our molecular results with paleobiogeographical and ecological data to infer the most likely evolutionary history of PTLV-1. Our analyses robustly inferred an Asian source for PTLV-1 with cross-species transmission of STLV-1 likely from a macaque (Macaca sp.) to an orangutan about 207.5-17.2 kya, and to humans between 125.9-10.4 kya. An orangutan diversification of STLV-1 commenced approximately 3.2 - 37.5 kya. Our analyses also inferred that HTLV-1 was first introduced into Australia ~3.1-3.7 kya, corresponding to both genetic and archaeological changes occurring in Australia at that time. Finally, HTLV-1 appears in Melanesia at ~2.3-2.7 kya corresponding to the migration of the Lapita peoples into the region. Our results also provide an important future reference for calibrating information essential for PTLV evolutionary timescale inference. Longer sequence data, or full genomes from a greater representation of Asian primates, including gibbons, leaf monkeys, and Sumatran orangutans are needed to fully elucidate these evolutionary dates and relationships using the model criteria suggested herein. |
Genetic Basis of Irritant Susceptibility in Health Care Workers.
Yucesoy B , Talzhanov Y , Barmada MM , Johnson VJ , Kashon ML , Baron E , Wilson NW , Frye B , Wang W , Fluharty K , Gharib R , Meade J , Germolec D , Luster MI , Nedorost S . J Occup Environ Med 2016 58 (8) 753-9 OBJECTIVE: The aim of this study was to investigate the association of single nucleotide polymorphisms (SNPs) within genes involved in inflammation, skin barrier integrity, signaling/pattern recognition, and antioxidant defense with irritant susceptibility in a group of health care workers. METHODS: The 536 volunteer subjects were genotyped for selected SNPs and patch tested with three model irritants: sodium lauryl sulfate (SLS), sodium hydroxide (NaOH), and benzalkonium chloride (BKC). Genotyping was performed on genomic DNA using Illumina Goldengate custom panels. RESULTS: The ACACB (rs2268387, rs16934132, rs2284685), NTRK2 (rs10868231), NTRK3 (rs1347424), IL22 (rs1179251), PLAU (rs2227564), EGFR (rs6593202), and FGF2 (rs308439) SNPs showed an association with skin response to tested irritants in different genetic models (all at P < 0.001). Functional annotations identified two SNPs in PLAU (rs2227564) and ACACB (rs2284685) genes with a potential impact on gene regulation. In addition, EGF (rs10029654), EGFR (rs12718939), CXCL12 (rs197452), and VCAM1 (rs3917018) genes showed an association with hand dermatitis (P < 0.005). CONCLUSIONS: The results demonstrate that genetic variations in genes related to inflammation and skin homeostasis can influence responses to irritants and may explain inter-individual variation in the development of subsequent contact dermatitis. |
IDF Diabetes Atlas estimates of 2014 global health expenditures on diabetes
da Rocha Fernandes J , Ogurtsova K , Linnenkamp U , Guariguata L , Seuring T , Zhang P , Cavan D , Makaroff LE . Diabetes Res Clin Pract 2016 117 48-54 Aims: To estimate health expenditures due to diabetes in 2014 for the world and its regions. Methods: Diabetes-attributable health expenditures were estimated using an attributable fraction method. Data were sourced from International Diabetes Federation (IDF) estimates of diabetes prevalence, UN population projections, WHO annual health expenditure reports, and estimates of the cost ratio of people with and without diabetes. Health expenditures were calculated in both US dollars (USD) and international dollars (ID). Results: The average health expenditure per person with diabetes worldwide in 2014 was estimated to range from USD 1583 (ID 1742) to USD 2842 (ID 3110). The estimated annual global health expenditure attributable to diabetes ranged from USD 612 billion (ID 673 billion) to USD 1099 billion (ID 1202 billion). Together, the North America and Caribbean Region and the Europe Region were responsible for over 69% of the costs, and less than 10% of the costs were from the Africa Region, South East Asia Region, and Middle East and North Africa Region combined. The North America and Caribbean Region had the highest annual spending per person with diabetes (USD 7984 [ID 8040.39]), while the South East Asia Region had the lowest annual spending per person with diabetes (USD 92 [ID 234]). Conclusions: Diabetes imposes a large economic burden on health care systems across the world, yet varies across world regions. Diabetes prevention and effective management of diabetes should be a public health priority to reduce the financial burden. |
Characteristics of Medicare Advantage and fee-for-service beneficiaries upon enrollment in Medicare at age 65
Miller EA , Decker SL , Parker JD . J Ambul Care Manage 2016 39 (3) 231-41 Previous research has found differences in characteristics of beneficiaries enrolled in Medicare fee-for-service versus Medicare Advantage (MA), but there has been limited research using more recent MA enrollment data. We used 1997-2005 National Health Interview Survey data linked to 2000-2009 Medicare enrollment data to compare characteristics of Medicare beneficiaries before their initial enrollment into Medicare fee-for-service or MA at age 65 and whether the characteristics of beneficiaries changed from 2006 to 2009 compared with 2000 to 2005. During this period of MA growth, the greatest increase in enrollment appears to have come from those with no chronic conditions and men. |
Comparing 2- and 3-dose 9-valent HPV vaccine schedules in the U.S.: A cost-effectiveness analysis
Laprise JF , Markowitz LE , Chesson HW , Drolet M , Brisson M . J Infect Dis 2016 214 (5) 685-8 A recent clinical trial using the 9-valent HPV vaccine has shown that antibody responses after 2 doses are non-inferior to 3 doses, suggesting that 2 and 3 doses may have comparable vaccine efficacy. We used an individual-based transmission-dynamic model to compare the population-level effectiveness and cost-effectiveness of 2- and 3-dose schedules of 9-valent HPV vaccine in the United States. Our model predicts that if 2 doses of 9-valent vaccine protect for ≥20 years: 1) the additional benefits of a 3-dose schedule are small compared to 2-dose schedules, and 2) 2-dose schedules are likely much more cost-efficient than 3-dose schedules. |
Notes from the field: Investigation of hepatitis C virus transmission associated with injection therapy for chronic pain - California, 2015
Foster MA , Grigg C , Hagon J , Batson PA , Kim J , Choi M , Moorman A , Dean C . MMWR Morb Mortal Wkly Rep 2016 65 (21) 547-9 On November 26, 2014, the California Department of Public Health (CDPH) contacted CDC concerning a report from the Santa Barbara County Public Health Department (SBPHD) regarding acute hepatitis C virus (HCV) infection in a repeat blood donor. The patient, who was asymptomatic, was first alerted of the infection by the blood bank and had no traditional risk factors for HCV infection. The donor had a negative HCV nucleic acid test (NAT) 56 days before the first positive NAT test, and an investigation into the donor's health care exposures and other potential risk factors, including injection drug use, incarceration, and long-term hemodialysis within this narrow exposure window, was conducted by SBPHD. |
A program to prevent catheter-associated urinary tract infection in acute care
Saint S , Greene MT , Krein SL , Rogers MA , Ratz D , Fowler KE , Edson BS , Watson SR , Meyer-Lucas B , Masuga M , Faulkner K , Gould CV , Battles J , Fakih MG . N Engl J Med 2016 374 (22) 2111-9 BACKGROUND: Catheter-associated urinary tract infection (UTI) is a common device-associated infection in hospitals. Both technical factors--appropriate catheter use, aseptic insertion, and proper maintenance--and socioadaptive factors, such as cultural and behavioral changes in hospital units, are important in preventing catheter-associated UTI. METHODS: The national Comprehensive Unit-based Safety Program, funded by the Agency for Healthcare Research and Quality, aimed to reduce catheter-associated UTI in intensive care units (ICUs) and non-ICUs. The main program features were dissemination of information to sponsor organizations and hospitals, data collection, and guidance on key technical and socioadaptive factors in the prevention of catheter-associated UTI. Data on catheter use and catheter-associated UTI rates were collected during three phases: baseline (3 months), implementation (2 months), and sustainability (12 months). Multilevel negative binomial models were used to assess changes in catheter use and catheter-associated UTI rates. RESULTS: Data were obtained from 926 units (59.7% were non-ICUs, and 40.3% were ICUs) in 603 hospitals in 32 states, the District of Columbia, and Puerto Rico. The unadjusted catheter-associated UTI rate decreased overall from 2.82 to 2.19 infections per 1000 catheter-days. In an adjusted analysis, catheter-associated UTI rates decreased from 2.40 to 2.05 infections per 1000 catheter-days (incidence rate ratio, 0.86; 95% confidence interval [CI], 0.76 to 0.96; P=0.009). Among non-ICUs, catheter use decreased from 20.1% to 18.8% (incidence rate ratio, 0.93; 95% CI, 0.90 to 0.96; P<0.001) and catheter-associated UTI rates decreased from 2.28 to 1.54 infections per 1000 catheter-days (incidence rate ratio, 0.68; 95% CI, 0.56 to 0.82; P<0.001). Catheter use and catheter-associated UTI rates were largely unchanged in ICUs. Tests for heterogeneity (ICU vs. non-ICU) were significant for catheter use (P=0.004) and catheter-associated UTI rates (P=0.001). CONCLUSIONS: A national prevention program appears to reduce catheter use and catheter-associated UTI rates in non-ICUs. (Funded by the Agency for Healthcare Research and Quality.). |
Measles immunity among pregnant women aged 15-44 years in Namibia, 2008 & 2010
Cardemil CV , Jonas A , Beukes A , Anderson R , Rota PA , Bankamp B , Jr HE , Sawadogo S , Patel SV , Zeko S , Muroua C , Gaeb E , Wannemuehler K , Gerber S , Goodson JL . Int J Infect Dis 2016 49 189-95 BACKGROUND: Namibia experienced a large measles outbreak starting in 2009, with 38% of reported cases in adults, including women of reproductive age. We assessed population immunity among pregnant women, to determine if immunization activities were needed in adults to achieve measles elimination in Namibia. METHODS: We tested 1,708 and 2,040 specimens for measles immunoglobulin G antibody from Namibian pregnant women aged 15-44 years sampled from the 2008 and 2010 National HIV Sentinel Survey, respectively. We determined the proportion of women seropositive overall and by 5-year age strata, and analyzed factors associated with seropositivity by logistic regression, including age, facility type, gravidity, HIV status, and urban/rural status. We tested for any difference in seropositivity between 2008 and 2010. RESULTS: In both analysis years, measles seropositivity was lower in 15-19 year olds (77%) and 20-24 year olds (85-87%) and higher in 25-44 year olds (90%-94%) (p<0.001, 2008; p<0.001, 2010). Overall measles seropositivity did not differ between 2008 (87%) and 2010 (87%) (p=0.7). HIV status did not affect seropositivity. CONCLUSIONS: Late in a large measles outbreak, 13% of pregnant women in Namibia, and almost one in four 15-19 year old pregnant women, remained measles-susceptible. In Namibia, immunization campaigns with measles-containing vaccine should be considered for adults. |
Pertussis vaccine effectiveness in the setting of pertactin-deficient pertussis
Breakwell L , Kelso P , Finley C , Schoenfeld S , Goode B , Misegades LK , Martin SW , Acosta AM . Pediatrics 2016 137 (5) BACKGROUND: In the United States, the proportion of Bordetella pertussis isolates lacking pertactin, a component of acellular pertussis vaccines, increased from 14% in 2010 to 85% in 2012. The impact on vaccine effectiveness (VE) is unknown. METHODS: We conducted 2 matched case-control evaluations in Vermont to assess VE of the 5-dose diphtheria, tetanus, and acellular pertussis vaccine (DTaP) series among 4- to 10-year-olds, and tetanus, diphtheria, and acellular pertussis vaccine (Tdap) among 11- to 19-year-olds. Cases reported during 2011 to 2013 were included. Three controls were matched to each case by medical home, and additionally by birth year for the Tdap evaluation. Vaccination history was obtained from medical records and parent interviews. Odds ratios (OR) were calculated by using conditional logistic regression; VE was estimated as (1-OR) x 100%. Pertactin status was determined for cases with available isolates. RESULTS: Overall DTaP VE was 84% (95% confidence interval [CI] 58%-94%). VE within 12 months of dose 5 was 90% (95% CI 71%-97%), declining to 68% (95% CI 10%-88%) by 5-7 years post-vaccination. Overall Tdap VE was 70% (95% CI 54%-81%). Within 12 months of Tdap vaccination, VE was 76% (95% CI 60%-85%), declining to 56% (95% CI 16%-77%) by 2-4 years post-vaccination. Of cases with available isolates, >90% were pertactin-deficient. CONCLUSIONS: Our DTaP and Tdap VE estimates remain similar to those found in other settings, despite high prevalence of pertactin deficiency in Vermont, suggesting these vaccines continue to be protective against reported pertussis disease. |
Evaluation of effectiveness of mixed rotavirus vaccine course for rotavirus gastroenteritis
Payne DC , Sulemana I , Parashar UD . JAMA Pediatr 2016 170 (7) 708-10 Two rotavirus vaccines—RotaTeq (RV5; Merck and Company), a 3-dose series, and Rotarix (RV1; GlaxoSmithKline Biologicals), a 2-dose series—are licensed for use in US children. The US Advisory Committee for Immunization Practices (ACIP) recommends that a rotavirus vaccine series be completed with the same product whenever possible but allows for administering mixed vaccine types if a previous dose type is not available or is unknown.1 In such situations, the ACIP recommends, “If any dose in the series was RV5 or the vaccine product is unknown for any dose in the series, a total of 3 doses of rotavirus vaccine should be administered.”1 However, the effectiveness of a mixed rotavirus vaccine series remains unclear. | We evaluated the postlicensure vaccine effectiveness (VE) of a complete 3-dose course of mixed rotavirus vaccine types according to the ACIP definition and compared these results with published VE results for the same population and time. |
Facial nerve palsy including Bell's palsy: case definitions and guidelines for collection, analysis, and presentation of immunisation safety data
Rath B , Gidudu JF , Anyoti H , Bollweg B , Caubel P , Chen YH , Cornblath D , Fernandopulle R , Fries L , Galama J , Gibbs N , Grilli G , Grogan P , Hartmann K , Heininger U , Hudson MJ , Izurieta HS , Jevaji I , Johnson WM , Jones J , Keller-Stanislawski B , Klein J , Kohl K , Kokotis P , Li Y , Linder T , Oleske J , Richard G , Shafshak T , Vajdy M , Wong V , Sejvar J . Vaccine 2016 35 (15) 1972-1983 Facial nerve palsy is classified based on the location of its lesion. Central facial nerve palsy is the consequence of an upper motor neuron (UMN) lesion of the 7th cranial nerve, while peripheral palsy is due to a lesion of a lower motor neuron (LMN). Peripheral facial nerve palsy is the partial (i.e., paresis) or complete (i.e., paralysis) loss of function of some or all the structures innervated by the facial nerve (i.e. cranial nerve VII). Facial nerve palsy is also classified by the time course of its development depending on whether acute (minutes to days), subacute (days to weeks) or chronic (longer than weeks). Acute onset facial palsies are common. The most common cause of acute onset, central facial palsy is stroke. However, of the acute onset, peripheral facial palsies, the most common syndrome is that of idiopathic, acute onset, peripheral facial palsy, better known as Bell's palsy. Henceforth in this document, it will be understood that, when discussing Bell's palsy, we are referring to peripheral facial palsy that is ‘acute-onset’. | Clinical signs of peripheral facial nerve palsy include loss of facial tone with obliteration of the naso-labial fold, inability to raise the eyebrows and wrinkle the forehead, smile, open or draw the corner of the mouth, and completely close the eye on the affected side [1], [2], [3], [4]. They may further include hyperacusis, dryness of eye and decreased salivation. Peripheral facial nerve palsy most commonly presents on one side of the face, leading to facial asymmetry, or “facial droop” [1], [5]. Simultaneous bilateral acute-onset cases have also been described and are now recognised as an uncommon clinical feature [6], [7], [8], [9], [10], [11], [12], [13]. |
Implementation of coordinated global serotype 2 oral poliovirus vaccine cessation: risks of inadvertent trivalent oral poliovirus vaccine use
Duintjer Tebbens RJ , Hampton LM , Thompson KM . BMC Infect Dis 2016 16 (1) 237 BACKGROUND: The endgame for polio eradication includes coordinated global cessation of oral poliovirus vaccine (OPV), starting with the cessation of vaccine containing OPV serotype 2 (OPV2) by switching all trivalent OPV (tOPV) to bivalent OPV (bOPV). The logistics associated with this global switch represent a significant undertaking, with some possibility of inadvertent tOPV use after the switch. METHODS: We used a previously developed poliovirus transmission and OPV evolution model to explore the relationships between the extent of inadvertent tOPV use, the time after the switch of the inadvertent tOPV use and corresponding population immunity to serotype 2 poliovirus transmission, and the ability of the inadvertently introduced viruses to cause a serotype 2 circulating vaccine-derived poliovirus (cVDPV2) outbreak in a hypothetical population. We then estimated the minimum time until inadvertent tOPV use in a supplemental immunization activity (SIA) or in routine immunization (RI) can lead to a cVDPV2 outbreak in realistic populations with properties like those of northern India, northern Pakistan and Afghanistan, northern Nigeria, and Ukraine. RESULTS: At low levels of inadvertent tOPV use, the minimum time after the switch for the inadvertent use to cause a cVDPV2 outbreak decreases sharply with increasing proportions of children inadvertently receiving tOPV. The minimum times until inadvertent tOPV use in an SIA or in RI can lead to a cVDPV2 outbreak varies widely among populations, with higher basic reproduction numbers, lower tOPV-induced population immunity to serotype 2 poliovirus transmission prior to the switch, and a lower proportion of transmission occurring via the oropharyngeal route all resulting in shorter times. In populations with the lowest expected immunity to serotype 2 poliovirus transmission after the switch, inadvertent tOPV use in an SIA leads to a cVDPV2 outbreak if it occurs as soon as 9 months after the switch with 0.5 % of children aged 0-4 years inadvertently receiving tOPV, and as short as 6 months after the switch with 10-20 % of children aged 0-1 years inadvertently receiving tOPV. In the same populations, inadvertent tOPV use in RI leads to a cVDPV2 outbreak if 0.5 % of OPV RI doses given use tOPV instead of bOPV for at least 20 months after the switch, with the minimum length of use dropping to at least 9 months if inadvertent tOPV use occurs in 50 % of OPV RI doses. CONCLUSIONS: Efforts to ensure timely and complete tOPV withdrawal at all levels, particularly from locations storing large amounts of tOPV, will help minimize risks associated with the tOPV-bOPV switch. Under-vaccinated populations with poor hygiene become at risk of a cVDPV2 outbreak in the event of inadvertent tOPV use the soonest after the tOPV-bOPV switch and therefore should represent priority areas to ensure tOPV withdrawal from all OPV stocks. |
Implementation of coordinated global serotype 2 oral poliovirus vaccine cessation: risks of potential non-synchronous cessation
Duintjer Tebbens RJ , Hampton LM , Thompson KM . BMC Infect Dis 2016 16 (1) 231 BACKGROUND: The endgame for polio eradication involves coordinated global cessation of oral poliovirus vaccine (OPV) with cessation of serotype 2 OPV (OPV2 cessation) implemented in late April and early May 2016 and cessation of serotypes 1 and 3 OPV (OPV13 cessation) currently planned for after 2018. The logistics associated with globally switching all use of trivalent OPV (tOPV) to bivalent OPV (bOPV) represent a significant undertaking, which may cause some complications, including delays that lead to different timing of the switch across shared borders. METHODS: Building on an integrated global model for long-term poliovirus risk management, we consider the expected vulnerability of different populations to transmission of OPV2-related polioviruses as a function of time following the switch. We explore the relationship between the net reproduction number (Rn) of OPV2 at the time of the switch and the time until OPV2-related viruses imported from countries still using OPV2 can establish transmission. We also analyze some specific situations modeled after populations at high potential risk of circulating serotype 2 vaccine-derived poliovirus (cVDPV2) outbreaks in the event of a non-synchronous switch. RESULTS: Well-implemented tOPV immunization activities prior to the tOPV to bOPV switch (i.e., tOPV intensification sufficient to prevent the creation of indigenous cVDPV2 outbreaks) lead to sufficient population immunity to transmission to cause die-out of any imported OPV2-related viruses for over 6 months after the switch in all populations in the global model. Higher Rn of OPV2 at the time of the switch reduces the time until imported OPV2-related viruses can establish transmission and increases the time during which indigenous OPV2-related viruses circulate. Modeling specific connected populations suggests a relatively low vulnerability to importations of OPV2-related viruses that could establish transmission in the context of a non-synchronous switch from tOPV to bOPV, unless the gap between switch times becomes very long (>6 months) or a high risk of indigenous cVDPV2s already exists in the importing and/or the exporting population. CONCLUSIONS: Short national discrepancies in the timing of the tOPV to bOPV switch will likely not significantly increase cVDPV2 risks due to the insurance provided by tOPV intensification efforts, although the goal to coordinate national switches within the globally agreed April 17-May 1, 2016 time window minimized the risks associated with cross-border importations. |
Suicide in Illinois, 2005-2010: a reflection of patterns and risks by age groups and opportunities for targeted prevention
McLone SG , Loharikar A , Sheehan K , Mason M . J Trauma Acute Care Surg 2016 81 S30-5 BACKGROUND: Suicide accounts for two-thirds of all deaths from intentional, or violence-related, injury and is a leading cause of death in the United States. Patterns of suicide have been well described among high risk groups, but few studies have compared the circumstances related to suicides across all age groups. We sought to understand the epidemiology of suicide cases in Illinois and to characterize the risks and patterns for suicide among different age groups. METHODS: We utilized suicide data collected in the Illinois Violent Death Reporting System (IVDRS) to assess demographics, method of suicide, circumstances, and mental health status among different age groups. RESULTS: Between 2005 and 2010, 3016 suicides were reported; 692 (23%) were female, and the median age (n=3013) was 45 years (range 10-98 years). The most common method/weapon types were hanging/strangulation (33%), firearm (32%) and poisoning (21%). Hanging was more common (74%) among young people aged 10-19 years, while firearm use was more common among elderly persons age 65 years and over (55%). The percentage of victims within an age group experiencing a crisis within two weeks prior to committing suicide was highest among 10 to 14 year-olds, while the risk factor of having a family member or friend die in past five years was highest among older victims. CONCLUSIONS: The final analysis demonstrated age-related trends in suicide in Illinois suggesting prevention programs should tailor services by age. LEVEL OF EVIDENCE: Level IV epidemiological study,. |
Nonfatal playground-related traumatic brain injuries among children, 2001-2013
Cheng TA , Bell JM , Haileyesus T , Gilchrist J , Sugerman DE , Coronado VG . Pediatrics 2016 137 (6) OBJECTIVE: To describe the circumstances, characteristics, and trends of emergency department (ED) visits for nonfatal, playground-related traumatic brain injury (TBI) among persons aged ≤14 years. METHODS: The National Electronic Injury Surveillance System-All Injury Program from January 1, 2001, through December 31, 2013, was examined. US Census bridged-race population estimates were used as the denominator to compute rates per 100 000 population. SAS and Joinpoint linear weighted regression analyses were used to analyze the best-fitting join-point and the annual modeled rate change. These models were used to indicate the magnitude and direction of rate trends for each segment or period. RESULTS: During the study period, an annual average of 21 101 persons aged ≤14 years were treated in EDs for playground-related TBI. The ED visit rate for boys was 39.7 per 100 000 and 53.5 for persons aged 5-9 years. Overall, 95.6% were treated and released, 33.5% occurred at places of recreation or sports, and 32.5% occurred at school. Monkey bars or playground gyms (28.3%) and swings (28.1%) were the most frequently associated with TBI, but equipment involvement varied by age group. The annual rate of TBI ED visits increased significantly from 2005 to 2013 (P < .05). CONCLUSIONS: Playgrounds remain an important location of injury risk to children. Strategies to reduce the incidence and severity of playground-related TBIs are needed. These may include improved adult supervision, methods to reduce child risk behavior, regular equipment maintenance, and improvements in playground surfaces and environments. |
Point of health care entry for youth with concussion within a large pediatric care network
Arbogast KB , Curry AE , Pfeiffer MR , Zonfrillo MR , Haarbauer-Krupa J , Breiding MJ , Coronado VG , Master CL . JAMA Pediatr 2016 170 (7) e160294 Importance: Previous epidemiologic research on concussions has primarily been limited to patient populations presenting to sport concussion clinics or to emergency departments (EDs) and to those high school age or older. By examining concussion visits across an entire pediatric health care network, a better estimate of the scope of the problem can be obtained. Objective: To comprehensively describe point of entry for children with concussion, overall and by relevant factors including age, sex, race/ethnicity, and payor, to quantify where children initially seek care for this injury. Design, Setting, and Participants: In this descriptive epidemiologic study, data were collected from primary care, specialty care, ED, urgent care, and inpatient settings. The initial concussion-related visit was selected and variation in the initial health care location (primary care, specialty care, ED, or hospital) was examined in relation to relevant variables. All patients aged 0 to 17 years who received their primary care from The Children's Hospital of Philadelphia's (CHOP) network and had 1 or more in-person clinical visits for concussion in the CHOP unified electronic health record (EHR) system (July 1, 2010, to June 30, 2014) were selected. Main Outcomes and Measures: Frequency of initial concussion visits at each type of health care location. Concussion visits in the EHR were defined based on International Classification of Diseases, Ninth Revision, Clinical Modification diagnosis codes indicative of concussion. Results: A total of 8083 patients were included (median age, 13 years; interquartile range, 10-15 years). Overall, 81.9% (95% CI, 81.1%-82.8%; n = 6624) had their first visit at CHOP within primary care, 5.2% (95% CI, 4.7%-5.7%; n = 418) within specialty care, and 11.7% (95% CI, 11.0%-12.4%; n = 947) within the ED. Health care entry varied by age: 52% (191/368) of children aged 0 to 4 years entered CHOP via the ED, whereas more than three-quarters of those aged 5 to 17 years entered via primary care (5-11 years: 1995/2492; 12-14 years: 2415/2820; and 15-17 years: 2056/2403). Insurance status also influenced the pattern of health care use, with more Medicaid patients using the ED for concussion care (478/1290 Medicaid patients [37%] used the ED vs 435/6652 private patients [7%] and 34/141 self-pay patients [24%]). Conclusions and Relevance: The findings suggest estimates of concussion incidence based solely on ED visits underestimate the burden of injury, highlight the importance of the primary care setting in concussion care management, and demonstrate the potential for EHR systems to advance research in this area. |
Characteristics of youth with combined histories of violent behavior, suicidal ideation or behavior, and gun-carrying
Logan JE , Vagi KJ , Gorman-Smith D . Crisis 2016 37 (6) 1-13 BACKGROUND: Youth reporting combined histories of nonfatal violence, suicidal ideation/behavior, and gun-carrying (VSG) are at risk for perpetrating fatal interpersonal violence and self-harm. AIMS: We characterized these youth to inform prevention efforts. METHOD: We analyzed 2004 data from 3,931 seventh-, ninth-, and 11-12th-grade youth and compared VSG youth (n = 66) with non-gun carrying youth who either had no histories of violence or suicidal thoughts/behavior (n = 1,839), histories of violence (n = 884), histories of suicidal thoughts/behaviors (n = 552), or both (n = 590). We compared groups based on demographic factors, risk factors (i.e., friends who engage in delinquency, peer-violence victimization, depressive symptoms, illicit substance use), and protective factors (i.e., school connectedness, parental care and supervision). Regression models identified factors associated with VSG youth. RESULTS: Illicit substance use and having friends who engage in delinquency were more common among VSG youth in all comparisons; almost all VSG youth had high levels of these factors. Depressive symptoms were positively associated with VSG youth versus youth without either violent or suicide-related histories and youth with violent histories alone. School connectedness and parental supervision were negatively associated with VSG youth in most comparisons. CONCLUSION: Family-focused and school-based interventions that increase connectedness while reducing delinquency and substance use might prevent these violent tendencies. |
Childhood sexual violence against boys: a study in 3 countries
Sumner SA , Mercy JA , Buluma R , Mwangi MW , Marcelin LH , Kheam T , Lea V , Brookmeyer K , Kress H , Hillis SD . Pediatrics 2016 137 (5) BACKGROUND AND OBJECTIVE: Globally, little evidence exists on sexual violence against boys. We sought to produce the first internationally comparable estimates of the magnitude, characteristics, risk factors, and consequences of sexual violence against boys in 3 diverse countries. METHODS: We conducted nationally representative, multistage cluster Violence Against Children Surveys in Haiti, Kenya, and Cambodia among males aged 13 to 24 years. Differences between countries for boys experiencing sexual violence (including sexual touching, attempted sex, and forced/coerced sex) before age 18 years were examined by using chi(2) and logistic regression analyses. RESULTS: In Haiti, Kenya, and Cambodia, respectively, 1459, 1456, and 1255 males completed surveys. The prevalence of experiencing any form of sexual violence ranged from 23.1% (95% confidence Interval [CI]: 20.0-26.2) in Haiti to 14.8% (95% CI: 12.0-17.7) in Kenya, and 5.6% (95% CI: 4.0-7.2) in Cambodia. The largest share of perpetrators in Haiti, Kenya, and Cambodia, respectively, were friends/neighbors (64.7%), romantic partners (37.2%), and relatives (37.0%). Most episodes occurred inside perpetrators' or victims' homes in Haiti (60.4%), contrasted with outside the home in Kenya (65.3%) and Cambodia (52.1%). The most common time period for violence in Haiti, Kenya, and Cambodia was the afternoon (55.0%), evening (41.3%), and morning (38.2%), respectively. Adverse health effects associated with violence were common, including increased odds of transactional sex, alcohol abuse, sexually transmitted infections, anxiety/depression, suicidal ideation/attempts, and violent gender attitudes. CONCLUSIONS: Differences were noted between countries in the prevalence, characteristics, and risk factors of sexual violence, yet associations with adverse health effects were pervasive. Prevention strategies tailored to individual locales are needed. |
PCR-Based Serotyping of Streptococcus pneumoniae from Culture-Negative Specimens: Novel Primers for Detection of Serotypes within Serogroup 18.
Tanmoy AM , Saha S , Darmstadt GL , Whitney CG , Saha SK . J Clin Microbiol 2016 Six multiplex-compatible PCR-primers were designed to distinguish Streptococcus pneumoniae serotypes within serogroup-18 from non/culturable pneumococcal specimen, with no cross-reactivity with other serotypes and respiratory organisms. These primers will aide generate better data on non/vaccine serotypes in invasive and carriage pneumococcal surveillance and contribute in future vaccine formulation and impact studies. |
Sampling and analytical method for alpha-dicarbonyl flavoring compounds via derivatization with o-phenylenediamine and analysis using GC-NPD
Pendergrass SM , Cooper JA . Scientifica (Cairo) 2016 2016 9059678 A novel methodology is described for the sampling and analysis of diacetyl, 2,3-pentanedione, 2,3-hexanedione, and 2,3-heptanedione. These analytes were collected on o-phenylenediamine-treated silica gel tubes and quantitatively recovered as the corresponding quinoxaline derivatives. After derivatization, the sorbent was desorbed in 3 mL of ethanol solvent and analyzed using gas chromatography/nitrogen-phosphorous detection (GC/NPD). The limits of detection (LOD) achieved for each analyte were determined to be in the range of 5-10 nanograms/sample. Evaluation of the on-tube derivatization procedure indicated that it is unaffected by humidities ranging from 20% to 80% and that the derivatization procedure was quantitative for analyte concentrations ranging from 0.1 mug to approximately 500 mug per sample. Storage stability studies indicated that the derivatives were stable for 30 days when stored at both ambient and refrigerated temperatures. Additional studies showed that the quinoxaline derivatives were quantitatively recovered when sampling up to a total volume of 72 L at a sampling rate of 50 cc/min. This method will be important to evaluate and monitor worker exposures in the food and flavoring industry. Samples can be collected over an 8-hour shift with up to 288 L total volume collected regardless of time, sampling rate, and/or the effects of humidity. |
Note: a portable laser induced breakdown spectroscopy instrument for rapid sampling and analysis of silicon-containing aerosols
McLaughlin RP , Mason GS , Miller AL , Stipe CB , Kearns JD , Prier MW , Rarick JD . Rev Sci Instrum 2016 87 (5) 056103 A portable instrument has been developed for measuring silicon-containing aerosols in near real-time using laser-induced breakdown spectroscopy (LIBS). The instrument uses a vacuum system to collect and deposit airborne particulate matter onto a translatable reel of filter tape. LIBS is used to analyze the deposited material, determining the amount of silicon-containing compounds present. In laboratory testing with pure silica (SiO2), the correlation between LIBS intensity for a characteristic silicon emission and the concentration of silica in a model aerosol was determined for a range of concentrations, demonstrating the instrument's plausibility for identifying hazardous levels of silicon-containing compounds. |
Effect of Vandetanib on Andes virus survival in the hamster model of Hantavirus pulmonary syndrome
Bird BH , Shrivastava-Ranjan P , Dodd KA , Erickson BR , Spiropoulou CF . Antiviral Res 2016 132 66-69 Hantavirus pulmonary syndrome (HPS) is a severe disease caused by hantavirus infection of pulmonary microvascular endothelial cells leading to microvascular leakage, pulmonary edema, pleural effusion and high case fatality. Previously, we demonstrated that Andes virus (ANDV) infection caused up-regulation of vascular endothelial growth factor (VEGF) and concomitant downregulation of the cellular adhesion molecule VE-cadherin leading to increased permeability. Analyses of human HPS-patient sera have further demonstrated increased circulating levels of VEGF. Here we investigate the impact of a small molecule antagonist of the VEGF receptor 2 (VEGFR-2) activation in vitro, and overall impact on survival in the Syrian hamster model of HPS. |
Evaluation of alternative killing agents for Aedes aegypti (Diptera: Culicidae) in the Gravid Aedes Trap (GAT)
Heringer L , Johnson BJ , Fikrig K , Oliveira BA , Silva RD , Townsend M , Barrera R , Eiras AE , Ritchie SA . J Med Entomol 2016 53 (4) 873-879 The Gravid Aedes Trap (GAT) uses visual and olfactory cues to attract gravid Aedes aegypti (L.) that are then captured when knocked down by a residual pyrethroid surface spray. However, the use of surface sprays can be compromised by poor availability of the spray and pesticide resistance in the target mosquito. We investigated several "alternative" insecticide and insecticide-free killing agents for use in the GAT. This included long-lasting insecticide-impregnated nets (LLINs), vapor-active synthetic pyrethroids (metofluthrin), canola oil, and two types of dry adhesive sticky card. During bench top assays LLINs, metofluthrin, and dry sticky cards had 24-h knockdown (KD) percentages >80% (91.2 +/- 7.2%, 84.2 +/- 6.8%, and 83.4 +/- 6.1%, respectively), whereas the 24-h KD for canola oil was 70 +/- 7.7%, which improved to 90.0 +/- 3.7% over 48 h. Importantly, there were no significant differences in the number of Ae. aegypti collected per week or the number of traps positive for Ae. aegypti between the sticky card and canola oil treatments compared with the surface spray and LLIN treatments in semifield and field trials. These results demonstrate that the use of inexpensive and widely available insecticide-free agents such as those described in this study are effective alternatives to pyrethroids in regions with insecticide-resistant populations. The use of such environmentally friendly insecticide-free alternatives will also be attractive in areas where there is substantial resistance to insecticide use due to environmental and public health concerns. |
International circumpolar surveillance interlaboratory quality control program for emm typing of Streptococcus pyogenes, 2011-2015
Rudolph K , Martin I , Demczuk W , Kakulphimp J , Bruden D , Zulz T , Bruce M . Diagn Microbiol Infect Dis 2016 85 (4) 398-400 In 2011, an interlaboratory quality control (QC) program for emm typing group A streptococci (GAS) was incorporated into existing international circumpolar surveillance QC programs. From 2011 - 2015, 35 GAS isolates were distributed to three laboratories; emm type-level concordance was 100%, while the overall sub-type level concordance was 83%. |
Acute infections, cost and time to reporting of HIV test results in three U.S. State Public Health Laboratories
Nasrullah M , Wesolowski LG , Ethridge SF , Cranston K , Pentella M , Myers RA , Rudrik JT , Hutchinson AB , Bennett SB , Werner BG . J Infect 2016 73 (2) 164-72 OBJECTIVE: In three U.S. State Public Health Laboratories (PHLs) using a fourth-generation immunoassay (IA), an HIV-1/HIV-2 differentiation antibody IA and a nucleic acid test (NAT), we characterized the yield and time to reporting of acute infections, and cost per positive specimen. METHODS: Routine HIV testing data were collected from July 1, 2012-June 30, 2013 for Massachusetts and Maryland PHLs, and from November 27, 2012-June 30, 2013 for Michigan PHL. Massachusetts and Michigan used fourth-generation and differentiation IAs with NAT conducted by a referral laboratory. In Maryland, fourth-generation IA repeatedly reactive specimens were followed by a Western blot (WB), and those with negative or indeterminate results were tested with a differentiation IA and HIV-1 NAT, and if positive by NAT, confirmed by a different HIV-1 NAT. Specimens from WB-positive persons at risk for HIV-2 were tested with a differentiation IA and, if positive, with an HIV-2 WB and/or differential HIV-1/HIV-2 proviral DNA polymerase chain reaction. RESULTS: Among 7914 specimens from Massachusetts PHL, 6069 from Michigan PHL, and 36,266 from Maryland PHL, 0.10%, 0.02% and 0.05% acute infections were identified, respectively. Massachusetts and Maryland PHLs each had 1 HIV-2 positive specimen. The median time from specimen receipt to laboratory reporting of results for acute infections at Massachusetts, Michigan and Maryland PHLs was 8, 11, and 7 days respectively. The laboratory cost per HIV positive specimen was $336 (Massachusetts), $263 (Michigan) and $210 (Maryland). CONCLUSIONS: Acute and established infections were found by PHLs using fourth-generation IA in conjunction with antibody tests and NAT. Time to reporting of acute HIV test results to clients was suboptimal, and needs to be streamlined to expedite treatment and interrupt transmission. |
Analysis of CHIKV in mosquitoes infected via artificial blood meal
Ledermann JP , Powers AM . Methods Mol Biol 2016 1426 129-42 Having a mechanism to assess the transmission dynamics of a vector-borne virus is one critical component of understanding the life cycle of these viruses. Laboratory infection systems using artificial blood meals is one valuable approach for monitoring the progress of virus in its mosquito host and evaluating potential points for interruption of the cycle for control purposes. Here, we describe an artificial blood meal system with Chikungunya virus (CHIKV) and the processing of mosquito tissues and saliva to understand the movement and time course of virus infection in the invertebrate host. |
Comparison of cell counting methods in rodent pulmonary toxicity studies: automated and manual protocols and considerations for experimental design
Zeidler-Erdely PC , Antonini JM , Meighan TG , Young SH , Eye TJ , Hammer MA , Erdely A . Inhal Toxicol 2016 28 (9) 1-11 Pulmonary toxicity studies often use bronchoalveolar lavage (BAL) to investigate potential adverse lung responses to a particulate exposure. The BAL cellular fraction is counted, using automated (i.e. Coulter Counter(R)), flow cytometry or manual (i.e. hemocytometer) methods, to determine inflammatory cell influx. The goal of the study was to compare the different counting methods to determine which is optimal for examining BAL cell influx after exposure by inhalation or intratracheal instillation (ITI) to different particles with varying inherent pulmonary toxicities in both rat and mouse models. General findings indicate that total BAL cell counts using the automated and manual methods tended to agree after inhalation or ITI exposure to particle samples that are relatively nontoxic or at later time points after exposure to a pneumotoxic particle when the response resolves. However, when the initial lung inflammation and cytotoxicity was high after exposure to a pneumotoxic particle, significant differences were observed when comparing cell counts from the automated, flow cytometry and manual methods. When using total BAL cell count for differential calculations from the automated method, depending on the cell diameter size range cutoff, the data suggest that the number of lung polymorphonuclear leukocytes (PMN) varies. Importantly, the automated counts, regardless of the size cutoff, still indicated a greater number of total lung PMN when compared with the manual method, which agreed more closely with flow cytometry. The results suggest that either the manual method or flow cytometry would be better suited for BAL studies where cytotoxicity is an unknown variable. |
Developing a reference system for the IFCC standardization of HbA
Paleari R , Caruso D , Kaiser P , Arsene CG , Schaeffer-Reiss C , Van Dorsselaer A , Bisse E , Ospina M , De Jesus VR , Wild B , Mosca A . Clin Chim Acta 2016 467 21-26 The importance of hemoglobin A2 (HbA2) as an indicator of the presence of beta-thalassemia was established many years ago. However, clinical application of recommended HbA2 cut off values is often hampered due to poor equivalence of HbA2 results among methods and laboratories. Thus, the IFCC Standardization program for HbA2 was initiated in 2004 with the goal of achieving a complete reference system for this measurand. HbA2 standardization efforts are still in progress, including the development of a higher-order HbA2 reference measurement procedure and the preparation of a certified reference material in collaboration with the IRMM. Here, we review the past, present and future of HbA2 standardization and describe the current status of HbA2 testing. |
Lessons learned from newborn screening for critical congenital heart defects
Oster ME , Aucott SW , Glidewell J , Hackell J , Kochilas L , Martin GR , Phillippi J , Pinto NM , Saarinen A , Sontag M , Kemper AR . Pediatrics 2016 137 (5) Newborn screening for critical congenital heart defects (CCHD) was added to the US Recommended Uniform Screening Panel in 2011. Within 4 years, 46 states and the District of Columbia had adopted it into their newborn screening program, leading to CCHD screening being nearly universal in the United States. This rapid adoption occurred while there were still questions about the effectiveness of the recommended screening protocol and barriers to follow-up for infants with a positive screen. In response, the Centers for Disease Control and Prevention partnered with the American Academy of Pediatrics to convene an expert panel between January and September 2015 representing a broad array of primary care, neonatology, pediatric cardiology, nursing, midwifery, public health, and advocacy communities. The panel's goal was to review current practices in newborn screening for CCHD and to identify opportunities for improvement. In this article, we describe the experience of CCHD screening in the United States with regard to: (1) identifying the target lesions for CCHD screening; (2) optimizing the algorithm for screening; (3) determining state-level challenges to implementation and surveillance of CCHD; (4) educating all stakeholders; (5) performing screening using the proper equipment and in a cost-effective manner; and (6) implementing screening in special settings such as the NICU, out-of-hospital settings, and areas of high altitude. |
Iron, anemia, and iron deficiency anemia among young children in the United States
Gupta PM , Perrine CG , Mei Z , Scanlon KS . Nutrients 2016 8 (6) Iron deficiency and anemia are associated with impaired neurocognitive development and immune function in young children. Total body iron, calculated from serum ferritin and soluble transferrin receptor concentrations, and hemoglobin allow for monitoring of the iron and anemia status of children in the United States. The purpose of this analysis is to describe the prevalence of iron deficiency (ID), anemia, and iron deficiency anemia (IDA) among children 1-5 years using data from the 2007-2010 National Health and Nutrition Examination Survey (NHANES). Prevalence of ID, anemia, and IDA among children 1-5 years was 7.1% (5.5, 8.7), 3.2% (2.0, 4.3), and 1.1% (0.6, 1.7), respectively. The prevalence of both ID and anemia were higher among children 1-2 years (p < 0.05). In addition, 50% of anemic children 1-2 years were iron deficient. This analysis provides an update on the prevalence of ID, anemia, and IDA for a representative sample of US children. Our results suggest little change in these indicators over the past decade. Monitoring of ID and anemia is critical and prevention of ID in early childhood should remain a public health priority. |
Down syndrome: changing cardiac phenotype?
Riehle-Colarusso T , Oster ME . Pediatrics 2016 138 (1) Down syndrome (DS) is the most common chromosomal abnormality, affecting 1 in 700 infants born yearly in the United States.1 The birth prevalence of DS varies internationally among populations, likely due to variations in maternal age, race/ethnicity, use of prenatal screening, and terminations of affected pregnancies.2, 3 Approximately half of all infants born with DS also have a congenital heart defect (CHD), the most common type being atrioventricular septal defect (AVSD), with rates from 30% to 60%.4 In the era of increasing maternal age and pregnancy terminations, 2 it is interesting to investigate if and how the pattern of associated CHDs among infants with DS has changed. | In this issue of Pediatrics, Bergstrom et al5 describe the pattern of CHDs among Swedish liveborn infants with DS from 1992 to 2012. In this nationwide, population-based cohort, over half had a CHD; the 3 most common being AVSD, ventricular septal defect, and atrial septal defect. Although the overall rate and risk for CHDs remained constant, differences were noted among CHD phenotypes over time. When adjusting for several factors, the risk of complex CHDs decreased by 40% from birth period 1992–1994 to birth period 2010–2012, with a concurrent rise in less severe CHDs. Among livebirths with DS and CHDs, the rate of AVSD decreased from 46% to 30%, whereas the rate of ventricular septal defect doubled from 14% to 31%. The authors hypothesize this phenotypic shift could be the result of improved prenatal detection, especially among older mothers, leading to increased pregnancy termination.5 |
Shiftwork and decline in endothelial function among police officers
Charles LE , Zhao S , Fekedulegn D , Violanti JM , Andrew ME , Burchfiel CM . Am J Ind Med 2016 59 (11) 1001-1008 BACKGROUND: Our objective was to assess the influence of shiftwork on change in endothelial function. METHODS: This longitudinal study was conducted in 188 police officers (78.2% men). Shiftwork status (day, afternoon, night) was assessed objectively using daily Buffalo, NY payroll work history records. Brachial artery flow-mediated dilation (FMD) was assessed using ultrasound. Mean change in FMD% between 2004-2009 and 2010-2015 was compared across shiftwork using analysis of variance/covariance. RESULTS: Overall, mean FMD% decreased from 5.74 +/- 2.83 to 3.88 +/- 2.11 over an average of 7 years among all officers; P < 0.0001. Effect modification by gender was significant. Among men (but not women), those who worked day shifts had a smaller mean (+/-SE) decrease in FMD% (-0.89 +/- 0.35) compared with those who worked the afternoon (-2.69 +/- 0.39; P = 0.001) or night shifts (-2.31 +/- 0.45; P = 0.020) after risk factor adjustment. CONCLUSIONS: Larger declines in endothelial function were observed among men who worked afternoon or night shifts. Further investigation is warranted. |
Why do fire ground duties trigger sudden cardiac events in firefighters?
Hales T . Exerc Sport Sci Rev 2016 44 (3) 89 Fire suppression involves physically demanding work in hot, dangerous environments with heavy encapsulating protective gear while being exposed to toxic chemicals and particulate matter in fire smoke. Thus, it is not surprising that firefighters have high rates of injuries and illness. Approximately 85–100 firefighters die each year on duty with approximately 35–45 being caused by sudden cardiac events. But these on-duty sudden cardiac events do not occur randomly. Rather, they occur in a much higher proportion on the fire ground when firefighters are performing fire suppression operations (2). For more than two decades, Smith et al. (4) has meticulously documented the physiologic effects of fire suppression on the cardiovascular system. In the current issue of the Journal, Smith et al. (4) summarizes this research and proposes flow diagrams about how these physiologic changes, in combination with other risk factors, could trigger a sudden cardiac event in susceptible firefighters. This commentary reviews the association between firefighting and cardiovascular disease and suggests additional research to direct prevention efforts. | A number of mortality studies have examined the relation between cardiovascular disease and firefighting, yet few have found elevated standardized mortality ratios. This lack of an association probably is caused by the healthy worker effect, an inherent bias of occupational cohort mortality studies (1). Mortality studies that examined the cardiovascular disease risk by duration of employment found markedly lower standardized mortality ratios at the beginning of a firefighter's career that catch up or even surpass the risk of the general population at the end of their careers. This rising standardized mortality ratio with increasing duration of employment suggests occupational involvement. |
Workplace psychosocial and organizational factors for neck pain in workers in the United States
Yang H , Hitchcock E , Haldeman S , Swanson N , Lu ML , Choi B , Nakata A , Baker D . Am J Ind Med 2016 59 (7) 549-60 BACKGROUND: Neck pain is a prevalent musculoskeletal condition among workers in the United States. This study explores a set of workplace psychosocial and organization-related factors for neck pain. METHODS: Data used for this study come from the 2010 National Health Interview Survey which provides a representative sample of the US population. To account for the complex sampling design, the Taylor linearized variance estimation method was used. Logistic regression models were constructed to measure the associations. RESULTS: This study demonstrated significant associations between neck pain and a set of workplace risk factors, including work-family imbalance, exposure to a hostile work environment and job insecurity, non-standard work arrangements, multiple jobs, and long work hours. CONCLUSION: Workers with neck pain may benefit from intervention programs that address issues related to these workplace risk factors. Future studies exploring both psychosocial risk factors and physical risk factors with a longitudinal design will be important. Am. J. Ind. Med. (c) 2016 Wiley Periodicals, Inc. |
NIOSH Field Studies Team Assessment: Worker exposure to aerosolized metal oxide nanoparticles in a semiconductor fabrication facility
Brenner SA , Neu-Baker NM , Eastlake AC , Beaucham CC , Geraci CL . J Occup Environ Hyg 2016 13 (11) 1-31 The ubiquitous use of engineered nanomaterials - particulate materials measuring approximately 1-100 nanometers (nm) on their smallest axis, intentionally engineered to express novel properties - in semiconductor fabrication poses unique issues for protecting worker health and safety. Use of new substances or substances in a new form may present hazards that have yet to be characterized for their acute or chronic health effects. Uncharacterized or emerging occupational health hazards may exist when there is insufficient validated hazard data available to make a decision on potential hazard and risk to exposed workers under condition of use. To advance the knowledge of potential worker exposure to engineered nanomaterials, the National Institute for Occupational Safety and Health Nanotechnology Field Studies Team conducted an on-site field evaluation in collaboration with on-site researchers at a semiconductor research and development facility on April 18-21, 2011. The Nanomaterial Exposure Assessment Technique (2.0) was used to perform a complete exposure assessment. A combination of filter-based sampling and direct-reading instruments was used to identify, characterize, and quantify the potential for worker inhalation exposure to airborne alumina and amorphous silica nanoparticles associated with the chemical mechanical planarization wafer polishing process. Engineering controls and work practices were evaluated to characterize tasks that might contribute to potential exposures and to assess existing engineering controls. Metal oxide structures were identified in all sampling areas, as individual nanoparticles and agglomerates ranging in size from 60nm to >1,000nm, with varying structure morphology, from long and narrow to compact. Filter-based samples indicated very little aerosolized material in task areas or worker breathing zone. Direct-reading instrument data indicated increased particle counts relative to background in the wastewater treatment area; however, particle counts were very low overall, indicating a well-controlled working environment. Recommendations for employees handling or potentially exposed to engineered nanomaterials include hazard communication, standard operating procedures, conservative ventilation systems, and prevention through design in locations where engineered nanomaterials are used or stored, and routine air sampling for occupational exposure assessment and analysis. |
Gaining and sustaining schistosomiasis control: study protocol and baseline data prior to different treatment strategies in five African countries
Ezeamama AE , He CL , Shen Y , Yin XP , Binder SC , Campbell CH Jr , Rathbun S , Whalen CC , N'Goran EK , Utzinger J , Olsen A , Magnussen P , Kinung'hi S , Fenwick A , Phillips A , Ferro J , Karanja DM , Mwinzi PN , Montgomery S , Secor WE , Hamidou A , Garba A , King CH , Colley DG . BMC Infect Dis 2016 16 (1) 229 BACKGROUND: The Schistosomiasis Consortium for Operational Research and Evaluation (SCORE) was established in 2008 to answer strategic questions about schistosomiasis control. For programme managers, a high-priority question is: what are the most cost-effective strategies for delivering preventive chemotherapy (PCT) with praziquantel (PZQ)? This paper describes the process SCORE used to transform this question into a harmonized research protocol, the study design for answering this question, the village eligibility assessments and data resulting from the first year of the study. METHODS: Beginning in 2009, SCORE held a series of meetings to specify empirical questions and design studies related to different schedules of PCT for schistosomiasis control in communities with high (gaining control studies) and moderate (sustaining control studies) prevalence of Schistosoma infection among school-aged children. Seven studies are currently being implemented in five African countries. During the first year, villages were screened for eligibility, and data were collected on prevalence and intensity of infection prior to randomisation and the implementation of different schemes of PZQ intervention strategies. RESULTS: These studies of different treatment schedules with PZQ will provide the most comprehensive data thus far on the optimal frequency and continuity of PCT for schistosomiasis infection and morbidity control. CONCLUSIONS: We expect that the study outcomes will provide data for decision-making for country programme managers and a rich resource of information to the schistosomiasis research community. TRIAL REGISTRATION: The trials are registered at International Standard Randomised Controlled Trial registry (identifiers: ISRCTN99401114 , ISRCTN14849830 , ISRCTN16755535 , ISRCTN14117624 , ISRCTN95819193 and ISRCTN32045736 ). |
Safety and effectiveness data for emergency contraceptive pills among women with obesity: a systematic review
Jatlaoui TC , Curtis KM . Contraception 2016 94 (6) 605-611 OBJECTIVE: To determine whether emergency contraceptive pills (ECPs) are less safe and effective for women with obesity compared with those without obesity. STUDY DESIGN: We searched PubMed for articles through November 2015 regarding the safety and effectiveness of ECPs (ulipristal acetate [UPA], levonorgestrel [LNG], and combined estrogen and progestin) among obese users. We assessed study quality using the United States Preventive Services Task Force evidence grading system. RESULTS: We identified four pooled secondary analyses (Quality poor to fair), two of which examined UPA and three examined LNG formulations. Three analyses pooled overlapping data from a total of three primary studies and demonstrated significant associations between obesity and risk of pregnancy after ECP use. One analysis reported a four-fold increased risk of pregnancy among women with obesity (BMI≥30kg/m2) compared with women within normal/underweight categories (BMI<25kg/m2) after use of LNG ECPs [Odds ratio (OR) 4.4; 95% confidence interval (CI) 2.0-9.4]. Further analysis of the same LNG data found that at an approximate weight of 80 kg, the rate of pregnancy rose above 6%, which is the estimated pregnancy probability without contraception; at weights less than 75 kg, the rate of pregnancy was less than 2%. Two analyses examining UPA suggested an approximate two-fold increased risk of pregnancy among women with obesity compared with either normal/underweight women or non-obese (BMI<30kg/m2) women, (OR 2.6; 95% CI 0.9-7.0 and OR 2.1; 95% CI 1.0-4.3, respectively), but confidence intervals were wide. Finally, the fourth secondary analysis pooled data from three separate randomized controlled trials on LNG ECPs and found no increase in pregnancy risk with increasing weight or BMI and found no consistent association between pregnancy and both factors when adjusted for other covariates. CONCLUSION: While data are limited and poor to fair quality, findings suggest that women with obesity experience an increased risk of pregnancy after use of LNG ECP compared with those normal/underweight. Women with obesity may also experience an increased risk of pregnancy compared with women without obesity after use of UPA ECP, though differences did not reach statistical significance. Providers should counsel all women at risk for unintended pregnancy, including those with obesity, about the effectiveness of the full range of emergency contraception options in order for them to understand their options, to receive advanced supplies of emergency contraception as needed and to understand how to access an emergency copper IUD if desired. |
Maternal smoking among women with and without use of assisted reproductive technologies
Tong VT , Kissin DM , Bernson D , Copeland G , Boulet SL , Zhang Y , Jamieson DJ , England LJ . J Womens Health (Larchmt) 2016 25 (10) 1066-1072 OBJECTIVE: To estimate smoking prevalence during the year before pregnancy and during pregnancy and adverse outcomes among women who delivered infants with and without assisted reproductive technology (ART) using linked birth certificates (BC) and National ART Surveillance System (NASS) data. METHODS: Data were analyzed for 384,390 women and 392,248 infants born in Massachusetts and Michigan during 2008-2009. Maternal smoking prevalence was estimated using smoking indicated from BC by ART status. For ART users, to evaluate underreporting, prepregnancy smoking was estimated from BC, NASS, or both sources. Effect of prenatal smoking on preterm and mean birthweight (term only) for singleton infants were examined by ART status. RESULTS: Maternal smoking prevalence estimates were significantly lower for ART users than nonusers (prepregnancy = 3.2% vs. 16.7%; prenatal = 1.0% vs. 11.1%, p < 0.05). When combining smoking information from BC and NASS, prepregnancy smoking prevalence estimates for ART users could be as high as 4.4% to 6.1%. Adverse effects of smoking on infant outcomes in ART pregnancies were consistent with the effects seen in non-ART pregnancies, specifically decline in infant birthweight and increase in preterm delivery, although association between smoking and preterm was not significant. CONCLUSION: A low, but substantial proportion of ART users smoked before and during pregnancy. As ART users are highly motivated to get pregnant, it should be clearly communicated that smoking can decrease fertility and adversely affect pregnancy outcomes. Continued efforts are needed to encourage smoking cessation and maintain tobacco abstinence among all women of reproductive age. |
Exposure to advertisements and electronic cigarette use among US middle and high school students
Singh T , Agaku IT , Arrazola RA , Marynak KL , Neff LJ , Rolle IT , King BA . Pediatrics 2016 137 (5) BACKGROUND: Electronic cigarette (e-cigarette) use among US students increased significantly during 2011 to 2014. We examined the association between e-cigarette advertisement exposure and current e-cigarette use among US middle school and high school students. METHODS: Data came from the 2014 National Youth Tobacco Survey (n = 22 007), a survey of students in grades 6 through 12. The association between current e-cigarette use and exposure to e-cigarette advertisements via 4 sources (Internet, newspapers/magazines, retail stores, and TV/movies) was assessed. Three advertising exposure categories were assessed: never/rarely, sometimes, and most of the time/always. Separate logistic regression models were used to measure the association, adjusting for gender, race/ethnicity, grade, and other tobacco use. RESULTS: Compared with students who reported exposure to e-cigarette advertisements never/rarely, the odds of current e-cigarette use were significantly (P < .05) greater among those reporting exposure sometimes and most of the time/always, respectively, as follows: Internet (adjusted odds ratio: middle school, 1.44 and 2.91; high school, 1.49, and 2.02); newspapers/magazines (middle school, 0.93 [not significant] and 1.87; high school, 1.26 and 1.71); retail stores (middle school, 1.78 and 2.34; high school, 1.37, and 1.91); and TV/movies (middle school, 1.25 [not significant] and 1.80; high school, 1.24 and 1.54). CONCLUSIONS: E-cigarette advertisement exposure is associated with current e-cigarette use among students; greater exposure is associated with higher odds of use. Given that youth use of tobacco in any form is unsafe, comprehensive tobacco prevention and control strategies, including efforts to reduce youth exposure to advertising, are critical to prevent all forms of tobacco use among youth. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Food Safety
- Genetics and Genomics
- Health Economics
- Healthcare Associated Infections
- Immunity and Immunization
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Occupational Safety and Health
- Parasitic Diseases
- Reproductive Health
- Substance Use and Abuse
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure