Achievements, challenges and unmet needs for haemophilia patients with inhibitors: Report from a symposium in Paris, France on 20 November 2014.
Dargaud Y , Pavlova A , Lacroix-Desmazes S , Fischer K , Soucie M , Claeyssens S , Scott DW , d'Oiron R , Lavigne-Lissalde G , Kenet G , Escuriola Ettingshausen C , Borel-Derlon A , Lambert T , Pasta G , Negrier C . Haemophilia 2016 22 Suppl 1 1-24 Over the past 20 years, there have been many advances in haemophilia treatment that have allowed patients to take greater control of their disease. However, the development of factor VIII (FVIII) inhibitors is the greatest complication of the disease and a challenge in the treatment of haemophilia making management of bleeding episodes difficult and surgical procedures very challenging. A meeting to discuss the unmet needs of haemophilia patients with inhibitors was held in Paris on 20 November 2014. Topics discussed were genetic and non-genetic risk factors for the development of inhibitors, immunological aspects of inhibitor development, FVIII products and inhibitor development, generation and functional properties of engineered antigen-specific T regulatory cells, suppression of immune responses to FVIII, prophylaxis in haemophilia patients with inhibitors, epitope mapping of FVIII inhibitors, current controversies in immune tolerance induction therapy, surgery in haemophilia patients with inhibitors and future perspectives for the treatment of haemophilia patients with inhibitors. A summary of the key points discussed is presented in this paper. |
Natriuretic peptide and high-sensitivity troponin for cardiovascular risk prediction in diabetes: The Atherosclerosis Risk in Communities (ARIC) Study
Gori M , Gupta DK , Claggett B , Selvin E , Folsom AR , Matsushita K , Bello NA , Cheng S , Shah A , Skali H , Vardeny O , Ni H , Ballantyne CM , Astor BC , Klein BE , Aguilar D , Solomon SD . Diabetes Care 2016 39 (5) 677-85 OBJECTIVE: Cardiovascular disease (CVD) is the major cause of morbidity and mortality in diabetes; yet, heterogeneity in CVD risk has been suggested in diabetes, providing a compelling rationale for improving diabetes risk stratification. We hypothesized that N-terminal prohormone brain natriuretic peptide (NTproBNP) and high-sensitivity troponin T may enhance CVD risk stratification beyond commonly used markers of risk and that CVD risk is heterogeneous in diabetes. RESEARCH DESIGN AND METHODS: Among 8,402 participants without prevalent CVD at visit 4 (1996-1998) of the Atherosclerosis Risk in Communities (ARIC) study there were 1,510 subjects with diabetes (mean age 63 years, 52% women, 31% African American, and 60% hypertensive). RESULTS: Over a median follow-up of 13.1 years, there were 540 incident fatal/nonfatal CVD events (coronary heart disease, heart failure, and stroke). Both troponin T ≥14 ng/L (hazard ratio [HR] 1.96 [95% CI 1.57-2.46]) and NTproBNP >125 pg/mL (1.61 [1.29-1.99]) were independent predictors of incident CVD events at multivariable Cox proportional hazard models. Addition of circulating cardiac biomarkers to traditional risk factors, abnormal electrocardiogram (ECG), and conventional markers of diabetes complications including retinopathy, nephropathy, and peripheral arterial disease significantly improved CVD risk prediction (net reclassification index 0.16 [95% CI 0.07-0.22]). Compared with individuals without diabetes, subjects with diabetes had 1.6-fold higher adjusted risk of incident CVD. However, participants with diabetes with normal cardiac biomarkers and no conventional complications/abnormal ECG (n = 725 [48%]) were at low risk (HR 1.12 [95% CI 0.92-1.31]), while those with abnormal cardiac biomarkers, alone (n = 186 [12%]) or in combination with conventional complications/abnormal ECG (n = 243 [16%]), were at greater risk (1.99 [1.59-2.50] and 2.80 [2.34-3.35], respectively). CONCLUSIONS: Abnormal levels of NTproBNP and troponin T may help to distinguish individuals with high diabetes risk from those with low diabetes risk, providing incremental risk prediction beyond commonly used markers of risk. |
Non-response in a cross-sectional study of respiratory health in Norway
Abrahamsen R , Svendsen MV , Henneberger PK , Gundersen GF , Toren K , Kongerud J , Fell AK . BMJ Open 2016 6 (1) e009912 OBJECTIVES: Declining participation in epidemiological studies has been reported in recent decades and may lead to biased prevalence estimates and selection bias. The aim of the study was to identify possible causes and effects of non-response in a population-based study of respiratory health in Norway. DESIGN: The Telemark study is a longitudinal study that began with a cross-sectional survey in 2013. SETTING: In 2013, a random sample of 50 000 inhabitants aged 16-50 years, living in Telemark county, received a validated postal questionnaire. The response rate was 33%. In this study, a random sample of 700 non-responders was contacted first by telephone and then by mail. OUTCOME MEASURES: Response rates, prevalence and OR of asthma and respiratory symptoms based on exposure to vapours, gas, dust or fumes (VGDF) and smoking. Causes of non-response. RESULTS: A total of 260 non-responders (37%) participated. Non-response was associated with younger age, male sex, living in a rural area and past smoking. The prevalence was similar for responders and non-responders for physician-diagnosed asthma and several respiratory symptoms. The prevalence of chronic cough and use of asthma medication was overestimated in the Telemark study, and adjusted prevalence estimates were 17.4% and 5%, respectively. Current smoking was identified as a risk factor for respiratory symptoms among responders and non-responders, while occupational VGDF exposure was a risk factor only among responders. The Breslow-Day test detected heterogeneity between productive cough and occupational VGDF exposure among responders. CONCLUSIONS: The Telemark study provided valid estimates for physician-diagnosed asthma and several respiratory symptoms, while it was necessary to adjust prevalence estimates for chronic cough and use of asthma medication. Reminder letters had little effect on risk factor associations. Selection bias should be considered in future investigations of the relationship between respiratory outcomes and exposures. |
Population-based estimates of decreases in quality-adjusted life expectancy associated with unhealthy body mass index
Jia H , Zack M M , Thompson W W . Public Health Rep 2016 131 (1) 177-184 OBJECTIVE: Being classified as outside the normal range for body mass index (BMI) has been associated with increased risk for chronic health conditions, poor health-related quality of life (HRQOL), and premature death. To assess the impact of BMI on HRQOL and mortality, we compared quality-adjusted life expectancy (QALE) by BMI levels. METHODS: We obtained HRQOL data from the 1993–2010 Behavioral Risk Factor Surveillance System and life table estimates from the National Center for Health Statistics national mortality files to estimate QALE among U.S. adults by BMI categories: underweight (BMI <18.5 kg/m2), normal weight (BMI 18.5–24.9 kg/m2), overweight (BMI 25.0–29.9 kg/m2), obese (BMI 30.0–34.9 kg/m2), and severely obese (BMI ≥35.0 kg/m2). RESULTS: In 2010 in the United States, the highest estimated QALE for adults at 18 years of age was 54.1 years for individuals classified as normal weight. The two lowest QALE estimates were for those classified as either underweight (48.9 years) or severely obese (48.2 years). For individuals who were overweight or obese, the QALE estimates fell between those classified as either normal weight (54.1 years) or severely obese (48.2 years). The difference in QALE between adults classified as normal weight and those classified as either overweight or obese was significantly higher among women than among men, irrespective of race/ethnicity. CONCLUSIONS: Using population-based data, we found significant differences in QALE loss by BMI category. These findings are valuable for setting national and state targets to reduce health risks associated with severe obesity, and could be used for cost-effectiveness evaluations of weight-reduction interventions. |
Prevalence of excess sodium intake in the United States - NHANES, 2009-2012
Jackson SL , King SM , Zhao L , Cogswell ME . MMWR Morb Mortal Wkly Rep 2016 64 (52) 1393-7 Hypertension, a major risk factor for cardiovascular diseases, occurs among 29% of U.S. adults, and lowering excess sodium intake can reduce blood pressure (1-3). The 2015-2020 Dietary Guidelines for Americans recommend consuming less than 2,300 mg dietary sodium per day for persons aged ≥14 years and less for persons aged 2-13 years.* To examine the current prevalence of excess sodium intake among Americans overall, and among hypertensive adults, CDC analyzed data from 14,728 participants aged ≥2 years in the 2009-2012 National Health and Nutrition Examination Survey (NHANES). Eighty-nine percent of adults and over 90% of children exceeded recommendations for sodium intake. Among hypertensive adults, 86% exceeded 2,300 mg dietary sodium per day. To address the high prevalence of excess sodium consumption in the U.S. population, the Institute of Medicine (IOM) recommended reducing sodium in the food supply, as excess sodium added to foods during commercial processing and preparation represents the main source of sodium intake in U.S. diets (4). |
Evaluating early case capture of pediatric cancers in seven central cancer registries in the United States, 2013
Puckett M , Neri A , Rohan E , Clerkin C , Underwood J M , Ryerson A B , Stewart S L . Public Health Rep 2016 131 (1) 126-136 OBJECTIVE: Cancer is the second-leading cause of death in children, but incidence data are not available until two years after diagnosis, thereby delaying data dissemination and research. An early case capture (ECC) surveillance program was piloted in seven state cancer registries to register pediatric cancer cases within 30 days of diagnosis. We sought to determine the quality of ECC data and understand pilot implementation. METHODS: We used quantitative and qualitative methods to evaluate ECC. We assessed data quality by comparing demographic and clinical characteristics from the initial ECC submission to a resubmission of ECC pilot data and to the most recent year of routinely collected cancer data for each state individually and in aggregate. We conducted telephone focus groups with registry staff to determine ECC practices and difficulties in August and September 2013. Interviews were recorded, transcribed, and coded to identify themes. RESULTS: Comparing ECC initial submissions with submissions for all states, ECC data were nationally representative for age (9.7 vs. 9.9 years) and sex (673 of 1,324 [50.9%] vs. 42,609 of 80,547 [52.9%] male cases), but not for primary site (472 of 1,324 [35.7%] vs. 27,547 of 80,547 [34.2%] leukemia/lymphoma cases), behavior (1,219 of 1,324 [92.1%] vs. 71,525 of 80,547 [88.8%] malignant cases), race/ethnicity (781 of 1,324 [59.0%] vs. 64,518 of 80,547 [80.1%] white cases), or diagnostic confirmation (1,233 of 1,324 [93.2%] vs. 73,217 of 80,547 [90.9%] microscopically confirmed cases). When comparing initial ECC data with resubmission data, differences were seen in race/ethnicity (808 of 1,324 [61.1%] vs. 1,425 of 1,921 [74.2%] white cases), primary site (475 of 1,324 [35.9%] vs. 670 of 1,921 [34.9%] leukemia/lymphoma cases), and behavior (1,215 of 1,324 [91.8%] vs. 1,717 of 1,921 [89.4%] malignant cases). Common themes from focus group analysis included implementation challenges and facilitators, benefits of ECC, and utility of ECC data. CONCLUSIONS: ECC provided data rapidly and reflected national data overall with differences in several data elements. ECC also expanded cancer report¬ing infrastructure and increased data completeness and timeliness. Although challenges related to timeliness and increased work burden remain, indica¬tions suggest that researchers may reliably use these data for pediatric cancer studies. |
Association of chronic obstructive pulmonary disease with increased confusion or memory loss and functional limitations among adults in 21 states, 2011 Behavioral Risk Factor Surveillance System
Greenlund KJ , Liu Y , Deokar AJ , Wheaton AG , Croft JB . Prev Chronic Dis 2016 13 E02 INTRODUCTION: Chronic obstructive pulmonary disease (COPD) is associated with cognitive impairment, but consequences of this association on a person's functional limitations are unclear. We examined the association between COPD and increased confusion and memory loss (ICML) and functional limitations among adults with COPD. METHODS: We studied adults aged 45 years or older in 21 states who participated in the 2011 Behavioral Risk Factor Surveillance System (n = 102,739). Presence of COPD was based on self-reported physician diagnosis. ICML was based on self-report that confusion or memory loss occurred more often or worsened during the prior year. ICML-associated difficulties were defined as giving up household chores and former activities, decreased ability to work or engage in social activities, or needing help from family or friends during the prior year due to ICML. General limitations were defined as needing special equipment as a result of a health condition, having had activity limitations for 2 weeks or more in the prior month, or being unable to work. Multivariable models were adjusted for demographics, health behaviors or conditions, and frequent mental distress. RESULTS: COPD was reported by 9.3% of adults. ICML was greater among those with COPD than among those without COPD (25.8% vs 11%; adjusted prevalence ratio [aPR], 1.48; 95% confidence interval [CI], 1.32%-1.66%). People with COPD, either with or without ICML, were more likely than those without COPD to report general functional limitations. Among people reporting ICML, those with COPD were more likely to report interference with work or social activities than those without COPD (aPR, 1.17; 95% CI, 1.01%-1.36%). CONCLUSION: Functional limitations were greater among those with COPD than among those without, and ICML may further affect these limitations. Results from our study can inform future studies of self- management and functional limitations for people with COPD. |
Blood lead and other metal biomarkers as risk factors for cardiovascular disease mortality
Aoki Y , Brody DJ , Flegal KM , Fakhouri TH , Parker JD , Axelrad DA . Medicine (Baltimore) 2016 95 (1) e2223 Analyses of the Third National Health and Nutrition Examination Survey (NHANES III) in 1988 to 1994 found an association of increasing blood lead levels <10 mug/dL with a higher risk of cardiovascular disease (CVD) mortality. The potential need to correct blood lead for hematocrit/hemoglobin and adjust for biomarkers for other metals, for example, cadmium and iron, had not been addressed in the previous NHANES III-based studies on blood lead-CVD mortality association.We analyzed 1999 to 2010 NHANES data for 18,602 participants who had a blood lead measurement, were ≥40 years of age at the baseline examination and were followed for mortality through 2011. We calculated the relative risk for CVD mortality as a function of hemoglobin- or hematocrit-corrected log-transformed blood lead through Cox proportional hazard regression analysis with adjustment for serum iron, blood cadmium, serum C-reactive protein, serum calcium, smoking, alcohol intake, race/Hispanic origin, and sex.The adjusted relative risk for CVD mortality was 1.44 (95% confidence interval = 1.05, 1.98) per 10-fold increase in hematocrit-corrected blood lead with little evidence of nonlinearity. Similar results were obtained with hemoglobin-corrected blood lead. Not correcting blood lead for hematocrit/hemoglobin resulted in underestimation of the lead-CVD mortality association while not adjusting for iron status and blood cadmium resulted in overestimation of the lead-CVD mortality association.In a nationally representative sample of U.S. adults, log-transformed blood lead was linearly associated with increased CVD mortality. Correcting blood lead for hematocrit/hemoglobin and adjustments for some biomarkers affected the association. |
Corticosteroid use in a prospective, community-based cohort of newly diagnosed inflammatory bowel disease patients
Shapiro JM , Hagin SE , Shah SA , Bright R , Law M , Moniz H , Giacalone J , Jackvony T , Taleban S , Samad Z , Merrick M , Sands BE , LeLeiko NS . Dig Dis Sci 2016 61 (6) 1635-40 BACKGROUND: Systemic corticosteroids (CS) are a mainstay of treatment for patients with newly diagnosed inflammatory bowel disease (IBD). Previous population-based studies report CS exposure rates range from 39 to 75 % within the first year of diagnosis with surgical resection rates as high as 13-18 % in the same time frame. These reports represent an older cohort of patients enrolled over prolonged periods of time and do not necessarily reflect current treatment approaches. We examine CS use during the first year of IBD diagnosis in a community-based, inception cohort. METHODS: Data were derived from the Ocean State Crohn's and Colitis Area Registry (OSCCAR), a prospective inception cohort of IBD patients who are residents of Rhode Island. RESULTS: A total of 272 patients were included in the current analyses. Overall, 60 % of Crohn's disease and 57 % of ulcerative colitis patients were exposed to at least one course of CS during year 1 of study enrollment. Most notably, only 2 % of patients (n = 5) required a surgical resection. CONCLUSIONS: In this community-based cohort, 59 % of patients were exposed to at least one course of CS during their first year of enrollment. In contrast to previous studies, OSCCAR represents a more modern cohort of patients. While steroid exposure rates were similar or slightly higher than those in previous reports, we observed a low rate of surgical resection. As our cohort ages, future analysis will focus on the role more contemporary agents may play on the low rates of surgery we observed. |
Development and validation of a hypertension prevalence estimator tool for use in clinical settings
Ritchey M , Yuan K , Gillespie C , Zhang G , Ostchega Y . J Clin Hypertens (Greenwich) 2016 18 (8) 750-61 Health systems are well positioned to identify and control hypertension among their patients. However, almost one third of US adults with uncontrolled hypertension are currently receiving medical care and are unaware of being hypertensive. This study describes the development and validation of a tool that health systems can use to compare their reported hypertension prevalence with their expected prevalence. Tool users provide the number of patients aged 18 to 85 years treated annually, stratified by sex, age group, race/ethnicity, and comorbidity status. Each stratum is multiplied by stratum-specific national prevalence estimates and the amounts are summed to calculate the number of expected hypertensive patients. The tool's validity was assessed by applying samples from cohorts with known hypertension prevalence; small differences in expected vs actual prevalence were identified (range, -3.3% to 0.6%). This tool provides clinically useful hypertension prevalence estimates that health systems can use to help inform hypertension management quality improvement efforts. |
Disability-free life-years lost among adults aged ≥50 years, with and without diabetes
Bardenheier BH , Lin J , Zhuo X , Ali MK , Thompson TJ , Cheng YJ , Gregg EW . Diabetes Care 2015 39 (7) 1222-9 OBJECTIVE: Quantify the impact of diabetes status on healthy and disabled years of life for older adults in the U.S. and provide a baseline from which to evaluate ongoing national public health efforts to prevent and control diabetes and disability. RESEARCH DESIGN AND METHODS: Adults (n = 20,008) aged 50 years and older were followed from 1998 to 2012 in the Health and Retirement Study, a prospective biannual survey of a nationally representative sample of adults. Diabetes and disability status (defined by mobility loss, difficulty with instrumental activities of daily living [IADL], and/or difficulty with activities of daily living [ADL]) were self-reported. We estimated incidence of disability, remission to nondisability, and mortality. We developed a discrete-time Markov simulation model with a 1-year transition cycle to predict and compare lifetime disability-related outcomes between people with and without diabetes. Data represent the U.S. population in 1998. RESULTS: From age 50, adults with diabetes died 4.6 years earlier, developed disability 6-7 years earlier, and spent about 1-2 more years in a disabled state than adults without diabetes. With increasing baseline age, diabetes was associated with significant (P < 0.05) reductions in the number of total and disability-free life-years, but the absolute difference in years between those with and without diabetes was less than at younger baseline age. Men with diabetes spent about twice as much of their remaining years disabled (20-24% of remaining life across the three disability definitions) as men without diabetes (12-16% of remaining life across the three disability definitions). Similar associations between diabetes status and disability-free and disabled years were observed among women. CONCLUSIONS: Diabetes is associated with a substantial reduction in nondisabled years, to a greater extent than the reduction of longevity. |
Multifacility Outbreak of Middle East Respiratory Syndrome in Taif, Saudi Arabia.
Assiri A , Abedi GR , Saeed AA , Abdalla MA , Al-Masry M , Choudhry AJ , Lu X , Erdman DD , Tatti K , Binder AM , Rudd J , Tokars J , Miao C , Alarbash H , Nooh R , Pallansch M , Gerber SI , Watson JT . Emerg Infect Dis 2016 22 (1) 32-40 Middle East respiratory syndrome (MERS) coronavirus (MERS-CoV) is a novel respiratory pathogen first reported in 2012. During September 2014-January 2015, an outbreak of 38 cases of MERS was reported from 4 healthcare facilities in Taif, Saudi Arabia; 21 of the 38 case-patients died. Clinical and public health records showed that 13 patients were healthcare personnel (HCP). Fifteen patients, including 4 HCP, were associated with 1 dialysis unit. Three additional HCP in this dialysis unit had serologic evidence of MERS-CoV infection. Viral RNA was amplified from acute-phase serum specimens of 15 patients, and full spike gene-coding sequencing was obtained from 10 patients who formed a discrete cluster; sequences from specimens of 9 patients were closely related. Similar gene sequences among patients unlinked by time or location suggest unrecognized viral transmission. Circulation persisted in multiple healthcare settings over an extended period, underscoring the importance of strengthening MERS-CoV surveillance and infection-control practices. |
The shift to high-impact HIV prevention by health departments in the United States
Purcell D W , McCray E , Mermin J . Public Health Rep 2016 131 (1) 7-10 The National HIV/AIDS Strategy (NHAS) for 2010–2015, released in July 2010, focused on decreasing new human immunodeficiency virus (HIV) infections, improving access to and outcomes from HIV care, reducing disparities, and increasing coordination across all levels of government. In concert with the release of NHAS, the Centers for Disease Control and Prevention (CDC) launched an initiative to fund 12 health departments to redesign programmatic efforts to help achieve the goals of NHAS by implementing CDC’s new High Impact Prevention (HIP) approach to HIV. The Enhanced Comprehensive HIV Prevention Planning (ECHPP) project involved implementing a wide range of high-impact HIV prevention activities in the 12 cities with the highest prevalence of AIDS, which represented 44% of total cases nationwide reported through 2007. An underlying assumption in CDC’s HIP approach, and in NHAS, is that HIV prevention resources needed to be better targeted to have a greater population impact. The task for ECHPP projects was to show that health departments could successfully shift their resources to different areas, populations, and interventions and, in doing so, have a greater impact on HIV in their jurisdictions. |
Shifting resources and focus to meet the goals of the National HIV/AIDS Strategy: The Enhanced Comprehensive HIV Prevention Planning Project, 2010-2013
Flores S A , Purcell D W , Fisher H H , Belcher L , Carey J W , Courtenay-Quirk C , Dunbar E , Eke A N , Galindo C , Glassman M , Margolis A D , Newman M S , Prather C , Stratford D , Taylor R D , Mermin J . Public Health Rep 2016 131 (1) 52-58 In September 2010, CDC launched the Enhanced Comprehensive HIV Preven¬tion Planning (ECHPP) project to shift HIV-related activities to meet goals of the 2010 National HIV/AIDS Strategy (NHAS). Twelve health departments in cities with high AIDS burden participated. These 12 grantees submitted plans detailing jurisdiction-level goals, strategies, and objectives for HIV prevention and care activities. We reviewed plans to identify themes in the planning process and initial implementation. Planning themes included data integration, broad engagement of partners, and resource allocation modeling. Implementation themes included organizational change, building partnerships, enhancing data use, developing protocols and policies, and providing training and technical assistance for new and expanded activities. Pilot programs also allowed grantees to assess the feasibility of large-scale implementation. These findings indicate that health departments in areas hardest hit by HIV are shifting their HIV prevention and care programs to increase local impact. Examples from ECHPP will be of interest to other health departments as they work toward meeting the NHAS goals. |
US college and university student health screening requirements for tuberculosis and vaccine-preventable diseases, 2012
Jewett A , Bell T , Cohen NJ , Buckley K , Leino EV , Even S , Beavers S , Brown C , Marano N . J Am Coll Health 2016 64 (5) 0 OBJECTIVE: Colleges are at risk for communicable disease outbreaks because of the high degree of person-to-person interactions and relatively crowded dormitory settings. This report describes the U.S. college student health screening requirements among U.S. resident and international students for tuberculosis (TB) and vaccine-preventable diseases (VPD) as it relates to the American College Health Association (ACHA) Guidelines. METHODS/PARTICIPANTS: In April 2012, U.S. college health administrators (N = 2858) were sent online surveys to assess their respective school's TB screening and immunization requirements. RESULTS: Surveys were completed by 308 (11%) schools. Most schools were aware of the ACHA immunization (78%) and TB screening (76%) guidelines. Schools reported having policies related to immunization screening (80.4%), immunization compliance (93%), TB screening (55%), and TB compliance (87%). CONCLUSION: Most colleges were following ACHA guidelines. However, there are opportunities for improvement to fully utilize the recommendations and prevent outbreaks of communicable diseases among students in colleges. |
Legionnaires' disease in South Africa, 2012-2014
Wolter N , Carrim M , Cohen C , Tempia S , Walaza S , Sahr P , de Gouveia L , Treurnicht F , Hellferscee O , Cohen AL , Benitez AJ , Dawood H , Variava E , Winchell JM , von Gottberg A . Emerg Infect Dis 2016 22 (1) 131-3 During June 2012-September 2014, we tested patients with severe respiratory illness for Legionella spp. infection and conducted a retrospective epidemiologic investigation. Of 1,805 patients tested, Legionella was detected in samples of 21 (1.2%); most were adults who had HIV or tuberculosis infections and were inappropriately treated for Legionella. |
Evaluation framework for HIV prevention and care activities in the Enhanced Comprehensive HIV Prevention Planning Project, 2010-2013
Fisher H H , Hoyte T , Flores S A , Purcell D W , Dunbar E , Stratford D . Public Health Rep 2016 131 (1) 67-75 OBJECTIVE: The Enhanced Comprehensive HIV Prevention Planning (ECHPP) project was a demonstration project implemented by 12 U.S. health departments (2010–2013) to enhance HIV program planning in cities with high AIDS prevalence, in support of National HIV/AIDS Strategy goals. Grantees were required to improve their planning and implementation of HIV prevention and care programs to increase their impact on local HIV epidemics. A multilevel evaluation using multiple data sources, spanning multiple years (2008–2015), will be conducted to assess the effect of ECHPP on client outcomes (e.g., HIV risk behaviors) and impact indicators (e.g., new HIV diagnoses). METHODS: We designed an evaluation approach that includes a broad assessment of program planning and implementation, a detailed examination of HIV prevention and care activities across funding sources, and an analysis of environmental and contextual factors that may affect services. A data triangulation approach was incorporated to integrate findings across all indicators and data sources to determine the extent to which ECHPP contributed to trends in indicators. RESULTS: To date, data have been collected for 2008–2009 (pre-ECHPP implementation) and 2010–2013 (ECHPP period). Initial analysis of process data indicate the ECHPP grantees increased their provision of HIV testing, condom distribution, and partner services programs and expanded their delivery of prevention programs for people diagnosed with HIV. CONCLUSION: The ECHPP evaluation (2008–2015) will assess whether ECHPP programmatic activities in 12 areas with high AIDS prevalence contributed to changes in client outcomes, and whether these changes were associated with changes in longer-term, community-level impact. |
The evolving contribution of emergency department testing studies: from risk to care
Oster AM . AIDS 2016 30 (1) 151-2 This issue of AIDS includes data from HIV serosurveys conducted in the Johns Hopkins Emergency Department (ED) over nearly 3 decades, including data from 2013 that have not previously been published (ref Kelen 2015). The earliest of the serosurveys, conducted in 1986 and 1987, aimed to document prevalence of HIV infection, with an eye toward quantifying occupational risk for health care providers [1, 2]. These early studies demonstrated the need for universal infection control precautions. A follow-up study in 1988 aimed to identify trends in HIV prevalence, document adherence to universal precautions implemented in the interim, and assess the burden of HIV-related service use in an emergency setting [3]. Together, these early serosurveys painted a picture of the epidemiology of HIV in an inner-city ED, examining the associations of demographic, risk, and geographic factors with HIV infection. They have also been important in understanding clinical presentations and courses for those patients with both known and unrecognized HIV infection in emergency settings. | Over time, the Johns Hopkins ED-based surveys also documented the emergence of HIV among populations not previously documented to be at risk, such as persons with heterosexual risk [4]. Additionally, the focus of the studies began to shift toward assessing feasibility of ED testing and identifying factors that could be used to prioritize groups for testing in ED settings [4]. Soon after, in 1993–1995, the Johns Hopkins ED instituted a routine HIV testing program. Since that time, many other EDs have followed suit, contributing to our understanding of the picture of HIV in diverse settings across the country [5]. |
Exploring chlamydia positivity among females on college campuses, 2008-2010
Habel MA , Leichliter JS , Torrone E . J Am Coll Health 2016 64 (6) 0 OBJECTIVE: Describe chlamydia positivity among young women tested at college health centers by student characteristics: age, race/ethnicity, and institution type. PARTICIPANTS: During 2008-2010 colleges participating in a national infertility prevention program provided chlamydia testing data from females aged 18-24. METHODS: We determined chlamydia positivity (# of positive tests divided by the # tested) among females stratified by college type (4-year versus 2-year; minority serving institutes (MSIs)). RESULTS: Chlamydia testing data was provided by 148 colleges: 37 (26%) MSIs and 21 (15%) 2-year colleges. Of the 118,946 chlamydia tests, 6.5% were positive. Chlamydia positivity in females at 4-year colleges was 6.6% versus 5.3% at 2-year colleges (p = 0.0001). Positivity at MSIs was almost double of that at non-MSIs 10.0% vs. 5.4% (p = 0.0001). CONCLUSIONS: Chlamydia positivity may be higher among college females than previously thought. Higher positivity at MSIs suggests targeted STI prevention efforts may be useful for high-risk college populations. |
From START to finish: implications of the START study
De Cock KM , El-Sadr WM . Lancet Infect Dis 2016 16 (1) 13-4 How best to use antiretroviral therapy (ART) has been a topic of debate for almost three decades. The landmark START trial1 settled one question that should have been resolved long ago—when to initiate ART in people with HIV. Findings of START showed a 57% reduction in AIDS, severe non-AIDS events, or deaths in people with a CD4 count higher than 500 cells/μL who were randomly assigned to immediate versus deferred treatment.2 This finding supports conclusions from observational studies and the recently completed west African TEMPRANO trial3 (which also favoured early treatment), and has profound implications for public health. | START should also be regarded in the context of the HPTN 052 trial,4 which showed a 96% reduction in HIV transmission in serodiscordant couples when the HIV-infected partner was taking ART. HPTN 052 complemented findings from modelling exercises5 and ecological studies that noted a decrease in new HIV infections in association with treatment scale-up at the population level.6 Synthesising these findings, early treatment of HIV is beneficial for individual as well as population health, ART is the most potent HIV prevention intervention,7 and universal access to ART is absolutely central to the global HIV response. |
HIV testing in publicly funded settings, National Health Interview Survey, 2003-2010
Tan C , Van Handel M , Johnson C , Dietz P . Public Health Rep 2016 131 (1) 137-144 OBJECTIVE: We determined whether or not HIV testing in publicly funded settings in the United States increased after 2006, when CDC recommended expanded HIV screening in health-care settings for all people aged 13–64 years. METHODS: We analyzed 2003–2010 National Health Interview Survey data to estimate annual national percentages of people aged 18–64 years who were tested for HIV in the previous 12 months. Estimates were calculated by setting (publicly funded, yes/other) and stratified by sex. Test settings were categorized as publicly funded based on the contribution of public funds for HIV testing. We used logistic regression modeling to assess statistical significance in linear trends for 2003–2006 and 2006–2010, adjusting for age, race/ethnicity, and health insurance coverage. Using model parameters for survey year, we calculated the estimated annual percentage change (EAPC) in HIV testing as the difference in the model-predicted testing prevalence between baseline and first post-baseline years, divided by baseline prevalence. RESULTS: During 2006–2010, the percentage of women tested for HIV in publicly funded settings increased significantly from 1.9% in 2006 to 2.4% in 2010 (EAPC=6.9%, p=0.008) and the percentage tested in other settings remained fairly stable, from 9.7% in 2006 to 9.6% in 2010 (EAPC=-0.5%, p=0.708). During the same period, the percentage of men tested for HIV in publicly funded settings increased, but not significantly, from 1.5% in 2006 to 1.9% in 2010 (EAPC=5.3%, p=0.110) and the percentage tested in other settings decreased significantly from 7.5% in 2006 to 6.2% in 2010 (EAPC=-4.4%, p=0.001). CONCLUSION: Although HIV testing in publicly funded settings increased among women during 2006–2010, testing rates remained low, and no similar increase occurred among men. As such, all test settings should increase HIV screening, particularly for men. |
Infection prevention and control of the Ebola outbreak in Liberia, 2014-2015: key challenges and successes
Cooper C , Fisher D , Gupta N , MaCauley R , Pessoa-Silva CL . BMC Med 2016 14 (1) 2 Prior to the 2014-2015 Ebola outbreak, infection prevention and control (IPC) activities in Liberian healthcare facilities were basic. There was no national IPC guidance, nor dedicated staff at any level of government or healthcare facility (HCF) to ensure the implementation of best practices. Efforts to improve IPC early in the outbreak were ad hoc and messaging was inconsistent. In September 2014, at the height of the outbreak, the national IPC Task Force was established with a Ministry of Health (MoH) mandate to coordinate IPC response activities. A steering group of the Task Force, including representatives of the World Health Organization (WHO) and the United States Centers for Disease Control and Prevention (CDC), supported MoH leadership in implementing standardized messaging and IPC training for the health workforce. This structure, and the activities implemented under this structure, played a crucial role in the implementation of IPC practices and successful containment of the outbreak. Moving forward, a nationwide culture of IPC needs to be maintained through this governance structure in Liberia's health system to prevent and respond to future outbreaks. |
The role of influenza, RSV and other common respiratory viruses in severe acute respiratory infections and influenza-like illness in a population with a high HIV sero-prevalence, South Africa 2012-2015
Pretorius MA , Tempia S , Walaza S , Cohen AL , Moyes J , Variava E , Dawood H , Seleka M , Hellferscee O , Treurnicht F , Cohen C , Venter M . J Clin Virol 2015 75 21-26 BACKGROUND: Viruses detected in patients with acute respiratory infections may be the cause of illness or asymptomatic shedding. OBJECTIVE: To estimate the attributable fraction (AF) and the detection rate attributable to illness for each of the different respiratory viruses STUDY DESIGN: We compared the prevalence of 10 common respiratory viruses (influenza A and B viruses, parainfluenza virus 1-3; respiratory syncytial virus (RSV); adenovirus, rhinovirus, human metapneumovirus (hMPV) and enterovirus) in both HIV positive and negative patients hospitalized with severe acute respiratory illness (SARI), outpatients with influenza-like illness (ILI), and control subjects who did not report any febrile, respiratory or gastrointestinal illness during 2012-2015 in South Africa. RESULTS: We enrolled 1959 SARI, 3784 ILI and 1793 controls with a HIV sero-prevalence of 26%, 30% and 43%, respectively. Influenza virus (AF: 86.3%; 95%CI: 77.7-91.6%), hMPV (AF: 85.6%; 95%CI: 72.0-92.6%), and RSV (AF: 83.7%; 95%CI: 77.5-88.2%) infections were associated with severe disease., while rhinovirus (AF: 46.9%; 95%CI: 37.6-56.5%) and adenovirus (AF: 36.4%; 95%CI: 20.6-49.0%) were only moderately associated. CONCLUSIONS: Influenza, RSV and hMPV can be considered pathogens if detected in ILI and SARI while rhinovirus and adenovirus were commonly identified in controls suggesting that they may cause only a proportion of clinical disease observed in positive patients. Nonetheless, they may be important contributors to disease. |
Screening difficult-to-reach populations for tuberculosis using a mobile medical unit, Punjab India
Binepal G , Agarwal P , Kaur N , Singh B , Bhagat V , Verma RP , Satyanarayana S , Oeltmann JE , Moonan PK . Public Health Action 2015 5 (4) 241-245 BACKGROUND: In India, the National Health Mission has provided one mobile medical unit (MMU) per district in the state of Punjab to provide primary health care services for difficult-to-reach populations. OBJECTIVES: To determine the number of patients with presumptive tuberculosis (TB) and the number of TB cases detected and treated among patients who used the MMU services from May to December 2012 in Mohali district, Punjab, India. METHODS: A cross-sectional study was conducted and registers of the out-patient, laboratory, radiology, and TB departments of the MMU were reviewed to determine the number of persons presumed to have TB and the number of persons diagnosed with TB. Results: Of 8346 patients who attended the MMUs, 663 (8%) had symptoms suggestive of TB. Among those with TB symptoms, 540 (81%) were evaluated for pulmonary TB using sputum examination or chest X-ray. In total, 58 (11%) patients had clinical or laboratory evidence of pulmonary TB, of whom 21 (36%) started anti-tuberculosis treatment. CONCLUSION: As MMUs are an integral part of the general public health system, these units have the potential to detect TB cases among difficult-to-reach populations. Additional research is required to optimise the diagnosis of TB at MMUs and to increase rates of TB treatment initiation. |
Measuring the potential role of frailty in apparent declining efficacy of HIV interventions
Hardnett FP , Rose CE . HIV Clin Trials 2015 16 (6) 219-227 OBJECTIVE: In recent HIV intervention trials, intervention efficacies appear to decline over time. Researchers have attributed this to "waning," or a loss of intervention efficacy. Another possible reason is heterogeneity in infection risk or "frailty." We propose an approach to assessing the impact of frailty and waning on measures of intervention efficacy and statistical power in randomized-controlled trials. METHODS: Using multiplicative risk reduction, we developed a mathematical formulation for computing disease incidence and the incidence rate ratio (IRR) as a function of frailty and waning. We designed study scenarios, which held study-related factors constant, varied waning and frailty parameters and measured the change in disease incidence, IRR, and statistical power. RESULTS: We found that frailty alone can impact disease incidence over time. However, frailty has minimal impact on the IRR. The factor that has the greatest influence on the IRR is intervention efficacy and the degree to which it is projected to wane. We also found that even moderate waning can cause an unacceptable decrease in statistical power while the impact of frailty on statistical power is minimal. DISCUSSION: We conclude that frailty has minimal impact on trial results relative to intervention efficacy. Study resources would, therefore, be better spent on efforts to keep the intervention efficacy constant throughout the trial (e.g., enhancing the vaccine schedule or promoting treatment adherence). |
The global burden of fungal diseases
Vallabhaneni S , Mody RK , Walker T , Chiller T . Infect Dis Clin North Am 2015 30 (1) 1-11 Fungal diseases require greater attention today than ever before, given the expanding population of immunosuppressed patients who are at higher risk for these diseases. This article reports on distribution, incidence, and prevalence of various fungal diseases and points out gaps in knowledge where such data are not available. Fungal diseases that contribute substantially to global morbidity and mortality are highlighted. Long-term, sustainable surveillance programs for fungal diseases and better noninvasive and reliable diagnostic tools are needed to estimate the burden of these diseases more accurately. |
HIV/STI risk-reduction intervention efficacy with South African adolescents over 54 months
Jemmott JB 3rd , Jemmott LS , O'Leary A , Ngwane Z , Lewis DA , Bellamy SL , Icard LD , Carty C , Heeren GA , Tyler JC , Makiwane MB , Teitelman A . Health Psychol 2015 34 (6) 610-21 OBJECTIVE: Little research has tested HIV/sexually transmitted infection (STI) risk-reduction interventions' effects on early adolescents as they age into middle and late adolescence. This study tested whether intervention-induced reductions in unprotected intercourse during a 12-month period endured over a 54-month period and whether the intervention reduced the prevalence of STIs, which increase risk for HIV. METHOD: Grade 6 learners (mean age = 12.4 years) participated in a 12-month trial in Eastern Cape Province, South Africa, in which 9 matched pairs of schools were randomly selected and within pairs randomized to a theory-based HIV/STI risk-reduction intervention or an attention-control intervention. They completed 42- and 54-month postintervention measures of unprotected intercourse (the primary outcome), other sexual behaviors, theoretical constructs, and, at 42- and 54-month follow-up only, biologically confirmed curable STIs (chlamydial infection, gonorrhea, and trichomoniasis) and herpes simplex virus 2. RESULTS: The HIV/STI risk-reduction intervention reduced unprotected intercourse averaged over the entire follow-up period (OR = 0.42, 95% CI [0.22, 0.84]), an effect not significantly reduced at 42- and 54-month follow-up compared with 3-, 6-, and 12-month follow-ups. The intervention caused positive changes on theoretical constructs averaged over the 5 follow-ups, although most effects weakened at long-term follow-up. Although the intervention's main effect on STIs was nonsignificant, an Intervention Condition x Time interaction revealed that it significantly reduced curable STIs at 42-month follow-up in adolescents who reported sexual experience. CONCLUSION: These results suggest that theory-based behavioral interventions with early adolescents can have long-lived effects in the context of a generalized severe HIV epidemic. |
Seroepidemiological studies of Crimean-Congo hemorrhagic fever virus in domestic and wild animals
Spengler JR , Bergeron E , Rollin PE . PLoS Negl Trop Dis 2016 10 (1) e0004210 Crimean-Congo hemorrhagic fever (CCHF) is a widely distributed, tick-borne viral disease. Humans are the only species known to develop illness after CCHF virus (CCHFV) infection, characterized by a nonspecific febrile illness that can progress to severe, often fatal, hemorrhagic disease. A variety of animals may serve as asymptomatic reservoirs of CCHFV in an endemic cycle of transmission. Seroepidemiological studies have been instrumental in elucidating CCHFV reservoirs and in determining endemic foci of viral transmission. Herein, we review over 50 years of CCHFV seroepidemiological studies in domestic and wild animals. This review highlights the role of livestock in the maintenance and transmission of CCHFV, and provides a detailed summary of seroepidemiological studies of wild animal species, reflecting their relative roles in CCHFV ecology. |
Respiratory health effects of ultrafine particles in children: A literature review
Heinzerling A , Hsu J , Yip F . Water Air Soil Pollut 2016 227 32 By convention, airborne particles ≤0.1 μm (100 nm) are defined as ultrafine particles (UFPs). UFPs can comprise a large number of particles in particulate matter with aerodynamic diameters ≤2.5 μm (PM2.5). Despite the documented respiratory health effects of PM2.5 and concerns that UFPs might be more toxic than larger particular matter, the effects of UFPs on the respiratory system are not well-described. Even less is known about the respiratory health effects of UFPs among particularly vulnerable populations including children. We reviewed studies examining respiratory health effects of UFPs in children and identified 12 relevant articles. Most (8/12) studies measured UFP exposure using central ambient monitors, and we found substantial heterogeneity in UFP definitions and study designs. No long-term studies were identified. In single pollutant models, UFPs were associated with incident wheezing, current asthma, lower spirometric values, and asthma-related emergency department visits among children. Also, higher exhaled nitric oxide levels were positively correlated with UFP dose among children with asthma or allergy to house dust mites in one study. Multivariate models accounting for potential copollutant confounding yielded no statistically significant results. Although evidence for a relationship between UFPs and children's respiratory is accumulating, the literature remains inconclusive. Interpretation of existing data is constrained by study heterogeneity, limited accounting for UFP spatial variation, and lack of significant findings from multipollutant models. |
Agricultural pesticide exposure and chronic kidney disease: new findings and more questions
Calvert GM . Occup Environ Med 2016 73 (1) 1-2 The vital importance of agriculture is well-recognised, as is the usefulness of pesticides in increasing agricultural yields and reducing spoilage rates. The usefulness of pesticides in mitigating disease-carrying pests (eg, mosquitos) is also well known. However, there are also risks associated with pesticide use. In addition to causing acute poisoning,1 they are also associated with increased cancer risks,2 among other diseases. A paper by Lebov and colleagues3 provides evidence for another potential risk associated with pesticides, that is, end-stage renal disease (ESRD). To our knowledge, this is the first report using the United States Renal Data System (USRDS) to assess the association between pesticide exposure and ESRD. | Currently, there is little literature available on the nephrotoxic effects of pesticides. The little research that does exist comes from animal studies and case reports of pesticide-poisoned individuals. Fortunately, our understanding of the role of occupational exposures, including pesticides, on ESRD development in humans is growing. An important tool supporting the growth of this understanding is the USRDS.4 Since the US government provides healthcare coverage, under Medicare, for all patients with ESRD and because these ESRD claims data are comprehensively captured by USRDS, USRDS represents a nearly complete national disease registry in the USA. Furthermore, because there is no other medical condition so covered by the US government, there is no other disease or injury in the USA that has such a nearly complete national registry. From the first use of USRDS to identify occupational exposures associated with ESRD, which in that initial case involved an exploration of silica exposure,5 it has been used to identify ESRD associations with several other occupational exposures, including perchloroethylene,6 lead7 and 1,1,1-trichloroethane.8 With the paper in this issue of OEM, USRDS has now been used to assess the association between pesticide exposure and ESRD.3 Lebov et al matched data from the Agricultural Health Study (AHS), a very large prospective study of licensed pesticide applicators, with data from USRDS to identify cohort members with ESRD and to determine if the observed cases exceeded population rates. Although these authors provided reassuring findings of no increased ESRD risk in the overall cohort, they did find significantly increased ESRD risks and positive exposure-response trends, among pesticide applicators who mixed or applied one or more of six specific pesticides. These pesticides included five herbicides (alachlor, atrazine, metolachor, paraquat and pendimethalin) and the insecticide permethrin. |
Identification of Source of Brucella suis Infection in Human by Using Whole-Genome Sequencing, United States and Tonga.
Quance C , Robbe-Austerman S , Stuber T , Brignole T , DeBess EE , Boyd L , LeaMaster B , Tiller R , Draper J , Humphrey S , Erdman MM . Emerg Infect Dis 2016 22 (1) 79-82 Brucella suis infection was diagnosed in a man from Tonga, Polynesia, who had butchered swine in Oregon, USA. Although the US commercial swine herd is designated brucellosis-free, exposure history suggested infection from commercial pigs. We used whole-genome sequencing to determine that the man was infected in Tonga, averting a field investigation. |
For working-age cancer survivors, medical debt and bankruptcy create financial hardships
Banegas MP , Guy GP Jr , de Moor JS , Ekwueme DU , Virgo KS , Kent EE , Nutt S , Zheng Z , Rechis R , Yabroff KR . Health Aff (Millwood) 2016 35 (1) 54-61 The rising medical costs associated with cancer have led to considerable financial hardship for patients and their families in the United States. Using data from the LIVESTRONG 2012 survey of 4,719 cancer survivors ages 18-64, we examined the proportions of survivors who reported going into debt or filing for bankruptcy as a result of cancer, as well as the amount of debt incurred. Approximately one-third of the survivors had gone into debt, and 3 percent had filed for bankruptcy. Of those who had gone into debt, 55 percent incurred obligations of $10,000 or more. Cancer survivors who were younger, had lower incomes, and had public health insurance were more likely to go into debt or file for bankruptcy, compared to those who were older, had higher incomes, and had private insurance, respectively. Future longitudinal population-based studies are needed to improve understanding of financial hardship among US working-age cancer survivors throughout the cancer care trajectory and, ultimately, to help stakeholders develop evidence-based interventions and policies to reduce the financial hardship of cancer. |
Hospital utilization and costs among preterm infants by payer: Nationwide inpatient sample, 2009
Barradas DT , Wasserman MP , Daniel-Robinson L , Bruce MA , DiSantis KI , Navarro FH , Jones WA , Manzi NM , Smith MW , Goodness BM . Matern Child Health J 2016 20 (4) 808-18 OBJECTIVES: To describe hospital utilization and costs associated with preterm or low birth weight births (preterm/LBW) by payer prior to implementation of the Affordable Care Act and to identify areas for improvement in the quality of care received among preterm/LBW infants. METHODS: Hospital utilization-defined as mean length of stay (LOS, days), secondary diagnoses for birth hospitalizations, primary diagnoses for rehospitalizations, and transfer status-and costs were described among preterm/LBW infants using the 2009 Nationwide Inpatient Sample. RESULTS: Approximately 9.1 % of included hospitalizations (n = 4,167,900) were births among preterm/LBW infants; however, these birth hospitalizations accounted for 43.4 % of total costs. Rehospitalizations of all infants occurred at a rate of 5.9 % overall, but accounted for 22.6 % of total costs. This pattern was observed across all payer types. The prevalence of rehospitalizations was nearly twice as high among preterm/LBW infants covered by Medicaid (7.6 %) compared to commercially-insured infants (4.3 %). Neonatal transfers were more common among preterm/LBW infants whose deliveries and hospitalizations were covered by Medicaid (7.3 %) versus commercial insurance (6.5 %). Uninsured/self-pay preterm and LBW infants died in-hospital during the first year of life at a rate of 91 per 1000 discharges-nearly three times higher than preterm and LBW infants covered by either Medicaid (37 per 1000) or commercial insurance (32 per 1000). CONCLUSIONS: When comparing preterm/LBW infants whose births were covered by Medicaid and commercial insurance, there were few differences in length of hospital stays and costs. However, opportunities for improvement within Medicaid and CHIP exist with regard to reducing rehospitalizations and neonatal transfers. |
Seasonal effectiveness of live attenuated and inactivated influenza vaccine
Chung JR , Flannery B , Thompson MG , Gaglani M , Jackson ML , Monto AS , Nowalk MP , Talbot HK , Treanor JJ , Belongia EA , Murthy K , Jackson LA , Petrie JG , Zimmerman RK , Griffin MR , McLean HQ , Fry AM . Pediatrics 2016 137 (2) e20153279 BACKGROUND: Few observational studies have evaluated the relative effectiveness of live attenuated (LAIV) and inactivated (IIV) influenza vaccines against medically attended laboratory-confirmed influenza. METHODS: We analyzed US Influenza Vaccine Effectiveness Network data from participants aged 2 to 17 years during 4 seasons (2010-2011 through 2013-2014) to compare relative effectiveness of LAIV and IIV against influenza-associated illness. Vaccine receipt was confirmed via provider/electronic medical records or immunization registry. We calculated the ratio (odds) of influenza-positive to influenza-negative participants among those age-appropriately vaccinated with either LAIV or IIV for the corresponding season. We examined relative effectiveness of LAIV and IIV by using adjusted odds ratios (ORs) and 95% confidence intervals (CIs) from logistic regression. RESULTS: Of 6819 participants aged 2 to 17 years, 2703 were age-appropriately vaccinated with LAIV (n = 637) or IIV (n = 2066). Odds of influenza were similar for LAIV and IIV recipients during 3 seasons (2010-2011 through 2012-2013). In 2013-2014, odds of influenza were significantly higher among LAIV recipients compared with IIV recipients 2 to 8 years old (OR 5.36; 95% CI, 2.37 to 12.13). Participants vaccinated with LAIV or IIV had similar odds of illness associated with influenza A/H3N2 or B. LAIV recipients had greater odds of illness due to influenza A/H1N1pdm09 in 2010-2011 and 2013-2014. CONCLUSIONS: We observed lower effectiveness of LAIV compared with IIV against influenza A/H1N1pdm09 but not A(H3N2) or B among children and adolescents, suggesting poor performance related to the LAIV A/H1N1pdm09 viral construct. |
Primary care physicians' perspectives about HPV vaccine
Allison MA , Hurley LP , Markowitz L , Crane LA , Brtnikova M , Beaty BL , Snow M , Cory J , Stokley S , Roark J , Kempe A . Pediatrics 2016 137 (2) e20152488 BACKGROUND AND OBJECTIVES: Because physicians' practices could be modified to reduce missed opportunities for human papillomavirus (HPV) vaccination, our goal was to: (1) describe self-reported practices regarding recommending the HPV vaccine; (2) estimate the frequency of parental deferral of HPV vaccination; and (3)identify characteristics associated with not discussing it. METHODS: A national survey among pediatricians and family physicians (FP) was conducted between October 2013 and January 2014. Using multivariable analysis, characteristics associated with not discussing HPV vaccination were examined. RESULTS: Response rates were 82% for pediatricians (364 of 442) and 56% for FP (218 of 387). For 11-12 year-old girls, 60% of pediatricians and 59% of FP strongly recommend HPV vaccine; for boys,52% and 41% ostrongly recommen. More than one-half reported ≥25% of parents deferred HPV vaccination. At the 11-12 year well visit, 84% of pediatricians and 75% of FP frequently/always discuss HPV vaccination. Compared with physicians who frequently/always discuss , those who occasionally/rarely discuss (18%) were more likely to be FP (adjusted odds ratio [aOR]: 2.0 [95% confidence interval (CI): 1.1-3.5), be male (aOR: 1.8 [95% CI: 1.1-3.1]), disagree that parents will accept HPV vaccine if discussed with other vaccines (aOR: 2.3 [95% CI: 1.3-4.2]), report that 25% to 49% (aOR: 2.8 [95% CI: 1.1-6.8]) or ≥50% (aOR: 7.8 [95% CI: 3.4-17.6]) of parents defer, and express concern about waning immunity (aOR: 3.4 [95% CI: 1.8-6.4]). CONCLUSIONS: Addressing physicians' perceptions about parental acceptance of HPV vaccine, the possible advantages of discussing HPV vaccination with other recommended vaccines, and concerns about waning immunity could lead to increased vaccination rates. |
Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies
Martinez R , Ordunez P , Soliz PN , Ballesteros MF . Inj Prev 2016 22 Suppl 1 i27-33 BACKGROUND: The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. OBJECTIVE: To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. METHODS: The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. RESULTS: Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. CONCLUSIONS: Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. |
Does screening or providing information on resources for intimate partner violence increase women's knowledge? Findings from a randomized controlled trial
Klevens J , Sadowski LS , Kee R , Garcia D . J Womens Health Issues Care 2015 4 (2) BACKGROUND: Screening for IPV in health care settings might increase women's knowledge or awareness around its frequency and its impact on health. When IPV is disclosed, assuring women it is not their fault should improve their knowledge that IPV is the perpetrator's responsibility. Providing information about IPV resources may also increase women's knowledge about the availability of solutions. METHODS: Women (n=2708) were randomly assigned to one of three groups: (1) partner violence screen plus video referral and list of local partner violence resources if screening was positive (n=909); (2) partner violence resource list only without screen (n=893); and (3) a no-screen, no-partner violence resource list control group (n=898). One year later, 2364 women (87%) were re-contacted and asked questions assessing their knowledge of the frequency of partner violence, its impact on physical and mental health, the availability of resources to help women experiencing partner violence, and that it is the perpetrator's fault. RESULTS: There were no differences between women screened and provided with a partner violence resource list compared to a control group as to women's knowledge of the frequency of IPV, its impact on physical or mental health, or the availability of IPV services in their community. However, among women who experienced IPV in the year before or year after enrolling in the trial, those who were provided a list of IPV resources without screening were significantly less likely to know that IPV is not the victim's fault than those in the control or list plus screening conditions. CONCLUSIONS: The results of this study suggest that providing information on partner violence resources, with or without asking questions about partner violence, did not result in improved knowledge. |
The problem of carbapenemase producing carbapenem-resistant Enterobacteriaceae detection
Lutgring JD , Limbago BM . J Clin Microbiol 2016 54 (3) 529-34 The emergence and spread of carbapenemase-producing carbapenem-resistant Enterobacteriaceae (CP-CRE) is a significant clinical and public health concern. Reliable detection of CP-CRE is the first step in combating this problem. There are both phenotypic and molecular methods available for CP-CRE detection. There is no single detection method that is ideal for all situations. |
Human cathelicidin, LL-37, inhibits respiratory syncytial virus infection in polarized airway epithelial cells
Harcourt JL , McDonald M , Svoboda P , Pohl J , Tatti K , Haynes LM . BMC Res Notes 2016 9 (1) 11 BACKGROUND: Respiratory syncytial virus (RSV) is a major cause of severe lower respiratory tract illness in young children worldwide. Treatment options for severe RSV disease remain limited and the development of therapeutic treatment strategies remains a priority. LL-37, a small cationic host defense peptide involved in anti-inflammatory and anti-bacterial responses, reduces replication of or infection by multiple viruses, including influenza virus, in vitro, and protects against lethal challenge with influenza virus in vivo. LL-37 also protects against RSV infection of HEp-2 cells in vitro; however, HEp-2 are not reflective of polarized airway epithelial cells and respond differently to RSV infection. An air-liquid interface (ALI) Calu-3 model that more closely mimics the human airway epithelium was established. Using this in vitro model, the effectiveness of LL-37 in preventing RSV infection and replication was examined. RESULTS: LL-37, when pre-incubated with virus prior to RSV infection (prophylactic), significantly reduced the level of viral genome detected in infected Calu-3 cells, and decreased chemokine expression associated with RSV infection in vitro. In contrast, therapeutic treatment of RSV-infected ALI Calu-3 at 24 h and 3 days post-infection had minimal impact on RSV infection. CONCLUSIONS: Differences in the efficacy of LL-37 at reducing RSV infection under prophylactic and therapeutic conditions may in part be ascribed to differences in the method of peptide exposure. However, the efficacy of LL-37 at reducing RSV infection under prophylactic conditions indicates that further studies examining the efficacy of LL-37 as a small peptide inhibitor of RSV are warranted. |
Accuracy and usefulness of select methods for assessing complete collection of 24-hour urine: A systematic review
John KA , Cogswell ME , Campbell NR , Nowson CA , Legetic B , Hennis AJ , Patel SM . J Clin Hypertens (Greenwich) 2016 18 (5) 456-67 Twenty-four-hour urine collection is the recommended method for estimating sodium intake. To investigate the strengths and limitations of methods used to assess completion of 24-hour urine collection, the authors systematically reviewed the literature on the accuracy and usefulness of methods vs para-aminobenzoic acid (PABA) recovery (referent). The percentage of incomplete collections, based on PABA, was 6% to 47% (n=8 studies). The sensitivity and specificity for identifying incomplete collection using creatinine criteria (n=4 studies) was 6% to 63% and 57% to 99.7%, respectively. The most sensitive method for removing incomplete collections was a creatinine index <0.7. In pooled analysis (≥2 studies), mean urine creatinine excretion and volume were higher among participants with complete collection (P<.05); whereas, self-reported collection time did not differ by completion status. Compared with participants with incomplete collection, mean 24-hour sodium excretion was 19.6 mmol higher (n=1781 specimens, 5 studies) in patients with complete collection. Sodium excretion may be underestimated by inclusion of incomplete 24-hour urine collections. None of the current approaches reliably assess completion of 24-hour urine collection. |
Biomarkers of susceptibility: State of the art and implications for occupational exposure to engineered nanomaterials
Iavicoli I , Leso V , Schulte PA . Toxicol Appl Pharmacol 2015 299 112-24 Rapid advances and applications in nanotechnology are expected to result in increasing occupational exposure to nano-sized materials whose health impacts are still not completely understood. Scientific efforts are required to identify hazards from nanomaterials and define risks and precautionary management strategies for exposed workers. In this scenario, the definition of susceptible populations, which may be at increased risk of adverse effects may be important for risk assessment and management. The aim of this review is to critically examine available literature to provide a comprehensive overview on susceptibility aspects potentially affecting heterogeneous responses to nanomaterials workplace exposure. Genetic, genotoxic and epigenetic alterations induced by nanomaterials in experimental studies were assessed with respect to their possible function as determinants of susceptibility. Additionally, the role of host factors, i.e. age, gender, and pathological conditions, potentially affecting nanomaterial toxicokinetic and health impacts, were also analysed. Overall, this review provides useful information to obtain insights into the nanomaterial mode of action in order to identify potentially sensitive, specific susceptibility biomarkers to be validated in occupational settings and addressed in risk assessment processes. The findings of this review are also important to guide future research into a deeper characterization of nanomaterial susceptibility in order to define adequate risk communication strategies. Ultimately, identification and use of susceptibility factors in workplace settings has both scientific and ethical issues that need addressing. |
Current understanding of interactions between nanoparticles and the immune system
Dobrovolskaia MA , Shurin M , Shvedova AA . Toxicol Appl Pharmacol 2015 299 78-89 The delivery of drugs, antigens, and imaging agents benefits from using nanotechnology-based carriers. The successful translation of nanoformulations to the clinic involves thorough assessment of their safety profiles, which, among other end-points, includes evaluation of immunotoxicity. The past decade of research focusing on nanoparticle interaction with the immune system has been fruitful in terms of understanding the basics of nanoparticle immunocompatibility, developing a bioanalytical infrastructure to screen for nanoparticle-mediated immune reactions, beginning to uncover the mechanisms of nanoparticle immunotoxicity, and utilizing current knowledge about the structure-activity relationship between nanoparticles' physicochemical properties and their effects on the immune system to guide safe drug delivery. In the present review, we focus on the most prominent pieces of the nanoparticle-immune system puzzle and discuss the achievements, disappointments, and lessons learned over the past 15 years of research on the immunotoxicity of engineered nanomaterials. |
Recognizing excellence in maternal and child health (MCH) epidemiology: The 2014 national MCH epidemiology awards
Kroelinger CD , Vladutiu CJ , Jones JR . Matern Child Health J 2016 20 (4) 760-8 PURPOSE: The impact of programs, policies, and practices developed by professionals in the field of maternal and child health (MCH) epidemiology is highlighted biennially by 16 national MCH agencies and organizations, or the Coalition for Excellence in MCH Epidemiology. Description In September 2014, multiple leading agencies in the field of MCH partnered to host the national CityMatCH Leadership and MCH Epidemiology Conference in Phoenix, Arizona. The conference offered opportunities for peer exchange; presentation of new scientific methodologies, programs, and policies; dialogue on changes in the MCH field; and discussion of emerging MCH issues relevant to the work of local, state, and national MCH professionals. During the conference, the National MCH Epidemiology Awards were presented to individuals, teams, institutions, and leaders for significantly contributing to the improved health of women, children, and families. ASSESSMENT: During the conference, the Coalition presented seven deserving health researchers and research groups with national awards in the areas of advancing knowledge, effective practice, outstanding leadership, young professional achievement, and lifetime achievement. The article highlights the accomplishments of these national-level awardees. CONCLUSION: Recognition of deserving professionals strengthens the field of MCH epidemiology, and sets the standard for exceptional research, mentoring, and practice. |
Volumetric measurement of rock movement using photogrammetry
Benton DJ , Iverson SR , Martin LA , Johnson JC , Raffaldi MJ . Int J Min Sci Technol 2015 26 (1) 123-130 NIOSH ground control safety research program at Spokane, Washington, is exploring applications of photogrammetry to rock mass and support monitoring. This paper describes two ways photogrammetric techniques are being used. First, photogrammetric data of laboratory testing is being used to correlate energy input and support deformation. This information can be used to infer remaining support toughness after ground deformation events. This technique is also demonstrated in a field application. Second, field photogrammetric data is compared to crackmeter data from a deep underground mine. Accuracies were found to average 8. mm, but have produced results within 0.2. mm of true displacement, as measured by crackmeters. Application of these techniques consists of monitoring overall fault activity by monitoring multiple points around the crackmeter. A case study is provided in which a crackmeter is clearly shown to have provided insufficient information regarding overall fault ground deformation. Photogrammetry is proving to be a useful ground monitoring tool due to its unobtrusiveness and ease of use. |
Sodium content of popular commercially processed and restaurant foods in the United States
Ahuja JKC , Wasswa-Kintu S , Haytowitz DB , Daniel M , Thomas R , Showell B , Nickle M , Roseland JM , Gunn J , Cogswell M , Pehrsson PR . Prev Med Rep 2015 2 962-967 PURPOSE: The purpose of this study was to provide baseline estimates of sodium levels in 125 popular, sodium-contributing, commercially processed and restaurant foods in the U.S., to assess future changes as manufacturers reformulate foods. METHODS: In 2010-2013, we obtained ~ 5200 sample units from up to 12 locations and analyzed 1654 composites for sodium and related nutrients (potassium, total dietary fiber, total and saturated fat, and total sugar), as part of the U.S. Department of Agriculture-led sodium-monitoring program. We determined sodium content as mg/100 g, mg/serving, and mg/kcal and compared them against U.S. Food and Drug Administration's (FDA) sodium limits for "low" and "healthy" claims and to the optimal sodium level of < 1.1 mg/kcal, extrapolating from the Healthy Eating Index-2010. RESULTS: Results from this study represent the baseline nutrient values to use in assessing future changes as foods are reformulated for sodium reduction. Sodium levels in over half (69 of 125) of the foods, including all main dishes and most Sentinel Foods from fast-food outlets or restaurants (29 of 33 foods), exceeded the FDA sodium limit for using the claim "healthy". Only 13 of 125 foods had sodium values below 1.1 mg/kcal. We observed a wide range of sodium content among similar food types and brands. CONCLUSIONS: Current sodium levels in commercially processed and restaurant foods in the U.S. are high and variable. Targeted benchmarks and increased awareness of high sodium content and variability in foods would support reduction of sodium intakes in the U.S. |
Serious injury and fatality investigations involving pneumatic nail guns, 1985-2012
Lowe BD , Albers JT , Hudock SD , Krieg EF . Am J Ind Med 2016 59 (2) 164-74 BACKGROUND: This article examines serious and fatal pneumatic nail gun (PNG) injury investigations for workplace, tool design, and human factors relevant to causation and resulting OS&H authorities' responses in terms of citations and penalties. METHODS: The U.S. Occupational Safety and Health Administration (OSHA) database of Fatality and Catastrophe Investigation Summaries (F&CIS) were reviewed (1985-2012) to identify n = 258 PNG accidents. RESULTS: 79.8% of investigations, and 100% of fatalities, occurred in the construction industry. Between 53-71% of injuries appear to have been preventable had a safer sequential trigger tool been used. Citations and monetary penalties were related to injury severity, body part injured, disabling of safety devices, and insufficient personal protective equipment (PPE). CONCLUSIONS: Differences may exist between construction and other industries in investigators interpretations of PNG injury causation and resulting citations/penalties. Violations of PPE standards were penalized most severely, yet the preventive effect of PPE would likely have been less than that of a safer sequential trigger. |
Bronchiolitis by any other name: Describing bronchiolar disorders from inhalational exposures
Cummings KJ , Kreiss K , Roggli VL . Ann Am Thorac Soc 2016 13 (1) 143-4 We read with interest the recent article by Ryerson and colleagues about the potential reversibility of a bronchiolar disorder they termed “fibrosing bronchiolitis” (1). The three cases described by the authors are unrelated and of unclear etiology, but may have infectious or inhalational causes. The patients had centrilobular nodularity on chest computed tomography and bronchiolar luminal narrowing by granulation tissue on biopsy. They were treated with immunosuppressive agents, including corticosteroids, and subsequently had clinical and functional improvement. The authors speculated that without immunosuppressive therapy, the disease may have progressed to constrictive bronchiolitis. | The finding of polypoid granulation tissue within bronchioles has been described historically under the nonspecific term “bronchiolitis obliterans” (2). Lowry and Schuman’s classic report of an inhalational bronchiolar disease, silo-fillers’ disease, from exposure to oxides of nitrogen, was notable for a diffuse nodular infiltrate on chest imaging “that cannot be distinguished from acute miliary tuberculosis,” and a microscopic pattern of bronchiolar filling by “a rather cellular fibrinous exudate” (3). Lowry and Schuman noted that “organization of this adherent plug of fibrin by ingrowth of fibroblasts from the bronchial walls tends eventually to occlude the lumen” (3). |
Re: Bias in the proportionate mortality ratio analysis of small study populations: A case on analyses of radiation and mesothelioma
Hein MJ , Schubauer-Berigan MK . Int J Radiat Biol 2015 91 (11) 908-10 Studies of mortality in occupational groups frequently use proportionate mortality ratio (PMR) analyses (Steenland et al.Citation1990). In the simplest case, the PMR is the ratio of the proportion of deaths due to a specific cause in the study population to the proportion of deaths due to the same cause in a referent population. Software packages such as the NIOSH Life-Table Analysis System (LTAS.NET) can estimate PMRs that are adjusted for gender, race, age, and calendar year (Schubauer-Berigan et al. Citation2011). Zhou (Citation2014) recently claimed that the PMR reported in the NIOSH LTAS.NET program is biased under certain conditions - namely, when the number of deaths available for analysis is small relative to the number of cause-of-death categories, resulting in categories with zero observed deaths. To remove the bias, Zhou proposed an adjustment that involved excluding deaths from the referent population for categories with zero deaths in the sample when estimating the PMR. The objective of this letter is to point out faulty logic in Zhou's proposed adjustment and assure users of LTAS.NET that the PMR estimates produced by LTAS.NET are not biased. |
Reported traumatic injuries among West Coast Dungeness crab fishermen, 2002-2014
Case S , Bovbjerg V , Lucas D , Syron L , Kincl L . Int Marit Health 2015 66 (4) 207-10 BACKGROUND: Commercial fishing is a high-risk occupation. The West Coast Dungeness crab fishery has a high fatality rate; however, nonfatal injuries have not been previously studied. The purpose of this report was to describe the characteristics of fatal and nonfatal traumatic occupational injuries and associated hazards in this fleet during 2002-2014. MATERIALS AND METHODS: Data on fatal injuries were obtained from a surveillance system managed by the National Institute for Occupational Safety and Health. Data on nonfatal injuries were manually abstracted from Coast Guard investigation reports and entered into a study database. Descriptive statistics were used to characterise demographics, injury characteristics, and work processes performed. RESULTS: Twenty-eight fatal and 45 nonfatal injuries were reported between 2002 and 2014 in the Dungeness crab fleet. Most fatalities were due to vessel disasters, and many nonfatal injuries occurred on-deck when fishermen were working with gear, particularly when hauling the gear (47%). The most frequently reported injuries affected the upper extremities (48%), and fractures were the most commonly reported injury type (40%). The overall fatality rate during this time period was 209 per 100,000 full-time equivalent workers and the rate of nonfatal injury was 3.4 per 1,000 full-time equivalent workers. CONCLUSIONS: Dungeness crab fishermen are at relatively high risk for fatal injuries. Nonfatal injuries were limited to reported information, which hampers efforts to accurately estimate nonfatal injury risk and understand fishing hazards. Further research is needed to identify work tasks and other hazards that cause nonfatal injuries in this fleet. Engaging fishermen directly may help develop approaches for injury prevention. |
Hexavalent chromium and isocyanate exposures during military aircraft painting under crossflow ventilation
Bennett JS , Marlow DA , Nourian F , Breay J , Hammond D . J Occup Environ Hyg 2015 13 (5) 1-50 Exposure control systems performance was investigated in an aircraft painting hangar. The ability of the ventilation system and respiratory protection program to limit worker exposures was examined through air sampling during painting of F/A-18C/D strike fighter aircraft, in four field surveys. Air velocities were measured across the supply filter, exhaust filter, and hangar midplane under crossflow ventilation. Air sampling conducted during painting process phases (wipe-down, primer spraying, and topcoat spraying) encompassed volatile organic compounds, total particulate matter, Cr[VI], metals, nitroethane, and hexamethylene diisocyanate, for two worker groups: sprayers and sprayer helpers ("hosemen"). One of six methyl ethyl ketone and two of six methyl isobutyl ketone samples exceeded the short term exposure limits of 300 and 75 ppm, with means 57 ppm and 63 ppm, respectively. All 12 Cr[VI] 8-hr time-weighted averages exceeded the recommended exposure limit of 1 microg/m3, 11 out of 12 exceeded the permissible exposure limit of 5 microg/m3, and 7 out of 12 exceeded the threshold limit value of 10 microg/m3, with means 38 microg/m3 for sprayers and 8.3 microg/m3 for hosemen. Hexamethylene diisocyanate means were 5.95 microg/m3 for sprayers and 0.645 microg/m3 for hosemen. Total reactive isocyanate group-the total of monomer and oligomer as NCO group mass-showed six of 15 personal samples exceeded the United Kingdom Health and Safety Executive workplace exposure limit of 20 microg/m3, with means 50.9 microg/m3 for sprayers and 7.29 microg/m3 for hosemen. Several exposure limits were exceeded, reinforcing continued use of personal protective equipment. The supply rate, 94.4 m3/s (200,000 cfm), produced a velocity of 8.58 m/s (157 fpm) at the supply filter, while the exhaust rate, 68.7 m3/s (146,000 cfm), drew 1.34 m/s (264 fpm) at the exhaust filter. Midway between supply and exhaust locations, the velocity was 0.528 m/s (104 fpm). Supply rate exceeding exhaust rate created re-circulations, turbulence, and fugitive emissions, while wasting energy. Smoke releases showing more effective ventilation here than in other aircraft painting facilities carries technical feasibility relevance. |
Lung transplantation is increasingly common among patients with coal workers' pneumoconiosis
Blackley DJ , Halldin CN , Cummings KJ , Laney AS . Am J Ind Med 2016 59 (3) 175-7 BACKGROUND: The prevalence of coal workers' pneumoconiosis (CWP) in U.S. coal miners has increased, and severe presentations are increasingly common. METHODS: We describe trends in lung transplantation during 1996-2014 for recipients with a primary diagnosis of CWP or pneumoconiosis unspecified, and we summarize recipient characteristics and estimate survival. RESULTS: A total of 47 transplants were included; nearly three-quarters were performed during 2008-2014. All recipients were male, 96% were white, and the mean age was 56 years. Mean FEV1 % was 35%; mean FVC% was 53%. Mean time on a waitlist was 155 days, and 60% of transplants were bilateral. Median survival was 3.7 years. CONCLUSIONS: These transplants reflect the use of a scarce resource for an entirely preventable disease, and highlight the need for enhanced efforts to reduce coal mine dust exposures. |
Dynamic failure in coal seams: Implications of coal composition for bump susceptibility
Lawson H , Weakley A , Miller A . Int J Min Sci Technol 2015 26 (1) 3-8 As a contributing factor in the dynamic failure (bumping) of coal pillars, a bump-prone coal seam has been described as one that is "uncleated or poorly cleated, strong...that sustains high stresses." Despite extensive research regarding engineering controls to help reduce the risk for coal bumps, there is a paucity of research related to the properties of coal itself and how those properties might contribute to the mechanics of failures. Geographic distribution of reportable dynamic failure events reveals a highly localized clustering of incidents despite widespread mining activities. This suggests that unique, contributing geologic characteristics exist within these regions that are less prevalent elsewhere. To investigate a new approach for identifying coal characteristics that might lead to bumping, a principal component analysis (PCA) was performed on 306 coal records from the Pennsylvania State Coal Sample database to determine which characteristics were most closely linked with a positive history of reportable bumping. Selected material properties from the data records for coal samples were chosen as variables for the PCA and included petrographic, elemental, and molecular properties. Results of the PCA suggest a clear correlation between low organic sulfur content and the occurrence of dynamic failure, and a secondary correlation between volatile matter and dynamic failure phenomena. The ratio of volatile matter to sulfur in the samples shows strong correlation with bump-prone regions, with a minimum threshold value of approximately 20, while correlations determined for other petrographic and elemental variables were more ambiguous. Results suggest that the composition of the coal itself is directly linked to how likely a coal is to have experienced a reportable dynamic failure event. These compositional controls are distinct from other previously established engineering and geologic criteria and represent a missing piece to the bump prediction puzzle. |
Strategies and approaches to vector control in nine malaria-eliminating countries: a cross-case study analysis
Smith Gueye C , Newby G , Gosling RD , Whittaker MA , Chandramohan D , Slutsker L , Tanner M . Malar J 2016 15 (1) 2 BACKGROUND: There has been progress towards malaria elimination in the last decade. In response, WHO launched the Global Technical Strategy (GTS), in which vector surveillance and control play important roles. Country experiences in the Eliminating Malaria Case Study Series were reviewed to identify success factors on the road to elimination using a cross-case study analytic approach. METHODS: Reports were included in the analysis if final English language draft reports or publications were available at the time of analysis (Bhutan, Cape Verde, Malaysia, Mauritius, Namibia, Philippines, Sri Lanka, Turkey, Turkmenistan). A conceptual framework for vector control in malaria elimination was developed, reviewed, formatted as a matrix, and case study data was extracted and entered into the matrix. A workshop was convened during which participants conducted reviews of the case studies and matrices and arrived at a consensus on the evidence and lessons. The framework was revised and a second round of data extraction, synthesis and summary of the case study reports was conducted. RESULTS: Countries implemented a range of vector control interventions. Most countries aligned with integrated vector management, however its impact was not well articulated. All programmes conducted entomological surveillance, but the response (i.e., stratification and targeting of interventions, outbreak forecasting and strategy) was limited or not described. Indoor residual spraying (IRS) was commonly used by countries. There were several examples of severe reductions or halting of IRS coverage and subsequent resurgence of malaria. Funding and operational constraints and poor implementation had roles. Bed nets were commonly used by most programmes; coverage and effectiveness were either not measured or not articulated. Larval control was an important intervention for several countries, preventing re-introduction, however coverage and impact on incidence were not described. Across all interventions, coverage indicators were incomparable, and the rationale for which tools were used and which were not used appeared to be a function of the availability of funding, operational issues and cost instead of evidence of effectiveness to reduce incidence. CONCLUSIONS: More work is required to fill gaps in programme guidance, clarify the best methods for choosing and targeting vector control interventions, and support to measure cost, cost-effectiveness and cost-benefit of vector surveillance and control interventions. |
Targeting indoor residual spraying for malaria using epidemiological data: a case study of the Zambia experience
Pinchoff J , Larsen DA , Renn S , Pollard D , Fornadel C , Maire M , Sikaala C , Sinyangwe C , Winters B , Bridges DJ , Winters AM . Malar J 2016 15 (1) 11 BACKGROUND: In Zambia and other sub-Saharan African countries affected by ongoing malaria transmission, indoor residual spraying (IRS) for malaria prevention has typically been implemented over large areas, e.g., district-wide, and targeted to peri-urban areas. However, there is a recent shift in some countries, including Zambia, towards the adoption of a more strategic and targeted IRS approach, in coordination with increased emphasis on universal coverage of long-lasting insecticidal nets (LLINs) and effective insecticide resistance management. A true targeted approach would deliver IRS to sub-district areas identified as high-risk, with the goal of maximizing the prevention of malaria cases and deaths. RESULTS: Together with the Government of the Republic of Zambia, a new methodology was developed applying geographic information systems and satellite imagery to support a targeted IRS campaign during the 2014 spray season using health management information system data. DISCUSSION/CONCLUSION: This case study focuses on the developed methodology while also highlighting the significant research gaps which must be filled to guide countries on the most effective strategy for IRS targeting in the context of universal LLIN coverage and evolving insecticide resistance. |
Pharmacokinetics of mefloquine and its effect on sulfamethoxazole and trimethoprim steady-state blood levels in intermittent preventive treatment (IPTp) of pregnant HIV-infected women in Kenya
Green M , Otieno K , Katana A , Slutsker L , Kariuki S , Ouma P , Gonzalez R , Menendez C , Ter Kuile F , Desai M . Malar J 2016 15 (1) 7 BACKGROUND: Intermittent preventive treatment in pregnancy with sulfadoxine/pyrimethamine is contra-indicated in HIV-positive pregnant women receiving sulfamethoxazole/trimethoprim prophylaxis. Since mefloquine is being considered as a replacement for sulfadoxine/pyrimethamine in this vulnerable population, an investigation on the pharmacokinetic interactions of mefloquine, sulfamethoxazole and trimethoprim in pregnant, HIV-infected women was performed. METHODS: A double-blinded, placebo-controlled study was conducted with 124 HIV-infected, pregnant women on a standard regimen of sulfamethoxazole/trimethoprim prophylaxis. Seventy-two subjects received three doses of mefloquine (15 mg/kg) at monthly intervals. Dried blood spots were collected from both placebo and mefloquine arms four to 672 h post-administration and on day 7 following a second monthly dose of mefloquine. A novel high-performance liquid chromatographic method was developed to simultaneously measure mefloquine, sulfamethoxazole and trimethoprim from each blood spot. Non-compartmental methods using a naive-pooled data approach were used to determine mefloquine pharmacokinetic parameters. RESULTS: Sulfamethoxazole/trimethoprim prophylaxis did not noticeably influence mefloquine pharmacokinetics relative to reported values. The mefloquine half-life, observed clearance (CL/f), and area-under-the-curve (AUC0-->infinity) were 12.0 days, 0.035 l/h/kg and 431 microg-h/ml, respectively. Although trimethoprim steady-state levels were not significantly different between arms, sulfamethoxazole levels showed a significant 53 % decrease after mefloquine administration relative to the placebo group and returning to pre-dose levels at 28 days. CONCLUSIONS: Although a transient decrease in sulfamethoxazole levels was observed, there was no change in hospital admissions due to secondary bacterial infections, implying that mefloquine may have provided antimicrobial protection. |
The effect of indoor residual spraying on the prevalence of malaria parasite infection, clinical malaria and anemia in an area of perennial transmission and moderate coverage of insecticide treated nets in Western Kenya
Gimnig JE , Otieno P , Were V , Marwanga D , Abong'o D , Wiegand R , Williamson J , Wolkon A , Zhou Y , Bayoh MN , Lobo NF , Laserson K , Kariuki S , Hamel MJ . PLoS One 2016 11 (1) e0145282 BACKGROUND: Insecticide treated nets (ITNs) and indoor residual spraying (IRS) have been scaled up for malaria prevention in sub-Saharan Africa. However, there are few studies on the benefit of implementing IRS in areas with moderate to high coverage of ITNs. We evaluated the impact of an IRS program on malaria related outcomes in western Kenya, an area of intense perennial malaria transmission and moderate ITN coverage (55-65% use of any net the previous night). METHODS: The Kenya Division of Malaria Control, with support from the US President's Malaria Initiative, conducted IRS in one lowland endemic district with moderate coverage of ITNs. Surveys were conducted in the IRS district and a neighboring district before IRS, after one round of IRS in July-Sept 2008 and after a second round of IRS in April-May 2009. IRS was conducted with pyrethroid insecticides. At each survey, 30 clusters were selected for sampling and within each cluster, 12 compounds were randomly selected. The primary outcomes measured in all residents of selected compounds included malaria parasitemia, clinical malaria (P. falciparum infection plus history of fever) and anemia (Hb<8) of all residents in randomly selected compounds. At each survey round, individuals from the IRS district were matched to those from the non-IRS district using propensity scores and multivariate logistic regression models were constructed based on the matched dataset. RESULTS: At baseline and after one round of IRS, there were no differences between the two districts in the prevalence of malaria parasitemia, clinical malaria or anemia. After two rounds of IRS, the prevalence of malaria parasitemia was 6.4% in the IRS district compared to 16.7% in the comparison district (OR = 0.36, 95% CI = 0.22-0.59, p<0.001). The prevalence of clinical malaria was also lower in the IRS district (1.8% vs. 4.9%, OR = 0.37, 95% CI = 0.20-0.68, p = 0.001). The prevalence of anemia was lower in the IRS district but only in children under 5 years of age (2.8% vs. 9.3%, OR = 0.30, 95% CI = 0.13-0.71, p = 0.006). Multivariate models incorporating both IRS and ITNs indicated that both had an impact on malaria parasitemia and clinical malaria but the independent effect of ITNs was reduced in the district that had received two rounds of IRS. There was no statistically significant independent effect of ITNs on the prevalence of anemia in any age group. CONCLUSIONS: Both IRS and ITNs are effective tools for reducing malaria burden and when implemented in an area of moderate to high transmission with moderate ITN coverage, there may be an added benefit of IRS. The value of adding ITNs to IRS is less clear as their benefits may be masked by IRS. Additional monitoring of malaria control programs that implement ITNs and IRS concurrently is encouraged to better understand how to maximize the benefits of both interventions, particularly in the context of increasing pyrethroid resistance. |
Responsive leadership in social services: A practical approach for optimizing engagement and performance
Lewis S . Health Promot Pract 2015 17 (2) 169-171 Responsive Leadership in Social Services: A Practical Approach for Optimizing Engagement and Performance emphasizes the importance of effective supervision as a key component of quality leadership. The Responsive Leadership Approach considers employee needs, values, goals, and strengths to optimize worker performance. It is posited that when leaders integrate and operationalize the meaning embedded in the "employee story," they improve employee engagement and work performance as well as advance their own leadership ability. Discovery tools such as the Key Performance Motivators Scale, Preferred Leadership Profile, and Strengths Index are provided. The impact of operationalizing important values and using a strengths-based approach on organizational climate and employee morale is explored. Active listening and empathic response are discussed as practical methods to discover employee meaning. Techniques for dealing with "difficult" employees and undesirable attitudes and behaviors are described. This book is a valuable resource for developing the leadership capacities of first-time and experienced health and social services supervisors. |
The 3 buckets of prevention
Auerbach J . J Public Health Manag Pract 2015 22 (3) 215-8 The US health care system is in a time of unprecedented change. The expansion of insurance coverage, redesign of the reimbursement systems, and growing influence of patient-centered medical homes and accountable care organizations all bring opportunities for those interested in the prevention of disease, injury, and premature death for entire communities as well as individual patients.1,2 It is, in short, a time when public health can come to the fore. | Public health practitioners can assist clinical providers in assuring that newly insured people receive services that promote health and do not simply treat illness. They can help insurers identify the quality measures and incentives that yield better health outcomes and control costs. They can provide evidence of effective interventions that were previously funded by public health grants but can now be brought to scale if paid for by the health care sector. And they can even point to ways to complement traditional health care treatment with community-oriented population health measures. | It is obvious that none of this will come easily. Nonetheless, at this moment—unprecedented in the careers of most public health practitioners, and of uncertain duration—it is critically important to try. |
Research gaps identified during the 2014 update of the WHO Medical Eligibility Criteria for Contraceptive Use and Selected Practice Recommendations for Contraceptive Use
Dragoman M , Jatlaoui T , Nanda K , Curtis KM , Gaffield ME . Contraception 2015 94 (3) 195-201 Universal access to safe and effective contraception is an important public health goal. Family planning and prevention of unintended pregnancy are essential to securing the well-being and autonomy of individuals, while supporting the health and development of communities [1]. The World Health Organization (WHO) recently undertook a process to update its global guidance on “who” can use contraception safely and “how” to use contraception safely and effectively to generate the fifth edition of the WHO Medical Eligibility Criteria for Contraceptive Use (MEC) and the third edition of the WHO Selected Practice Recommendations for Contraceptive Use (SPR). Overall, the MEC demonstrates that contraception is remarkably safe for most people; at least one highly effective contraceptive method is assigned a category “1” or “2” across the majority of conditions in the guidance, indicating no restrictions on use or that the advantages of using a particular method generally outweigh the theoretical or proven risks of use. Once a medically appropriate method is identified, the SPR offers critical guidance on safe and effective use, important for contraceptive management and service delivery. The major goal for producing these evidence-based recommendations is to help improve access to and strengthen the quality of family planning services worldwide. | While these recommendations reflect a rigorous synthesis and interpretation of the best evidence to date and contribute significantly to medical and public health knowledge around the world, a number of recommendations in both the MEC and SPR are grounded in limited to no direct evidence. Related to safety, conducting research on whether exposure to a contraceptive method would worsen a disease given significant theoretical concerns (e.g., combined hormonal contraceptive use among women with current breast cancer) would be unethical. In other instances, no or very limited published literature directly reports whether or not use of a given method is associated with an important health risk or how best to offer a contraceptive service. In the absence of direct evidence, indirect evidence and expert opinion inform assessments. For example, extrapolating what is known about the safety of contraceptive methods in healthy women to women with medical conditions can be helpful, while taking into account relevant disease processes and how they intersect with what is known generally about the characteristics of methods, associated side effects and potential complications. Even when direct evidence is available, methodological flaws can limit interpretation. The body and certainty of the evidence underpinning the recommendations has increased and improved over time in response to increased and improved contraceptive research. |
Reproducible Research Practices and Transparency across the Biomedical Literature.
Iqbal SA , Wallach JD , Khoury MJ , Schully SD , Ioannidis JP . PLoS Biol 2016 14 (1) e1002333 There is a growing movement to encourage reproducibility and transparency practices in the scientific community, including public access to raw data and protocols, the conduct of replication studies, systematic integration of evidence in systematic reviews, and the documentation of funding and potential conflicts of interest. In this survey, we assessed the current status of reproducibility and transparency addressing these indicators in a random sample of 441 biomedical journal articles published in 2000-2014. Only one study provided a full protocol and none made all raw data directly available. Replication studies were rare (n = 4), and only 16 studies had their data included in a subsequent systematic review or meta-analysis. The majority of studies did not mention anything about funding or conflicts of interest. The percentage of articles with no statement of conflict decreased substantially between 2000 and 2014 (94.4% in 2000 to 34.6% in 2014); the percentage of articles reporting statements of conflicts (0% in 2000, 15.4% in 2014) or no conflicts (5.6% in 2000, 50.0% in 2014) increased. Articles published in journals in the clinical medicine category versus other fields were almost twice as likely to not include any information on funding and to have private funding. This study provides baseline data to compare future progress in improving these indicators in the scientific literature. |
The Bayes factor for case-control studies with misclassified data
Lee T . J Mod Appl Stat Methods 2015 14 (2) 201-218 The question of how to test if collected data for a case-control study are misclassified was investigated. A mixed approach was employed to calculate the Bayes factor to assess the validity of the null hypothesis of no-misclassification. A real-world data set on the association between lung cancer and smoking status was used as an example to illustrate the proposed method. |
Vital Signs: Exposure to electronic cigarette advertising among middle school and high school students - United States, 2014
Singh T , Marynak K , Arrazola RA , Cox S , Rolle IV , King BA . MMWR Morb Mortal Wkly Rep 2016 64 (52) 1403-8 INTRODUCTION: Electronic cigarette (e-cigarette) use has increased considerably among U.S. youths since 2011. Tobacco use among youths in any form, including e-cigarettes, is unsafe. Tobacco product advertising can persuade youths to start using tobacco. CDC analyzed data from the 2014 National Youth Tobacco Survey to estimate the prevalence of e-cigarette advertisement exposure among U.S. middle school and high school students. METHODS: The 2014 National Youth Tobacco Survey, a school-based survey of middle school and high school students in grades 6-12, included 22,007 participants. Exposure to e-cigarette advertisements (categorized as "sometimes," "most of the time," or "always") was assessed for four sources: retail stores, Internet, TV and movies, and newspapers and magazines. Weighted exposure estimates were assessed overall and by school type, sex, race/ethnicity, and grade. RESULTS: In 2014, 68.9% of middle and high school students (18.3 million) were exposed to e-cigarette advertisements from at least one source. Among middle school students, exposure was highest for retail stores (52.8%), followed by Internet (35.8%), TV and movies (34.1%), and newspapers and magazines (25.0%). Among high school students, exposure was highest for retail stores (56.3%), followed by Internet (42.9%), TV and movies (38.4%), and newspapers and magazines (34.6%). Among middle school students, 23.4% reported exposure to e-cigarette advertising from one source, 17.4% from two sources, 13.7% from three sources, and 11.9% from four sources. Among high school students, 21.1% reported exposure to e-cigarette advertising from one source, 17.0% from two sources, 14.5% from three sources, and 18.2% from four sources. CONCLUSIONS AND IMPLICATIONS FOR PUBLIC HEALTH PRACTICE: Approximately seven in 10 U.S. middle and high school students were exposed to e-cigarette advertisements in 2014. Exposure to e-cigarette advertisements might contribute to increased use of e-cigarettes among youths. Multiple approaches are warranted to reduce youth e-cigarette use and exposure to e-cigarette advertisements, including efforts to reduce youth access to settings where tobacco products, such as e-cigarettes, are sold, and regulation of youth-oriented e-cigarette marketing. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Genetics and Genomics
- Health Economics
- Immunity and Immunization
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Mining
- Nutritional Sciences
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Parasitic Diseases
- Public Health Leadership and Management
- Public Health, General
- Reproductive Health
- Sciences, General
- Statistics as Topic
- Substance Use and Abuse
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure