Relationships between blood pressure and 24-hour urinary excretion of sodium and potassium by body mass index status in Chinese adults
Yan L , Bi Z , Tang J , Wang L , Yang Q , Guo X , Cogswell ME , Zhang X , Hong Y , Engelgau M , Zhang J , Elliott P , Angell SY , Ma J . J Clin Hypertens (Greenwich) 2015 17 (12) 916-25 This study examined the impact of overweight/obesity on sodium, potassium, and blood pressure associations using the Shandong-Ministry of Health Action on Salt Reduction and Hypertension (SMASH) project baseline survey data. Twenty-four-hour urine samples were collected in 1948 Chinese adults aged 18 to 69 years. The observed associations of sodium, potassium, sodium-potassium ratio, and systolic blood pressure (SBP) were stronger in the overweight/obese population than among those of normal weight. Among overweight/obese respondents, each additional standard deviation (SD) higher of urinary sodium excretion (SD=85 mmol) and potassium excretion (SD=19 mmol) was associated with a 1.31 mm Hg (95% confidence interval, 0.37-2.26) and -1.43 mm Hg (95% confidence interval, -2.23 to -0.63) difference in SBP, and each higher unit in sodium-potassium ratio was associated with a 0.54 mm Hg (95% confidence interval, 0.34-0.75) increase in SBP. The association between sodium, potassium, sodium-potassium ratio, and prevalence of hypertension among overweight/obese patients was similar to that of SBP. Our study indicated that the relationships between BP and both urinary sodium and potassium might be modified by BMI status in Chinese adults. |
Rising obesity prevalence and weight gain among adults starting antiretroviral therapy in the United States and Canada
Koethe JR , Jenkins CA , Lau B , Shepherd BE , Justice AC , Tate JP , Buchacz K , Napravnik S , Mayor AM , Horberg MA , Blashill AJ , Willig A , Wester CW , Silverberg MJ , Gill J , Thorne JE , Klein M , Eron JJ , Kitahata MM , Sterling TR , Moore RD . AIDS Res Hum Retroviruses 2015 32 (1) 50-8 The proportion of overweight and obese adults in the United States and Canada has increased over the past decade, but temporal trends in body mass index (BMI) and weight gain on antiretroviral therapy (ART) among HIV-infected adults have not been well characterized. We conducted a cohort study comparing HIV-infected adults in the North America AIDS Cohort Collaboration on Research and Design (NA-ACCORD) to United States National Health and Nutrition Examination Survey (NHANES) controls matched by sex, race, and age over the period 1998 to 2010. Multivariable linear regression assessed the relationship between BMI and year of ART initiation, adjusting for sex, race, age, and baseline CD4+ count. Temporal trends in weight on ART were assessed using a generalized least-squares model further adjusted for HIV-1 RNA and first ART regimen class. A total of 14,084 patients from 17 cohorts contributed data; 83% were male, 57% were nonwhite, and the median age was 40 years. Median BMI at ART initiation increased from 23.8 to 24.8 kg/m2 between 1998 and 2010 in NA-ACCORD, but the percentage of those obese (BMI ≥30 kg/m2) at ART initiation increased from 9% to 18%. After 3 years of ART, 22% of individuals with a normal BMI (18.5-24.9 kg/m2) at baseline had become overweight (BMI 25.0-29.9 kg/m2), and 18% of those overweight at baseline had become obese. HIV-infected white women had a higher BMI after 3 years of ART as compared to age-matched white women in NHANES (p = 0.02), while no difference in BMI after 3 years of ART was observed for HIV-infected men or non-white women compared to controls. The high prevalence of obesity we observed among ART-exposed HIV-infected adults in North America may contribute to health complications in the future. |
Self-reported current or prior periodontal disease performs moderately well in characterizing periodontitis status in postmenopausal women who receive regular dental checkups
Eke PI . J Evid Based Dent Pract 2015 15 (3) 121-3 The study examines the accuracy of self-reported periodontal disease in a cohort of post-menopausal women. |
Vital Signs: Predicted heart age and racial disparities in heart age among U.S. adults at the state level
Yang Q , Zhong Y , Ritchey M , Cobain M , Gillespie C , Merritt R , Hong Y , George MG , Bowman BA . MMWR Morb Mortal Wkly Rep 2015 64 (34) 950-8 INTRODUCTION: Cardiovascular disease is a leading cause of morbidity and mortality in the United States. Heart age (the predicted age of a person's vascular system based on their cardiovascular risk factor profile) and its comparison with chronological age represent a new way to express risk for developing cardiovascular disease. This study estimates heart age and differences between heart age and chronological age (excess heart age) and examines racial, sociodemographic, and regional disparities in heart age among U.S. adults aged 30-74 years. METHODS: Weighted 2011 and 2013 Behavioral Risk Factor Surveillance System data were applied to the sex-specific non-laboratory-based Framingham risk score models, stratifying the results by age and race/ethnic group, educational and income level, and state. These results were then translated into age-standardized heart age values, mean excess heart age was calculated, and the findings were compared across groups. RESULTS: Overall, average predicted heart age for adult men and women was 7.8 and 5.4 years older than their chronological age, respectively. Statistically significant (p<0.05) racial/ethnic, sociodemographic, and regional differences in heart age were observed: heart age among non-Hispanic black men (58.7 years) and women (58.9 years) was greater than other racial/ethnic groups, including non-Hispanic white men (55.3 years) and women (52.5 years). Excess heart age was lowest for men and women in Utah (5.8 and 2.8 years, respectively) and highest in Mississippi (10.1 and 9.1 years, respectively). CONCLUSIONS AND IMPLICATIONS FOR PUBLIC HEALTH PRACTICE: The predicted heart age among U.S. adults aged 30-74 years was significantly higher than their chronological age. Use of predicted heart age might 1) simplify risk communication and motivate more persons to live heart-healthy lifestyles and better comply with recommended therapeutic interventions, and 2) motivate communities to implement programs and policies that support cardiovascular health. |
Prevalence of and trends in diabetes among adults in the United States, 1988-2012
Menke A , Casagrande S , Geiss L , Cowie CC . JAMA 2015 314 (10) 1021-9 IMPORTANCE: Previous studies have shown increasing prevalence of diabetes in the United States. New US data are available to estimate prevalence of and trends in diabetes. OBJECTIVE: To estimate the recent prevalence and update US trends in total diabetes, diagnosed diabetes, and undiagnosed diabetes using National Health and Nutrition Examination Survey (NHANES) data. DESIGN, SETTING, AND PARTICIPANTS: Cross-sectional surveys conducted between 1988-1994 and 1999-2012 of nationally representative samples of the civilian, noninstitutionalized US population; 2781 adults from 2011-2012 were used to estimate recent prevalence and an additional 23,634 adults from 1988-2010 were used to estimate trends. MAIN OUTCOMES AND MEASURES: The prevalence of diabetes was defined using a previous diagnosis of diabetes or, if diabetes was not previously diagnosed, by (1) a hemoglobin A1c level of 6.5% or greater or a fasting plasma glucose (FPG) level of 126 mg/dL or greater (hemoglobin A1c or FPG definition) or (2) additionally including 2-hour plasma glucose (2-hour PG) level of 200 mg/dL or greater (hemoglobin A1c, FPG, or 2-hour PG definition). Prediabetes was defined as a hemoglobin A1c level of 5.7% to 6.4%, an FPG level of 100 mg/dL to 125 mg/dL, or a 2-hour PG level of 140 mg/dL to 199 mg/dL. RESULTS: In the overall 2011-2012 population, the unadjusted prevalence (using the hemoglobin A1c, FPG, or 2-hour PG definitions for diabetes and prediabetes) was 14.3% (95% CI, 12.2%-16.8%) for total diabetes, 9.1% (95% CI, 7.8%-10.6%) for diagnosed diabetes, 5.2% (95% CI, 4.0%-6.9%) for undiagnosed diabetes, and 38.0% (95% CI, 34.7%-41.3%) for prediabetes; among those with diabetes, 36.4% (95% CI, 30.5%-42.7%) were undiagnosed. The unadjusted prevalence of total diabetes (using the hemoglobin A1c or FPG definition) was 12.3% (95% CI, 10.8%-14.1%); among those with diabetes, 25.2% (95% CI, 21.1%-29.8%) were undiagnosed. Compared with non-Hispanic white participants (11.3% [95% CI, 9.0%-14.1%]), the age-standardized prevalence of total diabetes (using the hemoglobin A1c, FPG, or 2-hour PG definition) was higher among non-Hispanic black participants (21.8% [95% CI, 17.7%-26.7%]; P < .001), non-Hispanic Asian participants (20.6% [95% CI, 15.0%-27.6%]; P = .007), and Hispanic participants (22.6% [95% CI, 18.4%-27.5%]; P < .001). The age-standardized percentage of cases that were undiagnosed was higher among non-Hispanic Asian participants (50.9% [95% CI, 38.3%-63.4%]; P = .004) and Hispanic participants (49.0% [95% CI, 40.8%-57.2%]; P = .02) than all other racial/ethnic groups. The age-standardized prevalence of total diabetes (using the hemoglobin A1c or FPG definition) increased from 9.8% (95% CI, 8.9%-10.6%) in 1988-1994 to 10.8% (95% CI, 9.5%-12.0%) in 2001-2002 to 12.4% (95% CI, 10.8%-14.2%) in 2011-2012 (P < .001 for trend) and increased significantly in every age group, in both sexes, in every racial/ethnic group, by all education levels, and in all poverty income ratio tertiles. CONCLUSIONS AND RELEVANCE: In 2011-2012, the estimated prevalence of diabetes was 12% to 14% among US adults, depending on the criteria used, with a higher prevalence among participants who were non-Hispanic black, non-Hispanic Asian, and Hispanic. Between 1988-1994 and 2011-2012, the prevalence of diabetes increased in the overall population and in all subgroups evaluated. |
Eating patterns, body mass index, and food deserts: Does it matter where we live?
Posner SF . Prev Chronic Dis 2015 12 E144 One of the great pleasures of being the Editor in Chief of Preventing Chronic Disease: Public Health Research, Practice, and Policy (PCD) is to read the papers submitted by the next generation of public health professionals for the annual PCD Student Contest. This year was no exception. We received 59 papers on a range of critical public health topics that used novel analytic methods. In collaboration with members of the Editorial Board, it is my pleasure to announce that Nelly Mejia at the Pardee RAND Graduate School has won the 2015 PCD Student Contest. In this paper, Mejia and colleagues describe their analysis of the association between living in a food desert and eating fruits and vegetables (1). Understanding the influence of food deserts on public health is critical to designing, implementing, and evaluating the impact of policy and environmental changes to improve access to nutritious foods. | Much of the published literature has documented both the prevalence of food deserts and disparities in access to nutritious foods (2–5). In the past several years, there has been a substantial effort to place farmers markers in areas considered to be food deserts (4–6). These interventions have documented some success; however, there are a number remaining challenges. Access to healthier foods is necessary but not sufficient to improve nutrition and mitigate the short-term and long-term health effects of suboptimal nutritional intake. Interventions must also address issues of preparation of these healthier foods for consumption. Access and purchasing behavior are first steps in changing dietary intake. |
Hypertension control cascade: a framework to improve hypertension awareness, treatment, and control
Wozniak G , Khan T , Gillespie C , Sifuentes L , Hasan O , Ritchey M , Kmetik K , Wynia M . J Clin Hypertens (Greenwich) 2015 18 (3) 232-9 Evidence-based interventions differ for increasing hypertension awareness, treatment, and control and should be targeted for specific patient panels. This study developed a hypertension control cascade to identify patients with a usual source of care represented at each level of the cascade using the 2007-2012 National Health and Nutrition Examination Survey. Overall, 10.7 million adults in the United States were unaware of their condition, 3.8 million were aware but untreated, and 15.8 million were treated but uncontrolled. The results also suggest that failure to attain hypertension control because of lack of awareness or lack of treatment despite awareness occurs mainly among younger individuals and those with no annual healthcare visits, while the elderly and minorities are more likely to remain uncontrolled when aware and treated. Opportunities to leverage population health management functions in electronic health information systems to align the specific patient subgroups facing barriers to hypertension control at each level of the cascade with targeted hypertension management interventions are discussed. |
Alternative methods for defining osteoarthritis and the impact on estimating prevalence in a US population-based survey
Cisternas MG , Murphy L , Sacks JJ , Solomon DH , Pasta DJ , Helmick CG . Arthritis Care Res (Hoboken) 2015 68 (5) 574-80 OBJECTIVE: Provide a contemporary estimate of osteoarthritis (OA) by comparing accuracy and prevalence of alternative definitions of OA. METHODS: The Medical Expenditure Panel Survey (MEPS) household component (HC) records respondent-reported medical conditions as open-ended responses; professional coders translate these responses into ICD-9-CM codes for the medical conditions files. Using these codes and other data from the MEPS-HC medical conditions files, we constructed three case definitions of OA and assessed them against medical provider diagnoses of ICD-9-CM 715 [osteoarthrosis and allied disorders] in a MEPS subsample. The three definitions were: 1) strict = ICD-9-CM 715; 2) expanded = ICD-9-CM 715, 716 [other and unspecified arthropathies], OR 719 [other and unspecified disorders of joint]); and 3) probable = strict OR expanded + respondent-reported prior diagnosis of OA or other arthritis excluding rheumatoid arthritis (RA). RESULTS: Sensitivity and specificity of the three definitions were: strict - 34.6% and 97.5%; expanded - 73.8% and 90.5%; and probable - 62.9% and 93.5%. CONCLUSION: The strict definition for OA (ICD-9-CM 715) excludes many individuals with OA. The probable definition of OA has the optimal combination of sensitivity and specificity relative to the two other MEPS-based definitions and yields a national annual estimate of 30.8 million adults with OA (13.4% of US adult population) for 2008 - 2011. |
Clinically unsuspected prion disease among patients with dementia diagnoses in an Alzheimer's disease database
Maddox RA , Blase JL , Mercaldo ND , Harvey AR , Schonberger LB , Kukull WA , Belay ED . Am J Alzheimers Dis Other Demen 2015 30 (8) 752-5 BACKGROUND: Brain tissue analysis is necessary to confirm prion diseases. Clinically unsuspected cases may be identified through neuropathologic testing. METHODS: National Alzheimer's Coordinating Center (NACC) Minimum and Neuropathologic Data Set for 1984 to 2005 were reviewed. Eligible patients had dementia, underwent autopsy, had available neuropathologic data, belonged to a currently funded Alzheimer's Disease Center (ADC), and were coded as having an Alzheimer's disease clinical diagnosis or a nonprion disease etiology. For the eligible patients with neuropathology indicating prion disease, further clinical information, collected from the reporting ADC, determined whether prion disease was considered before autopsy. RESULTS: Of 6000 eligible patients in the NACC database, 7 (0.12%) were clinically unsuspected but autopsy-confirmed prion disease cases. CONCLUSION: The proportion of patients with dementia with clinically unrecognized but autopsy-confirmed prion disease was small. Besides confirming clinically suspected cases, neuropathology is useful to identify unsuspected clinically atypical cases of prion disease. |
Comparing methods for identifying biologically implausible values in height, weight, and body mass index among youth
Lawman HG , Ogden CL , Hassink S , Mallya G , Vander Veur S , Foster GD . Am J Epidemiol 2015 182 (4) 359-65 As more epidemiologic data on childhood obesity become available, researchers are faced with decisions regarding how to determine biologically implausible values (BIVs) in height, weight, and body mass index. The purpose of the current study was 1) to track how often large, epidemiologic studies address BIVs, 2) to review BIV identification methods, and 3) to apply those methods to a large data set of youth to determine the effects on obesity and BIV prevalence estimates. Studies with large samples of anthropometric data (n > 1,000) were reviewed to track whether and how BIVs were defined. Identified methods were then applied to a longitudinal sample of 13,662 students (65% African American, 52% male) in 55 urban, low-income schools that enroll students from kindergarten through eighth grade (ages 5-13 years) in Philadelphia, Pennsylvania, during 2011-2012. Using measured weight and height at baseline and 1-year follow-up, we compared descriptive statistics, weight status prevalence, and BIV prevalence estimates. Eleven different BIV methods were identified. When these methods were applied to a large data set, severe obesity and BIV prevalence ranged from 7.2% to 8.6% and from 0.04% to 1.68%, respectively. Approximately 41% of large epidemiologic studies did not address BIV identification, and existing identification methods varied considerably. Increased standardization of the identification and treatment of BIVs may aid in the comparability of study results and accurate monitoring of obesity trends. |
Developing the evidence base to inform best practice: a scoping study of breast and cervical cancer reviews in low- and middle-income countries
Demment MM , Peters K , Dykens JA , Dozier A , Nawaz H , McIntosh S , Smith JS , Sy A , Irwin T , Fogg TT , Khaliq M , Blumenfeld R , Massoudi M , De Ver Dye T . PLoS One 2015 10 (9) e0134618 BACKGROUND: Breast and cervical cancers have emerged as major global health challenges and disproportionately lead to excess morbidity and mortality in low- and middle-income countries (LMICs) when compared to high-income countries. The objective of this paper was to highlight key findings, recommendations, and gaps in research and practice identified through a scoping study of recent reviews in breast and cervical cancer in LMICs. METHODS: We conducted a scoping study based on the six-stage framework of Arskey and O'Malley. We searched PubMed, Cochrane Reviews, and CINAHL with the following inclusion criteria: 1) published between 2005-February 2015, 2) focused on breast or cervical cancer 3) focused on LMIC, 4) review article, and 5) published in English. RESULTS: Through our systematic search, 63 out of the 94 identified cervical cancer reviews met our selection criteria and 36 of the 54 in breast cancer. Cervical cancer reviews were more likely to focus upon prevention and screening, while breast cancer reviews were more likely to focus upon treatment and survivorship. Few of the breast cancer reviews referenced research and data from LMICs themselves; cervical cancer reviews were more likely to do so. Most reviews did not include elements of the PRISMA checklist. CONCLUSION: Overall, a limited evidence base supports breast and cervical cancer control in LMICs. Further breast and cervical cancer prevention and control studies are necessary in LMICs. |
Transnational Record Linkage for Tuberculosis Surveillance and Program Evaluation.
Aiona K , Lowenthal P , Painter JA , Reves R , Flood J , Parker M , Fu Y , Wall K , Walter ND . Public Health Rep 2015 130 (5) 475-84 OBJECTIVE: Pre-immigration tuberculosis (TB) screening, followed by post-arrival rescreening during the first year, is critical to reducing TB among foreign-born people in the United States. However, existing U.S. public health surveillance is inadequate to monitor TB among immigrants during subsequent years. We developed and tested a novel method for ascertaining post-U.S.-arrival TB outcomes among high-TB-risk immigrant cohorts to improve surveillance. METHODS: We used a probabilistic record linkage program to link pre-immigration screening records from U.S.-bound immigrants from the Philippines (n=422,593) and Vietnam (n=214,401) with the California TB registry during 2000-2010. We estimated sensitivity using Monte Carlo simulations to account for uncertainty in key inputs. Specificity was evaluated by using a time-stratified approach, which defined false-positives as TB records linked to pre-immigration screening records dated after the person had arrived in the United States. RESULTS: TB was reported in 4,382 and 2,830 people born in the Philippines and Vietnam, respectively, in California during the study period. Of these TB cases, records for 973 and 452 cases of people born in the Philippines and Vietnam, respectively, were linked to pre-immigration screening records. Sensitivity and specificity of linkage were 89% (90% credible interval [CrI] 83, 97) and 100%, respectively, for the Philippines, and 90% (90% CrI 83, 98) and 99.9%, respectively, for Vietnam. CONCLUSION: Electronic linkage of pre-immigration screening records to a domestic TB registry was feasible, sensitive, and highly specific in two high-priority immigrant cohorts. Transnational record linkage can be used for program evaluation and routine monitoring of post-U.S.-arrival TB risk among immigrants, but requires interagency data sharing and collaboration. |
Ryan White HIV/AIDS Program assistance and HIV treatment outcomes
Bradley H , Viall AH , Wortley PM , Dempsey A , Hauck H , Skarbinski J . Clin Infect Dis 2015 62 (1) 90-98 BACKGROUND: The Ryan White HIV/AIDS Program (RWHAP) provides HIV-infected persons with services not covered by other healthcare payer types. Limited data exist to inform policy decisions about the most appropriate role for RWHAP under the Patient Protection and Affordable Care Act (ACA). METHODS: We assessed associations between RWHAP assistance and antiretroviral therapy (ART) prescription and viral suppression. We used data from the Medical Monitoring Project (MMP), a surveillance system assessing characteristics of HIV-infected adults receiving medical care in the United States. Interview and medical record data were collected in 2009-2013from 18,095patients. RESULTS: Nearly 41% of patients had RWHAP assistance; 15% relied solely on RWHAP assistance for HIV care. Overall, 91% were prescribed ART, and 75% were virally suppressed. Uninsured patients receiving RWHAP assistance were significantly more likely to be prescribed ART (52% versus 94%; P<0.01) and virally suppressed (39% versus 77%; P<0.01) than uninsured patients without RWHAP assistance. Patients with private insurance and Medicaid were 6% and 7% less likely, respectively, to be prescribed ART than those with RWHAP only (P<0.01). Those with private insurance and Medicaid were 5% and 12% less likely, respectively, to be virally suppressed (P≤0.02) than those with RWHAP only. Patients whose private or Medicaid coverage was supplemented by RWHAP were more likely to be prescribed ART and virally suppressed than those without RWHAP supplementation (P≤0.01). CONCLUSIONS: Uninsured and underinsured HIV-infected persons receiving RWHAP assistance were more likely to be prescribed ART and virally suppressed than those with other types of healthcare coverage. |
Service delivery and patient outcomes in Ryan White HIV/AIDS Program-funded and -nonfunded health care facilities in the United States
Weiser J , Beer L , Frazier EL , Patel R , Dempsey A , Hauck H , Skarbinski J . JAMA Intern Med 2015 175 (10) 1650-9 IMPORTANCE: Outpatient human immunodeficiency virus (HIV) health care facilities receive funding from the Ryan White HIV/AIDS Program (RWHAP) to provide medical care and essential support services that help patients remain in care and adhere to treatment. Increased access to Medicaid and private insurance for HIV-infected persons may provide coverage for medical care but not all needed support services and may not supplant the need for RWHAP funding. OBJECTIVE: To examine differences between RWHAP-funded and non-RWHAP-funded facilities and in patient outcomes between the 2 systems. DESIGN, SETTING, AND PARTICIPANTS: The study was conducted from June 1, 2009, to May 31, 2012, using data from the 2009 and 2011 cycles of the Medical Monitoring Project, a national probability sample of 8038 HIV-infected adults receiving medical care at 989 outpatient health care facilities providing HIV medical care. MAIN OUTCOMES AND MEASURES: Data were used to compare patient characteristics, service needs, and access to services at RWHAP-funded vs non-RWHAP-funded facilities. Differences in prescribed antiretroviral treatment and viral suppression were assessed. Data analysis was performed between February 2012 and June 2015. RESULTS: Overall, 34.4% of facilities received RWHAP funding and 72.8% of patients received care at RWHAP-funded facilities. With results reported as percentage (95% CI), patients attending RWHAP-funded facilities were more likely to be aged 18 to 29 years (8.5% [7.4%-9.5%] vs 5.0% [3.9%-6.2%]), female (29.2% [27.2%-31.2%] vs 20.1% [17.0%-23.1%]), black (47.5% [41.5%-53.5%] vs 25.8% [20.6%-31.0%]) or Hispanic (22.5% [16.4%-28.6%] vs 12.9% [10.6%-15.2%]), have less than a high school education (26.1% [24.0%-28.3%] vs 10.9% [8.7%-13.1%]), income at or below the poverty level (53.6% [50.3%-56.9%] vs 23.9% [19.7%-28.0%]), and lack health care coverage (25.0% [21.9%-28.1%] vs 6.1% [4.1%-8.0%]). The RWHAP-funded facilities were more likely to provide case management (76.1% [69.9%-82.2%] vs 15.4% [10.4%-20.4%]) as well as mental health (64.0% [57.0%-71.0%] vs 18.0% [14.0%-21.9%]), substance abuse (33.6% [27.0%-40.2%] vs 12.0% [8.0%-16.0%]), and other support services; patients attending RWHAP-funded facilities were more likely to receive these services. After adjusting for patient characteristics, the percentage prescribed ART antiretroviral therapy, reported as adjusted prevalence ratio (95% CI), was similar between RWHAP-funded and non-RWHAP-funded facilities (1.01 [0.99-1.03]), but among poor patients, those attending RWHAP-funded facilities were more likely to be virally suppressed (1.09 [1.02-1.16]). CONCLUSIONS AND RELEVANCE: A total of 72.8% of HIV-positive patients received care at RWHAP-funded facilities. Many had multiple social determinants of poor health and used services at RWHAP-funded facilities associated with improved outcomes. Without facilities supported by the RWHAP, these patients may have had reduced access to services elsewhere. Poor patients were more likely to achieve viral suppression if they received care at a RWHAP-funded facility. |
Sources of infant pertussis infection in the United States
Skoff TH , Kenyon C , Cocoros N , Liko J , Miller L , Kudish K , Baumbach J , Zansky S , Faulkner A , Martin SW . Pediatrics 2015 136 (4) 635-41 BACKGROUND: Pertussis is poorly controlled, with the highest rates of morbidity and mortality among infants. Although the source of infant pertussis is often unknown, when identified, mothers have historically been the most common reservoir of transmission. Despite high vaccination coverage, disease incidence has been increasing. We examined whether infant source of infection (SOI) has changed in the United States in light of the changing epidemiology. METHODS: Cases <1 year old were identified at Enhanced Pertussis Surveillance sites between January 1, 2006 to December 31, 2013. SOI was collected during patient interview and was defined as a suspected pertussis case in contact with the infant case 7 to 20 days before infant cough onset. RESULTS: A total of 1306 infant cases were identified; 24.2% were <2 months old. An SOI was identified for 569 cases. Infants 0 to 1 months old were more likely to have an SOI identified than 2- to 11-month-olds (54.1% vs 40.2%, respectively; P < .0001). More than 66% of SOIs were immediate family members, most commonly siblings (35.5%), mothers (20.6%), and fathers (10.0%); mothers predominated until the transition to siblings beginning in 2008. Overall, the SOI median age was 14 years (range: 0-74 years); median age for sibling SOIs was 8 years. CONCLUSIONS: In contrast to previous studies, our data suggest that the most common source of transmission to infants is now siblings. While continued monitoring of SOIs will optimize pertussis prevention strategies, recommendations for vaccination during pregnancy should directly increase protection of infants, regardless of SOI. |
Trends and differences among three new indicators of HIV infection progression
An Q , Song R , Hernandez A , Hall HI . Public Health Rep 2015 130 (5) 468-74 OBJECTIVE: This study proposes three indicators of, and assesses the disparities and trends in, the risk of HIV infection progression among people living with diagnosed HIV infection in the United States. METHODS: Using data reported to national HIV surveillance through June 2012, we calculated the AIDS diagnosis hazard, HIV (including AIDS) death hazard, and AIDS death hazard for people living with diagnosed HIV infection for each calendar year from 1997 to 2010. We also calculated a stratified hazard in 2010 by age, race/ethnicity, mode of transmission, region of residence at diagnosis, and year of diagnosis. RESULTS: The risk of HIV infection progression among people living with diagnosed HIV infection decreased significantly from 1997 to 2010. The risks of progression to AIDS and death in 2010 were higher among African Americans and people of multiple races, males exposed through injection drug use (IDU) or heterosexual contact, females exposed through IDU, people residing in the South at diagnosis, and people diagnosed in 2009 compared with white individuals, men who have sex with men, females with infection attributed to heterosexual contact, those residing in the Northeast, and those diagnosed in previous years, respectively. People aged 15-29 years had the highest AIDS diagnosis hazard in 2010. CONCLUSION: Continued efforts are needed to ensure early HIV diagnosis as well as initial linkage to and continued engagement in HIV medical care among all people living with HIV. Targeted interventions are needed to improve health-care and supportive services for those with worse health outcomes. |
Trends in HIV testing among U.S. older adults prior to and since release of CDC's routine HIV testing recommendations: national findings from the BRFSS
Ford CL , Mulatu MS , Godette DC , Gaines TL . Public Health Rep 2015 130 (5) 514-25 OBJECTIVE: This study examined temporal trends in HIV testing among U.S. older adults (50-64 years of age) before and after the release of CDC's routine HIV testing recommendations in 2006. METHODS: The sample (n=872,797; 51.4% female) comprised 2003-2010 Behavioral Risk Factor Surveillance System respondents in the oldest categories to which the recommendations apply: 50-54 years (34.5%, n=301,519), 55-59 years (34.1%, n=297,865), and 60-64 years (31.3%, n=273,413). We calculated (1) four-year pooled prevalences of past-year HIV testing before and after 2006, when the recommendations were released; and (2) annual prevalences of HIV testing overall and by age category from 2003-2010. Using weighted, multivariable logistic regression analyses, we examined binary (pre- vs. post-recommendations) and annual changes in testing, controlling for covariates. We stratified the data by recent doctor visits, examined racial/ethnic differences, and tested for linear and quadratic temporal trends. RESULTS: Overall and within age categories, the pooled prevalence of past-year HIV testing decreased following release of the recommendations (p<0.001). The annual prevalence decreased monotonically from 2003 (5.5%) to 2006 (3.6%) (b=-0.16, p<0.001) and then increased immediately after release of the recommendations, but decreased to 3.7% after 2009 (b=0.01, p<0.001). By race/ethnicity, testing increased over time among non-Hispanic black people only. Annual prevalence also increased among respondents with recent doctor visits. CONCLUSION: CDC's HIV testing recommendations were associated with a reversal in the downward trend in past-year HIV testing among older adults; however, the gains were neither universal nor sustained over time. |
Letter to the editor in response to the editorial commentary by Dr Kenrad E. Nelson entitled, "The changing epidemiology of hepatitis A virus infections in the United States"
Ly K N , Klevens R M , Jiles R B . J Infect Dis 2015 212 (6) 1009-10 We appreciate the Editorial Commentary by Nelson [1] and agree that the data in our study show significant declines over time in the incidence of hepatitis A, most likely as a result of successful implementation of childhood hepatitis A vaccination in the United States. At the same time, our study of over 80 000 reported cases of hepatitis A, identified from state health departments from 1999 to 2011, showed a significant increase over time in hepatitis A–associated hospitalizations and in the mean age of persons with hepatitis A who were hospitalized or died [2]. | We wish to clarify a statement by Nelson that we recommended immunization of all adults. Rather, because of the hepatitis A vaccine's highly immunogenic properties [3, 4], we suggested that future studies explore a 1-dose hepatitis A vaccine strategy among adults as a way to possibly prevent complications of future outbreaks. |
Lubricant use among men who have sex with men reporting anal intercourse in Bangkok, Thailand: impact of HIV status and implications for prevention
Thienkrua W , Todd CS , Chaikummao S , Sukwicha W , Yafant S , Tippanont N , Varangrat A , Khlaimanee P , Sirivongrangson P , Holtz TH . J Homosex 2015 63 (4) 507-21 This analysis measures prevalence and correlates of consistent lubricant use among a cohort of Thai men who have sex with men (MSM). Lubricant use was queried at the 12 month follow-up visit. Consistent lubricant use was evaluated with logistic regression. Consistent lubricant use was reported by 77.0% of men and associated with consistent condom use with casual partners, while binge drinking, payment for sex, and inconsistent condom use with casual, and steady partners, were negatively associated. Though consistent lubricant use is common among this Thai MSM cohort, further promotion is needed with MSM engaging in risky sexual practices. |
Management of a pet dog after exposure to a human patient with Ebola virus disease
Spengler JR , Stonecipher S , McManus C , Hughes-Garza H , Dow M , Zoran DL , Bissett W , Beckham T , Alves DA , Wolcott M , Tostenson S , Dorman B , Jones J , Sidwa TJ , Knust B , Behravesh CB . J Am Vet Med Assoc 2015 247 (5) 531-8 In October 2014, a health-care worker who had been part of the treatment team for the first laboratory-confirmed case of Ebola virus disease imported to the United States developed symptoms of Ebola virus disease. A presumptive positive reverse transcription PCR assay result for Ebola virus RNA in a blood sample from the worker was confirmed by the CDC, making this the first documented occurrence of domestic transmission of Ebola virus in the United States. The Texas Department of State Health Services commissioner issued a control order requiring disinfection and decontamination of the health-care worker's residence. This process was delayed until the patient's pet dog (which, having been exposed to a human with Ebola virus disease, potentially posed a public health risk) was removed from the residence. This report describes the movement, quarantine, care, testing, and release of the pet dog, highlighting the interdisciplinary, one-health approach and extensive collaboration and communication across local, county, state, and federal agencies involved in the response. |
Migration, multiple sexual partnerships, and sexual concurrency in the Garifuna population of Honduras
Gandhi AD , Pettifor A , Barrington C , Marshall SW , Behets F , Guardado ME , Farach N , Ardon E , Paz-Bailey G . AIDS Behav 2015 19 (9) 1559-70 The Garifuna, an ethnic minority group in Honduras, have been disproportionately affected by HIV. Previous research suggests that migration and high rates of multiple sexual partnerships are major drivers of the epidemic. Using data from a 2012 population-based survey, we assessed whether temporary migration was associated with (1) multiple sexual partnerships and (2) sexual concurrency among Garifuna men and women in Honduras. Among both men and women, temporary migration in the last year was associated with an increased likelihood of multiple sexual partnerships and with concurrency, though only the association between migration and multiple sexual partnerships among men was statistically significant (Adjusted Prevalence Ratio 1.7, 95 % CI 1.2-2.4). Migration may contribute to HIV/STI vulnerability among Garifuna men and women via increases in these sexual risk behaviors. Research conducted among men and women at elevated risk of HIV should continue to incorporate measures of mobility, including history of internal migration. |
Pneumonia associated with an influenza A H3 outbreak at a skilled nursing facility - Florida, 2014
Jordan J G , Pritchard S , Nicholson G , Winston T , Gumke M , Rubino H , Watkins S , Heberlein-Larson LA , Likos A . MMWR Morb Mortal Wkly Rep 2015 64 (35) 985-6 In December 2014, the Florida Department of Health, Bureau of Epidemiology, was notified that 18 of 95 (19%) residents at a skilled nursing facility had radiographic evidence of pneumonia and were being treated with antibiotics. Two residents were hospitalized, one of whom died. A second resident died at the facility. The Florida Department of Health conducted an outbreak investigation to ascertain all cases through active case finding, identify the etiology, provide infection control guidance, and recommend treatment or prophylaxis, if indicated. |
Ebola virus disease - Sierra Leone and Guinea, August 2015
Hersey S , Martel LD , Jambai A , Keita S , Yoti Z , Meyer E , Seeman S , Bennett S , Ratto J , Morgan O , Akyeampong MA , Sainvil S , Worrell MC , Fitter D , Arnold KE . MMWR Morb Mortal Wkly Rep 2015 64 (35) 981-4 The Ebola virus disease (Ebola) outbreak in West Africa began in late 2013 in Guinea (1) and spread unchecked during early 2014. By mid-2014, it had become the first Ebola epidemic ever documented. Transmission was occurring in multiple districts of Guinea, Liberia, and Sierra Leone, and for the first time, in capital cities (2). On August 8, 2014, the World Health Organization (WHO) declared the outbreak to be a Public Health Emergency of International Concern (3). Ministries of Health, with assistance from multinational collaborators, have reduced Ebola transmission, and the number of cases is now declining. While Liberia has not reported a case since July 12, 2015, transmission has continued in Guinea and Sierra Leone, although the numbers of cases reported are at the lowest point in a year. In August 2015, Guinea and Sierra Leone reported 10 and four confirmed cases, respectively, compared with a peak of 526 (Guinea) and 1,997 (Sierra Leone) in November 2014. This report details the current situation in Guinea and Sierra Leone, outlines strategies to interrupt transmission, and highlights the need to maintain public health response capacity and vigilance for new cases at this critical time to end the outbreak. |
Elimination of Ebola virus transmission in Liberia - September 3, 2015
Bawo L , Fallah M , Kateh F , Nagbe T , Clement P , Gasasira A , Mahmoud N , Musa E , Lo TQ , Pillai SK , Seeman S , Sunshine BJ , Weidle PJ , Nyensweh T , Liberia Ministry of Health , World Health Organization , CDC Ebola Response Teams . MMWR Morb Mortal Wkly Rep 2015 64 (35) 979-80 Following 42 days since the last Ebola virus disease (Ebola) patient was discharged from a Liberian Ebola treatment unit (ETU), September 3, 2015, marks the second time in a 4-month period that the World Health Organization (WHO) has declared Liberia free of Ebola virus transmission (1). The first confirmed Ebola cases in West Africa were identified in southeastern Guinea on March 23, 2014, and within 1 week, cases were identified and confirmed in Liberia (1). Since then, Liberia has reported 5,036 confirmed and probable Ebola cases and 4,808 Ebola-related deaths. The epidemic in Liberia peaked in late summer and early fall of 2014, when more than 200 confirmed and probable cases were reported each week . |
Enterovirus and human parechovirus surveillance - United States, 2009-2013
Abedi GR , Watson JT , Pham H , Nix WA , Oberste MS , Gerber SI . MMWR Morb Mortal Wkly Rep 2015 64 (34) 940-3 Enteroviruses (EVs) and human parechoviruses (HPeVs) are small, non-enveloped RNA viruses in the Picornaviridae family, which are known or suspected to cause a spectrum of clinical manifestations in humans. Although most infected persons are asymptomatic, mild presentations can include respiratory infections, herpangina, and hand, foot, and mouth disease. Among the more severe syndromes associated with EV and HPeV infection are acute flaccid paralysis, meningitis, encephalitis, myocarditis, and sepsis. Neonates and infants are at higher risk for infection and for severe clinical outcomes than older children or adults (1-3). As of August 2015, a total of 16 HPeV types and 118 EV types (within four EV species known to infect humans: A, B, C, and D) had been identified, and the spectrum of illness caused differed among virus types (4). To describe trends in EV and HPeV circulating in the United States during 2009-2013, CDC summarized detections reported through two surveillance systems. The most commonly reported types of EV and HPeV during this period were coxsackievirus (CV) A6 and HPeV3. The large number of CVA6 detections likely reflected an increase in testing in response to an outbreak of severe hand, foot, and mouth disease in late 2011 and 2012 (5). Most HPeV3 detections originated from a single hospital that routinely tested for HPeV (6). Clinicians and public health practitioners should consider the EV and HPeV types recently circulating in the United States to inform diagnostic and surveillance activities. When EV and HPeV typing is performed, clinical and public health laboratories should routinely report their results to improve the reliability and generalizability of surveillance data. |
The epidemiology of human immunodeficiency virus infection and care among adult and adolescent females in the United States, 2008-2012
Nwangwu-Ike N , Hernandez AL , An Q , Huang T , Hall HI . Womens Health Issues 2015 25 (6) 711-9 OBJECTIVE: We sought to determine epidemiological patterns in diagnoses of human immunodeficiency virus (HIV) infection and prevalence among females by age, race/ethnicity and transmission category, and essential steps in the continuum of HIV care. METHODS: Using data from the National HIV Surveillance System, we estimated the number of females aged 13 years or older diagnosed with HIV infection in 2008 through 2012 and living with HIV at the end of 2011 in the United States. We determined percentages of females linked to care, retained in care, and virally suppressed in 18 jurisdictions with complete reporting of CD4 and viral load test results. RESULTS: From 2008 to 2012, the estimated rate of HIV diagnoses among females decreased from 9.3 to 6.9 per 100,000 (-7.1% per year; 95% confidence interval [CI], -7.9, -6.3). In 2012, the diagnosis rate was highest among Blacks/African Americans (35.7), followed by Hispanics or Latinos (6.4), and Native Hawaiian Other Pacific Islander (5.1), and lowest among Whites (1.8). Most females diagnosed in 2012 were linked to care within 3 months of diagnosis (82.5%). About one-half (52.4%) of females living with HIV in 2011 received ongoing care in 2011 and 44.3% had a suppressed viral load. Viral suppression was lower among American Indian/Alaska Native (29.7%) and Black/African American (41.6%) compared with White females (46.5%). The percentage in care and with viral suppression was lower among younger compared with older females. CONCLUSION: HIV diagnoses continue to decrease among females; however, disparities exist in HIV burden and viral suppression. Improvements in care and treatment outcomes are needed for all women with particular emphasis on younger women. |
Exogenous Reinfection as a Cause of Late Recurrent Tuberculosis in the United States
Interrante JD , Haddad MB , Kim L , Gandhi NR . Ann Am Thorac Soc 2015 12 (11) 1619-26 RATIONALE: The etiology of recurrent tuberculosis is typically presumed to be reactivation of residual Mycobacterium tuberculosis infection, but reinfection may account for a greater proportion of recurrent tuberculosis than previously recognized. OBJECTIVE: To use M. tuberculosis genotyping to characterize the etiology of recurrent tuberculosis ≥12 months after treatment completion. METHODS: The study population for this national population-based cohort was drawn from the estimated 3,039 persons reported to the National Tuberculosis Surveillance System with 2 episodes of tuberculosis in the United States during 1993-2011; 194 had genotyping results from both the initial and subsequent episode. We analyzed the proportion of recurrent tuberculosis attributable to and the risk factors associated with reinfection. MEASUREMENTS AND MAIN RESULTS: Among 136 recurrences meeting inclusion criteria, genotypes between episodes were the same for 116 (85%) recurrences during 1996-2011; the 20 (15%) with differing genotypes were categorized as reinfection. Using exact logistic regression, factors associated with reinfection included Mexican birth with both TB episodes diagnosed in the United States within 12 years of immigration (adjusted odds ratio, 10.7; 95% confidence interval, 1.7-86.3) and exclusive use of directly observed therapy for treatment of the first episode (adjusted odds ratio, 4.5; 95% confidence interval, 1.0-29.2). CONCLUSIONS: Reinfection was the cause for 15% of late recurrent tuberculosis cases in this US cohort. The proportion caused by reinfection increased to 60% in certain subpopulations, such as recent immigrants from Mexico, suggesting that despite successful treatment for TB during their first episode, they remain in a social environment where they are reexposed to M. tuberculosis. Public health interventions to prevent novel reinfection might require a broader focus on these communities. |
From theory to practice: Implementation of a resource allocation model in health departments
Yaylali E , Farnham PG , Schneider KL , Landers SJ , Kouzouian O , Lasry A , Purcell DW , Green TA , Sansom SL . J Public Health Manag Pract 2015 22 (6) 567-75 OBJECTIVE: To develop a resource allocation model to optimize health departments' Centers for Disease Control and Prevention (CDC)-funded HIV prevention budgets to prevent the most new cases of HIV infection and to evaluate the model's implementation in 4 health departments. DESIGN, SETTINGS, AND PARTICIPANTS: We developed a linear programming model combined with a Bernoulli process model that allocated a fixed budget among HIV prevention interventions and risk subpopulations to maximize the number of new infections prevented. The model, which required epidemiologic, behavioral, budgetary, and programmatic data, was implemented in health departments in Philadelphia, Chicago, Alabama, and Nebraska. MAIN OUTCOME MEASURES: The optimal allocation of funds, the site-specific cost per case of HIV infection prevented rankings by intervention, and the expected number of HIV cases prevented. RESULTS: The model suggested allocating funds to HIV testing and continuum-of-care interventions in all 4 health departments. The most cost-effective intervention for all sites was HIV testing in nonclinical settings for men who have sex with men, and the least cost-effective interventions were behavioral interventions for HIV-negative persons. The pilot sites required 3 to 4 months of technical assistance to develop data inputs and generate and interpret the results. Although the sites found the model easy to use in providing quantitative evidence for allocating HIV prevention resources, they criticized the exclusion of structural interventions and the use of the model to allocate only CDC funds. CONCLUSIONS: Resource allocation models have the potential to improve the allocation of limited HIV prevention resources and can be used as a decision-making guide for state and local health departments. Using such models may require substantial staff time and technical assistance. These model results emphasize the allocation of CDC funds toward testing and continuum-of-care interventions and populations at highest risk of HIV transmission. |
Geographical targeting to improve progression through the sexually transmitted infection/HIV treatment continua in different populations
Aral SO , Torrone E , Bernstein K . Curr Opin HIV AIDS 2015 10 (6) 477-482 PURPOSE OF REVIEW: The purpose of this study is to review and synthesize the recent literature on the use of geographical targeting to improve progression through HIV and sexually transmitted disease (STD) prevention and treatment continua in different populations. RECENT FINDINGS: Geographical targeting can help identify obstacles to progression through prevention and treatment continua for each stage and in specific geographic locations. Macro-level geographical targeting can help maximize allocative efficiency, while micro-level targeting of hot spots increases effectiveness of interventions. Migration into and out of geographical areas of interest constitutes a challenge to geographical targeting in that stage-specific monitoring strategies tend to yield inaccurate results when people leave the area. Despite these issues, it is possible to identify failures in each stage of the continuum by specific spatial location such as census tracts and focus improvement efforts accordingly. SUMMARY: Vulnerabilities, risk behaviours and infections all cluster across age, race-ethnicity, socioeconomic status, key populations, risk networks and geographic space. Spatial concentration may be the most important in this context, as it allows prevention programmes to identify and reach target populations more easily. Geographical targeting can be employed at both macro and micro levels and in combination with targeting of key populations and high-risk networks. |
Human coronaviruses-associated influenza-like illness in the community setting in Peru
Razuri H , Malecki M , Tinoco Y , Ortiz E , Guezala C , Romero C , Estela A , Brena P , Morales ML , Reaves EJ , Gomez J , Uyeki TM , Widdowson MA , Azziz-Baumgartner E , Bausch DG , Schildgen V , Schildgen O , Montgomery JM . Am J Trop Med Hyg 2015 93 (5) 1038-40 We present findings describing the epidemiology of non-severe acute respiratory syndrome human coronavirus-associated influenza-like illness from a population-based active follow-up study in four different regions of Peru. In 2010, the prevalence of infections by human coronaviruses 229E, OC43, NL63, or HKU1 was 6.4% in participants with influenza-like illness who tested negative for influenza viruses. Ten of 11 human coronavirus infections were identified in the fall-winter season. Human coronaviruses are present in different regions of Peru and are relatively frequently associated with influenza-like illness in Peru. |
Impact of prompt influenza antiviral treatment on extended care needs after influenza hospitalization among community-dwelling older adults
Chaves SS , Perez A , Miller L , Bennett NM , Bandyopadhyay A , Farley MM , Fowler B , Hancock EB , Kirley PD , Lynfield R , Ryan P , Morin C , Schaffner W , Sharangpani R , Lindegren ML , Tengelsen L , Thomas A , Hill MB , Bradley KK , Oni O , Meek J , Zansky S , Widdowson MA , Finelli L . Clin Infect Dis 2015 61 (12) 1807-14 BACKGROUND: Patients hospitalized with influenza may require extended care upon discharge. We aimed to explore predictors for extended care needs and the potential mitigating effect of antiviral treatment among community-dwelling adults aged ≥65 years hospitalized with influenza. METHODS: We used laboratory-confirmed influenza hospitalizations from 3 influenza seasons. Extended care was defined as new placement in a skilled nursing home/long-term/rehabilitation facility upon hospital discharge. We focused on those treated with antiviral agents to explore the effect of early treatment on extended care and hospital length of stay (LOS) using logistic regression and competing risk survival analysis, accounting for time from illness onset to hospitalization. Treatment was categorized as early (≤4 days) and late (>4 days) in reference to date of illness onset. RESULTS: Among 6,593 community-dwelling adults aged ≥65 years hospitalized for influenza, 18% required extended care at discharge. Need for care increased with age and neurologic disorders, ICU admission, and pneumonia were predictors of care needs. Early treatment reduced the odds of extended care after hospital discharge for those hospitalized ≤2 or >2 days from illness onset (adjusted odds ratio [aOR] 0.38; 95% confidence interval [CI] 0.17, 0.85, and aOR 0.75; 95% CI 0.56, 0.97 respectively). Early treatment was also independently associated with reduction in LOS for those hospitalized ≤2 days from illness onset (adjusted hazard ratio [aHR] 1.81; 95% CI 1.43, 2.30) or >2 days (aHR 1.30; 95% CI 1.20, 1.40). CONCLUSIONS: Prompt antiviral treatment decreases the impact of influenza on older adults through shorten hospitalization and reduced extended care needs. |
Bodies don't sleep, neither do babies: experiences at the only maternity hospital isolation unit in Sierra Leone during the 2014 Ebola epidemic
Johnson JL . Am J Obstet Gynecol 2015 213 (5) 739-40 I arrived in Sierra Leone during the 2014 Ebola epidemic on Oct. 22, 2014, as part of a traveling group of 5 US Public Health responders working on behalf of the Centers for Disease Control and Prevention and joining a larger field team spread throughout the country. This was the first time I had been called to an outbreak, and I knew 2 things: I would be assigned to the infection prevention control team covering the maternity hospital isolation unit, and I should be ready for “rough conditions.” The short commute from the hotel took an hour during afternoon traffic. We pulled into the hospital gates to a courtyard crowded with people. Because we had not been screened on entry to the hospital grounds, I suspected they hadn’t been either. I stood the recommended arm and a half length away from people in the crowded hospital courtyard, knowing this would be the first of many long hours spent at the maternity hospital. My colleague gave me a tour of the facility: the isolation unit for women with suspected or confirmed Ebola virus disease (EVD), hospital wards, screening areas, morgue, laboratory, pharmacy, patient waiting areas, and the incinerator. Little did I know how accustomed I would grow to the smell, smoke, and eye-burning effects of the incinerator. I walked awkwardly behind her trying not to get too close to anything or anyone and followed her lead as we stood to the side while men in full personal protection equipment carried bodies on stretchers past us to the morgue. No one can quite prepare for these “rough conditions.” I was shocked at the large size of the hospital compound and the throngs of people who seemed unfazed to spend an afternoon in a hospital courtyard during an EVD epidemic. There were no weekends. Staff at the isolation unit worked nonstop, 7 days a week. The head nurse of the isolation unit would become a friend. During my 35-day tenure at the hospital, I would see her mood fluctuate from an ever-present exhaustion to hope and optimism with rare bouts of sadness and hopelessness. I wished I could give her the reprieve she undoubtedly needed and worried for her health and stamina. |
Characteristics of African American women and their partners with perceived concurrent partnerships in 4 rural counties in the Southeastern US
Ludema C , Doherty IA , White BL , Villar-Loubet O , McLellan-Lemal E , O'Daniels CM , Adimora AA . Sex Transm Dis 2015 42 (9) 498-504 BACKGROUND: To the individual with concurrent partners, it is thought that having concurrent partnerships confers no greater risk of acquiring HIV than having multiple consecutive partnerships. However, an individual whose partner has concurrent partnerships (partner's concurrency) is at increased risk for incident HIV infection. We sought to better understand relationships characterized by partner's concurrency among African American women. METHODS: A total of 1013 African American women participated in a cross-sectional survey from 4 rural Southeastern counties. RESULTS: Older age at first sex was associated with lower prevalence of partner's concurrency (prevalence ratio, 0.70; 95% confidence interval, 0.57-0.87), but the participant's age was not associated with partner's concurrency. After adjusting for covariates, ever having experienced intimate partner violence (IPV) and forced sex were most strongly associated with partner's concurrency (prevalence ratios, 1.61 [95% confidence intervals, 1.23-2.11] and 1.65 [1.20-2.26], respectively). Women in mutually monogamous partnerships were the most likely to receive economic support from their partners; women whose partners had concurrent partnerships did not report more economic benefit than did those whose partners were monogamous. CONCLUSIONS: Associations between history of IPV and forced sex with partner's concurrency suggest that women with these experiences may particularly benefit from interventions to reduce partner's concurrency in addition to support for reducing IPV and other sexual risks. To inform these interventions, further research to understand partnerships characterized by partner's concurrency is warranted. |
A county-level analysis of persons living with HIV in the southern United States
Gray SC , Massaro T , Chen I , Edholm CJ , Grotheer R , Zheng Y , Chang HH . AIDS Care 2015 28 (2) 1-7 This study uses county-level surveillance data to systematically analyze geographic variation and clustering of persons living with diagnosed HIV (PLWH) in the southern United States in 2011. Clusters corresponding to large metropolitan areas - including Miami, Atlanta, and Baltimore - had HIV prevalence rates higher (p < .001) than the regional rate. Regression analysis within the counties included in these clusters determined that race was a significant indicator for PLWH. These results provide a general picture of the distribution of PLWH in the southern United States at the county level and provide insights for identifying local geographic areas with a high number of PLWH, as well as subpopulations that may have an increased risk of infection. |
Assessment of immigrant certified nursing assistants' communication when responding to standardized care challenges
Massey M , Roter DL . Patient Educ Couns 2015 99 (1) 44-50 OBJECTIVE: Certified nursing assistants (CNAs) provide 80% of the hands-on care in US nursing homes; a significant portion of this work is performed by immigrants with limited English fluency. This study is designed to assess immigrant CNA's communication behavior in response to a series of virtual simulated care challenges. METHODS: A convenience sample of 31 immigrant CNAs verbally responded to 9 care challenges embedded in an interactive computer platform. The responses were coded with the Roter Interaction Analysis System (RIAS), CNA instructors rated response quality and spoken English was rated. RESULTS: CNA communication behaviors varied across care challenges and a broad repertoire of communication was used; 69% of response content was characterized as psychosocial. Communication elements (both instrumental and psychosocial) were significant predictors of response quality for 5 of 9 scenarios. Overall these variables explained between 13% and 36% of the adjusted variance in quality ratings. CONCLUSION: Immigrant CNAs responded to common care challenges using a variety of communication strategies despite fluency deficits. PRACTICE IMPLICATIONS: Virtual simulation-based observation is a feasible, acceptable and low cost method of communication assessment with implications for supervision, training and evaluation of a para-professional workforce. |
Impact of human mobility on the emergence of dengue epidemics in Pakistan
Wesolowski A , Qureshi T , Boni MF , Sundsoy PR , Johansson MA , Rasheed SB , Engo-Monsen K , Buckee CO . Proc Natl Acad Sci U S A 2015 112 (38) 11887-92 The recent emergence of dengue viruses into new susceptible human populations throughout Asia and the Middle East, driven in part by human travel on both local and global scales, represents a significant global health risk, particularly in areas with changing climatic suitability for the mosquito vector. In Pakistan, dengue has been endemic for decades in the southern port city of Karachi, but large epidemics in the northeast have emerged only since 2011. Pakistan is therefore representative of many countries on the verge of countrywide endemic dengue transmission, where prevention, surveillance, and preparedness are key priorities in previously dengue-free regions. We analyze spatially explicit dengue case data from a large outbreak in Pakistan in 2013 and compare the dynamics of the epidemic to an epidemiological model of dengue virus transmission based on climate and mobility data from approximately 40 million mobile phone subscribers. We find that mobile phone-based mobility estimates predict the geographic spread and timing of epidemics in both recently epidemic and emerging locations. We combine transmission suitability maps with estimates of seasonal dengue virus importation to generate fine-scale dynamic risk maps with direct application to dengue containment and epidemic preparedness. |
Comparison of methods for xenomonitoring in vectors of lymphatic filariasis in northeastern Tanzania
Irish S R , Derua YA , Walker T , Cameron MM . Am J Trop Med Hyg 2015 93 (5) 983-9 Monitoring Wuchereria bancrofti infection in mosquitoes (xenomonitoring) can play an important role in determining when lymphatic filariasis has been eliminated, or in focusing control efforts. As mosquito infection rates can be low, a method for collecting large numbers of mosquitoes is necessary, for example, gravid traps collected large numbers of Culex quinquefasciatus in Tanzania, and a collection method that targets mosquitoes that have already fed could result in increased sensitivity in detecting W. bancrofti-infected mosquitoes. The aim of this experiment was to test this hypothesis by comparing U.S. Centers for Disease Control and Prevention (CDC) light traps with CDC gravid traps in northeastern Tanzania, where Cx. quinquefasciatus is a vector of lymphatic filariasis. After an initial study where small numbers of mosquitoes were collected, a second study collected 16,316 Cx. quinquefasciatus in 60 gravid trap-nights and 240 light trap-nights. Mosquitoes were pooled and tested for presence of W. bancrofti DNA. Light and gravid traps collected similar numbers of mosquitoes per trap-night, but the physiological status of the mosquitoes was different. The estimated infection rate in mosquitoes collected in light traps was considerably higher than in mosquitoes collected in gravid traps, so light traps can be a useful tool for xenomonitoring work in Tanzania. |
Studies on the species composition and relative abundance of mosquitoes of Mpigi District, Central Uganda
Mayanja M , Mutebi JP , Crabtree MB , Ssenfuka F , Muwawu T , Lutwama JJ . J Entomol Zool Stud 2014 2 (5) 317-322 Prediction of arboviral disease outbreaks and planning for appropriate control interventions require knowledge of the mosquito vectors involved. Although mosquito surveys have been conducted in different regions of Uganda since the mid 30's such studies have not been carried out in Mpigi District. In October 2011, we conducted mosquito collections in Mpigi district to determine species composition and relative abundance of the different species. The survey was conducted in four villages, Njeru, Ddela, Kiwumu and Nsumbain Kammengo sub-county, Mpigi district, Uganda. CDC light traps baited with dry ice (carbon dioxide) were used to capture adult mosquitoes. A total of 54,878 mosquitoes comprising 46 species from eight genera were collected. The dominant species at all sites was Coquilletidia (Coquilletidia) fuscopennata Theobald (n=38,059, 69%), followed by Coquillettidia (Coquillettidia) metallica Theobald (n=4,265, 7.8%). The number of species collected varied from 17 in the genus Culex to 1 in the genus Lutzia. Of the 46 species identified, arboviruses had previously been isolated from 28 (60.9%) suggesting a high potential for arboviral transmission and/or maintenance in Mpigi District. |
Trends in emergency department visits for unsupervised pediatric medication exposures, 2004-2013
Lovegrove M C , Weidle NJ , Budnitz DS . Pediatrics 2015 136 (4) e821-9 BACKGROUND: After reports of increasing emergency department (ED) visits for unsupervised pediatric medication exposures in the 2000s, renewed efforts to improve safety packaging and education were initiated. National data on current trends can help further target interventions. METHODS: We used nationally representative data from the National Electronic Injury Surveillance System-Cooperative Adverse Drug Event Surveillance project (2004-2013) to assess trends in ED visits for unsupervised medication exposures in children aged <6 years. For 2010 through 2013, the dosage form and prescription status of implicated medications were identified. RESULTS: Based on 13 268 cases, there were an estimated 640 161 ED visits (95% confidence interval: 512 885 to 767 436) for unsupervised medication exposures from 2004 through 2013. From 2004 through 2010, ED visits for unsupervised exposures increased by an average of 5.7% annually, peaking at 75 842. After 2010, this trend reversed, and visits decreased by an average of 6.7% annually to 59 092 in 2013. From 2010 through 2013, 91.0% of unsupervised exposure visits involved 1 medication, most commonly an oral prescription solid (45.9%), oral over-the-counter (OTC) solid (22.3%), or oral OTC liquid (12.4%). More than 260 different prescription solids were implicated; opioids (13.8%) and benzodiazepines (12.7%) were the most common classes. Four medications were implicated in 91.2% of OTC liquid exposure visits: acetaminophen (32.9%), cough and cold remedies (27.5%), ibuprofen (15.7%), and diphenhydramine (15.6%). CONCLUSIONS: Targeting prevention efforts based on harm frequency and intervention feasibility can lead to continued reductions in ED visits for pediatric medication exposures. |
A rapid assessment of drinking water quality in informal settlements after a cholera outbreak in Nairobi, Kenya
Blanton E , Wilhelm N , O'Reilly C , Muhonja E , Karoki S , Ope M , Langat D , Omolo J , Wamola N , Oundo J , Hoekstra R , Ayers T , De Cock K , Breiman R , Mintz E , Lantagne D . J Water Health 2015 13 (3) 714-25 Populations living in informal settlements with inadequate water and sanitation infrastructure are at risk of epidemic disease. In 2010, we conducted 398 household surveys in two informal settlements in Nairobi, Kenya with isolated cholera cases. We tested source and household water for free chlorine residual (FCR) and Escherichia coli in approximately 200 households. International guidelines are ≥0.5 mg/L FCR at source, ≥0.2 mg/L at household, and <1 E. coli/100 mL. In these two settlements, 82% and 38% of water sources met FCR guidelines; and 7% and 8% were contaminated with E. coli, respectively. In household stored water, 82% and 35% met FCR guidelines and 11% and 32% were contaminated with E. coli, respectively. Source water FCR ≥0.5 mg/L (p = 0.003) and reported purchase of a household water treatment product (p = 0.002) were associated with increases in likelihood that household stored water had ≥0.2 mg/L FCR, which was associated with a lower likelihood of E. coli contamination (p < 0.001). These results challenge the assumption that water quality in informal settlements is universally poor and the route of disease transmission, and highlight that providing centralized water with ≥0.5 mg/L FCR or (if not feasible) household water treatment technologies reduces the risk of waterborne cholera transmission in informal settlements. |
Trends in exposure to chemicals in personal care and consumer products
Calafat AM , Valentin-Blasini L , Ye X . Curr Environ Health Rep 2015 2 (4) 348-55 Synthetic organic chemicals can be used in personal care and consumer products. Data on potential human health effects of these chemicals are limited-sometimes even contradictory-but because several of these chemicals are toxic in experimental animals, alternative compounds are entering consumer markets. Nevertheless, limited information exists on consequent exposure trends to both the original chemicals and their replacements. Biomonitoring (measuring concentrations of chemicals or their metabolites in people) provides invaluable information for exposure assessment. We use phthalates and bisphenol A-known industrial chemicals-and organophosphate insecticides as case studies to show exposure trends to these chemicals and their replacements (e.g., other phthalates, non-phthalate plasticizers, various bisphenols, pyrethroid insecticides) among the US general population. We compare US trends to national trends from Canada and Germany. Exposure to the original compounds is still prevalent among these general populations, but exposures to alternative chemicals may be increasing. |
One Health - a strategy for resilience in a changing Arctic
Ruscio BA , Brubaker M , Glasser J , Hueston W , Hennessy TW . Int J Circumpolar Health 2015 74 27913 The circumpolar north is uniquely vulnerable to the health impacts of climate change. While international Arctic collaboration on health has enhanced partnerships and advanced the health of inhabitants, significant challenges lie ahead. One Health is an approach that considers the connections between the environment, plant, animal and human health. Understanding this is increasingly critical in assessing the impact of global climate change on the health of Arctic inhabitants. The effects of climate change are complex and difficult to predict with certainty. Health risks include changes in the distribution of infectious disease, expansion of zoonotic diseases and vectors, changing migration patterns, impacts on food security and changes in water availability and quality, among others. A regional network of diverse stakeholder and transdisciplinary specialists from circumpolar nations and Indigenous groups can advance the understanding of complex climate-driven health risks and provide community-based strategies for early identification, prevention and adaption of health risks in human, animals and environment. We propose a regional One Health approach for assessing interactions at the Arctic human-animal-environment interface to enhance the understanding of, and response to, the complexities of climate change on the health of the Arctic inhabitants. |
Environmental phenols and pubertal development in girls
Wolff MS , Teitelbaum SL , McGovern K , Pinney SM , Windham GC , Galvez M , Pajak A , Rybak M , Calafat AM , Kushi LH , Biro FM . Environ Int 2015 84 174-180 Environmental exposures to many phenols are documented worldwide and exposures can be quite high (>1 microM of urine metabolites). Phenols have a range of hormonal activity, but knowledge of effects on child reproductive development is limited, coming mostly from cross-sectional studies. We undertook a prospective study of pubertal development among 1239 girls recruited at three U.S. sites when they were 6-8 years old and were followed annually for 7 years to determine age at first breast or pubic hair development. Ten phenols were measured in urine collected at enrollment (benzophenone-3, enterolactone, bisphenol A, three parabens (methyl-, ethyl-, propyl-), 2,5-dichlorophenol, triclosan, genistein, daidzein). We used multivariable adjusted Cox proportional hazards ratios (HR (95% confidence intervals)) and Kaplan-Meier survival analyses to estimate relative risk of earlier or later age at puberty associated with phenol exposures. For enterolactone and benzophenone-3, girls experienced breast development 5-6 months later, adjusted HR 0.79 (0.64-0.98) and HR 0.80 (0.65-0.98) respectively for the 5th vs 1st quintiles of urinary biomarkers (microg/g-creatinine). Earlier breast development was seen for triclosan and 2,5-dichlorophenol: 4-9 months sooner for 5th vs 1st quintiles of urinary concentrations (HR 1.17 (0.96-1.43) and HR 1.37 (1.09-1.72), respectively). Association of breast development with enterolactone, but not the other three phenols, was mediated by body size. These phenols may be antiadipogens (benzophenone-3 and enterolactone) or thyroid agonists (triclosan and 2,5-dichlorophenol), and their ubiquity and relatively high levels in children would benefit from further investigation to confirm these findings and to establish whether there are certain windows of susceptibility during which exposure can affect pubertal development. |
Complete Genome Sequences of Two Bordetella hinzii Strains Isolated from Humans.
Weigand MR , Changayil S , Kulasekarapandian Y , Tondella ML . Genome Announc 2015 3 (4) Bordetella hinzii is primarily recovered from poultry but can also colonize mammalian hosts and immunocompromised humans. Here, we report the first complete genome sequences of B. hinzii in two isolates recovered from humans. The availability of these sequences will hopefully aid in identifying host-specific determinants variably present within this species. |
Analytical Validation of Quantitative Real-Time PCR Methods for Quantification of Trypanosoma cruzi DNA in Blood Samples from Chagas Disease Patients.
Ramirez JC , Cura CI , da Cruz Moreira O , Lages-Silva E , Juiz N , Velazquez E , Ramirez JD , Alberti A , Pavia P , Flores-Chavez MD , Munoz-Calderon A , Perez-Morales D , Santalla J , Marcos da Matta Guedes P , Peneau J , Marcet P , Padilla C , Cruz-Robles D , Valencia E , Crisante GE , Greif G , Zulantay I , Costales JA , Alvarez-Martínez M , Martínez NE , Villarroel R , Villarroel S , Sánchez Z , Bisio M , Parrado R , Maria da Cunha Galvão L , Jácome da Câmara AC , Espinoza B , Alarcón de Noya B , Puerta C , Riarte A , Diosque P , Sosa-Estani S , Guhl F , Ribeiro I , Aznar C , Britto C , Yadón ZE , Schijman AG . J Mol Diagn 2015 17 (5) 605-15 An international study was performed by 26 experienced PCR laboratories from 14 countries to assess the performance of duplex quantitative real-time PCR (qPCR) strategies on the basis of TaqMan probes for detection and quantification of parasitic loads in peripheral blood samples from Chagas disease patients. Two methods were studied: Satellite DNA (SatDNA) qPCR and kinetoplastid DNA (kDNA) qPCR. Both methods included an internal amplification control. Reportable range, analytical sensitivity, limits of detection and quantification, and precision were estimated according to international guidelines. In addition, inclusivity and exclusivity were estimated with DNA from stocks representing the different Trypanosoma cruzi discrete typing units and Trypanosoma rangeli and Leishmania spp. Both methods were challenged against 156 blood samples provided by the participant laboratories, including samples from acute and chronic patients with varied clinical findings, infected by oral route or vectorial transmission. kDNA qPCR showed better analytical sensitivity than SatDNA qPCR with limits of detection of 0.23 and 0.70 parasite equivalents/mL, respectively. Analyses of clinical samples revealed a high concordance in terms of sensitivity and parasitic loads determined by both SatDNA and kDNA qPCRs. This effort is a major step toward international validation of qPCR methods for the quantification of T. cruzi DNA in human blood samples, aiming to provide an accurate surrogate biomarker for diagnosis and treatment monitoring for patients with Chagas disease. |
Complete Genome Sequence of Strain H5989 of a Novel Devosia Species.
Nicholson AC , Whitney AM , Humrighouse B , Emery B , Loparev V , McQuiston JR . Genome Announc 2015 3 (5) The CDC Special Bacteriology Reference Laboratory (SBRL) collection of human clinical pathogens contains several strains from the genus Devosia, usually found environmentally. We provide here the complete genome of strain H5989, which was isolated from a human cerebrospinal fluid (CSF) specimen and represents a putative novel species in the genus Devosia. |
Factors associated with real time RT-PCR cycle threshold values among medically attended influenza episodes.
Spencer S , Clippard J , Thompson M , Piedra PA , Jewell A , Avadhanula V , Mei M , Jackson ML , Meece J , Sundaram M , Belongia EA , Cross R , Johnson E , Bullotta A , Rinaldo C , Gaglani M , Murthy K , Clipper L , Berman L , Flannery B . J Med Virol 2015 88 (4) 719-23 We evaluated the cycle threshold (CT) values of 1160 influenza A positive and 806 influenza B positive specimens from two seasons of the US Flu VE Network to identify factors associated with CT values. Low CT values (high genomic load) were associated with shorter intervals between illness onset and specimen collection, young age (ages 3-8 years old), and self-rated illness severity for both influenza A and B. Low CT values were also associated with reported fever/feverishness and age ≥65 years for influenza A. |
Noncanonical role of transferrin receptor 1 is essential for intestinal homeostasis.
Chen AC , Donovan A , Ned-Sykes R , Andrews NC . Proc Natl Acad Sci U S A 2015 112 (37) 11714-9 Transferrin receptor 1 (Tfr1) facilitates cellular iron uptake through receptor-mediated endocytosis of iron-loaded transferrin. It is expressed in the intestinal epithelium but not involved in dietary iron absorption. To investigate its role, we inactivated the Tfr1 gene selectively in murine intestinal epithelial cells. The mutant mice had severe disruption of the epithelial barrier and early death. There was impaired proliferation of intestinal epithelial cell progenitors, aberrant lipid handling, increased mRNA expression of stem cell markers, and striking induction of many genes associated with epithelial-to-mesenchymal transition. Administration of parenteral iron did not improve the phenotype. Surprisingly, however, enforced expression of a mutant allele of Tfr1 that is unable to serve as a receptor for iron-loaded transferrin appeared to fully rescue most animals. Our results implicate Tfr1 in homeostatic maintenance of the intestinal epithelium, acting through a role that is independent of its iron-uptake function. |
Diverse antigenic site targeting of influenza hemagglutinin in the murine antibody recall response to A(H1N1)pdm09 virus.
Wilson JR , Guo Z , Tzeng WP , Garten RJ , Xiyan X , Blanchard EG , Blanchfield K , Stevens J , Katz JM , York IA . Virology 2015 485 252-262 Here we define the epitopes on HA that are targeted by a group of 9 recombinant monoclonal antibodies (rmAbs) isolated from memory B cells of mice, immunized by infection with A(H1N1)pdm09 virus followed by a seasonal TIV boost. These rmAbs were all reactive against the HA1 region of HA, but display 7 distinct binding footprints, targeting each of the 4 known antigenic sites. Although the rmAbs were not broadly cross-reactive, a group showed subtype-specific cross-reactivity with the HA of A/South Carolina/1/18. Screening these rmAbs with a panel of human A(H1N1)pdm09 virus isolates indicated that naturally-occurring changes in HA could reduce rmAb binding, HI activity, and/or virus neutralization activity by rmAb, without showing changes in recognition by polyclonal antiserum. In some instances, virus neutralization was lost while both ELISA binding and HI activity were retained, demonstrating a discordance between the two serological assays traditionally used to detect antigenic drift. |
Effects of PCB126 and PCB153 on telomerase activity and telomere length in undifferentiated and differentiated HL-60 cells.
Xin X , Senthilkumar PK , Schnoor JL , Ludewig G . Environ Sci Pollut Res Int 2015 23 (3) 2173-85 PCBs are persistent organic pollutants that are carcinogenic and immunotoxic and have developmental toxicity. This suggests that they may interfere with normal cell maturation. Cancer and stem/progenitor cells have telomerase activity to maintain and protect the chromosome ends, but lose this activity during differentiation. We hypothesized that PCBs interfere with telomerase activity and the telomere complex, thereby disturbing cell differentiation and stem/progenitor cell function. HL-60 cells are cancer cells that can differentiated into granulocytes and monocytes. We exposed HL-60 cells to PCB126 (dioxin-like) and PCB153 (nondioxin-like) 6 days before and during 3 days of differentiation. The differentiated cells showed G0/G1 phase arrest and very low telomerase activity. hTERT and hTR, two telomerase-related genes, were downregulated. The telomere shelterins TRF1, TRF2, and POT1 were upregulated in granulocytes, and TRF2 was upregulated and POT1 downregulated in monocytes. Both PCBs further reduced telomerase activity in differentiated cells, but had only small effects on the differentiation and telomere-related genes. Treatment of undifferentiated HL-60 cells for 30 days with PCB126 produced a downregulation of telomerase activity and a decrease of hTERT, hTR, TRF1, and POT1 gene expression. With PCB153, the effects were less pronounced and some shelterin genes were increased after 30 days of exposure. With each PCB, no differentiation of cells was observed and cells continued to proliferate despite reduced telomerase activity, resulting in shortened telomeres after 30 days of exposure. These results indicate cell-type and PCB congener-specific effects on telomere/telomerase-related genes. Although PCBs do not seem to strongly affect differentiation, they may influence stem or progenitor cells through telomere attrition with potential long-term consequences for health. |
Risk factors for invasive methicillin-resistant Staphylococcus aureus infection after recent discharge from an acute care hospitalization, 2011-2013
Epstein L , Mu Y , Belflower R , Scott J , Ray S , Dumyati G , Felsen C , Petit S , Yousey-Hindes K , Nadle J , Pasutti L , Lynfield R , Warnke L , Schaffner W , Leib K , Kallen AJ , Fridkin SK , Lessa FC . Clin Infect Dis 2015 62 (1) 45-52 BACKGROUND: Significant progress has been made in reducing methicillin-resistant Staphylococcus aureus (MRSA) infections among hospitalized patients. However, the decreases in invasive MRSA infections among recently discharged patients have been less substantial. We assessed risk factors for developing invasive MRSA infections following acute care hospitalizations to inform prevention strategies. METHODS: We conducted a prospective, matched case-control study. A case was defined as MRSA cultured from a normally sterile body site in a patient discharged from a hospital within the prior 12 weeks. Eligible cases were identified from 15 hospitals across 6 U.S. states. For each case, two controls were matched on hospital, month of discharge, and age group. Medical record reviews and telephone interviews were performed. Conditional logistic regression was used to identify independent risk factors for post-discharge invasive MRSA. RESULTS: From February 1, 2011 through March 31, 2013, 194 cases and 388 matched controls were enrolled. The median time between hospital discharge and positive culture was 23 days (range: 1-83 days). Factors independently associated with post-discharge MRSA infection included MRSA colonization (mOR 7.71, 95%CI 3.60-16.51), discharge to a nursing home (mOR 2.65, 95%CI 1.41-4.99), presence of a chronic wound during the post-discharge period (mOR 4.41, 95%CI 2.14-9.09), and discharge with a central venous catheter (CVC) (mOR 2.16, 95%CI 1.13-4.99) or a non-CVC invasive device (mOR 3.03, 95%CI 1.24-7.39) in place. CONCLUSION: Prevention efforts should target patients with MRSA colonization or those with invasive devices or chronic wounds at hospital discharge. In addition, MRSA prevention efforts in nursing homes are warranted. |
Seroprevalence of 9 human papillomavirus types in the United States, 2005-2006
Liu G , Markowitz LE , Hariri S , Panicker G , Unger ER . J Infect Dis 2015 213 (2) 191-8 BACKGROUND: A 9-valent human papillomavirus (HPV) vaccine, licensed in 2014, prevents 4 HPV types targeted by the quadrivalent vaccine (6/11/16/18) and 5 additional high-risk (HR) types (31/33/45/52/58). Measuring seropositivity before vaccine introduction provides baseline data on exposure to types targeted by vaccines. METHODS: We determined seroprevalence of HPV 6/11/16/18/31/33/45/52/58 among 4943 persons aged 14-59 years who participated in the National Health and Nutrition Examination Survey, 2005-2006. RESULTS: Among females, seroprevalence was 40.5% for any of the 9 vaccine types, 30.0% for any 7 HR types (16/18/31/33/45/52/58), 18.3% for any 5 additional types (31/33/45/52/58), and 19.0% for 16/18. Compared with non-Hispanic whites, non-Hispanic blacks had higher seroprevalence of 31/33/45/52/58 (36.8% vs 15.9%) and 16/18 (30.1% vs 17.8%), while Mexican Americans had higher seroprevalence of 31/33/45/52/58 (23.6% vs 15.9%) (P < .05 for all). In multivariable analyses of data from females, race/ethnicity, number of sex partners, and age were associated with 16/18 and 31/33/45/52/58 seropositivity. Seropositivity was lower among males than among females (P < .001 for all type categories). CONCLUSIONS: In 2005-2006, about 40% of females and 20% of males had serological evidence of exposure to ≥1 of 9 HPV types. Seroprevalence of all type categories, especially HPV 31/33/45/52/58 among females, varied by race/ethnicity. |
Licensure of a diphtheria and tetanus toxoids and acellular pertussis adsorbed and inactivated poliovirus vaccine and guidance for use as a booster dose
Liang J , Wallace G , Mootrey G . MMWR Morb Mortal Wkly Rep 2015 64 (34) 948-9 On March 24, 2015, the Food and Drug Administration licensed an additional combined diphtheria and tetanus toxoids and acellular pertussis adsorbed (DTaP) and inactivated poliovirus (IPV) vaccine (DTaP-IPV) (Quadracel, Sanofi Pasteur Inc.). Quadracel is the second DTaP-IPV vaccine to be licensed for use among children aged 4 through 6 years in the United States (1). Quadracel is approved for administration as a fifth dose in the DTaP series and as a fourth or fifth dose in the IPV series in children aged 4 through 6 years who have received 4 doses of DTaP-IPV-Hib (Pentacel, Sanofi Pasteur) and/or DTaP (Daptacel, Sanofi Pasteur) vaccine (2,3). This report summarizes the indications for Quadracel vaccine and provides guidance from the Advisory Committee on Immunization Practices (ACIP) for its use. |
Monitoring effect of human papillomavirus vaccines in US population, Emerging Infections Program, 2008-2012
Hariri S , Markowitz LE , Bennett NM , Niccolai LM , Schafer S , Bloch K , Park IU , Scahill MW , Julian P , Abdullah N , Levine D , Whitney E , Unger ER , Steinau M , Bauer HM , Meek J , Hadler J , Sosa L , Powell SE , Johnson ML , Hpv-Impact Working Group . Emerg Infect Dis 2015 21 (9) 1557-61 In 2007, five Emerging Infections Program (EIP) sites were funded to determine the feasibility of establishing a population-based surveillance system for monitoring the effect of human papillomavirus (HPV) vaccine on pre-invasive cervical lesions. The project involved active population-based surveillance of cervical intraepithelial neoplasia grades 2 and 3 and adenocarcinoma in situ as well as associated HPV types in women >18 years of age residing in defined catchment areas; collecting relevant clinical information and detailed HPV vaccination histories for women 18-39 years of age; and estimating the annual rate of cervical cancer screening among the catchment area population. The first few years of the project provided key information, including data on HPV type distribution, before expected effect of vaccine introduction. The project's success exemplifies the flexibility of EIP's network to expand core activities to include emerging surveillance needs beyond acute infectious diseases. Project results contribute key information regarding the impact of HPV vaccination in the United States. |
Quantifying and explaining accessibility with application to the 2009 H1N1 vaccination campaign
Heier Stamm JL , Serban N , Swann J , Wortley P . Health Care Manag Sci 2015 20 (1) 76-93 Accessibility and equity across populations are important measures in public health. This paper is specifically concerned with potential spatial accessibility, or the opportunity to receive care as moderated by geographic factors, and with horizontal equity, or fairness across populations regardless of need. Both accessibility and equity were goals of the 2009 vaccination campaign for the novel H1N1a influenza virus, including during the period when demand for vaccine exceeded supply. Distribution system design can influence equity and accessibility at the local level. We develop a general methodology that integrates optimization, game theory, and spatial statistics to measure potential spatial accessibility across a network, where we quantify spatial accessibility by travel distance and scarcity. We estimate and make inference on local (census-tract level) associations between accessibility and geographic, socioeconomic, and health care infrastructure factors to identify potential inequities in vaccine accessibility during the 2009 H1N1 vaccination campaign in the U.S. We find that there were inequities in access to vaccine at the local level and that these were associated with factors including population density and health care infrastructure. Our methodology for measuring and explaining accessibility leads to policy recommendations for federal, state, and local public health officials. The spatial-specific results inform the development of equitable distribution plans for future public health efforts. |
Immunogenicity of the RTS,S/AS01 malaria vaccine and implications for duration of vaccine efficacy: secondary analysis of data from a phase 3 randomised controlled trial
White MT , Verity R , Griffin JT , Asante KP , Owusu-Agyei S , Greenwood B , Drakeley C , Gesase S , Lusingu J , Ansong D , Adjei S , Agbenyega T , Ogutu B , Otieno L , Otieno W , Agnandji ST , Lell B , Kremsner P , Hoffman I , Martinson F , Kamthunzu P , Tinto H , Valea I , Sorgho H , Oneko M , Otieno K , Hamel MJ , Salim N , Mtoro A , Abdulla S , Aide P , Sacarlal J , Aponte JJ , Njuguna P , Marsh K , Bejon P , Riley EM , Ghani AC . Lancet Infect Dis 2015 15 (12) 1450-8 BACKGROUND: The RTS,S/AS01 malaria vaccine targets the circumsporozoite protein, inducing antibodies associated with the prevention of Plasmodium falciparum infection. We assessed the association between anti-circumsporozoite antibody titres and the magnitude and duration of vaccine efficacy using data from a phase 3 trial done between 2009 and 2014. METHODS: Using data from 8922 African children aged 5-17 months and 6537 African infants aged 6-12 weeks at first vaccination, we analysed the determinants of immunogenicity after RTS,S/AS01 vaccination with or without a booster dose. We assessed the association between the incidence of clinical malaria and anti-circumsporozoite antibody titres using a model of anti-circumsporozoite antibody dynamics and the natural acquisition of protective immunity over time. FINDINGS: RTS,S/AS01-induced anti-circumsporozoite antibody titres were greater in children aged 5-17 months than in those aged 6-12 weeks. Pre-vaccination anti-circumsporozoite titres were associated with lower immunogenicity in children aged 6-12 weeks and higher immunogenicity in those aged 5-17 months. The immunogenicity of the booster dose was strongly associated with immunogenicity after primary vaccination. Anti-circumsporozoite titres wane according to a biphasic exponential distribution. In participants aged 5-17 months, the half-life of the short-lived component of the antibody response was 45 days (95% credible interval 42-48) and that of the long-lived component was 591 days (557-632). After primary vaccination 12% (11-13) of the response was estimated to be long-lived, rising to 30% (28-32%) after a booster dose. An anti-circumsporozoite antibody titre of 121 EU/mL (98-153) was estimated to prevent 50% of infections. Waning anti-circumsporozoite antibody titres predict the duration of efficacy against clinical malaria across different age categories and transmission intensities, and efficacy wanes more rapidly at higher transmission intensity. INTERPRETATION: Anti-circumsporozoite antibody titres are a surrogate of protection for the magnitude and duration of RTS,S/AS01 efficacy, with or without a booster dose, providing a valuable surrogate of effectiveness for new RTS,S formulations in the age groups considered. FUNDING: UK Medical Research Council. |
Inactivated poliovirus vaccine given alone or in a sequential schedule with bivalent oral poliovirus vaccine in Chilean infants: a randomised, controlled, open-label, phase 4, non-inferiority study
O'Ryan M , Bandyopadhyay AS , Villena R , Espinoza M , Novoa J , Weldon WC , Oberste MS , Self S , Borate BR , Asturias EJ , Clemens R , Orenstein W , Jimeno J , Ruttimann R , Costa Clemens SA . Lancet Infect Dis 2015 15 (11) 1273-82 BACKGROUND: Bivalent oral poliovirus vaccine (bOPV; types 1 and 3) is expected to replace trivalent OPV (tOPV) globally by April, 2016, preceded by the introduction of at least one dose of inactivated poliovirus vaccine (IPV) in routine immunisation programmes to eliminate vaccine-associated or vaccine-derived poliomyelitis from serotype 2 poliovirus. Because data are needed on sequential IPV-bOPV schedules, we assessed the immunogenicity of two different IPV-bOPV schedules compared with an all-IPV schedule in infants. METHODS: We did a randomised, controlled, open-label, non-inferiority trial with healthy, full-term (>2.5 kg birthweight) infants aged 8 weeks (+/- 7 days) at six well-child clinics in Santiago, Chile. We used supplied lists to randomly assign infants (1:1:1) to receive three polio vaccinations (IPV by injection or bOPV as oral drops) at age 8, 16, and 24 weeks in one of three sequential schedules: IPV-bOPV-bOPV, IPV-IPV-bOPV, or IPV-IPV-IPV. We did the randomisation with blocks of 12 stratified by study site. All analyses were done in a masked manner. Co-primary outcomes were non-inferiority of the bOPV-containing schedules compared with the all-IPV schedule for seroconversion (within a 10% margin) and antibody titres (within two-thirds log2 titres) to poliovirus serotypes 1 and 3 at age 28 weeks, analysed in the per-protocol population. Secondary outcomes were seroconversion and titres to serotype 2 and faecal shedding for 4 weeks after a monovalent OPV type 2 challenge at age 28 weeks. Safety analyses were done in the intention-to-treat population. This trial is registered with ClinicalTrials.gov, number NCT01841671, and is closed to new participants. FINDINGS: Between April 25 and August 1, 2013, we assigned 570 infants to treatment: 190 to IPV-bOPV-bOPV, 192 to IPV-IPV-bOPV, and 188 to IPV-IPV-IPV. 564 (99%) were vaccinated and included in the intention-to-treat cohort, and 537 (94%) in the per-protocol analyses. In the IPV-bOPV-bOPV, IPV-IPV-bOPV, and IPV-IPV-IPV groups, respectively, the proportions of children with seroconversion to type 1 poliovirus were 166 (98.8%) of 168, 95% CI 95.8-99.7; 178 (100%), 97.9-100.0; and 175 (100%), 97.9-100.0. Proportions with seroconvsion to type 3 poliovirus were 163 (98.2%) of 166, 94.8-99.4; 177 (100%), 97.9-100.0, and 172 (98.9%) of 174, 95.9-99.7. Non-inferiority was thus shown for the bOPV-containing schedules compared with the all-IPV schedule, with no significant differences between groups. In the IPV-bOPV-bOPV, IPV-IPV-bOPV, and IPV-IPV-IPV groups, respectively, the proportions of children with seroprotective antibody titres to type 1 poliovirus were 168 (98.8%) of 170, 95% CI 95.8-99.7; 181 (100%), 97.9-100.0; and 177 (100%), 97.9-100.0. Proportions to type 3 poliovirus were 166 (98.2%) of 169, 94.9-99.4; 180 (100%), 97.9-100.0; and 174 (98.9%) of 176, 96.0-99.7. Non-inferiority comparisons could not be done for this outcome because median titres for the groups receiving OPV were greater than the assay's upper limit of detection (log2 titres >10.5). The proportions of children seroconverting to type 2 poliovirus in the IPV-bOPV-bOPV, IPV-IPV-bOPV, and IPV-IPV-IPV groups, respectively, were 130 (77.4%) of 168, 95% CI 70.5-83.0; 169 (96.0%) of 176, 92.0-98.0; and 175 (100%), 97.8-100. IPV-bOPV schedules resulted in almost a 0.3 log reduction of type 2 faecal shedding compared with the IPV-only schedule. No participants died during the trial; 81 serious adverse events were reported, of which one was thought to be possibly vaccine-related (intestinal intussusception). INTERPRETATION: Seroconversion rates against polioviruses types 1 and 3 were non-inferior in sequential schedules containing IPV and bOPV, compared with an all-IPV schedule, and proportions of infants with protective antibodies were high after all three schedules. One or two doses of bOPV after IPV boosted intestinal immunity for poliovirus type 2, suggesting possible cross protection. Additionally, there was evidence of humoral priming for type 2 from one dose of IPV. Our findings could give policy makers flexibility when choosing a vaccination schedule, especially when trying to eliminate vaccine-associated and vaccine-derived poliomyelitis. FUNDING: Bill & Melinda Gates Foundation. |
Intervals between PCV13 and PPSV23 vaccines: recommendations of the Advisory Committee on Immunization Practices (ACIP)
Kobayashi M , Bennett NM , Gierke R , Almendares O , Moore MR , Whitney CG , Pilishvili T . MMWR Morb Mortal Wkly Rep 2015 64 (34) 944-7 Two pneumococcal vaccines are currently licensed for use in the United States: the 13-valent pneumococcal conjugate vaccine (PCV13 [Prevnar 13, Wyeth Pharmaceuticals, Inc., a subsidiary of Pfizer Inc.]) and the 23-valent pneumococcal polysaccharide vaccine (PPSV23 [Pneumovax 23, Merck and Co., Inc.]). The Advisory Committee on Immunization Practices (ACIP) currently recommends that a dose of PCV13 be followed by a dose of PPSV23 in all adults aged >/=65 years who have not previously received pneumococcal vaccine and in persons aged >/=2 years who are at high risk for pneumococcal disease because of underlying medical conditions (Table) (1-4). The recommended intervals between PCV13 and PPSV23 given in series differ by age and risk group and the order in which the two vaccines are given (1-4). |
Antecedent causes of a measles resurgence in the Democratic Republic of the Congo
Scobie HM , Ilunga BK , Mulumba A , Shidi C , Coulibaly T , Obama R , Tamfum JJM , Simbu EP , Smit SB , Masresha B , Perry RT , Alleman MM , Kretsinger K , Goodson J . Pan Afr Med J 2015 21 (30) 30 INTRODUCTION: Despite accelerated measles control efforts, a massive measles resurgence occurred in the Democratic Republic of the Congo (DRC) starting in mid-2010, prompting an investigation into likely causes. METHODS: We conducted a descriptive epidemiological analysis using measles immunization and surveillance data to understand the causes of the measles resurgence and to develop recommendations for elimination efforts in DRC. RESULTS: During 2004-2012, performance indicator targets for case-based surveillance and routine measles vaccination were not met. Estimated coverage with the routine first dose of measles-containing vaccine (MCV1) increased from 57% to 73%. Phased supplementary immunization activities (SIAs) were conducted starting in 2002, in some cases with sub-optimal coverage (≤95%). In 2010, SIAs in five of 11 provinces were not implemented as planned, resulting in a prolonged interval between SIAs, and a missed birth cohort in one province. During July 1, 2010-December 30, 2012, high measles attack rates (>100 cases per 100,000 population) occurred in provinces that had estimated MCV1 coverage lower than the national estimate and did not implement planned 2010 SIAs. The majority of confirmed case-patients were aged <10 years (87%) and unvaccinated or with unknown vaccination status (75%). Surveillance detected two genotype B3 and one genotype B2 measles virus strains that were previously identified in the region. CONCLUSION: The resurgence was likely caused by an accumulation of unvaccinated, measles-susceptible children due to low MCV1 coverage and suboptimal SIA implementation. To achieve the regional goal of measles elimination by 2020, efforts are needed in DRC to improve case-based surveillance and increase two-dose measles vaccination coverage through routine services and SIAs. |
Children and adolescents unvaccinated against measles: geographic clustering, parents' beliefs, and missed opportunities
Smith PJ , Marcuse EK , Seward JF , Zhao Z , Orenstein WA . Public Health Rep 2015 130 (5) 485-504 OBJECTIVE: We evaluated the extent to which children and adolescents were not vaccinated against measles ("unvaccinated"), clustering within U.S. counties, and factors associated with unvaccination, including parents' vaccine-related beliefs and missed opportunities. METHODS: We analyzed data from the 2010-2013 National Immunization Survey (NIS) and NIS-Teen Survey of households with 19- to 35-month-old children and 13- to 17-year-old adolescents, respectively. We used provider-reported vaccination histories to assess measles vaccination status. RESULTS: In 2013, 7.5% of children and 4.5% of adolescents were unvaccinated against measles. Four-fifths (80.0%) of unvaccinated children lived in counties containing 41.9% of the nation's children, and 80.0% of unvaccinated adolescents lived in counties containing 30.4% of the nation's adolescents. Multivariable statistical analyses found that 74.6% of children who were unvaccinated against measles missed being vaccinated for reasons other than parents' negative vaccine-related beliefs, and 89.6% could be deemed as having at least one missed opportunity for being vaccinated against measles because they were administered at least one dose of other recommended vaccines after 12 months of age. Among adolescents, multivariable analyses found that only demographic factors, not vaccine-related parental beliefs, were independently associated with being unvaccinated. CONCLUSIONS: Reasons other than negative vaccine-related beliefs, including missed opportunities, accounted for the vast majority of unvaccinated children and adolescents. |
Practical comparison of aberration detection algorithms for biosurveillance systems
Zhou H , Burkom H , Winston C , Dey A , Ajani U . J Biomed Inform 2015 57 446-55 National syndromic surveillance systems require optimal anomaly detection methods. For method performance comparison, we injected multi-day signals stochastically drawn from lognormal distributions into time series of aggregated daily visit counts from the U.S. Centers for Disease Control and Prevention's BioSense syndromic surveillance system. The time series corresponded to three different syndrome groups: rash, upper respiratory infection, and gastrointestinal illness. We included a sample of facilities with data reported every day and with median daily syndromic counts 1 over the entire study period. We compared anomaly detection methods of five control chart adaptations, a linear regression model and a Poisson regression model. We assessed sensitivity and timeliness of these methods for detection of multi-day signals. At a daily background alert rate of 1% and 2%, the sensitivities and timeliness ranged from 24%-77% and 3.3-6.1 days, respectively. The overall sensitivity and timeliness increased substantially after stratification by weekday versus weekend and holiday. Adjusting the baseline syndromic count by the total number of facility visits gave consistently improved sensitivity and timeliness without stratification, but it provided better performance when combined with stratification. The daily syndrome/total-visit proportion method did not improve the performance. In general, alerting based on linear regression outperformed control chart based methods. A Poisson regression model obtained the best sensitivity in the series with high-count data. |
A qualitative evaluation of the 2005-2011 National Academic Centers of Excellence in Youth Violence Prevention Program
Holland KM , Vivolo-Kantor AM , Cruz JD , Massetti GM , Mahendra R . Eval Program Plann 2015 53 80-90 The Centers for Disease Control and Prevention's Division of Violence Prevention (DVP) funded eight National Academic Centers of Excellence (ACEs) in Youth Violence Prevention from 2005 to 2010 and two Urban Partnership Academic Centers of Excellence (UPACEs) in Youth Violence Prevention from 2006 to 2011. The ACEs and UPACEs constitute DVP's 2005-2011 ACE Program. ACE Program goals include partnering with communities to promote youth violence (YV) prevention and fostering connections between research and community practice. This article describes a qualitative evaluation of the 2005-2011 ACE Program using an innovative approach for collecting and analyzing data from multiple large research centers via a web-based Information System (ACE-IS). The ACE-IS was established as an efficient mechanism to collect and document ACE research and programmatic activities. Performance indicators for the ACE Program were established in an ACE Program logic model. Data on performance indicators were collected through the ACE-IS biannually. Data assessed Centers' ability to develop, implement, and evaluate YV prevention activities. Performance indicator data demonstrate substantial progress on Centers' research in YV risk and protective factors, community partnerships, and other accomplishments. Findings provide important lessons learned, illustrate progress made by the Centers, and point to new directions for YV prevention research and programmatic efforts. |
Compliance to two city convenience store ordinance requirements
Chaumont Menendez CK , Amandus HE , Wu N , Hendricks SA . Inj Prev 2015 22 (2) 117-22 BACKGROUND: Robbery-related homicides and assaults are the leading cause of death in retail businesses. Robbery reduction approaches focus on compliance to Crime Prevention Through Environmental Design (CPTED) guidelines. PURPOSE: We evaluated the level of compliance to CPTED guidelines specified by convenience store safety ordinances effective in 2010 in Dallas and Houston, Texas, USA. METHODS: Convenience stores were defined as businesses less than 10 000 square feet that sell grocery items. Store managers were interviewed for store ordinance requirements from August to November 2011, in a random sample of 594 (289 in Dallas, 305 in Houston) convenience stores that were open before and after the effective dates of their city's ordinance. Data were collected in 2011 and analysed in 2012-2014. RESULTS: Overall, 9% of stores were in full compliance, although 79% reported being registered with the police departments as compliant. Compliance was consistently significantly higher in Dallas than in Houston for many requirements and by store type. Compliance was lower among single owner-operator stores compared with corporate/franchise stores. Compliance to individual requirements was lowest for signage and visibility. CONCLUSIONS: Full compliance to the required safety measures is consistent with industry 'best practices' and evidence-based workplace violence prevention research findings. In Houston and Dallas compliance was higher for some CPTED requirements but not the less costly approaches that are also the more straightforward to adopt. |
Common and distinct mechanisms of induced pulmonary fibrosis by particulate and soluble chemical fibrogenic agents.
Dong J , Yu X , Porter DW , Battelli LA , Kashon ML , Ma Q . Arch Toxicol 2015 90 (2) 385-402 Pulmonary fibrosis results from the excessive deposition of collagen fibers and scarring in the lungs with or without an identifiable cause. The mechanism(s) underlying lung fibrosis development is poorly understood, and effective treatment is lacking. Here we compared mouse lung fibrosis induced by pulmonary exposure to prototypical particulate (crystalline silica) or soluble chemical (bleomycin or paraquat) fibrogenic agents to identify the underlying mechanisms. Young male C57BL/6J mice were given silica (2 mg), bleomycin (0.07 mg), or paraquat (0.02 mg) by pharyngeal aspiration. All treatments induced significant inflammatory infiltration and collagen deposition, manifesting fibrotic foci in silica-exposed lungs or diffuse fibrosis in bleomycin or paraquat-exposed lungs on day 7 post-exposure, at which time the lesions reached their peaks and represented a junction of transition from an acute response to chronic fibrosis. Lung genome-wide gene expression was analyzed, and differential gene expression was confirmed by quantitative RT-PCR, immunohistochemistry, and immunoblotting for representative genes to demonstrate their induced expression and localization in fibrotic lungs. Canonical signaling pathways, gene ontology, and upstream transcription networks modified by each agent were identified. In particular, these inducers elicited marked proliferative responses; at the same time, silica preferentially activated innate immune functions and the defense against foreign bodies, whereas bleomycin and paraquat boosted responses related to cell adhesion, platelet activation, extracellular matrix remodeling, and wound healing. This study identified, for the first time, the shared and unique genes, signaling pathways, and biological functions regulated by particulate and soluble chemical fibrogenic agents during lung fibrosis, providing insights into the mechanisms underlying human lung fibrotic diseases. |
Remote ischemic conditioning temporarily improves antioxidant defense
Costa FL , Teixeira RK , Yamaki VN , Valente AL , Silva AM , Brito MV , Percario S . J Surg Res 2015 200 (1) 105-9 BACKGROUND: Remote ischemic conditioning (RIC) is the most promising surgical approach to mitigate ischemia and reperfusion (IR) injury. It consists in performing brief cycles of IR in tissues other than those exposed to ischemia. The underlying mechanisms of the induced protection are barely understood, so we evaluated if RIC works enhancing the antioxidant defense of the liver and kidney before IR injury. MATERIALS AND METHODS: Twenty-one Wistar rats were assigned into three groups as follows: sham, same surgical procedure as in the remaining groups was performed, but no RIC was carried out. RIC 10, RIC was performed, and no abdominal organ ischemia was induced. After 10 min of the end of the RIC protocol, the liver and kidney were harvested. RIC 60, similar procedure as performed in RIC 10, but the liver and the kidney were harvested 60 min. RIC consisted of three cycles of 5-min left hind limb ischemia followed by 5-min left hind limb perfusion, lasting 30 min in total. Samples were used to measure tissue total antioxidant capacity. RESULTS: RIC protocol increased both liver (1.064 +/- 0.26 mM/L) and kidney (1.310 +/- 0.17 mM/L) antioxidant capacity after 10 min when compared with sham (liver, 0.759 +/- 0.10 mM/L and kidney, 1.08 +/- 0.15 mM/L). Sixty minutes after the RIC protocol, no enhancement on liver (0.687 +/- 0.13 mM/L) or kidney (1.09 +/- 0.15 mM/L) antioxidant capacity was detected. CONCLUSIONS: RIC works through temporary and short-term enhancement of liver and kidney cells antioxidant defenses to avoid the deleterious consequences of a future IR injury. |
Review of telemicrobiology
Rhoads DD , Mathison BA , Bishop HS , da Silva AJ , Pantanowitz L . Arch Pathol Lab Med 2015 140 (4) 362-70 CONTEXT: Microbiology laboratories are continually pursuing means to improve quality, rapidity, and efficiency of specimen analysis in the face of limited resources. One means by which to achieve these improvements is through the remote analysis of digital images. Telemicrobiology enables the remote interpretation of images of microbiology specimens. To date, the practice of clinical telemicrobiology has not been thoroughly reviewed. OBJECTIVE: To identify the various methods that can be employed for telemicrobiology, including emerging technologies that may provide value to the clinical laboratory. DATA SOURCES: Peer-reviewed literature, conference proceedings, meeting presentations, and expert opinions pertaining to telemicrobiology have been evaluated. CONCLUSIONS: A number of modalities have been employed for telemicroscopy, including static capture techniques, whole slide imaging, video telemicroscopy, mobile devices, and hybrid systems. Telemicrobiology has been successfully implemented for several applications, including routine primary diagnosis, expert teleconsultation, and proficiency testing. Emerging areas of telemicrobiology include digital plate reading of bacterial cultures, mobile health applications, and computer-augmented analysis of digital images. To date, static image capture techniques have been the most widely used modality for telemicrobiology, despite newer technologies being available that may produce better quality interpretations. Telemicrobiology adds value, quality, and efficiency to the clinical microbiology laboratory, and increased adoption of telemicrobiology is anticipated. |
Mapping influenza transmission in the ferret model to transmission in humans
Buhnerkempe MG , Gostic K , Park M , Ahsan P , Belser JA , Lloyd-Smith JO . Elife 2015 4 The controversy surrounding 'gain-of-function' experiments on high-consequence avian influenza viruses has highlighted the role of ferret transmission experiments in studying the transmission potential of novel influenza strains. However, the mapping between influenza transmission in ferrets and in humans is unsubstantiated. We address this gap by compiling and analyzing 240 estimates of influenza transmission in ferrets and humans. We demonstrate that estimates of ferret secondary attack rate (SAR) explain 66% of the variation in human SAR estimates at the subtype level. Further analysis shows that ferret transmission experiments have potential to identify influenza viruses of concern for epidemic spread in humans, though small sample sizes and biological uncertainties prevent definitive classification of human transmissibility. Thus, ferret transmission experiments provide valid predictions of pandemic potential of novel influenza strains, though results should continue to be corroborated by targeted virological and epidemiological research. |
Quantification of metabolites for assessing human exposure to soapberry toxins hypoglycin A and methylenecyclopropylglycine
Isenberg SL , Carter MD , Graham LA , Mathews TP , Johnson D , Thomas JD , Pirkle JL , Johnson RC . Chem Res Toxicol 2015 28 (9) 1753-9 Ingestion of soapberry fruit toxins hypoglycin A and methylenecyclopropylglycine has been linked to public health challenges worldwide. In 1976, over 100 years after Jamaican vomiting sickness (JVS) was first reported, the cause of JVS was linked to the ingestion of the toxin hypoglycin A produced by ackee fruit. A structural analogue of hypoglycin A, methylenecyclopropylglycine (MCPG), was implicated as the cause of an acute encephalitis syndrome (AES). Much of the evidence linking hypoglycin A and MCPG to these diseases has been largely circumstantial due to the lack of an analytical method for specific metabolites. This study presents an analytical approach to identify and quantify specific urine metabolites for exposure to hypoglycin A and MCPG. The metabolites are excreted in urine as glycine adducts methylenecyclopropylacetyl-glycine (MCPA-Gly) and methylenecyclopropylformyl-glycine (MCPF-Gly). These metabolites were processed by isotope dilution, separated by reverse-phase liquid chromatography, and monitored by electrospray ionization tandem mass spectrometry. The analytical response ratio was linearly proportional to the concentration of MCPF-Gly and MCPA-Gly in urine from 0.10 to 20 mug/mL with a correlation coefficient of r > 0.99. The assay demonstrated accuracy ≥80% and precision ≤20% RSD across the calibration range. This method has been applied to assess exposure to hypoglycin A and MCPG as part of a larger public health initiative and was used to provide the first reported identification of MCPF-Gly and MCPA-Gly in human urine. |
Evaluating protection against infectious bronchitis virus by clinical signs, ciliostasis, challenge virus detection, and histopathology
Jackwood MW , Jordan BJ , Roh H J , Hilt DA , Williams SM . Avian Dis 2015 59 (3) 368-374 In this study, we examined the association among clinical signs, ciliostasis, virus detection, and histopathology for evaluating protection of vaccinated chickens against homologous and heterologous infectious bronchitis virus (IBV) challenge. At 5 days following challenge with IBV, we found a good correlation among clinical signs, ciliostasis in the trachea, challenge virus detection, and microscopic lesions in the trachea, with all four criteria being negative in fully protected birds and positive in fully susceptible birds. In partially protected birds we observed clinical signs and detected challenge virus; however, the ciliated epithelium was intact. In a second experiment, we challenged fully protected, partially protected, and fully susceptible birds with IBV, and then at 5 days postchallenge we gave the birds an opportunistic bacterium intranasally. Twenty Bordetella avium colonies were recovered from one of five fully protected birds, and only five colonies were isolated from two of five partially protected birds without ciliostasis, whereas in birds with ciliostasis, numerous colonies were isolated. Obviously, decreasing IBV infection and replication in the upper respiratory tract will decrease transmission and mutations, leading to variant viruses, and herein we demonstrate that protection of the cilia will decrease secondary bacterial infections, which have been shown to lead to condemnations and increased mortality. Thus, it appears that examining both criteria would be important when evaluating IBV vaccine efficacy. |
Association between Shigella infection and diarrhea varies based on location and age of children
Lindsay B , Saha D , Sanogo D , Das SK , Farag TH , Nasrin D , Li S , Panchalingam S , Levine MM , Kotloff K , Nataro JP , Magder L , Hungerford L , Oundo J , Hossain MA , Adeyemi M , Stine OC , Faruque AS . Am J Trop Med Hyg 2015 93 (5) 918-24 Molecular identification of the invasion plasmid antigen-H (ipaH) gene has been established as a useful detection mechanism for Shigella spp. The Global Enteric Multicenter Study (GEMS) identified the etiology and burden of moderate-to-severe diarrhea (MSD) in sub-Saharan Africa and south Asia using a case-control study and traditional culture techniques. Here, we used quantitative polymerase chain reaction (qPCR) to identify Shigella spp. in 2,611 stool specimens from GEMS and compared these results to those using culture. Demographic and nutritional characteristics were assessed as possible risk factors. The qPCR identified more cases of shigellosis than culture; however, the distribution of demographic characteristics was similar by both methods. In regression models adjusting for Shigella quantity, age, and site, children who were exclusively breast-fed had significantly lower odds of MSD compared with children who were not breast-fed (odds ratio [OR] = 0.47, 95% confidence interval (CI) = 0.28-0.81). The association between Shigella quantity and MSD increased with age, with a peak in children of 24-35 months of age (OR = 8.2, 95% CI = 4.3-15.7) and the relationship between Shigella quantity and disease was greatest in Bangladesh (OR = 13.2, 95% CI = 7.3-23.8). This study found that qPCR identified more cases of Shigella and age, site, and breast-feeding status were significant risk factors for MSD. |
CD4+ T cells are not required for suppression of hepatitis B virus replication in the liver of vaccinated chimpanzees
Rybczynska J , Campbell K , Kamili S , Locarnini S , Krawczynski K , Walker CM . J Infect Dis 2015 213 (1) 49-56 Humans vaccinated with hepatitis B virus (HBV) surface antigen (HBsAg) sometimes develop humoral and cellular immunity to HBV proteins such as core and polymerase that are not vaccine components, providing indirect evidence that vaccine-induced immunity is not sterilizing. We previously described CD4+ T-cell immunity against HBsAg and polymerase in chimpanzees after vaccination and HBV challenge. Here, vaccinated chimpanzees with protective levels of anti-HBsAg antibodies were rechallenged with HBV after antibody-mediated CD4+ T-cell depletion. HBV DNA was detected in liver for at least 3 months after rechallenge, but virus replication was suppressed, as revealed by the absence of HBV DNA and HBsAg in serum. These observations provide direct virological evidence for nonsterilizing immunity in individuals with anti-HBsAg antibodies and are consistent with translation of HBV proteins to prime immune responses. They also indicate that CD4+ T cells were not required for suppression of HBV replication in previously vaccinated individuals. |
Comparison of 5 monoclonal antibodies for immunopurification of human butyrylcholinesterase on Dynabeads: KD values, binding pairs, and amino acid sequences
Peng H , Brimijoin S , Hrabovska A , Targosova K , Krejci E , Blake TA , Johnson RC , Masson P , Lockridge O . Chem Biol Interact 2015 240 336-45 Human butyrylcholinesterase (HuBChE) is a stoichiometric bioscavenger of nerve agents and organophosphorus pesticides. Mass spectrometry methods detect stable nerve agent adducts on the active site serine of HuBChE. The first step in sample preparation is immunopurification of HuBChE from plasma. Our goal was to identify monoclonal antibodies that could be used to immunopurify HuBChE on Dynabeads Protein G. Mouse anti-HuBChE monoclonal antibodies were obtained in the form of ascites fluid, dead hybridoma cells stored frozen at -80 degrees C for 30 years, or recently frozen hybridoma cells. RNA from 4 hybridoma cell lines was amplified by PCR for determination of their nucleotide and amino acid sequences. Full-length light and heavy chains were expressed, and the antibodies purified from culture medium. A fifth monoclonal was purchased. The 5 monoclonal antibodies were compared for ability to capture HuBChE from human plasma on Dynabeads Protein G. In addition, they were evaluated for binding affinity by Biacore and ELISA. Epitope mapping by pairing analysis was performed on the Octet Red96 instrument. The 5 monoclonal antibodies, B2 12-1, B2 18-5, 3E8, mAb2, and 11D8, had similar KD values of 10-9 M for HuBChE. Monoclonal B2 18-5 outperformed the others in the Dynabeads Protein G assay where it captured 97% of the HuBChE in 0.5 ml plasma. Pairing analysis showed that 3E8 and B2 12-1 share the same epitope, 11D8 and B2 18-5 share the same epitope, but mAb2 and B2 12-1 or mAb2 and 3E8 bind to different epitopes on HuBChE. B2 18-5 was selected for establishment of a stable CHO cell line for production of mouse anti-HuBChE monoclonal. |
Development and validation of an ELISA kit (YF MAC-HD) to detect IgM to yellow fever virus
Basile AJ , Goodman C , Horiuchi K , Laven J , Panella AJ , Kosoy O , Lanciotti RS , Johnson BW . J Virol Methods 2015 225 41-8 Yellow fever virus (YFV) is endemic in tropical and sub-tropical regions of the world, with around 180,000 human infections a year occurring in Africa. Serologic testing is the chief laboratory diagnostic means of identifying an outbreak and to inform the decision to commence a vaccination campaign. The World Health Organization disseminates the reagents for YFV testing to African reference laboratories, and the US Centers for Disease Control and Prevention (CDC) is charged with producing and providing these reagents. The CDC M-antibody capture ELISA is a 2-day test, requiring titration of reagents when new lots are received, which leads to inconsistency in testing and wastage of material. Here we describe the development of a kit-based assay (YF MAC-HD) based upon the CDC method, that is completed in approximately 3.5h, with equivocal samples being reflexed to an overnight protocol. The kit exhibits >90% accuracy when compared to the 2-day test. The kits were designed for use with a minimum of equipment and are stored at 4 degrees C, removing the need for freezing capacity. This kit is capable of tolerating temporary sub-optimal storage conditions which will ease shipping or power outage concerns, and a shelf life of >6 months was demonstrated with no deterioration in accuracy. All reagents necessary to run the YF MAC-HD are included in the kit and are single-use, with 8 or 24 sample options per kit. Field trials are envisioned for the near future, which will enable refinement of the method. The use of the YF MAC-HD is anticipated to reduce materials wastage, and improve the quality and consistency of YFV serologic testing in endemic areas. |
Differentiation of chemical reaction activity of various carbon nanotubes using redox potential: classification by physical and chemical structures
Tsuruoka S , Matsumoto H , Castranova V , Porter DW , Yanagisawa T , Saito N , Kobayashi S , Endo M . Carbon N Y 2015 95 302-308 The present study systematically examined the kinetics of a hydroxyl radical scavenging reaction of various carbon nanotubes (CNTs) including double-walled and multi-walled carbon nanotubes (DWCNTs and MWCNTs), and carbon nano peapods (AuCl3@DWCNT). The theoretical model that we recently proposed based on the redox potential of CNTs was used to analyze the experimental results. The reaction kinetics for DWCNTs and thin MWCNTs agreed well with the theoretical model and was consistent with each other. On the other hand, thin and thick MWCNTs behaved differently, which was consistent with the theory. Additionally, surface morphology of CNTs substantially influenced the reaction kinetics, while the doped particles in the center hollow parts of CNTs (AuCl3@DWCNT) shifted the redox potential in a different direction. These findings make it possible to predict the chemical and biological reactivity of CNTs based on the structural and chemical nature and their influence on the redox potential. |
United States and territory policies supporting maternal and neonatal transfer: review of transport and reimbursement
Okoroh EM , Kroelinger CD , Lasswell SM , Goodman DA , Williams AM , Barfield WD . J Perinatol 2015 36 (1) 30-4 OBJECTIVE: Summarize policies that support maternal and neonatal transport among states and territories. STUDY DESIGN: Systematic review of publicly available, web-based information on maternal and neonatal transport for each state and territory in 2014. Information was abstracted from published rules, statutes, regulations, planning documents and program descriptions. Abstracted information was summarized within two categories: transport and reimbursement. RESULTS: Sixty-eight percent of states and 25% of territories had a policy for neonatal transport; 60% of states and one territory had a policy for maternal transport. Sixty-two percent of states had a reimbursement policy for neonatal transport, whereas 20% reimbursed for maternal transport. Thirty-two percent of states had an infant back-transport policy while 16% included back-transport for both. No territories had reimbursement or back-transport policies. CONCLUSION: The lack of development of maternal transport reimbursement and neonatal back-transport policies negatively impacts the achievements of risk-appropriate care, a strategy focused on improving perinatal outcomes. |
Predictors of micronutrient powder sachet coverage in Nepal
Jefferds ME , Mirkovic KR , Subedi GR , Mebrahtu S , Dahal P , Perrine CG . Matern Child Nutr 2015 11 Suppl 4 77-89 Many countries implement micronutrient powder (MNP) programmes to improve the nutritional status of young children. Little is known about the predictors of MNP coverage for different delivery models. We describe MNP coverage of an infant and young child feeding and MNP intervention for children aged 6-23 months comparing two delivery models piloted in rural Nepal: distributing MNPs either by female community health volunteers (FCHVs) or at health facilities (HFs). Cross-sectional household cluster surveys were conducted in four pilot districts among mothers of children 6-23 months after starting MNP distribution. FCHVs in each cluster were also surveyed. We used logistic regression to describe predictors of initial coverage (obtaining a batch of 60 MNP sachets) at 3 months and repeat coverage (≥2 times coverage among eligible children) at 15 months after project launch. At 15 months, initial and repeat coverage were higher in the FCHV model, although no differences were observed at 3 months. Attending an FCHV-led mothers' group meeting where MNP was discussed increased odds of any coverage in both models at 3 months and of repeat coverage in the HF model at 15 months. Perceiving ≥1 positive effects in the child increased odds of repeat coverage in both delivery models. A greater portion of FCHV volunteers from the FCHV model vs. the HF model reported increased burden at 3 and 15 months (not statistically significant). Designing MNP programmes that maximise coverage without overburdening the system can be challenging and more than one delivery model may be needed. |
Enhancing maternal and child health using a combined mother & child health booklet in Kenya
Mudany MA , Sirengo M , Rutherford GW , Mwangi M , Nganga LW , Gichangi A . J Trop Pediatr 2015 61 (6) 442-7 Under Kenyan guidelines, HIV-exposed infants should be tested for HIV DNA at 6 weeks or at first clinical contact thereafter, as infants come for immunization. Following the introduction of early infant diagnoses programmes, however, many infants were not being tested and linked to care and treatment. We developed the Mother & Child Health Booklet to help relate mothers' obstetrical history to infants' healthcare providers to facilitate follow-up and timely management. The booklet contains information on the mother's pregnancy, delivery and postpartum course and her child's growth and development, immunization, nutrition and other data need to monitor the child to 5 years of age. It replaced three separate record clinical cards. In a 1 year pilot evaluation of the booklet in Nyanza province in 2007-08, the number of HIV DNA tests on infants increased by 34% from 9966 to 13 379. The booklet was subsequently distributed nationwide in 2009. Overall, the numbers of infants tested for HIV DNA rose from 27 000 in 2007 to 60 000 in 2012, which represents approximately 60% of the estimated HIV-exposed infants in Kenya. We believe that the booklet is an important strategy for identifying and treating infected infants and, thus, in progress toward Millennium Development Goal 4. |
Agent Orange exposure and monoclonal gammopathy of undetermined significance: an Operation Ranch Hand veteran cohort study
Landgren O , Shim YK , Michalek J , Costello R , Burton D , Ketchum N , Calvo KR , Caporaso N , Raveche E , Middleton D , Marti G , Vogt RF Jr . JAMA Oncol 2015 1 (8) 1061-8 IMPORTANCE: Multiple myeloma has been classified as exhibiting "limited or suggestive evidence" of an association with exposure to herbicides in Vietnam War veterans. Occupational studies have shown that other pesticides (ie, insecticides, herbicides, fungicides) are associated with excess risk of multiple myeloma and its precursor state, monoclonal gammopathy of undetermined significance (MGUS); however, to our knowledge, no studies have uncovered such an association in Vietnam War veterans. OBJECTIVE: To examine the relationship between MGUS and exposure to Agent Orange, including its contaminant 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD), in Vietnam War veterans. DESIGN, SETTING, AND PARTICIPANTS: This was a prospective cohort study conducted in 2013 to 2014, testing for MGUS in serum specimens collected and stored in 2002 by the Air Force Health Study (AFHS). The relevant exposure data collected by the AFHS was also used. We tested all specimens in 2013 without knowledge of the exposure status. The AFHS included former US Air Force personnel who participated in Operation Ranch Hand (Ranch Hand veterans) and other US Air Force personnel who had similar duties in Southeast Asia during the same time period (1962 to 1971) but were not involved in herbicide spray missions (comparison veterans). Agent Orange was used by the US Air Force personnel who conducted aerial spray missions of herbicides (Operation Ranch Hand) in Vietnam from 1962 to 1971. We included 479 Ranch Hand veterans and 479 comparison veterans who participated in the 2002 follow-up examination of AFHS. EXPOSURES: Agent Orange and TCDD. Serum TCDD levels were measured in 1987, 1992, 1997, and 2002. MAIN OUTCOMES AND MEASURES: Risk of MGUS measured by prevalence, odds ratios (ORs), and 95% CIs. RESULTS: The 479 Ranch Hand veterans and 479 comparison veterans had similar demographic and lifestyle characteristics and medical histories. The crude prevalence of overall MGUS was 7.1% (34 of 479) in Ranch Hand veterans and 3.1% (15 of 479) in comparison veterans. This translated into a 2.4-fold increased risk for MGUS in Ranch Hand veterans than comparison veterans after adjusting for age, race, BMI in 2002, and the change in BMI between 2002 and the time of blood draw for TCDD measurement (adjusted OR, 2.37; 95% CI, 1.27-4.44; P = .007). CONCLUSIONS AND RELEVANCE: Operation Ranch Hand veterans have a significantly increased risk of MGUS, supporting an association between Agent Orange exposure and multiple myeloma. |
The contribution of subsidized food commodities to total energy intake among US adults
Siegel KR , McKeever Bullard K , Ali MK , Stein AD , Kahn HS , Mehta NK , Webb Girard A , Venkat Narayan KM , Imperatore G . Public Health Nutr 2015 19 (8) 1-10 OBJECTIVE: The contribution of subsidized food commodities to total food consumption is unknown. We estimated the proportion of individual energy intake from food commodities receiving the largest subsidies from 1995 to 2010 (corn, soyabeans, wheat, rice, sorghum, dairy and livestock). DESIGN: Integrating information from three federal databases (MyPyramid Equivalents, Food Intakes Converted to Retail Commodities, and What We Eat in America) with data from the 20012006 National Health and Nutrition Examination Surveys, we computed a Subsidy Score representing the percentage of total energy intake from subsidized commodities. We examined the scores distribution and the probability of having a high (70th percentile) v. low (30th percentile) score, across the population and subgroups, using multivariate logistic regression. SETTING: Community-dwelling adults in the USA. SUBJECTS: Participants (n 11 811) aged 1864 years. RESULTS: Median Subsidy Score was 567% (interquartile range 472654%). Younger, less educated, poorer, and Mexican Americans had higher scores. After controlling for covariates, age, education and income remained independently associated with the score: compared with individuals aged 5564 years, individuals aged 1824 years had a 50% higher probability of having a high score (P<00001). Individuals reporting less than high-school education had 21% higher probability of having a high score than individuals reporting college completion or higher (P=0003); individuals in the lowest tertile of income had an 11% higher probability of having a high score compared with individuals in the highest tertile (P=002). CONCLUSIONS: Over 50% of energy in US diets is derived from federally subsidized commodities. |
Reducing risks to women linked to shift work, long work hours, and related workplace sleep and fatigue issues
Caruso CC . J Womens Health (Larchmt) 2015 24 (10) 789-94 In the United States, an estimated 12% to 28% of working women are on shift work schedules, and 12% work more than 48 hours per week. Shift work and long work hours are associated with many health and safety risks, including obesity, injuries, and negative reproductive outcomes. Over time, the worker is at risk for developing a wide range of chronic diseases. These work schedules can also strain personal relationships, owing to fatigue and poor mood from sleep deprivation and reduced quality time to spend with family and friends. Worker errors from fatigue can lead to reduced quality of goods and services, negatively impacting the employer. In addition, mistakes by fatigued workers can have far-reaching negative effects on the community, ranging from medical care errors to motor vehicle crashes and industrial disasters that endanger others. To reduce the many risks that are linked to these demanding work hours, the National Institute for Occupational Safety and Health (NIOSH) conducts research, develops guidance and authoritative recommendations, and translates and disseminates scientific information to protect workers, their families, employers, and the community. The key message to reduce these risks is making sleep a priority in the employer's systems for organizing work and in the worker's personal life. The NIOSH website has freely available online training programs with suggestions for workers and their managers to help them better cope with this workplace hazard. |
Validation of a new metric for assessing the integration of health protection and health promotion in a sample of small- and medium-sized employer groups
Williams JA , Nelson CC , Caban-Martinez AJ , Katz JN , Wagner GR , Pronk NP , Sorensen G , McLellan DL . J Occup Environ Med 2015 57 (9) 1017-21 OBJECTIVE: To conduct validation analyses for a new measure of the integration of worksite health protection and health promotion approaches developed in earlier research. METHODS: A survey of small- to medium-sized employers located in the United States was conducted between October 2013 and March 2014 (n = 111). Cronbach alpha coefficient was used to assess reliability, and Pearson correlation coefficients were used to assess convergent validity. RESULTS: The integration score was positively associated with the measures of occupational safety and health and health promotion activities/policies-supporting its convergent validity (Pearson correlation coefficients of 0.32 to 0.47). Cronbach alpha coefficient was 0.94, indicating excellent reliability. CONCLUSIONS: The integration score seems to be a promising tool for assessing integration of health promotion and health protection. Further work is needed to test its dimensionality and validate its use in other samples. |
Musculoskeletal disorders and associated healthcare costs among family members of injured workers
Asfaw A , Pana-Cryan R , Bushnell T , Sauter S . Am J Ind Med 2015 58 (11) 1205-16 BACKGROUND: Research has infrequently looked beyond the injured worker when gauging the burden of occupational injury. OBJECTIVES: We explored the relationship between occupational injury and musculoskeletal disorders (MSDs) among family members of injured workers. DATA AND METHODS: We used 2005 and 2006 Truven Health Analytics databases, which contain information on workers' compensation and family healthcare claims. We used descriptive analyses, and negative binomial and two-part models. RESULTS: Family members of severely injured workers had a 15% increase in the total number of MSD outpatient claims and a 34% increase in the mean cost of MSD claims compared to family members of non-severely injured workers within 3 months after injury. Extrapolating cost results to the national level implies that severe occupational injury would be associated with between $29 and $33 million additional cost of family member outpatient MSD claims. CONCLUSION: Occupational injury can impose a formerly unrecognized health burden on family members of injured workers. |
Organizational characteristics influence implementation of worksite health protection and promotion programs: evidence from smaller businesses
McLellan DL , Caban-Martinez AJ , Nelson CC , Pronk NP , Katz JN , Allen JD , Davis KL , Wagner GR , Sorensen G . J Occup Environ Med 2015 57 (9) 1009-16 OBJECTIVE: We explored associations between organizational factors (size, sector, leadership support, and organizational capacity) and implementation of occupational safety and health (OSH) and worksite health promotion (WHP) programs in smaller businesses. METHODS: We conducted a web-based survey of human resource managers of 117 smaller businesses (<750 employees) and analyzed factors associated with implementation of OSH and WHP among these sites using multivariate analyses. RESULTS: Implementation of OSH, but not WHP activities, was related to industry sector (P = 0.003). Leadership support was positively associated with OSH activities (P < 0.001), but negatively associated with WHP implementation. Organizational capacity (budgets, staffing, and committee involvement) was associated with implementation of both OSH and WHP. Size was related to neither. CONCLUSIONS: Leadership support and specifically allocated resources reflecting that support are important factors for implementing OSH and WHP in smaller organizations. |
Elevated blood lead levels among fire assay workers and their children in Alaska, 2010-2011
Porter KA , Kirk C , Fearey D , Castrodale LJ , Verbrugge D , McLaughlin J . Public Health Rep 2015 130 (5) 440-6 In October 2010, an employee at Facility A in Alaska that performs fire assay analysis, an industrial technique that uses lead-containing flux to obtain metals from pulverized rocks, was reported to the Alaska Section of Epidemiology (SOE) with an elevated blood lead level (BLL) ≥10 micrograms per deciliter (mug/dL). The SOE initiated an investigation; investigators interviewed employees, offered blood lead screening to employees and their families, and observed a visit to the industrial facility by the Alaska Occupational Safety and Health Section (AKOSH). Among the 15 employees with known work responsibilities, 12 had an elevated BLL at least once from October 2010 through February 2011. Of these 12 employees, 10 reported working in the fire assay room. Four children of employees had BLLs ≥5 mug/dL. Employees working in Facility A's fire assay room were likely exposed to lead at work and could have brought lead home. AKOSH inspectors reported that they could not share their consultative report with SOE investigators because of the confidentiality requirements of a federal regulation, which hampered Alaska SOE investigators from fully characterizing the lead exposure standards. |
Ethylene oxide and hydrogen peroxide gas plasma sterilization: precautionary practices in U.S. hospitals
Boiano JM , Steege AL . Zentralsterilisation (Wiesb) 2015 23 (4) 255-268 OBJECTIVE: Evaluate precautionary practices and extent of use of ethylene oxide (EtO) and hydrogen peroxide gas plasma (HPGP) sterilization systems, including use of single chamber EtO units. DESIGN: Modular, web-based survey. PARTICIPANTS: Members of professional practice organizations who reported using EtO or HPGP in the past week to sterilize medical instruments and supplies. Participating organizations invited members via email which included a hyperlink to the survey. METHODS: Descriptive analyses were conducted including simple frequencies and prevalences. Results: A total of 428 respondents completed the module on chemical sterilants. Because most respondents worked in hospitals (87%, n = 373) analysis focused on these workers. Most used HPGP sterilizers (84%, n = 373), 38% used EtO sterilizers, with 22% using both. Nearly all respondents using EtO operated single chamber units (94%, n = 120); most of them reported that the units employed single use cartridges (83%, n = 115). Examples of where engineering and administrative controls were lacking for EtO include: operational local exhaust ventilation (7%; n = 114); continuous air monitoring (6%; n = 113); safe handling training (6%; n = 142); and standard operating procedures (4%; n = 142). Examples of practices which may increase HPGP exposure risk included lack of standard operating procedures (9%; n = 311) and safe handling training (8%; n = 312). CONCLUSIONS: Use of precautionary practices was good but not universal. EtO use appears to have diminished in favor of HPGP which affords higher throughput and minimal regulatory constraints. Separate EtO sterilization and aeration units were still being used nearly one year after U.S. EPA prohibited their use. |
Increased decline in pulmonary function among employees in Norwegian smelters reporting work-related asthma-like symptoms
Soyseth V , Johnsen HL , Henneberger PK , Kongerud J . J Occup Environ Med 2015 57 (9) 1004-8 OBJECTIVE: To investigate associations between work-related asthma-like symptoms (WASTH) and annual pulmonary function decline among employees of 18 Norwegian smelters. METHODS: A 5-year longitudinal study in which WASTH was defined as a combination of dyspnea and wheezing that improved on rest days and vacation. RESULTS: A total of 12,966 spirometry examinations were performed in 3084 employees. Crude annual decline in forced expiratory volume in 1 second (FEV1) (dFEV1) was 32.9 mL/yr (95% confidence interval, 30.5 to 35.3), and crude annual decline in forced vital capacity (FVC) (dFVC) was 40.9 mL/yr (37.8 to 43.9). After adjustment for relevant covariates, employees reporting WASTH showed higher dFEV1 by 16.0 m:/yr (3.4 to 28.6) and higher dFVC by 20.5 mL/yr (6.0 to 35.0) compared with employees not reporting WASTH. CONCLUSION: Work-related asthma-like symptom was associated with greater annual declines in FEV1 and FVC, indicating a restrictive pattern. |
Isocyanates and work-related asthma: findings from California, Massachusetts, Michigan, and New Jersey, 1993-2008
Lefkowitz D , Pechter E , Fitzsimmons K , Lumia M , Stephens AC , Davis L , Flattery J , Weinberg J , Harrison RJ , Reilly MJ , Filios M S , White G E , Rosenman KD . Am J Ind Med 2015 58 (11) 1138-49 BACKGROUND: Isocyanates remain a leading cause of work-related asthma (WRA). METHODS: Two independent data systems were analyzed for the period 1993-2008: (1) State-based WRA case surveillance data on persons with isocyanate-induced WRA from four states, and (2) Occupational Safety and Health Administration (OSHA) Integrated Management Information System (IMIS) isocyanate air sampling results. RESULTS: We identified 368 cases of isocyanate-induced WRA from 32 industries and 678 OSHA isocyanate air samples with detectable levels from 31 industries. Seventeen industries were unique to one or the other dataset. CONCLUSION: Isocyanate-induced WRA continues to occur in a wide variety of industries. Two data systems uncovered industries with isocyanate exposures and/or illness. Improved control measures and standards, including medical surveillance, are needed. More emphasis is needed on task-specific guidance, spill clean-up procedures, skin and respiratory protection, and targeted medical monitoring to mitigate the hazards of isocyanate use. |
Baseline evaluation with a sweating thermal manikin of personal protective ensembles recommended for use in West Africa
Coca A , DiLeo T , Kim JH , Roberge R , Shaffer R . Disaster Med Public Health Prep 2015 9 (5) 536-42 OBJECTIVE: Experience with the use of personal protective equipment (PPE) ensembles by health care workers responding to the Ebola outbreak in the hot, humid conditions of West Africa has prompted reports of significant issues with heat stress that has resulted in shortened work periods. METHODS: A sweating thermal manikin was used to ascertain the time to achievement of a critical core temperature of 39 degrees C while wearing 4 different PPE ensembles similar to those recommended by the World Health Organization and Medecins Sans Frontieres (Doctors Without Borders) at 2 different ambient conditions (32 degrees C/92% relative humidity and 26 degrees C/80% relative humidity) compared with a control ensemble. RESULTS: PPE ensembles that utilized coveralls with moderate to high degrees of impermeability attained the critical core temperature in significantly shorter times than did other ensembles. Encapsulation of the head and neck region resulted in higher model-predicted subjective impressions of heat sensation. CONCLUSIONS: To maximize work capacity and to protect health care workers in the challenging ambient conditions of West Africa, consideration should be given to adjustment of work and rest schedules, improvement of PPE (e.g., using less impermeable and more breathable fabrics that provide the same protection), and the possible use of cooling devices worn simultaneously with PPE. |
Sleeping arrangements and mass distribution of bed nets in six districts in central and northern Mozambique
Plucinski MM , Chicuecue S , Macete E , Chambe GA , Muguande O , Matsinhe G , Colborn J , Yoon SS , Doyle TJ , Kachur SP , Aide P , Alonso PL , Guinovart C , Morgan J . Trop Med Int Health 2015 20 (12) 1685-95 OBJECTIVE: Universal coverage with insecticide-treated bed nets is a cornerstone of modern malaria control. Mozambique has developed a novel bed net allocation strategy, where the number of bed nets allocated per household is calculated on the basis of household composition and assumptions about who sleeps with whom. We set out to evaluate the performance of the novel allocation strategy. METHODS: 1,994 households were visited during household surveys following two universal coverage bed net distribution campaigns in Sofala and Nampula Provinces in 2010-2013. Each sleeping space was observed for the presence of a bed net, and the sleeping patterns for each household were recorded. The observed coverage and efficiency were compared to a simulated coverage and efficiency had conventional allocation strategies been used. A composite indicator, the product of coverage and efficiency, was calculated. Observed sleeping patterns were compared with the sleeping pattern assumptions. RESULTS: In households reached by the campaign, 93% (95% CI: 93-94%) of sleeping spaces in Sofala and 84% (82-86%) in Nampula were covered by campaign bed nets. The achieved efficiency was high, with 92% (91-93%) of distributed bed nets in Sofala and 93% (91-95%) in Nampula covering a sleeping space. Using the composite indicator, the novel allocation strategy outperformed all conventional strategies in Sofala and was tied for best in Nampula. The sleeping pattern assumptions were completely satisfied in 66% of households in Sofala and 56% of households in Nampula. The most common violation of the sleeping pattern assumptions was that male children 3-10 years of age tended not to share sleeping spaces with female children 3-10 or 10-16 years of age. CONCLUSIONS: The sleeping pattern assumptions underlying the novel bed net allocation strategy are generally valid, and net allocation using these assumptions can achieve high coverage and compare favorably with conventional allocation strategies. |
Assessment of blood-brain barrier penetration of miltefosine used to treat a fatal case of granulomatous amebic encephalitis possibly caused by an unusual Balamuthia mandrillaris strain
Roy SL , Atkins JT , Gennuso R , Kofos D , Sriram RR , Dorlo TP , Hayes T , Qvarnstrom Y , Kucerova Z , Guglielmo BJ , Visvesvara GS . Parasitol Res 2015 114 (12) 4431-9 Balamuthia mandrillaris, a free-living ameba, causes rare but frequently fatal granulomatous amebic encephalitis (GAE). Few patients have survived after receiving experimental drug combinations, with or without brain lesion excisions. Some GAE survivors have been treated with a multi-drug regimen including miltefosine, an investigational anti-leishmanial agent with in vitro amebacidal activity. Miltefosine dosing for GAE has been based on leishmaniasis dosing because no data exist in humans concerning its pharmacologic distribution in the central nervous system. We describe results of limited cerebrospinal fluid (CSF) and serum drug level testing performed during clinical management of a child with fatal GAE who was treated with a multiple drug regimen including miltefosine. Brain biopsy specimens, CSF, and sera were tested for B. mandrillaris using multiple techniques, including culture, real-time polymerase chain reaction, immunohistochemical techniques, and serology. CSF and serum miltefosine levels were determined using a liquid chromatography method coupled to tandem mass spectrometry. The CSF miltefosine concentration on hospital admission day 12 was 0.4 mug/mL. The serum miltefosine concentration on day 37, about 80 h post-miltefosine treatment, was 15.3 mug/mL. These are the first results confirming some blood-brain barrier penetration by miltefosine in a human, although with low-level CSF accumulation. Further evaluation of brain parenchyma penetration is required to determine optimal miltefosine dosing for Balamuthia GAE, balanced with the drug's toxicity profile. Additionally, the Balamuthia isolate was evaluated by real-time polymerase chain reaction (PCR), demonstrating genetic variability in 18S ribosomal RNA (18S rRNA) sequences and possibly signaling the first identification of multiple Balamuthia strains with varying pathogenicities. |
Trends in injection drug use among high school students, U.S., 1995-2013
Klevens RM , Jones SE , Ward JW , Holtzman D , Kann L . Am J Prev Med 2015 50 (1) 40-46 INTRODUCTION: Injection drug use is the most frequently reported risk behavior among new cases of hepatitis C virus infection, and recent reports of increases in infection are of great concern in many communities. This study assessed the prevalence and trends in injection drug use among U.S. high school students. METHODS: Data were from CDC's Youth Risk Behavior Surveillance System, which collects information on health risk behaviors at the national, state, and large urban school district levels. Analyses were conducted in 2014. RESULTS: In 2013, 1.7% of high school students nationwide had ever injected any illegal drug. Nationwide, ever injecting any illegal drug did not change significantly from 1995 to 2013, except among black non-Hispanic students. For this subgroup, both a significant linear increase from 1995 to 2013 and a significant quadratic trend were observed, with injection drug use increasing from 1995 to 2009 and decreasing from 2009 to 2013. Significant linear increases in injection drug use occurred in five states (Arkansas, Hawaii, Maine, Maryland, and New York) and six large urban school districts (Baltimore, Memphis, Miami-Dade County, New York City, Philadelphia, and Seattle). Significant linear decreases occurred in three states (Massachusetts, South Dakota, and West Virginia). Both a significant linear increase and quadratic trend were observed in Maine; quadratic trends were observed in Tennessee, Utah, and Palm Beach County, Florida. CONCLUSIONS: In some geographic areas and population groups, an increasing or high frequency of injection drug use was found among high school students, who should be targeted for prevention. |
Cohort study of the impact of high-dose opioid analgesics on overdose mortality
Dasgupta N , Funk MJ , Proescholdbell S , Hirsch A , Ribisl KM , Marshall S . Pain Med 2015 17 (1) 85-98 OBJECTIVE: Previous studies examining opioid dose and overdose risk provide limited granularity by milligram strength and instead rely on thresholds. We quantify dose-dependent overdose mortality over a large spectrum of clinically common doses. We also examine the contributions of benzodiazepines and extended release opioid formulations to mortality. DESIGN: Prospective observational cohort with one year follow-up. SETTING: One year in one state (NC) using a controlled substances prescription monitoring program, with name-linked mortality data. SUBJECTS: Residential population of North Carolina (n = 9,560,234), with 2,182,374 opioid analgesic patients. METHODS: Exposure was dispensed prescriptions of solid oral and transdermal opioid analgesics; person-years calculated using intent-to-treat principles. Outcome was overdose deaths involving opioid analgesics in a primary or additive role. Poisson models were created, implemented using generalized estimating equations. RESULTS: Opioid analgesics were dispensed to 22.8% of residents. Among licensed clinicians, 89.6% prescribed opioid analgesics, and 40.0% prescribed ER formulations. There were 629 overdose deaths, half of which had an opioid analgesic prescription active on the day of death. Of 2,182,374 patients prescribed opioids, 478 overdose deaths were reported (0.022% per year). Mortality rates increased gradually across the range of average daily milligrams of morphine equivalents. 80.0% of opioid analgesic patients also received benzodiazepines. Rates of overdose death among those co-dispensed benzodiazepines and opioid analgesics were ten times higher (7.0 per 10,000 person-years, 95 percent CI: 6.3, 7.8) than opioid analgesics alone (0.7 per 10,000 person years, 95 percent CI: 0.6, 0.9). CONCLUSIONS: Dose-dependent opioid overdose risk among patients increased gradually and did not show evidence of a distinct risk threshold. There is urgent need for guidance about combined classes of medicines to facilitate a better balance between pain relief and overdose risk. |
Combustible and smokeless tobacco use among high school athletes - United States, 2001-2013
Agaku IT , Singh T , Jones SE , King BA , Jamal A , Neff L , Caraballo RS . MMWR Morb Mortal Wkly Rep 2015 64 (34) 935-9 Athletes are not a typical at-risk group for smoking combustible tobacco products, because they are generally health conscious and desire to remain fit and optimize athletic performance (1). In contrast, smokeless tobacco use historically has been associated with certain sports, such as baseball (2). Athletes might be more likely to use certain tobacco products, such as smokeless tobacco, if they perceive them to be harmless (3); however, smokeless tobacco use is not safe and is associated with increased risk for pancreatic, esophageal, and oral cancers (4). Tobacco use among youth athletes is of particular concern, because most adult tobacco users first try tobacco before age 18 years (5). To examine prevalence and trends in current (>/=1 day during the past 30 days) use of combustible tobacco (cigarettes, cigars) and smokeless tobacco (chewing tobacco, snuff, or dip [moist snuff]) products among athlete and nonathlete high school students, CDC analyzed data from the 2001-2013 National Youth Risk Behavior Surveys. Current use of any tobacco (combustible or smokeless tobacco) significantly declined from 33.9% in 2001 to 22.4% in 2013; however, current smokeless tobacco use significantly increased from 10.0% to 11.1% among athletes, and did not change (5.9%) among nonathletes. Furthermore, in 2013, compared with nonathletes, athletes had significantly higher odds of being current smokeless tobacco users (adjusted odds ratio [AOR] = 1.77, p<0.05), but significantly lower odds of being current combustible tobacco users (AOR = 0.80, p<0.05). These findings suggest that opportunities exist for development of stronger tobacco control and prevention measures targeting youth athletes regarding the health risks associated with all forms of tobacco use. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Community Health Services
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Genetics and Genomics
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Military Medicine and Health
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Substance Use and Abuse
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure