An action plan for translating cancer survivorship research into care.
Alfano CM , Smith T , de Moor JS , Glasgow RE , Khoury MJ , Hawkins NA , Stein KD , Rechis R , Parry C , Leach CR , Padgett L , Rowland JH . J Natl Cancer Inst 2014 106 (11) To meet the complex needs of a growing number of cancer survivors, it is essential to accelerate the translation of survivorship research into evidence-based interventions and, as appropriate, recommendations for care that may be implemented in a wide variety of settings. Current progress in translating research into care is stymied, with results of many studies un- or underutilized. To better understand this problem and identify strategies to encourage the translation of survivorship research findings into practice, four agencies (American Cancer Society, Centers for Disease Control and Prevention, LIVE STRONG: Foundation, National Cancer Institute) hosted a meeting in June, 2012, titled: "Biennial Cancer Survivorship Research Conference: Translating Science to Care." Meeting participants concluded that accelerating science into care will require a coordinated, collaborative effort by individuals from diverse settings, including researchers and clinicians, survivors and families, public health professionals, and policy makers. This commentary describes an approach stemming from that meeting to facilitate translating research into care by changing the process of conducting research-improving communication, collaboration, evaluation, and feedback through true and ongoing partnerships. We apply the T0-T4 translational process model to survivorship research and provide illustrations of its use. The resultant framework is intended to orient stakeholders to the role of their work in the translational process and facilitate the transdisciplinary collaboration needed to translate basic discoveries into best practices regarding clinical care, self-care/management, and community programs for cancer survivors. Finally, we discuss barriers to implementing translational survivorship science identified at the meeting, along with future directions to accelerate this process. |
Treatment patterns for prostate cancer: comparison of Medicare claims data to medical record review
Fleming ST , Hamilton AS , Sabatino SA , Kimmick GG , Wu XC , Owen JB , Huang B , Hwang W . Med Care 2014 52 (9) e58-64 BACKGROUND: As evidence-based guidelines increasingly define standards of care, the accurate reporting of patterns of treatment becomes critical to determine if appropriate care has been provided. We explore the level of agreement between claims and record abstraction for treatment regimens for prostate cancer. METHODS: Medicare claims data were linked to medical records abstraction using data from the Centers for Disease Control and Prevention's National Program of Cancer Registry-funded Breast and Prostate Patterns of Care study. The first course of therapy included surgery, radiation therapy (RT), and hormonal therapy with luteinizing hormone-releasing hormone agonists. RESULTS: The linked sample included 2765 men most (84.7%) of whom had stage II prostate cancer. Agreement was excellent for surgery (kappa=0.92) and RT (kappa=0.92) and lower for hormonal therapy (kappa=0.71); however, most of the discrepancies were due to greater number of patients reported who received hormonal therapy in the claims database than in the medical records database. For some standard multicomponent management strategies sensitivities were high, for example, hormonal therapy with either combination RT (86.9%) or cryosurgery (96.6%). CONCLUSIONS: Medicare claims are sensitive for determining patterns of multicomponent care for prostate cancer and for detecting use of hormonal therapy when not reported in the medical records abstracts. |
Leptin, adiponectin, and heart rate variability among police officers
Charles LE , Burchfiel CM , Sarkisian K , Li S , Miller DB , Gu JK , Fekedulegn D , Violanti JM , Andrew ME . Am J Hum Biol 2014 27 (2) 184-91 OBJECTIVES: Police officers have a high prevalence of cardiovascular disease (CVD). Reduced heart rate variability (HRV) is known to increase CVD risk. Leptin and adiponectin may be related to CVD health. Therefore, our objective was to investigate the relationship between these variables and HRV. METHODS: Leptin and adiponectin levels were measured in 388 officers from the Buffalo Cardio-Metabolic Occupational Police Stress study. HRV was assessed according to methods published by the Task Force of the European Society of Cardiology and the North American Society of Pacing Electrophysiology for measurement and analysis of HRV. Mean values of high-frequency (HF) and low-frequency (LF) HRV were compared across tertiles of leptin and adiponectin using analysis of variance and analysis of covariance; trends were assessed using linear regression models. RESULTS: Leptin, but not adiponectin, was significantly and inversely associated with HRV. Body mass index (BMI) and percent body fat significantly modified the association between leptin and LF (but not HF) HRV. Among officers with BMI < 25 kg/m2 , leptin was not significantly associated with HRV. However, among officers with BMI ≥ 25 kg/m2 , leptin was inversely associated with HRV, after adjustment for age, gender, and race/ethnicity; HF HRV, P = 0.019 and LF HRV, P < 0.0001. Similarly, among officers with percent body fat ≥ 25.5%, leptin and LF HRV showed significant, inverse associations (adjusted P = 0.001). CONCLUSIONS: Leptin levels were inversely associated with LF HRV, especially among officers with increased adiposity. Increased leptin levels may be associated with CVD-related health problems. |
Meeting the emerging public health needs of persons with blood disorders
Parker CS , Tsai J , Siddiqi AE , Atrash HK , Richardson LC . Am J Prev Med 2014 47 (5) 658-63 In its decades-long history, the Division of Blood Disorders (DBD) at CDC has evolved from a patient-focused, services-supporting entity at inception, to one of the world leaders in the practice of public health to improve the lives of people at risk for or affected by nonmalignant blood disorders. The DBD's earliest public health activities consisted of working with care providers in a network of hemophilia treatment centers to provide AIDS risk reduction services to people with hemophilia. Because this infectious disease threat has been reduced over time as a result of the development of safer treatment products, the DBD-under the auspices of congressional appropriations guidance-has expanded its core activities to encompass blood disorders other than hemophilia, including hemoglobinopathies such as thalassemia and sickle cell disease, and Diamond Blackfan anemia. Simultaneously, in transitioning to a greater public health role, the DBD has expanded its network of partners to new consumer and professional organizations, as well as state and other federal health agencies. The DBD has also developed and maintains many surveillance and registry activities beyond the Universal Data Collection system aimed at providing a better understanding of the health status, health needs, and health-related quality of life of people with nonmalignant blood disorders. The DBD has integrated applicable components of the Essential Services of Public Health successfully to promote and advance the agenda of blood disorders in public health. |
Population distribution of the sagittal abdominal diameter (SAD) from a representative sample of US adults: comparison of SAD, waist circumference and body mass index for identifying dysglycemia
Kahn HS , Gu Q , Bullard KM , Freedman DS , Ahluwalia N , Ogden CL . PLoS One 2014 9 (10) e108707 BACKGROUND: The sagittal abdominal diameter (SAD) measured in supine position is an alternative adiposity indicator that estimates the quantity of dysfunctional adipose tissue in the visceral depot. However, supine SAD's distribution and its association with health risk at the population level are unknown. Here we describe standardized measurements of SAD, provide the first, national estimates of the SAD distribution among US adults, and test associations of SAD and other adiposity indicators with prevalent dysglycemia. METHODS AND FINDINGS: In the 2011-2012 National Health and Nutrition Examination Survey, supine SAD was measured ("abdominal height") between arms of a sliding-beam caliper at the level of the iliac crests. From 4817 non-pregnant adults (age ≥20; response rate 88%) we used sample weights to estimate SAD's population distribution by sex and age groups. SAD's population mean was 22.5 cm [95% confidence interval 22.2-22.8]; median was 21.9 cm [21.6-22.4]. The mean and median values of SAD were greater for men than women. For the subpopulation without diagnosed diabetes, we compared the abilities of SAD, waist circumference (WC), and body mass index (BMI, kg/m2) to identify prevalent dysglycemia (HbA1c ≥5.7%). For age-adjusted, logistic-regression models in which sex-specific quartiles of SAD were considered simultaneously with quartiles of either WC or BMI, only SAD quartiles 3 (p<0.05 vs quartile 1) and 4 (p<0.001 vs quartile 1) remained associated with increased dysglycemia. Based on continuous adiposity indicators, analyses of the area under the receiver operating characteristic curve (AUC) indicated that the dysglycemia model fit for SAD (age-adjusted) was 0.734 for men (greater than the AUC for WC, p<0.001) and 0.764 for women (greater than the AUC for WC or BMI, p<0.001). CONCLUSIONS: Measured inexpensively by bedside caliper, SAD was associated with dysglycemia independently of WC or BMI. Standardized SAD measurements may enhance assessment of dysfunctional adiposity. |
Prevalence and incidence trends for diagnosed diabetes among adults aged 20 to 79 years, United States, 1980-2012
Geiss LS , Wang J , Cheng YJ , Thompson TJ , Barker L , Li Y , Albright AL , Gregg EW . JAMA 2014 312 (12) 1218-26 IMPORTANCE: Although the prevalence and incidence of diabetes have increased in the United States in recent decades, no studies have systematically examined long-term, national trends in the prevalence and incidence of diagnosed diabetes. OBJECTIVE: To examine long-term trends in the prevalence and incidence of diagnosed diabetes to determine whether there have been periods of acceleration or deceleration in rates. DESIGN, SETTING, AND PARTICIPANTS: We analyzed 1980-2012 data for 664,969 adults aged 20 to 79 years from the National Health Interview Survey (NHIS) to estimate incidence and prevalence rates for the overall civilian, noninstitutionalized, US population and by demographic subgroups (age group, sex, race/ethnicity, and educational level). MAIN OUTCOMES AND MEASURES: The annual percentage change (APC) in rates of the prevalence and incidence of diagnosed diabetes (type 1 and type 2 combined). RESULTS: The APC for age-adjusted prevalence and incidence of diagnosed diabetes did not change significantly during the 1980s (for prevalence, 0.2% [95% CI, -0.9% to 1.4%], P = .69; for incidence, -0.1% [95% CI, -2.5% to 2.4%], P = .93), but each increased sharply during 1990-2008 (for prevalence, 4.5% [95% CI, 4.1% to 4.9%], P < .001; for incidence, 4.7% [95% CI, 3.8% to 5.6%], P < .001) before leveling off with no significant change during 2008-2012 (for prevalence, 0.6% [95% CI, -1.9% to 3.0%], P = .64; for incidence, -5.4% [95% CI, -11.3% to 0.9%], P = .09). The prevalence per 100 persons was 3.5 (95% CI, 3.2 to 3.9) in 1990, 7.9 (95% CI, 7.4 to 8.3) in 2008, and 8.3 (95% CI, 7.9 to 8.7) in 2012. The incidence per 1000 persons was 3.2 (95% CI, 2.2 to 4.1) in 1990, 8.8 (95% CI, 7.4 to 10.3) in 2008, and 7.1 (95% CI, 6.1 to 8.2) in 2012. Trends in many demographic subpopulations were similar to these overall trends. However, incidence rates among non-Hispanic black and Hispanic adults continued to increase (for interaction, P = .03 for non-Hispanic black adults and P = .01 for Hispanic adults) at rates significantly greater than for non-Hispanic white adults. In addition, the rate of increase in prevalence was higher for adults who had a high school education or less compared with those who had more than a high school education (for interaction, P = .006 for <high school and P < .001 for high school). CONCLUSIONS AND RELEVANCE: Analyses of nationally representative data from 1980 to 2012 suggest a doubling of the incidence and prevalence of diabetes during 1990-2008, and a plateauing between 2008 and 2012. However, there appear to be continued increases in the prevalence or incidence of diabetes among subgroups, including non-Hispanic black and Hispanic subpopulations and those with a high school education or less. |
Prevalence of airflow obstruction in U.S. adults aged 40-79 years: NHANES data 1988-1994 and 2007-2010
Doney B , Hnizdo E , Dillon CF , Paulose-Ram R , Tilert T , Wolz M , Beeckman-Wagner LA . COPD 2014 12 (4) 355-65 BACKGROUND: The study evaluated the change in the prevalence of airflow obstruction in the U.S. population 40-79 years of age from years 1988-1994 to 2007-2010. METHODS: Spirometry data from two representative samples of the U.S. population, the National Health and Nutrition Examination Surveys (NHANES) conducted in 1988-1994 and 2007-2010, were used. The American Thoracic Society/European Respiratory Society (ATS/ERS) criteria were used to define airflow obstruction. RESULTS: Based on ATS/ERS criteria, the overall age-adjusted prevalence of airflow obstruction among adults aged 40-79 years decreased from 16.6% to 14.5% (p < 0.05). Significant decreases were observed for the older age category 60-69 years (20.2% vs. 15.4%; p < 0.01), for males (19.0% vs. 15.4%; p < 0.01), and for Mexican American adults (12.7% vs. 8.4%; p < 0.001). The prevalence of moderate and more severe airflow obstruction decreased also (6.4% vs. 4.4%; p < 0.01). Based on ATS/ERS criteria, during 2007-2010, an estimated 18.3 million U.S. adults 40-79 years had airflow obstruction, 5.6 million had moderate or severe airflow obstruction and 1.4 million had severe airflow obstruction. CONCLUSIONS: The overall age-adjusted prevalence of airflow obstruction among U.S. adults aged 40-79 years decreased from 1988-1994 to 2007-2010, especially among older adults, Mexican Americans, and males. |
Promotion and provision of colorectal cancer screening: a comparison of Colorectal Cancer Control Program grantees and nongrantees, 2011-2012
Maxwell AE , Hannon PA , Escoffery C , Vu T , Kohn M , Vernon SW , DeGroff A . Prev Chronic Dis 2014 11 E170 INTRODUCTION: Since 2009, the Centers for Disease Control and Prevention (CDC) has awarded nearly $95 million to 29 states and tribes through the Colorectal Cancer Control Program (CRCCP) to fund 2 program components: 1) providing colorectal cancer (CRC) screening to uninsured and underinsured low-income adults and 2) promoting population-wide CRC screening through evidence-based interventions identified in the Guide to Community Preventive Services (Community Guide). CRCCP is a new model for disseminating and promoting use of evidence-based interventions. If the program proves successful, CDC may adopt the model for future cancer control programs. The objective of our study was to compare the colorectal cancer screening practices of recipients of CRCCP funding (grantees) with those of nonrecipients (nongrantees). METHODS: We conducted parallel Web-based surveys in 2012 with CRCCP grantees (N = 29) and nongrantees (N = 24) to assess promotion and provision of CRC screening, including the use of evidence-based interventions. RESULTS: CRCCP grantees were significantly more likely than nongrantees to use Community Guide-recommended evidence-based interventions (mean, 3.14 interventions vs 1.25 interventions, P < .001) and to use patient navigation services (eg, transportion or language translation services) (72% vs 17%, P < .001) for promoting CRC screening. Both groups were equally likely to use other strategies. CRCCP grantees were significantly more likely to provide CRC screening than were nongrantees (100% versus 50%, P < .001). CONCLUSION: Results suggest that CRCCP funding and support increases use of evidence-based interventions to promote CRC screening, indicating the program's potential to increase population-wide CRC screening rates. |
Providing young women with credible health information about bleeding disorders
Rhynders PA , Sayers CA , Presley RJ , Thierry JM . Am J Prev Med 2014 47 (5) 674-80 BACKGROUND: Approximately 1% of U.S. women may have an undiagnosed bleeding disorder, which can diminish quality of life and lead to life-threatening complications during menstruation, childbirth, and surgery. PURPOSE: To understand young women's knowledge, attitudes, and perceptions about bleeding disorders and determine the preferred messaging strategy (e.g., gain- versus loss-framed messages) for presenting information. METHODS: In September 2010, a web-assisted personal interview of women aged 18-25 years was conducted. Preliminary analyses were conducted in 2011 with final analyses in 2013. In total, 1,243 women participated. Knowledge of blood disorders was tabulated for these respondents. Menstrual experiences of women at risk for a bleeding disorder were compared with those not at risk using chi-square analyses. Perceived influence of gain- versus loss-framed messages also was compared. RESULTS: Participants knew that a bleeding disorder is a condition in which bleeding takes a long time to stop (77%) or blood does not clot (66%). Of the women, 57% incorrectly thought that a bleeding disorder is characterized by thin blood; many were unsure if bleeding disorders involve blood types, not getting a period, or mother and fetus having a different blood type. Women at risk for a bleeding disorder were significantly more likely to report that menstruation interfered with daily activities (36% vs 9%); physical or sports activities (46% vs 21%); social activities (29% vs 7%); and school or work activities (20% vs 9%) than women not at risk. Gain-framed messages were significantly more likely to influence women's decisions to seek medical care than parallel loss-framed messages. Findings suggest that the most influential messages focus on knowing effective treatment is available (86% gain-framed vs 77% loss-framed); preventing pregnancy complications (79% gain- vs 71% loss-framed); and maintaining typical daily activities during menstrual periods. CONCLUSIONS: Lack of information about bleeding disorders is a serious public health concern. Health communications focused on gain-framed statements might encourage symptomatic young women to seek diagnosis and treatment. These findings and corresponding recommendations align with Healthy People 2020 and with the CDC's goal of working to promote the health, safety, and quality of life of women at every life stage. |
A public health approach to the prevention of inhibitors in hemophilia
Soucie JM , Miller CH , Kelly FM , Oakley M , Brown DL , Kucab P . Am J Prev Med 2014 47 (5) 669-73 The development of an antibody in people with hemophilia to products used in the treatment and prevention of bleeding, also referred to as an inhibitor, is the most serious complication of hemophilia care today. The CDC, together with healthcare providers, consumer organizations, hemophilia organizations, and federal partners, has developed a public health agenda to prevent the development of inhibitors. This paper describes a public health approach that combines a national surveillance program with epidemiologic, laboratory, and prevention research to address knowledge gaps in rates and risk factors for inhibitor development, and in knowledge and behaviors of patients and providers, in addition to screening and treatment practices. |
Public health surveillance of nonmalignant blood disorders
Beckman MG , Hulihan MM , Byams VR , Oakley MA , Reyes N , Trimble S , Grant AM . Am J Prev Med 2014 47 (5) 664-8 Nonmalignant blood disorders currently affect millions of Americans, and their prevalence is expected to grow over the next several decades. This is owing to improvements in treatment leading to increased life expectancy of people with hereditary conditions, like sickle cell disease and hemophilia, but also the rising occurrence of risk factors for venous thromboembolism. The lack of adequate surveillance systems to monitor these conditions and their associated health indicators is a significant barrier to successfully assess, inform, and measure prevention efforts and progress toward national health goals. CDC is strengthening surveillance activities for blood disorders by improving and developing new methods that are tailored to best capture and monitor the epidemiologic characteristics unique to each disorder. These activities will provide a robust evidence base for public health action to improve the health of patients affected by or at risk for these disorders. |
Elevation of circulating TNF receptors 1 and 2 increases the risk of end-stage renal disease in American Indians with type 2 diabetes
Pavkov ME , Nelson RG , Knowler WC , Cheng Y , Krolewski AS , Niewczas MA . Kidney Int 2014 87 (4) 812-9 In Caucasians with type 2 diabetes, circulating TNF receptors 1 (TNFR1) and 2 (TNFR2) predict end-stage renal disease (ESRD). Here we examined this relationship in a longitudinal cohort study of American Indians with type 2 diabetes with measured glomerular filtration rate (mGFR, iothalamate) and urinary albumin-to-creatinine ratio (ACR). ESRD was defined as dialysis, kidney transplant, or death attributed to diabetic kidney disease. Age-gender-adjusted incidence rates and incidence rate ratios of ESRD were computed by Mantel-Haenszel stratification. The hazard ratio of ESRD was assessed per interquartile range increase in the distribution of each TNFR after adjusting for baseline age, gender, mean blood pressure, HbA1c, ACR, and mGFR. Among the 193 participants, 62 developed ESRD and 25 died without ESRD during a median follow-up of 9.5 years. The age-gender-adjusted incidence rate ratio of ESRD was higher among participants in the highest versus lowest quartile for TNFR1 (6.6, 95% confidence interval (CI) 3.3-13.3) or TNFR2 (8.8, 95% CI 4.3-18.0). In the fully adjusted model, the risk of ESRD per interquartile range increase was 1.6 times (95% CI 1.1-2.2) as high for TNFR1 and 1.7 times (95% CI 1.2-2.3) as high for TNFR2. Thus, elevated serum concentrations of TNFR1 or TNFR2 are associated with increased risk of ESRD in American Indians with type 2 diabetes after accounting for traditional risk factors including ACR and mGFR. |
Fostering engagement and independence: opportunities and challenges for an aging society
Anderson LA , Prohaska TR . Health Educ Behav 2014 41 5s-9s Older adults, persons aged 65 or older, are growing in number faster than any other age group both nationally and internationally. In 2011, there were 41.4 million older adults in the United States. This means that one in eight people was an older adult (Administration on Aging, 2012). Globally by 2015, it is expected there will be more people over age 65 than young people aged 15 and younger (United Nations Population Fund & HelpAge International, 2012). This transformative demographic shift presents numerous challenges that are well documented and many opportunities that are less well examined. Concurrently, research in the field of aging and health has witnessed an exponential growth over the past several decades. This research has resulted in increasing awareness about promoting healthy aging and how much we can influence health, functioning, and well-being of individuals through behavioral and biopsychosocial approaches. Furthermore, within the field of public health, in 2006 the Society for Public Health Education (SOPHE) adopted a "Promoting Healthy Aging Resolution" and among the specified actions was the development of a special issue in Health Education & Behavior. This supplement issue was developed to highlight the scope and potential of behavioral research and health education in contributing to the optimal health of older adults. |
An inventory of healthy weight practices in federally funded haemophilia treatment centres in the United States
Adams E , Deutsche J , Okoroh E , Owens-McAlister S , Majumdar S , Ullman M , Damiano ML , Recht M . Haemophilia 2014 20 (5) 639-43 In the haemophilia population, obesity has an adverse effect on health care cost, chronic complications and joint disease. Although staff of federally funded Hemophilia Treatment Centers in the United States (HTCs) anecdotally recognize these outcomes, practices to promote healthy weights have not been reported. This evaluation identifies routine practices among HTCs in body mass index (BMI) assessment, perceptions about need to address obesity and roles in offering evidence-based strategies to promote healthy weights. A telephone survey was developed to assess HTCs practices including patient BMI assessment and counselling, perceptions about the importance of healthy patient weights, and HTCs roles in weight management. Ninety of the 130 federally funded HTCs contacted elected to participate and completed the telephone survey. Of these, 67% routinely calculated BMI and 48% provided results to patients. Approximately one-third classified obesity correctly for children (30%) and adults (32%), using the Centers for Disease Control and Preventions BMI cut-offs. Most HTCs (87%) reported obesity as an issue of 'big' or 'moderate' concern and 98% indicated HTC responsibility to address this issue. Most centres (64%) address patient weight during comprehensive visits. One-third (33%) of centres include a nutritionist; of those without, 61% offer nutrition referrals when needed. Most (89%) HTCs do not have a protocol in place to address healthy weights; 53% indicated that guidelines are needed. HTCs offer services to help improve weight outcomes. Training programmes for calculating and interpreting BMI as well as identifying appropriate guidelines to apply to the HTC patient population are needed. |
Associations between trends in race/ethnicity, aging, and body mass index with diabetes prevalence in the United States: a series of cross-sectional studies
Menke A , Rust KF , Fradkin J , Cheng YJ , Cowie CC . Ann Intern Med 2014 161 (5) 328-35 BACKGROUND: The increase in the prevalence of diabetes over the past few decades has coincided with an increase in certain risk factors for diabetes, such as a changing race/ethnicity distribution, an aging population, and a rising obesity prevalence. OBJECTIVE: To determine the extent to which the increase in diabetes prevalence is explained by changing distributions of race/ethnicity, age, and obesity prevalence in U.S. adults. DESIGN: Cross-sectional, using data from 5 NHANES (National Health and Nutrition Examination Surveys): NHANES II (1976-1980), NHANES III (1988-1994), and the continuous NHANES 1999-2002, 2003-2006, and 2007-2010. SETTING: Nationally representative samples of the U.S. noninstitutionalized civilian population. PATIENTS: 23 932 participants aged 20 to 74 years. MEASUREMENTS: Diabetes was defined as a self-reported diagnosis or fasting plasma glucose level of 7.0 mmol/L (126 mg/dL) or more. RESULTS: Between 1976 to 1980 and 2007 to 2010, diabetes prevalence increased from 4.7% to 11.2% in men and from 5.7% to 8.7% in women (P for trends for both groups < 0.001). After adjustment for age, race/ethnicity, and body mass index, diabetes prevalence increased in men (6.2% to 9.6%; P for trend < 0.001) but not women (7.6% to 7.5%; P for trend = 0.69). Body mass index was the greatest contributor among the 3 covariates to the change in prevalence estimates after adjustment. LIMITATION: Some possible risk factors, such as physical activity, waist circumference, and mortality, could not be studied because data on these variables were not collected in all surveys. CONCLUSION: The increase in the prevalence of diabetes was greater in men than in women in the U.S. population between 1976 to 1980 and 2007 to 2010. After changes in age, race/ethnicity, and body mass index were controlled for, the increase in diabetes prevalence over time was approximately halved in men and diabetes prevalence was no longer increased in women. PRIMARY FUNDING SOURCE: Centers for Disease Control and Prevention and National Institutes of Diabetes and Digestive and Kidney Diseases. |
Asthma education: different viewpoints elicited by qualitative and quantitative methods
Damon SA , Tardif RR . J Asthma 2014 52 (3) 1-16 OBJECTIVE: This project began as a qualitative examination of how asthma education provided by health professionals could be improved. Unexpected qualitative findings regarding the use of Asthma Action Plans and the importance of insurance reimbursement for asthma education prompted further quantitative examination. METHODS: Qualitative individual interviews were conducted with primary care physicians in private practice who routinely provide initial diagnoses of asthma and focus groups were conducted with other clinicians in private primary care practices who routinely provide asthma education. Using the DocStyles quantitative tool two questions regarding Asthma Action Plans and insurance reimbursement were asked of a representative sample of physicians and other clinicians. RESULTS: The utility of Asthma Action Plans was questioned in the 2012 qualitative study. Qualitative findings also raised questions regarding whether reimbursement is the barrier to asthma education for patients performed by medical professionals it is thought to be. 2013 quantitative findings show that the majority of clinicians see Asthma Action Plans as useful. The question of whether reimbursement is a barrier to providing asthma education to patients was not resolved by the quantitative data. CONCLUSIONS: The majority of clinicians see Asthma Action Plans as a useful tool for patient education. Clinicians had less clear opinions on whether the lack of defined reimbursement codes acted as a barrier to asthma education. The study also provided useful audience data for design of new asthma educational tools developed by CDC. |
Blood disorders and public health
Richardson LC , Parker CS , Tsai J . Am J Prev Med 2014 47 (5) 656-7 Millions of people in the U.S. are affected by blood disorders.1 The accumulating epidemiologic evidence for non-malignant blood disorders continues to strengthen its consideration as a national public health priority. Although there is enormous potential for public health practice to reduce the disease burden and associated healthcare costs, the fiscal resources with which to do so are decreasing and may continue in the near future. Thus, the Division of Blood Disorders (DBD) at the Centers for Disease Control and Prevention (CDC) has embraced a new currency of developing and implementing a comprehensive set of public health approaches to effectively promote and improve the health of people with blood disorders. | In spite of having effective regimens to prevent blood clots, it is estimated that thousands of patients will die of a blood clot associated with a hospital stay.2 Hydroxyurea has been proven to result in fewer painful crises, fewer episodes of acute chest syndrome, fewer blood transfusions,3 and lower risk of death in adult sickle cell disease (SCD) patients,4 yet less than one third of the patients who might benefit from it actually receive it.5 Prophylactic treatment in hemophilia patients reduces the number of bleeds significantly as compared to on-demand treatment; however, this has not become a standard of care.6 These public health activities have been undertaken by DBD and seek to bring to bear proven interventions in areas where research and surveillance findings suggest that potential benefits might be realized and adverse effects might be mitigated. Additionally, these approaches attempt to develop and implement interventions that benefit the entire community of people affected by blood disorders. |
A cancer center's approach to engaging African American men about cancer: the Men's Fellowship Breakfast, southeastern Michigan, 2008-2014
Langford AT , Griffith DM , Beasley DD , Braxton EI . Prev Chronic Dis 2014 11 E164 BACKGROUND: Despite disproportionate rates of cancer morbidity and mortality among African American men, few community-based efforts have been developed and sustained to educate African American men about cancer. The University of Michigan Comprehensive Cancer Center implemented a series of breakfasts to improve cancer awareness, screening, and education among African American men. This article describes the rationale for and history of the community intervention. COMMUNITY CONTEXT: The 21 breakfasts were held from 2008 through mid-2014 in Ypsilanti and Ann Arbor, Michigan. Ypsilanti ranks below Michigan and the nation on most socioeconomic indicators, although most residents are high school graduates (88% in Ypsilanti and 96.5% in Ann Arbor). African American men in Ypsilanti have higher death rates for diseases associated with poor diet and inadequate physical activity compared with Ypsilanti whites and general populations in Michigan and the nation. METHODS: We conducted a multicomponent qualitative process evaluation including staff meetings, conversations with participants, and focus groups. We collected 425 post-event surveys to evaluate the breakfasts quantitatively. OUTCOMES: Participants were African American (85%), were aged 51 to 70 years (54%), had health insurance (89%), and had some college education (38%). Fifty-three percent of participants reported interest in the breakfast topics including nutrition; 46%, prostate cancer; 34%, colorectal cancer, and 32%, pain management; 62% reported willingness to participate in a clinical trial. INTERPRETATION: African American men are interested in learning about health and are willing to attend a health-focused breakfast series. The Men's Fellowship Breakfast is a promising strategy for bringing men together to discuss cancer screening and risk reduction. |
Characteristics and outcomes of patients with acute decompensated heart failure developing after hospital admission
Patel MD , Kalbaugh CA , Chang PP , Matsushita K , Agarwal SK , Caughey MC , Ni H , Rosamond WD , Wruck LM , Loehr LR . Am J Cardiol 2014 114 (10) 1530-6 There are limited data on acute decompensated heart failure (ADHF) that develops after hospital admission. This study sought to compare patient characteristics, co-morbidities, mortality, and length of stay by timing of ADHF onset. The surveillance component of the Atherosclerosis Risk in Communities study (2005 to 2011) sampled, abstracted, and adjudicated hospitalizations with select International Classification of Disease, Ninth Revision, Clinical Modification discharge codes from 4 United States communities among those aged ≥55 years. We included 5,602 validated ADHF hospitalizations further classified as preadmission or postadmission onset. Vital status was assessed up to 1 year since admission. We estimated multivariate-adjusted associations of in-hospital mortality and 28- and 365-day case fatalities with timing of ADHF onset (postadmission vs preadmission). All analyses were weighted to account for the stratified sampling design. Of 25,862 weighted ADHF hospitalizations, 7% had postadmission onset of ADHF. Patients with postadmission ADHF were more likely to be older, white, and women. The most common primary discharge diagnosis codes for those with postadmission ADHF included diseases of the circulatory or digestive systems or infectious diseases. Short-term mortality among postadmission ADHF was almost 3 times that of preadmission ADHF (in-hospital mortality: odds ratio 2.7, 95% confidence interval 1.9 to 3.9; 28-day case fatality: odds ratio 2.6, 95% confidence interval 1.8 to 3.7). The average hospital stay was almost twice as long among postadmission as preadmission ADHF (9.6 vs 5.0 days). In conclusion, postadmission onset of ADHF is characterized by differences in co-morbidities and worse short-term prognosis, and opportunities for reducing postadmission ADHF occurrence and associated risks need to be studied. |
Developing a framework and priorities to promote mobility among older adults
Anderson LA , Slonim A , Yen IH , Jones DL , Allen P , Hunter RH , Goins RT , Leith KH , Rosenberg D , Satariano WA , McPhillips-Tangum C . Health Educ Behav 2014 41 10s-8s Mobility, broadly defined as movement in all of its forms from ambulation to transportation, is critical to supporting optimal aging. This article describes two projects to develop a framework and a set of priority actions designed to promote mobility among community-dwelling older adults. Project 1 involved a concept-mapping process to solicit and organize action items into domains from a broad group of stakeholders to create the framework. Concept mapping uses qualitative group processes with multivariate statistical analysis to represent the ideas visually through maps. A snowball technique was used to identify stakeholders (n = 211). A 12-member steering committee developed a focus prompt, "One specific action that can lead to positive change in mobility for older adults in the United States is . . ." Project 2 included a Delphi technique (n = 43) with three iterations to prioritize four to six items using results from the concept mapping rating process. Project 1 resulted in 102 items across nine domains (Research to Practice, Independence and Engagement, Built Environment and Safety, Transportation, Policy, Housing and Accessibility, Community Supports, Training, and Coordinated Action). The number of items ranged from 6 to 18 per domain. Project 2 resulted in agreement on four items that reflect the importance of promoting environmental strategies through collaborative initiatives aimed at planning and best practices focusing on environmental enhancements or transit, training of professionals, and integration of mobility into state and local public health plans. These findings can be applied to support coordinated, multidisciplinary research and practice to promote mobility among older adults. |
Recruitment by a geospatial networking application for research and practice: the New York City experience
Usher D , Frye V , Shinnick J , Greene E , Baez E , Benitez J , Solomon L , Shouse RL , Sobieszczyk ME , Koblin BA . J Acquir Immune Defic Syndr 2014 Social networking using mobile phone-based applications (“apps”) has become widespread, with 89% of Americans ages 18-29 reporting that they use social networking sites.1 Men who have sex with men (MSM) utilize social networking sites at high rates, in part because they are able to form private, anonymous and relatively safe communities on these sites.2,3 A variety of niche sites like Grindr, Manhunt, Adam4Adam and Scruff have web pages and mobile phone-based applications for use by MSM, with a large proportion of MSM using such applications to find sex partners.2-8 Several studies have documented the success of using social networking for HIV prevention9-11 and recruitment for HIV prevention research. 12-14 Recently, researchers have reported the use of mobile phone applications for education15 and as a tool to recruit MSM for research studies.16 | Here we describe our experience using a geospatial social networking application (Grindr) for recruitment into three different HIV prevention projects, including an HIV testing program, a social epidemiological survey (NYCM2M) and an HIV vaccine trial (HIV Vaccine Trials Network 505 (HVTN 505)). |
Risk for hepatitis B and C virus transmission in nail salons and barbershops and state regulatory requirements to prevent such transmission in the United States
Yang J , Hall K , Nuriddin A , Woolard D . J Public Health Manag Pract 2014 20 (6) E20-e30 CONTEXT: The potential for hepatitis B and C virus (HBV/HCV) transmission in nail salons and barbershops has been reported, but a systematic review has not been conducted. These businesses are regulated by state cosmetology or barbering boards, but adequacy of sanitary requirements has not been evaluated. OBJECTIVES: To conduct literature review to assess risk for HBV/HCV transmission in nail salons and barbershops and to evaluate sanitary requirements in HBV/HCV prevention in these businesses in 50 states and District of Columbia. DESIGN: Several search engines were used for literature search. Studies that quantified risks associated with manicuring, pedicuring, or barbering were included. State requirements for disinfection and sterilization were reviewed and evaluated. MAIN OUTCOME MEASURE: For literature review, odds ratios, 95% confidence intervals, and confounding adjustment were extracted and evaluated. For regulation review, requirements for disinfection or sterilization for multiuse items in nail salons and barbershops were assessed according to the US federal guidelines. RESULTS: Forty-six studies were identified and 36 were included in this study. Overall, the results were not consistent on risk for HBV/HCV transmission in nail salons and barbershops. For sanitary requirements, disinfection with an Environmental Protection Agency-registered disinfectant is required in 39 states for nail salons and in 26 states for barbershops. Sterilization was described in 15 states for nail salons and in 11 states for barbershops, but the majority of these states listed it as an optional approach. Sanitary requirements are consistent in states where 1 board regulates both businesses but are substantially discrepant in states with separate boards. CONCLUSIONS: Current literature cannot confirm or exclude the risk for HBV/HCV transmission in nail salons and barbershops. Existing sanitary requirements are adequate in the majority of states, but compliance is needed to prevent HBV/HCV transmission in these businesses. |
Slowing of the HIV epidemic in Ukraine: evidence from case reporting and key population surveys, 2005-2012
Vitek CR , Cakalo JI , Kruglov YV , Dumchev KV , Salyuk TO , Bozicevic I , Baughman AL , Spindler HH , Martsynovska VA , Kobyshcha YV , Abdul-Quader AS , Rutherford GW . PLoS One 2014 9 (9) e103657 BACKGROUND: Ukraine developed Europe's most severe HIV epidemic due to widespread transmission among persons who inject drugs (PWID). Since 2004, prevention has focused on key populations; antiretroviral therapy (ART) coverage has increased. Recent data show increases in reported HIV cases through 2011, especially attributed to sexual transmission, but also signs of potential epidemic slowing. We conducted a data triangulation exercise to better analyze available data and inform program implementation. METHODS AND FINDINGS: We reviewed data for 2005 to 2012 from multiple sources, primarily national HIV case reporting and integrated biobehavioral surveillance (IBBS) studies among key populations. Annually reported HIV cases increased at a progressively slower rate through 2011 with recent increases only among older, more immunosuppressed individuals; cases decreased 2.7% in 2012. Among women <25 years of age, cases attributed to heterosexual transmission and HIV prevalence in antenatal screening declined after 2008. Reported cases among young PWID declined by three-fourths. In 2011, integrated biobehavioral surveillance demonstrated decreased HIV prevalence among young members of key populations compared with 2009. HIV infection among female sex workers (FSW) remains strongly associated with a personal history of injecting drug use (IDU). CONCLUSIONS: This analysis suggests that Ukraine's HIV epidemic has slowed, with decreasing reported cases and older cases predominating among those diagnosed. Recent decreases in cases and in prevalence support decreased incidence among young PWID and women. Trends among heterosexual men and men who have sex with men (MSM) are less clear; further study and enhanced MSM prevention are needed. FSW appear to have stable prevalence with risk strongly associated with IDU. Current trends suggest the Ukrainian epidemic can be contained with enhanced prevention among key populations and increased treatment access. |
Syndemic vulnerability, sexual and injection risk behaviors, and HIV continuum of care outcomes in HIV-positive injection drug users
Mizuno Y , Purcell DW , Knowlton AR , Wilkinson JD , Gourevitch MN , Knight KR . AIDS Behav 2014 19 (4) 684-93 Limited investigations have been conducted on syndemics and HIV continuum of care outcomes. Using baseline data from a multi-site, randomized controlled study of HIV-positive injection drug users (n = 1,052), we examined whether psychosocial factors co-occurred, and whether these factors were additively associated with behavioral and HIV continuum of care outcomes. Experiencing one type of psychosocial problem was significantly (p < 0.05) associated with an increased odds of experiencing another type of problem. Persons with 3 or more psychosocial problems were significantly more likely to report sexual and injection risk behaviors and were less likely to be adherent to HIV medications. Persons with 4 or more problems were less likely to be virally suppressed. Reporting any problems was associated with not currently taking HIV medications. Our findings highlight the association of syndemics not only with risk behaviors, but also with outcomes related to the continuum of care for HIV-positive persons. |
Tenosynovitis caused by a novel nontuberculous Mycobacterium species initially misidentified as Mycobacterium tuberculosis complex
Simner PJ , Hyle EP , Buckwalter SP , Branda JA , Brown-Elliott BA , Franklin J , Toney NC , de Man TJ , Wallace RJ Jr , Vasireddy R , Gandhi RT , Wengenack NL . J Clin Microbiol 2014 52 (12) 4414-8 We present a case of tenosynovitis caused by a novel, slowly-growing, non-chromogenic, nontuberculous mycobacterium (NTM). Originally misidentified as Mycobacterium tuberculosis complex, the NTM cross-reacts with the M. tuberculosis complex nucleic acid hybridization probe, a M. tuberculosis gamma-interferon release assay, and is closely related to M. tuberculosis by 16S rRNA gene sequencing. |
Typhoid fever surveillance and vaccine use - South-East Asia and Western Pacific regions, 2009-2013
Date KA , Bentsi-Enchill AD , Fox KK , Abeysinghe N , Mintz ED , Khan MI , Sahastrabuddhe S , Hyde TB . MMWR Morb Mortal Wkly Rep 2014 63 (39) 855-860 Typhoid fever is a serious, systemic infection resulting in nearly 22 million cases and 216,500 deaths annually, primarily in Asia. Safe water, adequate sanitation, appropriate personal and food hygiene, and vaccination are the most effective strategies for prevention and control. In 2008, the World Health Organization (WHO) recommended use of available typhoid vaccines to control endemic disease and outbreaks and strengthening of typhoid surveillance to improve disease estimates and identify high-risk populations (e.g., persons without access to potable water and adequate sanitation). This report summarizes the status of typhoid surveillance and vaccination programs in the WHO South-East Asia (SEAR) and Western Pacific regions (WPR) during 2009-2013, after the revised WHO recommendations. Data were obtained from the WHO/United Nations Children's Fund (UNICEF) Joint Reporting Form on Immunization, a supplemental survey of surveillance and immunization program managers, and published literature. During 2009-2013, 23 (48%) of 48 countries and areas of SEAR (11) and WPR (37) collected surveillance or notifiable disease data on typhoid cases, with most surveillance activities established before 2008. Nine (19%) countries reported implementation of typhoid vaccination programs or recommended vaccine use during 2009-2013. Despite the high incidence, typhoid surveillance is weak in these two regions, and vaccination efforts have been limited. Further progress toward typhoid fever prevention and control in SEAR and WPR will require country commitment and international support for enhanced surveillance, targeted use of existing vaccines and availability of newer vaccines integrated within routine immunization programs, and integration of vaccination with safe water, sanitation, and hygiene measures. |
Update: influenza activity - United States and worldwide, May 18-September 20, 2014
Blanton L , Brammer L , Smith S , Mustaquim D , Steffens C , Abd Elal AI , Gubareva L , Hall H , Wallis T , Villanueva J , Xu X , Bresee J , Cox N , Finelli L . MMWR Morb Mortal Wkly Rep 2014 63 (39) 861-864 During May 18-September 20, 2014, the United States experienced low levels of seasonal influenza activity overall. Influenza A (H1N1)pdm09 (pH1N1), influenza A (H3N2), and influenza B viruses were detected worldwide and were identified sporadically in the United States. In August, two influenza A (H3N2) variant viruses (H3N2v) were detected in Ohio. This report summarizes influenza activity in the United States and worldwide during May 18-September 20, 2014. |
Updated preparedness and response framework for influenza pandemics
Holloway R , Rasmussen SA , Zaza S , Cox NJ , Jernigan DB . MMWR Recomm Rep 2014 63 1-9 The complexities of planning for and responding to the emergence of novel influenza viruses emphasize the need for systematic frameworks to describe the progression of the event; weigh the risk of emergence and potential public health impact; evaluate transmissibility, antiviral resistance, and severity; and make decisions about interventions. On the basis of experience from recent influenza responses, CDC has updated its framework to describe influenza pandemic progression using six intervals (two prepandemic and four pandemic intervals) and eight domains. This updated framework can be used for influenza pandemic planning and serves as recommendations for risk assessment, decision-making, and action in the United States. The updated framework replaces the U.S. federal government stages from the 2006 implementation plan for the National Strategy for Pandemic Influenza (US Homeland Security Council. National strategy for pandemic influenza: implementation plan. Washington, DC: US Homeland Security Council; 2006. Available at http://www.flu.gov/planning-preparedness/federal/pandemic-influenza-implementatio n.pdf). The six intervals of the updated framework are as follows: 1) investigation of cases of novel influenza, 2) recognition of increased potential for ongoing transmission, 3) initiation of a pandemic wave, 4) acceleration of a pandemic wave, 5) deceleration of a pandemic wave, and 6) preparation for future pandemic waves. The following eight domains are used to organize response efforts within each interval: incident management, surveillance and epidemiology, laboratory, community mitigation, medical care and countermeasures, vaccine, risk communications, and state/local coordination. Compared with the previous U.S. government stages, this updated framework provides greater detail and clarity regarding the potential timing of key decisions and actions aimed at slowing the spread and mitigating the impact of an emerging pandemic. Use of this updated framework is anticipated to improve pandemic preparedness and response in the United States. Activities and decisions during a response are event-specific. These intervals serve as a reference for public health decision-making by federal, state, and local health authorities in the United States during an influenza pandemic and are not meant to be prescriptive or comprehensive. This framework incorporates information from newly developed tools for pandemic planning and response, including the Influenza Risk Assessment Tool and the Pandemic Severity Assessment Framework, and has been aligned with the pandemic phases restructured in 2013 by the World Health Organization. |
Men living with diagnosed HIV who have sex with men: progress along the continuum of HIV care - United States, 2010
Singh S , Bradley H , Hu X , Skarbinski J , Hall HI , Lansky A . MMWR Morb Mortal Wkly Rep 2014 63 (38) 829-33 Gay, bisexual, and other men who have sex with men (MSM) represent approximately 2% of the United States population, yet are the risk group most affected by human immunodeficiency virus (HIV). In 2010, among persons newly infected with HIV, 63% were MSM; among persons living with HIV, 52% were MSM. The three goals of the National HIV/AIDS Strategy are to reduce new HIV infections, to increase access to care and improve health outcomes for persons living with HIV, and to reduce HIV-related health disparities. In July 2013, the HIV Care Continuum Initiative was established by executive order to mobilize and accelerate federal efforts to increase HIV testing, services, and treatment along the continuum. To meet the 2015 targets of the National HIV/AIDS Strategy, 85% of MSM diagnosed with HIV should be linked to care, 80% should be retained in care, and the proportion with an undetectable viral load (VL) should be increased by 20%. To assess progress toward meeting these targets, CDC assessed the level at each step of the continuum of care for MSM by age and race/ethnicity. CDC analyzed data from the National HIV Surveillance System (NHSS) and the Medical Monitoring Project (MMP) for MSM with diagnosed HIV infection. The results indicated that 77.5% were linked to care, 50.9% were retained in care, 49.5% were prescribed antiretroviral therapy (ART), and 42.0% had achieved viral suppression. Younger MSM and black/African American MSM had lower levels of care compared with older MSM and those of all other races/ethnicities. Interventions aimed at MSM are needed that increase linkage to care, retention in care, and ART use, particularly among MSM aged <25 years and black/African American MSM. |
Modes of transmission of influenza B virus in households
Cowling BJ , Ip DK , Fang VJ , Suntarattiwong P , Olsen SJ , Levy J , Uyeki TM , Leung GM , Peiris JS , Chotpitayasunondh T , Nishiura H , Simmerman JM . PLoS One 2014 9 (9) e108850 INTRODUCTION: While influenza A and B viruses can be transmitted via respiratory droplets, the importance of small droplet nuclei "aerosols" in transmission is controversial. METHODS AND FINDINGS: In Hong Kong and Bangkok, in 2008-11, subjects were recruited from outpatient clinics if they had recent onset of acute respiratory illness and none of their household contacts were ill. Following a positive rapid influenza diagnostic test result, subjects were randomly allocated to one of three household-based interventions: hand hygiene, hand hygiene plus face masks, and a control group. Index cases plus their household contacts were followed for 7-10 days to identify secondary infections by reverse transcription polymerase chain reaction (RT-PCR) testing of respiratory specimens. Index cases with RT-PCR-confirmed influenza B were included in the present analyses. We used a mathematical model to make inferences on the modes of transmission, facilitated by apparent differences in clinical presentation of secondary infections resulting from aerosol transmission. We estimated that approximately 37% and 26% of influenza B virus transmission was via the aerosol mode in households in Hong Kong and Bangkok, respectively. In the fitted model, influenza B virus infections were associated with a 56%-72% risk of fever plus cough if infected via aerosol route, and a 23%-31% risk of fever plus cough if infected via the other two modes of transmission. CONCLUSIONS: Aerosol transmission may be an important mode of spread of influenza B virus. The point estimates of aerosol transmission were slightly lower for influenza B virus compared to previously published estimates for influenza A virus in both Hong Kong and Bangkok. Caution should be taken in interpreting these findings because of the multiple assumptions inherent in the model, including that there is limited biological evidence to date supporting a difference in the clinical features of influenza B virus infection by different modes. |
Prevalence of Chlamydia trachomatis genital infection among persons aged 14-39 years - United States, 2007-2012
Torrone E , Papp J , Weinstock H . MMWR Morb Mortal Wkly Rep 2014 63 (38) 834-8 Infection with the bacterium, Chlamydia trachomatis (often termed "chlamydia") is the most frequently reported sexually transmitted infection in the United States. The urethra is the most common site of infection in males, and the urethra and cervix are most commonly infected in females. Ascending infection in females can cause pelvic inflammatory disease, which can lead to infertility and ectopic pregnancy. Genital chlamydial infections are usually asymptomatic, and screening is necessary to identify most infections. Currently, chlamydia screening for sexually active women aged <25 years is recommended by the U.S. Preventive Services Task Force (grade B recommendation). Chlamydia is nationally notifiable; however, if females do not access care or clinicians do not screen, many infections go undiagnosed, unreported, and untreated. CDC monitors population prevalence of genital chlamydial infection through the National Health and Nutrition Examination Survey (NHANES), which tests a sample of the U.S. population aged 14-39 years for genital C. trachomatis and found that the overall chlamydia burden in the United States decreased during 1999-2008. Using data from the most recent cycles of NHANES (2007-2012), CDC estimated chlamydia prevalence among persons aged 14-39 years overall and by demographic characteristics and sexual behaviors. The prevalence of chlamydia among persons aged 14-39 years was 1.7% (95% confidence interval [CI] = 1.4%-2.0%). Chlamydia prevalence varied by age and race/ethnicity, with prevalence highest among non-Hispanic blacks (5.2%). Among sexually active females aged 14-24 years, the population targeted for routine screening, chlamydia prevalence was 4.7% overall and 13.5% among non-Hispanic black females. As chlamydia is common and infections are usually asymptomatic, health care providers should routinely screen sexually active young women aged <25 years for chlamydial infection, provide prompt treatment for infected persons, and ensure that infected patients' sex partners receive timely treatment to prevent reinfection. |
Estimates of the reproduction number for seasonal, pandemic, and zoonotic influenza: a systematic review of the literature
Biggerstaff M , Cauchemez S , Reed C , Gambhir M , Finelli L . BMC Infect Dis 2014 14 480 BACKGROUND: The potential impact of an influenza pandemic can be assessed by calculating a set of transmissibility parameters, the most important being the reproduction number (R), which is defined as the average number of secondary cases generated per typical infectious case. METHODS: We conducted a systematic review to summarize published estimates of R for pandemic or seasonal influenza and for novel influenza viruses (e.g. H5N1). We retained and summarized papers that estimated R for pandemic or seasonal influenza or for human infections with novel influenza viruses. RESULTS: The search yielded 567 papers. Ninety-one papers were retained, and an additional twenty papers were identified from the references of the retained papers. Twenty-four studies reported 51 R values for the 1918 pandemic. The median R value for 1918 was 1.80 (interquartile range [IQR]: 1.47-2.27). Six studies reported seven 1957 pandemic R values. The median R value for 1957 was 1.65 (IQR: 1.53-1.70). Four studies reported seven 1968 pandemic R values. The median R value for 1968 was 1.80 (IQR: 1.56-1.85). Fifty-seven studies reported 78 2009 pandemic R values. The median R value for 2009 was 1.46 (IQR: 1.30-1.70) and was similar across the two waves of illness: 1.46 for the first wave and 1.48 for the second wave. Twenty-four studies reported 47 seasonal epidemic R values. The median R value for seasonal influenza was 1.28 (IQR: 1.19-1.37). Four studies reported six novel influenza R values. Four out of six R values were <1. CONCLUSIONS: These R values represent the difference between epidemics that are controllable and cause moderate illness and those causing a significant number of illnesses and requiring intensive mitigation strategies to control. Continued monitoring of R during seasonal and novel influenza outbreaks is needed to document its variation before the next pandemic. |
Estimating the future number of cases in the Ebola epidemic - Liberia and Sierra Leone, 2014-2015
Meltzer MI , Atkins CY , Santibanez S , Knust B , Petersen BW , Ervin ED , Nichol ST , Damon IK , Washington ML . MMWR Suppl 2014 63 (3) 1-14 The first cases of the current West African epidemic of Ebola virus disease (hereafter referred to as Ebola) were reported on March 22, 2014, with a report of 49 cases in Guinea. By August 31, 2014, a total of 3,685 probable, confirmed, and suspected cases in West Africa had been reported. To aid in planning for additional disease-control efforts, CDC constructed a modeling tool called EbolaResponse to provide estimates of the potential number of future cases. If trends continue without scale-up of effective interventions, by September 30, 2014, Sierra Leone and Liberia will have a total of approximately 8,000 Ebola cases. A potential underreporting correction factor of 2.5 also was calculated. Using this correction factor, the model estimates that approximately 21,000 total cases will have occurred in Liberia and Sierra Leone by September 30, 2014. Reported cases in Liberia are doubling every 15-20 days, and those in Sierra Leone are doubling every 30-40 days. The EbolaResponse modeling tool also was used to estimate how control and prevention interventions can slow and eventually stop the epidemic. In a hypothetical scenario, the epidemic begins to decrease and eventually end if approximately 70% of persons with Ebola are in medical care facilities or Ebola treatment units (ETUs) or, when these settings are at capacity, in a non-ETU setting such that there is a reduced risk for disease transmission (including safe burial when needed). In another hypothetical scenario, every 30-day delay in increasing the percentage of patients in ETUs to 70% was associated with an approximate tripling in the number of daily cases that occur at the peak of the epidemic (however, the epidemic still eventually ends). Officials have developed a plan to rapidly increase ETU capacities and also are developing innovative methods that can be quickly scaled up to isolate patients in non-ETU settings in a way that can help disrupt Ebola transmission in communities. The U.S. government and international organizations recently announced commitments to support these measures. As these measures are rapidly implemented and sustained, the higher projections presented in this report become very unlikely. |
Feasibility and validity of telephone triage for adverse events during a voluntary medical male circumcision campaign in Swaziland
Ashengo TA , Grund J , Mhlanga M , Hlophe T , Mirira M , Bock N , Njeuhmeli E , Curran K , Mallas E , Fitzgerald L , Shoshore R , Moyo K , Bicego G . BMC Public Health 2014 14 (1) 858 BACKGROUND: Voluntary medical male circumcision (VMMC) reduces HIV acquisition among heterosexual men by approximately 60%. VMMC is a surgical procedure and some adverse events (AEs) are expected. Swaziland's Ministry of Health established a toll-free hotline to provide general information about VMMC and to manage post-operative clinical AEs through telephone triage. METHODS: We retrospectively analyzed a dataset of telephone calls logged by the VMMC hotline during a VMMC campaign. The objectives were to determine reasons clients called the VMMC hotline and to ascertain the accuracy of telephone-based triage for VMMC AEs. We then analyzed VMMC service delivery data that included date of surgery, AE type and severity, as diagnosed by a VMMC clinician as part of routine post-operative follow-up. Both datasets were de-identified and did not contain any personal identifiers. Proportions of AEs were calculated from the call data and from VMMC service delivery data recorded by health facilities. Sensitivity analyses were performed to assess the accuracy of phone-based triage compared to clinically confirmed AEs. RESULTS: A total of 17,059 calls were registered by the triage nurses from April to December 2011. Calls requesting VMMC education and counseling totaled 12,492 (73.2%) and were most common. Triage nurses diagnosed 384 clients with 420 (2.5%) AEs. According to the predefined clinical algorithms, all moderate and severe AEs (153) diagnosed through telephone-triage were referred for clinical management at a health facility. Clinicians at the VMMC sites diagnosed 341 (4.1%) total clients as having a mild (46.0%), moderate (47.8%), or severe (6.2%) AE. Eighty-nine (26%) of the 341 clients who were diagnosed with AEs by clinicians at a VMMC site had initially called the VMMC hotline. The telephone-based triage system had a sensitivity of 69%, a positive predictive value of 83%, and a negative predictive value of 48% for screening moderate or severe AEs of all the AEs. CONCLUSIONS: The use of a telephone-based triage system may be an appropriate first step to identify life-threatening and urgent complications following VMMC surgery. |
Hepatitis A hospitalizations in the United States, 2002-2011
Collier MG , Tong X , Xu F . Hepatology 2014 61 (2) 481-5 BACKGROUND: Hepatitis A illness severity increases with age. One indicator of hepatitis A illness severity is whether persons hospitalized. We describe changes in primary hepatitis A hospitalization rates in the United States from 2002-2011, including changes in demographics, secondary discharge diagnoses, and factors affecting hospitalization duration. METHODS: We describe changes from 2002-2011 among U. S. residents hospitalized with a principal hepatitis A diagnosis and accompanying secondary diagnoses using ICD-9 codes from the National Inpatient Survey discharge data. We calculated rates of hospitalizations with hepatitis A as the principal discharge diagnosis and rates of secondary discharge diagnoses. Using multiple regression, we assessed the effect of secondary diagnoses on hospitalization length of stay for five time intervals: 2002-2003, 2004-2005, 2006-2007, 2008-2009 and 2010-2011. RESULTS: Rates of hospitalization for hepatitis A as a principal diagnosis decreased from 0.72/100,000 to 0.29/100,000 (p <0.0001) and mean age of those hospitalized increased from 37.6 years to 45.5 years (p <0.0001) during 2002-2011. The percentage of hepatitis A hospitalizations covered by Medicare increased from 12.4% to 22.7% (p <0.0001). Secondary comorbid discharge diagnoses increased, including liver disease, hypertension, ischemic heart disease, disorders of lipid metabolism, and chronic kidney disease. No changes in length-of-stay or in-hospital deaths from hepatitis A overtime were found, but persons with liver disease were hospitalized longer. DISCUSSION: Hospitalization rates for hepatitis A illness have declined significantly from 2002-2011, but the characteristics of the hospitalized population also changed. Persons hospitalized for hepatitis A in recent years are older and more likely to have liver diseases and other comorbid medical conditions. Hepatitis A disease and resulting hospitalizations could be prevented through adult vaccination. |
How much management is necessary? Sustaining the benefit of achieving a sustained virologic response to hepatitis C
Morgan RL . J Gastroenterol 2014 49 (11) 1514-5 Described as a silent killer by the media, hepatitis C virus (HCV) infection can be present in the body for 20 years or more before causing serious complications. An estimated 130–150 million persons globally are infected with HCV, among which 350,000–500,000 deaths each year can be attributed to HCV, including HCV-related liver disease, cirrhosis, and hepatocellular carcinoma (HCC) [1]. With appropriate screening and treatment, many of these deaths can be prevented. | With the approval of direct-acting antivirals for clinical use in 2011, HCV treatment regimens have demonstrated increased efficacy, fewer side effects, and wider application due to fewer contraindications and greater availability of treatment options [2, 3]. Treatment success is more likely, with 50–90 % of persons treated eradicating the virus (as determined by undetectable RNA) and achieving a sustained virologic response (SVR). |
HPV vaccine coverage among men who have sex with men - National HIV Behavioral Surveillance System, United States, 2011
Meites E , Markowitz LE , Paz-Bailey G , Oster AM . Vaccine 2014 32 (48) 6356-9 Men who have sex with men (MSM) are at high risk for disease associated with human papillomavirus (HPV). In late 2011, HPV vaccine was recommended for males through age 21 and MSM through age 26. Using data from the 2011 National HIV Behavioral Surveillance System, we assessed self-reported HPV vaccine uptake among MSM, using multivariate analysis to calculate adjusted prevalence ratios (aPRs) and 95% confidence intervals (CIs). Among 3221 MSM aged 18-26, 157 (4.9%) reported ≥1 vaccine dose. Uptake was higher among men who visited a healthcare provider (aPR 2.3, CI: 1.2-4.2), disclosed same-sex sexual attraction/behavior to a provider (aPR 2.1, CI: 1.3-3.3), reported a positive HIV test (aPR 2.2, CI: 1.5-3.2), or received hepatitis vaccine (aPR 3.9, CI: 2.4-6.4). Of 3064 unvaccinated MSM, 2326 (75.9%) had visited a healthcare provider within 1 year. These national data on HPV vaccine uptake among MSM provide a baseline as vaccination recommendations are implemented. |
Importation and containment of Ebola virus disease - Senegal, August-September 2014
Mirkovic K , Thwing J , Diack PA . MMWR Morb Mortal Wkly Rep 2014 63 (39) 873-874 On August 29, 2014, Senegal confirmed its first case of Ebola virus disease (Ebola) in a Guinean man, aged 21 years, who had traveled from Guinea to Dakar, Senegal, in mid-August to visit family. Senegalese medical and public health personnel were alerted about this patient after public health staff in Guinea contacted his family in Senegal on August 27. The patient had been admitted to a referral hospital in Senegal on August 26. He was promptly isolated, and a blood sample was sent for laboratory confirmation; Ebola was confirmed by reverse transcriptase-polymerase chain reaction at Institut Pasteur Dakar on August 29. The patient's mother and sister had been admitted to an Ebola treatment unit in Guinea on August 26, where they had named the patient as a contact and reported his recent travel to Senegal. Ebola was likely transmitted to the family from the brother of the patient, who had traveled by land from Sierra Leone to Guinea in early August seeking treatment from a traditional healer. The brother died in Guinea on August 10; family members, including the patient, participated in preparing the body for burial. |
Contact investigation of melioidosis cases reveals regional endemicity in Puerto Rico
Doker TJ , Sharp TM , Rivera-Garcia B , Perez-Padilla J , Benoit TJ , Ellis EM , Elrod MG , Gee JE , Shieh WJ , Beesley CA , Ryff KR , Traxler RM , Galloway RL , Haberling DL , Waller LA , Shadomy SV , Bower WA , Hoffmaster AR , Walke HT , Blaney DD . Clin Infect Dis 2014 60 (2) 243-50 BACKGROUND: Melioidosis results from infection with Burkholderia pseudomallei, and is associated with case-fatality rates up to 40%. Early diagnosis and treatment with appropriate antimicrobials can improve survival rates. Fatal and non-fatal melioidosis cases were identified in Puerto Rico in 2010 and 2012, respectively, which prompted contact investigations to identify risk factors for infection and evaluate endemicity. METHODS: Questionnaires were administered and serum specimens were collected from co-workers, neighborhood contacts within 250 meters of both patients' residences, and injection drug use (IDU) contacts of the 2012 patient. Serum specimens were tested for evidence of prior exposure to B. pseudomallei by indirect hemagglutination assay. Neighborhood seropositivity results guided soil sampling to isolate B. pseudomallei. RESULTS: Serum specimens were collected from contacts of the 2010 (n=51) and 2012 (n=60) patients, respectively. No co-workers had detectable anti-B. pseudomallei antibody, whereas seropositive results among neighborhood contacts was 5% (n=2) for the 2010 patient and 23%(n=12) for the 2012 patient, as well as 2 of 3 IDU contacts for the 2012 case. Factors significantly associated with seropositivity were having skin wounds, sores, or ulcers (OR=4.6; 95% CI: 1.2-17.8) and IDU (OR=18.0; 95% CI: 1.6-194.0). B. pseudomallei was isolated from soil collected in the neighborhood of the 2012 patient. CONCLUSIONS: Taken together, isolation of B. pseudomallei from a soil sample and high seropositivity among patient contacts suggest at least regional endemicity of melioidosis in Puerto Rico. Increased awareness of melioidosis is needed to enable early case identification and early initiation of appropriate antimicrobial therapy. |
Declining trends in the proportion of non-viral sexually transmissible infections reported by STD clinics in the US, 2000-10
Owusu-Edusei K , Sayegh BJ , Harvey AJ , Nelson RJ . Sex Health 2014 11 (4) 340-4 BACKGROUND: Recent budget shortfalls may have resulted in decreases in the number of sexually transmissible infections (STIs) reported from sexually transmitted disease clinics (STDCs) in the United States (US). The objective of this study was to examine the proportion of cases reported from STDCs for three non-viral STIs in the last decade. METHODS: Data from the national surveillance database on primary and secondary (P&S) syphilis, gonorrhoea and chlamydia cases for 2000-10 were extracted. The percentage of cases reported by STDCs for the nation and for each of the 48 contiguous states were then computed. Finally, the chi(2) trend test for proportions was used to determine the annual average decrease/increase in the percentage of cases reported by STDCs for the nation and for each state. RESULTS: Results demonstrate that the average annual declines in the proportion of P&S syphilis, gonorrhoea, and chlamydia cases reported from STDCs were 1.43% (P<0.01), 1.31% (P<0.01), and 0.31% (P<0.01), respectively. Additionally, most of the states with statistically significant trends (P<0.05) in the proportion of cases reported by STDCs had negative slopes: 86% (25/29) for P&S syphilis, 89% (34/38) for gonorrhoea, and 63% (27/43) for chlamydia. CONCLUSION: These results document the declining role of STDCs in STI prevention and control efforts in the US. Further studies are needed to assess the direct or indirect impact of the decline in the proportion of cases from STDCs on the overall STI control and prevention efforts in the US and its implications for the future. |
Travel-associated disease among US residents visiting US GeoSentinel clinics after return from international travel
Hagmann SH , Han PV , Stauffer WM , Miller AO , Connor BA , Hale DC , Coyle CM , Cahill JD , Marano C , Esposito DH , Kozarsky PE . Fam Pract 2014 31 (6) 678-87 BACKGROUND: US residents make 60 million international trips annually. Family practice providers need to be aware of travel-associated diseases affecting this growing mobile population. OBJECTIVE: To describe demographics, travel characteristics and clinical diagnoses of US residents who present ill after international travel. METHODS: Descriptive analysis of travel-associated morbidity and mortality among US travellers seeking care at 1 of the 22 US practices and clinics participating in the GeoSentinel Global Surveillance Network from January 2000 to December 2012. RESULTS: Of the 9624 ill US travellers included in the analysis, 3656 (38%) were tourist travellers, 2379 (25%) missionary/volunteer/research/aid workers (MVRA), 1580 (16%) travellers visiting friends and relatives (VFRs), 1394 (15%) business travellers and 593 (6%) student travellers. Median (interquartile range) travel duration was 20 days (10-60 days). Pre-travel advice was sought by 45%. Hospitalization was required by 7%. Compared with other groups of travellers, ill MVRA travellers returned from longer trips (median duration 61 days), while VFR travellers disproportionately required higher rates of inpatient care (24%) and less frequently had received pre-travel medical advice (20%). Illnesses of the gastrointestinal tract were the most common (58%), followed by systemic febrile illnesses (18%) and dermatologic disorders (17%). Three deaths were reported. Diagnoses varied according to the purpose of travel and region of exposure. CONCLUSIONS: Returning ill US international travellers present with a broad spectrum of travel-associated diseases. Destination and reason for travel may help primary health care providers to generate an accurate differential diagnosis for the most common disorders and for those that may be life-threatening. |
Challenges in and strategies for the surveillance of school health policies and practices: a commentary
Brener ND , Wechsler H , Kann L . J Sch Health 2014 84 (11) 687-9 Since 1994, the Centers for Disease Control and Prevention (CDC) has been monitoring policies and practices across multiple components of school health through 2 surveillance systems: the School Health Policies and Practices Study (SHPPS), a national survey periodically conducted at the state, district, school, and classroom levels, and the School Health Profiles (Profiles), a system of surveys assessing school health policies and practices in states, large urban school districts, territories, and tribal governments. CDC has encountered several challenges in implementing these systems. In this commentary, we describe the most common challenges encountered and the strategies that CDC has identified to address them. We hope our experiences will be helpful to others interested in monitoring school health policies and practices. |
Genotypic distribution and phylogenetic characterization of Enterocytozoon bieneusi in diarrheic chickens and pigs in multiple cities, China: potential zoonotic transmission.
Li W , Tao W , Jiang Y , Diao R , Yang J , Xiao L . PLoS One 2014 9 (9) e108279 This study investigated diarrheic broiler and layer chickens (<50 days; n = 14) and pigs of three age groups (preweaned <30 days, weaned approximately 30 to 60 days, and growing >60 days; n = 64) for E. bieneusi genotypes in northeast China and evaluated the potential roles of chickens and pigs in zoonotic transmission of microsporidiosis. Two 45-day-old layer chickens in city Jixi, Heilongjiang province and one 23-day-old broiler chicken in city Songyuan, Jilin province were identified to harbor a human-pathogenic E. bieneusi genotype Henan-IV and a new genotype named CC-1, respectively, by nested PCR and sequence analysis of the ribosomal internal transcribed spacer (ITS). Eleven of 64 (17.2%) duodenal mucosal specimens from pigs in city Tianjin, city Tongliao of Inner Mongolia, cities Jilin and Songyuan of Jilin province, and cities Daqing, Harbin, and Suihua of Heilongjiang province, were positive for E. bieneusi, with the infection rates of weaned pigs (35%, 7/20) significantly higher than preweaned ones (3.6%, 1/28; P<0.05). Nucleotide sequences of the ITS were obtained from 6 pig specimens, belonging to 3 known genotypes CHN7, EbpC, and Henan-IV. That the previous reports have described the occurrence of genotypes EbpC and Henan-IV in humans and EbpC in wastewater in central China and the clustering of genotypes CC-1 and CHN7 into a major phylogenetic group of E. bieneusi genotypes with zoonotic potential indicated that chickens and pigs could be potential sources of human micorsporidiosis. To our knowledge, this is the first report describing the existence of zoonotic E. bieneusi genotypes in diarrheic chickens. |
Trypanosoma cruzi transmission in a Colombian Caribbean region suggests that secondary vectors play an important epidemiological role
Cantillo-Barraza O , Chaverra D , Marcet P , Arboleda-Sanchez S , Triana-Chavez O . Parasit Vectors 2014 7 (1) 381 BACKGROUND: Colombia, as part of The Andean Countries Initiative has given priority to triatomine control programs to eliminate primary (domiciliated) vector species such as Rhodnius prolixus and Triatoma dimidiata. However, recent events of Trypanosoma cruzi transmission in localities where R. prolixus and T. dimidiata are not present suggest that other species are involved in the T. cruzi transmission cycle. METHODS: We studied T. cruzi transmission on Margarita Island, located on the Magdalena River in the Colombian Caribbean region, where a high number of non-domiciliated triatomines infected with T. cruzi inside human dwellings have been observed. A cross-sectional survey including serological studies in humans and parasitological and molecular methods in vectors and reservoirs was conducted. We investigated risk factors for human infection and house infestation, and evaluated the association between abundance of wild triatomines in palm trees (Attalea butyracea) across municipalities, seasons and anthropogenic land use. RESULTS: The T. cruzi seroprevalence rate in humans was 1.7% (13/743) and autochthonous active T. cruzi transmission was detected. The infection risk was associated with the capture of triatomines in human dwellings. Five wild mammal species were infected with T. cruzi, where Didelphis marsupialis was the main reservoir host with an 86.3% (19/22) infection rate. TcIb was the only genotype present among vectors. Triatomine abundance was significantly higher in Ecosystem 2, as well as in the dry season. Despite the absence of triatomine domiciliation in this area, T. cruzi active transmission was registered with a human seroprevalence rate similar to that reported in areas with domesticated R. prolixus. CONCLUSIONS: This study illustrates the importance of secondary and household invading triatomines in Chagas disease epidemiology in the Caribbean lowlands of Colombia. |
Persistently high estimates of late night, indoor exposure to malaria vectors despite high coverage of insecticide treated nets
Bayoh MN , Walker ED , Kosgei J , Ombok M , Olang GB , Githeko AK , Killeen GF , Otieno P , Desai M , Lobo NF , Vulule JM , Hamel MJ , Kariuki S , Gimnig JE . Parasit Vectors 2014 7 (1) 380 BACKGROUND: It has been speculated that widespread and sustained use of insecticide treated bed nets (ITNs) for over 10 years in Asembo, western Kenya, may have selected for changes in the location (indoor versus outdoor) and time (from late night to earlier in the evening) of biting of the predominant species of human malaria vectors (Anopheles funestus, Anopheles gambiae sensu stricto, and Anopheles arabiensis). METHODS: Mosquitoes were collected by human landing catches over a six week period in June and July, 2011, indoors and outdoors from 17 h to 07 h, in 75 villages in Asembo, western Kenya. Collections were separated by hour of the night, and mosquitoes were identified to species and tested for sporozoite infection with Plasmodium falciparum. A subset was dissected to determine parity. Human behavior (time going to bed and rising, time spent indoors and outdoors) was quantified by cross-sectional survey. Data from past studies of a similar design and in nearby settings, but conducted before the ITN scale up commenced in the early 2000s, were compared with those from the present study. RESULTS: Of 1,960 Anopheles mosquitoes collected in 2011, 1,267 (64.6%) were morphologically identified as An. funestus, 663 (33.8%) as An. gambiae sensu lato (An. gambiae s.s. and An. arabiensis combined), and 30 (1.5%) as other anophelines. Of the 663 An. gambiae s.l. collected, 385 were successfully tested by PCR among which 235 (61.0%) were identified as An. gambiae s.s. while 150 (39.0%) were identified as An. arabiensis. Compared with data collected before the scale-up of ITNs, daily entomological inoculation rates (EIRs) were consistently lower for An. gambiae s.l. (indoor EIR = 0.432 in 1985-1988, 0.458 in 1989-1990, 0.023 in 2011), and An. arabiensis specifically (indoor EIR = 0.532 in 1989-1990, 0.039 in 2009, 0.006 in 2011) but not An. funestus (indoor EIR = 0.029 in 1985-1988, 0.147 in 1989-1990, 0.010 in 2009 and 0.103 in 2011). Sporozoite rates were lowest in 2009 but rose again in 2011. Compared with data collected before the scale-up of ITNs, An. arabiensis and An. funestus were more likely to bite outdoors and/or early in the evening (p < 0.001 for all comparisons). However, when estimates of human exposure that would occur indoors (pii) or while asleep (pis) in the absence of an ITN were generated based on human behavioral patterns, the changes were modest with >90% of exposure of non-ITN users to mosquito bites occurring while people were indoors in all years. The proportion of bites occurring among non-ITN users while they were asleep was ≥90% for all species except for An. arabiensis. For this species, 97% of bites occurred while people were asleep in 1989-1990 while in 2009 and 2011, 80% and 84% of bites occurred while people were asleep for those not using ITNs. Assuming ITNs prevent a theoretical maximum of 93.7% of bites, it was estimated that 64-77% of bites would have occurred among persons using nets while they were asleep in 1989-1990, while 20-52% of bites would have occurred among persons using nets while they were asleep in 2009 and 2011. CONCLUSIONS: This study found no evidence to support the contention that populations of Anopheles vectors of malaria in Asembo, western Kenya, are exhibiting departures from the well-known pattern of late night, indoor biting characteristic of these typically highly anthropophilic species. While outdoor, early evening transmission likely does occur in western Kenya, the majority of transmission still occurs indoors, late at night. Therefore, malaria control interventions such as ITNs that aim to reduce indoor biting by mosquitoes should continue to be prioritized. |
Estimating the geographic distribution of human Tanapox and potential reservoirs using ecological niche modeling
Monroe BP , Nakazawa YJ , Reynolds MG , Carroll DS . Int J Health Geogr 2014 13 (1) 34 BACKGROUND: Tanapox virus is a zoonotic infection that causes mild febrile illness and one to several nodular skin lesions. The disease is endemic in parts of Africa. The principal reservoir for the virus that causes Tanapox is unknown, but has been hypothesized to be a non-human primate. This study employs ecological niche modeling (ENM) to determine areas of tropical Africa suitable for the occurrence of human Tanapox and a list of hypothetical reservoirs. The resultant niche model will be a useful tool to guide medical surveillance activities in the region. METHODS: This study uses the Desktop GARP software to predict regions where human Tanapox might be expected to occur based on historical human case locations and environmental data. Additional modeling of primate species, using occurrence data from museum records was performed to determine suitable disease reservoirs. RESULTS: The final ENM predicts a potential distribution of Tanapox over much of equatorial Africa, exceeding the borders of Kenya and Democratic Republic of Congo (DRC) where it has been historically reported. Five genera of non-human primates were found to be potential reservoir taxa. CONCLUSIONS: Validity testing suggests the model created here is robust (p < 0.04). Several genera of primates were identified as having ENMs overlapping with that of Tanapox and are suggested as potential reservoirs, mainly members of the Genus Cercopithecus. The ENM modeling technique has several limitations and results should be interpreted with caution. This study may increase knowledge and engage further research in this neglected disease. |
The yin: an adverse health perspective of nanoceria: uptake, distribution, accumulation, and mechanisms of its toxicity
Yokel RA , Hussain S , Garantziotis S , Demokritou P , Castranova V , Cassee FR . Environ Sci Nano 2014 1 (5) 406-428 This critical review evolved from a SNO Special Workshop on Nanoceria panel presentation addressing the toxicological risks of nanoceria: accumulation, target organs, and issues of clearance; how exposure dose/concentration, exposure route, and experimental preparation/model influence the different reported effects of nanoceria; and how can safer by design concepts be applied to nanoceria? It focuses on the most relevant routes of human nanoceria exposure and uptake, disposition, persistence, and resultant adverse effects. The pulmonary, oral, dermal, and topical ocular exposure routes are addressed as well as the intravenous route, as the latter provides a reference for the pharmacokinetic fate of nanoceria once introduced into blood. Nanoceria reaching the blood is primarily distributed to mononuclear phagocytic system organs. Available data suggest nanoceria's distribution is not greatly affected by dose, shape, or dosing schedule. Significant attention has been paid to the inhalation exposure route. Nanoceria distribution from the lung to the rest of the body is less than 1% of the deposited dose, and from the gastrointestinal tract even less. Intracellular nanoceria and organ burdens persist for at least months, suggesting very slow clearance rates. The acute toxicity of nanoceria is very low. However, large/accumulated doses produce granuloma in the lung and liver, and fibrosis in the lung. Toxicity, including genotoxicity, increases with exposure time; the effects disappear slowly, possibly due to nanoceria's biopersistence. Nanoceria may exert toxicity through oxidative stress. Adverse effects seen at sites distal to exposure may be due to nanoceria translocation or released biomolecules. An example is elevated oxidative stress indicators in the brain, in the absence of appreciable brain nanoceria. Nanoceria may change its nature in biological environments and cause changes in biological molecules. Increased toxicity has been related to greater surface Ce3+, which becomes more relevant as particle size decreases and the ratio of surface area to volume increases. Given its biopersistence and resulting increased toxicity with time, there is a risk that long-term exposure to low nanoceria levels may eventually lead to adverse health effects. This critical review provides recommendations for research to resolve some of the many unknowns of nanoceria's fate and adverse effects. |
Large outbreak of Legionnaires' disease and Pontiac fever at a military base
Ambrose J , Hampton LM , Fleming-Dutra KE , Marten C , Mc Clusky C , Perry C , Clemmons NA , Mc Cormic Z , Peik S , Mancuso J , Brown E , Kozak N , Travis T , Lucas C , Fields B , Hicks L , Cersovsky SB . Epidemiol Infect 2014 142 (11) 2336-46 We investigated a mixed outbreak of Legionnaires' disease (LD) and Pontiac fever (PF) at a military base to identify the outbreak's environmental source as well as known legionellosis risk factors. Base workers with possible legionellosis were interviewed and, if consenting, underwent testing for legionellosis. A retrospective cohort study collected information on occupants of the buildings closest to the outbreak source. We identified 29 confirmed and probable LD and 38 PF cases. All cases were exposed to airborne pathogens from a cooling tower. Occupants of the building closest to the cooling tower were 6.9 [95% confidence interval (CI) 2.2-22.0] and 5.5 (95% CI 2.1-14.5) times more likely to develop LD and PF, respectively, than occupants of the next closest building. Thorough preventive measures and aggressive responses to outbreaks, including searching for PF cases in mixed legionellosis outbreaks, are essential for legionellosis control. |
Linking exposure and health in environmental public health tracking
Zhou Y , Jerrett M . Environ Res 2014 134c 453 The mission of the National Environmental Public Health Tracking Program (Tracking) at Centers for Disease Control and Prevention (CDC) is to provide information from a nationwide network of integrated health, environmental hazard, and exposure data that drives actions to improve the health of communities. This special issue contains a series of articles that either analyze the association between environmental exposure and health or address different issues encountered in conducting these linkage studies. | This issue begins with an overview paper by Strosnider et al., which summarizes the mission and history of the Tracking Program at CDC. It reviews the challenges currently faced by the Tracking Program and provides an overview of the recent collaborations with academic partners to address them, some of which are featured in this special issue. | Three articles explore the linkage between environmental pollution and health outcomes—two of them focus on air pollution; the other on agricultural land use as a proxy for pesticides exposure. Talbott et al. examine the impact of fine particulate matter (PM2.5) on cardiovascular disease hospitalizations for seven states within the CDC Tracking Program (Florida, Massachusetts, New Hampshire, New Jersey, New Mexico, New York, and Washington). Harris et al. analyze the association of PM2.5 with full-term births with low birth weight also using data from seven Tracking states (Connecticut, Maine, Minnesota, New Jersey, New York, Utah, and Wisconsin). Almberg et al. study the potential associations between county level data on the densities of particular crops and low birth weight and preterm births, using data from Missouri. |
At your fingertips
Eggers C . J Environ Health 2014 77 (1) 34-6 A group of blind men surrounds an elephant, and in an attempt to discern what the object is, each man reaches out and begins touching it. As each man feels a different part, he shares his observation. One man feels a leg and proclaims the pillar-like object to be a stout tree. Another one feels the trunk and declares the curved subject a snake. Still another touches an ear and pronounces the thing a fan. Although their observations are singularly plausible, it is only when all viewpoints are combined that the whole object becomes known as an elephant. | Although the tale may vary in its retelling, the message is clear. When separate information is shared and brought together, we are able to see a more complete picture. | Bringing varied and disparate information together to effect knowledge is a cornerstone of informatics. Public health informatics is the systematic application of knowledge about systems that capture, manage, analyze, and use information to improve population health. We use informatics to move from our current understanding to a more knowledgeable comprehension through acquiring and using information. |
CDC’s National Environmental Public Health Tracking Network adds pesticide exposure and prospective climate data
Outin YR . J Environ Health 2014 77 (3) 34-36 The Centers for Disease Control and Prevention’s (CDC’s) National Environmental Public Health Tracking Network (Tracking Network) expands content and functionality every year. This year, two new datasets were added: pesticide exposure and 70 years of prospective climate data. These represent two important environmental public health concerns. In 2012, pesticides were the 10th leading cause of poisoning exposure reported to poison control centers in the U.S. (Mowry, Spyker, Cantilena, Bailey, & Ford, 2013). Understanding how and where pesticide exposures are happening can inform public health interventions and public education on the dangers of using these chemicals inappropriately. Extreme heat events, or heat waves, are one of the leading causes of weather-related deaths in the U.S. Climate experts are particularly confident that climate change will bring increasingly frequent and severe heat waves and extreme weather events, as well as a rise in sea levels. These changes have the potential to affect human health in several direct and indirect ways, some of them severe. |
Dietary sources of methylated arsenic species in urine of the United States population, NHANES 2003-2010
deCastro BR , Caldwell KL , Jones RL , Blount BC , Pan Y , Ward C , Mortensen ME . PLoS One 2014 9 (9) e108098 BACKGROUND: Arsenic is an ubiquitous element linked to carcinogenicity, neurotoxicity, as well as adverse respiratory, gastrointestinal, hepatic, and dermal health effects. OBJECTIVE: Identify dietary sources of speciated arsenic: monomethylarsonic acid (MMA), and dimethylarsinic acid (DMA). METHODS: Age-stratified, sample-weighted regression of NHANES (National Health and Nutrition Examination Survey) 2003-2010 data ( approximately 8,300 participants ≥6 years old) characterized the association between urinary arsenic species and the additional mass consumed of USDA-standardized food groups (24-hour dietary recall data), controlling for potential confounders. RESULTS: For all arsenic species, the rank-order of age strata for median urinary molar concentration was children 6-11 years > adults 20-84 years > adolescents 12-19 years, and for all age strata, the rank-order was DMA > MMA. Median urinary molar concentrations of methylated arsenic species ranged from 0.56 to 3.52 micromol/mol creatinine. Statistically significant increases in urinary arsenic species were associated with increased consumption of: fish (DMA); fruits (DMA, MMA); grain products (DMA, MMA); legumes, nuts, seeds (DMA); meat, poultry (DMA); rice (DMA, MMA); rice cakes/crackers (DMA, MMA); and sugars, sweets, beverages (MMA). And, for adults, rice beverage/milk (DMA, MMA). In addition, based on US (United States) median and 90th percentile consumption rates of each food group, exposure from the following food groups was highlighted: fish; fruits; grain products; legumes, nuts, seeds; meat, poultry; and sugars, sweets, beverages. CONCLUSIONS: In a nationally representative sample of the US civilian, noninstitutionalized population, fish (adults), rice (children), and rice cakes/crackers (adolescents) had the largest associations with urinary DMA. For MMA, rice beverage/milk (adults) and rice cakes/crackers (children, adolescents) had the largest associations. |
The global burden of listeriosis: a systematic review and meta-analysis.
de Noordhout CM , Devleesschauwer B , Angulo FJ , Verbeke G , Haagsma J , Kirk M , Havelaar A , Speybroeck N . Lancet Infect Dis 2014 14 (11) 1073-1082 BACKGROUND: Listeriosis, caused by Listeria monocytogenes, is an important foodborne disease that can be difficult to control and commonly results in severe clinical outcomes. We aimed to provide the first estimates of global numbers of illnesses, deaths, and disability-adjusted life-years (DALYs) due to listeriosis, by synthesising information and knowledge through a systematic review. METHODS: We retrieved data on listeriosis through a systematic review of peer-reviewed and grey literature (published in 1990-2012). We excluded incidence data from before 1990 from the analysis. We reviewed national surveillance data where available. We did a multilevel meta-analysis to impute missing country-specific listeriosis incidence rates. We used a meta-regression to calculate the proportions of health states, and a Monte Carlo simulation to generate DALYs by WHO subregion. FINDINGS: We screened 11 722 references and identified 87 eligible studies containing listeriosis data for inclusion in the meta-analyses. We estimated that, in 2010, listeriosis resulted in 23 150 illnesses (95% credible interval 6061-91 247), 5463 deaths (1401-21 497), and 172 823 DALYs (44 079-676 465). The proportion of perinatal cases was 20.7% (SD 1.7). INTERPRETATION: Our quantification of the global burden of listeriosis will enable international prioritisation exercises. The number of DALYs due to listeriosis was lower than those due to congenital toxoplasmosis but accords with those due to echinococcosis. Urgent efforts are needed to fill the missing data in developing countries. We were unable to identify incidence data for the AFRO, EMRO, and SEARO WHO regions. FUNDING: WHO Foodborne Diseases Epidemiology Reference Group and the Universite catholique de Louvain. |
Norovirus outbreak at a wildland fire base camp ignites investigation of restaurant inspection policies
Britton CL , Guzzle PL , Hahn CG , Carter KK . J Environ Health 2014 77 (1) 8-14; quiz 44 Norovirus outbreaks occur worldwide and have been associated with congregate settings (e.g., military and recreational camps). Investigation of a norovirus outbreak at a wildland fire base camp identified 49 (27%) illnesses among approximately 180 responders. Epidemiologic evidence implicated a restaurant as the infection source. Eight (89%) of nine wildland fire responder groups who ate at the restaurant had ill members; no groups who ate elsewhere reported ill members. An environmental health specialist restaurant inspection identified lack of managerial knowledge to protect against foodborne disease one year after the restaurant's opening; earlier inspection after opening might have led to earlier intervention. States were surveyed to determine existence of any policy or rule for food establishment inspection after opening and inspection timing. Among 18 states, five had no state rule or policy; nine had a policy in place; and four required postopening inspection by rule. Further research is needed to evaluate post-opening inspection efficacy and timing. |
Filovirus RefSeq entries: evaluation and selection of filovirus type variants, type sequences, and names.
Kuhn JH , Andersen KG , Bao Y , Bavari S , Becker S , Bennett RS , Bergman NH , Blinkova O , Bradfute S , Brister JR , Bukreyev A , Chandran K , Chepurnov AA , Davey RA , Dietzgen RG , Doggett NA , Dolnik O , Dye JM , Enterlein S , Fenimore PW , Formenty P , Freiberg AN , Garry RF , Garza NL , Gire SK , Gonzalez JP , Griffiths A , Happi CT , Hensley LE , Herbert AS , Hevey MC , Hoenen T , Honko AN , Ignatyev GM , Jahrling PB , Johnson JC , Johnson KM , Kindrachuk J , Klenk HD , Kobinger G , Kochel TJ , Lackemeyer MG , Leroy EM , Lever MS , Muhlberger E , Netesov SV , Olinger GG , Omilabu SA , Palacios G , Panchal RG , Park DJ , Patterson JL , Paweska JT , Peters CJ , Pettitt J , Pitt L , Radoshitzky SR , Ryabchikova EI , Saphire EO , Sabeti PC , Sealfon R , Smither SJ , Sullivan NJ , Swanepoel R , Takada A , Towner JS , van der Groen G , Volchkov VE , Volchkova VA , Wahl-Jensen V , Warren TK , Warfield KL , Weidmann M , Nichol ST . Viruses 2014 6 (9) 3663-82 Sequence determination of complete or coding-complete genomes of viruses is becoming common practice for supporting the work of epidemiologists, ecologists, virologists, and taxonomists. Sequencing duration and costs are rapidly decreasing, sequencing hardware is under modification for use by non-experts, and software is constantly being improved to simplify sequence data management and analysis. Thus, analysis of virus disease outbreaks on the molecular level is now feasible, including characterization of the evolution of individual virus populations in single patients over time. The increasing accumulation of sequencing data creates a management problem for the curators of commonly used sequence databases and an entry retrieval problem for end users. Therefore, utilizing the data to their fullest potential will require setting nomenclature and annotation standards for virus isolates and associated genomic sequences. The National Center for Biotechnology Information's (NCBI's) RefSeq is a non-redundant, curated database for reference (or type) nucleotide sequence records that supplies source data to numerous other databases. Building on recently proposed templates for filovirus variant naming [<virus name> (<strain>)/<isolation host-suffix>/<country of sampling>/<year of sampling>/<genetic variant designation>-<isolate designation>], we report consensus decisions from a majority of past and currently active filovirus experts on the eight filovirus type variants and isolates to be represented in RefSeq, their final designations, and their associated sequences. |
Correlation between CYP1A1 transcript, protein level, enzyme activity and DNA adduct formation in normal human mammary epithelial cell strains exposed to benzo[a]pyrene.
Divi RL , Einem Lindeman TL , Shockley ME , Keshava C , Weston A , Poirier MC . Mutagenesis 2014 29 (6) 409-17 The polycyclic aromatic hydrocarbon (PAH) benzo(a)pyrene (BP) is thought to bind covalently to DNA, through metabolism by cytochrome P450 1A1 (CYP1A1) and CYP1B1, and other enzymes, to form r7, t8, t9-trihydroxy-c-10-(N 2 -deoxyguanosyl)-7,8,9,10-tetrahydro-benzo[a]-pyrene (BPdG). Evaluation of RNA expression data, to understand the contribution of different metabolic enzymes to BPdG formation, is typically presented as fold-change observed upon BP exposure, leaving the actual number of RNA transcripts unknown. Here, we have quantified RNA copies/ng cDNA (RNA cpn) for CYP1A1 and CYP1B1, as well as NAD(P)H:quinone oxidoreductase 1 (NQO1), which may reduce formation of BPdG adducts, using primary normal human mammary epithelial cell (NHMEC) strains, and the MCF-7 breast cancer cell line. In unexposed NHMECs, basal RNA cpn values were 58-836 for CYP1A1, 336-5587 for CYP1B1 and 5943-40112 for NQO1. In cells exposed to 4.0 microM BP for 12h, RNA cpn values were 251-13234 for CYP1A1, 4133-57078 for CYP1B1 and 4456-55887 for NQO1. There were 3.5 (mean, range 0.2-15.8) BPdG adducts/108 nucleotides in the NHMECs (n = 16), and 790 in the MCF-7s. In the NHMECs, BP-induced CYP1A1 RNA cpn was highly associated with BPdG (P = 0.002), but CYP1B1 and NQO1 were not. Western blots of four NHMEC strains, chosen for different levels of BPdG adducts, showed a linear correlation between BPdG and CYP1A1, but not CYP1B1 or NQO1. Ethoxyresorufin-O-deethylase (EROD) activity, which measures CYP1A1 and CYP1B1 together, correlated with BPdG, but NQO1 activity did not. Despite more numerous levels of CYP1B1 and NQO1 RNA cpn in unexposed and BP-exposed NHMECs and MCF-7cells, BPdG formation was only correlated with induction of CYP1A1 RNA cpn. The higher level of BPdG in MCF-7 cells, compared to NHMECs, may have been due to a much increased induction of CYP1A1 and EROD. Overall, BPdG correlation was observed with CYP1A1 protein and CYP1A1/1B1 enzyme activity, but not with CYP1B1 or NQO1 protein, or NQO1 enzyme activity. |
pncA and bptA are not sufficient to complement Ixodes scapularis colonization and persistence by Borrelia burgdorferi in a linear plasmid lp25-deficient background.
Gilmore RD , Brandt KS , Hyde JA . Infect Immun 2014 82 (12) 5110-6 The complex segmented genome of Borrelia burgdorferi is comprised of a linear chromosome along with numerous linear and circular plasmids essential for tick and/or mammalian infectivity. The pathogenic necessity for specific borrelial plasmids has been identified; most notably, infection of the tick vector and mammalian host both require linear plasmid 25 (lp25). Genes encoded on lp25, specifically bptA and pncA, are postulated to play a role for B. burgdorferi to infect and persist in Ixodes ticks. In this study, we complemented an lp25 deficient borrelial strain with pncA or pncA accompanied by bptA to evaluate their ability to restore larval colonization and persistence through transstadial transmission relative to wild type B. burgdorferi. The acquisition and/or survival of the complemented strains by larvae from infected mice was significantly decreased when assayed by cultivation and qPCR. Only 10% of the pncA complemented strain were found by culture to survive 17 days following larval feeding, while 45% of the pncA/bptA complemented strain survived, with similar results by PCR. However, neither of the complemented B. burgdorferi strains were capable of persisting through the molt to the nymphal stage as analyzed by culture. qPCR analyses of unfed nymphs detected B. burgdorferi genomes in several nymphs at low copy numbers, likely indicating the presence of DNA from dead or dying cells. Overall, the data indicates that pncA and bptA cannot independently support infection suggesting that lp25 encodes additional gene(s) or regulatory elements critical for B. burgdorferi survival and pathogenesis in the Ixodes vector. |
Discovery of prosimian and afrotherian foamy viruses and potential cross species transmissions amidst stable and ancient mammalian co-evolution.
Katzourakis A , Aiewsakun P , Jia H , Wolfe ND , LeBreton M , Yoder AD , Switzer WM . Retrovirology 2014 11 61 BACKGROUND: Foamy viruses (FVs) are a unique subfamily of retroviruses that are widely distributed in mammals. Owing to the availability of sequences from diverse mammals coupled with their pattern of codivergence with their hosts, FVs have one of the best-understood viral evolutionary histories ever documented, estimated to have an ancient origin. Nonetheless, our knowledge of some parts of FV evolution, notably that of prosimian and afrotherian FVs, is far from complete due to the lack of sequence data. RESULTS: Here, we report the complete genome of the first extant prosimian FV (PSFV) isolated from a lorisiforme galago (PSFVgal), and a novel partial endogenous viral element with high sequence similarity to FVs, present in the afrotherian Cape golden mole genome (ChrEFV). We also further characterize a previously discovered endogenous PSFV present in the aye-aye genome (PSFVaye). Using phylogenetic methods and available FV sequence data, we show a deep divergence and stable co-evolution of FVs in eutherian mammals over 100 million years. Nonetheless, we found that the evolutionary histories of bat, aye-aye, and New World monkey FVs conflict with the evolutionary histories of their hosts. By combining sequence analysis and biogeographical knowledge, we propose explanations for these mismatches in FV-host evolutionary history. CONCLUSION: Our discovery of ChrEFV has expanded the FV host range to cover the whole eutherian clade, and our evolutionary analyses suggest a stable mammalian FV-host co-speciation pattern which extends as deep as the exafroplacentalian basal diversification. Nonetheless, two possible cases of host switching were observed. One was among New World monkey FVs, and the other involves PSFVaye and a bat FV which may involve cross-species transmission at the level of mammalian orders. Our results highlight the value of integrating multiple sources of information to elucidate the evolutionary history of viruses, including continental and geographical histories, ancestral host locations, in addition to the natural history of host and virus. |
Full genomic characterization of a novel genotype combination, G4P[14], of a human rotavirus strain from Barbados.
Tam KI , Roy S , Esona MD , Jones S , Sobers S , Morris-Glasgow V , Rey-Benito G , Gentsch JR , Bowen MD . Infect Genet Evol 2014 28 524-9 Since 2004, the Pan American Health Organization (PAHO) has carried out rotavirus surveillance in Latin America and the Caribbean. Here we report the characterization of human rotavirus with the novel G-P combination of G4P[14], detected through PAHO surveillance in Barbados. Full genome sequencing of strain RVA/Human-wt/BRB/CDC1133/2012/G4P[14] revealed that its genotype is G4-P[14]-I1-R1-C1-M1-A8-N1-T1-E1-H1. The possession of a Genogroup 1 (Wa-like) backbone distinguishes this strain from other P[14] rotavirus strains. Phylogenetic analyses suggested that this strain was likely generated by genetic reassortment between human, porcine and possibly other animal rotavirus strains and identified 7 lineages within the P[14] genotype. The results of this study reinforce the potential role of interspecies transmission in generating human rotavirus diversity through reassortment. Continued surveillance is important to determine if rotavirus vaccines will protect against strains that express the P[14] rotavirus genotype. |
RNA populations in immunocompromised patients as reservoirs for novel norovirus variants.
Vega E , Donaldson E , Huynh J , Barclay L , Lopman B , Baric R , Chen LF , Vinje J . J Virol 2014 88 (24) 14184-96 Noroviruses are the leading cause of acute gastroenteritis outbreaks worldwide. The majority of norovirus outbreaks are caused by genogroup II.4 (GII.4) noroviruses. Novel GII.4 noroviruses emerge every 2-4 years and replace older variants as the dominant norovirus. The process of the emergence of novel variants is believed to be caused by a combination of recombination, genetic drift, and selection driven by population immunity, but how or where these novel variants emerge is not known. We detected two previously unknown novel GII.4 variants, termed GII.4 UNK1 and GII.4 UNK2, and a diverse norovirus population in fecal specimens from immunocompromised individuals with diarrhea after they had undergone bone-marrow transplantation. We hypothesized that immunocompromised individuals can serve as reservoirs for novel norovirus variants. To test our hypothesis, metagenomic analysis of viral RNA populations was combined with a full genome bioinformatic analysis of publicly available GII.4 noroviruses sequences from 1974 - 2014 to identify converging sites. Localization analysis indicated that variable sites were more likely to be within two amino acids (P< 0.05) of positively selected sites. Further analysis indicated polymorphic site distribution was random and its proximity to positively selected sites was dependent on the size of the norovirus genome and the number of positively selected sites. The results indicate that random mutations can have a positive impact on driving norovirus evolution and that immunocompromised individuals have the ability to serve as a potential reservoirs for novel GII.4 strains. IMPORTANCE: Norovirus is the most common cause of viral gastroenteritis in the US. Every two to three years novel norovirus variants emerge and rapidly disseminate throughout the world. The continual emergence of novel noroviruses is believed to be caused by a combination of genetic drift, population immunity, and recombination, but exactly how this emergence occurs remains unknown. In this study we identified two novel GII.4 variants in immunocompromised bone marrow transplant patients. Using metagenomic and bioinformatics analysis, we show that most genetic polymorphisms in the novel variants occur near, 0-2 amino acids, of positively selected sites, but the distribution of mutations was random; clustering of polymorphisms with positively selected sites was a result of genome size, number of mutations and positively selected sites. This study shows that immunocompromised patients can harbor infectious novel norovirus variants and although mutations in viruses are random they can have a positive effect in viral evolution. |
HGTector: an automated method facilitating genome-wide discovery of putative horizontal gene transfers
Zhu Q , Kosoy M , Dittmar K . BMC Genomics 2014 15 (1) 717 BACKGROUND: First pass methods based on BLAST match are commonly used as an initial step to separate the different phylogenetic histories of genes in microbial genomes, and target putative horizontal gene transfer (HGT) events. This will continue to be necessary given the rapid growth of genomic data and the technical difficulties in conducting large-scale explicit phylogenetic analyses. However, these methods often produce misleading results due to their inability to resolve indirect phylogenetic links and their vulnerability to stochastic events. RESULTS: A new computational method of rapid, exhaustive and genome-wide detection of HGT was developed, featuring the systematic analysis of BLAST hit distribution patterns in the context of a priori defined hierarchical evolutionary categories. Genes that fall beyond a series of statistically determined thresholds are identified as not adhering to the typical vertical history of the organisms in question, but instead having a putative horizontal origin. Tests on simulated genomic data suggest that this approach effectively targets atypically distributed genes that are highly likely to be HGT-derived, and exhibits robust performance compared to conventional BLAST-based approaches. This method was further tested on real genomic datasets, including Rickettsia genomes, and was compared to previous studies. Results show consistency with currently employed categories of HGT prediction methods. In-depth analysis of both simulated and real genomic data suggests that the method is notably insensitive to stochastic events such as gene loss, rate variation and database error, which are common challenges to the current methodology. An automated pipeline was created to implement this approach and was made publicly available at: https://github.com/DittmarLab/HGTector. The program is versatile, easily deployed, has a low requirement for computational resources. CONCLUSIONS: HGTector is an effective tool for initial or standalone large-scale discovery of candidate HGT-derived genes. |
Transmission of methicillin-resistant Staphylococcus aureus infection through solid organ transplantation: confirmation via whole genome sequencing.
Wendt JM , Kaul D , Limbago BM , Ramesh M , Cohle S , Denison AM , Driebe EM , Rasheed JK , Zaki SR , Blau DM , Paddock CD , McDougal LK , Engelthaler DM , Keim PS , Roe CC , Akselrod H , Kuehnert MJ , Basavaraju SV . Am J Transplant 2014 14 (11) 2633-9 We describe two cases of donor-derived methicillin-resistant Staphylococcus aureus (MRSA) bacteremia that developed after transplantation of organs from a common donor who died from acute MRSA endocarditis. Both recipients developed recurrent MRSA infection despite appropriate antibiotic therapy, and required prolonged hospitalization and hospital readmission. Comparison of S. aureus whole genome sequence of DNA extracted from fixed donor tissue and recipients' isolates confirmed donor-derived transmission. Current guidelines emphasize the risk posed by donors with bacteremia from multidrug-resistant organisms. This investigation suggests that, particularly in the setting of donor endocarditis, even a standard course of prophylactic antibiotics may not be sufficient to prevent donor-derived infection. |
The role of public health in antimicrobial stewardship in healthcare
Trivedi KK , Pollack LA . Clin Infect Dis 2014 59 Suppl 3 S101-3 Education, surveillance, and promotion of antimicrobial stewardship align with the goals of public health to prevent disease, promote health, and prolong life. Many US federal and state public health organizations are already engaged in antimicrobial stewardship activities. Healthcare providers are encouraged to work with public health officials on appropriate local antimicrobial stewardship strategies to attain the common goal of reducing antimicrobial resistance and preserving antimicrobials for future generations. |
Antimicrobial stewardship: importance for patient and public health
File TM Jr , Srinivasan A , Bartlett JG . Clin Infect Dis 2014 59 Suppl 3 S93-6 The discovery of potent antimicrobial agents was one of the greatest contributions to medicine in the 20th century. When introduced, they had an immediate and dramatic impact on the outcomes of infectious diseases, making once-lethal infections readily curable. Unfortunately, the emergence of antimicrobial-resistant pathogens now threatens these advances. Resistance is a serious health threat that affects the clinical outcome of patients as well as results in higher rates of adverse events and healthcare costs. | The seriousness of the health impact of antibiotic resistance and the limited pipeline of new antibiotics has combined to make antibiotic resistance a major public health crisis. Unfortunately, there are already patients every day who contract infections that cannot be treated with currently available antibiotics. The crisis of antibiotic resistance has been highlighted by academicians, practicing clinicians, professional societies, and public health agencies [1–9]. What can be done to address this crisis? There is no question that antibiotic use is the most important modifiable factor in tackling the problem of antibiotic resistance. Although principles of appropriate use have been encouraged since the introduction of antimicrobials, abiding by them is now more urgent than ever. The discouraging fact is that for decades now, a huge percentage of antibiotic use in both inpatient and outpatient settings is either totally unnecessary or incorrectly prescribed [5,10]. The good news is that we do have a solution to this problem. Since their inception, antimicrobial stewardship programs have proven highly successful in improving antibiotic use. Published studies demonstrate that these programs can improve patient outcomes, reduce adverse events (including Clostridium difficile), reduce readmission rates, and even reduce antibiotic resistance [11–16]. The proven benefits of antimicrobial stewardship programs have led to increasing calls for their implementation in all hospitals. |
Core elements of hospital antibiotic stewardship programs from the Centers for Disease Control and Prevention
Pollack LA , Srinivasan A . Clin Infect Dis 2014 59 Suppl 3 S97-s100 The proven benefits of antibiotic stewardship programs (ASPs) for optimizing antibiotic use and minimizing adverse events, such as Clostridium difficile and antibiotic resistance, have prompted the Centers for Disease Control and Prevention (CDC) to recommend that all hospitals have an ASP. This article summarizes Core Elements of Hospital Antibiotic Stewardship Programs, a recently released CDC document focused on defining the infrastructure and practices of coordinated multidisciplinary programs to improve antibiotic use and patient care in US hospitals. |
Does chlorhexidine bathing in adult intensive care units reduce blood culture contamination? A pragmatic cluster-randomized trial
Septimus EJ , Hayden MK , Kleinman K , Avery TR , Moody J , Weinstein RA , Hickok J , Lankiewicz J , Gombosev A , Haffenreffer K , Kaganov RE , Jernigan JA , Perlin JB , Platt R , Huang SS . Infect Control Hosp Epidemiol 2014 35 Suppl 3 S17-22 OBJECTIVE: To determine rates of blood culture contamination comparing 3 strategies to prevent intensive care unit (ICU) infections: screening and isolation, targeted decolonization, and universal decolonization. DESIGN: Pragmatic cluster-randomized trial. SETTING: Forty-three hospitals with 74 ICUs; 42 of 43 were community hospitals. PATIENTS: Patients admitted to adult ICUs from July 1, 2009, to September 30, 2011. METHODS: After a 6-month baseline period, hospitals were randomly assigned to 1 of 3 strategies, with all participating adult ICUs in a given hospital assigned to the same strategy. Arm 1 implemented methicillin-resistant Staphylococcus aureus (MRSA) nares screening and isolation, arm 2 targeted decolonization (screening, isolation, and decolonization of MRSA carriers), and arm 3 conducted no screening but universal decolonization of all patients with mupirocin and chlorhexidine (CHG) bathing. Blood culture contamination rates in the intervention period were compared to the baseline period across all 3 arms. RESULTS: During the 6-month baseline period, 7,926 blood cultures were collected from 3,399 unique patients: 1,099 sets in arm 1, 928 in arm 2, and 1,372 in arm 3. During the 18-month intervention period, 22,761 blood cultures were collected from 9,878 unique patients: 3,055 sets in arm 1, 3,213 in arm 2, and 3,610 in arm 3. Among all individual draws, for arms 1, 2, and 3, the contamination rates were 4.1%, 3.9%, and 3.8% for the baseline period and 3.3%, 3.2%, and 2.4% for the intervention period, respectively. When we evaluated sets of blood cultures rather than individual draws, the contamination rate in arm 1 (screening and isolation) was 9.8% (N = 108 sets) in the baseline period and 7.5% (N = 228) in the intervention period. For arm 2 (targeted decolonization), the baseline rate was 8.4% (N = 78) compared to 7.5% (N = 241) in the intervention period. Arm 3 (universal decolonization) had the greatest decrease in contamination rate, with a decrease from 8.7% (N = 119) contaminated blood cultures during the baseline period to 5.1% (N = 184) during the intervention period. Logistic regression models demonstrated a significant difference across the arms when comparing the reduction in contamination between baseline and intervention periods in both unadjusted (P = .02) and adjusted (P = .02) analyses. Arm 3 resulted in the greatest reduction in blood culture contamination rates, with an unadjusted odds ratio (OR) of 0.56 (95% confidence interval [CI], 0.044-0.71) and an adjusted OR of 0.55 (95% CI, 0.43-0.71). CONCLUSION: In this large cluster-randomized trial, we demonstrated that universal decolonization with CHG bathing resulted in a significant reduction in blood culture contamination. |
Post-licensure surveillance of trivalent live attenuated influenza vaccine in adults, United States, Vaccine Adverse Event Reporting System (VAERS), July 2005-June 2013.
Haber P , Moro P , McNeil MM , Lewis P , Woo EJ , Hughes H , Shimabukuro TT . Vaccine 2014 32 (48) 6499-504 BACKGROUND: Trivalent live attenuated influenza vaccine (LAIV3) was licensed and recommended for use in 2003 in children and adults 2-49 years of age. Post-licensure safety data have been limited, particularly in adults. METHODS: We searched Vaccine Adverse Event Reporting System (VAERS) for US reports after LAIV3 from July 1, 2005-June 30, 2013 (eight influenza seasons) in adults aged >18 years old. We conducted descriptive analyses and clinically reviewed serious reports (i.e., death, hospitalization, prolonged hospitalization, life-threatening illness, and disability) and reports of selected conditions of interest. We used empirical Bayesian data mining to identify adverse events (AEs) that were reported more frequently than expected. We calculated crude AE reporting rates to VAERS by influenza season. RESULTS: During the study period, VAERS received 1207 LAIV3 reports in adults aged 18-49 years old; 107 (8.9%) were serious, including four death reports. The most commonly reported events were expired drug administered (n=207, 17%), headache (n=192, 16%), and fever (n=133, 11%). The most common diagnostic categories for non-fatal serious reports were neurological (n=40, 39%), cardiovascular (n=14, 14%), and other non-infectious conditions (n=20, 19%). We noted a higher proportion of Guillain-Barre syndrome (GBS) and cardiovascular reports in the Department of Defense (DoD) population compared to the civilian population. Data mining detected disproportional reporting of ataxia (n=15); clinical review revealed that ataxia was a component of diverse clinical entities including GBS. CONCLUSIONS: Review of VAERS reports are reassuring, the only unexpected safety concern for LAIV3 identified was a higher than expected number of GBS reports in the DoD population, which is being investigated. Reports of administration of expired LAIV3 represent administration errors and indicate the need for education, training and screening regarding the approved indications. |
Vaccination with tetanus, diphtheria, and acellular pertussis vaccine of pregnant women enrolled in Medicaid - Michigan, 2011-2013
Housey M , Zhang F , Miller C , Lyon-Callo S , McFadden J , Garcia E , Potter R . MMWR Morb Mortal Wkly Rep 2014 63 (38) 839-42 In October 2011, the Advisory Committee on Immunization Practices (ACIP) first recommended the routine administration of a tetanus, diphtheria, and acellular pertussis vaccine (Tdap) during pregnancy as a strategy to protect infants from pertussis (also known as whooping cough). This recommendation applied to women previously unvaccinated with Tdap and specified the optimal vaccination time as late second or third trimester (after 20 weeks' gestation). By vaccinating pregnant women, infants, who are at highest risk for mortality and morbidity from pertussis, gain passive immunity from maternal antibodies transferred to them in utero (2-4). Since this recommendation was made, little has been published on the percentage of women receiving Tdap during pregnancy. In Michigan, Medicaid pays for costs of pregnancy for approximately 40% of births. Infants enrolled in Medicaid are a particularly vulnerable population; in Michigan, their all-cause mortality is higher than that of privately insured infants. To assess vaccination coverage among pregnant women enrolled in a publicly funded insurance program in Michigan, Medicaid administrative claims data and statewide immunization information system data for mothers of infants born during November 2011-February 2013 were analyzed. This report describes the results of that analysis, which indicated that only 14.3% of these women received Tdap during pregnancy, with rates highest (17.6%) among non-Hispanic, non-Arab whites and lowest (6.8%) among Arab women. Vaccination was related to maternal age and gestational age at birth, but not to adequacy of prenatal care. In 2013, recognizing the importance of Tdap for every pregnancy, ACIP revised its guidelines to include a Tdap dose during every pregnancy. Ensuring that all infants receive the protection against pertussis afforded by maternal vaccination will require enhanced efforts to vaccinate pregnant women. |
Lot-to-lot consistency of live attenuated SA 14-14-2 Japanese encephalitis vaccine manufactured in a good manufacturing practice facility and non-inferiority with respect to an earlier product
Zaman K , Naser AM , Power M , Yaich M , Zhang L , Ginsburg AS , Luby SP , Rahman M , Hills S , Bhardwaj M , Flores J . Vaccine 2014 32 (46) 6061-6 We conducted a four-arm, double-blind, randomized controlled trial among 818 Bangladeshi infants between 10 and 12 months of age to establish equivalence among three lots of live attenuated SA 14-14-2 JE vaccine manufactured by the China National Biotec Group's Chengdu Institute of Biological Products (CDIBP) in a new Good Manufacturing Practice (GMP) facility and to evaluate non-inferiority of the product with a lot of the same vaccine manufactured in CDIBP's original facility. The study took place in two sites in Bangladesh, rural Matlab and Mirpur in urban Dhaka. We collected pre-vaccination (Day 0) and post-vaccination Day 28 (-4 to +14 days) blood samples to assess neutralizing anti-JE virus antibody titers in serum by plaque reduction neutralization tests (PRNT). Seroprotection following vaccination was defined as a PRNT titer ≥1:10 at Day 28 in participants non-immune at baseline. Follow-up for reactogenicity and safety was conducted through home visits at Day 7 and monitoring for serious adverse events through Day 28. Seroprotection rates ranged from 80.2% to 86.3% for all four lots of vaccine. Equivalence of the seroprotection rates between pairs of vaccine lots produced in the new GMP facility was satisfied at the pre-specified 10% margin of the 95% confidence interval (CI) for two of the three pairwise comparisons, but not for the third (-4.3% observed difference with 95% CI of -11.9 to 3.3%). Nevertheless, the aggregate seroprotection rate for all three vaccine lots manufactured in the GMP facility was calculated and found to be within the non-inferiority margin (within 10%) to the vaccine lot produced in the original facility. All four lots of vaccine were safe and well tolerated. These study results should facilitate the use of SA 14-14-2 JE vaccine as a routine component of immunization programs in Asian countries. |
Polio eradication. Efficacy of inactivated poliovirus vaccine in India
Jafari H , Deshpande JM , Sutter RW , Bahl S , Verma H , Ahmad M , Kunwar A , Vishwakarma R , Agarwal A , Jain S , Estivariz C , Sethi R , Molodecky NA , Grassly NC , Pallansch MA , Chatterjee A , Aylward RB . Science 2014 345 (6199) 922-5 Inactivated poliovirus vaccine (IPV) is efficacious against paralytic disease, but its effect on mucosal immunity is debated. We assessed the efficacy of IPV in boosting mucosal immunity. Participants received IPV, bivalent 1 and 3 oral poliovirus vaccine (bOPV), or no vaccine. A bOPV challenge was administered 4 weeks later, and excretion was assessed 3, 7, and 14 days later. Nine hundred and fifty-four participants completed the study. Any fecal shedding of poliovirus type 1 was 8.8, 9.1, and 13.5% in the IPV group and 14.4, 24.1, and 52.4% in the control group by 6- to 11-month, 5-year, and 10-year groups, respectively (IPV versus control: Fisher's exact test P < 0.001). IPV reduced excretion for poliovirus types 1 and 3 between 38.9 and 74.2% and 52.8 and 75.7%, respectively. Thus, IPV in OPV-vaccinated individuals boosts intestinal mucosal immunity. |
Protection against gastroenteritis in US households with children who received rotavirus vaccine
Cortese MM , Dahl RM , Curns AT , Parashar UD . J Infect Dis 2014 211 (4) 558-62 We used Truven Health Marketscan claims database (2008-2011) to compare gastroenteritis rates during January-June among households whose child had received rotavirus vaccine with those whose child did not receive vaccine. Statistically significantly lower rates of hospitalization with a rotavirus gastroenteritis or unspecified-gastroenteritis discharge code occurred in vaccinated households among persons 20-29 years and females 20-29 years (2008/09), and males 30-39 years (2009/10). Lower emergency department gastroenteritis rates occurred in vaccinated households among females 20-29 years (2009/2010) and individuals 5-19 years (2010/2011). These data suggest rotavirus vaccination of infants provides indirect protection against moderate-to-severe rotavirus disease in young parents and older siblings. |
Effectiveness of seasonal trivalent inactivated influenza vaccine in preventing influenza hospitalisations and primary care visits in Auckland, New Zealand, in 2013
Turner N , Pierse N , Bissielo A , Huang Q , Radke S , Baker M , Widdowson M , Kelly H . Euro Surveill 2014 19 (34) This study reports the first vaccine effectiveness (VE) estimates for the prevention of general practice visits and hospitalisations for laboratory-confirmed influenza from an urban population in Auckland, New Zealand, in the same influenza season (2013). A case test-negative design was used to estimate propensity-adjusted VE in both hospital and community settings. Patients with a severe acute respiratory infection (SARI) or influenza-like illness (ILI) were defined as requiring hospitalisation (SARI) or attending a general practice (ILI) with a history of fever or measured temperature >/=38 degrees C, cough and onset within the past 10 days. Those who tested positive for influenza virus were cases while those who tested negative were controls. Results were analysed to 7 days post symptom onset and adjusted for the propensity to be vaccinated and the timing during the influenza season. Influenza vaccination provided 52% (95% CI: 32 to 66) protection against laboratory-confirmed influenza hospitalisation and 56% (95% CI: 34 to 70) against presenting to general practice with influenza. VE estimates were similar for all types and subtypes. This study found moderate effectiveness of influenza vaccine against medically attended and hospitalised influenza in New Zealand, a temperate, southern hemisphere country during the 2013 winter season. |
Impact of repeated vaccination on vaccine effectiveness against influenza A(H3N2) and B during 8 seasons
McLean HQ , Thompson MG , Sundaram ME , Meece JK , McClure DL , Friedrich TC , Belongia EA . Clin Infect Dis 2014 59 (10) 1375-85 BACKGROUND: Recent studies suggest that influenza vaccination in the previous season may influence the effectiveness of current-season vaccination, but this has not been assessed in a single population over multiple years. METHODS: Patients presenting with acute respiratory illness were prospectively enrolled during the 2004-2005 through 2012-2013 influenza seasons. Respiratory swabs were tested for influenza and vaccination dates obtained from a validated registry. Vaccination status was determined for the current, previous, and prior 5 seasons. Vaccine effectiveness (VE) was calculated for participants aged ≥9 years using logistic regression models with an interaction term for vaccination history. RESULTS: There were 7315 enrollments during 8 seasons; 1056 (14%) and 650 (9%) were positive for influenza A(H3N2) and B, respectively. Vaccination during current only, previous only, or both seasons yielded similar protection against H3N2 (adjusted VE range, 31%-36%) and B (52%-66%). In the analysis using 5 years of historical vaccination data, current season VE against H3N2 was significantly higher among vaccinated individuals with no prior vaccination history (65%; 95% confidence interval [CI], 36%-80%) compared with vaccinated individuals with a frequent vaccination history (24%; 95% CI, 3%-41%; P = .01). VE against B was 75% (95% CI, 50%-87%) and 48% (95% CI, 29%-62%), respectively (P = .05). Similar findings were observed when analysis was restricted to adults 18-49 years. CONCLUSIONS: Current- and previous-season vaccination generated similar levels of protection, and vaccine-induced protection was greatest for individuals not vaccinated during the prior 5 years. Additional studies are needed to understand the long-term effects of annual vaccination. |
Does preventing rotavirus infections through vaccination also protect against naturally-occurring intussusception over time?
Payne DC , Baggs J , Klein NP , Parashar UD . Clin Infect Dis 2014 60 (1) 163-4 Intestinal intussusception is an uncommon event (incidence approximately 30 per 100 000 per year in US infants) in which one part of the intestine folds into another. The condition is usually considered idiopathic, but numerous case reports link intussusception to enteric pathogens, notably adenovirus. Several case reports have linked wild-type rotavirus to intussusception, although evidence for this link is inconclusive [1]. | In 1999, a previous rotavirus vaccine (Rotashield) was found to be associated with intussusception [2] and was withdrawn from the US market. Currently, 2 rotavirus vaccines are routinely administered to US infants: RotaTeq and Rotarix. Large clinical trials (>60 000 children each) found no statistical association between vaccination and intussusception within 42 days following any dose [3, 4]. Nonetheless, with the accumulation of study power over time, recent postlicensure rotavirus vaccine safety assessments in the United States [5,6] report a modest but significant risk of intussusception. | One could postulate that if vaccines based on 3 different live-attenuated rota-virus vaccine strains (rhesus in Rota-shield, bovine-human reassortants in RotaTeq, and human rotavirus in Rotar-ix) are statistically associated with intussusception, then the wild-type rotavirus (naturally nonattenuated and most virulent) could also plausibly be associated with intussusception. We studied a vaccine probe hypothesis: If wild-type rotavirus infection is causally associated with intussusception, then the prevention of such infections through vaccination could conceivably protect against intussusception during time periods after vaccination. |
Using the National Death Index to identify duplicate cancer incident cases in Florida and New York, 1996-2005
Wohler B , Qiao B , Weir HK , MacKinnon JA , Schymura MJ . Prev Chronic Dis 2014 11 E167 INTRODUCTION: Cancer registries link incidence data to state death certificates to update vital status and identify missing cases; they also link these data to the National Death Index (NDI) to update vital status among patients who leave the state after their diagnosis. This study explored the use of information from NDI linkages to identify potential duplicate cancer cases registered in both Florida and New York. METHODS: The Florida Cancer Data System (FCDS) and the New York State Cancer Registry (NYSCR) linked incidence data with state and NDI death records from 1996 through 2005. Information for patients whose death occurred in the reciprocal state (the death state) was exchanged. Potential duplicate cases were those that had the same diagnosis and the same or similar diagnosis date. RESULTS: NDI identified 4,657 FCDS cancer patients who died in New York and 2,740 NYSCR cancer patients who died in Florida. Matching identified 5,030 cases registered in both states; 508 were death certificate-only (DCO) cases in the death state's registry, and 3,760 (74.8%) were potential duplicates. Among FCDS and NYSCR patients who died and were registered in the registry of the reciprocal state, more than 50% were registered with the same cancer diagnosis, and approximately 80% had similar diagnosis dates (within 1 year). CONCLUSION: NDI identified DCO cases in the death state's cancer registry and a large proportion of potential duplicate cases. Standards are needed for assigning primary residence when multiple registries report the same case. The registry initiating the NDI linkage should consider sharing relevant information with death state registries so that these registries can remove erroneous DCO cases from their databases. |
Antimycobacterial activity of DNA intercalator inhibitors of Mycobacterium tuberculosis primase DnaG.
Gajadeera C , Willby MJ , Green KD , Shaul P , Fridman M , Garneau-Tsodikova S , Posey JE , Tsodikov OV . J Antibiot (Tokyo) 2014 68 (3) 153-7 Owing to the rise in drug resistance in tuberculosis combined with the global spread of its causative pathogen, Mycobacterium tuberculosis (Mtb), innovative anti mycobacterial agents are urgently needed. Recently, we developed a novel primase-pyrophosphatase assay and used it to discover inhibitors of an essential Mtb enzyme, primase DnaG (Mtb DnaG), a promising and unexplored potential target for novel antituberculosis chemotherapeutics. Doxorubicin, an anthracycline antibiotic used as an anticancer drug, was found to be a potent inhibitor of Mtb DnaG. In this study, we investigated both inhibition of Mtb DnaG and the inhibitory activity against in vitro growth of Mtb and M. smegmatis (Msm) by other anthracyclines, daunorubicin and idarubicin, as well as by less cytotoxic DNA intercalators: aloe-emodin, rhein and a mitoxantrone derivative. Generally, low-muM inhibition of Mtb DnaG by the anthracyclines was correlated with their low-muM minimum inhibitory concentrations. Aloe-emodin displayed threefold weaker potency than doxorubicin against Mtb DnaG and similar inhibition of Msm (but not Mtb) in the mid-muM range, whereas rhein (a close analog of aloe-emodin) and a di-glucosylated mitoxantrone derivative did not show significant inhibition of Mtb DnaG or antimycobacterial activity. Taken together, these observations strongly suggest that several clinically used anthracyclines and aloe-emodin target mycobacterial primase, setting the stage for a more extensive exploration of this enzyme as an antibacterial target. |
Activation of the RIG-I pathway during influenza vaccination enhances the germinal center reaction, promotes T follicular helper cell induction, and provides a dose-sparing effect and protective immunity.
Kulkarni RR , Rasheed MA , Bhaumik SK , Ranjan P , Cao W , Davis C , Marisetti K , Thomas S , Gangappa S , Sambhara S , Kaja MK . J Virol 2014 88 (24) 13990-4001 Pattern Recognition Receptors (PRR) sense certain molecular patterns uniquely expressed by pathogens. Retinoic-acid-inducible gene I (RIG-I) is a cytosolic PRR that senses viral nucleic acids and induces innate immune activation and secretion of type-I IFNs. Here, using influenza vaccine antigens, we investigated the consequences of activating the RIG-I pathway on antigen-specific adaptive immune responses. We found that mice immunized with influenza vaccine antigens co-administered with 5' ppp-dsRNA, a RIG-I ligand, developed robust levels of hemagglutination inhibiting antibodies, enhanced germinal center reaction and T follicular helper cell responses. In addition, RIG-I activation enhanced antibody affinity maturation and plasma cell responses in draining lymph nodes, spleen, and bone marrow and conferred protective immunity against virus challenge. Importantly, activation of RIG-I pathway was able to reduce the antigen requirement by 10-100 folds in inducing optimal influenza-specific cellular and humoral responses including protective immunity. The effects induced by 5' ppp-dsRNA were significantly dependent on type-I IFN and IPS-1 (adapter protein downstream of RIG-I pathway) signaling, but were independent of MyD88 or TLR3 mediated pathways. Our results show that activation of RIG-I-like receptor pathway programs the innate immunity to achieve qualitatively and quantitatively enhanced protective cellular adaptive immune responses even at low antigen doses and thus, indicate the potential utility of RIG-I ligands as molecular adjuvants for the viral vaccines. STUDY IMPORTANCE STATEMENT: The recently discovered RNA helicase family of RIG-I-like receptors (RLRs) is a critical component of host defense mechanisms responsible for detecting viruses and triggering innate anti-viral cytokines that help control viral replication and dissemination. In this study we show that the RLR-pathway can be effectively exploited for enhancing adaptive immunity and protective immune memory against viral infection. Our results show that activation of RIG-I pathway along with influenza vaccination programs the innate immunity to induce qualitatively and quantitatively superior protective adaptive immunity against pandemic influenza viruses. More importantly, the RIG-I activation at the time of vaccination allows induction of robust adaptive responses even at sparing vaccine antigen doses. These results highlight the potential utility of exploiting RIG-I pathway for enhancing viral vaccine specific immunity and have broader implications for designing better vaccines in general. |
Clinical laboratory response to a mock outbreak of invasive bacterial infections: a preparedness study.
Olsen RJ , Fittipaldi N , Kachroo P , Sanson MA , Long SW , Como-Sabetti KJ , Valson C , Cantu C , Lynfield R , Van Beneden C , Beres SB , Musser JM . J Clin Microbiol 2014 52 (12) 4210-6 Large, hospital-based clinical laboratories must be prepared to rapidly investigate potential infectious disease outbreaks. To challenge the ability of our molecular diagnostics laboratory to use whole genome sequencing in a potential outbreak scenario and identify impediments, we studied 84 invasive serotype emm59 group A Streptococcus (GAS) strains collected in the United States. We performed a rapid-response exercise to the mock outbreak scenario using whole genome sequencing, genome-wide transcript analysis and mouse virulence studies. Protocol changes installed in response to lessons learned were tested in a second iteration. The initial investigation was completed in 9 days. Whole genome sequencing showed that the invasive infections were caused by multiple subclones of epidemic emm59 GAS likely spread to the United States from Canada. The phylogenetic tree showed a strong temporal-spatial structure with diversity in mobile genetic element content, features useful for identifying closely related strains and possible transmission events. The genome data informed the epidemiology, identifying multiple patients who likely acquired the organisms through direct person-to-person transmission. Transcriptome analysis unexpectedly revealed significantly altered expression of genes encoding a two-component regulator and the hyaluronic acid capsule virulence factor. Mouse infection studies confirmed a high-virulence capacity of these emm59 organisms. Whole genome sequencing, coupled with transcriptome analysis and animal virulence studies, can be rapidly performed in a clinical environment to effectively contribute to patient care decisions and public health maneuvers. |
Synthesis, characterization, and bioactivity of carboxylic acid-functionalized titanium dioxide nanobelts
Hamilton RF , Wu N , Xiang C , Li M , Yang F , Wolfarth M , Porter DW , Holian A . Part Fibre Toxicol 2014 11 (1) 43 BACKGROUND: Surface modification strategies to reduce engineered nanomaterial (ENM) bioactivity have been used successfully in carbon nanotubes. This study examined the toxicity and inflammatory potential for two surface modifications (humic acid and carboxylation) on titanium nanobelts (TNB). METHODS: The in vitro exposure models include C57BL/6 alveolar macrophages (AM) and transformed human THP-1 cells exposed to TNB for 24 hrs in culture. Cell death and NLRP3 inflammasome activation (IL-1beta release) were monitored. Short term (4 and 24 hr) in vivo studies in C57BL/6, BALB/c and IL-1R null mice evaluated inflammation and cytokine release, and cytokine release from ex vivo cultured AM. RESULTS: Both in vitro cell models suggest that the humic acid modification does not significantly affect TNB bioactivity, while carboxylation reduced both toxicity and NLRP3 inflammasome activation. In addition, short term in vivo exposures in both C57BL/6 and IL-1R null mouse strains demonstrated decreased markers of inflammation, supporting the in vitro finding that carboxylation is effective in reducing bioactivity. TNB instillations in IL-1R null mice demonstrated the critical role of IL-1beta in initiation of TNB-induced lung inflammation. Neutrophils were completely absent in the lungs of IL-1R null mice instilled with TNB for 24 hrs. However, the cytokine content of the IL-1R null mice lung lavage samples indicated that other inflammatory agents, IL-6 and TNF-alpha were constitutively elevated indicating a potential compensatory inflammatory mechanism in the absence of IL-1 receptors. CONCLUSIONS: Taken together, the data suggests that carboxylation, but not humic acid modification of TNB reduces, but does not totally eliminate bioactivity of TNB, which is consistent with previous studies of other long aspect ratio nanomaterials such as carbon nanotubes. |
An LC-MS/MS method for serum methylmalonic acid suitable for monitoring vitamin B12 status in population surveys
Mineva EM , Zhang M , Rabinowitz DJ , Phinney KW , Pfeiffer CM . Anal Bioanal Chem 2014 407 (11) 2955-64 Methylmalonic acid (MMA), a functional indicator of vitamin B12 insufficiency, was measured in the US population in the National Health and Nutrition Examination Survey (NHANES) from 1999 to 2004 using a GC/MS procedure that required 275 muL of sample and had a low throughput (36 samples/run). Our objective was to introduce a more efficient yet highly accurate LC-MS/MS method for NHANES 2011-2014. We adapted the sample preparation with some modifications from a published isotope-dilution LC-MS/MS procedure. The procedure utilized liquid-liquid extraction and generation of MMA dibutyl ester. Reversed-phase chromatography with isocratic elution allowed baseline resolution of MMA from its naturally occurring structural isomer succinic acid within 4.5 min. Our new method afforded an increased throughput (≤160 samples/run) and measured serum MMA with high sensitivity (LOD = 22.1 nmol/L) in only 75 muL of sample. Mean (+/-SD) recovery of MMA spiked into serum (2 d, 4 levels, 2 replicates each) was 94 % +/- 5.5 %. Total imprecision (41 d, 2 replicates each) for three serum quality control pools was 4.9 %-7.9 % (97.1-548 nmol/L). The LC-MS/MS method showed excellent correlation (n = 326, r = 0.99) and no bias (Deming regression, Bland-Altman analysis) compared to the previous GC/MS method. Both methods produced virtually identical mean (+/-SD) MMA concentrations [LC-MS/MS: 18.47 +/- 0.71 ng/mL (n = 17), GC/MS: 18.18 +/- 0.67 ng/mL (n = 11)] on a future plasma reference material compared with a GC/MS method procedure from the National Institute of Standards and Technology [18.41 +/- 0.70 ng/mL (n = 15)]. No adjustment will be necessary to compare previous (1999-2004) to future (2011-2014) NHANES MMA data. |
Nowcasting the spread of chikungunya virus in the Americas
Johansson MA , Powers AM , Pesik N , Cohen NJ , Staples JE . PLoS One 2014 9 (8) e104915 BACKGROUND: In December 2013, the first locally-acquired chikungunya virus (CHIKV) infections in the Americas were reported in the Caribbean. As of May 16, 55,992 cases had been reported and the outbreak was still spreading. Identification of newly affected locations is paramount to intervention activities, but challenging due to limitations of current data on the outbreak and on CHIKV transmission. We developed models to make probabilistic predictions of spread based on current data considering these limitations. METHODS AND FINDINGS: Branching process models capturing travel patterns, local infection prevalence, climate dependent transmission factors, and associated uncertainty estimates were developed to predict probable locations for the arrival of CHIKV-infected travelers and for the initiation of local transmission. Many international cities and areas close to where transmission has already occurred were likely to have received infected travelers. Of the ten locations predicted to be the most likely locations for introduced CHIKV transmission in the first four months of the outbreak, eight had reported local cases by the end of April. Eight additional locations were likely to have had introduction leading to local transmission in April, but with substantial uncertainty. CONCLUSIONS: Branching process models can characterize the risk of CHIKV introduction and spread during the ongoing outbreak. Local transmission of CHIKV is currently likely in several Caribbean locations and possible, though uncertain, for other locations in the continental United States, Central America, and South America. This modeling framework may also be useful for other outbreaks where the risk of pathogen spread over heterogeneous transportation networks must be rapidly assessed on the basis of limited information. |
Effects of developmental methylphenidate (MPH) treatment on monoamine neurochemistry of male and female rats
Panos JJ , O'Callaghan JP , Miller DB , Ferguson SA . Neurotoxicol Teratol 2014 45c 70-74 Attention Deficit Hyperactivity Disorder (ADHD) is estimated to affect 4-5% of the adult human population (Kessler et al., 2006; Willcutt, 2012). Often prescribed to attenuate ADHD symptoms (Nair and Moss, 2009), methylphenidate hydrochloride (MPH) can have substantial positive effects. However, there is a paucity of literature regarding its use during pregnancy. Thus, adult women with ADHD face a difficult decision when contemplating pregnancy. In this study, pregnant Sprague-Dawley rats were orally treated a total of 0 (water), 6 (low), 18 (medium), or 42 (high) mgMPH/kgbodyweight/day (divided into three doses) on gestational days 6-21 (i.e., the low dose received 2mgMPH/kgbodyweight3x/day). Offspring were orally treated with the same daily dose as their dam (divided into two doses) on postnatal days (PNDs) 1-21. One offspring/sex/litter was sacrificed at PND 22 or PND 104 (n=6-7/age/sex/treatment group) and the striatum was quickly dissected and frozen. High Performance Liquid Chromatography (HPLC) coupled to a Photo Diode Array detector (PDA) was used to analyze monoamine content in the striatum of one side while a sandwich ELISA was used to analyze tyrosine hydroxylase (TH) from the other side. Age significantly affected monoamine and metabolite content as well as turnover ratios (i.e., DA, DOPAC, HVA, DOPAC/DA, HVA/DA, 5-HT and 5-HIAA); however, there were no significant effects of sex. Adult rats of the low MPH group had higher DA levels than control adults (p<0.05). At both ages, subjects of the low MPH group had higher TH levels than controls (p<0.05), although neither effect (i.e., higher DA or TH levels) exhibited an apparent dose-response. PND 22 subjects of the high MPH treatment group had higher ratios of HVA/DA and DOPAC/DA than same-age control subjects (p<0.05). The increased TH levels of the low MPH group may be related to the increased DA levels of adult rats. While developmental MPH treatment appears to have some effects on monoamine system development, further studies are required to determine if these alterations manifest as functional changes in behavior. |
Elevated Staphylococcus ceftriaxone MICs are an Etest artifact
Limbago BM , Pierce VM , Lonsway DR , Ferraro MJ . Clin Infect Dis 2014 60 (1) 162-3 The recent publication by Pickering et al [1] described a collection of methicillin-susceptible Staphylococcus aureus (MSSA) that displayed elevated ceftriaxone minimum inhibitory concentrations (MICs) when tested by Etest (bioMerieux, Durham, North Carolina) gradient diffusion and would have been called “Resistant” to ceftriaxone based on previous Clinical and Laboratory Standards Institute (CLSI) interpretive guidance. The authors reported that approximately 60% of MSSA tested at their institution would have been misclassified based on the current CLSI guidance, which recommends testing staphylococci only against penicillin and oxacillin or cefoxitin in order to infer susceptibility or resistance to other β-lactam agents. This article was available electronically ahead of print for several months. Although it was subsequently retracted as “an honest error in interpretation,” we believe a fuller explanation of the findings could improve understanding among Clinical Infectious Diseases readership. | We investigated the accuracy of the initial report by performing reference broth microdilution (BMD), disk diffusion, and Etest [both low (0.002–32 µg/mL) and high (0.016–256 µg/mL) range ceftriaxone Etest products] antimicrobial susceptibility testing on 8 pulsed field gel electrophoresis (PFGE)-matched pairs of MSSA from the Pickering study [1] reported to have mismatched ceftriaxone susceptibility. All 16 isolates were confirmed as oxacillin, cefoxitin, and ceftriaxone susceptible [2, 3] with BMD and disk methods. Ceftriaxone MICs obtained by both Etest products were typically higher than those obtained with BMD but were still in the susceptible range for 100% of isolates using the high concentration ceftriaxone Etest, and for 93.8% of isolates using the low concentration ceftriaxone Etest (1 isolate tested as intermediate). In addition, 30 consecutive, unique MSSA isolated from blood cultures during 2 months at a single hospital were tested against ceftriaxone byBMD, disk diffusion, and Etest using a single 0.5 McFarland inoculum. All isolates tested ceftriaxone susceptible by disk diffusion and BMD; 13 (43%) isolates tested nonsusceptible with Etest (Table 1). We also note that the Etest ceftriaxone package inserts do not list staphylococci as an organism group for which testing has been cleared [4, 5]. |
Endothelium-derived hyperpolarizing factor mediates bradykinin-stimulated tissue plasminogen activator release in humans
Rahman AM , Murrow JR , Ozkor MA , Kavtaradze N , Lin J , De Staercke C , Hooper WC , Manatunga A , Hayek S , Quyyumi AA . J Vasc Res 2014 51 (3) 200-8 AIMS: Bradykinin (BK) stimulates tissue plasminogen activator (t-PA) release from human endothelium. Although BK stimulates both nitric oxide and endothelium-derived hyperpolarizing factor (EDHF) release, the role of EDHF in t-PA release remains unexplored. This study sought to determine the mechanisms of BK-stimulated t-PA release in the forearm vasculature of healthy human subjects. METHODS: In 33 healthy subjects (age 40.3 +/- 1.9 years), forearm blood flow (FBF) and t-PA release were measured at rest and after intra-arterial infusions of BK (400 ng/min) and sodium nitroprusside (3.2 mg/min). Measurements were repeated after intra-arterial infusion of tetraethylammonium chloride (TEA; 1 micromol/min), fluconazole (0.4 micromol.min(-1).l(-1)), and N(G)-monomethyl-L-arginine (L-NMMA, 8 micromol/min) to block nitric oxide, and their combination in separate studies. RESULTS: BK significantly increased net t-PA release across the forearm (p < 0.0001). Fluconazole attenuated both BK-mediated vasodilation (-23.3 +/- 2.7% FBF, p < 0.0001) and t-PA release (from 50.9 +/- 9.0 to 21.3 +/- 8.9 ng/min/100 ml, p = 0.02). TEA attenuated FBF (-14.7 +/- 3.2%, p = 0.002) and abolished BK-stimulated t-PA release (from 22.9 +/- 5.7 to -0.8 +/- 3.6 ng/min/100 ml, p = 0.0002). L-NMMA attenuated FBF (p < 0.0001), but did not inhibit BK-induced t-PA release (nonsignificant). CONCLUSION: BK-stimulated t-PA release is partly due to cytochrome P450-derived epoxides and is inhibited by K(+)Ca channel blockade. Thus, BK stimulates both EDHF-dependent vasodilation and t-PA release. |
Improved specificity and reduced subtype cross-reactivity for antibody detection by ELISA using globular head domain recombinant hemagglutinin
Li ZN , Carney PJ , Lin SC , Li J , Chang JC , Veguilla V , Stevens J , Miller JD , Levine M , Katz JM , Hancock K . J Virol Methods 2014 209 121-5 The relative performance of ELISA using globular head domain (GH) and ectodomain hemagglutinins (HAs) as antigens to detect influenza A virus IgG antibody responses was assessed. Assay sensitivity and subtype cross-reactivity were evaluated using sera collected from recipients of monovalent H5N1 vaccine and A(H1N1)pdm09 virus-infected persons. Assay specificity was determined using collections of sera from either individuals unexposed to either H5N1 or A(H1N1)pdm09 viruses or exposed to H5N1 or A(H1N1)pdm09 viruses through vaccination or infection, respectively. ELISA using GH HA showed a similar degree of sensitivity, significantly higher specificity, and significantly lower subtype cross-reactivity compared to ELISA using ectodomain HA. |
The Bristol stool scale and its relationship to Clostridium difficile infection
Caroff DA , Edelstein PH , Hamilton K , Pegues DA . J Clin Microbiol 2014 52 (9) 3437-9 The Bristol stool form scale classifies the relative density of stool samples. In a prospective cohort study, we investigated the associations between stool density, C. difficile assay positivity, hospital-onset C. difficile infection, complications, and severity of C. difficile. We describe associations between the Bristol score, assay positivity, and clinical C. difficile infection. |
Characterization of Lone Pine, California, tremolite asbestos and preparation of research material
Harper M , Van Gosen B , Crankshaw OS , Doorn SS , Ennis TJ , Harrison SE . Ann Occup Hyg 2014 59 (1) 91-103 Well-characterized amphibole asbestos mineral samples are required for use as analytical standards and in future research projects. Currently, the National Institute for Standards and Technology Standard Reference Material samples of asbestos are listed as 'Discontinued'. The National Institute for Occupational Safety and Health (NIOSH) has a goal under the Asbestos Roadmap of locating and characterizing research materials for future use. Where an initial characterization analysis determines that a collected material is appropriate for use as a research material in terms of composition and asbestiform habit, sufficient amounts of the material will be collected to make it publicly available. An abandoned mine near Lone Pine, California, contains a vein of tremolite asbestos, which was the probable source of a reference material that has been available for the past 17 years from the Health and Safety Laboratory (HSL) in the UK. Newly collected fibrous vein material from this mine was analyzed at Research Triangle Institute (RTI International) with some additional analysis by the US Geological Survey's Denver Microbeam Laboratory. The analysis at RTI International included: (i) polarized light microscopy (PLM) with a determination of principal optical properties; (ii) X-ray diffraction; (iii) transmission electron microscopy, including energy dispersive X-ray spectroscopy and selected-area electron diffraction; and (iv) spindle stage analysis using PLM to determine whether individual fibers and bundles of the samples were polycrystalline or single-crystal cleavage fragments. The overall findings of the study indicated that the material is tremolite asbestos with characteristics substantially similar to the earlier distributed HSL reference material. A larger quantity of material was prepared by sorting, acid-washing and mixing for sub-division into vials of ~10g each. These vials have been transferred from NIOSH to RTI International, from where they can be obtained on request. |
Mortality of New York children with sickle cell disease identified through newborn screening.
Wang Y , Liu G , Caggana M , Kennedy J , Zimmerman R , Oyeku SO , Werner EM , Grant AM , Green NS , Grosse SD . Genet Med 2014 17 (6) 452-9 PURPOSE: Long-term follow-up of newborn screening for conditions such as sickle cell disease can be conducted using linkages to population-based data. We sought to estimate childhood sickle cell disease mortality and risk factors among a statewide birth cohort with sickle cell disease identified through newborn screening. METHODS: Children with sickle cell disease identified by newborn screening and born to New York residents in 2000-2008 were matched to birth and death certificates. Mortality rates were calculated (using numbers of deaths and observed person-years at risk) and compared with mortality rates for all New York children by maternal race/ethnicity. Stratified analyses were conducted to examine associations between selected factors and mortality. RESULTS: Among 1,911 infants with sickle cell disease matched to birth certificates, 21 deaths were identified. All-cause mortality following diagnosis was 3.8 per 1,000 person-years in the first 2 years of life and 1.0 per 1,000 person-years at ages 2-9 years. The mortality rate was significantly lower among children of foreign-born mothers and was significantly higher among preterm infants with low birth weight. The mortality rates were not significantly higher for infants after 28 days with sickle cell disease than for all New York births, but they were 2.7-8.4 times higher for children 1 through 9 years old with homozygous sickle cell disease than for those of all non-Hispanic black or Hispanic children born to New York residents. CONCLUSION: Estimated mortality risk in children with homozygous sickle cell disease remains elevated even after adjustment for maternal race/ethnicity. These results provide evidence regarding the current burden of child mortality among children with sickle cell disease despite newborn screening. |
The use of continuous surveys to generate and continuously report high quality timely maternal and newborn health data at the district level in Tanzania and Uganda
Marchant T , Schellenberg J , Peterson S , Manzi F , Waiswa P , Hanson C , Temu S , Darious K , Sedekia Y , Akuze J , Rowe AK . Implement Sci 2014 9 112 BACKGROUND: The lack of high quality timely data for evidence-informed decision making at the district level presents a challenge to improving maternal and newborn survival in low income settings. To address this problem, the EQUIP project (Expanded Quality Management using Information Power) implemented a continuous household and health facility survey for continuous feedback of data in two districts each in Tanzania and Uganda as part of a quality improvement innovation for mothers and newborns. METHODS: Within EQUIP, continuous survey data were used for quality improvement (intervention districts) and for effect evaluation (intervention and comparison districts). Over 30 months of intervention (November 2011 to April 2014), EQUIP conducted continuous cross-sectional household and health facility surveys using 24 independent probability samples of household clusters to represent each district each month, and repeat censuses of all government health facilities. Using repeat samples in this way allowed data to be aggregated at six four-monthly intervals to track progress over time for evaluation, and for continuous feedback to quality improvement teams in intervention districts.In both countries, one continuous survey team of eight people was employed to complete approximately 7,200 household and 200 facility interviews in year one. Data were collected using personal digital assistants. After every four months, routine tabulations of indicators were produced and synthesized to report cards for use by the quality improvement teams. RESULTS: The first 12 months were implemented as planned. Completion of household interviews was 96% in Tanzania and 91% in Uganda. Indicators across the continuum of care were tabulated every four months, results discussed by quality improvement teams, and report cards generated to support their work. CONCLUSIONS: The EQUIP continuous surveys were feasible to implement as a method to continuously generate and report on demand and supply side indicators for maternal and newborn health; they have potential to be expanded to include other health topics. Documenting the design and implementation of a continuous data collection and feedback mechanism for prospective description, quality improvement, and evaluation in a low-income setting potentially represents a new paradigm that places equal weight on data systems for course correction, as well as evaluation. |
News from CDC: the Legacy for Children parenting model, partnering to translate research to practice for children in poverty
Robinson LR , Perou R , Leeb RT . Transl Behav Med 2014 4 (3) 232-3 Approximately 16 million US children currently live in poverty [1]. Children living in poverty experience significant disparities on indicators of physical and mental health and academic success [2–6]. The importance of positive early experiences and the benefits of early intervention to mitigate the life-long effects of poverty have been confirmed in biologic [7], economic [8, 9], and social models [10]. Unfortunately, early childhood interventions have historically been limited in producing impacts when taken to scale. This has been attributed in part to a lack of quality assurance when moving from research to practice [11] and poor attention to scalability and dissemination when developing programs [12]. | To promote optimal development for children in poverty, the Centers for Disease Control and Prevention’s (CDC’s) National Center on Birth Defects and Developmental Disabilities (NCBDDD) developed the Legacy for Children™ (Legacy) model. NCBDDD seeks to promote the health of babies, children, and adults and enhance the potential for full, productive living through public health partnership, research, prevention, and education programs (http://www.cdc.gov/NCBDDD/AboutUs/index.html). Legacy is an evidence-based public health prevention approach to promote child health and development among families in poverty via a group-based parenting program [13]. The Legacy parenting curricula are developmentally sequenced sessions that cover themes including children’s physical health, safety and nutrition, responsive and sensitive parenting, fostering children’s development, and maternal self-care. Intervention components include mother and mother/child session time, one-on-one time to reinforce content, and participation in community events. |
Overprescribing and inappropriate antibiotic selection for children with pharyngitis in the United States, 1997-2010
Dooling KL , Shapiro DJ , Van Beneden C , Hersh AL , Hicks LA . JAMA Pediatr 2014 168 (11) 1073-4 Pharyngitis is a common reason for pediatric health care visits.1 While viral infections account for the majority of pharyngitis episodes, group A Streptococcus (GAS) is implicated in approximately 37% of episodes among children.1 Antimicrobial treatment of GAS pharyngitis can shorten illness duration, prevent complications, and minimize transmission to others.2 Evidence-based guidelines for GAS pharyngitis recommend narrow-spectrum penicillins (amoxicillin or penicillin) as first-line therapy; they are effective and GAS is universally susceptible to these agents.2 | In a recent study in adults with sore throat, most patients received broader-spectrum antibiotics, commonly macrolides, instead of first-line therapy.3 We characterized the frequency and appropriateness of antibiotic prescribing for pharyngitis in children. |
PCV7-induced changes in pneumococcal carriage and invasive disease burden in Alaskan children
Keck JW , Wenger JD , Bruden DL , Rudolph KM , Hurlburt DA , Hennessy TW , Bruce MG . Vaccine 2014 32 (48) 6478-84 BACKGROUND: Changes in pneumococcal serotype-specific carriage and invasive pneumococcal disease (IPD) after the introduction of pneumococcal conjugate vaccine (PCV7) could inform serotype epidemiology patterns following the introduction of newer conjugate vaccines. METHODS: We used data from statewide IPD surveillance and annual pneumococcal carriage studies in four regions of Alaska to calculate serotype-specific invasiveness ratios (IR; odds ratio of a carried serotype's likelihood to cause invasive disease compared to other serotypes) in children <5 years of age. We describe changes in carriage, disease burden, and invasiveness between two time periods, the pre-PCV7 period (1996-2000) and the late post-PCV7 period (2006-2009). RESULTS: Incidence of IPD decreased from the pre- to post-vaccine period (95.7 vs. 57.2 cases per 100,000 children, P<0.001), with a 99% reduction in PCV7 disease. Carriage prevalence did not change between the two periods (49% vs. 50%), although PCV7 serotype carriage declined by 97%, and non-vaccine serotypes increased in prevalence. Alaska pre-vaccine IRs corresponded to pooled results from eight pre-vaccine comparator studies (Spearman's rho=0.44, P=0.002) and to the Alaska post-vaccine period (Spearman's rho=0.28, P=0.029). Relatively invasive serotypes (IR>1) caused 66% of IPD in both periods, although fewer serotypes with IR>1 remained in the post-vaccine (n=9) than the pre-vaccine period (n=13). CONCLUSIONS: After PCV7 introduction, serotype IRs changed little, and four of the most invasive serotypes were nearly eliminated. If PCV13 use leads to a reduction of carriage and IPD for the 13 vaccine serotypes, the overall IPD rate should further decline. |
Intermittent preventive treatment of malaria in pregnancy with mefloquine in HIV-infected women receiving cotrimoxazole prophylaxis: a multicenter randomized placebo-controlled trial
Gonzalez R , Desai M , Macete E , Ouma P , Kakolwa MA , Abdulla S , Aponte JJ , Bulo H , Kabanywanyi AM , Katana A , Maculuve S , Mayor A , Nhacolo A , Otieno K , Pahlavan G , Ruperez M , Sevene E , Slutsker L , Vala A , Williamsom J , Menendez C . PLoS Med 2014 11 (9) e1001735 BACKGROUND: Intermittent preventive treatment in pregnancy (IPTp) with sulfadoxine-pyrimethamine (SP) is recommended for malaria prevention in HIV-negative pregnant women, but it is contraindicated in HIV-infected women taking daily cotrimoxazole prophylaxis (CTXp) because of potential added risk of adverse effects associated with taking two antifolate drugs simultaneously. We studied the safety and efficacy of mefloquine (MQ) in women receiving CTXp and long-lasting insecticide treated nets (LLITNs). METHODS AND FINDINGS: A total of 1,071 HIV-infected women from Kenya, Mozambique, and Tanzania were randomized to receive either three doses of IPTp-MQ (15 mg/kg) or placebo given at least one month apart; all received CTXp and a LLITN. IPTp-MQ was associated with reduced rates of maternal parasitemia (risk ratio [RR], 0.47 [95% CI 0.27-0.82]; p = 0.008), placental malaria (RR, 0.52 [95% CI 0.29-0.90]; p = 0.021), and reduced incidence of non-obstetric hospital admissions (RR, 0.59 [95% CI 0.37-0.95]; p = 0.031) in the intention to treat (ITT) analysis. There were no differences in the prevalence of adverse pregnancy outcomes between groups. Drug tolerability was poorer in the MQ group compared to the control group (29.6% referred dizziness and 23.9% vomiting after the first IPTp-MQ administration). HIV viral load at delivery was higher in the MQ group compared to the control group (p = 0.048) in the ATP analysis. The frequency of perinatal mother to child transmission of HIV was increased in women who received MQ (RR, 1.95 [95% CI 1.14-3.33]; p = 0.015). The main limitation of the latter finding relates to the exploratory nature of this part of the analysis. CONCLUSIONS: An effective antimalarial added to CTXp and LLITNs in HIV-infected pregnant women can improve malaria prevention, as well as maternal health through reduction in hospital admissions. However, MQ was not well tolerated, limiting its potential for IPTp and indicating the need to find alternatives with better tolerability to reduce malaria in this particularly vulnerable group. MQ was associated with an increased risk of mother to child transmission of HIV, which warrants a better understanding of the pharmacological interactions between antimalarials and antiretroviral drugs. TRIAL REGISTRATION: ClinicalTrials.gov NCT 00811421; Pan African Clinical Trials Registry PACTR 2010020001813440 Please see later in the article for the Editors' Summary. |
Brief report: are autistic-behaviors in children related to prenatal vitamin use and maternal whole blood folate concentrations?
Braun JM , Froehlich T , Kalkbrenner A , Pfeiffer CM , Fazili Z , Yolton K , Lanphear BP . J Autism Dev Disord 2014 44 (10) 2602-7 Prenatal multivitamin/folic acid supplement use may reduce the risk of autism spectrum disorders. We investigated whether 2nd trimester prenatal vitamin use and maternal whole blood folate (WBF) concentrations were associated with Social Responsiveness Scale (SRS) scores at 4-5 years of age in a prospective cohort of 209 mother-child pairs. After confounder adjustment, children born to women taking prenatal vitamins weekly/daily (n = 179) had lower odds of clinically elevated SRS scores (odds ratio 0.26; 95 % confidence interval 0.08, 0.89) than those who rarely/never took them (n = 30). WBF concentrations were not associated with SRS scores. The lack of association between WBF and autistic-behaviors may be due to the timing of biomarker measures relative to critical periods of brain development, confounding, or other modifying factors. |
Caffeine intake in children in the United States and 10-y trends: 2001-2010
Ahluwalia N , Herrick K , Moshfegh A , Rybak M . Am J Clin Nutr 2014 100 (4) 1124-32 BACKGROUND: Because of the increasing concern of the potential adverse effects of caffeine intake in children, recent estimates of caffeine consumption in a representative sample of children are needed. OBJECTIVES: We provide estimates of caffeine intake in children in absolute amounts (mg) and in relation to body weight (mg/kg) to examine the association of caffeine consumption with sociodemographic factors and describe trends in caffeine intake in children in the United States. DESIGN: We analyzed caffeine intake in 3280 children aged 2-19 y who participated in a 24-h dietary recall as part of the NHANES, which is a nationally representative survey of the US population with a cross-sectional design, in 2009-2010. Trends over time between 2001 and 2010 were examined in 2-19-y-old children (n = 18,530). Analyses were conducted for all children and repeated for caffeine consumers. RESULTS: In 2009-2010, 71% of US children consumed caffeine on a given day. Median caffeine intakes for 2-5-, 6-11-, and 12-19-y olds were 1.3, 4.5, and 13.6 mg, respectively, and 4.7, 9.1, and 40.6 mg, respectively, in caffeine consumers. Non-Hispanic black children had lower caffeine intake than that of non-Hispanic white counterparts. Caffeine intake correlated positively with age; this association was independent of body weight. On a given day, 10% of 12-19-y-olds exceeded the suggested maximum caffeine intake of 2.5 mg/kg by Health Canada. A significant linear trend of decline in caffeine intake (in mg or mg/kg) was noted overall for children aged 2-19 y during 2001-2010. Specifically, caffeine intake declined by 3.0 and 4.6 mg in 2-5- and 6-11-y-old caffeine consumers, respectively; no change was noted in 12-19-y-olds. CONCLUSION: A majority of US children including preschoolers consumed caffeine. Caffeine intake was highest in 12-19-y-olds and remained stable over the 10-y study period in this age group. |
Lung biodurability and free radical production of cellulose nanomaterials
Stefaniak AB , Seehra MS , Fix NR , Leonard SS . Inhal Toxicol 2014 26 (12) 733-49 The potential applications of cellulose nanomaterials in advanced composites and biomedicine makes it imperative to understand their pulmonary exposure to human health. Here, we report the results on the biodurability of three cellulose nanocrystal (CNC), two cellulose nanofibril (CNF) and a benchmark cellulose microcrystal (CMC) when exposed to artificial lung airway lining fluid (SUF, pH 7.3) for up to 7 days and alveolar macrophage phagolysosomal fluid (PSF, pH 4.5) for up to 9 months. X-ray diffraction analysis was used to monitor biodurability and thermogravimetry, surface area, hydrodynamic diameter, zeta potential and free radical generation capacity of the samples were determined (in vitro cell-free and RAW 264.7 cell line models). The CMC showed no measurable changes in crystallinity (xCR) or crystallite size D in either SUF or PSF. For one CNC, a slight decrease in xCR and D in SUF was observed. In acidic PSF, a slight increase in xCR with exposure time was observed, possibly due to dissolution of the amorphous component. In a cell-free reaction with H2O2, radicals were observed; the CNCs and a CNF generated significantly more [Formula: see text] radicals than the CMC (p < 0.05). The [Formula: see text] radical production correlates with particle decomposition temperature and is explained by the higher surface area to volume ratio of the CNCs. Based on their biodurability, mechanical clearance would be the primary mechanism for lung clearance of cellulose materials. The production of [Formula: see text] radicals indicates the need for additional studies to characterize the potential inhalation hazards of cellulose. |
Neurotoxicity following acute inhalation of aerosols generated during resistance spot weld-bonding of carbon steel
Sriram K , Jefferson AM , Lin GX , Afshari A , Zeidler-Erdely PC , Meighan TG , McKinney W , Jackson M , Cumpston A , Cumpston JL , Leonard HD , Frazer DG , Antonini JM . Inhal Toxicol 2014 26 (12) 720-32 Welding generates complex metal aerosols, inhalation of which is linked to adverse health effects among welders. An important health concern of welding fume (WF) exposure is neurological dysfunction akin to Parkinson's disease (PD). Some applications in manufacturing industry employ a variant welding technology known as "weld-bonding" that utilizes resistance spot welding, in combination with adhesives, for metal-to-metal welding. The presence of adhesives raises additional concerns about worker exposure to potentially toxic components like Methyl Methacrylate, Bisphenol A and volatile organic compounds (VOCs). Here, we investigated the potential neurotoxicological effects of exposure to welding aerosols generated during weld-bonding. Male Sprague-Dawley rats were exposed (25 mg/m(3) targeted concentration; 4 h/day x 13 days) by whole-body inhalation to filtered air or aerosols generated by either weld-bonding with sparking (high metal, low VOCs; HM) or without sparking (low metal; high VOCs; LM). Fumes generated under these conditions exhibited complex aerosols that contained both metal oxide particulates and VOCs. LM aerosols contained a greater fraction of VOCs than HM, which comprised largely metal particulates of ultrafine morphology. Short-term exposure to LM aerosols caused distinct changes in the levels of the neurotransmitters, dopamine (DA) and serotonin (5-HT), in various brain areas examined. LM aerosols also specifically decreased the mRNA expression of the olfactory marker protein (Omp) and tyrosine hydroxylase (Th) in the olfactory bulb. Consistent with the decrease in Th, LM also reduced the expression of dopamine transporter (Slc6a3; Dat), as well as, dopamine D2 receptor (Drd2) in the olfactory bulb. In contrast, HM aerosols induced the expression of Th and dopamine D5 receptor (Drd5) mRNAs, elicited neuroinflammation and blood-brain barrier-related changes in the olfactory bulb, but did not alter the expression of Omp. Our findings divulge the differential effects of LM and HM aerosols in the brain and suggest that exposure to weld-bonding aerosols can potentially elicit neurotoxicity following a short-term exposure. However, further investigations are warranted to determine if the aerosols generated by weld-bonding can contribute to persistent long-term neurological deficits and/or neurodegeneration. |
Exploratory breath analyses for assessing toxic dermal exposures of firefighters during suppression of structural burns
Pleil JD , Stiegel MA , Fent KW . J Breath Res 2014 8 (3) 037107 Firefighters wear fireproof clothing and self-contained breathing apparatus (SCBA) during rescue and fire suppression activities to protect against acute effects from heat and toxic chemicals. Fire services are also concerned about long-term health outcomes from chemical exposures over a working lifetime, in particular about low-level exposures that might serve as initiating events for adverse outcome pathways (AOP) leading to cancer. As part of a larger US National Institute for Occupational Safety and Health (NIOSH) study of dermal exposure protection from safety gear used by the City of Chicago firefighters, we collected pre- and post-fire fighting breath samples and analyzed for single-ring and polycyclic aromatic hydrocarbons as bioindicators of occupational exposure to gas-phase toxicants. Under the assumption that SCBA protects completely against inhalation exposures, any changes in the exhaled profile of combustion products were attributed to dermal exposures from gas and particle penetration through the protective clothing. Two separate rounds of firefighting activity were performed each with 15 firefighters per round. Exhaled breath samples were collected onto adsorbent tubes and analyzed with gas-chromatography-mass spectrometry (GC-MS) with a targeted approach using selective ion monitoring. We found that single ring aromatics and some PAHs were statistically elevated in post-firefighting samples of some individuals, suggesting that fire protective gear may allow for dermal exposures to airborne contaminants. However, in comparison to a previous occupational study of Air Force maintenance personnel where similar compounds were measured, these exposures are much lower suggesting that firefighters' gear is very effective. This study suggests that exhaled breath sampling and analysis for specific targeted compounds is a suitable method for assessing systemic dermal exposure in a simple and non-invasive manner. |
Correlation of respirator fit measured on human subjects and a static advanced headform
Bergman MS , He X , Joseph ME , Zhuang Z , Heimbuch BK , Shaffer RE , Choe M , Wander JD . J Occup Environ Hyg 2014 12 (3) 0 This study assessed the correlation of N95 filtering facepiece respirator (FFR) fit between a Static Advanced Headform (StAH) and 10 human test subjects. Quantitative fit evaluations were performed on test subjects who made three visits to the laboratory. On each visit, one fit evaluation was performed on eight different FFRs of various model/size variations. Additionally, subject breathing patterns were recorded. Each fit evaluation comprised three two-minute exercises: "Normal Breathing," "Deep Breathing," and again "Normal Breathing." The overall test fit factors (FF) for human tests were recorded. The same respirator samples were later mounted on the StAH and the overall test manikin fit factors (MFF) were assessed utilizing the recorded human breathing patterns. Linear regression was performed on the mean log10-transformed FF and MFF values to assess the relationship between the values obtained from humans and the StAH. This is the first study to report a positive correlation of respirator fit between a headform and test subjects. The linear regression by respirator resulted in R2 = 0.95, indicating a strong linear correlation between FF and MFF. For all respirators the geometric mean (GM) FF values were consistently higher than those of the GM MFF. For 50% of respirators, GM FF and GM MFF values were significantly different between humans and the StAH. For data grouped by subject/respirator combinations, the linear regression resulted in R2 = 0.49. A weaker correlation (R2 = 0.11) was found using only data paired by subject/respirator combination where both the test subject and StAH had passed a real-time leak check before performing the fit evaluation. For six respirators, the difference in passing rates between the StAH and humans was < 20%, while two respirators showed a difference of 29% and 43%. For data by test subject, GM FF and GM MFF values were significantly different for 40% of the subjects. Overall, the advanced headform system has potential for assessing fit for some N95 FFR model/sizes. |
Water-based interventions for schistosomiasis control
Secor WE . Pathog Glob Health 2014 108 (5) 246-54 Mass drug administration with praziquantel is the mainstay of programs for the control of schistosomiasis morbidity. However, there is a growing recognition that treatment alone will not be sufficient for eventually effecting elimination and that additional measures will be required to interrupt transmission. In the absence of a safe and an effective vaccine for human schistosomiasis, the strategies to reduce infection levels will necessarily involve some interventions that affect the water-related stages of the schistosome life cycle: by reducing exposure to infectious water, by moderating availability of the intermediate snail host, or by decreasing contamination of water with egg-containing excreta. While much research on the importance of water on schistosomiasis has been performed, advances in these areas have perhaps languished with the ready availability of a cost-effective treatment. As some endemic areas near a shift to an elimination goal, a better understanding of water-based interventions that can be used alone or in concert with treatment will be needed. Reinvigoration of laboratory, field, and human behavioral aspects of this research now will ensure that the appropriate strategies are available by the time their implementation becomes necessary. |
Monitoring long-lasting insecticidal net (LLIN) durability to validate net serviceable life assumptions, in Rwanda
Hakizimana E , Cyubahiro B , Rukundo A , Kabayiza A , Mutabazi A , Beach R , Patel R , Tongren JE , Karema C . Malar J 2014 13 344 BACKGROUND: To validate assumptions about the length of the distribution-replacement cycle for long-lasting insecticidal nets (LLINs) in Rwanda, the Malaria and other Parasitic Diseases Division, Rwanda Ministry of Health, used World Health Organization methods to independently confirm the three-year LLIN serviceable life span recommendation of WHO. METHODS: Approximately 3,000 coded LLINs, distributed as part of a national campaign, were monitored in six sites, by means of six-monthly visits to selected houses. Two indicators, survivorship/attrition, a measure of the number of nets remaining, and fabric integrity, the proportion of remaining nets in either 'good', 'serviceable' or 'needs replacement' condition, based on holes in the net material, were tracked. To validate the assumption that the intervention would remain effective for three years, LLIN coverage, calculated using either survivorship, or integrity, by removing nets in the 'needs replacement' category from the survivorship total, was compared with the predicted proportion of nets remaining, derived from a net loss model, that assumes an LLIN serviceable life of three years. RESULTS: After two years, there was close agreement between estimated LLIN survivorship at all sites, 75% (range 64-84%), and the predicted proportion of nets remaining, 75%. However, when integrity was considered, observed survivorship at all sites, declined to 42% (range 10-54%). CONCLUSIONS: More than half, 58%, of the LLINs fell into the 'needs replacement' category after two years. While these nets were counted for survivorship, they were judged to be of little-to-no benefit to a user. Therefore, when integrity was taken into account, survivorship was significantly lower than predicted, suggesting that net serviceable life was actually closer to two, rather than three years, and, by extension, that the impact of the intervention during year three of the LLIN distribution-replacement cycle could be well below that seen in years one and two. |
The effect of a health communication campaign on compliance with mass drug administration for schistosomiasis control in western Kenya - the SCORE Project
Omedo M , Ogutu M , Awiti A , Musuva R , Muchiri G , Montgomery SP , Secor WE , Mwinzi P . Am J Trop Med Hyg 2014 91 (5) 982-8 Compliance with mass drug administration (MDA) can be affected by rumors and mistrust about the drug. Communication campaigns are an effective way to influence attitudes and health behaviors in diverse public health contexts, but there is very little documentation about experiences using health communications in schistosomiasis control programs. A qualitative study was conducted with community health workers (CHWs) as informants to explore the effect of a health communication campaign on their experiences during subsequent praziquantel MDA for schistosomiasis. Discussions were audio-recorded, transcribed verbatim, translated into English where applicable, and analyzed thematically using ATLAS.ti software. According to the CHWs, exposure to mass media messages improved awareness of the MDA, which in turn, led to better treatment compliance. Our findings suggest that communication campaigns influence health behaviors and create awareness of schistosomiasis control interventions, which may ultimately improve praziquantel MDA. |
Comparative assessment of diverse strategies for malaria vector population control based on measured rates at which mosquitoes utilize targeted resource subsets
Killeen GF , Kiware SS , Seyoum A , Gimnig JE , Corliss GF , Stevenson J , Drakeley CJ , Chitnis N . Malar J 2014 13 338 BACKGROUND: Eliminating malaria requires vector control interventions that dramatically reduce adult mosquito population densities and survival rates. Indoor applications of insecticidal nets and sprays are effective against an important minority of mosquito species that rely heavily upon human blood and habitations for survival. However, complementary approaches are needed to tackle a broader diversity of less human-specialized vectors by killing them at other resource targets. METHODS: Impacts of strategies that target insecticides to humans or animals can be rationalized in terms of biological coverage of blood resources, quantified as proportional coverage of all blood resources mosquito vectors utilize. Here, this concept is adapted to enable impact prediction for diverse vector control strategies based on measurements of utilization rates for any definable, targetable resource subset, even if that overall resource is not quantifiable. RESULTS: The usefulness of this approach is illustrated by deriving utilization rate estimates for various blood, resting site, and sugar resource subsets from existing entomological survey data. Reported impacts of insecticidal nets upon human-feeding vectors, and insecticide-treated livestock upon animal-feeding vectors, are approximately consistent with model predictions based on measured utilization rates for those human and animal blood resource subsets. Utilization rates for artificial sugar baits compare well with blood resources, and are consistent with observed impact when insecticide is added. While existing data was used to indirectly measure utilization rates for a variety of resting site subsets, by comparison with measured rates of blood resource utilization in the same settings, current techniques for capturing resting mosquitoes underestimate this quantity, and reliance upon complex models with numerous input parameters may limit the applicability of this approach. CONCLUSIONS: While blood and sugar consumption can be readily quantified using existing methods for detecting natural markers or artificial tracers, improved techniques for labelling mosquitoes, or other arthropod pathogen vectors, will be required to assess vector control measures which target them when they utilize non-nutritional resources such as resting, oviposition, and mating sites. |
Informing the scale-up of Kenya's nursing workforce: a mixed methods study of factors affecting pre-service training capacity and production
Appiagyei AA , Kiriinya RN , Gross JM , Wambua DN , Oywer EO , Kamenju AK , Higgins MK , Riley PL , Rogers MF . Hum Resour Health 2014 12 (1) 47 BACKGROUND: Given the global nursing shortage and investments to scale-up the workforce, this study evaluated trends in annual student nurse enrolment, pre-service attrition between enrolment and registration, and factors that influence nurse production in Kenya. METHODS: This study used a mixed methods approach with data from the Regulatory Human Resources Information System (tracks initial student enrolment through registration) and the Kenya Health Workforce Information System (tracks deployment and demographic information on licensed nurses) for the quantitative analyses and qualitative data from key informant interviews with nurse training institution educators and/or administrators. Trends in annual student nurse enrolment from 1999 to 2010 were analyzed using regulatory and demographic data. To assess pre-service attrition between training enrolment and registration with the nursing council, data for a cohort that enrolled in training from 1999 to 2004 and completed training by 2010 was analyzed. Multivariate logistic regression was used to test for factors that significantly affected attrition. To assess the capacity of nurse training institutions for scale-up, qualitative data was obtained through key informant interviews. RESULTS: From 1999 to 2010, 23,350 students enrolled in nurse training in Kenya. While annual new student enrolment doubled between 1999 (1,493) and 2010 (3,030), training institutions reported challenges in their capacity to accommodate the increased numbers. Key factors identified by the nursing faculty included congestion at clinical placement sites, limited clinical mentorship by qualified nurses, challenges with faculty recruitment and retention, and inadequate student housing, transportation and classroom space. Pre-service attrition among the cohort that enrolled between 1999 and 2004 and completed training by 2010 was found to be low (6%). CONCLUSION: To scale-up the nursing workforce in Kenya, concurrent investments in expanding the number of student nurse clinical placement sites, utilizing alternate forms of skills training, hiring more faculty and clinical instructors, and expanding the dormitory and classroom space to accommodate new students are needed to ensure that increases in student enrolment are not at the cost of quality nursing education. Student attrition does not appear to be a concern in Kenya compared to other African countries (10 to 40%). |
Association of employee attributes and exceptional performance rating at a National Center of the US Centers for Disease Control and Prevention, 2011
Roberts H , Myles RL , Truman BI , Dean HD . J Public Health Manag Pract 2014 21 (4) E10-7 CONTEXT: Employee performance evaluation motivates and rewards exceptional individual performance that advances the achievement of organizational goals. The Centers for Disease Control and Prevention (CDC) and its operating units evaluate employee performance annually and reward exceptional performance with a cash award or quality step increase in pay. A summary performance rating (SPR) of "exceptional" indicated personal achievements in 2011 that were beyond expectations described in the employee's performance plan. OBJECTIVE: To determine whether personal attributes and job setting of civil service employees were associated with an exceptional SPR in National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention (NCHHSTP) in 2011. DESIGN: Data from the CDC 2011 performance management database collected in 2012 were analyzed in 2013 to identify SPR, personal attributes, and job settings of full-time civil service employees. Multivariate logistic regression controlled for confounding and stratified analysis detected effect modifiers of the association between receiving an exceptional SPR in 2011 and gender, race/ethnicity, education, job location, job series, grade level, years in grade, years of federal service, supervisory role, and NCHHSTP division. RESULTS: Among the 1037 employees, exceptional SPR was independently associated with: female gender (adjusted odds ratio: 1.7 [1.3, 2.3]), advanced degrees (doctorate: 1.7 [1.1, 2.5] master's: [1.1, 2.0]), headquarters location (2.8 [1.9, 4.1]), higher pay grade (3.3 [2.4, 4.5]) and years in grade (0-1 years: 1.7 [1.3, 2.4]; 2-4 years: 1.5 [1.1, 2.0]), division level (Division A: 5.0 [2.5, 9.9]; Division B: 5.5 [3.5, 8.8]), and supervisory status (at a lower-pay grade) (odds ratio: 3.7 [1.1, 11.3]). CONCLUSIONS: Exceptional SPR is independently associated with personal employee attributes and job settings that are not modifiable by interventions designed to improve employee performance based on accomplishments. |
The use of biomarkers of semen exposure in sexual and reproductive health studies
Snead MC , Black CM , Kourtis AP . J Womens Health (Larchmt) 2014 23 (10) 787-91 Biomarkers of semen exposure have been used in studies investigating the safety and efficacy of barrier methods of contraception. They have been used as objective indicators of semen exposure when studying sexual behaviors and in human immunodeficiency virus/sexually transmitted infection research interventions where participants are advised to avoid unprotected sex. Semen biomarkers have also been used to assess or validate self-reported sexual behaviors or condom use in reproductive health settings. Prostate-specific antigen (PSA) and Y chromosome DNA (Yc-DNA) have each been evaluated in the past as semen biomarkers and are the most widely used in the field. While both are considered reliable for evaluating exposure to semen, each has unique characteristics. In this report, we summarize the literature and provide some considerations for reproductive health researchers who are interested in using PSA or Yc-DNA as semen biomarkers. We also synthesize our previous published work on the optimal conditions of collecting and storing specimens and assay performance in the presence of other vaginal products that may influence various assays. Semen biomarkers are innovative and promising tools to further study and better understand women's reproductive and sexual health and behavior. More research is needed to better understand the strengths, limitations, and optimal performance conditions of specific assays in vivo. |
Psychological and physical functioning difficulties associated with complex activity limitations among U.S. adults
Loeb M , Jonas BS . Disabil Health J 2014 8 (1) 70-9 BACKGROUND: There is limited research that assesses psychological functioning categorically as a predictor of complex activity limitations either alone or in conjunction with physical functioning. OBJECTIVES: This paper assesses the impact of psychological and/or physical functioning difficulties as predictors of complex activity limitations among U.S. adults, using data from a national survey. METHODS: Data come from the 2006-2010 National Health Interview Survey among U.S. adults 18 or older (n = 124,337). We developed a combined physical/psychological exposure variable with six categories: 1) no/low psychological distress (LPD) and absence of physical functioning difficulties, 2) moderate psychological distress (MPD) only, 3) serious psychological distress (SPD) only, 4) physical functioning difficulty only, 5) MPD and physical functioning difficulties, and 6) SPD and physical functioning difficulties. Selected complex activity limitations include daily living, social and work limitations. RESULTS: Compared to adults with LPD and absence of physical functioning difficulties, the results demonstrated a clear and significant gradient of increasing risk of complex activity limitations beginning with MPD only, SPD only, physical functioning difficulty only, both MPD and physical functioning difficulties, and SPD and physical functioning difficulties. CONCLUSIONS: The data suggest a stronger risk of complex activity limitations when increasing psychological functioning difficulties coexist with physical functioning difficulties, leading to potential interference with a person's ability to accomplish major life activities measured in this study. The sizeable contribution of psychological distress to the prevalence of basic actions difficulty implies that the mental health component of functional limitations is important in the overall assessment of health and well-being. |
Methods and results for small area estimation using smoking data from the 2008 National Health Interview Survey
Ha NS , Lahiri P , Parsons V . Stat Med 2014 33 (22) 3932-45 The National Health Interview Survey, conducted by the National Center for Health Statistics, is designed to provide reliable design-based estimates for a wide range of health-related variables for national and four major geographical regions of the USA. However, state-level or substate-level estimates are likely to be unreliable because they are based on small sample sizes. In this paper, we compare the efficiency of different area-level models in estimating smoking prevalence for the 50 US states and the District of Columbia. Our study is based on survey data from the 2008 National Health Interview Survey in conjunction with a number of potentially related auxiliary variables obtained from the American Community Survey, an ongoing large complex survey conducted by the US Census. A major portion of this study is devoted to the investigation of several methods for estimating survey sampling variances needed to implement an area-level hierarchical model. Based on our findings, a hierarchical Bayesian method that uses a survey-adjusted random sampling variance model to capture the complex survey sampling variability appears to be somewhat superior to the other considered area-level models in accounting for small sample behavior of estimated survey sampling variances. Several diagnostic procedures are presented to compare the proposed methods. |
Trends in awareness and use of electronic cigarettes among U.S. adults, 2010-2013
King BA , Patel R , Nguyen K , Dube SR . Nicotine Tob Res 2014 17 (2) 219-27 INTRODUCTION: Electronic cigarette (e-cigarettes) marketing has increased considerably since the product entered the U.S. market in 2007, thereby warranting additional surveillance to monitor recent trends in population-level awareness and utilization. We assessed the prevalence, characteristics, and trends in e-cigarette awareness and use among nationally representative samples of U.S. adults during 2010-2013. METHODS: Data came from the 2010-2013 HealthStyles survey, an annual consumer-based web survey of U.S. adults aged ≥18 years. Sample size ranged from 2,505 (2010) to 4,170 (2012). Descriptive statistics were used to assess e-cigarette awareness, ever use, and current use (use within the past 30 days), overall and by sex, age, race/ethnicity, education, income, U.S. region, and cigarette smoking status. Trends were assessed using logistic regression. RESULTS: During 2010-2013, increases (p<0.05) were observed for e-cigarette awareness (40.9% to 79.7%), ever use (3.3% to 8.5%), and current use (1.0% to 2.6%). Awareness increased among all sociodemographic subpopulations during 2010-2013 (p<0.05); an increase in ever use of e-cigarettes occurred among all sociodemographic groups except those aged 18-24 years, Hispanics and non-Hispanic 'other' adults, and those living in the Midwest (p<0.05). During 2010-2013, ever use increased among current (9.8% to 36.5%) and former (2.5% to 9.6%) cigarette smokers (p<0.05), but remained unchanged among never smokers (1.3% to 1.2%). CONCLUSIONS: Awareness and use of e-cigarettes increased considerably among U.S. adults during 2010-2013. In 2013, over one-third of current cigarette smokers reported having ever used e-cigarettes. Given the uncertain public health impact of e-cigarettes, continued surveillance of emerging utilization patterns is critical for public health planning. |
Menthol cigarette smoking among lesbian, gay, bisexual, and transgender adults
Fallin A , Goodin AJ , King BA . Am J Prev Med 2014 48 (1) 93-7 BACKGROUND: Menthol can mask the harshness and taste of tobacco, making menthol cigarettes easier to use and increasing their appeal among vulnerable populations. The tobacco industry has targeted youth, women, and racial minorities with menthol cigarettes, and these groups smoke menthol cigarettes at higher rates. The tobacco industry has also targeted the lesbian, gay, bisexual, and transgender (LGBT) communities with tobacco product marketing. PURPOSE: To assess current menthol cigarette smoking by sexual orientation among a nationally representative sample of U.S. adults. METHODS: Data were obtained from the 2009-2010 National Adult Tobacco Survey, a national landline and cellular telephone survey of non-institutionalized U.S. adults aged ≥18 years, to compare current menthol cigarette smoking between LGBT (n=2,431) and heterosexual/straight (n=110,841) adults. Data were analyzed during January-April 2014 using descriptive statistics and logistic regression adjusted for sex, age, race, and educational attainment. RESULTS: Among all current cigarette smokers, 29.6% reported usually smoking menthol cigarettes in the past 30 days. Menthol use was significantly higher among LGBT smokers, with 36.3% reporting that the cigarettes they usually smoked were menthol compared to 29.3% of heterosexual/straight smokers (p<0.05); this difference was particularly prominent among LGBT females (42.9%) compared to heterosexual/straight women (32.4%) (p<0.05). Following adjustment, LGBT smokers had greater odds of usually smoking menthol cigarettes than heterosexual/straight smokers (odds ratio=1.31, 95% confidence interval=1.09, 1.57). CONCLUSIONS: These findings suggest that efforts to reduce menthol cigarette use may have the potential to reduce tobacco use and tobacco-related disease and death among LGBT adults. |
National and state cost savings associated with prohibiting smoking in subsidized and public housing in the United States
King BA , Peck RM , Babb SD . Prev Chronic Dis 2014 11 E171 INTRODUCTION: Despite progress in implementing smoke-free laws in indoor public places and workplaces, millions of Americans remain exposed to secondhand smoke at home. The nation's 80 million multiunit housing residents, including the nearly 7 million who live in subsidized or public housing, are especially susceptible to secondhand smoke infiltration between units. METHODS: We calculated national and state costs that could have been averted in 2012 if smoking were prohibited in all US subsidized housing, including public housing: 1) secondhand smoke-related direct health care, 2) renovation of smoking-permitted units; and 3) smoking-attributable fires. Annual cost savings were calculated by using residency estimates from the Department of Housing and Urban Development and cost data reported elsewhere. Data were adjusted for inflation and variations in state costs. National and state estimates (excluding Alaska and the District of Columbia) were calculated by cost type. RESULTS: Prohibiting smoking in subsidized housing would yield annual cost savings of $496.82 million (range, $258.96-$843.50 million), including $310.48 million ($154.14-$552.34 million) in secondhand smoke-related health care, $133.77 million ($75.24-$209.01 million) in renovation expenses, and $52.57 million ($29.57-$82.15 million) in smoking-attributable fire losses. By state, cost savings ranged from $0.58 million ($0.31-$0.94 million) in Wyoming to $124.68 million ($63.45-$216.71 million) in New York. Prohibiting smoking in public housing alone would yield cost savings of $152.91 million ($79.81-$259.28 million); by state, total cost savings ranged from $0.13 million ($0.07-$0.22 million) in Wyoming to $57.77 million ($29.41-$100.36 million) in New York. CONCLUSION: Prohibiting smoking in all US subsidized housing, including public housing, would protect health and could generate substantial societal cost savings. |
Prevalence and correlates of switching to another tobacco product to quit smoking cigarettes
Schauer GL , Malarcher AM , Babb SD . Nicotine Tob Res 2014 17 (5) 622-7 INTRODUCTION: Using nationally representative data, we assessed the prevalence and correlates of cigarette smokers who tried switching to smokeless tobacco (SLT) or to other combusted tobacco (OCT) products to quit. METHODS: Data came from 12,400 current or former adult smokers who made a quit attempt in the past year and responded to the 2010-2011 Tobacco Use Supplement to the Current Population Survey. Demographics and smoking characteristics were computed among those switching to SLT, switching to OCT, or trying to quit without using either strategy. Bivariate and multinomial logistic regression models identified correlates of using each strategy. RESULTS: Overall, 3.1% of smokers tried switching to SLT to quit, 2.2% tried switching to OCT, and 0.6% tried both strategies. Compared to those not using either switching strategy to try to quit, males were more likely than females to try switching to SLT or OCT; Blacks were less likely than Whites to try switching to SLT, but more likely to try switching to OCT; younger age groups were more likely to try switching to SLT or OCT; current someday smokers were more likely to have try switching to SLT (vs. everyday smokers), while recent former smokers were more likely to have tried switching to OCT. Both switching groups were more likely to have used cessation medication versus those not using switching strategies. CONCLUSION: Data suggest that switching to other tobacco products is a prevalent cessation approach; messages are needed to help clinicians encourage smokers who try to quit by switching to use evidence-based cessation approaches. |
The impact of smoking on women's health
McAfee T , Burnette D . J Womens Health (Larchmt) 2014 23 (11) 881-5 Despite half a century of public health efforts, smoking remains the single largest cause of preventable disease and death in the United States, killing 480,000 people a year and inflicting chronic disease on 16 million. Since the early part of the 20th century, tobacco companies' success in aggressively marketing their products to women has resulted in steady increases in smoking-related disease risk for women. Today, women smokers have caught up with their male counterparts and are just as likely to die from lung cancer, heart disease, and chronic obstructive pulmonary disease (COPD) as are men who smoke. Women's risk for developing smoking-related heart disease or dying from COPD now exceeds men's risk. |
Increases in heroin overdose deaths - 28 states, 2010 to 2012
Rudd RA , Paulozzi LJ , Bauer MJ , Burleson RW , Carlson RE , Dao D , Davis JW , Dudek J , Eichler BA , Fernandes JC , Fondario A , Gabella B , Hume B , Huntamer T , Kariisa M , Largo TW , Miles J , Newmyer A , Nitcheva D , Perez BE , Proescholdbell SK , Sabel JC , Skiba J , Slavova S , Stone K , Tharp JM , Wendling T , Wright D , Zehner AM . MMWR Morb Mortal Wkly Rep 2014 63 (39) 849-854 Nationally, death rates from prescription opioid pain reliever (OPR) overdoses quadrupled during 1999-2010, whereas rates from heroin overdoses increased by <50%. Individual states and cities have reported substantial increases in deaths from heroin overdose since 2010. CDC analyzed recent mortality data from 28 states to determine the scope of the heroin overdose death increase and to determine whether increases were associated with changes in OPR overdose death rates since 2010. This report summarizes the results of that analysis, which found that, from 2010 to 2012, the death rate from heroin overdose for the 28 states increased from 1.0 to 2.1 per 100,000, whereas the death rate from OPR overdose declined from 6.0 per 100,000 in 2010 to 5.6 per 100,000 in 2012. Heroin overdose death rates increased significantly for both sexes, all age groups, all census regions, and all racial/ethnic groups other than American Indians/Alaska Natives. OPR overdose mortality declined significantly among males, persons aged <45 years, persons in the South, and non-Hispanic whites. Five states had increases in the OPR death rate, seven states had decreases, and 16 states had no change. Of the 18 states with statistically reliable heroin overdose death rates (i.e., rates based on at least 20 deaths), 15 states reported increases. Decreases in OPR death rates were not associated with increases in heroin death rates. The findings indicate a need for intensified prevention efforts aimed at reducing overdose deaths from all types of opioids while recognizing the demographic differences between the heroin and OPR-using populations. Efforts to prevent expansion of the number of OPR users who might use heroin when it is available should continue. |
Cigarette smoking trends among U.S. working adults by industry and occupation: findings from the 2004-2012 National Health Interview Survey
Syamlal G , Mazurek JM , Hendricks SA , Ahmed J . Nicotine Tob Res 2014 17 (5) 599-606 OBJECTIVE: To examine trends in age-adjusted cigarette smoking prevalence among working adults by industry and occupation during 2004-2012, and to project those prevalences and compare them to the 2020 Healthy People objective (TU-1) to reduce cigarette smoking prevalence to ≤12%. METHODS: We analyzed the 2004-2012 National Health Interview Survey data. Respondents were aged ≥ 18 years working in the week prior to the interview. Temporal changes in cigarette smoking prevalence were assessed using logistic regression. We used the regression model to extrapolate to the period 2013-2020. RESULTS: Overall, an estimated 19.0% of working adults smoked cigarettes: 22.4% in 2004 to 18.1% in 2012. The largest declines were among workers in the education services (6.5%) industry and in the life, physical, and social science (9.7%) occupations. The smallest declines were among workers in the real estate and rental and leasing (0.9%) industry and the legal (0.4%) occupations. The 2020 projected smoking prevalence's in 15 of 21 industry groups and 13 of the 23 occupation groups were greater than the 2020 Healthy People goal. CONCLUSIONS: During 2004-2012, smoking prevalence declined in the majority of industry and occupation groups. The decline rate varied by industry and occupation groups. Projections suggest that certain groups may not reach the 2020 Healthy People goal. Consequently, smoking cessation, prevention and intervention efforts may need to be revised and strengthened, particularly in specific occupational groups. |
Encephalitozoonosis in 2 South American fur seal (Arctocephalus australis) pups
Seguel M , Howerth EW , Ritter J , Paredes E , Colegrove K , Gottdenker N . Vet Pathol 2014 52 (4) 720-3 Cerebral and disseminated encephalitozoonosis was diagnosed by histopathology, electron microscopy, and immunohistochemistry in 2 free-ranging South American fur seal pups found dead at Guafo Island (43 degrees 33'S 74 degrees 49'W) in southern Chile. In the brain, lesions were characterized by random foci of necrosis with large numbers of macrophages containing numerous microsporidial organisms within parasitophorous vacuoles. In addition, occasional histiocytes loaded with numerous mature and immature microsporidia spores consistent with Encephalitozoon sp were observed in pulmonary alveolar septa, splenic red pulp, glomerular capillaries, and proximal renal tubules by Gram and immunohistochemical stains. To our knowledge, microsporidial infection in a marine mammal species has not been previously reported. |
Feasibility of creating a national ALS registry using administrative data in the United States
Kaye WE , Sanchez M , Wu J . Amyotroph Lateral Scler Frontotemporal Degener 2014 15 433-9 Uncertainty about the incidence and prevalence of amyotrophic lateral sclerosis (ALS), as well as the role of the environment in the etiology of ALS, supports the need for a surveillance system/registry for this disease. Our aim was to evaluate the feasibility of using existing administrative data to identify cases of ALS. The Agency for Toxic Substances and Disease Registry (ATSDR) funded four pilot projects at tertiary care facilities for ALS, HMOs, and state based organizations. Data from Medicare, Medicaid, the Veterans Health Administration, and Veterans Benefits Administration were matched to data available from site-specific administrative and clinical databases for a five-year time-period (1 January 2001-31 December 2005). Review of information in the medical records by a neurologist was considered the gold standard for determining an ALS case. We developed an algorithm using variables from the administrative data that identified true cases of ALS (verified by a neurologist). Individuals could be categorized into ALS, possible ALS, and not ALS. The best algorithm had sensitivity of 87% and specificity of 85%. We concluded that administrative data can be used to develop a surveillance system/registry for ALS. These methods can be explored for creating surveillance systems for other neurodegenerative diseases. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Community Health Services
- Disease Reservoirs and Vectors
- Environmental Health
- Food Safety
- Genetics and Genomics
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Public Health Leadership and Management
- Reproductive Health
- Social and Behavioral Sciences
- Statistics as Topic
- Substance Use and Abuse
- Veterinary Medicine
- Vital Statistics
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure