Viral hepatitis C gets personal--the value of human genomics to public health.
Zhang L , Gwinn M , Hu DJ . Public Health Genomics 2013 16 (4) 192-7 About 180 million people worldwide are chronically infected with hepatitis C virus (HCV), with 3-4 million newly infected each year. Only 15-25% of acute HCV infections clear spontaneously, and the remainder persists as chronic HCV infection. More than 350,000 people die every year from hepatitis C-related liver failure and cancer. There is currently no vaccine and the standard-of-care therapies - peg-interferon alpha (pegIFN) plus ribavirin (RBV) - are expensive and have serious side effects. Also, they may be effective in only 40-50% of patients infected with HCV genotype 1, the most common HCV genotype in the US. Interleukin 28B (IL28B) genotype was recently and convincingly associated with response to pegIFN and RBV therapy. It has emerged as a robust pretreatment predictor of sustained virological response (SVR, i.e. virologic clearance) to pegIFN and RBV as well as to new triple therapy regimens that include a direct-acting antiviral agent with pegIFN and RBV and increase SVR rates as much as 75% in patients infected with HCV genotype 1. Testing for IL28B genotype may contribute to clinical decision-making and could inform clinical guidelines and public health policies. |
Recent economic evaluations of interventions to prevent cardiovascular disease by reducing sodium intake
Wang G , Bowman BA . Curr Atheroscler Rep 2013 15 (9) 349 Excess intake of sodium, a common problem worldwide, is associated with hypertension and cardiovascular disease (CVD), and hypertension is a major risk factor for CVD. Population-wide efforts to reduce sodium intake have been identified as a promising strategy for preventing hypertension and CVD, and such initiatives are currently recommended by a variety of scientific and public health organizations. By reviewing the literature published from January 2011 to March 2013, we summarized recent economic analyses of interventions to reduce sodium intake. The evidence, derived from estimates of resultant blood pressure decreases and thus decreases in the incidence of CVD events, supports population-wide interventions for reducing sodium intake. Both lowering the salt content in manufactured foods and conducting mass media campaigns at the national level are estimated to be cost-effective in preventing CVD. Although better data on the cost of interventions are needed for rigorous economic evaluations, population-wide sodium intake reduction can be a promising approach for containing the growing health and economic burden associated with hypertension and its sequelae. |
Self-reported sleep duration and weight-control strategies among US high school students
Wheaton AG , Perry GS , Chapman DP , Croft JB . Sleep 2013 36 (8) 1139-45 STUDY OBJECTIVE: To determine if self-reported sleep duration was associated with weight-control behaviors among US high school students. DESIGN: National Youth Risk Behavior Survey. SETTING: United States, 2007. PARTICIPANTS: US high school students (N = 12,087). MEASUREMENTS: Students were asked if they had engaged in several weight-control behaviors during the 30 days before the survey to lose or maintain weight. Self-reported sleep duration categories included very short (≤ 5 h), short (6 or 7 h), referent moderate (8 or 9 h), and long (≥ 10 h). Sex-specific logistic regression analyses with race/ethnicity, grade, and body mass index category as covariates were conducted using SUDAAN to account for complex study design. RESULTS: Approximately half the students reported short sleep duration (51.8% of males and 54.3% of females), whereas very short sleep durations were reported by another 14.8% of males and 16.9% of females. Among males, very short sleepers were significantly (P < 0.05) more likely than moderate sleepers to report dieting (36.3% versus 26.1%), fasting (14.2% versus 4.3%), and purging (4.3% versus 1.1%) to lose or maintain weight during the 30 days before the survey. Among females, the respective very short, short, and moderate sleepers varied (P < 0.05) in dieting (59.9%, 55.0%, and 47.5% respectively), fasting (28.3%, 15.2%, and 10.3%, respectively), and taking diet pills (13.3%, 6.8%, and 4.3%, respectively). Prevalence of purging was significantly higher only for very short sleepers (12.3%, 6.0%, and 3.9%, respectively). CONCLUSION: Self-reported short sleep duration was associated with dieting and three unhealthy weight-control behaviors in this population. If our findings are confirmed, intervention studies should be conducted to examine the effect of educational interventions. |
Serum fatty acids and incidence of ischemic stroke among postmenopausal women
Yaemsiri S , Sen S , Tinker LF , Robinson WR , Evans RW , Rosamond W , Wasserthiel-Smoller S , He K . Stroke 2013 44 (10) 2710-7 BACKGROUND AND PURPOSE: Although studies have linked types of fatty acids with coronary heart disease, data on individual fatty acids and risk of ischemic stroke are limited. We aimed to examine the associations between serum fatty acid concentrations and incidence of ischemic stroke and its subtypes. METHODS: We conducted a prospective case-control study nested in the Women's Health Initiative Observational Study cohort of postmenopausal US women aged 50 to 79 years. Between 1993 and 2003, incident cases of ischemic stroke were matched 1:1 to controls on age, race, and length of follow-up (964 matched pairs). Conditional logistic regression was used to estimate odds ratios and 99.9% confidence intervals (CI) for ischemic stroke and its subtypes. RESULTS: The multivariable-adjusted odds ratios and 99.9% CI of ischemic stroke associated with a 1-SD increment in serum fatty acid concentration were 1.38 (99.9% CI, 1.05-1.83) for linoelaidic acid (18:2tt, SD=0.04%), 1.27 (99.9% CI, 1.06-1.51) for palmitic acid (16:0, SD=2.74%), 1.20 (99.9% CI, 1.01-1.43) for oleic acid (18:1n9, SD=2.32%), 0.72 (99.9% CI, 0.59-0.87) for docosapentaenoic acid (22:5n3, SD=0.18%), 0.72 (99.9% CI, 0.59-0.87) for docosahexaenoic acid (22:6n3, SD=0.91%), and 0.81 (99.9% CI, 0.67-0.98) for arachidonic acid (20:4n6, SD=2.02%). These associations were generally consistent for atherothrombotic and lacunar stroke but not cardioembolic stroke. CONCLUSIONS: These findings suggest that individual serum trans, saturated, and monounsaturated fatty acids are positively associated with particular ischemic stroke subtypes, whereas individual n3 and n6 polyunsaturated fatty acids are inversely associated. |
Transfusion complications in thalassemia patients: a report from the Centers for Disease Control and Prevention
Vichinsky E , Neumayr L , Trimble S , Giardina PJ , Cohen AR , Coates T , Boudreaux J , Neufeld EJ , Kenney K , Grant A , Thompson AA . Transfusion 2013 54 (4) 972-81; quiz 971 BACKGROUND: Transfusions are the primary therapy for thalassemia but have significant cumulative risks. In 2004, the Centers for Disease Control and Prevention (CDC) established a national blood safety monitoring program for thalassemia. This report summarizes the population and their previous nonimmune and immune transfusion complications. STUDY DESIGN AND METHODS: The CDC Thalassemia Blood Safety Network is a consortium of centers longitudinally following patients. Enrollment occurred from 2004 through 2012. Demographics, transfusion history, infectious exposures, and transfusion and nontransfusion complications were summarized. Logistic regression analyses of factors associated with allo- and autoimmunization were employed. RESULTS: The race/ethnicity of these 407 thalassemia patients was predominantly Asian or Caucasian. The mean +/- SD age was 22.3 +/- 13.2 years and patients had received a mean +/- SD total number of 149 +/- 103.4 units of red blood cells (RBCs). Multiorgan dysfunction was common despite chelation. Twenty-four percent of transfused patients had previous exposure to possible transfusion-associated pathogens including one case of babesia. As 27% were immigrants, the infection source cannot be unequivocally linked to transfusion. Transfusion reactions occurred in 48%, including allergic, febrile, and hemolytic; 19% were alloimmunized. Common antigens were E, Kell, and C. Years of transfusion was the strongest predictor of alloimmunization. Autoantibodies occurred in 6.5% and were associated with alloimmunization (p < 0.0001). Local institutional policies, not patient characteristics, were major determinants of blood preparation and transfusion practices. CONCLUSION: Hemosiderosis, transfusion reactions, and infections continue to be major problems in thalassemia. New pathogens were noted. National guidelines for RBC phenotyping and preparation are needed to decrease transfusion-related morbidity. |
Link between cardiovascular disease and spinal cord injury: new evidence and update
Kuklina EV , Hagen EM . Neurology 2013 81 (8) 700-1 According to the most recent report by the National Spinal Cord Injury Statistical Center, hypertensive disorders and the resulting ischemic heart disease constitute the third leading cause of mortality in patients with spinal cord injuries (SCI). However, the risk factors and mechanisms underlying development of cardiovascular disease (CVD) in these patients are not completely explained. Increased vascular and inflammatory markers increase cardiovascular risk. Abnormal cardiovascular control is related to the level and severity of injury to descending autonomic (sympathetic) pathways. The results of a systematic review covering studies published in English from 1990 to 2007 indicate that the quality of evidence regarding SCI status as an independent predictor of cardiovascular morbidity and mortality was suboptimal. The limited number of studies that investigated a link between CVD and SCI had small sample size, lacked appropriate control groups or adjustment for key confounders, and varied widely in reported outcomes. |
Prevalence of nonalcoholic fatty liver disease in the United States: the Third National Health and Nutrition Examination Survey, 1988-1994
Lazo M , Hernaez R , Eberhardt MS , Bonekamp S , Kamel I , Guallar E , Koteish A , Brancati FL , Clark JM . Am J Epidemiol 2013 178 (1) 38-45 Previous estimates of the prevalence of nonalcoholic fatty liver disease (NAFLD) in the US population relied on measures of liver enzymes, potentially underestimating the burden of this disease. We used ultrasonography data from 12,454 adults who participated in the Third National Health and Nutrition Examination Survey, conducted in the United States from 1988 to 1994. We defined NAFLD as the presence of hepatic steatosis on ultrasonography in the absence of elevated alcohol consumption. In the US population, the rates of prevalence of hepatic steatosis and NAFLD were 21.4% and 19.0%, respectively, corresponding to estimates of 32.5 (95% confidence interval: 29.9, 35.0) million adults with hepatic steatosis and 28.8 (95% confidence interval: 26.6, 31.2) million adults with NAFLD nationwide. After adjustment for age, income, education, body mass index (weight (kg)/height (m)(2)), and diabetes status, NAFLD was more common in Mexican Americans (24.1%) compared with non-Hispanic whites (17.8%) and non-Hispanic blacks (13.5%) (P = 0.001) and in men (20.2%) compared with women (15.8%) (P < 0.001). Hepatic steatosis and NAFLD were also independently associated with diabetes, with insulin resistance among people without diabetes, with dyslipidemia, and with obesity. Our results extend previous national estimates of the prevalence of NAFLD in the US population and highlight the burden of this disease. Men, Mexican Americans, and people with diabetes and obesity are the most affected groups. |
Preventable hospitalizations and emergency department visits for angina, United States, 1995-2010
Will JC , Valderrama AL , Yoon PW . Prev Chronic Dis 2013 10 E126 INTRODUCTION: Preventable hospitalizations for angina have been decreasing since the late 1980s - most likely because of changes in guidance, physician coding practices, and reimbursement. We asked whether this national decline has continued and whether preventable emergency department visits for angina show a similar decline. METHODS: We used National Hospital Discharge Survey data from 1995 through 2010 and National Hospital Ambulatory Medical Care Survey data from 1995 through 2009 to study preventable hospitalizations and emergency department visits, respectively. We calculated both crude and standardized rates for these visits according to technical specifications published by the Agency for Healthcare Research and Quality, which uses population estimates from the US Census Bureau as the denominator for the rates. RESULTS: Crude hospitalization rates for angina declined from 1995-1998 to 2007-2010 for men and women in all 3 age groups (18-44, 45-64, and ≥65) and age- and sex-standardized rates declined in a linear fashion (P = .02). Crude rates for preventable emergency department visits for angina declined for men and women aged 65 or older from 1995-1998 to 2007-2009. Age- and sex-standardized rates for these visits showed a linear decline (P = .05). CONCLUSION: We extend previous research by showing that preventable hospitalization rates for angina have continued to decline beyond the time studied previously. We also show that emergency department visits for the same condition have also declined during the past 15 years. Although these declines are probably due to changes in diagnostic practices in the hospitals and emergency departments, more studies are needed to fully understand the reasons behind this phenomenon. |
Epidemiology and outcomes of adults with asthma who were hospitalized or died with 2009 pandemic influenza A (H1N1) - California, 2009
Mortensen E , Louie J , Pertowski C , Cadwell BL , Weiss E , Acosta M , Matyas BT . Influenza Other Respir Viruses 2013 7 (6) 1343-9 BACKGROUND: Asthma was the most common chronic condition among adults hospitalized for 2009 pandemic influenza A (H1N1) (pH1N1). OBJECTIVES: We describe the epidemiology and factors for severe outcomes among adults with asthma who were hospitalized or died from pH1N1 in California. METHODS: We reviewed California Department of Public Health pH1N1 reports from April 23, 2009 through August 11, 2009. Reports were included if the patient had pH1N1 (or non-subtypeable influenza A) infection by polymerase chain reaction in an adult (age ≥ 18 years) with asthma who was hospitalized or died. Patients were classified as having intermittent or persistent asthma on the basis of regular medications. Risk factors associated with severe outcomes (i.e., intensive care unit admission or death) vs those with less severe outcomes were assessed by chi-square tests and logistic regression. RESULTS: Among 744 identified patients, 170 (23%) had asthma (61% intermittent, 39% persistent). 132 of 142 (93%) patients had other chronic medical conditions. Severe outcomes occurred in 54 of 162 (33%), more commonly among those with renal disease (64% versus 31%; P = 0.04) and chest radiograph infiltrates (54% versus 11%; P < 0.01), less commonly among those who received antivirals within 48 hours of symptom onset (22% versus 44%; P = 0.02). In multivariable analysis, chest radiograph infiltrates were associated with severe outcomes (adjusted odds ratio 9.38, 95% confidence interval 3.05-28.90). CONCLUSIONS: One third of adults with asthma who died or were hospitalized with pH1N1 experienced severe outcomes. Early empiric antiviral therapy should be encouraged, especially among asthma patients. |
Femur neck bone mineral density and fracture risk by age, sex, and race or Hispanic origin in older US adults from NHANES III
Looker AC . Arch Osteoporos 2013 8 141 Differences in the relationship between femur neck bone mineral density (FNBMD) and fracture risk were examined by age, sex, and race/ethnicity in the third National Health and Nutrition Examination Survey (NHANES III) cohort. FNBMD had similar, significant predictive utility for fracture in the different subgroups, but it did not completely account for subgroup differences in risk. PURPOSE: Few previous studies of FNBMD and fracture risk examined the relationship by age, sex, and race within the same cohort. The present study examined the relationship between FNBMD and risk of incident major osteoporotic fracture (hip, spine, radius, and humerus) in older US adults from NHANES III (1988-1994). METHODS: Incident fractures were identified using linked mortality and Medicare records obtained through 2007 for 2,743 men and women ages 65 years and older. FNBMD was measured by dual-energy X-ray absorptiometry. Cox proportional hazards models were used to estimate the hazards ratio (HR) of fracture for FNBMD and femur neck T score and risk of major osteoporotic fracture. RESULTS: The sample included 380 incident major osteoporotic fractures. Fracture risk approximately doubled for each SD decrease in FNBMD. HR's for FNBMD were similar within age, sex, or race/Hispanic origin subgroups, and also for T scores calculated with either white female or sex- and race/ethnic-specific reference data. Adding FNBMD to Cox models slightly attenuated HR for age, sex, or race/Hispanic origin, but all three variables remained significant predictors of fracture risk. CONCLUSIONS: FNBMD had similar, significant predictive utility within age, sex, and race/Hispanic origin subgroups. However, FNBMD did not appear to completely account for fracture risk differences in these subgroups. Similarity of HR's for T scores calculated with two different reference databases support use of a uniform reference database to calculate these scores. |
Issues of ovarian cancer survivors in the USA: a literature review
Trivers KF , Patterson JR , Roland KB , Rodriguez JL . Support Care Cancer 2013 21 (10) 2889-98 PURPOSE: As the number of ovarian cancer survivors increases, so does the need for appropriate intervention and care. A literature review was conducted to assess the issues affecting ovarian cancer survivors in the USA, including the needs of younger survivors. METHODS: Articles on six topics (finances/employment, reproductive and sexual health, treatment effects, information needs, genomics, and end-of-life/palliative care) among ovarian cancer survivors were identified through comprehensive database searches. Abstracts for all citations were reviewed to determine relevancy. Data from relevant articles, defined as including a sample size of ≥20, published in English, involving human subjects in the USA, and published between 2000 and 2010, were abstracted. RESULTS: Thirty-four articles were relevant. Common, but often unaddressed, treatment side effects included infertility and issues with sexual health. Survivors reported not receiving adequate information about their disease. Hereditary cancer can lead to concern for family members. End-of-life/palliative care was often not addressed by physicians. Most of the studies used a cross-sectional design and lacked control groups. Participants were primarily recruited from academic medical centers or clinical trials and tended to be White. Few studies specifically addressed young survivors; however, reproductive health issues are common. CONCLUSIONS: Ovarian cancer has wide-ranging impacts. This review emphasizes the need for more research among ovarian cancer survivors, particularly related to finances, reproductive and sexual health, information, genomics, and end-of-life care. Issues specific to young survivors also deserve more attention. Direction for future research and clinical implications are discussed. |
Disparate distribution of hepatitis B virus genotypes in four sub-Saharan African countries.
Forbi JC , Ben-Ayed Y , Xia GL , Vaughan G , Drobeniuc J , Switzer WM , Khudyakov YE . J Clin Virol 2013 58 (1) 59-66 BACKGROUND: Hepatitis B virus (HBV) places a substantial health burden on Africa. Here, we investigated genetic diversity of HBV variants circulating in 4 countries of sub-Saharan Africa using archived samples. In total, 1492 plasma samples were tested from HIV-infected individuals and pregnant women, among which 143 (9.6%) were PCR-positive for HBV DNA (Cote d'Ivoire, 70/608 [11.5%]; Ghana, 13/444 [2.9%]; Cameroon, 33/303 [10.9%]; and Uganda, 27/137 [19.7%]). STUDY DESIGN/RESULTS: Phylogenetic analysis of the S-gene sequences identified HBV genotypes E (HBV/E, n=96) and A (HBV/A, n=47) distributed as follows: 87% of HBV/E and 13% of HBV/A in Cote d'Ivoire; 100% of HBV/E in Ghana; 67% of HBV/E and 33% of HBV/A in Cameroon; and 100% of HBV/A in Uganda. The average and maximal nucleotide distances among HBV/E sequences were 1.9% and 6.4%, respectively, suggesting a greater genetic diversity for this genotype than previously reported (p<0.001). HBV/A strains were classified into subgenotypes HBV/A1, HBV/A2 and HBV/A3. In Uganda, 93% of HBV/A strains belonged to HBV/A1 whereas HBV/A3 was the only subgenotype of HBV/A found in Cameroon. In Cote d'Ivoire, HBV/A strains were classified as HBV/A1 (11.1%), HBV/A2 (33.3%) and HBV/A3 (55.6%). Phylogeographic analysis of the sequences available from Africa supported earlier suggestions on the origin of HBV/A1, HBV/A2 and HBV/A3 in East, South and West/Central Africa, respectively. Using predicted amino acid sequences, hepatitis B surface antigen (HBsAg) was classified into serotype ayw4 in 93% of HBV/E strains and adw2 in 68% of HBV/A strains. Also, 7.7% of the sequences carried substitutions in HBsAg associated with immune escape. CONCLUSIONS: The observations of pan-African and global dissemination of HBV/A1 and HBV/A2, and the circulation of HBV/E and HBV/A3 almost exclusively in West and Central Africa suggest a more recent increase in prevalence in Africa of HBV/E and HBV/A3 compared to HBV/A1 and HBV/A2. The broad genetic heterogeneity of HBsAg detected here may impact the efficacy of prevention and control efforts in sub-Saharan Africa. |
Genotype GI.6 norovirus, United States, 2010-2012.
Leshem E , Barclay L , Wikswo M , Vega E , Gregoricus N , Parashar UD , Vinje J , Hall AJ . Emerg Infect Dis 2013 19 (8) 1317-20 We report an increase in the proportion of genotype GI.6 norovirus outbreaks in the United States from 1.4% in 2010 to 7.7% in 2012 (p<0.001). Compared with non-GI.6 outbreaks, GI.6 outbreaks were characterized by summer seasonality, foodborne transmission, and non-health care settings. |
Trends in mortality from respiratory disease in Latin America since 1998 and the impact of the 2009 influenza pandemic
de Souza MdFM , Widdowson MA , Alencar AP , Gawryszewski VP , Aziz-Baumgartner E , Palekar R , Breese J , Cheng PY , Barbosa J , Cabrera AM , Olea A , Flores AB , Shay DK , Mounts A , Oliva OP . Bull World Health Organ 2013 91 (7) 525-32 OBJECTIVE: To determine trends in mortality from respiratory disease in several areas of Latin America between 1998 and 2009. METHODS: The numbers of deaths attributed to respiratory disease between 1998 and 2009 were extracted from mortality data from Argentina, southern Brazil, Chile, Costa Rica, Ecuador, Mexico and Paraguay. Robust linear models were then fitted to the rates of mortality from respiratory disease recorded between 2003 and 2009. FINDINGS: Between 1998 and 2008, rates of mortality from respiratory disease gradually decreased in all age groups in most of the study areas. Among children younger than 5 years, for example, the annual rates of such mortality - across all seven study areas - fell from 56.9 deaths per 100 000 in 1998 to 26.6 deaths per 100 000 in 2008. Over this period, rates of mortality from respiratory disease were generally highest among adults older than 65 years and lowest among individuals aged 5 to 49 years. In 2009, mortality from respiratory disease was either similar to that recorded in 2008 or showed an increase - significant increases were seen among children younger than 5 years in Paraguay, among those aged 5 to 49 years in southern Brazil, Mexico and Paraguay and among adults aged 50 to 64 years in Mexico and Paraguay. CONCLUSION: In much of Latin America, mortality from respiratory disease gradually fell between 1998 and 2008. However, this downward trend came to a halt in 2009, probably as a result of the (H1N1) 2009 pandemic. |
The legal aspects of expedited partner therapy practice: do state laws and policies really matter?
Cramer R , Leichliter JS , Stenger MR , Loosier PS , Slive L . Sex Transm Dis 2013 40 (8) 657-62 BACKGROUND: Expedited partner therapy (EPT) is a potential partner treatment strategy. Significant efforts have been devoted to policies intended to facilitate its practice. However, few studies have attempted to evaluate these policies. METHODS: We used data on interviewed gonorrhea cases from 12 sites in the STD Surveillance Network in 2010 (n = 3404). Patients reported whether they had received EPT. We coded state laws relevant to EPT for gonorrhea using Westlaw legal research database and the general legal status of EPT in STD Surveillance Network sites from Centers for Disease Control and Prevention's Web site in 2010. We also coded policy statements by medical and other boards. We used chi tests to compare receipt of EPT by legal/policy variables, patient characteristics, and provider type. Variables significant at P < 0.10 in bivariate analyses were included in a logistic regression model. RESULTS: Overall, 9.5% of 2564 interviewed patients with gonorrhea reported receiving EPT for their partners. Receipt of EPT was significantly higher where laws and policies authorizing EPT existed. Where EPT laws for gonorrhea existed and EPT was permissible, 13.3% of patients reported receiving EPT as compared with 5.4% where there were no EPT laws and EPT was permissible, and 1.0% where there were no EPT laws and EPT was potentially allowable (P < 0.01). Expedited partner therapy was higher where professional boards had policy statements supporting EPT (P < 0.01). Receipt of EPT did not differ by most patient characteristics or provider type. Policy-related findings were similar in adjusted analyses. CONCLUSIONS: Expedited partner therapy laws and policies were associated with higher reports of receipt of EPT among interviewed gonorrhea cases. |
Looking to the future: vertical vs. horizontal prevention of Clostridium difficile infections
McDonald LC . Clin Infect Dis 2013 57 (8) 1103-5 Multidrug-resistant organisms (MDROs) such as methicillin-resistant Staphylococcus aureus, vancomycin-resistant enterococci, carbapenem-resistant Enterobacateriaceae, and Clostridium difficile all share certain epidemiologic characteristics: transmission via direct and indirect contact, colonization preceding infection by days to months, and a greater number of asymptomatically colonized than infected patients. For each of these MDROs, colonized patients may serve as an important source for healthcare transmission. Active surveillance (AS) to identify colonized patients has been used to prevent the transmission of MDROs by focusing isolation and/or decolonization efforts. In the case of C. difficile infection (CDI), AS has not been attempted largely because there has not been a feasible method for detecting colonized patients and the role of colonized patients in overall transmission has not been well defined. In this issue, Curry et al. cast additional light on the role of asymptomatic colonization in C. difficile transmission leading to hospital-associated CDI (HA-CDI: defined as hospital-onset cases plus community-onset within 12 weeks of previous discharge and no intervening hospital stay).[1, 2] |
Patients hospitalized with laboratory-confirmed influenza during the 2010-2011 influenza season: exploring disease severity by virus type and subtype
Chaves SS , Aragon D , Bennett N , Cooper T , D'Mello T , Farley M , Fowler B , Hancock E , Kirley PD , Lynfield R , Ryan P , Schaffner W , Sharangpani R , Tengelsen L , Thomas A , Thurston D , Williams J , Yousey-Hindes K , Zansky S , Finelli L . J Infect Dis 2013 208 (8) 1305-14 BACKGROUND: The 2010-11 influenza season was dominated by influenza A(H3N2) virus, but influenza A(H1N1)pdm09 (pH1N1) and B viruses co-circulated. This provided an opportunity to explore within-season predictors of severity among hospitalized patients, avoiding biases associated with season to season differences in strain virulence, population immunity and healthcare seeking. METHODS: Population-based, laboratory-confirmed influenza hospitalization surveillance data were used to examine the association between virus type/subtype and outcomes in children and adults. Multivariable analysis explored virus type/subtype, prompt antiviral treatment, medical conditions, and age as predictors for severity (ICU admission or death). RESULTS: In children, pH1N1 (adjusted odds ratio [aOR] 2.19; 95% CI 1.11-4.3), chronic metabolic disease (aOR 5.23; 95% CI 1.74-15.69) and neuromuscular disorder (aOR 4.84; 95% CI 2.02-11.58) were independently associated with severity. In adults, independent predictors were pH1N1 (aOR 2.21; 95% CI 1.66-2.94), chronic lung disease (aOR 1.46, 95% CI 1.12-1.89) and neuromuscular disorder (aOR 1.68; 95% CI 1.11-2.52). Antiviral treatment reduced the odds of severity among adults (aOR 0.47; 95% CI 0.33-0.68). CONCLUSIONS: During 2010-11 season, pH1N1 caused more severe disease than H3N2 or B in hospitalized patients. Underlying medical conditions increased severity despite virus strain. Antiviral treatment reduced severity among adults. Our findings underscore the importance of influenza prevention. |
Preventing HIV infection in women
Adimora AA , Ramirez C , Auerbach JD , Aral SO , Hodder S , Wingood G , El-Sadr W , Bukusi EA . J Acquir Immune Defic Syndr 2013 63 S168-S173 Although the number of new infections has declined recently, women still constitute almost half of the world's 34 million people with HIV infection, and HIV remains the leading cause of death among women of reproductive age. Prevention research has made considerable progress during the past few years in addressing the biological, behavioral, and social factors that influence women's vulnerability to HIV infection. Nevertheless, substantial work still must be performed to implement scientific advancements and to resolve many questions that remain. This article highlights some of the recent advances and persistent gaps in HIV prevention research for women and outlines key research and policy priorities. |
Effects and clinical significance of GII.4 Sydney norovirus, United States, 2012-2013
Leshem E , Wikswo M , Barclay L , Brandt E , Storm W , Salehi E , Desalvo T , Davis T , Saupe A , Dobbins G , Booth HA , Biggs C , Garman K , Woron AM , Parashar UD , Vinje J , Hall AJ . Emerg Infect Dis 2013 19 (8) 1231-8 During 2012, global detection of a new norovirus (NoV) strain, GII.4 Sydney, raised concerns about its potential effect in the United States. We analyzed data from NoV outbreaks in 5 states and emergency department visits for gastrointestinal illness in 1 state during the 2012-13 season and compared the data with those of previous seasons. During August 2012-April 2013, a total of 637 NoV outbreaks were reported compared with 536 and 432 in 2011-2012 and 2010-2011 during the same period. The proportion of outbreaks attributed to GII.4 Sydney increased from 8% in September 2012 to 82% in March 2013. The increase in emergency department visits for gastrointestinal illness during the 2012-13 season was similar to that of previous seasons. GII.4 Sydney has become the predominant US NoV outbreak strain during the 2012-13 season, but its emergence did not cause outbreak activity to substantially increase from that of previous seasons. |
Emergence of multiple clade 2.3.2.1 influenza A (H5N1) virus subgroups in Vietnam and detection of novel reassortants
Creanga A , Thi Nguyen D , Gerloff N , Thi Do H , Balish A , Dang Nguyen H , Jang Y , Thi Dam V , Thor S , Jones J , Simpson N , Shu B , Emery S , Berman L , Nguyen HT , Bryant JE , Lindstrom S , Klimov A , Donis RO , Davis CT , Nguyen T . Virology 2013 444 12-20 Phylogenetic analyses of 169 influenza A(H5N1) virus genomes were conducted for samples collected through active surveillance and outbreak responses in Vietnam between September 2010 and September 2012. While clade 1.1 viruses persisted in southern regions, three genetically distinct subgroups of clade 2.3.2.1 were found in northern and central Vietnam. The identification of each subgroup corresponded with detection of novel reassortants, likely due to their overlapping circulation throughout the country. While the previously identified clade 1.1 and A/Hubei/1/2010-like 2.3.2.1 genotypes remained the predominant viruses detected, four viruses were found to be reassortants between A/Hubei/1/2010-like (HA, NA, PB2, PB1, PA, NP) and A/duck/Vietnam/NCVD-885/2010-like (M, NS) viruses and one virus was identified as having A/duck/Vietnam/NCVD-885/2010-like HA, NA, PB1, and NP with A/Hubei/1/2010-like PB2 and PA genes. Additionally, clade 2.3.2.1 A/Hong Kong/6841/2010-like viruses, first detected in mid-2012, were identified as reassortants comprised of A/Hubei/1/2010-like PB2 and PA and A/duck/Vietnam/NCVD-885/2010-like PB1, NP, NA, M, NS genes. |
Emergency department visit data for rapid detection and monitoring of norovirus activity, United States
Rha B , Burrer S , Park S , Trivedi T , Parashar UD , Lopman BA . Emerg Infect Dis 2013 19 (8) 1214-21 Noroviruses are the leading cause of gastroenteritis in the United States, but timely measures of disease are lacking. BioSense, a national-level electronic surveillance system, assigns data on chief complaints (patient symptoms) collected during emergency department (ED) visits to 78 subsyndromes in near real-time. In a series of linear regression models, BioSense visits mapped by chief complaints of diarrhea and nausea/vomiting subsyndromes as a monthly proportion of all visits correlated strongly with reported norovirus outbreaks from 6 states during 2007-2010. Higher correlations were seen for diarrhea (R = 0.828-0.926) than for nausea/vomiting (R = 0.729-0.866) across multiple age groups. Diarrhea ED visit proportions exhibited winter seasonality attributable to norovirus; rotavirus contributed substantially for children <5 years of age. Diarrhea ED visit data estimated the onset, peak, and end of norovirus season within 4 weeks of observed dates and could be reliable, timely indicators of norovirus activity. |
Indications for testing among reported cases of HCV infection from enhanced hepatitis surveillance sites in the United States, 2004-2010
Mahajan R , Liu SJ , Klevens RM , Holmberg SD . Am J Public Health 2013 103 (8) 1445-9 OBJECTIVES: Centers for Disease Control and Prevention has recommended a 1-time HCV test for persons born from 1945 through 1965 to supplement current risk-based screening. We examined indications for testing by birth cohort (before 1945, 1945-1965, and after 1965) among persons with past or current HCV. METHODS: Cases had positive HCV laboratory markers reported by 4 surveillance sites (Colorado, Connecticut, Minnesota, and New York) to health departments from 2004 to 2010. Health department staff abstracted demographics and indications for testing from cases' medical records and compiled this information into a surveillance database. RESULTS: Of 110 223 cases of past or current HCV infection reported during 2004-2010, 74 578 (68%) were among persons born during 1945-1965. Testing indications were abstracted for 45 034 (41%) cases; of these, 29 544 (66%) identified at least 1 Centers for Disease Control and Prevention-recommended risk factor as a testing indication. Overall, 74% of reported cases were born from 1945 to 1965 or had an injection drug use history. CONCLUSIONS: These data support augmenting the current HCV risk-based screening recommendations by screening adults born from 1945 to 1965. |
Acute hepatitis B outbreaks in 2 skilled nursing facilities and possible sources of transmission: North Carolina, 2009-2010
Sena AC , Moorman A , Njord L , Williams RE , Colborn J , Khudyakov Y , Drobenuic J , Xia GL , Wood H , Moore Z . Infect Control Hosp Epidemiol 2013 34 (7) 709-16 OBJECTIVE: Acute hepatitis B virus (HBV) infections have been reported in long-term care facilities (LTCFs), primarily associated with infection control breaks during assisted blood glucose monitoring. We investigated HBV outbreaks that occurred in separate skilled nursing facilities (SNFs) to determine factors associated with transmission. DESIGN: Outbreak investigation with case-control studies. SETTING: Two SNFs (facilities A and B) in Durham, North Carolina, during 2009-2010. PATIENTS: Residents with acute HBV infection and controls randomly selected from HBV-susceptible residents during the outbreak period. METHODS: After initial cases were identified, screening was offered to all residents, with repeat testing 3 months later for HBV-susceptible residents. Molecular testing was performed to assess viral relatedness. Infection control practices were observed. Case-control studies were conducted to evaluate associations between exposures and acute HBV infection in each facility. RESULTS: Six acute HBV cases were identified in each SNF. Viral phylogenetic analysis revealed a high degree of HBV relatedness within, but not between, facilities. No evaluated exposures were significantly associated with acute HBV infection in facility A; those associated with infection in facility B (all odds ratios >20) included injections, hospital or emergency room visits, and daily blood glucose monitoring. Observations revealed absence of trained infection control staff at facility A and suboptimal hand hygiene practices during blood glucose monitoring and insulin injections at facility B. CONCLUSIONS: These outbreaks underscore the vulnerability of LTCF residents to acute HBV infection, the importance of surveillance and prompt investigation of incident cases, and the need for improved infection control education to prevent transmission. |
Antiretroviral prophylaxis for HIV infection in injecting drug users in Bangkok, Thailand (the Bangkok Tenofovir Study): a randomised, double-blind, placebo-controlled phase 3 trial
Choopanya K , Martin M , Suntharasamai P , Sangkum U , Mock PA , Leethochawalit M , Chiamwongpaet S , Kitisin P , Natrujirote P , Kittimunkong S , Chuachoowong R , Gvetadze RJ , McNicholl JM , Paxton LA , Curlin ME , Hendrix CW , Vanichseni S . Lancet 2013 381 (9883) 2083-90 BACKGROUND: Antiretroviral pre-exposure prophylaxis reduces sexual transmission of HIV. We assessed whether daily oral use of tenofovir disoproxil fumarate (tenofovir), an antiretroviral, can reduce HIV transmission in injecting drug users. METHODS: In this randomised, double-blind, placebo-controlled trial, we enrolled volunteers from 17 drug-treatment clinics in Bangkok, Thailand. Participants were eligible if they were aged 20-60 years, were HIV-negative, and reported injecting drugs during the previous year. We randomly assigned participants (1:1; blocks of four) to either tenofovir or placebo using a computer-generated randomisation sequence. Participants chose either daily directly observed treatment or monthly visits and could switch at monthly visits. Participants received monthly HIV testing and individualised risk-reduction and adherence counselling, blood safety assessments every 3 months, and were offered condoms and methadone treatment. The primary efficacy endpoint was HIV infection, analysed by modified intention-to-treat analysis. This trial is registered with ClinicalTrials.gov, number NCT00119106. FINDINGS: Between June 9, 2005, and July 22, 2010, we enrolled 2413 participants, assigning 1204 to tenofovir and 1209 to placebo. Two participants had HIV at enrolment and 50 became infected during follow-up: 17 in the tenofovir group (an incidence of 0.35 per 100 person-years) and 33 in the placebo group (0.68 per 100 person-years), indicating a 48.9% reduction in HIV incidence (95% CI 9.6-72.2; p=0.01). The occurrence of serious adverse events was much the same between the two groups (p=0.35). Nausea was more common in participants in the tenofovir group than in the placebo group (p=0.002). INTERPRETATION: In this study, daily oral tenofovir reduced the risk of HIV infection in people who inject drugs. Pre-exposure prophylaxis with tenofovir can now be considered for use as part of an HIV prevention package for people who inject drugs. FUNDING: US Centers for Disease Control and Prevention and the Bangkok Metropolitan Administration. |
Diagnoses of human immunodeficiency virus (HIV) infection among foreign-born persons living in the District of Columbia
Willis LA , Opoku J , Murray A , West T , Johnson AS , Pappas G , Sutton MY . J Immigr Minor Health 2013 17 (1) 37-46 This study characterizes available surveillance data for HIV infected foreign-born residents in the District of Columbia (DC) to inform local HIV prevention and care efforts. HIV surveillance data were reviewed for adults and adolescents (ages ≥13 years) living with HIV in 2008. Variables analyzed included demographics, region of origin (for persons born outside of the U.S.), insurance coverage, linkage to and continuous HIV care. Of the 16,513 DC residents living with HIV diagnoses, 1,391 (8.4 %) were foreign-born. Of foreign-born infected, 71.9 % were male; 33.3 % were from Africa and 20.8 % from Central America; 80.6 % were exposed through sex; 36.3 % had health coverage at diagnosis. While 100 % of foreign-born persons had documented linkage to HIV care, only 18.0 % had documentation of continued HIV care. These data suggest that strengthening continuous HIV care support after successful care linkage is warranted for foreign-born persons living with HIV in DC. |
Eliminating malaria vectors
Killeen GF , Seyoum A , Sikaala C , Zomboko AS , Gimnig JE , Govella NJ , White MT . Parasit Vectors 2013 6 172 Malaria vectors which predominantly feed indoors upon humans have been locally eliminated from several settings with insecticide treated nets (ITNs), indoor residual spraying or larval source management. Recent dramatic declines of An. gambiae in east Africa with imperfect ITN coverage suggest mosquito populations can rapidly collapse when forced below realistically achievable, non-zero thresholds of density and supporting resource availability. Here we explain why insecticide-based mosquito elimination strategies are feasible, desirable and can be extended to a wider variety of species by expanding the vector control arsenal to cover a broader spectrum of the resources they need to survive. The greatest advantage of eliminating mosquitoes, rather than merely controlling them, is that this precludes local selection for behavioural or physiological resistance traits. The greatest challenges are therefore to achieve high biological coverage of targeted resources rapidly enough to prevent local emergence of resistance and to then continually exclude, monitor for and respond to re-invasion from external populations. |
First detection of Heartland virus (Bunyaviridae: Phlebovirus) from field collected arthropods
Savage HM , Godsey MS Jr , Lambert A , Panella NA , Burkhalter KL , Harmon JR , Lash RR , Ashley DC , Nicholson WL . Am J Trop Med Hyg 2013 89 (3) 445-452 Heartland virus (HRTV), the first pathogenic Phlebovirus (Family: Bunyaviridae) discovered in the United States, was recently described from two Missouri farmers. In 2012, we collected 56,428 ticks representing three species at 12 sites including both patients' farms. Amblyomma americanum and Dermacentor variabilis accounted for nearly all ticks collected. Ten pools composed of deplete nymphs of A. americanum collected at a patient farm and a nearby conservation area were reverse transcription-polymerase chain reaction positive, and eight pools yielded viable viruses. Sequence data from the nonstructural protein of the Small segment indicates that tick strains and human strains are very similar, ≥ 97.6% sequence identity. This is the first study to isolate HRTV from field-collected arthropods and to implicate ticks as potential vectors. Amblyomma americanum likely becomes infected by feeding on viremic hosts during the larval stage, and transmission to humans occurs during the spring and early summer when nymphs are abundant and actively host seeking. |
Anthropogenic roost switching and rabies virus dynamics in house-roosting big brown bats
Streicker DG , Franka R , Jackson FR , Rupprecht CE . Vector Borne Zoonotic Dis 2013 13 (7) 498-504 Big brown bats (Eptesicus fuscus) are the most commonly encountered rabid bat in North America and represent an important source of wildlife rabies epizootics. Urban and suburban colonies of E. fuscus are often evicted from their roosts in houses, with poorly understood consequences for bat dispersal, population dynamics, and rabies virus transmission. We combined radiotelemetry and mark-recapture of E. fuscus with enhanced surveillance to understand the frequency of rabies virus exposure in house-roosting bats and to assess the potential for behavioral responses of eviction to exacerbate viral transmission. Serology demonstrated the circulation of rabies virus in nearly all sites, with an overall seroprevalence of 12%, but no bats were excreting rabies virus at the time of capture. Bats that were excluded from roosts relocated to houses <1 km from the original roost. However, behavioral responses to eviction differed, with bats switching repeatedly among new roosts in 1 site, but fusing with a neighboring colony in another. These findings confirm the circulation of rabies virus in E. fuscus that live in close contact with humans and companion animals, suggest mechanisms through which anthropogenic disturbance of bats might influence pathogen transmission, and highlight simple strategies to balance conservation and public health priorities. |
Efficacy of flow restrictors in limiting access of liquid medications by young children
Lovegrove MC , Hon S , Geller RJ , Rose KO , Hampton LM , Bradley J , Budnitz DS . J Pediatr 2013 163 (4) 1134-9 e1 OBJECTIVE: To assess whether adding flow restrictors (FRs) to liquid medicine bottles can provide additional protection against unsupervised medication ingestions by young children, even when the child-resistant closure is not fully secured. STUDY DESIGN: In April and May 2012, we conducted a block randomized trial with a convenience sample of 110 3- and 4-year-old children from 5 local preschools. Participants attempted to remove test liquid from an uncapped bottle with an FR and a control bottle without an FR (with either no cap or an incompletely closed cap). RESULTS: All but 1 (96%; 25 of 26) of the open control bottles and 82% (68 of 83) of the incompletely closed control bottles were emptied within 2 minutes. Only 6% (7 of 110) of the bottles with FRs were emptied during the 10-minute testing period, none before 6 minutes. Overall, children removed less liquid from the bottles with FRs than from the open or incompletely closed control bottles without FRs (both P < .001). All children assigned open control bottles and 90% of those assigned incompletely closed control bottles removed ≥25 mL of liquid. In contrast, 11% of children removed ≥25 mL of liquid from uncapped bottles with FRs. Older children (aged 54-59 months) were more successful than younger children at removing ≥25 mL of liquid (P = .002) from bottles with FRs. CONCLUSION: Our findings suggest that adding FRs to liquid medicine bottles limits the accessibility of their contents to young children and could complement the safety provided by current child-resistant packaging. |
Legionnaires' disease case-finding algorithm, attack rates, and risk factors during a residential outbreak among older adults: an environmental and cohort study
Silk BJ , Foltz JL , Ngamsnga K , Brown E , Munoz MG , Hampton L , Jacobs-Slifka K , Kozak NA , Underwood JM , Krick J , Travis T , Farrow O , Fields BS , Blythe D , Hicks LA . BMC Infect Dis 2013 13 (1) 291 BACKGROUND: During a Legionnaires' disease (LD) outbreak, combined epidemiological and environmental investigations were conducted to identify prevention recommendations for facilities where elderly residents live independently but have an increased risk of legionellosis. METHODS: Survey responses (n = 143) were used to calculate attack rates and describe transmission routes by estimating relative risk (RR) and 95% confidence intervals (95% CI). Potable water collected from five apartments of LD patients and three randomly-selected apartments of residents without LD (n = 103 samples) was cultured for Legionella. RESULTS: Eight confirmed LD cases occurred among 171 residents (attack rate = 4.7%); two visitors also developed LD. One case was fatal. The average age of patients was 70 years (range: 62--77). LD risk was lower among residents who reported tub bathing instead of showering (RR = 0.13, 95% CI: 0.02--1.09, P = 0.03). Two respiratory cultures were characterized as L. pneumophila serogroup 1, monoclonal antibody type Knoxville (1,2,3), sequence type 222. An indistinguishable strain was detected in 31 (74%) of 42 potable water samples. CONCLUSIONS: Managers of elderly-housing facilities and local public health officials should consider developing a Legionella prevention plan. When Legionella colonization of potable water is detected in these facilities, remediation is indicated to protect residents at higher risk. If LD occurs among residents, exposure reduction, heightened awareness, and clinical surveillance activities should be coordinated among stakeholders. For prompt diagnosis and effective treatment, clinicians should recognize the increased risk and atypical presentation of LD in older adults. |
Hemodialysis and water quality
Coulliette AD , Arduino MJ . Semin Dial 2013 26 (4) 427-38 Over 383,900 individuals in the U.S. undergo maintenance hemodialysis that exposes them to water, primarily in the form of dialysate. The quality of water and associated dialysis solutions have been implicated in adverse patient outcomes and is therefore critical. The Association for the Advancement of Medical Instrumentation has published both standards and recommended practices that address both water and the dialyzing solutions. Some of these recommendations have been adopted into Federal Regulations by the Centers for Medicare and Medicaid Services as part of the Conditions for Coverage, which includes limits on specific contaminants within water used for dialysis, dialysate, and substitution fluids. Chemical, bacterial, and endotoxin contaminants are health threats to dialysis patients, as shown by the continued episodic nature of outbreaks since the 1960s causing at least 592 cases and 16 deaths in the U.S. The importance of the dialysis water distribution system, current standards and recommendations, acceptable monitoring methods, a review of chemical, bacterial, and endotoxin outbreaks, and infection control programs are discussed. |
Family history of myocardial infarction is a risk factor for venous thromboembolism among whites but not among blacks.
Mili FD , Hooper WC , Lally C , Austin H . Clin Appl Thromb Hemost 2013 19 (4) 410-7 In addition to potentially sharing common pathogenesis and clinical manifestations, venous and arterial thromboses might have overlapping risk factors. To evaluate the family history of myocardial infarction (MI) as a risk factor for venous thromboembolism (VTE) among whites and blacks, we analyze data from the Genetic Attributes and Thrombosis Epidemiology (GATE) study. Results indicate that the association between VTE and a family history of MI is statistically significant only among whites (odds ratio [OR] = 1.3; 95% confidence interval [CI] = 1.03-1.8), particularly when they have diabetes mellitus (OR = 3.1; 95% CI = 1.2-8.0). Among blacks, the association between VTE and a family history of MI is not statistically significant (OR = 1.2; 95% CI = 0.89-1.5) either among those with diabetes or those without diabetes. We conclude that a family history of MI is a risk factor for VTE among certain populations stratified by race and comorbid conditions. |
Regional variations in esophageal cancer rates by census region in the United States, 1999-2008
Drahos J , Wu M , Anderson WF , Trivers KF , King J , Rosenberg PS , Eheman C , Cook MB . PLoS One 2013 8 (7) e67913 BACKGROUND: Assessment of cancer incidence trends within the U.S. have mostly relied upon Surveillance, Epidemiology, and End Results (SEER) data, with implicit inference that such is representative of the general population. However, many cancer policy decisions are based at a more granular level. To help inform such, analyses of regional cancer incidence data are needed. Leveraging the unique resource of National Program of Cancer Registries (NPCR)-SEER, we assessed whether regional rates and trends of esophageal cancer significantly deviated from national estimates. METHODS: From NPCR-SEER, we extracted cancer case counts and populations for whites aged 45-84 years by calendar year, histology, sex, and census region for the period 1999-2008. We calculated age-standardized incidence rates (ASRs), annual percent changes (APCs), and male-to-female incidence rate ratios (IRRs). RESULTS: This analysis included 65,823 esophageal adenocarcinomas and 27,094 esophageal squamous cell carcinomas diagnosed during 778 million person-years. We observed significant geographic variability in incidence rates and trends, especially for esophageal adenocarcinomas in males: ASRs were highest in the Northeast (17.7 per 100,000) and Midwest (18.1). Both were significantly higher than the national estimate (16.0). In addition, the Northeast APC was 62% higher than the national estimate (3.19% vs. 1.97%). Lastly, IRRs remained fairly constant across calendar time, despite changes in incidence rates. CONCLUSION: Significant regional variations in esophageal cancer incidence trends exist in the U.S. Stable IRRs may indicate the predominant factors affecting incidence rates are similar in men and women. |
Monitoring avian influenza A(H7N9) virus through National Influenza-like Illness Surveillance, China
Xu C , Havers F , Wang L , Chen T , Shi J , Wang D , Yang J , Yang L , Widdowson MA , Shu Y . Emerg Infect Dis 2013 19 (8) 1289-92 In China during March 4-April 28, 2013, avian influenza A(H7N9) virus testing was performed on 20,739 specimens from patients with influenza-like illness in 10 provinces with confirmed human cases: 6 (0.03%) were positive, and increased numbers of unsubtypeable influenza-positive specimens were not seen. Careful monitoring and rapid characterization of influenza A(H7N9) and other influenza viruses remain critical. |
The prevalence of Type 1 Diabetes in the United States
Menke A , Orchard TJ , Imperatore G , Bullard KM , Mayer-Davis E , Cowie CC . Epidemiology 2013 24 (5) 773-774 There are few data on the prevalence of type 1 diabetes mellitus1,2 and no estimates for the entire US population. The National Health and Nutrition Examination Survey (NHANES) is a representative cross-sectional survey of the civilian, noninstitutionalized US population. Although NHANES does not explicitly collect information on type 1 diabetes mellitus, we estimated the prevalence based on age of diabetes diagnosis, the age of insulin initiation, and current use of insulin.3 | The protocol for the 1999–2010 NHANES was approved by the National Center for Health Statistics of the Centers for Disease Control and Prevention Ethics Review Board. All participants gave written informed consent. After excluding 2955 infants for whom diabetes data were not collected and 75 children and adults who were missing diabetes data, 59,130 participants remained in the analytic sample.4 We considered participants to have type 1 diabetes mellitus if they started insulin within 1 year of diabetes diagnosis, were currently using insulin and were diagnosed with diabetes under age 30 (definition 1) or under age 40 (definition 2). |
Effect of nucleic acid amplification testing on population-based incidence rates of Clostridium difficile infection
Gould CV , Edwards JR , Cohen J , Bamberg WM , Clark LA , Farley MM , Johnston H , Nadle J , Winston L , Gerding DN , McDonald LC , Lessa FC . Clin Infect Dis 2013 57 (9) 1304-7 Nucleic acid amplification tests (NAAT) are increasingly being adopted for diagnosis of Clostridium difficile infection (CDI). Data from three states conducting population-based CDI surveillance showed increases ranging from 43% to 67% in CDI incidence attributable to changing from toxin enzyme immunoassays to NAAT. CDI surveillance requires adjustment for testing methods. |
Acute gastroenteritis surveillance through the National Outbreak Reporting System, United States
Hall AJ , Wikswo ME , Manikonda K , Roberts VA , Yoder JS , Gould LH . Emerg Infect Dis 2013 19 (8) 1305-9 Implemented in 2009, the National Outbreak Reporting System provides surveillance for acute gastroenteritis outbreaks in the United States resulting from any transmission mode. Data from the first 2 years of surveillance highlight the predominant role of norovirus. The pathogen-specific transmission pathways and exposure settings identified can help inform prevention efforts. |
Vision loss following intraocular listeriosis associated with contaminated cantaloupe
Ibraheem M , Vance S , Jackson KA , Ettestad P , Smelser C , Silk B . Case Rep Ophthalmol 2013 4 (2) 7-11 Intraocular listeriosis, a rare manifestation of invasive listeriosis, has a poor visual prognosis. We report an intraocular listeriosis case related to a multistate outbreak associated with contaminated cantaloupe. Increasing awareness of rare listeriosis presentations might facilitate timely diagnosis and treatment, and case reporting can clarify medical and epidemiologic aspects of listeriosis. |
Norovirus surveillance among callers to foodborne illness complaint hotline, Minnesota, USA, 2011-2013
Saupe AA , Kaehler D , Cebelinski EA , Nefzger B , Hall AJ , Smith KE . Emerg Infect Dis 2013 19 (8) 1293-6 Norovirus is the leading cause of foodborne disease in the United States. During October 2011-January 2013, we conducted surveillance for norovirus infection in Minnesota among callers to a complaint-based foodborne illness hotline who reported diarrhea or vomiting. Of 241 complainants tested, 127 (52.7%) were positive for norovirus. |
Outbreak-associated Salmonella enterica serotypes and food commodities, United States, 1998-2008
Jackson BR , Griffin PM , Cole D , Walsh KA , Chai SJ . Emerg Infect Dis 2013 19 (8) 1239-44 Salmonella enterica infections are transmitted not only by animal-derived foods but also by vegetables, fruits, and other plant products. To clarify links between Salmonella serotypes and specific foods, we examined the diversity and predominance of food commodities implicated in outbreaks of salmonellosis during 1998-2008. More than 80% of outbreaks caused by serotypes Enteritidis, Heidelberg, and Hadar were attributed to eggs or poultry, whereas >50% of outbreaks caused by serotypes Javiana, Litchfield, Mbandaka, Muenchen, Poona, and Senftenberg were attributed to plant commodities. Serotypes Typhimurium and Newport were associated with a wide variety of food commodities. Knowledge about these associations can help guide outbreak investigations and control measures. |
Human aflatoxin exposure in Kenya, 2007: a cross-sectional study
Yard EE , Daniel JH , Lewis LS , Rybak ME , Paliakov EM , Kim AA , Montgomery JM , Bunnell R , Abudo MU , Akhwale W , Breiman RF , Sharif SK . Food Addit Contam Part A Chem Anal Control Expo Risk Assess 2013 30 (7) 1322-31 Aflatoxins contaminate approximately 25% of agricultural products worldwide. They can cause liver failure and liver cancer. Kenya has experienced multiple aflatoxicosis outbreaks in recent years, often resulting in fatalities. However, the full extent of aflatoxin exposure in Kenya has been unknown. Our objective was to quantify aflatoxin exposure across Kenya. We analysed aflatoxin levels in serum specimens from the 2007 Kenya AIDS Indicator Survey - a nationally representative, cross-sectional serosurvey. KAIS collected 15,853 blood specimens. Of the 3180 human immunodeficiency virus-negative specimens with ≥1 mL sera, we randomly selected 600 specimens stratified by province and sex. We analysed serum specimens for aflatoxin albumin adducts by using isotope dilution MS/MS to quantify aflatoxin B1-lysine, and normalised with serum albumin. Aflatoxin concentrations were then compared by demographic, socioeconomic and geographic characteristics. We detected serum aflatoxin B1-lysine in 78% of serum specimens (range = <LOD-211 pg/mg albumin, median = 1.78 pg/mg albumin). Aflatoxin exposure did not vary by sex, age group, marital status, religion or socioeconomic characteristics. Aflatoxin exposure varied by province (p < 0.05); it was highest in Eastern (median = 7.87 pg/mg albumin) and Coast (median = 3.70 pg/mg albumin) provinces and lowest in Nyanza (median = <LOD) and Rift Valley (median = 0.70 pg/mg albumin) provinces. Our findings suggest that aflatoxin exposure is a public health problem throughout Kenya, and it could be substantially impacting human health. Wide-scale, evidence-based interventions are urgently needed to decrease exposure and subsequent health effects. |
Impact of 2003 state regulation on raw oyster-associated Vibrio vulnificus illnesses and deaths, California, USA
Vugia DJ , Tabnak F , Newton AE , Hernandez M , Griffin PM . Emerg Infect Dis 2013 19 (8) 1276-80 US vibriosis rates have increased since 1996, and many Vibrio vulnificus infections are fatal. In April 2003, California implemented a regulation restricting the sale of raw oysters harvested from the Gulf of Mexico during April 1-October 31, unless they were processed to reduce V. vulnificus to nondetectable levels. We analyzed California cases of V. vulnificus infection before and after the regulation's implementation and compared case data with data from other states. The annual number of reported V. vulnificus infections and deaths in California with patient's sole exposure to raw oysters dropped from 0 to 6 cases and 0 to 5 deaths per year during 1991-2002, before implementation, to 0 during 2003-2010, after implementation (p = 0.0005 for both). In other states, median annual numbers of similar cases and deaths increased slightly after 2002. The data strongly suggest that the 2003 regulation led to a significant reduction in reported raw oyster-associated V. vulnificus illnesses and deaths. |
Shared Mycobacterium avium genotypes observed among unlinked clinical and environmental isolates.
Dirac MA , Weigel KM , Yakrus MA , Becker AL , Chen HL , Fridley G , Sikora A , Speake C , Hilborn ED , Pfaller S , Cangelosi GA . Appl Environ Microbiol 2013 79 (18) 5601-7 Our understanding of the sources of Mycobacterium avium infection is partially based on genotypic matching of pathogen isolates from cases and environmental sources. These approaches assume that genotypic identity is rare in isolates from unlinked cases or sources. To test this assumption, a high-resolution PCR-based genotyping approach, LSP-MVR, was selected and used to analyze clinical and environmental isolates of M. avium from geographically diverse sources. Among 127 clinical isolates from seven locations in North America, South America, and Europe, 42 genotypes were observed. Among twelve of these genotypes, matches were seen in isolates from apparently unlinked patients in two or more geographic locations. Six of the twelve were also observed in environmental isolates. A subset of these isolates was further analyzed by alternative strain genotyping methods, PFGE and MIRU-VNTR, which confirmed the existence of geographically dispersed strain genotypes. These results suggest that caution should be exercised in interpreting high-resolution genotypic matches as evidence for an acquisition event. |
The new global health
De Cock KM , Simone PM , Davison V , Slutsker L . Emerg Infect Dis 2013 19 (8) 1192-7 Global health reflects the realities of globalization, including worldwide dissemination of infectious and noninfectious public health risks. Global health architecture is complex and better coordination is needed between multiple organizations. Three overlapping themes determine global health action and prioritization: development, security, and public health. These themes play out against a background of demographic change, socioeconomic development, and urbanization. Infectious diseases remain critical factors, but are no longer the major cause of global illness and death. Traditional indicators of public health, such as maternal and infant mortality rates no longer describe the health status of whole societies; this change highlights the need for investment in vital registration and disease-specific reporting. Noncommunicable diseases, injuries, and mental health will require greater attention from the world in the future. The new global health requires broader engagement by health organizations and all countries for the objectives of health equity, access, and coverage as priorities beyond the Millennium Development Goals are set. |
Sexual risk behavior and viremia among men who have sex with men in the HIV Outpatient Study, United States, 2007-2010
Durham MD , Buchacz K , Richardson J , Yang D , Wood K , Yangco B , Brooks JT . J Acquir Immune Defic Syndr 2013 63 (3) 372-8 BACKGROUND: Recent US data on unsafe sexual behaviors among viremic HIV-infected men who have sex with men (MSM) are limited. METHOD: Using data abstracted from medical records of the participants in the HIV Outpatient Study (HOPS) and a supplemental behavioral survey, we assessed the frequency of high-risk sexual practices among HIV-infected MSM in care and examined the factors associated with risky sexual practices. We also compared the frequency of unprotected anal sex (UAS) with HIV-negative or unknown serostatus partners among viremic (HIV viral load ≥400 copies per milliliter) vs virologically suppressed (HIV viral load <400 copies per milliliter) MSM. RESULTS: Among 902 HIV-infected MSM surveyed, 704 (78%) reported having sex in the past 6 months, of whom 54% reported UAS (37% insertive, 42% receptive) and 40% UAS with a male partner who was HIV-negative or of unknown serostatus (24% insertive, 31% receptive). In multivariable regression with an outcome of engaging in any UAS with a male partner who was HIV-negative or of unknown serostatus, MSM aged <50 years, who reported injection drug use risk, had ≥2 sex partners, and who disclosed their HIV status to some but not to all of their sex partners were more likely to report this practice. Among MSM who reported any UAS, 15% were viremic; frequency of the UAS did not differ between viremic and virologically suppressed MSM. CONCLUSIONS: The high frequency of UAS with HIV-negative or unknown-status partners among HIV-infected MSM in care suggests the need for targeted prevention strategies for this population. |
Community-level text messaging for 2009 H1N1 prevention in China
Chai SJ , Tan F , Ji Y , Wei X , Li R , Frost M . Am J Prev Med 2013 45 (2) 190-6 BACKGROUND: Although patients worldwide increasingly are using mobile phone text messaging (SMS) for clinical care, quality data are sparse on the community-level effectiveness of SMS to prevent and control disease. PURPOSE: To determine SMS effectiveness in improving 2009 H1N1 knowledge, attitudes, behaviors, and self-reported outcomes and to assess community SMS acceptability. METHODS: A program evaluation of Shanghai, China's SMS system using a single-blinded, randomized-controlled method was conducted in 2010 and results were analyzed in 2010-2011. Randomly selected community residents who agreed to participate were assigned to receive 3 weeks of either 2009 H1N1 prevention and control or tobacco-cessation messages. Assessments were made of 2009 H1N1 knowledge, attitudes, behaviors, and self-reported influenza-like illness before and after sending messages to participants. Acceptability of SMS also was assessed. RESULTS: Of 1992 respondents, those receiving 2009 H1N1 messages had higher scores measuring 2009 H1N1 knowledge (4.2% higher) and desired attitudes (9.4% higher) (p<0.001); 1.77 times greater odds of new 2009 H1N1 vaccination (p<0.001); and 0.12 times smaller odds of reporting influenza-like illness (p<0.001) than those receiving tobacco messages. More than 95% of participants found the SMS program useful and trustworthy; nearly 90% would use it again. CONCLUSIONS: SMS can improve self-reported uptake of short-term behaviors, such as vaccination, that can result in long-term prevention and control of disease. SMS can improve knowledge and influence attitudes about infection prevention and control and self-reported health outcomes. In Shanghai, health-based SMS is acceptable to users. |
Development of culturally tailored educational brochures on HPV and pap tests for American Indian women
Sharpe PA , Brandt HM , McCree DH , Owl-Myers E , Taylor B , Mullins G . J Transcult Nurs 2013 24 (3) 282-90 PURPOSE: Participatory formative research guided the creation of a culturally tailored educational brochure about human papillomavirus (HPV) at an American Indian women's clinic. METHOD: A review of existing educational materials and in-depth interviews were conducted. Nine steps for creating health communications messages that were patterned after National Cancer Institute guidelines guided the brochure development process. RESULTS: Of 95 women tested for HPV, 41% were positive, 32 (34%) agreed to the in-depth interview, and 9 agreed to the pretesting interview. Mean age was 41 years. Interviews revealed key themes concerning emotional reactions to abnormal Pap test results and HPV; need for basic information about HPV, Pap tests, and results; concerns about HPV stigma, sexual transmission, and communication with sexual partner; and the preferred source and format for HPV educational materials. A literature review revealed 12 areas of basic HPV content. CONCLUSIONS: A participatory process successfully engaged nursing staff and patients in creating culturally appropriate brochures for clinic use. IMPLICATIONS: This article provides specific steps for creating culturally tailored patient education materials. |
Evidence for the transmission of parvovirus B19 in patients with bleeding disorders treated with plasma-derived factor concentrates in the era of nucleic acid test screening.
Soucie JM , De Staercke C , Monahan PE , Recht M , Chitlur MB , Gruppo R , Hooper WC , Kessler C , Kulkarni R , Manco-Johnson MJ , Powell J , Pyle M , Riske B , Sabio H , Trimble S . Transfusion 2013 53 (6) 1217-25 BACKGROUND: Parvovirus B19 (B19V) is a small, nonenveloped virus that typically causes a benign flu-like illness that occurs most frequently in childhood. The virus is resistant to current viral inactivation steps used in the manufacture of antihemophilic factor concentrates and B19V transmission through these products has been documented. Since 2000, B19V nucleic acid test (NAT) screening of plasma pools has been implemented to further decrease the viral burden in these products, but no study has examined populations using these products to assess the impact of the screening on B19V transmission. STUDY DESIGN AND METHODS: Blood specimens obtained from participants of a surveillance system established in federally supported specialized bleeding disorders clinics were used in a B19V seroprevalence study. RESULTS: A total of 1643 specimens from 1043 participants age 2 to 7 years born after B19V NAT screening was implemented were tested. Age-specific prevalence rates were generally higher for subjects exposed to either plasma-derived products alone or in combination with other products compared to subjects with no exposure to antihemophilic products. Overall, compared to participants unexposed to blood or blood products, those exposed to plasma-derived products alone were 1.7 times more likely to have antibodies to B19V (p = 0.002). CONCLUSION: These results are consistent with continued B19V transmission through plasma-derived factor concentrates. Effective viral inactivation and detection processes are needed to protect users of these products from infection with B19V or other new or emerging viruses. |
Prevalence and risk factors for acquisition of carbapenem-resistant enterobacteriaceae in the setting of endemicity
Swaminathan M , Sharma S , Poliansky Blash S , Patel G , Banach DB , Phillips M , Labombardi V , Anderson KF , Kitchel B , Srinivasan A , Calfee DP . Infect Control Hosp Epidemiol 2013 34 (8) 809-17 OBJECTIVE: To describe the epidemiology of carbapenem-resistant Enterobacteriaceae (CRE) carriage and acquisition among hospitalized patients in an area of CRE endemicity. DESIGN: Cohort study with a nested case-control study. SETTING: Two acute care, academic hospitals in New York City. PARTICIPANTS: All patients admitted to 7 study units, including intensive care, medical-surgical, and acute rehabilitation units. METHOD: Perianal samples were collected from patients at admission and weekly thereafter to detect asymptomatic gastrointestinal carriage of CRE. A nested case-control study was performed to identify factors associated with CRE acquisition. Case patients were those who acquired CRE during a single hospitalization. Control subjects had no microbiologic evidence of CRE and at least 1 negative surveillance sample. Clinical data were abstracted from the medical record. RESULTS: The prevalence of CRE in the study population was 5.4% (306 of 5,676 patients), and 104 patients met the case definition of acquisition during a single hospital stay. Mechanical ventilation (odds ratio [OR], 11.5), pulmonary disease (OR, 5.2), days of antibiotic therapy (OR, 1.04), and CRE colonization pressure (OR, 1.15) were independently associated with CRE acquisition. Pulsed-field gel electrophoresis analysis identified 87% of tested Klebsiella pneumoniae isolates as sharing related patterns (greater than 78% similarity), which suggests clonal transmission within and between the study hospitals. CONCLUSIONS: Critical illness and underlying medical conditions, CRE colonization pressure, and antimicrobial exposure are important risk factors for CRE acquisition. Adherence to infection control practices and antimicrobial stewardship appear to be critical components of a CRE control program. |
Hepatitis C virus screening and management of seroconversions in hemodialysis facilities
Mbaeyi C , Thompson ND . Semin Dial 2013 26 (4) 439-46 Over the past two decades, healthcare-associated exposure has increasingly been proved to be a means of hepatitis C virus (HCV) transmission, especially in hemodialysis facilities. The prevalence of HCV among hemodialysis patients is known to be several times greater than that of the general population of the United States, and chronic HCV infection is associated with significant morbidity and mortality among these patients. During 2008-2011, HCV infection outbreaks were identified in multiple US hemodialysis facilities, resulting in at least 46 new HCV infections among hemodialysis patients. These outbreaks, linked to infection control breaches, also highlight the failure of some facilities to follow established guidelines for routine HCV antibody (anti-HCV) screening and response to new HCV infection among hemodialysis patients. Current national guidelines recommend screening of hemodialysis patients for anti-HCV on facility admission and, for susceptible patients, on a semiannual basis. Here, we seek to underscore the importance of compliance with national recommendations for anti-HCV screening of hemodialysis patients and actions to be taken in the event of possible HCV transmission within a hemodialysis facility. These include general steps to ensure that: hemodialysis patients are routinely screened for anti-HCV to facilitate early detection of new infections; newly infected patients are informed of the change in their HCV status and undergo clinical evaluation; and public health officials are notified of new HCV infections in a timely manner. We then focus on the need to assess infection control practices at the facility, with particular attention given to safe handling of injectable medications, hand hygiene and disinfection practices. In the absence of a vaccine, routine screening and adherence to standard infection control practices will remain the key strategies for preventing HCV transmission in hemodialysis units. |
Impact of electronic surveillance on isolation practices
Larson E , Behta M , Cohen B , Jia H , Furuya EY , Ross B , Chaudhry R , Vawdrey DK , Ellingson K . Infect Control Hosp Epidemiol 2013 34 (7) 694-9 OBJECTIVE: To assess the impact of an electronic surveillance system on isolation practices and rates of methicillin-resistant Staphylococcus aureus (MRSA). DESIGN: A pre-post test intervention. SETTING: Inpatient units (except psychiatry and labor and delivery) in 4 New York City hospitals. PATIENTS: All patients for whom isolation precautions were indicated, May 2009-December 2011. METHODS: Trained observers assessed isolation sign postings, availability of isolation carts, and staff use of personal protective equipment (PPE). Infection rates were obtained from the infection control department. Regression analyses were used to examine the association between the surveillance system, infection prevention practices, and MRSA infection rates. RESULTS: A total of 54,159 isolation days and 7,628 staff opportunities for donning PPE were observed over a 31-month period. Odds of having an appropriate sign posted were significantly higher after intervention than before intervention (odds ratio [OR], 1.10 [95% confidence interval {CI}, 1.01-1.20]). Relative to baseline, postintervention sign posting improved significantly for airborne and droplet precautions but not for contact precautions. Sign posting improved for vancomycin-resistant enterococci (OR, 1.51 [95% CI, 1.23-1.86]; Clostridium difficile (OR, 1.59 [95% CI, 1.27-2.02]; and Acinetobacter baumannii (OR, 1.41 [95% CI, 1.21-1.64]; precautions but not for MRSA precautions (OR, 1.11 [95% CI, 0.89-1.39]; Staff and visitor adherence to PPE remained low throughout the study but improved from 29.1% to 37.0% after the intervention (OR, 1.14 [95% CI, 1.01-1.29]). MRSA infection rates were not significantly different after the intervention. CONCLUSIONS: An electronic surveillance system resulted in small but statistically significant improvements in isolation practices but no reductions in infection rates over the short term. Such innovations likely require considerable uptake time. |
Seroepidemiology of diphtheria and tetanus among children and young adults in Tajikistan: nationwide population-based survey, 2010
Khetsuriani N , Zakikhany K , Jabirov S , Saparova N , Ursu P , Wannemuehler K , Wassilak S , Efstratiou A , Martin R . Vaccine 2013 31 (42) 4917-22 BACKGROUND: Tajikistan had a major diphtheria outbreak ( approximately 10,000 cases) in the 1990s, which was controlled after nationwide immunization campaigns with diphtheria-tetanus toxoid in 1995 and 1996. Since 2000, only 52 diphtheria cases have been reported. However, in coverage surveys conducted in 2000 and 2005, diphtheria-tetanus-pertussis vaccine coverage was lower than administratively reported estimates raising concerns about potential immunity gaps. To further assess population immunity to diphtheria in Tajikistan, diphtheria antibody testing was included in a large-scale nationwide serosurvey for vaccine-preventable diseases conducted in connection with a poliomyelitis outbreak in 2010. In addition, the serosurvey provided an opportunity to assess population immunity to tetanus. METHODS: Residents of all regions of Tajikistan aged 1-24 years were included in the serosurvey implemented during September-October 2010. Participants were selected through stratified cluster sampling. Specimens were tested for diphtheria antibodies using a Vero cell neutralization assay and for tetanus antibodies using an anti-tetanus IgG ELISA. Antibody concentrations ≥0.1IU/mL were considered seropositive. RESULTS: Overall, 51.4% (95% CI, 47.1%-55.6%) of participants were seropositive for diphtheria and 78.9% (95% CI, 74.7%-82.5%) were seropositive for tetanus. The lowest percentages of seropositivity for both diseases were observed among persons aged 10-19 years: diphtheria seropositivity was 37.1% (95% CI, 31.0%-43.7%) among 10-14 year-olds, and 35.3% (95% CI, 29.9%-41.1%) among 15-19 year-olds; tetanus seropositivity in respective age groups was 65.3% (95% CI, 58.4%-71.6%) and 70.1% (95% CI, 64.5%-75.2%). CONCLUSIONS: Population immunity for diphtheria in Tajikistan is low, particularly among 10-19 year-olds. Population immunity to tetanus is generally higher than for diphtheria, but is suboptimal among 10-19 year-olds. These findings highlight the need to improve routine immunization service delivery, and support a one-time supplementary immunization campaign with diphtheria-tetanus toxoid among birth cohorts aged 1-19 years in 2010 (3-21 years in 2012) to close immunity gaps and prevent diphtheria outbreaks. |
Uptake and effectiveness of monovalent influenza A (H1N1) pandemic 2009 vaccine among healthcare personnel in Kenya, 2010
Njuguna H , Ahmed J , Oria PA , Arunga G , Williamson J , Kosgey A , Muthoka P , Mott JA , Breiman RF , Katz MA . Vaccine 2013 31 (41) 4662-7 INTRODUCTION: During April-June 2010, the Kenya Ministry of Public Health and Sanitation distributed free monovalent influenza A(H1N1)pdm09 vaccines to health care personnel (HCP) and other vulnerable groups. We conducted a prospective, cohort study among HCP to characterize influenza A(H1N1)pdm09 vaccine uptake, and to assess influenza A(H1N1)pdm09 vaccine effectiveness. METHODS: We enrolled HCP from 5 hospitals and followed them for 6 months. At enrollment, we asked HCP if they had received the influenza A(H1N1)pdm09 vaccine and their reasons for their decision. We administered weekly questionnaires to participants about respiratory symptoms suffered during the previous week. Participants who had acute respiratory illness were asked to contact our surveillance clinician and nasopharyngeal and oropharyngeal specimens were collected and later tested for influenza by real-time reverse-transcriptase polymerase-chain-reaction. Vaccine effectiveness was estimated by comparing the incidence of acute respiratory illness, absenteeism from work due to respiratory illness and laboratory-confirmed influenza among vaccinated and unvaccinated HCP. RESULTS: We enrolled 3803 HCP from the five hospitals; 64% received influenza vaccine. Vaccinated HCP were more likely to develop acute respiratory illness (ARI) and more likely to report missed days of work due to respiratory illness compared to non-vaccinated HCP (adjusted incidence rate ratio (aIRR) 1.50, 95% confidence intervals (CI): 1.33-1.70) and (aIRR 2.02, 95% CI: 1.41-2.88), respectively. Of 531 samples collected from vaccinated and non-vaccinated HCP, 30 were influenza A and 3 were influenza B. Two influenza A(H1N1)pdm09 subtypes were isolated; one from vaccinated and the other from non-vaccinated HCP. DISCUSSION AND CONCLUSIONS: A majority of Kenyan HCP surveyed reported receiving the influenza A(H1N1)pdm09 vaccine. Because of low circulation of influenza A(H1N1)pdm09 virus during the study period, vaccine effectiveness could not be determined. The findings of increased ARI events and missed days of work among vaccinated HCP were likely confounded by vaccine-seeking behavioral factors. |
Number of antigens in early childhood vaccines and neuropsychological outcomes at age 7-10 years
Iqbal S , Barile JP , Thompson WW , Destefano F . Pharmacoepidemiol Drug Saf 2013 22 (12) 1263-70 PURPOSE: Concerns have been raised that children may be receiving too many immunizations under the recommended schedule in the USA. We used a publicly available dataset to evaluate the association between antibody-stimulating proteins and polysaccharides from early childhood vaccines and neuropsychological outcomes at age 7-10 years. METHODS: Children aged 7-10 years from four managed care organizations underwent standardized tests for domain-specific neuropsychological outcomes: general intellectual function, speech and language, verbal memory, attention and executive function, tics, achievement, visual spatial ability, and behavior regulation. Vaccination histories up to 24 months of age were obtained from medical charts, electronic records, and parents' records. Logistic regressions and structural equation modeling (SEM) were used to determine associations between total antigens up to 7, 12, and 24 months and domain-specific outcomes. RESULTS: On average, children (N = 1047) received 7266, 8127, and 10 341 antigens by ages 7, 12, and 24 months, respectively. For adjusted analyses, increase (per 1000) in the number of antigens was not associated with any neuropsychological outcomes. Antigen counts above the 10th percentile, compared with lower counts, were also not associated with any adverse outcomes. However, children with higher antigen counts up to 24 months performed better on attention and executive function tests (odds ratio for lower scores = 0.51, 95% confidence interval = 0.26, 0.99). Similar results were found with SEM analysis (b = 0.08, p = 0.02). CONCLUSIONS: We did not find any adverse associations between antigens received through vaccines in the first two years of life and neuropsychological outcomes in later childhood. |
Population immunity to polioviruses in the context of a large-scale wild poliovirus type 1 outbreak in Tajikistan, 2010
Khetsuriani N , Pallansch MA , Jabirov S , Saparova N , Oberste MS , Wannemuehler K , Ursu P , Wassilak S , Martin R . Vaccine 2013 31 (42) 4911-6 BACKGROUND: A serosurvey to evaluate population immunity to polioviruses (PVs) in the context of the importation-related wild PV1 outbreak in Tajikistan in 2010 (461 confirmed cases among children and young adults) was conducted. METHODS: Serum specimens from a nationwide sample of 1-24 year-old persons selected through stratified cluster sampling (n=2447) were tested for neutralizing antibodies to all three PV types. Samples with titers<1:8 were considered seronegative. The serosurvey was conducted during the interval after mOPV1 supplementary immunization activities (SIAs) and before tOPV SIAs (targeting ages≤15 years) implemented to control the outbreak. In the absence of pre-outbreak specimens, results for PV3 were used as a proxy for pre-outbreak PV1 immunity patterns. RESULTS: Overall, PV1 seroprevalence was 98.9%, PV2 seroprevalence was 98.8%, and PV3 seroprevalence was 86.9%. PV1 and PV2 seroprevalence exceeded 95% in all age groups and regions. PV3 seroprevalence was <90% in all age groups and regions, except 15-19 year-olds (91.7%) and Dushanbe (90.0%). PV3 seroprevalence was lowest among 1-4 (82.7%) and 5-9 (84.4%) year-olds, particularly among 1-4 year-olds in Kurgan-Tube (76.3%) and RRS (80.0%) regions. Birth cohorts immunized only through routine services (ages, 1-7 years) had lower PV3 seroprevalence than birth cohorts targeted by the SIAs during 1995-2002 (8-19 years): 82.5% versus 89.3%, p<0.001. CONCLUSIONS: Suboptimal (<90%) PV3 seroprevalence across wide age range suggests the outbreak resulted from accumulation of susceptibles due to suboptimal coverage over a long time period, particularly in the birth cohorts immunized only through routine services and in areas where the outbreak began (Kurgan-Tube and RRS). High PV1 seroprevalence indicates that mOPV1 SIAs with expanded target age (≤15 years) succeeded in closing the immunity gap and ongoing WPV1 transmission is unlikely. To accelerate outbreak control in areas which have been polio-free for long time, expanding SIA target age should be considered. |
Potential influence of seasonal influenza vaccination requirement versus traditional vaccine promotion strategies on unvaccinated healthcare personnel
Thompson MG , McIntyre AF , Naleway AL , Black C , Kennedy ED , Ball S , Walker DK , Henkle EM , Gaglani MJ . Vaccine 2013 31 (37) 3915-21 In a prospective cohort study of 1670 healthcare personnel (HCP) providing direct patient care at Scott & White Healthcare in Texas and Kaiser Permanente Northwest in Oregon and Washington, we examined the potential impact of twelve vaccine promotion strategies on the likelihood of being vaccinated. Internet-based surveys were conducted at enrollment (Fall, 2010) and at post-season (Spring, 2011), which asked HCP whether twelve vaccination promotion strategies would make them "much less" to "much more" likely to be vaccinated next season (on a 5-point Likert scale). Overall, 366 of 1670 HCP (22%) were unvaccinated. Half (50%) of unvaccinated HCP self-reported that a vaccination requirement would make them more likely to be vaccinated and most (62%) identified at least one strategy other than a vaccination requirement that would make them more likely to be vaccinated. In sub-groups of unvaccinated HCPs with specific barriers to vaccination, about one in three (range=27-35%) indicated that interventions targeting specific vaccination barrier would increase the likelihood they would be vaccinated. However, in all cases, significantly more unvaccinated HCP reported that a vaccination requirement would increase the likelihood of vaccination than reported a targeted intervention would have this effect (range in difference scores=+11-23%). |
Geographic and temporal trends in antimicrobial nonsusceptibility in Streptococcus pneumoniae in the post-vaccine era in the United States
Link-Gelles R , Thomas A , Lynfield R , Petit S , Schaffner W , Harrison L , Farley MM , Aragon D , Nicols M , Kirley PD , Zansky S , Jorgensen J , Juni BA , Jackson D , Moore M , Lipsitch M . J Infect Dis 2013 208 (8) 1266-73 BACKGROUND: After introduction of pneumococcal conjugate vaccine in the U.S. in 2000, increases in antibiotic nonsusceptible non-vaccine serotypes were observed. We sought to understand whether these increases were driven primarily by vaccine or antibiotic use. METHODS: Using active surveillance data, we evaluated geographic and temporal differences in serotype distribution and within-serotype nonsusceptibility during 2000-2009. We compared the proportions of nonsusceptibility to penicillin and erythromycin by study site after standardizing differences across time, place, and serotype by regressing standardized versus crude proportions. A regression slope (RS) approaching zero indicates greater importance of the standardizing factor. RESULTS: We evaluated 31,506 isolates. During 2000-2006, geographic differences in nonsusceptibility were better explained by within-serotype prevalence of nonsusceptibility (RS 0.32, 95%CI 0.08-0.55 for penicillin) than by geographic differences in serotype distribution (RS 0.71, 95%CI 0.44-0.97). From 2007-2009, differences in serotype distribution became more important for penicillin (within-serotype RS 0.52, 95%CI 0.11-0.93; serotype distribution RS 0.57, 95%CI 0.14-1.0). CONCLUSIONS: Differential nonsusceptibility, within individual serotypes, accounts for most geographic variation in nonsusceptibility, suggesting that selective pressure from antibiotic use, rather than differences in serotype distribution, mainly determines nonsusceptibility patterns. Recent trends suggest geographic differences in serotype distribution may be starting to affect the prevalence of nonsusceptibility, possibly due to the decrease in the number of nonsusceptible serotypes. |
Childhood intussusception: a literature review
Jiang J , Jiang B , Parashar U , Nguyen T , Bines J , Patel MM . PLoS One 2013 8 (7) e68482 BACKGROUND: Postlicensure data has identified a causal link between rotavirus vaccines and intussusception in some settings. As rotavirus vaccines are introduced globally, monitoring intussusception will be crucial for ensuring safety of the vaccine programs. METHODS: To obtain updated information on background rates and clinical management of intussusception, we reviewed studies of intussusception in children <18 years of age published since 2002. We assessed the incidence of intussusception by month of life among children <1 year of age, seasonality, method of diagnosis, treatment, and case-fatality. FINDINGS: We identified 82 studies from North America, Asia, Europe, Oceania, Africa, Eastern Mediterranean, and Central & South America that reported a total of 44,454 intussusception events. The mean incidence of intussusception was 74 per 100,000 (range: 9-328) among children <1 year of age, with peak incidence among infants 5-7 months of age. No seasonal patterns were observed. A radiographic modality was used to diagnose intussusception in over 95% of the cases in all regions except Africa where clinical findings or surgery were used in 65% of the cases. Surgical rates were substantially higher in Africa (77%) and Central and South America (86%) compared to other regions (13-29%). Case-fatality also was higher in Africa (9%) compared to other regions (<1%). The primary limitation of this review relates to the heterogeneity in intussusception surveillance across different regions. CONCLUSION: This review of the intussusception literature from the past decade provides pertinent information that should facilitate implementation of intussusception surveillance for monitoring the postlicensure safety of rotavirus vaccines. |
Cost-effectiveness of Haemophilus influenzae Type b conjugate vaccine in low- and middle-income countries: regional analysis and assessment of major determinants
Griffiths UK , Clark A , Hajjeh R . J Pediatr 2013 163 S50-S59 e9 OBJECTIVES: To estimate the cost-effectiveness of Haemophilus influenzae type b (Hib) conjugate vaccine in low- and middle-income countries and identify the model variables, which are most important for the result. STUDY DESIGN: A static decision tree model was developed to predict incremental costs and health impacts. Estimates were generated for 4 country groups: countries eligible for funding by the GAVI Alliance in Africa and Asia, lower middle-income countries, and upper middle-income countries. Values, including disease incidence, case fatality rates, and treatment costs, were based on international country estimates and the scientific literature. RESULTS: From the societal perspective, it is estimated that the probability of Hib conjugate vaccine cost saving is 34%-53% in Global Alliance for Vaccines and Immunization eligible African and Asian countries, respectively. In middle-income countries, costs per discounted disability adjusted life year averted are between US$37 and US$733. Variation in vaccine prices and risks of meningitis sequelae and mortality explain most of the difference in results. For all country groups, disease incidence cause the largest part of the uncertainty in the result. CONCLUSIONS: Hib conjugate vaccine is cost saving or highly cost-effective in low- and middle-income settings. This conclusion is especially influenced by the recent decline in Hib conjugate vaccine prices and new data revealing the high costs of lost productivity associated with meningitis sequelae. |
Detection of novel rotavirus strain by vaccine postlicensure surveillance
Weinberg GA , Teel EN , Mijatovic-Rustempasic S , Payne DC , Roy S , Foytich K , Parashar UD , Gentsch JR , Bowen MD . Emerg Infect Dis 2013 19 (8) 1321-3 Surveillance for rotavirus-associated diarrhea after implementation of rotavirus vaccination can assess vaccine effectiveness and identify disease-associated genotypes. During active vaccine postlicensure surveillance in the United States, we found a novel rotavirus genotype, G14P[24], in a stool sample from a child who had diarrhea. Unusual rotavirus strains may become more prevalent after vaccine implementation. |
Duration of immunity to norovirus gastroenteritis
Simmons K , Gambhir M , Leon J , Lopman B . Emerg Infect Dis 2013 19 (8) 1260-7 The duration of immunity to norovirus (NoV) gastroenteritis has been believed to be from 6 months to 2 years. However, several observations are inconsistent with this short period. To gain better estimates of the duration of immunity to NoV, we developed a mathematical model of community NoV transmission. The model was parameterized from the literature and also fit to age-specific incidence data from England and Wales by using maximum likelihood. We developed several scenarios to determine the effect of unknowns regarding transmission and immunity on estimates of the duration of immunity. In the various models, duration of immunity to NoV gastroenteritis was estimated at 4.1 (95% CI 3.2-5.1) to 8.7 (95% CI 6.8-11.3) years. Moreover, we calculated that children (<5 years) are much more infectious than older children and adults. If a vaccine can achieve protection for duration of natural immunity indicated by our results, its potential health and economic benefits could be substantial. |
One-step real-time PCR assay for detection and quantitation of hepatitis D virus RNA.
Kodani M , Martin A , Mixson-Hayden T , Drobeniuc J , Gish RR , Kamili S . J Virol Methods 2013 193 (2) 531-5 Hepatitis D virus (HDV) is a defective virus which requires hepatitis B virus (HBV) surface antigen (HBsAg) for its assembly. Hepatitis B infected individuals co-infected or superinfected with HDV often present with more severe hepatitis, progress faster to liver disease, and have a higher mortality rate than individuals infected with HBV alone. Currently, there are no commercially available clinical tests for the detection and quantitation of HDV RNA in the United States. A one-step TaqMan quantitative reverse transcription-polymerase chain reaction (qRT-PCR) assay was developed for detection of HDV RNA, designing primers located in the region located just downstream from the HDV antigen gene. The assay has the potential to detect all eight HDV genotypes. A quantifiable synthetic RNA control was also developed for use in the determination of HDV RNA titers in clinical samples. The limit of detection of this assay is 7.5x102 HDV RNA copies/ml with a dynamic range of six logs. Most clinical specimens tested (40/41) fell within the linear range of the assay. The median HDV RNA titer of the tested specimens was 6.24x106 copies/ml, with a range of 8.52x103 to 1.79x109 copies/ml. Out of 132 anti-HDV-positive specimens 41 (31.1%) were positive for HDV RNA. |
Sensitive and specific quantitative detection of rotavirus A by one-step real-time reverse transcription-PCR assay without antecedent double-stranded-RNA denaturation.
Mijatovic-Rustempasic S , Tam KI , Kerin TK , Lewis JM , Gautam R , Quaye O , Gentsch JR , Bowen MD . J Clin Microbiol 2013 51 (9) 3047-54 A real-time reverse transcription-polymerase chain reaction (qRT-PCR) assay using the recombinant thermostable Thermus thermophilus (rTth) enzyme was developed to detect and quantify rotavirus A (RVA). By using rTth polymerase, significant improvement was achieved over the existing real-time RT-PCR assays which require denaturation of the RVA double-stranded RNA (dsRNA) prior to assay set-up. Using a dsRNA transcript for segment 7 which encodes the assay target NSP3 gene, the limit of detection for the improved assay was calculated to be approximately 1 genome copy per reaction. The NSP3 qRT-PCR assay was validated using a panel of 1906 stool samples, 23 reference RVA strains, and 14 non-target enteric virus samples. The assay detected a diverse number of RVA genotypes and did not detect other enteric viruses demonstrating analytical sensitivity and specificity for RVA when testing stool samples. A XenoRNA internal process control was introduced and detected in a multiplexed qRT-PCR format. Because it does not require an antecedent dsRNA denaturation step, this assay reduces the possibility of sample cross-contamination and requires less hands-on time than other published qRT-PCR protocols for RVA detection. |
Real-time RT-PCR assay to differentiate clades of H5N1 avian influenza viruses circulating in Vietnam.
Kis Z , Jones J , Creanga A , Ferdinand K , Inui K , Gerloff N , Davis CT , Nguyen T , Donis RO . J Virol Methods 2013 193 (2) 452-8 Continued circulation and geographical expansion of highly pathogenic avian influenza H5N1 virus have led to the emergence of numerous clades in Vietnam. Although viral RNA sequencing and phylogenetic analysis are the gold standard for H5N1 HA clade designation, limited sequencing capacity in many laboratories precludes rapid H5N1 clade identification and detection of novel viruses. Therefore, a Taqman real-time RT-PCR assay for rapid differentiation of the four major H5N1 clades detected in Vietnam was developed. Using HA sequence alignments of clades 1.1, 2.3.2.1, 2.3.4, and 7 viruses, primers and FAM-labeled probes were designed to target conserved regions characteristic of each clade. The assay was optimized and evaluated using circulating clades of H5N1 collected in Vietnam from 2007 to 2012 and shown to be both sensitive and specific for the differentiation of the four H5N1 clades. The assay provides a useful tool for screening of large specimen collections for HA gene sequencing and phylogenetic analysis and for the rapid identification of molecular clade signatures to support outbreak investigations and surveillance activities. Finally, this assay may be useful to monitor for the emergence of novel or variant clades of H5N1 in Vietnam in the future or in other countries where these particular clades may circulate. |
Rate constants for the gas-phase reactions of ozone and nitrate radicals with the sesquiterpenes: Valencene and farnesol
Ham JE . Int J Chem Kinet 2013 45 (8) 508-514 Sesquiterpenes are constituents of a variety of essential oils that are used in flavorings, perfumes, personal care, and cleaning products. Two sesquiterpenes that are commonly used as indoor fragrances are valencene and farnesol. Knowing the reaction rate constants of these chemicals with ozone (O3) and nitrate radical (NO3&bull) is an important factor in determining their fate indoors. In this study, the bimolecular rate constants of kO3+valencene (0.350.9) 10 -16, kO3+farnesol (215.2) 10 -16, kNO3&bull+valencene (7.92.0) 10 -12, and kNO3&bull+farnesol (4411) 10-12 cm3 molecule-1 s-1 were measured using the relative rate technique at 297 3 K and 1 atm total pressure. Using the rate constants reported here and measured/modeled indoor concentrations of O3 and NO3&bull (20 ppb and 1 ppt, respectively), pseudo-first-order-rate lifetimes k'O3+valencene (0.06), k'O3+farnesol (3.8), k'NO3+valencene (0.7), and k'NO3·+farnesol (3.9)h-1 were determined. |
System-based identification of toxicity pathways associated with multi-walled carbon nanotube-induced pathological responses
Snyder-Talkington BN , Dymacek J , Porter DW , Wolfarth MG , Mercer RR , Pacurari M , Denvir J , Castranova V , Qian Y , Guo NL . Toxicol Appl Pharmacol 2013 272 (2) 476-89 The fibrous shape and biopersistence of multi-walled carbon nanotubes (MWCNT) have raised concern over their potential toxicity after pulmonary exposure. As in vivo exposure to MWCNT produced a transient inflammatory and progressive fibrotic response, this study sought to identify significant biological processes associated with lung inflammation and fibrosis pathology data, based upon whole genome mRNA expression, bronchoaveolar lavage scores, and morphometric analysis from C57BL/6J mice exposed by pharyngeal aspiration to 0, 10, 20, 40, or 80mug MWCNT at 1, 7, 28, or 56days post-exposure. Using a novel computational model employing non-negative matrix factorization and Monte Carlo Markov Chain simulation, significant biological processes with expression similar to MWCNT-induced lung inflammation and fibrosis pathology data in mice were identified. A subset of genes in these processes was determined to be functionally related to either fibrosis or inflammation by Ingenuity Pathway Analysis and was used to determine potential significant signaling cascades. Two genes determined to be functionally related to inflammation and fibrosis, vascular endothelial growth factor A (vegfa) and C-C motif chemokine 2 (ccl2), were confirmed by in vitro studies of mRNA and protein expression in small airway epithelial cells exposed to MWCNT as concordant with in vivo expression. This study identified that the novel computational model was sufficient to determine biological processes strongly associated with the pathology of lung inflammation and fibrosis and could identify potential toxicity signaling pathways and mechanisms of MWCNT exposure which could be used for future animal studies to support human risk assessment and intervention efforts. |
Laboratory faceseal leakage evaluation of N95 filtering facepiece respirators against nanoparticles and “all size” particles
Zhuang Z , Bergman MS , Eimer BC , Shaffer RE . J Occup Environ Hyg 2013 10 (9) 496-504 National Institute for Occupational Safety and Health (NIOSH)-certified N95 filtering facepiece respirators (FFRs) are used for respiratory protection in some workplaces handling engineered nanomaterials. Previous NIOSH research has focused on filtration performance against nanoparticles. This paper is the first NIOSH study using human test subjects to compare N95 FFR faceseal leakage (FSL) performance against nanoparticles and "all size" particles. In this study, estimates of FSL were obtained from fit factor (FF) measurements from nine test subjects who participated in previous fit test studies. These data were analyzed to compare values obtained by: (1) using the PortaCount Plus (8020A, TSI, Inc., MN, USA) alone (measureable particle size range 20 nm to > 1,000 nm, hereby referred to as the "all size particles test"), and (2) using the PortaCount Plus with N95-CompanionTM accessory (8095, TSI, Inc., MN, USA) accessory (negatively charged particles, size range ~40 to 60 nm, hereby referred to as the "nanoparticles test"). Log-transformed FF values were compared for the "all size particles test" and "nanoparticles test" using one-way analysis of variance tests (significant at P < 0.05). For individual FFR models, geometric mean (GM) FF using the "nanoparticles test" was the same or higher than the GM FFs using "all size particles test". For all three FFR models combined, GM FF using the "nanoparticles test" was significantly higher than the GM FF using "all size particles test" (P < 0.05). These data suggests that FSL for negatively charged ~40-60 nm nanoparticles is not greater than the FSL for the larger distribution of charged and uncharged 20 to > 1,000 nm particles. |
Modified MALDI MS fatty acid profiling for bacterial identification
Voorhees KJ , Jensen KR , McAlpin CR , Rees JC , Cody R , Ubukata M , Cox CR . J Mass Spectrom 2013 48 (7) 850-5 Bacterial fatty acid profiling is a well-established technique for bacterial identification. Ten bacteria were analyzed using both positive- and negative-ion modes with a modified matrix-assisted laser desorption ionization mass spectrometry (MALDI MS) approach using CaO as a matrix replacement (metal oxide laser ionization MS (MOLI MS)). The results show that reproducible lipid cleavage similar to thermal in situ tetramethyl ammonium hydroxide saponification/derivatization had occurred. Principal component analysis showed that replicates from each organism grouped in a unique space. Cross validation (CV) of spectra from both ionization modes resulted in greater than 94% validation of the data. When CV results were compared for the two ionization modes, negative-ion data produced a superior outcome. MOLI MS provides clinicians a rapid, reproducible and cost-effective bacterial diagnostic tool. |
Nanoparticle filtration performance of filtering facepiece respirators and canister/cartridge filters
Rengasamy S , Ann RB , Szalajda J . J Occup Environ Hyg 2013 10 (9) null-null Respiratory protection offered by a particulate respirator is a function of the filter efficiency and face seal leakage. A previous study in our laboratory measured the filter penetration and total inward leakage (TIL) of 20-1000 nm size particles for N95 filtering facepiece respirators (FFRs) using a breathing manikin. The results showed relatively higher filter penetration and TIL value under different leak sizes and flow rates at the most penetrating particle size (MPPS), ~45 nm for electrostatic FFRs and ~150 nm for the same FFRs after charge removal. This indicates an advantage of mechanical filters over electrostatic filters rated for similar filter efficiencies in providing respiratory protection in nanoparticle workplaces. To better understand the influence of the MPPS, the filtration performance of commonly used one N95 and one N100 FFR models, and four P100 canister/cartridge models were measured with monodisperse NaCl aerosols, and polydisperse NaCl aerosols employed in the National Institute for Occupational Safety and Health (NIOSH) certification test method. As expected, the polydisperse aerosol penetration was below 5% for the N95 FFR, and below 0.03% for the N100 FFR and P100 canister/cartridge filters. Monodisperse aerosol penetration results showed a MPPS of ~40 nm for both the N95 and N100 FFRs. All four P100 canister/cartridge filters had a MPPS of ≥150 nm, similar to expectations for mechanical filters. The P100 canister/cartridge filters showed lower penetration values for different size nanoparticles than the N100 FFRs. The results indicate that a mechanical filter would offer a relatively higher filtration performance for nanoparticles than an electrostatic counterpart rated for the same filter efficiency. Overall, the results obtained in the study suggest that MPPS should be considered as a key factor in the development of respirator standards and recommendations for protection against nanoparticles. |
Biodiesel versus diesel exposure: enhanced pulmonary inflammation, oxidative stress, and differential morphological changes in the mouse lung
Yanamala N , Hatfield MK , Farcas MT , Schwegler-Berry D , Hummer JA , Shurin MR , Birch ME , Gutkin DW , Kisin E , Kagan VE , Bugarski AD , Shvedova AA . Toxicol Appl Pharmacol 2013 272 (2) 373-83 The use of biodiesel (BD) or its blends with petroleum diesel (D) is considered to be a viable approach to reduce occupational and environmental exposures to particulate matter (PM). Due to its lower particulate mass emissions compared to D, use of BD is thought to alleviate adverse health effects. Considering BD fuel is mainly composed of unsaturated fatty acids, we hypothesize that BD exhaust particles could induce pronounced adverse outcomes, due to their ability to readily oxidize. The main objective of this study was to compare the effects of particles generated by engine fueled with neat BD and neat petroleum-based D. Biomarkers of tissue damage and inflammation were significantly elevated in lungs of mice exposed to BD particulates. Additionally, BD particulates caused a significant accumulation of oxidatively modified proteins and an increase in 4-hydroxynonenal. The up-regulation of inflammatory cytokines/chemokines/growth factors was higher in lungs upon BD particulate exposure. Histological evaluation of lung sections indicated presence of lymphocytic infiltrate and impaired clearance with prolonged retention of BD particulate in pigment laden macrophages. Taken together, these results clearly indicate that BD exhaust particles could exert more toxic effects compared to D. |
Comparison of 2 assays for diagnosing rotavirus and evaluating vaccine effectiveness in children with gastroenteritis
Tate JE , Mijatovic-Rustempasic S , Tam KI , Lyde FC , Payne DC , Szilagyi P , Edwards K , Staat MA , Weinberg GA , Hall CB , Chappell J , McNeal M , Gentsch JR , Bowen MD , Parashar UD . Emerg Infect Dis 2013 19 (8) 1245-52 We compared rotavirus detection rates in children with acute gastroenteritis (AGE) and in healthy controls using enzyme immunoassays (EIAs) and semiquantitative real-time reverse transcription PCR (qRT-PCR). We calculated rotavirus vaccine effectiveness using different laboratory-based case definitions to determine which best identified the proportion of disease that was vaccine preventable. Of 648 AGE patients, 158 (24%) were EIA positive, and 157 were also qRT-PCR positive. An additional 65 (10%) were qRT-PCR positive but EIA negative. Of 500 healthy controls, 1 was EIA positive and 24 (5%) were qRT-PCR positive. Rotavirus vaccine was highly effective (84% [95% CI 71%-91%]) in EIA-positive children but offered no significant protection (14% [95% CI -105% to 64%]) in EIA-negative children for whom virus was detected by qRT-PCR alone. Children with rotavirus detected by qRT-PCR but not by EIA were not protected by vaccination, suggesting that rotavirus detected by qRT-PCR alone might not be causally associated with AGE in all patients. |
Comparison of Premier Rotaclone, ProSpecT, and RIDASCREEN rotavirus enzyme immunoassay kits for detection of rotavirus antigen in stool specimens
Gautam R , Lyde F , Esona MD , Quaye O , Bowen MD . J Clin Virol 2013 58 (1) 292-4 BACKGROUND: Rotaviruses are the major cause of severe dehydrating diarrhea in children throughout the world. Enzyme immunoassays (EIAs) have been the standard method for detection of rotavirus in stool specimens since the 1980s. The World Health Organization (WHO) Rotavirus Surveillance Network has proposed including three EIA kits in the WHO-GSM (Global Management System/Systeme Mondial de Gestion) catalog for easy procurement of EIA kits by participating rotavirus surveillance network laboratories. OBJECTIVES: In this study, we conducted a comparative analysis of 3 commercially available enzyme immunoassay kits: Premier Rotaclone(R) (Meridian Bioscience, Inc.), ProSpecT (Oxoid, Ltd.) and RIDASCREEN(R) (R-biopharm AG) for rotavirus diagnostics. STUDY DESIGN: Using reverse-transcriptase-PCR (RT-PCR) as the gold standard, the 3 EIA kits were evaluated by testing a stool panel consisting of 56 rotavirus-positive and 54 rotavirus negative samples. RESULTS: The sensitivities of the Premier Rotaclone(R), ProSpecT and RIDASCREEN(R) kits were 76.8%, 75% and 82.1%, respectively, but did not differ significantly. The specificity of all the 3 kits was 100%. The use of RT-PCR as a gold standard lowered the observed sensitivity of all 3 EIA kits but helps to reduce equivocal results that can be seen when another EIA or other non-molecular methods are used as the reference assay in comparison studies. CONCLUSION: Our study found that all three kits are suitable for use by rotavirus surveillance programs. |
Diacetyl increases sensory innervation and substance P production in rat trachea
Goravanahally MP , Hubbs AF , Fedan JS , Kashon ML , Battelli LA , Mercer RR , Goldsmith WT , Jackson MC , Cumpston A , Frazer DG , Dey RD . Toxicol Pathol 2013 42 (3) 582-90 Inhalation of diacetyl, a butter flavoring, causes airway responses potentially mediated by sensory nerves. This study examines diacetyl-induced changes in sensory nerves of tracheal epithelium. Rats (n = 6/group) inhaled 0-, 25-, 249-, or 346-ppm diacetyl for 6 hr. Tracheas and vagal ganglia were removed 1-day postexposure and labeled for substance P (SP) or protein gene product 9.5 (PGP9.5). Vagal ganglia neurons projecting to airway epithelium were identified by axonal transport of fluorescent microspheres intratracheally instilled 14 days before diacetyl inhalation. End points were SP and PGP9.5 nerve fiber density (NFD) in tracheal epithelium and SP-positive neurons projecting to the trachea. PGP9.5-immunoreactive NFD decreased in foci with denuded epithelium, suggesting loss of airway sensory innervation. However, in the intact epithelium adjacent to denuded foci, SP-immunoreactive NFD increased from 0.01 +/- 0.002 in controls to 0.05 +/- 0.01 after exposure to 346-ppm diacetyl. In vagal ganglia, SP-positive airway neurons increased from 3.3 +/- 3.0% in controls to 25.5 +/- 6.6% after inhaling 346-ppm diacetyl. Thus, diacetyl inhalation increases SP levels in sensory nerves of airway epithelium. Because SP release in airways promotes inflammation and activation of sensory nerves mediates reflexes, neural changes may contribute to flavorings-related lung disease pathogenesis. |
Using mPINC data to measure breastfeeding support for hospital employees
Allen JA , Belay B , Perrine CG . J Hum Lact 2013 30 (1) 97-101 BACKGROUND: Employer support is important for mothers, as returning to work is a common reason for discontinuing breastfeeding. This article explores support available to breastfeeding employees of hospitals that provide maternity care. OBJECTIVES: This study aimed to describe the prevalence of 7 different types of worksite support and changes in these supports available to breastfeeding employees at hospitals that provide maternity care from 2007 to 2011. METHODS: Hospital data from the 2007, 2009, and 2011 Centers for Disease Control and Prevention Survey on Maternity Practices in Infant Nutrition and Care (mPINC) were analyzed. Survey respondents were asked if the hospital provides any of the following supports to hospital staff: (1) a designated room to express milk, (2) on-site child care, (3) an electric breast pump, (4) permission to use existing work breaks to express milk, (5) a breastfeeding support group, (6) lactation consultant/specialist available for consult, and (7) paid maternity leave other than accrued vacation or sick leave. This study was exempt from ethical approval because it was a secondary analysis of a publicly available dataset. RESULTS: Of the 7 worksite supports in hospitals measured, 6 increased and 1 decreased from 2007 to 2011. Across all survey years, more than 70% of hospitals provided supports for expressing breast milk, whereas less than 15% provided direct access to the breastfeeding child through on-site child care, and less than 35% offered paid maternity leave. Results differed by region and hospital size and type. In 2011, only 2% of maternity hospitals provided all 7 worksite supports; 40% provided 5 or more. CONCLUSION: The majority of maternity care hospitals (> 70%) offer breastfeeding supports that allow employees to express breast milk. Supports that provide direct access to the breastfeeding child, which would allow employees to breastfeed at the breast, and access to breastfeeding support groups are much less frequent than other supports, suggesting opportunities for improvement. |
Maternal prepregnancy weight status and associations with children's development and disabilities at kindergarten
Hinkle SN , Sharma AJ , Kim SY , Schieve LA . Int J Obes (Lond) 2013 37 (10) 1344-51 OBJECTIVE: Obesity is prevalent among women of reproductive age and developmental disabilities in children continue to increase. We examined associations between mother's prepregnancy body mass index (BMI) and physical and developmental disabilities, and objective measures of reading and math skills and fine and gross motor function in children. METHODS: We used the Early Childhood Longitudinal Study-Birth Cohort (ECLS-B) (n=5200), a cohort of children born in 2001 and followed until kindergarten. Children were classified according to maternal prepregnancy BMI (kg/m2): underweight (BMI <18.5), normal weight (BMI 18.5-24.9), overweight (BMI 25.0-29.9), obese class I (BMI 30.0-34.9), and obese class II/III (BMI ≥35.0). Parent reports of doctor diagnosed disabilities were collected up to kindergarten and classified as learning and behavioral or physical. Children's reading and math and fine and gross motor function were assessed at kindergarten according to standardized tests. Linear and modified logistic regression models were adjusted for sociodemographic and enrichment variables; children's sex, age, and year of kindergarten entry. Additional adjustment for current child BMI was performed in separate models. All data are weighted to be nationally representative of the children born in 2001. RESULTS: Compared with children of normal weight mothers, children born to obese class II/III mothers had an increased risk of learning or behavioral [risk ratio (RR) 1.67 (95% confidence interval (CI) 1.27, 2.21)], but not physical disabilities [RR 0.57 (95% CI 0.27, 1.22)]. Gross (P<0.001), but not fine (P=0.06) motor function, was significantly associated with maternal BMI, but gross motor function was attenuated after adjustment for current child BMI (P=0.05). Children's reading scores (P=0.01), but not math scores (P=0.11) were significantly associated with maternal BMI. CONCLUSIONS: In this nationally representative US cohort, children born to severely obese mothers had an increased risk for diagnosed learning and behavioral, but not physical disabilities by kindergarten. |
Medications in the first trimester of pregnancy: most common exposures and critical gaps in understanding fetal risk
Thorpe PG , Gilboa SM , Hernandez-Diaz S , Lind J , Cragan JD , Briggs G , Kweder S , Friedman JM , Mitchell AA , Honein MA . Pharmacoepidemiol Drug Saf 2013 22 (9) 1013-8 PURPOSE: To determine which medications are most commonly used by women in the first trimester of pregnancy and identify the critical gaps in information about fetal risk for those medications. METHODS: Self-reported first-trimester medication use was assessed among women delivering liveborn infants without birth defects and serving as control mothers in two large case-control studies of major birth defects. The Teratology Information System (TERIS) expert Advisory Board ratings of quality and quantity of data available to assess fetal risk were reviewed to identify information gaps. RESULTS: Responses from 5381 mothers identified 54 different medication components used in the first trimester by at least 0.5% of pregnant women, including 31 prescription and 23 over-the-counter medications. The most commonly used prescription medication components reported were progestins from oral contraceptives, amoxicillin, progesterone, albuterol, promethazine, and estrogenic compounds. The most commonly used over-the-counter medication components reported were acetaminophen, ibuprofen, docusate, pseudoephedrine, aspirin, and naproxen. Among the 54 most commonly used medications, only two had "Good to Excellent" data available to assess teratogenic risk in humans, based on the TERIS review. CONCLUSIONS: For most medications commonly used in pregnancy, there are insufficient data available to characterize the fetal risk fully, limiting the opportunity for informed clinical decisions about the best management of acute and chronic disorders during pregnancy. Future research efforts should be directed at these critical knowledge gaps. |
Neonatal withdrawal syndrome, Michigan, 2000-2009
Hekman KA , Grigorescu VI , Cameron LL , Miller CE , Smith RA . Am J Prev Med 2013 45 (1) 113-7 BACKGROUND: Neonatal withdrawal syndrome, which is associated most frequently with opioid use in pregnancy, is an emerging public health concern, with recent studies documenting an increase in the rate of U.S. infants diagnosed. PURPOSE: This study examined neonatal withdrawal syndrome diagnosis among Michigan infants from 2000 to 2009 and hospital length of stay (LOS) between infants with and without the syndrome for a subset of years (2006-2009). METHODS: Michigan live birth records from 2000 to 2009 were linked with hospital discharge data to identify infants with neonatal withdrawal syndrome. Linked data were restricted to infants born between 2006 and 2009 to examine the difference in hospital LOS between infants with and without the syndrome. Multivariable regression models were constructed to examine the adjusted impact of syndrome diagnosis on infant LOS and fit using negative binomial distribution. Data were analyzed from July 2011 to February 2012. RESULTS: From 2000 to 2009, the overall birth rate of infants with neonatal withdrawal syndrome increased from 41.2 to 289.0 per 100,000 live births (p<0.0001). Among infants born from 2006 to 2009, the average hospital LOS for those with the syndrome was between 1.36 (95% CI=1.24, 1.49) and 5.75 (95% CI=5.41, 6.10) times longer than for infants without it. CONCLUSIONS: Diagnosis of neonatal withdrawal syndrome increased significantly in Michigan with infants who had the syndrome requiring a significantly longer LOS compared to those without it. |
Predictors of ascertainment of autism spectrum disorders across nine US communities
Pettygrove S , Pinborough-Zimmerman J , John Meaney F , Van Naarden Braun K , Nicholas J , Miller L , Miller J , Rice C . J Autism Dev Disord 2013 43 (8) 1867-79 Autism spectrum disorders (ASD) prevalence estimates derived from a single data source under-identify children and provide a biased profile of case characteristics. We analyzed characteristics of 1,919 children with ASD identified by the Autism and Developmental Disabilities Monitoring Network. Cases ascertained only at education sources were compared to those identified at health sources. 38 % were education-only. These were older at their earliest evaluation (54.5 vs. 42.0 months, p < 0.001) and earliest ASD diagnosis (62 vs. 53 months, p < 0.001). More lived in census blocks with lower adult education (p < 0.001). Lower educational attainment of adults in census blocks of residence of education-only cases suggests disparities in access to clinical services with the schools providing crucial services to many families. |
The effects of low to moderate alcohol consumption and binge drinking in early pregnancy on behaviour in 5-year-old children: a prospective cohort study on 1628 children
Skogerbo A , Kesmodel U , Denny CH , Kjaersgaard M , Wimberley T , Landro N , Mortensen E . BJOG 2013 120 (9) 1042-50 OBJECTIVE: To examine the effects of low to moderate maternal alcohol consumption and binge drinking in early pregnancy on behaviour in children at the age of 5 years. DESIGN: Prospective cohort study. SETTING: Neuropsychological testing in four Danish cities, 2003-2008. POPULATION: A total of 1628 women and their children sampled from the Danish National Birth Cohort. METHODS: Participants were sampled based on maternal alcohol drinking patterns during early pregnancy. When the children were 5 years of age the parent and teacher versions of the Strengths and Difficulties Questionnaire (SDQ) were completed by the mothers and a preschool teacher, respectively. The full statistical model included the following potential confounding factors: maternal binge drinking or low to moderate alcohol consumption, respectively; parental education; maternal IQ; prenatal maternal smoking; the child's age at testing; the child's gender; maternal age; parity; maternal marital status; family home environment; postnatal parental smoking; prepregnancy maternal body mass index (BMI); and the child's health status. MAIN OUTCOME MEASURE: Behaviour among children assessed by the SDQ parent and teacher forms. RESULTS: Adjusted for all potential confounding factors, no statistically significant associations were observed between maternal low to moderate average weekly alcohol consumption and SDQ behavioural scores (OR 1.1, 95% CI 0.5-2.3; OR 1.1, 95% CI 0.6-2.1 for the total difficulties scores) or between binge drinking and SDQ behavioural scores (OR 1.2, 95% CI 0.8-1.7; OR 0.8, 95% CI 0.6-1.2). CONCLUSION: This study observed no consistent effects of low to moderate alcohol consumption or binge drinking in early pregnancy on offspring behaviour at the age of 5 years. |
An analysis of pregnancy-related mortality in the KEMRI/CDC Health and Demographic Surveillance System in western Kenya
Desai M , Phillips-Howard PA , Odhiambo FO , Katana A , Ouma P , Hamel MJ , Omoto J , Macharia S , van Eijk A , Ogwang S , Slutsker L , Laserson KF . PLoS One 2013 8 (7) e68733 BACKGROUND: Pregnancy-related (PR) deaths are often a result of direct obstetric complications occurring at childbirth. METHODS AND FINDINGS: To estimate the burden of and characterize risk factors for PR mortality, we evaluated deaths that occurred between 2003 and 2008 among women of childbearing age (15 to 49 years) using Health and Demographic Surveillance System data in rural western Kenya. WHO ICD definition of PR mortality was used: "the death of a woman while pregnant or within 42 days of termination of pregnancy, irrespective of the cause of death". In addition, symptoms and events at the time of death were examined using the WHO verbal autopsy methodology. Deaths were categorized as either (i) directly PR: main cause of death was ascribed as obstetric, or (ii) indirectly PR: main cause of death was non-obstetric. Of 3,223 deaths in women 15 to 49 years, 249 (7.7%) were PR. One-third (34%) of these were due to direct obstetric causes, predominantly postpartum hemorrhage, abortion complications and puerperal sepsis. Two-thirds were indirect; three-quarters were attributable to human immunodeficiency virus (HIV/AIDS), malaria and tuberculosis. Significantly more women who died in lower socio-economic groups sought care from traditional birth attendants (p = 0.034), while less impoverished women were more likely to seek hospital care (p = 0.001). The PR mortality ratio over the six years was 740 (95% CI 651-838) per 100,000 live births, with no evidence of reduction over time (chi(2) linear trend = 1.07; p = 0.3). CONCLUSIONS: These data supplement current scanty information on the relationship between infectious diseases and poor maternal outcomes in Africa. They indicate low uptake of maternal health interventions in women dying during pregnancy and postpartum, suggesting improved access to and increased uptake of skilled obstetric care, as well as preventive measures against HIV/AIDS, malaria and tuberculosis among all women of childbearing age may help to reduce pregnancy-related mortality. |
Convergent validity of parent-reported attention-deficit/hyperactivity disorder diagnosis: a cross-study comparison
Visser SN , Danielson ML , Bitsko RH , Perou R , Blumberg SJ . JAMA Pediatr 2013 167 (7) 674-5 Getahun and colleagues recently published a study entitled “Recent Trends in Childhood Attention-Deficit/Hyperactivity Disorder” in which they used medical records and well-defined criteria to generate the prevalence of diagnosed attention-deficit/hyperactivity disorder (ADHD) in a large Southern California administrative sample.1 Their study contributes important geographically-based estimates of ADHD and draws conclusions about increasing ADHD prevalence within Southern California. However, the authors cited our previous research2 to support a commonly held assertion that parent- and teacher-reports of ADHD “overestimate true prevalence.” To date, parent-reported ADHD diagnosis on national health surveys has not been directly validated against a clinical standard, and thus needs further study before conclusions related to validity can be made. However, studies like that of Getahun may inform the evidence base for the validity of using survey data for monitoring ADHD over time. Our research estimated that the parent-reported prevalence of ADHD for children aged 4–17 years in California was 6.2% (2007)2, which may appear high compared to Getahun et al.’s estimate of 4.9% among children aged 5–11 in California (2001–2010). In this research letter we replicate our previous analyses of parent-reported ADHD with a sample more comparable to the Getahun study population. |
Discharge timing, outpatient follow-up, and home care of late-preterm and early-term infants
Hwang SS , Barfield WD , Smith RA , Morrow B , Shapiro-Mendoza CK , Prince CB , Smith VC , McCormick MC . Pediatrics 2013 132 (1) 101-8 OBJECTIVE: To compare the timing of hospital discharge, time to outpatient follow-up, and home care practices (breastfeeding initiation and continuation, tobacco smoke exposure, supine sleep position) for late-preterm (LPT; 34 0/7-36 6/7 weeks) and early-term (ET; 37 0/7-38/6/7 weeks) infants with term infants. METHODS: We analyzed 2000-2008 data from the Centers for Disease Control and Prevention's Pregnancy Risk Assessment Monitoring System. chi(2) Analyses were used to measure differences in maternal and infant characteristics, hospital discharge, outpatient care, and home care among LPT, ET, and term infants. We calculated adjusted risk ratios for the risk of adverse care outcomes among LPT and ET infants compared with term infants. RESULTS: In the adjusted analysis, LPT infants were less likely to be discharged early compared with term infants, whereas there was no difference for ET infants (odds ratio [OR; 95% confidence interval (CI)]: 0.65 [0.54-0.79]; 0.95 [0.88-1.02]). LPT and ET infants were more likely to have timely outpatient follow-up (1.07 [1.06-1.08]; 1.02 [1.02-1.03]), more likely to experience maternal tobacco smoke exposure (1.09 [1.05-1.14]; 1.08 [1.06-1.11]), less likely to be initially breastfed (0.95 [0.94-0.97]; 0.98 [0.97-0.98]), less likely to be breastfed for ≥10 weeks (0.88 [0.86-0.90]; 0.94 [0.93-0.96]), and less likely to be placed in a supine sleep position (0.95 [0.93-0.97]; 0.97 [0.96-0.98]). CONCLUSIONS: Given that LPT and ET infants bear an increased risk of morbidity and mortality, greater efforts are needed to ensure safe and healthy posthospitalization and home care practices for these vulnerable infants. |
Parental and home environmental facilitators of sugar-sweetened beverage consumption among overweight and obese Latino youth
Bogart LM , Cowgill BO , Sharma AJ , Uyeda K , Sticklor LA , Alijewicz KE , Schuster MA . Acad Pediatr 2013 13 (4) 348-55 OBJECTIVE: To explore parental and home environmental facilitators of sugar-sweetened beverage (SSB) and water consumption among obese/overweight Latino youth. METHODS: Semistructured interviews were conducted with 55 overweight/obese Latino youth aged 10 to 18 and 55 parents, recruited from school-based clinics and a school in one West Coast district. All youth consumed SSBs regularly and lived in a home where SSBs were available. We used qualitative methods to examine key themes around beliefs about SSBs and water, facilitators of SSB and water consumption, and barriers to reducing SSB consumption. RESULTS: A few parents and youth believed that sports drinks are healthy. Although nearly all thought that water is healthy, most parents and about half of youth thought that tap water is unsafe. About half of parent-child dyads had discordant beliefs regarding their perceptions of tap water. About half of parents believed that homemade culturally relevant drinks (eg, aguas frescas), which typically contain sugar, fruit, and water, were healthy because of their "natural" ingredients. Participants cited home availability as a key factor in SSB consumption. About half of parents set no rules about SSB consumption at home. Among those with rules, most parent-child pairs differed on their beliefs about the content of the rules, and youth reported few consequences for breaking rules. CONCLUSIONS: Obesity programs for Latino youth should address misconceptions around water and should discuss culturally relevant drinks and sports drinks as potential sources of weight gain. Health care providers can help parents set appropriate rules by educating about the risks of keeping SSBs at home. |
Changes in the concentrations of biochemical indicators of diet and nutritional status of pregnant women across pregnancy trimesters in Trujillo, Peru, 2004-2005
Horton DK , Adetona O , Aguilar-Villalobos M , Cassidy BE , Pfeiffer CM , Schleicher RL , Caldwell KL , Needham LL , Rathbun SL , Vena JE , Naeher LP . Nutr J 2013 12 80 BACKGROUND: In developing countries, deficiencies in essential micronutrients are common, particularly in pregnant women. Although, biochemical indicators of diet and nutrition are useful to assess nutritional status, few studies have examined such indicators throughout pregnancy in women in developing countries. METHODS: The primary objective of this study was to assess the nutritional status of 78 Peruvian women throughout pregnancy for 16 different nutritional indicators including fat-soluble vitamins and carotenoids, iron-status indicators, and selenium. Venous blood samples from which serum was prepared were collected during trimesters one (n = 78), two (n = 65), three (n = 62), and at term via the umbilical cord (n = 52). Questionnaires were completed to determine the demographic characteristics of subjects. Linear mixed effects models were used to study the associations between each maternal indicator and the demographic characteristics. RESULTS: None of the women were vitamin A and E deficient at any stage of pregnancy and only 1/62 women (1.6%) was selenium deficient during the third trimester. However, 6.4%, 44% and 64% of women had ferritin levels indicative of iron deficiency during the first, second and third trimester, respectively. Statistically significant changes (p ≤ 0.05) throughout pregnancy were noted for 15/16 nutritional indicators for this Peruvian cohort, with little-to-no association with demographic characteristics. Three carotenoids (beta-carotene, beta-cryptoxanthin and trans-lycopene) were significantly associated with education status, while trans-lycopene was associated with age and beta-cryptoxanthin with SES (p < 0.05). Concentrations of retinol, tocopherol, beta-cryptoxanthin, lutein + zeaxanthin and selenium were lower in cord serum compared with maternal serum (p < 0.05). Conversely, levels of iron status indicators (ferritin, transferrin saturation and iron) were higher in cord serum (p < 0.05). CONCLUSION: The increasing prevalence of iron deficiency throughout pregnancy in these Peruvian women was expected. It was surprising though not to find deficiencies in other nutrients. The results highlight the importance of continual monitoring of women throughout pregnancy for iron deficiency which could be caused by increasing fetal needs and/or inadequate iron intake as pregnancy progresses. |
Dietary flavonol intake is associated with age of puberty in a longitudinal cohort of girls
Mervish NA , Gardiner EW , Galvez MP , Kushi LH , Windham GC , Biro FM , Pinney SM , Rybak ME , Teitelbaum SL , Wolff MS . Nutr Res 2013 33 (7) 534-42 Lignans and flavonols are dietary phytoestrogens found at high concentrations in the Western Diet. They have potential to influence the timing of puberty. We hypothesized that greater consumption of these 2 phytoestrogens would be related to later age at pubertal onset among girls. Pubertal assessment and 24-hour diet recall data were available for 1178 girls, ages 6 to 8 years (mean 7.3 years) in the Breast Cancer and Environment Research Project Puberty Study. Lignan and flavonol intakes were mainly derived from fruit and vegetable consumption. Average consumption was 6.5 mg/d for flavonols and 0.6 mg/d for lignans. Highest flavonol consumption (>5 mg/d) was associated with later breast development (adjusted hazards ratio [HR]: 0.74, 95% CI: [0.61-0.91]) compared to 2 to 5 mg/d (adjusted HR: 0.84, 95% CI: [0.70-1.0]) and <2 mg/d (referent group; P-trend = .006). Flavonol intake was not associated with pubic hair development. Lignan intake was not associated with either breast or pubic hair development. Dietary intake was only weakly correlated with urinary enterolactone, a biomarker for lignans (RS = 0.13). Consistent with biologic properties of phytoestrogens that indicate hormonal activity, their consumption may be associated with reproductive end points, even in childhood. |
The relationship between the social environment of work and workplace mistreatment
Sliter MT , Jex S , Grubb P . J Behav Health 2013 2 (2) 120-126 While a great deal of research has investigated employee reactions to mistreatment, considerably fewer studies have investigated how the social environment in the workplace contributes to both 1.) the prevalence of mistreatment, and 2.) employee reactions when mistreatment occurs. In the present study we investigate three important components of the social environment of organizations-perceptions of organizational justice, the extent to which people within the organization generally treat each other with respect, and the level of social support with the organization-and show how these relate to the prevalence of both verbal aggression and social undermining. In addition to testing these relations, we test a mediational model whereby these components of the social environment are related to employee strain. More specifically, we propose that workplace mistreatment mediates the relation between these components of the social environment and employee strain. Data from the 2004 General Social Survey (GSS) supported the proposed relations between all three social environment components and the two forms of mistreatment. Furthermore, mediated regression analyses showed that mistreatment mediated the relation between perceived level of respect and all measures of strain. These results suggest that the social environment of the workplace may play a role in employee mistreatment and contribute, at least indirectly, to employee strain. |
Shift work and long-term injury among police officers
Violanti JM , Fekedulegn D , Andrew ME , Charles LE , Hartley TA , Vila B , Burchfiel CM . Scand J Work Environ Health 2013 39 (4) 361-8 OBJECTIVE: Our previous work has suggested that the incidence of any occurrence of injury leave among police officers is higher on night shifts. In this study, we extended our inquiry to determine whether the incidence of long-term injury leave varies across shifts. METHODS: Police officers (N=419) from an urban department were included in the analysis. Daily payroll work history data from 1994-2010 was collected. Injury leave duration was examined ranging from ≥1≥90 days. Poisson regression models were used to compute incidence rates (IR) and incidence rate ratios (IRR) of long-term injury. RESULTS: Cumulative incidence of injury for different durations of leave defined as ≥1, ≥5, ≥10, ≥15, ≥30, and ≥90 days were 61.3%, 45.4%, 39.9%, 33.9%, 26.5%, and 9.6% respectively. Age-and gender adjusted IRR of long-term injury (≥90 days) for night versus day shifts was IRR 3.12, 95% confidence interval (95% CI) 1.35-7.21 and IRR 2.21, 95% CI 1.04-4.68, for night versus afternoon shifts. Among all durations examined, the largest IRR was for injury ≥90 days, night versus day shifts (IRR 3.12, 95% CI 1.35-7.21). CONCLUSIONS: Night shift work was significantly associated with long-term injury among police officers after adjustment for age and gender. Although type of injury was not available, it is possible that variation in injury type across shifts might account for some of this association. |
Work-related asthma and employment status - 38 states and District of Columbia, 2006-2009
White GE , Mazurek JM , Moorman JE . J Asthma 2013 50 (9) 954-9 OBJECTIVES: To examine differences in current employment status between persons with health professional-diagnosed work-related asthma and non-work-related asthma and to examine factors associated with unemployment in these groups. METHODS: We analyzed the 2006-2009 Behavioral Risk Factor Surveillance System Asthma Call-back Survey for ever-employed adults (excluding those who were retired, homemakers, and students at the time of the interview) with current asthma in 38 states and District of Columbia (N=25,680). We calculated prevalence ratios (PRs) adjusted for age, sex, race/ethnicity, education, and income. RESULTS: Among adults with current asthma, individuals with work-related asthma were less likely to be currently employed for wages (PR=0.89; 95% confidence interval [CI]=0.84-0.95) and more likely to be unable to work (PR=1.44; 95% CI=1.24-1.67) than those with non-work-related asthma. Among adults with current asthma who were unemployed at the time of the interview, adults with work-related asthma did not differ from those with non-work-related asthma in naming disability as reason for unemployment (PR=1.09; 95% CI=0.94-1.26). However, those with work-related asthma were more likely to be unable to work for health reasons other than disability (PR=1.46; 95% CI=1.01-2.12) than adults with non-work-related asthma. CONCLUSIONS: Additional studies are needed to determine what health reasons prevent individuals with work-related asthma from working and if the health reasons are asthma-related. |
Noise exposure reconstruction and evaluation of exposure trends in two large automotive plants
Brueck SE , Prince Panaccio M , Stancescu D , Woskie S , Estill C , Waters M . Ann Occup Hyg 2013 57 (9) 1091-104 This study used a task-based approach to reconstruct employee noise exposures at two large automotive manufacturing plants for the period 1970-1989, utilizing historic noise measurement data, work history records, documented changes in plant operations, focus group discussions, structured interviews with long-tenure employees, and task-based job profiles. Task-based job noise exposure profiles were developed in the 1990s when the plants conducted task-based noise monitoring. Under the assumption that tasks and time-at-task profile within jobs did not change over time, these profiles were applied to historic jobs. By linking historic noise exposure measurements to job tasks, this approach allowed task-based reconstructed noise exposure profiles to capture variability of daily noise exposures. Reconstructed noise exposures, along with task-based noise exposure measurements collected at each plant during the 1990s, were analyzed to examine time trends in workplace noise levels and worker noise exposure. Our analysis of noise exposure trends revealed that noise levels for many jobs declined by ≥3 dBA from 1970 to 1998 as operational and equipment changes occurred in the plants and some noise control measures were implemented, but for some jobs, noise levels increased in the mid- to late 1990s, most likely because of an increase in production at that time. Overall, the percentage of workers exposed to noise levels >90 dBA decreased from 95% in 1970 to 54% in 1998 at one of the plants and decreased from 36% in 1970 to ~5% in 1999 at the other plant. These reductions indicate a degree of success for the hearing conservation program. However, the actual number of employees with noise exposure >90 dBA increased because of a substantial increase in the number of production employees, particularly in jobs with high noise levels, which shows a hearing conservation program challenge that companies face during periods of increased production. Future analysis of hearing levels in these plant populations will help determine whether noise level reduction translates into decreased hearing loss at these plants. |
Occupational carbon monoxide fatalities in the US from unintentional non-fire related exposures, 1992-2008
Henn SA , Bell JL , Sussell AL , Konda S . Am J Ind Med 2013 56 (11) 1280-9 OBJECTIVE: To analyze characteristics of, and trends in, work-related carbon monoxide (CO) fatalities in the US. METHODS: Records of unintentional, non-fire related fatalities from CO exposure were extracted from the Bureau of Labor Statistics' Census of Fatal Occupational Injuries and the Occupational Safety and Health Administration's Integrated Management Information System for years 1992-2008 and analyzed separately. RESULTS: The average number of annual CO fatalities was 22 (standard deviation = 8). Fatality rates were highest among workers aged ≥65, males, Hispanics, winter months, the Midwest, and the Fishing, Hunting, and Trapping industry subsector. Self-employed workers accounted for 28% of all fatalities. Motor vehicles were the most frequent source of fatal CO exposure, followed by heating systems and generators. CONCLUSIONS: CO has been the most frequent cause of occupational fatality due to acute inhalation, and has shown no significant decreasing trend since 1992. The high number of fatalities from motor vehicles warrants further investigation. |
Possible health benefits from reducing occupational magnetic fields
Bowman JD , Ray TK , Park RM . Am J Ind Med 2013 56 (7) 791-805 BACKGROUND: Magnetic fields (MF) from AC electricity are a Possible Human Carcinogen, based on limited epidemiologic evidence from exposures far below occupational health limits. METHODS: To help formulate government guidance on occupational MF, the cancer cases prevented and the monetary benefits accruing to society by reducing workplace exposures were determined. Life-table methods produced Disability Adjusted Life Years, which were converted to monetary values. RESULTS: Adjusted for probabilities of causality, the expected increase in a worker's disability-free life are 0.04 year (2 weeks) from a 1 microtesla (microT) MF reduction in average worklife exposure, which is equivalent to $5,100/worker/microT in year 2010 U.S. dollars (95% confidence interval $1,000-$9,000/worker/microT). Where nine electrosteel workers had 13.8 microT exposures, for example, moving them to ambient MFs would provide $600,000 in benefits to society (uncertainty interval $0-$1,000,000). CONCLUSIONS: When combined with the costs of controls, this analysis provides guidance for precautionary recommendations for managing occupational MF exposures. |
A prospective study of musculoskeletal outcomes among manufacturing workers: I. effects of physical risk factors
Gerr F , Fethke N , Merlino L , Anton D , Rosecrance J , Jones MP , Marcus M , Meyers A . Hum Factors 2013 56 (1) 112-30 OBJECTIVE: To better characterize associations between physical risk factors and upper-extremity musculoskeletal symptoms and disorders, a prospective epidemiologic study of 386 manufacturing workers was performed. BACKGROUND: Methodological limitations of previous studies have resulted in inconsistent associations. METHOD: An individual, task-based exposure assessment strategy was used to assess upper-extremity exertion intensity, repetition, and time-in-posture categories. Participants recorded time spent performing daily work tasks on a preprinted log, which was then used to calculate time-weighted-average exposures across each week of follow-up. In addition, a weekly Strain Index (SI) risk category was assigned to each participant. Incident musculoskeletal symptoms and disorders were assessed weekly. Proportional hazards analyses were used to examine associations between exposure measures and incident hand/arm and neck/shoulder symptoms and disorders. RESULTS: Incident symptoms and disorders were common (incident hand/arm symptoms = 58/100 person-years (PY), incident hand/arm disorders = 19/100 PY, incident neck/shoulder symptoms = 54/100 PY, incident neck/shoulder disorders = 14/100 PY). Few associations between separate estimates of physical exposure and hand/arm and neck/shoulder outcomes were observed. However, associations were observed between dichotomized SI risk category and incident hand/arm symptoms (hazard ratio [HR] = 1.73, 95% confidence interval [CI] = [0.99, 3.04]) and disorders (HR = 1.93, 95% CI = [0.85, 4.40]). CONCLUSION: Evidence of associations between physical risk factors and musculoskeletal outcome was strongest when exposure was estimated with the SI, in comparison to other metrics of exposure. APPLICATION: The results of this study provide evidence that physical exposures in the workplace contribute to musculoskeletal disorder incidence. Musculoskeletal disorder prevention efforts should include mitigation of these occupational risk factors. |
Epidemiology of neurodegeneration in American-style professional football players
Lehman EJ . Alzheimers Res Ther 2013 5 (4) 34 The purpose of this article is to review the history of head injuries in relation to American-style football play, summarize recent research that has linked football head injuries to neurodegeneration, and provide a discussion of the next steps for refining the examination of neurodegeneration in football players. For most of the history of football, the focus of media reports and scientific studies on football-related head injuries was on the acute or short-term effects of serious, traumatic head injuries. Beginning about 10 years ago, a growing concern developed among neurologists and researchers about the long-term effects that playing professional football has on the neurologic health of the players. Autopsy-based studies identified a pathologically distinct neurodegenerative disorder, chronic traumatic encephalopathy, among athletes who were known to have experienced concussive and subconcussive blows to the head during their playing careers. Football players have been well represented in these autopsy findings. A mortality study of a large cohort of retired professional football players found a significantly increased risk of death from neurodegeneration. Further analysis found that non-line players were at higher risk than line players, possibly because of an increased risk of concussion. Although the results of the studies reviewed do not establish a cause effect relationship between football-related head injury and neurodegenerative disorders, a growing body of research supports the hypothesis that professional football players are at an increased risk of neurodegeneration. Significant progress has been made in the last few years on detecting and defining the pathology of neurodegenerative diseases. However, less progress has been made on other factors related to the progression of those diseases in football players. This review identifies three areas for further research: (a) quantification of exposure - a consensus is needed on the use of clinically practical measurements of blows to the head among football players; (b) genetic susceptibility factors - a more rigorous set of unbiased epidemiological and clinical studies is needed before any causal relationships can be drawn between suspected genetic factors, head injury, and neurodegeneration; and (c) earlier detection and prevention of neurodegenerative diseases. |
Are operating room nurses at higher risk of severe persistent asthma? the Nurses' Health Study
Le Moual N , Varraso R , Zock JP , Henneberger P , Speizer FE , Kauffmann F , Camargo CA Jr . J Occup Environ Med 2013 55 (8) 973-7 OBJECTIVE: To assess the associations between operating room (OR) nursing, a category of health care workers at high risk of exposure to various inhaled agents, and asthma severity/control among women with asthma. METHODS: The level of severity/control in nurses with prevalent doctor-diagnosed asthma in 1998/2000 was compared, using nominal logistic regression, in OR nursing (n = 69) and administrative nursing (n = 546) from the US Nurses' Health Study for whom detailed information on asthma and nursing employment status was available. RESULTS: We observed a significant association between OR nursing, compared with administrative nursing, and severe persistent asthma (adjusted odds ratio, 2.48; 95% confidence interval, 1.06 to 5.77). CONCLUSIONS: Our findings suggest that nurses working in the OR are at a higher risk of severe persistent asthma. Further studies with detailed estimates of occupational exposures, especially to disinfectant/cleaning agents, are warranted. |
A comparison of fatal occupational injury event characteristics from the Census of Fatal Occupational Injuries and the Vital Statistics Mortality System
Marsh SM , Jackson LL . J Safety Res 2013 46 119-125 OBJECTIVES: The aim of this study was to examine utility of appending International Classification of Diseases (ICD) codes from Vital Statistics Mortality (VSM) data to Bureau of Labor Statistics (BLS) Census of Fatal Occupational Injuries (CFOI), and compare occupational event characteristics based on ICD external cause and BLS Occupational Injury and Illness Classification System (OIICS) event codes. METHODS: We linked VSM records with CFOI records for 2003 and 2004. RESULTS: Ninety-five percent of approximately 11,000 CFOI cases were linked to VSM cases. Linked data suggest that CFOI OIICS event and VSM ICD codes identified similar leading events. However, VSM data were generally less specific. CONCLUSION: Lack of detail inherent in ICD codes and death narratives limits specificity of injury characteristics in VSM data. Appending ICD codes to CFOI appears to offer little value. Research comparing work- and non-work-related events may be better served by having a defined framework to crosswalk both coding schemes to facilitate comparisons. IMPACT ON INDUSTRY: Over the last two decades, both ICD and OIICS have been used to characterize occupational injury circumstances; however, this is the first study to use linked case comparisons of the OIICS and ICD codes at a detailed level. This study confirmed that multiple source data systems provide more detail surrounding an incident than a single source data system does. Our results suggest that OIICS-coded CFOI data are a better source for occupational injury research and prevention purposes. For future comparison studies requiring ICD, it would be advantageous to have a defined framework that could easily be used to map both coding schemes (OIICS and ICD). |
Development of an advanced respirator fit test headform
Bergman MS , Zhuang Z , Hanson D , Heimbuch BK , McDonald MJ , Palmiero AJ , Shaffer RE , Harnish D , Husband M , Wander JD . J Occup Environ Hyg 2013 11 (2) 117-25 Improved respirator test headforms are needed to measure the fit of N95 filtering facepiece respirators (FFRs) for protection studies against viable airborne particles. A Static (i.e., non-moving, non-speaking) Advanced Headform (StAH) was developed for evaluating the fit of N95 FFRs. The StAH was developed based on the anthropometric dimensions of a digital headform reported by the National Institute for Occupational Safety and Health and has a silicone polymer skin with defined local tissue thicknesses. Quantitative fit factor evaluations were performed on seven N95 FFR models of various sizes and designs. Donnings were performed with and without a pre-test leak checking method. For each method, four replicate FFR samples were tested of each of the seven models with two donnings per replicate, resulting in a total of 56 tests per donning method. Each fit factor evaluation was comprised of three one-minute exercises: "Normal Breathing" (NB, 11.2 liters per minute (lpm)), "Deep Breathing" (DB, 20.4 lpm), then NB again. A fit factor for each exercise and an overall test fit factor were obtained. Analysis of variance methods were used to identify statistical differences among fit factors (analyzed as logarithms) for different FFR models, exercises, and testing methods. For each FFR model and for each testing method, the NB and DB fit factor data were not significantly different (P > 0.05). Significant differences were seen in the overall exercise fit factor data for the two donning methods among all FFR models (pooled data) and in the overall exercise fit factor data for the two testing methods within certain models. Utilization of the leak checking method improved the rate of obtaining overall exercise fit factors ≥ 100. The FFR models, which are expected to achieve overall fit factors ≥ 100 on human subjects, achieved overall exercise fit factors ≥ 100 on the StAH. Further research is needed to evaluate the correlation of FFRs fitted on the StAH to FFRs fitted on people. |
Discounting the value of safety: effects of perceived risk and effort
Sigurdsson SO , Taylor MA , Wirth O . J Safety Res 2013 46 127-134 INTRODUCTION: Although falls from heights remain the most prevalent cause of fatalities in the construction industry, factors impacting safety-related choices associated with work at heights are not completely understood. Better tools are needed to identify and study the factors influencing safety-related choices and decision making. METHOD: Using a computer-based task within a behavioral economics paradigm, college students were presented a choice between two hypothetical scenarios that differed in working height and effort associated with retrieving and donning a safety harness. Participants were instructed to choose the scenario in which they were more likely to wear the safety harness. Based on choice patterns, switch points were identified, indicating when the perceived risk in both scenarios was equivalent. RESULTS: Switch points were a systematic function of working height and effort, and the quantified relation between perceived risk and effort was described well by a hyperbolic equation. CONCLUSION: Choice patterns revealed that the perceived risk of working at heights decreased as the effort to retrieve and don a safety harness increased. IMPACT ON INDUSTRY: Results contribute to the development of computer-based procedure for assessing risk discounting within a behavioral economics framework. Such a procedure can be used as a research tool to study factors that influence safety-related decision making with a goal of informing more effective prevention and intervention strategies. |
Cutaneous infection caused by a novel Francisella sp.
Respicio-Kingry LB , Byrd L , Allison A , Brett M , Scott-Waldron C , Galliher K , Hannah P , Mead P , Petersen JM . J Clin Microbiol 2013 51 (10) 3456-60 A 69 year old patient presented with a tender, thickly-crusted skin lesion of one week's duration. A bacterial culture swab taken from the underlying granular tissue yielded a pure isolate of a gram-negative coccobacillus, presumptively identified as a novel Francisella species via 16S rRNA and multi-locus gene sequence analysis. |
A randomized trial of artemether-lumefantrine and dihydroartemisinin-piperaquine in the treatment of uncomplicated malaria among children in western Kenya
Agarwal A , McMorrow M , Onyango P , Otieno K , Odero C , Williamson J , Kariuki S , Kachur SP , Slutsker L , Desai M . Malar J 2013 12 254 BACKGROUND: Artemether-lumefantrine (AL) was adopted as first-line treatment for uncomplicated malaria in Kenya in 2006. Monitoring drug efficacy at regular intervals is essential to prevent unnecessary morbidity and mortality. The efficacy of AL and dihydroartemisinin-piperaquine (DP) were evaluated for the treatment of uncomplicated malaria in children aged six to 59 months in western Kenya. METHODS: From October 2010 to August 2011, children with fever or history of fever with uncomplicated Plasmodium falciparum mono-infection were enrolled in an in vivo efficacy trial in accordance with World Health Organization (WHO) guidelines. The children were randomized to treatment with a three-day course of AL or DP and efficacy outcomes were measured at 28 and 42 days after treatment initiation. RESULTS: A total of 137 children were enrolled in each treatment arm. There were no early treatment failures and all children except one had cleared parasites by day 3. Polymerase chain reaction (PCR)-uncorrected adequate clinical and parasitological response rate (ACPR) was 61% in the AL arm and 83% in the DP arm at day 28 (p = 0.001). PCR-corrected ACPR at day 28 was 97% in the AL group and 99% in the DP group, and it was 96% in both arms at day 42. CONCLUSIONS: AL and DP remain efficacious for the treatment of uncomplicated malaria among children in western Kenya. The longer half-life of piperaquine relative to lumefantrine may provide a prophylactic effect, accounting for the lower rate of re-infection in the first 28 days after treatment in the DP arm. |
Recruiting patients into the CDC's Colorectal Cancer Screening Demonstration Program: strategies and challenges across 5 sites
Boehm JE , Rohan EA , Preissle J , Degroff A , Glover-Kudon R . Cancer 2013 119 Suppl 15 2914-25 BACKGROUND: In 2005, the Centers for Disease Control and Prevention (CDC) funded 5 sites as part of the Colorectal Cancer Screening Demonstration Program (CRCSDP) to provide colorectal cancer screening to low-income, uninsured, and underinsured individuals. Funded sites experienced unexpected challenges in recruiting patients for services. METHODS: The authors conducted a longitudinal, qualitative case study of all 5 sites to document program implementation, including recruitment. Data were collected during 3 periods over the 4-year program and included interviews, document review, and observations. After coding and analyzing the data, themes were identified and triangulated across the research team. Patterns were confirmed through member checking, further validating the analytic interpretation. RESULTS: During early implementation, patient enrollment was low at 4 of the 5 CRCSDP sites. Evaluators found 3 primary challenges to patient recruitment: overreliance on in-reach to National Breast and Cervical Cancer Early Detection Program patients, difficulty keeping colorectal cancer screening and the program a priority among staff at partnering primary care clinics responsible for patient recruitment, and a lack of public knowledge about the need for colorectal cancer screening among patients. To address these challenges, site staff expanded partnerships with additional primary care networks for greater reach, enhanced technical support to primary care providers to ensure more consistent patient enrollment, and developed tailored outreach and education. CONCLUSIONS: Removing financial barriers to colorectal cancer screening was necessary but not sufficient to reach the priority population. To optimize colorectal cancer screening, public health practitioners must work closely with the health care sector to implement evidence-based, comprehensive strategies across individual, environmental, and systems levels of society. |
Lessons learned from the CDC's Colorectal Cancer Screening Demonstration Program
Seeff LC , Rohan EA . Cancer 2013 119 Suppl 15 2817-9 This report briefly summarizes 13 articles in this dedicated supplement to Cancer documenting the full implementation and evaluation of CDC's Colorectal Cancer Screening Demonstration Program (CRCSDP). The supplement includes 3 articles that describe clinical and quality outcomes; 2 articles that describe programmatic and clinical costs; 3 that were based on a multiple case study, using qualitative methods to describe the overall implementation experience of this initiative; and 4 articles written by and about individual program sites. The comprehensive, multi-methods evaluation conducted alongside the program produced many important lessons regarding the design, start-up, and implementation of colorectal cancer screening in this high-need population, and paved the way for the CDC to establish a larger, population-based colorectal cancer control initiative, broadly aligned with expectations of the Patient Protection and Affordable Care Act through its population-based emphasis on using a health systems approach to increase colorectal cancer screening. |
The National Prevention Strategy and breast cancer screening: scientific evidence for public health action
Plescia M , White MC . Am J Public Health 2013 103 (9) 1545-8 Mammography screening rates in the United States have remained fairly stable over the past decade, and screening rates remain low for some groups. We examined insights from recent public health research on breast cancer screening to identify promising new approaches to improve screening rates and address persistent health disparities in mammography use. We considered this research in the context of the four strategic directions of the National Prevention Strategy: elimination of health disparities, empowered people, healthy and safe community environments, and clinical and community preventive services. This research points to the value of direct outreach and case management services, interventions to support more patient-centered models of care, and more organized, population-based approaches to identify women who are eligible to be screened, encourage participation, and monitor results. |
Outpatient colonoscopy complications in the CDC's Colorectal Cancer Screening Demonstration Program: a prospective analysis
Castro G , Azrak MF , Seeff LC , Royalty J . Cancer 2013 119 Suppl 15 2849-54 BACKGROUND: To the authors's knowledge, there are few published prospective cohort studies of colonoscopy complications in patients at average risk for colorectal cancer who received colorectal cancer screening from a community-based program. In this article, the authors report the rate of colonoscopy complications in the Centers for Disease Control and Prevention (CDC)'s Colorectal Cancer Screening Demonstration Program (CRCSDP), which provided colorectal cancer screening to a medically underserved population aged 50 years to 64 years for screening, diagnostic follow-up after positive stool blood tests, and surveillance purposes. METHODS: Clinical data were collected prospectively from 5 community-based colorectal cancer screening programs. Complications were identified by reviewing the standardized clinical data and medical complication reporting forms submitted by the programs to the CDC. Serious complications were defined as conditions or symptoms that resulted in hospital admission within 30 days after the procedure, including perforation, gastrointestinal bleeding requiring or not requiring blood transfusion, cardiopulmonary events, postpolypectomy syndrome, excessive abdominal pain, or death. RESULTS: A total of 3215 individuals underwent 3355 colonoscopies. Of these, 89% of the colonoscopies were conducted for screening, 9% were conducted for diagnostic follow-up, and 2% were conducted for surveillance purposes. The mean age of the individuals was 55.9 years. Eight individuals experienced serious complications, for an incidence of 2.38 per 1000 colonoscopies. Three patients experienced bowel perforations that required surgery, 1 patient was hospitalized for postpolypectomy bleeding, 3 patients experienced cardiopulmonary events, and 1 patient visited the emergency room for excessive abdominal pain and underwent surgery for an identified colorectal mass. No deaths were reported. CONCLUSIONS: In the CDC's CRCSDP, in which a total of 3215 individuals underwent 3355 colonoscopies, the overall incidence of serious complications from colonoscopy was found to be low. |
Evaluating application of the National Healthcare Safety Network central line-associated bloodstream infection surveillance definition: a survey of pediatric intensive care and hematology/oncology units
Gaur AH , Miller MR , Gao C , Rosenberg C , Morrell GC , Coffin SE , Huskins WC . Infect Control Hosp Epidemiol 2013 34 (7) 663-70 OBJECTIVE: To evaluate the application of the National Healthcare Safety Network (NHSN) central line-associated bloodstream infection (CLABSI) definition in pediatric intensive care units (PICUs) and pediatric hematology/oncology units (PHOUs) participating in a multicenter quality improvement collaborative to reduce CLABSIs; to identify sources of variability in the application of the definition. DESIGN: Online survey using 18 standardized case scenarios. Each described a positive blood culture in a patient and required a yes- or-no answer to the question "Is this a CLABSI?" NHSN staff responses were the reference standard. SETTING: Sixty-five US PICUs and PHOUs. PARTICIPANTS: Staff who routinely adjudicate CLABSIs using NHSN definitions. RESULTS: Sixty responses were received from 58 (89%) of 65 institutions; 78% of respondents were infection preventionists, infection control officers, or infectious disease physicians. Responses matched those of NHSN staff for 78% of questions. The mean (SE) percentage of concurring answers did not differ for scenarios evaluating application of 1 of the 3 criteria ("known pathogen," 78% [1.7%]; "skin contaminant, >1 year of age," 76% [SE, 2.5%]; "skin contaminant, ≤1 year of age," 81% [3.8%]. The mean percentage of concurring answers was lower for scenarios requiring respondents to determine whether a CLABSI was present or incubating on admission (64% [4.6%]; or to distinguish between primary and secondary bacteremia (65% [2.5%]. CONCLUSIONS: The accuracy of application of the CLABSI definition was suboptimal. Efforts to reduce variability in identifying CLABSIs that are present or incubating on admission and in distinguishing primary from secondary bloodstream infection are needed. |
Evaluation of a volunteer community-based health worker program for providing contraceptive services in Madagascar
Gallo MF , Walldorf J , Kolesar R , Agarwal A , Kourtis AP , Jamieson DJ , Finlay A . Contraception 2013 88 (5) 657-65 BACKGROUND: Madagascar recently scaled up their volunteer community health worker (CHW) program in maternal health and family planning to reach remote and underserved communities. STUDY DESIGN: We conducted a cross-sectional evaluation using a systematic sample of 100 CHWs trained to provide contraceptive counseling and short-acting contraceptive services at the community level. CHWs were interviewed on demographics, recruitment, training, supervision, commodity supply, and other measures of program functionality; tested on knowledge of injectable contraception; and observed by an expert while completing five simulated client encounters with uninstructed volunteers. We developed a CHW performance score (0-100%) based on the number of counseling activities adequately met during the client encounters and used multivariable linear regression to identify correlates of the score. RESULTS: CHWs had a mean performance score of 73.9% (95% confidence interval [CI]: 70.3-77.6%). More education, more weekly volunteer hours, and receiving a refresher training correlated with a higher performance score. We found no other associations between measures of the components previously identified as essential for effective CHW programs and performance score. CONCLUSIONS: Although areas of deficiency were identified, CHWs proved capable of providing high-quality contraception services. |
Gynecologic cancer prevention and control in the National Comprehensive Cancer Control Program: progress, current activities, and future directions
Stewart SL , Lakhani N , Brown PM , Larkin OA , Moore AR , Hayes NS . J Womens Health (Larchmt) 2013 22 (8) 651-7 Gynecologic cancer confers a large burden among women in the United States. Several evidence-based interventions are available to reduce the incidence, morbidity, and mortality from these cancers. The National Comprehensive Cancer Control Program (NCCCP) is uniquely positioned to implement these interventions in the US population. This review discusses progress and future directions for the NCCCP in preventing and controlling gynecologic cancer. |
Implementing the CDC's Colorectal Cancer Screening Demonstration Program: wisdom from the field
Rohan EA , Boehm JE , Degroff A , Glover-Kudon R , Preissle J . Cancer 2013 119 Suppl 15 2870-83 BACKGROUND: Colorectal cancer, as the second leading cause of cancer-related deaths among men and women in the United States, represents an important area for public health intervention. Although colorectal cancer screening can prevent cancer and detect disease early when treatment is most effective, few organized public health screening programs have been implemented and evaluated. From 2005 to 2009, the Centers for Disease Control and Prevention funded 5 sites to participate in the Colorectal Cancer Screening Demonstration Program (CRCSDP), which was designed to reach medically underserved populations. METHODS: The authors conducted a longitudinal, multiple case study to analyze program implementation processes. Qualitative methods included interviews with 100 stakeholders, 125 observations, and review of 19 documents. Data were analyzed within and across cases. RESULTS: Several themes related to CRCSDP implementation emerged from the cross-case analysis: the complexity of colorectal cancer screening, the need for teamwork and collaboration, integration of the program into existing systems, the ability of programs to use wisdom at the local level, and the influence of social norms. Although these themes were explored independently from 1 another, interaction across themes was evident. CONCLUSIONS: Colorectal cancer screening is clinically complex, and its screening methods are not well accepted by the general public; both of these circumstances have implications for program implementation. Using patient navigation, engaging in transdisciplinary teamwork, assimilating new programs into existing clinical settings, and deferring to local-level wisdom together helped to address complexity and enhance program implementation. In addition, public health efforts must confront negative social norms around colorectal cancer screening. |
Kenya's Health Workforce Information System: a model of impact on strategic human resources policy, planning and management
Waters KP , Zuber A , Willy RM , Kiriinya RN , Waudo AN , Oluoch T , Kimani FM , Riley PL . Int J Med Inform 2013 82 (9) 895-902 OBJECTIVE: Countries worldwide are challenged by health worker shortages, skill mix imbalances, and maldistribution. Human resources information systems (HRIS) are used to monitor and address these health workforce issues, but global understanding of such systems is minimal and baseline information regarding their scope and capability is practically non-existent. The Kenya Health Workforce Information System (KHWIS) has been identified as a promising example of a functioning HRIS. The objective of this paper is to document the impact of KHWIS data on human resources policy, planning and management. METHODS: Sources for this study included semi-structured interviews with senior officials at Kenya's Ministry of Medical Services (MOMS), Ministry of Public Health and Sanitation (MOPHS), the Department of Nursing within MOMS, the Nursing Council of Kenya, Kenya Medical Practitioners and Dentists Board, Kenya's Clinical Officers Council, and Kenya Medical Laboratory Technicians and Technologists Board. Additionally, quantitative data were extracted from KHWIS databases to supplement the interviews. Health sector policy documents were retrieved from MOMS and MOPHS websites, and reviewed to assess whether they documented any changes to policy and practice as having been impacted by KHWIS data. RESULTS: Interviews with Kenyan government and regulatory officials cited health workforce data provided by KHWIS influenced policy, regulation, and management. Policy changes include extension of Kenya's age of mandatory civil service retirement from 55 to 60 years. Data retrieved from KHWIS document increased relicensing of professional nurses, midwives, medical practitioners and dentists, and interviewees reported this improved compliance raised professional regulatory body revenues. The review of Government records revealed few references to KHWIS; however, documentation specifically cited the KHWIS as having improved the availability of human resources for health information regarding workforce planning, management, and development. CONCLUSION: KHWIS data have impacted a range of improvements in health worker regulation, human resources management, and workforce policy and planning at Kenya's ministries of health. |
Assessing screening quality in the CDC's Colorectal Cancer Screening Demonstration Program
Nadel MR , Royalty J , Shapiro JA , Joseph D , Seeff LC , Lane DS , Dwyer DM . Cancer 2013 119 Suppl 15 2834-41 BACKGROUND: Gaps in screening quality in community practice have been well documented. The authors examined recommended indicators of screening quality in the Centers for Disease Control and Prevention's Colorectal Cancer Screening Demonstration Program (CRCSDP), which provided colorectal cancer screening and diagnostic services between 2005 and 2009 for asymptomatic, low-income, underinsured, or uninsured individuals at 5 sites around the United States. METHODS: For each client screened in the CRCSDP, a standardized set of colorectal cancer clinical data elements was collected. Data regarding client age, screening history, risk level, screening test indication, results, and recommendation for the next test were analyzed. For colonoscopies, data were analyzed regarding whether the cecum was reached, bowel preparation was adequate, and identified lesions were completely removed. RESULTS: Overall, 53% of the fecal occult blood tests (FOBTs) (2295 tests) distributed were completed and returned. At the 2 sites with adequate numbers of FOBTs, 77% and 97%, respectively, of clients with positive results received follow-up colonoscopies. Site-specific cecal intubation rates ranged from 90% to 98%. Adenoma detection rates were 32% for men and 21% for women. For approximately one-third of colonoscopies, the recommended interval to the next test was shorter than recommended by national guidelines. At some sites, endoscopists failed to report on the adequacy of bowel preparation and completeness of polyp removal. CONCLUSIONS: Cecal intubation rates and adenoma detection rates met recommended levels. The authors identified the need for improvements in the follow-up of positive FOBTs, documentation of important elements in colonoscopy reports, and recommendations for rescreening or surveillance intervals after colonoscopy. Monitoring quality indicators is important to improve screening quality. |
Clinical outcomes from the CDC's Colorectal Cancer Screening Demonstration Program
Seeff LC , Royalty J , Helsel WE , Kammerer WG , Boehm JE , Dwyer DM , Howe WR Jr , Joseph D , Lane DS , Laughlin M , Leypoldt M , Marroulis SC , Mattingly CA , Nadel MR , Phillips-Angeles E , Rockwell TJ , Ryerson AB , Tangka FK . Cancer 2013 119 Suppl 15 2820-33 BACKGROUND: Colorectal cancer remains the second leading cause of cancer-related deaths among US men and women. Screening rates have been slow to increase, and disparities in screening remain. METHODS: To address the disparity in screening for this high burden but largely preventable disease, the Centers for Disease Control and Prevention (CDC) designed and established a 4-year Colorectal Cancer Screening Demonstration Program (CRCSDP) in 2005 for low-income, under-insured or uninsured men and women aged 50 to 64 years in 5 participating US program sites. In this report, the authors describe the design of the CRCSDP and the overall clinical findings and screening test performance characteristics, including the positive fecal occult blood testing (FOBT) rate; the rates of polyp, adenoma, and cancer detection with FOBTs and colonoscopies; and the positive predicative value for polyps, adenomas, and cancers. RESULTS: In total, 5233 individuals at average risk and increased risk were screened for colorectal cancer across all 5 sites, including 44% who underwent screening FOBT and 56% who underwent screening colonoscopy. Overall, 77% of all individuals screened were women. The FOBT positivity rate was 10%. Results from all screening or diagnostic colonoscopies indicated that 75% had negative results and required a repeat screening colonoscopy in 10 years, 16% had low-risk adenomas and required surveillance colonoscopy in 5 to 10 years, 8% had high-risk adenomas and required surveillance colonoscopy in 3 years, and 0.6% had invasive cancers. CONCLUSIONS: This report documents the successes and challenges in implementing the CDC's CRCSDP and describes the clinical outcomes of this 4-year initiative, the patterns in program uptake and test choice, and the comparative test performance characteristics of FOBT versus colonoscopy. Patterns in final outcomes from the follow-up of positive screening tests were consistent with national registry data. |
Developmental milestones across the programmatic life cycle: implementing the CDC's Colorectal Cancer Screening Demonstration Program
Glover-Kudon R , Degroff A , Rohan EA , Preissle J , Boehm JE . Cancer 2013 119 Suppl 15 2926-39 BACKGROUND: In 2005 through 2009, the Centers for Disease Control and Prevention (CDC) funded 5 sites to implement a colorectal cancer screening program for uninsured, low-income populations. These 5 sites composed a demonstration project intended to explore the feasibility of establishing a national colorectal cancer screening program through various service delivery models. METHODS: A longitudinal, multiple case study was conducted to understand and document program implementation processes. Using metaphor as a qualitative analytic technique, evaluators identified stages of maturation across the programmatic life cycle. RESULTS: Analysis rendered a working theory of program development during screening implementation. In early stages, program staff built relationships with CDC and local partners around screening readiness, faced real-world challenges putting program policies into practice, revised initial program designs, and developed new professional skills. Midterm implementation was defined by establishing program cohesiveness and expanding programmatic reach. In later stages of implementation, staff focused on sustainability and formal program closeout, which prompted reflection about personal and programmatic accomplishments. CONCLUSIONS: Demonstration sites evolved through common developmental stages during screening implementation. Findings elucidate ways to target technical assistance to more efficiently move programs along their maturation trajectory. In practical terms, the time and cost associated with guiding a program to maturity may be potentially shortened to maximize return on investment for both organizations and clients receiving service benefits. |
Variation in electronic health record adoption and readiness for meaningful use: 2008-2011
Patel V , Jamoom E , Hsiao CJ , Furukawa MF , Buntin M . J Gen Intern Med 2013 28 (7) 957-64 BACKGROUND: Federal initiatives are underway that provide physicians with financial incentives for meaningful use (MU) of electronic health records (EHRs) and assistance to purchase and implement EHRs. OBJECTIVE: We sought to examine readiness and interest in MU among primary care physicians and specialists, and identify factors that may affect their readiness to obtain MU incentives. DESIGN/PARTICIPANTS: We analyzed 4 years of data (2008-2011) from the National Ambulatory Medical Care Survey (NAMCS) Electronic Medical Record (EMR) Supplement, an annual cross-sectional nationally representative survey of non-federally employed office-based physicians. MAIN MEASURES: Survey-weighted EHR adoption rates, potential to meet selected MU criteria, and self-reported intention to apply for MU incentives. We also examined the association between physician and practice characteristics and readiness for MU. KEY RESULTS: The overall sample consisted of 10,889 respondents, with weighted response rates of 62 % (2008); 74 % (2009); 66 % (2010); and 61 % (2011). Primary care physicians' adoption of EHRs with the potential to meet MU nearly doubled from 2009 to 2011 (18 % to 38 %, p < 0.01), and was significantly higher than specialists (19 %) in 2011 (p < 0.01). In 2011, half of physicians (52 %) expressed their intention to apply for MU incentives; this did not vary by specialty. Multivariate analyses report that EHR adoption was significantly higher in both 2010 and 2011 compared to 2009, and primary care physicians and physicians working in larger or multi-specialty practices or for HMOs were more likely to adopt EHRs with the potential to meet MU. CONCLUSIONS: Physician EHR adoption rates increased in advance of MU incentive payments. Although interest in MU incentives did not vary by specialty, primary care physicians had significantly higher rates of adopting EHRs with the potential to meet MU. Addressing barriers to EHR adoption, which may vary by specialty, will be important to enhancing coordination of care. |
Excerpt from PHS Guideline for Reducing HIV, HBV and HCV Transmission through Organ Transplantation
Seem DL , Lee I , Umscheid CA , Kuehnert MJ . Am J Transplant 2013 13 (8) 1953-62 The intent of the PHS guideline is to improve organ transplant recipient outcomes by reducing the risk of unexpected HIV, HBV and HCV transmission, while preserving the availability of high-quality organs. An evidence-based approach was used to identify the most relevant studies and reports on which to formulate the recommendations. This excerpt from the guideline comprises (1) the executive summary; (2) 12 criteria for assessment of risk factors for recent HIV, HBV and HCV infection; (3) 34 recommendations on risk assessment (screening) of living and deceased donors; testing of living and deceased donors; informed consent discussion with transplant candidates; testing of recipients pre- and posttransplant; collection and/or storage of donor and recipient specimens; and tracking and reporting of HIV, HBV and HCV; and (4) 20 recommendations for further study. For the PHS guideline in its entirety, including the background, methodology and primary evidence underlying the recommendations, refer to the source document in Public Health Reports, accessible at http://www.publichealthreports.org/issuecontents.cfm?Volume=128&Issue=4. For more in-depth information on the evidence base, including tables of all study-level data, refer to Solid Organ Transplantation and the Probability of Transmitting HIV, HBV or HCV: A Systematic Review to Support an Evidence-Based Guideline, accessible at http://stacks.cdc.gov/view/cdc/12164/. |
Improving public health agency and system performance: fortification for promoting population health and wellness
Monroe JA , Thomas C . Prev Chronic Dis 2013 10 E122 America faces a new frontier in preventing chronic disease. Nearly 80% of the 10,000 people who turn 65 each day have at least 1 chronic health condition, and most have multiple chronic conditions (1). The costs of braving this new world are staggering, especially given the nation’s strained economy and budget cuts that have forced health departments to reduce their workforce and impose furloughs and reduce or eliminate chronic disease programs. Despite these daunting challenges, government public health agencies have opportunities to be the driving force behind improving the nation’s health. | The Office for State, Tribal, Local, and Territorial Support (OSTLTS) at the Centers for Disease Control and Prevention (CDC) was established in 2010 to better position CDC to support health departments. The mission of OSTLTS is to advance US public health agency and system performance, capacity, agility, and resilience (2). OSTLTS recognizes health departments as the front line of prevention and promotes a systems approach that works across sectors, partners, and programs to address the complex community and economic challenges that affect the public’s health. | An important and recent advancement in the practice of public health is the establishment of a national public health accreditation program and the Public Health Accreditation Board (PHAB). Funded by OSTLTS and the Robert Wood Johnson Foundation, PHAB seeks to advance the quality and performance of public health departments through national standards, which more than 3,000 health agencies can use to continuously improve performance. Based on the 10 Essential Public Health Services (3), PHAB accreditation offers a means to ensure comprehensive and quality programs for a range of public health areas, including chronic disease prevention. In March 2013, the first 11 public health departments were accredited by PHAB, and many more applications are in process (4). |
Post-disaster reproductive health outcomes
Zotti ME , Williams AM , Robertson M , Horney J , Hsia J . Matern Child Health J 2013 17 (5) 783-96 We examined methodological issues in studies of disaster-related effects on reproductive health outcomes and fertility among women of reproductive age and infants in the United States (US). We conducted a systematic literature review of 1,635 articles and reports published in peer-reviewed journals or by the government from January 1981 through December 2010. We classified the studies using three exposure types: (1) physical exposure to toxicants; (2) psychological trauma; and (3) general exposure to disaster. Fifteen articles met our inclusion criteria concerning research focus and design. Overall studies pertained to eight different disasters, with most (n = 6) focused on the World Trade Center attack. Only one study examined pregnancy loss, i.e., occurrence of spontaneous abortions post-disaster. Most studies focused on associations between disaster and adverse birth outcomes, but two studies pertained only to post-disaster fertility while another two examined it in addition to adverse birth outcomes. In most studies disaster-affected populations were assumed to have experienced psychological trauma, but exposure to trauma was measured in only four studies. Furthermore, effects of both physical exposure to toxicants and psychological trauma on disaster-affected populations were examined in only one study. Effects on birth outcomes were not consistently demonstrated, and study methodologies varied widely. Even so, these studies suggest an association between disasters and reproductive health and highlight the need for further studies to clarify associations. We postulate that post-disaster surveillance among pregnant women could improve our understanding of effects of disaster on the reproductive health of US pregnant women. |
Psychosocial functioning and depressive symptoms among HIV-positive persons receiving care and treatment in Kenya, Namibia, and Tanzania
Seth P , Kidder D , Pals S , Parent J , Mbatia R , Chesang K , Mbilinyi D , Koech E , Nkingwa M , Katuta F , Ng'ang'a A , Bachanas P . Prev Sci 2013 15 (3) 318-28 In sub-Saharan Africa, the prevalence of depressive symptoms among people living with HIV (PLHIV) is considerably greater than that among members of the general population. It is particularly important to treat depressive symptoms among PLHIV because they have been associated with poorer HIV care-related outcomes. This study describes overall psychosocial functioning and factors associated with depressive symptoms among PLHIV attending HIV care and treatment clinics in Kenya, Namibia, and Tanzania. Eighteen HIV care and treatment clinics (six per country) enrolled approximately 200 HIV-positive patients (for a total of 3,538 participants) and collected data on patients' physical and mental well-being, medical/health status, and psychosocial functioning. Although the majority of participants did not report clinically significant depressive symptoms (72 %), 28 % reported mild to severe depressive symptoms, with 12 % reporting severe depressive symptoms. Regression models indicated that greater levels of depressive symptoms were associated with: (1) being female, (2) younger age, (3) not being completely adherent to HIV medications, (4) likely dependence on alcohol, (5) disclosure to three or more people (versus one person), (6) experiences of recent violence, (7) less social support, and (8) poorer physical functioning. Participants from Kenya and Namibia reported greater depressive symptoms than those from Tanzania. Approximately 28 % of PLHIV reported clinically significant depressive symptoms. The scale-up of care and treatment services in sub-Saharan Africa provides an opportunity to address psychosocial and mental health needs for PLHIV as part of comprehensive care. |
An international systematic review and meta-analysis of multisession psychosocial interventions compared with educational or minimal interventions on the HIV sex risk behaviors of people who use drugs
Meader N , Semaan S , Halton M , Bhatti H , Chan M , Llewellyn A , Des Jarlais DC . AIDS Behav 2013 17 (6) 1963-78 This systematic review and meta-analysis examines the effectiveness of multisession psychosocial interventions compared with educational interventions and minimal interventions in reducing sexual risk in people who use drugs (51 studies; 19,209 participants). We conducted comprehensive searches (MEDLINE, EMBASE, CINAHL, Cochrane Central Register of Controlled Trials and PsychINFO 1998-2012). Outcomes (unprotected sex, condom use, or a composite outcome) were extracted by two authors and synthesised using meta-analysis. Subgroup analyses and meta-regression were conducted to explore heterogeneity. Multisession psychosocial interventions had modest additional benefits compared to educational interventions (K = 46; OR 0.86; 95% CI 0.77, 0.96), and large positive effects compared to minimal interventions (K = 7; OR 0.60; 95% CI 0.46, 0.78). Comparison with previous meta-analyses suggested limited progress in recent years in developing more effective interventions. Multisession psychosocial and educational interventions provided similar modest sexual risk reduction justifying offering educational interventions in settings with limited exposure to sexual risk reduction interventions, messages, and resources. |
Patient-reported recall of smoking cessation interventions from a health professional
King BA , Dube SR , Babb SD , McAfee TA . Prev Med 2013 57 (5) 715-7 OBJECTIVE: To determine the prevalence and characteristics of current cigarette smokers who report receiving health care provider interventions ('5A's': ask, advise, assess, assist, arrange) for smoking cessation. METHODS: Data came from the 2009-2010 National Adult Tobacco Survey, a telephone survey of United States adults aged ≥18years. Among current cigarette smokers who reported visiting a health professional in the past year (n=16,542), estimates were calculated overall and by sex, age, race/ethnicity, education, income, health insurance coverage, and sexual orientation. RESULTS: Among smokers who visited a health professional (75.2%), 87.9% were asked if they used tobacco, 65.8% were advised to quit, and 42.6% were asked if they wanted to quit. Among those wanting to quit, 78.2% were offered assistance and 17.5% had follow-up arranged. Receipt of the 'ask' component was lower among males and uninsured individuals. Receipt of the 'advise' and 'assess' components was lower among those aged 18-24 and uninsured individuals. Receipt of the 'assist' component was lower among non-Hispanic blacks. No differences were observed for the 'arrange' component. CONCLUSIONS: Many current smokers report receiving health care provider interventions for smoking cessation. Continued efforts to educate, encourage, and support all health professionals to provide effective, comprehensive tobacco cessation interventions to their patients may be beneficial. |
Perceptions about the harm of secondhand smoke exposure among U.S. middle and high school students: findings from the 2012 National Youth Tobacco Survey
King BA , Dube SR , Babb SD . Tob Induc Dis 2013 11 (1) 16 BACKGROUND: Increased knowledge of the harmful effects of SHS is an evidence-based key indicator for eliminating nonsmokers' exposure to SHS. This study assessed the prevalence and predictors of perceptions about the harm of secondhand smoke (SHS) exposure among U.S. middle and high school students. FINDINGS: Data were obtained from the 2012 National Youth Tobacco Survey, a nationally representative school-based survey of U.S. students in grades 6-12. Respondents who reported that they thought breathing smoke from other people's cigarettes or other tobacco products causes "some" or "a lot" of harm were considered to have the perception that SHS is harmful. Multivariate logistic regression was used to identify predictors of the perception that SHS is harmful. Predictors included sex, race/ethnicity, school grade level, current tobacco use, and whether the respondent currently lived with a tobacco user. Overall, 75.4% of students perceived SHS exposure as harmful. The adjusted odds of perceiving SHS exposure as harmful were higher among non-Hispanic Asians than among non-Hispanic whites, and among students in 10th-12th grades than among students in 8th grade. Adjusted odds were lower among boys than among girls, among non-Hispanic blacks than among non-Hispanic whites, among students living with a tobacco user than among those not, and among those who use combustible tobacco only or both combustible and non-combustible tobacco than among those who use no tobacco. CONCLUSIONS: Most middle and high school students perceive SHS exposure as harmful, but efforts are needed to increase the prevalence of this perception in certain subpopulations, particularly tobacco users. |
An exogenous retrovirus isolated from koalas with malignant neoplasias in a US zoo
Xu W , Stadler CK , Gorman K , Jensen N , Kim D , Zheng H , Tang S , Switzer WM , Pye GW , Eiden MV . Proc Natl Acad Sci U S A 2013 110 (28) 11547-52 Leukemia and lymphoma account for more than 60% of deaths in captive koalas (Phascolarctos cinereus) in northeastern Australia. Although the endogenizing gammaretrovirus koala endogenous retrovirus (KoRV) was isolated from these koalas, KoRV has not been definitively associated with leukemogenesis. We performed KoRV screening in koalas from the San Diego Zoo, maintained for more than 45 y with very limited outbreeding, and the Los Angeles Zoo, maintained by continuously assimilating captive-born Australian koalas. San Diego Zoo koalas are currently free of malignant neoplasias and were infected with only endogenous KoRV, which we now term subtype "KoRV-A," whereas Los Angeles Zoo koalas with lymphomas/leukemias are infected in addition to KoRV-A by a unique KoRV we term subtype "KoRV-B." KoRV-B is most divergent in the envelope protein and uses a host receptor distinct from KoRV-A. KoRV-B also has duplicated enhancer regions in the LTR associated with increased pathology in gammaretroviruses. Whereas KoRV-A uses the sodium-dependent phosphate transporter 1 (PiT1) as a receptor, KoRV-B employs a different receptor, the thiamine transporter 1 (THTR1), to infect cells. KoRV-B is transmitted from dam to offspring through de novo infection, rather than via genetic inheritance like KoRV-A. Detection of KoRV-B in native Australian koalas should provide a history, and a mode for remediation, of leukemia/lymphoma currently endemic in this population. |
A high diversity of Eurasian lineage low pathogenicity avian influenza A viruses circulate among wild birds sampled in Egypt
Gerloff NA , Jones J , Simpson N , Balish A , Elbadry MA , Baghat V , Rusev I , de Mattos CC , de Mattos CA , Zonkle LE , Kis Z , Davis CT , Yingst S , Cornelius C , Soliman A , Mohareb E , Klimov A , Donis RO . PLoS One 2013 8 (7) e68522 Surveillance for influenza A viruses in wild birds has increased substantially as part of efforts to control the global movement of highly pathogenic avian influenza A (H5N1) virus. Studies conducted in Egypt from 2003 to 2007 to monitor birds for H5N1 identified multiple subtypes of low pathogenicity avian influenza A viruses isolated primarily from migratory waterfowl collected in the Nile Delta. Phylogenetic analysis of 28 viral genomes was performed to estimate their nearest ancestors and identify possible reassortants. Migratory flyway patterns were included in the analysis to assess gene flow between overlapping flyways. Overall, the viruses were most closely related to Eurasian, African and/or Central Asian lineage low pathogenicity viruses and belonged to 15 different subtypes. A subset of the internal genes seemed to originate from specific flyways (Black Sea-Mediterranean, East African-West Asian). The remaining genes were derived from a mixture of viruses broadly distributed across as many as 4 different flyways suggesting the importance of the Nile Delta for virus dispersal. Molecular clock date estimates suggested that the time to the nearest common ancestor of all viruses analyzed ranged from 5 to 10 years, indicating frequent genetic exchange with viruses sampled elsewhere. The intersection of multiple migratory bird flyways and the resulting diversity of influenza virus gene lineages in the Nile Delta create conditions favoring reassortment, as evident from the gene constellations identified by this study. In conclusion, we present for the first time a comprehensive phylogenetic analysis of full genome sequences from low pathogenic avian influenza viruses circulating in Egypt, underscoring the significance of the region for viral reassortment and the potential emergence of novel avian influenza A viruses, as well as representing a highly diverse influenza A virus gene pool that merits continued monitoring. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Epidemiology and Surveillance
- Food Safety
- Genetics and Genomics
- Global Health
- Health Behavior and Risk
- Health Communication and Education
- Healthcare Associated Infections
- Immunity and Immunization
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Program Evaluation
- Public Health Leadership and Management
- Public Health, General
- Reproductive Health
- Social and Behavioral Sciences
- Substance Use and Abuse
- Veterinary Medicine
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure