Acute chest syndrome is associated with single nucleotide polymorphism-defined beta globin cluster haplotype in children with sickle cell anaemia.
Bean CJ , Boulet SL , Yang G , Payne AB , Ghaji N , Pyle ME , Hooper WC , Bhatnagar P , Keefer J , Barron-Casella EA , Casella JF , Debaun MR . Br J Haematol 2013 163 (2) 268-76 Genetic diversity at the human beta-globin locus has been implicated as a modifier of sickle cell anaemia (SCA) severity. However, haplotypes defined by restriction fragment length polymorphism sites across the beta-globin locus have not been consistently associated with clinical phenotypes. To define the genetic structure at the beta-globin locus more thoroughly, we performed high-density single nucleotide polymorphism (SNP) mapping in 820 children who were homozygous for the sickle cell mutation (HbSS). Genotyping results revealed very high linkage disequilibrium across a large region spanning the locus control region and the HBB (beta-globin gene) cluster. We identified three predominant haplotypes accounting for 96% of the betaS -carrying chromosomes in this population that could be distinguished using a minimal set of common SNPs. Consistent with previous studies, fetal haemoglobin level was significantly associated with betaS -haplotypes. After controlling for covariates, an association was detected between haplotype and rate of hospitalization for acute chest syndrome (ACS) (incidence rate ratio 0.51, 95% confidence interval 0.29-0.89) but not incidence rate of vaso-occlusive pain or presence of silent cerebral infarct (SCI). Our results suggest that these SNP-defined betaS -haplotypes may be associated with ACS, but not pain or SCI in a study population of children with SCA. |
Utilization of epidermal growth factor receptor (EGFR) testing in the United States: a case study of T3 translational research.
Lynch JA , Khoury MJ , Borzecki A , Cromwell J , Hayman LL , Ponte PR , Miller GA , Lathan CS . Genet Med 2013 15 (8) 630-8 PURPOSE: We examined hospital use of the epidermal growth factor receptor assay in patients with lung cancer in the United States. Our goal was to inform the development of a model to predict phase 3 translation of guideline-directed molecular diagnostic tests. METHODS: This was a retrospective observational study. Using logistic regression, we analyzed the association between hospitals' institutional and regional characteristics and the likelihood that an epidermal growth factor receptor assay would be ordered. RESULTS: Significant institutional predictors included affiliation with an academic medical center (odds ratio, 1.48; 95% confidence interval, 1.20-1.83), participation in a National Cancer Institute clinical research cooperative group (odds ratio, 2.06, 1.66-2.55), and -availability of positron emission tomography scan (odds ratio, 1.44, 1.07-1.94) and cardiothoracic surgery (odds ratio, 1.90, 1.52-2.37) services. Significant regional predictors included metropolitan county (odds ratio, 2.08, 1.48-2.91), population with above-average education (odds ratio, 1.46, 1.09-1.96), and population with above-average income (odds ratio, 1.46, 1.04-2.05). Distance from a National Cancer Institute cancer center was a negative predictor (odds ratio, 0.996, 0.995-0.998), with a 34% decrease in likelihood for every 100 miles. CONCLUSION: In 2010, only 12% of US acute-care hospitals ordered the epidermal growth factor receptor assay, suggesting that most patients with lung cancer did not have access to this test. This case study illustrated the need for: (i) increased dissemination and implementation research, and (ii) interventions to improve adoption of guideline-directed molecular diagnostic tests by community hospitals.Genet Med 2013:15(8):630-638. |
Use of genomic panels to determine risk of developing type 2 diabetes in the general population: a targeted evidence-based review.
Palomaki GE , Melillo S , Marrone M , Douglas MP . Genet Med 2013 15 (8) 600-11 This evidence review addresses whether type 2 diabetes genomic risk panels improve health outcomes (e.g., reduce rates of developing type 2 diabetes) in low- or high-risk adults; two clinical scenarios promulgated by commercial companies offering such testing. Evidence for the analytic validity of available genomic profiles was inadequate. Clinical validity ranged from inadequate to convincing for 30 variants identified on five type 2 diabetes genomic panels and by genome-wide association studies. Eight common variants were identified for general population use; evidence credibility based on published criteria was strong for two variants, moderate for two variants, and weak for four variants. TCF7L2 had the largest per-allele odds ratio of 1.39 (95% confidence interval 1.33-1.46). Models combining the best four, best eight, and all 30 variants used summary effect sizes, reported genotype frequencies, and assumed independent effects. Areas under the curve were 0.547, 0.551, and 0.570, respectively. In high-risk populations, per-allele odds ratios for TCF7L2 alone were similar to those of the general population. TCF7L2, in combination with other variants, yielded minimal improvement in risk reclassification. Evidence on TCF7L2 clinical validity was adequate. Three studies addressed the clinical utility of intervention effectiveness, stratified by TCF7L2 genotype; none found significant interactions. Clinical utility evidence was inadequate. In addition to analytic validity and clinical utility knowledge gaps, additional gaps were identified regarding how to inform, produce, and evaluate models combining multiple variants.Genet Med 2013:15(8):600-611. |
The Relationship between county-level contextual characteristics and use of diabetes care services
Luo H , Beckles GL , Zhang X , Sotnikov S , Thompson T , Bardenheier B . J Public Health Manag Pract 2013 20 (4) 401-10 OBJECTIVES:To examine the relationship between county-level measures of social determinants and use of preventive care among US adults with diagnosed diabetes. To inform future diabetes prevention strategies. METHODS: Data are from the Behavioral Risk Factor Surveillance System (BRFSS) 2004 and 2005 surveys, the National Diabetes Surveillance System, and the Area Resource File. Use of diabetes care services was defined by self-reported receipt of 7 preventive care services. Our study sample included 46 806 respondents with self-reported diagnosed diabetes. Multilevel models were run to assess the association between county-level characteristics and receipt of each of the 7 preventive diabetes care service after controlling for characteristics of individuals. Results were considered significant if P < .05. RESULTS: Controlling for individual-level characteristics, our analyses showed that 7 of the 8 county-level factors examined were significantly associated with use of 1 or more preventive diabetes care services. For example, people with diabetes living in a county with a high uninsurance rate were less likely to have an influenza vaccination, visit a doctor for diabetes care, have an A1c test, or a foot examination; people with diabetes living in a county with a high physician density were more likely to have an A1c test, foot examination, or an eye examination; and people with diabetes living in a county with more people with less than high-school education were less likely to have influenza vaccination, pneumococcal vaccination, or self-care education (all P < .05). CONCLUSIONS: Many of the county-level factors examined in this study were found to be significantly associated with use of preventive diabetes care services. County policy makers may need to consider local circumstances to address the disparities in use of these services. |
School-based programs aimed at the prevention and treatment of obesity: evidence-based interventions for youth in Latin America
Lobelo F , Garcia de Quevedo I , Holub CK , Nagle BJ , Arredondo EM , Barquera S , Elder JP . J Sch Health 2013 83 (9) 668-77 BACKGROUND: Rapidly rising childhood obesity rates constitute a public health priority in Latin America which makes it imperative to develop evidence-based strategies. Schools are a promising setting but to date it is unclear how many school-based obesity interventions have been documented in Latin America and what level of evidence can be gathered from such interventions. METHODS: We performed a systematic review of papers published between 1965 and December 2010. Interventions were considered eligible if they had a school-based component, were done in Latin America, evaluated an obesity related outcome (body mass index [BMI], weight, %body fat, waist circumference, BMI z-score), and compared youth exposed vs not exposed. RESULTS: Ten studies were identified as having a school-based component. Most interventions had a sample of normal and overweight children. The most successful interventions focused on prevention rather than treatment, had longer follow-ups, a multidisciplinary team, and fewer limitations in execution. Three prevention and 2 treatment interventions found sufficient improvements in obesity-related outcomes. CONCLUSIONS: We found sufficient evidence to recommend school-based interventions to prevent obesity among youth in Latin America. Evidence-based interventions in the school setting should be promoted as an important component for integrated programs, policies, and monitoring frameworks designed to reverse the childhood obesity in the region. |
Longitudinal nasopharyngeal carriage and antibiotic resistance of respiratory bacteria in indigenous Australian and alaska native children with bronchiectasis
Hare KM , Singleton RJ , Grimwood K , Valery PC , Cheng AC , Morris PS , Leach AJ , Smith-Vaughan HC , Chatfield M , Redding G , Reasonover AL , McCallum GB , Chikoyak L , McDonald MI , Brown N , Torzillo PJ , Chang AB . PLoS One 2013 8 (8) e70478 BACKGROUND: Indigenous children in Australia and Alaska have very high rates of chronic suppurative lung disease (CSLD)/bronchiectasis. Antibiotics, including frequent or long-term azithromycin in Australia and short-term beta-lactam therapy in both countries, are often prescribed to treat these patients. In the Bronchiectasis Observational Study we examined over several years the nasopharyngeal carriage and antibiotic resistance of respiratory bacteria in these two PCV7-vaccinated populations. METHODS: Indigenous children aged 0.5-8.9 years with CSLD/bronchiectasis from remote Australia (n = 79) and Alaska (n = 41) were enrolled in a prospective cohort study during 2004-8. At scheduled study visits until 2010 antibiotic use in the preceding 2-weeks was recorded and nasopharyngeal swabs collected for culture and antimicrobial susceptibility testing. Analysis of respiratory bacterial carriage and antibiotic resistance was by baseline and final swabs, and total swabs by year. RESULTS: Streptococcus pneumoniae carriage changed little over time. In contrast, carriage of Haemophilus influenzae declined and Staphylococcus aureus increased (from 0% in 2005-6 to 23% in 2010 in Alaskan children); these changes were associated with increasing age. Moraxella catarrhalis carriage declined significantly in Australian, but not Alaskan, children (from 64% in 2004-6 to 11% in 2010). While beta-lactam antibiotic use was similar in the two cohorts, Australian children received more azithromycin. Macrolide resistance was significantly higher in Australian compared to Alaskan children, while H. influenzae beta-lactam resistance was higher in Alaskan children. Azithromycin use coincided significantly with reduced carriage of S. pneumoniae, H. influenzae and M. catarrhalis, but increased carriage of S. aureus and macrolide-resistant strains of S. pneumoniae and S. aureus (proportion of carriers and all swabs), in a 'cumulative dose-response' relationship. CONCLUSIONS: Over time, similar (possibly age-related) changes in nasopharyngeal bacterial carriage were observed in Australian and Alaskan children with CSLD/bronchiectasis. However, there were also significant frequency-dependent differences in carriage and antibiotic resistance that coincided with azithromycin use. |
Neutrophilic bacterial meningitis: pathology and etiologic diagnosis of fatal cases
Guarner J , Liu L , Bhatnagar J , Jones T , Patel M , Deleon-Carnes M , Zaki SR . Mod Pathol 2013 26 (8) 1076-85 The frequency of fatalities due to acute bacterial meningitis has decreased significantly due to vaccinations, early diagnoses, and treatments. We studied brain tissues of patients with fatal neutrophilic meningitis referred to the Centers for Disease Control for etiologic diagnosis from 2000-2009 to highlight aspects of the disease that may be preventable or treatable. Demographic, clinical, and laboratory data were extracted from records. Of 117 cases in the database with a diagnosis of meningitis or meningoencephalitis, 39 had neutrophilic inflammation in the meninges. Inflammatory cells infiltrated the superficial cortex in 16 of 39 (41%) cases. Bacteria were found using Gram and bacterial silver stains in 72% of cases, immunohistochemistry in 69% (including two cases where the meningococcus was found outside the meninges), and PCR in 74%. Streptococcus pneumoniae was the cause of the meningitis in 14 patients and Neisseria meningitidis in 9. In addition, Streptococcus spp. were found to be the cause in six cases, while Staphylococcus aureus, Staphylococcus spp., Enterococcus spp., and Fusobacterium were the cause of one case each. There were six cases in which no specific etiological agent could be determined. The mean age of the patients with S. pneumoniae was 39 years (range 0-65), with N. meningitidis was 19 years (range 7-51), whereas that for all others was 31 years (range 0-68). In summary, our study shows that S. pneumoniae continues to be the most frequent cause of fatal neutrophilic bacterial meningitis followed by N. meningitidis, both vaccine preventable diseases. |
Hypospadias and maternal intake of phytoestrogens
Carmichael SL , Cogswell ME , Ma C , Gonzalez-Feliciano A , Olney RS , Correa A , Shaw GM . Am J Epidemiol 2013 178 (3) 434-40 Experimental data indicate that gestational exposures to estrogenic compounds impact risk of hypospadias. We examined whether risk of hypospadias (i.e., a congenital malformation in which the opening of the penile urethra occurs on the ventral side of the penis) was associated with maternal intake of phytoestrogens, given their potential impact on estrogen metabolism. The analysis included data on mothers of 1,250 hypospadias cases and 3,118 controls who delivered their infants from 1997 to 2005 and participated in the National Birth Defects Prevention Study, a multistate, population-based, case-control study. After adjustment for several covariates, high intakes of daidzein, genistein, glycetin, secoisolariciresinol, total isoflavones, total lignans, and total phytoestrogens were associated with reduced risks; odds ratios comparing intakes ≥90th percentile with intakes between the 11th and 89th percentiles ranged from 0.6 to 0.8. For example, the odds ratio for total phytoestrogen intake was 0.7 (95% confidence interval: 0.5, 1.0). This study represents the first large-scale analysis of phytoestrogen intake and hypospadias. The observed associations merit investigation in additional populations before firm conclusions can be reached. |
The Association of depressive symptoms and pulmonary function in healthy adults
Ochs-Balcom HM , Lainhart W , Mnatsakanova A , Charles LE , Violanti JM , Andrew ME , Freudenheim JL , Muti P , Trevisan M , Burchfiel CM , Schunemann HJ . Psychosom Med 2013 75 (8) 737-43 OBJECTIVE: Chronic lung disease is exacerbated by comorbid psychiatric issues and treatment of depression may improve disease symptoms. We sought to add to the literature as to whether depression is associated with pulmonary function in healthy adults. METHODS: In 2551 healthy adults from New York State, we studied the association of depression via the Center for Epidemiologic Studies Depression scale (CES-D) scale score and forced expiratory volume in 1 second (FEV1) and forced vital capacity (FVC) using general linear models and a cross-sectional design. RESULT: We identified statistically significant inverse trends in FEV1, FVC, FEV1%, and FVC% by CES-D category, especially in ever-smokers and men. When adjusted for covariates, the difference in FEV1 and FEV1% for smokers with more than 18.5 lifetime pack-years from CES-D scores 0 to 3 to 16 or more (depressed) is approximately 0.25 l and 5.0% (adjusted p values for trend are <.001 and .019, respectively). In men, we also observed statistically significant inverse trends in pulmonary function with increasing CES-D. CONCLUSIONS: We identified an inverse association of depressive symptoms and pulmonary function in healthy adults, especially in men and individuals with a heavy smoking history. Further studies of these associations are essential for the development and tailoring of interventions for the prevention and treatment of chronic lung disease. |
Chronic obstructive pulmonary disease prevalence among nonsmokers by occupation in the United States
Bang KM , Syamlal G , Mazurek JM , Wassell JT . J Occup Environ Med 2013 55 (9) 1021-6 OBJECTIVE: To examine the prevalence of chronic obstructive pulmonary disease (COPD) among nonsmokers by occupation in the United States. METHODS: The 1997 to 2004 National Health Interview Survey data for working adults aged 25 years or more were used to estimate the COPD prevalence and to examine change in COPD prevalence between 1997 to 2000 and 2001 to 2004 by occupational groups. RESULTS: During 1997 to 2004, COPD prevalence was 2.8%. The COPD prevalence was highest in financial records processing (4.6%) occupations. There was a slight increase in COPD prevalence during the two survey periods from 2.8% during 1997 to 2000 compared with 2.9% during 2001 to 2004. CONCLUSIONS: No significant changes in the COPD prevalence between the two periods were found. Nevertheless, the elevated COPD prevalence in certain occupational groups suggests that other risk factors play a role in developing COPD. |
Search strategy has influenced the discovery rate of human viruses
Rosenberg R , Johansson MA , Powers AM , Miller BR . Proc Natl Acad Sci U S A 2013 110 (34) 13961-4 A widely held concern is that the pace of infectious disease emergence has been increasing. We have analyzed the rate of discovery of pathogenic viruses, the preeminent source of newly discovered causes of human disease, from 1897 through 2010. The rate was highest during 1950-1969, after which it moderated. This general picture masks two distinct trends: for arthropod-borne viruses, which comprised 39% of pathogenic viruses, the discovery rate peaked at three per year during 1960-1969, but subsequently fell nearly to zero by 1980; however, the rate of discovery of nonarboviruses remained stable at about two per year from 1950 through 2010. The period of highest arbovirus discovery coincided with a comprehensive program supported by The Rockefeller Foundation of isolating viruses from humans, animals, and arthropod vectors at field stations in Latin America, Africa, and India. The productivity of this strategy illustrates the importance of location, approach, long-term commitment, and sponsorship in the discovery of emerging pathogens. |
Trends in HIV prevalence and HIV testing among young MSM: five United States cities, 1994-2011
Oster AM , Johnson CH , Le BC , Balaji AB , Finlayson TJ , Lansky A , Mermin J , Valleroy L , Mackellar D , Behel S , Paz-Bailey G . AIDS Behav 2013 18 Suppl 3 S237-47 We examined trends in cross-sectional HIV prevalence (a surrogate for incidence) and past 12 month testing behavior among young men who have sex with men (MSM). The Young Men's Survey and the National HIV Behavioral Surveillance System conducted interviews and HIV testing among MSM recruited by venue-based sampling during 1994-2011. Using data from five cities, we determined whether interview year was associated with HIV prevalence and recent testing for MSM ages 18-22 and 23-29 years, after adjusting for city, race/ethnicity, and education. Multivariable analysis demonstrated an overall increase in prevalence among MSM ages 23-29 years, driven by an increase in Baltimore. There was no change in HIV prevalence among MSM ages 18-22 years overall, although prevalence increased in Baltimore. HIV testing increased significantly for both age groups. Gains in HIV testing are encouraging, but increasing prevalence among young MSM in Baltimore suggests increasing incidence and the need for additional efforts for this population. |
Leptospirosis and human immunodeficiency virus co-infection among febrile inpatients in northern Tanzania
Biggs HM , Galloway RL , Bui DM , Morrissey AB , Maro VP , Crump JA . Vector Borne Zoonotic Dis 2013 13 (8) 572-80 BACKGROUND: Leptospirosis and human immunodeficiency virus (HIV) infection are prevalent in many areas, including northern Tanzania, yet little is known about their interaction. METHODS: We enrolled febrile inpatients at two hospitals in Moshi, Tanzania, over 1 year and performed HIV antibody testing and the microscopic agglutination test (MAT) for leptospirosis. Confirmed leptospirosis was defined as ≥four-fold rise in MAT titer between acute and convalescent serum samples, and probable leptospirosis was defined as any reciprocal MAT titer ≥800. RESULTS: Confirmed or probable leptospirosis was found in 70 (8.4%) of 831 participants with at least one serum sample tested. At total of 823 (99.0%) of 831 participants had HIV testing performed, and 203 (24.7%) were HIV infected. Among HIV-infected participants, 9 (4.4%) of 203 had confirmed or probable leptospirosis, whereas among HIV-uninfected participants 61 (9.8%) of 620 had leptospirosis. Leptospirosis was less prevalent among HIV-infected as compared to HIV-uninfected participants [odds ratio (OR) 0.43, p=0.019]. Among those with leptospirosis, HIV-infected patients more commonly presented with features of severe sepsis syndrome than HIV-uninfected patients, but differences were not statistically significant. Among HIV-infected patients, severe immunosuppression was not significantly different between those with and without leptospirosis (p=0.476). Among HIV-infected adolescents and adults, median CD4 percent and median CD4 count were higher among those with leptospirosis as compared to those with other etiologies of febrile illness, but differences in CD4 count did not reach statistical significance (p=0.015 and p=0.089, respectively). CONCLUSIONS: Among febrile inpatients in northern Tanzania, leptospirosis was not more prevalent among HIV-infected patients. Although some indicators of leptospirosis severity were more common among HIV-infected patients, a statistically significant difference was not demonstrated. Among HIV-infected patients, those with leptospirosis were not more immunosuppressed relative to those with other etiologies of febrile illness. |
'Let Us Protect Our Future' a culturally congruent evidenced-based HIV/STD risk-reduction intervention for young South African adolescents
Jemmott LS , Jemmott JB 3rd , Ngwane Z , Icard L , O'Leary A , Gueits L , Brawner B . Health Educ Res 2013 29 (1) 166-81 One of the worst HIV/AIDS epidemics in the world is occurring in South Africa, where heterosexual exposure is the main mode of HIV transmission. Young people 15-24 years of age, particularly women, account for a large share of new infections. Accordingly, there is an urgent need for behavior-change interventions to reduce the incidence of HIV among adolescents in South Africa. However, there are few such interventions with proven efficacy for South African adolescents, especially young adolescents. A recent cluster-randomized controlled trial of the 'Let Us Protect Our Future!' HIV/STD risk-reduction intervention for Grade 6 South African adolescents (mean age = 12.4 years) found significant decreases in self-reported sexual risk behaviors compared with a control intervention. This article describes the intervention, the use of the social cognitive theory and the reasoned action approach to develop the intervention, how formative research informed its development and the acceptability of the intervention. Challenges in designing and implementing HIV/STD risk-reduction interventions for young adolescents in sub-Saharan Africa are discussed. |
Linking time-varying symptomatology and intensity of infectiousness to patterns of norovirus transmission
Zelner JL , Lopman BA , Hall AJ , Ballesteros S , Grenfell BT . PLoS One 2013 8 (7) e68413 BACKGROUND: Norovirus (NoV) transmission may be impacted by changes in symptom intensity. Sudden onset of vomiting, which may cause an initial period of hyper-infectiousness, often marks the beginning of symptoms. This is often followed by: a 1-3 day period of milder symptoms, environmental contamination following vomiting, and post-symptomatic shedding that may result in transmission at progressively lower rates. Existing models have not included time-varying infectiousness, though representing these features could add utility to models of NoV transmission. METHODS: We address this by comparing the fit of three models (Models 1-3) of NoV infection to household transmission data from a 2009 point-source outbreak of GII.12 norovirus in North Carolina. Model 1 is an SEIR compartmental model, modified to allow Gamma-distributed sojourn times in the latent and infectious classes, where symptomatic cases are uniformly infectious over time. Model 2 assumes infectiousness decays exponentially as a function of time since onset, while Model 3 is discontinuous, with a spike concentrating 50% of transmissibility at onset. We use Bayesian data augmentation techniques to estimate transmission parameters for each model, and compare their goodness of fit using qualitative and quantitative model comparison. We also assess the robustness of our findings to asymptomatic infections. RESULTS: We find that Model 3 (initial spike in shedding) best explains the household transmission data, using both quantitative and qualitative model comparisons. We also show that these results are robust to the presence of asymptomatic infections. CONCLUSIONS: Explicitly representing explosive NoV infectiousness at onset should be considered when developing models and interventions to interrupt and prevent outbreaks of norovirus in the community. The methods presented here are generally applicable to the transmission of pathogens that exhibit large variation in transmissibility over an infection. |
Neurologic complications of influenza in children
Chaves Sandra S . Contemp Pediatr 2013 30 (8) 26-37 Influenza-associated neurologic complications in children are rare, but can be severe. Familiarity with the clinical presentation and frequency of specific neurologic findings can help with early diagnosis and treatment. |
Influenza A (H1N1) 2009 monovalent and seasonal influenza vaccination among adults 25 to 64 years of age with high-risk conditions-United States, 2010
Lu PJ , Gonzalez-Feliciano A , Ding H , Bryan LN , Yankey D , Monsell EA , Greby SM , Euler GL . Am J Infect Control 2013 41 (8) 702-9 BACKGROUND: Seasonal influenza vaccination has been routinely recommended for adults with high-risk conditions. The Advisory Committee on Immunization Practices recommended that persons 25 to 64 years of age with high-risk conditions be one of the initial target groups to receive H1N1 vaccination during the 2009-2010 season. METHODS: We used data from the 2009-2010 Behavioral Risk Factor Surveillance System survey. Vaccination levels of H1N1 and seasonal influenza vaccination among respondents 25 to 64 years with high-risk conditions were assessed. Multivariable logistic regression models were performed to identify factors independently associated with vaccination. RESULTS: Overall, 24.8% of adults 25 to 64 years of age were identified to have high-risk conditions. Among adults 25 to 64 years of age with high-risk conditions, H1N1 and seasonal vaccination coverage were 26.3% and 47.6%, respectively. Characteristics independently associated with an increased likelihood of H1N1 vaccination were as follows: higher age; Hispanic race/ethnicity; medical insurance; ability to see a doctor if needed; having a primary doctor; a routine checkup in the previous year; not being a current smoker; and having high-risk conditions other than asthma, diabetes, and heart disease. Characteristics independently associated with seasonal influenza vaccination were similar compared with factors associated with H1N1 vaccination. CONCLUSION: Immunization programs should work with provider organizations to review efforts made to reach adults with high-risk conditions during the recent pandemic and assess how and where they can increase vaccination coverage during future pandemics. |
Influenza vaccination among health care personnel in California: 2010-2011 influenza season
Lee SJ , Harrison R , Rosenberg J , McLendon P , Boston E , Lindley MC . Am J Infect Control 2013 41 (8) e65-71 BACKGROUND: Influenza vaccination among health care personnel (HCP) is a key measure to prevent influenza infection and transmission in health care settings. This study described influenza vaccination coverage among employees in various health care settings in California and examined factors associated with HCP influenza vaccination. METHODS: This study analyzed data from 111 facilities recruited through statewide invitation. Data on facility characteristics, vaccination programs, and vaccination receipt within and outside facilities were collected using Web-based questionnaires. Employees were defined as all persons in the facility payroll system regardless of patient contact. Facility-level employee vaccination coverage was calculated for 91 facilities. RESULTS: The mean employee influenza vaccination coverage was 60.7% overall: 64.0% for acute care hospitals (n = 30), 54.7% for long-term care facilities (n = 22), 59.4% for ambulatory surgery centers (n = 8), 58.6% for dialysis centers (n = 25), and 77.2% for physician practices (n = 6). Vaccination promotion methods such as risk-benefit education, personal reminders, and vaccination data tracking and feedback were significantly associated with increased vaccination coverage. CONCLUSION: The study findings suggest some variations in HCP vaccination coverage by type of health care setting as well as substantial challenges in reaching the Healthy People 2020 goal of 90%. Health care facilities need to use comprehensive promotion methods to improve HCP influenza vaccinations. |
Anti-tuberculosis drug resistance among new and previously treated sputum smear-positive tuberculosis patients in Uganda: results of the first national survey
Lukoye D , Adatu F , Musisi K , Kasule GW , Were W , Odeke R , Kalamya JN , Awor A , Date A , Joloba ML . PLoS One 2013 8 (8) e70763 BACKGROUND: Multidrug resistant and extensively drug resistant tuberculosis (TB) have become major threats to control of tuberculosis globally. The rates of anti-TB drug resistance in Uganda are not known. We conducted a national drug resistance survey to investigate the levels and patterns of resistance to first and second line anti-TB drugs among new and previously treated sputum smear-positive TB cases. METHODS: Sputum samples were collected from a nationally representative sample of new and previously treated sputum smear-positive TB patients registered at TB diagnostic centers during December 2009 to February 2011 using a weighted cluster sampling method. Culture and drug susceptibility testing was performed at the national TB reference laboratory. RESULTS: A total of 1537 patients (1397 new and 140 previously treated) were enrolled in the survey from 44 health facilities. HIV test result and complete drug susceptibility testing (DST) results were available for 1524 (96.8%) and 1325 (85.9%) patients, respectively. Of the 1209 isolates from new cases, resistance to any anti-TB drug was 10.3%, 5% were resistant to isoniazid, 1.9% to rifampicin, and 1.4% were multi drug resistant. Among the 116 isolates from previously treated cases, the prevalence of resistance was 25.9%, 23.3%, 12.1% and 12.1% respectively. Of the 1524 patients who had HIV testing 469 (30.7%) tested positive. There was no association between anti-TB drug resistance (including MDR) and HIV infection. CONCLUSION: The prevalence of anti-TB drug resistance among new patients in Uganda is low relative to WHO estimates. The higher levels of MDR-TB (12.1%) and resistance to any drug (25.3%) among previously treated patients raises concerns about the quality of directly observed therapy (DOT) and adherence to treatment. This calls for strengthening existing TB control measures, especially DOT, routine DST among the previously treated TB patients or periodic drug resistance surveys, to prevent and monitor development and transmission of drug resistant TB. |
Clinical trials provide the evidence critical for patient empowerment
De Cock KM , El-Sadr WM . J Int AIDS Soc 2013 16 (1) 18811 In this issue of the Journal, Delva et al. discuss in a Viewpoint our Perspective article published in the New England Journal of Medicine in which we argue for the urgent need for a clinical trial on when to initiate antiretroviral therapy (ART) in HIV-infected patients in sub-Saharan Africa [1]. The authors posit that there is currently sufficient evidence to make informed decisions regarding this issue and consequently individual patients' autonomy should be the key factor in determining the timing for ART initiation. | As readers can review our Perspective article, we will not repeat arguments concerning the lack of definitive evidence to guide ART initiation, the limited evidence from observational studies nor the limitations of ongoing studies assessing timing for initiation of ART for African settings. Nonetheless, a few points that directly relate to the Viewpoint authors’ arguments are important to address. | With regard to the observational studies the authors cite as evidence in support of early initiation of ART, two of the articles do not provide relevant information to the question of early versus deferred ART [2, 3] and the other two are focused on the use of ART in individuals with early or acute HIV infection [4, 5]. Recently, Sabin et al. carefully reviewed the available observational studies related to early versus deferred ART initiation and highlighted the inconsistent estimates of benefit particularly with regard to mortality, the modest effect size noted with early ART use and the risk of confounding inherent to observational studies [6]. |
Condom use and human papillomavirus in men
Hariri S , Warner L . J Infect Dis 2013 208 (3) 367-9 In this issue of the Journal, an article by Pierce Campbell et al [1] adds new information to the largely inconsistent body of observational studies on the protective effect of condom use against human papillomavirus (HPV) infection. Using data from the HPV Infection in Men (HIM) study [2], a multinational cohort study of the natural history of anogenital HPV in men, the authors examined the effect of self-reported condom use on the incidence and duration of penile HPV infection. Their results suggest reductions in HPV acquisition and duration of HPV infection for some men who reported consistent condom use. | In the HIM study, participants completed a physical examination every 6 months over a 4-year period. At each visit, participants provided DNA specimens and completed a self-administered questionnaire about recent sexual behaviors, including condom use since the prior visit. For this analysis, the authors divided the original cohort into 4 categories based on the risk of HPV exposure, as determined by participants’ self-reported sexual behavior. The categories, in order of decreasing exposure risk, were no steady sex partner; nonmonogamous, nonsteady sex partner; nonmonogamous, steady sex partner; and monogamous. Condom use in the previous 6 months was defined at a single time point (the baseline visit) and assessed at 3 levels (always, sometimes, and never). This baseline measure was extrapolated to represent condom use during the entire follow-up period. HPV infection was classified into 3 categories on the basis of HPV type: any HPV type, oncogenic HPV types, and nononcogenic HPV types. In the highest HPV exposure risk category (ie, men with no steady sex partners), those who reported always using condoms in the 6 months before study entry were about 50% less likely to become newly infected with any HPV types in the 12-month follow-up period, compared with men who never used condoms (adjusted hazard ratio [HR], 0.54; 95% confidence interval [CI], .31–.95). Similar reductions were reported when HPV types were stratified by oncogenic risk, but these associations did not reach statistical significance. No significant associations were observed between condom use and HPV incidence in the other 3 HPV exposure risk groups. Evaluating duration of infection, the authors found a faster rate of oncogenic HPV clearance only in the group of men who reported always using condoms with their nonsteady sex partners, compared with men who reported never using condoms (adjusted HR, 1.29; 95% CI, 1.03–1.61). Condom use did not impact the duration of infection in the other 3 HPV exposure risk groups, including the group with no steady sex partners, the same group for which consistent use decreased HPV incidence. |
Reaching men who have sex with men: a comparison of respondent-driven sampling and time-location sampling in Guatemala City
Paz-Bailey G , Miller W , Shiraishi RW , Jacobson JO , Abimbola TO , Chen SY . AIDS Behav 2013 17 (9) 3081-90 We present a comparison of respondent-driven sampling (RDS) and time-location sampling (TLS) for behavioral surveillance studies among men who have sex with men (MSM). In 2010, we conducted two simultaneous studies using TLS (N = 609) and RDS (N = 507) in Guatemala city. Differences in characteristics of the population reached based on weighted estimates as well as the time and cost of recruitment are presented. RDS MSM were marginally more likely to self-report as heterosexual, less likely to disclose sexual orientation to family members and more likely to report sex with women than TLS MSM. Although RDS MSM were less likely than TLS MSM to report ≥2 non-commercial male partners, they were more likely to report selling sex in the past 12 months. The cost per participant was $89 and $121 for RDS and TLS, respectively. Our results suggest that RDS reached a more hidden sub-population of non-gay-identifying MSM than TLS and had a lower implementation cost. |
Physical and mental health status of Iraqi refugees resettled in the United States
Taylor EM , Yanni EA , Pezzi C , Guterbock M , Rothney E , Harton E , Montour J , Elias C , Burke H . J Immigr Minor Health 2013 16 (6) 1130-7 We conducted a survey among Iraqi refugees resettled in the United States to assess their physical and mental health status and healthcare access and utilization following the initial 8-month, post-arrival period. We randomly selected Iraqi refugees: ≥18 years of age; living in the United States for 8-36 months; and residents of Michigan, California, Texas and Idaho. Participants completed a household questionnaire and mental health assessment. We distributed 366 surveys. Seventy-five percent of participants had health insurance at the time of the survey; 43 % reported delaying or not seeking care for a medical problem in the past year. Sixty percent of participants reported one chronic condition; 37 % reported ≥2 conditions. The prevalence of emotional distress, anxiety, and depression was approximately 50 % of participants; 31 % were at risk for post-traumatic stress disorder. Iraqi refugees in this evaluation reported a high prevalence of chronic conditions and mental health symptoms despite relatively high access to healthcare. It is important for resettlement partners to be aware of the distinctive health concerns of this population to best address needs within this community. |
Profiling cholinesterase adduction: a high-throughput prioritization method for organophosphate exposure samples
Carter MD , Crow BS , Pantazides BG , Watson CM , Decastro BR , Thomas JD , Blake TA , Johnson RC . J Biomol Screen 2013 19 (2) 325-30 A high-throughput prioritization method was developed for use with a validated confirmatory method detecting organophosphorus nerve agent exposure by immunomagnetic separation high-performance liquid chromatography tandem mass spectrometry. A ballistic gradient was incorporated into this analytical method to profile unadducted butyrylcholinesterase (BChE) in clinical samples. With Zhang et al.'s Z' factor of 0.88 +/- 0.01 (SD) of control analytes and Z factor of 0.25 +/- 0.06 (SD) of serum samples, the assay is rated an "excellent assay" for the synthetic peptide controls used and a "double assay" when used to prioritize clinical samples. Hits, defined as samples containing BChE Ser-198 adducts or no BChE present, were analyzed in a confirmatory method for identification and quantitation of the BChE adduct, if present. The ability to prioritize samples by highest exposure for confirmatory analysis is of particular importance in an exposure to cholinesterase inhibitors such as organophosphorus nerve agents, in which a large number of clinical samples may be collected. In an initial blind screen, 67 of 70 samples were accurately identified, giving an assay accuracy of 96%, and it yielded no false-negatives. The method is the first to provide a high-throughput prioritization assay for profiling adduction of Ser-198 BChE in clinical samples. |
Urinary bisphenol A concentrations and cytochrome P450 19 A1 (Cyp19) gene expression in ovarian granulosa cells: an in vivo human study.
Ehrlich S , Williams PL , Hauser R , Missmer SA , Peretz J , Calafat AM , Flaws JA . Reprod Toxicol 2013 42C 18-23 BACKGROUND: Exposure to bisphenol A (BPA), a chemical widely used in consumer products, has been associated with in vitro Cyp19 gene expression. OBJECTIVE: To evaluate an in vivo human model of Cyp19 gene expression in granulosa cells. STUDY DESIGN: A subset of an ongoing prospective cohort study of women undergoing in vitro fertilization (IVF) at Massachusetts General Hospital. METHODS: Mixed effect models were used to evaluate the association of urinary BPA concentrations with granulosa cell Cyp19 mRNA expression. RESULTS: In 61 women undergoing 76 IVF cycles, adjusted changes in mean Cyp19 expression (beta estimate (95% CI)) for quartiles 2, 3 and 4 as compared to the lowest quartile were: -0.97 (-2.22, 0.28); -0.97 (-2.18, 0.24) and -0.38 (-1.58, 0.82). CONCLUSIONS: An in vivo model for evaluation of Cyp19 gene expression was developed for use in epidemiologic studies. In this pilot study, we found no statistically significant linear association between urinary BPA concentrations and Cyp19 expression. |
Nonfatal choking on food among children 14 years or younger in the United States, 2001-2009
Chapin Meyli M , Rochette Lynne M , Annest Joseph L , Haileyesus Tadesse , Conner Kristen A , Smith Gary A . Pediatrics 2013 132 (2) 275-81 OBJECTIVE: The objective of this study was to investigate the epidemiology of nonfatal choking on food among US children. METHODS: Using a nationally representative sample, nonfatal pediatric choking-related emergency department (ED) visits involving food for 2001 through 2009 were analyzed by using data from the National Electronic Injury Surveillance System-All Injury Program. Narratives abstracted from the medical record were reviewed to identify choking cases and the types of food involved. RESULTS: An estimated 111 914 (95% confidence interval: 83 975-139 854) children ages 0 to 14 years were treated in US hospital EDs from 2001 through 2009 for nonfatal food-related choking, yielding an average of 12 435 children annually and a rate of 20.4 (95% confidence interval: 15.4-25.3) visits per 100 000 population. The mean age of children treated for nonfatal food-related choking was 4.5 years. Children aged ≤1 year accounted for 37.8% of cases, and male children accounted for more than one-half (55.4%) of cases. Of all food types, hard candy was most frequently (15.5% [16 168 cases]) associated with choking, followed by other candy (12.8% [13 324]), meat (12.2% [12 671]), and bone (12.0% [12 496]). Most patients (87.3% [97 509]) were treated and released, but 10.0% (11 218) were hospitalized, and 2.6% (2911) left against medical advice. CONCLUSIONS: This is the first nationally representative study to focus solely on nonfatal pediatric food-related choking treated in US EDs over a multiyear period. Improved surveillance, food labeling and redesign, and public education are strategies that can help reduce pediatric choking on food. |
Foodborne outbreak of group a streptococcus pharyngitis associated with a high school dance team banquet--Minnesota, 2012
Kemble SK , Westbrook A , Lynfield R , Bogard A , Koktavy N , Gall K , Lappi V , Devries AS , Kaplan E , Smith KE . Clin Infect Dis 2013 57 (5) 648-54 BACKGROUND: On 20 March 2012, the Minnesota Department of Health (MDH) was notified of multiple Facebook postings suggestive of a foodborne outbreak of Group A Streptococcus (GAS) pharyngitis occurring among attendees of a high school dance team banquet. An investigation was initiated. METHODS: Associations between GAS pharyngitis and specific food items were assessed among banquet attendees. Pharyngeal swabs were performed on attendees, household contacts, and food workers. Patient GAS isolates from clinical laboratories were also obtained. Pharyngeal and food specimens were cultured for GAS by the MDH Public Health Laboratory. Isolates were further characterized by pulsed-field gel electrophoresis (PFGE) and emm typing. RESULTS: Among 63 persons who consumed banquet food, 18 primary illnesses occurred, yielding an attack rate of 29%. Although no food or beverage items were significantly associated with illness, pasta consumption yielded the highest relative risk (risk ratio, 3.56; 95% confidence interval, .25-50.6). GAS colonies with indistinguishable PFGE patterns corresponding to emm subtype 1.0 were isolated from 5 patients and from leftover pasta. The pasta was prepared at home by a dance team member parent; both parent and child reported GAS pharyngitis episodes 3 weeks before the banquet. CONCLUSIONS: In this foodborne outbreak of GAS pharyngitis, pasta was implicated as the vehicle. Recognition of foodborne GAS illness is challenging because transmission is typically assumed to occur by respiratory spread; foodborne transmission should be considered when clusters of GAS pharyngitis patients are encountered. DNA-based typing can reveal potentially epidemiologically related isolates during GAS disease outbreaks and facilitate understanding and control of GAS disease. |
Invasive methicillin-resistant staphylococcus aureus infections among chronic dialysis patients in the United States, 2005-2011
Nguyen DB , Lessa FC , Belflower R , Mu Y , Wise M , Nadle J , Bamberg WM , Petit S , Ray SM , Harrison LH , Lynfield R , Dumyati G , Thompson J , Schaffner W , Patel PR . Clin Infect Dis 2013 57 (10) 1393-400 BACKGROUND: Approximately 15,700 invasive methicillin-resistant Staphylococcus aureus (MRSA) infections occurred in U.S. dialysis patients in 2010. Frequent hospital visits and prolonged bloodstream access, especially via central venous catheters (CVCs), are risk factors among hemodialysis patients. We describe the epidemiology of and recent trends in invasive MRSA infections among dialysis patients. METHODS: We analyzed population-based data from nine U.S. metropolitan areas from 2005-2011. Cases were defined as MRSA isolated from a normally sterile body site in a surveillance area resident who received dialysis, and were classified as hospital-onset (HO) (culture collected >3 days after hospital admission) or healthcare-associated community-onset (HACO) (all others). Incidence was calculated using denominators from the U.S. Renal Data System. Temporal trends in incidence and national estimates were calculated controlling for age, gender, and race. RESULTS: From 2005-2011, 7,489 cases were identified; 85.7% were HACO; 93.2% were bloodstream infections. Incidence of invasive MRSA infections decreased from 6.5 to 4.2 per 100 dialysis patients (annual decrease: 7.3%) with annual decreases of 6.7% for HACO and 10.5% for HO cases. Of cases identified during 2009-2011, 70% were hospitalized in the year prior to infection. Among hemodialysis cases, 60.4% were dialyzed through a CVC. The 2011 national estimated number of MRSA infections was 15,169. CONCLUSIONS: There has been a substantial decrease in invasive MRSA infection incidence among dialysis patients. Most cases had previous hospitalizations, suggesting that efforts to control MRSA in hospitals might have contributed to the declines. Infection prevention measures should include improved vascular access and CVC care. |
Medical students' perceptions and knowledge about antimicrobial stewardship: how are we educating our future prescribers?
Abbo LM , Cosgrove SE , Pottinger PS , Pereyra M , Sinkowitz-Cochran R , Srinivasan A , Webb DJ , Hooton TM . Clin Infect Dis 2013 57 (5) 631-8 BACKGROUND: Better understanding of medical students' perceptions, attitudes, and knowledge about antimicrobial prescribing practices could facilitate more effective education of these future prescribers. METHODS: A 24-item electronic survey on antimicrobial prescribing and education was administered to fourth-year medical students at the University of Miami, the Johns Hopkins University, and the University of Washington (January-March 2012). RESULTS: Three hundred seventeen of 519 (61%) students completed the survey; 92% of respondents agreed that strong knowledge of antimicrobials is important in their careers, and 90% said that they would like more education on appropriate use of antimicrobials. Mean correct knowledge score (11 items) was 51%, with statistically significant differences between study sites and sources of information used to learn about antimicrobials. Only 15% had completed a clinical infectious diseases rotation during medical school; those who had done so rated the quality of their antimicrobial education significantly higher compared to those who had not (mean, 3.93 vs 3.44, on a 5-point scale; P = .0003). There were no statistically significant associations between knowledge scores and having had an infectious diseases clinical elective. Only one-third of respondents perceived their preparedness to be adequate in some fundamental principles of antimicrobial use. CONCLUSIONS: Differences exist between medical schools in educational resources used, perceived preparedness, and knowledge about antimicrobial use. Variability in formative education could frame behaviors and prescribing practices in future patient care. To help address the growing problem of antimicrobial resistance, efforts should be undertaken to ensure that our future doctors are well educated in the principles and practices of appropriate use of antibiotics and antimicrobial stewardship. |
Incidence Trends in Pathogen-Specific Central Line-Associated Bloodstream Infections in US Intensive Care Units, 1990-2010
Fagan RP , Edwards JR , Park BJ , Fridkin SK , Magill SS . Infect Control Hosp Epidemiol 2013 34 (9) 893-9 OBJECTIVE: To quantify historical trends in rates of central line-associated bloodstream infections (CLABSIs) in US intensive care units (ICUs) caused by major pathogen groups, including Candida spp., Enterococcus spp., specified gram-negative rods, and Staphylococcus aureus. DESIGN:. Active surveillance in a cohort of participating ICUs through the Centers for Disease Control and Prevention, the National Nosocomial Infections Surveillance system during 1990-2004, and the National Healthcare Safety Network during 2006-2010. Setting. ICUs. Participants. Patients who were admitted to participating ICUs. RESULTS: The CLABSI incidence density rate for S. aureus decreased annually starting in 2002 and remained lower than for other pathogen groups. Since 2006, the annual decrease for S. aureus CLABSIs in nonpediatric ICU types was -18.3% (95% confidence interval [CI], -20.8% to -15.8%), whereas the incidence density rate for S. aureus among pediatric ICUs did not change. The annual decrease for all ICUs combined since 2006 was -17.8% (95% CI, -19.4% to -16.1%) for Enterococcus spp., -16.4% (95% CI, -18.2% to -14.7%) for gram-negative rods, and -13.5% (95% CI, -15.4% to -11.5%) for Candida spp. CONCLUSIONS:. Patterns of ICU CLABSI incidence density rates among major pathogen groups have changed considerably during recent decades. CLABSI incidence declined steeply since 2006, except for CLABSI due to S. aureus in pediatric ICUs. There is a need to better understand CLABSIs that still do occur, on the basis of microbiological and patient characteristics. New prevention approaches may be needed in addition to central line insertion and maintenance practices. |
Increasing adolescent immunization rates in primary care: strategies physicians use and would consider implementing
Humiston SG , Serwint JR , Szilagyi PG , Vincelli PA , Dhepyasuwan N , Rand CM , Schaffer SJ , Blumkin AK , Curtis CR . Clin Pediatr (Phila) 2013 52 (8) 710-20 Strategies to increase adolescent immunization rates have been suggested, but little is documented about which strategies clinicians actually use or would consider. In spring 2010, we surveyed primary care physicians from 2 practice-based research networks (PBRNs): Greater Rochester PBRN (GR-PBRN) and national pediatric COntinuity Research NETwork (CORNET). Network clinicians received mailed or online surveys (response rate 76%, n = 148). The GR-PBRN patient population (51% suburban, 33% rural, and 16% urban) differed from that served by CORNET (85% urban). For nonseasonal vaccines recommended for adolescents, many GR-PBRN and CORNET practices reported using nurse prompts to providers at preventive visits (61% and 52%, respectively), physician education (53% and 53%), and scheduled vaccine-only visits (91% and 82%). Strategies not used that clinicians frequently indicated they would consider included patient reminder/recall and prompts to providers via nurses or electronic health records. As preventive visits and immunization recommendations grow more complex, using technology to support immunization delivery to adolescents might be effective. |
Intussusception and rotavirus vaccination - balancing risk against benefit
Parashar UD , Orenstein WA . Clin Infect Dis 2013 57 (10) 1435-7 An association between intussusception, a form of bowel obstruction, and live oral rotavirus vaccines was first identified with Rotashield, a rhesus-human reassortant rotavirus vaccine that was recommended for routine immunization of US infants in 1998 [1]. During the first year after vaccine introduction, a cluster of intussusception cases temporally linked to Rotashield vaccination was reported to the US Vaccine Adverse Event Reporting System (VAERS), a national passive reporting system [2]. This prompted a national case-control study, which confirmed the association between Rotashield and intussusception [3], with the greatest risk (an approximately 37-fold increase) occurring 3–7 days after the first vaccine dose. A smaller increase was seen in the second week after dose 1 and during the first week after dose 2. The excess risk of approximately 1 intussusception case in 10 000 Rotashield recipients led to withdrawal of the vaccine from the US market in 1999 [4]. | Because of the legacy of Rotashield, the 2 other live oral rotavirus vaccines in advanced stages of clinical testing at the time—a pentavalent bovine-human reassortant vaccine (RV5, RotaTeq, Merck.) and a monovalent human vaccine (RV1, Rotarix, GSK Biologicals)—each underwent large clinical trials of approximately 60 000–70 000 infants specifically to assess the risk of intussusception [5, 6]. No elevated risk was found 42 and 30 days after vaccination after any of the 3 doses of RV5 and either of the 2 doses of RV1, respectively, in these trials. This facilitated licensure of both products and a recommendation for universal use in the United States and around the world. The World Health Organization (WHO) recommends both RV5 and RV1 for global use and encourages postlicensure monitoring to further assess the intussusception risk during routine programmatic use [7]. |
Human pathogens on plants: designing a multidisciplinary strategy for research
Fletcher J , Leach JE , Eversole K , Tauxe R . Phytopathology 2013 103 (4) 306-15 Recent efforts to address concerns about microbial contamination of food plants and resulting foodborne illness have prompted new collaboration and interactions between the scientific communities of plant pathology and food safety. This article provides perspectives from scientists of both disciplines and presents selected research results and concepts that highlight existing and possible future synergisms for audiences of both disciplines. Plant pathology is a complex discipline that encompasses studies of the dissemination, colonization, and infection of plants by microbes such as bacteria, viruses, fungi, and oomycetes. Plant pathologists study plant diseases as well as host plant defense responses and disease management strategies with the goal of minimizing disease occurrences and impacts. Repeated outbreaks of human illness attributed to the contamination of fresh produce, nuts and seeds, and other plant-derived foods by human enteric pathogens such as Shiga toxin-producing Escherichia coli and Salmonella spp. have led some plant pathologists to broaden the application of their science in the past two decades, to address problems of human pathogens on plants (HPOPs). Food microbiology, which began with the study of microbes that spoil foods and those that are critical to produce food, now also focuses study on how foods become contaminated with pathogens and how this can be controlled or prevented. Thus, at the same time, public health researchers and food microbiologists have become more concerned about plant-microbe interactions before and after harvest. New collaborations are forming between members of the plant pathology and food safety communities, leading to enhanced research capacity and greater understanding of the issues for which research is needed. The two communities use somewhat different vocabularies and conceptual models. For example, traditional plant pathology concepts such as the disease triangle and the disease cycle can help to define cross-over issues that pertain also to HPOP research, and can suggest logical strategies for minimizing the risk of microbial contamination. Continued interactions and communication among these two disciplinary communities is essential and can be achieved by the creation of an interdisciplinary research coordination network. We hope that this article, an introduction to the multidisciplinary HPOP arena, will be useful to researchers in many related fields. |
Winter season, frequent hand washing, and irritant patch test reactions to detergents are associated with hand dermatitis in health care workers
Callahan A , Baron E , Fekedulegn D , Kashon M , Yucesoy B , Johnson VJ , Domingo DS , Kirkland B , Luster MI , Nedorost S . Dermatitis 2013 24 (4) 170-5 BACKGROUND: Irritant hand dermatitis (IHD) is common in health care workers. OBJECTIVE: We studied endogenous irritant contact dermatitis threshold by patch testing and exogenous factors such as season and hand washing for their association with IHD in health care workers. METHODS: Irritant patch testing with sodium lauryl sulfate (SLS), sodium hydroxide, and benzalkonium chloride at varying concentrations was measured in 113 health care workers. Examination for hand dermatitis occurred at 1-month intervals for a period of 6 months in the Midwestern United States. RESULTS: Positive patch testing to low-concentration SLS was associated with IHD (P = 0.0310) after adjusting for age, sex, ethnicity, season, history of childhood flexural dermatitis, mean indoor relative humidity, and glove and hand sanitizer usage. Subjects with a positive patch test to SLS were 78% more likely to have occurrence of IHD (incidence rate ratio [IRR] = 1.78; 95% confidence interval [CI], 0.92-3.45). Hand washing frequency (≥10 times a day; IRR = 1.55; 95% CI, 1.01-2.39) and cold season (IRR = 2.76; 95% CI, 1.35-5.65) were associated with IHD. No association was found between history of childhood flexural dermatitis and IHD in this population. CONCLUSIONS: Both genetic and environmental factors are important in the etiology of IHD and should be considered in designing strategies to protect, educate, and treat susceptible individuals. |
Office-based physicians are responding to incentives and assistance by adopting and using electronic health records
Hsiao CJ , Jha AK , King J , Patel V , Furukawa MF , Mostashari F . Health Aff (Millwood) 2013 32 (8) 1470-7 Expanding the use of interoperable electronic health record (EHR) systems to improve health care delivery is a national policy priority. We used the 2010-12 National Ambulatory Medical Care Survey-Electronic Health Records Survey to examine which physicians in what types of practices are implementing the systems, and how they are using them. We found that 72 percent of physicians had adopted some type of system and that 40 percent had adopted capabilities required for a basic EHR system. The highest relative increases in adoption were among physicians with historically low adoption levels, including older physicians and those working in solo practices or community health centers. As of 2012, physicians in rural areas had higher rates of adoption than those in large urban areas, and physicians in counties with high rates of poverty had rates of adoption comparable to those in areas with less poverty. However, small practices continued to lag behind larger practices. Finally, the majority of physicians who adopted the EHR capabilities required to obtain federal financial incentives used the capabilities routinely, with few differences across physician groups. |
Modeling a methylmalonic acid-derived change point for serum vitamin B-12 for adults in NHANES
Bailey RL , Durazo-Arvizu RA , Carmel R , Green R , Pfeiffer CM , Sempos CT , Carriquiry A , Yetley EA . Am J Clin Nutr 2013 98 (2) 460-7 BACKGROUND: No consensus exists about which cutoff point should be applied for serum vitamin B-12 (SB-12) concentrations to define vitamin B-12 status in population-based research. OBJECTIVE: The study's aim was to identify whether a change point exists at which the relation between plasma methylmalonic acid (MMA) and SB-12 changes slope to differentiate between inadequate and adequate vitamin B-12 status by using various statistical models. DESIGN: We used data on adults (≥19 y; n = 12,683) from NHANES 1999-2004-a nationally representative, cross-sectional survey. We evaluated 6 piece-wise polynomial and exponential decay models that used different control levels for known covariates. RESULTS: The MMA-defined change point for SB-12 varied depending on the statistical model used. A linear-splines model was determined to best fit the data, as determined by the approximate permutation test; 3 slopes relating SB-12 and MMA and resulting in 2 change points and 3 subgroups were shown. The first group (SB-12 <126 pmol/L) was small and had the highest MMA concentration (median: 281 nmol/L; 95% CI: 245, 366 nmol/L; n = 157, 1.2%); many in this group could be considered at high risk of severe deficiency because combined abnormalities of MMA and homocysteine were very frequent and the concentrations themselves were significantly higher. The highest SB-12 group (SB-12 >287 pmol/L; n = 8569, 67.6%) likely had adequate vitamin B-12 status (median MMA: 120 nmol/L; 95% CI: 119, 125 nmol/L). The vitamin B-12 status of the sizable intermediate group (n = 3957, 33%) was difficult to interpret. CONCLUSIONS: The 3 distinct slopes for the relation between SB-12 and MMA challenges the conventional use of one cutoff point for classifying vitamin B-12 status. In epidemiologic research, the use of one cutoff point would fail to separate the small, severely deficient group from the intermediate group that has neither normal nor clearly deficient vitamin B-12 concentrations (ie, unknown vitamin B-12 status). This intermediate group requires further characterization. |
The contribution of mixed dishes to vegetable intake among US children and adolescents
Branum AM , Rossen LM . Public Health Nutr 2013 17 (9) 1-8 OBJECTIVE: To describe the contribution of mixed dishes to vegetable consumption and to estimate vegetable intake according to specific types of vegetables and other foods among US children and adolescents. DESIGN: The 2003-2008 National Health and Nutrition Examination Survey (NHANES), a nationally representative probability survey conducted in the USA. SETTING: Civilian non-institutionalized US population. SUBJECTS: All children and adolescents aged 2-18 years who met eligibility criteria (n 9169). RESULTS: Approximately 59 % of total vegetable intake came from whole forms of vegetables with 41 % coming from a mixed dish. White potatoes (10.7 (se 0.6) %), fried potatoes (10.2 (se 0.4) %), potato chips (8.6 (se 0.5) %) and other vegetables (9.2 (se 0.5) %) accounted for most vegetables in their whole forms, whereas pasta dishes (9.5 (se 0.4) %), chilli/soups/stews (7.0 (se 0.5) %), pizza/calzones (7.6 (se 0.3) %) and other foods (13.7 (se 0.6) %) accounted for most mixed dishes. Usual mean vegetable intake was 1.02 cup equivalents/d; however, after excluding vegetables from mixed dishes, mean intake fell to 0.54 cup equivalents/d and to 0.32 cup equivalents/d when fried potatoes were further excluded. CONCLUSIONS: Mixed dishes account for nearly half of overall vegetable intake in US children and adolescents. It is critical for future research to examine various components of vegetable intake carefully in order to inform policy and programmatic efforts aimed at improving dietary intake among children and adolescents. |
Work-related knee injuries treated in US emergency departments
Chen Z , Chakrabarty S , Levine RS , Aliyu MH , Ding T , Jackson LL . J Occup Environ Med 2013 55 (9) 1091-9 OBJECTIVE: To characterize work-related knee injuries treated in US emergency departments (EDs). METHODS: We characterized work-related knee injuries treated in EDs in 2007 and examined trends from 1998 to 2007 by using the National Electronic Injury Surveillance System-occupational supplement. RESULTS: In 2007, 184,300 (+/-54,000; 95% confidence interval) occupational knee injuries were treated in US EDs, accounting for 5% of the 3.4 (+/-0.9) million ED-treated occupational injuries. The ED-treated knee injury rate was 13 (+/-4) injuries per 10,000 full-time equivalent workers. Younger workers and older female workers had high rates. Strains/sprains and contusions/abrasions were common-frequently resulting from falls and bodily reaction/overexertion events. Knee injury rates declined from 1998 through 2007. CONCLUSIONS: Knee injury prevention should emphasize reducing falls and bodily reaction/overexertion events, particularly among all youth and older women. |
A multi-stakeholder perspective on the use of alternative test strategies for nanomaterial safety assessment
Nel AE , Nasser E , Godwin H , Avery D , Bahadori T , Bergeson L , Beryt E , Bonner JC , Boverhof D , Carter J , Castranova V , Deshazo JR , Hussain SM , Kane AB , Klaessig F , Kuempel E , Lafranconi M , Landsiedel R , Malloy T , Miller MB , Morris J , Moss K , Oberdorster G , Pinkerton K , Pleus RC , Shatkin JA , Thomas R , Tolaymat T , Wang A , Wong J . ACS Nano 2013 7 (8) 6422-33 There has been a conceptual shift in toxicological studies from describing what happens to explaining how the adverse outcome occurs, thereby enabling a deeper and improved understanding of how biomolecular and mechanistic profiling can inform hazard identification and improve risk assessment. Compared to traditional toxicology methods, which have a heavy reliance on animals, new approaches to generate toxicological data are becoming available for the safety assessment of chemicals, including high-throughput and high-content screening (HTS, HCS). With the emergence of nanotechnology, the exponential increase in the total number of engineered nanomaterials (ENMs) in research, development, and commercialization requires a robust scientific approach to screen ENM safety in humans and the environment rapidly and efficiently. Spurred by the developments in chemical testing, a promising new toxicological paradigm for ENMs is to use alternative test strategies (ATS), which reduce reliance on animal testing through the use of in vitro and in silico methods such as HTS, HCS, and computational modeling. Furthermore, this allows for the comparative analysis of large numbers of ENMs simultaneously and for hazard assessment at various stages of the product development process and overall life cycle. Using carbon nanotubes as a case study, a workshop bringing together national and international leaders from government, industry, and academia was convened at the University of California, Los Angeles, to discuss the utility of ATS for decision-making analyses of ENMs. After lively discussions, a short list of generally shared viewpoints on this topic was generated, including a general view that ATS approaches for ENMs can significantly benefit chemical safety analysis. |
Multi-walled carbon nanotubes induce human microvascular endothelial cellular effects in an alveolar-capillary co-culture with small airway epithelial cells
Snyder-Talkington BN , Schwegler-Berry D , Castranova V , Qian Y , Guo NL . Part Fibre Toxicol 2013 10 35 BACKGROUND: Nanotechnology, particularly the use of multi-walled carbon nanotubes (MWCNT), is a rapidly growing discipline with implications for advancement in a variety of fields. A major route of exposure to MWCNT during both occupational and environmental contact is inhalation. While many studies showed adverse effects to the vascular endothelium upon MWCNT exposure, in vitro results often do not correlate with in vivo effects. This study aimed to determine if an alveolar-capillary co-culture model could determine changes in the vascular endothelium after epithelial exposure to MWCNT. METHODS: A co-culture system in which both human small airway epithelial cells and human microvascular endothelial cells were separated by a Transwell membrane so as to resemble an alveolar-capillary interaction was used. Following exposure of the epithelial layer to MWCNT, the effects to the endothelial barrier were determined. RESULTS: Exposure of the epithelial layer to MWCNT induced multiple changes in the endothelial cell barrier, including an increase in reactive oxygen species, actin rearrangement, loss of VE-cadherin at the cell surface, and an increase in endothelial angiogenic ability. Overall increases in secreted VEGFA, sICAM-1, and sVCAM-1 protein levels, as well as increases in intracellular phospho-NF-kappaB, phospho-Stat3, and phospho-p38 MAPK, were also noted in HMVEC after epithelial exposure. CONCLUSION: The co-culture system identified that alveolar-capillary exposure to MWCNT induced multiple changes to the underlying endothelium, potentially through cell signaling mediators derived from MWCNT-exposed epithelial cells. Therefore, the co-culture system appears to be a relevant in vitro method to study the pulmonary toxicity of MWCNT. |
Personal and workplace psychosocial risk factors for carpal tunnel syndrome: a pooled study cohort
Harris-Adamson C , Eisen EA , Dale AM , Evanoff B , Hegmann KT , Thiese MS , Kapellusch JM , Garg A , Burt S , Bao S , Silverstein B , Gerr F , Merlino L , Rempel D . Occup Environ Med 2013 70 (8) 529-37 BACKGROUND: Between 2001 and 2010, six research groups conducted coordinated multiyear, prospective studies of carpal tunnel syndrome (CTS) incidence in US workers from various industries and collected detailed subject-level exposure information with follow-up symptom, physical examination, electrophysiological measures and job changes. OBJECTIVE: This analysis of the pooled cohort examined the incidence of dominant-hand CTS in relation to demographic characteristics and estimated associations with occupational psychosocial factors and years worked, adjusting for confounding by personal risk factors. METHODS: 3515 participants, without baseline CTS, were followed-up to 7 years. Case criteria included symptoms and an electrodiagnostic study consistent with CTS. Adjusted HRs were estimated in Cox proportional hazard models. Workplace biomechanical factors were collected but not evaluated in this analysis. RESULTS: Women were at elevated risk for CTS (HR=1.30; 95% CI 0.98 to 1.72), and the incidence of CTS increased linearly with both age and body mass index (BMI) over most of the observed range. High job strain increased risk (HR=1.86; 95% CI 1.11 to 3.14), and social support was protective (HR=0.54; 95% CI 0.31 to 0.95). There was an inverse relationship with years worked among recent hires with the highest incidence in the first 3.5 years of work (HR=3.08; 95% CI 1.55 to 6.12). CONCLUSIONS: Personal factors associated with an increased risk of developing CTS were BMI, age and being a woman. Workplace risk factors were high job strain, while social support was protective. The inverse relationship between CTS incidence and years worked among recent hires suggests the presence of a healthy worker survivor effect in the cohort. |
Popcorn flavoring effects on reactivity of rat airways in vivo and in vitro
Zaccone EJ , Thompson JA , Ponnoth DS , Cumpston AM , Goldsmith WT , Jackson MC , Kashon ML , Frazer DG , Hubbs AF , Shimko MJ , Fedan JS . J Toxicol Environ Health A 2013 76 (11) 669-89 Popcorn workers' lung is an obstructive pulmonary disease produced by inhalation of volatile artificial butter flavorings. In rats, inhalation of diacetyl, a major component of butter flavoring, and inhalation of a diacetyl substitute, 2,3-pentanedione, produce similar damage to airway epithelium. The effects of diacetyl and 2,3-pentanedione and mixtures of diacetyl, acetic acid, and acetoin, all components of butter flavoring, on pulmonary function and airway reactivity to methacholine (MCh) were investigated. Lung resistance (RL) and dynamic compliance (Cdyn) were negligibly changed 18 h after a 6-h inhalation exposure to diacetyl or 2,3-pentanedione (100-360 ppm). Reactivity to MCh was not markedly changed after diacetyl, but was modestly decreased after 2,3-pentanedione inhalation. Inhaled diacetyl exerted essentially no effect on reactivity to mucosally applied MCh, but 2,3-pentanedione (320 and 360 ppm) increased reactivity to MCh in the isolated, perfused trachea preparation (IPT). In IPT, diacetyl and 2,3-pentanedione (≥3 mM) applied to the serosal and mucosal surfaces of intact and epithelium-denuded tracheas initiated transient contractions followed by relaxations. Inhaled acetoin (150 ppm) exerted no effect on pulmonary function and airway reactivity in vivo; acetic acid (27 ppm) produced hyperreactivity to MCh; and exposure to diacetyl + acetoin + acetic acid (250 + 150 + 27 ppm) led to a diacetyl-like reduction in reactivity. Data suggest that the effects of 2,3-pentanedione on airway reactivity are greater than those of diacetyl, and that flavorings are airway smooth muscle relaxants and constrictors, thus indicating a complex mechanism. |
Evaluation of respiratory protection programs and practices in California hospitals during the 2009-2010 H1N1 influenza pandemic
Beckman S , Materna B , Goldmacher S , Zipprich J , D'Alessandro M , Novak D , Harrison R . Am J Infect Control 2013 41 (11) 1024-31 BACKGROUND: Emergence of the novel 2009 influenza A H1N1 virus in California led to an evaluation of hospital respiratory protection programs (RPPs) and practices by the California Department of Public Health during the 2009-2010 influenza season. METHODS: Onsite evaluation of 16 hospitals consisted of interviews with managers and health care workers about RPPs and practices, review of written RPPs, and limited observations of personnel using respirators. Data were analyzed using descriptive statistics. RESULTS: All hospitals had implemented policies requiring the minimum use of N95 filtering facepiece respirators when working with patients with H1N1 virus infection; 95.5% of health care workers (n = 199) reported they would wear at least this level of protection when in close contact with a patient with confirmed or suspected H1N1 virus infection. However, evaluation of written RPPs indicated deficiencies in required areas, most commonly in recordkeeping, designation of a program administrator, program evaluation, employee training, and fit testing procedures. CONCLUSIONS: Health care workers were aware of respiratory protection required when providing care for patients with confirmed or suspected H1N1 virus infection. Hospitals should improve written RPPs, fully implement written procedures, and conduct periodic program evaluation to ensure effectiveness of respirator use for health care worker protection. Increased accessibility of resources tailored for hospital respirator program administrators may be helpful. |
Extrapulmonary transport of MWCNT following inhalation exposure
Mercer RR , Scabilloni JF , Hubbs AF , Wang L , Battelli LA , McKinney W , Castranova V , Porter DW . Part Fibre Toxicol 2013 10 (1) 38 BACKGROUND: Inhalation exposure studies of mice were conducted to determine if multi-walled carbon nanotubes (MWCNT) distribute to the tracheobronchial lymphatics, parietal pleura, respiratory musculature and/or extrapulmonary organs. Male C57BL/6 J mice were exposed in a whole-body inhalation system to a 5 mg/m3 MWCNT aerosol for 5 hours/day for 12 days (4 times/week for 3 weeks, lung burden 28.1 ug/lung). At 1 day and 336 days after the 12 day exposure period, mice were anesthetized and lungs, lymph nodes and extrapulmonary tissues were preserved by whole body vascular perfusion of paraformaldehyde while the lungs were inflated with air. Separate, clean-air control groups were studied at 1 day and 336 days post-exposure. Sirius Red stained sections from lung, tracheobronchial lymph nodes, diaphragm, chest wall, heart, brain, kidney and liver were analyzed. Enhanced darkfield microscopy and morphometric methods were used to detect and count MWCNT in tissue sections. Counts in tissue sections were expressed as number of MWCNT per g of tissue and as a percentage of total lung burden (Mean +/- S.E., N = 8 mice per group). MWCNT burden in tracheobronchial lymph nodes was determined separately based on the volume density in the lymph nodes relative to the volume density in the lungs. Field emission scanning electron microscopy (FESEM) was used to examine MWCNT structure in the various tissues. RESULTS: Tracheobronchial lymph nodes were found to contain 1.08 and 7.34 percent of the lung burden at 1 day and 336 days post-exposure, respectively. Although agglomerates account for approximately 54% of lung burden, only singlet MWCNT were observed in the diaphragm, chest wall, liver, kidney, heart and brain. At one day post exposure, the average length of singlet MWCNT in liver and kidney, was comparable to that of singlet MWCNT in the lungs 8.2 +/- 0.3 versus 7.5 +/- 0.4 um, respectively. On average, there were 15,371 and 109,885 fibers per gram in liver, kidney, heart and brain at 1 day and 336 days post-exposure, respectively. The burden of singlet MWCNT in the lymph nodes, diaphragm, chest wall and extrapulmonary organs at 336 days post-exposure was significantly higher than at 1 day post-exposure. CONCLUSIONS: Inhaled MWCNT, which deposit in the lungs, are transported to the parietal pleura, the respiratory musculature, liver, kidney, heart and brain in a singlet form and accumulate with time following exposure. The tracheobronchial lymph nodes contain high levels of MWCNT following exposure and further accumulate over nearly a year to levels that are a significant fraction of the lung burden 1 day post-exposure. |
Chronic exposure to carbon nanotubes induces invasion of human mesothelial cells through matrix metalloproteinase-2
Lohcharoenkal W , Wang L , Stueckle TA , Dinu CZ , Castranova V , Liu Y , Rojanasakul Y . ACS Nano 2013 7 (9) 7711-23 Malignant mesothelioma is one of the most aggressive forms of cancer known. Recent studies have shown that carbon nanotubes (CNTs) are biopersistent and induce mesothelioma in animals, but the underlying mechanisms are not known. Here, we investigate the effect of long-term exposure to high aspect ratio CNTs on the aggressive behaviors of human pleural mesothelial cells, the primary cellular target of human lung mesothelioma. We show that chronic exposure (4 months) to single- and multiwalled CNTs induced proliferation, migration, and invasion of the cells similar to that observed in asbestos-exposed cells. An up-regulation of several key genes known to be important in cell invasion, notably matrix metalloproteinase-2 (MMP-2), was observed in the exposed mesothelial cells as determined by real-time PCR. Western blot and enzyme activity assays confirmed the increased expression and activity of MMP-2. Whole genome microarray analysis further indicated the importance of MMP-2 in the invasion gene signaling network of the exposed cells. Knockdown of MMP-2 in CNT and asbestos-exposed cells by shRNA-mediated gene silencing effectively inhibited the aggressive phenotypes. This study demonstrates CNT-induced cell invasion and indicates the role of MMP-2 in the process. |
Rapid decline in lung function in coal miners: evidence of disease in small airways
Stansbury RC , Beeckman-Wagner LA , Wang ML , Hogg JP , Petsonk EL . Am J Ind Med 2013 56 (9) 1107-12 BACKGROUND: Coal mine dust exposure can cause both pneumoconiosis and chronic airflow limitation. The contributions of various pathophysiologic mechanisms to dust-related lung function decrements remain unclear. METHODS: Clinical and physiological findings were assessed for 15 underground coal miners who had demonstrated accelerated FEV1 losses (decliners) over 6-18 years. Decliners' findings were evaluated in comparison to a group of 11 miners who had shown relatively stable lung function (referents) during the same period. RESULTS: At follow-up examination, the decliners showed significantly greater mean airway resistance (10.47 vs. 6.78 cmH2 O/L/s; P = 0.05) and more air trapping (RV/TLC = 37.5 vs. 29.1%; P < 0.01) compared to the referents. Decliners also demonstrated more evidence of small airways dysfunction and tended to have more bronchospasm than the referent group. Total lung capacity, lung compliance, diffusing capacity, and chest radiography did not differ significantly between the two groups. After cessation of mine dust exposures, the decliners' mean rate of FEV1 loss normalized. CONCLUSION: In a series of working coal miners, accelerated lung function declines were associated with air trapping and evidence of small airways dysfunction. A preventive benefit from controlling dust exposures was suggested. |
Physical durability of PermaNet 2.0 long-lasting insecticidal nets over three to 32 months of use in Ethiopia
Wills AB , Smith SC , Anshebo GY , Graves PM , Endeshaw T , Shargie EB , Damte M , Gebre T , Mosher AW , Patterson AE , Tesema YB , Richards FO Jr , Emerson PM . Malar J 2013 12 (1) 242 BACKGROUND: Ethiopia scaled up net distribution markedly starting in 2006. Information on expected net life under field conditions (physical durability and persistence of insecticidal activity) is needed to improve planning for net replacement. Standardization of physical durability assessment methods is lacking. METHODS: Permanet(R)2.0 long-lasting insecticidal bed nets (LLINs), available for distribution in early 2007, were collected from households at three time intervals. The number, size and location of holes were recorded for 189 nets used for three to six months from nine sites (2007) and 220 nets used for 14 to 20 months from 11 sites (2008). In 2009, a "finger/fist" sizing method classified holes in 200 nets used for 26 to 32 months from ten sites into small (<2 cm), medium (> = 2 to < =10 cm) and large (>10 cm) sizes. A proportionate hole index based on both hole number and area was derived from these size classifications. RESULTS: After three to six months, 54.5% (95% CI 47.1-61.7%) of 189 LLINs had at least one hole 0.5 cm (in the longest axis) or larger; mean holes per net was 4.4 (SD 8.4), median was 1.0 (Inter Quartile Range [IQR] 0--5) and median size was 1 cm (IQR 1--2). At 14 to 20 months, 85.5% (95% CI 80.1-89.8%) of 220 nets had at least one hole with mean 29.1 (SD 50.1) and median 12 (IQR 3--36.5) holes per net, and median size of 1 cm (IQR 1--2). At 26 to 32 months, 92.5% of 200 nets had at least one hole with a mean of 62.2 (SD 205.4) and median of 23 (IQR 6--55.5) holes per net. The mean hole index was 24.3, 169.1 and 352.8 at the three time periods respectively. Repairs were rarely observed. The majority of holes were in the lower half of the net walls. The proportion of nets in 'poor' condition (hole index >300) increased from 0% at three to six months to 30% at 26 to 32 months. CONCLUSIONS: Net damage began quickly: more than half the nets had holes by three to six months of use, with 40% of holes being larger than 2 cm. Holes continued to accumulate until 92.5% of nets had holes by 26 to 32 months of use. An almost complete lack of repairs shows the need for promoting proper use of nets and repairs, to increase LLIN longevity. Using the hole index, almost one third of the nets were classed as unusable and ineffective after two and a half years of potential use. |
Analysis of physical activity mass media campaign design
Lankford T , Wallace J , Brown D , Soares J , Epping JN , Fridinger F . J Phys Act Health 2013 11 (6) 1065-9 BACKGROUND: Mass media campaigns are a necessary tool for public health practitioners to reach large populations and promote healthy behaviors. Most health scholars have concluded that mass media can significantly influence the health behaviors of populations; however the effects of such campaigns are typically modest and may require significant resources. A recent Community Preventive Services Task Force review on stand-alone mass media campaigns concluded there was insufficient evidence to determine their effectiveness in increasing physical activity, partly due to mixed methods and modest and inconsistent effects on levels of physical activity. METHODS: A secondary analysis was performed on the campaigns evaluated in the Task Force review to determine use of campaign-building principles, channels, and levels of awareness and their impact on campaign outcomes. Each study was analyzed by two reviewers for inclusion of campaign building principles. Results: Campaigns that included five or more campaign principles were more likely to be successful in achieving physical activity outcomes. CONCLUSION: Campaign success is more likely if the campaign building principles (formative research, audience segmentation, message design, channel placement, process evaluation, and theory-based) are used as part of campaign design and planning. |
Monitoring population health for Healthy People 2020: evaluation of the NIH PROMIS((R)) Global Health, CDC Healthy Days, and satisfaction with life instruments
Barile John P , Reeve Bryce B , Smith Ashley Wilder , Zack Matthew M , Mitchell Sandra A , Kobau Rosemarie , Cella David F , Luncheon Cecily , Thompson William W . Qual Life Res 2013 22 (6) 1201-11 PURPOSE: Healthy People 2020 identified health-related quality of life and well-being (WB) as indicators of population health for the next decade. This study examined the measurement properties of the NIH PROMIS((R)) Global Health Scale, the CDC Healthy Days items, and associations with the Satisfaction with Life Scale. METHODS: A total of 4,184 adults completed the Porter Novelli's HealthStyles mailed survey. Physical and mental health (9 items from PROMIS Global Scale and 3 items from CDC Healthy days measure), and 4 WB factor items were tested for measurement equivalence using multiple-group confirmatory factor analysis. RESULTS: The CDC items accounted for similar variance as the PROMIS items on physical and mental health factors; both factors were moderately correlated with WB. Measurement invariance was supported across gender and age; the magnitude of some factor loadings differed between those with and without a chronic medical condition. CONCLUSIONS: The PROMIS, CDC, and WB items all performed well. The PROMIS items captured a broad range of functioning across the entire continuum of physical and mental health, while the CDC items appear appropriate for assessing burden of disease for chronic conditions and are brief and easily interpretable. All three measures under study appear to be appropriate measures for monitoring several aspects of the Healthy People 2020 goals and objectives. |
Concordance between current job and usual job in occupational and industry groupings: assessment of the 2010 national health interview survey
Luckhaupt SE , Cohen MA , Calvert GM . J Occup Environ Med 2013 55 (9) 1074-90 OBJECTIVE: To determine whether current job is a reasonable surrogate for usual job. METHODS: Data from the 2010 National Health Interview Survey were utilized to determine concordance between current and usual jobs for workers employed within the past year. Concordance was quantitated by kappa values for both simple and detailed industry and occupational groups. Good agreement is considered to be present when kappa values exceed 60. RESULTS: Overall kappa values +/- standard errors were 74.5 +/- 0.5 for simple industry, 72.4 +/- 0.5 for detailed industry, 76.3 +/- 0.4 for simple occupation, 73.7 +/- 0.5 for detailed occupation, and 80.4 +/- 0.6 for very broad occupational class. Sixty-five of 73 detailed industry groups and 78 of 81 detailed occupation groups evaluated had good agreement between current and usual jobs. CONCLUSIONS: Current job can often serve as a reliable surrogate for usual job in epidemiologic studies. |
Characterization of the Federal Workforce at the Centers for Disease Control and Prevention
Coronado F , Polite M , Glynn MK , Massoudi MS , Sohani MM , Koo D . J Public Health Manag Pract 2013 20 (4) 432-41 CONTEXT: Studies characterizing the public health workforce are needed for providing the evidence on which to base planning and policy decision making both for workforce staffing and for addressing uncertainties regarding organizing, financing, and delivering effective public health strategies. The Centers for Disease Control and Prevention (CDC) is leading the enumeration of the US public health workforce with an initial focus on CDC as the leading federal public health agency. OBJECTIVE: To characterize CDC's workforce, assess retirement eligibility and potential staff losses, and contribute these data as the federal component of national enumeration efforts. METHODS: Two sources containing data related to CDC employees were analyzed. CDC's workforce was characterized by using data elements recommended for public health workforce enumeration and categorized the occupations of CDC staff into 15 standard occupational classifications by using position titles. Retirement eligibility and potential staffing losses were analyzed by using 1-, 3-, and 5-year increments and compared these data across occupational classifications to determine the future impact of potential loss of workforce. RESULTS: As of the first quarter of calendar year 2012, a total 11 223 persons were working at CDC; 10 316 were civil servants, and 907 were Commissioned Corps officers. Women accounted for 61%. Public health managers, laboratory workers, and administrative-clerical staff comprised the top 3 most common occupational classifications among CDC staff. Sixteen percent of the workforce was eligible to retire by December 2012, and more than 30% will be eligible to retire by December 2017. CONCLUSIONS: This study represents the first characterization of CDC's workforce and provides an evidence base upon which to develop policies for ensuring an ongoing ability to fulfill the CDC mission of maintaining and strengthening the public's health. Establishing a system for continually monitoring the public health workforce will support future efforts in understanding workforce shortages, capacity, and effectiveness; projecting trends; and initiating policies. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Community Health Services
- Drug Safety
- Epidemiology and Surveillance
- Food Safety
- Genetics and Genomics
- Health Communication and Education
- Healthcare Associated Infections
- Immunity and Immunization
- Laboratory Sciences
- Medicine
- Nutritional Sciences
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Parasitic Diseases
- Physical Activity
- Public Health Leadership and Management
- Public Health, General
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure