Lung cancer incidence trends among men and women - United States, 2005-2009
Henley JS , Richards TB , Underwood MJ , Sunderam CR , Plescia M , McAfee TA . MMWR Morb Mortal Wkly Rep 2014 63 (1) 1-5 Lung cancer is the leading cause of cancer death and the second most commonly diagnosed cancer (excluding skin cancer) among men and women in the United States. Although lung cancer can be caused by environmental exposures, most efforts to prevent lung cancer emphasize tobacco control because 80%-90% of lung cancers are attributed to cigarette smoking and secondhand smoke. One sentinel health consequence of tobacco use is lung cancer, and one way to measure the impact of tobacco control is by examining trends in lung cancer incidence rates, particularly among younger adults. Changes in lung cancer rates among younger adults likely reflect recent changes in risk exposure. To assess lung cancer incidence and trends among men and women by age group, CDC used data from the National Program of Cancer Registries (NPCR) and the National Cancer Institute's Surveillance, Epidemiology, and End Results (SEER) program for the period 2005-2009, the most recent data available. During the study period, lung cancer incidence decreased among men in all age groups except <35 years and decreased among women aged 35-44 years and 54-64 years. Lung cancer incidence decreased more rapidly among men than among women and more rapidly among adults aged 35-44 years than among other age groups. To further reduce lung cancer incidence in the United States, proven population-based tobacco prevention and control strategies should receive sustained attention and support. |
Prevalent inhibitors in haemophilia B subjects enrolled in the Universal Data Collection database
Puetz J , Soucie JM , Kempton CL , Monahan PE . Haemophilia 2014 20 (1) 25-31 Several risk factors for inhibitors have recently been described for haemophilia A. It has been assumed that similar risk factors are also relevant for haemophilia B, but there is limited data to confirm this notion. The aim of this study was to determine the prevalence of and risk factors associated with inhibitors in haemophilia B. The database of the Universal Data Collection (UDC) project of the Centers for Disease Control for the years 1998-2011 was queried to determine the prevalence of inhibitors in haemophilia B subjects. In addition, disease severity, race/ethnicity, age, factor exposure and prophylaxis usage were evaluated to determine their impact on inhibitor prevalence. Of the 3785 male subjects with haemophilia B enrolled in the UDC database, 75 (2%) were determined to have an inhibitor at some point during the study period. Severe disease (OR 13.1, 95% CI 6.2-27.7), black race (OR 2.2, 95% CI 1.2-4.1), and age <11 years (OR 2.5, 95% CI 1.5-4.0) were found to be significantly associated with having an inhibitor. There was insufficient data to determine if type of factor used and prophylaxis were associated with inhibitors. Inhibitors in haemophilia B are much less prevalent than haemophilia A, especially in patients with mild disease. Similar factors associated with inhibitors in haemophilia A also seem to be present for haemophilia B. The information collected by this large surveillance project did not permit evaluation of potential risk factors related to treatment approaches and exposures, and additional studies will be required. |
Impairments in hearing and vision impact on mortality in older people: the AGES-Reykjavik Study
Fisher D , Li CM , Chiu MS , Themann CL , Petersen H , Jonasson F , Jonsson PV , Sverrisdottir JE , Garcia M , Harris TB , Launer LJ , Eiriksdottir G , Gudnason V , Hoffman HJ , Cotch MF . Age Ageing 2014 43 (1) 69-76 OBJECTIVE: to examine the relationships between impairments in hearing and vision and mortality from all-causes and cardiovascular disease (CVD) among older people. DESIGN: population-based cohort study. Participants: the study population included 4,926 Icelandic individuals, aged ≥67 years, 43.4% male, who completed vision and hearing examinations between 2002 and 2006 in the Age, Gene/Environment Susceptibility-Reykjavik Study (AGES-RS) and were followed prospectively for mortality through 2009. METHODS: participants were classified as having 'moderate or greater' degree of impairment for vision only (VI), hearing only (HI), and both vision and hearing (dual sensory impairment, DSI). Cox proportional hazard regression, with age as the time scale, was used to calculate hazard ratios (HR) associated with impairment and mortality due to all-causes and specifically CVD after a median follow-up of 5.3 years. RESULTS: the prevalence of HI, VI and DSI were 25.4, 9.2 and 7.0%, respectively. After adjusting for age, significantly (P < 0.01) increased mortality from all causes, and CVD was observed for HI and DSI, especially among men. After further adjustment for established mortality risk factors, people with HI remained at higher risk for CVD mortality [HR: 1.70 (1.27-2.27)], whereas people with DSI remained at higher risk of all-cause mortality [HR: 1.43 (1.11-1.85)] and CVD mortality [HR: 1.78 (1.18-2.69)]. Mortality rates were significantly higher in men with HI and DSI and were elevated, although not significantly, among women with HI. CONCLUSIONS: older men with HI or DSI had a greater risk of dying from any cause and particularly cardiovascular causes within a median 5-year follow-up. Women with hearing impairment had a non-significantly elevated risk. Vision impairment alone was not associated with increased mortality. |
Outbreaks of non-O157 Shiga toxin-producing Escherichia coli infection: USA
Luna-Gierke RE , Griffin PM , Gould LH , Herman K , Bopp CA , Strockbine N , Mody RK . Epidemiol Infect 2014 142 (11) 1-11 Non-O157 Shiga toxin-producing Escherichia coli (STEC) infections are increasingly detected, but sources are not well established. We summarize outbreaks to 2010 in the USA. Single-aetiology outbreaks were defined as 2 epidemiologically linked culture-confirmed non-O157 STEC infections; multiple-aetiology outbreaks also had laboratory evidence of 2 infections caused by another enteric pathogen. Twenty-six states reported 46 outbreaks with 1727 illnesses and 144 hospitalizations. Of 38 single-aetiology outbreaks, 66% were caused by STEC O111 (n = 14) or O26 (n = 11), and 84% were transmitted through food (n = 17) or person-to-person spread (n = 15); food vehicles included dairy products, produce, and meats; childcare centres were the most common setting for person-to-person spread. Of single-aetiology outbreaks, a greater percentage of persons infected by Shiga toxin 2-positive strains had haemolytic uraemic syndrome compared with persons infected by Shiga toxin 1-only positive strains (7% vs. 0.8%). Compared with single-aetiology outbreaks, multiple-aetiology outbreaks were more frequently transmitted through water or animal contact. |
Population genetic and admixture analyses of Culex pipiens complex (Diptera: Culicidae) populations in California, United States.
Kothera L , Nelms BM , Reisen WK , Savage HM . Am J Trop Med Hyg 2013 89 (6) 1154-67 Microsatellite markers were used to genetically characterize 19 Culex pipiens complex populations from California. Two populations showed characteristics of earlier genetic bottlenecks. The overall FST value and a neighbor-joining tree suggested moderate amounts of genetic differentiation. Analyses using Structure indicated K = 4 genetic clusters: Cx. pipiens form pipiens L., Cx. quinquefasciatus Say, Cx. pipiens form molestus Forskal, and a group of genetically similar individuals of hybrid origin. A Discriminant Analysis of Principal Components indicated that the latter group is a mixture of the other three taxa, with form pipiens and form molestus contributing somewhat more ancestry than Cx. quinquefasciatus. Characterization of 56 morphologically autogenous individuals classified most as Cx. pipiens form molestus, and none as Cx. pipiens form pipiens or Cx. quinquefasciatus. Comparison of California microsatellite data with those of Cx. pipiens pallens Coquillett from Japan indicated the latter does not contribute significantly to genotypes in California. |
Cough and cold medication adverse events after market withdrawal and labeling revision
Hampton LM , Nguyen DB , Edwards JR , Budnitz DS . Pediatrics 2013 132 (6) 1047-54 BACKGROUND: In October 2007, manufacturers voluntarily withdrew over-the-counter (OTC) infant cough and cold medications (CCMs) from the US market. A year later, manufacturers announced OTC CCM labeling would be revised to warn against OTC CCM use by children aged <4 years. We determined whether emergency department (ED) visits for CCM adverse drug events (ADEs) declined after these interventions. METHODS: We used National Electronic Injury Surveillance System-Cooperative Adverse Drug Event Surveillance data from 2004 to 2011 to estimate the number of ED visits for CCM ADEs before and after each intervention. RESULTS: Among children aged <2 years, ED visits for CCM ADEs decreased from 4.1% of all ADE ED visits before the market withdrawal to 2.4% of all ADE visits afterward (difference in proportion: -1.7%, 95% confidence interval [CI]: -2.7% to -0.6%). Among children aged 2 to 3 years, ED visits for CCM ADEs decreased from 9.5% of all ADE ED visits before the labeling revision announcement to 6.5% of all ADE visits afterward (difference in proportion: -3.0%, 95% CI: -5.4% to -0.6%). Unsupervised ingestions accounted for 64.3% (95% CI: 51.1% to 77.5%) of CCM ADE ED visits involving children aged <2 years after the withdrawal and 88.8% (95% CI: 83.8% to 93.8%) of visits involving children aged 2 to 3 years after the labeling revision announcement. CONCLUSIONS: After a voluntary market withdrawal and labeling revision, ED visits for CCM ADEs declined among children aged <2 years and 2 to 3 years relative to ADE ED visits for all drugs. Interventions addressing unsupervised ingestions are needed to reduce CCM ADEs. |
Recreational water-associated disease outbreaks - United States, 2009-2010
Hlavsa MC , Roberts VA , Kahler AM , Hilborn ED , Wade TJ , Backer LC , Yoder JS . MMWR Morb Mortal Wkly Rep 2014 63 (1) 6-10 Recreational water-associated disease outbreaks result from exposure to infectious pathogens or chemical agents in treated recreational water venues (e.g., pools and hot tubs or spas) or untreated recreational water venues (e.g., lakes and oceans). For 2009-2010, the most recent years for which finalized data are available, public health officials from 28 states and Puerto Rico electronically reported 81 recreational water-associated disease outbreaks to CDC's Waterborne Disease and Outbreak Surveillance System (WBDOSS) via the National Outbreak Reporting System (NORS). This report summarizes the characteristics of those outbreaks. Among the 57 outbreaks associated with treated recreational water, 24 (42%) were caused by Cryptosporidium. Among the 24 outbreaks associated with untreated recreational water, 11 (46%) were confirmed or suspected to have been caused by cyanobacterial toxins. In total, the 81 outbreaks resulted in at least 1,326 cases of illness and 62 hospitalizations; no deaths were reported. Laboratory and environmental data, in addition to epidemiologic data, can be used to direct and optimize the prevention and control of recreational water-associated disease outbreaks. |
Algal bloom-associated disease outbreaks among users of freshwater lakes - United States, 2009-2010
Hilborn ED , Roberts VA , Backer L , Deconno E , Egan JS , Hyde JB , Nicholas DC , Wiegert EJ , Billing LM , Diorio M , Mohr MC , Hardy JF , Wade TJ , Yoder JS , Hlavsa MC . MMWR Morb Mortal Wkly Rep 2014 63 (1) 11-5 Harmful algal blooms (HABs) are excessive accumulations of microscopic photosynthesizing aquatic organisms (phytoplankton) that produce biotoxins or otherwise adversely affect humans, animals, and ecosystems. HABs occur sporadically and often produce a visible algal scum on the water. This report summarizes human health data and water sampling results voluntarily reported to CDC's Waterborne Disease and Outbreak Surveillance System (WBDOSS) via the National Outbreak Reporting System (NORS) and the Harmful Algal Bloom-Related Illness Surveillance System (HABISS)* for the years 2009-2010. For 2009-2010, 11 waterborne disease outbreaks associated with algal blooms were reported; these HABs all occurred in freshwater lakes. The outbreaks occurred in three states and affected at least 61 persons. Health effects included dermatologic, gastrointestinal, respiratory, and neurologic signs and symptoms. These 11 HAB-associated outbreaks represented 46% of the 24 outbreaks associated with untreated recreational water reported for 2009-2010, and 79% of the 14 freshwater HAB-associated outbreaks that have been reported to CDC since 1978. Clinicians should be aware of the potential for HAB-associated illness among patients with a history of exposure to freshwater. |
Evidence that rodent control strategies ought to be improved to enhance food security and reduce the risk of rodent-borne illnesses within subsistence farming villages in the plague-endemic West Nile region, Uganda
Eisen RJ , Enscore RE , Atiku LA , Zielinski-Gutierrez E , Mpanga JT , Kajik E , Andama V , Mungujakisa C , Tibo E , MacMillan K , Borchert JN , Gage KL . Int J Pest Manag 2013 59 (4) 259-270 Rodents pose serious threats to human health and economics, particularly in developing countries where the animals play a dual role as pests: they are reservoirs of human pathogens, and they inflict damage levels to stored products sufficient to cause food shortages. To assess the magnitude of the damage caused by rodents to crops, their level of contact with humans, and to better understand current food storage and rodent control practices, we conducted a survey of 37 households from 17 subsistence farming villages within the West Nile region of Uganda. Our survey revealed that rodents cause both pre- and post-harvest damage to crops. Evidence of rodent access to stored foods was reported in conjunction with each of the reported storage practices. Approximately half of the respondents reported that at least one family member had been bitten by a rat within the previous three months. Approximately two-thirds of respondents practiced some form of rodent control in their homes. The abundance of rodents was similar within homes that practiced or did not practice rodent control. Together, our results show that current efforts are inadequate for effectively reducing rodent abundance in homes. |
Horizon scanning for translational genomic research beyond bench to bedside.
Clyne M , Schully SD , Dotson WD , Douglas MP , Gwinn M , Kolor K , Wulf A , Bowen MS , Khoury MJ . Genet Med 2014 16 (7) 535-8 PURPOSE: The dizzying pace of genomic discoveries is leading to an increasing number of clinical applications. In this report, we provide a method for horizon scanning and 1 year data on translational research beyond bench to bedside to assess the validity, utility, implementation, and outcomes of such applications. METHODS: We compiled cross-sectional results of ongoing horizon scanning of translational genomic research, conducted between 16 May 2012 and 15 May 2013, based on a weekly, systematic query of PubMed. A set of 505 beyond bench to bedside articles were collected and classified, including 312 original research articles; 123 systematic and other reviews; 38 clinical guidelines, policies, and recommendations; and 32 articles describing tools, decision support, and educational materials. RESULTS: Most articles (62%) addressed a specific genomic test or other health application; almost half of these (n = 180) were related to cancer. We estimate that these publications account for 0.5% of reported human genomics and genetics research during the same time. CONCLUSION: These data provide baseline information to track the evolving knowledge base and gaps in genomic medicine. Continuous horizon scanning of the translational genomics literature is crucial for an evidence-based translation of genomics discoveries into improved health care and disease prevention. |
Genetic analysis of G12P[8] rotaviruses detected in the largest U.S. G12 genotype outbreak on record.
Mijatovic-Rustempasic S , Teel EN , Kerin TK , Hull JJ , Roy S , Weinberg GA , Payne DC , Parashar UD , Gentsch JR , Bowen MD . Infect Genet Evol 2013 21c 214-219 In 2006-07, 77 cases of gastroenteritis in Rochester, NY, USA were associated with rotavirus genotype G12P[8]. Sequence analysis identified a high degree of genetic relatedness among the VP7 and VP4 genes of the Rochester G12P[8] strains and between these strains and currently circulating human G12P[8] strains. Out of 77 samples, two and seven unique nucleotide sequences were identified for VP7 and VP4 genes, respectively. Rochester strain VP7 genes were found to occupy the G12-III lineage and VP4 genes clustered within the P[8]-3 lineage. Six strains contained non-synonymous nucleotide substitutions that produced amino acid changes at 6 sites in the VP8 * region of the VP4 gene. Two sites (amino acids 242 and 246) were located in or near a described trypsin cleavage site. Selection analyses identified one positively selected VP7 site (107) and strong purifying selection at 58 sites within the VP7 gene as well as 2 of the 6 variant sites (79 and 218) in VP4. |
Prioritizing genomic applications for action by level of evidence: a horizon-scanning method.
Dotson WD , Douglas MP , Kolor K , Stewart AC , Bowen MS , Gwinn M , Wulf A , Anders HM , Chang CQ , Clyne M , Lam TK , Schully SD , Marrone M , Feero WG , Khoury MJ . Clin Pharmacol Ther 2013 95 (4) 394-402 As evidence accumulates on the use of genomic tests and other health-related applications of genomic technologies, decision makers may increasingly seek support in identifying which applications have sufficiently robust evidence to suggest they might be considered for action. As an interim working process to provide such support, we developed a horizon-scanning method that assigns genomic applications to tiers defined by availability of synthesized evidence. We illustrate an application of the method to pharmacogenomics tests. |
Suicidal thoughts and attempts among U.S. high school students: trends and associated health-risk behaviors, 1991-2011
Lowry R , Crosby AE , Brener ND , Kann L . J Adolesc Health 2014 54 (1) 100-8 PURPOSE: To describe secular trends in suicidal thoughts and attempts and the types of health-risk behaviors associated with suicidal thoughts and attempts among U.S. high school students. METHODS: Data were analyzed from 11 national Youth Risk Behavior Surveys conducted biennially during 1991-2011. Each survey employed a nationally representative sample of students in grades 9-12 and provided data from approximately 14,000 students. Using sex-stratified logistic regression models that controlled for race/ethnicity and grade, we analyzed secular trends in the prevalence of suicidal thoughts and attempts. Adjusted prevalence ratios (APR) were calculated to measure associations between suicide risk and a broad range of health-risk behaviors. RESULTS: During 1991-2011, among female students, both suicidal thoughts (seriously considered suicide; made a plan to attempt suicide) and attempts (any attempt; attempt with injury requiring medical treatment) decreased significantly; among male students, only suicidal thoughts decreased significantly. During 2011, compared with students with no suicidal thoughts or attempts, the health-risk behaviors most strongly associated with suicide attempts among female students were injection drug use (APR = 12.8), carrying a weapon on school property (APR = 9.7), and methamphetamine use (APR = 8.7); among male students, the strongest associations were for IDU (APR = 22.4), using vomiting/laxatives for weight control (APR = 17.1), and having been forced to have sex (APR = 14.8). CONCLUSIONS: School-based suicide prevention programs should consider confidential screening for health-risk behaviors that are strongly associated with suicide attempts to help identify students at increased risk for suicide and provide referrals to suicide and other prevention services (e.g., substance abuse and violence prevention) as appropriate. |
Infection prevention and control standards in assisted living facilities: are residents' needs being met?
Kossover RA , Chi CJ , Wise ME , Tran AH , Chande ND , Perz JF . J Am Med Dir Assoc 2014 15 (1) 47-53 BACKGROUND: Assisted living facilities (ALFs) provide housing and care to persons unable to live independently, and who often have increasing medical needs. Disease outbreaks illustrate challenges of maintaining adequate resident protections in these facilities. OBJECTIVES: Describe current state laws on assisted living admissions criteria, medical oversight, medication administration, vaccination requirements, and standards for infection control training. METHODS: We abstracted laws and regulations governing assisted living facilities for the 50 states using a structured abstraction tool. Selected characteristics were compared according to the time period in which the regulation took effect. Selected state health departments were queried regarding outbreaks identified in assisted living facilities. RESULTS: Of the 50 states, 84% specify health-based admissions criteria to assisted living facilities; 60% require licensed health care professionals to oversee medical care; 88% specifically allow subcontracting with outside entities to provide routine medical services onsite; 64% address medication administration by assisted living facility staff; 54% specify requirements for some form of initial infection control training for all staff; 50% require reporting of disease outbreaks to the health department; 18% specify requirements to offer or require vaccines to staff; 30% specify requirements to offer or require vaccines to residents. Twelve states identified approximately 1600 outbreaks from 2010 to 2013, with influenza or norovirus infections predominating. CONCLUSIONS: There is wide variation in how assisted living facilities are regulated in the United States. States may wish to consider regulatory changes that ensure safe health care delivery, and minimize risks of infections, outbreaks of disease, and other forms of harm among assisted living residents. |
Duration of colonization with methicillin-resistant Staphylococcus aureus in an acute care facility: a study to assess epidemiologic features
Rogers C , Sharma A , Rimland D , Stafford C , Jernigan J , Satola S , Crispell E , Gaynes R . Am J Infect Control 2014 42 (3) 249-53 BACKGROUND: Patients with a history of methicillin-resistant Staphylococcus aureus (MRSA) colonization or infection are often presumed to remain colonized when they are readmitted to the hospital. This assumption underlies the hospital practice that flags MRSA-positive patients so that these patients can be placed in contact isolation at hospital admission and, when necessary, be given the appropriate empirical therapy and/or antibiotic prophylaxis. METHODS: To determine the duration of and factors associated with MRSA colonization among patients following discharge, we designed a cohort study of patients hospitalized between October 1, 2007, and July 31, 2009, at the Atlanta Veterans Affairs Medical Center, a 128-bed acute care facility. We defined 3 cohorts: cohort A; patients with both a MRSA infection during hospitalization and nasal colonization at discharge; cohort B; patients with a MRSA infection but no nasal colonization at discharge; and cohort C; patients only nasally colonized at discharge. We collected information on demographic characteristics, underlying conditions, infections, and antibiotic use. We cultured nasal swabs obtained from patients at home. We calculated hazard ratios (HR), comparing cohorts A, B, and C after controlling for other factors. RESULTS: We obtained 231 swabs (23 in cohort A, 34 in cohort B, and 174 in cohort C). We documented MRSA colonization in 92 (39.9%) of the 231 patients who returned swabs. The median duration of colonization was 33.3 months. Factors significantly associated with persistent MRSA colonization were (1) total duration of hospital stay from previous admissions prior to study entry and (2) a member of cohort A who had a longer duration of colonization compared with cohorts B and C (P < .001). CONCLUSION: Our data suggest that higher initial inocula of bacteria may be an important determinant of persistent colonization with MRSA. |
Rotavirus vaccines: successes and challenges
Glass RI , Parashar U , Patel M , Gentsch J , Jiang B . J Infect 2014 68 Suppl 1 S9-s18 Since 2006, the availability of two new rotavirus vaccines has raised enthusiasm to consider the eventual control and elimination of severe rotavirus diarrhea through the global use of vaccines. Rotavirus remains the most severe cause of acute diarrhea in children worldwide responsible for several hundred thousands of deaths in low income countries and up to half of hospital admissions for diarrhea around the world. The new vaccines have been recommended by WHO for all infants and in more than 47 countries, their introduction into routine childhood immunization programs has led to a remarkable decline in hospital admissions and even deaths within 3 years of introduction. Challenges remain with issues of vaccine finance globally and the problem that these live oral vaccines perform less well in low income settings where they are needed most. Ongoing research that will accompany vaccine introduction might help address these issues of efficacy and new vaccines and novel financing schemes may both help make these vaccines universally available and affordable in the decade. |
Assessment of vaccine exemptions among Wyoming school children, 2009 and 2011
Pride KR , Geissler AL , Kolasa MS , Robinson B , Van Houten C , McClinton R , Bryan K , Murphy T . J Sch Nurs 2014 30 (5) 332-9 During 2010-2011, varicella vaccination was an added requirement for school entrance in Wyoming. Vaccination exemption rates were compared during the 2009-2010 and 2011-2012 school years, and impacts of implementing a new childhood vaccine requirement were evaluated. All public schools, grades K-12, were required to report vaccination status of enrolled children for the 2009-2010 and 2011-2012 school years to the Wyoming Department of Health. Exemption data were analyzed by exemption category, vaccine, county, grade, and rurality. The proportion of children exempt for ≥1 vaccine increased from 1.2% (1,035/87,398) during the 2009-2010 school year to 1.9% (1,678/89,476) during 2011-2012. In 2011, exemptions were lowest (1.5%) in urban areas and highest (2.6%) in the most rural areas, and varicella vaccine exemptions represented 67.1% (294/438) of single vaccination exemptions. Implementation of a new vaccination requirement for school admission led to an increased exemption rate across Wyoming. |
Comparative evaluation of commercially available manual and automated nucleic acid extraction methods for rotavirus RNA detection in stools.
Esona MD , McDonald S , Kamili S , Kerin T , Gautam R , Bowen MD . J Virol Methods 2013 194 242-9 Rotaviruses are a major cause of viral gastroenteritis in children. For accurate and sensitive detection of rotavirus RNA from stool samples by reverse transcription-polymerase chain reaction (RT-PCR), the extraction process must be robust. However, some extraction methods may not remove the strong RT-PCR inhibitors known to be present in stool samples. The objective of this study was to evaluate and compare the performance of six extraction methods used commonly for extraction of rotavirus RNA from stool, which have never been formally evaluated: the MagNA Pure Compact, KingFisher Flex and NucliSENS easyMAG instruments, the NucliSENS miniMAG semi-automated system, and two manual purification kits, the QIAamp Viral RNA kit and a modified RNaid kit. Using each method, total nucleic acid or RNA was extracted from eight rotavirus-positive stool samples with enzyme immunoassay optical density (EIA OD) values ranging from 0.176 to 3.098. Extracts prepared using the MagNA Pure Compact instrument yielded the most consistent results by qRT-PCR and conventional RT-PCR. When extracts prepared from a dilution series were extracted by the 6 methods and tested, rotavirus RNA was detected in all samples by qRT-PCR but by conventional RT-PCR testing, only the MagNA Pure Compact and KingFisher Flex extracts were positive in all cases. RT-PCR inhibitors were detected in extracts produced with the QIAamp Viral RNA Mini kit. The findings of this study should prove useful for selection of extraction methods to be incorporated into future rotavirus detection and genotyping protocols. |
Effect of multi-walled carbon nanotube surface modification on bioactivity in the C57BL/6 mouse model
Sager TM , Wolfarth MW , Andrew M , Hubbs A , Friend S , Chen TH , Porter DW , Wu N , Yang F , Hamilton RF , Holian A . Nanotoxicology 2014 8 (3) 317-27 The current study tests the hypothesis that multi-walled carbon nanotubes (MWCNT) with different surface chemistries exhibit different bioactivity profiles in vivo. In addition, the study examined the potential contribution of the NLRP3 inflammasome in MWCNT-induced lung pathology. Unmodified (BMWCNT) and MWCNT that were surface functionalised with -COOH (FMWCNT), were instilled into C57BL/6 mice. The mice were then examined for biomarkers of inflammation and injury, as well as examined histologically for development of pulmonary disease as a function of dose and time. Biomarkers for pulmonary inflammation included cytokines, mediators and the presence of inflammatory cells (IL-1beta, IL-18, IL-33, cathepsin B and neutrophils) and markers of injury (albumin and lactate dehydrogenase). The results show that surface modification by the addition of the -COOH group to the MWCNT, significantly reduced the bioactivity and pathogenicity. The results of this study also suggest that in vivo pathogenicity of the BMWCNT and FMWCNT correlates with activation of the NLRP3 inflammasome in the lung. |
High variability in serum estradiol measurements in men and women
Vesper HW , Botelho JC , Vidal ML , Rahmani Y , Thienpont LM , Caudill SP . Steroids 2014 82 7-13 To reduce the variability in estradiol (E2) testing and to assure better patient care, standardization of E2 measurements has been recommended. This study aims to assess the accuracy and variability of E2 measurements performed by 11 routine immunological methods and 6 mass spectrometry methods using single donor serum materials and to compare the results to a reference method. The contribution of calibration bias, specificity or matrix effects, and imprecision on the overall variability of individual assays was evaluated. This study showed substantial variability in serum E2 measurements in samples from men and pre- and post-menopausal women. The mean bias across all samples, for each participant, ranged between -2.4% and 235%, with 3 participants having a mean bias of over 100%. The data suggest that calibration bias is the major contributor to the overall variability for nine assays. The analytical performances of most assays measuring E2 concentrations do not meet current needs in research and patient care. Three out of 17 assays would meet performance criteria derived from biological variability of +12.5% bias at concentrations >20pg/mL, and a maximum allowable bias of +2.5pg/mL at concentrations <20pg/mL. The sensitivity differs highly between assays. Most assays are not able to measure E2 levels below 10pg/mL. Standardization, specifically calibration to a common standard by using panels of individual patient samples, can reduce the observed variability and improve the utility of E2 levels in clinical settings. |
Induction of miR-21-PDCD4 signaling by UVB in JB6 cells involves ROS-mediated MAPK pathways
Hou L , Bowman L , Meighan TG , Pratheeshkumar P , Shi X , Ding M . Exp Toxicol Pathol 2013 65 1145-8 Ultraviolet (UV) irradiation plays a major role in the development of human skin cancer. The present study examined the alterations of miR-21-PDCD4 signaling in a mouse epidermal cell line (JB6 P(+)) post exposure to UVB irradiation. The results showed that (1) UVB caused PDCD4 inhibition in JB6 cells; (2) exposure of cells to UVB caused a significant increase of miR-21, the upstream regulator of PDCD4, expression; (3) both inhibition of ERKs with U0126 and inhibition of p38 with SB203580 significantly reversed UVB-induced PDCD4 inhibition; (4) ROS scavenger, N-acetyl-l-cysteine reversed the inhibitory effect of UVB on PDCD4 expression. The above results suggested that UVB induced PDCD4 inhibition, which may be mediated through ROS, especially endogenous H2O2 and p38 and ERKs phosphorylation. Unraveling the complex mechanisms associated with these events may provide insights into the initiation and progression of UVB-induced carcinogenesis. |
Challenges and improvements in testosterone and estradiol testing
Vesper HW , Botelho JC , Wang Y . Asian J Androl 2013 16 (2) 178-84 Assays that measure steroid hormones in patient care, public health, and research need to be both accurate and precise, as these criteria help to ensure comparability across all clinical and research applications. This review addresses major issues relevant to assay variability and describes recent activities by the US Centers for Disease Control and Prevention (CDC) to improve assay performance. Currently, high degrees of accuracy and precision are not always met for testosterone and estradiol measurements; although technologies for steroid hormone measurement have advanced significantly, measurement variability within and across laboratories has not improved accordingly. Differences in calibration and specificity are discussed as sources of variability in measurement accuracy. Ultimately, a combination of factors appears to cause inaccuracy of steroid hormone measurements, with nonuniform assay calibration and lack of specificity being two major contributors to assay variability. Within-assay variability for current assays is generally high, especially at low analyte concentrations. The CDC Hormone Standardization (HoSt) Program is improving clinical assays, as evidenced by a 50% decline in mean absolute bias between mass spectrometry assays and the CDC reference method from 2007 to 2011. This program provides the measurement traceability to CDC reference methods and helps to minimize factors affecting measurement variability. |
Racial/ethnic and nativity differences in birth outcomes among mothers in New York City: the role of social ties and social support
Almeida J , Mulready-Ward C , Bettegowda VR , Ahluwalia IB . Matern Child Health J 2014 18 (1) 90-100 Immigrants have lower rates of low birth weight (LBW) and to some extent preterm birth (PTB), than their US-born counterparts. This pattern has been termed the 'immigrant health paradox'. Social ties and support are one proposed explanation for this phenomenon. We examined the contribution of social ties and social support to LBW and PTB by race/ethnicity and nativity among women in New York City (NYC). The NYC Pregnancy Risk Assessment Monitoring System survey (2004-2007) data, linked with the selected items from birth certificates, were used to examine LBW and PTB by race/ethnicity and nativity status and the role of social ties and social support to adverse birth outcomes using bivariate and multivariable analyses. SUDAAN software was used to adjust for complex survey design and sampling weights. US- and foreign-born Blacks had significantly increased odds of PTB [adjusted odds ratio (AOR) = 2.43, 95 % CI 1.56, 3.77 and AOR = 2.6, 95 % CI 1.66, 4.24, respectively] compared to US-born Whites. Odds of PTB among foreign-born Other Latinas, Island-born Puerto Ricans' and foreign-born Asians' were not significantly different from US-born Whites, while odds of PTB for foreign-born Whites were significantly lower (AOR = 0.47, 95 % CI 0.26, 0.84). US and foreign-born Blacks' odds of LBW were 2.5 fold that of US-born Whites. Fewer social ties were associated with 32-39 % lower odds of PTB. Lower social support was associated with decreased odds of LBW (AOR 0.69, 95 % CI 0.50, 0.96). We found stronger evidence of the immigrant health paradox across racial/ethnic groups for PTB than for LBW. Results also point to the importance of accurately assessing social ties and social support during pregnancy and to considering the potential downside of social ties. |
Differences in thrombotic risk factors in black and white women with adverse pregnancy outcome
Philipp CS , Faiz AS , Beckman MG , Grant A , Bockenstedt PL , Heit JA , James AH , Kulkarni R , Manco-Johnson MJ , Moll S , Ortel TL . Thromb Res 2014 133 (1) 108-11 INTRODUCTION: Black women have an increased risk of adverse pregnancy outcomes and the characteristics of thrombotic risk factors in this population are unknown. The objective of this study was to examine the racial differences in thrombotic risk factors among women with adverse pregnancy outcomes. METHODS: Uniform data were collected in women with adverse pregnancy outcomes (pregnancy losses, intrauterine growth restriction (IUGR), prematurity, placental abruption and preeclampsia) referred to Thrombosis Network Centers funded by the Centers for Disease Control and Prevention (CDC). RESULTS: Among 343 white and 66 black women seen for adverse pregnancy outcomes, protein S and antithrombin deficiencies were more common in black women. The prevalence of diagnosed thrombophilia was higher among whites compared to blacks largely due to Factor V Leiden mutation. The prevalence of a personal history of venous thromboembolism (VTE) did not differ significantly by race. A family history of VTE, thrombophilia, and stroke or myocardial infarction (MI) was higher among whites. Black women had a higher body mass index, and a higher prevalence of hypertension, while the prevalence of sickle cell disease was approximately 27 fold higher compared to the general US black population. CONCLUSIONS: Thrombotic risk factors differ significantly in white and black women with adverse pregnancy outcomes. Such differences highlight the importance of considering race separately when assessing thrombotic risk factors for adverse pregnancy outcomes. |
Disparities in mortality rates among US infants born late preterm or early term, 2003-2005
King JP , Gazmararian JA , Shapiro-Mendoza CK . Matern Child Health J 2014 18 (1) 233-41 The purpose of this study was to identify disparities in neonatal, post-neonatal, and overall infant mortality rates among infants born late preterm (34-36 weeks gestation) and early term (37-38 weeks gestation) by race/ethnicity, maternal age, and plurality. In analyses of 2003-2005 data from US period linked birth/infant death datasets, we compared infant mortality rates by race/ethnicity, maternal age, and plurality among infants born late preterm or early term and also determined the leading causes of death among these infants. Among infants born late preterm, infants born to American Indian/Alaskan Native, non-Hispanic black, or teenage mothers had the highest infant mortality rates per 1,000 live births (14.85, 9.90, and 11.88 respectively). Among infants born early term, corresponding mortality rates were 5.69, 4.49, and 4.82, respectively. Among infants born late preterm, singletons had a higher infant mortality rate than twins (8.59 vs. 5.62), whereas among infants born early term, the rate was higher among twins (3.67 vs. 3.15). Congenital malformations and sudden infant death syndrome were the leading causes of death among both late preterm and early term infants. Infant mortality rates among infants born late preterm or early term varied substantially by maternal race/ethnicity, maternal age, and plurality. Information about these disparities may help in the development of clinical practice and prevention strategies targeting infants at highest risk. |
Exploring risk perception and attitudes to miscarriage and congenital anomaly in rural Western Kenya
Dellicour S , Desai M , Mason L , Odidi B , Aol G , Phillips-Howard PA , Laserson KF , Ter Kuile FO . PLoS One 2013 8 (11) e80551 BACKGROUND: Understanding the socio-cultural context and perceptions of adverse pregnancy outcomes is important for informing the best approaches for public health programs. This article describes the perceptions, beliefs and health-seeking behaviours of women from rural western Kenya regarding congenital anomalies and miscarriages. METHODS: Ten focus group discussions (FGDs) were undertaken in a rural district in western Kenya in September 2010. The FGDs included separate groups consisting of adult women of childbearing age, adolescent girls, recently pregnant women, traditional birth attendants and mothers of children with a birth defect. Participants were selected purposively. A deductive thematic framework approach using the questions from the FGD guides was used to analyse the transcripts. RESULTS: There was substantial overlap between perceived causes of miscarriages and congenital anomalies and these were broadly categorized into two groups: biomedical and cultural. The biomedical causes included medications, illnesses, physical and emotional stresses, as well as hereditary causes. Cultural beliefs mostly related to the breaking of a taboo or not following cultural norms. Mothers were often stigmatised and blamed following miscarriage, or the birth of a child with a congenital anomaly. Often, women did not seek care following miscarriage unless there was a complication. Most reported that children with a congenital anomaly were neglected either because of lack of knowledge of where care could be sought or because these children brought shame to the family and were hidden from society. CONCLUSION: The local explanatory model of miscarriage and congenital anomalies covered many perceived causes within biomedical and cultural beliefs. Some of these fuelled stigmatisation and blame of the mother. Understanding of these beliefs, improving access to information about the possible causes of adverse outcomes, and greater collaboration between traditional healers and healthcare providers may help to reduce stigma and increase access to formal healthcare providers. |
An introduction to a medium frequency propagation characteristic measurement method of a transmission line in underground coal mines
Li J , Waynert JA , Whisner BG . Prog Electromagn Res B Pier B 2013 55 (55) 131-149 An underground coal mine medium frequency (MF) communication system generally couples its electromagnetic signals to a long conductor in a tunnel, which acts as a transmission line, and exchanges signals with transceivers along the line. The propagation characteristics of the transmission line, which is usually the longest signal path for an MF communication system, play a major role in determining the system performance. To measure the MF propagation characteristics of transmission lines in coal mine tunnels, a method was developed based on a basic transmission line model. The method will be presented in this paper along with the propagation measurements on a transmission line system in a coal mine using the method. The measurements con-rmed a low MF signal power loss rate, and showed the influence of the electrical properties of surrounding coal and rock on the MF propagation characteristics of the line. |
Why U.S. children use dietary supplements
Bailey RL , Gahche JJ , Thomas PR , Dwyer JT . Pediatr Res 2013 74 (6) 737-41 BACKGROUND: Dietary supplements are used by one-third of children. We examined motivations for supplement use in children, the types of products used by motivations, and the role of physicians and health care practitioners in guiding choices about supplements. METHODS: We examined motivations for dietary supplement use reported for children (from birth to 19 y of age; n = 8,245) using the National Health and Nutrition Examination Survey 2007-2010. RESULTS: Dietary supplements were used by 31% of children; many different reasons were given as follows: to "improve overall health" (41%), to "maintain health" (37%), for "supplementing the diet" (23%), to "prevent health problems" (20%), and to "boost immunity" (14%). Most children (~90%) who use dietary supplements use a multivitamin-mineral or multivitamin product. Supplement users tend to be non-Hispanic white, have higher family incomes, report more physical activity, and have health insurance. Only a small group of supplements used by children (15%) were based on the recommendation of a physician or other health care provider. CONCLUSION: Most supplements used by children are not under the recommendation of a health care provider. The most common reasons for use of supplements in children are for health promotion, yet little scientific data support this notion in nutrient-replete children. |
Mortality from cancer and other causes in commercial airline crews: a joint analysis of cohorts from 10 countries
Hammer GP , Auvinen A , De Stavola BL , Grajewski B , Gundestrup M , Haldorsen T , Hammar N , Lagorio S , Linnersjo A , Pinkerton L , Pukkala E , Rafnsson V , Dos-Santos-Silva I , Storm HH , Strand TE , Tzonou A , Zeeb H , Blettner M . Occup Environ Med 2014 71 (5) 313-22 BACKGROUND: Commercial airline crew is one of the occupational groups with the highest exposures to ionising radiation. Crew members are also exposed to other physical risk factors and subject to potential disruption of circadian rhythms. METHODS: This study analyses mortality in a pooled cohort of 93 771 crew members from 10 countries. The cohort was followed for a mean of 21.7 years (2.0 million person-years), during which 5508 deaths occurred. RESULTS: The overall mortality was strongly reduced in male cockpit (SMR 0.56) and female cabin crews (SMR 0.73). The mortality from radiation-related cancers was also reduced in male cockpit crew (SMR 0.73), but not in female or male cabin crews (SMR 1.01 and 1.00, respectively). The mortality from female breast cancer (SMR 1.06), leukaemia and brain cancer was similar to that of the general population. The mortality from malignant melanoma was elevated, and significantly so in male cockpit crew (SMR 1.57). The mortality from cardiovascular diseases was strongly reduced (SMR 0.46). On the other hand, the mortality from aircraft accidents was exceedingly high (SMR 33.9), as was that from AIDS in male cabin crew (SMR 14.0). CONCLUSIONS: This large study with highly complete follow-up shows a reduced overall mortality in male cockpit and female cabin crews, an increased mortality of aircraft accidents and an increased mortality in malignant skin melanoma in cockpit crew. Further analysis after longer follow-up is recommended. |
Occupational safety and health criteria for responsible development of nanotechnology
Schulte PA , Geraci CL , Murashov V , Kuempel ED , Zumwalde RD , Castranova V , Hoover MD , Hodson L , Martinez KF . J Nanopart Res 2013 16 2153 Organizations around the world have called for the responsible development of nanotechnology. The goals of this approach are to emphasize the importance of considering and controlling the potential adverse impacts of nanotechnology in order to develop its capabilities and benefits. A primary area of concern is the potential adverse impact on workers, since they are the first people in society who are exposed to the potential hazards of nanotechnology. Occupational safety and health criteria for defining what constitutes responsible development of nanotechnology are needed. This article presents five criterion actions that should be practiced by decision-makers at the business and societal levels-if nanotechnology is to be developed responsibly. These include (1) anticipate, identify, and track potentially hazardous nanomaterials in the workplace; (2) assess workers' exposures to nanomaterials; (3) assess and communicate hazards and risks to workers; (4) manage occupational safety and health risks; and (5) foster the safe development of nanotechnology and realization of its societal and commercial benefits. All these criteria are necessary for responsible development to occur. Since it is early in the commercialization of nanotechnology, there are still many unknowns and concerns about nanomaterials. Therefore, it is prudent to treat them as potentially hazardous until sufficient toxicology, and exposure data are gathered for nanomaterial-specific hazard and risk assessments. In this emergent period, it is necessary to be clear about the extent of uncertainty and the need for prudent actions. |
Conference report: community health centers and vulnerable workers
Robbins A . New Solut 2013 23 (3) 435-7 A one-day conference in Washington brought together leaders of community health centers and worker advocates to discuss collaboration. They agreed that health centers could help protect vulnerable workers. They agreed on use of electronic medical records; access to workers compensation; Medical-Legal Partnerships; better understanding of work settings in their communities; and educating clinicians on work and jobs. |
Are lithium ion cells intrinsically safe?
Dubaniewicz TH , DuCarme JP . IEEE Trans Ind Appl 2013 49 (6) 2451-60 National Institute for Occupational Safety and Health researchers are studying the potential for Li-ion-battery thermal runaway from an internal short circuit in equipment approved as permissible for use in underground coal mines. Researchers used a plastic wedge to induce internal short circuits for thermal runaway susceptibility evaluation purposes, which proved to be a more severe test than the flat plate method for selected Li-ion cells. Researchers conducted cell crush tests within a 20-L chamber filled with 6.5 % CH4-air to simulate the mining hazard. Results indicate that LG Chem ICR18650S2 LiCoO2 cells pose a CH4 explosion hazard from a cell internal short circuit. Under specified test conditions, A123 Systems 26650 LiFePO4 cells were safer than the LG Chem ICR18650S2 LiCoO2 cells at a conservative statistical significance level. |
A systematic approach to evaluating public health training: the Obesity Prevention in Public Health Course
Mainor A , Leeman J , Sommers J , Heiser C , Gonzales C , Farris RP , Ammerman A . J Public Health Manag Pract 2014 20 (6) 647-53 OBJECTIVE: Public health practitioners require new knowledge and skills to address the multilevel factors contributing to obesity. This article presents the systematic approach the Center of Excellence for Training and Research Translation (Center TRT) used both to assess practitioners' competencies to lead public health obesity prevention initiatives and to evaluate its annual, competency-based obesity prevention course. DESIGN: In 2006, Center TRT identified priority public health competencies for obesity prevention and then planned 7 annual courses to address the priority competencies progressively over time. Each year, a longitudinal evaluation based on Kirkpatrick's training evaluation framework was administered to course participants (n = 243) to assess perceptions of the course (daily), changes in self-reported competency (immediately pre- and postcourse), and course impact on practice over time (at 6 months). RESULTS: Participants rated the course highly for quality and relevance. Although many participants reported low levels of confidence prior to the course, following the course, at least 70% reported feeling confident to perform almost all competencies. At 6-month follow-up, the majority of participants reported completing at least 1 activity identified during course action planning. CONCLUSIONS: We identified practitioners' high-priority competency needs and then designed 7 annual courses to progressively address those needs and new needs as they arose. This approach resulted in trainings valued by practitioners and effective in increasing their sense of competence to lead public health obesity prevention initiatives. The course's continuing impact was evidenced by participants' high level of completion of their action plans at 6-month follow-up. Competency-based training is important to develop a skilled public health workforce. |
Use of evidence-based practices and resources among Comprehensive Cancer Control Programs
Steele CB , Rose JM , Chovnick G , Townsend JS , Stockmyer CK , Fonseka J , Richardson LC . J Public Health Manag Pract 2014 21 (5) 441-8 CONTEXT: While efforts to promote use of evidence-based practices (EBPs) for cancer control have increased, questions remain whether this will result in widespread adoption of EBPs (eg, Guide to Community Preventive Services interventions) by comprehensive cancer control (CCC) programs. OBJECTIVE: To examine use of EBPs among CCC programs to develop cancer control plans and select interventions. DESIGN: Conducted Web-based surveys of and telephone interviews with CCC program staff between March and July 2012. SETTING: CCC programs funded by the Centers for Disease Control and Prevention's National Comprehensive Cancer Control Program (NCCCP). PARTICIPANTS: Sixty-one CCC program directors. MAIN OUTCOME MEASURES: 1) Use of and knowledge/attitudes about EBPs and related resources and 2) EBP-related technical assistance needs. RESULTS: Seventy-five percent of eligible program directors reported use of EBPs to a moderate or great extent to address program objectives. Benefits of using EBPS included their effectiveness has been proven, they are an efficient use of resources, and they lend credibility to an intervention. Challenges to using EBPs included resource limitations, lack of culturally appropriate interventions, and limited skills adapting EBPs for local use. Most respondents had heard of and used Web sites for The Guide to Community Preventive Services (95% and 91%, respectively) and Cancer Control P.L.A.N.E.T. (98% and 75%, respectively). Training needs included how to adapt an EBP and its materials for cultural appropriateness (state 78%, tribe 86%, territory 80%) and how to maintain the fidelity of an EBP (state 75%, tribe 86%, territory 60%). CONCLUSIONS: While awareness, knowledge, and use of EBPs and related resources are high, respondents identified numerous challenges and training needs. The findings from this study may be used to enhance technical assistance provided to NCCCP grantees related to selecting and implementing EBPs. |
Immunization Information Systems: a decade of progress in law and policy
Martin DW , Lowery NE , Brand B , Gold R , Horlick G . J Public Health Manag Pract 2014 21 (3) 296-303 This article reports on a study of laws, regulations, and policies governing Immunization Information Systems (IIS, also known as "immunization registries") in states and selected urban areas of the United States. The study included a search of relevant statutes, administrative codes and published attorney general opinions/findings, an online questionnaire completed by immunization program managers and/or their staff, and follow-up telephone interviews.The legal/regulatory framework for IIS has changed considerably since 2000, largely in ways that improve IIS' ability to perform their public health functions while continuing to maintain strict confidentiality and privacy controls. Nevertheless, the exchange of immunization data and other health information between care providers and public health and between entities in different jurisdictions remains difficult due in part to ongoing regulatory diversity.To continue to be leaders in health information exchange and facilitate immunization of children and adults, IIS will need to address the challenges presented by the interplay of federal and state legislation, regulations, and policies and continue to move toward standardized data collection and sharing necessary for interoperable systems. |
The Infectious Diseases Society of America Emerging Infections Network -bridging the gap between clinical infectious diseases and public health
Pillai SK , Beekmann SE , Santibanez S , Polgreen PM . Clin Infect Dis 2014 58 (7) 991-6 In 1995 the Centers for Disease Control and Prevention (CDC) granted a Cooperative Agreement Program award to the Infectious Diseases Society of America (IDSA) to develop a provider-based emerging infections sentinel network: Emerging Infections Network (EIN). Over the past 17 years, the EIN has evolved into a flexible, nationwide network with membership representing a broad cross-section of infectious diseases physicians. The EIN has an active electronic mail conference (listserv) that facilitates communication among infectious diseases providers and the public health community, and also sends members periodic queries (short surveys on infectious disease topics) that have addressed numerous topics relevant to both clinical infectious diseases and public health practice. The following article reviews how the various functions of EIN contribute to clinical care and public health, identifies opportunities to further link clinical medicine and public health, and describes future directions for the EIN. |
Number of embryos transferred after in vitro fertilization and good perinatal outcome
Kissin DM , Kulkarni AD , Kushnir VA , Jamieson DJ . Obstet Gynecol 2014 123 239-247 OBJECTIVE: To assess the association between number of embryos transferred and a measure of assisted reproductive technology success that emphasizes good perinatal outcome. METHODS: We analyzed assisted reproductive technology cycles initiated in 2011 that progressed to fresh embryo transfer among women using autologous oocytes and reported to the U.S. National Assisted Reproductive Technology Surveillance System (n=82,508). Percentages of good perinatal outcome (live birth of a term [at or after 37 weeks of gestation], normal birth weight [2,500 g or greater] singleton) were stratified by prognosis (favorable, average, less favorable), age, embryo stage (day 3, day 5), and number of embryos transferred. Differences in the percentages by number of embryos transferred were evaluated using Fisher's exact test with Bonferroni correction. RESULTS: Among patients younger than 35 years with a favorable prognosis, chances of a good perinatal outcome were higher with transferring a single (compared with double) day 5 (43% compared with 27%) or day 3 embryo (36% compared with 30%). Likewise, a higher chance of a good perinatal outcome was observed with transferring a single day 5 embryo in patients 35-37 years old with a favorable prognosis (39% compared with 28%) or patients younger than 35 years old with an average prognosis (35% compared with 26%). A higher chance of good perinatal outcome was associated with transferring two (compared with one) day 3 embryos among patients aged 40 years or younger with an average prognosis or patients younger than 35 years old with a less favorable prognosis. CONCLUSION: The association between number of embryos transferred and the birth of a term, normal birth weight singleton is described. Among patients younger than 35 years of age undergoing in vitro fertilization with a favorable prognosis, the highest chance of good perinatal outcome is associated with a single embryo transfer. LEVEL OF EVIDENCE: II. |
Internationally comparable diagnosis-specific survival probabilities for calculation of the ICD-10-based Injury Severity Score
Gedeborg R , Warner M , Chen LH , Gulliver P , Cryer C , Robitaille Y , Bauer R , Ubeda C , Lauritsen J , Harrison J , Henley G , Langley J . J Trauma Acute Care Surg 2014 76 (2) 358-65 BACKGROUND: The International Statistical Classification of Diseases, 10th Revision (ICD-10)-based Injury Severity Score (ICISS) performs well but requires diagnosis-specific survival probabilities (DSPs), which are empirically derived, for its calculation. The objective was to examine if DSPs based on data pooled from several countries could increase accuracy, precision, utility, and international comparability of DSPs and ICISS. METHODS: Australia, Argentina, Austria, Canada, Denmark, New Zealand, and Sweden provided ICD-10-coded injury hospital discharge data, including in-hospital mortality status. Data from the seven countries were pooled using four different methods to create an international collaborative effort ICISS (ICE-ICISS). The ability of the ICISS to predict mortality using the country-specific DSPs and the pooled DSPs was estimated and compared. RESULTS: The pooled DSPs were based on a total of 3,966,550 observations of injury diagnoses from the seven countries. The proportion of injury diagnoses having at least 100 discharges to calculate the DSP varied from 12% to 48% in the country-specific data set and was 66% in the pooled data set. When compared with using a country's own DSPs for ICISS calculation, the pooled DSPs resulted in somewhat reduced discrimination in predicting mortality (difference in c statistic varied from 0.006 to 0.04). Calibration was generally good when the predicted mortality risk was less than 20%. When Danish and Swedish data were used, ICISS was combined with age and sex in a logistic regression model to predict in-hospital mortality. Including age and sex improved both discrimination and calibration substantially, and the differences from using country-specific or pooled DSPs were minor. CONCLUSION: Pooling data from seven countries generated empirically derived DSPs. These pooled DSPs facilitate international comparisons and enables the use of ICISS in all settings where ICD-10 hospital discharge diagnoses are available. The modest reduction in performance of the ICE-ICISS compared with the country-specific scores is unlikely to outweigh the benefit of internationally comparable Injury Severity Scores possible with pooled data. LEVEL OF EVIDENCE: Prognostic and epidemiological study. Level III. |
Tobacco control progress and potential
Frieden TR . JAMA 2014 311 (2) 133-4 The 1964 surgeon general’s report on the health harms of smoking “hit the country like a bombshell.”1 More than 40% of US adults smoked, and smoking was accepted and considered normal behavior. Today, the US adult smoking rate is around 18%2 and about half of Americans are protected from secondhand smoke in workplaces.3 | Researchers documented the harms of tobacco through rigorous, often innovative studies; activists implemented tobacco control interventions and then evaluated them rigorously to establish practice-based evidence. Tobacco use declined steadily as the evidence base of successful tactics increased, and social mores changed gradually as a result of education, advocacy, and policy interventions. Tobacco control has been described, accurately, as one of the great public health successes of the 20th century. | However, there are 2 important and concerning surprises in tobacco control. First, even 50 years later, studies are continuing to elucidate new ways tobacco causes death and disability among both smokers and people exposed to secondhand smoke—new diseases it causes or complicates. Tobacco is, quite simply, in a league of its own in terms of the sheer numbers and varieties of ways it kills and maims people. Second, despite progress both in the United States and globally, proven strategies have not been fully implemented to protect children, support smokers who want to quit, and prevent myocardial infarctions, strokes, cancers, and other tragic and expensive health consequences of smoking. |
Vital Signs: communication between health professionals and their patients about alcohol use - 44 States and the District of Columbia, 2011
McKnight-Eily LR , Liu Y , Brewer RD , Kanny D , Lu H , Denny CH , Balluz L , Collins J . MMWR Morb Mortal Wkly Rep 2014 63 (1) 16-22 INTRODUCTION: Excessive alcohol use accounted for an estimated 88,000 deaths in the United States each year during 2006-2010, and $224 billion in economic costs in 2006. Since 2004, the U.S. Preventive Services Task Force (USPSTF) has recommended alcohol misuse screening and behavioral counseling (also known as alcohol screening and brief intervention [ASBI]) for adults to address excessive alcohol use; however, little is known about the prevalence of its implementation. ASBI will also be covered by many health insurance plans because of the Affordable Care Act. METHODS: CDC analyzed Behavioral Risk Factor Surveillance System (BRFSS) data from a question added to surveys in 44 states and the District of Columbia (DC) from August 1 to December 31, 2011, about patient-reported communication with a health professional about alcohol. Elements of ASBI are traditionally delivered via conversation. Weighted state-level prevalence estimates of this communication were generated for 166,753 U.S. adults aged ≥18 years by selected demographic characteristics and drinking behaviors. RESULTS: The prevalence of ever discussing alcohol use with a health professional was 15.7% among U.S. adults overall, 17.4% among current drinkers, and 25.4% among binge drinkers. It was most prevalent among those aged 18-24 years (27.9%). However, only 13.4% of binge drinkers reported discussing alcohol use with a health professional in the past year, and only 34.9% of those who reported binge drinking ≥10 times in the past month had ever discussed alcohol with a health professional. State-level estimates of communication about alcohol ranged from 8.7% in Kansas to 25.5% in DC. CONCLUSIONS: Only one of six U.S. adults, including binge drinkers, reported ever discussing alcohol consumption with a health professional, despite public health efforts to increase ASBI implementation. IMPLICATIONS FOR PUBLIC HEALTH PRACTICE: Increased implementation of ASBI, including systems-level changes such as integration into electronic health records processes, might reduce excessive alcohol consumption and the harms related to it. Routine surveillance of ASBI by states and communities might support monitoring and increasing its implementation. |
Patterns of current use of tobacco products among U.S. high school students for 2000-2012-findings from the National Youth Tobacco Survey
Arrazola RA , Kuiper NM , Dube SR . J Adolesc Health 2014 54 (1) 54-60.e9 PURPOSE: The purpose of this study was to assess patterns and trends of tobacco use among high school students to better understand which products are used individually or concurrently. METHODS: Data from the National Youth Tobacco Survey from 2000 through 2012 were used to assess patterns and trends of current tobacco use (cigarettes, cigars, smokeless tobacco, and other tobacco products) among U.S. high school students. We assessed use of products individually and concurrently. RESULTS: During 2000-2012, overall linear declines were observed in current use of any tobacco product from 33.6% to 20.4% (p < .05), current use of only 1 tobacco product, from 18.8% to 10.5% (p < .05), and current poly tobacco use, from 14.7% to 9.9% (p < .05), among high school students. Overall current use of only cigarettes had both a linear decline, from 14.0% to 4.7%, as well as a quadratic trend. CONCLUSIONS: During 2000-2012, the most significant overall decline observed was for students who reported smoking only cigarettes. The results suggest that more data on the use of multiple tobacco products, not just cigarettes, is needed to guide tobacco prevention and control policies and programs. |
Estimating cotinine associations and a saliva cotinine level to identify active cigarette smoking in Alaska Native pregnant women
Smith JJ , Robinson RF , Khan BA , Sosnoff CS , Dillard DA . Matern Child Health J 2014 18 (1) 120-8 Studies indicate nicotine metabolism varies by race and can change during pregnancy. Given high rates of tobacco use and limited studies among Alaska Native (AN) women, we estimated associations of saliva cotinine levels with cigarette use and second-hand smoke (SHS) exposure and estimated a saliva cotinine cutoff to distinguish smoking from non-smoking pregnant AN women. Using questionnaire data and saliva cotinine, we utilized multi-variable linear regression (n = 370) to estimate cotinine associations with tobacco use, SHS exposure, demographic, and pregnancy-related factors. Additionally, we estimated an optimal saliva cotinine cutoff for indication of active cigarette use in AN pregnant women using receiver operating characteristic (ROC) curve analysis (n = 377). Saliva cotinine significantly decreased with maternal age and significantly increased with cigarettes smoked per day, SHS exposure, and number of previous full term pregnancies. Using self-reported cigarette use in the past 7 days as indication of active smoking, the area under the ROC curve was 0.975 (95 % CI: 0.960-0.990). The point closest to 100 % specificity and sensitivity occurred with a cotinine concentration of 1.07 ng/mL, which corresponded to sensitivity of 94 % and specificity of 94 %. We recommend using a saliva cotinine cutoff of 1 ng/mL to distinguish active smoking in pregnant AN women. This cutoff is lower than used in other studies with pregnant women, most likely due to high prevalence of light or intermittent smoking in the AN population. Continued study of cotinine levels in diverse populations is needed. |
Prevalence and characterization of Cryptosporidium spp. in dairy cattle in Nile River delta provinces, Egypt
Amer S , Zidan S , Adamu H , Ye J , Roellig D , Xiao L , Feng Y . Exp Parasitol 2013 135 (3) 518-23 Molecular characterizations of Cryptosporidium spp. in dairy cattle in industrialized nations have mostly shown a dominance of Cryptosporidium parvum, especially its IIa subtypes in pre-weaned calves. Few studies, however, have been conducted on the distribution of Cryptosporidium species and C. parvum subtypes in various age groups of dairy cattle in developing countries. In this study, we examined the prevalence and molecular characteristics of Cryptosporidium in dairy cattle in four Nile River delta provinces in Egypt. Modified Ziehl-Neelsen acid-fast microscopy was used to screen for Cryptosporidium oocysts in 1974 fecal specimens from animals of different ages on 12 farms. Positive fecal specimens were identified from all studied farms with an overall prevalence of 13.6%. By age group, the infection rates were 12.5% in pre-weaned calves, 10.4% in post-weaned calves, 22.1% in heifers, and 10.7% in adults. PCR-RFLP and DNA sequence analyses of microscopy-positive fecal specimens revealed the presence of four major Cryptosporidium species. In pre-weaned calves, C. parvum was most common (30/69 or 43.5%), but Cryptosporidium ryanae (13/69 or 18.8%), Cryptosporidium bovis (7/69 or 10.2%), and Cryptosporidium andersoni (7/69 or 10.2%) were also present at much higher frequencies seen in most industrialized nations. Mixed infections were seen in 12/69 (17.4%) of genotyped specimens. In contrast, C. andersoni was the dominant species (193/195 or 99.0%) in post-weaned calves and older animals. Subtyping of C. parvum based on sequence analysis of the 60kDa glycoprotein gene showed the presence of subtypes IIdA20G1 in nine specimens, IIaA15G1R1 in 27 specimens, and a rare subtype IIaA14G1R1r1b in one specimen. The common occurrence of non-C. parvum species and IId subtypes in pre-weaned calves is a distinct feature of cryptosporidiosis transmission in dairy cattle in Egypt. The finding of the same two dominant IIa and IId C. parvum subtypes recently found in humans in Egypt suggests calves can be potential reservoirs of zoonotic cryptosporidiosis. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Food Safety
- Genetics and Genomics
- Health Behavior and Risk
- Healthcare Associated Infections
- Immunity and Immunization
- Laboratory Sciences
- Maternal and Child Health
- Mining
- Nutritional Sciences
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Program Evaluation
- Public Health Law
- Public Health Leadership and Management
- Reproductive Health
- Statistics as Topic
- Substance Use and Abuse
- Veterinary Medicine
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure