Relation of candidate genes that encode for endothelial function to migraine and stroke: the stroke prevention in young women study
Macclellan LR , Howard TD , Cole JW , Stine OC , Giles WH , O'Connell JR , Wozniak MA , Stern BJ , Mitchell BD , Kittner SJ . Stroke 2009 40 (10) e550-7 BACKGROUND AND PURPOSE: Migraine with aura is a risk factor for ischemic stroke, but the mechanism by which these disorders are associated remains unclear. Both disorders exhibit familial clustering, which may imply a genetic influence on migraine and stroke risk. Genes encoding for endothelial function are promising candidate genes for migraine and stroke susceptibility because of the importance of endothelial function in regulating vascular tone and cerebral blood flow. METHODS: Using data from the Stroke Prevention in Young Women study, a population-based case-control study including 297 women aged 15 to 49 years with ischemic stroke and 422 women without stroke, we evaluated whether polymorphisms in genes regulating endothelial function, including endothelin-1 (EDN), endothelin receptor type B (EDNRB), and nitric oxide synthase-3 (NOS3), confer susceptibility to migraine and stroke. RESULTS: EDN SNP rs1800542 and rs10478723 were associated with increased stroke susceptibility in whites (OR, 2.1; 95% CI, 1.1-4.2 and OR, 2.2; 95% CI, 1.1-4.4; P=0.02 and 0.02, respectively), as were EDNRB SNP rs4885493 and rs10507875, (OR, 1.7; 95% CI, 1.1-2.7 and OR, 2.4; 95% CI, 1.4-4.3; P=0.01 and 0.002, respectively). Only 1 of the tested SNP (NOS3 rs3918166) was associated with both migraine and stroke. CONCLUSIONS: In our study population, variants in EDN and EDNRB were associated with stroke susceptibility in white but not in black women. We found no evidence that these genes mediate the association between migraine and stroke. |
Three decade change in the prevalence of hearing impairment and its association with diabetes in the United States
Cheng YJ , Gregg EW , Saaddine JB , Imperatore G , Zhang X , Albright AL . Prev Med 2009 49 (5) 360-4 OBJECTIVE: to examine the secular change of the prevalence of hearing impairment over three decades in U.S. adults with and without diabetes. METHODS: the cross-sectional National Health and Nutrition Examination Surveys (NHANES, the 1971-1973 [NHANES I] and the 1999-2004 [NHANES 1999-2004]) were used. Average pure-tone audiometry thresholds in decibels (dB) at 1, 2, 3, 4 kHz frequencies of the worse ear were used to represent the participants' hearing status. Any hearing impairment was defined as average pure-tone audiometry threshold of the worse ear >25 dB. RESULTS: From 1971 to 2004, among adults without diabetes aged 25 to 69 years, the unadjusted prevalence of hearing impairment decreased from 27.9% to 19.1% (P <0.001), but among adults with diabetes there was no significant change (46.4% to 48.5%). After adjustment for age, sex, race, and education, the prevalence of hearing impairment in the NHANES I and NHANES 1999-2004, respectively, was 24.4% (95% confidence interval [CI], 22.3-26.6%) and 22.3% (95% CI, 20.4-24.2) for adults without diabetes and 28.5% (95% CI, 20.4-36.6%) and 34.4 (95% CI, 29.1-39.7%) for adults with diabetes. The adjusted prevalence ratios of hearing impairment for persons with diabetes vs. those without diabetes was 1.17 (95% CI, 0.87-1.57) for the NHANES I and 1.53 (95% CI, 1.28-1.83) for NHANES 1999-2004. CONCLUSIONS: Persons with diabetes have a higher prevalence of hearing impairment, and they have not achieved the same reductions in hearing impairment over time as have persons without diabetes. |
Association of type 1 diabetes with month of birth among US youth: The SEARCH for Diabetes in Youth Study
Kahn HS , Morgan TM , Case LD , Dabelea D , Mayer-Davis EJ , Lawrence JM , Marcovina SM , Imperatore G , SEARCH for Diabetes in Youth Study Group . Diabetes Care 2009 32 (11) 2010-5 OBJECTIVE: Seasonal environment at birth may influence diabetes incidence in later life. We sought evidence for this effect in a large sample of diabetic youth residing in the US. RESEARCH DESIGN AND METHODS: We compared the distribution of birth months within the SEARCH for Diabetes in Youth Study with the monthly distributions in US births tabulated by race for years 1982-2005. SEARCH participants (9,737 youth with type 1 and 1,749 with type 2 diabetes) were identified by 6 collaborating US centers. RESULTS: Among type 1 diabetic youth the percentage of observed to expected births differed across the months (P = .0092; decreased in October-February, increased in March-July). Their smoothed birth-month estimates demonstrated a deficit in November-February births and an excess in April-July births (smoothed May vs January relative risk [RR]=1.06 (95% CI 1.02-1.11)). Stratifications by sex or by 3 racial groups showed similar patterns relating type 1 diabetes to month of birth. Stratification by geographic regions showed a peak-to-nadir RR of 1.10 (CI 1.04-1.16) in study regions from northern latitudes (Colorado, western Washington State, and southern Ohio) but no birth-month effect (P >0.9) in study regions from more southern locations. Among type 2 diabetic youth, associations with birth month were inconclusive. CONCLUSIONS: Spring births were associated with increased likelihood of type 1 diabetes, but possibly not in all US regions. Causal mechanisms may involve factors dependent on geographic latitude such as solar irradiance, but it is unknown whether they influence prenatal or early postnatal development. |
Healthy living is the best revenge: findings from the European Prospective Investigation Into Cancer and Nutrition-Potsdam study
Ford ES , Bergmann MM , Kroger J , Schienkiewitz A , Weikert C , Boeing H . Arch Intern Med 2009 169 (15) 1355-62 BACKGROUND: Our objective was to describe the reduction in relative risk of developing major chronic diseases such as cardiovascular disease, diabetes, and cancer associated with 4 healthy lifestyle factors among German adults. METHODS: We used data from 23,153 German participants aged 35 to 65 years from the European Prospective Investigation Into Cancer and Nutrition-Potsdam study. End points included confirmed incident type 2 diabetes mellitus, myocardial infarction, stroke, and cancer. The 4 factors were never smoking, having a body mass index lower than 30 (calculated as weight in kilograms divided by height in meters squared), performing 3.5 h/wk or more of physical activity, and adhering to healthy dietary principles (high intake of fruits, vegetables, and whole-grain bread and low meat consumption). The 4 factors (healthy, 1 point; unhealthy, 0 points) were summed to form an index that ranged from 0 to 4. RESULTS: During a mean follow-up of 7.8 years, 2006 participants developed new-onset diabetes (3.7%), myocardial infarction (0.9%), stroke (0.8%), or cancer (3.8%). Fewer than 4% of participants had zero healthy factors, most had 1 to 3 healthy factors, and approximately 9% had 4 factors. After adjusting for age, sex, educational status, and occupational status, the hazard ratio for developing a chronic disease decreased progressively as the number of healthy factors increased. Participants with all 4 factors at baseline had a 78% (95% confidence interval [CI], 72% to 83%) lower risk of developing a chronic disease (diabetes, 93% [95% CI, 88% to 95%]; myocardial infarction, 81% [95% CI, 47% to 93%]; stroke, 50% [95% CI, -18% to 79%]; and cancer, 36% [95% CI, 5% to 57%]) than participants without a healthy factor. CONCLUSION: Adhering to 4 simple healthy lifestyle factors can have a strong impact on the prevention of chronic diseases. |
The influence of hepatitis B virus genotype and subgenotype on the natural history of chronic hepatitis B
McMahon BJ . Hepatol Int 2009 3 (2) 334-42 BACKGROUND: Chronic infection with hepatitis B virus (HBV) is associated with a high lifetime risk of developing hepatocellular carcinoma (HCC) and cirrhosis of the liver. PURPOSE: To review the studies published to date regarding the association of HBV genotypes and subgenotypes in the development of adverse sequelae from HBV. METHODS: Review of the literature for articles describing studies of HBV genotype/subgenotypes and development of HCC, cirrhosis, and liver-related death. RESULTS: Eight genotypes of HBV (A through H), which differ from each other in viral genome sequence by more than 8%, and multiple subgenotypes, which differ from each other by 4-8% have been identified. Recently, studies investigating the association between the risks of developing HCC and cirrhosis by specific HBV genotypes and subgenotypes have reported marked differences in outcome. Certain HBV genotypes and subgenotypes, including genotype C, B2-5, and F1, appear to be associated with a higher risk of developing HCC, and others, including genotypes B1, B6, and A2, appear to be associated with a lower risk of complications of HBV. Our understanding of the role of HBV genotypes and subgenotypes on the outcome of HBV infection is limited, as few population-based prospective studies have been performed and most studies compare only the outcome in areas where two genotypes predominate whereas others have not examined subgenotypes. CONCLUSIONS: Studies to date suggest that HBV genotypes/subgenotypes have important influences on the outcome of chronic HBV infection, but more population-based prospective studies examining multiple genotypes are needed. |
Mupirocin resistance
Patel JB , Gorwitz RJ , Jernigan JA . Clin Infect Dis 2009 49 (6) 935-41 With increasing pressure to prevent methicillin-resistant Staphylococcus aureus (MRSA) infection, it is possible that there will be increased use of mupirocin for nasal decolonization of MRSA. Understanding the mechanisms, clinical significance, and epidemiology of mupirocin resistance is important for predicting how changes in mupirocin use may affect bacterial populations and MRSA control. High-level mupirocin resistance in S. aureus is mediated by a plasmid-encoded mupA gene. This gene can be found on conjugative plasmids that carry multiple resistance determinants for other classes of antimicrobial agents. High-level resistance has been associated with decolonization failure, and increased resistance rates have been associated with increased mupirocin use. Low-level mupirocin resistance is mediated via mutation in the native ileS gene, and the clinical significance of this resistance is unclear. Laboratory tests to detect and distinguish between these types of resistance have been described but are not widely available in the United States. Institutions that are considering the implementation of widespread mupirocin use should consider these resistance issues and develop strategies to monitor the impact of mupirocin use. |
Obstetrician/gynecologists' knowledge, attitudes, and practices regarding prevention of infections in pregnancy
Ross DS , Rasmussen SA , Cannon MJ , Anderson B , Kilker K , Tumpey A , Schulkin J , Jones JL . J Womens Health (Larchmt) 2009 18 (8) 1187-93 BACKGROUND: Maternal infection during pregnancy is a well-recognized cause of birth defects and developmental disabilities, as well as an important contributor to other adverse pregnancy outcomes. The objective of the present survey was to gain information about the knowledge, attitudes, and practices of obstetrician/gynecologists regarding prevention of infections during pregnancy. METHODS: A survey was mailed to 606 Collaborative Ambulatory Research Network (CARN) members of the American College of Obstetricians and Gynecologists (ACOG) (approximately 2% of membership). CARN members were sampled to demographically represent ACOG. RESULTS: Of the 606 eligible respondents, surveys were received from 305 (response rate: 50%). Most obstetrician/gynecologists knew that specific actions by pregnant women could reduce the risk of infection. Seventy-nine to eighty-eight percent reported counseling pregnant women about preventing infection from Toxoplasma gondii, hepatitis B virus, and influenza, 50%-68% about varicella-zoster virus, Listeria monocytogenes, and Parvovirus B19, and <50% about cytomegalovirus, Bordetella pertussis, and lymphocytic choriomeningitis virus. The majority reported time constraints were a barrier to counseling, although most reported educational materials would be helpful. CONCLUSIONS: Knowledge was accurate and preventive counseling was appropriate for some infections, but for others it could be improved. Further studies are needed to identify strategies to increase preventive counseling. |
Reporting patterns and characteristics of tuberculosis among international travelers, United States, June 2006 to May 2008
Modi S , Buff AM , Lawson CJ , Rodriguez D , Kirking HL , Lipman H , Fishbein DB . Clin Infect Dis 2009 49 (6) 885-91 BACKGROUND: As part of efforts to prevent the introduction of communicable diseases into the United States, the Centers for Disease Control and Prevention (CDC) conducts surveillance for selected diseases in international travelers. One of these diseases, tuberculosis (TB), received substantial attention in May 2007 when the CDC issued travel restrictions and a federal isolation order for a person with drug-resistant TB who traveled internationally against public health recommendations. METHODS: Reports of TB in international travelers in the CDC's Quarantine Activity Reporting System (QARS) from 1 June 2006 through 31 May 2007 (year 1) were compared with reports from 1 June 2007 through 31 May 2008 (year 2). These reports were classified using the CDC and American Thoracic Society guidelines and analyzed for epidemiologic characteristics and trends. RESULTS: Among QARS reports, 4.6% were classified as active TB disease and 1.7% as no TB disease. Active TB disease reports increased from 2.5% of QARS reports in year 1 to 6.4% in year 2 ([Formula: see text]). The proportion of active TB disease reports leading to a federal travel restriction increased from 6.8% in year 1 to 15.4% in year 2 ([Formula: see text]). CONCLUSIONS: The significant increase in reports of international travelers with TB disease likely represents more attention to and a higher index of suspicion for TB. The increased use of federal travel restrictions was associated with the development of new procedures to limit travel for public health reasons. Continued efforts are needed to decrease the number of persons with TB who travel while potentially contagious. |
Risk factors for toxoplasma gondii infection in the United States
Jones JL , Dargelas V , Roberts J , Press C , Remington JS , Montoya JG . Clin Infect Dis 2009 49 (6) 878-84 BACKGROUND: Toxoplasmosis can cause severe ocular and neurological disease. We sought to determine risk factors for Toxoplasma gondii infection in the United States. METHODS: We conducted a case-control study of adults recently infected with T. gondii. Case patients were selected from the Palo Alto Medical Foundation Toxoplasma Serology Laboratory from August 2002 through May 2007; control patients were randomly selected from among T. gondii-seronegative persons. Data were obtained from serological testing and patient questionnaires. RESULTS: We evaluated 148 case patients with recent T. gondii infection and 413 control patients. In multivariate analysis, an elevated risk of recent T. gondii infection was associated with the following factors: eating raw ground beef (adjusted odds ratio [aOR], 6.67; 95% confidence limits [CLs], 2.09, 21.24; attributable risk [AR], 7%); eating rare lamb (aOR, 8.39; 95% CLs, 3.68, 19.16; AR, 20%); eating locally produced cured, dried, or smoked meat (aOR, 1.97; 95% CLs, 1.18, 3.28; AR, 22%); working with meat (aOR, 3.15; 95% CLs, 1.09, 9.10; AR, 5%); drinking unpasteurized goat's milk (aOR, 5.09; 95% CLs, 1.45, 17.80; AR, 4%); and having 3 or more kittens (aOR, 27.89; 95% CLs, 5.72, 135.86; AR, 10%). Eating raw oysters, clams, or mussels (aOR, 2.22; 95% CLs, 1.07, 4.61; AR, 16%) was significant in a separate model among persons asked this question. Subgroup results are also provided for women and for pregnant women. CONCLUSIONS: In the United States, exposure to certain raw or undercooked foods and exposure to kittens are risk factors for T. gondii infection. Knowledge of these risk factors will help to target prevention efforts. |
What's new in the 2009 US guidelines for prevention and treatment of opportunistic infections among adults and adolescents with HIV?
Brooks JT , Kaplan JE , Masur H . Top HIV Med 2009 17 (3) 109-14 Despite dramatic declines in the incidence of opportunistic infections (OIs) in the United States, they remain an important cause of morbidity and mortality for HIV-infected persons. Previously separate guidelines on the prevention of OIs and on the treatment of OIs have been combined recently into an updated single document; the present article reviews salient changes to and new information contained in this guidance. Chapters on hepatitis B virus infection and tuberculosis have been expanded substantially, and each chapter now includes information on immune reconstitution inflammatory syndrome. In addition, there is detailed discussion on the role of antiretroviral therapy in OI prevention and issues concerning the initiation of antiretroviral therapy during treatment of an acute OI. In the future, these guidelines will likely be maintained as an internet-based document to facilitate wider dissemination and more rapid updates. |
Increased activity of coxsackievirus B1 strains associated with severe disease among young infants in the United States, 2007-2008
Wikswo ME , Khetsuriani N , Fowlkes AL , Zheng X , Penaranda S , Verma N , Shulman ST , Sircar K , Robinson CC , Schmidt T , Schnurr D , Oberste MS . Clin Infect Dis 2009 49 (5) e44-51 BACKGROUND: Enterovirus infections are very common and typically cause mild illness, although neonates are at higher risk for severe illness. In 2007, the Centers for Disease Control and Prevention (CDC) received multiple reports of severe neonatal illness and death associated with coxsackievirus B1 (CVB1), a less common enterovirus serotype not previously associated with death in surveillance reports to the CDC. METHODS: This report includes clinical, epidemiologic, and virologic data from cases of severe neonatal illness associated with CVB1 reported during the period from 2007 through 2008 to the National Enterovirus Surveillance System (NESS), a voluntary, passive surveillance system. Also included are data on additional cases reported to the CDC outside of the NESS. Virus isolates or original specimens obtained from patients from 25 states were referred to the CDC picornavirus laboratory for molecular typing or characterization. RESULTS: During 2007-2008, the NESS received 1079 reports of enterovirus infection. CVB1 accounted for 176 (23%) of 775 reported cases with known serotype, making it the most commonly reported serotype for the first time ever in the NESS. Six neonatal deaths due to CVB1 infection were also reported to the CDC during that time. Phylogenetic analysis of the 2007 and 2008 CVB1 strains indicated that the increase in cases resulted from widespread circulation of a single genetic lineage that had been present in the United States since at least 2001. CONCLUSIONS: Healthcare providers and public health departments should be vigilant to the possibility of continuing CVB1-associated neonatal illness, and testing and continued reporting of enterovirus infections should be encouraged. |
A multicenter study on optimizing piperacillin-tazobactam use: lessons on why interventions fail
Gaynes RP , Gould CV , Edwards J , Antoine TL , Blumberg HM , Desilva K , King M , Kraman A , Pack J , Ribner B , Seybold U , Steinberg J , Jernigan JA . Infect Control Hosp Epidemiol 2009 30 (8) 794-6 We examined interventions to optimize piperacillin-tazobactam use at 4 hospitals. Interventions for rotating house staff did not affect use. We could target empiric therapy in only 35% of cases. Because prescribing practices seemed to be institution specific, interventions should address attitudes of local prescribers. Interventions should target empiric therapy and ordering of appropriate cultures. |
Arbovirus surveillance of mosquitoes collected at sites of active Rift Valley fever virus transmission: Kenya, 2006-2007
Crabtree M , Sang R , Lutomiah J , Richardson J , Miller B . J Med Entomol 2009 46 (4) 961-4 Mosquitoes collected during an outbreak of Rift Valley fever in Kenya from December 2006 to February 2007 were tested to isolate other mosquito-borne arboviruses circulating in the region. Twenty-seven virus isolations were made comprising seven viruses from three arbovirus families. |
Listeria marthii sp. nov., isolated from the natural environment, Finger Lakes National Forest
Graves LM , Helsel LO , Steigerwalt AG , Morey RE , Daneshvar MI , Roof SE , Orsi RH , Fortes ED , Millilo SR , den Bakker HC , Wiedmann M , Swaminathan B , Sauders BD . Int J Syst Evol Microbiol 2009 69 (6) 1280-1288 Four isolates (FSL S4-120T, FSL S4-696, FSL S4-710, and FSL S4-965) of Gram-positive, motile, facultatively anaerobic, non-sporeforming bacilli that were phenotypically similar to Listeria spp. were isolated from soil, standing water, and flowing water samples obtained from the natural environment in the Finger Lakes National Forest, New York, USA. The four isolates were closely related to one another and were determined to be the same species by whole genome DNA-DNA hybridization studies (>82% relatedness at 55 degrees C and >76% relatedness at 70 degrees C with 0.0-0.5% divergence). 16S ribosomal RNA sequence analysis confirmed their close phylogenetic relatedness to L. monocytogenes and L. innocua and more distant relatedness to L. welshimeri, L. seeligeri, L. ivanovii, and L. grayi. Phylogenetic analysis of partial sequences for sigB, gap, and prs showed that these isolates form a well-supported sistergroup to L. monocytogenes. The four isolates were sufficiently different from L. monocytogenes and L. innocua by DNA-DNA hybridization to warrant their designation as a new Listeria species. The four isolates yielded positive reactions in the AccuProbe(R) test that is purported to be specific for L. monocytogenes, did not ferment L-rhamnose, were non-hemolytic on blood agar media, and did not contain a homologue of the L. monocytogenes virulence gene island. On the basis of their phenotypic characteristics and their genotypic distinctiveness from L. monocytogenes and L. innocua, the four isolates should be classified as a new species within the genus Listeria, for which the name Listeria marthii sp. nov. is proposed. The type strain of L. marthii is FSL S4-120T (=ATCC BAA 1595T =BEIR NR 9579T =CCUG 56148T). L. marthii has not been associated with human or animal disease at this time. |
Monitoring of human populations for early markers of cadmium toxicity: a review
Fowler BA . Toxicol Appl Pharmacol 2009 238 (3) 294-300 Exposure of human populations to cadmium (Cd) from air, food and water may produce effects in organs such as the kidneys, liver, lungs, cardiovascular, immune and reproductive systems. Since Cd has been identified as a human carcinogen, biomarkers for early detection of susceptibility to cancer are of an importance to public health. The ability to document Cd exposure and uptake of this element through biological monitoring is a first step towards understanding its health effects. Interpretation and application of biological monitoring data for predicting human health outcomes require correlation with biological measures of organ system responses to the documented exposure. Essential to this understanding is the detection and linkage of early biological responses toxic effects in target cell populations. Fortunately, advances in cell biology have resulted in the development of pre-clinical biological markers (biomarkers) that demonstrate measurable and characteristic molecular changes in organ systems following chemical exposures that occur prior to the onset of overt clinical disease or development of cancer. Technical advances have rendered a number of these biomarkers practical for monitoring Cd-exposed human populations. Biomarkers will be increasingly important in relation to monitoring effects from the exposure to new Cd-based high technology materials. For example, cadmium-selenium (CdSe), nano-materials made from combinations of these elements have greatly altered cellular uptake characteristics due to particle size. These differences may greatly alter effects at the target cell level and hence risks for organ toxicities from such exposures. The value of validated biomarkers for early detection of systemic Cd-induced effects in humans cannot be underestimated due to the rapid expansion of nano-material technologies. This review will attempt to briefly summarize the applications, to date, of biomarker endpoints for assessing target organ system effects in humans and experimental systems from Cd exposure. Further, it will attempt to provide a prospective look at the possible future of biomarkers. The emphasis will be on the detection of early toxic effects from exposure to Cd in new products such as nano-materials and identification of populations at special risk for Cd toxicity. |
Serum concentrations of selected persistent organic pollutants in a sample of pregnant females and changes in their concentrations during gestation
Wang RY , Jain RB , Wolkin AF , Rubin CH , Needham LL . Environ Health Perspect 2009 117 (8) 1244-9 OBJECTIVES: In this study we evaluated the concentrations of selected persistent organic pollutants in a sample of first-time pregnant females residing in the United States and assessed differences in these concentrations in all pregnant females during gestation. METHODS: We reviewed demographic and laboratory data for pregnant females participating in the National Health and Nutrition Examination Survey, including concentrations of 25 polychlorinated biphenyls (PCBs), 6 polychlorinated dibenzo-p-dioxins (PCDDs), 9 polychlorinated dibenzofurans (PCDFs), and 9 organochlorine pesticides. We report serum concentrations for first-time pregnant females (2001-2002; n = 49) and evaluate these concentrations in all pregnant females by trimester (1999-2002; n = 203) using a cross-sectional analysis. RESULTS: The chemicals with >or= 60% detection included PCBs (congeners 126, 138/158, 153, 180), PCDDs/PCDFs [1,2,3,4,6,7,8-heptachlorodibenzo-p-dioxin (1234678HpCDD), 1,2,3,6,7,8-hexachlorodibenzo-p-dioxin (123678HxCDD), 1,2,3,4,6,7,8-heptachlorodibenzofuran (1234678HpCDF), 1,1'-(2,2-dichloroethenylidene)-bis(4-chlorobenzene) (p,p'-DDE)], and trans-nonachlor. The geometric mean concentration (95% confidence intervals) for 1234678HpCDD was 15.9 pg/g lipid (5.0-50.6 pg/g); for 123678HxCDD, 9.7 pg/g (5.5-17.1 pg/g); and for 1234678HpCDF, 5.4 pg/g (3.3-8.7 pg/g). The differences in concentrations of these chemicals by trimester were better accounted for with the use of lipid-adjusted units than with whole-weight units; however, the increase in the third-trimester concentration was greater for PCDDs/PCDFs (123678HxCDD, 1234678HpCDF) than for the highest concentration of indicator PCBs (138/158, 153, 180), even after adjusting for potential confounders. CONCLUSION: The concentrations of these persistent organic pollutants in a sample of first-time pregnant females living in the United States suggest a decline in exposures to these chemicals since their ban or restricted use and emission. The redistribution of body burden for these and other persistent organic pollutants during pregnancy needs to be more carefully defined to improve the assessment of fetal exposure to them based on maternal serum concentrations. Additional studies are needed to further the understanding of the potential health consequences to the fetus from persistent organic pollutants. |
Human zoonotic enteropathogens in a constructed free-surface flow wetland
Graczyk TK , Lucy FE , Mashinsky Y , Andrew Thompson RC , Koru O , Dasilva AJ . Parasitol Res 2009 105 (2) 423-8 Effluents from a small-scale free-surface flow constructed wetland, used for polishing of secondary treated wastewater, contained significantly higher concentrations of potentially viable Giardia duodenalis cysts and Enterocytozoon bieneusi spores than did wetland influents consisting of secondary treated wastewater. Zoonotic Assemblage A of G. duodenalis cysts was identified in wetland inflows, while Assemblage A and two nonhuman infective Assemblages (i.e., C, and E) were present in wetland effluents. E. bieneusi spores represented genotype K based on DNA sequencing analysis of internal transcribed spacer. The study demonstrated that: (1) free-surface flow small-scale constructed wetlands may not provide sufficient remediation for human zoonotic protozoa and fungi present in secondary treated wastewater; (2) dogs and livestock can substantially contribute human-pathogenic protozoan and fungal microorganisms to engineered vegetated wetland systems; and (3) large volumes of wetland effluents can contribute to contamination of surface waters used for recreation and drinking water abstraction and therefore represent a serious public health threat. |
Interpreting longitudinal spirometry: weight gain and other factors affecting the recognition of excessive FEV(1) decline
Wang ML , Avashia BH , Petsonk EL . Am J Ind Med 2009 52 (10) 782-9 BACKGROUND: Excessive FEV(1) loss in an individual or a group can reflect hazardous exposures and development of lung disease. However, multiple factors may affect FEV(1) measurements. METHODS: Using medical screening data collected in 1884 chemical plant workers between 1973 and 2003, the influence of multiple factors on repeated measurements of FEV(1) was examined. RESULTS: The FEV(1) level was associated with age, height, race, sex, cigarette smoking, changes in body weight, and spirometer model. After controlling for these factors, longitudinal FEV(1) decline averaged 23.8 ml/year for white males; an additional loss of 8.3 ml was associated with one pack-year smoking and 5.4 ml with a one pound weight gain. Depending on the spirometer model, FEV(1) differed by up to 95 ml. CONCLUSIONS: The study results provide quantitative estimates of the effect of specific factors on FEV(1), and should be useful to health professionals in the evaluation of accelerated lung function declines. Am. J. Ind. Med. (c) 2009 Wiley-Liss, Inc. |
Outbreak of life-threatening coxsackievirus b1 myocarditis in neonates
Verma NA , Zheng XT , Harris MU , Cadichon SB , Melin-Aldana H , Khetsuriani N , Oberste MS , Shulman ST . Clin Infect Dis 2009 49 (5) 759-63 In the summer and fall of 2007, we observed a unique cluster of cases of severe coxsackievirus B1 (CVB1) infection among Chicago area neonates. Eight neonates had closely related strains of CVB1 that were typed at the Centers of Disease Control and Prevention; 2 other neonates had CVB infections, 1 of which was further identified as serotype CVB1. All had severe myocarditis; 1 neonate underwent heart transplantation, and 1 died of severe left ventricular dysfunction. |
Chikungunya fever: an epidemiological review of a re-emerging infectious disease
Staples JE , Breiman RF , Powers AM . Clin Infect Dis 2009 49 (6) 942-8 Chikungunya fever is an acute febrile illness associated with severe, often debilitating polyarthralgias. The disease is caused by Chikungunya virus (CHIKV), an arthropod-borne virus that is transmitted to humans primarily via the bite of an infected mosquito. Since a re-emergence of CHIKV in 2004, the virus has spread into novel locations, such as Europe, and has led to millions of cases of disease throughout countries in and around the Indian Ocean. The risk of importation of CHIKV into new areas is ever present because of the high attack rates associated with the recurring epidemics, the high levels of viremia in infected humans, and the worldwide distribution of the vectors responsible for transmitting CHIKV. In this review, we will characterize the epidemiology and global expansion of CHIKV, describe the clinical features and laboratory testing for the disease, and discuss priorities for further studies needed for effective disease control and prevention. |
The Scientific Foundation for personal genomics: recommendations from a National Institutes of Health-Centers for Disease Control and Prevention multidisciplinary workshop.
Khoury MJ , McBride C , Schully SD , Ioannidis JP , Feero WG , Janssens AC , Gwinn M , Simons-Morton DG , Bernhardt JM , Cargill M , Chanock SJ , Church GM , Coates RJ , Collins FS , Croyle RT , Davis BR , Downing GJ , Duross A , Friedman S , Gail MH , Ginsburg GS , Green RC , Greene MH , Greenland P , Gulcher JR , Hsu A , Hudson KL , Kardia SL , Kimmel PL , Lauer MS , Miller AM , Offit K , Ransohoff DF , Roberts HS , Rasooly RS , Stefansson K , Terry SF , Teutsch SM , Trepanier A , Wanke KL , Witte JS , Xu J . Genet Med 2009 11 (8) 559-67 The increasing availability of personal genomic tests has led to discussions about the validity and utility of such tests and the balance of benefits and harms. A multidisciplinary workshop was convened by the National Institutes of Health and the Centers for Disease Control and Prevention to review the scientific foundation for using personal genomics in risk assessment and disease prevention and to develop recommendations for targeted research. The clinical validity and utility of personal genomics is a moving target with rapidly developing discoveries but little translation research to close the gap between discoveries and health impact. Workshop participants made recommendations in five domains: (1) developing and applying scientific standards for assessing personal genomic tests; (2) developing and applying a multidisciplinary research agenda, including observational studies and clinical trials to fill knowledge gaps in clinical validity and utility; (3) enhancing credible knowledge synthesis and information dissemination to clinicians and consumers; (4) linking scientific findings to evidence-based recommendations for use of personal genomics; and (5) assessing how the concept of personal utility can affect health benefits, costs, and risks by developing appropriate metrics for evaluation. To fulfill the promise of personal genomics, a rigorous multidisciplinary research agenda is needed. |
Administrative coding data and health care-associated infections
Jhung MA , Banerjee SN . Clin Infect Dis 2009 49 (6) 949-55 Surveillance for health care-associated infections (HAIs) using administrative data has received attention from health care epidemiologists searching for efficient means to track infections in their institutions. Several states are also considering electronic surveillance that incorporates administrative data as a means to satisfy an increasing demand for mandatory public reporting of HAIs. International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) discharge diagnosis codes have attributes that make them suitable for detecting HAIs; for example, they may facilitate automated surveillance, freeing up infection control personnel to perform other important tasks, such as staff education and outbreak investigation. However, controversy surrounds the appropriate use of ICD-9-CM data in detecting HAIs, and administrative coding data have been criticized for lacking elements necessary for surveillance. Administrative coding data are inappropriate as the sole means of HAI surveillance but may have value to the health care epidemiologist as a way to augment traditional methods. |
Behavior and beliefs about influenza vaccine among adults aged 50-64 years
Santibanez TA , Mootrey GT , Euler GL , Janssen AP . Am J Health Behav 2010 34 (1) 77-89 OBJECTIVE: To examine demographics and beliefs about influenza disease and vaccine that may be associated with influenza vaccination among 50- to 64-year-olds. METHODS: A national sample of adults aged 50-64 years surveyed by telephone. RESULTS: Variables associated with receiving influenza vaccination included age, education level, recent doctor visit, and beliefs about vaccine effectiveness and vaccine safety. Beliefs about influenza vaccination varied by race/ethnicity, age, education, and gender. CONCLUSION: The finding of demographic differences in beliefs suggests that segmented communication messages designed for specific demographic subgroups may help to increase influenza vaccination coverage. |
An outbreak of varicella in elementary school children with two-dose varicella vaccine recipients--Arkansas, 2006
Gould PL , Leung J , Scott C , Schmid DS , Deng H , Lopez A , Chaves SS , Reynolds M , Gladden L , Harpaz R , Snow S . Pediatr Infect Dis J 2009 28 (8) 678-81 BACKGROUND: In June 2006, the Advisory Committee on Immunization Practices (ACIP) expanded its June 2005 recommendation for a second dose of varicella vaccine during outbreaks to a recommendation for routine school entry second dose varicella vaccination. In October 2006, the Arkansas Department of Health was notified of a varicella outbreak among students where some received a second dose during an outbreak-related vaccination campaign in February 2006. METHODS: The outbreak was investigated using a school-wide parental survey with a follow-up survey of identified case patients. Vaccination status was verified using state and local immunization records. Limited laboratory testing confirmed circulation of wild-type varicella, including varicella in 2-dose vaccine recipients. RESULTS: Vaccination information was available for 871 (99%) of the 880 children. Varicella vaccination coverage was 97% (2-dose, 39%; 1-dose, 58%). A review of the February vaccination clinic found no deficiencies; lot numbers did not differ between cases and noncases. Varicella was confirmed by PCR in 5 (42%) of 12 lesion specimens and by IgM in 1 (6%) of 16 serum specimens. Varicella was reported in 84 children, including 25 (30%) two-dose and 53 (63%) one-dose recipients. Attack rates among 2-dose recipients (10.4%) and 1-dose recipients (14.6%) were not significantly different (RR: 0.72, 95% CI: 0.44-1.15). All 2-dose recipients and 80% of 1-dose recipients reported having 50 or fewer skin lesions. CONCLUSION: This outbreak is the first to document varicella in both 1- and 2-dose vaccine recipients; both groups had mild disease. The vaccine effectiveness of 1 and 2 doses were similar. |
Persistence of rubella antibodies after 2 doses of measles-mumps-rubella vaccine
Lebaron CW , Forghani B , Matter L , Reef SE , Beck C , Bi D , Cossen C , Sullivan BJ . J Infect Dis 2009 200 (6) 888-99 BACKGROUND: Since 1990, most schoolchildren in the United States have received a second dose of measles-mumps-rubella vaccine (MMR2) at kindergarten entry. Elimination of endemic rubella virus circulation in the United States was declared in 2004. The objective of the current study was to evaluate the short- and long-term rubella immunogenicity of MMR2. METHODS: At enrollment in 1994-1995, children ([Formula: see text]) in a rural Wisconsin health maintenance organization received MMR2 at age 4-6 years. A comparison group of older children ([Formula: see text]) was vaccinated at age 9-11 years. Serum specimens were collected during a 12-year period. Rubella antibody levels were evaluated by plaque-reduction neutralization (lowest detectable titer, 1:10). RESULTS: Before administration of MMR2 in the kindergarten group, 9% of subjects were seronegative, 60% had the lowest detectable titer, and the geometric mean titer (GMT) was 1:13. One month after administration of MMR2, 1% were seronegative, 6% had the lowest detectable titer, and the GMT was 1:42. Four-fold boosts occurred in 62% of subjects, but only 0.3% were immunoglobulin M positive. Twelve years after MMR2 administration, 10% were seronegative, 43% had the lowest detectable titer, and the GMT was 1:17. The middle-school group showed similar patterns. CONCLUSIONS: Rubella antibody response to MMR2 was vigorous, but titers decreased to pre-MMR2 levels after 12 years. Because rubella is a highly epidemic disease, vigilance will be required to assure continued elimination. |
Cytoplasmic nucleic acid sensors in antiviral immunity
Ranjan P , Bowzard JB , Schwerzmann JW , Jeisy-Scott V , Fujita T , Sambhara S . Trends Mol Med 2009 15 (8) 359-68 The innate immune system uses pattern recognition receptors (PRRs) to sense invading microbes and initiate a rapid protective response. PRRs bind and are activated by structural motifs, such as nucleic acids or bacterial and fungal cell wall components, collectively known as pathogen-associated molecular patterns. PRRs that recognize pathogen-derived nucleic acids are present in vesicular compartments and in the cytosol of most cell types. Here, we review recent studies of these cytosolic sensors, focusing on the nature of the ligands for DNA-dependent activator of interferon (DAI)-regulatory factors, absent in melanoma 2 (AIM2), and the retinoic acid-inducible gene I-like helicase (RLH) family of receptors, the basis of ligand recognition and the signaling pathways triggered by the activation of these receptors. An increased understanding of these molecular aspects of innate immunity will guide the development of novel antiviral therapeutics. |
Decline and change in seasonality of US rotavirus activity after the introduction of rotavirus vaccine
Tate JE , Panozzo CA , Payne DC , Patel MM , Cortese MM , Fowlkes AL , Parashar UD . Pediatrics 2009 124 (2) 465-71 BACKGROUND: In 2006, routine immunization of US infants against rotavirus was initiated. We assessed national, regional, and local trends in rotavirus testing and detection before and after vaccine introduction. METHODS: We examined data for July 2000 through June 2008 from a national network of approximately 70 US laboratories to compare geographical and temporal aspects of rotavirus season timing and peak activity. To assess trends in rotavirus testing and detection, we restricted the analyses to 33 laboratories that reported for >or=26 weeks per season from 2000 to 2008. RESULTS: Nationally, the onset and peak of the 2007-2008 rotavirus season were delayed 15 and 8 weeks, respectively, compared with prevaccine seasons from 2000-2006. Delays were observed in each region. The 2007-2008 rotavirus season lasted 14 weeks compared with a median of 26 weeks during the prevaccine era. Of 33 laboratories, 32 reported fewer positive results and a lower proportion of positive test results in 2007-2008 compared with the median in 2000-2006, with a 67% decline in the number and a 69% decline in the proportion of rotavirus-positive test results. The proportion of positive test results in 2007-2008 compared with the median in 2000-2006 declined >50% in 79% of the laboratories and >75% in 39% of the laboratories. CONCLUSIONS: The 2007-2008 US rotavirus season seems substantially delayed, shorter, and diminished in magnitude compared with seasons before vaccine implementation. The extent of change seems greater than expected on the basis of estimated vaccine coverage, suggesting indirect benefits to unvaccinated individuals. Monitoring in future seasons is needed to confirm these trends. |
Immunization programs for infants, children, adolescents, and adults: clinical practice guidelines by the Infectious Diseases Society of America
Pickering LK , Baker CJ , Freed GL , Gall SA , Grogg SE , Poland GA , Rodewald LE , Schaffner W , Stinchfield P , Tan L , Zimmerman RK , Orenstein WA . Clin Infect Dis 2009 49 (6) 817-40 Evidence-based guidelines for immunization of infants, children, adolescents, and adults have been prepared by an Expert Panel of the Infectious Diseases Society of America (IDSA). These updated guidelines replace the previous immunization guidelines published in 2002. These guidelines are prepared for health care professionals who care for either immunocompetent or immunocompromised people of all ages. Since 2002, the capacity to prevent more infectious diseases has increased markedly for several reasons: new vaccines have been licensed (human papillomavirus vaccine; live, attenuated influenza vaccine; meningococcal conjugate vaccine; rotavirus vaccine; tetanus toxoid, reduced diphtheria toxoid, and acellular pertussis [Tdap] vaccine; and zoster vaccine), new combination vaccines have become available (measles, mumps, rubella and varicella vaccine; tetanus, diphtheria, and pertussis and inactivated polio vaccine; and tetanus, diphtheria, and pertussis and inactivated polio/Haemophilus influenzae type b vaccine), hepatitis A vaccines are now recommended universally for young children, influenza vaccines are recommended annually for all children aged 6 months through 18 years and for adults aged 50 years, and a second dose of varicella vaccine has been added to the routine childhood and adolescent immunization schedule. Many of these changes have resulted in expansion of the adolescent and adult immunization schedules. In addition, increased emphasis has been placed on removing barriers to immunization, eliminating racial/ethnic disparities, addressing vaccine safety issues, financing recommended vaccines, and immunizing specific groups, including health care providers, immunocompromised people, pregnant women, international travelers, and internationally adopted children. This document includes 46 standards that, if followed, should lead to optimal disease prevention through vaccination in multiple population groups while maintaining high levels of safety. |
Influenza vaccination among adults with asthma findings from the 2007 BRFSS survey
Lu PJ , Euler GL , Callahan DB . Am J Prev Med 2009 37 (2) 109-15 BACKGROUND: Asthma prevalence among U.S. adults is estimated to be 6.7%. People with asthma are at increased risk of complications from influenza. Influenza vaccination of adults and children with asthma is recommended by the Advisory Committee on Immunization Practices. The Healthy People 2010 Objectives call for annual influenza vaccination of at least 60% of adults aged 18-64 years with asthma and other conditions associated with an increased risk of complications from influenza. PURPOSE: To assess influenza vaccination coverage among adults with asthma in the United States. METHODS: Data from the 2007 Behavioral Risk Factor Surveillance System restricted to individuals interviewed during February through August were analyzed in 2008 to estimate national and state prevalence of self-reported receipt of influenza vaccination among respondents aged 18-64 years with asthma. Logistic regression provided predictive marginal vaccination coverage for each covariate, adjusted for demographic and access to care characteristics. RESULTS: Among adults aged 18-64 years with asthma, influenza vaccination coverage was 39.9% (95% CI=38.3%, 41.5%) during the 2006-2007 season (coverage ranged from 26.9% [95% CI=19.8%, 35.3%] in California to 53.3% [95% CI=42.8%, 63.6%] in Tennessee). Influenza vaccination coverage was 33.9% (95% CI=31.9%, 35.9%) for adults aged 18-49 years with asthma compared to 54.7% (95% CI=52.4%, 57.0%) for adults aged 50-64 years with asthma. Among people aged 18-64 years, vaccination coverage was 28.8% among those without asthma. People with asthma who had an increased likelihood of vaccination were aged 50-64 years, female, non-Hispanic white, and had diabetes, activity limitations, health insurance, a regular healthcare provider, routine checkup in the previous year, and formerly smoked or never smoked. CONCLUSIONS: Influenza vaccination coverage continues to be below the national objective of 60% for people aged 18-64 years with asthma as a high-risk condition. Increased state and national efforts are needed to improve influenza vaccination levels among this population and particularly among those aged 18-49 years. |
Victimization by peers and adolescent suicide in three US samples
Kaminski JW , Fang X . J Pediatr 2009 155 (5) 683-8 OBJECTIVE: To investigate the association between victimization by peers and suicidal ideation and behavior in 3 samples of adolescents in the United States. STUDY DESIGN: This study was a secondary analysis of data from 3 cohorts of adolescents: (1) a nationally representative survey of adolescents in grade 7 through 12, Wave I of the National Longitudinal Study of Adolescent Health, conducted by the Carolina Population Center in 1994-1995; (2) a nationally representative survey, the Youth Risk Behavior Surveillance System, conducted by the Centers for Disease Control and Prevention in 2005; and (3) a survey in a high-risk community conducted by the Centers for Disease Control and Prevention in 2004. RESULTS: Controlling for differences in age, sex, race/ethnicity, and depressive symptomology, adolescents reporting more frequent victimization by peers were more likely to report suicidal ideation and suicidal behavior. Adjusted odds ratios ranged from 1.67 (95% confidence interval [CI] = 1.30-2.15) to 3.83 (95% CI = 2.78-5.27) for the different outcome measures and data sets. CONCLUSIONS: Our results provide further support for the need for effective prevention of peer victimization. Inclusion of questions about victimization experiences might aid formal and informal suicide screening efforts. |
Developing teen dating violence prevention strategies: formative research with middle school youth
Noonan RK , Charles D . Violence Against Women 2009 15 (9) 1087-105 Intimate partner violence (IPV) peaks in youth and young adulthood and is associated with multiple adolescent risk behaviors and negative health outcomes. Targeting youth with prevention messages before they start dating may avert teen dating violence and subsequent adult IPV. This article discusses findings from focus groups with middle school youth to determine behaviors and beliefs regarding dating violence. To develop effective prevention messages, participants were asked questions about characteristics of middle school dating relationships, healthy relationships, relationship norms, unhealthy relationships, emotional abuse, physical abuse, sexual abuse, intervening in violent situations, and trusted sources for information about dating violence. The recommendations for prevention efforts include an emphasis on skill building, tailoring efforts for particular subgroups, and identifying innovative ways of reaching youth. |
Molecular identification of Aspergillus species: Transplant Associated Infection Surveillance Network (TRANSNET)
Balajee SA , Kano R , Baddley JW , Moser SA , Marr KA , Alexander BD , Andes D , Kontoyiannis DP , Perrone G , Peterson S , Brandt ME , Pappas PG , Chiller T . J Clin Microbiol 2009 47 (10) 3138-41 A large aggregate collection of clinical isolates of aspergilli (n=218) from transplant patients with proven or probable Invasive Aspergillosis was available from the Transplant Associated Infection Surveillance Network (TRANSNET), a six-year prospective surveillance study. To determine the Aspergillus species distribution in this collection, isolates were subjected to comparative sequence analyses using the Internal Transcribed Spacer (ITS) and beta-tubulin regions. Aspergillus fumigatus was the predominant species recovered, followed by A. flavus and A. niger. Several newly described species were identified including A. lentulus and A. calidoustus; both species had high in vitro MICs to multiple antifungal drugs. Aspergillus tubingensis, a member of the A. niger species complex, is described from clinical specimens; all A. tubingensis isolates had low in vitro MICs to antifungal drugs. |
Stability of the conjugated species of environmental phenols and parabens in human serum
Ye X , Wong LY , Jia LT , Needham LL , Calafat AM . Environ Int 2009 35 (8) 1160-3 In humans, the metabolism of environmental phenols may include the formation of conjugated species (e.g., glucuronides and sulfates), but the free species-not the conjugated forms-are considered biologically active. Therefore, information on the concentration of these free species in blood or urine could be helpful for risk assessment. Because conjugates could hydrolyze to their corresponding free forms during collection, handling, and storage of biological specimens, information on the temporal stability of the conjugates is of interest. Previously, we reported the temporal stability of urinary conjugates of several environmental phenols, but data on the stability of phenols' conjugated species in serum, albeit critical if concentrations of free and conjugated species are compared, are largely unknown. In the present study, we investigate the stability of the conjugates of four phenols-bisphenol A, benzophenone-3, triclosan, and 2,5-dichlorophenol-and two parabens-methyl paraben and propyl paraben-in 16 human serum samples for 30days at above-freezing temperature storage conditions (4 degrees C, room temperature, and 37 degrees C). These conditions reflect the worst-case scenarios that could occur during the short-term storage of biological samples before their long-term storage at controlled subfreezing temperatures. We found that the percentage of the conjugated species of the four detected compounds (2,5-dichlorophenol, triclosan, and methyl and propyl parabens) in these serum specimens even when stored at 37 degrees C for at least 30days did not vary significantly. These preliminary data suggest that the phenols' serum conjugates appear to be more stable than their corresponding urinary conjugates, some of which started to hydrolyze within 24h under similar storage conditions. The reported stability of these conjugated species in human serum also suggests that the free species are unlikely to have resulted from the hydrolysis of their corresponding conjugates. This information could be important for interpreting the low concentrations of free phenol species detected in serum samples of nonoccupationally exposed populations. To our knowledge, this is the first study to report on the stability of conjugated species in serum, and as such requires replication. |
Consensus amplification and novel multiplex sequencing method for S segment species identification of 47 viruses of the Orthobunyavirus, Phlebovirus, and Nairovirus genera of the family Bunyaviridae
Lambert AJ , Lanciotti RS . J Clin Microbiol 2009 47 (8) 2398-404 A reverse transcription-PCR (RT-PCR) assay was designed, according to previously determined and newly derived genetic data, to target S genomic segments of 47 viruses, including 29 arthropod-borne human pathogens, of the family Bunyaviridae. The analytical sensitivity of the presented assay was evaluated through its application to RNAs extracted from quantitated dilutions of bunyaviruses of interest. Additionally, the assay's analytical specificity was determined through the evaluation of RNAs extracted from selected bunyaviruses and other representative arthropod-borne viruses isolated from a diverse group of host species and temporal and geographic origins. After RT-PCR amplification, DNAs amplified from bunyaviruses of interest were subjected to a novel multiplex sequencing method to confirm bunyavirus positivity and provide preliminary, species-level S segment identification. It is our goal that this assay will be used as a tool for identification and characterization of emergent arthropod-borne bunyavirus isolates of medical import as well as related viruses of the family Bunyaviridae that have not been associated with human illness. |
Genotypic comparison of invasive Neisseria meningitidis serogroup Y isolates from the United States, South Africa and Israel, 1999-2002
Whitney AM , Coulson GB , von Gottberg A , Block C , Keller N , Mayer LW , Messonnier NE , Klugman KP . J Clin Microbiol 2009 47 (9) 2787-93 The proportion of meningococcal disease in the US, South Africa, and Israel caused by Neisseria meningitidis serogroup Y (NmY) was greater than the worldwide average during the period 1999-2002. Genotypic characterization of 300 NmY isolates by MLST, 16S rRNA gene sequencing and PorA variable region typing was conducted to determine the relationships of the isolates from these 3 countries. 70 different genotypes were found. Two groups of ST-23 clonal complex isolates accounted for 88% of the US isolates, 12% of the South African isolates, and 96% of the isolates from Israel. The single common clone (ST-23/16S-19/P1.5-2,10-1) represented 57%, 5% and 35% of the NmY isolates from the US, South Africa and Israel. The predominant clone in South Africa (ST-175/16S-21/P1.5-1,2-2) and 11 other closely related clones made up 77% of the South African study isolates and were not found among the isolates from the US or Israel. ST-175 was the predicted founder of the ST-175 clonal complex, and isolates of ST-175 and related STs have been described previously in other African countries. Continued active surveillance and genetic characterization of NmY isolates causing disease in the US, South Africa, and Israel will provide valuable data for local and global epidemiology, allow monitoring for any expansion of existing clonal complexes and detection of the emergence of new virulent clones in the population. |
Kinetics of lethal factor and poly-D-glutamic acid antigenemia during inhalation anthrax in rhesus macaques
Boyer AE , Quinn CP , Hoffmaster AR , Kozel TR , Saile E , Marston CK , Percival A , Plikaytis BD , Woolfitt AR , Gallegos M , Sabourin P , McWilliams LG , Pirkle JL , Barr JR . Infect Immun 2009 77 (8) 3432-41 Systemic anthrax manifests as toxemia, rapidly disseminating septicemia, immune collapse, and death. Virulence factors include the anti-phagocytic gamma-linked poly-d-glutamic acid (PGA) capsule and two binary toxins, complexes of protective antigen (PA) with lethal factor (LF) and edema factor. We report the characterization of LF, PA, and PGA levels during the course of inhalation anthrax in five rhesus macaques. We describe bacteremia, blood differentials, and detection of the PA gene (pagA) by PCR analysis of the blood as confirmation of infection. For four of five animals tested, LF exhibited a triphasic kinetic profile. LF levels (mean +/- standard error [SE] between animals) were low at 24 h postchallenge (0.03 +/- 1.82 ng/ml), increased at 48 h to 39.53 +/- 0.12 ng/ml (phase 1), declined at 72 h to 13.31 +/- 0.24 ng/ml (phase 2), and increased at 96 h (82.78 +/- 2.01 ng/ml) and 120 h (185.12 +/- 5.68 ng/ml; phase 3). The fifth animal had an extended phase 2. PGA levels were triphasic; they were nondetectable at 24 h, increased at 48 h (2,037 +/- 2 ng/ml), declined at 72 h (14 +/- 0.2 ng/ml), and then increased at 96 h (3,401 +/- 8 ng/ml) and 120 h (6,004 +/- 187 ng/ml). Bacteremia was also triphasic: positive at 48 h, negative at 72 h, and positive at euthanasia. Blood neutrophils increased from preexposure (34.4% +/- 0.13%) to 48 h (75.6% +/- 0.08%) and declined at 72 h (62.4% +/- 0.05%). The 72-h declines may establish a "go/no go" turning point in infection, after which systemic bacteremia ensues and the host's condition deteriorates. This study emphasizes the value of LF detection as a tool for early diagnosis of inhalation anthrax before the onset of fulminant systemic infection. |
Serum vitamin C and the prevalence of vitamin C deficiency in the United States: 2003-2004 National Health and Nutrition Examination Survey (NHANES)
Schleicher RL , Carroll MD , Ford ES , Lacher DA . Am J Clin Nutr 2009 90 (5) 1252-63 BACKGROUND: Vitamin C (ascorbic acid) may be the most important water-soluble antioxidant in human plasma. In the third National Health and Nutrition Examination Survey (NHANES III, 1988-1994), approximately 13% of the US population was vitamin C deficient (serum concentrations <11.4 mumol/L). OBJECTIVE: The aim was to determine the most current distribution of serum vitamin C concentrations in the United States and the prevalence of deficiency in selected subgroups. DESIGN: Serum concentrations of total vitamin C were measured in 7277 noninstitutionalized civilians aged ≥6 y during the cross-sectional, nationally representative NHANES 2003-2004. The prevalence of deficiency was compared with results from NHANES III. RESULTS: The overall age-adjusted mean from the square-root transformed (SM) concentration was 51.4 mumol/L (95% CI: 48.4, 54.6). The highest concentrations were found in children and older persons. Within each race-ethnic group, women had higher concentrations than did men (P < 0.05). Mean concentrations of adult smokers were one-third lower than those of nonsmokers (SM: 35.2 compared with 50.7 mumol/L and 38.6 compared with 58.0 mumol/L in men and women, respectively). The overall prevalence (+/-SE) of age-adjusted vitamin C deficiency was 7.1 +/- 0.9%. Mean vitamin C concentrations increased (P < 0.05) and the prevalence of vitamin C deficiency decreased (P < 0.01) with increasing socioeconomic status. Recent vitamin C supplement use or adequate dietary intake decreased the risk of vitamin C deficiency (P < 0.05). CONCLUSIONS: In NHANES 2003-2004, vitamin C status improved, and the prevalence of vitamin C deficiency was significantly lower than that during NHANES III, but smokers and low-income persons were among those at increased risk of deficiency. |
Methods for deriving a representative biodynamic response of the hand-arm system to vibration
Dong RG , Welcome DE , McDowell TW , Wu JZ . J Sound Vib 2009 325 1047- 61 Vibration-induced biodynamic responses (BR) of the human hand-arm system measured with subjects participating in an experiment are usually arithmetically averaged and used to represent their mean response. The mean BR data reported from different studies are further arithmetically averaged to form the reference mean response for standardization and other applications. The objectives of this study are to clarify whether such a response-based averaging process could significantly misrepresent the characteristics of the original responses, and to identify an appropriate derivation method. The arithmetically averaged response was directly compared with the response derived from a property-based method proposed in this study. Two sets of reported mechanical impedance data measured at the fingers and the palms of the hands were used to derive the models required for the comparison. This study found that the response-based arithmetic averaging could generate some systematic errors. The range of the subjects' natural frequencies in each resonance mode, the mode damping ratio, and the number of subjects participating in the experiment are among the major factors influencing the level of the errors. An effective and practical approach for reducing the potential for error is to increase the number of subjects in the BR measurement. On the other hand, the property-based derivation method can be generally used to obtain the representative response, but it is less efficient than the response-based derivation method. (C) Published by Elsevier Ltd. |
A reconsideration of acute beryllium disease
Cummings KJ , Stefaniak AB , Virji MA , Kreiss K . Environ Health Perspect 2009 117 (8) 1250-6 CONTEXT: Although chronic beryllium disease (CBD) is clearly an immune-mediated granulomatous reaction to beryllium, acute beryllium disease (ABD) is commonly considered an irritative chemical phenomenon related to high exposures. Given reported new cases of ABD and projected increased demand for beryllium, we aimed to reevaluate the patho physiologic associations between ABD and CBD using two cases identified from a survey of beryllium production facility workers. CASE PRESENTATION: Within weeks after exposure to beryllium fluoride began, two workers had systemic illness characterized by dermal and respiratory symptoms and precipitous declines in pulmonary function. Symptoms and pulmonary function abnormalities improved with cessation of exposure and, in one worker, recurred with repeat exposure. Bronchoalveolar lavage fluid analyses and blood beryllium lymphocyte proliferation tests revealed lymphocytic alveolitis and cellular immune recognition of beryllium. None of the measured air samples exceeded 100 microg/m(3), and most were < 10 microg/m(3), lower than usually described. In both cases, lung biopsy about 18 months after acute illness revealed noncaseating granulomas. Years after first exposure, the workers left employment because of CBD. DISCUSSION: Contrary to common understanding, these cases suggest that ABD and CBD represent a continuum of disease, and both involve hypersensitivity reactions to beryllium. Differences in disease presentation and progression are likely influenced by the solubility of the beryllium compound involved. RELEVANCE TO PRACTICE: ABD may occur after exposures lower than the high concentrations commonly described. Prudence dictates limitation of further beryllium exposure in both ABD and CBD. |
Contributions of dust exposure and cigarette smoking to emphysema severity in coal miners in the United States
Kuempel ED , Wheeler MW , Smith RJ , Vallyathan V , Green FH . Am J Respir Crit Care Med 2009 180 (3) 257-64 RATIONALE: Previous studies have shown associations between dust exposure or lung burden and emphysema in coal miners, although the separate contributions of various predictors have not been clearly demonstrated. OBJECTIVES: To quantitatively evaluate the relationship between cumulative exposure to respirable coal mine dust, cigarette smoking, and other factors on emphysema severity. METHODS: The study group included 722 autopsied coal miners and nonminers in the United States. Data on work history, smoking, race, and age at death were obtained from medical records and questionnaire completed by next-of-kin. Emphysema was classified and graded using a standardized schema. Job-specific mean concentrations of respirable coal mine dust were matched with work histories to estimate cumulative exposure. Relationships between various metrics of dust exposure (including cumulative exposure and lung dust burden) and emphysema severity were investigated in weighted least squares regression models. MEASUREMENTS AND MAIN RESULTS: Emphysema severity was significantly elevated in coal miners compared with nonminers among ever- and never-smokers (P < 0.0001). Cumulative exposure to respirable coal mine dust or coal dust retained in the lungs were significant predictors of emphysema severity (P < 0.0001) after accounting for cigarette smoking, age at death, and race. The contributions of coal mine dust exposure and cigarette smoking were similar in predicting emphysema severity averaged over this cohort. CONCLUSIONS: Coal dust exposure, cigarette smoking, age, and race are significant and additive predictors of emphysema severity in this study. |
Don't become a statistic: work safely at heights
Mulhern B , Lentz TJ . Occup Health Saf 2009 78 (7) 36, 38, 40 passim In Alabama, a framing crew member who was moving a roof truss into place while supporting himself on an 8-inch wide structural beam fell 27 feet to the ground inside the partially constructed building. The native Mexican laborer, who understood little English, was not wearing or using personal fall protection equipment. An 8-foot by 4-foot truss fell at the same time, striking the worker's head when he hit the ground. He was pronounced dead at a local hospital. |
Essential features for proactive risk management
Murashov V , Howard J . Nat Nanotechnol 2009 4 (8) 467-70 We propose a proactive approach to the management of occupational health risks in emerging technologies based on six features: qualitative risk assessment; the ability to adapt strategies and refine requirements; an appropriate level of precaution; global applicability; the ability to elicit voluntary cooperation by companies; and stakeholder involvement. |
Impact of publicly sponsored interventions on musculoskeletal injury claims in nursing homes
Park RM , Bushnell PT , Bailer AJ , Collins JW , Stayner LT . Am J Ind Med 2009 52 (9) 683-97 BACKGROUND: The rate of lost-time sprains and strains in private nursing homes is over three times the national average, and for back injuries, almost four times the national average. The Ohio Bureau of Workers' Compensation (BWC) has sponsored interventions that were preferentially promoted to nursing homes in 2000-2001, including training, consultation, and grants up to $40,000 for equipment purchases. METHODS: This study evaluated the impact of BWC interventions on back injury claim rates using BWC data on claims, interventions, and employer payroll for all Ohio nursing homes during 1995-2004 using Poisson regression. A subset of nursing homes was analyzed with more detailed data that allowed estimation of the impact of staffing levels and resident acuity on claim rates. Costs of interventions were compared to the associated savings in claim costs. RESULTS: A $500 equipment purchase per nursing home worker was associated with a 21% reduction in back injury rate. Assuming an equipment life of 10 years, this translates to an estimated $768 reduction in claim costs per worker, a present value of $495 with a 5% discount rate applied. Results for training courses were equivocal. Only those receiving below-median hours had a significant 19% reduction in claim rates. Injury rates did not generally decline with consultation independent of equipment purchases, although possible confounding, misclassification, and bias due to non-random management participation clouds interpretation. In nursing homes with available data, resident acuity was modestly associated with back injury risk, and the injury rate increased with resident-to-staff ratio (acting through three terms: RR = 1.50 for each additional resident per staff member; for the ratio alone, RR = 1.32, 95% CI = 1.18-1.48). In these NHs, an expenditure of $908 per resident care worker (equivalent to $500 per employee in the other model) was also associated with a 21% reduction in injury rate. However, with a resident-to-staff ratio greater than 2.0, the same expenditure was associated with a $1,643 reduction in back claim costs over 10 years per employee, a present value of $1,062 with 5% discount rate. CONCLUSIONS: Expenditures for ergonomic equipment in nursing homes by the Ohio BWC were associated with fewer worker injuries and reductions in claim costs that were similar in magnitude to expenditures. Un-estimated benefits and costs also need to be considered in assessing full health and financial impacts. Am. J. Ind. Med. 52:683-697, 2009. (c) 2009 Wiley-Liss, Inc. |
In situ structure characterization of airborne carbon nanofibres by a tandem mobility-mass analysis
Ku BK , Emery MS , Maynard AD , Stolzenburg MR , McMurry PH . Nanotechnology 2006 17 (14) 3613-21 Carbon nanofibres aerosolized by the agitation of as-produced commercial powder have been characterized in situ by using the differential mobility analyser-aerosol particle mass analyser (DMA-APM) method to determine their structural properties such as the effective density and fractal dimension for toxicology study. The effective density of the aerosolized carbon nanofibres decreased from 1.2 to 0.4 g cm(-3) as the mobility diameters increased from 100 to 700 nm, indicating that the carbon nanofibres had open structures with an overall void that increased with increasing diameter, due to increased agglomeration of the nanofibres. This was confirmed by transmission electron microscopy (TEM) observation, showing that 100 nm mobility diameter nanofibres were predominantly single fibres, while doubly or triply attached fibres were seen at mobility diameters of 200 and 400 nm. Effective densities calculated using Cox's theory were in reasonable agreement with experimental values. The mass fractal dimension of the carbon nanofibres was found to be 2.38 over the size range measured and higher than that of single-walled carbon nanotubes (SWCNTs), suggesting that the carbon nanofibres have more compact structure than SWCNTs. |
Cholesterol control beyond the clinic: New York City's trans fat restriction
Angell SY , Silver LD , Goldstein GP , Johnson CM , Deitcher DR , Frieden TR , Bassett MT . Ann Intern Med 2009 151 (2) 129-34 Decades after key modifiable risk factors were identified, cardiovascular disease remains the leading cause of preventable death, and only one quarter of persons with high cholesterol levels have attained recommended levels of control. Cholesterol control efforts have focused on consumer education and medical treatment. A powerful, complementary approach is to change the makeup of food, a route the New York City Department of Health and Mental Hygiene took when it restricted artificial trans fat--a contributor to coronary heart disease--in restaurants. The Department first undertook a voluntary campaign, but this effort did not decrease the proportion of restaurants that used artificial trans fat. In December 2006, the Board of Health required that artificial trans fat be phased out of restaurant food. To support implementation, the Department provided technical assistance to restaurants. By November 2008, the restriction was in full effect in all New York City restaurants and estimated restaurant use of artificial trans fat for frying, baking, or cooking or in spreads had decreased from 50% to less than 2%. Preliminary analyses suggest that replacement of artificial trans fat has resulted in products with more healthful fatty acid profiles. For example, in major restaurant chains, total saturated fat plus trans fat in French fries decreased by more than 50%. At 2 years, dozens of national chains had removed artificial trans fat, and 13 jurisdictions, including California, had adopted similar laws. Public health efforts that change food content to make default choices healthier enable consumers to more successfully meet dietary recommendations and reduce their cardiovascular risk. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Community Health Services
- Entomology
- Environmental Health
- Epidemiology and Surveillance
- Genetics and Genomics
- Healthcare Associated Infections
- Immunity and Immunization
- Injury and Violence
- Laboratory Sciences
- Nutritional Sciences
- Occupational Safety and Health
- Public Health Law
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure