First detection of group C rotavirus in children with acute gastroenteritis in South Korea.
Moon S , Humphrey CD , Kim JS , Baek LJ , Song JW , Song KJ , Jiang B . Clin Microbiol Infect 2010 17 (2) 244-7 Group C rotavirus (GpC RV) causes sporadic cases and outbreaks of acute diarrhea in humans worldwide, but has not been detected among children in South Korea. The aims of the present study were to detect GpC RV among children hospitalized with gastroenteritis in South Korea and perform molecular characterization of GpC RV strains. From November 2003 to January 2006, 434 fecal samples were collected from children < 10 years of age who were hospitalized for treatment of acute diarrhea and screened for group C and A rotaviruses by EIA. GpC RV strains were characterized by sequence and phylogenetic analysis.Of the 434 samples screened, two were positive for GpC RV and one had a mixed GpC and GpA RV infection. One of the strains, Icheon, shared high sequence conservation in VP4, VP6 and VP7 genes with other published GpC RV. This is the first report describing the molecular characteristics of GpC RV among children in South Korea, more surveillance is needed to determine the burden of GpC RV diarrhea. |
Screening and treatment to prevent sequelae in women with Chlamydia trachomatis genital infection: how much do we know?
Gottlieb SL , Berman SM , Low N . J Infect Dis 2010 201 Suppl 2 S156-67 BACKGROUND: An important question for chlamydia control programs is the extent to which finding and treating prevalent, asymptomatic Chlamydia trachomatis genital infection reduces reproductive sequelae in infected women. METHODS: We reviewed the literature to critically evaluate evidence on the effect of chlamydia screening on development of sequelae in infected women. RESULTS: Two randomized controlled trials of 1-time screening for chlamydial infection-in a Seattle-area health maintenance organization and a Danish school district-revealed that screening was associated with an approximately 50% reduction in the incidence of pelvic inflammatory disease over the following year. However, both of these trials had methodological issues that may have affected the magnitude of observed screening benefits and might limit generalizability to other populations. A large, nonrandomized cohort of chlamydia screening among US Army recruits, although limited by lack of outpatient data, did not find a benefit of similar magnitude to the randomized trials. Methodological limitations restrict valid conclusions about individual benefits of screening using data from historical cohorts and ecological studies. We identified no trials directly evaluating the effect of chlamydia screening on subclinical tubal inflammation or damage, ectopic pregnancy, or tubal factor infertility and no studies addressing the effects of >1 round of screening, the optimal frequency of screening, or the benefits of screening for repeat infections. CONCLUSIONS: Additional studies of the effectiveness of chlamydia screening would be valuable; feasible study designs may depend on the degree to which screening programs are already established. In addition, better natural history data on the timing of tubal inflammation and damage after C. trachomatis infection and development of more accurate, noninvasive tools to assess chlamydial sequelae are essential to informing chlamydia control efforts. |
Tuberculosis investigations associated with air travel: U. S. Centers for Disease Control and Prevention, January 2007-June 2008
Marienau KJ , Burgess GW , Cramer E , Averhoff FM , Buff AM , Russell M , Kim C , Neatherlin JC , Lipman H . Travel Med Infect Dis 2010 8 (2) 104-12 INTRODUCTION: Contact investigations conducted in the United States of persons with tuberculosis (TB) who traveled by air while infectious have increased. However, data about transmission risks of Mycobacterium tuberculosis on aircraft are limited. METHODS: We analyzed data on index TB cases and passenger contacts from contact investigations initiated by the U.S. Centers for Disease Control and Prevention from January 2007 through June 2008. RESULTS: Contact investigations for 131 index cases met study inclusion criteria, including 4550 passenger contacts. U.S. health departments reported TB screening test results for 758 (22%) of assigned contacts; 182 (24%) had positive results. Of the 142 passenger contacts with positive TB test results with information about risk factors for prior TB infection, 130 (92%) had at least one risk factor and 12 (8%) had no risk factors. Positive TB test results were significantly associated with risk factors for prior TB infection (OR 23; p<0.001). No cases of TB disease among passenger contacts were reported. CONCLUSION: The risks of M. tuberculosis transmission during air travel remain difficult to quantify. Definitive assessment of transmission risks during flights and determination of the effectiveness of contact-tracing efforts will require comprehensive cohort studies. |
Pulmonary impairment after tuberculosis and its contribution to TB burden
Pasipanodya JG , McNabb SJ , Hilsenrath P , Bae S , Lykens K , Vecino E , Munguia G , Miller TL , Drewyer G , Weis SE . BMC Public Health 2010 10 (1) 259 BACKGROUND: The health impacts of pulmonary impairment after tuberculosis (TB) treatment have not been included in assessments of TB burden. Therefore, previous global and national TB burden estimates do not reflect the full consequences of surviving TB. We assessed the burden of TB including pulmonary impairment after tuberculosis in Tarrant County, Texas using Disability-adjusted Life Years (DALYs) METHODS: TB burden was calculated for all culture-confirmed TB patients treated at Tarrant County Public Health between January 2005 and December 2006 using identical methods and life tables as the Global Burden of Disease Study. Years of life-lost were calculated as the difference between life expectancy using standardized life tables and age-at-death from TB. Years lived-with-disability were calculated from age and gender-specific TB disease incidence using published disability weights. Non-fatal health impacts of TB were divided into years lived-with-disability-acute and years lived-with-disability-chronic. Years lived-with-disability-acute was defined as TB burden resulting from illness prior to completion of treatment including the burden from treatment-related side effects. Years lived-with-disability-chronic was defined as TB burden from disability resulting from pulmonary impairment after tuberculosis. RESULTS: There were 224 TB cases in the time period, of these 177 were culture confirmed. These 177 subjects lost a total of 1189 DALYs. Of these 1189 DALYs 23% were from years of life-lost, 2% were from years lived-with-disability-acute and 75% were from years lived-with-disability-chronic. CONCLUSIONS: Our findings demonstrate that the disease burden from TB is greater than previously estimated. Pulmonary impairment after tuberculosis was responsible for the majority of the burden. These data demonstrate that successful TB control efforts may reduce the health burden more than previously recognized. |
High-risk human papillomavirus reactivation in human immunodeficiency virus-infected women: risk factors for cervical viral shedding
Theiler RN , Farr SL , Karon JM , Paramsothy P , Viscidi R , Duerr A , Cu-Uvin S , Sobel J , Shah K , Klein RS , Jamieson DJ . Obstet Gynecol 2010 115 (6) 1150-1158 OBJECTIVE: To evaluate the presence of and estimate risk factors for reactivation of latent high-risk human papillomavirus (HPV) cervical infection in human immunodeficiency virus (HIV)-infected and HIV-uninfected women. METHODS: Data from 898 women in the HIV Epidemiology Research Study (HERS) were used to evaluate cervical HPV latency and reactivation. Prior exposure to HPV types (16, 18, 31, 35, and 45) was determined by serologic testing at enrollment, and cervical shedding of HPV was detected by polymerase chain reaction at 6-month intervals. Human papillomavirus cervical shedding and sexual history were used to estimate rates of reactivation and recurrence. Repeated measures survival analysis was used to estimate hazard ratios and 95% confidence intervals for reactivation and recurrence. Rates of total HPV shedding (recurrence and reactivation) during follow-up were assessed by HIV status and rate ratios were calculated. RESULTS: Reactivation of latent HPV infections was observed in HIV-infected women, but few reactivation events were identified in HIV-uninfected women. Factors consistently associated with reactivation in HIV-infected women included CD4 count less than 200/mm and age younger than 35 years. Women infected with HIV had 1.8 to 8.2 times higher rates of viral shedding (reactivation plus recurrence) compared with HIV-uninfected women. CONCLUSION: Women with a history of cervical HPV infection may be at risk of reactivation of latent viral infection even in the absence of sexual activity, and this risk is higher in women with HIV infection. LEVEL OF EVIDENCE: II. |
Introduction: The natural history and immunobiology of Chlamydia trachomatis genital infection and implications for chlamydia control
Gottlieb SL , Brunham RC , Byrne GI , Martin DH , Xu F , Berman SM . J Infect Dis 2010 201 Suppl 2 S85-7 Chlamydia trachomatis genital infection is the most common bacterial sexually transmitted infection worldwide [1], and an estimated 3 million cases occur each year in the United States [2]. In women, C. trachomatis genital infection can lead to serious complications, including pelvic inflammatory disease, ectopic pregnancy, tubal infertility, and chronic pelvic pain [3]. Because of this, many countries have implemented chlamydia control efforts that have primarily emphasized enhanced detection and treatment of asymptomatic infection in young women and have achieved varying degrees of screening coverage [4–6]. Early reports from regions that were the first to implement chlamydia control activities (during the late 1980s and early 1990s) revealed that both chlamydia case rates and rates of associated complications were decreasing [7–9]. However, since the mid-1990s, in virtually all countries with substantial investment in chlamydia control, the number of C. trachomatis infection case reports has been increasing in the setting of ongoing control efforts [10–12]. In the United States, regions that had initially shown decreases in chlamydia test positivity (prevalence of chlamydia among tested women) have since shown stable or increasing test positivity [11]. Although there are limitations in using these types of surveillance data to assess burden of disease [13], the substantial and continuing decreases in rates of C. trachomatis infection that were expected after implementation of control programs have not been observed [14], and many chlamydia control programs are currently at a crossroads. |
AIDS-defining opportunistic illnesses in US patients, 1994-2007: a cohort study
Buchacz K , Baker RK , Palella Jr FJ , Chmiel JS , Lichtenstein KA , Novak RM , Wood KC , Brooks JT . AIDS 2010 24 (10) 1549-59 OBJECTIVES: To assess the incidence and spectrum of AIDS-defining opportunistic illnesses in the highly active antiretroviral therapy (cART) era. DESIGN: A prospective cohort study of 8070 participants in the HIV Outpatient Study at 12 U.S. HIV clinics. METHODS: We calculated incidence rates per 1000 person-years of observation for the first opportunistic infection, first opportunistic malignancy, and first occurrence of each individual opportunistic illness during 1994-2007. Using stratified Poisson regression models, and adjusting for sex, race, and HIV risk category, we modeled annual percentage changes in opportunistic illness incidence rates by calendar period. RESULTS: Eight thousand and seventy patients (baseline median age 38 years; median CD4 cell count 298 cells/mul) experienced 2027 incident opportunistic illnesses during a median of 2.9 years of observation. During 1994-1997, 1998-2002, and 2003-2007, respectively, rates of opportunistic infections (per 1000 person-years) were 89.0, 25.2 and 13.3 and rates of opportunistic malignancies were 23.4, 5.8 and 3.0 (P for trend <0.001 for both). Opportunistic illness rate decreases were similar for the subset of patients receiving cART. During 2003-2007, there were no significant changes in annual rates of opportunistic infections or opportunistic malignancies; the leading opportunistic illnesses (rate per 1000 person-years) were esophageal candidiasis (5.2), Pneumocystis pneumonia (3.9), cervical cancer (3.5), Mycobacterium avium complex infection (2.5), and cytomegalovirus disease (1.8); 36% opportunistic illness events occurred at CD4 cell counts at least 200 cells/mul. CONCLUSIONS: Opportunistic illness rates declined precipitously after introduction of cART and stabilized at low levels during 2003-2007. In this contemporary cART era, a third of opportunistic illnesses were diagnosed at CD4 cell counts at least 200 cells/mul. |
Chlamydia screening and pelvic inflammatory disease: Insights from exploratory time-series analyses
Owusu-Edusei Jr K , Bohm MK , Chesson HW , Kent CK . Am J Prev Med 2010 38 (6) 652-7 BACKGROUND: Screening for chlamydia has been reported to reduce pelvic inflammatory disease (PID) at the individual level. However, information on population-level association (or causality) is scant. PURPOSE: This study aims to examine the association between chlamydia and gonorrhea screening and PID diagnoses using time-series analyses. METHODS: Monthly chlamydia and gonorrhea screening and PID diagnosis rates were extracted for a cohort of 207,695 continuously enrolled privately insured women from January 2001 to December 2006. An autoregressive integrated moving average model was used to examine whether rates of PID diagnoses in a given month were associated with rates of chlamydia and gonorrhea screening in previous months. RESULTS: Monthly screening rates increased from about 300 to almost 700 per 100,000 for chlamydia and from 250 to almost 650 per 100,000 for gonorrhea, whereas PID diagnosis rates declined during the same period (40-20 per 100,000). Increases in screening rates were associated with decreases in PID diagnosis rates 4 months later. On average, a one-unit (or 10%) increase in the growth of chlamydia and gonorrhea screening rates, separately, in the prior fourth month was significantly associated with a 0.36 (or 3.6%, p<0.05) and 0.32 (or 3.2%, p<0.10) decrease in the growth rate of the PID diagnosis rate, respectively. CONCLUSIONS: Although analyses such as these cannot prove causality, the results are consistent with the hypothesis that increases in chlamydia and gonorrhea screening coverage can lead to reductions in PID at the population level. A population-level focus offers advantages over individual-level analyses of screening and PID, such as the ability to capture indirect benefits of increased screening. |
Coping strategies of adolescents living with HIV: disease-specific stressors and responses
Orban LA , Stein R , Koenig LJ , Conner LC , Rexhouse EL , Lewis JV , LaGrange R . AIDS Care 2010 22 (4) 420-30 This study examined disease-specific stressors and coping responses employed by youth with HIV. Data were analyzed from Adolescent Impact, a multi-site study of 166 adolescents infected with HIV in three major US cities. Participants identified HIV-related stressors during a face-to-face interview. Coping strategies were measured using the adolescent version of the Kidcope. Emotional and behavioral functioning were assessed with the Youth or Adult Self Report symptom checklists. Medication-related stressors were most common (30%) and reported more often by perinatally infected youth, whereas youth infected through risk behaviors reported more disclosure-related stressors. Passive emotional regulation was perceived as the most used and most helpful coping strategy overall. Youth reporting medication adherence-related stressors used resignation most frequently. A two-factor model (Passive and Active Coping) emerged. The Passive Coping factor included strategies that do not directly approach the problem, whereas Active Coping included strategies that involve an active approach. Youth with moderately advanced disease (CD4 200-500 cells/mm(3)) used a Passive Coping style more often than healthier youth (CD4 > 500 cells/mm(3)). Additionally, Passive Coping was associated with greater emotional and behavioral problems. Youth infected with HIV may benefit from interventions promoting adaptive coping responses to HIV-specific stressors, particularly medication adherence. |
Prenatal exposure to PBDEs and neurodevelopment
Herbstman JB , Sjodin A , Kurzon M , Lederman SA , Jones RS , Rauh V , Needham LL , Tang D , Niedzwiecki M , Wang RY , Perera F . Environ Health Perspect 2010 118 (5) 712-9 BACKGROUND: Polybrominated diphenyl ethers (PBDEs) are widely used flame retardant compounds that are persistent and bioaccumulative and therefore have become ubiquitous environment contaminants. Animal studies suggest that prenatal PBDE exposure may result in adverse neurodevelopmental effects. OBJECTIVE: In a longitudinal cohort initiated after 11 September 2001, including 329 mothers who delivered in one of three hospitals in lower Manhattan, New York, we examined prenatal PBDE exposure and neurodevelopment when their children were 12-48 and 72 months of age. METHODS: We analyzed 210 cord blood specimens for selected PBDE congeners and assessed neurodevelopmental effects in the children at 12-48 and 72 months of age; 118, 117, 114, 104, and 96 children with available cord PBDE measurements were assessed at 12, 24, 36, 48, and 72 months, respectively. We used multivariate regression analyses to evaluate the associations between concentrations of individual PBDE congeners and neurodevelopmental indices. RESULTS: Median cord blood concentrations of PBDE congeners 47, 99, and 100 were 11.2, 3.2, and 1.4 ng/g lipid, respectively. After adjustment for potential confounders, children with higher concentrations of BDEs 47, 99, or 100 scored lower on tests of mental and physical development at 12-48 and 72 months. Associations were significant for 12-month Psychomotor Development Index (BDE-47), 24-month Mental Development Index (MDI) (BDE-47, 99, and 100), 36-month MDI (BDE-100), 48-month full-scale and verbal IQ (BDE-47, 99, and 100) and performance IQ (BDE-100), and 72-month performance IQ (BDE-100). CONCLUSIONS: This epidemiologic study demonstrates neurodevelopmental effects in relation to cord blood PBDE concentrations. Confirmation is needed in other longitudinal studies. EDITOR'S SUMMARY: Polybrominated diphenyl ethers (PBDEs) are widely used flame retardant compounds that are persistent and bioaccumulative. Animal studies suggest that prenatal PBDE exposure may result in adverse neurodevelopmental effects. Herbstman et al. (p. 712) initiated a longitudinal cohort following the World Trade Center attack on 11 September 2001 to evaluate associations between concentrations of individual PBDE congeners and neurodevelopmental indices. Outcomes were evaluated in approximately 100 children with PBDE concentrations measured in cord blood samples. After adjustment for potential confounders, higher concentrations of BDEs 47, 99, or 100 were associated with lower scores on tests of mental and physical development at 12-48 and 72 months. The authors conclude that developmental exposure to flame retardants following the World Trade Center disaster was associated with altered neurodevelopment of children up to 72 months of age. |
Exposure to phthalates and breast cancer risk in northern Mexico
Lopez-Carrillo L , Hernandez-Ramirez RU , Calafat AM , Torres-Sanchez L , Galvan-Portillo M , Needham LL , Ruiz-Ramos R , Cebrian ME . Environ Health Perspect 2010 118 (4) 539-44 BACKGROUND: Phthalates, ubiquitous environmental pollutants that may disturb the endocrine system, are used primarily as plasticizers of polyvinyl chloride and as additives in consumer and personal care products. OBJECTIVES: In this study, we examined the association between urinary concentrations of nine phthalate metabolites and breast cancer (BC) in Mexican women. METHODS: We age-matched 233 BC cases to 221 women residing in northern Mexico. Sociodemographic and reproductive characteristics were obtained by direct interviews. Phthalates were determined in urine samples (collected pretreatment from the cases) by isotope dilution/high-performance liquid chromatography coupled to tandem mass spectrometry. RESULTS: Phthalate metabolites were detected in at least 82% of women. The geometric mean concentrations of monoethyl phthalate (MEP) were higher in cases than in controls (169.58 vs. 106.78 microg/g creatinine). Controls showed significantly higher concentrations of mono-n-butyl phthalate, mono(2-ethyl-5-oxohexyl) phthalate, and mono(3-carboxypropyl) phthalate (MCPP) than did the cases. After adjusting for risk factors and other phthalates, MEP urinary concentrations were positively associated with BC [odds ratio (OR), highest vs. lowest tertile = 2.20; 95% confidence interval (CI), 1.33-3.63; p for trend < 0.01]. This association became stronger when estimated for premenopausal women (OR, highest vs. lowest tertile = 4.13; 95% CI, 1.60-10.70; p for trend < 0.01). In contrast, we observed significant negative associations for monobenzyl phthalate (MBzP) and MCPP. CONCLUSIONS: We show for the first time that exposure to diethyl phthalate, the parent compound of MEP, may be associated with increased risk of BC, whereas exposure to the parent phthalates of MBzP and MCPP might be negatively associated. These findings require confirmation. |
A case-control study of body mass index and breast cancer risk in white and African-American women.
Berstad P , Coates RJ , Bernstein L , Folger SG , Malone KE , Marchbanks PA , Weiss LK , Liff JM , McDonald JA , Strom BL , Simon MS , Deapen D , Press MF , Burkman RT , Spirtas R , Ursin G . Cancer Epidemiol Biomarkers Prev 2010 19 (6) 1532-44 OBJECTIVE: Large body size has been associated with decreased risk of breast cancer in premenopausal women but with increased risk in postmenopausal women. Limited information is available about African-American women and differences by estrogen and progesterone receptor status. METHODS: We analyzed data from the Women's Contraceptive and Reproductive Experiences Study among 3,997 white and African-American breast cancer case patients diagnosed in 1994 to 1998 and 4,041 control participants ages 35 to 64 years. We calculated multivariate odds ratios (OR) as measures of relative risk of breast cancer associated with self-reported body mass index (BMI) at age 18 and 5 years before diagnosis (recent BMI). RESULTS: Risk tended to decrease with increasing BMI at age 18 years in all women [OR(BMI ≥ 25 kg/m(2) versus < 20 kg/m(2)) = 0.76; 95% confidence interval (CI), 0.63-0.90; P(trend) = 0.005] and with recent BMI in premenopausal women (OR(BMI ≥ 35 kg/m(2) versus < 25 kg/m(2)) = 0.81; 95% CI, 0.61-1.06; P(trend) = 0.05), unmodified by race. Among postmenopausal white but not African-American women, there was an inverse relation between recent BMI and risk. High recent BMI was associated with increased risk of estrogen receptor- and progesterone receptor-positive tumors among postmenopausal African-American women (OR(BMI ≥ 35 kg/m(2) versus < 25 kg/m(2)) = 1.83; 95% CI, 1.08-3.09; P(trend) = 0.03). CONCLUSION: Among women at age 35 to 64 years, BMI at age 18 years is inversely associated with risk of breast cancer, but association with recent BMI varies by menopause status, race, and hormone receptor status. Impact: Our findings indicate that studies of BMI and breast cancer should consider breast cancer subtypes. (c)2010 AACR. |
Asian American/Pacific Islander paradox in diabetic retinopathy: findings from the Behavioral Risk Factor Surveillance System, 2006-2008
Li Y , Liao Y , Fan A , Zhang X , Balluz L . Ethn Dis 2010 20 (2) 111-7 OBJECTIVES: To compare the self-reported prevalence of diabetic retinopathy (DR) between Asian Americans/Pacific Islanders (AAPIs) and Whites in the United States. METHODS: We analyzed data from 70,209 adults aged > or =18 years with diabetes derived from the 2006-2008 Behavioral Risk Factor Surveillance System (BRFSS), including 1,499 AAPIs and 68,710 White individuals. RESULTS: Compared with Whites with diabetes, AAPIs with diabetes had higher socioeconomic status, fewer risk factors (eg, smoking) and coexisting chronic diseases (eg, cardiovascular disease [CVD]). Diabetes duration and percentage of persons using insulin were similar between the 2 populations. However, AAPIs had a much higher prevalence of DR (27.6%) than Whites (18.2%) (P<.001). Comparing AAPIs to Whites, the age- and gender-adjusted odds ratio of DR was 1.97 (1.48-2.62). The adjusted odds ratio was 2.21 (1.63-3.00) after adjustment for sociodemographic (education and marital status), chronic conditions (CVD and smoking), severity of diabetes and diabetes care (age of diabetes onset, frequency of self-checking blood sugar, and frequency of dilated eye exam). CONCLUSIONS: Despite their favorable socio- and health-related profiles, AAPIs had significantly higher prevalence of DR compared with Whites. |
The role of a poison control center in identifying and limiting an outbreak of foodborne botulism
Brown J , Sutter ME , Algren DA , Thomas JD , Ragone S , Schier JG , Geller RJ . Am J Prev Med 2010 38 (6) 675-8 Many poison control centers partner with public health agencies to handle weekend and after-hours consultations and emergencies. This event describes the effective use of poison control center capabilities in identifying and limiting an outbreak of foodborne botulism. On September 8, 2006, the poison control center received a call regarding a man aged 77 years admitted to a hospital neurology service with dysarthria, dysphagia, and weakness. The poison control center was contacted regarding a concern for botulism. Further information revealed that the patient's wife and a friend had similar symptoms and had eaten together on the previous night. All three sought treatment at different hospitals. The poison control center successfully located the other two patients and provided information regarding the treatment of botulism. In addition, the poison control center notified the on-call local public health official and the CDC for the release of botulinum antitoxin. Public health officials were informed of our concerns for a foodborne outbreak given the common meal. Their investigation determined that the source of botulism was carrot juice. |
Improvements in ability to detect undiagnosed diabetes by using information on family history among adults in the United States.
Yang Q , Liu T , Valdez R , Moonesinghe R , Khoury MJ . Am J Epidemiol 2010 171 (10) 1079-89 Family history is an independent risk factor for diabetes, but it is not clear how much adding family history to other known risk factors would improve detection of undiagnosed diabetes in a population. Using the National Health and Nutrition Examination Survey for 1999-2004, the authors compared logistic regression models with established risk factors (model 1) with a model (model 2) that also included familial risk of diabetes (average, moderate, and high). Adjusted odds ratios for undiagnosed diabetes, using average familial risk as referent, were 1.7 (95% confidence interval (CI): 1.2, 2.5) and 3.8 (95% CI: 2.2, 6.3) for those with moderate and high familial risk, respectively. Model 2 was superior to model 1 in detecting undiagnosed diabetes, as reflected by several significant improvements, including weighted C statistics of 0.826 versus 0.842 (bootstrap P = 0.001) and integrated discrimination improvement of 0.012 (95% CI: 0.004, 0.030). With a risk threshold of 7.3% (sensitivity of 40% based on model 1), adding family history would identify an additional 620,000 (95% CI: 221,100, 1,020,000) cases without a significant change in false-positive fraction. Study findings suggest that adding family history of diabetes can provide significant improvements in detecting undiagnosed diabetes in the US population. Further research is needed to validate the authors' findings. |
Role of human immunodeficiency virus type 1 integrase in uncoating of the viral core
Briones MS , Dobard CW , Chow SA . J Virol 2010 84 (10) 5181-90 After membrane fusion with a target cell, the core of human immunodeficiency virus type 1 (HIV-1) enters into the cytoplasm, where uncoating occurs. The cone-shaped core is composed of the viral capsid protein (CA), which disassembles during uncoating. The underlying factors and mechanisms governing uncoating are poorly understood. Several CA mutations can cause changes in core stability and a block at reverse transcription, demonstrating the requirement for optimal core stability during viral replication. HIV-1 integrase (IN) catalyzes the insertion of the viral cDNA into the host genome, and certain IN mutations are pleiotropic. Similar to some CA mutants, two IN mutants, one with a complete deletion of IN (NL-DeltaIN) and the other with a Cys-to-Ser substitution (NL-C130S), were noninfectious, with a replication block at reverse transcription. Compared to the wild type (WT), the cytoplasmic CA levels of the IN mutants in infected cells were reduced, suggesting accelerated uncoating. The role of IN during uncoating was examined by isolating and characterizing cores from NL-DeltaIN and NL-C130S. Both IN mutants could form functional cores, but the core yield and stability were decreased. Also, virion incorporation of cyclophilin A (CypA), a cellular peptidyl-prolyl isomerase that binds specifically to CA, was decreased in the IN mutants. Cores isolated from WT virus depleted of CypA had an unstable-core phenotype, confirming a role of CypA in promoting optimal core stability. Taken together, our results indicate that IN is required during uncoating for maintaining CypA-CA interaction, which promotes optimal stability of the viral core. |
Lethal dissemination of H5N1 influenza virus is associated with dysregulation of inflammation and lipoxin signaling in a mouse model of infection
Cilloniz C , Pantin-Jackwood MJ , Ni C , Goodman AG , Peng X , Proll SC , Carter VS , Rosenzweig ER , Szretter KJ , Katz JM , Korth MJ , Swayne DE , Tumpey TM , Katze MG . J Virol 2010 84 (15) 7613-24 Periodic outbreaks of highly pathogenic avian H5N1 influenza viruses, and the current H1N1 pandemic, highlight the need for a more detailed understanding of influenza virus pathogenesis. To investigate the host transcriptional response induced by pathogenic influenza viruses, we used a functional genomics approach to compare gene expression profiles in lungs from 129S6/SvEv mice infected with either the fully reconstructed H1N1 1918 pandemic virus (1918) or the highly pathogenic avian H5N1 virus Vietnam/1203/04 (VN/1203). Although both viruses reached similar titers in the lung and caused a lethal infection, the mean time of death was 6 days for VN/1203-infected animals and 9 days for mice infected with the 1918 virus. VN/1203-infected animals also exhibited an earlier and more potent inflammatory response. This response included induction of genes encoding components of the inflammasome. VN/1203 was also able to disseminate to multiple organs, including the brain, which correlated with changes in the expression of genes associated with hematological functions and lipoxin biogenesis and signaling. Both viruses elicited expression of type I interferon (IFN)-regulated genes in wild-type mice and to a lesser extent in mice lacking the type I IFN receptor, suggesting alternative or redundant pathways for IFN signaling. Our findings suggest that VN/1203 is more pathogenic in mice as a consequence of several factors, including the early and sustained induction of the inflammatory response, the additive or synergistic effects of up-regulated components of the immune response, and inhibition of lipoxin-mediated anti-inflammatory responses, which correlated with the ability of VN/1203 to disseminate to extrapulmonary organs. |
Reasons for delay in seeking care for tuberculosis, Republic of Armenia, 2006-2007
Schneider D , McNabb SJN , Safaryan M , Davidyants V , Niazyan L , Orbelyan S . Interdiscip Perspect Infect Dis 2010 2010 412624. BACKGROUND: Tuberculosis (TB) is a leading cause of morbidity and mortality worldwide. In Armenia, case reports of active TB increased from 590 to 1538 between 1990 and 2003. However, the TB case detection rate in Armenia in 2007 was only 51%, indicating that many cases go undetected or that suspected cases are not referred for confirmatory diagnosis. Understanding why Armenians do not seek or delay TB medical care is important to increase detection rates, improve treatment outcomes, and reduce ongoing transmission. METHODS: Two hundred-forty patients hospitalized between August 2006 and September 2007 at two Armenian TB reference hospitals were interviewed about symptoms, when they sought medical attention after symptom onset, outcomes of their first medical visit, and when they began treatment after diagnosis. We used logistic regression modeling to identify reasons for delay in diagnosis. RESULTS: Fatigue and weight loss were significantly associated with delay in seeking medical attention [aOR = 2.47 (95%CI = 1.15, 5.29); aOR = 2.99 (95%CI = 1.46, 6.14), resp.], while having night sweats protected against delay [aOR = 0.48 (95%CI = 0.24, 0.96)]. Believing the illness to be something other than TB was also significantly associated with delay [aOR = 2.63 (95%CI = 1.13, 6.12)]. Almost 20% of the 240TB patients were neither diagnosed at their first medical visit nor referred for further evaluation. CONCLUSIONS: This study showed that raising awareness of the signs and symptoms of TB among both the public and clinical communities is urgently needed. |
Parenting predictors of early-adolescents' health behaviors: simultaneous group comparisons across sex and ethnic groups
Windle M , Brener N , Cuccaro P , Dittus P , Kanouse DE , Murray N , Wallander J , Schuster MA . J Youth Adolesc 2010 39 (6) 594-606 The purpose of this study was to evaluate the invariance of predictive relations across early-adolescent sex and ethnic groups regarding parenting factors and externalizing and internalizing problems and victimization. Data (n = 598; 54% female) from a triethnic (Hispanic, non-Hispanic white, and non-Hispanic black) probability sample of fifth graders collected from three sites (Birmingham, AL, Houston, TX, and Los Angeles, CA) were used in the analyses. Simultaneous group structural equation modeling supported the invariance of parenting-early adolescent outcomes across sex and ethnic groups. Parental monitoring and parental norms were relatively robust predictors of early-adolescent externalizing problems and victimization, and to a lesser extent, of internalizing problems. A maternal nurturance by parental monitoring interaction was statistically significant for all outcome behaviors, indicating that higher monitoring in conjunction with higher maternal nurturance was associated with lower levels of early-adolescent problem behaviors. The findings suggest that core parenting factors such as nurturance, monitoring, and normative expectations for early adolescent problem behaviors may serve as a foundation for parenting components of multi-component intervention studies. |
Survey of employee knowledge and attitudes before and after a multicenter Veterans' Administration quality improvement initiative to reduce nosocomial methicillin-resistant Staphylococcus aureus infections
Burkitt KH , Sinkowitz-Cochran RL , Obrosky DS , Cuerdon T , Miller LJ , Jain R , Jernigan JA , Fine MJ . Am J Infect Control 2010 38 (4) 274-82 BACKGROUND: Although guidelines currently recommend prevention practices to decrease in-hospital transmission of infections, increasing adherence to the practices remains a challenge. This study assessed the effect of a multicenter methicillin-resistant Staphylococcus aureus (MRSA) prevention initiative on changes in employees' knowledge, attitudes, and practices. METHODS: Two cross-sectional surveys were distributed at baseline (October 2006) and follow-up (July 2007) at 17 medical centers participating in the Veterans' Administration (VA) MRSA initiative. RESULTS: Surveys were completed by 1362 employees at baseline and 952 employees at follow-up (representing 57% and 56% of eligible respondents, respectively). Respondents included physicians (9%), nurses (38%), allied health professionals (30%), and other support staff (24%). Of the 5 knowledge items, the mean proportion answered correctly increased slightly from baseline to follow-up (from 71% to 73%; P = .07). The percentage of respondents who believed that MRSA was a problem on their unit increased over time (from 56% to 65%; P < .001). Respondents also reported increased comfort with reminding other staff about proper hand hygiene (from 61% to 70%; P < .001) and contact precautions (from 63% to 70%; P < .002). The percentage of respondents reporting at least one barrier to proper hand hygiene decreased over time (from 25% to 20%; P = .003). CONCLUSIONS: In this multicenter study of VA employees, implementation of a MRSA quality improvement initiative was associated with temporal improvements in knowledge and perceptions regarding MRSA prevention. |
Reduced risk of transfusion-transmitted HIV in Kenya through centrally co-ordinated blood centres, stringent donor selection and effective p24 antigen-HIV antibody screening
Basavaraju SV , Mwangi J , Nyamongo J , Zeh C , Kimani D , Shiraishi RW , Madoda R , Okonji JA , Sugut W , Ongwae S , Pitman JP , Marum LH . Vox Sang 2010 99 (3) 212-9 BACKGROUND: Following a 1994 study showing a high rate of transfusion-associated HIV, Kenya implemented WHO blood safety recommendations including: organizing the Kenya National Blood Transfusion Service (NBTS), stringent blood donor selection, and universal screening with fourth-generation p24 antigen and HIV antibody assays. Here, we estimate the risk of transfusion-associated HIV transmission in Kenya resulting from NBTS laboratory error and consider the potential safety benefit of instituting pooled nucleic acid testing (NAT) to reduce window period transmission. METHODS: From November to December 2008 in one NBTS regional centre, and from March to June 2009 in all six NBTS regional centres, every third unit of blood screened negative for HIV by the national algorithm was selected. Dried blood spots were prepared and sent to a reference laboratory for further testing, including NAT. Test results from the reference laboratory and NBTS were compared. Risk of transfusion-associated HIV transmission owing to laboratory error and the estimated yield of implementing NAT were calculated. FINDINGS: No cases of laboratory error were detected in 12 435 units tested. We estimate that during the study period, the percentage of units reactive for HIV by NAT but non-reactive by the national algorithm was 0.0% (95% exact binomial confidence interval, 0.00-0.024%). INTERPRETATION: By adopting WHO blood safety strategies for resource-limited settings, Kenya has substantially reduced the risk of transfusion-associated HIV infection. As the national testing and donor selection algorithm is effective, implementing NAT is unlikely to add a significant safety benefit. These findings should encourage other countries in the region to fully adopt the WHO strategies. |
Streptococcus salivarius meningitis case strain traced to oral flora of anesthesiologist
Shewmaker PL , Gertz Jr RE , Kim CY , de Fijter S , Diorio M , Moore MR , Beall BW . J Clin Microbiol 2010 48 (7) 2589-91 Two women in labor received intrapartum spinal anesthesia from the same anesthesiologist approximately one hour apart. Within 15 hours both patients developed Streptococcus salivarius meningitis and one patient died. Blood and CSF from both patients and tongue swab specimens from the anesthesiologist yielded isolates of an indistinguishable S. salivarius strain. |
Advances in the development of vaccines against Neisseria meningitidis
Tan LK , Carlone GM , Borrow R . N Engl J Med 2010 362 (16) 1511-20 Although two centuries have passed since Vieusseux described epidemic meningococcal disease, (1) Neisseria meningitidis remains a leading cause of meningitis and sepsis. Overwhelming meningococcal disease can develop rapidly and is associated with mortality rates exceeding 20%. (2)Thus, efforts to control the disease have focused on vaccination. In the past, vaccines against meningococcal disease have failed to provide immunogenicity and long-term protection in infants, who are at greatest risk. Although recent vaccines have improved coverage for this age group, there is still no broadly effective vaccine against N. meningitidis group B (NMB), now the predominant disease-causing isolate in industrialized countries. |
Restraint use and seating position among children less than 13 years of age: Is it still a problem?
Greenspan AI , Dellinger AM , Chen J . J Safety Res 2010 41 (2) 183-5 INTRODUCTION: The purpose of this study was to calculate national estimates and examine the extent to which children prematurely use adult seat belts and ride in the front seat of a vehicle during a 30 day period. METHODS: Data were obtained from a nationally representative cross-sectional random-digit-dial telephone survey that included child-specific questions on motor vehicle restraint use and seating position. RESULTS: Among children less than 13 years, parents reported an estimated 618,337 who rode unrestrained and more than one million who rode in the front seat of a vehicle at least some of the time in the past 30 days. During the same time period, close to 11 million children 8 years and younger reportedly used only adult seat belts. DISCUSSION: Our results highlight the need for continued outreach to parents regarding optimal restraint use and rear seating position for children every trip, every time. |
Lifetime self-reported victimization among low-income, urban women: the relationship between childhood maltreatment and adult violent victimization
Parks SE , Kim KH , Day NL , Garza MA , Larkby CA . J Interpers Violence 2010 26 (6) 1111-28 Study aims were to examine the relations between multiple forms of childhood maltreatment (CM) and adult violent victimization (AVV) and to explore other significant covariates of the relations between CM and AVV. Data were collected from women (n = 477) who participated in two longitudinal studies in the Maternal Health Practices and Child Development Project. Women with a history of CM were more than twice as likely to experience AVV as women with no history of CM. Those who experienced one or two forms of CM were significantly more likely to report any AVV compared to women with no CM. The relationship between CM and AVV remained significant after controlling for illicit drug use at baseline. Among low-income women, a history of CM exposure increased the risk of AVV. Having had any CM exposure was more important that the specific form or combination of forms, of CM exposure (e.g., sexual abuse or physical abuse). |
A population-based study of risk of epilepsy after hospitalization for traumatic brain injury
Ferguson PL , Smith GM , Wannamaker BB , Thurman DJ , Pickelsimer EE , Selassie AW . Epilepsia 2010 51 (5) 891-898 PURPOSE: This study was undertaken to determine the risk of developing posttraumatic epilepsy (PTE) within 3 years after discharge among a population-based sample of older adolescents and adults hospitalized with traumatic brain injury (TBI) in South Carolina. It also identifies characteristics related to development of PTE within this population. METHODS: A stratified random sample of persons aged 15 and older with TBI was selected from the South Carolina nonfederal hospital discharge dataset for four consecutive years. Medical records of recruits were reviewed, and they participated in up to three yearly follow-up telephone interviews. RESULTS: The cumulative incidence of PTE in the first 3 years after discharge, after adjusting for loss to follow-up, was 4.4 per 100 persons over 3 years for hospitalized mild TBI, 7.6 for moderate, and 13.6 for severe. Those with severe TBI, posttraumatic seizures prior to discharge, and a history of depression were most at risk for PTE. This higher risk group also included persons with three or more chronic medical conditions at discharge. DISCUSSION: These results raise the possibility that although some of the characteristics related to development of PTE are nonmodifiable, other factors, such as depression, might be altered with intervention. Further research into factors associated with developing PTE could lead to risk-reducing treatments. |
Foreword to "Graduated driver licensing research, 2007-present: a review and commentary"
Shults RA . J Safety Res 2010 41 (2) 75 Since the Journal of Safety Research published its first summary of Graduated Driver Licensing (GDL) research in 2003, tremendous progress has been made in reducing adolescent motor-vehicle crash deaths. During 2003–2008, the rate of crash deaths among 13–19 year-olds fell 31%, from 19.7 to 13.6 per 100,000 population (Insurance Institute for Highway Safety [IIHS], 2010). In contrast, crash deaths among adults aged 20–69 years declined 13%, from 16.4 to 14.3 per 100,000 population. Despite this progress, motor-vehicle crashes remain the leading cause of death for adolescents in the United States, accounting for one third of all deaths in this age group (Centers for Disease Control and Prevention [CDC], 2010). | GDL addresses this principal health threat to adolescents using a classic public health approach—primary prevention. By limiting the novice driver's exposure to challenging driving conditions, GDL seeks to prevent crash-related injuries from occurring during this high-risk period. The protection provided by GDL applies to the entire population of novice teen drivers (and their passengers), whereas law enforcement approaches focus more on individuals who are caught violating traffic laws. By extending the learner stage of the licensure process, GDL encourages teens to gain driving experience under the safest possible, realistic conditions—with an adult supervisor. During the provisional license stage, teens transition from driving under the supervision of an adult to being fully “in charge” of the vehicle. During this transition time, passenger and night driving restrictions provide continued protection by limiting exposure to the most dangerous known conditions. |
Quantification of dialkylphosphate metabolites of organophosphorus insecticides in human urine using 96-well plate sample preparation and high-performance liquid chromatography-electrospray ionization-tandem mass spectrometry
Odetokun MS , Montesano MA , Weerasekera G , Whitehead Jr RD , Needham LL , Barr DB . J Chromatogr B Analyt Technol Biomed Life Sci 2010 878 (27) 2567-74 Organophosphorus (OP) pesticides kill by disrupting a targeted pest's brain and nervous systems. But if humans and other animals are sufficiently exposed, OP pesticides can have the same effect on them. We developed a fast and accurate high-performance liquid chromatography-tandem mass spectrometry method for the quantitative measurement of the following six common dialkylphosphate (DAP) metabolites of organophosphorus insecticides: dimethylphosphate (DMP), dimethylthiophosphate (DMTP), dimethyldithiophosphate (DMDTP), diethylphosphate, (DEP), diethylthiophosphate (DETP), and diethyldithiophosphate (DEDTP). The general sample preparation included 96-well plate solid phase extraction using weak anion exchange cartridges. The analytical separation was performed by high-performance liquid chromatography with a HILIC column. Detection involved a triple quadrupole mass spectrometer with an ESI probe in negative ion mode using multiple reaction monitoring. Repeated analyses of urine samples spiked at 150, 90 and 32ng/mL with the analytes gave relative standard deviations of less than 22%. The extraction efficiency ranged from 40% to 98%. The limits of detection were in the range of 0.04-1.5ng/mL. The throughput is 1152 samples per week, effectively quadrupling our previous throughput. The method is safe, quick, and sensitive enough to be used in environmental and emergency biological monitoring of occupational and nonoccupational exposure to organophosphates. |
Establishment and characterization of a Madin-Darby canine kidney reporter cell line for influenza A virus assays
Hossain MJ , Perez S , Guo Z , Chen LM , Donis RO . J Clin Microbiol 2010 48 (7) 2515-23 Influenza virus diagnosis has traditionally relied on virus isolation in chicken embryo or cell cultures. Many laboratories have adopted rapid molecular methods for detection of influenza viruses and discontinued routine utilization of the relatively slow viral culture methods. We describe an influenza A reporter cell line that contributes to more efficient viral detection in cell culture. Madin-Darby canine kidney (MDCK) cells were engineered to constitutively produce an influenza genome-like luciferase reporter RNA driven by the canine RNA polymerase I promoter. Induction of high level of luciferase activity was detected in the Luc9.1 cells upon infection with various strains of influenza A virus including 2009 H1N1-pandemic and highly pathogenic H5N1 virus. In contrast, infection with influenza B or human Adenovirus type 5 did not induce significant levels of reporter expression. The reporter Luc9.1 cells were evaluated in neutralizing antibody assays with convalescent H3N2 ferret serum yielding a neutralization titer comparable to that obtained by the conventional microneutralization assay suggesting that the use of the reporter cell line might simplify neutralization assays by facilitating the establishment of infectious virus endpoints. Luc9.1 cells were also used to determine the susceptibility of influenza A viruses to a model antiviral drug. The equivalence with conventional antiviral assay results indicated that the Luc9.1 cells could provide an alternative cell-based platform for high throughput drug discovery screens. In summary, the MDCK-derived Luc9.1 reporter cell line is highly permissive for influenza A virus replication and provides a very specific and sensitive approach for simultaneous detection and isolation of influenza A viruses as well as functional evaluation of antibodies and antiviral molecules. |
Evaluation of a procedure for the simultaneous quantification of 4-ketocyclophosphamide, cyclophosphamide, and ifosfamide in human urine
B'Hymer C , Cheever KL . J Chromatogr Sci 2010 48 (5) 328-333 An accurate and precise analysis procedure is presented for the detection and quantification of cyclophosphamide (CP), 4-ketocyclophosphamide (4-keto-CP), a primary metabolite of CP, and ifosfamide (IF) in human urine. CP and IF are common antineoplastic drugs used for the treatment of many types of cancer. Workers in the healthcare field, including nurses and pharmacists who interact with or prepare prescriptions for patients, have potential low-level exposure to the parent drugs; therefore, an analysis procedure is needed. The main focus of this procedure is the quantitation of 4-keto-CP because it is a primary metabolite of CP exposure and stable under physiological conditions. Sample preparation consists of liquid-liquid extraction of urine with ethyl acetate, and the analysis consists of reversed-phase high-performance liquid chromatography coupled with tandem mass spectrometry for detection of the analytes. Accuracy and precision of this procedure is demonstrated by means of recovery experiments. Recoveries are between 97-105% of theory for the three target analytes at various concentrations (25, 50, 100, and 375 ng/mL for 4-keto-CP; 1, 2, 4, and 15 ng/mL for CP and IF) with relative standard deviations of 8.4% or less. The limit of detection for this procedure is 1 ng/mL for 4-keto-CP, 0.1 ng/mL for CP, and 0.05 ng/mL for IF in urine. |
Evaluation of laboratory methods for diagnosis of varicella
Leung J , Harpaz R , Baughman AL , Heath K , Loparev V , Vazquez M , Watson BM , Schmid DS . Clin Infect Dis 2010 51 (1) 23-32 BACKGROUND: The incidence of varicella disease is declining as a result of vaccination, making clinical diagnosis more challenging, particularly for vaccine-modified cases. We conducted a comprehensive evaluation of laboratory tests and specimen types to assess diagnostic performance and determine what role testing can play after skin lesions have resolved. METHODS: We enrolled patients with suspected varicella disease in 2 communities. Enrollees were visited at the time of rash onset and 2 weeks later. Multiple skin lesion, oral, urine, and blood or serum specimens were requested at each visit and tested for varicella zoster virus (VZV) immunoglobulin (Ig) G, IgM, and IgA antibody by enzyme-linked immunoassay; for VZV antigen by direct fluorescent antibody; and/or for VZV DNA by polymerase chain reaction (PCR). Clinical certainty of the diagnosis of varicella disease was scored. PCR results from first-visit vesicles or scab specimens served as the gold standard in assessing test performance. RESULTS: Of 93 enrollees, 53 were confirmed to have varicella disease. Among 20 unmodified cases, PCR testing was 95%-100% sensitive for macular and/or papular lesions and for oral specimens collected at the first visit; most specimens from the second visit yielded negative results. Among 27 vaccine-modified cases, macular and/or papular lesions collected at the first visit were also 100% sensitive; yields from other specimens were poorer, and few specimens from the second visit tested positive. Clinical diagnosis was 100% and 85% sensitive for diagnosing unmodified and vaccine-modified varicella cases, respectively. CONCLUSIONS: PCR testing of skin lesion specimens remains convenient and accurate for diagnosing varicella disease in vaccinated and unvaccinated persons. PCR of oral specimens can sometimes aid in diagnosis of varicella disease, even after rash resolves. |
Formulation and stability of a novel artificial sebum under conditions of storage and use
Stefaniak AB , Harvey CJ , Wertz PW . Int J Cosmet Sci 2010 32 (5) 347-55 Materials in contact with liquids on the human skin surface may dissolve and permeate into skin. Release and permeation of chemicals in contact with skin is often estimated in vitro using artificial skin liquids, although sebum lipids are generally not included in these models. The purposes of this research were to develop a representative artificial sebum that contains the appropriate types of lipids at levels that match human values and quantitatively characterize the model to understand its utility for in vitro testing. Artificial sebum that consisted of 10 lipids at proportions that closely resembled human sebum was characterized using thin layer chromatography under a variety of storage and use conditions (dry and liquid, 4 degrees C and 32 degrees C, with and without vitamin E) for 28 days. Levels of sebum constituents maintained in solution and dry at 4 degrees C were stable through the duration of the test period. Levels of all sebum lipids maintained dry at 32 degrees C were stable in the presence of vitamin E; however, squalene oxidized rapidly in the absence of vitamin E. Liquids on the human skin surface consist of sebum and sweat with minor amounts of cellular debris and intercellular lipid from the stratum corneum. The relative importance of each component for release of chemicals from materials in contact with skin will depend upon the type of material (metal, organic, etc.). A model artificial sebum was formulated and characterized to aid researchers in understanding potential release of chemicals from materials in contact with skin and subsequent partitioning and absorption. |
Human antimicrobial peptide LL-37 induces mefE/mel-mediated macrolide resistance in Streptococcus pneumoniae
Zahner D , Zhou X , Chancey ST , Pohl J , Shafer WM , Stephens DS . Antimicrob Agents Chemother 2010 54 (8) 3516-9 Macrolide resistance is a major concern in the treatment of Streptococcus pneumoniae. Inducible macrolide resistance in the pneumococcus is mediated by the efflux pump MefE/Mel. We show here that the human antimicrobial peptide LL-37 induces the mefE-promoter and confers resistance to erythromycin and LL-37. Such induction may impact efficacy of host defenses and of macrolide-based treatment of pneumococcal disease. |
Adherence to perinatal group B streptococcal prevention guidelines
Goins WP , Talbot TR , Schaffner W , Edwards KM , Craig AS , Schrag SJ , Van Dyke MK , Griffin MR . Obstet Gynecol 2010 115 (6) 1217-1224 OBJECTIVE: To estimate compliance with the 2002 revised perinatal group B streptococci (GBS) prevention guidelines in Tennessee, which recommend universal GBS screening of pregnant women at 35-37 weeks of gestation and, when indicated, administration of intrapartum chemoprophylaxis. METHODS: Active Bacterial Core surveillance conducts active, population-based surveillance for invasive GBS disease in 11 Tennessee counties. A retrospective case-cohort study was conducted using a stratified random sample of all live births in surveillance hospitals during 2003-2004, including all early-onset GBS cases. Factors associated with GBS screening and lack of optimal GBS chemoprophylaxis were analyzed using logistic regression. RESULTS: Screening was performed for 84.7% of pregnant women, but 26.3% of prenatal tests with documented test dates were performed before 35 weeks of gestation. Among women with an indication for GBS prophylaxis, 61.2% received optimal chemoprophylaxis, defined as initiation of a recommended antibiotic 4 hours or more before delivery. When the analysis was restricted to women who were admitted 4 hours or more before delivery, 70.9% received optimal chemoprophylaxis. Women not receiving optimal chemoprophylaxis were more likely to have penicillin allergy (11.7% compared with 2.5%, adjusted odds ratio [OR] 8.58, 95% confidence interval [CI] 1.57-47.04) or preterm delivery (45.5% compared with 13.2%, adjusted OR 5.52, 95% CI 2.29-13.30) and were less likely to have received the recommended prenatal serologic testing for other infectious diseases (77.9% compared with 91.1%, adjusted OR 0.30, 95% CI 0.09-0.98). Forty cases of early-onset GBS were identified (0.36 per 1,000 live births); 25% of these neonates were born to women who received screening at 35 weeks of gestation or later and, when indicated, optimal chemoprophylaxis. CONCLUSION: Universal prenatal GBS screening was implemented widely in Tennessee, although the timing of screening and administration of chemoprophylaxis often were not optimal. A substantial burden of early-onset GBS disease occurs despite optimal prenatal screening and chemoprophylaxis, suggesting that alternative strategies, such as vaccination, are needed. LEVEL OF EVIDENCE: II. |
Congenital syphilis: not gone and all too forgotten
Kamb ML . World J Pediatr 2010 6 (2) 101-2 This issue features a retrospective study of clinical | records from one health facility in rural Tanzania in | which the investigators uncovered an unexpected | number of congenital syphilis cases among infants | (Page 125-131). The authors point out that congenital | syphilis is an age-old disease that has fallen off the | priority list of the global health agenda, yet continues | to cause substantial morbidity in countries with limited | resources. They further note that syphilis can be cheaply | and easily cured with penicillin anywhere in the world, | supporting that the most important reason that this | infection continues to kill and maim infants is through | simple neglect. | The evidence supports that the authors' assertions | are correct. The World Health Orgnization (WHO) | estimates that each year between 715 000 and 1 575 000 | pregnant women are infected with syphilis, and most | of them (up to 80%) will suffer a serious adverse | pregnancy outcome.[1] Among live-born infants of | infected mothers, about half go on to have congenital | syphilis infection as depicted in this article; but these | surviving children represent just the tip of the iceberg. | In low income settings, syphilis is the most common | infection associated with fetal loss or stillbirth, occurring | in up to 40% of pregnancies among infected women | who are inadequately treated.[1-3] Syphilis infection also | contributes importantly to preterm delivery, affecting | 20%-33% of infants born to untreated mothers, and | through this means cause a substantial source of early | neonatal death.[1,3] Morbidity aside, the global perinatal | mortality associated with syphilis is estimated at about | 327 000 cases each year—similar to or exceeding the | perinatal deaths estimated for HIV, malaria or tetanus— | and almost all these cases could be prevented.[4 |
Caloric sweetener consumption and dyslipidemia among US adults
Welsh JA , Sharma A , Abramson JL , Vaccarino V , Gillespie C , Vos MB . JAMA 2010 303 (15) 1490-7 CONTEXT: Dietary carbohydrates have been associated with dyslipidemia, a lipid profile known to increase cardiovascular disease risk. Added sugars (caloric sweeteners used as ingredients in processed or prepared foods) are an increasing and potentially modifiable component in the US diet. No known studies have examined the association between the consumption of added sugars and lipid measures. OBJECTIVE: To assess the association between consumption of added sugars and blood lipid levels in US adults. DESIGN, SETTING, AND PARTICIPANTS: Cross-sectional study among US adults (n = 6113) from the National Health and Nutrition Examination Survey (NHANES) 1999-2006. Respondents were grouped by intake of added sugars using limits specified in dietary recommendations (< 5% [reference group], 5%-<10%, 10%-<17.5%, 17.5%-<25%, and > or = 25% of total calories). Linear regression was used to estimate adjusted mean lipid levels. Logistic regression was used to determine adjusted odds ratios of dyslipidemia. Interactions between added sugars and sex were evaluated. MAIN OUTCOME MEASURES: Adjusted mean high-density lipoprotein cholesterol (HDL-C), geometric mean triglycerides, and mean low-density lipoprotein cholesterol (LDL-C) levels and adjusted odds ratios of dyslipidemia, including low HDL-C levels (< 40 mg/dL for men; < 50 mg/dL for women), high triglyceride levels (> or = 150 mg/dL), high LDL-C levels (> or = 130 mg/dL), or high ratio of triglycerides to HDL-C (> 3.8). Results were weighted to be representative of the US population. RESULTS: A mean of 15.8% of consumed calories was from added sugars. Among participants consuming less than 5%, 5% to less than 17.5%, 17.5% to less than 25%, and 25% or greater of total energy as added sugars, adjusted mean HDL-C levels were, respectively, 58.7, 57.5, 53.7, 51.0, and 47.7 mg/dL (P < .001 for linear trend), geometric mean triglyceride levels were 105, 102, 111, 113, and 114 mg/dL (P < .001 for linear trend), and LDL-C levels modified by sex were 116, 115, 118, 121, and 123 mg/dL among women (P = .047 for linear trend). There were no significant trends in LDL-C levels among men. Among higher consumers (> or = 10% added sugars) the odds of low HDL-C levels were 50% to more than 300% greater compared with the reference group (< 5% added sugars). CONCLUSION: In this study, there was a statistically significant correlation between dietary added sugars and blood lipid levels among US adults. |
Welding fumes from stainless steel gas metal arc processes contain multiple manganese chemical species
Keane M , Stone S , Chen B . J Environ Monit 2010 12 (5) 1133-1140 Fumes from a group of gas metal arc welding (GMAW) processes used on stainless steel were generated using three different metal transfer modes and four different shield gases. The objective was to identify and measure manganese (Mn) species in the fumes, and identify processes that are minimal generators of Mn species. The robotic welding system was operated in short-circuit (SC) mode (Ar/CO2 and He/Ar), axial spray (AXS) mode (Ar/O-2 and Ar/CO2), and pulsed axial-spray (PAXS) mode (Ar/O-2). The fumes were analyzed for Mn by a sequential extraction process followed by inductively coupled plasma-atomic emission spectroscopy (ICP-AES) analysis, and by X-ray diffraction (XRD). Total elemental Mn, iron (Fe), chromium (Cr) and nickel (Ni) were separately measured after aqua regia digestion and ICP-AES analysis. Soluble Mn2+, Fe2+, Fe3+, and Ni2+ in a simple biological buffer (phosphate-buffered saline) were determined at pH 7.2 and 5.0 after 2 h incubation at 37 degrees C by ion chromatography. Results indicate that Mn was present in soluble form, acid-soluble form, and acid-soluble form after reduction by hydroxylamine, which represents soluble Mn-0 and Mn2+ compounds, other Mn2+ compounds, and (Mn3+ and Mn4+) compounds, respectively. The dominant fraction was the acid-soluble Mn2+ fraction, but results varied with the process and shield gas. Soluble Mn mass percent in the fume ranged from 0.2 to 0.9%, acid-soluble Mn2+ compounds ranged from 2.6 to 9.3%, and acid plus reducing agent-soluble (Mn3+ and Mn4+) compounds ranged from 0.6 to 5.1%. Total Mn composition ranged from 7 to 15%. XRD results showed fumes had a crystalline content of 90-99% Fe3O4, and showed evidence of multiple Mn oxides, but overlaps and weak signals limited identification. Small amounts of the Mn2+ in the fume (<0.01 to similar to 1% or <0.1 to similar to 10 mg ml(-1)) and Ni2+ (<0.01 to similar to 0.2% or <0.1 to similar to 2 mg ml(-1)) ions were found in biological buffer media, but amounts were highly dependent on pH and the welding process. Mn generation rates for the fractions were tabulated, and the influence of ozone is discussed. The conclusions are that exposures to welding fumes include multiple Mn species, both soluble and insoluble, and that exposures to Mn species vary with specific processes and shield gases. |
Workers with Libby amphibole exposure: retrospective identification and progression of radiographic changes
Larson TC , Meyer CA , Kapil V , Gurney JW , Tarver RD , Black CB , Lockey JE . Radiology 2010 255 (3) 924-33 PURPOSE: To assess how early pleural and/or parenchymal abnormalities consistent with asbestos exposure could be ascertained and to identify factors associated with progression. MATERIALS AND METHODS: Informed consent was obtained under an institutional review board-approved protocol. Multiple sequential chest radiographs obtained between 1955 and 2004 in 84 workers exposed to amphiboles associated with vermiculite in the town of Libby, Montana, were studied. A panel of three NIOSH B readers reviewed each worker's longitudinal chest radiograph series in reverse chronologic order and achieved a consensus reading for each radiograph. Measures of exposure were compared between workers with and those without progression of parenchymal and pleural abnormalities. RESULTS: Because of the way the study was designed, all subjects had pleural (n = 84) and/or parenchymal (n = 26) abnormalities on the most recent chest radiograph. Compared with other investigations that used different methods, this investigation revealed shorter latency periods (defined as the interval between date of hire and date of earliest radiographic detection) for circumscribed pleural plaque (median latency, 8.6 years) and pleural calcification (median latency, 17.5 years). Pleural abnormalities progressed in 64 workers, while parenchymal abnormalities progressed in 14. No significant differences were found with regard to measures of exposure between workers with and those without progression. CONCLUSION: The latency period for the development of pleural plaques may be shorter than previously reported. Early plaques are subtle and may not be detectable except at retrospective review. (c) RSNA, 2010. |
Estimation of the biodynamic responses distributed at fingers and palm based on the total response of the hand-arm system
Dong RG , Rakheja S , McDowell TW , Welcome DE , Wu JZ . Int J Ind Ergon 2010 40 (4) 425-436 The major objective of this study is to develop a modeling method for estimating the biodynamic responses distributed at the fingers and the palm of the hand based on the total driving-point mechanical impedance of the entire hand-arm system. A five degrees-of-freedom (DOF) model with a set of constraints proposed in this study was used in the estimation. Three sets of mechanical impedance data measured at the fingers and palm of the hand were used to examine the validity of the proposed method. The estimated response distributed at the palm was consistent with the measured data even when the real part of the impedance alone was used in the modeling (coefficient of correlation, r2 = 0.902). Better agreements between the estimated and measured responses were obtained (r2 = 0.929) when the magnitude and phase of the total impedance or the magnitude alone were used in the modeling estimation. In each case, the estimated response distributed at the fingers was also reliably correlated with the experimental data (r2 = 0.726) but it was not as consistent with the experimental data as that distributed at the palm. The applications of the proposed method were also demonstrated using five other sets of reported experimental data. This study also demonstrated that the modeling method may also be used to assess the quality of the experimental data in some cases. As a special application of the acceptable data identified in this study, this study also defined a 2-DOF model for the construction of a hand-arm simulator for tool tests. The results of this study and the proposed modeling method are expected to contribute to the revision of ISO 10068 (1998). |
Disproportionated rosin dehydroabietic acid in neoprene surgical gloves
Siegel PD , Law BF , Fowler Jr JF , Fowler LM . Dermatitis 2010 21 (3) 157-9 BACKGROUND: Allergic contact dermatitis (ACD) is a well-recognized immune-mediated disease often associated with the use of vulcanization accelerator-containing latex and nitrile gloves. Potential contact allergens in neoprene (polychloroisoprene, polychloroprene) gloves have not been reported. OBJECTIVE: The objective was to analyze extracts of neoprene surgical and examination gloves for potential contact allergens. METHODS: Four different brands of neoprene-type gloves were purchased, and dichloromethane extracts were derivatized and assayed by gas chromatographic mass spectrometry. A latex surgical glove was used as a negative control. RESULTS: Chemical species consistent with the composition of disproportionated rosin (dehydroabietic acid [DHA], didehydroabietic acid, and other pimaric or isopimaric species) were identified in dichloromethane extracts of neoprene gloves. Levels of DHA, a type IV prohapten that can be air oxidized to an active allergen, ranged from 7 to 31 mg/g of glove. A leaching study of DHA was conducted, and small amounts of DHA leached from the glove materials into artificial sweat. DHA oxidation products were not observed in any of the gloves assayed. CONCLUSION: DHA exposure may occur from neoprene-type glove use, although a potential association with glove ACD has not been established. |
Visual performance for trip hazard detection when using incandescent and led miner cap lamps
Sammarco JJ , Gallagher S , Reyes M . J Safety Res 2010 41 (2) 85-91 INTRODUCTION: Accident data for 2003-2007 indicate that slip, trip, and falls (STFs) are the second leading accident class (17.8%, n=2,441) of lost-time injuries in underground mining. Proper lighting plays a critical role in enabling miners to detect STF hazards in this environment. Often, the only lighting available to the miner is from a cap lamp worn on the miner's helmet. The focus of this research was to determine if the spectral content of light from light-emitting diode (LED) cap lamps enabled visual performance improvements for the detection of tripping hazards as compared to incandescent cap lamps that are traditionally used in underground mining. A secondary objective was to determine the effects of aging on visual performance. METHOD: The visual performance of 30 subjects was quantified by measuring each subject's speed and accuracy in detecting objects positioned on the floor both in the near field, at 1.83 meters, and far field, at 3.66 meters. Near field objects were positioned at 0 degrees and +/-20 degrees off axis, while far field objects were positioned at 0 degrees and +/-10 degrees off axis. Three age groups were designated: group A consisted of subjects 18 to 25 years old, group B consisted of subjects 40 to 50 years old, and group C consisted of subjects 51 years and older. RESULTS: Results of the visual performance comparison for a commercially available LED, a prototype LED, and an incandescent cap lamp indicate that the location of objects on the floor, the type of cap lamp used, and subject age all had significant influences on the time required to identify potential trip hazards. The LED-based cap lamps enabled detection times that were an average of 0.96 seconds faster compared to the incandescent cap lamp. Use of the LED cap lamps resulted in average detection times that were about 13.6% faster than those recorded for the incandescent cap lamp. The visual performance differences between the commercially available LED and prototype LED cap lamp were not statistically significant. IMPACT ON INDUSTRY: It can be inferred from this data that the spectral content from LED-based cap lamps could enable significant visual performance improvements for miners in the detection of trip hazards. |
Machine-related injuries in the US mining industry and priorities for safety research
Ruff T , Coleman P , Martini L . Int J Inj Contr Saf Promot 2010 18 (1) 1-10 Researchers at the National Institute for Occupational Safety and Health studied mining accidents that involved a worker entangled in, struck by, or in contact with machinery or equipment in motion. The motivation for this study came from the large number of severe accidents, i.e. accidents resulting in a fatality or permanent disability, that are occurring despite available interventions. Accident descriptions were taken from an accident database maintained by the United States Department of Labor, Mine Safety and Health Administration, and 562 accidents that occurred during 2000-2007 fit the search criteria. Machine-related accidents accounted for 41% of all severe accidents in the mining industry during this period. Machinery most often involved in these accidents included conveyors, rock bolting machines, milling machines and haulage equipment such as trucks and loaders. The most common activities associated with these accidents were operation of the machine and maintenance and repair. The current methods to safeguard workers near machinery include mechanical guarding around moving components, lockout/tagout of machine power during maintenance and backup alarms for mobile equipment. To decrease accidents further, researchers recommend additional efforts in the development of new control technologies, training materials and dissemination of information on best practices. |
The role of clinical toxicologists and poison control centers in public health
Sutter ME , Bronstein AC , Heard SE , Barthold CL , Lando J , Lewis LS , Schier JG . Am J Prev Med 2010 38 (6) 658-62 BACKGROUND: Poison control centers and clinical toxicologists serve many roles within public health; however, the degree to which these entities collaborate is unknown. PURPOSE: The objective of this survey was to identify successful collaborations of public health agencies with clinical toxicologists and poison control centers. Four areas including outbreak identification, syndromic surveillance, terrorism preparedness, and daily public health responsibilities amenable to poison control center resources were assessed. METHODS: An online survey was sent to the directors of poison control centers, state epidemiologists, and the most senior public health official in each state and selected major metropolitan areas. This survey focused on three areas: service, structure within the local or state public health system, and remuneration. Questions regarding remuneration and poison control center location within the public health structure were asked to assess if these were critical factors of successful collaborations. Senior state and local public health officials were excluded because of a low response rate. The survey was completed in October 2007. RESULTS: A total of 111 respondents, 61 poison control centers and 50 state epidemiologists, were eligible for the survey. Sixty-nine (62%) of the 111 respondents, completed and returned the survey. Thirty-three (54%) of the 61 poison control centers responded, and 36 of the 50 state epidemiologists (72%) responded. The most frequent collaborations were terrorism preparedness and epidemic illness reporting. Additional collaborations also exist. Important collaborations exist outside of remuneration or poison control centers being a formal part of the public health structure. CONCLUSIONS: Poison control centers have expanded their efforts to include outbreak identification, syndromic surveillance, terrorism preparedness, and daily public health responsibilities amenable to poison control center resources. Collaboration in these areas and others should be expanded. |
Transportation-related hazardous materials incidents and the role of poison control centers
Sutter ME , Hon SL , Chang AS , Schwartz MD , Algren DA , Schier JG , Lando J , Lewis LS . Am J Prev Med 2010 38 (6) 663-6 BACKGROUND: Department of Transportation (DOT) mandates reporting of all serious hazardous materials incidents. Hazardous material exposures may result in secondary contamination of emergency departments, or delayed clinical effects. Poison control centers specialize in the management of patients exposed to toxic substances; however, poison control center notification is not required. PURPOSE: The objective is to determine the frequency of poison control center notification after serious hazardous materials incidents when patients were transported to a hospital. METHODS: A retrospective analysis was conducted of serious hazardous materials incidents as reported by DOT, matched with data from the American Association of Poison Control Centers from 2002 through 2006 that involved patient transport. Incidents were divided into four groups: those reported to a poison control center within 0-360 minutes of the incident; those reported within 361-1440 minutes of the incident; those reported within 1441-4320 minutes of the incident; and no poison control center notification. Analyses were performed on variables including date, time, substance, and time to notification. Data were received in January 2008. RESULTS: One hundred fifty-four serious incidents met inclusion criteria. One hundred thirty-four incidents (87%) occurred without poison control center notification. Poison control centers were notified in 20 incidents (12.9%); 15 incidents (9.7%) were reported within 0-360 minutes of the incident (M=115 minutes, range=5-359 minutes); four incidents (2.6%) were reported within 361-1440 minutes of the incident (M=652 minutes, range=566-750 minutes); and one incident (0.7%) was reported after 4320 minutes following the incident. CONCLUSIONS: Most serious hazardous materials incidents involving patient transport are not reported to poison control centers. Opportunities exist to increase utilization of poison control center resources without increasing financial burdens of the hazardous materials incident. |
Public health partnerships in medical toxicology education and practice
Schier JG , Rubin C , Schwartz MD , Thomas JD , Geller RJ , Morgan BW , McGeehin MA , Frumkin H . Am J Prev Med 2010 38 (6) 667-74 In December 2002, the medical toxicology sub-board, which consists of representatives from emergency medicine, preventive medicine, and pediatrics, released revised core content for medical toxicology, aiming to better meet the academic challenges imposed by the continually expanding knowledge base of medical toxicology. These challenges included the addition of relatively new areas of interest in medical toxicology, including population health, while simultaneously ensuring that a structural framework existed to accommodate future areas of interest. There is no evidence readily available to assess how well the educational curricula of existing fellowship programs are meeting these needs. In an effort to address this, the authors describe a medical toxicology fellowship program that consists of a partnership among the Emory University School of Medicine, the Georgia Poison Control Center, and the CDC, as well as the results of a reorganization of its academic curriculum that occurred in 2006. To the best of the authors' knowledge, this is the first published report describing such a curriculum redesign. Suggestions and potential resources proposed as enhancements for the public health-associated education of medical toxicology fellows are discussed. The authors also seek to initiate a discussion among programs about how to optimally meet the new challenges developed by the medical toxicology sub-board. |
Improving human health in the Arctic: the expanding role of the Arctic Council's Sustainable Development Working Group
Parkinson AJ . Int J Circumpolar Health 2010 69 (3) 304-13 Human health is now a critical component of the Arctic Council's sustainable development program. The newly formed Arctic Human Health Expert Group (AHHEG), a subsidiary body of experts within the Sustainable Development Working Group (SDWG), will focus on identifying human health priorities that will improve the health of Arctic residents; engage experts in the field to evaluate possible actions; strengthen co-operation and collaboration between Arctic Council working groups and other Arctic co-operatives; and promote the translation of research into actions that will improve the health of Arctic peoples. |
Semi-volatiles in mainstream smoke delivery from select charcoal-filtered cigarette brand variants
Hearn BA , Ding YS , Vaughan C , Zhang L , Polzin G , Caudill SP , Watson CH , Ashley DL . Tob Control 2010 19 (3) 223-30 BACKGROUND: It has been reported that charcoal added to cigarette filters selectively removes many of the more volatile chemicals, but it is not clear to what extent charcoal may reduce the delivery of important less volatile chemical constituents in mainstream cigarette smoke. METHODS: We analysed machine-derived mainstream smoke deliveries (under three smoking regimens) for variants of a charcoal-filtered cigarette commercially test-marketed in the USA, focusing on selected polycyclic aromatic hydrocarbons (PAHs), phenols and tobacco-specific nitrosamines (TSNAs). Results While charcoal-containing filters selectively removed lower molecular weight PAHs from mainstream smoke, they did not significantly remove the heavier and more toxic PAHs studied, such as benzo[a]pyrene, a known carcinogen. Likewise, charcoal-containing filters removed phenols and TSNAs from mainstream smoke to differing amounts depending on the compound, filter design and the smoking regimen. CONCLUSIONS: The addition of sufficient charcoal to cigarette filters is known to remove many volatile compounds and can potentially reduce deliveries of certain semi-volatile compounds under some machine smoking regimens. Less volatile compounds, with a significant portion in the particulate phase, are less available for selective filtration by charcoal-containing filters than the more volatile compounds that reside predominantly in the gas phase. |
Effect of differing levels of tobacco-specific nitrosamines in cigarette smoke on the levels of biomarkers in smokers
Ashley DL , O'Connor RJ , Bernert JT , Watson CH , Polzin GM , Jain RB , Hammond D , Hatsukami DK , Giovino GA , Cummings KM , McNeill A , Shahab L , King B , Fong GT , Zhang L , Xia Y , Yan X , McCraw JM . Cancer Epidemiol Biomarkers Prev 2010 19 (6) 1389-98 BACKGROUND: Smokers are exposed to significant doses of carcinogens, including tobacco-specific nitrosamines (TSNA). Previous studies have shown significant global differences in the levels of TSNAs in cigarette smoke because of the variation in tobacco blending and curing practices around the world. METHODS: Mouth-level exposure to 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanone (NNK) measured in cigarette butts and urinary concentrations of its major metabolite 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanol (NNAL) were examined among 126 daily smokers in four countries over a 24-hour study period. RESULTS: As mouth-level exposure of NNK increased, the urinary NNAL increased even after adjustment for other covariates (beta = 0.46, P = 0.004). The relationship between mouth-level exposure to nicotine and its salivary metabolite, cotinine, was not statistically significant (beta = 0.29, P = 0.057), likely because of the very limited range of differences in mouth-level nicotine exposure in this population. CONCLUSIONS: We have shown a direct association between the 24-hour mouth-level exposure of NNK resulting from cigarette smoking and the concentration of its primary metabolite, NNAL, in the urine of smokers. Internal dose concentrations of urinary NNAL are significantly lower in smokers in countries that have lower TSNA levels in cigarettes such as Canada and Australia in contrast to countries that have high levels of these carcinogens in cigarettes, such as the United States. Impact: Lowering the levels of NNK in the mainstream smoke of cigarettes through the use of specific tobacco types and known curing practices can significantly affect the exposure of smokers to this known carcinogen. (c)2010 AACR. |
Content Index (Achived Edition)
- Communicable Diseases
- Environmental Health
- Epidemiology and Surveillance
- Food Safety
- Genetics and Genomics
- Health Behavior and Risk
- Health Communication and Education
- Healthcare Associated Infections
- Immunity and Immunization
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Public Health Leadership and Management
- Substance Use and Abuse
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure