Discordance between self-report and genetic confirmation of sickle cell disease status in African-American adults.
Bean CJ , Hooper WC , Ellingsen D , Debaun MR , Sonderman J , Blot WJ . Public Health Genomics 2014 17 (3) 169-72 BACKGROUND: Sickle cell disease (SCD) is an autosomal recessive genetic disorder, with persons heterozygous for the mutation said to have the sickle cell trait (SCT). Serious adverse effects are mainly limited to those with SCD, but the distinction between disease and trait is not always clear to the general population. We sought to determine the accuracy of self-reported SCD when compared to genetic confirmation. METHODS: From stratified random samples of Southern Community Cohort Study participants, we sequenced the beta- globin gene in 51 individuals reporting SCD and 75 individuals reporting no SCD. RESULTS: The median age of the group selected was 53 years (range 40-69) with 29% male. Only 5.9% of the 51 individuals reporting SCD were confirmed by sequencing, with the remaining 62.7% having SCT, 5.9% having hemoglobin C trait, and 25.5% having neither SCD nor trait. Sequencing results of the 75 individuals reporting no SCD by contrast were 100% concordant with self-report. CONCLUSIONS: Misreporting of SCD is common in an older adult population, with most persons reporting SCD in this study being carriers of the trait and a sizeable minority completely unaffected. The results from this pilot survey support the need for increased efforts to raise community awareness and knowledge of SCD. |
Peripheral nerve function and lower extremity muscle power in older men
Ward RE , Caserotti P , Faulkner K , Boudreau RM , Zivkovic S , Lee C , Goodpaster BH , Cawthon PM , Newman AB , Cauley JA , Strotmeyer ES . Arch Phys Med Rehabil 2014 95 (4) 726-33 OBJECTIVE: To assess whether sensorimotor peripheral nerve function is associated with muscle power in community-dwelling older men. DESIGN: Longitudinal cohort study with 2.3+/-0.3 years of follow-up. SETTING: One clinical site. PARTICIPANTS: Participants (n=372; mean age +/- SD, 77.2+/-5.1y; 99.5% white; body mass index, 27.9+/-3.7kg/m(2); power, 1.88+/-0.6W/kg) at 1 site of the Osteoporotic Fractures in Men Study (N=5994). INTERVENTIONS: Not applicable. MAIN OUTCOME MEASURES: A nerve function ancillary study was performed 4.6+/-0.4 years after baseline. Muscle power was measured using a power rig. Peroneal motor nerve conduction amplitude, distal motor latency, and mean f-wave latency were measured. Sensory nerve function was assessed using 10-g and 1.4-g monofilaments and sural sensory nerve conduction amplitude and distal latency. Peripheral neuropathy symptoms at the leg and feet were assessed by self-report. RESULTS: After adjustments for age, height, and total body lean and fat mass, 1 SD lower motor (beta=-.07, P<.05) and sensory amplitude (beta=-.09, P<.05) and 1.4-g (beta=-.11, P<.05) and 10-g monofilament insensitivity (beta=-.17, P<.05) were associated with lower muscle power/kg. Compared with the effect of age on muscle power (beta per year, -.05; P<.001), this was equivalent to aging 1.4 years for motor amplitude, 1.8 years for sensory amplitude, 2.2 years for 1.4-g monofilament detection, and 3.4 years for 10-g detection. Baseline 1.4-g monofilament detection predicted a greater decline in muscle power/kg. Short-term change in nerve function was not associated with concurrent short-term change in muscle power/kg. CONCLUSIONS: Worse sensory and motor nerve function were associated with lower muscle power/kg and are likely important for impaired muscle function in older men. Monofilament sensitivity was associated with a greater decline in muscle power/kg, and screening may identify an early risk for muscle function decline in late life, which has implications for disability. |
Allergic sensitization in Canadian chronic rhinosinusitis patients
Green BJ , Beezhold DH , Gallinger Z , Barron CS , Melvin R , Bledsoe TA , Kashon ML , Sussman GL . Allergy Asthma Clin Immunol 2014 10 (1) 15 BACKGROUND: Chronic rhinosinusitis (CRS) is a societal burden and cause of morbidity in Canada; however, the prevalence of allergic sensitization in Canadian CRS patients has remained poorly characterized. OBJECTIVE: In this study, we used skin prick test (SPT) and specific immunoglobulin E (sIgE) and G (sIgG) titers to regionally relevant allergen sources in order to determine whether allergic sensitization is more prevalent in CRS patients compared to chronic idiopathic urticaria (CIU) control patients. METHODS: One hundred and fifty eight subjects (19-70 years of age) were recruited into the study. 101 subjects had a confirmed diagnostic history of CRS and 57 subjects with a clinical diagnosis of CIU were recruited as controls. Enrolled subjects underwent SPT to a panel of perennial and seasonal allergens and sIgE titers were quantified to selected environmental allergen mixes (grass, mold, and tree species) using Phadia ImmunoCAP. sIgG was additionally quantified to Alternaria alternata, Aspergillus versicolor, Cladosporium herbarum, and Stachybotrys atra. Differences between CRS and control CIU patient SPT and serological data were examined by chi-squared analysis and analysis of variance. RESULTS: Reactivity to at least one SPT extract occurred in 73% of CRS patients. Positive SPT reactivity to A. alternata (odds ratio (OR): 4.34, 95% confidence interval: 1.57, 12.02), cat (OR: 3.23, 95% CI: 1.16, 9.02), and ragweed (OR: 2.31, 95% CI: 1.02, 5.19) extracts were more prevalent in patients with CRS (p < 0.05). Although dust mite and timothy grass sensitization approached statistical significance in the chi-squared analysis of SPT data, other common perennial and seasonal allergens were not associated with CRS. No statistically significant differences were observed between mean sIgE and sIgG titers in CRS and control patients. CONCLUSIONS: This study supports previous data that suggests A. alternata sensitization is associated with CRS; however, these findings additionally highlight the contribution of other regionally important allergens including cat and ragweed. |
Association between upper digestive tract microbiota and cancer-predisposing states in the esophagus and stomach
Yu G , Gail MH , Shi J , Klepac-Ceraj V , Paster BJ , Dye BA , Wang GQ , Wei WQ , Fan JH , Qiao YL , Dawsey SM , Freedman ND , Abnet CC . Cancer Epidemiol Biomarkers Prev 2014 23 (5) 735-41 BACKGROUND: The human upper digestive tract microbial community (microbiota) is not well characterized and few studies have explored how it relates to human health. We examined the relationship between upper digestive tract microbiota and two cancer-predisposing states, serum pepsinogen I/pepsinogen II ratio (PGI/II; predictor of gastric cancer risk) and esophageal squamous dysplasia (ESD; the precursor lesion of esophageal squamous cell carcinoma; ESCC) in a cross-sectional design. METHODS: The Human Oral Microbe Identification Microarray was used to test for the presence of 272 bacterial species in 333 upper digestive tract samples from a Chinese cancer screening cohort. Serum PGI and PGII were determined by ELISA. ESD was determined by chromoendoscopy with biopsy. RESULTS: Lower microbial richness (number of bacterial genera per sample) was significantly associated with lower PGI/II ratio (P = 0.034) and the presence of ESD (P = 0.018). We conducted principal component (PC) analysis on a beta-diversity matrix (pairwise difference in microbiota), and observed significant correlations between PC1, PC3, and PGI/II (P = 0.004 and 0.009, respectively), and between PC1 and ESD (P = 0.003). CONCLUSIONS: Lower microbial richness in upper digestive tract was independently associated with both cancer-predisposing states in the esophagus and stomach (presence of ESD and lower PGI/II). IMPACT: These novel findings suggest that the upper digestive tract microbiota may play a role in the etiology of chronic atrophic gastritis and ESD, and therefore in the development of gastric and esophageal cancers. |
Use of drug-susceptibility testing for management of drug-resistant tuberculosis, Thailand, 2004-2008
Lam E , Nateniyom S , Whitehead S , Anuwatnonthakate A , Monkongdee P , Kanphukiew A , Inyaphong J , Sitti W , Chiengsorn N , Moolphate S , Kavinum S , Suriyon N , Limsomboon P , Danyutapolchai J , Sinthuwattanawibool C , Podewils LJ . Emerg Infect Dis 2014 20 (3) 408-16 In 2004, routine use of culture and drug-susceptibility testing (DST) was implemented for persons in 5 Thailand provinces with a diagnosis of tuberculosis (TB). To determine if DST results were being used to guide treatment, we conducted a retrospective chart review for patients with rifampin-resistant or multidrug-resistant (MDR) TB during 2004-2008. A total of 208 patients were identified. Median time from clinical sample collection to physician review of DST results was 114 days. Only 5.8% of patients with MDR TB were empirically prescribed an appropriate regimen; an additional 31.3% received an appropriate regimen after DST results were reviewed. Most patients with rifampin -resistant or MDR TB had successful treatment outcomes. Patients with HIV co-infection and patients who were unmarried or had received category II treatment before DST results were reviewed had less successful outcomes. Overall, review of available DST results was delayed, and results were rarely used to improve treatment. |
Varicella outbreak in a daycare: challenges and opportunities for preventing varicella outbreaks in this setting
Daskalaki I , Thermitus R , Perella D , Viner K , Spells N , Mohanty S , Lopez A , Johnson C . Pediatr Infect Dis J 2014 33 (4) 420-2 As a result of single-dose varicella vaccination, daycare outbreaks have become rare. We investigated a daycare outbreak resulting from a misdiagnosed varicella case in an unvaccinated attendee. Of 25 attendees aged 12-32 months without evidence of immunity, 7 (28%) were unvaccinated due to religious/philosophical opposition or recent 1st birthday. Single-dose vaccination reduced disease by 92% compared with no vaccination. |
Pharmacokinetic interaction of rifapentine and raltegravir in healthy volunteers
Weiner M , Egelund EF , Engle M , Kiser M , Prihoda TJ , Gelfond JA , Mac Kenzie W , Peloquin CA . J Antimicrob Chemother 2014 69 (4) 1079-85 OBJECTIVES: Latent tuberculosis infection and tuberculosis disease are prevalent worldwide. However, antimycobacterial rifamycins have drug interactions with many antiretroviral drugs. We evaluated the effect of rifapentine on the pharmacokinetic properties of raltegravir. METHODS: In this open-label, fixed-sequence, three-period study, 21 healthy volunteers were given: raltegravir alone (400 mg every 12 h for 4 days) on days 1-4 of Period 1; rifapentine (900 mg once weekly for 3 weeks) on days 1, 8 and 15 of Period 2 and raltegravir (400 mg every 12 h for 4 days) on days 12-15 of Period 2; and rifapentine (600 mg once daily for 10 scheduled doses) on days 1, 4-8 and 11-14 of Period 3 and raltegravir (400 mg every 12 h for 4 days) on days 11-14 of Period 3. Plasma raltegravir concentrations were measured. ClinicalTrials.gov database: NCT00809718. RESULTS: In 16 subjects who completed the study, coadministration of raltegravir with rifapentine (900 mg once weekly; Period 2) compared with raltegravir alone resulted in the geometric mean of the raltegravir AUC from 0 to 12 h (AUC0-12) being increased by 71%; the peak concentration increased by 89% and the trough concentration decreased by 12%. Coadministration of raltegravir with rifapentine in Period 3 did not change the geometric mean of the raltegravir AUC0-12 or the peak concentration, but it decreased the trough concentration by 41%. Raltegravir coadministered with rifapentine was generally well tolerated. CONCLUSIONS: The increased raltegravir exposure observed with once-weekly rifapentine was safe and tolerable. Once-weekly rifapentine can be used with raltegravir to treat latent tuberculosis infection in patients who are infected with HIV. |
Effects of early versus delayed initiation of antiretroviral treatment on clinical outcomes of HIV-1 infection: results from the phase 3 HPTN 052 randomised controlled trial
Grinsztejn B , Hosseinipour MC , Ribaudo HJ , Swindells S , Eron J , Chen YQ , Wang L , Ou SS , Anderson M , McCauley M , Gamble T , Kumarasamy N , Hakim JG , Kumwenda J , Pilotto JH , Godbole SV , Chariyalertsak S , de Melo MG , Mayer KH , Eshleman SH , Piwowar-Manning E , Makhema J , Mills LA , Panchia R , Sanne I , Gallant J , Hoffman I , Taha TE , Nielsen-Saines K , Celentano D , Essex M , Havlir D , Cohen MS . Lancet Infect Dis 2014 14 (4) 281-90 BACKGROUND: Use of antiretroviral treatment for HIV-1 infection has decreased AIDS-related morbidity and mortality and prevents sexual transmission of HIV-1. However, the best time to initiate antiretroviral treatment to reduce progression of HIV-1 infection or non-AIDS clinical events is unknown. We reported previously that early antiretroviral treatment reduced HIV-1 transmission by 96%. We aimed to compare the effects of early and delayed initiation of antiretroviral treatment on clinical outcomes. METHODS: The HPTN 052 trial is a randomised controlled trial done at 13 sites in nine countries. We enrolled HIV-1-serodiscordant couples to the study and randomly allocated them to either early or delayed antiretroviral treatment by use of permuted block randomisation, stratified by site. Random assignment was unblinded. The HIV-1-infected member of every couple initiated antiretroviral treatment either on entry into the study (early treatment group) or after a decline in CD4 count or with onset of an AIDS-related illness (delayed treatment group). Primary events were AIDS clinical events (WHO stage 4 HIV-1 disease, tuberculosis, and severe bacterial infections) and the following serious medical conditions unrelated to AIDS: serious cardiovascular or vascular disease, serious liver disease, end-stage renal disease, new-onset diabetes mellitus, and non-AIDS malignant disease. Analysis was by intention-to-treat. This trial is registered with ClinicalTrials.gov, number NCT00074581. FINDINGS: 1763 people with HIV-1 infection and a serodiscordant partner were enrolled in the study; 886 were assigned early antiretroviral treatment and 877 to the delayed treatment group (two individuals were excluded from this group after randomisation). Median CD4 counts at randomisation were 442 (IQR 373-522) cells per muL in patients assigned to the early treatment group and 428 (357-522) cells per muL in those allocated delayed antiretroviral treatment. In the delayed group, antiretroviral treatment was initiated at a median CD4 count of 230 (IQR 197-249) cells per muL. Primary clinical events were reported in 57 individuals assigned to early treatment initiation versus 77 people allocated to delayed antiretroviral treatment (hazard ratio 0.73, 95% CI 0.52-1.03; p=0.074). New-onset AIDS events were recorded in 40 participants assigned to early antiretroviral treatment versus 61 allocated delayed initiation (0.64, 0.43-0.96; p=0.031), tuberculosis developed in 17 versus 34 patients, respectively (0.49, 0.28-0.89, p=0.018), and primary non-AIDS events were rare (12 in the early group vs nine with delayed treatment). In total, 498 primary and secondary outcomes occurred in the early treatment group (incidence 24.9 per 100 person-years, 95% CI 22.5-27.5) versus 585 in the delayed treatment group (29.2 per 100 person-years, 26.5-32.1; p=0.025). 26 people died, 11 who were allocated to early antiretroviral treatment and 15 who were assigned to the delayed treatment group. INTERPRETATION: Early initiation of antiretroviral treatment delayed the time to AIDS events and decreased the incidence of primary and secondary outcomes. The clinical benefits recorded, combined with the striking reduction in HIV-1 transmission risk previously reported, provides strong support for earlier initiation of antiretroviral treatment. FUNDING: US National Institute of Allergy and Infectious Diseases. |
Incidence and predictors of first line antiretroviral regimen modification in western Kenya
Inzaule S , Otieno J , Kalyango J , Nafisa L , Kabugo C , Nalusiba J , Kwaro D , Zeh C , Karamagi C . PLoS One 2014 9 (4) e93106 BACKGROUND: Limited antiretroviral treatment regimens in resource-limited settings require long-term sustainability of patients on the few available options. We evaluated the incidence and predictors of combined antiretroviral treatment (cART) modifications, in an outpatient cohort of 955 patients who initiated cART between January 2009 and January 2011 in western Kenya. METHODS: cART modification was defined as either first time single drug substitution or switch. Incidence rates were determined by Poisson regression and risk factor analysis assessed using multivariate Cox regression modeling. RESULTS: Over a median follow-up period of 10.7 months, 178 (18.7%) patients modified regimens (incidence rate (IR); 18.6 per 100 person years [95% CI: 16.2-21.8]). Toxicity was the most common cited reason (66.3%). In adjusted multivariate Cox piecewise regression model, WHO disease stage III/IV (aHR; 1.82, 95%CI: 1.25-2.66), stavudine (d4T) use (aHR; 2.21 95%CI: 1.49-3.30) and increase in age (aHR; 1.02, 95%CI: 1.0-1.04) were associated with increased risk of treatment modification within the first year post-cART. Zidovudine (AZT) and tenofovir (TDF) use had a reduced risk for modification (aHR; 0.60 95%CI: 0.38-0.96 and aHR; 0.51 95%CI: 0.29-0.91 respectively). Beyond one year of treatment, d4T use (aHR; 2.75, 95% CI: 1.25-6.05), baseline CD4 counts ≤350 cells/mm3 (aHR; 2.45, 95%CI: 1.14-5.26), increase in age (aHR; 1.05 95%CI: 1.02-1.07) and high baseline weight >60kg aHR; 2.69 95% CI: 1.58-4.59) were associated with risk of cART modification. CONCLUSIONS: Early treatment initiation at higher CD4 counts and avoiding d4T use may reduce treatment modification and subsequently improve sustainability of patients on the available limited options. |
Influenza-like illness and presenteeism among school employees
de Perio MA , Wiegand DM , Brueck SE . Am J Infect Control 2014 42 (4) 450-2 We determined the prevalence of influenza-like illness (ILI) among employees of a suburban Ohio school district. In a survey of 412 of 841 employees (49%), 120 (29%) reported ILI symptoms during the school year, and 92 (77%) reported working while ill. Age ≥50 years and asthma were significantly associated with reporting of ILI symptoms. Encouraging school employees to receive the seasonal influenza vaccine and to stay home when ill should be part of a comprehensive influenza prevention strategy. |
Ciprofloxacin resistance and gonorrhea incidence rates in 17 cities, United States, 1991-2006
Chesson HW , Kirkcaldy RD , Gift TL , Owusu-Edusei K Jr , Weinstock HS . Emerg Infect Dis 2014 20 (4) 612-9 Antimicrobial drug resistance can hinder gonorrhea prevention and control efforts. In this study, we analyzed historical ciprofloxacin resistance data and gonorrhea incidence data to examine the possible effect of antimicrobial drug resistance on gonorrhea incidence at the population level. We analyzed data from the Gonococcal Isolate Surveillance Project and city-level gonorrhea incidence rates from surveillance data for 17 cities during 1991-2006. We found a strong positive association between ciprofloxacin resistance and gonorrhea incidence rates at the city level during this period. Their association was consistent with predictions of mathematical models in which resistance to treatment can increase gonorrhea incidence rates through factors such as increased duration of infection. These findings highlight the possibility of future increases in gonorrhea incidence caused by emerging cephalosporin resistance. |
Exposure to Borrelia burgdorferi and other tick-borne pathogens in Gettysburg National Military Park, south-central Pennsylvania, 2009
Han GS , Stromdahl EY , Wong D , Weltman AC . Vector Borne Zoonotic Dis 2014 14 (4) 227-33 Since 1998, Lyme disease cases have increased in south-central Pennsylvania, which includes Gettysburg National Military Park (NMP). Limited information is available about tick populations or pathogens in this area, and no data regarding frequency of tick bites or prevention measures among Gettysburg NMP employees are available. To address these gaps, ticks were collected, classified, and replaced (to minimize disruptions to tick populations) at two sites within Gettysburg NMP during April-September, 2009, among eight nonremoval samplings. On two additional occasions during May and June, 2009, ticks were collected and removed from the two original sites plus 10 additional sites and tested for tick-borne pathogens by using PCR. A self-administered anonymous survey of Gettysburg NMP employees was conducted to determine knowledge, attitudes, and practices regarding tick-borne diseases. Peak Ixodes scapularis nymph populations were observed during May-July. Of 115 I. scapularis ticks tested, 21% were infected with Borrelia burgdorferi, including 18% of 74 nymphs and 27% of 41 adults; no other pathogen was identified. The entomologic risk index was calculated at 1.3 infected nymphs/hour. An adult and nymph Amblyomma americanum were also found, representing the first confirmed field collection of this tick in Pennsylvania, but no pathogens were detected. The survey revealed that most park employees believed Lyme disease was a problem at Gettysburg NMP and that they frequently found ticks on their skin and clothing. However, use of personal preventive measures was inconsistent, and 6% of respondents reported contracting Lyme disease while employed at Gettysburg NMP. These findings indicate a need to improve surveillance for tick bites among employees and enhance prevention programs for park staff and visitors. |
Investigating a crow die-off in January-February 2011 during the introduction of a new clade of highly pathogenic avian influenza virus H5N1 into Bangladesh
Khan SU , Berman L , Haider N , Gerloff N , Rahman MZ , Shu B , Rahman M , Dey TK , Davis TC , Das BC , Balish A , Islam A , Teifke JP , Zeidner N , Lindstrom S , Klimov A , Donis RO , Luby SP , Shivaprasad HL , Mikolon AB . Arch Virol 2014 159 (3) 509-18 We investigated unusual crow mortality in Bangladesh during January-February 2011 at two sites. Crows of two species, Corvus splendens and C. macrorhynchos, were found sick and dead during the outbreaks. In selected crow roosts, morbidity was ~1 % and mortality was ~4 % during the investigation. Highly pathogenic avian influenza virus H5N1 clade 2.3.2.1 was isolated from dead crows. All isolates were closely related to A/duck/India/02CA10/2011 (H5N1) with 99.8 % and A/crow/Bangladesh/11rs1984-15/2011 (H5N1) virus with 99 % nucleotide sequence identity in their HA genes. The phylogenetic cluster of Bangladesh viruses suggested a common ancestor with viruses found in poultry from India, Myanmar and Nepal. Histopathological changes and immunohistochemistry staining in brain, pancreas, liver, heart, kidney, bursa of Fabricius, rectum, and cloaca were consistent with influenza virus infection. Through our limited investigation in domesticated birds near the crow roosts, we did not identify any samples that tested positive for influenza virus A/H5N1. However, environmental samples collected from live-bird markets near an outbreak site during the month of the outbreaks tested very weakly positive for influenza virus A/H5N1 in clade 2.3.2.1-specific rRT-PCR. Continuation of surveillance in wild and domestic birds may identify evolution of new avian influenza virus and associated public-health risks. |
Development of indicators for measuring outcomes of water safety plans
Lockhart G , Oswald WE , Hubbard B , Medlin E , Gelting RJ . J Water Sanit Hyg Dev 2014 4 (1) 171-181 Water safety plans (WSPs) are endorsed by the World Health Organization as the most effective method of protecting a water supply. With the increase in WSPs worldwide, several valuable resources have been developed to assist practitioners in the implementation of WSPs, yet there is still a need for a practical and standardized method of evaluating WSP effectiveness. In 2012, the Centers for Disease Control and Prevention (CDC) published a conceptual framework for the evaluation of WSPs, presenting four key outcomes of the WSP process: institutional, operational, financial and policy change. In this paper, we seek to operationalize this conceptual framework by providing a set of simple and practical indicators for assessing WSP outcomes. Using CDC's WSP framework as a foundation and incorporating various existing performance monitoring indicators for water utilities, we developed a set of approximately 25 indicators of institutional, operational, financial and policy change within the WSP context. These outcome indicators hold great potential for the continued implementation and expansion of WSPs worldwide. Having a defined framework for evaluating a WSP's effectiveness, along with a set of measurable indicators by which to carry out that evaluation, will help implementers assess key WSP outcomes internally, as well as benchmark their progress against other WSPs in their region and globally. |
Notes from the field: multistate outbreak of listeriosis linked to soft-ripened cheese - United States, 2013
Choi MJ , Jackson KA , Medu C , Beal J , Rigdon CE , Cloyd TC , Forstner MJ , Ball J , Bosch S , Bottichio L , Cantu V , Melka DC , Ishow W , Slette S , Melka DC , Irvin K , Melka DC , Wise M , Tarr C , Mahon B , Smith KE , Silk BJ . MMWR Morb Mortal Wkly Rep 2014 63 (13) 294-5 On June 27, 2013, the Minnesota Department of Health notified CDC of two patients with invasive Listeria monocytogenes infections (listeriosis) whose clinical isolates had indistinguishable pulsed-field gel electrophoresis (PFGE) patterns. A query of PulseNet, the national molecular subtyping network for foodborne disease surveillance, identified clinical and environmental isolates from other states. On June 28, CDC learned from the Food and Drug Administration's Coordinated Outbreak Response and Evaluation Network that environmental isolates indistinguishable from those of the two patients had been collected from Crave Brothers Farmstead Cheese during 2010-2011. An outbreak-related case was defined as isolation of L. monocytogenes with the outbreak PFGE pattern from an anatomic site that is normally sterile (e.g., blood or cerebrospinal fluid), or from a product of conception, with an isolate upload date during May 20-June 28, 2013. As of June 28, five cases were identified in four states (Minnesota, two cases; Illinois, Indiana, and Ohio, one each). Median age of the five patients was 58 years (range: 31-67 years). Four patients were female, including one who was pregnant at the time of infection. All five were hospitalized. One death and one miscarriage were reported. |
Mediation effects of problem drinking and marijuana use on HIV sexual risk behaviors among childhood sexually abused South African heterosexual men
Icard LD , Jemmott JB 3rd , Teitelman A , O'Leary A , Heeren GA . Child Abuse Negl 2014 38 (2) 234-42 HIV/AIDS prevalence in South Africa is one of the highest in the world with heterosexual transmission predominantly promoting the epidemic. The goal of this study is to examine whether, marijuana use and problem drinking mediate the relationship between histories of childhood sexual, abuse (CSA) and HIV risk behaviors among heterosexual men. Participants were 1181 Black men aged, 18-45 from randomly selected neighborhoods in Eastern Cape Province, South Africa. Audio computer assisted, self-interviewing was used to assess self-reported childhood sexual abuse, problem drinking, and marijuana (dagga) use, and HIV sexual transmission behavior with steady and casual partners. Data were analyzed using multiple meditational modeling. There was more support for problem, drinking than marijuana use as a mediator. Findings suggest that problem drinking and marijuana use, mediate HIV sexual risk behaviors in men with histories of CSA. Focusing on men with histories of CSA, and their use of marijuana and alcohol may be particularly useful for designing strategies to reduce, HIV sexual transmission in South Africa. |
Medicare reimbursement attributable to catheter-associated urinary tract infection in the inpatient setting: a retrospective cohort analysis
Yi SH , Baggs J , Gould CV , Scott RD 2nd , Jernigan JA . Med Care 2014 52 (6) 469-78 BACKGROUND: Most catheter-associated urinary tract infections (CAUTIs) are considered preventable and thus a potential target for health care quality improvement and cost savings. OBJECTIVES: We sought to estimate excess Medicare reimbursement, length of stay, and inpatient death associated with CAUTI among hospitalized beneficiaries. RESEARCH DESIGN: Using a retrospective cohort design with linked Medicare inpatient claims and National Healthcare Safety Network data from 2009, we compared Medicare reimbursement between Medicare beneficiaries with and without CAUTIs. SUBJECTS: Fee-for-service Medicare beneficiaries aged 65 years or older with continuous coverage of parts A (hospital insurance) and B (supplementary medical insurance). RESULTS: We found that beneficiaries with CAUTI had higher median Medicare reimbursement [intensive care unit (ICU): $8548, non-ICU: $1479) and length of stay (ICU: 8.1 d, non-ICU: 3.6 d) compared with those without CAUTI controlling for potential confounding factors. Odds of inpatient death were higher among beneficiaries with versus without CAUTI only among those with an ICU stay (ICU: odds ratio 1.37). CONCLUSIONS: Beneficiaries with CAUTI had increased Medicare reimbursement and length of stay compared with those without CAUTI after adjusting for potential confounders. |
A polymicrobial outbreak of surgical site infections following cardiac surgery at a community hospital in Florida, 2011-2012
Nguyen DB , Gupta N , Abou-Daoud A , Klekamp BG , Rhone C , Winston T , Hedberg T , Scuteri A , Evans C , Jensen B , Moulton-Meissner H , Torok T , Berrios-Torres SI , Noble-Wang J , Kallen A . Am J Infect Control 2014 42 (4) 432-5 We describe an outbreak of 22 sternal surgical site infections following cardiac surgery, including 4 Gordonia infections. Possible operation room environmental contamination and suboptimal infection control practices regarding scrub attire may have contributed to the outbreak. |
Safety of diphtheria, tetanus, acellular pertussis and inactivated poliovirus (DTaP-IPV) vaccine
Daley MF , Yih WK , Glanz JM , Hambidge SJ , Narwaney KJ , Yin R , Li L , Nelson JC , Nordin JD , Klein NP , Jacobsen SJ , Weintraub E . Vaccine 2014 32 (25) 3019-24 BACKGROUND: In 2008, a diphtheria, tetanus, acellular pertussis, and inactivated poliovirus combined vaccine (DTaP-IPV) was licensed for use in children 4 through 6 years of age. While pre-licensure studies did not demonstrate significant safety concerns, the number vaccinated in these studies was not sufficient to examine the risk of uncommon but serious adverse events. OBJECTIVE: To assess the risk of serious adverse events following DTaP-IPV vaccination. METHODS: The study was conducted from January 2009 through September 2012 in the Vaccine Safety Datalink (VSD) project. In the VSD, electronic vaccination and encounter data are updated and aggregated weekly as part of ongoing surveillance activities. Based on previous reports and biologic plausibility, eight potential adverse events were monitored: meningitis/encephalitis; seizures; stroke; Guillain-Barre syndrome; Stevens-Johnson syndrome; anaphylaxis; serious allergic reactions other than anaphylaxis; and serious local reactions. Adverse event rates in DTaP-IPV recipients were compared to historical incidence rates in the VSD population prior to 2009. Sequential probability ratio testing was used to analyze the data on a weekly basis. RESULTS: During the study period, 201,116 children received DTaP-IPV vaccine. Ninety-seven percent of DTaP-IPV recipients also received other vaccines on the same day, typically measles-mumps-rubella and varicella vaccines. There was no statistically significant increased risk of any of the eight pre-specified adverse events among DTaP-IPV recipients when compared to historical incidence rates. CONCLUSIONS: In this safety surveillance study of more than 200,000 DTaP-IPV vaccine recipients, there was no evidence of increased risk for any of the pre-specified adverse events monitored. Continued surveillance of DTaP-IPV vaccine safety may be warranted to monitor for rare adverse events, such as Guillain-Barre syndrome. |
Progress toward measles preelimination - African Region, 2011-2012
Masresha BG , Kaiser R , Eshetu M , Katsande R , Luce R , Fall A , Dosseh AR , Naouri B , Byabamazima CR , Perry R , Dabbagh AJ , Strebel P , Kretsinger K , Goodson JL , Nshimirimana D . MMWR Morb Mortal Wkly Rep 2014 63 (13) 285-91 In 2008, the 46 member states of the World Health Organization (WHO) African Region (AFR) adopted a measles preelimination goal to reach by the end of 2012 with the following targets: 1) >98% reduction in estimated regional measles mortality compared with 2000, 2) annual measles incidence of fewer than five reported cases per million population nationally, 3) >90% national first dose of measles-containing vaccine (MCV1) coverage and >80% MCV1 coverage in all districts, and 4) >95% MCV coverage in all districts by supplementary immunization activities (SIAs). Surveillance performance objectives were to report two or more cases of nonmeasles febrile rash illness per 100,000 population, one or more suspected measles cases investigated with blood specimens in ≥80% of districts, and 100% completeness of surveillance reporting from all districts. This report updates previous reports and describes progress toward the measles preelimination goal during 2011-2012. In 2012, 13 (28%) member states had >90% MCV1 coverage, and three (7%) reported >90% MCV1 coverage nationally and >80% coverage in all districts. During 2011-2012, four (15%) of 27 SIAs with available information met the target of >95% coverage in all districts. In 2012, 16 of 43 (37%) member states met the incidence target of fewer than five cases per million, and 19 of 43 (44%) met both surveillance performance targets. In 2011, the WHO Regional Committee for AFR established a goal to achieve measles elimination by 2020. To achieve this goal, intensified efforts to identify and close population immunity gaps and improve surveillance quality are needed, as well as committed leadership and ownership of the measles elimination activities and mobilization of adequate resources to complement funding from global partners. |
Effect of breastfeeding on immunogenicity of oral live-attenuated human rotavirus vaccine: a randomized trial in HIV-uninfected infants in Soweto, South Africa
Groome MJ , Moon SS , Velasquez D , Jones S , Koen A , van Niekerk N , Jiang B , Parashar UD , Madhi SA . Bull World Health Organ 2014 92 (4) 238-45 OBJECTIVE: To investigate the effect of abstention from breastfeeding, for an hour before and after each vaccination, on the immune responses of infants to two doses of rotavirus vaccine. METHODS: In Soweto, South Africa, mother-infant pairs who were uninfected with human immunodeficiency virus (HIV) were enrolled as they presented for the "6-week" immunizations of the infants. Each infant was randomly assigned to Group 1 - in which breastfeeding was deferred for at least 1 h before and after each dose of rotavirus vaccine - or Group 2 - in which unrestricted breastfeeding was encouraged. Enzyme-linked immunosorbent assays were used to evaluate the titres of rotavirus-specific IgA in samples of serum collected from each infant immediately before each vaccine dose and 1 month after the second dose. Among the infants, a fourfold or greater increase in titres of rotavirus-specific IgA following vaccination was considered indicative of seroconversion. FINDINGS: The evaluable infants in Group 1 (n = 98) were similar to those in Group 2 (n = 106) in their baseline demographic characteristics and their pre-vaccination titres of anti-rotavirus IgA. After the second vaccine doses, geometric mean titres of anti-rotavirus IgA in the sera of Group-1 infants were similar to those in the sera of Group-2 infants (P = 0.685) and the frequency of seroconversion in the Group-1 infants was similar to that in the Group-2 infants (P = 0.485). CONCLUSION: Among HIV-uninfected South African infants, abstention from breastfeeding for at least 1 h before and after each vaccination dose had no significant effect on the infants' immune response to a rotavirus vaccine. |
Effectiveness of the monovalent rotavirus vaccine in Colombia: a case-control study
Cotes-Cantillo K , Paternina-Caicedo A , Coronell-Rodriguez W , Alvis-Guzman N , Parashar UD , Patel M , De la Hoz-Restrepo F . Vaccine 2014 32 (25) 3035-40 OBJECTIVE: To assess the effectiveness of the monovalent rotavirus vaccine (RV1) to prevent rotavirus diarrhea admissions to emergency departments (ED) in Colombia. METHODS: A multicenter case-control study was carried out in six Colombian cities from 2011 to January, 2013. Cases were laboratory confirmed rotavirus diarrhea patients admitted to ED of selected health centers. Controls were patients with non-rotavirus diarrhea. Vaccination status was card-confirmed. Vaccine effectiveness and 95% confidence intervals (CI) were calculated from the conditional logistic regression models using the formula 1-adjusted odds ratiox100. RESULTS: 1051 fecal samples were collected from 193 cases and 858 controls. Vaccination history was confirmed on 173 cases (90%) and 801 controls (93%). Among the rotavirus-positive samples with vaccination history, 57% were G2P[4], 9.8% G9P[8], 6% G9P[6]. Median age of cases (17 months) was greater than controls (15 months) (P<0.001), and mothers of cases had lower level of education (P=0.025). The adjusted effectiveness was 79.19% (95% CI, 23.7 to 94.32) among children 6-11 months of age and -39.75% (95% CI, -270.67 to 47.24) among those >12 months of age. Against overnight rotavirus hospitalizations, RV1 provided protection of 84.42% (95% CI, 22.68 to 96.86) among children 6-11 months of age, and -79.49% (95% CI, -555.8 to 51.08) among those >12 months. CONCLUSIONS: RV1 provided significant protection against rotavirus hospitalization among children under 1 year of age in the Colombian setting. The observation of lower effectiveness in children >12 months requires further assessment. |
Efficiency of points of dispensing for influenza A(H1N1)pdm09 vaccination, Los Angeles County, California, USA, 2009
Saha S , Dean B , Teutsch S , Borse RH , Meltzer MI , Bagwell D , Plough A , Fielding J . Emerg Infect Dis 2014 20 (3) 590-5 During October 23-December 8, 2009, the Los Angeles County Department of Public Health used points of dispensing (PODs) to improve access to and increase the number of vaccinations against influenza A(H1N1)pdm09. We assessed the efficiency of these units and access to vaccines among ethnic groups. An average of 251 persons per hour (SE 65) were vaccinated at the PODs; a 10% increase in use of live-attenuated monovalent vaccines reduced that rate by 23 persons per hour (SE 7). Vaccination rates were highest for Asians (257/10,000 persons), followed by Hispanics (114/10,000), whites (75/100,000), and African Americans (37/10,000). Average distance traveled to a POD was highest for whites (6.6 miles; SD 6.5) and lowest for Hispanics (4.7 miles; SD +/-5.3). Placing PODs in areas of high population density could be an effective strategy to reach large numbers of persons for mass vaccination, but additional PODs may be needed to improve coverage for specific populations. |
Barriers and facilitators to influenza vaccination and vaccine coverage in a cohort of health care personnel
Naleway AL , Henkle EM , Ball S , Bozeman S , Gaglani MJ , Kennedy ED , Thompson MG . Am J Infect Control 2014 42 (4) 371-5 BACKGROUND: Annual influenza vaccination is recommended for health care personnel (HCP). We describe influenza vaccination coverage among HCP during the 2010-2011 season and present reported facilitators of and barriers to vaccination. METHODS: We enrolled HCP 18 to 65 years of age, working full time, with direct patient contact. Participants completed an Internet-based survey at enrollment and the end of influenza season. In addition to self-reported data, we collected information about the 2010-2011 influenza vaccine from electronic employee health and medical records. RESULTS: Vaccination coverage was 77% (1,307/1,701). Factors associated with higher vaccination coverage include older age, being married or partnered, working as a physician or dentist, prior history of influenza vaccination, more years in patient care, and higher job satisfaction. Personal protection was reported as the most important reason for vaccination followed closely by convenience, protection of patients, and protection of family and friends. Concerns about perceived vaccine safety and effectiveness and low perceived susceptibility to influenza were the most commonly reported barriers to vaccination. About half of the unvaccinated HCP said they would have been vaccinated if required by their employer. CONCLUSION: Influenza vaccination in this cohort was relatively high but still fell short of the recommended target of 90% coverage for HCP. Addressing concerns about vaccine safety and effectiveness are possible areas for future education or intervention to improve coverage among HCP. |
Distribution of pandemic influenza vaccine and reporting of doses administered, New York, New York, USA
Marcello RK , Papadouka V , Misener M , Wake E , Mandell R , Zucker JR . Emerg Infect Dis 2014 20 (4) 525-31 In 2009, the New York City Department of Health and Mental Hygiene delivered influenza A(H1N1)pdm09 (pH1N1) vaccine to health care providers, who were required to report all administered doses to the Citywide Immunization Registry. Using data from this registry and a provider survey, we estimated the number of all pH1N1 vaccine doses administered. Of 2.8 million doses distributed during October 1, 2009-March 4, 2010, a total of 988,298 doses were administered and reported; another 172,289 doses were administered but not reported, for a total of 1,160,587 doses administered during this period. Reported doses represented an estimated 80%-85% of actual doses administered. Reporting by a wide range of provider types was feasible during a pandemic. Pediatric-care providers had the highest reporting rate (93%). Other private-care providers who routinely did not report vaccinations indicated that they had few, if any, problems, thereby suggesting that mandatory reporting of all vaccines would be feasible. |
EHR adopters vs. non-adopters: impacts of, barriers to, and federal initiatives for EHR adoption
Jamoom EW , Patel V , Furukawa MF , King J . Healthc (Amst) 2014 2 (1) 33-39 While adoption of electronic health record (EHR) systems has grown rapidly, little is known about physicians' perspectives on its adoption and use. Nationally representative survey data from 2011 are used to compare the perspectives of physicians who have adopted EHRs with those that have yet to do so across three key areas: the impact of EHRs on clinical care, practice efficiency and operations; barriers to EHR adoption; and factors that influence physicians to adopt EHRs. Despite significant differences in perspectives between adopters and non-adopters, the majority of physicians perceive that EHR use yields overall clinical benefits, more efficient practices and financial benefits. Purchase cost and productivity loss are the greatest barriers to EHR adoption among both adopters and non-adopters; although non-adopters have significantly higher rates of reporting these as barriers. Financial incentives and penalties, technical assistance, and the capability for electronic health information exchange are factors with the greatest influence on EHR adoption among all physicians. However, a substantially higher proportion of non-adopters regard various national health IT policies, and in particular, financial incentives or penalties as a major influence in their decision to adopt an EHR system. Contrasting these perspectives provides a window into how national policies have shaped adoption thus far; and how these policies may shape adoption in the near future. 2014. |
Pathway from child sexual and physical abuse to risky sex among emerging adults: the role of trauma-related intrusions and alcohol problems
Walsh K , Latzman NE , Latzman RD . J Adolesc Health 2014 54 (4) 442-8 PURPOSE: Some evidence suggests that risk reduction programming for sexual risk behaviors (SRB) has been minimally effective, which emphasized the need for research on etiological and mechanistic factors that can be addressed in prevention and intervention programming. Childhood sexual and physical abuse have been linked with SRB among older adolescents and emerging adults; however, pathways to SRB remain unclear. This study adds to the literature by testing a model specifying that traumatic intrusions after early abuse may increase risk for alcohol problems, which in turn may increase the likelihood of engaging in various types of SRB. METHODS: Participants were 1,169 racially diverse college students (72.9% female, 37.6% black/African-American, and 33.6% white) who completed anonymous questionnaires assessing child abuse, traumatic intrusions, alcohol problems, and sexual risk behavior. RESULTS: The hypothesized path model specifying that traumatic intrusions and alcohol problems account for associations between child abuse and several aspects of SRB was a good fit for the data; however, for men, stronger associations emerged between physical abuse and traumatic intrusions and between traumatic intrusions and alcohol problems, whereas for women, alcohol problems were more strongly associated with intent to engage in risky sex. CONCLUSIONS: Findings highlight the role of traumatic intrusions and alcohol problems in explaining paths from childhood abuse to SRB in emerging adulthood, and suggest that risk reduction programs may benefit from an integrated focus on traumatic intrusions, alcohol problems, and SRB for individuals with abuse experiences. |
An economic evaluation of anonymised information sharing in a partnership between health services, police and local government for preventing violence-related injury
Florence C , Shepherd J , Brennan I , Simon TR . Inj Prev 2014 20 (2) 108-14 OBJECTIVE: To assess the costs and benefits of a partnership between health services, police and local government shown to reduce violence-related injury. METHODS: Benefit-cost analysis. RESULTS: Anonymised information sharing and use led to a reduction in wounding recorded by the police that reduced the economic and social costs of violence by pound6.9 million in 2007 compared with the costs the intervention city, Cardiff UK, would have experienced in the absence of the programme. This includes a gross cost reduction of pound1.25 million to the health service and pound1.62 million to the criminal justice system in 2007. By contrast, the costs associated with the programme were modest: setup costs of software modifications and prevention strategies were pound107 769, while the annual operating costs of the system were estimated as pound210 433 (2003 UK pound). The cumulative social benefit-cost ratio of the programme from 2003 to 2007 was pound82 in benefits for each pound spent on the programme, including a benefit-cost ratio of 14.80 for the health service and 19.1 for the criminal justice system. Each of these benefit-cost ratios is above 1 across a wide range of sensitivity analyses. CONCLUSIONS: An effective information-sharing partnership between health services, police and local government in Cardiff, UK, led to substantial cost savings for the health service and the criminal justice system compared with 14 other cities in England and Wales designated as similar by the UK government where this intervention was not implemented. |
The impact of alcohol and road traffic policies on crash rates in Botswana, 2004-2011: a time-series analysis
Sebego M , Naumann RB , Rudd RA , Voetsch K , Dellinger AM , Ndlovu C . Accid Anal Prev 2014 70c 33-39 In Botswana, increased development and motorization have brought increased road traffic-related death rates. Between 1981 and 2001, the road traffic-related death rate in Botswana more than tripled. The country has taken several steps over the last several years to address the growing burden of road traffic crashes and particularly to address the burden of alcohol-related crashes. This study examines the impact of the implementation of alcohol and road safety-related policies on crash rates, including overall crash rates, fatal crash rates, and single-vehicle nighttime fatal (SVNF) crash rates, in Botswana from 2004 to 2011. The overall crash rate declined significantly in June 2009 and June 2010, such that the overall crash rate from June 2010 to December 2011 was 22% lower than the overall crash rate from January 2004 to May 2009. Additionally, there were significant declines in average fatal crash and SVNF crash rates in early 2010. Botswana's recent crash rate reductions occurred during a time when aggressive policies and other activities (e.g., education, enforcement) were implemented to reduce alcohol consumption and improve road safety. While it is unclear which of the policies or activities contributed to these declines and to what extent, these reductions are likely the result of several, combined efforts. |
Characteristics of U.S. suicide decedents in 2005-2010 who had received mental health treatment
Niederkrotenthaler T , Logan JE , Karch DL , Crosby A . Psychiatr Serv 2014 65 (3) 387-90 OBJECTIVE: To inform suicide prevention efforts in mental health treatment, the study assessed associations between recent mental health treatment, personal characteristics, and circumstances of suicide among suicide decedents. METHODS: Data from 18 states reporting to the National Violent Death Reporting System between 2005 and 2010 (N=57,877 suicides) were used to compare circumstances among adult decedents receiving any or no type of mental health treatment within two months before death. RESULTS: Of suicide decedents, 28.5% received treatment before suicide. Several variables were associated with higher odds of receiving treatment, including death by poisoning with commonly prescribed substances (adjusted odds ratio [AOR]=3.04, 95% confidence interval [CI]=2.84-3.26), a history of suicide attempts (AOR=2.77, CI=2.64-2.90), depressed mood (AOR=1.69, CI=1.62-1.76), and nonalcoholic substance abuse or dependence (AOR=1.13, CI=1.07-1.19). CONCLUSIONS: For nearly a third of all suicide decedents, better mental health care might have prevented death. Efforts to reduce access to lethal doses of prescription medications seem warranted to prevent overdosing with commonly prescribed substances. |
Sensitive testing of plasma HIV-1 RNA and Sanger sequencing of cellular HIV-1 DNA for the detection of drug resistance prior to starting first-line antiretroviral therapy with etravirine or efavirenz.
Geretti AM , Conibear T , Hill A , Johnson JA , Tambuyzer L , Thys K , Vingerhoets J , Van Delft Y . J Antimicrob Chemother 2014 69 (4) 1090-7 OBJECTIVES: This study investigated strategies that may increase the yield of drug resistance testing prior to starting antiretroviral therapy (ART), and whether transmitted and polymorphic resistance-associated mutations (RAMs) correlated with virological outcomes. METHODS: We carried out retrospective testing of baseline samples from patients entering the SENSE trial of first-line ART in Europe, Russia and Israel. Prior to randomization to etravirine or efavirenz plus two nucleos(t)ide reverse transcriptase inhibitors (NRTIs), plasma samples underwent routine Sanger sequencing of HIV-1 RT and protease (plasmaSS) in order to exclude patients with transmitted RAMs. Retrospectively, Sanger sequencing was repeated with HIV-1 DNA from baseline peripheral blood mononuclear cells (PBMCSS); baseline plasma samples were retested by allele-specific PCR targeting seven RT RAMs (AS-PCR) and ultra-deep RT sequencing (UDS). RESULTS: By plasmaSS, 16/193 (8.3%) patients showed ≥1 transmitted RAM affecting the NRTIs (10/193, 5.2%), non-nucleoside reverse transcriptase inhibitors (4/193, 2.1%) or protease inhibitors (2/193, 1.0%). No additional RAMs were detected by AS-PCR (n = 152) and UDS (n = 24); PBMCSS (n = 91) yielded two additional samples with one RAM each. Over 48 weeks, 4/79 (5.1%) patients on etravirine and 7/78 (9.0%) on efavirenz experienced virological failure; none had baseline RAMs. Conversely, 11/79 (13.9%) patients randomized to etravirine had one polymorphic RAM from the etravirine score in baseline plasma (V90I, V106I or E138A), without any impact on virological outcomes. CONCLUSIONS: The detection of resistance increased marginally with PBMC testing but did not increase with sensitive plasma testing. A careful consideration is required of the cost-effectiveness of different strategies for baseline HIV drug resistance testing. |
Serological measures of malaria transmission in Haiti: comparison of longitudinal and cross-sectional methods
Arnold BF , Priest JW , Hamlin KL , Moss DM , Colford JM Jr , Lammie PJ . PLoS One 2014 9 (4) e93684 BACKGROUND: Efforts to monitor malaria transmission increasingly use cross-sectional surveys to estimate transmission intensity from seroprevalence data using malarial antibodies. To date, seroconversion rates estimated from cross-sectional surveys have not been compared to rates estimated in prospective cohorts. Our objective was to compare seroconversion rates estimated in a prospective cohort with those from a cross-sectional survey in a low-transmission population. METHODS AND FINDINGS: The analysis included two studies from Haiti: a prospective cohort of 142 children ages ≤11 years followed for up to 9 years, and a concurrent cross-sectional survey of 383 individuals ages 0-90 years old. From all individuals, we analyzed 1,154 blood spot specimens for the malaria antibody MSP-119 using a multiplex bead antigen assay. We classified individuals as positive for malaria using a cutoff derived from the mean plus 3 standard deviations in antibody responses from a negative control set of unexposed individuals. We estimated prospective seroconversion rates from the longitudinal cohort based on 13 incident seroconversions among 646 person-years at risk. We also estimated seroconversion rates from the cross-sectional survey using a reversible catalytic model fit with maximum likelihood. We found the two approaches provided consistent results: the seroconversion rate for ages ≤11 years was 0.020 (0.010, 0.032) estimated prospectively versus 0.023 (0.001, 0.052) in the cross-sectional survey. CONCLUSIONS: The estimation of seroconversion rates using cross-sectional data is a widespread and generalizable problem for many infectious diseases that can be measured using antibody titers. The consistency between these two estimates lends credibility to model-based estimates of malaria seroconversion rates using cross-sectional surveys. This study also demonstrates the utility of including malaria antibody measures in multiplex assays alongside targets for vaccine coverage and other neglected tropical diseases, which together could comprise an integrated, large-scale serological surveillance platform. |
What is the most reliable solid culture medium for tuberculosis treatment trials?
Joloba ML , Johnson JL , Feng PJ , Bozeman L , Goldberg SV , Morgan K , Gitta P , Boom HW , Heilig CM , Mayanja-Kizza H , Eisenach KD . Tuberculosis (Edinb) 2014 94 (3) 311-6 We conducted a prospective study to determine which solid medium is the most reliable overall and after two months of therapy to detect Mycobacterium tuberculosis complex (MTB). MTB isolation and contamination rates on LJ and Middlebrook 7H10 and 7H11 agar with and without selective antibiotics were examined in a single laboratory and compared against a constructed reference standard and MGIT 960 results. Of 50 smear positive adults with pulmonary TB enrolled, 45 successfully completed standard treatment. Two spot sputum specimens were collected before treatment and at week 8 and one spot specimen each at weeks 2, 4, 6, and 12. The MTB recovery rate among all solid media for pre-treatment specimens was similar. After 8 weeks, selective (S) 7H11 had the highest positivity rate. Latent class analysis was used to construct the primary reference standard. The 98.7% sensitivity of 7H11S (95% Wilson confidence interval 96.4%-99.6%) was highest among the 5 solid media (P = 0.003 by bootstrap); the 82.6% specificity of 7H10S (95% CI 75.7%-87.8%) was highest (P = 0.098). Our results support 7H11S as the medium of choice. Further studies in different areas where recovery and contamination are likely to vary, are recommended. |
Metallic nickel nanoparticles may exhibit higher carcinogenic potential than fine particles in JB6 cells
Magaye R , Zhou Q , Bowman L , Zou B , Mao G , Xu J , Castranova V , Zhao J , Ding M . PLoS One 2014 9 (4) e92418 While numerous studies have described the pathogenic and carcinogenic effects of nickel compounds, little has been done on the biological effects of metallic nickel. Moreover, the carcinogenetic potential of metallic nickel nanoparticles is unknown. Activator protein-1 (AP-1) and nuclear factor-kappaB (NF-kappaB) have been shown to play pivotal roles in tumor initiation, promotion, and progression. Mutation of the p53 tumor suppressor gene is considered to be one of the steps leading to the neoplastic state. The present study examines effects of metallic nickel fine and nanoparticles on tumor promoter or suppressor gene expressions as well as on cell transformation in JB6 cells. Our results demonstrate that metallic nickel nanoparticles caused higher activation of AP-1 and NF-kappaB, and a greater decrease of p53 transcription activity than fine particles. Western blot indicates that metallic nickel nanoparticles induced a higher level of protein expressions for R-Ras, c-myc, C-Jun, p65, and p50 in a time-dependent manner. In addition, both metallic nickel nano- and fine particles increased anchorage-independent colony formation in JB6 P+ cells in the soft agar assay. These results imply that metallic nickel fine and nanoparticles are both carcinogenetic in vitro in JB6 cells. Moreover, metallic nickel nanoparticles may exhibit higher carcinogenic potential, which suggests that precautionary measures should be taken in the use of nickel nanoparticles or its compounds in nanomedicine. |
Evaluation of luciferase and GFP-expressing Nipah viruses for rapid quantitative antiviral screening
Lo MK , Nichol ST , Spiropoulou CF . Antiviral Res 2014 106 53-60 Nipah virus (NiV) outbreaks have occurred in Malaysia, India, and Bangladesh, and the virus continues to cause annual outbreaks of fatal human encephalitis in Bangladesh due to spillover from its bat host reservoir. Due to its high pathogenicity, its potential use for bio/agro-terrorism, and to the current lack of approved therapeutics, NiV is designated as an overlap select agent requiring biosafety level-4 containment. Although the development of therapeutic monoclonal antibodies and soluble protein subunit vaccines have shown great promise, the paucity of effective antiviral drugs against NiV merits further exploration of compound libraries using rapid quantitative antiviral assays. As a proof-of-concept study, we evaluated the use of fluorescent and luminescent reporter NiVs for antiviral screening. We constructed and rescued NiVs expressing either Renilla luciferase or green fluorescent protein, and characterized their reporter signal kinetics in different cell types as well as in the presence of several inhibitors. The 50 percent effective concentrations (EC50s) derived for inhibitors against both reporter viruses are within range of EC50s derived from virus yield-based dose-response assays against wild-type NiV (within 1 Log10), thus demonstrating that both reporter NiVs can serve as robust antiviral screening tools. Utilizing these live NiV-based reporter assays requires modest instrumentation, and circumvents the time and labor-intensive steps associated with cytopathic effect or viral antigen-based assays. These reporter NiVs will not only facilitate antiviral screening, but also the study of host cell components that influence the virus life cycle. |
Analysis of crystalline silica in bulk materials
Harper M , Key-Schwartz R . Ann Occup Hyg 2014 58 (5) 657-8 We are writing concerning results presented in Annals of Occupational Hygiene as part of a manuscript by Radnoff and Kutz (2014). The manuscript presents the results of seven analyses without associated uncertainty or validated Limit of Quantitation (LOQ) for bulk crystalline silica content with values reported <1% down to <0.1%. In the Methods section, they state the samples were ‘analysed according to NIOSH Method 7500 for the presence of quartz silica down to 0.1% w/w. This method includes protocols for analysing bulk or settled dust samples’. The NIOSH 7500 method (NIOSH, 2003) is designed to quantify respirable samples collected on a filter and does not include specific procedures for analyzing bulk or settled dust samples beyond using bulk samples to identify interferences in the air samples. The user of the method is directed from Paragraph 4a: Interference check. Prepare area dust sample or settled dust bulk sample for XRD analysis … to Paragraph 11: Obtain a qualitative X-ray diffraction scan of the area air sample (or bulk settled dust) to determine the presence of free silica polymorphs. The subsequent quantitative section (Paragraph 12) refers only to the air sample filter analysis. No evaluation data or LOQ estimate are presented in NIOSH 7500 to support bulk analysis and it is unlikely that the method could be used to measure down to or <0.1% in any case. | Verma et al. (2002) evaluated an infrared method for bulk analysis between 1 and 75% and concluded that although it could be used to determine down to 1% in routine analyses, the method was ineffective <1% silica. A more recent evaluation of an X-ray diffraction method (Martin et al., 2012) suggests that it may be very difficult to go much <1%, even with the additional use of the Rietveld refinement in X-ray diffraction (LOQ = 0.76%). We are unaware of any method that has been published for the determination of crystalline silica in bulk materials that can measure down to 0.1%. If the laboratory used by Radnoff and Kutz has been able to modify NIOSH 7500 to achieve this goal, we would welcome publication of the details of the modification and methods validation as this would be of value to the occupational health community. |
Burkholderia pseudomallei type G in Western Hemisphere
Gee JE , Allender CJ , Tuanyok A , Elrod MG , Hoffmaster AR . Emerg Infect Dis 2014 20 (4) 661-3 Burkholderia pseudomallei isolates from the Western Hemisphere are difficult to differentiate from those from regions in which melioidosis is traditionally endemic. We used internal transcribed spacer typing to determine that B. pseudomallei isolates from the Western Hemisphere are consistently type G. Knowledge of this relationship might be useful for epidemiologic investigations. |
Circulating lethal toxin decreases the ability of neutrophils to respond to Bacillus anthracis
Weiner ZP , Ernst SM , Boyer AE , Gallegos-Candela M , Barr JR , Glomski IJ . Cell Microbiol 2014 16 (4) 504-18 Polymorphonuclear leucocytes (PMNs) play a protective role during Bacillus anthracis infection. However, B. anthracis is able to subvert the PMN response effectively as evidenced by the high mortality rates of anthrax. One major virulence factor produced by B. anthracis, lethal toxin (LT), is necessary for dissemination in the BSL2 model of mouse infection. While human and mouse PMNs kill vegetative B. anthracis, short in vitro half-lives of PMNs have made it difficult to determine how or if LT alters their bactericidal function. Additionally, the role of LT intoxication on PMN's ability to migrate to inflammatory signals remains controversial. LF concentrations in both serum and major organs were determined from mice infected with B. anthracis Sterne strain at defined stages of infection to guide subsequent administration of purified toxin. Bactericidal activity of PMNs assessed using ex vivo cell culture assays showed significant defects in killing B. anthracis. In vivo PMN recruitment to inflammatory stimuli was significantly impaired at 24 h as assessed by real-time analysis of light-producing PMNs within the mouse. The observations described above suggest that LT serves dual functions; it both attenuates accumulation of PMNs at sites of inflammation and impairs PMNs bactericidal activity against vegetative B. anthracis. |
Socioeconomic status, child enrichment factors, and cognitive performance among preschool-age children: results from the Follow-Up of Growth and Development Experiences study
Christensen DL , Schieve LA , Devine O , Drews-Botsch C . Res Dev Disabil 2014 35 (7) 1789-801 Lower cognitive performance is associated with poorer health and functioning throughout the lifespan and disproportionately affects children from lower socioeconomic status (SES) populations. Previous studies reporting positive associations between child home enrichment and cognitive performance generally had a limited distribution of SES. We evaluated the associations of SES and child enrichment with cognitive performance in a population with a wide range of SES, particularly whether enrichment attenuates associations with SES. Children were sampled from a case-control study of small-for-gestational-age (SGA) conducted in a public hospital serving a low SES population (final n=198) and a private hospital serving a middle-to-high SES population (final n=253). SES (maternal education and income) and perinatal factors (SGA, maternal smoking and drinking) were obtained from maternal birth interview. Five child home enrichment factors (e.g. books in home) and preschool attendance were obtained from follow-up interview at age 4.5 years. Cognitive performance was assessed with the Differential Ability Scales (DAS), a standardized psychometric test administered at follow-up. SES and enrichment scores were created by combining individual factors. Analyses were adjusted for perinatal factors. Children from the public birth hospital had a significantly lower mean DAS general cognitive ability (GCA) score than children born at the private birth hospital (adjusted mean difference -21.4, 95% CI: -24.0, -18.7); this was substantially attenuated by adjustment for individual SES, child enrichment factors, and preschool attendance (adjusted mean difference -5.1, 95% CI: -9.5, -0.7). Individual-level SES score was associated with DAS score, beyond the general SES effect associated with hospital of birth. Adjustment for preschool attendance and home enrichment score attenuated the association between individual SES score and adjusted mean DAS-GCA among children born at both of the hospitals. The effect of being in the lower compared to the middle tertile of SES score was reduced by approximately a quarter; the effect of being in the upper compared to the middle tertile of SES score was reduced by nearly half, but this comparison was possible only for children born at the private hospital. A child's individual SES was associated with cognitive performance within advantaged and disadvantaged populations. Child enrichment was associated with better cognitive performance and attenuated the SES influence. Health care providers should reinforce guidelines for home enrichment and refer children with delays to early intervention and education, particularly children from disadvantaged populations. |
Cardiac arrest during hospitalization for delivery in the United States, 1998-2011
Mhyre JM , Tsen LC , Einav S , Kuklina EV , Leffert LR , Bateman BT . Anesthesiology 2014 120 (4) 810-8 BACKGROUND: The objective of this analysis was to evaluate the frequency, distribution of potential etiologies, and survival rates of maternal cardiopulmonary arrest during the hospitalization for delivery in the United States. METHODS: By using data from the Nationwide Inpatient Sample during the years 1998 through 2011, the authors obtained weighted estimates of the number of U.S. hospitalizations for delivery complicated by maternal cardiac arrest. Clinical and demographic risk factors, potential etiologies, and outcomes were identified and compared in women with and without cardiac arrest. The authors tested for temporal trends in the occurrence and survival associated with maternal arrest. RESULTS: Cardiac arrest complicated 1 in 12,000 or 8.5 per 100,000 hospitalizations for delivery (99% CI, 7.7 to 9.3 per 100,000). The most common potential etiologies of arrest included hemorrhage, heart failure, amniotic fluid embolism, and sepsis. Among patients with cardiac arrest, 58.9% of patients (99% CI, 54.8 to 63.0%) survived to hospital discharge. CONCLUSIONS: Approximately 1 in 12,000 hospitalizations for delivery is complicated by cardiac arrest, most frequently due to hemorrhage, heart failure, amniotic fluid embolism, or sepsis. Survival depends on the underlying etiology of arrest. |
Case-control analysis of maternal prenatal analgesic use and cardiovascular malformations: Baltimore-Washington Infant Study
Marsh CA , Cragan JD , Alverson CJ , Correa A . Am J Obstet Gynecol 2014 211 (4) 404 e1-9 OBJECTIVE: To assess maternal prenatal use of analgesics and risk of cardiovascular malformations (CVM) in the offspring. STUDY DESIGN: Data from the Baltimore-Washington Infant Study, a population based case-control investigation of CVM, were used to examine selected isolated CVM diagnoses and maternal analgesic use during the periconceptional period (3 months before and after conception). We compared case and control infants on frequency of maternal use of analgesics and estimated adjusted odds ratios (adjORs) and 95% confidence intervals (CI) with logistic regression models for specific CVM phenotypes. RESULTS: Frequency of periconceptional use of any analgesic was 52% among control mothers and 53% among case mothers. Analyses by CVM diagnoses identified an association of tetralogy of Fallot with maternal acetaminophen use (adjOR=1.6, 95%CI=1.1, 2.3) and dextro-transposition of the great arteries with intact ventricular septum with maternal non-steroidal anti-inflammatory drug use (adjOR=3.2, 95%CI=1.2, 8.7). CONCLUSION: Analgesic use during the periconceptional period was not associated with CVM in the aggregate or with most phenotypes of CVM examined. Associations with two phenotypes of CVM may have occurred by chance. These findings warrant corroboration and further study, including further evaluation of the observed associations, the dose of analgesic taken, more specific timing of analgesic use, and indications for use. |
Mortality among a cohort of U.S. commercial airline cockpit crew
Yong LC , Pinkerton LE , Yiin JH , Anderson JL , Deddens JA . Am J Ind Med 2014 57 (8) 906-14 BACKGROUND: We evaluated mortality among 5,964 former U.S. commercial cockpit crew (pilots and flight engineers). The outcomes of a priori interest were non-chronic lymphocytic leukemia, central nervous system (CNS) cancer (including brain), and malignant melanoma. METHODS: Vital status was ascertained through 2008. Life table and Cox regression analyses were conducted. Cumulative exposure to cosmic radiation was estimated from work history data. RESULTS: Compared to the U.S. general population, mortality from all causes, all cancer, and cardiovascular diseases was decreased, but mortality from aircraft accidents was highly elevated. Mortality was elevated for malignant melanoma but not for non-chronic lymphocytic leukemia. CNS cancer mortality increased with an increase in cumulative radiation dose. CONCLUSIONS: Cockpit crew had a low all-cause, all-cancer, and cardiovascular disease mortality but elevated aircraft accident mortality. Further studies are needed to clarify the risk of CNS and other radiation-associated cancers in relation to cosmic radiation and other workplace exposures. |
Prevalence of obesity by occupation among US workers: the National Health Interview Survey 2004-2011
Gu JK , Charles LE , Bang KM , Ma CC , Andrew ME , Violanti JM , Burchfiel CM . J Occup Environ Med 2014 56 (5) 516-28 OBJECTIVE: To estimate the prevalence of obesity and the change of prevalence of obesity between 2004-2007 and 2008-20011 by occupation among US workers in the National Health Interview Survey. METHODS: Self-reported weight and height were collected and used to assess obesity (body mass index ≥ 30 kg/m). Gender-, race/ethnicity-, and occupation-specific prevalence of obesity were calculated. RESULTS: Prevalence of obesity steadily increased from 2004 through 2008 across gender and race/ethnicity but leveled off from 2008 through 2011. Non-Hispanic black female workers in health care support (49.2%) and transportation/material moving (46.6%) had the highest prevalence of obesity. Prevalence of obesity in relatively low-obesity (white-collar) occupations significantly increased between 2004-2007 and 2008-2011, whereas it did not change significantly in high-obesity (blue-collar) occupations. CONCLUSIONS: Workers in all occupational categories are appropriate targets for health promotion and intervention programs to reduce obesity. |
Collection efficiencies of high flow rate personal respirable samplers when measuring Arizona road dust and analysis of quartz by X-ray diffraction
Stacey P , Lee T , Thorpe A , Roberts P , Frost G , Harper M . Ann Occup Hyg 2014 58 (4) 512-23 Prolonged exposure to respirable crystalline silica (RCS) causes silicosis and is also considered a cause of cancer. To meet emerging needs for precise measurements of RCS, from shorter sampling periods (<4h) and lower air concentrations, collaborative work was done to assess the differences between personal respirable samplers at higher flow rates. The performance of FSP10, GK2.69, and CIP 10 R samplers were compared with that of the Safety In Mines Personal Dust Sampler (SIMPEDS) sampler as a reference, which is commonly used in the UK for the measurement of RCS. In addition, the performance of the FSP10 and GK 2.69 samplers were compared; at the nominal flow rates recommended by the manufacturers of 10 and 4.2 l . min(-1) and with flow rates proposed by the National Institute for Occupational Safety and Health of 11.2 and 4.4 l . min(-1). Samplers were exposed to aerosols of ultrafine and medium grades of Arizona road dust (ARD) generated in a calm air chamber. All analyses for RCS in this study were performed at the Health and Safety Laboratory. The difference in flow rates for the GK2.69 is small and does not result in a substantial difference in collection efficiency for the dusts tested, while the performance of the FSP10 at 11.2 l . min(-1) was more comparable with samples from the SIMPEDS. Conversely, the GK2.69 collected proportionately more crystalline silica in the respirable dust than other samplers, which then produced RCS results most comparable with the SIMPEDS. The CIP 10 R collected less ultrafine ARD than other samplers, as might be expected based on earlier performance evaluations. The higher flow rate for the FSP10 should be an added advantage for task-specific sampling or when measuring air concentrations less than current occupational exposure limits. |
Do hearing protectors protect hearing?
Groenewold MR , Masterson EA , Themann CL , Davis RR . Am J Ind Med 2014 57 (9) 1001-10 BACKGROUND: We examined the association between self-reported hearing protection use at work and incidence of hearing shifts over a 5-year period. METHODS: Audiometric data from 19,911 workers were analyzed. Two hearing shift measures-OSHA standard threshold shift (OSTS) and high-frequency threshold shift (HFTS)-were used to identify incident shifts in hearing between workers' 2005 and 2009 audiograms. Adjusted odds ratios were generated using multivariable logistic regression with multi-level modeling. RESULTS: The odds ratio for hearing shift for workers who reported never versus always wearing hearing protection was nonsignificant for OSTS (OR 1.23, 95% CI 0.92-1.64) and marginally significant for HFTS (OR 1.26, 95% CI 1.00-1.59). A significant linear trend towards increased risk of HFTS with decreased use of hearing protection was observed (P = 0.02). CONCLUSION: The study raises concern about the effectiveness of hearing protection as a substitute for noise control to prevent noise-induced hearing loss in the workplace. |
Human schistosomiasis
Colley DG , Bustinduy AL , Secor WE , King CH . Lancet 2014 383 (9936) 2253-64 Human schistosomiasis-or bilharzia-is a parasitic disease caused by trematode flukes of the genus Schistosoma. By conservative estimates, at least 230 million people worldwide are infected with Schistosoma spp. Adult schistosome worms colonise human blood vessels for years, successfully evading the immune system while excreting hundreds to thousands of eggs daily, which must either leave the body in excreta or become trapped in nearby tissues. Trapped eggs induce a distinct immune-mediated granulomatous response that causes local and systemic pathological effects ranging from anaemia, growth stunting, impaired cognition, and decreased physical fitness, to organ-specific effects such as severe hepatosplenism, periportal fibrosis with portal hypertension, and urogenital inflammation and scarring. At present, preventive public health measures in endemic regions consist of treatment once every 1 or 2 years with the isoquinolinone drug, praziquantel, to suppress morbidity. In some locations, elimination of transmission is now the goal; however, more sensitive diagnostics are needed in both the field and clinics, and integrated environmental and health-care management will be needed to ensure elimination. |
Sustainability of water, sanitation and hygiene interventions in Central America
Sabogal RI , Medlin E , Aquino G , Gelting RJ . J Water Sanit Hyg Dev 2014 4 (1) 89-99 The American Red Cross and U. S. Centers for Disease Control and Prevention collaborated on a sustainability evaluation of post-hurricane water, sanitation and hygiene (WASH) interventions in Central America. In 2006 and 2009, we revisited six study areas in rural El Salvador, Guatemala, Honduras and Nicaragua to assess sustainability of WASH interventions finalized in 2002, after 1998's Hurricane Mitch. We used surveys to collect data, calculate indicators and identify factors that influence sustainability. Regional sustainability indicator results showed there was a statistically significant decline in access to water. The presence of sanitation facilities had not changed since the beginning of the project; however, maintenance and use of latrines declined but continued to meet the goal of 75% use after 7 years. The hygiene indicator, hand washing, initially declined and then increased. Declines in water access were due to operational problems related to storm events and population changes. Sanitation facilities were still present and sometimes used even though they reached or surpassed their original design life. Changes in hygiene practices appeared related to ongoing hygiene promotion from outside organizations. These results provide useful input for making WASH programs more sustainable and informing future, more in-depth research into factors influencing sustainability. |
Implementing systematic review in Toxicological Profiles: ATSDR and NIEHS/NTP collaboration
Murray HE , Thayer KA . J Environ Health 2014 76 (8) 34-35 The Agency for Toxic Substances and Disease Registry’s (ATSDR) Toxicological Profiles provide comprehensive qualitative and quantitative summations of potential adverse health effects from exposure to hazardous substances. This information is subsequently used to derive the Agency’s Minimal Risk Levels (MRLs). These Profiles and their attendant MRLs serve as the scientific basis for the Agency’s applied public health activities such as Site-Specific Health Assessments, Health Studies, Health Education, and Emergency Response. ATSDR’s Profile-development and MRL-derivation processes consist of a multi-tiered, critical evaluation and interpretation of the available scientific literature for a specific hazardous substance (ATSDR, 2003; ATSDR, 1996). Recently, ATSDR has been updating the approach used to develop Profiles by incorporating methods of systematic review (SR). Adoption of SR methods should provide for even more comprehensive, transparent, and organized examination and assessment of the information and conclusions presented in ATSDR’s Toxicological Profiles and Addenda. | Systematic review methods first gained traction in the area of health care interventions, prompting Congress, in 2008, to direct the Institute of Medicine to develop a set of standards for conducting SRs in order “…to assure objective, transparent, and scientifically valid systematic reviews…” of the effectiveness of medical and surgical interventions (IOM, 2011). Although originally intended to evaluate the strength of evidence used to develop guidelines for clinical practice and healthcare interventions (AHRQ, 2012), SR has become an increasingly important tool to search, analyze and summarize information used to make environmental health decisions (Silbergeld and Scherer, 2013; Woodruff and Sutton, 2011). |
Development of Haiti's rural water, sanitation and hygiene workforce
Hubbard B , Lockhart G , Gelting RJ , Bertrand F . J Water Sanit Hyg Dev 2014 4 (1) 159-163 In 2009 the Haitian Directorate of Potable Water and Sanitation (DINEPA) identified an inadequately trained and under-staffed rural workforce as one of their main institutional challenges. Plans to address this challenge were impacted by the devastating earthquake of January 12, 2010 and the cholera outbreak of October 2010, both of which further complicated Haiti's already poor water and sanitation conditions. Recognizing the importance of DINEPA's institutional priorities, donor and technical assistance groups provided needed support to improve the country's conditions and build the rural water and sanitation workforce. This report describes how DINEPA and the US Centers for Disease Control and Prevention (CDC) collaborated to design and implement a training program for 264 potable water and sanitation technicians for rural areas. The paper also describes the initial field activities of the newly trained technicians and the immediate impact of their work in the rural water, sanitation and hygiene sector. |
Higher urinary lignan concentrations in women but not men are positively associated with shorter time to pregnancy
Mumford SL , Sundaram R , Schisterman EF , Sweeney AM , Barr DB , Rybak ME , Maisog JM , Parker DL , Pfeiffer CM , Louis GM . J Nutr 2014 144 (3) 352-8 Phytoestrogens have been associated with subtle hormonal changes, although effects on fecundity are unknown. Our objective was to evaluate the association between male and female urinary phytoestrogen (isoflavone and lignan) concentrations and time to pregnancy (TTP) in a population-based cohort of 501 couples desiring pregnancy and discontinuing contraception. Couples were followed for 12 mo or until pregnancy. Fecundability ORs (FORs) and 95% CIs were estimated after adjusting for age, body mass index, race, site, creatinine, supplement use, and physical activity in relation to female, male, and joint couple concentrations. Models included the phytoestrogen of interest and the sum of the remaining individual phytoestrogens. FORs <1 denote a longer TTP and FORs >1 a shorter TTP. Urinary lignan concentrations were higher, on average, among female partners of couples who became pregnant during the study compared with women who did not become pregnant (median enterodiol: 118 vs. 80 nmol/L; P < 0.10; median enterolactone: 990 vs. 412 nmol/L; P < 0.05) and were associated with significantly shorter TTP in models based on both individual and couples' concentrations (couples' models: enterodiol FOR, 1.13; 95% CI: 1.02, 1.26; enterolactone FOR, 1.11; 95% CI: 1.01, 1.21). Male lignan concentrations were not associated with TTP, nor were isoflavone concentrations. Sensitivity analyses showed that associations observed are unlikely to be explained by potential unmeasured confounding by lifestyle or other nutrients. Our results suggest that female urinary lignan concentrations at levels characteristic of the U.S. population are associated with a shorter TTP among couples who are attempting to conceive, highlighting the importance of dietary influences on fecundity. |
Notes from the field: calls to poison centers for exposures to electronic cigarettes - United States, September 2010-February 2014
Chatham-Stephens K , Law R , Taylor E , Melstrom P , Bunnell R , Wang B , Apelberg B , Schier JG . MMWR Morb Mortal Wkly Rep 2014 63 (13) 292-3 Electronic nicotine delivery devices such as electronic cigarettes (e-cigarettes) are battery-powered devices that deliver nicotine, flavorings (e.g., fruit, mint, and chocolate), and other chemicals via an inhaled aerosol. E-cigarettes that are marketed without a therapeutic claim by the product manufacturer are currently not regulated by the Food and Drug Administration (FDA) . In many states, there are no restrictions on the sale of e-cigarettes to minors. Although e-cigarette use is increasing among U.S. adolescents and adults, its overall impact on public health remains unclear. One area of concern is the potential of e-cigarettes to cause acute nicotine toxicity. To assess the frequency of exposures to e-cigarettes and characterize the reported adverse health effects associated with e-cigarettes, CDC analyzed data on calls to U.S. poison centers (PCs) about human exposures to e-cigarettes (exposure calls) for the period September 2010 (when new, unique codes were added specifically for capturing e-cigarette calls) through February 2014. To provide a comparison to a conventional product with known toxicity, the number and characteristics of e-cigarette exposure calls were compared with those of conventional tobacco cigarette exposure calls. |
Are tobacco control policies effective in reducing young adult smoking?
Farrelly MC , Loomis BR , Kuiper N , Han B , Gfroerer J , Caraballo RS , Pechacek TF , Couzens GL . J Adolesc Health 2014 54 (4) 481-6 PURPOSE: We examined the influence of tobacco control program funding, smoke-free air laws, and cigarette prices on young adult smoking outcomes. METHODS: We use a natural experimental design approach that uses the variation in tobacco control policies across states and over time to understand their influence on tobacco outcomes. We combine individual outcome data with annual state-level policy data to conduct multivariable logistic regression models, controlling for an extensive set of sociodemographic factors. The participants are 18- to 25-year-olds from the 2002-2009 National Surveys on Drug Use and Health. The three main outcomes are past-year smoking initiation, and current and established smoking. A current smoker was one who had smoked on at least 1 day in the past 30 days. An established smoker was one who had smoked 1 or more cigarettes in the past 30 days and smoked at least 100 cigarettes in his or her lifetime. RESULTS: Higher levels of tobacco control program funding and greater smoke-free-air law coverage were both associated with declines in current and established smoking (p < .01). Greater coverage of smoke-free air laws was associated with lower past year initiation with marginal significance (p = .058). Higher cigarette prices were not associated with smoking outcomes. Had smoke-free-air law coverage and cumulative tobacco control funding remained at 2002 levels, current and established smoking would have been 5%-7% higher in 2009. CONCLUSIONS: Smoke-free air laws and state tobacco control programs are effective strategies for curbing young adult smoking. |
CDC Grand Rounds: global tobacco control
Asma S , Song Y , Cohen J , Eriksen M , Pechacek T , Cohen N , Iskander J . MMWR Morb Mortal Wkly Rep 2014 63 (13) 277-80 During the 20th century, use of tobacco products contributed to the deaths of 100 million persons worldwide. In 2011, approximately 6 million additional deaths were linked to tobacco use, the world's leading underlying cause of death, responsible for more deaths each year than human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS), tuberculosis, and malaria combined. One third to one half of lifetime users die from tobacco products, and smokers die an average of 14 years earlier than nonsmokers. Manufactured cigarettes account for 96% of all tobacco sales worldwide. From 1880 to 2009, annual global consumption of cigarettes increased from an estimated 10 billion cigarettes to approximately 5.9 trillion cigarettes, with five countries accounting for 58% of the total consumption: China (38%), Russia (7%), the United States (5%), Indonesia (4%), and Japan (4%). Among the estimated 1 billion smokers worldwide, men outnumber women by four to one. In 14 countries, at least 50% of men smoke, whereas in more than half of these same countries, fewer than 10% of women smoke. If current trends persist, an estimated 500 million persons alive today will die from use of tobacco products. By 2030, tobacco use will result in the deaths of approximately 8 million persons worldwide each year. Yet, every death from tobacco products is preventable. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Food Safety
- Health Behavior and Risk
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Occupational Safety and Health
- Parasitic Diseases
- Public Health Leadership and Management
- Reproductive Health
- Substance Use and Abuse
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure