One more lesson from the pandemic
Jhung MA , Finelli L . J Public Health Manag Pract 2011 17 (1) 1-3 Over the last year, many in the public health community have reflected upon lessons learned from the first pandemic of the 20th century. In the current issue of the Journal of Public Health Management and Practice, 3 articles remind us of the importance of public health partnerships in planning for and responding to influenza pandemics and other national public health emergencies. Two articles that evaluate surveillance for influenza associated hospitalizations in New York State highlight the contribution that public health partnerships make to gathering accurate information for surveillance.1,2 A third article by Plough and colleagues addresses the role of partnerships in Los Angeles County's response to the pandemic and recommends a partnership strategy to remedy the inequity in vaccination that was observed.3 | The 2009 H1N1 pandemic was perceived to be mildly severe by most measures,4–7 but the specter of widespread susceptibility combined with the early uncertainty surrounding the pathogenicity of the novel virus was alarming. As Noyes and colleagues point out, although enhanced surveillance for influenza-associated hospitalizations is conducted in some areas, measures of illness severity were not routinely part of influenza surveillance activities throughout New York State. In their description of sentinel surveillance in 6 hospitals in New York State, Noyes and colleagues highlight 2 realities of influenza surveillance that were observed in nearly all jurisdictions affected by the pandemic. First, there is a need for comprehensive influenza surveillance data, which include some measures of disease severity, such as hospitalizations or deaths, in addition to routine measures of disease burden through outpatient encounters for influenza-like illness.8 Second, local data are usually the best data. Local surveillance data that were timely and specific to areas affected by the pandemic often provided the best information on which to base local interventions. Establishing this surveillance network in the midst of the pandemic required substantial cooperation between the State Department of Health and partners in healthcare, particularly the infection control personnel who collected surveillance data in each hospital. |
Influenza-associated mortality among children - United States: 2007-2008
Peebles PJ , Dhara R , Brammer L , Fry AM , Finelli L . Influenza Other Respir Viruses 2011 5 (1) 25-31 BACKGROUND: Since October 2004, pediatric influenza-associated deaths have been a nationally notifiable condition. To further investigate the bacterial organisms that may have contributed to death, we systematically collected information about bacterial cultures collected at non-sterile sites and about the timing of Staphylococcus aureus specimen collection relative to hospital admission. METHODS: We performed a retrospective, descriptive study of all reported influenza-associated pediatric deaths in 2007-2008 influenza season in the United States. RESULTS: During the 2007-2008 influenza season, 88 influenza-associated pediatric deaths were reported. The median age was 5 (range 29 days - 17 years); 48% were <5 years of age. The median time from symptom onset to death was 4 days (range 0-64 days). S. aureus was identified at a sterile site or at a non-sterile site in 20 (35%) of the 57 children with specimens collected from these sites; in 17 (85%) of these children, specimens yielding S. aureus were obtained within three days of inpatient admission. These 17 children were older (10 versus 4 years, median; P < 0.05) and less likely to have a high-risk medical condition (P < 0.05) than children with cultures from the designated sites that did not grow S. aureus. CONCLUSIONS: S. aureus continues to be the most common bacteria isolated from children with influenza-associated mortality. S. aureus isolates were associated with older age and lack of high-risk medical conditions. Healthcare providers should consider influenza co-infections with S. aureus when empirically treating children with influenza and severe respiratory illness. |
The future of HIV testing
Branson BM . J Acquir Immune Defic Syndr 2010 55 S102-S105 HIV testing is the essential entry point for both treatment and prevention. The need to identify acute HIV infection (the period immediately after HIV acquisition, when persons are most infectious) and HIV-2 infection, which does not respond to many first-line antiretroviral agents, poses challenges for the traditional algorithm of Western blot confirmation after a repeatedly reactive antibody screening test. Immunoassays that detect antibodies earlier, tests for HIV RNA, and combination assays that screen simultaneously for both p24 antigen and HIV antibody are now approved for HIV diagnosis by the Food and Drug Administration. A revised testing algorithm can address the challenges posed by acute infection, HIV-2 infection, and the shortcomings of the Western blot. These new diagnostic strategies will allow earlier more accurate identification of infected persons so that they can benefit from effective treatment and also enhance abilities to focus prevention efforts where HIV transmission is most active. |
Design and implementation of a China comprehensive AIDS response programme (China CARES), 2003-08
Han M , Chen Q , Hao Y , Hu Y , Wang D , Gao Y , Bulterys M . Int J Epidemiol 2010 39 Suppl 2 ii47-55 BACKGROUND: Prior to 2003, there was limited capacity for an HIV/AIDS response in China. In early 2003, China launched a 5-year China Comprehensive AIDS Response Programme (China CARES) to contain the spread of HIV infection and reduce its impact. This article describes the China CARES' practices and experiences. METHODS: China CARES covered 83.3 million people in 127 programme sites chosen from 28 provinces based on HIV prevalence. Each China CARES site was required to carry out surveillance and surveys to understand the local HIV/AIDS epidemic, to deliver primary interventions to reduce new HIV infections among and from high-risk groups, to prevent mother-to-child transmission, to treat AIDS patients with antiretroviral medicines and to provide support services to families affected by HIV/AIDS. Data were collected to monitor and evaluate implementation. RESULTS: HIV/AIDS prevention knowledge and awareness improved significantly in China CARES sites from <30% in 2004 to 86% in 2008. The number of persons tested for HIV increased by 67% between 2005 and 2007 from 1.5 to 2.5 million. China CARES enrolled 23 000 patients in anti-retroviral treatment and supported 6007 AIDS orphans. Among pregnant women, 81.8% received counselling and 75.8% received HIV testing during antenatal care, while 92.9% of HIV-infected pregnant women and 85.5% of their newborns received anti-retroviral prophylaxis. During the project period, no known HIV transmissions occurred through blood transfusions. CONCLUSION: China CARES has facilitated AIDS prevention, treatment and care in resource-poor, rural and ethnic minority areas of China. |
Mosquito vectors of West Nile virus during an epizootic outbreak in Puerto Rico
Barrera R , MacKay A , Amador M , Vasquez J , Smith J , Diaz A , Acevedo V , Caban B , Hunsperger EA , Munoz-Jordan JL . J Med Entomol 2010 47 (6) 1185-1195 The purpose of this investigation was to identify the mosquito (Diptera: Culicidae) vectors of West Nile virus (WNV; family Flaviviridae, genus Flavivirus) during an epizootic WNV outbreak in eastern Puerto Rico in 2007. In June 2006, 12 sentinel chicken pens with five chickens per pen were deployed in six types of habitats: herbaceous wetlands, mangrove forests, deciduous forests, evergreen forests, rural areas, and urban areas. Once WNV seroconversion in chickens was detected in June 2007, we began trapping mosquitoes using Centers for Disease Control and Prevention (CDC) miniature (light/CO2-baited) traps, CMT-20 collapsible mosquito (CO2- and ISCA SkinLure-baited) traps, and CDC gravid (hay infusion-baited) traps. We placed the CDC miniature traps both 2-4 m and >30 m from the chicken pens, the collapsible traps 2-4 m from the pens, and the gravid traps in backyards of houses with sentinel chicken pens and in a wetland adjacent to an urban area. We found numerous blood-engorged mosquitoes in the traps nearest to the sentinel chickens and reasoned that any such mosquitoes with a disseminated WNV infection likely served as vectors for the transmission of WNV to the sentinels. We used reverse transcriptase-polymerase chain reaction and isolation (C636) on pools of heads, thoraxes/abdomens, and legs of collected blood-engorged mosquitoes to determine whether the mosquitoes carried WNV. We detected WNV-disseminated infections in and obtained WNV isolates from Culex nigripalpus Theo (minimum infection rate [MIR] 1.1-9.7/1,000), Culex bahamensis Dyar and Knab (MIR 1.8-6.0/1,000), and Aedes taeniorhynchus (Wied.) (MIR 0.34-0.36/1,000). WNV was also identified in and isolated from the pool of thoraxes and abdomens of Culex quinquefasciatus Say (4.17/1,000) and identified in one pool of thoraxes and abdomens of Culex habilitator Dyar and Knab (13.39/1,000). Accumulated evidence since 2002 suggests that WNV has not become endemic in Puerto Rico. |
Mode of action for natural products isolated from essential oils of two trees is different from available mosquito adulticides
McAllister JC , Adams MF . J Med Entomol 2010 47 (6) 1123-1126 Insecticidal properties of natural products may present alternatives to the use of synthetic molecule pesticides that are of diminishing effectiveness due to resistance. Three compounds, thymoquinone, nootkatone, and carvacrol, components of Alaska yellow cedar, Chamaecyyparis nootkatensis (D. Don) Spach, and incense cedar, Calocedrus decurrens (Torr.), essential oils, have been shown to have biological activity against a variety of mosquito and tick species. Although these components act as both repellents and insecticides, how they function in either capacity is unknown. Their use as mosquito control insecticides would be greatly increased if their mode of action is not the same as that of currently used commercial products. This study compared the lethal dosages for nootkatone, carvacrol, and thymoquinone by using colony strains of Anopheles gambiae Giles with known mutations at three different target sites. The altered target sites evaluated were the sodium channel para-locus mutation (L1014 F KDR) that confers permethrin resistance, the ACE-1 gene that confers organophosphate and carbamate resistance, and a gamma-aminobutyric acid receptor mutation of the Rdl locus conferring dieldrin resistance. Significant increases in lethal dose were not observed in any of the mosquito strains for any of the compounds tested compared with the doses required of chemicals with known modes of action at the mutated sites. Although the mode of action was not determined, this screening study indicates that none of these compounds interact at the target sites represented in the test mosquito strains. These compounds represent a different mode of action than existing chemicals currently used in mosquito control. |
Spatio-temporal patterns of road traffic noise pollution in Karachi, Pakistan
Mehdi MR , Kim M , Seong JC , Arsalan MH . Environ Int 2011 37 (1) 97-104 We studied the spatial and temporal patterns of noise exposure due to road traffic in Karachi City, Pakistan, and found that levels of noise were generally higher during mornings and evenings because of the commuting pattern of Karachi residents. This study found the average value of noise levels to be over 66 dB, which could cause serious annoyance according to the World Health Organization (WHO) outdoor noise guidelines. Maximum peak noise was over 101 dB, which is close to 110 dB, the level that can cause possible hearing impairment according to the WHO guidelines. We found that noise pollution is not an environmental problem reserved for developed countries, but occurs in developing countries as well. For this reason, steps might be required to reduce noise levels caused by road traffic. |
Pesticide exposure among pregnant women in Jerusalem, Israel: results of a pilot study
Berman T , Hochner-Celnikier D , Barr DB , Needham LL , Amitai Y , Wormser U , Richter E . Environ Int 2011 37 (1) 198-203 BACKGROUND: Pesticides have been shown to disrupt neurodevelopment in laboratory animals and in human populations. To date, there have been no studies on exposure to pesticides in pregnant women in Israel, despite reports of widespread exposure in other populations of pregnant women and the importance of evaluating exposure in this susceptible sub-population. METHODS: We measured urinary concentrations of organophosphorus (OP) insecticide metabolites and plasma concentrations of OP and other pesticides in 20 pregnant women, recruited in Jerusalem, Israel in 2006, and collected questionnaire data on demographic factors and consumer habits from these women. We compared geometric mean concentrations in subgroups using the Mann-Whitney U-test for independent samples. We compared creatinine-adjusted OP pesticide metabolite concentrations, as well as plasma pesticide concentrations, with other populations of pregnant women. RESULTS: Creatinine-adjusted total dimethyl (DM) metabolite concentrations were between 4 and 6 times higher in this population compared to other populations of pregnant women in the United States while total diethyl (DE) metabolite concentrations were lower. Dimethylphosphate (DMP) was detected in 74% of the urine samples whereas dimethylthiophosphate (DMTP) was detected in 90% of the urine samples. The carbamate bendiocarb was detected in 89% of the plasma samples, while the OP insecticide chlorpyrifos was detected in 42% of the samples. Mean plasma concentrations of bendiocarb and chlorpyrifos in our sample were 4.4 and 3.9 times higher, respectively, than that of an urban minority cohort from New York City. Twelve women (63%) reported using some form of household pest control during their pregnancy and five (26%) reported using household pest control during the past month. Women with a graduate degree had significantly higher geometric mean concentrations of total urinary DM metabolite concentrations compared to other women (P=0.006). Finally, one woman in the study had exceptionally high concentrations of DMP, DMTP, DMDTP compared to the other women in the study, despite reporting no current occupational exposure to OP pesticides and no other significant exposure sources. CONCLUSIONS: Pregnant women in the Jerusalem area are exposed to OP pesticides and to the carbamate pesticide bendiocarb. It is unclear why total DM metabolites concentrations were much higher in this population compared to other populations of pregnant women in the United States and Netherlands. Finally, the finding of very high DM metabolite concentrations in one woman who reported being moved from her regular laboratory work to administrative work upon becoming pregnant, raises questions about the adequacy of measures to protect pregnant women from pesticide exposures during pregnancy. |
Exposure to polyfluoroalkyl chemicals during pregnancy is not associated with offspring age at menarche in a contemporary British cohort
Christensen KY , Maisonet M , Rubin C , Holmes A , Calafat AM , Kato K , Flanders WD , Heron J , McGeehin MA , Marcus M . Environ Int 2011 37 (1) 129-35 INTRODUCTION: Polyfluoroalkyl chemicals (PFCs) are commercially synthesized chemicals used in consumer products. Exposure to certain PFCs is widespread, and some PFCs may act as endocrine disruptors. We used data from the Avon Longitudinal Study of Parents and Children (ALSPAC) in the United Kingdom to conduct a nested case-control study examining the association between age at menarche, and exposure to PFCs during pregnancy. METHODS: Cases were selected from female offspring in the ALSPAC who reported menarche before the age of 11.5 years (n = 218), and controls were a random sample of remaining girls (n = 230). Serum samples taken from the girls' mothers during pregnancy (1991-1992) were analyzed using on-line solid-phase extraction coupled to isotope dilution high-performance liquid chromatography-tandem mass spectrometry for 8 PFCs. Logistic regression was used to determine association between maternal serum PFC concentrations, and odds of earlier age at menarche. RESULTS: PFOS and PFOA were the predominant PFCs (median serum concentrations of 19.8 ng/mL and 3.7 ng/mL). All but one PFC were detectable in most samples. Total PFC concentration varied by number of births (inverse association with birth order; p-value < 0.0001) and race of the child (higher among whites; p-value = 0.03). The serum concentrations of carboxylates were associated with increased odds of earlier age at menarche; concentrations of perfluorooctane sulfonamide, the sulfonamide esters and sulfonates were all associated with decreased odds of earlier age at menarche. However, all confidence intervals included the null value of 1.0. CONCLUSIONS: ALSPAC study participants had nearly ubiquitous exposure to most PFCs examined, but PFC exposure did not appear to be associated with altered age at menarche of their offspring. |
Under-mining health: environmental justice and mining in India
Saha S , Pattanayak SK , Sills EO , Singha AK . Health Place 2010 17 (1) 140-8 Despite the potential for economic growth, extractive mineral industries can impose negative health externalities in mining communities. We estimate the size of these externalities by combining household interviews with mine location and estimating statistical functions of respiratory illness and malaria among villagers living along a gradient of proximity to iron-ore mines in rural India. Two-stage regression modeling with cluster corrections suggests that villagers living closer to mines had higher respiratory illness and malaria-related workday loss, but the evidence for mine workers is mixed. These findings contribute to the thin empirical literature on environmental justice and public health in developing countries. |
Nonhygienic behavior, knowledge, and attitudes among interactive splash park visitors
Nett RJ , Toblin R , Sheehan A , Huang WT , Baughman A , Carter K . J Environ Health 2010 73 (4) 8-14 Nonhygienic behavior likely contributed to three recreational waterborne illness (RWI) outbreaks at Idaho splash parks. The study described in this article examined the influence of signage and hygiene attendant presence on rates of nonhygienic behavior among children at splash parks and knowledge and attitudes of their adult supervisors. Investigators observed children for nonhygienic behaviors at four Idaho splash parks, two with signage and attendants. Supervisors were surveyed (N = 551) using an eight-item survey. Individually observed children (N = 145) were often seen exposing their buttocks to splash feature water and placing an open mouth to water. The rate of nonhygienic behaviors was not lower at parks with signage or staff. Supervisors reported bathing children before splash park entry infrequently. Signage and hygiene attendants do not adequately limit nonhygienic behaviors at splash parks, and supervisors have insufficient understanding of RWI. These findings have implica.tions for developing splash park regulations and RWI prevention efforts. |
Feasibility and reliability of classifying gross motor function among children with cerebral palsy using population-based record surveillance
Benedict RE , Patz J , Maenner MJ , Arneson CL , Yeargin-Allsopp M , Doernberg NS , Van Naarden Braun K , Kirby RS , Durkin MS . Paediatr Perinat Epidemiol 2011 25 (1) 88-96 For conditions with wide-ranging consequences, such as cerebral palsy (CP), population-based surveillance provides an estimate of the prevalence of case status but only the broadest understanding of the impact of the condition on children, families or society. Beyond case status, information regarding health, functional skills and participation is necessary to fully appreciate the consequences of the condition. The purpose of this study was to assess the feasibility and reliability of enhancing population-based surveillance by classifying gross motor function (GMF) from information available in medical records of children with CP. We assessed inter-rater reliability of two GMF classification methods, one the Gross Motor Function Classification System (GMFCS) and the other a 3-category classification of walking ability: (1) independently, (2) with handheld mobility device, or (3) limited or none. Two qualified clinicians independently reviewed abstracted evaluations from medical records of 8-year-old children residing in southeast Wisconsin, USA who were identified as having CP (n = 154) through the Centers for Disease Control and Prevention's Autism and Developmental Disabilities Monitoring Network. Ninety per cent (n = 138) of the children with CP had information in the record after age 4 years and 108 (70%) had adequate descriptions of gross motor skills to classify using the GMFCS. Agreement was achieved on 75.0% of the GMFCS ratings (simple kappa = 0.67, 95% confidence interval [95% CI 0.57, 0.78], weighted kappa = 0.83, [95% CI 0.77, 0.89]). Among case children for whom walking ability could be classified (n = 117), approximately half walked independently without devices and one-third had limited or no walking ability. Across walking ability categories, agreement was reached for 94% (simple kappa = 0.90, [95% CI 0.82, 0.96], weighted kappa = 0.94, [95% CI 0.89, 0.98]). Classifying GMF in the context of active records-based surveillance is feasible and reliable. Future surveillance efforts that include functional level among children with cerebral palsy may provide important information for monitoring the impact of the condition for programmatic and policy purposes. |
Epidemiology of HIV in the United States
Lansky A , Brooks JT , DiNenno E , Heffelfinger J , Hall HI , Mermin J . J Acquir Immune Defic Syndr 2010 55 S64-S68 BACKGROUND: The United States has a comprehensive system of HIV surveillance, including case reporting and disease staging, estimates of incidence, behavioral, and clinical indicators and monitoring of HIV-related mortality. These data are used to monitor the epidemic and to better design, implement, and evaluate public health programs. METHODS: We describe HIV-related surveillance systems and review recent data. RESULTS: There are more than 1.1 million people living with HIV in the United States, and approximately 56,000 new HIV infections annually. Risk behavior data show that 47% of men who have sex with men engaged in unprotected anal intercourse in the past year, and 33% of injection drug users had shared syringes. One third (32%) of people diagnosed with HIV in 2008 were diagnosed with AIDS within 12 months, indicating missed opportunities for care and prevention. An estimated 72% of HIV-diagnosed persons received HIV medical care within 4 months of initial diagnosis. CONCLUSIONS: Conducting accurate and comprehensive HIV surveillance is critical for measuring progress toward the goals of the 2010 National HIV/AIDS Strategy: reduced HIV incidence, increased access to care, and improvements in health equity. |
Heterosexual anal sex experiences among Puerto Rican and black young adults
Carter M , Henry-Moss D , Hock-Long L , Bergdall A , Andes K . Perspect Sex Reprod Health 2010 42 (4) 267-74 CONTEXT: Heterosexual anal sex is not uncommon in the United States, and it poses risk for STDs. However, who engages in it and why are not well understood, particularly among young adults. METHODS: In 2006-2008, data on sexual health-related topics were collected in surveys (483 respondents) and qualitative interviews (70 participants) with black and Puerto Rican 18-25-year-olds in Hartford and Philadelphia. Bivariate and multivariate analyses of survey data assessed predictors of anal sex with the most recent serious heterosexual partner. Interview transcripts were analyzed for anal sex experiences and reasons for and against engaging in this behavior. RESULTS: Some 34% of survey respondents had had anal sex; this behavior was more common with serious partners than with casual partners (22% vs. 8%). Black respondents were less likely than Puerto Ricans to report anal sex (odds ratio, 0.3); women were more likely to do so than were men (2.9). In the qualitative cohort, perceptions of anal sex as painful and unappealing were the predominant reasons for not having anal sex, whereas sexual pleasure and, in serious relationships, intimacy were the main reasons for engaging in it. Condom use during anal sex was rare and was motivated by STD or hygiene concerns. CONCLUSIONS: Heterosexual anal sex is not an infrequent behavior and should be considered in a broad sexual health context, not simply as an indicator of STD risk. Health providers should address it openly and, when appropriate, as a positive sexual and emotional experience. |
Daily participation in sports and students' sexual activity
Habel MA , Dittus PJ , De Rosa CJ , Chung EQ , Kerndt PR . Perspect Sex Reprod Health 2010 42 (4) 244-250 CONTEXT: Previous studies suggest that student athletes may be less likely than nonathletes to engage in sexual behavior. However, few have explored sexual risk behavior among athletes in early adolescence. METHODS: In 2005, a sample of 10,487 students in 26 Los Angeles public middle and high schools completed a self-administered survey that asked about their demographic characteristics, sports participation, sexual behaviors and expectations, and parental relationships. Chi-square analyses compared reported levels of daily participation in sports, experience with intercourse, experience with oral sex and condom use at last intercourse by selected characteristics. Predictors of sexual experience and condom use were assessed in multivariate logistic regression analyses. RESULTS: One-third of students reported daily participation in sports. This group had higher odds of ever having had intercourse and ever having had oral sex than their peers who did not play a sport daily (odds ratios, 1.2 and 1.1, respectively). The increases in risk were greater for middle school sports participants than for their high school counterparts (1.5 and 1.6, respectively). Among sexually experienced students, daily sports participants also had elevated odds of reporting condom use at last intercourse (1.4). CONCLUSIONS: Students as young as middle school age who participate in sports daily may have an elevated risk for STDs and pregnancy. Health professionals should counsel middle school athletes about sexual risk reduction, given that young students may find it particularly difficult to obtain contraceptives, STD testing and prevention counseling. |
Sustained reduction in the clinical incidence of methicillin-resistant Staphylococcus aureus colonization or infection associated with a multifaceted infection control intervention
Ellingson K , Muder RR , Jain R , Kleinbaum D , Feng PJ , Cunningham C , Squier C , Lloyd J , Edwards J , Gebski V , Jernigan J . Infect Control Hosp Epidemiol 2010 32 (1) 1-8 OBJECTIVE: To assess the impact and sustainability of a multifaceted intervention to prevent methicillin-resistant Staphylococcus aureus (MRSA) transmission implemented in 3 chronologically overlapping phases at 1 hospital. DESIGN: Interrupted time-series analyses. SETTING: A Veterans Affairs hospital in the northeastern United States. PATEINTS AND PARTICIPANTS: Individuals admitted to acute care units from October 1, 1999, through September 30, 2008. To calculate the monthly clinical incidence of MRSA colonization or infection, the number of MRSA-positive cultures obtained from a clinical site more than 48 hours after admission among patients with no MRSA-positive clinical cultures during the previous year was divided by patient-days at risk. Secondary outcomes included clinical incidence of methicillin-sensitive S. aureus colonization or infection and incidence of MRSA bloodstream infections. INTERVENTIONS: The intervention-implemented in a surgical ward beginning October 2001, in a surgical intensive care unit beginning October 2003, and in all acute care units beginning July 2005-included systems and behavior change strategies to increase adherence to infection control precautions (eg, hand hygiene and active surveillance culturing for MRSA). RESULTS: Hospital-wide, the clinical incidence of MRSA colonization or infection decreased after initiation of the intervention in 2001, compared with the period before intervention (P = .002), and decreased by 61% (P < .001) in the 7-year postintervention period. In the postintervention period, the hospital-wide incidence of MRSA bloodstream infection decreased by 50% (P = .02), and the proportion of S. aureus isolates that were methicillin resistant decreased by 30% (P < .001). CONCLUSIONS: Sustained decreases in hospital-wide clinical incidence of MRSA colonization or infection, incidence of MRSA bloodstream infection, and proportion of S. aureus isolates resistant to methicillin followed implementation of a multifaceted prevention program at 1 Veterans Affairs hospital. Findings suggest that interventions designed to prevent transmission can impact endemic antimicrobial resistance problems. |
Clinical incidence of methicillin-resistant Staphylococcus aureus (MRSA) colonization or infection as a proxy measure for MRSA transmission in acute care hospitals
Feng PJ , Kallen AJ , Ellingson K , Muder R , Jain R , Jernigan JA . Infect Control Hosp Epidemiol 2010 32 (1) 20-5 BACKGROUND: The incidence of methicillin-resistant Staphylococcus aureus (MRSA) colonization or infection has been used as a proxy measure for MRSA transmission, but incidence calculations vary depending on whether active surveillance culture (ASC) data are included. OBJECTIVE: To evaluate the relationship between incidences of MRSA colonization or infection calculated with and without ASCs in intensive care units and non-intensive care units. SETTING: A Veterans Affairs medical center. METHODS: From microbiology records, incidences of MRSA colonization or infection were calculated with and without ASC data. Correlation coefficients were calculated for the 2 measures, and Poisson regression was used to model temporal trends. A Poisson interaction model was used to test for differences in incidence trends modeled with and without ASCs. RESULTS: The incidence of MRSA colonization or infection calculated with ASCs was 4.9 times higher than that calculated without ASCs. Correlation coefficients for incidences with and without ASCs were 0.42 for intensive care units, 0.59 for non-intensive care units, and 0.48 hospital-wide. Trends over time for the hospital were similar with and without ASCs (incidence rate ratio with ASCs, 0.95 [95% confidence interval, 0.93-0.97]; incidence rate ratio without ASCs, 0.95 [95% confidence interval, 0.92-0.99]; P = .68). Without ASCs, 35% of prevalent cases were falsely classified as incident. CONCLUSIONS: At 1 Veterans Affairs medical center, the incidence of MRSA colonization or infection calculated solely on the basis of clinical culture results commonly misclassified incident cases and underestimated incidence, compared with measures that included ASCs; however, temporal changes were similar. These findings suggest that incidence measured without ASCs may not accurately reflect the magnitude of MRSA transmission but may be useful for monitoring transmission trends over time, a crucial element for evaluating the impact of prevention activities. |
Diarrhea: case definition and guidelines for collection, analysis, and presentation of immunization safety data
Gidudu J , Sack DA , Pina M , Hudson MJ , Kohl KS , Bishop P , Chatterjee A , Chiappini E , Compingbutra A , da Costa C , Fernandopulle R , Fischer TK , Haber P , Masana W , de Menezes MR , Kang G , Khuri-Bulos N , Killion LA , Nair C , Poerschke G , Rath B , Salazar-Lindo E , Setse R , Wenger P , Wong VC , Zaman K . Vaccine 2010 29 (5) 1053-71 Diarrhea, also spelled diarrhoea, is a common medical condition that is characterized by increased frequency of bowel movements and increased liquidity of stool [1], [2]. Although acute diarrhea is typically self-limiting, it can be severe and can lead to profound dehydration, which can lead to abnormally low blood volume, low blood pressure, and damage to the kidneys, heart, liver, brain and other organs. Acute diarrhea remains a major cause of infant mortality around the world. Over 2 million deaths are attributed to acute diarrhea each year world-wide, most of them in the developing world. [3], [4], [5]. Children and the elderly are particularly prone to dehydration secondary to diarrhea. | Diarrhea has been defined over time by various scientific groups and health organizations in different ways, such as: “the passage of loose unformed stools” [6] or “three looser-than normal stools in a 24-h period” [7], [8] with emphasis on the consistency of stools rather than the number [9]. In epidemiological studies, diarrhea is usually defined as the passage of three or more loose or watery stools in a 24-h period, a loose stool being one that takes the shape of a stool container [8], [9], [10], [11], [12], [13], [14], [15], [16]. |
Characterization of public health alerts and their suitability for alerting in electronic health record systems
Garrett NY , Mishra N , Nichols B , Staes CJ , Akin C , Safran C . J Public Health Manag Pract 2011 17 (1) 77-83 Public health agencies including federal, state, and local governments routinely send out public health advisories and alerts via e-mail and text messages to health care providers to increase awareness of public health events and situations. Agencies must ensure that practitioners have timely and accessible information at the critical point-of-care. Electronic health record (EHR) systems have the potential to alert physicians of emerging health conditions deemed important for public health at the most critical time of need. To understand how public health agencies can leverage existing alerting mechanisms in EHR systems, it is important to understand characteristics of public health alerts to determine their suitability for alerting in EHR systems. Authors conducted a review and analysis of public health alerts for a 3-year period to identify critical data attributes necessary to support public health alerting in EHR systems. The alerts were restricted to those most relevant for clinical care. The results showed that there is an opportunity for disseminating actionable information to clinical practitioners at the point of care to guide care and reporting. Public health alerts in EHR systems can be useful in reporting, recommending specific tests, as well as suggesting secondary prevention. |
Automated surveillance of Clostridium difficile infections using BioSense
Benoit SR , McDonald LC , English R , Tokars JI . Infect Control Hosp Epidemiol 2010 32 (1) 26-33 OBJECTIVE: To determine the feasibility of using electronic laboratory and admission-discharge-transfer data from BioSense, a national automated surveillance system, to apply new modified Clostridium difficile infection (CDI) surveillance definitions and calculate overall and facility-specific rates of disease. DESIGN: Retrospective, multicenter cohort study. SETTING: Thirty-four hospitals sending inpatient, emergency department, and/or outpatient data to BioSense. METHODS: Laboratory codes and text-parsing methods were used to extract C. difficile-positive toxin assay results from laboratory data sent to BioSense during the period from January 1, 2007, through June 30, 2008; these were merged with administrative records to determine whether cases were community associated or healthcare onset, as well as patient-day data for rate calculations. A patient was classified as having hospital-onset CDI if he or she had a C. difficile toxin-positive result on a stool sample collected 3 or more days after admission and community-onset CDI if the specimen was collected less than 3 days after admission or the patient was not hospitalized. RESULTS: A total of 4,585 patients from 34 hospitals in 12 states had C. difficile-positive assay results. More than half (53.0%) of the cases were community-onset, and 30.8% of these occurred in patients who were recently hospitalized. The overall rate of healthcare-onset CDI was 7.8 cases per 10,000 patient-days, with a range among facilities of 1.5-27.8 cases per 10,000 patient-days. CONCLUSIONS: Electronic laboratory data sent to the BioSense surveillance system were successfully used to produce disease rates of CDI comparable to those of other studies, which shows the feasibility of using electronic laboratory data to track a disease of public health importance. |
Heat illness among high school athletes - United States, 2005-2009
Yard EE , Gilchrist J , Haileyesus T , Murphy M , Collins C , McIlvain N , Comstock RD . J Safety Res 2010 41 (6) 471-4 INTRODUCTION: Heat illness is a leading cause of death and disability among U.S. high school athletes. METHODS: To examine the incidence and characteristics of heat illness among high school athletes, CDC analyzed data from the National High School Sports-Related Injury Surveillance Study for the period 2005-2009. RESULTS: During 2005-2009, the 100 schools sampled reported a total of 118 heat illnesses among high school athletes resulting in ≥1day of time lost from athletic activity, a rate of 1.6 per 100,000 athlete-exposures, and an average of 29.5 time-loss heat illnesses per school year. The average corresponds to a weighted average annual estimate of 9,237 illnesses nationwide. The highest rate of time-loss heat illness was among football players, 4.5 per 100,000 athlete-exposures, a rate 10 times higher than the average rate (0.4) for the eight other sports. Time-loss heat illnesses occurred most frequently during August (66.3%) and while practicing or playing football (70.7%). No deaths were reported. CONCLUSIONS: Consistent with guidelines from the National Athletic Trainers' Association, to reduce the risk for heat illness, high school athletic programs should implement heat-acclimatization guidelines (e.g., set limits on summer practice duration and intensity). All athletes, coaches, athletic trainers, and parents/guardians should be aware of the risk factors for heat illness, follow recommended strategies, and be prepared to respond quickly to symptoms of illness. Coaches also should continue to stress to their athletes the importance of maintaining proper hydration before, during, and after sports activities. IMPACT OF INDUSTRY: By implementing preventive recommendations and quickly recognizing and responding to heat illness, coaches, athletic trainers, and the sporting community can prevent future deaths. |
Minority HIV mutation detection in dried blood spots indicates high specimen integrity and reveals hidden archived drug resistance.
Wei X , Youngpairoj AS , Garrido C , Zahonero N , Corral A , de Mendoza C , Heneine W , Johnson JA , Garcia-Lerma JG . J Clin Virol 2010 50 (2) 148-52 BACKGROUND: Dried blood spots (DBS) could serve as an attractive, cost-effective alternative to plasma for HIV drug resistance testing. OBJECTIVES: To assess the utility and potential gain in genotypic information with sensitive testing of DBS compared to conventional bulk plasma genotyping, and examine the correlation of majority and minority-level resistance mutations in DBS with treatment history. STUDY DESIGN: Evaluate nucleic acids from the DBS of 33 antiretroviral-experienced subtype B-infected subjects for minority M41L, K65R, K70R, K103N, Y181C, M184V, and T215Y/F mutations by real-time PCR. Compare minority resistance mutations in DBS with bulk genotypes from the same DBS cards and available plasma specimens. RESULTS: All but one (50/51, 98%) mutation from the original plasma bulk sequencing were still detectable in the DBS after three years of storage. The one mutation not identified in DBS was also no longer detectable by bulk sequencing. Furthermore, sensitive testing found 12 additional drug resistance mutations at minority levels in the DBS of 11 (33%) patients. Six minority mutations were in the RNA compartment and six were detected only in the DNA compartment. Resistance was detected in the DBS RNA compartment only in cases where the associated drug was in use within one year of sample collection. CONCLUSIONS: Our ability to identify majority and additional minority-level resistance mutations demonstrated that DBS, if stored properly, is a high-integrity specimen type for conventional and sensitive drug resistance testing. Our data further support the global utility of DBS for drug resistance surveillance and clinical monitoring. |
Impact of second-tier testing on the effectiveness of newborn screening
Chace DH , Hannon WH . Clin Chem 2010 56 (11) 1653-5 The goal of newborn screening (NBS)3 for inherited disorders of metabolism is the early detection and confirmation of disease, thus enabling early medical intervention, treatment, and improved outcomes (1). Important characteristics of a screening method include analytical specificity and sensitivity, coupled with rapid, high throughput and timely reporting of abnormal results. Routine primary screening methods are designed to identify as many abnormal infants as possible, with diagnostic sensitivity favored over specificity for disorder detection. This approach not only increases the numbers of false-positive test results, thus adding to the cost of operating NBS programs, but also places unnecessarily increased stress, anxiety, and possibly parent–child dysfunction on families (2). As the number of disorders in the NBS test panels grows, however, so does the overall number of false-positive results, which has increased severalfold per true case (3). One solution to this problem is to use improved methods or to couple primary screening methods with second-tier tests that improve selectivity. | The use of tandem mass spectrometry (MS/MS) for detecting phenylketonuria is an example of an NBS method that improves detection as a primary screen while also being more selective than older, classic NBS methods such as fluorometry. In one study, MS/MS analysis of newborn spot samples of dried blood collected ≤24 h after birth was compared with fluorometric analysis of the same samples. Because of this early time of collection, the decision level for an increased phenylalanine concentration was lowered by the public health laboratory using fluorometry to ensure that no infants with phenylketonuria were missed. MS/MS analysis of the identical samples demonstrated that disease detection could be sustained while improving selectivity (4). The ability to measure multiple analytes in the same analysis enabled the calculation of the phenylalanine/tyrosine molar ratio, which reduced false-positive rates a 100-fold. This screen for phenylketonuria was the first instance of a new paradigm in NBS, in which both current screens could be improved and new screens could be added for other disorders, such as fatty acid oxidation defects and organic acidemias (5). The ability of MS/MS to improve efficacy without the need for collecting a second sample reduces the false-positive rate. |
Determination of total homocysteine, methylmalonic acid, and 2-methylcitric acid in dried blood spots by tandem mass spectrometry
Turgeon CT , Magera MJ , Cuthbert CD , Loken PR , Gavrilov DK , Tortorelli S , Raymond KM , Oglesbee D , Rinaldo P , Matern D . Clin Chem 2010 56 (11) 1686-95 BACKGROUND: Newborn screening (NBS) for inborn errors of propionate, methionine, and cobalamin metabolism relies on finding abnormal concentrations of methionine and propionylcarnitine. These analytes are not specific for these conditions and lead to frequent false-positive results. More specific markers are total homocysteine (tHCY), methylmalonic acid (MMA), and methylcitric acid (MCA), but these markers are not detected by current NBS methods. To improve this situation, we developed a method for the detection of tHCY, MMA, and MCA in dried blood spots (DBSs) by liquid chromatography-tandem mass spectrometry (LC-MS/MS). METHODS: The analytes were extracted from a single 4.8-mm DBS punch with acetonitrile:water:formic acid (59:41:0.42) containing dithiothreitol and isotopically labeled standards (d(3)-MMA, d(3)-MCA, d(8)-homocystine). The extract was dried and treated with 3 N HCl in n-butanol to form butylesters. After evaporation of the butanol, the residue was reconstituted and centrifuged and the supernatant was subjected to LC-MS/MS analysis. Algorithms were developed to apply this method as an efficient and effective second-tier assay on samples with abnormal results by primary screening. RESULTS: The 99th percentiles determined from the analysis of 200 control DBSs for MMA, MCA, and HCY were 1.5, 0.5, and 9.8 mumol/L, respectively. Since 2005, prospective application of this second-tier analysis to 2.3% of all NBS samples led to the identification of 13 affected infants. CONCLUSIONS: Application of this assay reduced the false-positive rate and improved the positive predictive value of NBS for conditions associated with abnormal propionylcarnitine and methionine concentrations. |
Timing of maturation and predictors of Tanner stage transitions in boys enrolled in a contemporary British cohort
Monteilh C , Kieszak S , Flanders WD , Maisonet M , Rubin C , Holmes AK , Heron J , Golding J , McGeehin MA , Marcus M . Paediatr Perinat Epidemiol 2011 25 (1) 75-87 This study describes the timing of puberty in 8- to 14-year-old boys enrolled in the Avon Longitudinal Study of Parents and Children (ALSPAC) and identifies factors associated with earlier achievement of advanced pubic hair stages. Women were enrolled during pregnancy and their offspring were followed prospectively. We analysed self-reported pubic hair Tanner staging collected annually. We used survival models to estimate median age of attainment of pubic hair stage >1, stage >2 and stage >3 of pubic hair development. We also constructed multivariable logistic regression models to identify factors associated with earlier achievement of pubic hair stages. Approximately 5% of the boys reported Tanner pubic hair stage >1 at age 8; 99% of boys were at stage >1 by age 14. The estimated median ages of entry into stages of pubic hair development were 11.4 years [95% confidence interval (CI) 11.3, 11.4] for stage >1, 12.7 years [95% CI 12.7, 12.8] for stage >2 and 13.5 years [95% CI 13.5, 13.6] for stage >3. Predictors of younger age at Tanner stage >1 included low birthweight, younger maternal age at delivery and being taller at age 8. Associations were found between younger age at attainment of stage >2 and gestational diabetes and taller or heavier body size at age 8. Being taller or heavier at age 8 also predicted younger age at Tanner stage >3. The results give added support to the strong influence of pre-adolescent body size on male pubertal development; the tallest and heaviest boys at 8 years achieved each stage earlier and the shortest boys later. Age at attainment of pubic hair Tanner stages in the ALSPAC cohort are similar to ages reported in other European studies that were conducted during overlapping time periods. This cohort will continue to be followed for maturational information until age 17. |
Frequency of human bocavirus (HBoV) infection among children with febrile respiratory symptoms in Argentina, Nicaragua and Peru
Salmon-Mulanovich G , Sovero M , Laguna-Torres VA , Kochel TJ , Lescano AG , Chauca G , Sanchez JF , Rodriguez F , Parrales E , Ocana V , Barrantes M , Blazes DL , Montgomery JM . Influenza Other Respir Viruses 2011 5 (1) 1-5 BACKGROUND: Globally, respiratory infections are the primary cause of illness in developing countries, specifically among children; however, an etiological agent for many of these illnesses is rarely identified. OBJECTIVES: Our study aimed to estimate the frequency of human bocavirus (HBoV) infection among pediatric populations in Argentina, Nicaragua and Peru. METHODS: We conducted a cross-sectional study using stored samples of an influenza-like illness surveillance program. Irrespective of previous diagnosis, nasopharyngeal or nasal swab specimens were randomly selected and tested using real-time PCR from three sites during 2007 from patients younger than 6 years old. RESULTS: A total of 568 specimens from Argentina (185), Nicaragua (192) and Peru (191) were tested. The prevalence of HBoV was 10.8% (95% CI: 6.3; 15.3) in Argentina, 33.3% in Nicaragua (95% CI: 26.6; 40.1) and 25.1% in Peru (95% CI: 18.9; 31.3). CONCLUSIONS: These findings demonstrate circulation of HBoV in Argentina, Nicaragua and Peru among children with influenza-like symptoms enrolled in a sentinel surveillance program. |
Maternal infection requiring hospitalization during pregnancy and autism spectrum disorders
Atladottir HO , Thorsen P , Ostergaard L , Schendel DE , Lemcke S , Abdallah M , Parner ET . J Autism Dev Disord 2010 40 (12) 1423-30 Exposure to prenatal infection has been suggested to cause deficiencies in fetal neurodevelopment. In this study we included all children born in Denmark from 1980, through 2005. Diagnoses of autism spectrum disorders (ASDs) and maternal infection were obtained through nationwide registers. Data was analyzed using Cox proportional hazards regression. No association was found between any maternal infection and diagnosis of ASDs in the child when looking at the total period of pregnancy: adjusted hazard ratio = 1.14 (CI: 0.96-1.34). However, admission to hospital due to maternal viral infection in the first trimester and maternal bacterial infection in the second trimester were found to be associated with diagnosis of ASDs in the offspring, adjusted hazard ratio = 2.98 (CI: 1.29-7.15) and adjusted hazard ratio = 1.42 (CI: 1.08-1.87), respectively. Our results support prior hypotheses concerning early prenatal viral infection increasing the risk of ASDs. |
Serum 25-hydroxyvitamin D and cancer mortality in the NHANES III study (1988-2006)
Freedman DM , Looker AC , Abnet CC , Linet MS , Graubard BI . Cancer Res 2010 70 (21) 8587-97 Vitamin D has been hypothesized to protect against cancer. We followed 16,819 participants in NHANES III (Third National Health and Nutritional Examination Survey) from 1988 to 2006, expanding on an earlier NHANES III study (1988-2000). Using Cox proportional hazards regression models, we examined risk related to baseline serum 25-hydroxyvitamin D [25(OH)D] for total cancer mortality, in both sexes, and by racial/ethnic groups, as well as for site-specific cancers. Because serum was collected in the south in cooler months and in the north in warmer months, we examined associations by collection season ("summer/higher latitude" and "winter/lower latitude"). We identified 884 cancer deaths during 225,212 person-years. Overall cancer mortality risks were unrelated to baseline 25(OH)D status in both season/latitude groups, and in non-Hispanic whites, non-Hispanic blacks, and Mexican-Americans. In men, risks were elevated at higher levels {e.g., for ≥100 nmol/L, relative risk (RR) = 1.85 [95% confidence interval (CI), 1.02-3.35] compared with <37.5 nmol/L}. Although risks were unrelated to 25(OH)D in all women combined, risks significantly decreased with increasing 25(OH)D in the summer/higher latitude group [for ≥100 nmol/L, RR = 0.52 (95% CI, 0.25-1.15) compared with <37.5 nmol/L; P(trend) = 0.03, based on continuous values]. We also observed a suggestion of an inverse association with colorectal cancer mortality (P(trend) = 0.09) and a positive association with lung cancer mortality among males (P(trend) = 0.03). Our results do not support the hypothesis that 25(OH)D is associated with reduced cancer mortality. Although cancer mortality in females was inversely associated with 25(OH)D in the summer/higher latitude group, cancer mortality at some sites was increased among men with higher 25(OH)D. These findings argue for caution before increasing 25(OH)D levels to prevent cancer. |
High dietary niacin intake is associated with decreased chromosome translocation frequency in airline pilots.
Yong LC , Petersen MR . Br J Nutr 2010 105 (4) 1-9 Experimental studies suggest that B vitamins such as niacin, folate, riboflavin, vitamin B6 and vitamin B12 may protect against DNA damage induced by ionising radiation (IR). However, to date, data from IR-exposed human populations are not available. We examined the intakes of these B vitamins and their food sources in relation to the frequency of chromosome translocations as a biomarker of cumulative DNA damage, in eighty-two male airline pilots. Dietary intakes were estimated by using a self-administered semi-quantitative FFQ. Translocations in peripheral blood lymphocytes were scored by using fluorescence in situ hybridisation whole-chromosome painting. Negative binomial regression was used to estimate rate ratios and 95 % CI, adjusted for age and occupational and lifestyle factors. We observed a significant inverse association between translocation frequency and dietary intake of niacin (P = 0.02): adjusted rate ratio for subjects in the highest tertile compared with the lowest tertile was 0.58 (95 % CI 0.40, 0.83). Translocation frequency was not associated with total niacin intake from food and supplements as well as dietary or total intake of folate, riboflavin or vitamin B6 or B12. However, the adjusted rate ratios were significant for subjects with ≥ median compared with < median intake of whole grains (P = 0.03) and red and processed meat (P = 0.01): 0.69 (95 % CI 0.50, 0.96) and 1.56 (95 % CI 1.13, 2.16), respectively. Our data suggest that a high intake of niacin from food or a diet high in whole grains but low in red and processed meat may protect against cumulative DNA damage in IR-exposed persons. |
Use of historical data and a novel metric in the evaluation of the effectiveness of hearing conservation program components
Heyer N , Morata TC , Pinkerton LE , Brueck SE , Stancescu D , Panaccio MP , Kim H , Sinclair JS , Waters MA , Estill CF , Franks JR . Occup Environ Med 2010 68 (7) 510-7 OBJECTIVES: To evaluate the effectiveness of hearing conservation programs (HCP) and their specific components in reducing noise-induced hearing loss (NIHL). METHODS: This retrospective cohort study was conducted at one food-processing plant and two automotive plants. Audiometric and work-history databases were combined with historical noise monitoring data to develop a time-dependent exposure matrix for each plant. Historical changes in production and HCP implementation were collected from company records, employee interviews and focus groups. These data were used to develop time-dependent quality assessments for various HCP components. 5478 male (30 427 observations) and 1005 female (5816 observations) subjects were included in the analysis. RESULTS: Analyses were conducted separately for males and females. Females tended to have less NIHL at given exposure levels than males. Duration of noise exposure stratified by intensity (dBA) was a better predictor of NIHL than the standard equivalent continuous noise level (L(eq)) based upon a 3-dBA exchange. Within this cohort, efficient dBA strata for males were <95 versus ≥95, and for females <90 versus ≥90. The reported enforced use of hearing protection devices (HPDs) significantly reduced NIHL. The data did not have sufficient within-plant variation to determine the effectiveness of noise monitoring or worker training. An association between increased audiometric testing and NIHL was believed to be an artifact of increased participation in screening. CONCLUSIONS: Historical audiometric data combined with noise monitoring data can be used to better understand the effectiveness of HCPs. Regular collection and maintenance of quality data should be encouraged and used to monitor the effectiveness of these interventions. |
Mortality among members of a truck driver trade association
Birdsey J , Alterman T , Li J , Petersen MR , Sestito J . AAOHN J 2010 58 (11) 473-80 Previous studies report that truck drivers are at increased risk for illness and on-the-job mortality. It is unknown whether owner-operator truck drivers face the same risks as employee drivers, yet few studies have targeted owner-operators as a study population. This study examined the overall and cause-specific mortality ratios for a cohort with owner-operator truck drivers constituting 69% of the study population. Of the 26 major disease classifications and 92 specific causes of death examined, only mortality due to transportation accidents was significantly elevated (standardized mortality ratio=1.52, 95% confidence interval=1.36-1.70). Leading causes of death were ischemic heart disease and lung cancer, although risk was below that of the general population. Transportation accidents pose a particular hazard for members of the trade association. The absence of excess disease mortality deserves careful interpretation, and may be due to both a strong healthy worker effect and a short monitoring period. |
Effect of an interferent on the performance of two direct-reading organic vapor monitors
LeBouf RF , Rossner A , Hudnall JB , Slaven JE , Calvert CC , Pearce TA , Coffey CC . J Emerg Manag 2010 8 (5) 72-80 Direct-reading organic vapor monitors (DROVMs) are widely used by industrial hygienists and emergency responders as survey tools for the assessment of volatile organic compounds (VOCs) in occupational or emergency response settings. Although these monitors provide real-time information for expedient decision making, their utility in determining compliance with specific exposure limits is not well established. In addition, other VOCs that may be present in the same environment can act as interferents and adversely affect performance. This study assessed the effect of an interferent (hexane) on the performance of two representative commercially available monitors when measuring cyclohexane. The instrument readings were compared with concentrations measured with sorbent tubes, a standard compliance monitoring technique. Infrared-based concentration measurements were more precise at the two middle challenge concentrations (144 and 289 ppm), indicating a shift in instrument precision at the low and high end of the recommended operating range. Both photoionization detection and infrared-based concentration measurements were affected by the presence and amount of hexane in the test atmosphere. Emergency response personnel and industrial hygienists should be aware of the limitations of DROVMs in the assessment of hazardous situations involving VOCs. |
Assessing the performance of various restraints on ambulance patient compartment workers during crash events
Green JD , Yannaccone JR , Current RS , Sicher LA , Moore PH , Whitman GR . Int J Crashworthiness 2010 15 (5) 517 - 541 The inability of emergency medical service (EMS) workers to remain safely restrained while treating patients in the patient compartment of a moving ambulance has been identified as a key impediment to EMS worker safety in North America. It has been hypothesised that restraint systems designed to provide mobility while offering the ability to lock during an impact or sudden manoeuvre, could greatly enhance worker safety in the back of ambulances. Through a series of 33 sled and crash tests impacting the front, side, and rear of simulated and actual ambulance patient compartments, the National Institute for Occupational Safety and Health examined the biomechanical and kinematic effects of two-, four- and five-point restraints on 95th percentile male Hybrid III anthropomorphic test devices. Results indicate that the inclusion of restraint systems offering mobility have the potential to improve worker safety under many working conditions in this unique work environment. |
Rapid increase in ownership and use of long-lasting insecticidal nets and decrease in prevalence of malaria in three regional States of Ethiopia (2006-2007)
Shargie EB , Ngondi J , Graves PM , Getachew A , Hwang J , Gebre T , Mosher AW , Ceccato P , Endeshaw T , Jima D , Tadesse Z , Tenaw E , Reithinger R , Emerson PM , Richards FO , Ghebreyesus TA . J Trop Med 2010 2010 Following recent large scale-up of malaria control interventions in Ethiopia, this study aimed to compare ownership and use of long-lasting insecticidal nets (LLIN), and the change in malaria prevalence using two population-based household surveys in three regions of the country. Each survey used multistage cluster random sampling with 25 households per cluster. Household net ownership tripled from 19.6% in 2006 to 68.4% in 2007, with mean LLIN per household increasing from 0.3 to 1.2. Net use overall more than doubled from 15.3% to 34.5%, but in households owning LLIN, use declined from 71.7% to 48.3%. Parasitemia declined from 4.1% to 0.4%. Large scale-up of net ownership over a short period of time was possible. However, a large increase in net ownership was not necessarily mirrored directly by increased net use. Better targeting of nets to malaria-risk areas and sustained behavioural change communication are needed to increase and maintain net use. |
Stakeholder attitudes toward influenza vaccination policy in the United States
Berman PP , Orenstein WA , Hinman AR , Gazmararian J . Health Promot Pract 2010 11 (6) 807-16 There is growing interest in simplifying recommendations to vaccinate Americans against influenza. The article discusses interviews with 35 stakeholders from the medical, public health, educational, insurance, and vaccine industry sectors to assess the potential for policy change, and discusses questions posed to the interviewees on current and future influenza vaccination policy and barriers to policy change. About 97% of respondents support the expansion of vaccination for all school-age children, and about 95% support universal vaccination, but there are reservations expressed by the respondents, despite the support for this policy change. Barriers to influenza vaccination recommendations include access, supply, confusing recommendations, and public perceptions. Barriers to universal vaccination include lack of infrastructure, cost, need for education, and vaccine supply. Issues concerning resources and education are challenges that impede policy change. The study findings can be useful to policy makers and practitioners for reviewing U.S. vaccination policy and changes to the policy. |
A way forward: the National HIV/AIDS Strategy and reducing HIV incidence in the United States
Millett GA , Crowley JS , Koh H , Valdiserri RO , Frieden T , Dieffenbach CW , Fenton KA , Benjamin R , Whitescarver J , Mermin J , Parham-Hopson D , Fauci AS . J Acquir Immune Defic Syndr 2010 55 S144-S147 In July 2010, the Obama Administration released a National HIV/AIDS Strategy for the United States to refocus national attention on responding to the domestic HIV epidemic. The goals of the strategy are to reduce HIV incidence; to increase access to care and optimize health outcomes among people living with HIV; and to reduce HIV-related disparities. The strategy identifies a small number of action steps that will align efforts across federal, state, local, and tribal levels of government, and maximally impact the domestic HIV epidemic. In this article, we outline key programmatic and research issues that must be addressed to accomplish the prevention goals of the National HIV/AIDS Strategy. |
A brief history and overview of CDC's Centers for Public Health Preparedness Cooperative Agreement Program
Richmond A , Hostler L , Leeman G , King W . Public Health Rep 2010 125 Suppl 5 8-14 The Centers for Disease Control and Prevention (CDC) funded the Centers for Public Health Preparedness (CPHP) Cooperative Agreement program from 2004 through 2010. CDC gave approximately $134 million to 27 CPHPs within accredited schools of public health to enhance the relationship between academia and state and local health agencies to strengthen public health preparedness. Over the course of the program, CPHPs provided education and training services that met public health preparedness and response needs throughout the United States. The passage of the Pandemic and All-Hazards Preparedness Act in 2006 has had broad implications for the Department of Health and Human Services' future preparedness and response activities. Guidelines were established giving accredited schools of public health eligibility to receive federal grants to carry out the continual development and delivery of core curricula and training that responds to the needs of state, local, and tribal public health authorities. |
Concise formulas for the area and volume of a hyperspherical cap
Li S . Asian J Math Stat 2011 4 (1) 1-5 Spherical caps in hyperspace have found applications in stochastic optimizations and software engineering. However, there is a need for concise formulas for surface area and volume that are easy to express and compute. In this note, concise formulas are given in closed-forms. These formulas are obtained by integrating the area/volume of an (n-1)-sphere over a great circle arc in hyperspherical coordinates. |
Factor structure and psychometric properties of the Brief Symptom Inventory-18 in women: a MACS approach to testing for invariance across racial/ethnic groups
Wiesner M , Chen V , Windle M , Elliott MN , Grunbaum JA , Kanouse DE , Schuster MA . Psychol Assess 2010 22 (4) 912-922 This study used data from 3 sites to examine the invariance and psychometric characteristics of the Brief Symptom Inventory-18 across Black, Hispanic, and White mothers of 5th graders (N = 4,711; M = 38.07 years of age, SD = 7.16). Internal consistencies were satisfactory for all subscale scores of the instrument regardless of ethnic group membership. Mean and covariance structures analysis indicated that the hypothesized 3-factor structure of the instrument was not robust across ethnic groups. It provided a reasonable approximation to the data for Black and White women but not for Hispanic women. Tests for differential item functioning (DIF) were therefore conducted for only Black and White women. Analyses revealed no more than trivial instances of nonuniform DIF but more substantial evidence of uniform DIF for 3 of the 18 items. After having established partial strong factorial invariance of the instrument, latent factor means were found to be significantly higher for Black than for White women on all 3 subscales (somatization, depression, anxiety). In conclusion, the instrument may be used for mean comparisons between Black and White women. (PsycINFO Database Record (c) 2010 APA, all rights reserved) (journal abstract). |
Prevalence and molecular identification of Cryptosporidium spp. in pigs in Henan, China
Wang R , Qiu S , Jian F , Zhang S , Shen Y , Zhang L , Ning C , Cao J , Qi M , Xiao L . Parasitol Res 2010 107 (6) 1489-94 The distribution and public health significance of Cryptosporidium species/genotypes in pigs differ among geographic areas and studies. To characterize the prevalence of cryptosporidiosis in pigs in Henan, China, a total of 1,350 fecal samples from 14 farms in ten prefectures in Henan Province were examined. The overall prevalence of Cryptosporidium was 8.2% (111/1,350), with the highest infection rate (79/383 or 20.6%) in 1-2-month-old piglets and the lowest infection rates in 3-6-month-old pigs. Cryptosporidium-positive samples from 108 animals were analyzed by polymerase chain reaction (PCR)-restriction fragment length polymorphism analysis of the small subunit rRNA gene, and 35 were further analyzed by DNA sequencing of the PCR products. Two Cryptosporidium species/genotype were identified, including Cryptosporidium suis (94/108) and the Cryptosporidium pig genotype II (14/108). C. suis infection was more common in younger piglets whereas the pig genotype II was relatively common in older pigs. These findings suggest that pigs are not a major source of zoonotic Cryptosporidium in the study area. |
Content Index (Achived Edition)
- Communicable Diseases
- Disease Reservoirs and Vectors
- Entomology
- Environmental Health
- Epidemiology and Surveillance
- Health Behavior and Risk
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Public Health Leadership and Management
- Sciences, General
- Social and Behavioral Sciences
- Veterinary Medicine
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure