Status of cardiovascular health in US adults: prevalence estimates from the National Health and Nutrition Examination Surveys (NHANES) 2003-2008
Shay CM , Ning H , Allen NB , Carnethon MR , Chiuve SE , Greenlund KJ , Daviglus ML , Lloyd-Jones DM . Circulation 2012 125 (1) 45-56 BACKGROUND: The American Heart Association's 2020 Strategic Impact Goals define a new concept, cardiovascular (CV) health; however, current prevalence estimates of the status of CV health in US adults according to age, sex, and race/ethnicity have not been published. METHODS AND RESULTS: We included 14 515 adults (≥20 years of age) from the 2003 to 2008 National Health and Nutrition Examination Surveys. Participants were stratified by young (20-39 years), middle (40-64 years), and older (≥65 years) ages. CV health behaviors (diet, physical activity, body mass index, smoking) and CV health factors (blood pressure, total cholesterol, fasting blood glucose, smoking) were defined as poor, intermediate, or ideal. Fewer than 1% of adults exhibited ideal CV health for all 7 metrics. For CV health behaviors, nonsmoking was most prevalent (range, 60.2%-90.4%), whereas ideal Healthy Diet Score was least prevalent (range, 0.2%-2.6%) across groups. Prevalences of ideal body mass index (range, 36.5%-45.3%) and ideal physical activity levels (range, 50.2%-58.8%) were higher in young adults compared with middle or older ages. Ideal total cholesterol (range, 23.7%-36.2%), blood pressure (range, 11.9%-16.3%), and fasting blood glucose (range, 31.2%-42.9%) were lower in older adults compared with young and middle-aged adults. Prevalence of poor CV health factors was lowest in young age but higher at middle and older ages. Prevalence estimates by age and sex were consistent across race/ethnic groups. CONCLUSIONS: These prevalence estimates of CV health represent a starting point from which effectiveness of efforts to promote CV health and prevent CV disease can be monitored and compared in US adult populations. |
Trends in preventive asthma medication use among children and adolescents, 1988-2008
Kit BK , Simon AE , Ogden CL , Akinbami LJ . Pediatrics 2012 129 (1) 62-9 OBJECTIVES: To examine trends in preventive asthma medication (PAM) use among children with current asthma in the United States from 1988 to 2008. METHODS: We performed a cross-sectional analysis of PAM use among 2499 children aged 1 to 19 years with current asthma using nationally representative data from the National Health and Nutrition Examination Survey (NHANES) during 3 time periods: 1988-1994, 1999-2002, and 2005-2008. PAMs included inhaled corticosteroids, leukotriene receptor antagonists, long-acting beta-agonists, mast-cell stabilizers, and methylxanthines. RESULTS: Among children with current asthma, there was an increase in the use of PAMs from 17.8% (SE: 3.3) in 1988-1994 to 34.9% (SE: 3.3) in 2005-2008 (P < .001 for trend). Adjusting for age, gender, race/ethnicity, and health insurance status, the odds of PAM use were higher in 2005-2008 compared with 1988-1994 (adjusted odds ratio [aOR] = 2.6; 95% confidence interval [CI]: 1.5-4.5). A multivariate analysis, combining all 3 time periods, showed lower use of PAMs among non-Hispanic black (aOR = 0.5 [95% CI: 0.4-0.7]) and Mexican American (aOR = 0.6 [95% CI: 0.4-0.9]) children compared to non-Hispanic white children. PAM use was also lower in 12 to 19 year olds compared with 1 to 5 year olds and also in children who did not have health insurance compared with those who did. CONCLUSIONS: Between 1988 and 2008, the use of PAM increased among children with current asthma. Non-Hispanic black and Mexican American children, adolescents aged 12 to 19 years, and uninsured children with current asthma had lower use of PAM. |
Complementary and alternative medicine (CAM) use among children with current asthma
Shen J , Oraka E . Prev Med 2012 54 (1) 27-31 OBJECTIVE: To estimate the prevalence of complementary and alternative medicine (CAM) use among children with current asthma. DESIGN: We analyzed data from the Asthma Call Back Survey (ACBS) 2006-2008. ACBS is a follow-up to the state-based Behavioral Risk Factor Surveillance System (BRFSS) survey that collects information on asthma and related factors including CAM use for asthma. The survey is administered to the parents who report in a subset of BRFSS states that their children have asthma. 5435 children had current asthma and were included in this analysis. RESULTS: Overall, 26.7% (95% confidence interval [CI]=24.5-29.0) of children with current asthma reported CAM use in the previous 12 months. Among them, the three most commonly used therapies were breathing techniques (58.5%; 95% CI=53.6-63.5), vitamins (27.3%; 95% CI=23.0-31.5), and herbal products (12.8%; 95% CI=9.2-16.4). Multivariate analysis of CAM use revealed higher adjusted odds ratios (aOR) among children who experienced cost barriers to conventional health care compared with children with no cost barrier (aOR=1.8; 95% CI=1.2-2.8). Children with poorly controlled asthma were most likely to use all types of CAM when compared to their counterpart with well-controlled asthma: aOR=2.3 (95% CI=1.6-3.3) for any CAM; aOR=1.7 (95% CI=1.2-2.6) for self-care based CAM; and aOR=4.4 (95% CI=1.6-9.3) for practitioner-based CAM. CONCLUSIONS: Children with poorly controlled asthma are more likely to use CAM; this likelihood persists after controlling for other factors (including parent's education, barriers to conventional health care, and controller medication use). CAM is also more commonly used by children who experienced cost barriers to conventional asthma care. CAM use could be a marker to identify patients who need patient/family education and support thus facilitate improved asthma control. |
Declining rates of hospitalization for nontraumatic lower-extremity amputation in the diabetic population aged 40 years or older: U.S., 1988-2008
Li Y , Burrows NR , Gregg EW , Albright A , Geiss LS . Diabetes Care 2012 35 (2) 273-7 OBJECTIVE: To assess trends in rates of hospitalization for nontraumatic lower-extremity amputation (NLEA) in U.S. diabetic and nondiabetic populations and disparities in NLEA rates within the diabetic population. RESEARCH DESIGN AND METHODS: We calculated NLEA hospitalization rates, by diabetes status, among persons aged ≥40 years on the basis of National Hospital Discharge Survey data on NLEA procedures and National Health Interview Survey data on diabetes prevalence. We used joinpoint regression to calculate the annual percentage change (APC) and to assess trends in rates from 1988 to 2008. RESULTS: The age-adjusted NLEA discharge rate per 1,000 persons among those diagnosed with diabetes and aged ≥40 years decreased from 11.2 in 1996 to 3.9 in 2008 (APC -8.6%; P < 0.01), while rates among persons without diagnosed diabetes changed little. NLEA rates in the diabetic population decreased significantly from 1996 to 2008 in all demographic groups examined (all P < 0.05). Throughout the entire study period, rates of diabetes-related NLEA were higher among persons aged ≥75 years than among those who were younger, higher among men than women, and higher among blacks than whites. CONCLUSIONS: From 1996 to 2008, NLEA discharge rates declined significantly in the U.S. diabetic population. Nevertheless, NLEA continues to be substantially higher in the diabetic population than in the nondiabetic population and disproportionately affects people aged ≥75 years, blacks, and men. Continued efforts are needed to decrease the prevalence of NLEA risk factors and to improve foot care among certain subgroups within the U.S. diabetic population that are at higher risk. |
Diabetic retinopathy in the SEARCH for Diabetes in Youth cohort: a pilot study
Mayer-Davis EJ , Davis C , Saadine J , D'Agostino RB Jr , Dabelea D , Dolan L , Garg S , Lawrence JM , Pihoker C , Rodriguez BL , Klein BE , Klein R , Bell RA . Diabet Med 2012 29 (9) 1148-52 AIMS: The aim of this pilot study was to generate an initial estimate of the prevalence and correlates of diabetic retinopathy in a racially and ethnically diverse sample of youth with Type 1 and Type 2 diabetes mellitus. METHODS: A pilot study was conducted among 222 individuals with Type 1 diabetes (79% non-Hispanic white, 21% other) and 43 with Type 2 diabetes (28% non-Hispanic white, 72% other), all of > 5 years duration (mean duration 6.8 years) who participated in the SEARCH for Diabetes in Youth study. Diabetic retinopathy was assessed using non-mydriatic retinal photography of both eyes. RESULTS: The prevalence of diabetic retinopathy was 17% for Type 1 diabetes and 42% for Type 2 diabetes (odds ratio 1.50, 95% CI 0.58-3.88; P = 0.40 adjusted for age, duration, gender, race/ethnicity, parental education and HbA(1c) . HbA(1c) was significantly higher among those with any diabetic retinopathy (adjusted mean 79 mmol/mol, 9.4%) vs. no diabetic retinopathy (adjusted mean 70 mmol/mol, 8.6%) (P = 0.015). LDL cholesterol was also significantly higher among those with any diabetic retinopathy (adjusted mean 107.2 mg/dl) compared with those without diabetic retinopathy (adjusted mean 97.9 mg/dl) (P = 0.04). CONCLUSIONS: The prevalence of diabetic retinopathy in contemporary young individuals was substantial, particularly among minority youth and those with Type 2 diabetes. Further long-term study of diabetic retinopathy in youth is needed. |
Multistate outbreak of MDR TB identified by genotype cluster investigation.
Barry PM , Gardner TJ , Funk E , Oren E , Field K , Shaw T , Langer AJ . Emerg Infect Dis 2012 18 (1) 113-6 In 2008, diagnosis and investigation of 2 multidrug-resistant tuberculosis cases with matching genotypes led to identification of an outbreak among foreign-born persons who performed short-term seafood production work in Alaska during 2006. Tuberculosis control programs should consider the possibility of domestic transmission even among foreign-born patients. |
Reinfection redux
Vernon AA , Villarino ME . Clin Infect Dis 2012 54 (6) 792-3 Andrews et al have published a creative effort to address an epidemiologic question of significance for tuberculosis control both domestically and globally [1]. The question has particular relevance as the importance of managing latent tuberculosis infection (LTBI) grows in the national strategy to eliminate tuberculosis and prevent tuberculosis transmission in the United States. | The authors endeavor to quantify the reduction in risk of progression to active tuberculosis following reexposure and reinfection compared with the risk of progression following new or primary infection. Prior efforts to define this risk have relied largely on population modeling to varying degrees. The authors instead use data obtained from studies of young healthcare workers in the era before chemotherapy. They identified 18 studies of nursing and medical students conducted from 1928 to 1954. At baseline half the subjects were already tuberculin skin test (TST)–positive, and the median annual risk of infection among the remainder was an astonishing 34%. Using clearly described analytic approaches, the authors estimate that the risk of disease after reexposure is only 21% of that among persons not previously TST positive—a reduction of 79%. |
Retention in care of adults and adolescents living with HIV in 13 U.S. areas
Hall HI , Gray KM , Tang T , Li J , Shouse L , Mermin J . J Acquir Immune Defic Syndr 2012 60 (1) 77-82 BACKGROUND: Monitoring immunologic and virologic response to antiretroviral therapy in HIV-1-infected patients is an important component of treatment in the United States. However, little population-based information is available on whether HIV-infected persons receive the recommended tests or continuous care. METHODS: Using data from 13 areas reporting relevant HIV-related tests to national HIV surveillance, we determined retention in care in persons >12 years old living with HIV at the end of 2009. We assessed retention in care, defined as ≥2 CD4 or viral load tests at least 3 months apart in the past year, by demographic, clinical, and risk characteristics, and calculated prevalence ratios (PR) and 95% confidence intervals (CI). We also assessed the percentage established in care within 12 months after HIV diagnosis in 2008 (≥2 tests, ≥3 months apart). RESULTS: Among 100,375 persons living with HIV, 45% had ≥2 tests at least 3 months apart. A higher percentage of whites was retained in care (50%) compared with blacks/African Americans (41%, PR 0.83, 95% CI 0.82, 0.84) and Hispanics/Latinos (40%, PR 0.90, 95% CI 0.87, 0.92). Compared with heterosexual women (50%), fewer men who have sex with men (48%), heterosexual males (45%), and male (37%) and female (43%) injection-drug users had ≥2 tests. About 64% established care within 12 months of diagnosis. CONCLUSIONS: Less than half of persons living with HIV had laboratory evidence of ongoing clinical care and only two-thirds established care after diagnosis. Further assessments determining modifiable barriers to accessing care could assist with achieving public health targets. |
Stopping the control arm in response to the DSMB: mother's choice of HIV prophylaxis during breastfeeding in the BAN study
Chavula C , Long D , Mzembe E , Kayira D , Chasela C , Hudgens MG , Hosseinipour M , King CC , Ellington S , Chigwenembe M , Jamieson DJ , van der Horst C . Contemp Clin Trials 2012 33 (1) 55-9 The Data and Safety Monitoring Board (DSMB) for the Breastfeeding, Antiretrovirals, and Nutrition study, a clinical trial aimed to prevent postnatal HIV transmission, recommended halting randomization to the enhanced standard-of-care (control) arm. The 67 mother-infant pairs on the control arm and less than 21weeks postpartum at the time of the DSMB recommendation were read a script informing them of the DSMB decision and offering them the maternal or infant antiretroviral interventions for the remainder of the 28-week breastfeeding period. This paper describes the BAN study response to the DSMB decision and what the women on the control arm chose, when given a choice to start the maternal or infant antiretroviral interventions. |
Toxoplasmosis hospitalizations in the United States, 2008, and trends, 1993-2008
Jones JL , Roberts JM . Clin Infect Dis 2012 54 (7) e58-61 BACKGROUND: Toxoplasmosis-related hospitalizations often occur in persons with human immunodeficiency virus (HIV) infection and other causes of immunosuppression. METHODS: Using the National Inpatient Sample (NIS) from the Healthcare Cost and Utilization Project, we examined trends in toxoplasmosis-related hospitalizations by HIV infection status from 1993 through 2008, and rates by sex and race or ethnicity in 2008. The NIS is designed to represent a 20% sample of US community hospitals and currently includes information on up to 8 million discharges per year from approximately 1000 hospitals. We used International Classification of Diseases, Ninth Revision, Clinical Modification codes 130-130.9 for toxoplasmosis and 042-044/795.8/795.71/V08 for HIV infection. RESULTS: Estimated HIV-associated toxoplasmosis hospitalizations increased from 9395 in 1993 to 10583 in 1995 (P = .0002), then dropped to 3643 in 2001 (P < .0001), with similar levels thereafter. The rate of HIV-associated toxoplasmosis hospitalizations among all HIV-related hospitalizations decreased from 3.33% in 1993 to 1.25% in 2008 (P < .0001). Estimated non-HIV-associated toxoplasmosis hospitalizations were less variable from 1993 to 2008 (range, 386-819; 0.0020% in 1993, 0.0015% in 2008). In 2008, the rates of both HIV- and non-HIV-associated toxoplasmosis hospitalizations were higher in Hispanic persons than in white persons. CONCLUSIONS: HIV-associated toxoplasmosis hospitalizations dropped markedly after 1995 when highly active antiretroviral therapy was introduced; however, hospitalizations decreased relatively little after 2000, suggesting late diagnosis of some HIV-infected persons or antiretroviral therapy failure. Non-HIV-associated toxoplasmosis hospitalizations have been more stable. The rates of toxoplasmosis-related hospitalizations varied markedly among racial and ethnic groups. |
Modeling insights into Haemophilus influenzae type b disease, transmission, and vaccine programs
Jackson ML , Rose CE , Cohn A , Coronado F , Clark TA , Wenger JD , Bulkow L , Bruce MG , Messonnier NE , Hennessy TW . Emerg Infect Dis 2012 18 (1) 13-20 In response to the 2007-2009 Haemophilus influenzae type b (Hib) vaccine shortage in the United States, we developed a flexible model of Hib transmission and disease for optimizing Hib vaccine programs in diverse populations and situations. The model classifies population members by age, colonization/disease status, and antibody levels, with movement across categories defined by differential equations. We implemented the model for the United States as a whole, England and Wales, and the Alaska Native population. This model accurately simulated Hib incidence in all 3 populations, including the increased incidence in England/Wales beginning in 1999 and the change in Hib incidence in Alaska Natives after switching Hib vaccines in 1996. The model suggests that a vaccine shortage requiring deferral of the booster dose could last 3 years in the United States before loss of herd immunity would result in increasing rates of invasive Hib disease in children <5 years of age. |
MRSA USA300 at Alaska Native Medical Center, Anchorage, Alaska, USA, 2000-2006
David MZ , Rudolph KM , Hennessy TW , Zychowski DL , Asthi K , Boyle-Vavra S , Daum RS . Emerg Infect Dis 2012 18 (1) 105-8 To determine whether methicillin-resistant Staphylococcus aureus (MRSA) USA300 commonly caused infections among Alaska Natives, we examined clinical MRSA isolates from the Alaska Native Medical Center, Anchorage, during 2000-2006. Among Anchorage-region residents, USA300 was a minor constituent among MRSA isolates in 2000-2003 (11/68, 16%); by 2006, USA300 was the exclusive genotype identified (10/10). |
Population-based incidence of typhoid fever in an urban informal settlement and a rural area in Kenya: implications for typhoid vaccine use in Africa
Breiman RF , Cosmas L , Njuguna H , Audi A , Olack B , Ochieng JB , Wamola N , Bigogo GM , Awiti G , Tabu CW , Burke H , Williamson J , Oundo JO , Mintz ED , Feikin DR . PLoS One 2012 7 (1) e29119 BACKGROUND: High rates of typhoid fever in children in urban settings in Asia have led to focus on childhood immunization in Asian cities, but not in Africa, where data, mostly from rural areas, have shown low disease incidence. We set out to compare incidence of typhoid fever in a densely populated urban slum and a rural community in Kenya, hypothesizing higher rates in the urban area, given crowding and suboptimal access to safe water, sanitation and hygiene. METHODS: During 2007-9, we conducted population-based surveillance in Kibera, an urban informal settlement in Nairobi, and in Lwak, a rural area in western Kenya. Participants had free access to study clinics; field workers visited their homes biweekly to collect information about acute illnesses. In clinic, blood cultures were processed from patients with fever or pneumonia. Crude and adjusted incidence rates were calculated. RESULTS: In the urban site, the overall crude incidence of Salmonella enterica serovar Typhi (S. Typhi) bacteremia was 247 cases per 100,000 person-years of observation (pyo) with highest rates in children 5-9 years old (596 per 100,000 pyo) and 2-4 years old (521 per 100,000 pyo). Crude overall incidence in Lwak was 29 cases per 100,000 pyo with low rates in children 2-4 and 5-9 years old (28 and 18 cases per 100,000 pyo, respectively). Adjusted incidence rates were highest in 2-4 year old urban children (2,243 per 100,000 pyo) which were >15-fold higher than rates in the rural site for the same age group. Nearly 75% of S. Typhi isolates were multi-drug resistant. CONCLUSIONS: This systematic urban slum and rural comparison showed dramatically higher typhoid incidence among urban children <10 years old with rates similar to those from Asian urban slums. The findings have potential policy implications for use of typhoid vaccines in increasingly urban Africa. |
Fatal outbreaks of jaundice in pregnancy and the epidemic history of hepatitis E
Teo CG . Epidemiol Infect 2012 140 (5) 1-21 SUMMARY: Space-time clustering of people who fall acutely ill with jaundice, then slip into coma and death, is an alarming phenomenon, more markedly so when the victims are mostly or exclusively pregnant. Documentation of the peculiar, fatal predisposition of pregnant women during outbreaks of jaundice identifies hepatitis E and enables construction of its epidemic history. Between the last decade of the 18th century and the early decades of the 20th century, hepatitis E-like outbreaks were reported mainly from Western Europe and several of its colonies. During the latter half of the 20th century, reports of these epidemics, including those that became serologically confirmed as hepatitis E, emanated from, first, the eastern and southern Mediterranean littoral and, thereafter, Southern and Central Asia, Eastern Europe, and the rest of Africa. The dispersal has been accompanied by a trend towards more frequent and larger-scale occurrences. Epidemic and endemic hepatitis E still beset people inhabiting Asia and Africa, especially pregnant women and their fetuses and infants. Their relief necessitates not only accelerated access to potable water and sanitation but also vaccination against hepatitis E. |
Incidence of influenza-like illness and severe acute respiratory infection during three influenza seasons in Bangladesh, 2008-2010
Azziz-Baumgartner E , Alamgir A , Rahman M , Homaira N , Sohel BM , Sharker MY , Zaman RU , Dee J , Gurley ES , Al Mamun A , Mah EMuneer S , Fry AM , Widdowson MA , Bresee J , Lindstrom S , Azim T , Brooks A , Podder G , Hossain MJ , Rahman M , Luby SP . Bull World Health Organ 2012 90 (1) 12-9 OBJECTIVE: To determine how much influenza contributes to severe acute respiratory illness (SARI), a leading cause of death in children, among people of all ages in Bangladesh. METHODS: Physicians obtained nasal and throat swabs to test for influenza virus from patients who were hospitalized within 7 days of the onset of severe acute respiratory infection (SARI) or who consulted as outpatients for influenza-like illness (ILI). A community health care utilization survey was conducted to determine the proportion of hospital catchment area residents who sought care at study hospitals and calculate the incidence of influenza using this denominator. FINDINGS: The estimated incidence of SARI associated with influenza in children < 5 years old was 6.7 (95% confidence interval, CI: 0-18.3); 4.4 (95% CI: 0-13.4) and 6.5 per 1000 person-years (95% CI: 0-8.3/1000) during the 2008, 2009 and 2010 influenza seasons, respectively. The incidence of SARI in people aged ≥ 5 years was 1.1 (95% CI: 0.4-2.0) and 1.3 (95% CI: 0.5-2.2) per 10 000 person-years during 2009 and 2010, respectively. The incidence of medically attended, laboratory-confirmed seasonal influenza in outpatients with ILI was 10 (95% CI: 8-14), 6.6 (95% CI: 5-9) and 17 per 100 person-years (95% CI: 13-22) during the 2008, 2009 and 2010 influenza seasons, respectively. CONCLUSION: Influenza-like illness is a frequent cause of consultation in the outpatient setting in Bangladesh. Children aged less than 5 years are hospitalized for influenza in greater proportions than children in other age groups. |
Asymmetric type F botulism with cranial nerve demyelination
Filozov A , Kattan JA , Jitendranath L , Smith CG , Luquez C , Phan QN , Fagan RP . Emerg Infect Dis 2012 18 (1) 102-4 We report a case of type F botulism in a patient with bilateral but asymmetric neurologic deficits. Cranial nerve demyelination was found during autopsy. Bilateral, asymmetric clinical signs, although rare, do not rule out botulism. Demyelination of cranial nerves might be underrecognized during autopsy of botulism patients. |
Clinical, epidemiologic, histopathologic and molecular features of an unexplained dermopathy
Pearson ML , Selby JV , Katz KA , Cantrell V , Braden CR , Parise ME , Paddock CD , Lewin-Smith MR , Kalasinsky VF , Goldstein FC , Hightower AW , Papier A , Lewis B , Motipara S , Eberhard ML . PLoS One 2012 7 (1) e29908 BACKGROUND: Morgellons is a poorly characterized constellation of symptoms, with the primary manifestations involving the skin. We conducted an investigation of this unexplained dermopathy to characterize the clinical and epidemiologic features and explore potential etiologies. METHODS: A descriptive study was conducted among persons at least 13 years of age and enrolled in Kaiser Permanente Northern California (KPNC) during 2006-2008. A case was defined as the self-reported emergence of fibers or materials from the skin accompanied by skin lesions and/or disturbing skin sensations. We collected detailed epidemiologic data, performed clinical evaluations and geospatial analyses and analyzed materials collected from participants' skin. RESULTS: We identified 115 case-patients. The prevalence was 3.65 (95% CI = 2.98, 4.40) cases per 100,000 enrollees. There was no clustering of cases within the 13-county KPNC catchment area (p = .113). Case-patients had a median age of 52 years (range: 17-93) and were primarily female (77%) and Caucasian (77%). Multi-system complaints were common; 70% reported chronic fatigue and 54% rated their overall health as fair or poor with mean Physical Component Scores and Mental Component Scores of 36.63 (SD = 12.9) and 35.45 (SD = 12.89), respectively. Cognitive deficits were detected in 59% of case-patients and 63% had evidence of clinically significant somatic complaints; 50% had drugs detected in hair samples and 78% reported exposure to solvents. Solar elastosis was the most common histopathologic abnormality (51% of biopsies); skin lesions were most consistent with arthropod bites or chronic excoriations. No parasites or mycobacteria were detected. Most materials collected from participants’ skin were composed of cellulose, likely of cotton origin. | CONCLUSIONS: This unexplained dermopathy was rare among this population of Northern California residents, but associated with significantly reduced health-related quality of life. No common underlying medical condition or infectious source was identified, similar to more commonly recognized conditions such as delusional infestation. |
Dengue outbreak in Key West, Florida, USA, 2009
Radke EG , Gregory CJ , Kintziger KW , Sauber-Schatz EK , Hunsperger EA , Gallagher GR , Barber JM , Biggerstaff BJ , Stanek DR , Tomashek KM , Blackmore CG . Emerg Infect Dis 2012 18 (1) 135-7 After 3 dengue cases were acquired in Key West, Florida, we conducted a serosurvey to determine the scope of the outbreak. Thirteen residents showed recent infection (infection rate 5%; 90% CI 2%-8%), demonstrating the reemergence of dengue in Florida. Increased awareness of dengue among health care providers is needed. |
Transmission of flea-borne zoonotic agents
Eisen RJ , Gage KL . Annu Rev Entomol 2012 57 61-82 Flea-borne zoonoses such as plague (Yersinia pestis) and murine typhus (Rickettsia typhi) caused significant numbers of human cases in the past and remain a public health concern. Other flea-borne human pathogens have emerged recently (e.g., Bartonella henselae, Rickettsia felis), and their mechanisms of transmission and impact on human health are not fully understood. Our review focuses on the ecology and epidemiology of the flea-borne bacterial zoonoses mentioned above with an emphasis on recent advancements in our understanding of how these organisms are transmitted by fleas, maintained in zoonotic cycles, and transmitted to humans. Emphasis is given to plague because of the considerable number of studies generated during the first decade of the twenty-first century that arose, in part, because of renewed interest in potential agents of bioterrorism, including Y. pestis. |
What do we need to know about disease ecology to prevent Lyme disease in the northeastern United States?
Eisen RJ , Piesman J , Zielinski-Gutierrez E , Eisen L . J Med Entomol 2012 49 (1) 11-22 Lyme disease is the most commonly reported vector-borne disease in the United States, with the majority of cases occurring in the Northeast. It has now been three decades since the etiological agent of the disease in North America, the spirochete Borrelia burgdorferi, and its primary North American vectors, the ticks Ixodes scapularis Say and I. pacificus Cooley & Kohls, were identified. Great strides have been made in our understanding of the ecology of the vectors and disease agent, and this knowledge has been used to design a wide range of prevention and control strategies. However, despite these advances, the number of Lyme disease cases have steadily increased. In this article, we assess potential reasons for the continued lack of success in prevention and control of Lyme disease in the northeastern United States, and identify conceptual areas where additional knowledge could be used to improve Lyme disease prevention and control strategies. Some of these areas include: (1) identifying critical host infestation rates required to maintain enzootic transmission of B. burgdorferi, (2) understanding how habitat diversity and forest fragmentation impacts acarological risk of exposure to B. burgdorferi and the ability of interventions to reduce risk, (3) quantifying the epidemiological outcomes of interventions focusing on ticks or vertebrate reservoirs, and (4) refining knowledge of how human behavior influences Lyme disease risk and identifying barriers to the adoption of personal protective measures and environmental tick management. |
Evaluation and modification of off-host flea collection techniques used in Northwest Uganda: laboratory and field studies
Borchert JN , Eisen RJ , Holmes JL , Atiku LA , Mpanga JT , Brown HE , Graham CB , Babi N , Montenieri JA , Enscore RE , Gage KL . J Med Entomol 2012 49 (1) 210-214 Quantifying the abundance of host-seeking fleas is critical for assessing risk of human exposure to flea-borne disease agents, including Yersinia pestis, the etiological agent of plague. Yet, reliable measures of the efficacy of existing host-seeking flea collection methods are lacking. In this study, we compare the efficacy of passive and active methods for the collection of host-seeking fleas in both the laboratory and human habitations in a plague-endemic region of northwest Uganda. In the laboratory, lighted "Kilonzo" flea traps modified with either blinking lights, the creation of shadows or the generation of carbon dioxide were less efficient at collecting Xenopsylla cheopis Rothchild and Ctenocephalides felis Bouché fleas than an active collection method using white cotton socks or cotton flannel. Passive collection using Kilonzo light traps in the laboratory collected significantly more X. cheopis than C. felis and active collection, using white socks and flannel, collected significantly more C. felis than X. cheopis. In field studies conducted in Uganda, Kilonzo traps using a flashlight were similar in their collection efficacy to Kilonzo traps using kerosene lamps. However, in contrast to laboratory studies, Kilonzo flea traps using flashlights collected a greater number of fleas than swabbing. Within human habitations in Uganda, Kilonzo traps were especially useful for collecting C. felis, the dominant species found in human habitations in this area. |
Date palm sap linked to Nipah virus outbreak in Bangladesh, 2008
Rahman MA , Hossain MJ , Sultana S , Homaira N , Khan SU , Rahman M , Gurley ES , Rollin PE , Lo MK , Comer JA , Lowe L , Rota PA , Ksiazek TG , Kenah E , Sharker Y , Luby SP . Vector Borne Zoonotic Dis 2012 12 (1) 65-72 INTRODUCTION: We investigated a cluster of patients with encephalitis in the Manikgonj and Rajbari Districts of Bangladesh in February 2008 to determine the etiology and risk factors for disease. METHODS: We classified persons as confirmed Nipah cases by the presence of immunoglobulin M antibodies against Nipah virus (NiV), or by the presence of NiV RNA or by isolation of NiV from cerebrospinal fluid or throat swabs who had onset of symptoms between February 6 and March 10, 2008. We classified persons as probable cases if they reported fever with convulsions or altered mental status, who resided in the outbreak areas during that period, and who died before serum samples were collected. For the case-control study, we compared both confirmed and probable Nipah case-patients to controls, who were free from illness during the reference period. We used motion-sensor-infrared cameras to observe bat's contact of date palm sap. RESULTS: We identified four confirmed and six probable case-patients, nine (90%) of whom died. The median age of the cases was 10 years; eight were males. The outbreak occurred simultaneously in two communities that were 44 km apart and separated by a river. Drinking raw date palm sap 2-12 days before illness onset was the only risk factor most strongly associated with the illness (adjusted odds ratio 25, 95% confidence intervals 3.3-infinity, p<0.001). Case-patients reported no history of physical contact with bats, though community members often reported seeing bats. Infrared camera photographs showed that Pteropus bats frequently visited date palm trees in those communities where sap was collected for human consumption. CONCLUSION: This is the second Nipah outbreak in Bangladesh where date palm sap has been implicated as the vehicle of transmission. Fresh date palm sap should not be drunk, unless effective steps have been taken to prevent bat access to the sap during collection. |
Environmental health internship essentials
Choo A , Gerke J , Sellers V , Syed M . J Environ Health 2012 74 (6) 52-53 The Centers for Disease Control and | Prevention (CDC) Summer Program | in Environmental Health (SUPEH) | provides students in academic programs | accredited by the National Environmental | Health Science and Protection Accreditation | Council (EHAC) an opportunity to experience environmental health practice at the local, state, and federal levels. The internship | exposes students to the aspects of the environmental health profession, from hands-on | activities in the field to environmental health | management in the office. Typically, the internship is the student’s first glimpse into | the real-world application of environmental | health science. | As interns, we recognized early on that | environmental health practitioners must | possess a wide range of competencies to be | effective at promoting and improving environmental health. Based on observations during our internship, we recognized a need to continually develop not only technical skills | and abilities but also competencies as wellrounded professionals. Those competencies | fall under the three categories identified by | the Environmental Health Core Competency Project: assessment, management, and | communication (American Public Health | Association and National Center for Environmental Health, Centers for Disease Control | and Prevention, 2001). This column gives | our unique perspectives as four environmental health interns who experienced, for the | first time, general environmental health practice through the eyes of practitioners. |
Community stress, psychosocial hazards, and EPA decision-making in communities impacted by chronic technological disasters
Couch SR , Coles CJ . Am J Public Health 2011 101 Suppl 1 S140-8 Psychosocial stress has emerged as an important consideration in managing environmental health risks. Stress has adverse impacts on health and may interact with environmental hazards to increase health risk. This article's primary objective was to explore psychosocial stress related to environmental contamination. We hypothesized that knowledge about stress should be used in conjunction with chemical risk assessment to inform environmental risk management decisions. Knowledge of psychosocial stress at contaminated sites began by exploring the relationships among social capital, collective efficacy, and contamination at the community level. We discussed stress at the family and individual levels, focusing on stress proliferation, available resources, and coping styles and mechanisms. We then made recommendations on how to improve the use of information on psychosocial stress in environmental decision-making, particularly in communities facing chronic technological disasters. |
Critical biological pathways for chronic psychosocial stress and research opportunities to advance the consideration of stress in chemical risk assessment
McEwen BS , Tucker P . Am J Public Health 2011 101 Suppl 1 S131-9 Emerging evidence suggests that psychosocial stress and toxicants may interact to modify health risks. Stress-toxicant interactions could be important in chemical risk assessment, but these interactions are poorly understood and additional research is necessary to advance their application. Environmental health research can increase knowledge of these interactions by exploring hypotheses on allostatic load, which measures the cumulative impacts of stress across multiple physiological pathways, using knowledge about physiological pathways for stress-related health effects, and evidence of common target pathways for both stress and toxicants. In this article, critical physiological pathways for stress-related health effects are discussed, with specific attention to allostatic load and stress-toxicant interactions, concluding with research suggestions for potential applications of such research in chemical risk assessment. |
Monitoring HPV type-specific prevalence over time through clinic-based surveillance: a perspective on vaccine effectiveness.
Gaffga NH , Flagg EW , Weinstock HS , Shlay JC , Ghanem KG , Koutsky LA , Kerndt PR , Hsu KK , Unger ER , Datta SD . Vaccine 2012 30 (11) 1959-64 We investigated the feasibility of monitoring trends in prevalence of vaccine-preventable human papillomavirus (HPV) types in different clinic populations. We collected cervical specimens from women presenting to family planning, primary care, and sexually transmitted disease (STD) clinics for routine pap smears in five US cities during 2003-2005. We performed HPV genotyping and calculated annual type-specific prevalences; pre-vaccine era prevalence was highest for HPV 16 (6.0; 95% confidence interval [CI] 5.5-6.6%) and annual prevalences for vaccine-preventable types were stable, with few exceptions, after controlling for clinic type, age group, and city. With sufficient sample size and stable population characteristics, clinic-based surveillance systems can contribute to monitoring HPV vaccine impact in the cervical screening population. |
Surveillance for malaria elimination in Swaziland: a national cross-sectional study using pooled PCR and serology
Hsiang MS , Hwang J , Kunene S , Drakeley C , Kandula D , Novotny J , Parizo J , Jensen T , Tong M , Kemere J , Dlamini S , Moonen B , Angov E , Dutta S , Ockenhouse C , Dorsey G , Greenhouse B . PLoS One 2012 7 (1) e29550 BACKGROUND: To guide malaria elimination efforts in Swaziland and other countries, accurate assessments of transmission are critical. Pooled-PCR has potential to efficiently improve sensitivity to detect infections; serology may clarify temporal and spatial trends in exposure. METHODOLOGY/PRINCIPAL FINDINGS: Using a stratified two-stage cluster, cross-sectional design, subjects were recruited from the malaria endemic region of Swaziland. Blood was collected for rapid diagnostic testing (RDT), pooled PCR, and ELISA detecting antibodies to Plasmodium falciparum surface antigens. Of 4330 participants tested, three were RDT-positive yet false positives by PCR. Pooled PCR led to the identification of one P. falciparum and one P. malariae infection among RDT-negative participants. The P. falciparum-infected participant reported recent travel to Mozambique. Compared to performing individual testing on thousands of samples, PCR pooling reduced labor and consumable costs by 95.5%. Seropositivity was associated with age ≥20 years (11.7% vs 1.9%, P<0.001), recent travel to Mozambique (OR 4.4 [95% CI 1.0-19.0]) and residence in southeast Swaziland (RR 3.78, P<0.001). CONCLUSIONS: The prevalence of malaria infection and recent exposure in Swaziland are extremely low, suggesting elimination is feasible. Future efforts should address imported malaria and target remaining foci of transmission. Pooled PCR and ELISA are valuable surveillance tools for guiding elimination efforts. |
Cognitive testing of human papillomavirus vaccine survey items for parents of adolescent girls
Richman AR , Coronado GD , Arnold LD , Fernandez ME , Glenn BA , Allen JD , Wilson KM , Brewer NT . J Low Genit Tract Dis 2012 16 (1) 16-23 OBJECTIVE: Many studies have been conducted to understand what factors are associated with human papillomavirus (HPV) vaccine acceptability and completion of the 3-dose vaccination series, but few have examined whether people understand the survey items used to assess these relationships. Through a multisite collaborative effort, we developed and cognitively tested survey items that represent constructs known to affect vaccine acceptability and completion. MATERIALS AND METHODS: Investigators from 7 research centers in the United States used cognitive interviewing techniques and in-person and telephone interviews to test 21 items. Four rounds of testing, revising, and retesting were conducted among racially and ethnically diverse parents (n = 62) of girls between the ages of 9 and 17 years. RESULTS: The final survey contained 20 items on attitudes and beliefs relevant to HPV vaccine. Some parents misinterpreted statements about hypothetical vaccine harms as statements of fact. Others were unwilling to answer items about perceived disease likelihood and perceived vaccine effectiveness, because they said the items seemed to have a "right" answer that they did not know. On the basis of these and other findings from cognitive testing, we revised the wording of 14 questions to improve clarity and comprehension. We also revised instructions, response options, and item order. CONCLUSIONS: Cognitive testing of HPV vaccine survey items revealed important differences between intended and ascribed item meaning by participants. Use of the tested survey questions presented here may increase measurement validity and researchers' ability to compare findings across studies and populations. Additional testing using quantitative methods can help to further validate these items. |
Multistate outbreak of Salmonella serotype Typhimurium infections associated with consumption of restaurant tomatoes, USA, 2006: hypothesis generation through case exposures in multiple restaurant clusters
Behravesh CB , Blaney D , Medus C , Bidol SA , Phan Q , Soliva S , Daly ER , Smith K , Miller B , Taylor T , Nguyen T , Perry C , Hill TA , Fogg N , Kleiza A , Moorhead D , Al-Khaldi S , Braden C , Lynch MF . Epidemiol Infect 2012 140 (11) 1-9 SUMMARY: Multiple salmonellosis outbreaks have been linked to contaminated tomatoes. We investigated a multistate outbreak of Salmonella Typhimurium infections among 190 cases. For hypothesis generation, review of patients' food histories from four restaurant-associated clusters in four states revealed that large tomatoes were the only common food consumed by patients. Two case-control studies were conducted to identify food exposures associated with infections. In a study conducted in nine states illness was significantly associated with eating raw, large, round tomatoes in a restaurant [matched odds ratio (mOR) 3.1, 95% confidence interval (CI) 1.3-7.3]. In a Minnesota study, illness was associated with tomatoes eaten at a restaurant (OR 6.3, mid-P 95% CI 1.05-50.4, P=0.046). State, local and federal regulatory officials traced the source of tomatoes to Ohio tomato fields, a growing area not previously identified in past tomato-associated outbreaks. Because tomatoes are commonly eaten raw, prevention of tomato contamination should include interventions on the farm, during packing, and at restaurants. |
Assessing the incidence of ciguatera fish poisoning with two surveys conducted in Culebra, Puerto Rico, during 2005 and 2006
Azziz-Baumgartner E , Luber G , Conklin L , Tosteson TR , Granade HR , Dickey RW , Backer LC . Environ Health Perspect 2012 120 (4) 526-9 BACKGROUND: Although ciguatera fish poisoning (CFP) is the most common seafood intoxication worldwide, its burden has been difficult to establish because there are no biomarkers to diagnose human exposure. OBJECTIVE: We explored the incidence CFP, proportion of CFP case-patients with laboratory confirmed ciguatoxic meal remnants, cost of CFP illness, and potential risk factors for CFP. METHODS: During 2005 and again during 2006, we conducted a census of all occupied households in the island of Culebra, Puerto Rico, where locally caught fish are a staple food. We defined CFP case-patients as persons with gastrointestinal symptoms (i.e., abdominal pain, vomiting, diarrhea, or nausea) and neurological symptoms (i.e., extremity paresthesia, arthralgia, myalgia, malaise, pruritus, headache, dizziness, metallic taste, visual disturbance, circumoral paresthesia, or temperature reversal, or toothache) or systemic symptoms (e.g., bradycardia) within 72 hours of eating a fish during the previous year. Participants were asked to save fish remnants eaten by cases for ciguatoxins analysis at the Food and Drug Administration laboratory in Dauphin Island. RESULTS: We surveyed 340 households during 2005 and 335 households during 2006. The estimated annual incidence of possible CFP was 4.0/1000 person-years and probable CFP was 7.5/1000 person-years. One of three fish samples submitted by probable case-patients was positive for ciguatoxins. None of the case-patients required respiratory support. Households that typically consumed barracuda were more likely to report CFP (p = 0.02). CONCLUSIONS: Our estimates, which are consistent with previous studies using similar case-finding, contribute to the overall information available to support public health decision-making about CFP prevention. |
How can we stimulate translational research in cancer genomics beyond bench to bedside?
Schully SD , Benedicto CB , Khoury MJ . Genet Med 2012 14 (1) 169-70 A huge expansion in cancer “omics” research—including germline and somatic mutation analysis, gene expression, epigenomics, and proteomics—is promising a new era of personalized cancer care and prevention.1 Nevertheless, to maximize the health benefits from genomic discoveries, translational research is needed to move basic science discoveries—not only from the bench to the bedside but also to clinical and public health practice.2 Several advisory and working groups have been cautioning about the chasm between basic science discoveries and translation to population health benefits.3 Khoury et al.2 proposed four phases of translational research in genomic medicine. T1 research (bench to bedside) develops candidate health application (e.g., test or therapies). T2 research evaluates candidate applications and leads to evidence-based recommendations and guidelines. T3 research assesses how to implement and integrate an evidence-based recommendation into practice. T4 research assesses health outcomes and population impact. The authors also found that less than 2% of published genomics research between 2001 and 2006 was post–bench to bedside (T2 and beyond). We recently applied this framework to a portfolio analysis of National Cancer Institute (NCI) fiscal year 2007 (FY2007) extramurally funded research in cancer genomics.4 We found that of the 1,019 funded grants, there was little funded and conducted research in the later phases of translation (T2 and beyond); only 18 grants (1.8%) were funded in this area. |
Randomized trial of type 1 and type 3 oral monovalent poliovirus vaccines in newborns in Africa
Waggie Z , Geldenhuys H , Sutter RW , Jacks M , Mulenga H , Mahomed H , De Kock M , Hanekom W , Pallansch MA , Kahn AL , Burton AH , Sreevatsava M , Hussey G . J Infect Dis 2012 205 (2) 228-36 BACKGROUND: The Global Polio Eradication Initiative aims to eradicate wild poliovirus by the end of 2012. Therefore, more-immunogenic polio vaccines, including monovalent oral poliovirus vaccines (mOPVs), are needed for supplemental immunization activities. This trial assessed the immunogenicity of monovalent types 1 and 3, compared with that of trivalent oral poliovirus vaccine (tOPV), in South Africa. METHODS: We conducted a blinded, randomized, 4-arm controlled trial comparing the immunogenicity of a single dose of mOPV1 (from 2 manufacturers) and mOPV3 (from 1 manufacturer), given at birth, with the immunogenicity of tOPV. RESULTS: Eight hundred newborns were enrolled; 762 (95%) were included in the analysis. At 30 days after vaccine administration, seroconversion to poliovirus type 1 was 73.4% and 76.4% in the 2 mOPV1 arms, compared with 39.1% in the tOPV arm (P < .0000001), and seroconversion to poliovirus type 3 was 58.0% in the mOPV3 arm, compared with 21.2% in the tOPV arm (P < .0000001). The vaccines were well tolerated, and no adverse events were attributed to trial interventions. CONCLUSION: A dose of mOPV1 or mOPV3 at birth was superior to that of tOPV in inducing type-specific seroconversion in this sub-Saharan African population. Our results support continued use of mOPVs in supplemental immunization activities in countries where poliovirus types 1 or 3 circulate. CLINICAL TRIALS REGISTRATION: ISRCTN18107202. (See the editorial commentary by Cochi and Linkins, on pages 169-71.) |
Limited awareness of vaccines recommended for adolescents and other results from two national consumer health surveys in the United States
Kennedy A , Stokley S , Curtis CR , Gust D . J Adolesc Health 2012 50 (2) 198-200 PURPOSE: This study describes the vaccine-related knowledge and attitudes of adolescents aged 11-18 years and parents of adolescents aged 11-18 years. METHODS: We analyzed the 2007 HealthStyles and YouthStyles surveys related to vaccine knowledge and attitudes of parents (n = 1,208) and adolescents (n = 1,087). RESULTS: In all, 21% of parents and 11% of adolescents correctly identified the three vaccines recommended at the time of the survey for adolescents. Regarding the hypothetical scenario that minor adolescents should be allowed to consent to vaccination without parental knowledge, 70% of parents and 72% of adolescents disagreed. The majority of parents and adolescents recognized the importance of vaccines in protecting an adolescent's health yet a substantial minority of both groups also reported concerns about vaccine safety. CONCLUSIONS: Many parents and adolescents surveyed were not aware of all vaccine recommendations for adolescents and did not support adolescents receiving vaccinations independent of parental knowledge and/or consent. |
Hepatitis A vaccination coverage among adolescents in the United States
Dorell CG , Yankey D , Byrd KK , Murphy TV . Pediatrics 2012 129 (2) 213-21 OBJECTIVE: Hepatitis A infection causes severe disease among adolescents and adults. The Advisory Committee on Immunization Practices instituted incremental recommendations for hepatitis A vaccination (HepA) at 2 years of age based on risk (1996), in selected states (1999), and universally at 1 year of age, with vaccination through 18 years of age based on risk or desire for protection (2006). We assessed adolescent HepA coverage in the United States and factors independently associated with vaccination. METHODS: Data from the 2009 National Immunization Survey-Teen (n = 20 066) were analyzed to determine ≥1- and ≥2-dose HepA coverage among adolescents 13 to 17 years of age. We used bivariate and multivariable analyses to test associations between HepA initiation and sociodemographic characteristics stratified by state groups: group 1, universal child vaccination since 1999; group 2, consideration for child vaccination since 1999; group 3, universal child vaccination at 1 year of age since 2006. RESULTS: In 2009, national 1-dose HepA coverage among adolescents was 42.0%. Seventy percent of vaccinees completed the 2-dose series. One-dose coverage was 74.3% among group 1 states, 54.0% for group 2 states, and 27.8% for group 3 states. The adjusted prevalence ratios of vaccination initiation were highest for states with a vaccination requirement and for adolescents whose providers recommended HepA. CONCLUSIONS: HepA coverage was low among most adolescents in the United States in 2009 leaving a large population susceptible to hepatitis A infection maturing into adulthood. |
Impact of more than a decade of pneumococcal conjugate vaccine use on carriage and invasive potential in Native American communities
Scott JR , Millar EV , Lipsitch M , Moulton LH , Weatherholtz R , Perilla MJ , Jackson DM , Beall B , Craig MJ , Reid R , Santosham M , O'Brien KL . J Infect Dis 2012 205 (2) 280-8 BACKGROUND: We assessed the impact of 12 years of pneumococcal conjugate vaccine (PCV7) use on pneumococcal nasopharyngeal carriage and serotype-specific invasive disease potential among Native Americans. METHODS: Families were enrolled in a carriage study from 2006 to 2008; nasopharyngeal specimens and risk factor information were collected monthly for 7 visits. Pneumococcal carriage prevalence was compared with that before (1998-2000) and during (2001-2002) PCV7 introduction. We compared invasive disease incidence and carriage prevalence before and after PCV7 introduction to estimate changes in serotype-specific invasive potential. RESULTS: We enrolled 1077 subjects from 302 households. There was an absolute reduction in carriage prevalence of 8.0% (95% confidence interval [CI], 4.5%-11.4%) in children aged <5 years and 3.1% (95% CI, 1.1%-5.1%) in adults. In children aged <5 years, vaccine-serotype carriage prevalence decreased by 22.8% (95% CI, 20.1%-25.3%), and nonvaccine serotype (NVT) increased by 15.9% (95% CI, 12.4%-19.3%). No significant change was detected in serotype-specific invasive potential after PCV7 introduction. CONCLUSIONS: Pneumococcal carriage prevalence decreased in all ages since PCV7 introduction; vaccine-serotype carriage has been nearly eliminated, whereas the prevalence of NVT carriage has increased. The increase in the NVT invasive disease rate seems to be proportional to the increase in colonization prevalence. |
Molecular dissection of an outbreak of carbapenem-resistant enterobacteriaceae reveals Intergenus KPC carbapenemase transmission through a promiscuous plasmid.
Mathers AJ , Cox HL , Kitchel B , Bonatti H , Brassinga AK , Carroll J , Scheld WM , Hazen KC , Sifri CD . mBio 2011 2 (6) e00204-11 Carbapenem-resistant Enterobacteriaceae (CRE) have emerged as major causes of health care-associated infections worldwide. This diverse collection of organisms with various resistance mechanisms is associated with increased lengths of hospitalization, costs of care, morbidity, and mortality. The global spread of CRE has largely been attributed to dissemination of a dominant strain of Klebsiella pneumoniae producing a serine beta-lactamase, termed K. pneumoniae carbapenemase (KPC). Here we report an outbreak of KPC-producing CRE infections in which the degree of horizontal transmission between strains and species of a promiscuous plasmid is unprecedented. Sixteen isolates, comprising 11 unique strains, 6 species, and 4 genera of bacteria, were obtained from 14 patients over the first 8 months of the outbreak. Of the 11 unique strains, 9 harbored the same highly promiscuous plasmid carrying the KPC gene bla(KPC). The remaining strains harbored distinct bla(KPC) plasmids, one of which was carried in a strain of Klebsiella oxytoca coisolated from the index patient and the other generated from transposition of the bla(KPC) element Tn4401. All isolates could be genetically traced to the index patient. Molecular epidemiological investigation of the outbreak was aided by the adaptation of nested arbitrary PCR (ARB-PCR) for rapid plasmid identification. This detailed molecular genetic analysis, combined with traditional epidemiological investigation, provides insights into the highly fluid dynamics of drug resistance transmission during the outbreak. IMPORTANCE: The ease of horizontal transmission of carbapenemase resistance plasmids across strains, species, and genera of bacteria observed in this study has several important public health and epidemiological implications. First, it has the potential to promote dissemination of carbapenem resistance to new populations of Enterobacteriaceae, including organisms of low virulence, leading to the establishment of reservoirs of carbapenem resistance genes in patients and/or the environment and of high virulence, raising the specter of untreatable community-associated infections. Second, recognition of plasmid-mediated outbreaks, such as those described here, is problematic because analysis of resistance plasmids from clinical isolates is laborious and technically challenging. Adaptation of nested arbitrary PCR (ARB-PCR) to investigate the plasmid outbreak facilitated our investigation, and the method may be broadly applicable to other outbreaks due to other conserved mobile genetic elements. Whether infection control measures that focus on preventing transmission of drug-resistant clones are effective in controlling dissemination of these elements is unknown. |
Integrative deep sequencing of the mouse lung transcriptome reveals differential expression of diverse classes of small RNAs in response to respiratory virus infection.
Peng X , Gralinski L , Ferris MT , Frieman MB , Thomas MJ , Proll S , Korth MJ , Tisoncik JR , Heise M , Luo S , Schroth GP , Tumpey TM , Li C , Kawaoka Y , Baric RS , Katze MG . mBio 2011 2 (6) We previously reported widespread differential expression of long non-protein-coding RNAs (ncRNAs) in response to virus infection. Here, we expanded the study through small RNA transcriptome sequencing analysis of the host response to both severe acute respiratory syndrome coronavirus (SARS-CoV) and influenza virus infections across four founder mouse strains of the Collaborative Cross, a recombinant inbred mouse resource for mapping complex traits. We observed differential expression of over 200 small RNAs of diverse classes during infection. A majority of identified microRNAs (miRNAs) showed divergent changes in expression across mouse strains with respect to SARS-CoV and influenza virus infections and responded differently to a highly pathogenic reconstructed 1918 virus compared to a minimally pathogenic seasonal influenza virus isolate. Novel insights into miRNA expression changes, including the association with pathogenic outcomes and large differences between in vivo and in vitro experimental systems, were further elucidated by a survey of selected miRNAs across diverse virus infections. The small RNAs identified also included many non-miRNA small RNAs, such as small nucleolar RNAs (snoRNAs), in addition to nonannotated small RNAs. An integrative sequencing analysis of both small RNAs and long transcripts from the same samples showed that the results revealing differential expression of miRNAs during infection were largely due to transcriptional regulation and that the predicted miRNA-mRNA network could modulate global host responses to virus infection in a combinatorial fashion. These findings represent the first integrated sequencing analysis of the response of host small RNAs to virus infection and show that small RNAs are an integrated component of complex networks involved in regulating the host response to infection. IMPORTANCE: Most studies examining the host transcriptional response to infection focus only on protein-coding genes. However, mammalian genomes transcribe many short and long non-protein-coding RNAs (ncRNAs). With the advent of deep-sequencing technologies, systematic transcriptome analysis of the host response, including analysis of ncRNAs of different sizes, is now possible. Using this approach, we recently discovered widespread differential expression of host long (>200 nucleotide [nt]) ncRNAs in response to virus infection. Here, the samples described in the previous report were again used, but we sequenced another fraction of the transcriptome to study very short (about 20 to 30 nt) ncRNAs. We demonstrated that virus infection also altered expression of many short ncRNAs of diverse classes. Putting the results of the two studies together, we show that small RNAs may also play an important role in regulating the host response to virus infection. |
Reference measurement procedure for total glycerides by isotope dilution GC-MS
Edwards SH , Stribling SL , Pyatt SD , Kimberly MM . Clin Chem 2012 58 (4) 768-76 BACKGROUND: The CDC's Lipid Standardization Program established the chromotropic acid (CA) reference measurement procedure (RMP) as the accuracy base for standardization and metrological traceability for triglyceride testing. The CA RMP has several disadvantages, including lack of ruggedness. It uses obsolete instrumentation and hazardous reagents. To overcome these problems the CDC developed an isotope dilution GC-MS (ID-GC-MS) RMP for total glycerides in serum. METHODS: We diluted serum samples with Tris-HCl buffer solution and spiked 200-mcL aliquots with [(13)C(3)]-glycerol. These samples were incubated and hydrolyzed under basic conditions. The samples were dried, derivatized with acetic anhydride and pyridine, extracted with ethyl acetate, and analyzed by ID-GC-MS. Linearity, imprecision, and accuracy were evaluated by analyzing calibrator solutions, 10 serum pools, and a standard reference material (SRM 1951b). RESULTS: The calibration response was linear for the range of calibrator concentrations examined (0-1.24 mmol/L) with a slope and intercept of 0.717 (95% CI, 0.7123-0.7225) and 0.3122 (95% CI, 0.3096-0.3140), respectively. The limit of detection was 14.8 mcmol/L. The mean %CV for the sample set (serum pools and SRM) was 1.2%. The mean %bias from NIST isotope dilution MS values for SRM 1951b was 0.7%. CONCLUSIONS: This ID-GC-MS RMP has the specificity and ruggedness to accurately quantify total glycerides in the serum pools used in the CDC's Lipid Standardization Program and demonstrates sufficiently acceptable agreement with the NIST primary RMP for total glyceride measurement. |
Use of lean response to improve pandemic influenza surge in public health laboratories
Isaac-Renton JL , Chang Y , Prystajecky N , Petric M , Mak A , Abbott B , Paris B , Decker KC , Pittenger L , Guercio S , Stott J , Miller JD . Emerg Infect Dis 2012 18 (1) 57-62 A novel influenza A (H1N1) virus detected in April 2009 rapidly spread around the world. North American provincial and state laboratories have well-defined roles and responsibilities, including providing accurate, timely test results for patients and information for regional public health and other decision makers. We used the multidisciplinary response and rapid implementation of process changes based on Lean methods at the provincial public health laboratory in British Columbia, Canada, to improve laboratory surge capacity in the 2009 influenza pandemic. Observed and computer simulating evaluation results from rapid processes changes showed that use of Lean tools successfully expanded surge capacity, which enabled response to the 10-fold increase in testing demands. |
Hollow-fiber ultrafiltration for simultaneous recovery of viruses, bacteria and parasites from reclaimed water
Liu P , Hill VR , Hahn D , Johnson TB , Pan Y , Jothikumar N , Moe CL . J Microbiol Methods 2012 88 (1) 155-61 Hollow-fiber ultrafiltration (UF) is a technique that has been reported to be effective for recovering a diverse array of microbes from water, and may also be potentially useful for microbial monitoring of effluent from water reclamation facilities. However, few data are available to indicate the potential limitations and efficacy of the UF technique for treated wastewater. In this study, recovery efficiencies were determined for various options available for performing the tangential-flow UF technique, including hollow-fiber ultrafilter (i.e., dialyzer) type, ultrafilter pre-treatment (i.e., blocking), and elution. MS2 and PhiX174 bacteriophages, Clostridium perfringens spores, Escherichia coli, and Cryptosporidium parvum oocysts were seeded into 10-L reclaimed water samples to evaluate UF options. Then a single UF protocol was established and studied using seeded and non-seeded 100-L samples from two water reclamation facilities in Georgia, USA. Baxter Exeltra Plus 210 and Fresenius F200NR dialyzers were found to provide significantly higher microbial recovery than Minntech HPH 1400 hemoconcentrators. The selected final UF method incorporated use of a non-blocked ultrafilter for UF followed by elution using a surfactant-based solution. For 10-L samples, this method achieved recovery efficiencies of greater than 50% recovery of seeded viruses, bacteria, and parasites. There was no significant difference in overall microbial recovery efficiency when the method was applied to 10- and 100-L samples. In addition, detection levels for pathogens in seeded 100-L reclaimed water samples were 1000 PFU HAV, 10,000 GI norovirus particles, <500 Salmonella and <200 Cryptosporidium oocysts. These data demonstrate that UF can be an effective technique for recovering diverse microbes in reclaimed water to monitor and improve effluent water quality in wastewater treatment plants. |
Application of numerical methods for diffusion-based modeling of skin permeation
Frasch HF , Barbero AM . Adv Drug Deliv Rev 2012 65 (2) 208-20 The application of numerical methods for mechanistic, diffusion-based modeling of skin permeation is reviewed. Methods considered here are finite difference, method of lines, finite element, finite volume, random walk, cellular automata, and smoothed particle hydrodynamics. First the methods are briefly explained with rudimentary mathematical underpinnings. Current state of the art numerical models are described, and then a chronological overview of published models is provided. Key findings and insights of reviewed models are highlighted. Model results support a primarily transcellular pathway with anisotropic lipid transport. Future endeavors would benefit from a fundamental analysis of drug/vehicle/skin interactions. |
Discordant antigenic drift of neuraminidase and hemagglutinin in H1N1 and H3N2 influenza viruses
Sandbulte MR , Westgeest KB , Gao J , Xu X , Klimov AI , Russell CA , Burke DF , Smith DJ , Fouchier RA , Eichelberger MC . Proc Natl Acad Sci U S A 2011 108 (51) 20748-53 Seasonal epidemics caused by influenza virus are driven by antigenic changes (drift) in viral surface glycoproteins that allow evasion from preexisting humoral immunity. Antigenic drift is a feature of not only the hemagglutinin (HA), but also of neuraminidase (NA). We have evaluated the antigenic evolution of each protein in H1N1 and H3N2 viruses used in vaccine formulations during the last 15 y by analysis of HA and NA inhibition titers and antigenic cartography. As previously shown for HA, genetic changes in NA did not always lead to an antigenic change. The noncontinuous pattern of NA drift did not correspond closely with HA drift in either subtype. Although NA drift was demonstrated using ferret sera, we show that these changes also impact recognition by NA-inhibiting antibodies in human sera. Remarkably, a single point mutation in the NA of A/Brisbane/59/2007 was primarily responsible for the lack of inhibition by polyclonal antibodies specific for earlier strains. These data underscore the importance of NA inhibition testing to define antigenic drift when there are sequence changes in NA. |
Epidemiology of late and moderate preterm birth
Shapiro-Mendoza CK , Lackritz EM . Semin Fetal Neonatal Med 2012 17 (3) 120-5 Preterm birth affects 12.5% of all births in the USA. Infants of Black mothers are disproportionately affected, with 1.5 times the risk of preterm birth and 3.4 times the risk of preterm-related mortality. The preterm birth rate has increased by 33% in the last 25 years, almost entirely due to the rise in late preterm births (34-36 weeks' gestation). Recently attention has been given to uncovering the often subtle morbidity and mortality risks associated with moderate (32-33 weeks' gestation) and late preterm delivery, including respiratory, infectious, and neurocognitive complications and infant mortality. This section summarizes the epidemiology of moderate and late preterm birth, case definitions, risk factors, recent trends, and the emerging body of knowledge of morbidity and mortality associated with moderate and late preterm birth. |
Clinical sepsis in neonates and young infants, United States, 1988-2006
Lukacs SL , Schrag SJ . J Pediatr 2012 160 (6) 960-5 e1 OBJECTIVE: To describe the burden and characteristics of clinical neonatal sepsis in the United States and evaluate incidence rates after the issuance of intrapartum antibiotic prophylaxis (IAP) guidelines. STUDY DESIGN: This is a cross-sectional study of hospitalizations of infants aged <3 months diagnosed with sepsis from the 1988-2006 National Hospital Discharge Survey. The National Hospital Discharge Survey collects data annually on inpatient discharges from a national probability sample of approximately 500 short-stay hospitals. We examined sepsis hospitalizations, defined by International Classification of Diseases, Ninth Revision, Clinical Modification codes, and compared sepsis hospitalization rates for 2 time periods after the issuance of IAP guidelines (1996-2001 and 2002-2006) with 1988-1995 using national natality data as the population denominator. We used Joinpoint (Surveillance Research Program, National Cancer Institute, Bethesda, Maryland) regression to assess the average annual percent change (AAPC) in rates. RESULTS: Between 1988 and 2006, there were more than 2.5 million sepsis-related hospitalizations in infants aged <3 months (112,000-146,000 annually). In 2006, the sepsis hospitalization rate was 30.8/1000 births. The rate was more than 3 times higher in preterm infants compared with term infants (85.4/1000 preterm births vs 23.1/1000 term births). The AAPC in sepsis hospitalization rate was -3.6% (95% CI, -2.1% to 5.1%) for term infants during 1996-2002 and did not change significantly after issuance of the revised 2002 guidelines. For preterm infants, the AAPC was -1.2% (95% CI, -2.2% to 0.1%) annually from 1988 to 2006. CONCLUSION: Clinical neonatal sepsis declined in the post-IAP era, mirroring trends observed in group B streptococcal early-onset neonatal sepsis surveillance. Preterm infants were affected disproportionately and exhibited a modest but steady decline in sepsis hospitalization rate. |
The rates of abnormal glucose challenge tests and gestational diabetes in women receiving 17alpha-hydroxyprogesterone caproate
Wolfe K , Dearmond C , How H , Henderson ZT , Sibai B . Am J Perinatol 2011 28 (10) 741-6 We compared the rates of abnormal 1-hour glucose challenge tests (GCT) and gestational diabetes (GDM) between women receiving 17alpha-hydroxyprogesterone caproate (17-P) and women who did not receive 17-P to determine if the effect varies based on the number of doses received or in a group of high-risk obese women. We performed a secondary analysis of a prospective cohort study where women with a history of a previous preterm delivery in the antecedent pregnancy followed at a high-risk clinic were offered 17-P. GCT was performed after the initiation of 17-P, and doses given prior to testing were recorded. Rates of abnormal GCT and GDM were compared between those receiving 17-P ( N = 67) and controls ( N = 140). Mean glucose values (112.4 versus 111.3, P = 0.8), rate of abnormal GCT (23.9% versus 20%, adjusted odds ratio 1.45, 95% confidence interval 0.7 to 3.0), and rate of GDM (6% versus 8.6%, adjusted odds ratio 1.21, 95% confidence interval 0.3 to 4.5) were similar between groups. In this prospective study, 17-P administration to women at risk of recurrent preterm delivery did not significantly affect glucose tolerance. |
Program experience with micronutrient powders and current evidence
Rah JH , Depee S , Kraemer K , Steiger G , Bloem MW , Spiegel P , Wilkinson C , Bilukha O . J Nutr 2012 142 (1) 191S-6S The efficacy of micronutrient powders (MNP) in the treatment of anemia in moderately anemic children aged 6-24 mo has been clearly demonstrated. The evidence of the effectiveness of MNP in large-scale programs, however, is scarce. This article describes the program experience and findings of large-scale MNP distribution in refugee camps and in an emergency context in Bangladesh, Nepal, and Kenya. The MNP contained 15-16 micronutrients as per the WHO/World Food Programme/UNICEF joint statement, whereas the iron content was reduced to 2.5 mg from NaFeEDTA in a malaria-endemic area in Kenya. Hundreds of thousands of children aged 6-59 mo and pregnant and lactating women were targeted to consume MNP either daily or every other day over an extended period of time. Extensive social marketing campaigns were undertaken to promote regular use of the product. A number of studies were embedded in the programs to assess the impact of MNP on the nutritional status of target beneficiaries. Some improvements in anemia prevalence estimates were observed in particular subgroups, but other results did not show significant improvements. A significant decrease in the prevalence of stunting was observed in Nepal and Kenya but not in Bangladesh. Diarrhea episodes decreased significantly among children receiving MNP in Nepal. A key challenge is to ensure high MNP acceptance and adherence among beneficiaries. Investigation of non-nutritional causes of anemia is warranted in settings with high compliance but no improvement in hemoglobin status. Further investigation into the most appropriate manner to use MNP in malaria endemic settings is warranted. |
Correcting for inflammation changes estimates of iron deficiency among rural Kenyan preschool children
Grant FK , Suchdev PS , Flores-Ayala R , Cole CR , Ramakrishnan U , Ruth LJ , Martorell R . J Nutr 2012 142 (1) 105-11 The assessment of iron status where infections are common is complicated by the effects of inflammation on iron indicators and in this study we compared approaches that adjust for this influence. Blood was collected in 680 children (aged 6-35 mo) and indicators of iron status [(hemoglobin (Hb), zinc protoporphyrin (ZP), ferritin, transferrin receptor (TfR), and TfR/ferritin index)] and subclinical inflammation [(the acute phase proteins (APP) C-reactive protein (CRP), and alpha-1-acid glycoprotein (AGP)] were determined. Malaria parasitemia was assessed. Subclinical inflammation was defined as CRP >5 mg/L and/or AGP >1 g/L). Four groups were defined based on APP levels: reference (normal CRP and AGP), incubation (raised CRP and normal AGP), early convalescence (raised CRP and AGP), and late convalescence (normal CRP and raised AGP). Correction factors (CF) were estimated as the ratios of geometric means of iron indicators to the reference group of those for each inflammation group. Corrected values of iron indicators within inflammation groups were obtained by multiplying values by their respective group CF. CRP correlated with AGP (r = 0.65; P < 0.001), ferritin (r = 0.38; P < 0.001), Hb (r = -0.27; P < 0.001), and ZP (r = 0.16; P < 0.001); AGP was correlated with ferritin (r = 0.39; P < 0.001), Hb (r = -0.29; P < 0.001), and ZP (r = 0.24; P < 0.001). Use of CF to adjust for inflammation increased the prevalence of ID based on ferritin < 12 mcg/L by 34% (from 27 to 41%). Applying the CF strengthened the expected relationship between Hb and ferritin (r = 0.10; P = 0.013 vs. r = 0.20; P < 0.001, before and after adjustment, respectively). Although the use of CF to adjust for inflammation appears indicated, further work is needed to confirm that this approach improves the accuracy of assessment of ID. |
A simulation study of the potential effects of healthy food and beverage substitutions on diet quality and total energy intake in Lower Mississippi Delta adults
Thomson JL , Tussing-Humphreys LM , Onufrak SJ , Zoellner JM , Connell CL , Bogle ML , Yadrick K . J Nutr 2011 141 (12) 2191-7 The majority of adult diets in the United States, particularly the South, are of poor quality, putting these individuals at increased risk for chronic diseases. In this study, simulation modeling was used to determine the effects of substituting familiar, more healthful foods and beverages for less healthy ones on diet quality and total energy intake in Lower Mississippi Delta (LMD) adults. Dietary data collected in 2000 for 1689 LMD adults who participated in the Foods of Our Delta Study were analyzed. The Healthy Eating Index-2005 (HEI-2005) was used to measure diet quality. The effects of substituting targeted foods and beverages with more healthful items on diet quality were simulated by replacing the targeted items' nutrient profile with their replacements' profile. For the single food and beverage groups, 100% replacement of grain desserts with juice-packed fruit cocktail and sugar-sweetened beverages with water resulted in the largest improvements in diet quality (4.0 and 3.8 points, respectively) and greatest decreases in total energy intake (98 and 215 kcal/d, respectively). The 100% substitution of all food and beverage groups combined resulted in a 12.0-point increase in HEI-2005 score and a decrease of 785 kcal/d in total energy intake. Community interventions designed to improve the diet of LMD adults through the use of familiar, healthy food and beverage substitutions have the potential to improve diet quality and decrease energy intake of this health disparate population. |
Work experiences of Latina immigrants
Eggerth DE , DeLaney SC , Flynn MA , Jacobson CJ . J Career Dev 2012 39 (1) 13-30 Almost half of the Latino immigrants working in the United States are women. However, studies concerning the work experiences of Latinas are almost absent in the literature. This article reports the findings from a qualitative study using eight focus groups (n = 53) of Latina immigrant workers. The focus group transcripts were analyzed using the grounded theory approach in which themes emerge from iterative readings of the transcripts by a group of investigators. This study identified themes related to excessive workload, familiar work/unfamiliar hazards, cultural tensions, lack of health care, pregnancy, sexual harassment, and family obligations/expectations. The responses of the Latina workers in this study clearly indicated that they live within a complex web of stressors, both as workers and as women. The increased economic opportunities that come with immigration to the United States are accompanied by many opportunities for exploitation, especially if they are undocumented. It is hoped that the findings of this study will raise awareness regarding these issues and spur further work in this area. |
Occupational gradients in smoking behavior and exposure to workplace environmental tobacco smoke: the multi-ethnic study of atherosclerosis
Fujishiro K , Stukovsky KD , Roux AD , Landsbergis P , Burchfiel C . J Occup Environ Med 2012 54 (2) 136-45 OBJECTIVE: This study examines associations of occupation with smoking status, amount smoked among current and former smokers (number of cigarettes per day and lifetime cigarette consumption (pack-years)), and workplace exposure to environmental tobacco smoke (ETS) independent from income and education. METHODS: This is a cross-sectional analysis of data from a community sample (n = 6355, age range: 45-84) using logistic and multinomial regression. All analyses were stratified by sex and adjusted for socio-demographic variables. RESULTS: Male blue-collar and sales/office workers had higher odds of having consumed more than 20 pack-years of cigarettes than managers/professionals. For both male and female current or former smokers, exposure to workplace ETS was consistently and strongly associated with heavy smoking and greater pack-years. CONCLUSIONS: Blue-collar workplaces are associated with intense smoking and ETS exposure. Smoking must be addressed at both the individual and workplace levels especially in blue-collar workplaces. |
An evaluation of the proposed revision of the anti-vibration glove test method defined in ISO 10819 (1996)
Welcome DE , Dong RG , Xu XS , Warren C , McDowell TW . Int J Ind Ergon 2012 42 (1) 143-155 To improve the reliability of the anti-vibration (AV) glove test defined in the current standard, a revised version of the standard has been proposed. However, the revised method has not been fully tested and sufficiently evaluated, and it is unknown whether it is practically feasible and convenient to implement the standard. To help achieve the objective of the revision, the specific aims of this study are to examine the rationale behind the major revisions of the standard and to evaluate the major technical aspects of the revised method through an experiment. Five human subjects participated in the experiment for the evaluation. Fifteen gloves with anti-vibration features were used in the experiment. To help evaluate the AV glove criteria, the effects of the glove on the grip strength were also examined. While this study failed to realize the constant-velocity spectrum proposed in the original revision, the glove vibration transmissibility values measured with a new spectrum proposed in the current study were very similar to those measured with the M and H spectra defined in the current standard, which suggests the new spectrum can greatly simplify the test without changing the original test results, and it should be adopted in the further revision of the standard. The results of this study also strongly support the proposed major revisions in the instrumentation and test procedures. Coincidently, the glove that reduced the grip strength the least was also the one that reduced the most vibration, which suggests that the negative and positive effects of the glove can be balanced in its design. While the subject is identified as a major influencing factor, this study proposed a novel approach – the use of a reference glove in the test to minimize the inter-subject and inter-laboratory variations. Based on the results of this study, some other further revisions in the test procedures, evaluation methods, and AV glove criteria were also proposed and discussed. Relevance to industry Anti-vibration gloves have been used as an alternative approach to reduce hand-transmitted vibration exposure. A standard is required to conduct a reliable screening test to help select appropriate anti-vibration gloves. This study can significantly help improve the current standard on the test. The results of this study can also be directly used to help select appropriate AV gloves. |
Association between depressive symptoms and metabolic syndrome in police officers: results from two cross-sectional studies
Hartley TA , Knox SS , Fekedulegn D , Barbosa-Leiker C , Violanti JM , Andrew ME , Burchfiel CM . J Environ Public Health 2012 2012 861219 Policing is one of the most dangerous and stressful occupations and such stress can have deleterious effects on health. The purpose of this study was to examine the association between depressive symptoms and metabolic syndrome (MetSyn) in male and female police officers from two study populations, Buffalo, NY and Spokane, WA. Depressive symptoms were measured using the Center for Epidemiologic Studies-Depression (CES-D) scale. MetSyn was defined using the 2005 AHA/NHBLI guidelines. Analysis of covariance was used to describe differences in number of MetSyn components across depressive symptom categories. The number of MetSyn components increased significantly across categories of CES-D for Spokane men only (p-trend = 0.003). For each 5-unit increase in CES-D score, odds increased by 47.6% for having hypertriglyceridemia, by 51.8% for having hypertension, and by 56.7% for having glucose intolerance. Exploring this association is important since both are predictors of future chronic health problems and the results could be helpful in developing future gender-specific prevention and intervention efforts among police officers. |
Body mass index versus dual energy x-ray absorptiometry-derived indexes: predictors of cardiovascular and diabetic disease risk factors
Sharp DS , Andrew ME , Burchfiel CM , Violanti JM , Wactawski-Wende J . Am J Hum Biol 2012 24 (4) 400-5 OBJECTIVES: The body mass index (BMI), a ratio of weight/height(2), dominates estimation of adiposity in population studies. BMI, however, does not distinguish among fat, muscle, or bone mass. Accordingly, its usage to assess and manage obesity in the population is limited. This study compares the use of BMI with direct measures of fat- and lean-mass to predict established cardiovascular and diabetes risk factors: blood pressure, lipids, and glucose. METHODS: The entire Buffalo Police Department was the object of recruitment to a baseline study of physiological and psychological stress. Four hundred nine officers constitute the sample for this analysis. Regression methods focusing on explained variance in blood pressure, high density lipoprotein (HDL) cholesterol, and blood glucose compare the use of BMI to that of fat- and lean-mass indexes derived from dual energy X-ray absorptiometry (DEXA). RESULTS: DEXA indexes explain 1.6%-3.3% (P < 0.05, all risk factors) more variance than BMI. Fat mass drives the association for blood pressure, trunk lean mass for HDL cholesterol, and both for blood glucose. High degrees of multicollinearity complicate interpretation of predictive models jointly containing BMI and DEXA indexes. CONCLUSIONS: In police officers, DEXA indexes are better predictors of cardiovascular disease and diabetes risk factors. However, populations with different distributions of fitness, diet, and health conditions may demonstrate different features. In contrast to BMI, DEXA-derived measurements suggest avenues to explore metabolic processes, which relate to an index's underlying association with risk and may suggest more effective intervention strategies. (c) 2012 Wiley Periodicals, Inc. |
Cadmium and lung cancer mortality accounting for simultaneous arsenic exposure
Park RM , Stayner LT , Petersen MR , Finley-Couch M , Hornung R , Rice C . Occup Environ Med 2012 69 (5) 303-9 OBJECTIVES: Prior investigations identified an association between airborne cadmium and lung cancer but questions remain regarding confounding by arsenic, a well-established lung carcinogen. METHODS: A cadmium smelter population exhibiting excess lung cancer was re-analysed using a retrospective exposure assessment for arsenic (As), updated mortality (1940-2002), a revised cadmium (Cd) exposure matrix and improved work history information. RESULTS: Cumulative exposure metrics for both cadmium and arsenic were strongly associated making estimation of their independent effects difficult. Standardised mortality ratios (SMRs) were modelled with Poisson regression with the contribution of arsenic to lung cancer risk constrained by exposure-response estimates previously reported. The results demonstrate (1) a statistically significant effect of Cd independent of As (SMR=3.2 for 10 mg-year/m(3) Cd, p=0.012), (2) a substantial healthy worker effect for lung cancer (for unexposed workers, SMR=0.69) and (3) a large deficit in lung cancer mortality among Hispanic workers (SMR=0.27, p=0.009), known to have low lung cancer rates. A supralinear dose-rate effect was observed (contribution to risk with increasing exposure intensity has declining positive slope). Lung cancer mortality was somewhat better predicted using a cadmium burden metric with a half-life of about 20-25 years. CONCLUSIONS: These findings support an independent effect for cadmium in risk of lung cancer mortality. 1/1000 excess lifetime risk of lung cancer death is predicted from an airborne exposure of about 2.4 mcg/m(3) Cd. |
Checklist model to improve work practices in small-scale demolition operations with silica dust exposures
Muianga C , Rice C , Lentz T , Lockey J , Niemeier R , Succop P . Int J Environ Res Public Health 2012 9 (2) 343-361 A systematic approach was developed to review, revise and adapt existing exposure control guidance used in developed countries for use in developing countries. One-page employee and multiple-page supervisor guidance sheets were adapted from existing documents using a logic framework and workers were trained to use the information to improve work practices. Interactive, hands-on training was delivered to 26 workers at five small-scale demolition projects in Maputo City, Mozambique, and evaluated. A pre-and-post walkthrough survey used by trained observers documented work practice changes. Worker feedback indicated that the training was effective and useful. Workers acquired knowledge (84% increase, p < 0.01) and applied the work practice guidance. The difference of proportions between use of work practice components before and after the intervention was statistically significant (p < 0.05). Changes in work practices following training included preplanning, use of wet methods and natural ventilation and end-of-task review. Respirable dust measurements indicated a reduction in exposure following training. Consistency in observer ratings and observations support the reliability and validity of the instruments. This approach demonstrated the short-term benefit of training in changing work practices; follow-up is required to determine the long-term impact on changes in work practices, and to evaluate the need for refresher training. |
Digital 3-d headforms representative of Chinese workers
Yu Y , Benson S , Cheng W , Hsiao J , Liu Y , Zhuang Z , Chen W . Ann Occup Hyg 2012 56 (1) 113-22 Headforms are useful for designing and testing various types of personal protective equipment used to protect millions of workers from occupational hazards in China. Although the Chinese national standard of head-and-face dimensions for adults was first published in 1981, headforms based on those dimensions were never developed. In 2006, an anthropometric survey of 3000 Chinese civilian workers was conducted. As part of the survey, 350 subjects were scanned with a Cyberware 3D Rapid Digitizer. The manual measurements and 3-D digital scans from this survey were used to develop 3-D digital headforms that represent Chinese workers. OBJECTIVE: The objective of this study was to develop headforms that represent today's Chinese workers. METHODS: Ten facial dimensions relevant to respirator fit were chosen for defining a principal component analysis model which divides the user population into five face size categories. Mean facial dimensions from manual measurements were then computed to target the ideal facial dimensions for each size category. Five scans were chosen from each face size category to be used in the construction process. Selected scans were then averaged to construct a representative headform for each face size category. RESULTS: Five digital 3-D headforms were developed: small, medium, large, long/narrow, and short/wide. These distinct sizes of digital 3-D headforms take into account the linear distance between landmarks as well as the surface contours captured during the 3-D scan. The dimensions of constructed headforms were within approximately 4 mm between the corresponding computed means and manual measurements of anthropometric landmarks for the sample population in each size category. CONCLUSIONS: These new headforms represent the facial size and shape distribution of current Chinese workers and may be useful for respirator research and development. The Chinese medium headform has a wider face width, shorter face length, and smaller nose protrusion when compared with the current U.S. standard headforms. Upon validation, it may be useful to incorporate these dimensions into Chinese and international respiratory protective devices standards. |
Preparation, certification and interlaboratory analysis of workplace air filters spiked with high-fired beryllium oxide
Oatts TJ , Hicks CE , Adams AR , Brisson MJ , Youmans-McDonald LD , Hoover MD , Ashley K . J Environ Monit 2011 14 (2) 391-401 Occupational sampling and analysis for multiple elements is generally approached using various approved methods from authoritative government sources such as the National Institute for Occupational Safety and Health (NIOSH), the Occupational Safety and Health Administration (OSHA) and the Environmental Protection Agency (EPA), as well as consensus standards bodies such as ASTM International. The constituents of a sample can exist as unidentified compounds requiring sample preparation to be chosen appropriately, as in the case of beryllium in the form of beryllium oxide (BeO). An interlaboratory study was performed to collect analytical data from volunteer laboratories to examine the effectiveness of methods currently in use for preparation and analysis of samples containing calcined BeO powder. NIST SRM((R)) 1877 high-fired BeO powder (1100 to 1200 degrees C calcining temperature; count median primary particle diameter 0.12 mcm) was used to spike air filter media as a representative form of beryllium particulate matter present in workplace sampling that is known to be resistant to dissolution. The BeO powder standard reference material was gravimetrically prepared in a suspension and deposited onto 37 mm mixed cellulose ester air filters at five different levels between 0.5 mcg and 25 mcg of Be (as BeO). Sample sets consisting of five BeO-spiked filters (in duplicate) and two blank filters, for a total of twelve unique air filter samples per set, were submitted as blind samples to each of 27 participating laboratories. Participants were instructed to follow their current process for sample preparation and utilize their normal analytical methods for processing samples containing substances of this nature. Laboratories using more than one sample preparation and analysis method were provided with more than one sample set. Results from 34 data sets ultimately received from the 27 volunteer laboratories were subjected to applicable statistical analyses. The observed performance data show that sample preparations using nitric acid alone, or combinations of nitric and hydrochloric acids, are not effective for complete extraction of Be from the SRM 1877 refractory BeO particulate matter spiked on air filters; but that effective recovery can be achieved by using sample preparation procedures utilizing either sulfuric or hydrofluoric acid, or by using methodologies involving ammonium bifluoride with heating. Laboratories responsible for quantitative determination of Be in workplace samples that may contain high-fired BeO should use quality assurance schemes that include BeO-spiked sampling media, rather than solely media spiked with soluble Be compounds, and should ensure that methods capable of quantitative digestion of Be from the actual material present are used. |
Survey of physician diagnostic and treatment practices for patients with acute diarrhea in Guangdong province, China
Ke B , Ran L , Wu S , Deng X , Ke C , Feng Z , Ma L , Varma JK . Foodborne Pathog Dis 2012 9 (1) 47-53 Although international clinical guidelines generally recommend performing bacterial stool culture in patients with acute diarrhea and fever and discourage routine antibiotic prescribing, clinical practice varies. Understanding practice patterns can help health officials assess the sensitivity of laboratory-based enteric infection surveillance systems and the need to improve antibiotic prescribing practices. We surveyed physicians in Guangdong province, China, to measure their practices for patients with acute diarrhea. A standardized questionnaire was used to interview physicians working in hospitals participating in a Salmonella surveillance system in Guangdong, China. The questionnaire asked physicians about their routine practice for patients with diarrhea, including how they managed the last patient they had seen with acute diarrhea. We calculated the odds ratio and 95% confidence interval for factors associated with ordering a stool culture and for prescribing antibiotics. We received surveys from 237 physicians across 22 hospitals in Guangdong. For the last patient with diarrhea whom they had evaluated, 134 (57%) reported ordering a stool culture. The most common reasons for not ordering a stool culture included that it takes too long to receive the result, that the patient is not willing to pay for the test, and that the patient's illness was too mild to warrant testing. Most physicians prescribed at least one medication for the last patient with diarrhea whom they had evaluated. Of the 237 physicians surveyed, 153 (65%) prescribed antibiotics, 135 (57%) probiotics, and 115 (49%), a gastric mucosal protective drug. In conclusion, physicians in Guangdong, China, reported high rates of ordering bacterial stool cultures from patients with diarrhea, possibly associated with their hospital's participation in a special surveillance project. The high rate of antibiotic prescribing suggests that efforts to promote judicious antibiotic use, such as physician education, are needed. |
Returning to our roots: immigrant populations at work
Stebleton MJ , Eggerth DE . J Career Dev 2012 39 (1) 3-12 This introductory article to the special issue on immigrants and work provides a historical context of the career development profession. Beginning with Parsons and the early reformers of the 1900s, the authors contend that the field was founded on principles of social justice and multiculturalism with an aim toward societal change. Just as helping professionals assisted the new immigrants of the previous century, there is a need and opportunity to be of service to the immigrants of the 21st century. Unique career-related issues for immigrant clients are discussed. An overview of the six pieces in this volume is briefly described along with common themes. |
Is the accuracy of self-reported colorectal cancer screening associated with social desirability?
Vernon SW , Abotchie PN , McQueen A , White A , Eberth JM , Coan SP . Cancer Epidemiol Biomarkers Prev 2012 21 (1) 61-5 BACKGROUND: Self-reported cancer screening behaviors are often overreported and may lead to biased estimates of prevalence and of subgroup differences in screening. We examined whether the tendency to give socially desirable responses was associated with concordance between self-reported colorectal cancer (CRC) screening behaviors and medical records. METHODS: Primary care patients (n = 857) age 50 to 74 years completed a mail, face-to-face, or telephone survey that assessed CRC screening and social desirability measured by a short version of the Marlowe-Crowne scale. We used medical records to verify self-reports of fecal occult blood testing (FOBT), sigmoidoscopy, colonoscopy, and barium enema. RESULTS: Social desirability scores were lower for whites versus African Americans, college graduates, and patients reporting no prior screening tests; they were higher for telephone versus mail or face-to-face survey respondents. In univariable logistic regression analysis, social desirability scores were not associated with concordance for FOBT (OR = 1.03, 95% CI = 0.94-1.13), sigmoidoscopy (OR = 0.95, 95% CI = 0.86-1.04), or colonoscopy (OR = 0.99, 95% CI = 0.88-1.11); however, lower social desirability scores were associated with increased concordance for barium enema (OR = 0.87, 95% CI = 0.77-0.99). In multivariable analyses, no associations were statistically significant. CONCLUSION: Social desirability as measured by the Marlowe-Crowne scale was not associated with accuracy of self-reported CRC tests in our sample, suggesting that other explanations for overreporting need to be explored. IMPACT: By understanding sources of response bias, we can improve the accuracy of self-report measures. (Cancer Epidemiol Biomarkers Prev; 21(1); 61-65. (c)2011 AACR.) |
Applying the theory of work adjustment to Latino immigrant workers
Eggerth DE , Flynn MA . J Career Dev 2012 39 (1) 76-98 Blustein mapped career decision making onto Maslow’s model of motivation and personality and concluded that most models of career development assume opportunities and decision-making latitude that do not exist for many individuals from low income or otherwise disadvantaged backgrounds. Consequently, Blustein argued that these models may be of limited utility for such individuals. Blustein challenged researchers to reevaluate current career development approaches, particularly those assuming a static world of work, from a perspective allowing for changing circumstances and recognizing career choice can be limited by access to opportunities, personal obligations, and social barriers. This article represents an exploratory effort to determine if the theory of work adjustment (TWA) might meaningfully be used to describe the work experiences of Latino immigrant workers, a group living with severe constraints and having very limited employment opportunities. It is argued that there is significant conceptual convergence between Maslow’s hierarchy of needs and the work reinforcers of TWA. The results of an exploratory, qualitative study with a sample of 10 Latino immigrants are also presented. These immigrants participated in key informant interviews concerning their work experiences both in the United States and in their home countries. The findings support Blustein’s contention that such workers will be most focused on basic survival needs and suggest that TWA reinforcers are descriptive of important aspects of how Latino immigrant workers conceptualize their jobs. |
A life course perspective on migration and mental health among Asian immigrants: the role of human agency
Gong F , Xu J , Fujishiro K , Takeuchi DT . Soc Sci Med 2011 73 (11) 1618-26 The relationship between human agency and health is an important yet under-researched topic. This study uses a life course perspective to examine how human agency (measured by voluntariness, migratory reasons, and planning) and timing (measured by age at immigration) affect mental health outcomes among Asian immigrants in the United States. Data from the National Latino and Asian American Study showed that Asian immigrants (n=1491) with multiple strong reasons to migrate were less likely to suffer from mental health problems (i.e., psychological distress and psychiatric disorders in the past 12 months) than those without clear goals. Moreover, Asian immigrants with adequate migratory planning had lower levels of distress and lower rates of 12-month psychiatric disorders than those with poorly planned migration. Compared with migrants of the youngest age category (six or younger), those who migrated during preteen and adolescent years without clear goals had higher levels of psychological distress, and those who migrated during adulthood (25 years or older) were less likely to suffer from recent depressive disorders (with the exception of those migrating for life-improving goals). Furthermore, we found that well-planned migration lowered acculturative stress, and multiple strong reasons for migration buffered the negative effect of acculturative stress upon mental health. Findings from this study advance research on immigrant health from the life course perspective by highlighting the effects of exercising human agency during the pre-migration stage upon post-migration mental health. |
Self-reported alcohol-impaired driving in the U.S., 2006 and 2008
Bergen G , Shults RA , Beck LF , Qayad M . Am J Prev Med 2012 42 (2) 142-9 BACKGROUND: Alcohol-impaired driving caused 10,839 deaths in 2009. Alcohol-impaired driving fatalities as a percentage of all motor vehicle fatalities decreased from 1982 to 1999 but have remained stable since. Understanding characteristics of those who engage in this behavior is critical to achieving future reductions. PURPOSE: The purpose of this study is to estimate the number of episodes of self-reported alcohol-impaired driving and to explore the related demographic factors and drinking patterns. METHODS: Data from the 2006 and 2008 Behavioral Risk Factor Surveillance System were used in 2010 to produce annualized estimates of alcohol-impaired driving episodes. Logistic regression modeling was used to explore the effects of drinking patterns, seatbelt use, and sociodemographics. RESULTS: The percentage of the population reporting at least one alcohol-impaired driving episode in the past 30 days was 2.2% for 2006 and 2008 combined. The number of annualized episodes of alcohol-impaired driving was 147 million. Annualized episode rates varied across states from 165 to 1242 episodes per 1000 population. Characteristics associated with alcohol-impaired driving differed by gender. The strongest correlate of alcohol-impaired driving was binge drinking, with those reporting binge drinking at least once per month being five to six times as likely to report alcohol-impaired driving when adjusting for all other variables. CONCLUSIONS: Understanding who is most likely to report alcohol-impaired driving is important in developing interventions to prevent this behavior. Interventions that are known to be effective, such as sobriety checkpoints and installing ignition interlocks on the vehicles of people convicted of alcohol-impaired driving, should be widely implemented. |
Alcohol consumption, drinking pattern, and self-reported visual impairment
Fan AZ , Li Y , Zhang X , Klein R , Mokdad AH , Saaddine JB , Balluz L . Ophthalmic Epidemiol 2012 19 (1) 8-15 PURPOSE: To examine whether alcohol drinking status and drinking pattern are associated with self-reported visual impairment. METHODS: We used data from the Behavioral Risk Factor Surveillance System, a state-based telephone health survey conducted by random-digit dialing among non-institutionalized US adults. The Visual Impairment and Access to Eye Care module was implemented among 42,713 adults aged 50 years and older in 2005 and 2006. Visual impairment was defined as any degree of difficulty experienced in recognizing a friend across the street or reading print in newspaper, magazine, recipe, menu, or numbers on the telephone with usual correction. Drinking patterns included drinking quantity (drinks per drinking day), frequency (drinking days in the past month), and binge drinking. RESULTS: After adjustment for age, sex, race/ethnicity, educational attainment, smoking status, Body Mass Index, history of cardiovascular diseases, diabetes, and eye diseases, current drinking status was not associated with distance and/or near vision impairment. However, drinking more than 1 drink per drinking day (odds ratio [OR], 1.21; 95% confidence intervals [CI], 1.09-1.35) and binge drinking (OR, 1.32; 95% CI, 1.14-1.53) were associated with visual impairment among current drinkers. CONCLUSION: Among current drinkers, drinking patterns were significantly associated with near and distance vision impairment. Longitudinal studies are needed to confirm whether drinkers who drink beyond drinking guidelines, especially binge drinkers, are at higher risk of visual impairment than those who drink at lower levels. |
Rift Valley fever virus vaccine lacking the NSs and NSm genes is safe, nonteratogenic, and confers protection from viremia, pyrexia, and abortion following challenge in adult and pregnant sheep.
Bird BH , Maartens LH , Campbell S , Erasmus BJ , Erickson BR , Dodd KA , Spiropoulou CF , Cannon D , Drew CP , Knust B , McElroy AK , Khristova ML , Albarino CG , Nichol ST . J Virol 2011 85 (24) 12901-9 Rift Valley fever virus (RVFV) is a mosquito-borne human and veterinary pathogen causing large outbreaks of severe disease throughout Africa and the Arabian Peninsula. Safe and effective vaccines are critically needed, especially those that can be used in a targeted one-health approach to prevent both livestock and human disease. We report here on the safety, immunogenicity, and efficacy of the DeltaNSs-DeltaNSm recombinant RVFV (rRVFV) vaccine (which lacks the NSs and NSm virulence factors) in a total of 41 sheep, including 29 timed-pregnant ewes. This vaccine was proven safe and immunogenic for adult animals at doses ranging from 1.0 x 10(3) to 1.0 x 10(5) PFU administered subcutaneously (s.c.). Pregnant animals were vaccinated with 1.0 x 10(4) PFU s.c. at day 42 of gestation, when fetal sensitivity to RVFV vaccine-induced teratogenesis is highest. No febrile reactions, clinical illness, or pregnancy loss was observed following vaccination. Vaccination resulted in a rapid increase in anti-RVFV IgM (day 4) and IgG (day 7) titers. No seroconversion occurred in cohoused control animals. A subset of 20 ewes progressed to full-term delivery after vaccination. All lambs were born without musculoskeletal, neurological, or histological birth defects. Vaccine efficacy was assessed in 9 pregnant animals challenged at day 122 of gestation with virulent RVFV (1.0 x 10(6) PFU intravenously). Following challenge, 100% (9/9) of the animals were protected, progressed to full term, and delivered healthy lambs. As expected, all 3 sham-vaccinated controls experienced viremia, fetal death, and abortion postchallenge. These results demonstrate that the DeltaNSs-DeltaNSm rRVFV vaccine is safe and nonteratogenic and confers high-level protection in sheep. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Epidemiology and Surveillance
- Food Safety
- Genetics and Genomics
- Immunity and Immunization
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Public Health Leadership and Management
- Social and Behavioral Sciences
- Substance Use and Abuse
- Veterinary Medicine
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure