Recent trends in prostate cancer testing and incidence among men under age of 50
Li J , German R , King J , Joseph D , Thompson T , Wu XC , Ajani U , Tai E . Cancer Epidemiol 2011 36 (2) 122-7 BACKGROUND: Information on prostate cancer testing and incidence among men under age 50 is scant. This study aims to describe trends of prostate cancer testing and incidence by demographic and clinical characteristics and identify potential correlations between prostate cancer testing and incidence. METHODS: We examined prostate cancer testing and incidence rates among American men under age of 50 using data from the Behavioral Risk Factor Surveillance System (2002, 2004, 2006, and 2008) and data from the National Program of Cancer Registries and Surveillance, Epidemiology, and End Results programs (2001-2006). We conducted descriptive, logistic regression, and trend analyses using SUDAAN and SEER*Stat. RESULTS: The prostate cancer incidence rate among black men was more than 2-fold that of white men. The overall prostate cancer incidence rate slightly increased from 2001 to 2006; however, the prevalence of prostate cancer testing declined over time. There was a borderline significant increase in prostate cancer incidence rate (APC=3.5, 95% CI=0.0, 7.0) for men aged 40-44. Well-differentiated prostate cancer incidence decreased significantly (APC=-24.7; 95% confidence interval (CI)=-34.9, -12.8) over time. CONCLUSIONS: We observed a large difference in prostate cancer incidence between blacks and whites under age 50. Similar patterns in prostate cancer testing and cancer incidence by race and ethnicity suggested prostate cancer testing might have influenced incidence to some extent in this young population. The different temporal patterns for prostate cancer testing and incidence, especially for men aged 40-44 years, suggested screening alone could not fully accounted for the increasing prostate cancer incidence rates. Decreasing trend of well-differentiated prostate cancer may be partially due to "Grade Inflation". |
Testicular cancer: a narrative review of the role of socioeconomic position from risk to survivorship
Richardson LC , Neri AJ , Tai E , Glenn JD . Urol Oncol 2011 30 (1) 95-101 BACKGROUND: Testicular cancer (TC) is one of the most curable cancers. Given survival rates of close to 100% with appropriate therapy, ensuring proper treatment is essential. We reviewed and summarized the literature on the association of socioeconomic position (SEP) along the cancer control spectrum from risk factors to survivorship. METHODS: We searched PubMed from 1966 to 2011 using the following terms: testicular cancer, testicular neoplasm, poverty, and socioeconomic factors, retrieving 119 papers. After excluding papers for the non-English (10) language and non-relevance (46), we reviewed 63 papers. We abstracted information on socioeconomic position (SEP), including occupation, education, income, and combinations of the 3. Five areas were examined: risk factors, diagnosis, treatment, survival, and survivorship. RESULTS: Most studies examined area-based measures, not individual measures of SEP. The majority of studies found an increased risk of developing TC with high SEP though recent papers have indicated increased risk in low-income populations. Regarding diagnosis, recent papers have indicated that lower levels of education and SEP are risk factors for later-stage TC diagnosis and hence higher TC mortality. For treatment, 1 study that examined the use of radiation therapy (RT) in stage I seminoma reported that living in a county with lower educational attainment led to lower use of RT. For survival (mortality), several studies found that men living in lower SEP geographic areas experience lower survival and higher mortality. CONCLUSION: The strongest evidence for SEP impact on testicular germ cell tumor (TGCT) was found for the risk of developing cancer as well as survival. The association of SEP with TGCT risk appears to have changed over the last decade. Given the highly curable nature of TGCT, more research is needed to understand how SEP impacts diagnosis and treatment for TGCT and to design interventions to address disparities in TGCT outcomes and SEP. |
Measures of self-efficacy: Arthritis Self-Efficacy Scale (ASES), Arthritis Self-Efficacy Scale-8 Item (ASES-8), Children's Arthritis Self-Efficacy Scale (CASE), Chronic Disease Self-Efficacy Scale (CDSES), Parent's Arthritis Self-Efficacy Scale (PASE), and Rheumatoid Arthritis Self-Efficacy Scale (RASE)
Brady TJ . Arthritis Care Res (Hoboken) 2011 63 S473-S485 Enhancing self-efficacy has become an essential feature of many arthritis management interventions because of its robust relationships with health behaviors and health status. Empirical studies document that self-efficacy predicts health behaviors such as physical activity, eating behaviors, and pain coping strategies (1). In rheumatoid arthritis and osteoarthritis, self-efficacy has also been correlated with measures of health status such as daily pain and mood ratings (2), pain, stiffness, function, and physical and mental well-being (3); it has also been correlated with changes in pain, function, and depression (4). Adherence with medications and other health recommendations has also been associated with self-efficacy (5, 6). In addition to evidence that self-efficacy is associated with health behaviors, current and future health status, and adherence to health recommendations, the fact that self-efficacy can change through efficacy-enhancing interventions makes it a rich target of arthritis interventions (1). | Self-efficacy, defined in Bandura's seminal 1977 article as “the conviction that one can successfully execute the behavior required to produce the outcomes” (7), was hypothesized to influence whether a behavior was initiated and sustained despite obstacles or adverse experiences, and to influence the level of effort invested in the behavior. Bandura's definition of self-efficacy evolved slightly over time; in his 1997 publication, Bandura defined self-efficacy as “belief in one's capability to organize and execute the courses of action required to produce given attainments” (8). Bandura has consistently described self-efficacy as domain specific and distinct from other constructs in social learning theory such as outcome expectations, defined as a person's estimate that a given behavior will lead to certain outcomes (7). Self-efficacy beliefs are also conceptualized as distinct from actual ability to perform a task (e.g., can I ride a bicycle), actual task performance (e.g., do I ride a bicycle), or intention to perform task (e.g., do I intend to ride a bicycle) (8, 9). These different types of beliefs are clearly distinguished in Gecht et al's survey of exercise beliefs and habits among people with arthritis (10). In that survey, respondents were asked about their self-efficacy expectations regarding exercise (“If I want to exercise, I know I can do it”), and their outcome expectations regarding exercise (“regular exercise will probably make my arthritis worse in the future”); they were also asked to report their actual behavior (how often they did specific exercises in the past 2 weeks). Self-efficacy theory hypothesizes that both efficacy expectations and outcome expectations influence whether or not an individual will initiate and sustain a specific behavior (7). Gecht et al found that positive outcome expectations and self-efficacy for exercise were associated with participation in exercise (10). Conversely, self-efficacy theory predicts that if a patient believes that they can exercise (self-efficacy expectation) but also believes that exercise will be harmful for their arthritis (outcome expectation), the patient would be less likely to exercise than if they expected positive outcomes from exercise (7). Social learning theory suggests that it is important for clinicians and others hoping to help a person adopt health behaviors to understand both whether the person believes they can perform the behavior, and whether they believe that behavior will lead to positive outcomes. |
Multiple health behaviors and serum hepatic enzymes among US adults with obesity
Tsai J , Ford ES , Zhao G , Croft JB . Prev Med 2011 53 278-83 INTRODUCTION: This study was to examine the cumulative number and clustering patterns of low-risk health behaviors (i.e., not currently smoking, not excessive drinking, and physically active) associated with elevation of serum alanine aminotransferase (ALT), aspartate aminotransferase (AST), and gamma-glutamyltransferase (GGT) among adults with obesity in the United States. METHODS: We estimated the age-adjusted prevalence of elevated ALT, AST, and GGT from 4547 adults with obesity aged ≥20years who participated in the 2005-2008 National Health and Nutrition Examination Survey. The associations between the cumulative number or clustering patterns of low-risk health behaviors and measures of serum ALT, AST, and GGT were assessed using multivariate regression models. RESULTS: Adult men who reported having three low-risk health behaviors were 62%, 39%, and 48% less likely to have elevated serum ALT, AST, and GGT, respectively; adult women were 56% and 73% less likely to have elevated serum AST and GGT, respectively, when compared to their respective counterparts who reported having none of the low-risk health behaviors. CONCLUSIONS: The findings of this study indicate that, among adults with obesity, having multiple low-risk health behaviors is associated with decreased likelihoods of elevated hepatic enzymes, including ALT in men, AST and GGT in both men and women. |
Muscular strengthening activity patterns and metabolic health risk among U.S. adults
Churilla JR , Magyari PM , Ford ES , Fitzhugh EC , Johnson TM . J Diabetes 2011 4 (1) 77-84 OBJECTIVES: Many studies have examined the relationship between physical activity and metabolic disorders. However, few studies have focused on specific associations between these disorders and muscular strengthening activity (MSA) patterns. Our purpose was to examine the association(s) for each metabolic syndrome criterion and MSA patterns. METHODS: Study sample (N=5,618) included adults, 20 years and older who participated in the 1999-2004 National Health and Nutrition Examination Survey. Cut-points for metabolic syndrome criteria were derived from the American Heart Association/ National Heart, Lung, and Blood Institute definition. The aggregate of data on weight lifting, push-ups, and sit-ups was utilized to establish patterns of MSA. Participants reporting >=2 days per week of MSA were coded as meeting current U.S. MSA guidelines. RESULTS: Following adjustments, participants reporting >=2 days per week of MSA were found to be 28% (OR 0.72; 95% confidence interval (CI) 0.62, 0.83) less likely to have dyslipidemia, 29% (OR 0.71; 95% CI 0.54, 0.93) less likely to have impaired fasting glucose, 19% (OR 0.81; 95% CI 0.65, 0.99) less likely to have prehypertension and 43% (OR 0.57; 95% CI 0.46, 0.72) less likely to have augmented waist circumferences compared to those reporting engaging in no MSA. No association was found for hypertension and MSA. CONCLUSION: Engaging in >=2 days per week of MSA as part of an overall physical activity regimen may be prudent in preserving metabolic health. These findings strengthen the relationship between MSA and metabolic health, thus clinicians should include MSA when discussing lifestyle approaches to better health. SIGNIFICANCE OF STUDY: These finding suggest that U.S. adults reporting engaging in >=2 days per week of muscular strengthening activities may be significantly less likely to have -Impaired fasting glucose -Dyslipidemia -Abdominal obesity -Prehypertension. WHAT THIS STUDY ADDS: This study adds a novel perspective to the area of strength training and metabolic health. Our study is one of the first to report on the favorable dose-response effects of muscular strengthening activities and various metabolic markers in a nationally representative sample of the U.S. adult population. |
Predictors of major bleeding in peri-procedural anticoagulation management
Tafur AJ , McBane R 2nd , Wysokinski WE , Litin S , Daniels P , Slusser J , Hodge D , Beckman MG , Heit JA . J Thromb Haemost 2011 10 (2) 261-7 BACKGROUND: Appropriate periprocedural management for chronically anticoagulated patients requires assessment of patient-specific thrombosis and bleeding risks. However, predictors of post-procedure bleeding are unknown. OBJECTIVES: To determine the 3-month cumulative incidence and independent predictors of peri-procedural bleeding in chronically anticoagulated patients requiring temporary warfarin interruption for an invasive procedure. METHODS: In a protocol driven, cohort study design, all patients referred to the Mayo Clinic Thrombophilia Center for peri-procedural anticoagulation management (1997-2007; n=2182), were followed forward in time to determine the 3-month cumulative incidence of peri-procedural bleeding (Kaplan-Meier product limit) and potential predictors of bleeding (Cox proportional hazards). Decisions to "bridge" with LMWH were based on estimated thromboembolism and bleeding risk. RESULTS: Indications for chronic anticoagulation included venous thromboembolism (38%), atrial fibrillation (30%), and mechanical heart valves (27%). Of these, 1496 (69%) patients received bridging therapy. The 3-month cumulative incidence rates of major and overall bleeding were 2.1% and 5.1%, respectively. Major bleeding occurred more frequently in patients receiving bridging therapy (3% vs. 1%; p=0.017). Independent predictors (HR; 95% CI) of major bleeding included mitral mechanical heart valve (2.2; 1.1-4.3), active cancer (1.8; 1.0-3.1), prior bleeding history (2.6; 1.5-4.5) and re-initiation of heparin therapy within 24 hours after the procedure (1.9; 1.1-3.4). CONCLUSION: Factors predisposing to peri-procedural bleeding are primarily patient-specific. Premature heparin re-initiation is an avoidable provider-specific variable to consider. |
Failure to confirm XMRV/MLVs in the blood of patients with chronic fatigue syndrome: a multi-laboratory study
Simmons G , Glynn SA , Komaroff AL , Mikovits JA , Tobler LH , Hackett J Jr , Tang N , Switzer WM , Heneine W , Hewlett IK , Zhao J , Lo SC , Alter HJ , Linnen JM , Gao K , Coffin JM , Kearney MF , Ruscetti FW , Pfost MA , Bethel J , Kleinman S , Holmberg JA , Busch MP . Science 2011 334 (6057) 814-7 Murine leukemia viruses (MLVs), including xenotropic-MLV-related virus (XMRV), have been controversially linked to chronic fatigue syndrome (CFS). To explore this issue in greater depth, we compiled coded replicate samples of blood from 15 subjects previously reported to be XMRV/MLV-positive (14 with CFS) and from 15 healthy donors previously determined to be negative for the viruses. These samples were distributed in a blinded fashion to nine laboratories, which performed assays designed to detect XMRV/MLV nucleic acid, virus replication, and antibody. Only two laboratories reported evidence of XMRV/MLVs; however, replicate sample results showed disagreement, and reactivity was similar among CFS subjects and negative controls. These results indicate that current assays do not reproducibly detect XMRV/MLV in blood samples and that blood donor screening is not warranted. |
Complementary and alternative medicine use among adults with work-related and non-work-related asthma
Knoeller GE , Mazurek JM , Moorman JE . J Asthma 2011 49 (1) 107-13 BACKGROUND: The prevalence of complementary and alternative medicine (CAM) use among adults with current asthma has been estimated to be 40%. To our knowledge, there is no information on the prevalence of CAM use among individuals with work-related asthma (WRA). OBJECTIVES: To examine the associations between WRA, CAM use, and adverse asthma events. METHODS: We analyzed data from the 2006-2008 Behavioral Risk Factor Surveillance System Asthma Call-Back Survey from 37 states and the District of Columbia for ever-employed adults with current asthma. We defined WRA as health-professional-diagnosed WRA. We calculated prevalence ratios (PRs) adjusted for age, sex, race/ethnicity, education, income, health insurance, and geographic region of residence. RESULTS: Of ever-employed adults with current asthma, an estimated 38.1% used CAM and 8.6% had WRA. An estimated 56.6% of individuals with WRA reported using CAM compared with 27.9% of those with non-WRA (PR = 2.0). People with WRA were more likely than those with non-WRA to have adverse asthma events including an asthma attack in the past month (PR = 1.43), urgent treatment for worsening asthma (PR = 1.74), emergency room visit (PR = 1.95), overnight hospital stay (PR = 2.49), and poorly controlled asthma (PR = 1.27). The associations of WRA with adverse asthma events remained after stratifying for CAM use. CONCLUSIONS: Compared with non-WRA, individuals with WRA were more likely to use CAM to control their asthma. However, there was no evidence that the use of CAM modified the association of WRA with adverse asthma events. |
Report on a single topic conference on "chronic viral hepatitis - strategies to improve effectiveness of screening and treatment"
Ward JW , Lok AS , Thomas DL , El-Serag HB , Kim WR . Hepatology 2011 55 (1) 307-15 The 2010 Institute of Medicine Report on "Hepatitis and Liver Cancer" indicated that lack of knowledge and awareness about chronic hepatitis B (HBV) and C virus (HCV) infections and insufficient understanding about the extent and seriousness of this public health problem impeded current efforts to prevent and control hepatitis B and C. A single topic conference was held in June 2011 to discuss strategies to improve the effectiveness of screening, care referral and clinical management of chronic HBV and HCV infections with the ultimate goal of reducing morbidity and mortality from these infections. Various models that have been shown to improve hepatitis screening and effectiveness of hepatitis treatment in the community including rural settings and populations that have traditionally been excluded due to comorbidities were presented. Recent advances in laboratory testing, medical management, and new antiviral therapies will not decrease the burden of viral hepatitis if persons at risk for or are living with viral hepatitis are not aware of the risks, have not been diagnosed, or have no access to care. Systematic changes in our health care delivery system and enhanced coordination of prevention and care services with partnerships between public health leaders and clinicians through education of the public and health care providers and linkage of infected persons with care and treatment services can increase the success of preventing viral hepatitis and the effectiveness of hepatitis treatment in the real-world. Implementation of these changes is feasible and will require policy changes, coordination among government agencies, and collaboration between government agencies, health care providers, community organizations, and advocacy groups. (HEPATOLOGY 2011). |
Trichomonas vaginalis genital infections: progress and challenges
Bachmann LH , Hobbs MM , Sena AC , Sobel JD , Schwebke JR , Krieger JN , McClelland RS , Workowski KA . Clin Infect Dis 2011 53 S160-S172 Trichomonas vaginalis (TV) infection is the most prevalent curable sexually transmitted infection in the United States and worldwide. Most TV infections are asymptomatic, and the accurate diagnosis of this infection has been limited by lack of sufficiently sensitive and specific diagnostic tests, particularly for men. To provide updates for the 2010 Centers for Disease Control and Prevention's Sexually Transmitted Diseases Treatment Guidelines, a PubMed search was conducted of all TV literature published from 9 January 2004 through 24 September 2008. Approximately 175 pertinent abstracts and articles were reviewed and discussed with national experts. This article describes advances in TV diagnostics which have led to an improved understanding of the epidemiology of this pathogen, as well as potential biologic and epidemiological interactions between TV and human immunodeficiency virus (HIV). New data on treatment outcomes, metronidazole-resistant TV, management of nitroimidazole-allergic patients, frequency of recurrent TV infection following treatment, and screening considerations for TV in certain populations are also presented. |
Unexpected decline in tuberculosis cases coincident with economic recession -- United States, 2009
Winston CA , Navin TR , Becerra JE , Chen MP , Armstrong LR , Jeffries C , Yelk Woodruff RS , Wing J , Starks AM , Hales CM , Kammerer JS , Mac Kenzie WR , Mitruka K , Miner MC , Price S , Scavotto J , Cronin AM , Griffin P , Lobue PA , Castro KG . BMC Public Health 2011 11 (1) 846 BACKGROUND: Since 1953, through the cooperation of state and local health departments, the U.S. Centers for Disease Control and Prevention (CDC) has collected information on incident cases of tuberculosis (TB) disease in the United States. In 2009, TB case rates declined -11.4%, compared to an average annual -3.8% decline since 2000. The unexpectedly large decline raised concerns that TB cases may have gone unreported. To address the unexpected decline, we examined trends from multiple sources on TB treatment initiation, medication sales, and laboratory and genotyping data on culture-positive TB. METHODS: We analyzed 142,174 incident TB cases reported to the U. S. National Tuberculosis Surveillance System (NTSS) during January 1, 2000-December 31, 2009; TB control program data from 59 public health reporting areas; self-reported data from 50 CDC-funded public health laboratories; monthly electronic prescription claims for new TB therapy prescriptions; and complete genotyping results available for NTSS cases. Accounting for prior trends using regression and time-series analyses, we calculated the deviation between observed and expected TB cases in 2009 according to patient and clinical characteristics, and assessed at what point in time the deviation occurred. RESULTS: The overall deviation in TB cases in 2009 was -7.9%, with -994 fewer cases reported than expected (P <.001). We ruled out evidence of surveillance underreporting since declines were seen in states that used new software for case reporting in 2009 as well as states that did not, and we found no cases unreported to CDC in our examination of over 5400 individual line-listed reports in 11 areas. TB cases decreased substantially among both foreign-born and U.S.-born persons. The unexpected decline began in late 2008 or early 2009, and may have begun to reverse in late 2009. The decline was greater in terms of case counts among foreign-born than U.S.-born persons; among the foreign-born, the declines were greatest in terms of percentage deviation from expected among persons who had been in the United States less than 2 years. Among U.S.-born persons, the declines in percentage deviation from expected were greatest among homeless persons and substance users. Independent information systems (NTSS, TB prescription claims, and public health laboratories) reported similar patterns of declines. Genotyping data did not suggest sudden decreases in recent transmission. CONCLUSIONS: Our assessments show that the decline in reported TB was not an artifact of changes in surveillance methods; rather, similar declines were found through multiple data sources. While the steady decline of TB cases before 2009 suggests ongoing improvement in TB control, we were not able to identify any substantial change in TB control activities or TB transmission that would account for the abrupt decline in 2009. It is possible that other multiple causes coincident with economic recession in the United States, including decreased immigration and delayed access to medical care, could be related to TB declines. Our findings underscore important needs in addressing health disparities as we move towards TB elimination in the United States. |
Updates on human papillomavirus and genital warts and counseling messages from the 2010 Sexually Transmitted Diseases Treatment Guidelines
Dunne EF , Friedman A , Datta SD , Markowitz LE , Workowski KA . Clin Infect Dis 2011 53 S143-S152 BACKGROUND: In April 2009, experts on sexually transmitted diseases (STDs) were convened to review updates on STD prevention and treatment in preparation for the revision of the Centers for Disease Control and Prevention (CDC) STD Treatment Guidelines. At this meeting, there was a discussion of important updates on human papillomavirus (HPV), genital warts, and cervical cancer screening. METHODS: Key questions were identified with assistance from an expert panel, and systematic reviews of the literature were conducted searching the English-language literature of the PubMed computerized database (US National Library of Medicine). The available evidence was reviewed, and new information was incorporated in the 2010 CDC STD Treatment Guidelines. RESULTS: Two HPV vaccines are now available, the quadrivalent HPV vaccine and the bivalent HPV vaccine; either vaccine is recommended routinely for girls aged 11 or 12 years. The quadrivalent HPV vaccine may be given to boys and men aged 9-26 years. A new patient-applied treatment option for genital warts, sinecatechins 15% ointment, is available and recommended for treatment of external genital warts. This product is a mixture of active ingredients (catechins) from green tea. Finally, updated counseling guidelines and messages about HPV, genital warts, and cervical cancer are included. CONCLUSIONS: This manuscript highlights updates to the 2010 CDC STD Treatment Guidelines for HPV and genital warts. Important additions to the 2010 STD Treatment Guidelines include information on prophylactic HPV vaccine recommendations, new patient-applied treatment options for genital warts, and counseling messages for patients on HPV, genital warts, cervical cancer screening, and HPV tests. |
Management of adult syphilis
Ghanem KG , Workowski KA . Clin Infect Dis 2011 53 S110-S128 There are several important unanswered key questions in the management of adult syphilis. A systematic literature review was conducted and tables of evidence were constructed to answer these important questions. A single dose of 2.4 million units of benzathine penicillin G remains the drug of choice for managing early syphilis. Enhanced antibiotic therapy has not been shown to improve treatment outcomes, regardless of human immunodeficiency virus (HIV) status. Although additional data on the efficacy of azithromycin in treating early syphilis have emerged, reported increases in the prevalence of a mutation associated with azithromycin resistance precludes a recommendation for its routine use. Cerebrospinal fluid (CSF) examination should be performed in all persons with serologic evidence of syphilis infection and neurologic symptoms. In those persons with early syphilis who do not achieve a >=4-fold serologic decline in their rapid plasma reagin (RPR) titers 6-12 months after adequate therapy and those with late latent infection who do not achieve a similar decline within 12-24 months, CSF examination should be considered. Among HIV-infected persons, CSF examination among all those with asymptomatic late latent syphilis is not recommended owing to lack of evidence that demonstrates clinical benefit. HIV-infected persons with syphilis of any stages whose RPR titers are >=1:32 and/or whose CD4 cell counts are <350 cells/mm(3) may be at increased risk for asymptomatic neurosyphilis. If CSF pleocytosis is evident at initial CSF examination, these examinations should be repeated every 6 months until the cell count is normal. Several important questions regarding the management of syphilis remain unanswered and should be a priority for future research. |
Prevalence and risk factors for tuberculosis infection among personnel in two hospitals in Viet Nam
Powell K , Han D , Hung NV , Vu T , Sy DN , Trinh TT , Le TC , Do K , Oeltmann JE , Whitehead S . Int J Tuberc Lung Dis 2011 15 (12) 1643-9 SETTING: Two general hospitals in Viet Nam. OBJECTIVE: To assess the risk of tuberculosis (TB) infection associated with hospital employment. DESIGN: During October-December 2009, we performed a cross-sectional study of hospital personnel and, for community comparison groups, staff from nearby schools. We tested for TB infection using the tuberculin skin test; an induration ≥10 mm indicated TB infection. RESULTS: Of 956 hospital personnel, 380 (40%) had TB infection compared to 40 (26%) of 155 school personnel. Hospital personnel had twice the odds of TB infection compared with school personnel (OR 2.0, 95%CI 1.3-3.0) after adjustment for age and sex. Compared to hospital administrative staff, the odds of TB infection were similar among clinical staff (OR 1.0, 95%CI 0.6- 1.3), clinical support staff (OR 0.9, 95%CI 0.5-1.6) and auxiliary staff (OR 1.1, 95%CI 0.6-2.0) at the hospitals. No additional infection risk was detected in highrisk departments (OR 1.1, 95%CI 0.6-2.0). CONCLUSIONS: Hospital personnel are at increased risk of TB infection. Among hospital personnel, risk was independent of job or department, suggesting that personnel are commonly at risk and that improvements in infection control are needed throughout hospitals. |
Provider-initiated HIV testing and counseling: increased uptake in two public community health centers in South Africa and implications for scale-up
Dalal S , Lee CW , Farirai T , Schilsky A , Goldman T , Moore J , Bock NN . PLoS One 2011 6 (11) e27293 BACKGROUND: International guidance recommends the scale up of routinely recommended, offered, and delivered health care provider-initiated HIV testing and counseling (PITC) to increase the proportion of persons who know their HIV status. We compared HIV test uptake under PITC to provider-referral to voluntary counseling and testing (VCT referral) in two primary health centers in South Africa. METHODS: Prior to introducing PITC, clinical providers were instructed to refer systematically selected study participants to VCT. After PITC and HIV rapid test training, providers were asked to recommend, offer and provide HIV testing to study participants during the clinical consultation. Participants were interviewed before and after their consultation to assess their HIV testing experiences. RESULTS: HIV test uptake increased under PITC (OR 2.85, 95% CI 1.71, 4.76), and more patients felt providers answered their questions on HIV (104/141 [74%] versus 73/118 [62%] for VCT referral; p 0.04). After three months, only 4/106 (3.8%) HIV-positive patients had registered for onsite HIV treatment. Providers found PITC useful, but tested very few patients (range 0-15). CONCLUSION: PITC increased the uptake of HIV testing compared with referral to onsite VCT, and patients reported a positive response to PITC. However, providing universal PITC will require strong leadership to train and motivate providers, and interventions to link HIV-positive persons to HIV treatment centers. |
Five-year trends in epidemiology and prevention of mother-to-child HIV transmission, St. Petersburg, Russia: results from perinatal HIV surveillance
Kissin DM , Mandel MG , Akatova N , Belyakov NA , Rakhmanova AG , Voronin EE , Volkova GV , Yakovlev AA , Jamieson DJ , Vitek C , Robinson J , Miller WC , Hillis S . BMC Infect Dis 2011 11 (1) 292 BACKGROUND: The HIV epidemic in Russia has increasingly involved reproductive-aged women, which may increase perinatal HIV transmission. METHOD: Standard HIV case-reporting and enhanced perinatal HIV surveillance systems were used for prospective assessment of HIV-infected women giving birth in St. Petersburg, Russia, during 2004-2008. Trends in social, perinatal, and clinical factors influencing mother-to-child HIV transmission stratified by history of injection drug use, and rates of perinatal HIV transmission were assessed using two-sided chi-square or Cochran-Armitage tests. RESULTS: Among HIV-infected women who gave birth, the proportion of women who self-reported ever using injection drugs (IDUs) decreased from 62% in 2004 to 41% in 2008 (P<0.0001). Programmatic improvements led to increased uptake of the following clinical services from 2004 to 2008 (all P<0.01): initiation of antiretroviral prophylaxis at less than or equal to 28 weeks gestation (IDUs 44%-54%, non-IDUs 45%-72%), monitoring of immunologic (IDUs 48%-64%, non-IDUs 58%-80%) and virologic status (IDUs 8%-58%, non-IDUs 10%-75%), dual/triple antiretroviral prophylaxis (IDUs 9%-44%, non-IDUs 14%-59%). After initial increase from 5.3% (95% confidence interval [CI] 3.5%-7.8%) in 2004 to 8.5% (CI 6.1%-11.7%) in 2005 (P<0.05), perinatal HIV transmission decreased to 5.3% (CI 3.4%-8.3%) in 2006, and 3.2% (CI 1.7%-5.8%) in 2007 (P for trend <0.05). However, the proportion of women without prenatal care and without HIV testing before labor and delivery remained unchanged. CONCLUSIONS: Reduced proportion of IDUs and improved clinical services among HIV-infected women giving birth were accompanied by decreased perinatal HIV transmission, which can be further reduced by increasing outreach and HIV testing of women before and during pregnancy. |
CD4 cell count and viral load monitoring in patients undergoing antiretroviral therapy in Uganda: cost effectiveness study
Kahn JG , Marseille E , Moore D , Bunnell R , Were W , Degerman R , Tappero JW , Ekwaru P , Kaharuza F , Mermin J . BMJ 2011 343 d6884 OBJECTIVE: To examine the cost and cost effectiveness of quarterly CD4 cell count and viral load monitoring among patients taking antiretroviral therapy (ART). DESIGN: Cost effectiveness study. SETTING: A randomised trial in a home based ART programme in Tororo, Uganda. PARTICIPANTS: People with HIV who were members of the AIDS Support Organisation and had CD4 cell counts <250 x10(6) cells/L or World Health Organization stage 3 or 4 disease. MAIN OUTCOME MEASURES: Outcomes calculated for the study period and projected 15 years into the future included costs, disability adjusted life years (DALYs), and incremental cost effectiveness ratios (ICER; $ per DALY averted). Cost inputs were based on the trial and other sources. Clinical inputs derived from the trial; in the base case, we assumed that point estimates reflected true differences even if non-significant. We conducted univariate and multivariate sensitivity analyses. INTERVENTIONS: Three monitoring strategies: clinical monitoring with quarterly CD4 cell counts and viral load measurement (clinical/CD4/viral load); clinical monitoring and quarterly CD4 counts (clinical/CD4); and clinical monitoring alone. RESULTS: With the intention to treat (ITT) results per 100 individuals starting ART, we found that clinical/CD4 monitoring compared with clinical monitoring alone increases costs by $20,458 ( GBP 12,780, EUR 14 ,707) and averts 117.3 DALYs (ICER=$174 per DALY). Clinical/CD4/viral load monitoring compared with clinical/CD4 monitoring adds $142,458, and averts 27.5 DALYs ($5181 per DALY). The superior ICER for clinical/CD4 monitoring is robust to uncertainties in input values, and that strategy is dominant (less expensive and more effective) compared with clinical/CD4/viral load monitoring in one quarter of simulations. If clinical inputs are based on the as treated analysis starting at 90 days (after laboratory monitoring was initiated), then clinical/CD4/viral load monitoring is dominated by other strategies. CONCLUSIONS: Based on this trial, compared with clinical monitoring alone, monitoring of routine CD4 cell count is considerably more cost effective than additionally including routine viral load testing in the monitoring strategy and is more cost effective than ART. |
Centers for Disease Control and Prevention sexually transmitted disease treatment guidelines
Workowski KA , Berman SM . Clin Infect Dis 2011 53 S59-S63 Sexually transmitted diseases (STDs) constitute an epidemic of tremendous magnitude, with an estimated 18.9 million persons acquiring a new STD each year [1]. Reported disease rates underestimate the true burden of infection because the majority of STDs are asymptomatic and therefore go undetected, and also because of underreporting. STDs have far-reaching public health consequences on the sexual and reproductive health of individuals as well as the long-term health and health care costs of the community. | The accurate identification and effective clinical management of STDs represents a critical strategy for improving reproductive and sexual health and strengthening human immunodeficiency virus (HIV) prevention efforts. This is especially relevant to women, adolescents, and infants, as untreated infections frequently result in severe, long-term complications, including tubal infertility, adverse pregnancy outcomes, cancer, and facilitation of HIV infection. For more than 20 years, the Centers for Disease Control and Prevention’s (CDC) national guidelines for managing STDs has helped clinicians deliver optimal STD care. The CDC STD treatment guidelines are the most widely referenced and authoritative source of information on STD treatment and prevention strategies for clinicians who evaluate persons with STDs or those at risk for STDs. |
Changes in fluoroquinolone use for gonorrhea following publication of revised treatment guidelines
Dowell D , Tian LH , Stover JA , Donnelly JA , Martins S , Erbelding EJ , Pino R , Weinstock H , Newman LM . Am J Public Health 2011 102 (1) 148-55 OBJECTIVES: We evaluated the impact of revised national treatment recommendations on fluoroquinolone use for gonorrhea in selected states. METHODS: We evaluated gonorrhea cases reported through the Sexually Transmitted Disease Surveillance Network as treated between July 1, 2006 and May 31, 2008, using interrupted time series analysis. Outcomes were fluoroquinolone treatment overall, by area, and by practice setting. RESULTS: Of 16,126 cases with treatment dates in this period, 15,669 noted the medication used. After revised recommendations were released, fluoroquinolone use decreased abruptly overall (21.5%; 95% confidence interval [CI]=15.9%, 27.2%), in most geographic areas evaluated, and in sexually transmitted disease clinics (28.5%; 95% CI=19.0%, 37.9%). More gradual decreases were seen in primary care (8.6%; 95% CI=2.6%, 14.6%), and in emergency departments, urgent care, and hospitals (2.7%; 95% CI=1.7%, 3.7%). CONCLUSIONS: Fluoroquinolone use decreased after the publication of revised national guidelines, particularly in sexually transmitted disease clinics. Additional mechanisms are needed to increase the speed and magnitude of changes in prescribing in primary care, emergency departments, urgent care, and hospitals. (Am J Public Health. Published online ahead of print November 17, 2011: e1-e8. doi:10.2105/AJPH.2011.300283). |
Clinical and virologic outcomes in patients with oseltamivir-resistant seasonal influenza A (H1N1) infections: results from a clinical trial
Dharan NJ , Fry AM , Kieke BA , Coleman L , Meece J , Vandermause M , Gubareva LV , Klimov AI , Belongia EA . Influenza Other Respir Viruses 2011 6 (3) 153-8 Nineteen patients with oseltamivir-resistant seasonal influenza A (H1N1) infections were randomized to receive oseltamivir or placebo. Nasopharyngeal swabs were obtained, and clinical and virologic outcomes were compared, stratified by early or late treatment. Neuraminidase inhibition assay and pyrosequencing for H275Y confirmed resistance. Twelve (63%) patients received oseltamivir; 8 (67%) received late treatment. Seven (37%) patients received placebo; 6 (86%) presented >48 hours after onset. Time to 50% decrease in symptom severity, complete symptom resolution, and first negative culture were shortest among the early treatment group. While sample size prohibits a strong conclusion, future studies should evaluate for similar trends. (Please cite this paper as: Dharan et al. (2011) Clinical and virologic outcomes in patients with oseltamivir-resistant seasonal influenza A (H1N1) infections: results from a clinical trial. Influenza and Other Respiratory Viruses DOI: 10.1111/j.1750-2659.2011.00312.x). |
Dynamics of hantavirus infection in Peromyscus leucopus of central Pennsylvania
Luong LT , Vigliotti BA , Campbell S , Comer JA , Mills JN , Hudson PJ . Vector Borne Zoonotic Dis 2011 11 (11) 1459-64 Hantaviruses are distributed throughout the United States and are the etiologic agents for hantavirus pulmonary syndrome and hemorrhagic fever with renal syndrome. Hantavirus genotypes and epidemiologic patterns vary spatially across the United States. While several longitudinal studies have been performed in the western United States, little is known about the virus in the eastern United States. We undertook a longitudinal study of hantaviruses in the primary rodent reservoir host in central Pennsylvania, Peromyscus leucopus. Prevalence of hantavirus antibodies varied both by year and site, but was not correlated with host abundance. Males were significantly more likely to have antibodies to a hantavirus than females, and both antibody sero-conversion and antibody prevalence increased with mass class (indicator for age). Our findings suggest that one or more hantaviruses are present and circulating among P. leucopus of central Pennsylvania, and understanding the dynamics in this region could help prevent zoonotic transmission to humans. Our aim was to describe the differences in epizootiology of hantavirus infection in rodents from various geographical locations to enable improved analysis of the risk of rodent-to-human transmission and obtain insights that may indicate improved means of disease intervention. |
Emergency hospitalizations for adverse drug events in older Americans
Budnitz DS , Lovegrove MC , Shehab N , Richards CL . N Engl J Med 2011 365 (21) 2002-12 BACKGROUND: Adverse drug events are important preventable causes of hospitalization in older adults. However, nationally representative data on adverse drug events that result in hospitalization in this population have been limited. METHODS: We used adverse-event data from the National Electronic Injury Surveillance System-Cooperative Adverse Drug Event Surveillance project (2007 through 2009) to estimate the frequency and rates of hospitalization after emergency department visits for adverse drug events in older adults and to assess the contribution of specific medications, including those identified as high-risk or potentially inappropriate by national quality measures. RESULTS: On the basis of 5077 cases identified in our sample, there were an estimated 99,628 emergency hospitalizations (95% confidence interval [CI], 55,531 to 143,724) for adverse drug events in U.S. adults 65 years of age or older each year from 2007 through 2009. Nearly half of these hospitalizations were among adults 80 years of age or older (48.1%; 95% CI, 44.6 to 51.6). Nearly two thirds of hospitalizations were due to unintentional overdoses (65.7%; 95% CI, 60.1 to 71.3). Four medications or medication classes were implicated alone or in combination in 67.0% (95% CI, 60.0 to 74.1) of hospitalizations: warfarin (33.3%), insulins (13.9%), oral antiplatelet agents (13.3%), and oral hypoglycemic agents (10.7%). High-risk medications were implicated in only 1.2% (95% CI, 0.7 to 1.7) of hospitalizations. CONCLUSIONS: Most emergency hospitalizations for recognized adverse drug events in older adults resulted from a few commonly used medications, and relatively few resulted from medications typically designated as high-risk or inappropriate. Improved management of antithrombotic and antidiabetic drugs has the potential to reduce hospitalizations for adverse drug events in older adults. |
Whole blood lead levels are associated with biomarkers of joint tissue metabolism in African American and white men and women: the Johnston County Osteoarthritis Project
Nelson AE , Chaudhary S , Kraus VB , Fang F , Chen JC , Schwartz TA , Shi XA , Renner JB , Stabler TV , Helmick CG , Caldwell K , Robin Poole A , Jordan JM . Environ Res 2011 111 (8) 1208-14 PURPOSE: To examine associations between biomarkers of joint tissue metabolism and whole blood lead (Pb), separately for men and women in an African American and Caucasian population, which may reflect an underlying pathology. METHODS: Participants in the Johnston County Osteoarthritis Project Metals Exposure Sub-Study (329 men and 342 women) underwent assessment of whole blood Pb and biochemical biomarkers of joint tissue metabolism. Urinary cross-linked N telopeptide of type I collagen (uNTX-I) and C-telopeptide fragments of type II collagen (uCTX-II), serum cleavage neoepitope of type II collagen (C2C), serum type II procollagen synthesis C-propeptide (CPII), and serum hyaluronic acid (HA) were measured using commercially available kits; the ratio of [C2C:CPII] was calculated. Serum cartilage oligomeric matrix protein (COMP) was measured by an in-house assay. Multiple linear regression models were used to examine associations between continuous blood Pb and biomarker outcomes, adjusted for age, race, current smoking status, and body mass index. Results are reported as estimated change in biomarker level for a 5-unit change in Pb level. RESULTS: The median Pb level among men and women was 2.2 and 1.9mug/dL, respectively. Correlations were noted between Pb levels and the biomarkers uNTX-I, uCTX-II, and COMP in women, and between Pb and uCTX-II, COMP, CPII, and the ratio [C2C:CPII] in men. In adjusted models among women, a 5-unit increase in blood Pb level was associated with a 28% increase in uCTX-II and a 45% increase in uNTX-I levels (uCTX-II: 1.28 [95% CI: 1.04-1.58], uNTX-I: 1.45 [95% CI:1.21-1.74]). Among men, levels of Pb and COMP showed a borderline positive association (8% increase in COMP for a 5-unit change in Pb: 1.08 [95% CI: 1.00-1.18]); no other associations were significant after adjustment. CONCLUSIONS: Based upon known biomarker origins, the novel associations between blood Pb and biomarkers appear to be primarily reflective of relationships to bone and calcified cartilage turnover among women and cartilage metabolism among men, suggesting a potential gender-specific effect of Pb on joint tissue metabolism that may be relevant to osteoarthritis. |
Menstrual cycle characteristics and reproductive hormone levels in women exposed to atrazine in drinking water
Cragin LA , Kesner JS , Bachand AM , Barr DB , Meadows JW , Krieg EF , Reif JS . Environ Res 2011 111 (8) 1293-301 Atrazine is the most commonly used herbicide in the U.S. and a wide-spread groundwater contaminant. Epidemiologic and laboratory evidence exists that atrazine disrupts reproductive health and hormone secretion. We examined the relationship between exposure to atrazine in drinking water and menstrual cycle function including reproductive hormone levels. Women 18-40 years old residing in agricultural communities where atrazine is used extensively (Illinois) and sparingly (Vermont) answered a questionnaire (n=102), maintained menstrual cycle diaries (n=67), and provided daily urine samples for analyses of luteinizing hormone (LH), and estradiol and progesterone metabolites (n=35). Markers of exposures included state of residence, atrazine and chlorotriazine concentrations in tap water, municipal water and urine, and estimated dose from water consumption. Women who lived in Illinois were more likely to report menstrual cycle length irregularity (odds ratio (OR)=4.69; 95% confidence interval (CI): 1.58-13.95) and more than 6 weeks between periods (OR=6.16; 95% CI: 1.29-29.38) than those who lived in Vermont. Consumption of >2 cups of unfiltered Illinois water daily was associated with increased risk of irregular periods (OR=5.73; 95% CI: 1.58-20.77). Estimated "dose" of atrazine and chlorotriazine from tap water was inversely related to mean mid-luteal estradiol metabolite. Atrazine "dose" from municipal concentrations was directly related to follicular phase length and inversely related to mean mid-luteal progesterone metabolite levels. We present preliminary evidence that atrazine exposure, at levels below the US EPA MCL, is associated with increased menstrual cycle irregularity, longer follicular phases, and decreased levels of menstrual cycle endocrine biomarkers of infertile ovulatory cycles. |
Gestational lead exposure selectively decreases retinal dopamine amacrine cells and dopamine content in adult mice
Fox DA , Hamilton WR , Johnson JE , Xiao W , Chaney S , Mukherjee S , Miller DB , O'Callaghan JP . Toxicol Appl Pharmacol 2011 256 (3) 258-67 Gestational lead exposure (GLE) produces supernormal scotopic electroretinograms (ERG) in children, monkeys and rats, and a novel retinal phenotype characterized by an increased number of rod photoreceptors and bipolar cells in adult mice and rats. Since the loss of dopaminergic amacrine cells (DA ACs) in GLE monkeys and rats contributes to supernormal ERGs, the retinal DA system was analyzed in mice following GLE. C57BL/6 female mice were exposed to low (27ppm), moderate (55ppm) or high (109ppm) lead throughout gestation and until postnatal day 10 (PN10). Blood [Pb] in control, low-, moderate- and high-dose GLE was ≤1, ≤10, ~25 and ~40mcg/dL, respectively, on PN10 and by PN30 all were ≤1mcg/dL. At PN60, confocal-stereology studies used vertical sections and wholemounts to characterize tyrosine hydroxylase (TH) expression and the number of DA and other ACs. GLE dose-dependently and selectively decreased the number of TH-immunoreactive (IR) DA ACs and their synaptic plexus without affecting GABAergic, glycinergic or cholinergic ACs. Immunoblots and confocal revealed dose-dependent decreases in retinal TH protein expression and content, although monoamine oxidase-A protein and gene expression were unchanged. High-pressure liquid chromatography showed that GLE dose-dependently decreased retinal DA content, its metabolites and DA utilization/release. The mechanism of DA selective vulnerability is unknown. However, a GLE-induced loss/dysfunction of DA ACs during development could increase the number of rods and bipolar cells since DA helps regulate neuronal proliferation, whereas during adulthood it could produce ERG supernormality as well as altered circadian rhythms, dark/light adaptation and spatial contrast sensitivity. |
The HPV vaccine impact monitoring project (HPV-IMPACT): assessing early evidence of vaccination impact on HPV-associated cervical cancer precursor lesions.
Hariri S , Unger ER , Powell SE , Bauer HM , Bennett NM , Bloch KC , Niccolai LM , Schafer S , Markowitz LE . Cancer Causes Control 2011 23 (2) 281-8 The following paper describes a collaboration between the Centers for Disease Control and Prevention and five Emerging Infections Program sites to develop a comprehensive population-based approach to monitoring human papillomavirus (HPV) vaccine impact on cervical cancer precursors and associated HPV genotypes. The process of establishing this novel monitoring system is described, and development details such as enumeration of sources for reporting cervical intraepithelial neoplasia 2/3 and adenocarcinoma in situ, approaches to case ascertainment, electronic reporting, and HPV typing are outlined. Implementation of a feasible and sustainable surveillance system for HPV-associated cervical precancers will enable evaluation of the direct impact of HPV vaccination. |
Prescription medication use among normal weight, overweight, and obese adults, United States, 2005-2008
Kit BK , Ogden CL , Flegal KM . Ann Epidemiol 2011 22 (2) 112-9 PURPOSE: We sought to describe differences between normal weight, overweight, and obese adults in use of specific prescription medication classes. METHODS: Cross-sectional analysis of prescription medication use among 9789 adults in the National Health and Nutrition Examination Survey, a nationally representative sample of the United States. RESULTS: In 2005-2008, 56.4% (95% confidence interval [CI], 54.6-58.3) of adults used 1+ prescription medication. Approximately one-quarter of adults used a hypertension medication (26.1%; 95% CI, 24.5%-27.8%). The use of hypertension medications increased with increasing weight status (normal weight: 17.2%; 95% CI, 15.6%-18.8%; overweight: 24.5%, 95% CI, 22.6%-26.4%; and obese: 35.1%, 95% CI, 32.8%-37.4%). Similarly, lipid-lowering, analgesic, antidepressant, proton pump inhibitors, thyroid, diabetes, and bronchodilator medication use was greater among obese compared with normal weight adults (each p < .01). Among adults 65+ years, 72% (95% CI, 68.2%-75.8%) of men and 67.7% (95% CI, 64.3%-71.2%) of women used a hypertension medication and a majority of men (51.2%, 95% CI, 48.4%-54%) and 40.3% (95% CI, 36.8%-43.8%) of women used lipid lowering medications; the use of both was greater among obese adults compared to normal weight adults (both p < .01). CONCLUSIONS: Obese adults in the United States use several prescription medication classes more frequently, than normal weight adults, including hypertension, lipid-lowering, and diabetes medications. |
Public health surveillance and knowing about health in the context of growing sources of health data
Lee LM , Thacker SB . Am J Prev Med 2011 41 (6) 636-40 The past decade has brought substantial changes in how data related to a community's health are collected, stored, and used to inform decisions about health interventions. Despite these changes, the purpose of public health surveillance has remained constant for more than a century. Public health surveillance is the ongoing, systematic collection, analysis, and interpretation of health-related data with the a priori purpose of preventing or controlling disease or injury, or of identifying unusual events of public health importance, followed by the dissemination and use of information for public health action. Surveillance is an important and necessary contributor to knowledge of a community's health. The public health system is responsible for ensuring that public health surveillance is conducted with appropriate practices and safeguards in order to maintain the public's trust. |
Evaluation of pregnancy mortality in Louisiana using enhanced linkage and different indicators defined by WHO and CDC/ACOG: challenging and practical issues
Tran T , Roberson E , Borstell J , Hoyert DL . Matern Child Health J 2011 15 (7) 955-63 Differences in definitions and methods of data collection on deaths occurring during or shortly after pregnancy have created confusion and challenges in evaluating research findings. The study aimed to determine if the use of enhanced linkage procedures improve data collection of deaths occurring during or shortly after pregnancy, and how different definitions of those deaths changed the results of data analysis. The study used 2000-2005 Louisiana Pregnancy Mortality Surveillance System (LPMSS) and 2000-2005 death certificates linked with 1999-2005 live birth and fetal death certificates. Five indicators of deaths occurring during or shortly after pregnancy using WHO and CDC/ACOG definitions were estimated. One-sided Spearman rank test was used to analyze maternal mortality trends from 2000 to 2005. Of 345 women who died within 1 year of pregnancy, 187 were identified through linkage; 38 of those were missed by the LPMSS. Total mortality ratios of deaths occurring within 1 year of pregnancy ranged from 13.4 to 88.9 per 100,000 live births depending on the indicator used. CDC/ACOG pregnancy-related death and pregnancy-associated death statistically increased, whereas WHO pregnancy-related death decreased between 2000 and 2005. The most common causes of death differed by indicator. Universal adoption of linkage procedures could improve data on deaths occurring during or shortly after pregnancy. Estimates, trends, and most common causes of death were markedly different depending on which indicator was used. Additionally, the use of different mortality indicators during analysis provides a more detailed picture of potential target areas for future research and interventions. |
The field-testing of a novel integrated mapping protocol for neglected tropical diseases
Pelletreau S , Nyaku M , Dembele M , Sarr B , Budge P , Ross R , Mathieu E . PLoS Negl Trop Dis 2011 5 (11) e1380 BACKGROUND: Vertical control and elimination programs focused on specific neglected tropical diseases (NTDs) can achieve notable success by reducing the prevalence and intensity of infection. However, many NTD-endemic countries have not been able to launch or scale-up programs because they lack the necessary baseline data for planning and advocacy. Each NTD program has its own mapping guidelines to collect missing data. Where geographic overlap among NTDs exists, an integrated mapping approach could result in significant resource savings. We developed and field-tested an innovative integrated NTD mapping protocol (Integrated Threshold Mapping (ITM) Methodology) for lymphatic filariasis (LF), trachoma, schistosomiasis and soil-transmitted helminths (STH). METHODOLOGY/PRINCIPAL FINDINGS: The protocol is designed to be resource-efficient, and its specific purpose is to determine whether a threshold to trigger public health interventions in an implementation unit has been attained. The protocol relies on World Health Organization (WHO) recommended indicators in the disease-specific age groups. For each disease, the sampling frame was the district, but for schistosomiasis, the sub-district rather than the ecological zone was used. We tested the protocol by comparing it to current WHO mapping methodologies for each of the targeted diseases in one district each in Mali and Senegal. Results were compared in terms of public health intervention, and feasibility, including cost. In this study, the ITM methodology reached the same conclusions as the WHO methodologies regarding the initiation of public health interventions for trachoma, LF and STH, but resulted in more targeted intervention recommendations for schistosomiasis. ITM was practical, feasible and demonstrated an overall cost saving compared with the standard, non-integrated, WHO methodologies. CONCLUSIONS/SIGNIFICANCE: This integrated mapping tool could facilitate the implementation of much-needed programs in endemic countries. |
Beyond base pairs to bedside: a population perspective on how genomics can improve health.
Khoury MJ , Gwinn M , Bowen MS , Dotson WD . Am J Public Health 2011 102 (1) 34-7 A decade after the sequencing of the human genome, the National Human Genome Research Institute announced a strategic plan for genomic medicine. It calls for evaluating the structure and biology of genomes, understanding the biology of disease, advancing the science of medicine, and improving the effectiveness of health care. Fulfilling the promise of genomics urgently requires a population perspective to complement the bench-to-bedside model of translation. A population approach should assess the contribution of genomics to health in the context of social and environmental determinants of disease; evaluate genomic applications that may improve health care; design strategies for integrating genomics into practice; address ethical, legal, and social issues; and measure the population health impact of new technologies. (Am J Public Health. Published online ahead of print November 17, 2011: e5-e8. doi:10.2105/AJPH.2011.300299). |
Genetic epidemiology with a capital E, ten years after.
Khoury MJ , Gwinn M , Clyne M , Yu W . Genet Epidemiol 2011 35 (8) 845-52 More than a decade after Duncan Thomas gave his presidential address at the International Society for Genetic Epidemiology entitled "Genetic Epidemiology with a Capital E," genetic epidemiology has gone mainstream. Epidemiology has taken its place not only in gene discovery studies but also in characterizing genetic effects and gene-environment interactions in populations. Furthermore, epidemiologic principles are being applied to the evaluation of genetic tests. We used an online informatics tool, the HuGE Navigator, to describe the growth in the field in the past decade. We developed the HuGE Navigator as a means to continuously monitor the evolving information obtained from epidemiologic studies of the human genome. Between 2001 and 2010, the HuGE Navigator included 57,005 articles published in 2,396 journals. During that period, the annual number of publications increased almost four-fold. The articles included 986 genome-wide association studies and 1,879 meta-analyses of gene-disease associations. The total number of authors of published studies grew from 12,907 in 2001 to 48,389 in 2010. The number of diseases also increased over time, from 697 medical subject headings in 2001 to 1,404 in 2010. Gene-environment interaction was mentioned explicitly in 17% of published abstracts, almost half of which focused on gene-drug interactions. Clearly, genetic epidemiology has gone "capital E" in the past decade; however, the ever-expanding volume and variety of genomic information poses a formidable challenge for developing appropriate methods for analysis, synthesis, and inference on complex genetic and environmental effects. We extend Duncan Thomas' capital E to include "Evaluation" as the tools of epidemiology are increasingly used to assess how genome-based information can be applied in medicine and public health. (Genet. Epidemiol. 35:845-852, 2011. (c) 2011 Wiley Periodicals, Inc.) |
F8 and F9 mutations in US haemophilia patients: correlation with history of inhibitor and race/ethnicity.
Miller CH , Benson J , Ellingsen D , Driggers J , Payne A , Kelly FM , Soucie JM , Hooper WC . Haemophilia 2011 18 (3) 375-82 Both genetic and treatment-related risk factors contribute to the development of inhibitors in haemophilia. An inhibitor surveillance system piloted at 12 US sites has the goal of assessing risk factors through prospective data collection. This report examines the relationship of genotype and race/ethnicity to history of inhibitor in a large cohort of US haemophilia patients. Mutation analysis was performed on 676 haemophilia A (HA) and 153 haemophilia B (HB) patients by sequencing, Multiplex Ligation-dependent Probe Amplification, and PCR for inversions in F8 introns 22 (inv22) and 1 (inv1). Two HB patients with deletions had history of inhibitor. In severe HA, frequency of history of inhibitor was: large deletion 57.1%, splice site 35.7%, inv22 26.8%, nonsense 24.5%, frameshift 12.9%, inv1 11.1% and missense 9.5%. In HA, 19.6% of 321 White non-Hispanics (Whites), 37.1% of 35 Black non-Hispanics (Blacks) and 46.9% of 32 Hispanics had history of inhibitor (P = 0.0003). Mutation types and novel mutation rates were similar across ethnicities. When F8 haplotypes were constructed, Whites and Hispanics showed only H1 and H2. Within H1, history of inhibitor was 12.4% in Whites, 40.0% in Blacks (P = 0.009) and 32.4% in Hispanics (P = 0.002). Inhibitor frequency is confirmed to vary by mutation type and race in a large US population. White patients with history of inhibitor did not exhibit rare F8 haplotypes. F8 gene analysis did not reveal a cause for the higher inhibitor frequencies in Black and Hispanic patients. |
Sufficient sleep, physical activity, and sedentary behaviors
Foti KE , Eaton DK , Lowry R , McKnight-Ely LR . Am J Prev Med 2011 41 (6) 596-602 BACKGROUND: Insufficient sleep among adolescents is common and has adverse health and behavior consequences. Understanding associations of physical activity and sedentary behaviors with sleep duration could shed light on ways to promote sufficient sleep. PURPOSE: The purpose of this study is to determine whether physical activity and sedentary behaviors are associated with sufficient sleep (8 or more hours of sleep on an average school night) among U.S. high school students. METHODS: Data were from the 2009 national Youth Risk Behavior Survey and are representative of 9th-12th-grade students nationally (n=14,782). Associations of physical activity and sedentary behaviors with sufficient sleep were determined using logistic regression models controlling for confounders. Data were analyzed in October 2010. RESULTS: Students who engaged in ≥60 minutes of physical activity daily during the 7 days before the survey had higher odds of sufficient sleep than those who did not engage in ≥60 minutes on any day. There was no association between the number of days students were vigorously active ≥20 minutes and sufficient sleep. Compared to their respective referent groups of 0 hours on an average school day, students who watched TV ≥4 hours/day had higher odds of sufficient sleep and students who played video or computer games or used a computer for something that was not school work ≥2 hours/day had lower odds of sufficient sleep. CONCLUSIONS: Daily physical activity for ≥60 minutes and limited computer use are associated with sufficient sleep among adolescents. |
Depressive symptoms and food insufficiency among HIV-infected crack users in Atlanta and Miami
Vogenthaler NS , Hadley C , Rodriguez AE , Valverde EE , del Rio C , Metsch LR . AIDS Behav 2011 15 (7) 1520-6 Depression contributes to worse general and HIV-related clinical outcomes. We examined the prevalence of and factors associated with depressive symptomatology among HIV-infected crack cocaine users recruited for Project HOPE (Hospital Visit is an Opportunity for Prevention and Engagement with HIV-positive Crack Users). We used multiple logistic regression to determine sociodemographic correlates associated with screening in for depression. Among 291 participants, three-quarters (73.5%) were identified as depressed. Higher odds of screening in for depression was associated with food insufficiency and monthly income below $600. Alcohol and crack use were not associated with screening in for depression. Depressive symptomatology is extremely prevalent among HIV-infected crack cocaine users and is associated with food insufficiency and lower income. Screening for depression and food insecurity should be included in HIV prevention and treatment programs. Improved recognition and mitigation of these conditions will help alleviate their contribution to HIV-related adverse health outcomes. |
Investigation of a cluster of cutaneous aspergillosis in a neonatal intensive care unit
Etienne KA , Subudhi CP , Chadwick PR , Settle P , Moise J , Magill SS , Chiller T , Balajee SA . J Hosp Infect 2011 79 (4) 344-8 Between December 2007 and July 2008, three neonates in a neonatal intensive care unit (NICU) in Salford, UK, were diagnosed with primary cutaneous aspergillosis (PCA) due to Aspergillus fumigatus. The first PCA case, in December 2007, developed multi-organ failure leading to death within a short time frame: the other two cases survived after treatment with intravenous antifungal therapy followed by oral posaconazole. Air, surface, and water samples were collected within the NICU and from the incubators that were occupied by the neonates. All recovered fungal isolates were confirmed as A. fumigatus by sequencing the beta-tubulin region. Microsatellite strain typing demonstrated genotypically related A. fumigatus isolates from the neonates and the humidity chambers (HCs) of the neonates' incubators, suggesting that the source of the infection may have been the HCs/incubators used in the NICU. Aspergillus strain typing may be a useful tool in clinical outbreak settings to help understand the source of exposure and to design targeted environmental interventions to prevent future infections. |
Rotavirus vaccines: update on global impact and future priorities
Yen C , Tate JE , Patel MM , Cortese MM , Lopman B , Fleming J , Lewis K , Jiang B , Gentsch J , Steele D , Parashar UD . Hum Vaccin 2011 7 (12) 1282-90 Early rotavirus vaccine adopter countries in the Americas, Europe, and in Australia have documented substantial declines in rotavirus disease burden following the introduction of vaccination. However, the full public health impact of rotavirus vaccines has not been realized as they have not been introduced into routine immunization programs in countries of Africa and Asia with the highest rotavirus disease morbidity and mortality burden. In this article, we review the epidemiology of rotavirus disease, the development and current status of rotavirus vaccines including newly available vaccine impact data from early-introducer countries, and future priorities for implementation and monitoring of rotavirus vaccination programs in developing countries. |
Varicella in infants after implementation of the US varicella vaccination program
Chaves SS , Lopez AS , Watson TL , Civen R , Watson B , Mascola L , Seward JF . Pediatrics 2011 128 (6) 1071-7 OBJECTIVE: To describe varicella disease in infants since implementation of the varicella vaccination program in the United States. PATIENTS AND METHODS: From 1995 to 2008, demographic, clinical, and epidemiologic data on cases of varicella in infants were collected prospectively through a community-based active surveillance project. We examined disease patterns for infants in 2 age groups: 0 to 5 and 6 to 11 months. RESULTS: Infant varicella disease incidence declined 89.7% from 1995 to 2008. Infants aged 0 to 5 months had milder clinical disease than those aged 6 to 11 months: ≥50 lesions, 49% vs 58% (P = .038); fever (body temperature > 38 degrees C), 12% vs 21% (P = .014); and varicella-related complications, 6% vs 14% (P = .009), respectively. Age was an independent predictor of the occurrence of complications. CONCLUSIONS: The varicella vaccination program has resulted in substantial indirect benefits for infants, who are not eligible for vaccination. Presence of maternal varicella-zoster virus antibodies might explain attenuated disease in very young infants likely born to mothers with history of varicella. Although varicella disease incidence has declined, exposure to varicella-zoster virus continues to occur. Improving varicella vaccination coverage in all age groups will further reduce the risk of varicella exposure and protect those not eligible for varicella vaccination. |
Magnitude of potential biases in a simulated case-control study of the effectiveness of influenza vaccination
Ferdinands JM , Shay DK . Clin Infect Dis 2011 54 (1) 25-32 BACKGROUND: Many influenza vaccine effectiveness estimates have been made using case-control methods. Although several forms of bias may distort estimates of vaccine effectiveness derived from case-control studies, there have been few attempts to quantify the magnitude of these biases. METHODS: We estimated the magnitude of potential biases in influenza vaccine effectiveness values derived from case-control studies from several factors, including bias from differential use of diagnostic testing based on influenza vaccine status, imperfect diagnostic test characteristics, and confounding. A decision tree model was used to simulate an influenza vaccine effectiveness case-control study in children. Using probability distributions, we varied the value of factors that influence vaccine effectiveness estimates, including diagnostic test characteristics, vaccine coverage, likelihood of receiving a diagnostic test for influenza, likelihood that a child hospitalized with acute respiratory infection had influenza, and others. Bias was measured as the difference between the effectiveness observed in the simulated case-control study and a true underlying effectiveness value. RESULTS AND CONCLUSIONS: We found an average difference between observed and true vaccine effectiveness of -11.9%. Observed vaccine effectiveness underestimated the true effectiveness in 88% of model iterations. Diagnostic test specificity exhibited the strongest association with observed vaccine effectiveness, followed by the likelihood of receiving a diagnostic test based on vaccination status and the likelihood that a child hospitalized with acute respiratory infection had influenza. Our findings suggest that the potential biases in case-control studies that we examined tend to result in underestimates of true influenza vaccine effects. |
Meningococcal conjugate vaccines: optimizing global impact
Terranella A , Cohn A , Clark T . Infect Drug Resist 2011 4 161-9 Meningococcal conjugate vaccines have several advantages over polysaccharide vaccines, including the ability to induce greater antibody persistence, avidity, immunologic memory, and herd immunity. Since 1999, meningococcal conjugate vaccine programs have been established across the globe. Many of these vaccination programs have resulted in significant decline in meningococcal disease in several countries. Recent introduction of serogroup A conjugate vaccine in Africa offers the potential to eliminate meningococcal disease as a public health problem in Africa. However, the duration of immune response and the development of widespread herd immunity in the population remain important questions for meningococcal vaccine programs. Because of the unique epidemiology of meningococcal disease around the world, the optimal vaccination strategy for long-term disease prevention will vary by country. |
Influenza A (H1N1) 2009 monovalent vaccination among adults with asthma, U.S., 2010
Lu PJ , Callahan DB , Ding H , Euler GL . Am J Prev Med 2011 41 (6) 619-26 BACKGROUND: The 2009 pandemic influenza A (H1N1) virus (2009 H1N1) was first identified in April 2009 and quickly spread around the world. The first doses of influenza A (H1N1) 2009 monovalent vaccine (2009 H1N1 vaccine) became available in the U.S. in early October 2009. Because people with asthma are at increased risk of complications from influenza, people with asthma were included among the initial prioritized groups. PURPOSE: To evaluate 2009 H1N1 vaccination coverage and identify factors independently associated with vaccination among adults with asthma in the U.S. METHODS: Data from the 2009-2010 BRFSS (Behavioral Risk Factor Surveillance System) influenza supplemental survey were used; responses from March through June 2010 were analyzed to estimate vaccination levels of 2009 H1N1 vaccine among respondents aged 25-64 years with asthma. Multivariable logistic regression and predictive marginal models were performed to identify factors independently associated with vaccination. RESULTS: Among adults aged 25-64 years with asthma, 25.5% (95% CI=23.9%, 27.2%) received the 2009 H1N1 vaccination. Vaccination coverage ranged from 9.9% (95% CI=6.4%, 15.1%) in Mississippi to 46.1% (95% CI=33.3%, 61.2%) in Maine. Characteristics independently associated with an increased likelihood of vaccination among adults with asthma were as follows: had a primary doctor, had other high-risk conditions, and received seasonal influenza vaccination in the 2009-2010 season. CONCLUSIONS: Vaccination coverage among adults aged 25-64 years with asthma was only 25.5% and varied widely by state and demographic characteristics. National and state-specific 2009 H1N1 vaccination coverage data for adults with asthma are useful for evaluating the vaccination campaign and for planning and implementing strategies for increasing vaccination coverage in possible future pandemics. |
Compliance with recommended dosing intervals for HPV vaccination among females, 13-17 years, National Immunization Survey-Teen, 2008-2009
Dorell CG , Stokley S , Yankey D , Markowitz LE . Vaccine 2011 30 (3) 503-5 Data from the 2008 and 2009 National Immunization Survey-Teen were analyzed to determine age at initiation of the human papillomavirus vaccine (HPV) series among females 13-17 years (n=7594) and assess compliance with the recommended HPV dosing intervals. Among females who initiated the HPV series, 56.7% of females <13 years at the time of the HPV vaccine recommendation publication did so by age 13; while the majority of females 13-14 and 15-17 years at the time of the recommendation publication did so at ages 14 (44.4%) and 16 (46.7%), respectively. Forty-six percent of females who received three doses completed the vaccination series in a period longer than the recommended time interval. Series completion at an earlier age to ensure protection before sexual debut is optimal. Improved provider communication of the need for three doses for long-term protection and implementing clinical practice guidelines to use reminder-recall systems may increase HPV completion. |
Healthy people 2010 objectives for unintentional injury and violence among adolescents: trends from the National Youth Risk Behavior Survey, 1999-2009
Olsen EO , Hertz MF , Shults RA , Hamburger ME , Lowry R . Am J Prev Med 2011 41 (6) 551-8 BACKGROUND: In 2000, the USDHHS released Healthy People 2010 (HP2010), a series of disease prevention and health promotion objectives for the nation. Thirty-nine of these objectives were dedicated to injury prevention and six of these objectives related to adolescents, who were tracked through CDC's National Youth Risk Behavior Survey (YRBS). PURPOSE: This paper uses national YRBS data from 1999 to 2009 to analyze overall and subgroup trends and determine progress toward targets for the following HP2010 objectives: seatbelt use (HP2010 objective 15-19); motorcycle helmet use (15-21); riding with a driver who had been drinking alcohol (26-6); physical fighting (15-38); weapon carrying on school property (15-39); and suicide attempts requiring medical attention (18-2). METHODS: The CDC conducted the national YRBS biennially from 1999 to 2009 and used similar three-stage cluster-sample designs to obtain representative samples of high school students in the U.S. This study was conducted in 2010 and used linear and quadratic time variables simultaneously in logistic regression models while controlling for gender, race/ethnicity, and grade to test for secular trends over time. RESULTS: Only two objectives met their HP2010 targets: riding with a driver who had been drinking alcohol (26-6) and physical fighting (15-38). Progress was seen for four additional objectives and within some subgroups. CONCLUSIONS: Substantial policy and practice changes must occur if the recently released Healthy People 2020 targets are to be met. School-, community-, and state-level policies and programs may be effective tools to prevent injuries and victimizations. |
Rapid typing of Coxiella burnetii.
Hornstra HM , Priestley RA , Georgia SM , Kachur S , Birdsell DN , Hilsabeck R , Gates LT , Samuel JE , Heinzen RA , Kersh GJ , Keim P , Massung RF , Pearson T . PLoS One 2011 6 (11) e26201 Coxiella burnetii has the potential to cause serious disease and is highly prevalent in the environment. Despite this, epidemiological data are sparse and isolate collections are typically small, rare, and difficult to share among laboratories as this pathogen is governed by select agent rules and fastidious to culture. With the advent of whole genome sequencing, some of this knowledge gap has been overcome by the development of genotyping schemes, however many of these methods are cumbersome and not readily transferable between institutions. As comparisons of the few existing collections can dramatically increase our knowledge of the evolution and phylogeography of the species, we aimed to facilitate such comparisons by extracting SNP signatures from past genotyping efforts and then incorporated these signatures into assays that quickly and easily define genotypes and phylogenetic groups. We found 91 polymorphisms (SNPs and indels) among multispacer sequence typing (MST) loci and designed 14 SNP-based assays that could be used to type samples based on previously established phylogenetic groups. These assays are rapid, inexpensive, real-time PCR assays whose results are unambiguous. Data from these assays allowed us to assign 43 previously untyped isolates to established genotypes and genomic groups. Furthermore, genotyping results based on assays from the signatures provided here are easily transferred between institutions, readily interpreted phylogenetically and simple to adapt to new genotyping technologies. |
Optimization of a low cost and broadly sensitive genotyping assay for HIV-1 drug resistance surveillance and monitoring in resource-limited settings.
Zhou Z , Wagar N , Devos JR , Rottinghaus E , Diallo K , Nguyen DB , Bassey O , Ugbena R , Wadonda-Kabondo N , McConnell MS , Zulu I , Chilima B , Nkengasong J , Yang C . PLoS One 2011 6 (11) e28184 Commercially available HIV-1 drug resistance (HIVDR) genotyping assays are expensive and have limitations in detecting non-B subtypes and circulating recombinant forms that are co-circulating in resource-limited settings (RLS). This study aimed to optimize a low cost and broadly sensitive in-house assay in detecting HIVDR mutations in the protease (PR) and reverse transcriptase (RT) regions of pol gene. The overall plasma genotyping sensitivity was 95.8% (N = 96). Compared to the original in-house assay and two commercially available genotyping systems, TRUGENE(R) and ViroSeq(R), the optimized in-house assay showed a nucleotide sequence concordance of 99.3%, 99.6% and 99.1%, respectively. The optimized in-house assay was more sensitive in detecting mixture bases than the original in-house (N = 87, P<0.001) and TRUGENE(R) and ViroSeq(R) assays. When the optimized in-house assay was applied to genotype samples collected for HIVDR surveys (N = 230), all 72 (100%) plasma and 69 (95.8%) of the matched dried blood spots (DBS) in the Vietnam transmitted HIVDR survey were genotyped and nucleotide sequence concordance was 98.8%; Testing of treatment-experienced patient plasmas with viral load (VL) ≥ and <3 log10 copies/ml from the Nigeria and Malawi surveys yielded 100% (N = 46) and 78.6% (N = 14) genotyping rates, respectively. Furthermore, all 18 matched DBS stored at room temperature from the Nigeria survey were genotyped. Phylogenetic analysis of the 236 sequences revealed that 43.6% were CRF01_AE, 25.9% subtype C, 13.1% CRF02_AG, 5.1% subtype G, 4.2% subtype B, 2.5% subtype A, 2.1% each subtype F and unclassifiable, 0.4% each CRF06_CPX, CRF07_BC and CRF09_CPX. CONCLUSIONS: The optimized in-house assay is broadly sensitive in genotyping HIV-1 group M viral strains and more sensitive than the original in-house, TRUGENE(R) and ViroSeq(R) in detecting mixed viral populations. The broad sensitivity and substantial reagent cost saving make this assay more accessible for RLS where HIVDR surveillance is recommended to minimize the development and transmission of HIVDR. |
Semi-quantitative analysis of influenza samples using the Luminex xTAG(®) respiratory viral panel kit.
Smith J , Sammons D , Toennis C , Butler MA , Blachere F , Beezhold D . Toxicol Mech Methods 2011 22 (3) 211-7 The Luminex xTAG(R) respiratory viral panel (RVP) kit simultaneously detects and identifies multiple respiratory viruses including several subtypes of influenza A using a multiplex nucleic acid amplification test assay platform. The emitted fluorescence signal from the RVP assay provides qualitative information on the presence of a particular viral species in respiratory specimens. However, a quantitative assessment is preferred when monitoring environmental samples for respiratory viruses. In this study, we explored the potential use of the RVP kit as a semi-quantitative screening assay for influenza virus detection. The concentration- response of the RVP assay was modeled using four-parameter logistic (4-PL) fits of mean fluorescence intensity (MFI) versus dilute ranges of the influenza A matrix gene, seasonal influenza vaccine, and 2009 H1N1 influenza vaccine. The goodness of fit of the 4-PL model was evaluated by comparing the copy number determined with the fitted model (observed copy number) with the copy number calculated from the dilution of the matrix DNA or vaccine (expected copy number). For the matrix DNA and 2009 H1N1 vaccine, the 4-PL model provided good fit for the influenza A RVP assay response over factors of 10(3) to 10(4). For seasonal influenza vaccine, the model provided good fit for RVP assay response to influenza A, influenza B, H1, and H3. |
Multiple cytokines are released when blood from patients with tuberculosis is stimulated with Mycobacterium tuberculosis antigens
Kellar KL , Gehrke J , Weis SE , Mahmutovic-Mayhew A , Davila B , Zajdowicz MJ , Scarborough R , Lobue PA , Lardizabal AA , Daley CL , Reves RR , Bernardo J , Campbell BH , Whitworth WC , Mazurek GH . PLoS One 2011 6 (11) e26545 BACKGROUND: Mycobacterium tuberculosis (Mtb) infection may cause overt disease or remain latent. Interferon gamma release assays (IGRAs) detect Mtb infection, both latent infection and infection manifesting as overt disease, by measuring whole-blood interferon gamma (IFN-gamma) responses to Mtb antigens such as early secreted antigenic target-6 (ESAT-6), culture filtrate protein 10 (CFP-10), and TB7.7. Due to a lack of adequate diagnostic standards for confirming latent Mtb infection, IGRA sensitivity for detecting Mtb infection has been estimated using patients with culture-confirmed tuberculosis (CCTB) for whom recovery of Mtb confirms the infection. In this study, cytokines in addition to IFN-gamma were assessed for potential to provide robust measures of Mtb infection. METHODS: Cytokine responses to ESAT-6, CFP-10, TB7.7, or combinations of these Mtb antigens, for patients with CCTB were compared with responses for subjects at low risk for Mtb infection (controls). Three different multiplexed immunoassays were used to measure concentrations of 9 to 20 different cytokines. Responses were calculated by subtracting background cytokine concentrations from cytokine concentrations in plasma from blood stimulated with Mtb antigens. RESULTS: Two assays demonstrated that ESAT-6, CFP-10, ESAT-6+CFP-10, and ESAT-6+CFP-10+TB7.7 stimulated the release of significantly greater amounts of IFN-gamma, IL-2, IL-8, MCP-1 and MIP-1beta for CCTB patients than for controls. Responses to combination antigens were, or tended to be, greater than responses to individual antigens. A third assay, using whole blood stimulation with ESAT-6+CFP-10+TB7.7, revealed significantly greater IFN-gamma, IL-2, IL-6, IL-8, IP-10, MCP-1, MIP-1beta, and TNF-alpha responses among patients compared with controls. One CCTB patient with a falsely negative IFN-gamma response had elevated responses with other cytokines. CONCLUSIONS: Multiple cytokines are released when whole blood from patients with CCTB is stimulated with Mtb antigens. Measurement of multiple cytokine responses may improve diagnostic sensitivity for Mtb infection compared with assessment of IFN-gamma alone. |
Phytoestrogen biomonitoring: an extractionless LC-MS/MS method for measuring urinary isoflavones and lignans by use of atmospheric pressure photoionization (APPI)
Parker DL , Rybak ME , Pfeiffer CM . Anal Bioanal Chem 2011 402 (3) 1123-36 We present here a high-performance liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for quantifying phytoestrogenic isoflavones (daidzein, equol, genistein, and O-desmethylangolensin) and lignans (enterodiol and enterolactone) in urine without the use of extraction or the preconcentration techniques inherent in existing methods. The development of this concept was made possible by use of atmospheric pressure photoionization (APPI); an ionization technique that we found to improve analyte sensitivity relative to electrospray ionization and atmospheric pressure chemical ionization for this particular group of compounds. The analytical performance of this method was equal to or exceeded that of comparable methods. Between-run coefficients of variation (CVs) across three quality control (QC) pool levels analyzed in duplicate over 20 days were 3.1-5.8% CV; within-run CVs were 2.3-6.0%. Accuracy, as determined by average spike recovery in QC pools, was generally within +/-10% of being quantitative (100%). Relative limits of detection were 0.04-0.4 ng/mL urine, with absolute detection limits as low as 0.1 pg. This method was applied to the analysis of >2,500 urine specimens for the 2005-2006 Centers for Disease Control and Prevention's National Health and Nutrition Examination Survey (NHANES). The method was capable of quantifying these compounds in 95-100% of study samples. This work is the first ever report of using APPI for the LC-MS/MS determination of these compounds in urine. It is also the first method of its kind to do so without any need for analyte extraction or preconcentration prior to analysis. |
Comparative study of different sources of pertussis toxin (PT) as coating antigens in IgG anti-PT ELISAs
Kapasi A , Meade BD , Plikaytis B , Pawloski L , Martin MD , Yoder S , Rock MT , Coddens S , Haezebroeck V , Fievet-Groyne F , Bixler G , Jones C , Hildreth S , Edwards KM , Messonnier NE , Tondella ML . Clin Vaccine Immunol 2011 19 (1) 64-72 In an effort to improve the reliability and reproducibility of serological assays for Bordetella pertussis, a collaborative study was conducted to compare four different sources of pertussis toxin (PT) as coating antigens in the immunoglobulin G (IgG) anti-PT enzyme-linked immunosorbent assay (ELISA). Four sources of PT were used as coating antigens in the IgG anti-PT ELISA in four different testing laboratories (Labs A-D) to determine if the different antigen preparations and different laboratories influenced assay results. A panel of 60 sera consisting of de-identified human specimens from previous vaccination trials of normal healthy adults and infants and clinical specimens from outbreak settings was tested. In the four laboratories, each sample was tested three times with the four PT antigens according to the standard coating optimization and IgG anti-PT ELISA testing procedures used in that laboratory. Differences among the antigens, as well as intra- and inter-laboratory variability, were evaluated. Excellent agreement was observed with the test panel results among the four antigens within each laboratory. Concordance correlation coefficient (r(c)) measurements among the different antigens ranged from 0.99, 0.99-1.00, 1.00, and 0.97-1.00 for Labs A-D, respectively. The comparisons between pairs of laboratories also indicated a high degree of concordance for each PT preparation, with r(c) measurements between 0.90-0.98, 0.93-0.99, 0.92-0.98 and 0.93-0.99 for antigens 1-4, respectively. Relatively minor differences in results were observed among laboratories or among antigens, suggesting that the four PT antigens are quite similar and could be considered for acceptance in harmonized immunoassays used for serodiagnosis or vaccine evaluation. |
Comparison of digital with film radiographs for the classification of pneumoconiotic pleural abnormalities
Larson TC , Holiday DB , Antao VC , Thomas J , Pinheiro G , Kapil V , Franzblau A . Acad Radiol 2011 19 (2) 131-40 RATIONALE AND OBJECTIVES: Analog film radiographs are typically used to classify pneumoconiosis to allow comparison with standard film radiographs. The aim of this study was to determine if digital radiography is comparable to film for the purpose of classifying pneumoconiotic pleural abnormalities. MATERIALS AND METHODS: Subjects were 200 asbestos-exposed patients, from whom digital and film chest radiographs were obtained along with chest high-resolution computed tomographic scans. Using a crossover design, radiographs were independently read on two occasions by seven readers, using conventional International Labour Organization standards for film and digitized standards for digital. High-resolution computed tomographic scans were read independently by three readers. Areas under the receiver-operating characteristic curves were calculated using high-resolution computed tomographic ratings as the gold standard for disease status. Mixed linear models were fit to estimate the effects of order of presentation, occasion, and modality, treating the seven readers as a random effect. Comparing digital and film radiography for each reader and occasion, crude agreement and agreement beyond chance (kappa) were also calculated. RESULTS: The linear models showed no statistically significant sequence effect for order of presentation (P = .73) or occasion (P = .28). Most important, the difference between modalities was not statistically significant (digital vs film, P = .54). The mean area under the curve for film was 0.736 and increased slightly to 0.741 for digital. Mean crude agreement for the presence of pleural abnormalities consistent with pneumoconiosis across all readers and occasions was 78.3%, while the mean kappa value was 0.49. CONCLUSIONS: These results indicate that digital radiography is not statistically different from analog film for the purpose of classifying pneumoconiotic pleural abnormalities, when appropriate standards are used. |
Coxsackievirus A24 variant uses sialic acid-containing O-linked glycoconjugates as cellular receptors on human ocular cells
Mistry N , Inoue H , Jamshidi F , Storm RJ , Oberste MS , Arnberg N . J Virol 2011 85 (21) 11283-90 Coxsackievirus A24 variant (CVA24v) is a main causative agent of acute hemorrhagic conjunctivitis (AHC), which is a highly contagious eye infection. Previously it has been suggested that CVA24v uses sialic acid-containing glycoconjugates as attachment receptors on corneal cells, but the nature of these receptors is poorly described. Here, we set out to characterize and identify the cellular components serving as receptors for CVA24v. Binding and infection experiments using corneal cells treated with deglycosylating enzymes or metabolic inhibitors of de novo glycosylation suggested that the receptor(s) used by CVA24v are constituted by sialylated O-linked glycans that are linked to one or more cell surface proteins but not to lipids. CVA24v bound better to mouse L929 cells overexpressing human P-selectin glycoprotein ligand-1 (PSGL-1) than to mock-transfected cells, suggesting that PSGL-1 is a candidate receptor for CVA24v. Finally, binding competition experiments using a library of mono- and oligosaccharides mimicking known PSGL-1 glycans suggested that CVA24v binds to Neu5Acalpha2,3Gal disaccharides (Neu5Ac is N-acetylneuraminic acid). These results provide further insights into the early steps of the CVA24v life cycle. |
Detection of an influenza B virus strain with reduced susceptibility to neuraminidase inhibitor drugs
Bastien N , Gubbay JB , Richardson D , Sleeman K , Gubareva L , Li Y . J Clin Microbiol 2011 49 (11) 4020-1 The neuraminidase inhibitors (NAIs) oseltamivir and zanamivir have played an essential role in the prophylaxis and treatment of influenza. The residues forming the NA active sites are conserved among influenza A and B viruses (3). Conserved residues are in direct contact with the substrate or provide structural framework for the functional residues. There have been reports of in vivo resistance for influenza B viruses (4, 7). Here we report the isolation of a novel influenza B virus with reduced sensitivity to NAIs. | On 22 December 2010, an 87-year-old woman presented to a hospital in Ontario, Canada, with an influenza-like illness. Her symptoms began on 19 December 2010. She was admitted to the hospital and treated with oseltamivir for 5 days (75 mg twice daily), making an uneventful recovery. Influenza B virus was detected in a nasopharyngeal swab collected on 22 December 2010. The specimen was cultured in rhesus monkey kidney cells, and the isolate was designated B/Ontario/RV75-11/2010. The susceptibility of B/Ontario/RV75-11/2010 to NAIs was determined by a chemiluminescence neuraminidase inhibition assay. The 50% inhibitory concentrations (IC50s) for B/Ontario/RV75-11/2010 showed a 7- to 13-fold increase and a 6- to 18-fold increase compared to the values for the wild-type control B/Hong Kong/36/2005 for oseltamivir and zanamivir, respectively (Table 1). Specimen collection and drug treatment initiation occurred on the same day, indicating that the reduced sensitivity may have occurred naturally. |
The National Children's Study: an opportunity for medical toxicology
Mortensen ME , Hirschfeld S . J Med Toxicol 2011 8 (2) 160-5 The National Children's Study (NCS) is a national longitudinal study that will prospectively investigate the influence of biological, environmental, genetic, and social factors on the health and development of US children. The NCS was mandated by the Children's Health Act of 2000 (Public Law 106-310) and is being implemented by the National Institutes of Health with input from the Centers for Disease Control and Prevention (CDC), the Environmental Protection Agency, and other federal departments and agencies. The NCS is a data-driven, evidence-based, community- and participant-informed study. Given its scale and scope, the NCS is an integrated system using several data acquisition strategies intended to provide evidenced-based design of methodologies and protocols. These strategies include the Vanguard Study, the Main Study, and formative research and sub-studies. The Vanguard Study, a pilot study, is currently underway and has been expanded from 7 to 37 study locations. The original study protocols and recruitment strategy have been field tested and revisions are under consideration. The CDC is collaborating with NCS in a pilot study that evaluates biological specimen protocols and will provide results on a broad array of environmental chemical exposures and nutritional indicators for a sample of Vanguard Study participants. This study is an example of the kind of collaborative opportunity that would benefit the NCS. Medical toxicologists have unique training in basic and clinical toxicology and laboratory assessments, and by partnering with study centers, both the NCS design and future NCS research projects could be enhanced. |
Prenatal care utilization in Mississippi: racial disparities and implications for unfavorable birth outcomes
Cox RG , Zhang L , Zotti ME , Graham J . Matern Child Health J 2011 15 (7) 931-42 The objective of the study is to identify racial disparities in prenatal care (PNC) utilization and to examine the relationship between PNC and preterm birth (PTB), low birth weight (LBW) and infant mortality in Mississippi. Retrospective cohort from 1996 to 2003 linked Mississippi birth and infant death files was used. Analysis was limited to live-born singleton infants born to non-Hispanic white and black women (n = 292,776). PNC was classified by Kotelchuck's Adequacy of Prenatal Care Utilization Index. Factors associated with PTB, LBW and infant death were identified using multiple logistic regression after controlling for maternal age, education, marital status, place of residence, tobacco use and medical risk. About one in five Mississippi women had less than adequate PNC, and racial disparities in PNC utilization were observed. Black women delayed PNC, received too few visits, and were more likely to have either "inadequate PNC" (P < 0.0001) or "no care" (P < 0.0001) compared to white women. Furthermore, among women with medical conditions, black women were twice as likely to receive inadequate PNC compared to white women. Regardless of race, "no care" and "inadequate PNC" were strong risk factors for PTB, LBW and infant death. We provide empirical evidence to support the existence of racial disparities in PNC utilization and infant birth outcomes in Mississippi. Further study is needed to explain racial differences in PNC utilization. However, this study suggests that public health interventions designed to improve PNC utilization among women might reduce unfavorable birth outcomes especially infant mortality. |
Emergency department laboratory evaluations of fever without source in children aged 3 to 36 months
Simon AE , Lukacs SL , Mendola P . Pediatrics 2011 128 (6) e1368-75 OBJECTIVE: This article describes ordering of diagnostic tests, admission rates, and antibiotic administration among visits to US emergency departments (EDs) by children aged 3 to 36 months with fever without source (FWS). METHODS: The 2006-2008 National Hospital Ambulatory Medical Care Survey-Emergency Department was used to identify visits by 3- to 36-month-old children with FWS. Percentages of visits that included a complete blood count (CBC), urinalysis, blood culture, radiograph, rapid influenza test, admission to hospital, and ceftriaxone and other antibiotic administration were calculated. Multivariate logistic regression was used to identify factors associated with ordering of a CBC and urinalysis. RESULTS: No tests were ordered in 58.6% of visits for FWS. CBCs were ordered in 20.5% of visits and urinalysis in 17.4% of visits. Even among girls with a temperature of ≥39 degrees C, urinalysis was ordered in only 40.2% of visits. Ceftriaxone was given in 7.1% and other antibiotics in 18.3% of visits; 5.2% of the children at these visits were admitted to the hospital. In multivariate analysis, increased temperature, being female, and higher median income of the patient's zip code were associated with increased odds of having a CBC and urinalysis ordered. Being 24 to 36 months of age was associated with lower odds of receiving both a CBC and a urinalysis. CONCLUSIONS: Most US emergency department visits for FWS among children aged 3 to 36 months, physicians do not order diagnostic tests. Being female, having a higher fever, and higher median income of the patient's zip code were associated with ordering CBCs and urinalysis. |
The health of HIV-exposed children after early weaning
Parker ME , Tembo M , Adair L , Chasela C , Piwoz EG , Jamieson DJ , Ellington S , Kayira D , Soko A , Mkhomawanthu C , Martinson F , van der Horst CM , Bentley ME . Matern Child Nutr 2011 9 (2) 217-32 There are potential health risks associated with the use of early weaning to prevent mother-to-child transmission of human immunodeficiency virus (HIV) in resource-poor settings. Our objective was to examine growth and nutrient inadequacies among a cohort of children weaned early. Children participating in the Breastfeeding Antiretrovirals and Nutrition (BAN) Study in Lilongwe, Malawi, had HIV-infected mothers, were weaned at 6 months and fed LNS until 12 months. 40 HIV-negative, BAN-exited children were compared with 40 HIV-negative, community children matched on age, gender and local health clinic. Nutrient intake was calculated from 24-h dietary recalls collected from BAN-exited children. Anthropometric measurements were collected from BAN-exited and matched community children at 15-16 months, and 2 months later. Longitudinal random effects sex-stratified models were used to evaluate anthropometric differences between the two groups. BAN-exited children consumed adequate energy, protein and carbohydrates but inadequate amounts of fat. The prevalence of inadequate micronutrient intakes were: 46% for vitamin A; 20% for vitamin B6; 69% for folate; 13% for vitamin C; 19% for iron; 23% for zinc. Regarding growth, BAN-exited girls gained weight at a significantly lower rate {0.02 g kg(-1) per day [95% confidence interval (CI): 0.01, 0.03]} than their matched comparison [0.05 g kg(-1) per day (95% CI: 0.03, 0.07)]; BAN girls grew significantly slower [0.73 cm month(-1) (95% CI: 0.40,1.06)] than their matched comparison (1.55 cm month(-1) [95% CI: 0.98, 2.12]). Among this sample of BAN-exited children, early weaning was associated with dietary deficiencies and girls experienced reduced growth velocity. In resource-poor settings, HIV prevention programmes must ensure that breastfeeding stop only once a nutritionally adequate and safe diet without breast milk can be provided. |
Human coronavirus in young children hospitalized for acute respiratory illness and asymptomatic controls
Prill MM , Iwane MK , Edwards KM , Williams JV , Weinberg GA , Staat MA , Willby MJ , Talbot HK , Hall CB , Szilagyi PG , Griffin MR , Curns AT , Erdman DD . Pediatr Infect Dis J 2011 31 (3) 235-40 BACKGROUND: Human coronaviruses (HCoVs) have been detected in children with upper and lower respiratory symptoms but little is known about their relationship with severe respiratory illness. OBJECTIVE: To compare the prevalence of HCoV species among children hospitalized for acute respiratory illness and/or fever with asymptomatic controls and to assess the severity of outcomes among hospitalized children with HCoV compared with other respiratory viruses. METHODS: From December 2003-April 2004 and October 2004-April 2005, we conducted prospective, population-based surveillance of children <5 years of age hospitalized for ARI/fever in three US counties. Asymptomatic outpatient controls were enrolled concurrently. Nasal/throat swabs were tested for HCoV species HKU1, NL63, 229E, and OC43 by RT-PCR. Specimens from hospitalized children were also tested for other common respiratory viruses. Demographic and medical data were collected by parent/guardian interview and medical chart review. RESULTS: Overall, HCoV was detected in 113 (7.6%) of 1481 hospitalized children (83 [5.7%] after excluding 30 cases coinfected with other viruses) and 53 (7.1%) of 742 controls. The prevalence of HCoV or individual species was not significantly higher among hospitalized children than controls. Hospitalized children testing positive for HCoV alone tended to be less ill than those infected with other viruses, whereas those coinfected with HCoV and other viruses were clinically similar to those infected with other viruses alone. CONCLUSION: In this study of children hospitalized for acute respiratory illness and/or fever, HCoV infection was not associated with hospitalization or with increased severity of illness. |
Concurrent medical conditions and health care use and needs among children with learning and behavioral developmental disabilities, National Health Interview Survey, 2006-2010
Schieve LA , Gonzalez V , Boulet SL , Visser SN , Rice CE , Braun KV , Boyle CA . Res Dev Disabil 2011 33 (2) 467-476 Studies document various associated health risks for children with developmental disabilities (DDs). Further study is needed by disability type. Using the 2006-2010 National Health Interview Surveys, we assessed the prevalence of numerous medical conditions (e.g. asthma, frequent diarrhea/colitis, seizures), health care use measures (e.g. seeing a medical specialist and >9 office visits in past year), health impact measures (e.g. needing help with personal care), and selected indicators of unmet health needs (e.g. unable to afford needed prescription medications) among a nationally representative sample of children ages 3-17 years, with and without DDs. Children in four mutually exclusive developmental disability groups: autism (N=375), intellectual disability (ID) without autism (N=238); attention-deficit/hyperactivity disorder (ADHD) without autism or ID (N=2901); and learning disability (LD) or other developmental delay without ADHD, autism, or ID (N=1955); were compared to children without DDs (N=35,775) on each condition or health care measure of interest. Adjusted odds ratios (aORs) were calculated from weighted logistic regression models that accounted for the complex sample design. Prevalence estimates for most medical conditions examined were moderately to markedly higher for children in all four DD groups than children without DDs. Most differences were statistically significant after adjustment for child sex, age, race/ethnicity, and maternal education. Children in all DD groups also had significantly higher estimates for health care use, impact, and unmet needs measures than children without DDs. This study provides empirical evidence that children with DDs require increased pediatric and specialist services, both for their core functional deficits and concurrent medical conditions. |
Crossing growth percentiles in infancy and risk of obesity in childhood
Taveras EM , Rifas-Shiman SL , Sherry B , Oken E , Haines J , Kleinman K , Rich-Edwards JW , Gillman MW . Arch Pediatr Adolesc Med 2011 165 (11) 993-8 OBJECTIVE: To examine the associations of upward crossing of major percentiles in weight-for-length in the first 24 months of life with the prevalence of obesity at ages 5 and 10 years. DESIGN: Longitudinal study. SETTING: Multisite clinical practice. PARTICIPANTS: We included 44,622 children aged from 1 month to less than 11 years with 122,214 length/height and weight measurements from January 1, 1980, through December 31, 2008. MAIN EXPOSURE: The number of major weight-for-length percentiles crossed during each of four 6-month intervals, that is, 1 to 6 months, 6 to 12 months, 12 to 18 months, and 18 to 24 months. MAIN OUTCOME MEASURES: Odds and observed prevalence of obesity (body mass index [calculated as weight in kilograms divided by height in meters squared] ≥95th percentile) at ages 5 and 10 years. RESULTS: Crossing upwards 2 or more weight-for-length percentiles was common in the first 6 months of life (43%) and less common during later age intervals. Crossing upwards 2 or more weight-for-length percentiles in the first 24 months was associated with elevated odds of obesity at ages 5 years (odds ratio, 2.08; 95% CI, 1.84-2.34) and 10 years (1.75; 1.53-2.00) compared with crossing less than 2 major percentiles. Obesity prevalence at ages 5 and 10 was highest among children who crossed upwards 2 or more weight-for-length percentiles in the first 6 months of life. CONCLUSIONS: Crossing upwards 2 or more major weight-for-length percentiles in the first 24 months of life is associated with later obesity. Upward crossing of 2 weight-for-length percentiles in the first 6 months is associated with the highest prevalence of obesity 5 and 10 years later. Efforts to curb excess weight gain in infancy may be useful in preventing later obesity. |
Cytokines and neurodevelopmental outcomes in extremely low birth weight infants
Carlo WA , McDonald SA , Tyson JE , Stoll BJ , Ehrenkranz RA , Shankaran S , Goldberg RN , Das A , Schendel D , Thorsen P , Skogstrand K , Hougaard DM , Oh W , Laptook AR , Duara S , Fanaroff AA , Donovan EF , Korones SB , Stevenson DK , Papile LA , Finer NN , O'Shea TM , Poindexter BB , Wright LL , Ambalavanan N , Higgins RD . J Pediatr 2011 159 (6) 919-925 e3 OBJECTIVE: To determine if selected pro-inflammatory and anti-inflammatory cytokines and/or mediators of inflammation reported to be related to the development of cerebral palsy (CP) predict neurodevelopmental outcome in extremely low birth weight infants. STUDY DESIGN: Infants with birth weights ≤1000 g (n = 1067) had blood samples collected at birth and on days 3 +/- 1, 7 +/- 1, 14 +/- 3, and 21 +/- 3 to examine the association between cytokines and neurodevelopmental outcomes. The analyses were focused on 5 cytokines (interleukin [IL] 1beta; IL-8; tumor necrosis factor-alpha; regulated upon activation, normal T-cell expressed, and secreted (RANTES); and IL-2) reported to be most predictive of CP in term and late preterm infants. RESULTS: IL-8 was higher on days 0-4 and subsequently in infants who developed CP compared with infants who did not develop CP in both unadjusted and adjusted analyses. Other cytokines (IL-12, IL-17, tumor necrosis factor-beta, soluble IL R alpha, macrophage inflammatory protein 1beta) were found to be altered on days 0-4 in infants who developed CP. CONCLUSIONS: CP in former preterm infants may, in part, have a late perinatal and/or early neonatal inflammatory origin. |
Strategies for pediatric practitioners to increase fruit and vegetable consumption in children
Kim SA , Grimm KA , May AL , Harris DM , Kimmons J , Foltz JL . Pediatr Clin North Am 2011 58 (6) 1439-53 High intake of fruits and vegetables (FV) is associated with a decreased risk for many chronic diseases and may assist in weight management, but few children and adolescents consume the recommended amounts of FV. The pediatric practitioner can positively influence FV consumption of children through patient-level interventions (eg, counseling, connecting families to community resources), community-level interventions (eg, advocacy, community involvement), and health care facility-level interventions (eg, creating a healthy food environment in the clinical setting). This article reviews the importance of FV consumption, recommended intakes for children, and strategies by which pediatric practitioners can influence FV consumption of children. |
Occupational exposure to acrylamide in closed system production plants: air levels and biomonitoring
Moorman WJ , Reutman SS , Shaw PB , Blade LM , Marlow D , Vesper H , Clark JC , Schrader SM . J Toxicol Environ Health A 2012 75 (2) 100-11 The aim of this study was to evaluate biomarkers of acrylamide exposure, including hemoglobin adducts and urinary metabolites in acrylamide production workers. Biomarkers are integrated measures of the internal dose, and it is total acrylamide dose from all routes and sources that may present health risks. Workers from three companies were studied. Workers potentially exposed to acrylamide monomer wore personal breathing-zone air samplers. Air samples and surface-wipe samples were collected and analyzed for acrylamide. General-area air samples were collected in chemical processing units and control rooms. Hemoglobin adducts were isolated from ethylenediamine teraacetic acid (EDTA)-whole blood, and adducts of acrylamide and glycidamide, at the N-terminal valines of hemoglobin, were cleaved from the protein chain by use of a modified Edman reaction. Full work-shift, personal breathing zone, and general-area air samples were collected and analyzed for particulate and acrylamide monomer vapor. The highest general-area concentration of acrylamide vapor was 350 mcg/cm(3) in monomer production. Personal breathing zone and general-area concentrations of acrylamide vapor were found to be highest in monomer production operations, and lower levels were in the polymer production operations. Adduct levels varied widely among workers, with the highest in workers in the monomer and polymer production areas. The acrylamide adduct range was 15-1884 pmol/g; glycidamide adducts ranged from 17.8 to 1376 p/mol/g. The highest acrylamide and glycidamide adduct levels were found among monomer production process operators. The primary urinary metabolite N-acetyl-S-(2-carbamoylethyl) cysteine (NACEC) ranged from the limit of detection to 15.4 mcg/ml. Correlation of workplace exposure and sentinel health effects is needed to determine and control safe levels of exposure for regulatory standards. |
Reaerosolization of MS2 bacteriophage from an N95 filtering facepiece respirator by simulated coughing
Fisher EM , Richardson AW , Harpest SD , Hofacre KC , Shaffer RE . Ann Occup Hyg 2011 56 (3) 315-25 The supply of N95 filtering facepiece respirators (FFRs) may not be adequate to match demand during a pandemic outbreak. One possible strategy to maintain supplies in healthcare settings is to extend FFR use for multiple patient encounters; however, contaminated FFRs may serve as a source for the airborne transmission of virus particles. In this study, reaerosolization of virus particles from contaminated FFRs was examined using bacteriophage MS2 as a surrogate for airborne pathogenic viruses. MS2 was applied to FFRs as droplets or droplet nuclei. A simulated cough (370 l min(-1) peak flow) provided reverse airflow through the contaminated FFR. The number and size of the reaerosolized particles were measured using gelatin filters and an Andersen Cascade Impactor (ACI). Two droplet nuclei challenges produced higher percentages of reaerosolized particles (0.21 and 0.08%) than a droplet challenge (<0.0001%). Overall, the ACI-determined size distribution of the reaerosolized particles was larger than the characterized loading virus aerosol. This study demonstrates that only a small percentage of viable MS2 viruses was reaerosolized from FFRs by reverse airflow under the conditions evaluated, suggesting that the risks of exposure due to reaerosolization associated with extended use can be considered negligible for most respiratory viruses. However, risk assessments should be updated as new viruses emerge and better workplace exposure data becomes available. |
Field evaluation of a new prototype self-contained breathing apparatus
Coca A , Kim JH , Duffy R , Williams WJ . Ergonomics 2011 54 (12) 1197-206 Firefighters are required to use a self-contained breathing apparatus (SCBA) for respiratory protection when engaged in a variety of firefighting duties. While the SCBA provides crucial respiratory support and protection, it is also cumbersome and heavy, thus adding to the physical work performed by the firefighter. The purpose of the present study was to evaluate and compare the low profile SCBA prototype to a standard SCBA, as assessed by the objective and subjective measures of mobility and comfort, time of donning/doffing, as well as by acquiring user feedback on SCBA design features during field activities. The results of the present study indicated that the prototype SCBA was rated as a significant improvement over the standard SCBA in the areas of range of motion (ROM), mobility, comfort, induction of fatigue, interaction with protective clothing, and operability when worn over a standard firefighter ensemble, while performing a series of International Association of Fire Fighters Fire Ground Survival Program training exercises. STATEMENT OF RELEVANCE: A prototype SCBA was evaluated and compared with a standard SCBA, focusing on the objective and subjective measures of mobility and comfort during field activities. Feedback from end users was collected during the evaluation. The findings of the present study can be used for improving the system design and overall performance of new prototype SCBAs. |
Evaluating portable infrared spectrometers for measuring the silica content of coal dust
Miller AL , Drake PL , Murphy NC , Noll JD , Volkwein JC . J Environ Monit 2011 14 (1) 48-55 Miners face a variety of respiratory hazards while on the job, including exposure to silica dust which can lead to silicosis, a potentially fatal lung disease. Currently, field-collected filter samples of silica are sent for laboratory analysis and the results take weeks to be reported. Since the mining workplace is constantly moving into new and often different geological strata with changing silica levels, more timely data on silica levels in mining workplaces could help reduce exposures. Improvements in infrared (IR) spectroscopy open the prospect for end-of-shift silica measurements at mine sites. Two field-portable IR spectrometers were evaluated for their ability to quantify the mass of silica on filter samples loaded with known amounts of either silica or silica-bearing coal dust (silica content ranging from 10-200 mug/filter). Analyses included a scheme to correct for the presence of kaolin, which is a confounder for IR analysis of silica. IR measurements of the samples were compared to parallel measurements derived using the laboratory-based U.S. Mine Safety and Health Administration P7 analytical method. Linear correlations between Fourier transform infrared (FTIR) and P7 data yielded slopes in the range of 0.90-0.97 with minimal bias. Data from a variable filter array spectrometer did not correlate as well, mainly due to poor wavelength resolution compared to the FTIR instrument. This work has shown that FTIR spectrometry has the potential to reasonably estimate the silica exposure of miners if employed in an end-of-shift method. |
Effect of transmission reduction by insecticide-treated bednets (ITNs) on antimalarial drug resistance in western Kenya
Shah M , Kariuki S , Vanden Eng J , Blackstock AJ , Garner K , Gatei W , Gimnig JE , Lindblade K , Terlouw D , Ter Kuile F , Hawley WA , Phillips-Howard P , Nahlen B , Walker E , Hamel MJ , Slutsker L , Shi YP . PLoS One 2011 6 (11) e26746 Despite the clear public health benefit of insecticide-treated bednets (ITNs), the impact of malaria transmission-reduction by vector control on the spread of drug resistance is not well understood. In the present study, the effect of sustained transmission reduction by ITNs on the prevalence of Plasmodium falciparum gene mutations associated with resistance to the antimalarial drugs sulfadoxine-pyrimethamine (SP) and chloroquine (CQ) in children under the age of five years was investigated during an ITN trial in Asembo area, western Kenya. During the ITN trial, the national first line antimalarial treatment changed from CQ to SP. Smear-positive samples collected from cross sectional surveys prior to ITN introduction (baseline, n = 250) and five years post-ITN intervention (year 5 survey, n = 242) were genotyped for single nucleotide polymorphisms (SNPs) at dhfr-51, 59, 108, 164 and dhps-437, 540 (SP resistance), and pfcrt-76 and pfmdr1-86 (CQ resistance). The association between the drug resistance mutations and epidemiological variables was evaluated. There were significant increases in the prevalence of SP dhps mutations and the dhfr/dhps quintuple mutant, and a significant reduction in the proportion of mixed infections detected at dhfr-51, 59 and dhps-437, 540 SNPs from baseline to the year 5 survey. There was no change in the high prevalence of pfcrt-76 and pfmdr1-86 mutations. Multivariable regression analysis further showed that current antifolate use and year of survey were significantly associated with more SP drug resistance mutations. These results suggest that increased antifolate drug use due to drug policy change likely led to the high prevalence of SP mutations 5 years post-ITN intervention and reduced transmission had no apparent effect on the existing high prevalence of CQ mutations. There is no evidence from the current study that sustained transmission reduction by ITNs reduces the prevalence of genes associated with malaria drug resistance. |
Evidence for a useful life of more than three years for a polyester-based long-lasting insecticidal mosquito net in Western Uganda
Kilian A , Byamukama W , Pigeon O , Gimnig J , Atieli F , Koekemoer L , Protopopoff N . Malar J 2011 10 299 BACKGROUND: Long-lasting insecticidal nets (LLIN) are now standard for the prevention of malaria. However, only products with recommendation for public use from the World Health Organization should be used and this evaluation includes the assessment of net effectiveness after three years of field use. Results for one of the polyester-based products, Interceptor(R) is presented. METHODS: In five villages, 190 LLIN and 90 nets conventionally treated with the insecticide alpha-cypermethrin at 25 mg/m2 were distributed randomly and used by the families. Following a baseline household survey a net survey was carried out every six months to capture use, washing habits and physical condition of the nets. Randomly selected nets were collected after 6, 12, 24, 36 and 42 months and tested for remaining insecticide content and ability to knock-down and kill malaria transmitting mosquitoes. RESULTS: During the three and a half years of observation only 16 nets were lost to follow-up resulting in an estimated attrition rate of 12% after three and 20/% after 3.5 years. Nets were used regularly and washed on average 1.5 times per year. After three and a half years 29% of the nets were still in good condition while 13% were seriously torn with no difference between the LLIN and control nets. The conventionally treated nets quickly lost insecticide and after 24 months only 7% of the original dose remained (1.6 mg/m2). Baseline median concentration of alpha-cypermethrin for LLIN was 194.5 mg/m2 or 97% of the target dose with between and within net variation of 11% and 4% respectively (relative standard deviation). On the LLIN 73.8 mg/m2 alpha-cypermethrin remained after three years of use and 56.2 mg/m2 after three and a half and 94% and 81% of the LLIN still had > 15 mg/m2 left respectively. Optimal effectiveness in bio-assays (≥95% 60 minute knock-down or ≥ 80% 24 hour mortality) was found in 83% of the sampled LLIN after three and 71% after three and a half years. CONCLUSIONS: Under conditions in Western Uganda the tested long-lasting insecticidal net Interceptor(R) fulfilled the criteria for phase III of WHO evaluations and, based on preliminary criteria of the useful life, this product is estimated to last on average between three and four years. |
Ethical justification for conducting public health surveillance without patient consent
Lee LM , Heilig CM , White A . Am J Public Health 2011 102 (1) 38-44 Public health surveillance by necessity occurs without explicit patient consent. There is strong legal and scientific support for maintaining name-based reporting of infectious diseases and other types of public health surveillance. We present conditions under which surveillance without explicit patient consent is ethically justifiable using principles of contemporary clinical and public health ethics. Overriding individual autonomy must be justified in terms of the obligation of public health to improve population health, reduce inequities, attend to the health of vulnerable and systematically disadvantaged persons, and prevent harm. In addition, data elements collected without consent must represent the minimal necessary interference, lead to effective public health action, and be maintained securely. (Am J Public Health. Published online ahead of print November 17, 2011: e1-e7. doi:10.2105/AJPH.2011.300297). |
Developing the next generation of vaccinologists
Klein NP , Gidudu J , Qiang Y , Pahud B , Rowhani-Rahbar A , Baxter R , Dekker CL , Edwards KM , Halsey NA , Larussa P , Marchant C , Tokars JI , Destefano F . Vaccine 2011 29 (50) 9296-7 Thank you for your editorial in the December 6, 2010 issue titled “Developing the Next Generation of Vaccinologists” [1]. While we support your call for expanded formal vaccinology training, we also wish to point out that the Centers for Disease Control and Prevention (CDC)’s Immunization Safety Office (ISO) currently has two programs focused on mentoring and training in vaccine safety. | The oldest training opportunity is an ISO position within CDC's Epidemic Intelligence Service (EIS) program (http://www.cdc.gov/eis/index.html). EIS is a 2-year post-graduate training program of service and on-the-job learning for health professionals that provides “hands-on” practical training in epidemiology and public health [2], [3]. Over the past 15 years, 11 EIS Officers have received training in immunization safety through assignments with ISO. |
Blood pressure and cholesterol screening prevalence among U.S. women of reproductive age: opportunities to improve screening
Robbins CL , Dietz PM , Bombard JM , Gibbs F , Ko JY , Valderrama AL . Am J Prev Med 2011 41 (6) 588-95 BACKGROUND: Blood pressure and cholesterol screening among women of reproductive age are important for early disease detection and intervention, and because hypertension and dyslipidemia are associated with adverse pregnancy outcomes. PURPOSE: The objective of this study was to examine associations of sociodemographic characteristics, cardiovascular disease risk factors, and healthcare access indicators with blood pressure and cholesterol screening among women of reproductive age. METHODS: In 2011, prevalence estimates for self-reported blood pressure screening within 2 years and cholesterol screening within 5 years and AORs for screenings were calculated for 4837 women aged 20-44 years, using weighted 2008 National Health Interview Survey data. RESULTS: Overall, recommended blood pressure and cholesterol screening was received by 89.6% and 63.3% women, respectively. Those who were underinsured or uninsured had the lowest screening percentage at 76.6% for blood pressure (95% CI=73.4, 79.6) and 47.6% for cholesterol (95% CI=43.8, 51.5) screening. Suboptimal cholesterol screening prevalence was also found for women who smoke (54.5%, 95% CI=50.8, 58.2); obese women (69.8%, 95% CI=66.3, 73.0); and those with cardiovascular disease (70.3%, 95% CI=63.7, 76.1), prediabetes (73.3%, 95% CI= 64.1, 80.8), or hypertension (81.4%, 95% CI=76.6, 85.4). CONCLUSIONS: Most women received blood pressure screening, but many did not receive cholesterol screening. Universal healthcare access may improve screening prevalence. |
Cervical cancer screening among women who attend sexually transmitted diseases (STD) clinics: background paper for 2010 STD treatment guidelines
Datta SD , Saraiya M . Clin Infect Dis 2011 53 S153-S159 BACKGROUND: In April 2008, experts reviewed updates on sexually transmitted disease (STD) prevention and treatment in preparation for the revision of the Centers for Disease Control and Prevention (CDC) STD Treatment Guidelines. This included a review of cervical cancer screening in the STD clinical setting. METHODS: Key questions were identified with assistance from an expert panel. Reviews of the literature were conducted using the PubMed computerized database and shared with the panel. Updated information was incorporated in the 2010 CDC STD Treatment Guidelines. RESULTS: We recommend that STD clinics offering cervical screening services screen and treat women according to guidelines by the American College of Obstetrics and Gynecology, the American Cancer Society, the US Preventive Services Task Force, and the American Society for Colposcopists and Cervical Pathologists. New to the 2010 guidelines are higher age for initiating cervical screening (age >=21 years) and less frequent intervals of screening (at least every 3 years). New recommendations include new technologies, such as liquid-based cytology and high-risk human papillomavirus (HPV) DNA tests. Liquid-based technologies are not recommended over conventional testing. HPV DNA tests are recommended as adjunct tests and with new indications for use in cervical screening and management. Stronger recommendations were issued for STD clinics offering cervical screening services to have protocols in place for follow-up of test results and referral (eg, colposcopy). CONCLUSIONS: Important additions to the 2010 STD Treatment Guidelines include information on updated algorithms for screening and management of women and recommendations for use of liquid-based cytology and high-risk HPV testing. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Epidemiology and Surveillance
- Genetics and Genomics
- Health Behavior and Risk
- Healthcare Associated Infections
- Immunity and Immunization
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Parasitic Diseases
- Public Health Ethics
- Public Health Leadership and Management
- Reproductive Health
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure