Fecal DNA testing for Colorectal Cancer Screening: the ColoSure™ test.
Ned RM , Melillo S , Marrone M . PLoS Curr 2011 3 RRN1220 Colorectal cancer is the third most common cancer and the second leading cause of cancer-related deaths in the United States. Screening has been shown to be effective in reducing colorectal cancer incidence and mortality. Colonoscopy, sigmoidoscopy, and fecal occult blood tests are all recommended screening tests that have widespread availability. Nevertheless, many people do not receive the evidence-based recommended screening for colorectal cancer. Additional stool-based methods have been developed that offer more options for colorectal cancer screening, including a variety of fecal DNA tests. The only fecal DNA test that is currently available commercially in the United States is ColoSure(TM), which is marketed as a non-invasive test that detects an epigenetic marker (methylated vimentin) associated with colorectal cancer and pre-cancerous adenomas. We examined the published literature on the analytic validity, clinical validity, and clinical utility of ColoSure and we briefly summarized the current colorectal cancer screening guidelines regarding fecal DNA testing. We also addressed the public health implications of the test and contextual issues surrounding the integration of fecal DNA testing into current colorectal cancer screening strategies. The primary goal was to provide a basic overview of ColoSure and identify gaps in knowledge and evidence that affect the recommendation and adoption of the test in colorectal cancer screening strategies. |
Effectiveness-based guidelines for the prevention of cardiovascular disease in women--2011 update: a guideline from the American Heart Association
Mosca L , Benjamin EJ , Berra K , Bezanson JL , Dolor RJ , Lloyd-Jones DM , Newby LK , Pina IL , Roger VL , Shaw LJ , Zhao D , Beckie TM , Bushnell C , D'Armiento J , Kris-Etherton PM , Fang J , Ganiats TG , Gomes AS , Gracia CR , Haan CK , Jackson EA , Judelson DR , Kelepouris E , Lavie CJ , Moore A , Nussmeier NA , Ofili E , Oparil S , Ouyang P , Pinn VW , Sherif K , Smith SC Jr , Sopko G , Chandra-Strobos N , Urbina EM , Vaccarino V , Wenger NK . Circulation 2011 123 (11) 1243-62 Substantial progress has been made in the awareness, treatment, and prevention of cardiovascular disease (CVD) in women since the first women-specific clinical recommendations for the prevention of CVD were published by the American Heart Association (AHA) in 1999.1 The myth that heart disease is a “man’s disease” has been debunked; the rate of public awareness of CVD as the leading cause of death among US women has increased from 30% in 1997 to 54% in 2009.2 The age-adjusted death rate resulting from coronary heart disease (CHD) in females, which accounts for about half of all CVD deaths in women, was 95.7 per 100 000 females in 2007, a third of what it was in 1980.3,4 Approximately 50% of this decline in CHD deaths has been attributed to reducing major risk factors and the other half to treatment of CHD including secondary preventive therapies.4 Major randomized controlled clinical trials such as the Women’s Health Initiative have changed the practice of CVD prevention in women over the past decade.5 The investment in combating this major public health issue for women has been significant, as have the scientific and medical achievements. | Despite the gains that have been made, considerable challenges remain. In 2007, CVD still caused ≈1 death per minute among women in the United States.6 These represent 421 918 deaths, more women’s lives than were claimed by cancer, chronic lower respiratory disease, Alzheimer disease, and accidents combined.6 Reversing a trend of the past 4 decades, CHD death rates in US women 35 to 54 years of age now actually appear to be increasing, likely because of the effects of the obesity epidemic.4 CVD rates in the United States are significantly higher for black females compared with their white counterparts (286.1/100 000 versus 205.7/100 000). This disparity parallels the substantially lower rate of awareness of heart disease and stroke that has been documented among black versus white women.2,6–8 Of concern is that in a recent AHA national survey, only 53% of women said the first thing they would do if they thought they were having a heart attack was to call 9-1-1. This distressing lack of appreciation by many women for the need for emergency care for acute cardiovascular events is a barrier to optimal survival among women and underscores the need for educational campaigns targeted to women.2 |
Cancer burden in the HIV-infected population in the United States
Shiels MS , Pfeiffer RM , Gail MH , Hall HI , Li J , Chaturvedi AK , Bhatia K , Uldrick TS , Yarchoan R , Goedert JJ , Engels EA . J Natl Cancer Inst 2011 103 (9) 753-62 BACKGROUND: Effective antiretroviral therapy has reduced the risk of AIDS and dramatically prolonged the survival of HIV-infected people in the United States. Consequently, an increasing number of HIV-infected people are at risk of non-AIDS-defining cancers that typically occur at older ages. We estimated the annual number of cancers in the HIV-infected population, both with and without AIDS, in the United States. METHODS: Incidence rates for individual cancer types were obtained from the HIV/AIDS Cancer Match Study by linking 15 HIV and cancer registries in the United States. Estimated counts of the US HIV-infected and AIDS populations were obtained from Centers for Disease Control and Prevention surveillance data. We obtained estimated counts of AIDS-defining (ie, Kaposi sarcoma, non-Hodgkin lymphoma, and cervical cancer) and non-AIDS-defining cancers in the US AIDS population during 1991-2005 by multiplying cancer incidence rates and AIDS population counts, stratified by year, age, sex, race and ethnicity, transmission category, and AIDS-relative time. We tested trends in counts and standardized incidence rates using linear regression models. We multiplied overall cancer rates and HIV-only (HIV infected, without AIDS) population counts, available from 34 US states during 2004-2007, to estimate cancers in the HIV-only population. All statistical tests were two-sided. RESULTS: The US AIDS population expanded fourfold from 1991 to 2005 (96,179 to 413,080) largely because of an increase in the number of people aged 40 years or older. During 1991-2005, an estimated 79,656 cancers occurred in the AIDS population. From 1991-1995 to 2001-2005, the estimated number of AIDS-defining cancers decreased by greater than threefold (34,587 to 10,325 cancers; P(trend) < .001), whereas non-AIDS-defining cancers increased by approximately threefold (3193 to 10,059 cancers; P(trend) < .001). From 1991-1995 to 2001-2005, estimated counts increased for anal (206 to 1564 cancers), liver (116 to 583 cancers), prostate (87 to 759 cancers), and lung cancers (875 to 1882 cancers), and Hodgkin lymphoma (426 to 897 cancers). In the HIV-only population in 34 US states, an estimated 2191 non-AIDS-defining cancers occurred during 2004-2007, including 454 lung, 166 breast, and 154 anal cancers. CONCLUSIONS: Over a 15-year period (1991-2005), increases in non-AIDS-defining cancers were mainly driven by growth and aging of the AIDS population. This growing burden requires targeted cancer prevention and treatment strategies. |
Trends in HIV diagnoses and testing among U.S. adolescents and young adults
Hall HI , Walker F , Shah D , Belle E . AIDS Behav 2011 16 (1) 36-43 The Centers for Disease Control and Prevention recommends routine HIV screening in health care settings. Using national surveillance data, we assessed trends in HIV diagnoses and testing frequency in youth aged 13-24 diagnosed with HIV in 2005-2008. Diagnosis rates increased among black (17.0% per year), Hispanic (13.5%), and white males (8.8%), with increases driven by men who have sex with men (MSM). A higher percentage of white males and MSM had previously been tested than their counterparts. No increases in diagnoses or differences in testing were observed among females. Intensified interventions are needed to reduce HIV infections and racial/ethnic disparities. |
Four-year treatment outcomes of adult patients enrolled in Mozambique's rapidly expanding antiretroviral therapy program
Auld AF , Mbofana F , Shiraishi RW , Sanchez M , Alfredo C , Nelson LJ , Ellerbrock T . PLoS One 2011 6 (4) e18453 BACKGROUND: In Mozambique during 2004-2007 numbers of adult patients (≥15 years old) enrolled on antiretroviral therapy (ART) increased about 16-fold, from <5,000 to 79,500. All ART patients were eligible for co-trimoxazole. ART program outcomes, and determinants of outcomes, have not yet been reported. METHODOLOGY/PRINCIPAL FINDINGS: In a retrospective cohort study, we investigated rates of mortality, attrition (death, loss to follow-up, or treatment cessation), immunologic treatment failure, and regimen-switch, as well as determinants of selected outcomes, among a nationally representative sample of 2,596 adults initiating ART during 2004-2007. At ART initiation, median age of patients was 34 and 62% were female. Malnutrition and advanced disease were common; 18% of patients weighed <45 kilograms, and 15% were WHO stage IV. Median baseline CD4(+) T-cell count was 153/microL and was lower for males than females (139/microL vs. 159/microL, p<0.01). Stavudine, lamivudine, and nevirapine or efavirenz were prescribed to 88% of patients; only 31% were prescribed co-trimoxazole. Mortality and attrition rates were 3.4 deaths and 19.8 attritions per 100 patient-years overall, and 12.9 deaths and 57.2 attritions per 100 patient-years in the first 90 days. Predictors of attrition included male sex [adjusted hazard ratio (AHR) 1.5; 95% confidence interval (CI), 1.3-1.8], weight <45 kg (AHR 2.1; 95% CI, 1.6-2.9, reference group >60 kg), WHO stage IV (AHR 1.7; 95% CI, 1.3-2.4, reference group WHO stage I/II), lack of co-trimoxazole prescription (AHR 1.4; 95% CI, 1.0-1.8), and later calendar year of ART initiation (AHR 1.5; 95% CI, 1.2-1.8). Rates of immunologic treatment failure and regimen-switch were 14.0 and 0.6 events per 100-patient years, respectively. CONCLUSIONS: ART initiation at earlier disease stages and scale-up of co-trimoxazole among ART patients could improve outcomes. Research to determine reasons for low regimen-switch rates and increasing rates of attrition during program expansion is needed. |
Gonorrhoea positivity among women aged 15-24 years in the USA, 2005-2007
Gorgos L , Newman L , Satterwhite C , Berman S , Weinstock H . Sex Transm Infect 2011 87 (3) 202-4 OBJECTIVE: To examine the epidemiology of young women screened for gonorrhoea in the USA. METHODS: Data on tests for gonorrhoea among women aged 15-24 years attending family planning clinics from 2005 to 2007 were obtained through the infertility prevention project. Clinics testing 90% or more of women for gonorrhoea and sending 50 or more gonorrhoea tests per year were included. Gonorrhoea positivity on a state and county level was calculated and compared by age and race/ethnicity. RESULTS: A total of 1,119,394 tests from 948 clinics was eligible for inclusion. Median state-specific gonorrhoea positivity was 1.3% (IQR 0.7-2.0%). Positivity was higher among women aged 15-19 years (1.4%, IQR 0.9-2.6%) than among those aged 20-24 years (1.1%, IQR 0.6-1.4%, p=0.03) and among non-Hispanic black women (3.8%, IQR 3.2-4.6%) than non-Hispanic white women (0.6%, IQR 0.4-0.8%, p<0.0001). Half of all gonorrhoea cases in these women originated from 57 of 753 counties. Among non-Hispanic white women, positivity was 2.0% or greater in 4% of counties, while 83% of counties had gonorrhoea positivity of less than 1.0%. Gonorrhoea positivity among non-Hispanic black women was 2.0% or greater in 58% of counties, and less than 1.0% in only one-third of counties. These disparities were present diffusely across the geographical areas included in this analysis. CONCLUSIONS: Gonorrhea positivity was consistently high for young non-Hispanic black women attending family planning clinics across multiple geographical regions. A large proportion of gonorrhoea morbidity was concentrated in a relatively small number of counties in the USA among this population of young women. |
Rift Valley fever in Kenya: history of epizootics and identification of vulnerable districts
Murithi RM , Munyua P , Ithondeka PM , Macharia JM , Hightower A , Luman ET , Breiman RF , Njenga MK . Epidemiol Infect 2011 139 (3) 372-80 Since Kenya first reported Rift Valley fever (RVF)-like disease in livestock in 1912, the country has reported the most frequent epizootics of RVF disease. To determine the pattern of disease spread across the country after its introduction in 1912, and to identify regions vulnerable to the periodic epizootics, annual livestock disease records at the Department of Veterinary Services from 1910 to 2007 were analysed in order to document the number and location of RVF-infected livestock herds. A total of 38/69 (55%) administrative districts in the country had reported RVF epizootics by the end of 2007. During the 1912-1950 period, the disease was confined to a district in Rift Valley province that is prone to flooding and where livestock were raised in proximity with wildlife. Between 1951 and 2007, 11 national RVF epizootics were recorded with an average inter-epizootic period of 3.6 years (range 1-7 years); in addition, all epizootics occurred in years when the average annual rainfall increased by more than 50% in the affected districts. Whereas the first two national epizootics in 1951 and 1955 were confined to eight districts in the Rift Valley province, there was a sustained epizootic between 1961 and 1964 that spread the virus to over 30% of the districts across six out of eight provinces. The Western and Nyanza provinces, located on the southwestern region of the country, had never reported RVF infections by 2007. The probability of a district being involved in a national epizootic was fivefold higher (62%) in districts that had previously reported disease compared to districts that had no prior disease activity (11%). These findings suggests that once introduced into certain permissive ecologies, the RVF virus becomes enzootic, making the region vulnerable to periodic epizootics that were probably precipitated by amplification of resident virus associated with heavy rainfall and flooding. |
Rickettsia rickettsii in Rhipicephalus ticks, Mexicali, Mexico
Eremeeva ME , Zambrano ML , Anaya L , Beati L , Karpathy SE , Santos-Silva MM , Salceda B , MacBeth D , Olguin H , Dasch GA , Aranda CA . J Med Entomol 2011 48 (2) 418-21 Circulation of a unique genetic type of Rickettsia rickettsii in ticks of the Rhipicephalus sanguineus complex was detected in Mexicali, Baja California, Mexico. The Mexican R. rickettsii differed from all isolates previously characterized from the endemic regions of Rocky Mountain spotted fever in northern, central, and southern Americas. Rhipicephalus ticks in Mexicali are genetically different from Rh. sanguineus found in the United States. |
Effects of temperature on early-phase transmission of Yersina pestis by the flea, Xenopsylla cheopis
Schotthoefer AM , Bearden SW , Vetter SM , Holmes J , Montenieri JA , Graham CB , Woods ME , Eisen RJ , Gage KL . J Med Entomol 2011 48 (2) 411-7 Sharp declines in human and animal cases of plague, caused by the bacterium Yersinia pestis (Yersin), have been observed when outbreaks coincide with hot weather. Failure of biofilm production, or blockage, to occur in the flea, as temperatures reach 30 degrees C has been suggested as an explanation for these declines. Recent work demonstrating efficient flea transmission during the first few days after fleas have taken an infectious blood meal, in the absence of blockage (e.g., early-phase transmission), however, has called this hypothesis into question. To explore the potential effects of temperature on early-phase transmission, we infected colony-reared Xenopsylla cheopis (Rothchild) fleas with a wild-type strain of plague bacteria using an artificial feeding system, and held groups of fleas at 10, 23, 27, and 30 degrees C. Naive Swiss Webster mice were exposed to fleas from each of these temperatures on days 1-4 postinfection, and monitored for signs of infection for 21 d. Temperature did not significantly influence the rates of transmission observed for fleas held at 23, 27, and 30 degrees C. Estimated per flea transmission efficiencies for these higher temperatures ranged from 2.32 to 4.96% (95% confidence interval [CI]: 0.96-8.74). In contrast, no transmission was observed in mice challenged by fleas held at 10 degrees C (per flea transmission efficiency estimates, 0-1.68%). These results suggest that declines in human and animal cases during hot weather are not related to changes in the abilities of X. cheopis fleas to transmit Y. pestis infections during the early-phase period. By contrast, transmission may be delayed or inhibited at low temperatures, indicating that epizootic spread of Y. pestis by X. cheopis via early-phase transmission is unlikely during colder periods of the year. |
The impact of adulticide applications on mosquito density in Chicago, 2005
Mutebi JP , Delorey MJ , Jones RC , Plate DK , Gerber SI , Gibbs KP , Sun G , Cohen NJ , Paul WS . J Am Mosq Control Assoc 2011 27 (1) 69-76 The city of Chicago used ground ultra-low volume treatments of sumithrin (ANVIL 10+10) in areas with high West Nile virus infection rates among Culex mosquitoes. Two sequential treatments in Morbidity and Mortality Weekly Reports wk 31 and 32 decreased mean mosquito density by 54% from 2.5 to 1.1 mosquitoes per trap-day, whereas mosquito density increased by 153% from 1.3 to 3.3 mosquitoes per trap-day at the nonsprayed sites. The difference between these changes in mosquito density was statistically significant (confidence intervals for the difference in change: -4.7 to -1.9). Sequential adulticide treatments in September (wk 34 and 35) had no effect on mosquito density, probably because it was late in the season and the mosquitoes were presumably entering diapause and less active. Overall, there was significant decrease in mosquito density at the trap sites treated in all 4 wk (wk 31, 32, 34, and 35), suggesting that sustained sequential treatments suppressed mosquito density. Maximum likelihood estimates (MLE) of infection rate estimates varied independently of adulticide treatments, suggesting that the adulticide treatments had no direct effect on MLE. Mosquito trap counts were low, which was probably due to large numbers of alternative oviposition sites, especially catch basins competing with the gravid traps. |
Metal ions affecting the hematological system
Roney N , Abadin HG , Fowler B , Pohl HR . Met Ions Life Sci 2011 8 143-55 Many metals are essential elements and necessary for proper biological function at low intake levels. However, exposure to high intake levels of these metals may result in adverse effects. In addition, exposures to mixtures of metals may produce interactions that result in synergistic or antagonistic effects. This chapter focuses on metals that affect the hematological system and how exposures to mixtures of metals may contribute to their hematotoxicity. Exposure to arsenic, cadmium, copper, lead, mercury, tin or zinc has been shown to produce some effect on the hematological system. Binary interactions resulting from exposure to combinations of metals may increase or decrease the hematotoxicity induced by individual metals. For example, copper, iron, and zinc have been shown to have a protective effect on the hematotoxicity of lead. In contrast, co-exposure to manganese may increase the hematotoxicity of lead. |
Metal ions affecting the kidney
Fowler BA . Met Ions Life Sci 2011 8 133-41 This chapter provides a succinct summary of the nephrotoxic effects of a number of metals/metalloids on an individual or mixture basis. There is a discussion of routes of exposure, mechanisms of uptake by renal cells and the potential impact of nanomaterials on these processes. An emphasis is placed on the toxicity of these metals/ metalloids to individual cell types in the kidney and the application of biomarkers for the early detection of kidney cell injury prior to the onset of an overt clinical state such as end-stage renal disease. The issue of interactions between nephrotoxic metals in mixture exposures is discussed in relation to the application of molecular biomarkers for early detection of renal cell injury. |
Metal ions affecting the neurological system
Pohl HR , Roney N , Abadin HG . Met Ions Life Sci 2011 8 247-62 Several individual metals including aluminum, arsenic, cadmium, lead, manganese, and mercury were demonstrated to affect the neurological system. Metals are ubiquitous in the environment. Environmental and occupational exposure to one metal is likely to be accompanied by exposure to other metals, as well. It is, therefore, expected that interactions or "joint toxic actions" may occur in populations exposed to mixtures of metals or to mixtures of metals with other chemicals. Some metals seem to have a protective role against neurotoxicity of other metals, yet other interactions may result in increased neurotoxicity. For example, zinc and copper provided a protective role in cases of lead-induced neurotoxicity. In contrast, arsenic and lead co-exposure resulted in synergistic effects. Similarly, information is available in the current literature on interactions of metals with some organic chemicals such as ethanol, polychlorinated biphenyls, and pesticides. In depth understanding of the toxicity and the mechanism of action (including toxicokinetics and toxicodynamics) of individual chemicals is important for predicting the outcomes of interactions in mixtures. Therefore, plausible mechanisms of action are also described. |
Mixtures and their risk assessment in toxicology
Mumtaz MM , Hansen H , Pohl HR . Met Ions Life Sci 2011 8 61-80 For communities generally and for persons living in the vicinity of waste sites specifically, potential exposures to chemical mixtures are genuine concerns. Such concerns often arise from perceptions of a site's higher than anticipated toxicity due to synergistic interactions among chemicals. This chapter outlines some historical approaches to mixtures risk assessment. It also outlines ATSDR's current approach to toxicity risk assessment. The ATSDR's joint toxicity assessment guidance for chemical mixtures addresses interactions among components of chemical mixtures. The guidance recommends a series of steps that include simple calculations for a systematic analysis of data leading to conclusions regarding any hazards chemical mixtures might pose. These conclusions can, in turn, lead to recommendations such as targeted research to fill data gaps, development of new methods using current science, and health education to raise awareness of residents and health care providers. The chapter also provides examples of future trends in chemical mixtures assessment. |
Environmental Health Specialists Network (EHS-Net) 2010-2015: the new funding cycle
Leonard M . J Environ Health 2011 73 (8) 22-3 The Centers for Disease Control and | Prevention (CDC) Environmental | Health Services Branch is funding a | new national extramural program, “Revitalizing Core Environmental Health Programs | Through the Environmental Health Specialists | Network (EHS-Net).” This program merges | the Environmental Health Capacity Building | Program and EHS-Net into one comprehensive program with practice (nonresearch) and | research components relevant to food and water safety. The funding cycle for this program | is July 1, 2010–June 30, 2015. With this new | cycle, the capacity-building program and the | EHS-Net program are under the same EHS-Net | umbrella. This program will provide opportunities for the EHS-Net grantees to conduct research on environmental causes of foodborne | and waterborne illness, apply that research to | environmental health practice where possible, | and develop new ideas for further research. |
Evaluation of on-site wastewater system Escherichia coli contributions to shallow groundwater in coastal North Carolina
Humphrey Jr CP , O'Driscoll MA , Zarate MA . Water Sci Technol 2011 63 (4) 789-95 The study goal was to determine if on-site wastewater systems (OSWWS) installed in coastal areas were effective at reducing indicator bacteria densities before discharge to groundwater. Groundwater Escherichia coli (E. coli) densities and groundwater levels adjacent to 16 OSWWS in three different soil groups (sand, sandy loam, and sandy clay loam) were monitored and compared to background groundwater conditions on four occasions between March 2007 and February 2008 in coastal North Carolina. Groundwater beneath OSWWS had significantly (p≤0.05) lower densities of E. coli than septic tank effluent, but significantly higher densities of E. coli than background conditions for each soil type. Twenty three percent of all groundwater samples near OSWWS had E. coli densities that exceeded the EPA freshwater contact standards (single sample 235 cfu/100 mL) for surface waters. Groundwater E. coli densities near OSWWS were highest during shallow water table periods. The results indicate that increasing the required vertical separation distance from drainfield trenches to seasonal high water table could improve shallow groundwater quality. |
Risk factors for community-associated Staphylococcus aureus infections: results from parallel studies including methicillin-resistant and methicillin-sensitive S. aureus compared to uninfected controls
Como-Sabetti KJ , Harriman KH , Fridkin SK , Jawahir SL , Lynfield R . Epidemiol Infect 2011 139 (3) 419-29 Despite the increasing burden of community-associated methicillin-resistant Staphylococcus aureus (CA-MRSA) infections, the risk factors are not well understood. We conducted a hypothesis-generating study using three parallel case-control studies to identify risk factors for CA-MRSA and community-associated methicillin-susceptible S. aureus (CA-MSSA) infections. In the multivariate model, antimicrobial use in the 1-6 months prior to culture was associated with CA-MRSA infection compared to CA-MSSA [adjusted odds ratio (aOR) 1.7, P=0.07] cases. Antimicrobial use 1-6 months prior to culture (aOR 1.8, P=0.04), history of boils (aOR 1.6, P=0.03), and having a household member who was a smoker (aOR 1.3, P=0.05) were associated with CA-MRSA compared to uninfected community controls. The finding of an increased risk of CA-MRSA infection associated with prior antimicrobial use highlights the importance of careful antimicrobial stewardship. |
Epidemiology and prevention of respiratory syncytial virus infections among infants and young children
Langley GF , Anderson LJ . Pediatr Infect Dis J 2011 30 (6) 510-7 Since its discovery in 1956, respiratory syncytial virus (RSV) has been recognized as one of the most common causes of serious lower respiratory tract infections in young children worldwide. While considered a high priority, development of a safe and effective vaccine has remained elusive. Prevention of RSV disease relies on infection control and hygiene measures, as well as providing immunoprophylaxis in select infants. The prophylaxis, however, is costly, and so targeting the recipient population and timing of administration is important for optimal effectiveness and judicious use of limited health care resources. This article reviews the epidemiology of RSV infections in infants and young children, including risk factors for severe disease, so as to inform decisions about prevention efforts. |
Human rhinovirus infections in rural Thailand: epidemiological evidence for rhinovirus as both pathogen and bystander
Fry AM , Lu X , Olsen SJ , Chittaganpitch M , Sawatwong P , Chantra S , Baggett HC , Erdman D . PLoS One 2011 6 (3) e17780 BACKGROUND: We describe human rhinovirus (HRV) detections in SaKaeo province, Thailand. METHODS: From September 1, 2003-August 31, 2005, we tested hospitalized patients with acute lower respiratory illness and outpatient controls without fever or respiratory symptoms for HRVs with polymerase chain reaction and molecularly-typed select HRVs. We compared HRV detection among hospitalized patients and controls and estimated enrollment adjusted incidence. RESULTS: HRVs were detected in 315 (16%) of 1919 hospitalized patients and 27 (9.6%) of 280 controls. Children had the highest frequency of HRV detections (hospitalized: <1 year: 29%, 1-4 year: 29%, ≥65 years: 9%; controls: <1 year: 24%, 1-4 year: 14%, ≥65 years: 2.8%). Enrollment adjusted hospitalized HRV detection rates were highest among persons aged <1 year (1038/100,000 persons/year), 1-4 years (457), and ≥65 years (71). All three HRV species were identified, HRV-A was the most common species in most age groups including children aged <1 year (61%) and all adult age groups. HRV-C was the most common species in the 1-4 year (51%) and 5-19 year age groups (54%). Compared to controls, hospitalized adults (≥19 years) and children were more likely to have HRV detections (odds ratio [OR]: 4.8, 95% confidence interval [CI]: 1.5, 15.8; OR: 2.0, CI: 1.2, 3.3, respectively) and hospitalized children were more likely to have HRV-A (OR 1.7, CI: 0.8, 3.5) or HVR-C (OR 2.7, CI: 1.2, 5.9) detection. CONCLUSIONS: HRV rates were high among hospitalized children and the elderly but asymptomatic children also had substantial HRV detection. HRV (all species), and HRV-A and HRV-C detections were epidemiologically-associated with hospitalized illness. Treatment or prevention modalities effective against HRV could reduce hospitalizations due to HRV in Thailand. |
Ignoring the group in group-level HIV/AIDS intervention trials: a review of reported design and analytic methods
Pals SL , Wiegand RE , Murray DM . AIDS 2011 25 (7) 989-96 OBJECTIVES: Studies evaluating the efficacy of HIV/AIDS interventions often involve the random assignment of groups of participants or the treatment of participants in groups. These studies require analytic methods that take within-group correlation into account. We reviewed published studies to determine the extent to which within-group correlation was dealt with properly. DESIGN: We reviewed group-randomized trials (GRTs) and individually randomized group treatment (IRGT) trials published in HIV/AIDS and general public health journals 2005-2009. METHODS: At least two of the authors reviewed each article, recording descriptive characteristics, sample size estimation methods, analytic methods, and judgments about whether the methods took intraclass correlation into account properly. RESULTS: Of those articles including sufficient information to judge whether analytic methods were correct, only 24% used only appropriate methods for dealing with the intraclass correlation. The percentages differed substantially for GRTs (41.7%) and IRGT trials (8.0%). Most of the articles (69.2%) also made no mention of a priori sample size estimation. CONCLUSION: A majority of the articles in our review reported analyses ignoring the intraclass correlation. This practice may result in underestimated variance, inappropriately small P values, and incorrect conclusions about the effectiveness of interventions. Previous trials that were analyzed incorrectly need to be re-analyzed, and future trials should be designed and analyzed with appropriate methods. Also, journal reviewers and editors need to be aware of the special requirements for design and analysis of GRTs and IRGT trials and judge the quality of articles reporting on such trials according to appropriate standards. |
Improving public health surveillance using a dual-frame survey of landline and cell phone numbers
Hu SS , Balluz L , Battaglia MP , Frankel MR . Am J Epidemiol 2011 173 (6) 703-11 To meet challenges arising from increasing rates of noncoverage in US landline-based telephone samples due to cell-phone-only households, the Behavioral Risk Factor Surveillance System (BRFSS) expanded a traditional landline-based random digit dialing survey to a dual-frame survey of landline and cell phone numbers. In 2008, a survey of adults with cell phones only was conducted in parallel with an ongoing landline-based health survey in 18 states. The authors used the optimal approach to allocate samples into landline and cell-phone-only strata and used a new approach to weighting state-level landline and cell phone samples. They developed logistic models for each of 16 health indicators to examine whether exclusion of adults with cell phones only affected estimates after adjustment for demographic characteristics. The extents of the potential biases in landline telephone surveys that exclude cell phones were estimated. Biases resulting from exclusion of adults with cell phones only from the landline-based survey were found for 9 out of the 16 health indicators. Because landline noncoverage rates for adults with cell phones only continue to increase, these biases are likely to increase. Use of a dual-frame survey of landline and cell phone numbers assisted the BRFSS efforts in obtaining valid, reliable, and representative data. |
Developing a claim-based version of the ACE-27 comorbidity index: a comparison with medical record review
Fleming ST , Sabatino SA , Kimmick G , Cress R , Wu XC , Trentham-Dietz A , Huang B , Hwang W , Liff J . Med Care 2011 49 (8) 752-60 OBJECTIVES: The adult comorbidity evaluation (ACE-27) is a medical record-based comorbidity index that predicts survival among various types of cancer patients. The purpose of this study was to compare the medical record-based ACE-27 instrument to a newly developed administrative claim-based ACE-27 measure. STUDY DESIGN AND SETTING: Cross-sectional study of 4,300 breast and prostate cancer patients from the Centers for Disease Control and Prevention Patterns of Care Study. RESULTS: Comorbidities with the highest concordance were diabetes (sensitivity=84.6%, k=0.58 for breast cancer patients; sensitivity=0.764, k=0.54 for prostate cancer patients), and hypertension (sensitivity=78.5%, k=0.32 for breast cancer patients; sensitivity=69.6%, k=0.28 for prostate cancer patients). Diseases with fair or moderate agreement in one or both cancer sites include congestive heart failure, arrhythmia, hypertension, respiratory diseases, hepatic disease, renal disease, dementia, and neuromuscular disease. For overall indices, agreement was fair but with high sensitivities in the collapsed indices, and the highest sensitivities in the lowest level of decompensation. CONCLUSIONS: The ACE-27 comorbidity score derived from administrative claims data provides a tool to examine the relationship between comorbidity, cancer diagnosis, and outcomes in future epidemiologic research, particularly when medical record review is logistically impossible. The classification of most comorbidities into 2 or 3 levels of severity within a claim-based measure is a major development. Future research should be directed toward refining the measure with a longer review period or different paradigms for diagnosis identification, and testing the predictive ability of the measure in terms of survival, complications, or other outcomes of care. |
Core gene set as the basis of multilocus sequence analysis of the subclass Actinobacteridae.
Adekambi T , Butler RW , Hanrahan F , Delcher AL , Drancourt M , Shinnick TM . PLoS One 2011 6 (3) e14792 Comparative genomic sequencing is shedding new light on bacterial identification, taxonomy and phylogeny. An in silico assessment of a core gene set necessary for cellular functioning was made to determine a consensus set of genes that would be useful for the identification, taxonomy and phylogeny of the species belonging to the subclass Actinobacteridae which contained two orders Actinomycetales and Bifidobacteriales. The subclass Actinobacteridae comprised about 85% of the actinobacteria families. The following recommended criteria were used to establish a comprehensive gene set; the gene should (i) be long enough to contain phylogenetically useful information, (ii) not be subject to horizontal gene transfer, (iii) be a single copy (iv) have at least two regions sufficiently conserved that allow the design of amplification and sequencing primers and (v) predict whole-genome relationships. We applied these constraints to 50 different Actinobacteridae genomes and made 1,224 pairwise comparisons of the genome conserved regions and gene fragments obtained by using Sequence VARiability Analysis Program (SVARAP), which allow designing the primers. Following a comparative statistical modeling phase, 3 gene fragments were selected, ychF, rpoB, and secY with R(2)>0.85. Selected sets of broad range primers were tested from the 3 gene fragments and were demonstrated to be useful for amplification and sequencing of 25 species belonging to 9 genera of Actinobacteridae. The intraspecies similarities were 96.3-100% for ychF, 97.8-100% for rpoB and 96.9-100% for secY among 73 strains belonging to 15 species of the subclass Actinobacteridae compare to 99.4-100% for 16S rRNA. The phylogenetic topology obtained from the combined datasets ychF+rpoB+secY was globally similar to that inferred from the 16S rRNA but with higher confidence. It was concluded that multi-locus sequence analysis using core gene set might represent the first consensus and valid approach for investigating the bacterial identification, phylogeny and taxonomy. |
Prevalence and correlates of sexual risk behaviors among Jamaican adolescents
Ishida K , Stupp P , McDonald O . Int Perspect Sex Reprod Health 2011 37 (1) 6-15 CONTEXT: Despite high levels of sexual activity and risk behaviors among Jamaican youth, few population-based studies have examined their prevalence or correlates. METHODS: The prevalence of three sexual risk behaviors was assessed using data from the 2008-2009 Jamaican Reproductive Health Survey on a subsample of adolescents aged 15-19 who neither were in a union nor had a child. Factors associated with the risk behaviors were examined separately for females and males, using bivariate analysis and multivariate logistic regression. RESULTS: In the year prior to the survey, 32% of females and 54% of males had had sexual intercourse; of those, 12% and 52%, respectively, had had more than one sexual partner, and 49% and 46% had used condoms inconsistently or not at all. School enrollment was protective against females being sexually active and males having multiple partners. Females who were enrolled in an age-appropriate or higher grade had decreased odds of using condoms inconsistently or not at all, and males who were enrolled in a lower than age-appropriate grade had a decreased risk of being sexually active. Males in the lowest wealth tercile were less likely than those in the highest tercile to have been sexually active or to have had multiple partners. Weekly attendance at religious services was protective against all three risk behaviors for both genders, with the exception of inconsistent or no condom use among males. CONCLUSIONS: Future reproductive health programs should continue to target adolescents in venues other than schools and churches, and should also address the varying needs of females and males. |
Alcohol use as a marker for risky sexual behaviors and biologically confirmed sexually transmitted infections among young adult African-American women
Seth P , Wingood GM , DiClemente RJ , Robinson LS . Womens Health Issues 2011 21 (2) 130-5 INTRODUCTION: Previous research has primarily focused on the relationship between illicit drug use and HIV/sexually transmitted infection (STI) risk behavior among African-American women. Very few studies have solely reviewed the role of alcohol use on risky sexual behavior. The present study examined the relationship between alcohol use at non-abuse levels and risky sexual behaviors and STIs among young adult African-American women. METHODS: Eight hundred forty-eight African American women, ages 18 to 29, participated at baseline, with 669 and 673 women at 6 and 12 months follow-up, respectively. Participants completed an Audio Computer Assisted Survey Interview assessing sociodemographics, alcohol use, and risky sexual behaviors. Subsequently, participants provided two vaginal swab specimens for STIs. RESULTS: Multivariate logistic regression analyses were conducted for cross-sectional analyses, with illicit drug use as a covariate. Women who consumed alcohol were more likely to have multiple partners and risky partners. Binary generalized estimating equation models assessed the impact of alcohol use at baseline on risky sexual behavior and STIs over a 12-month period. Illicit drug use, intervention group, and baseline outcome measures were entered as covariates. Alcohol consumption predicted positive results for chlamydia, positive results for any STI, and never using a condom with a casual partner over a 12-month follow-up period. DISCUSSION: Frequency of alcohol use at non-abuse levels was correlated with and predicted risky sexual behaviors and STIs. Prevention programs for African-American women should incorporate education regarding the link between alcohol and HIV/STI risk behaviors and the potential negative health consequences. |
Veterans Affairs initiative to prevent methicillin-resistant Staphylococcus aureus infections
Jain R , Kralovic SM , Evans ME , Ambrose M , Simbartl LA , Obrosky DS , Render ML , Freyberg RW , Jernigan JA , Muder RR , Miller LJ , Roselle GA . N Engl J Med 2011 364 (15) 1419-1430 BACKGROUND: Health care-associated infections with methicillin-resistant Staphylococcus aureus (MRSA) have been an increasing concern in Veterans Affairs (VA) hospitals. METHODS: A "MRSA bundle" was implemented in 2007 in acute care VA hospitals nationwide in an effort to decrease health care-associated infections with MRSA. The bundle consisted of universal nasal surveillance for MRSA, contact precautions for patients colonized or infected with MRSA, hand hygiene, and a change in the institutional culture whereby infection control would become the responsibility of everyone who had contact with patients. Each month, personnel at each facility entered into a central database aggregate data on adherence to surveillance practice, the prevalence of MRSA colonization or infection, and health care-associated transmissions of and infections with MRSA. We assessed the effect of the MRSA bundle on health care-associated MRSA infections. RESULTS: From October 2007, when the bundle was fully implemented, through June 2010, there were 1,934,598 admissions to or transfers or discharges from intensive care units (ICUs) and non-ICUs (ICUs, 365,139; non-ICUs, 1,569,459) and 8,318,675 patient-days (ICUs, 1,312,840; and non-ICUs, 7,005,835). During this period, the percentage of patients who were screened at admission increased from 82% to 96%, and the percentage who were screened at transfer or discharge increased from 72% to 93%. The mean (+/-SD) prevalence of MRSA colonization or infection at the time of hospital admission was 13.6+/-3.7%. The rates of health care-associated MRSA infections in ICUs had not changed in the 2 years before October 2007 (P=0.50 for trend) but declined with implementation of the bundle, from 1.64 infections per 1000 patient-days in October 2007 to 0.62 per 1000 patient-days in June 2010, a decrease of 62% (P<0.001 for trend). During this same period, the rates of health care-associated MRSA infections in non-ICUs fell from 0.47 per 1000 patient-days to 0.26 per 1000 patient-days, a decrease of 45% (P<0.001 for trend). CONCLUSIONS: A program of universal surveillance, contact precautions, hand hygiene, and institutional culture change was associated with a decrease in health care-associated transmissions of and infections with MRSA in a large health care system. |
Intervention to reduce transmission of resistant bacteria in intensive care
Huskins WC , Huckabee CM , O'Grady NP , Murray P , Kopetskie H , Zimmer L , Walker ME , Sinkowitz-Cochran RL , Jernigan JA , Samore M , Wallace D , Goldmann DA . N Engl J Med 2011 364 (15) 1407-1418 BACKGROUND: Intensive care units (ICUs) are high-risk settings for the transmission of methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant enterococcus (VRE). METHODS: In a cluster-randomized trial, we evaluated the effect of surveillance for MRSA and VRE colonization and of the expanded use of barrier precautions (intervention) as compared with existing practice (control) on the incidence of MRSA or VRE colonization or infection in adult ICUs. Surveillance cultures were obtained from patients in all participating ICUs; the results were reported only to ICUs assigned to the intervention. In intervention ICUs, patients who were colonized or infected with MRSA or VRE were assigned to care with contact precautions; all the other patients were assigned to care with universal gloving until their discharge or until surveillance cultures obtained at admission were reported to be negative. RESULTS: During a 6-month intervention period, there were 5434 admissions to 10 intervention ICUs, and 3705 admissions to 8 control ICUs. Patients who were colonized or infected with MRSA or VRE were assigned to barrier precautions more frequently in intervention ICUs than in control ICUs (a median of 92% of ICU days with either contact precautions or universal gloving [51% with contact precautions and 43% with universal gloving] in intervention ICUs vs. a median of 38% of ICU days with contact precautions in control ICUs, P<0.001). In intervention ICUs, health care providers used clean gloves, gowns, and hand hygiene less frequently than required for contacts with patients assigned to barrier precautions; when contact precautions were specified, gloves were used for a median of 82% of contacts, gowns for 77% of contacts, and hand hygiene after 69% of contacts, and when universal gloving was specified, gloves were used for a median of 72% of contacts and hand hygiene after 62% of contacts. The mean (+/-SE) ICU-level incidence of events of colonization or infection with MRSA or VRE per 1000 patient-days at risk, adjusted for baseline incidence, did not differ significantly between the intervention and control ICUs (40.4+/-3.3 and 35.6+/-3.7 in the two groups, respectively; P=0.35). CONCLUSIONS: The intervention was not effective in reducing the transmission of MRSA or VRE, although the use of barrier precautions by providers was less than what was required. (Funded by the National Institute of Allergy and Infectious Diseases and others; STAR*ICU ClinicalTrials.gov number, NCT00100386 .). |
The association of activity level, parent mental distress, and parental involvement and monitoring with unintentional injury risk in fifth graders
Schwebel DC , Roth DL , Elliott MN , Windle M , Grunbaum JA , Low B , Cooper SP , Schuster MA . Accid Anal Prev 2011 43 (3) 848-52 OBJECTIVE: Extend findings with young children by examining the strength of association of activity level, parent mental distress, and parental involvement and monitoring with fifth graders' unintentional injuries. METHODS: Ordinal logistic regression models were used to predict unintentional injury frequency among 4745 fifth-graders. Examined predictors included demographics, parent reports of mental distress, temperamental activity level (tendency to be fidgety, restless, and constantly in motion), and parental involvement and monitoring in adolescents' lives. RESULTS: Higher levels of both activity level and parent mental distress predicted more frequent injuries. CONCLUSIONS: As has been found with younger children, unintentional injuries in fifth graders are associated with both parent and child characteristics. The result is discussed in the context of adolescent development. Implications include those for injury prevention (multi-dimensional prevention strategies that incorporate environmental modifications as well as training of youth and parents) and future research (study of potential mechanisms behind injury risk behavior via longitudinal and experimental research; study of injury risk during this phase of child development). |
Broadening the approach to youth violence prevention through public health
Hammond WR , Arias I . J Prev Interv Community 2011 39 (2) 167-75 Violence is a critical cause of death and nonfatal injuries among youth, and even those who witness violence can suffer serious health and mental health consequences. This highlights the need for prevention programs and policies aimed at reducing risks, promoting prosocial behavior, strengthening families, and creating communities in which youth are safe from violence. The Centers for Disease Control and Prevention's Injury Center is developing a National Public Health Strategy to Prevent Youth Violence. The strategy will establish a full application of the public health approach, ranging from research to practice. It also spotlights what is working, as a way to mobilize community leaders in supporting evidence-based initiatives. With the empirical guidance of articles such as those in this special issue, a shared strategy to prevent youth violence will help focus efforts and resources on solutions that show the most promise, and ensure that American communities undertake more comprehensive and coordinated prevention efforts to protect our nation's youth. |
Serotype distribution and invasive potential of group B streptococcus isolates causing disease in infants and colonizing maternal-newborn dyads
Madzivhandila M , Adrian PV , Cutland CL , Kuwanda L , Schrag SJ , Madhi SA . PLoS One 2011 6 (3) e17861 BACKGROUND: Serotype-specific polysaccharide based group B streptococcus (GBS) vaccines are being developed. An understanding of the serotype epidemiology associated with maternal colonization and invasive disease in infants is necessary to determine the potential coverage of serotype-specific GBS vaccines. METHODS: Colonizing GBS isolates were identified by vaginal swabbing of mothers during active labor and from skin of their newborns post-delivery. Invasive GBS isolates from infants were identified through laboratory-based surveillance. GBS serotyping was done by latex agglutination. Serologically non-typeable isolates were typed by a serotype-specific PCR method. The invasive potential of GBS serotypes associated with sepsis within seven days of birth was evaluated in association to maternal colonizing serotypes. RESULTS: GBS was identified in 289 (52.4%) newborns born to 551 women with GBS-vaginal colonization and from 113 (5.6%) newborns born to 2,010 mothers in whom GBS was not cultured from vaginal swabs. The serotype distribution among vaginal-colonizing isolates was as follows: III (37.3%), Ia (30.1%), and II (11.3%), V (10.2%), Ib (6.7%) and IV (3.7%). There were no significant differences in serotype distribution between vaginal and newborn colonizing isolates (P = 0.77). Serotype distribution of invasive GBS isolates were significantly different to that of colonizing isolates (P<0.0001). Serotype III was the most common invasive serotype in newborns less than 7 days (57.7%) and in infants 7 to 90 days of age (84.3%; P<0.001). Relative to serotype III, other serotypes showed reduced invasive potential: Ia (0.49; 95%CI 0.31-0.77), II (0.30; 95%CI 0.13-0.67) and V (0.38; 95%CI 0.17-0.83). CONCLUSION: In South Africa, an anti-GBS vaccine including serotypes Ia, Ib and III has the potential of preventing 74.1%, 85.4% and 98.2% of GBS associated with maternal vaginal-colonization, invasive disease in neonates less than 7 days and invasive disease in infants between 7-90 days of age, respectively. |
Laboratory diagnosis of tuberculosis in resource-poor countries: challenges and opportunities
Parsons LM , Somoskovi A , Gutierrez C , Lee E , Paramasivan CN , Abimiku A , Spector S , Roscigno G , Nkengasong J . Clin Microbiol Rev 2011 24 (2) 314-50 SUMMARY: With an estimated 9.4 million new cases globally, tuberculosis (TB) continues to be a major public health concern. Eighty percent of all cases worldwide occur in 22 high-burden, mainly resource-poor settings. This devastating impact of tuberculosis on vulnerable populations is also driven by its deadly synergy with HIV. Therefore, building capacity and enhancing universal access to rapid and accurate laboratory diagnostics are necessary to control TB and HIV-TB coinfections in resource-limited countries. The present review describes several new and established methods as well as the issues and challenges associated with implementing quality tuberculosis laboratory services in such countries. Recently, the WHO has endorsed some of these novel methods, and they have been made available at discounted prices for procurement by the public health sector of high-burden countries. In addition, international and national laboratory partners and donors are currently evaluating other new diagnostics that will allow further and more rapid testing in point-of-care settings. While some techniques are simple, others have complex requirements, and therefore, it is important to carefully determine how to link these new tests and incorporate them within a country's national diagnostic algorithm. Finally, the successful implementation of these methods is dependent on key partnerships in the international laboratory community and ensuring that adequate quality assurance programs are inherent in each country's laboratory network. |
Marginal iodide deficiency and thyroid function: dose-response analysis for quantitative pharmacokinetic modeling
Gilbert ME , McLanahan ED , Hedge J , Crofton KM , Fisher JW , Valentin-Blasini L , Blount BC . Toxicology 2011 283 (1) 41-8 Severe iodine deficiency (ID) results in adverse health outcomes and remains a benchmark for understanding the effects of developmental hypothyroidism. The implications of marginal ID, however, remain less well known. The current study examined the relationship between graded levels of ID in rats and serum thyroid hormones, thyroid iodine content, and urinary iodide excretion. The goals of this study were to provide parametric and dose-response information for development of a quantitative model of the thyroid axis. Female Long Evans rats were fed casein-based diets containing varying iodine (I) concentrations for 8 weeks. Diets were created by adding 975, 200, 125, 25, or 0 mcg/kg I to the base diet ( approximately 25 mcg I/kg chow) to produce 5 nominal I levels, ranging from excess (basal+added I, Treatment 1: 1000 mcg I/kg chow) to deficient (Treatment 5: 25 mcg I/kg chow). Food intake and body weight were monitored throughout and on 2 consecutive days each week over the 8-week exposure period, animals were placed in metabolism cages to capture urine. Food, water intake, and body weight gain did not differ among treatment groups. Serum T4 was dose-dependently reduced relative to Treatment 1 with significant declines (19 and 48%) at the two lowest I groups, and no significant changes in serum T3 or TSH were detected. Increases in thyroid weight and decreases in thyroidal and urinary iodide content were observed as a function of decreasing I in the diet. Data were compared with predictions from a recently published biologically based dose-response (BBDR) model for ID. Relative to model predictions, female Long Evans rats under the conditions of this study appeared more resilient to low I intake. These results challenge existing models and provide essential information for development of quantitative BBDR models for ID during pregnancy and lactation. |
Mycobacterium tuberculosis components stimulate production of the antimicrobial peptide hepcidin
Sow FB , Nandakumar S , Velu V , Kellar KL , Schlesinger LS , Amara RR , Lafuse WP , Shinnick TM , Sable SB . Tuberculosis (Edinb) 2011 91 (4) 314-21 We investigated the in vitro production of the antimicrobial peptide hepcidin by cells of the innate immune system that harbor Mycobacterium tuberculosis. Stimulation of mouse lung macrophages with M. tuberculosis or IFN-gamma + M. tuberculosis induced hepcidin mRNA. In human alveolar A549 epithelial cells, lipoglycans of M. tuberculosis, in particular mannose-capped lipoarabinomannan and phosphatidyl-myo-inositol mannosides, were strong inducers of hepcidin mRNA. In mouse dendritic cells, hepcidin mRNA was increased by subcellular fractions and culture filtrate proteins of M. tuberculosis and by TLR2 and TLR4 agonists, but not by TLR9 agonists, IL-1-alpha, IL-6 or TNF-alpha. Flow cytometry evaluation of human peripheral blood mononuclear cells demonstrated that CD11c(+) myeloid dendritic cells stimulated with killed M. tuberculosis or live M. bovis BCG produced hepcidin. The production of the antimicrobial peptide hepcidin by cells that interact with M. tuberculosis suggests a host defense mechanism against mycobacteria. |
Nox5 forms a functional oligomer mediated by self-association of its dehydrogenase domain
Kawahara T , Jackson HM , Smith SM , Simpson PD , Lambeth JD . Biochemistry 2011 50 (12) 2013-25 Nox5 belongs to the calcium-regulated subfamily of NADPH oxidases (Nox). Like other calcium-regulated Noxes, Nox5 has an EF-hand-containing calcium-binding domain at its N-terminus, a transmembrane heme-containing region, and a C-terminal dehydrogenase (DH) domain that binds FAD and NADPH. While Nox1-4 require regulatory subunits, including p22phox, Nox5 activity does not depend on any subunits. We found that inactive point mutants and truncated forms of Nox5 (including the naturally expressed splice form, Nox5S) inhibit full-length Nox5, consistent with formation of a dominant negative complex. Oligomerization of full-length Nox5 was demonstrated using co-immunoprecipitation of coexpressed, differentially tagged forms of Nox5 and occurred in a manner independent of calcium ion. Several approaches were used to show that the DH domain mediates oligomerization: Nox5 could be isolated as a multimer when the calcium-binding domain and/or the N-terminal polybasic region (PBR-N) was deleted, but deletion of the DH domain eliminated oligomerization. Further, a chimera containing the transmembrane domain of Ciona intestinalis voltage sensor-containing phosphatase (CiVSP) fused to the Nox5 DH domain formed a co-immunoprecipitating complex with, and functioned as a dominant inhibitor of, full-length Nox5. Radiation inactivation of Nox5 overexpressed in HEK293 cells and endogenously expressed in human aortic smooth muscle cells indicated molecular masses of approximately 350 and approximately 300 kDa, respectively, consistent with a tetramer being the functionally active unit. Thus, Nox5 forms a catalytically active oligomer in the membrane that is mediated by its dehydrogenase domain. As a result of oligomerization, the short, calcium-independent splice form, Nox5S, may function as an endogenous inhibitor of calcium-stimulated ROS generation by full-length Nox5. |
Effect of receptor binding domain mutations on receptor binding and transmissibility of avian influenza H5N1 viruses
Maines TR , Chen LM , Van Hoeven N , Tumpey TM , Blixt O , Belser JA , Gustin KM , Pearce MB , Pappas C , Stevens J , Cox NJ , Paulson JC , Raman R , Sasisekharan R , Katz JM , Donis RO . Virology 2011 413 (1) 139-47 Although H5N1 influenza viruses have been responsible for hundreds of human infections, these avian influenza viruses have not fully adapted to the human host. The lack of sustained transmission in humans may be due, in part, to their avian-like receptor preference. Here, we have introduced receptor binding domain mutations within the hemagglutinin (HA) gene of two H5N1 viruses and evaluated changes in receptor binding specificity by glycan microarray analysis. The impact of these mutations on replication efficiency was assessed in vitro and in vivo. Although certain mutations switched the receptor binding preference of the H5 HA, the rescued mutant viruses displayed reduced replication in vitro and delayed peak virus shedding in ferrets. An improvement in transmission efficiency was not observed with any of the mutants compared to the parental viruses, indicating that alternative molecular changes are required for H5N1 viruses to fully adapt to humans and to acquire pandemic capability. |
k-Nearest neighbor based consistent entropy estimation for hyperspherical distributions
Li SQ , Mnatsakanov RM , Andrew ME . Entropy (Basel) 2011 13 (3) 650-667 A consistent entropy estimator for hyperspherical data is proposed based on the k-nearest neighbor (knn) approach. The asymptotic unbiasedness and consistency of the estimator are proved. Moreover, cross entropy and Kullback-Leibler (KL) divergence estimators are also discussed. Simulation studies are conducted to assess the performance of the estimators for models including uniform and von Mises-Fisher distributions. The proposed knn entropy estimator is compared with the moment based counterpart via simulations. The results show that these two methods are comparable. |
Distribution of Chlamydia trachomatis genovars among youths and adults in Brazil
Machado AC , Bandea CI , Alves MF , Joseph K , Igietseme J , Miranda AE , Guimaraes EM , Turchi MD , Black CM . J Med Microbiol 2011 60 472-6 Despite a high prevalence of sexually transmitted Chlamydia trachomatis infections in Brazil and other countries in South America, very little is known about the distribution of C. trachomatis genovars. In this study, we genotyped C. trachomatis strains from urine or endocervical specimens collected from 163 C. trachomatis-positive female and male youths, and female adults, residing in two different regions of Brazil, the city of Goiania located in the central part of Brazil, and the city of Vitoria in the south-east region. C. trachomatis strains were genotyped by amplifying and sequencing the ompA gene encoding the chlamydial major outer-membrane protein, which is genovar specific. We found nine different C. trachomatis genovars: E (39.3 %), F (16.6 %), D (15.9 %), I (8.6 %), J (7.4 %), G (4.9 %), K (3.1 %), H (2.4 %) and B (1.8 %). The distribution of the C. trachomatis genovars in the two regions of Brazil was similar, and there was no statistically significant association of serovars with age, gender, number of sexual partners or clinical symptoms. The overall distribution of C. trachomatis genovars in Brazil appears similar to that found in other regions of the world, where E, D and F are the most common. This supports the notion that, during the last few decades, the overall distribution of C. trachomatis genovars throughout the world has been relatively stable. |
Rapid intrapartum or postpartum HIV testing at a midwife obstetric unit and a district hospital in South Africa
Theron GB , Shapiro DE , Van Dyke R , Cababasay MP , Louw J , Watts DH , Smith E , Bulterys M , Maupin R . Int J Gynaecol Obstet 2011 113 (1) 44-9 OBJECTIVE: To compare the prepartum and postpartum feasibility and acceptance of voluntary counseling and rapid testing (VCT) among women with unknown HIV status in South Africa. METHODS: Eligible women were randomized according to the calendar week of presentation to receive VCT either while in labor or after delivery. RESULTS: Of 7238 women approached, 542 (7.5%) were eligible, 343 (63%) were enrolled, and 45 (13%) were found to be HIV infected. The proportions of eligible women who accepted VCT were 66.8% (161 of 241) in the intrapartum arm and 60.5% (182 of 301) in the postpartum arm, and the difference of 6.3% (95% CI, -1.8% to 14.5%) was not significant. The median times (44 and 45minutes) required to conduct VCT were also similar in the 2 arms. In the intrapartum arm, all women in true labor received their test results before delivery and all those found to be HIV positive accepted prophylaxis with nevirapine before delivery. CONCLUSIONS: Rapid testing in labor wards for women with an unknown HIV status is feasible and well accepted, and allows for a more timely antiretroviral prophylaxis than postpartum testing. |
Analysis of nevirapine resistance in HIV-infected infants who received extended nevirapine or nevirapine/zidovudine prophylaxis
Fogel J , Hoover DR , Sun J , Mofenson LM , Fowler MG , Taylor AW , Kumwenda N , Taha TE , Eshleman SH . AIDS 2011 25 (7) 911-917 BACKGROUND: In the Post Exposure Prophylaxis of Infants (PEPI)-Malawi trial, infants received up to 14 weeks of extended nevirapine (NVP) or extended NVP with zidovudine (NVP + ZDV) to prevent postnatal HIV transmission. We examined emergence and persistence of NVP resistance in HIV-infected infants who received these regimens prior to HIV diagnosis. METHODS: Infant plasma samples collected at 14 weeks of age were tested using the ViroSeq HIV Genotyping System and a sensitive point mutation assay, LigAmp (for K103N and Y181C). Samples collected at 6 and 12 months of age were analyzed using LigAmp. RESULTS: At 14 weeks of age, NVP resistance was detected in samples from 82 (75.9%) of 108 HIV-infected infants. Although the frequency of NVP resistance detected by ViroSeq was lower in the extended NVP + ZDV arm than in the extended NVP arm, the difference was not statistically significant (38/55 = 69.1% vs. 44/53 = 83.0%, P = 0.12). Similar results were obtained using LigAmp. Using LigAmp, the proportion of infants who still had detectable NVP resistance at 6 and 12 months was similar among infants in the two study arms (at 6 months: 17/20 = 85.0% for extended NVP vs. 21/26 = 80.8% for extended NVP + ZDV, P = 1.00; at 12 months: 9/16 = 56.3% for extended NVP vs.10/13 = 76.9% for extended NVP + ZDV, P = 0.43). CONCLUSION: Infants exposed to extended NVP or extended NVP + ZDV had high rates of NVP resistance at 14 weeks of age, and resistant variants frequently persisted for 6-12 months. Frequency and persistence of NVP resistance did not differ significantly among infants who received extended NVP only vs. extended NVP + ZDV prophylaxis. |
Folic acid food fortification - its history, effect, concerns, and future directions
Crider KS , Bailey LB , Berry RJ . Nutrients 2011 3 (3) 370-384 Periconceptional intake of folic acid is known to reduce a woman's risk of having an infant affected by a neural tube birth defect (NTD). National programs to mandate fortification of food with folic acid have reduced the prevalence of NTDs worldwide. Uncertainty surrounding possible unintended consequences has led to concerns about higher folic acid intake and food fortification programs. This uncertainty emphasizes the need to continually monitor fortification programs for accurate measures of their effect and the ability to address concerns as they arise. This review highlights the history, effect, concerns, and future directions of folic acid food fortification programs. |
Vasospasm in the feet in workers assessed for HAVS
House R , Jiang D , Thompson A , Eger T , Krajnak K , Sauve J , Schweigert M . Occup Med (Lond) 2011 61 (2) 115-20 BACKGROUND: Previous studies have suggested that the presence of the vascular component of hand-arm vibration syndrome (HAVS) in the hands increases the risk of cold-induced vasospasm in the feet. AIMS: To determine if objectively measured cold-induced vasospasm in the hands is a risk factor for objectively measured cold-induced vasospasm in the feet in workers being assessed for HAVS. METHODS: The subjects were 191 male construction workers who had a standardized assessment for HAVS including cold provocation digital photocell plethysmography of the hands and feet to measure cold-induced vasospasm. Bivariate analysis and multinomial logistic regression were used to examine the association between plethysmographic findings in the feet and predictor variables including years worked in construction, occupation, current smoking, cold intolerance in the feet, the Stockholm vascular stage and plethysmographic findings in the hands. RESULTS: Sixty-one (32%) subjects had non-severe vasospasm and 59 (31%) had severe vasospasm in the right foot with the corresponding values being 57(30%) and 62 (32%) in the left foot. Multinomial logistic regression indicated that the only statistically significant predictor of severe vasospasm in the feet was the presence of severe vasospasm in the hands (OR: 4.11, 95% CI: 1.60-10.6, P < 0.01 on the right side and OR: 4.97, 95% CI: 1.82-13.53, P < 0.01 on the left side). Multinomial logistic regression analysis did not indicate any statistically significant predictors of non-severe vasospasm in the feet. CONCLUSIONS: Workers assessed for HAVS frequently have cold-induced vasospasm of their feet. The main predictor of severe vasospastic foot abnormalities is severe cold-induced vasospasm in the hands. |
Pulmonary inflammation induced by office dust and the relation to 3-beta-glucan using different extraction techniques
Young S , Cox-Ganser JM , Shogren ES , Wolfarth MG , Li S , Antonini JM , Castranova V , Park J . Toxicol Environ Chem 2011 93 (4) 806-823 It is observed that 3-beta-glucan, a major cell wall component of fungi, induces pulmonary inflammation. There is inconsistency in determining the correlation between the levels of glucan measured by current extraction methods and the respiratory inflammation observed in individuals or lab animals exposed to environmental dust samples. The glucan-specific limulus amebocyte lysate (G-LAL) method was used after extraction with dimethyl sulfoxide (DMSO) or sodium hydroxide (NaOH) to analyze the glucan content of office dust samples collected from a water-damaged building. C3HeB/FeJ mice, an endotoxinsensitive strain, were treated with different dust samples (2.5 mg kg-1 body weight) or saline (vehicle control) by pharyngeal aspiration. At 1 day after aspiration, bronchoalveolar lavage (BAL) was performed, and lung inflammation and injury were assessed by measuring: (1) neutrophil (PMN) infiltration, (2) inflammatory cytokine (IL-6, IL-10, MCP-1, IFN-, TNF-, and IL12-p70) levels, and (3) albumin and lactate dehydrogenase in recovered BAL fluid. Both DMSO and NaOH extraction increased the detection of glucan by approximately 20-fold compared to water extraction. However, only the DMSO extraction method showed a statistically significant positive correlation between 13-beta-glucan and albumin levels, total numbers of BAL, polymorphonuclear leukocytes (PMNs) cells recovered, levels of TNF-, MCP-1, and IL-6. In conclusion, 3-beta-glucan is a potent inflammatory agent in dust samples and DMSO extraction for glucan analysis may prove useful in understanding the impact of environmental contamination by glucans on lung disease. 2011 Taylor Francis. |
Estimation of the kinetic energy dissipation in fall-arrest system and manikin during fall impact
Wu JZ , Powers JR , Harris JR , Pan CS . Ergonomics 2011 54 (4) 367-379 Fall-arrest systems (FASs) have been widely applied to provide a safe stop during fall incidents for occupational activities. The mechanical interaction and kinetic energy exchange between the human body and the fall-arrest system during fall impact is one of the most important factors in FAS ergonomic design. In the current study, we developed a systematic approach to evaluate the energy dissipated in the energy absorbing lanyard (EAL) and in the harness/manikin during fall impact. The kinematics of the manikin and EAL during the impact were derived using the arrest-force time histories that were measured experimentally. We applied the proposed method to analyse the experimental data of drop tests at heights of 1.83 and 3.35 m. Our preliminary results indicate that approximately 84-92% of the kinetic energy is dissipated in the EAL system and the remainder is dissipated in the harness/manikin during fall impact. The proposed approach would be useful for the ergonomic design and performance evaluation of an FAS. STATEMENT OF RELEVANCE: Mechanical interaction, especially kinetic energy exchange, between the human body and the fall-arrest system during fall impact is one of the most important factors in the ergonomic design of a fall-arrest system. In the current study, we propose an approach to quantify the kinetic energy dissipated in the energy absorbing lanyard and in the harness/body system during fall impact. |
Integrating direct-reading exposure assessment methods into industrial hygiene practice
Pearce T , Coffey C . J Occup Environ Hyg 2011 8 (5) 31-6 Real-time and near real-time methods for assessing workplace exposures are becoming increasingly available. While many conventional exposure assessment methods require collecting the agent of interest on some type of sampling media and subsequently sending it to a commercial laboratory for analysis, some workplace hazards are already routinely monitored in real time. Noise is the primary example of a workplace hazard that is monitored by direct-reading instruments, with the regulation governing the allowable exposure also specifying the operating characteristics for the monitors | Advancements in the technology for monitoring the range of workplace hazards have led to increased usage of real-time monitoring either as a supplement to or a replacement for conventional workplace sampling. Direct-reading methods are also being used in innovative ways, such as for identifying workplace factors that influence exposure (determinants of exposure). Such characterization is possible because these methods have unique capabilities for measuring peak concentrations and for differentiating exposures across different work tasks or manufacturing processes; such determinations are not usually possible with conventional time-weighted average (TWA) exposure assessment methods. |
Cytogenetic analysis of an exposed-referent study: perchloroethylene-exposed dry cleaners compared to unexposed laundry workers
Tucker JD , Sorensen KJ , Ruder AM , McKernan LT , Forrester CL , Butler MA . Environ Health 2011 10 16 BACKGROUND: Significant numbers of people are exposed to tetrachloroethylene (perchloroethylene, PCE) every year, including workers in the dry cleaning industry. Adverse health effects have been associated with PCE exposure. However, investigations of possible cumulative cytogenetic damage resulting from PCE exposure are lacking. METHODS: Eighteen dry cleaning workers and 18 laundry workers (unexposed controls) provided a peripheral blood sample for cytogenetic analysis by whole chromosome painting. Pre-shift exhaled air on these same participants was collected and analyzed for PCE levels. The laundry workers were matched to the dry cleaners on race, age, and smoking status. The relationships between levels of cytological damage and exposures (including PCE levels in the shop and in workers' blood, packyears, cumulative alcohol consumption, and age) were compared with correlation coefficients and t-tests. Multiple linear regressions considered blood PCE, packyears, alcohol, and age. RESULTS: There were no significant differences between the PCE-exposed dry cleaners and the laundry workers for chromosome translocation frequencies, but PCE levels were significantly correlated with percentage of cells with acentric fragments (R2 = 0.488, p < 0.026). CONCLUSIONS: There does not appear to be a strong effect in these dry cleaning workers of PCE exposure on persistent chromosome damage as measured by translocations. However, the correlation between frequencies of acentric fragments and PCE exposure level suggests that recent exposures to PCE may induce transient genetic damage. More heavily exposed participants and a larger sample size will be needed to determine whether PCE exposure induces significant levels of persistent chromosome damage. |
Training and technical assistance to enhance capacity building between prevention research centers and their partners
Spadaro AJ , Grunbaum JA , Dawkins NU , Wright DS , Rubel SK , Green DC , Simoes EJ . Prev Chronic Dis 2011 8 (3) A65 INTRODUCTION: The Centers for Disease Control and Prevention has administered the Prevention Research Centers Program since 1986. We quantified the number and reach of training programs across all centers, determined whether the centers' outcomes varied by characteristics of the academic institution, and explored potential benefits of training and technical assistance for academic researchers and community partners. We characterized how these activities enhanced capacity building within Prevention Research Centers and the community. METHODS: The program office collected quantitative information on training across all 33 centers via its Internet-based system from April through December 2007. Qualitative data were collected from April through May 2007. We selected 9 centers each for 2 separate, semistructured, telephone interviews, 1 on training and 1 on technical assistance. RESULTS: Across 24 centers, 4,777 people were trained in 99 training programs in fiscal year 2007 (October 1, 2006-September 30, 2007). Nearly 30% of people trained were community members or agency representatives. Training and technical assistance activities provided opportunities to enhance community partners' capacity in areas such as conducting needs assessments and writing grants and to improve the centers' capacity for cultural competency. CONCLUSION: Both qualitative and quantitative data demonstrated that training and technical assistance activities can foster capacity building and provide a reciprocal venue to support researchers' and the community's research interests. Future evaluation could assess community and public health partners' perception of centers' training programs and technical assistance. |
Overview and quality assurance for the oral health component of the National Health and Nutrition Examination Survey (NHANES), 2005-08
Dye BA , Barker LK , Li XF , Lewis BG , Beltran-Aguilar ED . J Public Health Dent 2011 71 (1) 54-61 The oral health component for the National Health and Nutrition Examination Survey (NHANES) was changed in 2005 from an examination conducted by dentists to an oral health screening conducted by health technologists rather than dental professionals. The oral health screening included a person-based assessment for dental caries, restorations, and sealants. This report provides oral health content information and presents results of data quality analyses that include dental examiner reliability statistics for data collected during NHANES 2005-08. Oral health data are available on 15,342 persons aged 5 years and older representing the civilian, noninstitutionalized population of the United States who participated in NHANES 2005-08. Overall, interrater reliability findings indicate that health technologist performance was excellent with concordance between examination teams and the survey reference examiner being almost perfect for a number of assessments. Concordance for dental caries and sealants (kappa statistics) between health technologists and the survey reference examiner ranged from 0.82 to 0.90 for the combined 4-year period. These findings support the use of health technologists in the assessment of person-based estimators of dental caries and sealant prevalence as part of an oral health surveillance system. |
Variability in cancer death certificate accuracy by characteristics of death certifiers
Johnson CJ , Hahn CG , Fink AK , German RR . Am J Forensic Med Pathol 2011 33 (2) 137-42 Death certificates are the source for mortality statistics and are used to set public health goals. Accurate death certificates are vital in tracking outcomes of cancer. Deaths may be certified by physicians or other medical professionals, coroners, or medical examiners. Idaho is one of 3 states that participated in a Centers for Disease Control and Prevention-funded study to assess the concordance between cancer-specific causes of death and primary cancer site among linked cancer registry/death certificate data. We investigated variability in the accuracy of cancer death certificates by characteristics of death certifiers, including certifier type (physician vs coroner), physician specialty, years of experience as death certifier, and number of deaths certified. This study showed significant differences by certifier type/physician specialty in the accuracy of cancer mortality measured by death certificates. Nonphysician coroners had lower accuracy rates compared with physicians. Although nonphysician coroners certified less than 5% of cancer deaths in Idaho, they were significantly less likely to match the primary site from the cancer registry. Results from this study may be useful in the future training of death certifiers to improve the accuracy of death certificates and cancer mortality statistics. |
Mean systolic and diastolic blood pressure in adults aged 18 and over in the United States, 2001-2008
Wright JD , Hughes JP , Ostchega Y , Yoon SS , Nwankwo T . Natl Health Stat Report 2011 (35) 1-22, 24 OBJECTIVE: This report presents estimates for the period 2001-2008 of means and selected percentiles of systolic and diastolic blood pressure by sex, race or ethnicity, age, and hypertension status in adults aged 18 and over. METHODS: Demographic characteristics were collected during a personal interview, and blood pressures were measured during a physician examination. All estimates were calculated using the mean of up to three measurements. The final analytic sample consisted of 19,921 adults aged 18 and over with complete data. Examined sample weights and sample design variables were used to calculate nationally representative estimates and standard error estimates that account for the complex design, using SAS and SUDAAN statistical software. RESULTS: Mean systolic blood pressure was 122 mm Hg for all adults aged 18 and over; it was 116 mm Hg for normotensive adults, 130 mm Hg for treated hypertensive adults, and 146 mm Hg for untreated hypertensive adults. Mean diastolic blood pressure was 71 mm Hg for all adults 18 and over; it was 69 mm Hg for normotensive adults, 75 mm Hg for treated hypertensive adults, and 85 mm Hg for untreated hypertensive adults. There was a trend of increasing systolic blood pressure with increasing age. A more curvilinear trend was seen in diastolic blood pressure, with increasing then decreasing means with age in both men and women. Men had higher mean systolic and diastolic pressures than women. There were some differences in mean blood pressure by race or ethnicity, with non-Hispanic black adults having higher mean systolic and diastolic blood pressures than non-Hispanic white and Mexican-American adults, but these differences were not consistent after stratification by hypertension status and sex. CONCLUSIONS: These estimates of the distribution of blood pressure may be useful for policy makers who are considering ways to achieve a downward shift in the population distribution of blood pressure with the goal of reducing morbidity and mortality related to hypertension. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Entomology
- Environmental Health
- Epidemiology and Surveillance
- Genetics and Genomics
- Health Behavior and Risk
- Healthcare Associated Infections
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Program Evaluation
- Public Health Leadership and Management
- Vital Statistics
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure