Use and timeliness of radiation therapy after breast-conserving surgery in low-income women with early-stage breast cancer
Wheeler SB , Wu Y , Meyer AM , Carpenter WR , Richardson LC , Smith JL , Lewis MA , Weiner BJ . Cancer Invest 2012 30 (4) 258-67 PURPOSE: To characterize overall receipt and timeliness of radiation therapy (RT) following breast-conserving surgery among Medicaid-insured patients. METHOD: State cancer registry data linked with Medicaid claims from 2003 to 2009 were analyzed. Multivariate logistic and Cox proportional hazards regressions were employed. RESULTS: Overall, 81% of patients received guideline-recommended RT. Significant variation in timing of RT initiation was documented. Having fewer comorbitidies and receiving chemotherapy were correlated with higher odds of RT initiation within 1 year. CONCLUSION: Although RT use in Medicaid-insured women appears to have improved since earlier studies, documented delays in RT are troublesome and warrant further investigation. |
Ovarian cancer and body size: individual participant meta-analysis including 25,157 women with ovarian cancer from 47 epidemiological studies
Collaborative Group on Epidemiological Studies of Ovarian Cancer , Lee N , Marchbanks P , Ory HW , Peterson HB , Wingo PA . PLoS Med 2012 9 (4) e1001200 BACKGROUND: Only about half the studies that have collected information on the relevance of women's height and body mass index to their risk of developing ovarian cancer have published their results, and findings are inconsistent. Here, we bring together the worldwide evidence, published and unpublished, and describe these relationships. METHODS AND FINDINGS: Individual data on 25,157 women with ovarian cancer and 81,311 women without ovarian cancer from 47 epidemiological studies were collected, checked, and analysed centrally. Adjusted relative risks of ovarian cancer were calculated, by height and by body mass index. Ovarian cancer risk increased significantly with height and with body mass index, except in studies using hospital controls. For other study designs, the relative risk of ovarian cancer per 5 cm increase in height was 1.07 (95% confidence interval [CI], 1.05-1.09; p<0.001); this relationship did not vary significantly by women's age, year of birth, education, age at menarche, parity, menopausal status, smoking, alcohol consumption, having had a hysterectomy, having first degree relatives with ovarian or breast cancer, use of oral contraceptives, or use of menopausal hormone therapy. For body mass index, there was significant heterogeneity (p<0.001) in the findings between ever-users and never-users of menopausal hormone therapy, but not by the 11 other factors listed above. The relative risk for ovarian cancer per 5 kg/m2 increase in body mass index was 1.10 (95% CI, 1.07-1.13; p<0.001) in never-users and 0.95 (95% CI, 0.92-0.99; p = 0.02) in ever-users of hormone therapy. CONCLUSIONS: Ovarian cancer is associated with height and, among never-users of hormone therapy, with body mass index. In high-income countries, both height and body mass index have been increasing in birth cohorts now developing the disease. If all other relevant factors had remained constant, then these increases in height and weight would be associated with a 3% increase in ovarian cancer incidence per decade. (2012 Collaborative Group on Epidemiological Studies of Ovarian Cancer.) |
The association between depression and leptin is mediated by adiposity
Morris AA , Ahmed Y , Stoyanova N , Hooper WC , De Staerke C , Gibbons G , Quyyumi A , Vaccarino V . Psychosom Med 2012 74 (5) 483-8 OBJECTIVE: Animal models suggest that impaired leptin production, or leptin resistance despite increased leptin levels, may contribute to depression. The link between leptin and depression could be mediated by obesity, which is more common in depression and increases leptin production. METHODS: We administered the Beck Depression Inventory-II (BDI-II) to 537 participants (mean [standard deviation (SD)] age = 51 [9] years; female, 61%) enrolled in the Morehouse and Emory Team up to Eliminate Health Disparities (META-Health) study. Leptin levels were examined as continuous log-transformed values. RESULTS: Participants with moderate to severe depression had higher leptin levels (median [interquartile range] = 19.8 ng/mL [7.8-39.1 ng/mL]) than those with mild depression (22.9 ng/mL [7.0-57.9 ng/mL]) or minimal to no depression (37.7 ng/mL [17.6-64.9 ng/mL], p = .003). Participants with moderate to severe depression had higher body mass index (BMI) than those with mild or minimal depression (mean [SD] = 33 [8] versus 31 [9] versus 29 [7] kg/m(2), p = .001). After multivariate adjustment for age, sex, race, smoking status, hypertension, diabetes, blood pressure, lipids, and C-reactive protein, the BDI-II score remained a significant predictor of leptin levels (beta = 0.093, p = .01). Further adjustment for BMI eliminated the association between the BDI-II score and leptin (beta = 0.03, p = .3). Adjusting for waist circumference in place of BMI revealed similar findings. CONCLUSIONS: The association between depression and leptin seems to be mediated by increased adiposity in depressed individuals. |
Cervical cancer survivors at increased risk of subsequent tobacco-related malignancies, United States 1992-2008
Underwood JM , Rim SH , Fairley TL , Tai E , Stewart SL . Cancer Causes Control 2012 23 (7) 1009-16 PURPOSE: Persistent smoking among cancer survivors may increase their risk of subsequent malignancies, including tobacco-related malignancies. Despite these risks, nearly 40 % of women diagnosed with cervical cancer continue to smoke after diagnosis. This study describes the relative risk of developing any subsequent and tobacco-related malignancy among cervical cancer survivors. METHODS: We examined data from the year 1992 to 2008 in 13 Surveillance, Epidemiology and End Results registries. We calculated the standardized incidence ratio (SIR) and 95 % confidence limits (CLs) for all subsequent and tobacco-related malignancies among cervical cancer survivors. Tobacco-related malignancies were defined according to the 2004 Surgeon General's Report on the Health Consequences of Smoking. For comparison with cervical cancer survivors, SIRs for subsequent malignancies were also calculated for female survivors of breast or colorectal cancers. RESULTS: The SIR of developing a subsequent tobacco-related malignancy was higher among cervical cancer survivors (SIR = 2.2, 95 % CL = 2.0-2.4). Female breast (SIR = 1.1, 95 % CL = 1.0-1.1) and colorectal cancer survivors (1.1, 1.1-1.2) also had an elevated risk. The increased risk of a subsequent tobacco-related malignancy among cervical cancer survivors was greatest in the first 5 years after the initial diagnosis and decreased as time since diagnosis elapsed. CONCLUSION: Women with cervical cancer have a two-fold increased risk of subsequent tobacco-related malignancies, compared with breast and colorectal cancer survivors. In an effort to decrease their risk of subsequent tobacco-related malignancies, cancer survivors should be targeted for tobacco prevention and cessation services. Special attention should be given to cervical cancer survivors whose risk is almost twice that of breast or colorectal cancer survivors. |
A retrospective survey of HIV drug resistance among patients 1 year after initiation of antiretroviral therapy at 4 clinics in Malawi
Wadonda-Kabondo N , Hedt BL , van Oosterhout JJ , Moyo K , Limbambala E , Bello G , Chilima B , Schouten E , Harries A , Massaquoi M , Porter C , Weigel R , Hosseinipour M , Aberle-Grasse J , Jordan MR , Kabuluzi S , Bennett DE . Clin Infect Dis 2012 54 Suppl 4 S355-61 In 2004, Malawi began scaling up its national antiretroviral therapy (ART) program. Because of limited treatment options, population-level surveillance of acquired human immunodeficiency virus drug resistance (HIVDR) is critical to ensuring long-term treatment success. The World Health Organization target for clinic-level HIVDR prevention at 12 months after ART initiation is ≥70%. In 2007, viral load and HIVDR genotyping was performed in a retrospective cohort of 596 patients at 4 ART clinics. Overall, HIVDR prevention (using viral load ≤400 copies/mL) was 72% (95% confidence interval [CI], 67%-77%; range by site, 60%-83%) and detected HIVDR was 3.4% (95% CI, 1.8%-5.8%; range by site, 2.5%-4.7%). Results demonstrate virological suppression and HIVDR consistent with previous reports from sub-Saharan Africa. High rates of attrition because of loss to follow-up were noted and merit attention. |
Transmitted HIV drug resistance among drug-naive subjects recently infected with HIV in Mexico City: a World Health Organization survey to classify resistance and to field test two alternative patient enrollment methods
Bertagnolio S , Rodriguez-Diaz RA , Fuentes-Romero LL , Bennett DE , Viveros-Rogel M , Hart S , Pilon R , Sandstrom P , Soto-Ramirez LE . Clin Infect Dis 2012 54 Suppl 4 S328-33 In 2004, the World Health Organization performed a survey to assess transmitted drug resistance in Mexico City among drug-naive persons with newly diagnosed human immunodeficiency virus (HIV) infection and likely to be recently infected who were attending 3 voluntary counseling and testing sites. A parallel study comparing 2 alternative methods of enrolling survey participant was conducted in 9 voluntary counseling and testing sites in central Mexico. In study arm 1, subject information, consent and blood specimens were obtained during the HIV diagnostic testing visit. In study arm 2, consent and blood specimens were obtained at the return visit, only from those who were HIV infected. This survey classified nonnucleoside reverse-transcriptase inhibitor and nucleoside reverse-transcriptase inhibitor transmitted drug resistance as <5% and 5%-15%, respectively. Arm 2 yielded major advantages in cost and workload, with no evidence of increased sampling bias. |
Update on World Health Organization HIV drug resistance prevention and assessment strategy: 2004-2011
Jordan MR , Bennett DE , Wainberg MA , Havlir D , Hammer S , Yang C , Morris L , Peeters M , Wensing AM , Parkin N , Nachega JB , Phillips A , De Luca A , Geng E , Calmy A , Raizes E , Sandstrom P , Archibald CP , Perriens J , McClure CM , Hong SY , McMahon JH , Dedes N , Sutherland D , Bertagnolio S . Clin Infect Dis 2012 54 Suppl 4 S245-9 The HIV drug resistance (HIVDR) prevention and assessment strategy, developed by the World Health Organization (WHO) in partnership with HIVResNet, includes monitoring of HIVDR early warning indicators, surveys to assess acquired and transmitted HIVDR, and development of an accredited HIVDR genotyping laboratory network to support survey implementation in resource-limited settings. As of June 2011, 52 countries had implemented at least 1 element of the strategy, and 27 laboratories had been accredited. As access to antiretrovirals expands under the WHO/Joint United Nations Programme on HIV/AIDS Treatment 2.0 initiative, it is essential to strengthen HIVDR surveillance efforts in the face of increasing concern about HIVDR emergence and transmission. |
World Health Organization generic protocol to assess drug-resistant HIV among children <18 months of age and newly diagnosed with HIV in resource-limited countries
Bertagnolio S , Penazzato M , Jordan MR , Persaud D , Mofenson LM , Bennett DE . Clin Infect Dis 2012 54 S254-S260 Increased use of nonnucleoside reverse transcriptase inhibitors (NNRTIs) in pregnant and breastfeeding women will result in fewer children infected with human immunodeficiency virus (HIV). However, among children infected despite prevention of mother-to-child transmission (PMTCT), a substantial proportion will acquire NNRTI-resistant HIV, potentially compromising response to NNRTI-based antiretroviral therapy (ART). In countries scaling up PMTCT and pediatric ART programs, it is crucial to assess the proportion of young children with drug-resistant HIV to improve health outcomes and support national and global decision making on optimal selection of pediatric first-line ART. This article summarizes a new World Health Organization surveillance protocol to assess resistance using remnant dried blood spot specimens from a representative sample of children aged <18 months being tested for early infant diagnosis. |
Monitoring of early warning indicators for HIV drug resistance in antiretroviral therapy clinics in Zimbabwe
Dzangare J , Gonese E , Mugurungi O , Shamu T , Apollo T , Bennett DE , Kelley KF , Jordan MR , Chakanyuka C , Cham F , Banda RM . Clin Infect Dis 2012 54 Suppl 4 S313-6 Monitoring human immunodeficiency virus drug resistance (HIVDR) early warning indicators (EWIs) can help national antiretroviral treatment (ART) programs to identify clinic factors associated with HIVDR emergence and provide evidence to support national program and clinic-level adjustments, if necessary. World Health Organization-recommended HIVDR EWIs were monitored in Zimbabwe using routinely available data at selected ART clinics between 2007 and 2009. As Zimbabwe's national ART coverage increases, improved ART information systems are required to strengthen routine national ART monitoring and evaluation and facilitate scale-up of HIVDR EWI monitoring. Attention should be paid to minimizing loss to follow-up, supporting adherence, and ensuring clinic-level drug supply continuity. |
Prevalence of HIV drug resistance before and 1 year after treatment initiation in 4 sites in the Malawi antiretroviral treatment program
Wadonda-Kabondo N , Bennett D , van Oosterhout JJ , Moyo K , Hosseinipour M , Devos J , Zhou Z , Aberle-Grasse J , Warne TR , Mtika C , Chilima B , Banda R , Pasulani O , Porter C , Phiri S , Jahn A , Kamwendo D , Jordan MR , Kabuluzi S , Chimbwandira F , Kagoli M , Matatiyo B , Demby A , Yang C . Clin Infect Dis 2012 54 Suppl 4 S362-8 Since 2004, the Malawi antiretroviral treatment (ART) program has provided a public health-focused system based on World Health Organization clinical staging, standardized first-line ART regimens, limited laboratory monitoring, and no patient-level monitoring of human immunodeficiency virus drug resistance (HIVDR). The Malawi Ministry of Health conducts periodic evaluations of HIVDR development in prospective cohorts at sentinel clinics. We evaluated viral load suppression, HIVDR, and factors associated with HIVDR in 4 ART sites at 12-15 months after ART initiation. More than 70% of patients initiating ART had viral suppression at 12 months. HIVDR prevalence (6.1%) after 12 months of ART was low and largely associated with baseline HIVDR. Better follow-up, removal of barriers to on-time drug pickups, and adherence education for patients 16-24 years of age may further prevent HIVDR. |
Evaluation of obesity as an independent risk factor for medically attended laboratory-confirmed influenza
Coleman LA , Waring SC , Irving SA , Vandermause M , Shay DK , Belongia EA . Influenza Other Respir Viruses 2012 7 (2) 160-7 BACKGROUND: The relationship between obesity and susceptibility to influenza infection in humans is unclear. Morbidly obese people were at an increased risk of complications from 2009 pandemic H1N1 influenza [A(H1N1)pdm09]. OBJECTIVE: The goal of this study was to determine whether medically attended, laboratory-confirmed influenza is independently associated with obesity in adults with acute respiratory illness. PATIENTS/METHODS: Adults ≥20 years with a medical encounter for acute respiratory illness were recruited from a population cohort during the 2007-2008 (n = 903), 2008-2009 (n = 869), and 2009 pandemic (n = 851) season. Nasopharyngeal swabs were tested for influenza by real-time reverse-transcription polymerase chain reaction. Body mass index (BMI) was calculated using data from the electronic medical record. Logistic regression evaluated the association between influenza and obesity, adjusting for gender, vaccination, age, and high-risk medical condition. RESULTS: Influenza was detected in 50% of patients in 2007-2008, 15% in 2008-2009, and 14% during the 2009 pandemic. Predominant seasonal viruses in this population were A/H3N2 in 2007-2008, and A/H1N1 and B in 2008-2009. Mean (+/-SD) BMI was 30.58 (+/-7.31) in patients with influenza and 30.93 (+/-7.55) in test-negative controls during all seasons. Mean BMI of patients with influenza did not vary by season. After adjusting for confounders, neither obesity nor extreme obesity were associated with influenza by season or for all years combined (OR 0.95: 95% CI 0.75, 1.20 and 1.10: 0.80, 1.52, respectively, for obesity and extreme obesity, all years). CONCLUSIONS: Obesity was not associated with medically attended influenza among adults with acute respiratory illness in this population. |
Hospitalizations associated with influenza and respiratory syncytial virus in the United States, 1993-2008
Zhou H , Thompson WW , Viboud CG , Ringholz CM , Cheng PY , Steiner C , Abedi GR , Anderson LJ , Brammer L , Shay DK . Clin Infect Dis 2012 54 (10) 1427-36 BACKGROUND: Age-specific comparisons of influenza and respiratory syncytial virus (RSV) hospitalization rates can inform prevention efforts, including vaccine development plans. Previous US studies have not estimated jointly the burden of these viruses using similar data sources and over many seasons. METHODS: We estimated influenza and RSV hospitalizations in 5 age categories (<1, 1-4, 5-49, 50-64, and ≥65 years) with data for 13 states from 1993-1994 through 2007-2008. For each state and age group, we estimated the contribution of influenza and RSV to hospitalizations for respiratory and circulatory disease by using negative binomial regression models that incorporated weekly influenza and RSV surveillance data as covariates. RESULTS: Mean rates of influenza and RSV hospitalizations were 63.5 (95% confidence interval [CI], 37.5-237) and 55.3 (95% CI, 44.4-107) per 100,000 person-years, respectively. The highest hospitalization rates for influenza were among persons aged ≥65 years (309/100000; 95% CI, 186-1100) and those aged <1 year (151/100,000; 95% CI, 151-660). For RSV, children aged <1 year had the highest hospitalization rate (2350/100,000; 95% CI, 2220-2520) followed by those aged 1-4 years (178/100,000; 95% CI, 155-230). Age-standardized annual rates per 100,000 person-years varied substantially for influenza (33-100) but less for RSV (42-77). CONCLUSIONS: Overall US hospitalization rates for influenza and RSV are similar; however, their age-specific burdens differ dramatically. Our estimates are consistent with those from previous studies focusing either on influenza or RSV. Our approach provides robust national comparisons of hospitalizations associated with these 2 viral respiratory pathogens by age group and over time. |
Antiviral treatment for children with influenza: what's the evidence? Reply
Garg S , Finelli L , Fry A . Pediatr Infect Dis J 2012 31 (6) 662 Dr. Moral comments that most of the information on the efficacy of neuraminidase inhibitors for treatment of influenza virus infection in hospitalized children or children at increased risk of complications comes from retrospective and uncontrolled studies.1 We agree with Dr. Moral that there is a paucity of clinical trials data to inform influenza antiviral treatment recommendations for children, especially those who are hospitalized or have underlying medical conditions. However, we disagree that observational studies should be disregarded as evidence to support guidance. While observational studies have inherent design limitations, they can inform clinical practice and public health, particularly when data from randomized controlled trials are unavailable, have not been conducted in certain high-risk groups, or are unethical to perform using a placebo group because antiviral treatment is recommended in the group under study. We support the need for additional studies to inform clinical practice and policy decisions including well-designed clinical trials and observational studies. However, while we await these data, we cannot ignore the fact that in the United States each year >100 children die from influenza and its complications2–5 and an estimated 35,000 children are hospitalized,6 with 11%–27% of hospitalized children having illness severe enough to require admission to an intensive care unit.7 Among hospitalized children, 32%–75% have underlying medical conditions,7 and chronic lung disease, neurological conditions, immunocompromised state and other high-risk conditions are associated with an increased risk of intensive care unit admission and death among children with influenza virus infection.8–10 Two meta-analyses of oseltamivir clinical trials data, using the same databases but different methodologies, have concluded that early outpatient use of oseltamivir reduced hospitalizations and lower respiratory tract illness among healthy persons.11,12 Furthermore, additional evidence supports a reduction in complications with early treatment.13–15 When providing antiviral treatment recommendations, policy makers consider the total body of evidence that is available and account for potential benefits and risks, with the aim of reducing the morbidity and mortality associated with influenza illness, especially among those most at risk for severe illness. As new data become available, guidance can be updated and revised. |
Assessment of physician knowledge and practices concerning Shiga toxin-producing Escherichia coli infection and enteric illness, 2009, Foodborne Diseases Active Surveillance Network (FoodNet)
Clogher P , Hurd S , Hoefer D , Hadler JL , Pasutti L , Cosgrove S , Segler S , Tobin-D'Angelo M , Nicholson C , Booth H , Garman K , Mody RK , Gould LH . Clin Infect Dis 2012 54 Suppl 5 S446-52 BACKGROUND: Shiga toxin-producing Escherichia coli (STEC) infections cause acute diarrheal illness and sometimes life-threatening hemolytic uremic syndrome (HUS). Escherichia coli O157 is the most common STEC, although the number of reported non-O157 STEC infections is growing with the increased availability and use of enzyme immunoassay testing, which detects the presence of Shiga toxin in stool specimens. Prompt and accurate diagnosis of STEC infection facilitates appropriate therapy and may improve patient outcomes. METHODS: We mailed 2400 surveys to physicians in 8 Foodborne Diseases Active Surveillance Network (FoodNet) sites to assess their knowledge and practices regarding STEC testing, treatment, and reporting, and their interpretation of Shiga toxin test results. RESULTS: Of 1102 completed surveys, 955 were included in this analysis. Most (83%) physicians reported often or always ordering a culture of bloody stool specimens; 49% believed that their laboratory routinely tested for STEC O157, and 30% believed that testing for non-O157 STEC was also included in a routine stool culture. Forty-two percent of physicians were aware that STEC, other than O157, can cause HUS, and 34% correctly interpreted a positive Shiga toxin test result. All STEC knowledge-related factors were strongly associated with correct interpretation of a positive Shiga toxin test result. CONCLUSIONS: Identification and management of STEC infection depends on laboratories testing for STEC and physicians ordering and correctly interpreting results of Shiga toxin tests. Although overall knowledge of STEC was low, physicians who had more knowledge were more likely to correctly interpret a Shiga toxin test result. Physician knowledge of STEC may be modifiable through educational interventions. |
Clinical research and development of tuberculosis diagnostics: moving from silos to synergy
Nahid P , Kim PS , Evans CA , Alland D , Barer M , Diefenbach J , Ellner J , Hafner R , Hamilton CD , Iademarco MF , Ireton G , Kimerling ME , Lienhardt C , Mackenzie WR , Murray M , Perkins MD , Posey JE , Roberts T , Sizemore C , Stevens WS , Via L , Williams SD , Yew WW , Swindells S . J Infect Dis 2012 205 Suppl 2 S159-68 The development, evaluation, and implementation of new and improved diagnostics have been identified as critical needs by human immunodeficiency virus (HIV) and tuberculosis researchers and clinicians alike. These needs exist in international and domestic settings and in adult and pediatric populations. Experts in tuberculosis and HIV care, researchers, healthcare providers, public health experts, and industry representatives, as well as representatives of pertinent US federal agencies (Centers for Disease Control and Prevention, Food and Drug Administration, National Institutes of Health, United States Agency for International Development) assembled at a workshop proposed by the Diagnostics Working Group of the Federal Tuberculosis Taskforce to review the state of tuberculosis diagnostics development in adult and pediatric populations. |
Does maternal use of tenofovir during pregnancy affect growth of HIV-exposed uninfected infants?
Kuhn L , Bulterys M . AIDS 2012 26 (9) 1167-9 Transmission of HIV from mother to child can be almost eliminated by antiretroviral drugs started early in pregnancy [1,2]. If HIV disease in a pregnant woman is more advanced (i.e., CD4 cell count below 350 cells/μl), then antiretroviral drugs are given as treatment for the woman herself with the additional benefit that transmission to the child is prevented. If HIV disease in a pregnant woman is less advanced, then combination antiretroviral regimens can be provided during pregnancy and lactation as prophylaxis. There is also gathering support for implementing universal test-and-treat strategies during pregnancy as a public health approach to preventing vertical transmission and optimizing coverage of treatment for women who need it [3–5]. Tenofovirdisoproxilfumarate (TDF) is an attractive drug to recommend as part of first-line antiretroviral drug treatment regimens because of its generally favorable safety profile, excellent durability, and high efficacy [6,7]. It is also available in the United States and many other countries as part of fixed-dose combinations. Thus, determining whether or not there are untoward side effects of intrauterine exposure is of great public health importance. | There is another reason why the safety of intrauterine TDF exposure needs careful consideration. Oral TDF combined with emtricitabine used preexposure and postexposure (PrEP) has now been shown in three independent studies to be significantly associated with reduced risk of HIV acquisition among adult men and women [8–10]. In settings, particularly in sub-Saharan Africa, with high HIV incidence, PrEP has the potential to make sizable gains in terms of adult HIV infections prevented [11]. The young, at-risk, but still uninfected, women most in need of antiretroviral prophylaxis against sexual HIV transmission are also those most likely to become pregnant. Thus, not only will infants born to HIV-infected women potentially be exposed to TDF, but infants born to uninfected women may also be exposed in the future, with the optimistic scenario that these prevention programs are able to gather momentum. Vaginal gel formulations of TDF have produced promising [12] but some inconsistent results and have not yet garnered support for implementation without further studies. Vaginal gel formulations are attractive, from the potential toxicity point of view, as systemic drug concentrations are considerably lower [13]. |
Model Aquatic Health Code (MAHC) and International Swimming Pool and Spa Code (ISPSC)
Blake R , Peters J . J Environ Health 2012 74 (9) 36-9 Swimming is the third most common form of physical exercise in the United States (MMWR 2010). But a lot of swimming occurs in facilities with inadequate public health protection. This is often due to local or state codes that are not up-to-date or are not based on the latest science. In 2009, the Centers for Disease Control and Prevention (CDC) reviewed 2008 data from four state and 11 local pool inspection programs. Pool codes in those jurisdictions were not uniform, resulting in inspection data recorded in different ways. The data analysis showed, however, that of the 120,000 inspections, over 12% found serious health and safety violations that resulted in immediate pool closure (MMWR 2010). | This lack of consistent health regulation in state and local codes results in uncertainty and confusion for pool and spa owners, operators, suppliers, and users. And in some existing laws and regulations, gaps or outdated standards leave many persons unnecessarily vulnerable to disease and injury. The need for a menu of regulatory and policy provisions become clear; a menu to help state and local governments review their laws and revisit the design, construction, operation, and maintenance of all aquatic venues within their jurisdictions. | Recent years have seen a steady increase in reported disease outbreaks tied to aquatic facilities. Table 1 contains examples of recreational water illnesses attributed to aquatic venues. Waterborne pathogens can cause a variety of ailments, many of which cause diarrhea. In fact, diarrheal disease is so common that some 5% of the public contracts it monthly (Roy et al. 2006). Annually, the incidence is up to 3.5 cases of diarrhea per person, with even higher rates for young children (Mack 2006) Behaviors such as swallowing water, inadequate showering before entering the water, and the lack of toilet and diaper changes all increase the likelihood of disease in aquatic venues. |
The burden of environmental disease in the United States
Pugh KH , Zarus GM . J Environ Health 2012 74 (9) 30-4 The U.S. spends the most of any nation on health—over $2 trillion every | year—yet ranks 37th in overall health among | nations of the world (Healthiest Nation Alliance, 2011). Over 17% of the U.S. gross | domestic product was spent on health expenditures in 2009 (Centers for Medicare and | Medicaid Services, 2009). As our emphasis | moves to health protection through health | promotion, prevention, and preparedness, it | is helpful to identify the economic burden | of major disease groups in order to develop | and support the best evidence-based health | protection strategies. In an effort to establish | environmental health prevention strategy targets, we have focused this report on defining | the economic burden of environmental disease in the U.S |
Relative risk of listeriosis in Foodborne Diseases Active Surveillance Network (FoodNet) sites according to age, pregnancy, and ethnicity
Pouillot R , Hoelzer K , Jackson KA , Henao OL , Silk BJ . Clin Infect Dis 2012 54 Suppl 5 S405-10 BACKGROUND: Quantitative estimates of the relative risk (RR) of listeriosis among higher-risk populations and a nuanced understanding of the age-specific risks are crucial for risk assessments, targeted interventions, and policy decisions. METHOD: The RR of invasive listeriosis was evaluated by age, pregnancy status, and ethnicity using 2004-2009 data from the Foodborne Diseases Active Surveillance Network (FoodNet). Nonparametric logistic regression was used to characterize changes in risk with age and ethnicity. Adjusted RRs and 95% confidence intervals (CIs) were evaluated using negative binomial generalized linear models. RESULTS: Among non-pregnancy-associated cases, listeriosis incidence rates increased gradually with age (45-59 years: RR, 4.7; 95% CI, 3.3-6.8; >85 years: RR, 53.8; 95% CI, 37.3-78.9; reference: 15-44 years). The RR was significantly higher for Hispanics than for non-Hispanics (RR, 1.8; 95% CI, 1.3-2.5). Among women of reproductive age (15-44 years), pregnant women had a markedly higher listeriosis risk (RR, 114.6; 95% CI, 68.9-205.1) than nonpregnant women. The RR was higher for Hispanic than non-Hispanic women, regardless of pregnancy status, and this increased during the study period (2004-2006: RR, 1.9; 95% CI, 1.0-3.3; 2007-2009: RR, 4.8; 95% CI, 3.1-7.1). CONCLUSIONS: This study quantifies the increases in risk of listeriosis among older persons, pregnant women, and Hispanics in the United States. Additional research is needed to better describe the independent effects of age on risk while accounting for underlying conditions. These estimates are needed both to optimize risk assessment models and to inform targeted interventions and policy decisions. |
Retention of autism spectrum diagnoses by community professionals: findings from the Autism and Developmental Disabilities Monitoring Network, 2000 and 2006
Wiggins LD , Baio J , Schieve L , Lee LC , Nicholas J , Rice CE . J Dev Behav Pediatr 2012 33 (5) 387-95 OBJECTIVE: Past research is inconsistent in the stability of autism spectrum disorder (ASD) diagnoses. The authors therefore sought to examine the proportion of children identified from a population-based surveillance system that had a change in classification from ASD to non-ASD and factors associated with such changes. METHODS: Children with a documented age of first ASD diagnosis noted in surveillance records by a community professional (n = 1392) were identified from the Autism and Developmental Disabilities Monitoring Network. Children were considered to have a change in classification if an ASD was excluded after the age of first recorded ASD diagnosis. Child and surveillance factors were entered into a multivariable regression model to determine factors associated with diagnostic change. RESULTS: Only 4% of our sample had a change in classification from ASD to non-ASD noted in evaluation records. Factors associated with change in classification from ASD to non-ASD were timing of first ASD diagnosis at 30 months or younger, onset other than developmental regression, presence of specific developmental delays, and participation in a special needs classroom other than autism at 8 years of age. CONCLUSIONS: Our results found that children with ASDs are likely to retain an ASD diagnosis, which underscores the need for continued services. Children diagnosed at 30 months or younger are more likely to experience a change in classification from ASD to non-ASD than children diagnosed at 31 months or older, suggesting earlier identification of ASD symptoms may be associated with response to intervention efforts or increased likelihood for overdiagnosis. |
Salmonella enterica serotype Enteritidis: increasing incidence of domestically acquired infections
Chai SJ , White PL , Lathrop SL , Solghan SM , Medus C , McGlinchey BM , Tobin-D'Angelo M , Marcus R , Mahon BE . Clin Infect Dis 2012 54 Suppl 5 S488-97 BACKGROUND: Salmonella enterica causes an estimated 1 million cases of domestically acquired foodborne illness in humans annually in the United States; Enteritidis (SE) is the most common serotype. Public health authorities, regulatory agencies, food producers, and food processors need accurate information about rates and changes in SE infection to implement and evaluate evidence-based control policies and practices. METHODS: We analyzed the incidence of human SE infection during 1996-2009 in the Foodborne Diseases Active Surveillance Network (FoodNet), an active, population-based surveillance system for laboratory-confirmed infections. We compared FoodNet incidence with passively collected data from complementary surveillance systems and with rates of SE isolation from processed chickens and egg products; shell eggs are not routinely tested. We also compared molecular subtyping patterns of SE isolated from humans and chickens. RESULTS: Since the period 1996-1999, the incidence of human SE infection in FoodNet has increased by 44%. This change is mirrored in passive national surveillance data. The greatest relative increases were in young children, older adults, and FoodNet sites in the southern United States. The proportion of patients with SE infection who reported recent international travel has decreased in recent years, whereas the proportion of chickens from which SE was isolated has increased. Similar molecular subtypes of SE are commonly isolated from humans and chickens. CONCLUSIONS: Most SE infections in the United States are acquired from domestic sources, and the problem is growing. Chicken and eggs are likely major sources of SE. Continued close attention to surveillance data is needed to monitor the impact of recent regulatory control measures. |
Sex-based differences in food consumption: Foodborne Diseases Active Surveillance Network (FoodNet) Population Survey, 2006-2007
Shiferaw B , Verrill L , Booth H , Zansky SM , Norton DM , Crim S , Henao OL . Clin Infect Dis 2012 54 Suppl 5 S453-7 BACKGROUND: This analysis used data from the most recent Foodborne Diseases Active Surveillance Network (FoodNet) Population Survey (May 2006 through April 2007) to examine differences in the consumption of various types of foods between men and women. METHODS: Participants were surveyed by telephone and asked whether or not they had consumed certain foods in the past 7 days, including the following "high-risk" foods commonly associated with foodborne illness: pink hamburger, raw oysters, unpasteurized milk, cheese made from unpasteurized milk, runny eggs, and alfalfa sprouts. Data were weighted to adjust for survey design and to reflect the age and sex distribution of the population under FoodNet surveillance. RESULTS: A total of 14,878 persons ≥18 years were interviewed, of whom 5688 (38%) were men. A higher proportion of men reported eating meat and certain types of poultry than women, whereas a higher proportion of women ate fruits and vegetables. A higher proportion of men than women reported consuming runny eggs (12% versus 8%), pink hamburger (7% versus 4%), and raw oysters (2% versus 0.4%). A higher proportion of women than men ate alfalfa sprouts (3% versus 2%). No differences by sex were observed for consumption of unpasteurized milk or cheese. CONCLUSIONS: Data from the FoodNet Population Surveys can be useful in efforts to design targeted interventions regarding consumption of high-risk foods. Moreover, understanding the background rates of food consumption, stratified by sex, may help investigators identify the kinds of foods likely to be associated with outbreaks in which a preponderance of cases occur among members of one sex. |
Strategies for surveillance of pediatric hemolytic uremic syndrome: Foodborne Diseases Active Surveillance Network (FoodNet), 2000-2007
Ong KL , Apostal M , Comstock N , Hurd S , Webb TH , Mickelson S , Scheftel J , Smith G , Shiferaw B , Boothe E , Gould LH . Clin Infect Dis 2012 54 Suppl 5 S424-31 BACKGROUND: Postdiarrheal hemolytic uremic syndrome (HUS) is the most common cause of acute kidney failure among US children. The Foodborne Diseases Active Surveillance Network (FoodNet) conducts population-based surveillance of pediatric HUS to measure the incidence of disease and to validate surveillance trends in associated Shiga toxin-producing Escherichia coli (STEC) O157 infection. METHODS: We report the incidence of pediatric HUS, which is defined as HUS in children <18 years. We compare the results from provider-based surveillance and hospital discharge data review and examine the impact of different case definitions on the findings of the surveillance system. RESULTS: During 2000-2007, 627 pediatric HUS cases were reported. Fifty-two percent of cases were classified as confirmed (diarrhea, anemia, microangiopathic changes, low platelet count, and acute renal impairment). The average annual crude incidence rate for all reported cases of pediatric HUS was 0.78 per 100,000 children <18 years. Regardless of the case definition used, the year-to-year pattern of incidence appeared similar. More cases were captured by provider-based surveillance (76%) than by hospital discharge data review (68%); only 49% were identified by both methods. CONCLUSIONS: The overall incidence of pediatric HUS was affected by key characteristics of the surveillance system, including the method of ascertainment and the case definitions. However, year-to-year patterns were similar for all methods examined, suggesting that several approaches to HUS surveillance can be used to track trends. |
Travel-associated enteric infections diagnosed after return to the United States, Foodborne Diseases Active Surveillance Network (FoodNet), 2004-2009
Kendall ME , Crim S , Fullerton K , Han PV , Cronquist AB , Shiferaw B , Ingram LA , Rounds J , Mintz ED , Mahon BE . Clin Infect Dis 2012 54 Suppl 5 S480-7 BACKGROUND: Approximately 40% of US travelers to less developed countries experience diarrheal illness. Using data from the Foodborne Diseases Active Surveillance Network (FoodNet), we describe travel-associated enteric infections during 2004-2009, characterizing the patients, pathogens, and destinations involved. METHODS: FoodNet conducts active surveillance at 10 US sites for laboratory-confirmed infections with 9 pathogens transmitted commonly through food. Travel-associated infections are infections diagnosed in the United States but likely acquired abroad based on a pathogen-specific time window between return from international travel to diagnosis. We compare the demographic, clinical, and exposure-related characteristics of travelers with those of nontravelers and estimate the risk of travel-associated infections by destination, using US Department of Commerce data. RESULTS: Of 64,039 enteric infections reported to FoodNet with information about travel, 8270 (13%) were travel associated. The pathogens identified most commonly in travelers were Campylobacter (42%), nontyphoidal Salmonella (32%), and Shigella (13%). The most common travel destinations were Mexico, India, Peru, Dominican Republic, and Jamaica. Most travel-associated infections occurred in travelers returning from Latin America and the Caribbean (LAC). Risk was greatest after travel to Africa (75.9 cases per 100,000 population), followed by Asia (22.7 cases per 100,000), and LAC (20.0 cases per 100,000). CONCLUSIONS: The Latin America and Caribbean region accounts for most travel-associated enteric infections diagnosed in the United States, although travel to Africa carries the greatest risk. Although FoodNet surveillance does not cover enterotoxigenic Escherichia coli, a common travel-associated infection, this information about other key enteric pathogens can be used by travelers and clinicians in pre- and posttravel consultations. |
Validating deaths reported in the Foodborne Diseases Active Surveillance Network (FoodNet): are all deaths being captured?
Manikonda K , Palmer A , Wymore K , McMillian M , Nicholson C , Hurd S , Hoefer D , Tobin-D'Angelo M , Cosgrove S , Lyons C , Lathrop S , Hedican E , Patrick M . Clin Infect Dis 2012 54 Suppl 5 S421-3 Accurate information about deaths is important when determining the human health and economic burden of foodborne diseases. We reviewed death certificate data to assess the accuracy of deaths reported to the Foodborne Diseases Active Surveillance Network (FoodNet). Data were highly accurate, and few deaths were missed through active surveillance. |
Population-based active surveillance for Cyclospora infection--United States, Foodborne Diseases Active Surveillance Network (FoodNet), 1997-2009
Hall RL , Jones JL , Hurd S , Smith G , Mahon BE , Herwaldt BL . Clin Infect Dis 2012 54 Suppl 5 S411-7 BACKGROUND: Cyclosporiasis is an enteric disease caused by the parasite Cyclospora cayetanensis. Since the mid-1990s, the Centers for Disease Control and Prevention has been notified of cases through various reporting and surveillance mechanisms. METHODS: We summarized data regarding laboratory-confirmed cases of Cyclospora infection reported during 1997-2009 via the Foodborne Diseases Active Surveillance Network (FoodNet), which gradually expanded to include 10 sites (Connecticut, Georgia, Maryland, Minnesota, New Mexico, Oregon, Tennessee, and selected counties in California, Colorado, and New York) that represent approximately 15% of the US population. Since 2004, the number of sites has remained constant and data on the international travel history and outbreak status of cases have been collected. RESULTS: A total of 370 cases were reported, 70.3% (260) of which were in residents of Connecticut (134 [36.2%]) and Georgia (126 [34.1%]), which on average during this 13-year period accounted for 29.0% of the total FoodNet population under surveillance. Positive stool specimens were collected in all months of the year, with a peak in June and July (208 cases [56.2%]). Approximately half (48.6%) of the 185 cases reported during 2004-2009 were associated with international travel, known outbreaks, or both. CONCLUSIONS: The reported cases were concentrated in time (spring and summer) and place (2 of 10 sites). The extent to which the geographic concentration reflects higher rates of testing, more sensitive testing methods, or higher exposure/infection rates is unknown. Clinicians should include Cyclospora infection in the differential diagnosis of prolonged or relapsing diarrheal illness and explicitly request stool examinations for this parasite. |
Primary care-based interventions are associated with increases in hepatitis C virus testing for patients at risk
Litwin AH , Smith BD , Drainoni ML , McKee D , Gifford AL , Koppelman E , Christiansen CL , Weinbaum CM , Southern WN . Dig Liver Dis 2012 44 (6) 497-503 BACKGROUND: An estimated 3.2 million persons are chronically infected with the hepatitis C virus (HCV) in the U.S. Effective treatment is available, but approximately 50% of patients are not aware that they are infected. Optimal testing strategies have not been described. METHODS: The Hepatitis C Assessment and Testing Project (HepCAT) was a serial cross-sectional evaluation of two community-based interventions designed to increase HCV testing in urban primary care clinics in comparison with a baseline period. The first intervention (risk-based screener) prompted physicians to order HCV tests based on the presence of HCV-related risks. The second intervention (birth cohort) prompted physicians to order HCV tests on all patients born within a high-prevalence birth cohort (1945-1964). The study was conducted at three primary care clinics in the Bronx, New York. RESULTS: Both interventions were associated with an increased proportion of patients tested for HCV from 6.0% at baseline to 13.1% during the risk-based screener period (P<0.001) and 9.9% during the birth cohort period (P<0.001). CONCLUSIONS: Two simple clinical reminder interventions were associated with significantly increased HCV testing rates. Our findings suggest that HCV screening programs, using either a risk-based or birth cohort strategy, should be adopted in primary care settings so that HCV-infected patients may benefit from antiviral treatment. |
Estimates of enteric illness attributable to contact with animals and their environments in the United States
Hale CR , Scallan E , Cronquist AB , Dunn J , Smith K , Robinson T , Lathrop S , Tobin-D'Angelo M , Clogher P . Clin Infect Dis 2012 54 Suppl 5 S472-9 BACKGROUND: Contact with animals and their environment is an important, and often preventable, route of transmission for enteric pathogens. This study estimated the annual burden of illness attributable to animal contact for 7 groups of pathogens: Campylobacter species, Cryptosporidium species, Shiga toxin-producing Escherichia coli (STEC) O157, STEC non-O157, Listeria monocytogenes, nontyphoidal Salmonella species, and Yersinia enterocolitica. METHODS: By using data from the US Foodborne Diseases Active Surveillance Network and other sources, we estimated the proportion of illnesses attributable to animal contact for each pathogen and applied those proportions to the estimated annual number of illnesses, hospitalizations, and deaths among US residents. We established credible intervals (CrIs) for each estimate. RESULTS: We estimated that 14% of all illnesses caused by these 7 groups of pathogens were attributable to animal contact. This estimate translates to 445,213 (90% CrI, 234 197-774 839) illnesses annually for the 7 groups combined. Campylobacter species caused an estimated 187,481 illnesses annually (90% CrI, 66,259-372,359), followed by nontyphoidal Salmonella species (127,155; 90% CrI, 66,502-219,886) and Cryptosporidium species (113,344; 90% CrI, 22,570-299,243). Of an estimated 4933 hospitalizations (90% CrI, 2704-7914), the majority were attributable to nontyphoidal Salmonella (48%), Campylobacter (38%), and Cryptosporidium (8%) species. Nontyphoidal Salmonella (62%), Campylobacter (22%), and Cryptosporidium (9%) were also responsible for the majority of the estimated 76 deaths (90% CrI, 5-211). CONCLUSIONS: Animal contact is an important transmission route for multiple major enteric pathogens. Continued efforts are needed to prevent pathogen transmission from animals to humans, including increasing awareness and encouraging hand hygiene. |
Impacts of culture-independent diagnostic practices on public health surveillance for bacterial enteric pathogens
Cronquist AB , Mody RK , Atkinson R , Besser J , D'Angelo MT , Hurd S , Robinson T , Nicholson C , Mahon BE . Clin Infect Dis 2012 54 Suppl 5 S432-9 For decades, culture has been the mainstay of diagnostic testing for bacterial enteric pathogens. This paradigm is changing as clinical laboratories adopt culture-independent methods, such as antigen-based tests and nucleic acid-based assays. Public health surveillance for enteric infections addresses 4 interrelated but distinct objectives: case investigation for localized disease control; assessment of disease burden and trends to prioritize and assess impact of population-based control measures; outbreak detection; and microbiologic characterization to improve understanding of pathogens, their virulence mechanisms, and epidemiology. We summarize the challenges and opportunities that culture-independent tests present and suggest strategies, such as validation studies and development of culture-independent tests compatible with subtyping, that could be adopted to ensure that surveillance remains robust. Many of these approaches will require time and resources to implement, but they will be necessary to maintain a strong surveillance system. Public health practitioners must clearly explain the value of surveillance, especially how outbreak detection benefits the public, and collaborate with all stakeholders to develop solutions. |
Increasing rates of vibriosis in the United States, 1996-2010: review of surveillance data from 2 systems
Newton A , Kendall M , Vugia DJ , Henao OL , Mahon BE . Clin Infect Dis 2012 54 Suppl 5 S391-5 BACKGROUND: The Centers for Disease Control and Prevention monitors vibriosis through 2 surveillance systems: the nationwide Cholera and Other Vibrio Illness Surveillance (COVIS) system and the 10-state Foodborne Diseases Active Surveillance Network (FoodNet). COVIS conducts passive surveillance and FoodNet conducts active surveillance for laboratory-confirmed Vibrio infections. METHODS: We summarized Vibrio infections (excluding toxigenic V. cholerae O1 and O139) reported to COVIS and FoodNet from 1996 through 2010. For each system, we calculated incidence rates using US Census Bureau population estimates for the surveillance area. RESULTS: From 1996 to 2010, 7700 cases of vibriosis were reported to COVIS and 1519 to FoodNet. Annual incidence of reported vibriosis per 100,000 population increased from 1996 to 2010 in both systems, from 0.09 to 0.28 in COVIS and from 0.15 to 0.42 in FoodNet. The 3 commonly reported Vibrio species were V. parahaemolyticus, V. vulnificus, and V. alginolyticus; both surveillance systems showed that the incidence of each increased. In both systems, most hospitalizations and deaths were caused by V. vulnificus infection, and most patients were white men. The number of cases peaked in the summer months. CONCLUSIONS: Surveillance data from both COVIS and FoodNet indicate that the incidence of vibriosis increased from 1996 to 2010 overall and for each of the 3 most commonly reported species. Epidemiologic patterns were similar in both systems. Current prevention efforts have failed to prevent increasing rates of vibriosis; more effective efforts will be needed to decrease rates. |
Invasive listeriosis in the Foodborne Diseases Active Surveillance Network (FoodNet), 2004-2009: further targeted prevention needed for higher-risk groups
Silk BJ , Date KA , Jackson KA , Pouillot R , Holt KG , Graves LM , Ong KL , Hurd S , Meyer R , Marcus R , Shiferaw B , Norton DM , Medus C , Zansky SM , Cronquist AB , Henao OL , Jones TF , Vugia DJ , Farley MM , Mahon BE . Clin Infect Dis 2012 54 Suppl 5 S396-404 BACKGROUND: Listeriosis can cause severe disease, especially in fetuses, neonates, older adults, and persons with certain immunocompromising and chronic conditions. We summarize US population-based surveillance data for invasive listeriosis from 2004 through 2009. METHODS: We analyzed Foodborne Diseases Active Surveillance Network (FoodNet) data for patients with Listeria monocytogenes isolated from normally sterile sites. We describe the epidemiology of listeriosis, estimate overall and specific incidence rates, and compare pregnancy-associated and nonpregnancy-associated listeriosis by age and ethnicity. RESULTS: A total of 762 listeriosis cases were identified during the 6-year reporting period, including 126 pregnancy-associated cases (17%), 234 nonpregnancy-associated cases(31%) in patients aged <65 years, and 400 nonpregnancy-associated cases (53%) in patients aged ≥65 years. Eighteen percent of all cases were fatal. Meningitis was diagnosed in 44% of neonates. For 2004-2009, the overall annual incidence of listeriosis varied from 0.25 to 0.32 cases per 100,000 population. Among Hispanic women, the crude incidence of pregnancy-associated listeriosis increased from 5.09 to 12.37 cases per 100,000 for the periods of 2004-2006 and 2007-2009, respectively; among non-Hispanic women, pregnancy-associated listeriosis increased from 1.74 to 2.80 cases per 100,000 for the same periods. Incidence rates of nonpregnancy-associated listeriosis in patients aged ≥65 years were 4-5 times greater than overall rates annually. CONCLUSIONS: Overall listeriosis incidence did not change significantly from 2004 through 2009. Further targeted prevention is needed, including food safety education and messaging (eg, avoiding Mexican-style cheese during pregnancy). Effective prevention among pregnant women, especially Hispanics, and older adults would substantially affect overall rates. |
Antimicrobial susceptibility patterns of Shigella isolates in Foodborne Diseases Active Surveillance Network (FoodNet) sites, 2000-2010
Shiferaw B , Solghan S , Palmer A , Joyce K , Barzilay EJ , Krueger A , Cieslak P . Clin Infect Dis 2012 54 Suppl 5 S458-63 BACKGROUND: Treatment of shigellosis with appropriate antimicrobial agents shortens duration of illness and bacterial shedding, but resistance to commonly used agents is increasing. METHODS: We describe resistance patterns among Shigella isolates in the United States with use of linked data from the Foodborne Diseases Active Surveillance Network (FoodNet) and National Antimicrobial Resistance Monitoring System (NARMS). FoodNet sites send every 20th Shigella isolate to the NARMS laboratory for susceptibility testing. RESULTS: During 2000-2010, the NARMS laboratory tested 1376 Shigella isolates from FoodNet sites. Of 1118 isolates (81%) linked to FoodNet, 826 (74%) were resistant to ampicillin, 649 (58%) to streptomycin, 402 (36%) to trimethoprim-sulfamethoxazole (TMP-SMX), 355 (32%) to sulfamethoxazole-sulfisoxazole, 312 (28%) to tetracycline, 19 (2%) to nalidixic acid, and 6 (0.5%) to ciprofloxacin. The proportion of Shigella isolates with resistance to TMP-SMX was 40% among white persons, 58% among Hispanic persons, and 75% among persons with a history of international travel. Resistance to at least TMP-SMX and ampicillin was present in 25% of isolate, and 5% were resistant to ampicillin, TMP-SMX, and chloramphenicol. Overall, 5% of isolates showed multidrug resistance to ampicillin, chloramphenicol, streptomycin, sulfamethoxazole-sulfisoxazole, and tetracycline, including 49 Shigella flexneri (33%) and 3 Shigella sonnei (0.3%) isolates. Male individuals were more likely than female individuals to be infected with a multidrug-resistant strain (7% versus 3%; P < .01). CONCLUSIONS: Antimicrobial resistance differed by race, ethnicity, age, travel, and species. Resistance to commonly used antibiotics is high; therefore, it is important to look at the susceptibility pattern before starting treatment. |
Calculating a measure of overall change in the incidence of selected laboratory-confirmed infections with pathogens transmitted commonly through food in the Foodborne Diseases Active Surveillance Network (FoodNet), 1996-2010
Henao OL , Crim SM , Hoekstra RM . Clin Infect Dis 2012 54 Suppl 5 S418-20 To measure overall change in the incidence of illness, we combined data for infections caused by 6 bacterial pathogens monitored by the Foodborne Diseases Active Surveillance Network for which >50% of illnesses are estimated to be transmitted by food. The overall incidence for these pathogens was 23% lower in 2010 than during the period 1996-1998. This estimate provides a summary of changes in incidence of infection for these pathogens. |
Changing epidemiology of Yersinia enterocolitica infections: markedly decreased rates in young black children, Foodborne Diseases Active Surveillance Network (FoodNet), 1996-2009
Ong KL , Gould LH , Chen DL , Jones TF , Scheftel J , Webb TH , Mody RK , Mahon BE . Clin Infect Dis 2012 54 Suppl 5 S385-90 BACKGROUND: Yersinia enterocolitica causes an estimated 116,716 illnesses annually in the United States. Black children have historically had the highest rates of infection, with incidence peaking in the winter. METHODS: The Foodborne Diseases Active Surveillance Network (FoodNet) conducts active surveillance for laboratory-confirmed Y. enterocolitica infections, defined as the isolation of Y. enterocolitica or unspeciated Yersinia from a human clinical specimen. We calculated the average annual crude incidence rate per 100,000 persons from 1996 through 2009 and described rates by age, race, and geographic site. To account for changes in the FoodNet catchment area, we used a negative binomial model to estimate statistical changes in incidence using the average annual incidence in 1996-1998 as the baseline. RESULTS: From 1996 through 2009, 2085 Y. enterocolitica infections were reported to FoodNet. The average annual crude incidence was 0.5 per 100,000 persons and was highest in blacks (0.9 per 100,000 persons). Over time, the rate in blacks declined from 3.9 to 0.4 per 100,000 persons. Declines among other racial groups were not as pronounced. The largest decline occurred in black children <5 years old (from 41.5 per 100,000 persons in 1996 to 3.5 per 100,000 persons in 2009). From 2007 through 2009, the highest rate of infection was in Asian children (5.1 per 100,000 persons). Compared with 1996-1998, the incidence in 2009 was 66% (95% confidence interval, 51%-77%) lower among children <5 years old. CONCLUSIONS: Y. enterocolitica infections in FoodNet sites have significantly declined since 1996. These declines were greatest in young black children, the group that initially had the highest incidence, possibly as the result of educational efforts in Georgia. |
Characteristics of foodborne disease outbreak investigations conducted by Foodborne Diseases Active Surveillance Network (FoodNet) sites, 2003-2008
Murphree R , Garman K , Phan Q , Everstine K , Gould LH , Jones TF . Clin Infect Dis 2012 54 Suppl 5 S498-503 BACKGROUND: A mean of ≥1000 foodborne disease outbreaks (FBDOs) causing ≥20 000 illnesses are reported to the Centers for Disease Control and Prevention (CDC) annually. We evaluated characteristics of successful outbreak investigations (ie, those that identified an etiologic agent or food vehicle) in the Foodborne Diseases Active Surveillance Network (FoodNet). METHODS: FBDOs were defined as the occurrence of ≥2 cases of a similar illness resulting from ingestion of a common food. FBDOs reported to CDC Foodborne Disease Outbreak Surveillance System during 2003-2008 with FoodNet supplemental data available were included in the analyses. RESULTS: Data regarding 1200 FBDOs were available. An etiologic agent was confirmed in 715 (60%); a food vehicle was identified in 387 (32%). At least 4 fecal specimens were collected in 425 of 639 outbreaks (67%) with a confirmed etiologic agent and 48 of 232 (21%) without a confirmed etiologic agent (odds ratio [OR], 7.6; 95% confidence interval [CI], 5.3-10.9). A food vehicle was identified in 314 (47%) of 671 outbreaks investigated using a case-control or cohort study, compared with only 73 (14%) of 529 outbreaks investigated by using other methods (OR, 5.5; 95% CI, 4.1-7.3). At least 1 barrier affecting the success of the investigation was reported for 655 outbreaks, including too few patients (n = 172; 26%), too few stool specimens (n = 167; 25%), and too few control subjects (n = 152; 23%). CONCLUSINS: Etiologic agent and vehicle are frequently undetermined in FBDOs. Greater emphasis on fecal specimen collection and overcoming barriers to pursuing analytic epidemiologic studies can improve ascertainment of these factors. |
Clinical laboratory practices for the isolation and identification of Campylobacter in Foodborne Diseases Active Surveillance Network (FoodNet) sites: baseline information for understanding changes in surveillance data
Hurd S , Patrick M , Hatch J , Clogher P , Wymore K , Cronquist AB , Segler S , Robinson T , Hanna S , Smith G , Fitzgerald C . Clin Infect Dis 2012 54 Suppl 5 S440-5 BACKGROUND: Campylobacter is a leading cause of foodborne illness in the United States. Understanding laboratory practices is essential to interpreting incidence and trends in reported campylobacteriosis over time and provides a baseline for evaluating the increasing use of culture-independent diagnostic methods for Campylobacter infection. METHODS: The Foodborne Diseases Active Surveillance Network (FoodNet) conducts surveillance for laboratory-confirmed Campylobacter infections. In 2005, FoodNet conducted a survey of clinical laboratories to describe routine practices used for isolation and identification of Campylobacter. A profile was assigned to laboratories based on complete responses to key survey questions that could impact the recovery and isolation of Campylobacter from stool specimens. RESULTS: Of 411 laboratories testing on-site for Campylobacter, 97% used only culture methods. Among those responding to the individual questions, nearly all used transport medium (97%) and incubated at 42 degrees C (94%); however, most deviated from existing guidelines in other areas: 68% held specimens in transport medium at room temperature before plating, 51% used Campy blood agar plate medium, 52% read plates at <72 hours of incubation, and 14% batched plates before placing them in a microaerobic environment. In all, there were 106 testing algorithms among 214 laboratories with a complete profile; only 16 laboratories were fully adherent to existing guidelines. CONCLUSIONS: Although most laboratories used culture-based methods, procedures differed widely and most did not adhere to existing guidelines, likely resulting in underdiagnosis. Given the availability of new culture-independent testing methods, these data highlight a clear need to develop best practice recommendations for Campylobacter infection diagnostic testing. |
Do differences in risk factors, medical care seeking, or medical practices explain the geographic variation in campylobacteriosis in Foodborne Diseases Active Surveillance Network (FoodNet) sites?
Ailes E , Scallan E , Berkelman RL , Kleinbaum DG , Tauxe RV , Moe CL . Clin Infect Dis 2012 54 Suppl 5 S464-71 BACKGROUND: In the United States, considerable geographic variation in the rates of culture-confirmed Campylobacter infection has been consistently observed among sites participating in the Foodborne Diseases Active Surveillance Network (FoodNet). METHODS: We used data from the FoodNet Population Surveys and a FoodNet case-control study of sporadic infection to examine whether differences in medical care seeking, medical practices, or risk factors contributed to geographic variation in incidence. RESULTS: We found differences across the FoodNet sites in the proportion of persons seeking medical care for an acute campylobacteriosis-like illness (range, 24.9%-43.5%) and in the proportion of ill persons who submitted a stool sample (range, 18.6%-40.7%), but these differences were not statistically significant. We found no evidence of geographic effect modification of previously identified risk factors for campylobacteriosis in the case-control study analysis. The prevalence of some exposures varied among control subjects in the FoodNet sites, including the proportion of controls reporting eating chicken at a commercial eating establishment (18.2%-46.1%); contact with animal stool (8.9%-30.9%); drinking water from a lake, river, or stream (0%-5.1%); and contact with a farm animal (2.1%-12.7%). However, these differences do not fully explain the geographic variation in campylobacteriosis. CONCLUSIONS: Future studies that quantify Campylobacter contamination in poultry or variation in host immunity may be useful in identifying sources of this geographic variation in incidence. |
Evidence of purifying selection on merozoite surface protein 8 (MSP8) and 10 (MSP10) in Plasmodium spp.
Pacheco MA , Elango AP , Rahman AA , Fisher D , Collins WE , Barnwell JW , Escalante AA . Infect Genet Evol 2012 12 (5) 978-86 Evidence for natural selection, positive or negative, on gene encoding antigens may indicate variation or functional constraints that are immunologically relevant. Most malaria surface antigens with high genetic diversity have been reported to be under positive-diversifying selection. However, antigens with limited genetic variation are usually ignored in terms of the role that natural selection may have in generating such patterns. We investigated orthologous genes encoding two merozoite proteins, MSP8 and MSP10, among several mammalian Plasmodium spp. These antigens, together with MSP1, are among the few MSPs that have two epidermal growth factor-like domains (EGF) at the C-terminal. Those EGF are relatively conserved (low levels of genetic polymorphism) and have been proposed to act as ligands during the invasion of RBCs. We use several evolutionary genetic methods to detect patterns consistent with natural selection acting on MSP8 and MSP10 orthologs in the human parasites Plasmodium falciparum and P. vivax, as well as closely related malarial species found in non-human primates (NHPs). Overall, these antigens have low polymorphism in the human parasites in comparison with the orthologs from other Plasmodium spp. We found that the MSP10 gene polymorphism in P. falciparum only harbor non-synonymous substitutions, a pattern consistent with a gene under positive selection. Evidence of purifying selection was found on the polymorphism observed in both orthologs from P. cynomolgi, a non-human primate parasite closely related to P. vivax, but it was not conclusive in the human parasite. Yet, using phylogenetic base approaches, we found evidence for purifying selection on both MSP8 and MSP10 in the lineage leading to P. vivax. Such antigens evolving under strong functional constraints could become valuable vaccine candidates. We discuss how comparative approaches could allow detecting patterns consistent with negative selection even when there is low polymorphism in the extant populations. |
Convergence and coevolution of hepatitis B virus drug resistance.
Thai H , Campo DS , Lara J , Dimitrova Z , Ramachandran S , Xia G , Ganova-Raeva L , Teo CG , Lok A , Khudyakov Y . Nat Commun 2012 3 789 Treatment with lamivudine of patients infected with hepatitis B virus (HBV) results in a high rate of drug resistance, which is primarily associated with the rtM204I/V substitution in the HBV reverse transcriptase domain. Here we show that the rtM204I/V substitution, although essential, is insufficient for establishing resistance against lamivudine. The analysis of 639 HBV whole-genome sequences obtained from 11 patients shows that rtM204I/V is independently acquired by more than one intra-host HBV variant, indicating the convergent nature of lamivudine resistance. The differential capacity of HBV variants to develop drug resistance suggests that fitness effects of drug-resistance mutations depend on the genetic structure of the HBV genome. An analysis of Bayesian networks that connect rtM204I/V to many sites of HBV proteins confirms that lamivudine resistance is a complex trait encoded by the entire HBV genome rather than by a single mutation. These findings have implications for public health and offer a more general framework for understanding drug resistance. |
Ready, set, go: African American preadolescents' sexual thoughts, intentions, and behaviors
Miller KS , Fasula AM , Lin CY , Levin ML , Wyckoff SC , Forehand R . J Early Adolesc 2012 32 (2) 293-307 Understanding of preadolescent sexuality is limited. To help fill this gap, we calculated frequencies, percentages, and confidence intervals for 1,096 preadolescents' reports of sexual thoughts, intentions, and sexual behavior. Cochran-Armitage trend tests accounted for age effects. Findings show that 9-year-olds are readying for sexual activity, with sexual readiness increasing between ages of 9 and 12. Sexual thoughts increased with age (p < .001): 46% of 9-year-olds and 70% of 12-year-olds were ready to learn about sex, and 14% of 9-year-olds and 41% of 12-year-olds thought about having sex. Few 9-year-olds anticipated sexual debut, but this increased with age (p < .05): 25% of 12-year-olds were ready for sex, and 20% anticipated initiating sex within a year. Our results indicate that preadolescents are initiating dating relationships and anticipating intercourse, and some have engaged in risk behaviors. Thus preadolescence is a critical time to implement prevention programs. (PsycINFO Database Record (c) 2012 APA, all rights reserved) (journal abstract). |
Results from two online surveys comparing sexual risk behaviors in Hispanic, black, and white men who have sex with men
Taylor BS , Chiasson MA , Scheinmann R , Hirshfield S , Humberstone M , Remien RH , Wolitski RJ , Wong T . AIDS Behav 2012 16 (3) 644-52 Many men who have sex with men (MSM) are among those who increasingly use the internet to find sexual partners. Few studies have compared behavior by race/ethnicity in internet-based samples of MSM. We examined the association of race/ethnicity with HIV risk-related behavior among 10,979 Hispanic, black, and white MSM recruited online. Significant variations by race/ethnicity were found in: age, income level, sexual orientation, number of lifetime male and female sexual partners, and rates of unprotected anal intercourse (UAI). Black and Hispanic men were more likely to report anal intercourse during the last sexual encounter, but white men were more likely to report UAI. In multivariate analysis, UAI was associated with HIV infection and sex with a main partner. Significant risk behavior variations by race/ethnicity were found. Research is needed to better target online interventions to MSM who engage in UAI or have other risk factors for transmitting or acquiring HIV. |
Routine brief risk-reduction counseling with biannual STD testing reduces STD incidence among HIV-infected men who have sex with men in care
Patel P , Bush T , Mayer K , Milam J , Richardson J , Hammer J , Henry K , Overton T , Conley L , Marks G , Brooks JT . Sex Transm Dis 2012 39 (6) 470-474 BACKGROUND: We evaluated whether routine biannual sexually transmitted disease (STD) testing coupled with brief risk-reduction counseling reduces STD incidence and high-risk behaviors. METHODS: The SUN study is a prospective observational HIV cohort study conducted in 4 US cities. At enrollment and every 6 months thereafter, participants completed a behavioral survey and were screened for STDs, and if diagnosed, were treated. Medical providers conducted brief risk-reduction counseling with all patients. Among men who have sex with men (MSM), we examined trends in STD incidence and rates of self-reported risk behaviors before and after exposure to the risk-reduction intervention. The "preintervention" visit was the study visit that was at least 6 months after enrollment STD screening and treatment and at which the participant was first exposed to the intervention. The "postintervention" visit was 12 months later. RESULTS: Among 216 MSM with complete STD and behavioral data, median age was 44.5 years; 77% were non-Hispanic white; 83% were on highly active antiretroviral treatment; 84% had an HIV RNA level <400 copies/mL and the median CD4 (cluster of differentiation 4) count was 511 cells/mm. Twelve months after first exposure to the risk-reduction intervention, STD incidence declined from 8.8% to 4.2% (P = 0.041). Rates of unprotected receptive or insertive anal intercourse with HIV-positive partners increased (19% to 25%, P = 0.024), but did not change with HIV-negative partners or partners of unknown HIV status (24% to 22%, P = 0.590). CONCLUSIONS: STD incidence declined significantly among HIV-infected MSM after implementing frequent, routine STD testing coupled with risk-reduction counseling. These findings support adoption of routine STD screening and risk-reduction counseling for HIV-infected MSM. |
Social network predictors of disclosure of MSM behavior and HIV-positive serostatus among African American MSM in Baltimore, Maryland
Latkin C , Yang C , Tobin K , Roebuck G , Spikes P , Patterson J . AIDS Behav 2012 16 (3) 535-42 This study examined correlates of disclosure of MSM behavior and seropositive HIV status to social network members among 187 African American MSM in Baltimore, MD. 49.7% of participants were HIV-positive, 64% of their social network members (excluding male sex partners) were aware of their MSM behavior, and 71.3% were aware of their HIV-positive status. Disclosure of MSM behavior to network members was more frequent among participants who were younger, had a higher level of education, and were HIV-positive. Attributes of the social network members associated with MSM disclosure included the network member being HIV-positive, providing emotional support, socializing with the participant, and not being a female sex partner. Participants who were younger were more likely to disclose their positive HIV status. Attributes of social network members associated with disclosure of positive serostatus included the network member being older, HIV-positive, providing emotional support, loaning money, and not being a male sex partner. |
Having supportive social relationships is associated with reduced risk of unrecognized HIV infection among black and Latino men who have sex with men
Lauby JL , Marks G , Bingham T , Liu KL , Liau A , Stueve A , Millett GA . AIDS Behav 2012 16 (3) 508-15 We examined the hypothesis that black and Latino men who have sex with men (MSM) who have supportive social relationships with other people are less likely to have unrecognized HIV infection compared with MSM of color who report lower levels of social support. We interviewed 1286 black and Latino MSM without known HIV infection in three metropolitan areas who were recruited using respondent driven sampling. Participants completed a computer-administered questionnaire and were tested for HIV. Unrecognized HIV infection was found in 118 men (9.2%). MSM who scored higher on the supportive relationship index had significantly lower odds of testing HIV-positive in the study. The mediation analysis identified two possible behavioral pathways that may partially explain this association: men who had strong supportive relationships were more likely to have had a test for HIV infection in the past 2 years and less likely to have recently engaged in high-risk sexual behavior. The findings illuminate the protective role of social relationships among MSM of color in our sample. |
HIV risk behavior among HIV-infected men who have sex with men in Bangkok, Thailand
Sirivongrangson P , Lolekha R , Charoenwatanachokchai A , Siangphoe U , Fox KK , Jirarojwattana N , Bollen L , Yenyarsan N , Lokpichat S , Suksripanich O , McConnell M . AIDS Behav 2012 16 (3) 618-25 We assessed prevalence of sexually transmitted infection (STIs), sexual risk behaviors, and factors associated with risk behaviors among HIV-infected MSM attending a public STI clinic serving MSM in Bangkok, Thailand. Between October 2005-October 2007, 154 HIV-infected MSM attending the clinic were interviewed about sexual risk behaviors and evaluated for STIs. Patients were examined for genital ulcers and had serologic testing for syphilis and PCR testing for chlamydia and gonorrhea. Results showed that sexual intercourse in the last 3 months was reported by 131 men. Of these, 32% reported anal sex without a condom. STIs were diagnosed in 41%. Factors associated with having sex without a condom were having a steady male partner, having a female partner and awareness of HIV status <1 month. Sexual risk behaviors and STIs were common among HIV-infected MSM in this study. This highlights the need for increased HIV prevention strategies for HIV-infected MSM. |
Predicting response to reassurances and uncertainties in bioterrorism communications for urban populations in New York and California
Vaughan E , Tinker TL , Truman BI , Edelson P , Morse SS . Biosecur Bioterror 2012 10 (2) 188-202 Recent national plans for recovery from bioterrorism acts perpetrated in densely populated urban areas acknowledge the formidable technical and social challenges of consequence management. Effective risk and crisis communication is one priority to strengthen the U.S.'s response and resilience. However, several notable risk events since September 11, 2001, have revealed vulnerabilities in risk/crisis communication strategies and infrastructure of agencies responsible for protecting civilian populations. During recovery from a significant biocontamination event, 2 goals are essential: (1) effective communication of changing risk circumstances and uncertainties related to cleanup, restoration, and reoccupancy; and (2) adequate responsiveness to emerging information needs and priorities of diverse populations in high-threat, vulnerable locations. This telephone survey study explored predictors of public reactions to uncertainty communications and reassurances from leaders related to the remediation stage of an urban-based bioterrorism incident. African American and Hispanic adults (N=320) were randomly sampled from 2 ethnically and socioeconomically diverse geographic areas in New York and California assessed as high threat, high vulnerability for terrorism and other public health emergencies. Results suggest that considerable heterogeneity exists in risk perspectives and information needs within certain sociodemographic groups; that success of risk/crisis communication during recovery is likely to be uneven; that common assumptions about public responsiveness to particular risk communications need further consideration; and that communication effectiveness depends partly on preexisting values and risk perceptions and prior trust in leaders. Needed improvements in communication strategies are possible with recognition of where individuals start as a reference point for reasoning about risk information, and comprehension of how this influences subsequent interpretation of agencies' actions and communications. |
The development of an eHealth tool suite for prostate cancer patients and their partners
Van Bogaert D , Hawkins R , Pingree S , Jarrard D . J Support Oncol 2012 10 (5) 202-8 BACKGROUND: eHealth resources for people facing health crises must balance the expert knowledge and perspective of developers and clinicians against the very different needs and perspectives of prospective users. This formative study explores the information and support needs of posttreatment prostate cancer patients and their partners as a way to improve an existing eHealth information and support system called CHESS (Comprehensive Health Enhancement Support System). METHODS: Focus groups with patient survivors and their partners were used to identify information gaps and information-seeking milestones. RESULTS: Both patients and partners expressed a need for assistance in decision making, connecting with experienced patients, and making sexual adjustments. Female partners of patients are more active in searching for cancer information. All partners have information and support needs distinct from those of the patient. CONCLUSIONS: Findings were used to develop a series of interactive tools and navigational features for the CHESS prostate cancer computer-mediated system. |
Reactivation of hepatitis B during immunosuppressive therapy: potentially fatal yet preventable
Lok AS , Ward JW , Perrillo RP , McMahon BJ , Liang TJ . Ann Intern Med 2012 156 (10) 743-5 Reactivation of hepatitis B virus (HBV) replication, an abrupt increase or reappearance of serum HBV DNA in a patient with chronic or past HBV infection, is a known complication of immunosuppressive therapy. This condition can lead to hepatocellular injury, elevated alanine aminotransferase levels, symptoms of acute hepatitis, liver failure, and even death (1). Many physicians who regularly prescribe immunosuppressive therapy unfortunately do not recognize this potentially fatal condition. | Hepatitis B virus reactivation has been best studied in patients receiving chemotherapy for hematologic cancer, but it has also been reported during treatment of solid tumors (2). In addition, reactivation can occur in patients receiving antirejection treatment, long-term corticosteroid therapy, and tumor necrosis factor-α inhibitors (3, 4). Most cases of HBV reactivation occur in patients who are hepatitis B surface antigen (HBsAg)–positive, but it has also been reported in patients who are HBsAg-negative/hepatitis B core antibody (anti-HBc)–positive, particularly when rituximab is used (5). | A systematic review of 14 studies (including 2 randomized, controlled trials) evaluated 550 HBsAg-positive patients receiving cancer chemotherapy. In patients who did not receive prophylactic antiviral therapy, 36.8% had HBV reactivation, 33.4% had HBV-related hepatitis, 13% had liver failure, and 5.5% died (6). Prophylactic use of lamivudine decreased the risk for HBV reactivation and HBV-related hepatitis by 79% to 100%, and no cases of HBV-related liver failure occurred. Furthermore, patients who received prophylactic lamivudine had less interruption of chemotherapy and lower rates of cancer-related, as well as all-cause, mortality. |
Analysis of variola and vaccinia neutralization assays for smallpox vaccines
Hughes CM , Newman FK , Davidson WB , Olson VA , Smith SK , Holman RC , Yan L , Frey SE , Belshe RB , Karem KL , Damon IK . Clin Vaccine Immunol 2012 19 (7) 1116-8 Possible smallpox re-emergence drives research for third-generation vaccines that effectively neutralize variola virus. Comparison of neutralization assays using different substrates, variola and vaccinia (Dryvax and MVA), showed significantly different 90% neutralization titers; Dryvax underestimated while MVA overestimated variola neutralization. Third-generation vaccines may rely upon neutralization as a correlate of protection. |
Standardization of measurements of 25-hydroxyvitamin D3 and D2
Thienpont LM , Stepman HC , Vesper HW . Scand J Clin Lab Invest Suppl 2012 243 41-9 The vitamin D status is increasingly assessed/monitored in different populations, research cohorts and individual patients. This is done by measuring the liver metabolites 25-hydroxyvitamin D3 and D2 as biomarkers. Recommendations for using specific serum concentrations of these biomarkers to assess a person's vitamin D status were done. This requires current vitamin D assays to be sufficiently accurate over time, location and laboratory procedures. In view of the fact that several studies demonstrated that current 25(OH)D measurement methods do not meet this prerequisite, standardization is needed. This paper rehearses the basic concept of standardization, in particular applied to measurements of 25-hydroxyvitamin D. Progress has been made by establishing a reference measurement system consisting of reference methods and reference materials. Coordinated efforts to improve the accuracy and standardize measurements are being performed by organizations such as the U.S. NIH, the CDC and Prevention, the NIST together with their national and international partners. Beyond describing the available reference measurement system and its use as calibration hierarchy to establish traceability of measurements with routine laboratory methods to the SI-unit, this report will also focus on other aspects considered essential for a successful and sustainable standardization, such as analytical issues related to the definition of the measurand and analytical performance goals. |
Opportunities and challenges for cost-efficient implementation of new point-of-care diagnostics for HIV and tuberculosis
Schito M , Peter TF , Cavanaugh S , Piatek AS , Young GJ , Alexander H , Coggin W , Domingo GJ , Ellenberger D , Ermantraut E , Jani IV , Katamba A , Palamountain KM , Essajee S , Dowdy DW . J Infect Dis 2012 205 Suppl 2 S169-80 Stakeholders agree that supporting high-quality diagnostics is essential if we are to continue to make strides in the fight against human immunodeficiency virus (HIV) and tuberculosis. Despite the need to strengthen existing laboratory infrastructure, which includes expanding and developing new laboratories, there are clear diagnostic needs where conventional laboratory support is insufficient. Regarding HIV, rapid point-of-care (POC) testing for initial HIV diagnosis has been successful, but several needs remain. For tuberculosis, several new diagnostic tests have recently been endorsed by the World Health Organization, but a POC test remains elusive. Human immunodeficiency virus and tuberculosis are coendemic in many high prevalence locations, making parallel diagnosis of these conditions an important consideration. Despite its clear advantages, POC testing has important limitations, and laboratory-based testing will continue to be an important component of future diagnostic networks. Ideally, a strategic deployment plan should be used to define where and how POC technologies can be most efficiently and cost effectively integrated into diagnostic algorithms and existing test networks prior to widespread scale-up. In this fashion, the global community can best harness the tremendous capacity of novel diagnostics in fighting these 2 scourges. |
Influenza A virus neuraminidase protein enhances cell survival through interaction with carcinoembryonic antigen-related cell adhesion molecule 6 (CEACAM6) protein
Gaur P , Ranjan P , Sharma S , Patel JR , Bowzard JB , Rahman SK , Kumari R , Gangappa S , Katz JM , Cox NJ , Lal RB , Sambhara S , Lal SK . J Biol Chem 2012 287 (18) 15109-17 The influenza virus neuraminidase (NA) protein primarily aids in the release of progeny virions from infected cells. Here, we demonstrate a novel role for NA in enhancing host cell survival by activating the Src/Akt signaling axis via an interaction with carcinoembryonic antigen-related cell adhesion molecule 6/cluster of differentiation 66c (C6). NA/C6 interaction leads to increased tyrosyl phosphorylation of Src, FAK, Akt, GSK3beta, and Bcl-2, which affects cell survival, proliferation, migration, differentiation, and apoptosis. siRNA-mediated suppression of C6 resulted in a down-regulation of activated Src, FAK, and Akt, increased apoptosis, and reduced expression of viral proteins and viral titers in influenza virus-infected human lung adenocarcinoma epithelial and normal human bronchial epithelial cells. These findings indicate that influenza NA not only aids in the release of progeny virions, but also cell survival during viral replication. |
Interlaboratory study of ASTM F2731, standard test method for measuring the transmitted and stored energy of firefighter protective clothing systems
Deuser L , Barker R , Deaton AS , Shepherd A . J ASTM Int 2012 9 (3) This paper describes an interlaboratory study conducted using ASTM F2731, Standard Test Method for Measuring the Transmitted and Stored Energy of Firefighter Protective Clothing Systems. Five replications of six different composites representative of firefighting turnout gear materials were tested at six different laboratories equipped to conduct the test. Data collected were used to predict the time to second degree burn for each of the turnout composite test specimens. Statistical analysis showed good agreement between test sites. This interlaboratory study confirmed the repeatability and reliability of ASTM F2731, a test method used to measure an important property associated with the thermal protective performance of firefighter turnout materials. (Copyright 2012 by ASTM International.) |
Neonatal outcomes after influenza immunization during pregnancy: a randomized controlled trial
Steinhoff MC , Omer SB , Roy E , El Arifeen S , Raqib R , Dodd C , Breiman RF , Zaman K . CMAJ 2012 184 (6) 645-53 BACKGROUND: There are limited data about the effect of maternal influenza infection on fetuses and newborns. We performed a secondary analysis of data from the Mother's Gift project, a randomized study designed to test the effectiveness of inactivated influenza and pneumococcal vaccines during pregnancy. METHODS: In the Mother's Gift project, 340 pregnant women in Bangladesh received either inactivated influenza vaccine or 23-valent pneumococcal polysaccharide vaccine (control). This study was performed from August 2004 through December 2005. We performed a secondary analysis of outcomes following maternal influenza immunization during two periods: when influenza virus was not circulating (September 2004 through January 2005) and when influenza virus was circulating (February through October 2005). We assessed gestational age, mean birth weight and the proportion of infants who were small for gestational age. RESULTS: During the period with no circulating influenza virus, there were no differences in the incidence of respiratory illness with fever per 100 person-months among mothers and infants in the two groups (influenza vaccine: 3.9; control: 4.0; p > 0.9). The proportion of infants who were small for gestational age and the mean birth weight were similar between groups (small for gestational age: influenza vaccine 29.1%, control 34.3%; mean birth weight: influenza vaccine 3083 g, control 3053 g). During the period with circulating influenza virus, there was a substantial reduction in the incidence per 100 person-months of respiratory illness with fever among the mothers and infants who had received the influenza vaccine (influenza vaccine: 3.7; control: 7.2; p = 0.0003). During this period, the proportion of infants who were small for gestational age was lower in the influenza vaccine group than in the control group (25.9% v. 44.8%; p = 0.03). The mean birth weight was higher among infants whose mothers received the influenza vaccine than among those who received the control vaccine during this period (3178 g v. 2978 g; p = 0.02). INTERPRETATION: During the period with circulating influenza virus, maternal immunization during pregnancy was associated with a lower proportion of infants who were small for gestational age and an increase in mean birth weight. These data need confirmation but suggest that prevention of influenza infection in pregnancy can influence intrauterine growth. TRIAL REGISTRATION: ClinicalTrials.gov: NCT 00142389. |
Improving systems in perinatal care: quality, not quantity
Barfield WD . JAMA 2012 307 (16) 1750-1 More than 30 years of evidence demonstrates improved survival for very low-birth-weight (VLBW; <1500 g) or very preterm (<32 weeks' gestation) infants born at level III neonatal intensive care units (NICUs) compared with those born at lower-level facilities.1 However, little is known about the components of the quality of care provided within these NICUs. | In this issue of JAMA, Lake and colleagues consider an important dimension within the NICU—the provision of neonatal nursing care.2 Neonatal nurses in the United States are credentialed and highly trained specialists who provide a constant vigil at the bedside of critically ill patients.3 Nurses assess and monitor the status of vulnerable patients with actions including maintaining a patent airway, preventing hospital-acquired infections, assessing the status of multiple organ systems, conducting and assisting in procedures, as well as caring for the overall needs of the infant and the concerns of the mother and family. Given the high risk of death and severe morbidity among the infants for whom they provide care, these professionals must be dedicated to the principles of quality nursing care. |
Collaboration at the federal, state, and local levels to build capacity in maternal and child health: the impact of the Maternal and Child Health Epidemiology Program
Kroelinger CD . J Womens Health (Larchmt) 2012 21 (5) 471-5 This article provides a description of the Maternal and Child Health Epidemiology Program housed in the Division of Reproductive Health at the Centers for Disease Control and Prevention. The article highlights programmatic efforts to build capacity and increase infrastructure within states, localities, and among tribes in the field of maternal and child health by leveraging partnerships with other federal, nonprofit, private, and academic agencies. |
Farm to Institution: creating access to healthy local and regional foods
Harris D , Lott M , Lakins V , Bowden B , Kimmons J . Adv Nutr 2012 3 (3) 343-9 Farm to Institution (FTI) programs are one approach to align food service operations with health and sustainability guidelines, such as those recently developed by the U.S. Department of Health and Human Services and General Services Administration. Programs and policies that support sourcing local and regional foods for schools, hospitals, faith-based organizations, and worksites may benefit institutional customers and their families, farmers, the local community, and the economy. Different models of FTI programs exist. On-site farmer's markets at institutions have been promoted on federal government property, healthcare facilities, and private institutions nationwide. Farm to School programs focus on connecting schools with local agricultural production with the goal of improving school meals and increasing intake of fruits and vegetables in children. Sourcing food from local farms presents a number of challenges including cost and availability of local products, food safety, and liability considerations and lack of skilled labor for food preparation. Institutions utilize multiple strategies to address these barriers, and local, state, and federal polices can help facilitate FTI approaches. FTI enables the purchasing power of institutions to contribute to regional and local food systems, thus potentially affecting social, economic, and ecological systems. Local and state food policy councils can assist in bringing stakeholders together to inform this process. Rigorous research and evaluation is needed to determine and document best practices and substantiate links between FTI and multiple outcomes. Nutritionists, public health practitioners, and researchers can help communities work with institutions to develop, implement, and evaluate programs and policies supporting FTI. |
Comparison of indicators of iron deficiency in Kenyan children
Grant FK , Martorell R , Flores-Ayala R , Cole CR , Ruth LJ , Ramakrishnan U , Suchdev PS . Am J Clin Nutr 2012 95 (5) 1231-7 BACKGROUND: In the absence of a feasible, noninvasive gold standard, iron deficiency (ID) is best measured by the use of multiple indicators. However, the choice of an appropriate single iron biomarker to replace the multiple-criteria model for screening for ID at the population level continues to be debated. OBJECTIVE: We compared ID defined as ≥2 of 3 abnormal ferritin (<12 mcg/L), soluble transferrin receptor (TfR; >8.3 mg/L), or zinc protoporphyrin (ZP; >80 mcmol/mol) concentrations (ie, multiple-criteria model) with ID defined by abnormal concentrations of any of the independent candidate iron biomarkers (ferritin alone, TfR alone, or ZP alone) and TfR/ferritin index (ID, >500). Values either were adjusted for inflammation [as measured by C-reactive protein (>5 mg/L) and alpha(1)-acid glycoprotein (>1 g/L) before applying cutoffs for ID] or were unadjusted. DESIGN: In this community-based cluster survey, capillary blood was obtained from 680 children (aged 6-35 mo) for measurement of iron status by using ferritin, TfR, and ZP. RESULTS: On the basis of the multiple-criteria model, the mean (+/-SE) prevalence of ID was 61.9 +/- 2.2%, whereas the prevalences based on abnormal ferritin, TfR, or ZP concentrations or an abnormal TfR/ferritin index were 26.9 +/- 1.7%, 60.9 +/- 2.2%, 82.8 +/- 1.6%, and 43.1 +/- 2.3%, respectively, for unadjusted values. The prevalences of ID were higher for adjusted values only for low ferritin and an elevated TfR/ferritin index compared with the unadjusted values. The kappa statistics for agreement between the multiple-criteria model and the other iron indicators ranged from 0.35 to 0.88; TfR had the best agreement (kappa = 0.88) with the multiple-criteria model. Positive predictive values of ID based on the other iron indicators in predicting ID based on the multiple-criteria model were highest for ferritin and TfR. Receiver operating characteristic curve analysis indicated that TfR (AUC = 0.94) was superior to the other indicators in diagnosing ID based on the multiple-criteria model (P < 0.001). The inflammation effect did not appear to alter these observations appreciably. CONCLUSION: TfR better estimates the prevalence of ID in preschoolers than do ferritin, ZP, and the TfR/ferritin index on the basis of multiple indexes in a high inflammation, resource-poor setting. This trial was registered at clinicaltrials.gov as NCT101088958. |
Developing and implementing health and sustainability guidelines for institutional food service
Kimmons J , Jones S , McPeak HH , Bowden B . Adv Nutr 2012 3 (3) 337-42 Health and sustainability guidelines for institutional food service are directed at improving dietary intake and increasing the ecological benefits of the food system. The development and implementation of institutional food service guidelines, such as the Health and Human Services (HHS) and General Services Administration (GSA) Health and Sustainability Guidelines for Federal Concessions and Vending Operations (HHS/GSA Guidelines), have the potential to improve the health and sustainability of the food system. Institutional guidelines assist staff, managers, and vendors in aligning the food environment at food service venues with healthier and more sustainable choices and practices. Guideline specifics and their effective implementation depend on the size, culture, nature, and management structure of an institution and the individuals affected. They may be applied anywhere food is sold, served, or consumed. Changing institutional food service practice requires comprehensive analysis, engagement, and education of all relevant stakeholders including institutional management, members of the food supply chain, and customers. Current examples of food service guidelines presented here are the HHS and GSA Health and Sustainability Guidelines for Federal Concessions and Vending Operations, which translate evidence-based recommendations on health and sustainability into institutional food service practices and are currently being implemented at the federal level. Developing and implementing guidelines has the potential to improve long-term population health outcomes while simultaneously benefitting the food system. Nutritionists, public health practitioners, and researchers should consider working with institutions to develop, implement, and evaluate food service guidelines for health and sustainability. |
U.S. truck driver anthropometric study and multivariate anthropometric models for cab designs
Guan J , Hsiao H , Bradtmiller B , Kau T-Y , Reed MR , Jahns SK , Loczi J , Hardee HL , Piamonte DPT . Hum Factors 2012 54 (5) 849-71 OBJECTIVE: This study presents data from a large-scale anthropometric study of U.S. truck drivers and the multivariate anthropometric models developed for the design of next-generation truck cabs. BACKGROUND: Up-to-date anthropometric information of the U.S. truck driver population is needed for the design of safe and ergonomically efficient truck cabs. METHOD: We collected 35 anthropometric dimensions for 1,950 truck drivers (1,779 males and 171 females) across the continental United States using a sampling plan designed to capture the appropriate ethnic, gender, and age distributions of the truck driver population. RESULTS: Truck drivers are heavier than the U.S. general population, with a difference in mean body weight of 13.5 kg for males and 15.4 kg for females. They are also different in physique from the U.S. general population. In addition, the current truck drivers are heavier and different in physique compared to their counterparts of 25 to 30 years ago. CONCLUSION: The data obtained in this study provide more accurate anthropometric information for cab designs than do the current U.S. general population data or truck driver data collected 25 to 30 years ago. Multivariate anthropometric models, spanning 95% of the current truck driver population on the basis of a set of 12 anthropometric measurements, have been developed to facilitate future cab designs. APPLICATION: The up-to-date truck driver anthropometric data and multivariate anthropometric models will benefit the design of future truck cabs which, in turn, will help promote the safety and health of the U.S. truck drivers. |
Lung cancer mortality in North Carolina and South Carolina chrysotile asbestos textile workers
Elliott L , Loomis D , Dement J , Hein MJ , Richardson D , Stayner L . Occup Environ Med 2012 69 (6) 385-90 OBJECTIVES: Studies of workers in two US cohorts of asbestos textile workers exposed to chrysotile (North Carolina (NC) and South Carolina (SC)) found increasing risk of lung cancer mortality with cumulative fibre exposure. However, the risk appeared to increase more steeply in SC, possibly due to differences in study methods. The authors conducted pooled analyses of the cohorts and investigated the exposure-disease relationship using uniform cohort inclusion criteria and statistical methods. METHODS: Workers were included after 30 days of employment in a production job during qualifying years, and vital status ascertained through 2003 (2001 for SC). Poisson regression was used to estimate the exposure-response relationship between asbestos and lung cancer, using both exponential and linear relative rate models adjusted for age, sex, race, birth cohort and decade of follow-up. RESULTS: The cohort included 6136 workers, contributing 218,631 person-years of observation and 3356 deaths. Cumulative exposures at the four study facilities varied considerably. The pooled relative rate for lung cancer, comparing 100 f-yr/ml to 0 f-yr/ml, was 1.11 (95% CI 1.06 to 1.16) for the combined cohort, with different effects in the NC cohort (RR=1.10, 95% CI 1.03 to 1.16) and the SC cohort (RR=1.67, 95% CI 1.44 to 1.93). CONCLUSIONS: Increased rates of lung cancer were significantly associated with cumulative fibre exposure overall and in both the Carolina asbestos-textile cohorts. Previously reported differences in exposure-response between the cohorts do not appear to be related to inclusion criteria or analytical methods. |
Maternal occupational exposure to organic solvents during early pregnancy and risks of neural tube defects and orofacial clefts
Desrosiers TA , Lawson CC , Meyer RE , Richardson DB , Daniels JL , Waters MA , van Wijngaarden E , Langlois PH , Romitti PA , Correa A , Olshan A , National Birth Defects Prevention Study . Occup Environ Med 2012 69 (7) 493-9 OBJECTIVES:Though toxicological experiments demonstrate the teratogenicity of organic solvents in animal models, epidemiologic studies have reported inconsistent results. Using data from the population-based National Birth Defects Prevention Study, the authors examined the relation between maternal occupational exposure to aromatic solvents, chlorinated solvents and Stoddard solvent during early pregnancy and neural tube defects (NTDs) and orofacial clefts (OFCs). METHODS: Cases of NTDs (anencephaly, spina bifida and encephalocoele) and OFCs (cleft lip +/- cleft palate and cleft palate alone) delivered between 1997 and 2002 were identified by birth defect surveillance registries in eight states; non-malformed control infants were selected using birth certificates or hospital records. Maternal solvent exposure was estimated by industrial hygienist review of self-reported occupational histories in combination with a literature-derived exposure database. ORs and 95% CIs for the association between solvent class and each birth defect group and component phenotype were estimated using multivariable logistic regression, adjusting for maternal age, race/ethnicity, education, pre-pregnancy body mass index, folic acid supplement use and smoking. RESULTS: The prevalence of exposure to any solvent among mothers of NTD cases (n=511), OFC cases (n=1163) and controls (n=2977) was 13.1%, 9.6% and 8.2%, respectively. Exposure to chlorinated solvents was associated with increased odds of NTDs (OR=1.96, CI 1.34 to 2.87), especially spina bifida (OR=2.26, CI 1.44 to 3.53). No solvent class was strongly associated with OFCs in these data. CONCLUSIONS: The findings suggest that maternal occupational exposure to chlorinated solvents during early pregnancy is positively associated with the prevalence of NTDs in offspring. |
In reply to the letter to the editor titled 'Is carpal tunnel syndrome overdiagnosed?'
Burt S . Occup Environ Med 2012 69 (9) 690 In reply to the letter to the editor titled ‘Is carpal tunnel syndrome overdiagnosed?’, I would like to make several points.1 First, I would like to clarify that we did not observe any association between arm elevation and carpal tunnel syndrome (CTS) in our data.2 The only mention of arm elevation in our manuscript was to report in the discussion section that the inter-rater reliability of posture observations was higher for arm elevation than it was for wrist postures. | The case definition was discussed among several research groups in USA who have met for several years to discuss such issues as part of a musculoskeletal disorder research consortium. Although each institution has carried out independent research, and their case definitions for CTS are not identical, all agreed that the components would include nerve conduction testing and symptoms recorded on hand diagrams, but not physical examination findings. In the case of our study at the National Institute for Occupational Safety and Health (NIOSH), the health assessment portion of our study included physical examinations, nerve conduction tests and questionnaires administered to all study subjects during site visits to participating workplaces. Our physical examinations included the neck, shoulder and proximal upper limb as well as the forearm, wrist and hand, on all study subjects.3 However, because the timing of the data collection for our prevalent CTS study was cross-sectional, a physical examination is less informative than it would be for a patient scheduling a medical appointment soon after experiencing symptoms. We inquired about symptoms that had occurred within the past year; those who met the criteria for frequency, duration, type and location of symptoms recorded on hand diagrams and who also met the electrodiagnostic criteria met our case definition. Hand diagram analysis differentiated symptoms in the ulnar and radial distributions from the median distribution. Ulnar nerve testing was done to rule out generalised peripheral neuropathy. Nerve conduction testing is our only objective indication of possible CTS. |
Better sleep: antidote to on-the-job fatigue
Caruso CC . Am Nurse Today 2012 7 (5) 38-39 Getting enough good-quality sleep each day is important not just for nurses’ personal health and safety but for patient safety, too. Like the basic need to eat and drink, the need to sleep is critical for maintaining life and health-and for working safely. Sleeping 7 to 8 hours per night is linked to a wide range of better health and safety outcomes. Long work hours and shift work, in contrast, are tied to sleep disturbances and health and safety risks for nurses, including declines in mental function and physical ability, reduction in immunologic function, and higher rates of depression, injury, heart disease, GI disorders, mood disturbances, and cancer. Multiple studies have found that performance in a person who has been awake for 17 hours or more resembles that of someone with alcohol intoxication. | Negative effects of fatigue extend to employers, who lose an estimated $2,000 to $10,000 per employee annually from reduced productivity, increased errors, absenteeism, lack of full functioning at work, increased healthcare and worker compensation costs, and worker turnover due to disability, death, or resigning to take jobs with less demanding schedules. Risks extend to the community, as when a tired nurse makes a patient-care error or has a motor vehicle accident while commuting. | Obviously, sleep deprivation can have serious and even fatal consequences. Nurses, managers, and employers all share in the responsibility to reduce risks connected with fatigue. A 2010 study found the percentage of American healthcare workers who reported 6 or fewer hours of sleep per day (too little, according to sleep experts) increased from 28% in the mid-1980s to 32% in the mid-2000s. So it’s likely that a growing number of nurses aren’t getting enough sleep. |
Depressive symptoms and bone mineral density among police officers in a northeastern US city
Charles LE , Fekedulegn D , Miller DB , Wactawski-Wende J , Violanti JM , Andrew ME , Burchfiel CM . Glob J Health Sci 2012 4 (3) 39-50 PURPOSE: The purpose of this study was to examine the association between depressive symptoms and bone mineral density (BMD). METHODS: Depressive symptoms were measured using the Center for Epidemiologic Studies Depression (CES-D) scale. BMD of total hip, femoral neck, anterio-posterior (AP) spine, wrist, and total body were measured by DXA using standardized procedures. Mean levels of BMD across gender-specific tertiles of CES-D score were obtained using ANOVA and ANCOVA. RESULTS: Participants included 97 police officers (41 women; 29-64 years). Depressive symptoms were not associated with BMD at any site among men. However among women, mean BMD values decreased across increasing (worsening) tertiles of CES-D for the AP spine (low CES-D=1.22+/-0.04; medium CES-D=1.05+/-0.04; high CES-D=1.03+/-0.04 g/cm2; p=0.035) and for the whole body (low=1.26+/-0.03; medium=1.20+/-0.03; high=1.11+/-0.03 g/cm2; p=0.018) after adjustment. CONCLUSIONS: Higher depressive symptoms were associated with lower BMD among female but not male officers. |
The association between malnutrition and the incidence of malaria among young HIV-infected and -uninfected Ugandan children: a prospective study
Arinaitwe E , Gasasira A , Verret W , Homsy J , Wanzira H , Kakuru A , Sandison TG , Young S , Tappero JW , Kamya MR , Dorsey G . Malar J 2012 11 90 BACKGROUND: In sub-Saharan Africa, malnutrition and malaria remain major causes of morbidity and mortality in young children. There are conflicting data as to whether malnutrition is associated with an increased or decreased risk of malaria. In addition, data are limited on the potential interaction between HIV infection and the association between malnutrition and the risk of malaria. METHODS: A cohort of 100 HIV-unexposed, 203 HIV-exposed (HIV negative children born to HIV-infected mothers) and 48 HIV-infected children aged 6 weeks to 1 year were recruited from an area of high malaria transmission intensity in rural Uganda and followed until the age of 2.5 years. All children were provided with insecticide-treated bed nets at enrolment and daily trimethoprim-sulphamethoxazole prophylaxis (TS) was prescribed for HIV-exposed breastfeeding and HIV-infected children. Monthly routine assessments, including measurement of height and weight, were conducted at the study clinic. Nutritional outcomes including stunting (low height-for-age) and underweight (low weight-for-age), classified as mild (mean z-scores between -1 and -2 during follow-up) and moderate-severe (mean z-scores < -2 during follow-up) were considered. Malaria was diagnosed when a child presented with fever and a positive blood smear. The incidence of malaria was compared using negative binomial regression controlling for potential confounders with measures of association expressed as an incidence rate ratio (IRR). RESULTS: The overall incidence of malaria was 3.64 cases per person year. Mild stunting (IRR = 1.24, 95% CI 1.06-1.46, p = 0.008) and moderate-severe stunting (IRR = 1.24, 95% CI 1.03-1.48, p = 0.02) were associated with a similarly increased incidence of malaria compared to non-stunted children. Being mildly underweight (IRR = 1.09, 95% CI 0.95-1.25, p = 0.24) and moderate-severe underweight (IRR = 1.12, 95% CI 0.86-1.46, p = 0.39) were not associated with a significant difference in the incidence of malaria compared to children who were not underweight. There were no significant interactions between HIV-infected, HIV-exposed children taking TS and the associations between malnutrition and the incidence of malaria. CONCLUSIONS: Stunting, indicative of chronic malnutrition, was associated with an increased incidence of malaria among a cohort of HIV-infected and -uninfected young children living in an area of high malaria transmission intensity. However, caution should be made when making causal inferences given the observational study design and inability to disentangle the temporal relationship between malnutrition and the incidence of malaria. TRIAL REGISTRATION: ClinicalTrials.gov: NCT00527800. |
Worldwide application of prevention science in adolescent health
Catalano RF , Fagan AA , Gavin LE , Greenberg MT , Irwin CE Jr , Ross DA , Shek DT . Lancet 2012 379 (9826) 1653-64 The burden of morbidity and mortality from non-communicable disease has risen worldwide and is accelerating in low-income and middle-income countries, whereas the burden from infectious diseases has declined. Since this transition, the prevention of non-communicable disease as well as communicable disease causes of adolescent mortality has risen in importance. Problem behaviours that increase the short-term or long-term likelihood of morbidity and mortality, including alcohol, tobacco, and other drug misuse, mental health problems, unsafe sex, risky and unsafe driving, and violence are largely preventable. In the past 30 years new discoveries have led to prevention science being established as a discipline designed to mitigate these problem behaviours. Longitudinal studies have provided an understanding of risk and protective factors across the life course for many of these problem behaviours. Risks cluster across development to produce early accumulation of risk in childhood and more pervasive risk in adolescence. This understanding has led to the construction of developmentally appropriate prevention policies and programmes that have shown short-term and long-term reductions in these adolescent problem behaviours. We describe the principles of prevention science, provide examples of efficacious preventive interventions, describe challenges and potential solutions to take efficacious prevention policies and programmes to scale, and conclude with recommendations to reduce the burden of adolescent mortality and morbidity worldwide through preventive intervention. |
Is operational research delivering the goods? The journey to success in low-income countries
Zachariah R , Ford N , Maher D , Bissell K , Van den Bergh R , van den Boogaard W , Reid T , Castro KG , Draguez B , von Schreeb J , Chakaya J , Atun R , Lienhardt C , Enarson DA , Harries AD . Lancet Infect Dis 2012 12 (5) 415-21 Operational research in low-income countries has a key role in filling the gap between what we know from research and what we do with that knowledge-the so-called know-do gap, or implementation gap. Planned research that does not tangibly affect policies and practices is ineffective and wasteful, especially in settings where resources are scarce and disease burden is high. Clear parameters are urgently needed to measure and judge the success of operational research. We define operational research and its relation with policy and practice, identify why operational research might fail to affect policy and practice, and offer possible solutions to address these shortcomings. We also propose measures of success for operational research. Adoption and use of these measures could help to ensure that operational research better changes policy and practice and improves health-care delivery and disease programmes. |
Spontaneous preterm labor and cardiovascular disease risk: one step closer to a better understanding
Kuklina EV , Shilkrut A . J Womens Health (Larchmt) 2012 21 (6) 619-20 Currently four pregnancy complications, gestational diabetes, hypertensive disorders in pregnancy, preterm delivery, and delivery of a small for gestational age (SGA) infant, are recognized as cardiovascular disease (CVD) risk factors.1 Incorporating a history of gestational diabetes or hypertensive disorders in pregnancy has been shown to improve CVD risk prediction in women.2 The relationship between gestational diabetes or hypertensive disorders in pregnancy and CVD is believed to be explained, at least in part, by the atherogenic inflammation and oxidative stress that often accompany obesity.3 Indeed, overweight/obese women are 2–6 times more likely to develop gestational diabetes and 2–3 times more likely to develop hypertensive disorders in pregnancy in comparison to normal weight women.5 Women with a history of these complications are more likely to develop hypertension or type 2 diabetes.6 Despite substantial evidence linking these risk factors to CVD, a lack of modifying treatments for these risk factors may limit their use in CVD risk assessment and management. | Preterm delivery has also been associated with maternal overweight or obesity.7 In a meta-analysis of 39 studies, however, associations remained statistically significant only for medically indicated preterm labor, and no statistically significant associations were found for spontaneous preterm labor.7 Moreover, overweight and class I obesity of the first degree emerged as protective factors for spontaneous preterm labor. In the case-controlled study by Bartha at al.8 in this issue of the Journal of Women’s Health, women with spontaneous preterm labor were matched with women without preterm labor. The study showed that women with spontaneous preterm labor had significantly lower prepregnancy body mass index (BMI). During the third trimester, they had higher levels of interleukin-6 (IL-6) and lower levels of myeloperoxidase. Most noteworthy, these women had lower levels of total cholesterol, lower levels of high-density lipoprotein cholesterol (HDL-C), and a higher total cholesterol/HDL-C ratio. Thus, except for low HDL-C levels, women in this study did not fit the traditional CVD risk profile associated with high BMI. Nevertheless, they had increased levels of IL-6 and decreased levels of HDL-C, which are recognized biomarkers of inflammation and oxidative stress.9,10 |
Varying kernel density estimation on R+
Mnatsakanov R , Sarkisian K . Stat Probab Lett 2012 82 (7) 1337-1345 In this article a new nonparametric density estimator based on the sequence of asymmetric kernels is proposed. This method is natural when estimating an unknown density function of a positive random variable. The rates of Mean Squared Error, Mean Integrated Squared Error, and the L 1 -consistency are investigated. Simulation studies are conducted to compare a new estimator and its modified version with traditional kernel density construction. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Drug Safety
- Environmental Health
- Epidemiology and Surveillance
- Genetics and Genomics
- Health Behavior and Risk
- Health Communication and Education
- Immune System Disorders
- Immunity and Immunization
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Public Health, General
- Reproductive Health
- Statistics as Topic
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure