Predictive value of C-reactive protein on 30-day and 1-year mortality in acute coronary syndromes: an analysis from the ACUITY trial
Caixeta A , Stone GW , Mehran R , Lee EA , McLaurin BT , Cox DA , Bertrand ME , Lincoff AM , Moses JW , White HD , Ohman EM , Palmerini T , Syros G , Kittas C , Fahy M , Hooper WC , Lansky AJ , Dangas GD . J Thromb Thrombolysis 2011 31 (2) 154-64 We sought to evaluate the association between C-reactive protein (CRP) sampled on admission and short- and long-term mortality in patients with acute coronary syndromes (ACS) undergoing early invasive treatment. Baseline levels of CRP were determined in 2,974 patients with moderate and high-risk ACS undergoing an early invasive treatment strategy in the large-scale randomized ACUITY trial. The relationship of CRP to 30-day and 1-year clinical outcomes were assessed according to quartiles of CRP values. Patients with CRP levels in the fourth quartile compared to the first quartile had significantly higher 30-day mortality (2.3 vs. 0.3%, P = 0.0004) and 1-year mortality (5.5 vs. 2.8%, P = 0.0003). CRP level as a continuous variable was associated with 30-day mortality (OR [95% CI] for one unit increase in logarithmically transformed CRP level = 1.42 [1.08-1.89], P = 0.01) and 1-year mortality (OR [95% CI] = 1.24, [1.04-1.47], P = 0.02). By multivariable analysis, higher baseline CRP levels independently predicted 30-day and 1-year mortality, a relationship that was particularly strong for patients with the highest quartile of CRP (OR [95% CI] = 5.19 [1.14-23.68], P = 0.009). In troponin-positive patients, increasing quartiles of CRP were associated with a trend for 30-day mortality (P (trend) = 0.08) and a significant increase in 1-year mortality (P (trend) = 0.02); this relationship was not present in troponin-negative patients. Baseline CRP level is a powerful independent predictor of both early and late mortality in patients with ACS being treated with an early invasive strategy, especially in troponin positive patients. |
Geographic distribution of diagnosed diabetes in the U.S.: a diabetes belt
Barker LE , Kirtland KA , Gregg EW , Geiss LS , Thompson TJ . Am J Prev Med 2011 40 (4) 434-9 BACKGROUND: The American "stroke belt" has contributed to the study of stroke. However, U.S. geographic patterns of diabetes have not been as specifically characterized. PURPOSE: This study identifies a geographically coherent region of the U.S. where the prevalence of diagnosed diabetes is especially high, called the "diabetes belt." METHODS: In 2010, data from the 2007 and 2008 Behavioral Risk Factor Surveillance System were combined with county-level diagnosed diabetes prevalence estimates. Counties in close proximity with an estimated prevalence of diagnosed diabetes ≥11.0% were considered to define the diabetes belt. Prevalence of risk factors in the diabetes belt was compared to that in the rest of the U.S. The fraction of the excess risk associated with living in the diabetes belt associated with selected risk factors, both modifiable (sedentary lifestyle, obesity) and nonmodifiable (age, gender, race/ethnicity, education), was calculated. RESULTS: A diabetes belt consisting of 644 counties in 15 mostly southern states was identified. People in the diabetes belt were more likely to be non-Hispanic African-American, lead a sedentary lifestyle, and be obese than in the rest of the U.S. Thirty percent of the excess risk was associated with modifiable risk factors, and 37% with nonmodifiable factors. CONCLUSIONS: Nearly one third of the difference in diabetes prevalence between the diabetes belt and the rest of the U.S. is associated with sedentary lifestyle and obesity. Culturally appropriate interventions aimed at decreasing obesity and sedentary lifestyle in counties within the diabetes belt should be considered. |
Iron-deficiency anemia, non-iron-deficiency anemia and HbA1c among adults in the US
Ford ES , Cowie CC , Li C , Handelsman Y , Bloomgarden ZT . J Diabetes 2011 3 (1) 67-73 BACKGROUND: Conditions that affect erythrocyte turnover affect HbA1c concentrations. Although many forms of anemia are associated with lowering of HbA1c, iron deficiency tends to increase HbA1c. We examined the effect of iron and hemoglobin (Hb) status on HbA1c and on the relationship between concentrations of fasting glucose and HbA1c in a national sample of adults in the US. METHODS: Cross-sectional data from 8296 adults aged ≥20 years who participated in NHANES 1999-2002 were used. RESULTS: The prevalence of low Hb (defined as <120 and <118 g/L in women aged 20-69 and ≥70 years, respectively, and <137, <133, and <124 g/L in men aged 20-49, 50-69, and ≥70 years, respectively) was 5.5%. There was a significant positive correlation between Hb concentrations and HbA1c concentrations after adjusting for age, gender, and race or ethnicity, with HbA1c rising from a mean of 5.28% among participants with Hb <100 g/L to 5.72% among participants with Hb ≥170 g/L. The adjusted mean concentrations of HbA1c were 5.56% and 5.46% among participants with and without iron deficiency, respectively (P = 0.095). However, there was no evidence of differences in the relationship between fasting glucose and HbA1c when groups of anemic and non-anemic individuals with and without iron deficiency were examined individually. CONCLUSIONS: Caution should be used when diagnosing diabetes and prediabetes among people with high or low Hb when the HbA1c level is near 6.5% or 5.7%, respectively, as changes in erythrocyte turnover may alter the test result. However, the trend for HbA1c to increase with iron deficiency does not appear to require screening for iron deficiency in ascertaining the reliability of HbA1c in the diagnosis of diabetes and prediabetes in a given individual. |
Bleeding events are associated with an increase in markers of inflammation in acute coronary syndromes: an ACUITY trial substudy
Campbell CL , Steinhubl SR , Hooper WC , Jozic J , Smyth SS , Bernstein D , De Staercke C , Syros G , Negus BH , Stuckey T , Stone GW , Mehran R , Dangas G . J Thromb Thrombolysis 2011 31 (2) 139-45 Bleeding events have been associated with adverse early and late outcomes in virtually all clinical settings. The mechanism behind this observation remains poorly understood. We sought to determine if the reason might be the provocation of an inflammatory response by bleeding events. In a formal substudy of the ACUITY trial, plasma samples of a range of biomarkers were collected at baseline, discharge, 30 days, and 1 year from 192 patients with acute coronary syndromes (ACS) and were analyzed in a central core laboratory. Temporal changes in biomarker levels were assessed in patients who experienced in-hospital hemorrhagic events, recurrent ischemic events, or neither. Sixteen patients were excluded from the study (7 with incomplete samples, 5 undergoing coronary artery bypass grafting (CABG) during index hospitalization; 1 had both bleeding and ischemic events). Median high sensitivity C-reactive protein (hs-CRP) levels (mg/l) increased significantly more from admission to discharge among the 9 patients who experienced an in-hospital major bleed compared to either the 9 patients who had a recurrent ischemic event (+6.0 vs. +0.70, P = 0.04) or the 151 patients who had no event (+6.0 vs. +0.60, P = 0.003). Compared to patients with no in-hospital events, median interleukin-6 (IL-6) levels (pg/ml) increased from admission to hospital discharge non-significantly in those with a bleeding event (+0.92 vs. +2.46, P = 0.55) and in those who experienced an in-hospital recurrent ischemic event (+0.92 vs. +3.60, P = 0.09). These data suggest that major bleeding is associated with development of a pro-inflammatory state. If confirmed, this mechanism may in part explain the poor prognosis of patients experiencing an acute hemorrhagic event. |
Estimated number of infants born to HIV-infected women in the United States and five dependent areas, 2006
Whitmore SK , Zhang X , Taylor AW , Blair JM . J Acquir Immune Defic Syndr 2011 57 (3) 218-22 OBJECTIVE: Although perinatal HIV infections are declining in the United States (U.S.), there is no single source of nationally representative data available to estimate the number of infants born to HIV-infected women in the U.S. and its dependencies. This study determines the total number of births to HIV-positive women in the U.S. in 2006. STUDY DESIGN: Diagnosed Stage 1 or 2 HIV disease in the U.S. were based on reported data from 39 areas that conducted confidential name-based HIV case reporting and Stage 3 HIV from all areas in the U.S. A zero-inflated Poisson (ZIP) model was used to estimate the number of women aged 13-44 years living with diagnosed Stage 1 or 2 HIV disease in the U.S. The number of undiagnosed HIV-infected women (Stage 1 or 2) of childbearing age was estimated from the number of reported Stage 3 HIV (i.e., AIDS) cases using a back-calculation method. RESULTS: An estimated 115,200 women aged 13-44 years were living with Stage 1 or 2 HIV disease in 2006. A total of 56,200 women were living with diagnosed Stage 3 disease. The estimated number of births to all women living with HIV disease (diagnosed or undiagnosed) was 8,700 [95% Confidence Interval (CI): 8,400-8,800] in 2006. CONCLUSIONS: The number of infants born to HIV-infected women in 2006 was approximately 30% greater than the number of such births (6,075-6,422) in 2000. This increase highlights the need to continue and strengthen efforts to prevent perinatal HIV transmission in the U.S. |
Increased detection of the HIV-1 reverse transcriptase M184V mutation using mutation-specific minority assays in a UK surveillance study suggests evidence of unrecognized transmitted drug resistance
Buckton A , Prabhu D , Motamed C , Harris R , Hill C , Murphy G , Parry J , Johnson J , Lowndes C , Gill N , Pillay D , Cane P . HIV Med 2011 12 (4) 250-4 OBJECTIVES: The aim of the study was to estimate the levels of transmitted drug resistance (TDR) in HIV-1 using very sensitive assays to detect minority drug-resistant populations. METHODS: We tested unlinked anonymous serum specimens from sexual health clinic attendees, who had not received an HIV diagnosis at the time of sampling, by both standard genotyping and using minority detection assays. RESULTS: By standard genotyping, 21 of 165 specimens (12.7%) showed evidence of drug resistance, while, using a combination of standard genotyping and minority mutation assays targeting three commonly observed drug resistance mutations which cause high-level resistance to commonly prescribed first-line antiretroviral therapy (ART), this rose to 32 of 165 (19.4%). This increase of 45% in drug resistance levels [95% confidence interval (CI) 15.2-83.7%; P=0.002] was statistically significant. Almost all of this increase was accounted for by additional detections of the M184V mutation. CONCLUSIONS: Future surveillance studies of TDR in the United Kingdom should consider combining standard genotyping and minority-specific assays to provide more accurate estimates, particularly when using specimens collected from chronic HIV infections in which TDR variants may have declined to low levels. |
Development of a standardized screening rule for tuberculosis in people living with HIV in resource-constrained settings: individual participant data meta-analysis of observational studies
Getahun H , Kittikraisak W , Heilig CM , Corbett EL , Ayles H , Cain KP , Grant AD , Churchyard GJ , Kimerling M , Shah S , Lawn SD , Wood R , Maartens G , Granich R , Date AA , Varma JK . PLoS Med 2011 8 (1) e1000391 BACKGROUND: The World Health Organization recommends the screening of all people living with HIV for tuberculosis (TB) disease, followed by TB treatment, or isoniazid preventive therapy (IPT) when TB is excluded. However, the difficulty of reliably excluding TB disease has severely limited TB screening and IPT uptake in resource-limited settings. We conducted an individual participant data meta-analysis of primary studies, aiming to identify a sensitive TB screening rule. METHODS AND FINDINGS: We identified 12 studies that had systematically collected sputum specimens regardless of signs or symptoms, at least one mycobacterial culture, clinical symptoms, and HIV and TB disease status. Bivariate random-effects meta-analysis and the hierarchical summary relative operating characteristic curves were used to evaluate the screening performance of all combinations of variables of interest. TB disease was diagnosed in 557 (5.8%) of 9,626 people living with HIV. The primary analysis included 8,148 people living with HIV who could be evaluated on five symptoms from nine of the 12 studies. The median age was 34 years. The best performing rule was the presence of any one of: current cough (any duration), fever, night sweats, or weight loss. The overall sensitivity of this rule was 78.9% (95% confidence interval [CI] 58.3%-90.9%) and specificity was 49.6% (95% CI 29.2%-70.1%). Its sensitivity increased to 90.1% (95% CI 76.3%-96.2%) among participants selected from clinical settings and to 88.0% (95% CI 76.1%-94.4%) among those who were not previously screened for TB. Negative predictive value was 97.7% (95% CI 97.4%-98.0%) and 90.0% (95% CI 88.6%-91.3%) at 5% and 20% prevalence of TB among people living with HIV, respectively. Abnormal chest radiographic findings increased the sensitivity of the rule by 11.7% (90.6% versus 78.9%) with a reduction of specificity by 10.7% (49.6% versus 38.9%). CONCLUSIONS: Absence of all of current cough, fever, night sweats, and weight loss can identify a subset of people living with HIV who have a very low probability of having TB disease. A simplified screening rule using any one of these symptoms can be used in resource-constrained settings to identify people living with HIV in need of further diagnostic assessment for TB. Use of this algorithm should result in earlier TB diagnosis and treatment, and should allow for substantial scale-up of IPT. |
Isolation of Bartonella capreoli from elk
Bai Y , Cross PC , Malania L , Kosoy M . Vet Microbiol 2011 148 329-32 The aim of the present study was to investigate the presence of Bartonella infections in elk populations. We report the isolation of four Bartonella strains from 55 elk blood samples. Sequencing analysis demonstrated that all four strains belong to Bartonella capreoli, a bacterium that was originally described in the wild roe deer of Europe. Our finding first time demonstrated that B. capreoli has a wide geographic range, and that elk may be another host for this bacterium. Further investigations are needed to determine the impact of this bacterium on wildlife. |
Sensitivity of birth certificate reports of birth defects in Atlanta, 1995-2005: effects of maternal, infant, and hospital characteristics
Boulet SL , Shin M , Kirby RS , Goodman D , Correa A . Public Health Rep 2011 126 (2) 186-94 OBJECTIVES: We assessed variations in the sensitivity of birth defect diagnoses derived from birth certificate data by maternal, infant, and hospital characteristics. METHODS: We compared birth certificate data for 1995-2005 births in Atlanta with data from the Metropolitan Atlanta Congenital Defects Program (MACDP). We calculated the sensitivity of birth certificates for reporting defects often discernable at birth (e.g., anencephaly, spina bifida, cleft lip, clubfoot, Down syndrome, and rectal atresia or stenosis). We used multivariable logistic regression models to examine associations with sociodemographic and hospital factors. RESULTS: The overall sensitivity of birth certificates was 23% and ranged from 7% for rectal atresia/stenosis to 69% for anencephaly. Non-Hispanic black maternal race/ethnicity, less than a high school education, and preterm birth were independently associated with a lower probability of a birth defect diagnosis being reported on a birth certificate. Sensitivity also was lower for hospitals with > 1,000 births per year. CONCLUSIONS: The underreporting of birth defects on birth certificates is influenced by sociodemographic and hospital characteristics. Interpretation of birth defects prevalence estimates derived from birth certificate reports should take these issues into account. |
Effective state-based surveillance for multidrug-resistant organisms related to health care-associated infections
Duffy J , Sievert D , Rebmann C , Kainer M , Lynfield R , Smith P , Fridkin S . Public Health Rep 2011 126 (2) 176-85 In September 2008, the Council of State and Territorial Epidemiologists and the Centers for Disease Control and Prevention sponsored a meeting of public health and infection-control professionals to address the implementation of surveillance for multidrug-resistant organisms (MDROs)-particularly those related to health care-associated infections. The group discussed the role of health departments and defined goals for future surveillance activities. Participants identified the following main points: (1) surveillance should guide prevention and infection-control activities, (2) an MDRO surveillance system should be adaptable and not organism specific, (3) new systems should utilize and link existing systems, and (4) automated electronic laboratory reporting will be an important component of surveillance but will take time to develop. Current MDRO reporting mandates and surveillance methods vary across states and localities. Health departments that have not already done so should be proactive in determining what type of system, if any, will fit their needs. |
Efficacy of succimer chelation of mercury at background exposures in toddlers: a randomized trial
Cao Y , Chen A , Jones RL , Radcliffe J , Dietrich KN , Caldwell KL , Peddada S , Rogan WJ . J Pediatr 2011 158 (3) 480-485 e1 OBJECTIVE: To examine whether succimer, a mercaptan compound known to reduce blood lead concentration in children, reduces blood mercury concentration. STUDY DESIGN: We used samples from a randomized clinical trial of succimer chelation for lead-exposed children. We measured mercury levels in pre-treatment samples from 767 children. We also measured mercury levels in blood samples drawn 1 week after treatment began (n = 768) and in a 20% random sample of the children who received the maximum 3 courses of treatment (n = 67). A bootstrap-based isotonic regression method was used to compare the trend with time in the difference between the adjusted mean mercury concentrations in the succimer group and that in the placebo group. RESULTS: The adjusted mean organic mercury concentration in the succimer group relative to the placebo group fell from 99% at baseline to 82% after 3 courses of treatment (P for trend = .048), but this resulted from the prevention of the age-related increase in the succimer group. CONCLUSION: Succimer chelation for low level organic mercury exposure in children has limited efficacy. |
Enhanced surveillance of norovirus outbreaks of gastroenteritis in Georgia
Widdowson MA , Bulens SN , Beard RS , Lane KM , Monroe SS , Lance S , Bresee JS , Glass RI . Public Health Rep 2011 126 (2) 251-8 OBJECTIVES: The role of noroviruses in both foodborne and person-to-person outbreaks of acute gastroenteritis (AGE) has been difficult to determine in the U.S. because of lack of routine norovirus testing and of national reporting of person-to-person outbreaks. We conducted a prospective study in one state in which enhanced testing for noroviruses was performed to better understand the relative contribution of all gastroenteric pathogens. METHODS: During the two-year period, 2000-2001, we took all fecal specimens from AGE outbreaks reported in Georgia that were negative for bacteria and tested these for norovirus. RESULTS: We investigated 78 AGE outbreaks, from which suitable fecal samples were collected from 57 of them. Norovirus was identified in 25 (44%) outbreaks, bacteria in 20 (35%) outbreaks, and parasites in one (2%) outbreak. Forty-three (75%) of the outbreaks tested were foodborne, of which 17 (40%) were attributable to norovirus and 18 (42%) were attributable to bacteria. Adjusting for incomplete testing, we estimated that 53% of all AGE outbreaks were attributable to norovirus. A total of 2,674 people were reported ill in the 57 outbreaks, and norovirus infections accounted for 1,735 (65%) of these cases. Norovirus outbreaks tended to be larger than bacterial outbreaks, with a median number of 30 vs. 16 cases per outbreak, respectively (p = 0.057). CONCLUSIONS: This study provides further evidence that noroviruses are, overall, the most common cause of AGE outbreaks in the U.S. Improved specimen collection, reporting person-to-person outbreaks, and access to molecular assays are needed to further understand the role of these viruses and methods for their prevention. |
Implications of alternative definitions of prediabetes for prevalence in U.S. adults
James C , Bullard KM , Rolka DB , Geiss LS , Williams DE , Cowie CC , Albright A , Gregg EW . Diabetes Care 2011 34 (2) 387-91 OBJECTIVE: To compare the prevalence of prediabetes using A1C, fasting plasma glucose (FPG), and oral glucose tolerance test (OGTT) criteria, and to examine the degree of agreement between the measures. RESEARCH DESIGN AND METHODS: We used the 2005-2008 National Health and Nutrition Examination Surveys to classify 3,627 adults aged ≥ 18 years without diabetes according to their prediabetes status using A1C, FPG, and OGTT. We compared the prevalence of prediabetes according to different measures and used conditional probabilities to examine agreement between measures. RESULTS: In 2005-2008, the crude prevalence of prediabetes in adults aged ≥ 18 years was 14.2% for A1C 5.7-6.4% (A1C5.7), 26.2% for FPG 100-125 mg/dL (IFG100), 7.0% for FPG 110-125 mg/dL (IFG110), and 13.7% for OGTT 140-199 mg/dL (IGT). Prediabetes prevalence varied by age, sex, and race/ethnicity, and there was considerable discordance between measures of prediabetes. Among those with IGT, 58.2, 23.4, and 32.3% had IFG100, IFG110, and A1C5.7, respectively, and 67.1% had the combination of either A1C5.7 or IFG100. CONCLUSIONS: The prevalence of prediabetes varied by the indicator used to measure risk; there was considerable discordance between indicators and the characteristics of individuals with prediabetes. Programs to prevent diabetes may need to consider issues of equity, resources, need, and efficiency in targeting their efforts. |
National outbreak of salmonella serotype Saintpaul infections: importance of Texas restaurant investigations in implicating jalapeno peppers
Mody RK , Greene SA , Gaul L , Sever A , Pichette S , Zambrana I , Dang T , Gass A , Wood R , Herman K , Cantwell LB , Falkenhorst G , Wannemuehler K , Hoekstra RM , McCullum I , Cone A , Franklin L , Austin J , Delea K , Behravesh CB , Sodha SV , Yee JC , Emanuel B , Al-Khaldi SF , Jefferson V , Williams IT , Griffin PM , Swerdlow DL . PLoS One 2011 6 (2) e16579 BACKGROUND: In May 2008, PulseNet detected a multistate outbreak of Salmonella enterica serotype Saintpaul infections. Initial investigations identified an epidemiologic association between illness and consumption of raw tomatoes, yet cases continued. In mid-June, we investigated two clusters of outbreak strain infections in Texas among patrons of Restaurant A and two establishments of Restaurant Chain B to determine the outbreak's source. METHODOLOGY/PRINCIPAL FINDINGS: We conducted independent case-control studies of Restaurant A and B patrons. Patients were matched to well controls by meal date. We conducted restaurant environmental investigations and traced the origin of implicated products. Forty-seven case-patients and 40 controls were enrolled in the Restaurant A study. Thirty case-patients and 31 controls were enrolled in the Restaurant Chain B study. In both studies, illness was independently associated with only one menu item, fresh salsa (Restaurant A: matched odds ratio [mOR], 37; 95% confidence interval [CI], 7.2-386; Restaurant B: mOR, 13; 95% CI 1.3-infinity). The only ingredient in common between the two salsas was raw jalapeno peppers. Cultures of jalapeno peppers collected from an importer that supplied Restaurant Chain B and serrano peppers and irrigation water from a Mexican farm that supplied that importer with jalapeno and serrano peppers grew the outbreak strain. CONCLUSIONS/SIGNIFICANCE: Jalapeno peppers, contaminated before arrival at the restaurants and served in uncooked fresh salsas, were the source of these infections. Our investigations, critical in understanding the broader multistate outbreak, exemplify an effective approach to investigating large foodborne outbreaks. Additional measures are needed to reduce produce contamination. |
Perceived financial need and sexual risk behavior among urban, minority patients following sexually transmitted infection diagnosis
Schwartz RM , Bruno DM , Augenbraun MA , Hogben M , Joseph MA , Liddon N , McCormack WM , Rubin SR , Wilson TE . Sex Transm Dis 2011 38 (3) 230-4 BACKGROUND: Previous studies have shown that racial/ethnic and gender disparities in human immunodeficiency virus (HIV)/sexually transmitted infections (STI) may be due in part to factors such as poverty and income-inequality. Little has been published in the HIV/STI literature on the effect of the perception of having unmet basic needs on sexual risk behavior. METHODS: Data on perceived financial need and sexual risk were collected as part of a behavioral intervention aimed at promoting STI partner notification and reducing sexual behavior among minority patients presenting for care at 1 of 2 STI treatment centers in Brooklyn, NY, between January 2002 and December 2004. Data from 528 participants collected at the 6-month follow-up visit were used for the current study. RESULTS: Forty-three percent of participants were categorized as having unmet needs. Those with unmet needs were more likely to report unprotected anal or vaginal sex (unprotected anal or vaginal intercourse [UAVI]; 62%) versus those who had met needs (53%). This association was significant (adjusted odds ratio = 1.28; 95% confidence interval = 1.04-1.53), after controlling for age, sex, site of recruitment, intervention group membership, and country of origin. Stratified analyses indicated that, in the group that did not receive the intervention, there was a statistically significant interaction between sex and basic needs such that women with unmet needs were more likely to report any UAVI (78%) than those with met needs (54%) (adjusted odds ratio = 1.18; 95% confidence interval = 1.07-1.24). No such relationship was detected for men in this sample. CONCLUSIONS: The significant association between perceived unmet needs and UAVI appears to be particularly relevant for women. These findings provide preliminary evidence that HIV/STI intervention components that seek to directly deal with issues of reduction in partner conflict might be beneficial to women with high perceived unmet basic needs, and for whom a potential dissolution of a relationship may represent a further loss in ability to meet basic needs. |
The impact of media coverage on the transmission dynamics of human influenza
Tchuenche JM , Dube N , Bhunu CP , Smith RJ , Bauch CT . BMC Public Health 2011 11 Suppl 1 S5 BACKGROUND: There is an urgent need to understand how the provision of information influences individual risk perception and how this in turn shapes the evolution of epidemics. Individuals are influenced by information in complex and unpredictable ways. Emerging infectious diseases, such as the recent swine flu epidemic, may be particular hotspots for a media-fueled rush to vaccination; conversely, seasonal diseases may receive little media attention, despite their high mortality rate, due to their perceived lack of newness. METHODS: We formulate a deterministic transmission and vaccination model to investigate the effects of media coverage on the transmission dynamics of influenza. The population is subdivided into different classes according to their disease status. The compartmental model includes the effect of media coverage on reporting the number of infections as well as the number of individuals successfully vaccinated. RESULTS: A threshold parameter (the basic reproductive ratio) is analytically derived and used to discuss the local stability of the disease-free steady state. The impact of costs that can be incurred, which include vaccination, education, implementation and campaigns on media coverage, are also investigated using optimal control theory. A simplified version of the model with pulse vaccination shows that the media can trigger a vaccinating panic if the vaccine is imperfect and simplified messages result in the vaccinated mixing with the infectives without regard to disease risk. CONCLUSIONS: The effects of media on an outbreak are complex. Simplified understandings of disease epidemiology, propogated through media soundbites, may make the disease significantly worse. |
Longitudinal predictors of human papillomavirus vaccine initiation among adolescent girls in a high-risk geographic area
Brewer NT , Gottlieb SL , Reiter PL , McRee AL , Liddon N , Markowitz L , Smith JS . Sex Transm Dis 2011 38 (3) 197-204 BACKGROUND: Human papillomavirus (HPV) vaccine uptake is low among adolescent girls in the United States. We sought to identify longitudinal predictors of HPV vaccine initiation in populations at elevated risk for cervical cancer. METHODS: We interviewed a population-based sample of parents of 10- to 18-year-old girls in areas of North Carolina with elevated cervical cancer rates. Baseline interviews occurred in summer 2007 and follow-up interviews in fall 2008. Measures included health belief model constructs. RESULTS: Parents reported that 27% (149/567) of their daughters had initiated HPV vaccine between baseline and follow-up. Of parents who at baseline intended to get their daughters the vaccine in the next year, only 38% (126/348) had done so by follow-up. Of parents of daughters who remained unvaccinated at follow-up but had seen a doctor since baseline, only 37% (122/388) received an HPV vaccine recommendation. Rates of HPV vaccine initiation were higher among parents who at baseline perceived lower barriers to getting HPV vaccine, anticipated greater regret if their daughters got HPV because they were unvaccinated, did not report "needing more information" as the main reason they had not already vaccinated, intended to get their daughters the vaccine, or were not born-again Christians. CONCLUSIONS: Missed opportunities to increase HPV vaccine uptake included unrealized parent intentions and absent doctor recommendations. While several health belief model constructs identified in early acceptability studies (e.g., perceived risk, perceived vaccine effectiveness) were not longitudinally associated with HPV vaccine initiation, our findings suggest correlates of uptake (e.g., anticipated regret) that offer novel opportunities for intervention. |
Group B streptococcal vaccine for resource-poor countries
Schrag SJ . Lancet 2011 378 (9785) 11-2 Neonatal deaths, which occur mostly in resource-poor countries during the first week of life, constitute 41% of the 8·8 million deaths in children aged less than 5 years worldwide.1 Sepsis and pneumonia cause about a third of neonatal deaths. Maternal immunisation—the prevention cornerstone of neonatal tetanus and influenza programmes—has untapped potential to protect neonates from other infectious diseases. Group B streptococcal vaccines are uniquely suited to maternal immunisation in view of the substantial perinatal morbidity and mortality, particularly in the first 48 h of life. Group B streptococcus emerged in industrialised countries as a leading neonatal pathogen in the 1970s. Transmission is mostly from mother to newborn during the intrapartum period. Sepsis, pneumonia, and meningitis in the first week of life (early-onset) are the most common diseases caused by group B streptococcus. Medical advances have reduced case fatality from 20–50% to 5%;2 however, infants who survive often have long-term neurological sequelae. Although data for neonatal infection from resource-poor countries are sparse, the burden is clear in sub-Saharan Africa, where neonates have the highest risk of dying. Compelling surveillance in southern Africa documents high rates of invasive disease (>2 cases per 1000 livebirths) and death (14–38% of cases).3, 4 Data for rural Kenya identify group B streptococcus as a leading neonatal pathogen.5 Maternal and neonatal colonisation rates for more than nine African countries are similar to those in countries with a documented disease burden that is substantial. |
Immunologic non-inferiority of a newly licensed inactivated trivalent influenza vaccine versus an established vaccine: a randomized study in US adults
Campbell JD , Chambers CV , Brady RC , Caldwell MC , Bennett NL , Fourneau MA , Jain VK , Innis BL . Hum Vaccin 2011 7 (1) 81-8 A trivalent inactivated influenza vaccine (Fluarix () , GlaxoSmithKline Biologicals) was licensed under US accelerated approval regulations. We performed a randomized, observer-blind, post-approval study to demonstrate its immunological non-inferiority versus an established US-licensed vaccine (primary endpoint). Adult (including elderly) subjects received a single injection of newly-licensed vaccine (n=923) or established vaccine (n=922). Serum hemagglutination-inhibition titers were determined pre-vaccination and 21-28 days after vaccination. Non-inferiority was assessed by post-vaccination geometric mean titer (GMT) ratio (upper 95% confidence interval [CI] ≤1.5) and difference in seroconversion rate (upper 95% CI ≤0.1) for all three vaccine strains. Safety was monitored for 6 months. The newly-licensed vaccine was non-inferior to the established vaccine in all subjects (≥18 years) and in elderly subjects (≥65 years). Adjusted GMT ratios (established/newly-licensed) against the H1N1, H3N2 and B strains were 0.65 (95% CI: 0.58, 0.73), 0.93 (0.83, 1.04) and 1.13 (1.03, 1.25) for all subjects and 0.75 (0.67, 0.85), 0.95 (0.82, 1.09) and 1.13 (1.00, 1.27) for elderly subjects. Corresponding values for the differences in seroconversion rate (established minus newly-licensed) were -0.12 (-0.16, -0.07), -0.02 (-0.06, 0.03) and 0.01 (-0.04, 0.06) for all subjects and -0.11 (-0.16, -0.05), -0.02 (-0.07, 0.04) and 0.02 (-0.04, 0.08) for elderly subjects. The most common adverse events with both vaccines were injection site pain, fatigue and headache, and no serious adverse events or deaths were considered related; there were no clinically relevant differences between the vaccines. In conclusion, the newly-licensed vaccine was well tolerated and immunologically non-inferior to the established vaccine for all three vaccine strains in the whole population and the elderly. |
Correlates of receiving recommended adolescent vaccines among adolescent females in North Carolina
Reiter PL , McRee AL , Gottlieb SL , Brewer NT . Hum Vaccin 2011 7 (1) 67-73 BACKGROUND: Immunization is a successful and cost-effective method for preventing disease, yet many adolescents do not receive recommended vaccines. We assessed correlates of uptake of three vaccines (tetanus booster, meningococcal, and human papillomavirus [HPV] vaccines) recommended for adolescent females. METHODS: We examined cross-sectional data from 647 parents of 11-20 year-old females from North Carolina who completed the Carolina HPV Immunization Measurement and Evaluation (CHIME) Project follow-up survey in late 2008. Analyses used ordinal and binary logistic regression. RESULTS: Only 17% of parents indicated their daughters had received all three vaccines. Eighty-seven percent of parents indicated their daughters had received tetanus booster vaccine, 36% reported vaccination against meningococcal disease, and 36% reported HPV vaccine initiation. Daughters aged 13-15 years (OR=1.70, 95% CI: 1.09-2.64) or 16-20 years (OR=2.28, 95% CI: 1.51-3.44) had received a greater number of these vaccines compared to daughters aged 11-12 years. Daughters who had preventive care visits in the last year (OR=4.81, 95% CI: 3.14-7.34) or whose parents had at least some college education (OR=1.90, 95% CI: 1.29-2.80) had also received a greater number of these vaccines. CONCLUSIONS: Few daughters, particularly 11-12 years olds, had received all three vaccines recommended for adolescent females. Ensuring annual preventive care visits and increasing concomitant administration of adolescent vaccines may help increase vaccine coverage. |
Tracking antimicrobials dispensed during an anthrax attack: a case study from the New Hampshire anthrax exercise
Tropper J , Adamski C , Vinion C , Sapkota S . J Emerg Manag 2011 9 (1) 65-69 The Countermeasure and Response Administration (CRA) system is a Centers for Disease Control and Prevention informatics application developed to track countermeasures, including medical interventions (eg, vaccinations and pharmaceuticals) and nonmedical interventions (eg, patient isolation, quarantine, and personal protective equipment), administered during a public health response. This case study follows the use of CRA as a supplement to paper-based processes during an exercise in which antimicrobials dispensed to individual exposed persons were captured after a simulated bioterrorist attack of anthrax spores. The exercise was conducted by the New Hampshire Division of Public Health Services on April 14, 2007. Automated systems like CRA can track when medications are dispensed. The data can then be used for performance metrics, statistics, and in locating victims for follow-up study. Given that this case study was limited to a single location in a relatively rural setting, the authors concluded that more study is needed to compare the feasibility of using an automated system rather than paper-based processes for effectively managing a very large-scale urgent public health response. |
Drowning mortality in the United States, 1999-2006
Nasrullah M , Muazzam S . J Community Health 2011 36 (1) 69-75 Drowning is the fifth leading cause of unintentional fatalities in the US. Our study described demographics and trend analysis of unintentional drowning mortality in the US from 1999 to 2006, and identifies the changes in deaths for specific population subgroups. Mortality data came from the CDC's Web-based Injury Statistics Query and Reporting System. Trends during 1999-2006 were analyzed by gender, age group and race. Annual percentage change in deaths/rates and simple linear regression was used for time-trend analysis from 1999 to 2006, and examines its significance. During 1999-2006, there were 27,514 deaths; 21,668 (78.8%) males, 21,380 (77.7%) whites, and 4,241 (15.4%) aged 00-04 years. The annual number of drowning mortality varied from a high of 3,529 in 1999 to a low of 3,281 in 2001. Overall, deaths were increased 1.4% from 3,529 during 1999 to 3,579 deaths during 2006 however, the overall mortality rate decreased by 5%. The proportion of deaths was significantly greater among males than females (27.4 vs. 13.7%: p < 0.001) and blacks than among all other races combined (32.5 vs. 21.3%: p < 0.001). Fatalities reported from California (n = 3,234; 11.75%), Florida (n = 2,852; 10.37%) and Texas (n = 2,395; 8.70%) accounted for 30.82% of all such deaths in the US. Sub-group analyses showed that drowning mortality decreased 0.72% for males but increased 9.52% for females, the trend differ significantly among males and females (p < 0.001). Males, American Indians, and blacks appear to have higher risk of drowning mortality. The trend varied among sexes, age and racial groups from 1999 to 2006. Preventive measures and continuous surveillance is warranted to further decrease these drowning mortalities. |
Comparison of commercial systems for extraction of nucleic acids from DNA/RNA respiratory pathogens.
Yang G , Erdman DE , Kodani M , Kools J , Bowen MD , Fields BS . J Virol Methods 2011 171 (1) 195-9 This study compared six automated nucleic acid extraction systems and one manual kit for their ability to recover nucleic acids from human nasal wash specimens spiked with five respiratory pathogens, representing Gram-positive bacteria (Streptococcus pyogenes), Gram-negative bacteria (Legionella pneumophila), DNA viruses (adenovirus), segmented RNA viruses (human influenza virus A), and non-segmented RNA viruses (respiratory syncytial virus). The robots and kit evaluated represent major commercially available methods that are capable of simultaneous extraction of DNA and RNA from respiratory specimens, and included platforms based on magnetic-bead technology (KingFisher mL, Biorobot EZ1, easyMAG, KingFisher Flex, and MagNA Pure Compact) or glass fiber filter technology (Biorobot MDX and the manual kit Allprep). All methods yielded extracts free of cross-contamination and RT-PCR inhibition. All automated systems recovered L. pneumophila and adenovirus DNA equivalently. However, the MagNA Pure protocol demonstrated more than 4-fold higher DNA recovery from the S. pyogenes than other methods. The KingFisher mL and easyMAG protocols provided 1- to 3-log wider linearity and extracted 3- to 4-fold more RNA from the human influenza virus and respiratory syncytial virus. These findings suggest that systems differed in nucleic acid recovery, reproducibility, and linearity in a pathogen specific manner. |
Performance of A1C for the classification and prediction of diabetes
Selvin E , Steffes MW , Gregg E , Brancati FL , Coresh J . Diabetes Care 2011 34 (1) 84-9 OBJECTIVE: Although A1C is now recommended to diagnose diabetes, its test performance for diagnosis and prognosis is uncertain. Our objective was to assess the test performance of A1C against single and repeat glucose measurements for diagnosis of prevalent diabetes and for prediction of incident diabetes. RESEARCH DESIGN AND METHODS: We conducted population-based analyses of 12,485 participants in the Atherosclerosis Risk in Communities (ARIC) study and a subpopulation of 691 participants in the Third National Health and Nutrition Examination Survey (NHANES III) with repeat test results. RESULTS: Against a single fasting glucose ≥126 mg/dl, the sensitivity and specificity of A1C ≥6.5% for detection of prevalent diabetes were 47 and 98%, respectively (area under the curve 0.892). Against repeated fasting glucose (3 years apart) ≥126 mg/dl, sensitivity improved to 67% and specificity remained high (97%) (AUC 0.936). Similar results were obtained in NHANES III against repeated fasting glucose 2 weeks apart. The accuracy of A1C was consistent across age, BMI, and race groups. For individuals with fasting glucose ≥126 mg/dl and A1C ≥6.5% at baseline, the 10-year risk of diagnosed diabetes was 88% compared with 55% among those individuals with fasting glucose ≥126 mg/dl and A1C 5.7-<6.5%. CONCLUSIONS: A1C performs well as a diagnostic tool when diabetes definitions that most closely resemble those used in clinical practice are used as the "gold standard." The high risk of diabetes among individuals with both elevated fasting glucose and A1C suggests a dual role for fasting glucose and A1C for prediction of diabetes. |
The preparation and storage of dried-blood spot quality control materials for lysosomal storage disease screening tests
Adam BW , Orsini JJ Jr , Martin M , Hall EM , Zobel SD , Caggana M , Hannon WH . Clin Biochem 2011 44 704-10 OBJECTIVE: We aimed to prepare dried-blood spot (DBS) quality control (QC) materials for lysosomal storage disease (LSD) screening tests and to determine optimum blood and DBS storage conditions. METHODS: We compared enzyme activities of five LSD markers in adult blood, umbilical-cord blood, and leukocyte-reduced blood. We measured activities in liquid blood and DBSs after predetermined intervals at controlled temperatures and humidities. RESULTS: Lysosomal-enzyme activity levels in umbilical-cord blood mimicked those in newborn screening samples. Lysosomal-enzyme activities in leukocyte-reduced blood were lower than in LSD-positive patient samples. Enzyme activities were stable in refrigerated liquid blood for 32days and in frozen DBSs stored at low humidity for a year. Activity losses from DBSs after 34days at 37+/-1 degrees C were 35-66% in low humidity and 61-100% in high humidity. CONCLUSIONS: Umbilical-cord blood is the preferred matrix for LSD-normal DBS QC materials. Leukocyte-reduced blood is lysosomal enzyme-deficient. Failure to control humidity during DBS storage results in loss of lysosomal-enzyme activities. |
Immunotherapy with a combination of intravenous immune globulin and p4 peptide rescues mice from postinfluenza pneumococcal pneumonia
Weeks JN , Boyd KL , Rajam G , Ades EW , McCullers JA . Antimicrob Agents Chemother 2011 55 (5) 2276-81 Alternate therapies are needed for treatment of secondary bacterial pneumonia following influenza. The immunomodulatory peptide P4 has shown promise in mouse models of primary pneumococcal infection. Mice infected with influenza virus then challenged with Streptococcus pneumoniae were treated with a combination of P4 peptide and intravenous immune globulin. Survival was improved from 20% to 80% in treated mice relative to controls. Clinical cure correlated with increased clearance of bacteria and decreased lung consolidation. Greater trafficking of professional phagocytic cells to the site of pneumococcal infection coupled with enhanced opsonophagocytosis as manifest by decreased surface display of Fcgamma receptors on neutrophils and macrophages were associated with P4 peptide treatment. This suggests that the mechanism of action for improved clearance of bacteria engendered by P4 is through improved uptake by phagocytes mediated by IgG Fc - Fcgamma receptor interactions following antibody mediated opsonophagocytosis of bacteria. Antibody-based therapies, when coupled with immune modulators such as P4 peptide, may be an effective tool together with antibiotics in our armamentarium against severe pneumonia. |
Antibody responses to a spore carbohydrate antigen as a marker of non-fatal inhalation anthrax in Rhesus macaques
Saile E , Boons GJ , Buskas T , Carlson RW , Kannenberg EL , Barr JR , Boyer AE , Gallegos-Candela M , Quinn CP . Clin Vaccine Immunol 2011 18 (5) 743-8 The Bacillus anthracis exosporium protein BclA contains an O-linked antigenic tetrasaccharide whose terminal sugar is known as anthrose (3). We hypothesized that serologic responses to anthrose may have diagnostic value in confirming exposure to aerosolized B. anthracis. We evaluated the serologic responses to a synthetic anthrose-containing trisaccharide (ATS) in a group of five Rhesus macaques (RM) that survived inhalation anthrax following exposure to B. anthracis Ames spores. Two of five animals were treated with ciprofloxacin starting at 48 (RM2, RM3) hours and two at 72 hours (RM4, RM5) post-exposure; one animal was untreated (RM1). Infection was confirmed by blood culture and detection of anthrax toxin lethal factor (LF) in plasma. Anti-ATS IgG responses were determined at 14, 21, 28, and 35 days post-exposure with pre-exposure serum as a control. All animals, irrespective of ciprofloxacin treatment, mounted a specific, measurable anti-ATS IgG response. The earliest detectable responses were on day 14 (RM1, RM2, RM5) and at days 21 (RM4) and 28 (RM3). Specificity of the anti-ATS responses was demonstrated by competitive inhibition enzyme immunoassay (CIEIA) in which a two-fold excess of carbohydrate (wt/wt) in a bovine serum albumin (BSA) conjugate of the oligosaccharide (ATS-BSA) effected >94% inhibition, whereas a structural analog lacking the 3-hydroxy-3-methyl-butyryl moiety at the C-4" of the anthrosyl residue had no inhibition activity. These data suggest that anti-ATS antibody responses may be used to identify aerosol exposure to B. anthracis spores. The anti-ATS antibody responses were detectable during administration of cipropfloxacin. |
Comparative proteomics and pulmonary toxicity of instilled single-walled carbon nanotubes, crocidolite asbestos, and ultrafine carbon black in mice
Teeguarden JG , Webb-Robertson BJ , Waters KM , Murray AR , Kisin ER , Varnum SM , Jacobs JM , Pounds JG , Zanger RC , Shvedova AA . Toxicol Sci 2011 120 (1) 123-135 Reflecting their exceptional potential to advance a range of biomedical, aeronautic, and other industrial products, carbon nanotube (CNT) production and the potential for human exposure to aerosolized CNTs are increasing. CNTs have toxicologically significant structural and chemical similarities to asbestos (AB) and have repeatedly been shown to cause pulmonary inflammation, granuloma formation, and fibrosis after inhalation/instillation/aspiration exposure in rodents, a pattern of effects similar to those observed following exposure to AB. To determine the degree to which responses to single-walled CNTs (SWCNT) and AB are similar or different, the pulmonary response of C57BL/6 mice to repeated exposures to SWCNTs, crocidolite AB, and ultrafine carbon black (UFCB) were compared using high-throughput global high performance liquid chromatography fourier transform ion cyclotron resonance mass spectrometry (HPLC-FTICR-MS) proteomics, histopathology, and bronchoalveolar lavage cytokine analyses. Mice were exposed to material suspensions (40 micrograms per mouse) twice a week for 3 weeks by pharyngeal aspiration. Histologically, the incidence and severity of inflammatory and fibrotic responses were greatest in mice treated with SWCNTs. SWCNT treatment affected the greatest changes in abundance of identified lung tissue proteins. The trend in number of proteins affected (SWCNT [376] > AB [231] > UFCB [184]) followed the potency of these materials in three biochemical assays of inflammation (cytokines). SWCNT treatment uniquely affected the abundance of 109 proteins, but these proteins largely represent cellular processes affected by AB treatment as well, further evidence of broad similarity in the tissue-level response to AB and SWCNTs. Two high-sensitivity markers of inflammation, one (S100a9) observed in humans exposed to AB, were found and may be promising biomarkers of human response to SWCNT exposure. |
Recent changes in the trends of teen birth rates, 1981-2006
Wingo PA , Smith RA , Tevendale HD , Ferre C . J Adolesc Health 2011 48 (3) 281-8 PURPOSE: To explore trends in teen birth rates by selected demographics. METHODS: We used birth certificate data and joinpoint regression to examine trends in teen birth rates by age (10-14, 15-17, and 18-19 years) and race during 1981-2006 and by age and Hispanic origin during 1990-2006. Joinpoint analysis describes changing trends over successive segments of time and uses annual percentage change (APC) to express the amount of increase or decrease within each segment. RESULTS: For teens younger than 18 years, the decline in birth rates began in 1994 and ended in 2003 (APC: -8.03% per year for ages 10-14 years; APC: -5.63% per year for ages 15-17 years). The downward trend for 18- and 19-year-old teens began earlier (1991) and ended 1 year later (2004) (APC: -2.37% per year). For each study population, the trend was approximately level during the most recent time segment, except for continuing declines for 18- and 19-year-old white and Asian/Pacific Islander teens. The only increasing trend in the most recent time segment was for 18- and 19-year-old Hispanic teens. During these declines, the age distribution of teens who gave birth shifted to slightly older ages, and the percentage whose current birth was at least their second birth decreased. CONCLUSIONS: Teen birth rates were generally level during 2003/2004-2006 after the long-term declines. Rates increased among older Hispanic teens. These results indicate a need for renewed attention to effective teen pregnancy prevention programs in specific populations. |
Maternal caffeine intake and risk of selected birth defects in the National Birth Defects Prevention Study
Browne ML , Hoyt AT , Feldkamp ML , Rasmussen SA , Marshall EG , Druschel CM , Romitti PA . Birth Defects Res A Clin Mol Teratol 2011 91 (2) 93-101 BACKGROUND: Caffeine intake is common during pregnancy, yet few epidemiologic studies have examined the association between maternal caffeine consumption and birth defects. Using data from the National Birth Defects Prevention Study (NBDPS), we examined the association between maternal caffeine consumption and anotia/microtia, esophageal atresia, small intestinal atresia, craniosynostosis, diaphragmatic hernia, omphalocele, and gastroschisis. METHODS: The NBDPS is a multi-site population-based case-control study. The present analysis included 3,346 case infants and 6,642 control infants born from October 1997 through December 2005. Maternal telephone interview reports of demographic characteristics and conditions and exposures before and during pregnancy were collected. Odds ratios and 95% confidence intervals, adjusted for relevant covariates, were calculated to estimate the associations between maternal dietary caffeine intake (coffee, tea, soda, and chocolate) and maternal use of caffeine-containing medications and each defect. RESULTS: We observed small, statistically significant elevations in adjusted odds ratios ranging from 1.3 to 1.8 for total maternal dietary caffeine intake or specific types of caffeinated beverages and anotia/microtia, esophageal atresia, small intestinal atresia, and craniosynostosis; however, dose-response patterns were absent. Periconceptional use of caffeine-containing medications was infrequent and estimates were imprecise. CONCLUSIONS: We did not find convincing evidence of an association between maternal caffeine intake and the birth defects included in this study. The increasing popularity of caffeine-containing energy drinks and other caffeinated products may result in higher caffeine intake among women of childbearing age. Future studies should consider more detailed evaluation of such products. Birth Defects Research (Part A), 2011. (c) 2011 Wiley-Liss, Inc. |
Maternal self-reported genital tract infections during pregnancy and the risk of selected birth defects
Carter TC , Olney RS , Mitchell AA , Romitti PA , Bell EM , Druschel CM . Birth Defects Res A Clin Mol Teratol 2011 91 (2) 108-16 BACKGROUND: Genital tract infections are common during pregnancy and can result in adverse outcomes including preterm birth and neonatal infection. This hypothesis-generating study examined whether these infections are associated with selected birth defects. METHODS: We conducted a case-control study of 5913 children identified as controls and 12,158 cases with birth defects from the National Birth Defects Prevention Study (1997-2004). Maternal interviews provided data on genital tract infections that occurred from one month before pregnancy through the end of the first trimester. Infections were either grouped together as a single overall exposure or were considered as a subgroup that included chlamydia/gonorrhea/pelvic inflammatory disease. Odds ratios (ORs) and 95% confidence intervals (CIs) were estimated using unconditional logistic regression with adjustment for potential confounders. RESULTS: Genital tract infections were associated with bilateral renal agenesis/hypoplasia (OR, 2.89; 95% CI, 1.11-7.50), cleft lip with or without cleft palate (OR, 1.46; 95% CI, 1.03-2.06), and transverse limb deficiency (OR, 1.84; 95% CI, 1.04-3.26). Chlamydia/gonorrhea/pelvic inflammatory disease was associated with cleft lip only (OR, 2.81; 95% CI, 1.39-5.69). These findings were not statistically significant after adjustment for multiple comparisons. CONCLUSIONS: Caution is needed in interpreting these findings due to the possible misclassification of infection, the limited sample size that constrained consideration of the effects of treatment, and the possibility of chance associations. Although these data do not provide strong evidence for an association between genital tract infections and birth defects, additional research on the possible effects of these relatively common infections is needed. Birth Defects Research (Part A), 2010. (c) 2010 Wiley-Liss, Inc. |
Does maternal feeding restriction lead to childhood obesity in a prospective cohort study?
Rifas-Shiman SL , Sherry B , Scanlon K , Birch LL , Gillman MW , Taveras EM . Arch Dis Child 2011 96 (3) 265-9 BACKGROUND: Some studies show that greater parental control over children's eating habits predicts later obesity, but it is unclear whether parents are reacting to infants who are already overweight. OBJECTIVE: To examine the longitudinal association between maternal feeding restriction at age 1 and body mass index (BMI) at age 3 and the extent to which the association is explained by weight for length (WFL) at age 1. METHODS: We studied 837 mother-infant pairs from a prospective cohort study. The main exposure was maternal feeding restriction at age 1, defined as agreeing or strongly agreeing with the following question: "I have to be careful not to feed my child too much." We ran multivariable linear regression models before and after adjusting for WFL at age 1. All models were adjusted for parental and child sociodemographic characteristics. RESULTS: 100 (12.0%) mothers reported feeding restriction at age 1. Mean (SD) WFL z-score at age 1 was 0.32 (1.01), and BMI z-score at age 3 was 0.43 (1.01). Maternal feeding restriction at age 1 was associated with higher BMI z-score at age 3 before (beta 0.26 (95% CI 0.05 to 0.48)) but not after (beta 0.00 (95% CI -0.17 to 0.18)) adjusting for WFL z-score at age 1. Each unit of WFL z-score at age 1 was associated with an increment of 0.57 BMI z-score units at age 3 (95% CI 0.51 to 0.62). CONCLUSIONS: We found that maternal feeding restriction was associated with children having a higher BMI at age 3 before, but not after, adjusting for WFL at age 1. One potential reason may be that parents restrict the food intake of infants who are already overweight. |
Regulatory approaches to worker protection in nanotechnology industry in the USA and European Union
Murashov V , Schulte P , Geraci C , Howard J . Ind Health 2011 49 (3) 280-96 A number of reports have been published regarding the applicability of existing regulatory frameworks to protect consumers and the environment from potentially adverse effects related to introduction of nanomaterials into commerce in the United States and the European Union. However, a detailed comparison of the regulatory approaches to worker safety and health in the USA and in the EU is lacking. This report aims to fill this gap by reviewing regulatory frameworks designed to protect workers and their possible application to nanotechnology. |
Worksite wellness program for respiratory disease prevention in heavy-construction workers
Hnizdo E , Berry A , Hakobyan A , Beeckman-Wagner LA , Catlett L . J Occup Environ Med 2011 53 (3) 274-81 OBJECTIVE: To describe a respiratory disease prevention program in a US heavy-construction company. METHODS: The program uses periodic spirometry and questionnaires and is integrated into a worksite wellness program involving individualized intervention. Spirometry Longitudinal Data Analysis (SPIROLA) technology is used to assist the physician with (i) manage-ment and evaluation of longitudinal spirometry and questionnaire data; (ii) designing, recoding, and implementing intervention; and (iii) evaluation of impact of the intervention. Preintervention data provide benchmark results. RESULTS: Preintervention results on 1224 workers with 5 or more years of follow-up showed that the mean rate of FEV1 decline was 47 mL/year. Age-stratified prevalence of moderate airflow obstruction was higher than that for the US population. CONCLUSION: Preintervention results indicate the need for respiratory disease prevention in this construction workforce and provide a benchmark for future evaluation of the intervention. |
Fire fighter wellness regime
Samo DG , Bogucki S , Hales T , Haimes S , Czarnecki F , Louis D . J Occup Environ Med 2011 53 (3) 229 In their article, Leffer and Grizzell1 have mistakenly claimed that “In 2007, the National Fire Protection Association (NFPA) began recommending yearly screening for firefighters to include a 12-MET minimum Stage 4 Bruce Protocol Stress Test.”2 This statement is not consistent with the content or intent of NFPA, Standard of Comprehensive Occupational Medicine Program for Fire Departments 1582. Chapter 7 of NFPA 1582 is titled “Occupational Medical Evaluation of Members,” and section 7.7.6.3, states: “Stress EKG with or without echocardiogram or radionuclide scanning shall be performed as clinically indicated by history or symptoms [emphasis added].” The appendix to this section provides suggestions as to when this testing might occur. There is no reference to an MET level requirement in this section of the document. | Chapter 8 of NFPA 1582 is titled “Annual Occupational Fitness Evaluation of Members.” This is a mandatory, nonpunitive program to “establish an individual's baseline [and is] measured against the individual's previous assessments and not against any standard or norm.” There is no mention of any MET requirement either in the standard or the appendix related to chapter 8. | The only reference to MET levels is in chapter 9, section 9.4.3.1. This section addresses the Fire Fighter (FF) with coronary artery disease (CAD) and states that physicians should report limitations for FF's with CAD if they have certain findings present. One of these findings is maximal exercise tolerance of <12 METS and a second is “Exercise induced ischemia or ventricular arrhythmias observed by radionuclide stress test during an evaluation reaching at least 12-METS workload.” The purpose of section 9.4.3.1 is evaluation for ongoing CAD, the risk of sudden incapacitation, and ensuring normal cardiac function while performing essential job tasks. | In addition to pointing out this error, we have a number of questions about the categorization of independent variables, the measurement of outcome variables, and whether the conclusions can be supported from the presented data. |
Investigating the associations between work hours, sleep status, and self-reported health among full-time employees
Nakata A . Int J Public Health 2011 57 (2) 403-11 OBJECTIVES: The extent to which work hours and sleep are associated with self-rated health (SRH) was investigated in full-time employees of small- and medium-scale businesses (SMBs) in a suburb of Tokyo. METHODS: A total of 2,579 employees (1,887 men and 692 women), aged 18-79 (mean 45) years, in 296 SMBs were surveyed using a self-administered questionnaire from August to December 2002. Work hours, sleep, and SRH were evaluated. RESULTS: Compared with those working 6-8 h/day, participants working >8 to 10 h/day and >10 h/day had significantly higher odds of suboptimal SRH [adjusted odds ratio (aOR) 1.36 and 1.87, respectively]. Similarly, compared with those sleeping 6+ h/day and sufficient sleep, participants with short sleep (<6 h/day) and insufficient sleep had increased odds of suboptimal SRH (aOR 1.65 and aOR 2.03, respectively). Combinations of the longest work hours with short sleep (aOR 3.30) or insufficient sleep (aOR 3.40) exerted synergistic negative associations on SRH. CONCLUSIONS: This study suggests that long work hours and poor sleep and its combination are associated with suboptimal SRH. |
A research agenda to underpin malaria eradication
Alonso PL , Brown G , Arevalo-Herrera M , Binka F , Chitnis C , Collins F , Doumbo OK , Greenwood B , Hall BF , Levine MM , Mendis K , Newman RD , Plowe CV , Rodriguez MH , Sinden R , Slutsker L , Tanner M . PLoS Med 2011 8 (1) e1000406 The interruption of malaria transmission worldwide is one of the greatest challenges for international health and development communities. The current expert view suggests that, by aggressively scaling up control with currently available tools and strategies, much greater gains could be achieved against malaria, including elimination from a number of countries and regions; however, even with maximal effort we will fall short of global eradication. The Malaria Eradication Research Agenda (malERA) complements the current research agenda-primarily directed towards reducing morbidity and mortality-with one that aims to identify key knowledge gaps and define the strategies and tools that will result in reducing the basic reproduction rate to less than 1, with the ultimate aim of eradication of the parasite from the human population. Sustained commitment from local communities, civil society, policy leaders, and the scientific community, together with a massive effort to build a strong base of researchers from the endemic areas will be critical factors in the success of this new agenda. |
Effect of nutritional status on response to treatment with artemisinin-based combination therapy in young Ugandan children with malaria
Verret WJ , Arinaitwe E , Wanzira H , Bigira V , Kakuru A , Kamya M , Tappero JW , Sandison T , Dorsey G . Antimicrob Agents Chemother 2011 55 (6) 2629-35 The relationship between malnutrition and malaria in young children is under debate and no studies have been published evaluating the association between malnutrition and response to artemisinin-based combination therapies (ACTs). We evaluated the association between malnutrition and response to antimalarial therapy in Ugandan children treated with ACTs for repeated episodes of malaria. Children aged 4 to 12 months diagnosed with uncomplicated malaria were randomized to dihydroartemisinin-piperaquine (DP) or artemether-lumefantrine (AL) and followed for up to 2 years. All HIV-exposed and HIV infected children received trimethoprim-sulfamethoxazole prophylaxis (TS). The primary exposure variables included height-for-age and weight-for-age z-scores. Outcomes included parasite clearance at day 2 and 3 and risk of recurrent parasitemia after 42 days of follow-up. 292 children were randomized to DP or AL, resulting in 2013 malaria treatments. Less than 1% of patients had a positive blood smear by day 3 (DP 0.2%, AL 0.6%, p=0.18). There was no significant association between height-for-age or weight-for -age z-scores and a positive blood smear 2 days following treatment. In children treated with DP not on TS, decreasing height-for-age z-scores <-1 were associated with a higher risk of recurrent parasitemia compared to height-for-age z-score > 0 (HR height for-age z-score <-1 and ≥ -2=2.89, p=0.039; HR height-for-age z-score <-2=3.18, p=0.022). DP and AL are effective antimalarial therapies in chronically malnourished children in a high transmission setting. However, children with mild to moderate chronic malnutrition not taking TS are at higher risk of recurrent parasitemia and may be considered a target for chemoprevention. |
Reduced sexual risk behaviors among people living with HIV: results from the Healthy Relationships Outcome Monitoring Project
Heitgerd JL , Kalayil EJ , Patel-Larson A , Uhl G , Williams WO , Griffin T , Smith BD . AIDS Behav 2011 15 (8) 1677-90 In 2006, the Centers for Disease Control and Prevention funded seven community-based organizations (CBOs) to conduct outcome monitoring of Healthy Relationships. Healthy Relationships is an evidence-based behavioral intervention for people living with HIV. Demographic and sexual risk behaviors recalled by participants with a time referent of the past 90 days were collected over a 17-month project period using a repeated measures design. Data were collected at baseline, and at 3 and 6 months after the intervention. Generalized estimating equations were used to assess the changes in sexual risk behaviors after participation in Healthy Relationships. Our findings show that participants (n = 474) in the outcome monitoring project reported decreased sexual risk behaviors over time, such as fewer number of partners (RR = 0.55; 95% CI 0.41-0.73, P < 0.001) and any unprotected sex events (OR = 0.44; 95% CI 0.36-0.54, P < 0.001) at 6 months after the intervention. Additionally, this project demonstrates that CBOs can successfully collect and report longitudinal outcome monitoring data. |
Health in All Policies: addressing the legal and policy foundations of health impact assessment
Rajotte BR , Ross CL , Ekechi CO , Cadet VN . J Law Med Ethics 2011 39 Suppl 1 27-9 The concept of Health in All Policies aims to improve the health outcomes associated with policies in an attempt to mitigate health disparities and provide optimal environments for healthier living. This multidisciplinary framework seeks to improve health through effective assessment and reformation of policy for organizations of any level and stature. The importance of integrating health in policy assessment and decision making is a key concept in the growing field of Health Impact Assessment. | The World Health Organization defines Health Impact Assessment (HIA) as “a combination of procedures, methods, and tools by which a policy, program, or project may be judged as to its potential effects on the health of a population, and the distribution of those effects within the population.” HIA provides a mechanism for collaboration between various sectors and disciplines bridging the gap between research, policymaking, and implementation of policies, programs, and projects affecting health outcomes. In the United States, while some HIA efforts have focused on proposed public policies, HIA has been used primarily to analyze the health effects of proposed development projects and plans related to community design and transportation. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Epidemiology and Surveillance
- Food Safety
- Health Behavior and Risk
- Health Communication and Education
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Program Evaluation
- Public Health Law
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure