Plasma stromal cell-derived factor 1alpha/CXCL12 level predicts long-term adverse cardiovascular outcomes in patients with coronary artery disease
Ghasemzadeh N , Hritani AW , De Staercke C , Eapen DJ , Veledar E , Al Kassem H , Khayata M , Zafari AM , Sperling L , Hooper C , Vaccarino V , Mavromatis K , Quyyumi AA . Atherosclerosis 2015 238 (1) 113-8 OBJECTIVE: Stromal derived factor-1alpha/CXCL12 is a chemoattractant responsible for homing of progenitor cells to ischemic tissues. We aimed to investigate the association of plasma CXCL12 with long-term cardiovascular outcomes in patients with coronary artery disease (CAD). METHODS: 785 patients aged: 63 +/- 12 undergoing coronary angiography were independently enrolled into discovery (N = 186) and replication (N = 599) cohorts. Baseline levels of plasma CXCL12 were measured using Quantikine CXCL12 ELISA assay (R&D systems). Patients were followed for cardiovascular death and/or myocardial infarction (MI) for a mean of 2.6 yrs. Cox proportional hazard was used to determine independent predictors of cardiovascular death/MI. RESULTS: The incidence of cardiovascular death/MI was 13% (N = 99). High CXCL12 level based on best discriminatory threshold derived from the ROC analysis predicted risk of cardiovascular death/MI (HR = 4.81, p = 1 x 10(-6)) independent of traditional risk factors in the pooled cohort. Addition of CXCL12 to a baseline model was associated with a significant improvement in c-statistic (AUC: 0.67-0.73, p = 0.03). Addition of CXCL12 was associated with correct risk reclassification of 40% of events and 10.5% of non-events. Similarly for the outcome of cardiovascular death, the addition of the CXCL12 to the baseline model was associated with correct reclassification of 20.7% of events and 9% of non-events. These results were replicated in two independent cohorts. CONCLUSION: Plasma CXCL12 level is a strong independent predictor of adverse cardiovascular outcomes in patients with CAD and improves risk reclassification. |
Shape Up Somerville: change in parent body mass indexes during a child-targeted, community-based environmental change intervention
Coffield E , Nihiser AJ , Sherry B , Economos CD . Am J Public Health 2014 105 (2) e1-e7 OBJECTIVES: We investigated the body mass index (BMI; weight in pounds/[height in inches]2 x 703) of parents whose children participated in Shape Up Somerville (SUS), a community-based participatory research study that altered household, school, and community environments to prevent and reduce childhood obesity. METHODS: SUS was a nonrandomized controlled trial with 30 participating elementary schools in 3 Massachusetts communities that occurred from 2002 to 2005. It included first-, second-, and third-grade children. We used an inverse probability weighting estimator adjusted for clustering effects to isolate the influence of SUS on parent (n = 478) BMI. The model's dependent variable was the change in pre- and postintervention parent BMI. RESULTS: SUS was significantly associated with decreases in parent BMIs. SUS decreased treatment parents' BMIs by 0.411 points (95% confidence interval = -0.725, -0.097) relative to control parents. CONCLUSIONS: The benefits of a community-based environmental change childhood obesity intervention can spill over to parents, resulting in decreased parental BMI. Further research is warranted to examine the effects of this type of intervention on parental health behaviors and health outcomes. |
Trends in indoor tanning among US high school students, 2009-2013
Guy GP Jr , Berkowitz Z , Everett Jones S , Holman DM , Garnett E , Watson M . JAMA Dermatol 2014 151 (4) 448-50 Indoor tanning increases the risk of skin cancer, particularly among frequent users and those initiating use at a young age.1,2 While previous research has demonstrated that indoor tanning is common among youth,3 to our knowledge, this study provides the first national estimates of indoor tanning trends among this population. |
Trends in predicted risk for atherosclerotic cardiovascular disease using the pooled cohort risk equations among US adults from 1999 to 2012
Ford ES , Will JC , Mercado CI , Loustalot F . JAMA Intern Med 2014 175 (2) 299-302 Risk assessment has become an important tool to assess an individual’s future risk for cardiovascular disease. Recently, the American College of Cardiology/American Heart Association (ACC/AHA) released a report that presented updated risk equations, the Pooled Cohort Risk Equations, for cardiovascular disease.1 Race and ethnicity-specific estimates were novel to the new risk equations. Because changes over time in predicted cardiovascular risk using these new risk equations have not been examined, our objectives were to (1) examine the trend in predicted 10-year cardiovascular risk using the new ACC/AHA Pooled Cohort Risk Equations and (2) estimate the potential for risk reduction by optimizing levels of cardiovascular risk factors. |
Variations in guideline-concordant breast cancer adjuvant therapy in rural Georgia
Guy GP Jr , Lipscomb J , Gillespie TW , Goodman M , Richardson LC , Ward KC . Health Serv Res 2014 50 (4) 1088-108 OBJECTIVE: To examine factors associated with guideline-concordant adjuvant therapy among breast cancer patients in a rural region of the United States and to present an advancement in quality-of-care assessment in the context of multiple treatments. DATA SOURCES: Chart abstraction on initial therapy received by 868 women diagnosed with primary, invasive, early-stage breast cancer in a largely rural region of southwest Georgia. STUDY DESIGN: Using multivariable logistic regression, we examined predictors of adjuvant chemo-, radiation, and hormonal therapy regimens defined as guideline-concordant according to the 2000 National Institutes of Health Consensus Development Conference Statement. PRINCIPAL FINDINGS: Overall, 35.2 percent of women received guideline-concordant care for all three adjuvant therapies. Higher socioeconomic status was associated with receiving guideline-concordant care for all three adjuvant therapies jointly, and for chemotherapy. Compared with private insurance, having Medicaid was associated with guideline-concordant chemotherapy. Unmarried women were more likely to be nonconcordant for chemotherapy and radiation therapy. Increased age predicted nonconcordance for adjuvant therapies jointly, for chemotherapy, and for hormonal therapy. CONCLUSIONS: A number of factors were independently associated with receiving guideline-concordant adjuvant therapy. Identifying and addressing factors that lead to nonconcordance may reduce disparities in treatment and survival. |
Obesity, diabetes, and the moving targets of healthy-years estimation
Gregg E . Lancet Diabetes Endocrinol 2014 3 (2) 93-4 Many studies have attempted to quantify the effect of obesity on death, fueling a sustained controversy about which levels of bodyweight can harm health.1 However, many investigators have argued that life expectancy does not capture the essence of the damage that obesity causes across a lifetime and that better long-term metrics are needed to convey risk, judge interventions, and motivate behaviour.2 In The Lancet Diabetes & Endocrinology, Steven Grover and colleagues3 model the effect of diabetes and cardiovascular disease in people who are overweight or obese and show what is intuitively known, but not often quantified, about obesity—that its effect on the number of number of healthy-years lost is far greater than its effect on total years of life. | Constructing a model from cohort studies about the probability of transition to diabetes, cardiovascular disease, and death, Grover and colleagues'3 study used data for obesity, blood pressure, glucose concentrations, lipid concentrations, and other risk factors from 3992 non-Hispanic white participants from the US National Heath and Nutrition Examination Surveys 2003–10 to estimate the life years and healthy life-years lost associated with different levels of overweight and obesity. |
Outcomes of cardiovascular disease risk factor screening and referrals in a family planning clinic
Robbins CL , Keyserling TC , Jilcott Pitts S , Morrow J , Moos MK , Johnston LF , Farr SL . J Womens Health (Larchmt) 2014 24 (2) 131-7 BACKGROUND: Cardiovascular disease (CVD) screening in Title X settings can identify low-income women at risk of future chronic disease. This study examines follow-up related to newly identified CVD risk factors in a Title X setting. METHODS: Female patients at a North Carolina Title X clinic were screened for CVD risk factors (n=462) and 167/462 (36.1%) were rescreened one year later. Clinical staff made protocol-driven referrals for women identified with newly diagnosed CVD risk factors. We used paired t-tests and chi square tests to compare screening and rescreening results (two-tailed, p<0.05). RESULTS: Among 11 women in need of referrals for newly diagnosed hypertension or diabetes, 9 out of 11 (81.8%) were referred, and 2 of 11 (18.2%) completed referrals. Among hypertensive women who were rescreened (n=21), systolic blood pressure decreased (139 to 132 mmHg, p=0.001) and diastolic blood pressure decreased (90 to 83 mmHg, p=0.006). Hemoglobin A1c did not improve among rescreened diabetic women (n=5, p=0.640). Among women who reported smoking at enrollment, 129 of 148 (87.2%) received cessation counseling and 8 of 148 (5.4%) accepted tobacco quitline referrals. Among smokers, 53 out of 148 (35.8%) were rescreened and 11 of 53 (20.8%) reported nonsmoking at that time. Among 188 women identified as obese at enrollment, 22 (11.7%) scheduled nutrition appointments, but only one attended. Mean weight increased from 221 to 225 pounds (p 0<.05) among 70 out of 188 (37.2%) obese women who were rescreened. CONCLUSIONS: The majority of women in need of referrals for CVD risk factors received them. Few women completed referrals. Future research should examine barriers and facilitators of referral care among low-income women. |
Estimating population attributable fractions to quantify the health burden of obesity
Flegal KM , Panagiotou OA , Graubard BI . Ann Epidemiol 2014 25 (3) 201-7 PURPOSE: Obesity is a highly prevalent condition in the United States and elsewhere and is associated with increased mortality and morbidity. Here, we discuss some issues involved in quantifying the health burden of obesity using population attributable fraction (PAF) estimates and provide examples. METHODS: We searched PubMed for articles reporting attributable fraction estimates for obesity. We reviewed eligible articles to identify methodological concerns and tabulated illustrative examples of PAF estimates for obesity relative to cancer, diabetes, cardiovascular disease, and all-cause mortality. RESULTS: There is considerable variability among studies regarding the methods used for PAF calculation and the selection of appropriate counterfactuals. The reported estimates ranged from 5% to 15% for all-cause mortality, -0.2% to 8% for all-cancer incidence, 7% to 44% for cardiovascular disease incidence, and 3% to 83% for diabetes incidence. CONCLUSIONS: To evaluate a given estimate, it is important to consider whether the exposure and outcome were defined similarly for the PAF and for the relative risks, whether the relative risks were suitable for the population at hand, and whether PAF was calculated using correct methods. Strong causal assumptions are not necessarily warranted. In general, PAFs for obesity may be best considered as indicators of association. |
Clinical and prognostic factors for renal parenchymal, pelvis, and ureter cancers in SEER registries: Collaborative Stage Data Collection System, version 2
Altekruse SF , Dickie L , Wu XC , Hsieh MC , Wu M , Lee R , Delacroix S Jr . Cancer 2014 120 Suppl 23 3826-35 BACKGROUND: The American Joint Committee on Cancer's (AJCC) 7th edition cancer staging manual reflects recent changes in cancer care practices. This report assesses changes from the AJCC 6th to the AJCC 7th edition stage distributions and the quality of site-specific factors (SSFs). METHODS: Incidence data for renal parenchyma and pelvis and ureter cancers from 18 Surveillance, Epidemiology, and End Results (SEER) registries were examined, including staging trends during 2004-2010, stage distribution changes between the AJCC 6th and 7th editions, and SSF completeness for cases diagnosed in 2010. RESULTS: From 2004 to 2010, the percentage of stage I renal parenchyma cancers increased from 50% to 58%, whereas stage IV and unknown stage cases decreased (18% to 15%, and 10% to 6%, respectively). During this period, the percentage of stage 0a renal pelvis and ureter cancers increased from 21% to 25%, and stage IV and unknown stage tumors decreased (20% to 18%, and 7% to 5%, respectively). Stage distributions under the AJCC 6th and 7th editions were about the same. For renal parenchymal cancers, 71%-90% of cases had known values for 6 required SSFs. For renal pelvis and ureter cancers, 74% of cases were coded as known for SSF1 (WHO/ISUP grade) and 47% as known for SSF2 (depth of renal parenchymal invasion). SSF values were known for larger proportions of cases with reported resections. CONCLUSIONS: Stage distributions between the AJCC 6th and 7th editions were similar. SSFs were known for more than two-thirds of cases, providing more detail in the SEER database relevant to prognosis. |
High frequency of active HCV infection among seropositive cases in west Africa and evidence for multiple transmission pathways.
Layden JE , Phillips RO , Owusu-Ofori S , Sarfo FS , Kliethermes S , Mora N , Owusu D , Nelson K , Opare-Sem O , Dugas L , Luke A , Shoham D , Forbi JC , Khudyakov YE , Cooper RS . Clin Infect Dis 2014 60 (7) 1033-41 BACKGROUND: Sub-Saharan Africa (SSA) has among the highest global Hepatitis C Virus (HCV) prevalence estimates. However, reports suggesting high rates of serologic false positives and low levels of viremia have led to uncertainty regarding the burden of active infection in this region. Additionally, little is known about the predominant transmission risk factors in SSA. METHODS: We prospectively recalled 363 past blood donors (180 who were rapid screen assay (RSA) positive and 183 that were RSA negative at time of donation) to identify the level of active infection and risk factors for infection at a teaching hospital in Kumasi, Ghana. Participants had repeat blood testing and were administered a questionnaire on risk factors. RESULTS: The frequency of HCV active infection ranged from 74.4% to 88% depending on the criteria used to define serologically positive cases. Individuals with active disease had biochemical evidence of liver inflammation and median viral loads of 5.7 log copies/ml. Individuals from the northern and upper regions of Ghana had greater risks of infection compared to participants from other areas. Additional risk factors included traditional circumcision, home birth, tribal scarring and HBV co-infection. CONCLUSIONS: Viremic infection was common among serologically confirmed cases. Attention to testing algorithms is needed to define the true HCV burden in SSA. These data also suggest that several transmission modes are likely contributing to the current HCV epidemic in Ghana, and the distribution of these practices may result in substantial regional variation in prevalence. |
TB-HAART trial
Lederer P , Briggs M , Hassani AS , Date A . Lancet Infect Dis 2015 15 (1) 14 We read with interest Sayoki Mfinanga and colleagues’ recent TB-HAART randomised trial in sub-Saharan Africa.1 Initiation of antiretroviral therapy (ART) within 2 weeks of the start of pulmonary tuberculosis treatment for patients with CD4 cell counts more than 220 cells per μL did not confer any advantage over delayed ART initiation on a composite outcome of tuberculosis treatment failure, recurrence, and death. The authors concluded that comanagement of HIV infection and tuberculosis in sub-Saharan Africa remains challenging because of toxic effects, drug inter-actions, risk of antiretroviral drug resistance, pill burden, immune reconstitution inflammatory syndrome, and cost. Thus, they argued that WHO guidelines should be updated to recommend the delay of ART initiation until completion of tuberculosis treatment for patients with HIV and CD4 cell counts more than 220 cells per μL. | Although the authors noted concern about potential toxic effects of early ART initiation, their study showed no harm in terms of mortality, grade 3 and 4 adverse events, and immune reconstitution inflammatory syndrome. However, several studies have shown that even a short deferral of ART comes at the expense of recovery of CD4-positive T cells. Deferment of ART until the completion of tuberculosis treatment could also lead to loss to follow-up and subsequent morbidity and mortality.2 Initiation of ART during tuberculosis treatment enables linkage between HIV and tuberculosis treatment programmes and could improve adherence. ART integration into tuberculosis treatment settings could help to improve ART uptake among patients with tuberculosis who also have HIV.3 |
Tuberculosis and excess alcohol use in the United States, 1997-2012
Volkmann T , Moonan PK , Miramontes R , Oeltmann JE . Int J Tuberc Lung Dis 2015 19 (1) 111-9 BACKGROUND: Excess alcohol use among tuberculosis (TB) patients complicates TB control strategies. OBJECTIVES: To characterize the role of excess alcohol use in TB control, we describe the epidemiology of excess alcohol use and TB in the United States among those aged 15 years. DESIGN: Using data reported to the National Tuberculosis Surveillance System, 1997-2012, we examined associations between excess alcohol use and TB treatment outcomes and markers for increased transmission (involvement in a local genotype cluster of cases) using multivariate logistic regression. We used Cox proportional hazards regression analysis to examine the relationship between excess alcohol use and the rate of conversion from positive to negative in sputum culture results. RESULTS: Excess alcohol use was documented for 31 207 (15.1%) of 207 307 patients. Prevalence of excess alcohol use was greater among male patients (20.6%) and US-born patients (24.6%). Excess alcohol use was associated with a positive sputum smear result (aOR 1.23, 95%CI 1.18-1.28) and death during treatment (vs. completion of treatment) (aOR 1.16, 95%CI 1.10-1.22). The rate of culture conversion was higher among patients without excess alcohol use (adjusted hazard ratio 1.20, 95%CI 1.18-1.23). CONCLUSIONS: Excess alcohol use was common among patients with TB, and was associated with TB transmission, lower rates of sputum culture conversion, and greater mortality. |
Viable influenza a virus in airborne particles from human coughs
Lindsley WG , Noti JD , Blachere FM , Thewlis RE , Martin SB , Othumpangat S , Noorbakhsh B , Goldsmith WT , Vishnu A , Palmer JE , Clark KE , Beezhold DH . J Occup Environ Hyg 2015 12 (2) 107-13 Patients with influenza release aerosol particles containing the virus into their environment. However, the importance of airborne transmission in the spread of influenza is unclear, in part because of a lack of information about the infectivity of the airborne virus. The purpose of this study was to determine the amount of viable influenza A virus that was expelled by patients in aerosol particles while coughing. Sixty-four symptomatic adult volunteer outpatients were asked to cough 6 times into a cough aerosol collection system. Seventeen of these participants tested positive for influenza A virus by viral plaque assay (VPA) with confirmation by viral replication assay (VRA). Viable influenza A virus was detected in the cough aerosol particles from 7 of these 17 test subjects (41%). Viable influenza A virus was found in the smallest particle size fraction (0.3 mum to 8 mum), with a mean of 142 plaque-forming units (SD 215) expelled during the 6 coughs in particles of this size. These results suggest that a significant proportion of patients with influenza A release small airborne particles containing viable virus into the environment. Although the amounts of influenza A detected in cough aerosol particles during our experiments were relatively low, larger quantities could be expelled by influenza patients during a pandemic when illnesses would be more severe. Our findings support the idea that airborne infectious particles could play an important role in the spread of influenza. |
Latent tuberculous infection in the United States and Canada: who completes treatment and why?
Hirsch-Moverman Y , Shrestha-Kuwahara R , Bethel J , Blumberg HM , Venkatappa TK , Horsburgh CR , Colson PW . Int J Tuberc Lung Dis 2015 19 (1) 31-8 OBJECTIVES: To assess latent tuberculous infection (LTBI) treatment completion rates in a large prospective US/Canada multisite cohort and identify associated risk factors. METHODS: This prospective cohort study assessed factors associated with LTBI treatment completion through interviews with persons who initiated treatment at 12 sites. Interviews were conducted at treatment initiation and completion/cessation. Participants received usual care according to each clinic's procedure. Multivariable models were constructed based on stepwise assessment of potential predictors and interactions. RESULTS: Of 1515 participants initiating LTBI treatment, 1323 had information available on treatment completion; 617 (46.6%) completed treatment. Baseline predictors of completion included male sex, foreign birth, not thinking it would be a problem to take anti-tuberculosis medication, and having health insurance. Participants in stable housing who received monthly appointment reminders were more likely to complete treatment than those without stable housing or without monthly reminders. End-of-treatment predictors of non-completion included severe symptoms and the inconvenience of clinic/pharmacy schedules, barriers to care and changes of residence. Common reasons for treatment non-completion were patient concerns about tolerability/toxicity, appointment conflicts, low prioritization of TB, and forgetfulness. CONCLUSIONS: Less than half of treatment initiators completed treatment in our multisite study. Addressing tangible issues such as not having health insurance, toxicity concerns, and clinic accessibility could help to improve treatment completion rates. |
Gonorrhea treatment practices in the STD Surveillance Network, 2010-2012
Kerani RP , Stenger MR , Weinstock H , Bernstein KT , Reed M , Schumacher C , Samuel MC , Eaglin M , Golden M . Sex Transm Dis 2015 42 (1) 6-12 BACKGROUND: Replacing oral treatments with ceftriaxone is a central component of public health efforts to slow the emergence of cephalosporin-resistant Neisseria gonorrhoeae in the United States; US gonorrhea treatment guidelines were revised accordingly in 2010. However, current US gonorrhea treatment practices have not been well characterized. METHODS: Six city and state health departments in Cycle II of the STD Surveillance Network (SSuN) contributed data on all gonorrhea cases reported in 101 counties and independent cities. Treatment data were obtained through local public health surveillance and interviews with a random sample of patients. Cases were weighted to adjust for site-specific sample fractions and for differential nonresponse by age, sex, and provider type. RESULTS: From 2010 to 2012, 135,984 gonorrhea cases were reported in participating areas, 15,246 (11.2%) of which were randomly sampled. Of these, 7,851 (51.5%) patients were interviewed. Among patients with complete treatment data, 76.8% received ceftriaxone, 16.4% received an oral cephalosporin, and 6.9% did not receive a cephalosporin; 51.9% of persons were treated with a regimen containing ceftriaxone and either doxycycline or azithromycin. Ceftriaxone treatment increased significantly by year (64.1% of patients in 2010, 79.3% in 2011, 85.4% in 2012; P = 0.0001). Ceftriaxone use varied widely by STD Surveillance Network site (from 44.6% to 95.1% in 2012). CONCLUSIONS: Most persons diagnosed as having gonorrhea between 2010 and 2012 in the United States received ceftriaxone, and its use has increased since the release of the 2010 Centers for Disease Control and Prevention STD Treatment Guidelines. |
HIV and STI prevalence and risk factors among male sex workers and other men who have sex with men in Nairobi, Kenya
Muraguri N , Tun W , Okal J , Broz D , Raymond HF , Kellogg T , Dadabhai S , Musyoki H , Sheehy M , Kuria D , Kaiser R , Geibel S . J Acquir Immune Defic Syndr 2015 68 (1) 91-6 Previous surveys of men who have sex with men (MSM) in Africa have not adequately profiled HIV status and risk factors by sex work status. MSM in Nairobi, Kenya, were recruited using respondent-driven sampling, completed a behavioral interview, and were tested for HIV and sexually transmitted infections. Overlapping recruitment among 273 male sex workers and 290 other MSM was common. Sex workers were more likely to report receptive anal sex with multiple partners (65.7% versus 18.0%, P < 0.001) and unprotected receptive anal intercourse (40.0% versus 22.8%, P = 0.005). Male sex workers were also more likely to be HIV infected (26.3% versus 12.2%, P = 0.007). |
Annual risk of tuberculous infection measured using serial skin testing, Orel Oblast, Russia, 1991-2005
Yuen CM , Krapivina TM , Kazennyy BY , Kiryanova EV , Aksenova VA , Gordina A , Finlay AM , Cegielski JP . Int J Tuberc Lung Dis 2015 19 (1) 39-43 OBJECTIVE: To compare trends in direct annual risk of tuberculous infection (ARTI) during 1991-2005 in relation to tuberculosis (TB) incidence and to indirect estimates of ARTI derived from the prevalence of tuberculin skin test (TST) positivity in schoolchildren in Orel Oblast, Russia. DESIGN: In 2005, we abstracted annual TST results and vaccination histories from a representative sample of schoolchildren in Orel Oblast, Russia, where bacille Calmette-Guerin (BCG) vaccination and annual TST of children are nearly universal. We calculated direct ARTI based on the percentage of children tested with TST conversions each year, excluding conversions following BCG vaccination. RESULTS: We analysed records from 13 206 children, with a median of 10 recorded TST results per child. The ARTI increased from 0.2% in 1991 to 1.6% in 2000, paralleling trends in TB incidence. Similar results were observed when the ARTI was estimated based on prevalence of infection among children aged 3-5 years using a 12 mm cut-off to define TST positivity. Results differed substantially when 10 or 15 mm cut-offs were used or when prevalence was determined among children aged 6-8 years. CONCLUSION: ARTI measured through TST conversion increased as TB incidence increased in Orel Oblast. ARTI measured through serial TSTs can thus provide an indicator of changing trends in TB incidence. |
Rapid assessment of Ebola infection prevention and control needs - six districts, Sierra Leone, October 2014
Pathmanathan I , O'Connor KA , Adams ML , Rao CY , Kilmarx PH , Park BJ , Mermin J , Kargbo B , Wurie AH , Clarke KR . MMWR Morb Mortal Wkly Rep 2014 63 (49) 1172-4 As of October 31, 2014, the Sierra Leone Ministry of Health and Sanitation had reported 3,854 laboratory-confirmed cases of Ebola virus disease (Ebola) since the outbreak began in May 2014; 199 (5.2%) of these cases were among health care workers. Ebola infection prevention and control (IPC) measures are essential to interrupt Ebola virus transmission and protect the health workforce, a population that is disproportionately affected by Ebola because of its increased risk of exposure yet is essential to patient care required for outbreak control and maintenance of the country's health system at large. To rapidly identify existing IPC resources and high priority outbreak response needs, an assessment by CDC Ebola Response Team members was conducted in six of the 14 districts in Sierra Leone, consisting of health facility observations and structured interviews with key informants in facilities and government district health management offices. Health system gaps were identified in all six districts, including shortages or absence of trained health care staff, personal protective equipment (PPE), safe patient transport, and standardized IPC protocols. Based on rapid assessment findings and key stakeholder input, priority IPC actions were recommended. Progress has since been made in developing standard operating procedures, increasing laboratory and Ebola treatment capacity and training the health workforce. However, further system strengthening is needed. In particular, a successful Ebola outbreak response in Sierra Leone will require an increase in coordinated and comprehensive district-level IPC support to prevent ongoing Ebola virus transmission. |
Reducing lost to follow-up in a large clinical trial of prevention of mother-to-child transmission of HIV: the Breastfeeding, Antiretrovirals and Nutrition study experience
Sellers CJ , Lee H , Chasela C , Kayira D , Soko A , Mofolo I , Ellington S , Hudgens MG , Kourtis AP , King CC , Jamieson DJ , van der Horst C . Clin Trials 2014 12 (2) 156-65 BACKGROUND/AIMS: Retaining patients in prevention of mother-to-child transmission of HIV studies can be challenging in resource-limited settings, where high lost to follow-up rates have been reported. In this article, we describe the effectiveness of methods used to encourage retention in the Breastfeeding, Antiretrovirals, and Nutrition study and analyze factors associated with lost to follow-up in the study. METHODS: The Breastfeeding, Antiretrovirals, and Nutrition clinical trial was designed to evaluate the efficacy of three different mother-to-child HIV transmission prevention strategies. Lower than expected participant retention prompted enhanced efforts to reduce lost to follow-up during the conduct of the trial. Following study completion, we employed regression modeling to determine predictors of perfect attendance and variables associated with being lost to follow-up. RESULTS: During the study, intensive tracing efforts were initiated after the first 1686 mother-infant pairs had been enrolled, and 327 pairs were missing. Of these pairs, 60 were located and had complete data obtained. Among the 683 participants enrolling after initiation of intensive tracing efforts, the lost to follow-up rate was 3.4%. At study's end, 290 (12.2%) of the 2369 mother-infant pairs were lost to follow-up. Among successfully traced missing pairs, relocation was common and three were deceased. Log-binomial regression modeling revealed higher maternal hemoglobin and older maternal age to be significant predictors of perfect attendance. These factors and the presence of food insecurity were also significantly associated with lower rates of lost to follow-up. CONCLUSION: In this large HIV prevention trial, intensive tracing efforts centered on reaching study participants at their homes succeeded in finding a substantial proportion of lost to follow-up participants and were very effective in preventing further lost to follow-up during the remainder of the trial. The association between food insecurity and lower rates of lost to follow-up is likely related to the study's provision of nutritional support, including a family maize supplement, which may have contributed to patient retention. |
Reintegration of Ebola survivors into their communities - Firestone District, Liberia, 2014
Arwady MA , Garcia EL , Wollor B , Mabande LG , Reaves EJ , Montgomery JM . MMWR Morb Mortal Wkly Rep 2014 63 (50) 1207-9 The current Ebola virus disease (Ebola) epidemic in West Africa is unprecedented in size and duration. Since the outbreak was recognized in March 2014, the World Health Organization (WHO) has reported 17,145 cases with 6,070 deaths, primarily in Guinea, Liberia, and Sierra Leone. Combined data show a case-fatality rate of approximately 70% in patients with a recorded outcome; a 30% survival rate means that thousands of patients have survived Ebola. An important component of a comprehensive Ebola response is the reintegration of Ebola survivors into their communities. |
Reply to Soman et al, Alffenaar et al, Metcalfe et al, and Raoult
Cegielski JP , Chen MP , Tupasi TE , Leimane V , Volchenkov GV . Clin Infect Dis 2014 60 (6) 971-3 We thank Metcalfe et al, Alffenaar et al, Soman et al, and Raoult for their interest in our study [1]. Metcalfe et al raise 2 issues about the analysis and reporting of results [2]. Alffenaar and colleagues raise issues related to drug dosing [3]. Soman and colleagues raise concern about standardized treatment regimens [4]. Raoult refers to the potential utility of existing drugs that are not standard antituberculosis drugs [5]. We respond to each of these letters in turn. | Metcalfe and colleagues suggest that patients who remain culture negative after 1 month of treatment could not have acquired drug resistance and therefore might have been included in the denominator when calculating the proportion of patients with acquired drug resistance [2]. However, the reality and the math are more complicated for at least 3 reasons. First, we disagree that the target population “is presented as all patients with MDR [multidrug-resistant] tuberculosis starting treatment with [second-line drugs].” The target population for this analysis was patients with at least one positive follow-up cultures as displayed in our Figure 1 [1]. Second, we described the excluded subset of patients as having no positive follow-up cultures rather than as having all negative follow-up cultures because these are not the same: 20.8% of the excluded group of patients did not complete treatment (ie, were classified as defaulting) after a median of <12 months (interquartile range, 5–16 months). Because “default” is a World Health Organization (WHO)–defined standard outcome category [6], it was the endpoint in our follow-up of these patients, and we cannot know whether these patients had any subsequent positive cultures. However, the duration of treatment for this group of patients is inadequate. These patients would be at high risk for again becoming culture positive and for acquired drug resistance. Third, many of these patients already had baseline resistance to fluoroquinolones, second-line injectable drugs, or both. It would not be appropriate to include them in the denominator when calculating the frequency of acquired resistance to these same drugs. The exact percentages are uncertain because we did not receive baseline cultures for all these patients and did not recover viable mycobacteria from all cultures received. However, of the 340 viable baseline isolates we received among patients with no positive follow-up cultures, 6.8% had fluoroquinolone resistance, 8.5% had resistance to 1 or more second-line injectable drugs, 11.8% had resistance to either, and 3.5% had resistance to both. |
Support services for survivors of Ebola virus disease - Sierra Leone, 2014
Lee-Kwan SH , DeLuca N , Adams M , Dalling M , Drevlow E , Gassama G , Davies T . MMWR Morb Mortal Wkly Rep 2014 63 (50) 1205-6 As of December 6, 2014, Sierra Leone reported 6,317 laboratory-confirmed cases of Ebola virus disease (Ebola), the highest number of reported cases in the current West Africa epidemic. The Sierra Leone Ministry of Health and Sanitation reported that as of December 6, 2014, there were 1,181 persons who had survived and were discharged. Survivors from previous Ebola outbreaks have reported major barriers to resuming normal lives after release from treatment, such as emotional distress, health issues, loss of possessions, and difficulty regaining their livelihoods. In August 2014, a knowledge, attitude, and practice survey regarding the Ebola outbreak in Sierra Leone, administered by a consortium of partners that included the Ministry of Health and Sanitation, UNICEF, CDC, and a local nongovernmental organization, Focus 1000, found that 96% of the general population respondents reported some discriminatory attitude towards persons with suspected or known Ebola. Access to increased psychosocial support, provision of goods, and family and community reunification programs might reduce these barriers. Survivors also have unique potential to contribute to the Ebola response, particularly because survivors might have some immunity to the same virus strain. In previous outbreaks, survivors served as burial team members, contact tracers, and community educators promoting messages that seeking treatment improves the chances for survival and that persons who survived Ebola can help their communities. As caregivers in Ebola treatment units, survivors have encouraged patients to stay hydrated and eat and inspired them to believe that they, too, can survive. Survivors regaining livelihood through participation in the response might offset the stigma associated with Ebola. |
Syphilis in the United States: on the rise?
Peterman TA , Su J , Bernstein KT , Weinstock H . Expert Rev Anti Infect Ther 2014 13 (2) 1-8 Syphilis rates and trends vary by population subgroup. Among men who have sex with men (MSM), rates of primary and secondary (P&S) syphilis are high throughout the USA (228.8 per 100,000 in 2013). P&S syphilis among women is much less common (0.9 per 100,000 in 2013) and occurs in isolated outbreaks plus in a few counties with persistent low levels of infection. Congenital syphilis trends closely follow P&S trends among women. These trends have implications for prevention. Routine screening among MSM can prevent tertiary syphilis, but despite interventions, rates of infection continue to rise among MSM and will soon approach those last seen in 1982 (estimate: 340.7 per 100,000). Control of syphilis among women is possible and important because it often leads to congenital syphilis. Outbreaks among heterosexuals can be halted by intensive screening, treatment and partner notification. |
Update: Ebola virus disease epidemic - West Africa, December 2014
CDC Incident Management System Ebola Epidemiology Team , Guinea Interministerial Committee for Response Against the Ebola Virus , World Health Organization , CDC Guinea Response Team , Liberia Ministry of Health and Social Welfare , CDC Liberia Response Team , Sierra Leone Ministry of Health and Sanitation , CDC Sierra Leone Response Team , CDC NCEZID Viral Special Pathogens Branch . MMWR Morb Mortal Wkly Rep 2014 63 (50) 1199-1201 CDC is assisting ministries of health and working with other organizations to end the ongoing epidemic of Ebola virus disease (Ebola) in West Africa. The updated data in this report were compiled from situation reports from the Guinea Interministerial Committee for Response Against the Ebola Virus, the World Health Organization, the Liberia Ministry of Health and Social Welfare, and the Sierra Leone Ministry of Health and Sanitation. Total case counts include all suspected, probable, and confirmed cases, which are defined similarly by each country. These data reflect reported cases, which make up an unknown proportion of all cases, and reporting delays that vary from country to country. |
Update: influenza activity - United States, September 28-December 6, 2014
Rolfes M , Blanton L , Brammer L , Smith S , Mustaquim D , Steffens C , Cohen J , Leon M , Chaves SS , Abd Elal AI , Gubareva L , Hall H , Wallis T , Villanueva J , Bresee J , Cox N , Finelli L . MMWR Morb Mortal Wkly Rep 2014 63 (50) 1189-94 CDC collects, compiles, and analyzes data on influenza activity year-round in the United States (http://www.cdc.gov/flu/weekly/fluactivitysurv.htm). The influenza season generally begins in the fall and continues through the winter and spring months; however, the timing and severity of circulating influenza viruses can vary by geographic location and season. Influenza activity in the United States increased starting mid-October through December. This report summarizes U.S. influenza activity during September 28-December 6, 2014. |
Modeling options to manage type 1 wild poliovirus imported into Israel in 2013
Kalkowska DA , Duintjer Tebbens RJ , Grotto I , Shulman LM , Anis E , Wassilak SG , Pallansch MA , Thompson KM . J Infect Dis 2014 211 (11) 1800-12 BACKGROUND: After 25 years without poliomyelitis cases caused by wild poliovirus (WPV) circulation in Israel, sewage sampling detected WPV type 1 (WPV1) in April 2013, despite high vaccination coverage using only inactivated poliovirus vaccine (IPV) since 2005. METHODS: We used a differential equation-based model to simulate the dynamics of poliovirus transmission and population immunity in Israel due to past exposure to WPV and use of oral poliovirus vaccine (OPV) in addition to IPV. We explored the impacts of various immunization options to stop imported WPV1 circulation in Israel. RESULTS: We successfully modeled the potential for WPVs to circulate without detected cases in Israel. Maintaining a sequential IPV/OPV schedule instead of switching to IPV-only in 2005 would have kept population immunity high enough in Israel to prevent WPV1 circulation. The Israeli response to WPV1 detection prevented paralytic cases; a more rapid response might have interrupted transmission more quickly. CONCLUSIONS: IPV protection alone might not provide sufficient population immunity to prevent poliovirus transmission after an importation. As countries transition to IPV in immunization schedules, they may need to actively manage population immunity and consider continued OPV use to avoid the potential circulation of imported live polioviruses until globally-coordinated OPV cessation. |
Modeling receipt of influenza A(H1N1)pdm09 vaccinations among US children during the 2009-2010 flu season: findings from the 2010 National Health Interview Survey
Blackwell DL . Med Care 2014 53 (2) 191-8 OBJECTIVE: Using 32 weeks of data from the 2010 National Health Interview Survey, factors associated with receipt of influenza A(H1N1)pdm09 vaccinations among US children during October 2009 through February 2010 are examined. METHODS: Logistic models estimated receipt of first dose by January 1, 2010 for all children aged 4.5 months through 17 years and receipt of second dose by February 1, 2010 for children aged 6 months through 9 years who received a first dose, using demographic characteristics and measures of family structure, parental education, family income, access to health care, and chronic condition status. All analyses were weighted to yield nationally representative results for the US child population. RESULTS: Receipt of a seasonal influenza vaccination in the 12 months before October 2009 as well as race/ethnicity, family structure, and various measures representing family socioeconomic status were statistically significant correlates of receipt of the first pH1N1 dose, whereas children's asthma and chronic condition status were not. CONCLUSIONS: In the event of future pandemics, public health officials may utilize these findings to target particular segments of the US child population that may have been underserved during the 2009 influenza pandemic. |
Notes from the field: measles transmission at a domestic terminal gate in an international airport - United States, January 2014
Vega JS , Escobedo M , Schulte CR , Rosen JB , Schauer S , Wiseman R , Lippold SA , Regan JJ . MMWR Morb Mortal Wkly Rep 2014 63 (50) 1211 In March 2014, CDC identified a possible cluster of four laboratory-confirmed measles cases among passengers transiting a domestic terminal in a U.S. international airport. Through epidemiologic assessments conducted by multiple health departments and investigation of flight itineraries by CDC, all four patients were linked to the same terminal gate during a 4-hour period on January 17, 2014. Patient 1, an unvaccinated man aged 21 years with rash onset February 1, traveled on two domestic flights on January 17 and 18 that connected at the international airport. Patient 2, an unvaccinated man aged 49 years with rash onset February 1, traveled from the airport on January 17. Patient 3, an unvaccinated man aged 19 years with rash onset January 30, traveled domestically with at least a 4-hour layover at the airport on January 17. Patient 4, an unvaccinated man aged 63 years with rash onset February 5, traveled on a flight to the airport on January 17. |
Ebola virus disease in health care workers - Sierra Leone, 2014
Kilmarx PH , Clarke KR , Dietz PM , Hamel MJ , Husain F , McFadden JD , Park BJ , Sugerman DE , Bresee JS , Mermin J , McAuley J , Jambai A . MMWR Morb Mortal Wkly Rep 2014 63 (49) 1168-71 Health care workers (HCWs) are at increased risk for infection in outbreaks of Ebola virus disease (Ebola). To characterize Ebola in HCWs in Sierra Leone and guide prevention efforts, surveillance data from the national Viral Hemorrhagic Fever database were analyzed. In addition, site visits and interviews with HCWs and health facility administrators were conducted. As of October 31, 2014, a total of 199 (5.2%) of the total of 3,854 laboratory-confirmed Ebola cases reported from Sierra Leone were in HCWs, representing a much higher estimated cumulative incidence of confirmed Ebola in HCWs than in non-HCWs, based on national data on the number of HCW. The peak number of confirmed Ebola cases in HCWs was reported in August (65 cases), and the highest number and percentage of confirmed Ebola cases in HCWs was in Kenema District (65 cases, 12.9% of cases in Kenema), mostly from Kenema General Hospital. Confirmed Ebola cases in HCWs continued to be reported through October and were from 12 of 14 districts in Sierra Leone. A broad range of challenges were reported in implementing infection prevention and control measures. In response, the Ministry of Health and Sanitation and partners are developing standard operating procedures for multiple aspects of infection prevention, including patient isolation and safe burials; recruiting and training staff in infection prevention and control; procuring needed commodities and equipment, including personal protective equipment and vehicles for safe transport of Ebola patients and corpses; renovating and constructing Ebola care facilities designed to reduce risk for nosocomial transmission; monitoring and evaluating infection prevention and control practices; and investigating new cases of Ebola in HCWs as sentinel public health events to identify and address ongoing prevention failures. |
Fatal influenza outbreak aboard a sport fishing vessel in San Diego, California
Adam JK , Varan AK , Kao AS , McDonald EC , Waterman SH . Travel Med Infect Dis 2014 13 (1) 102-3 In January 2014, the Centers for Disease Control and Prevention (CDC) was notified about a death aboard a sport fishing vessel on a 16-day cruise off the Baja California peninsula. The male passenger, aged 70 years, had chronic obstructive pulmonary disease, coronary artery disease, was obese, and had not received the seasonal influenza vaccine. He had chills, productive cough, and dyspnea before his death on January 22, 2014. A second passenger, unrelated to the decedent, was medically evacuated on January 24, 2014, while at an international port due to respiratory illness. Upon return of the vessel to a U.S. port on January 27, 2014, federal and local public health officials assessed symptoms, offered influenza testing (rapid test and nasal swab for reverse transcription polymerase chain reaction [RT-PCR]) for all persons onboard, and planned for interviews to assess seasonal influenza vaccination beliefs. | Among 25 passengers (including the evacuee) and nine crew, all male adults, seven passengers (28%) and two crew (22%) met criteria for influenza-like illness (ILI), defined as subjective fever plus either cough or sore throat. Subjective fever could not be confirmed for the decedent via proxy interview; hence, he was not deemed to meet ILI criteria. Among persons with ILI, the median age was 52 years (range: 43–65 years). Illness onset dates among persons with ILI ranged from 13 to 23 January, 2014. The majority (78%) with ILI had not received the 2013–2014 influenza vaccine. The ILI attack rate was 26% among all passengers and crew, 28% among the unvaccinated, and 22% among the vaccinated. Twenty-seven passengers and crew (79%) onboard agreed to influenza testing, including seven of the nine persons with ILI. Among persons with ILI, specimens were collected a median of 12 days after illness onset (range: 5–15 days). All rapid tests were negative. Two (7.4%) passengers were positive for H1N1pdm09 virus by RT-PCR; neither met criteria for ILI. The evacuated passenger was hospitalized; convalescent serum was positive for influenza A H1N1pdm09 virus by hemagglutination inhibition assay. Additionally, the decedent had a post-mortem nasopharyngeal swab positive for H1N1pdm09 virus by RT-PCR; cause of death by autopsy report was acute viral influenza and bacterial bronchopneumonia. |
GB Virus C (GBV-C) infection in Hepatitis C Virus (HCV) seropositive women with or at risk for HIV Infection
Blackard JT , Ma G , Welge JA , King CC , Taylor LE , Mayer KH , Klein RS , Celentano DD , Sobel JD , Jamieson DJ , Gardner L . PLoS One 2014 9 (12) e114467 BACKGROUND: GB virus C (GBV-C) may have a beneficial impact on HIV disease progression; however, the epidemiologic characteristics of this virus are not well characterized. Behavioral factors and gender may lead to differential rates of GBV-C infection; yet, studies have rarely addressed GBV-C infections in women or racial/ethnic minorities. Therefore, we evaluated GBV-C RNA prevalence and genotype distribution in a large prospective study of high-risk women in the US. RESULTS: 438 hepatitis C virus (HCV) seropositive women, including 306 HIV-infected and 132 HIV-uninfected women, from the HIV Epidemiologic Research Study were evaluated for GBV-C RNA. 347 (79.2%) women were GBV-C RNA negative, while 91 (20.8%) were GBV-C RNA positive. GBV-C positive women were younger than GBV-C negative women. Among 306 HIV-infected women, 70 (22.9%) women were HIV/GBV-C co-infected. Among HIV-infected women, the only significant difference between GBV-negative and GBV-positive women was age (mean 38.4 vs. 35.1 years; p<0.001). Median baseline CD4 cell counts and plasma HIV RNA levels were similar. The GBV-C genotypes were 1 (n = 31; 44.3%), 2 (n = 36; 51.4%), and 3 (n = 3; 4.3%). The distribution of GBV-C genotypes in co-infected women differed significantly by race/ethnicity. However, median CD4 cell counts and log10 HIV RNA levels did not differ by GBV-C genotype. GBV-C incidence was 2.7% over a median follow-up of 2.9 (IQR: 1.5, 4.9) years, while GBV-C clearance was 35.7% over a median follow-up of 2.44 (1.4, 3.5) years. 4 women switched genotypes. CONCLUSIONS: Age, injection drug use, a history of sex for money or drugs, and number of recent male sex partners were associated with GBV-C infection among all women in this analysis. However, CD4 cell count and HIV viral load of HIV/HCV/GBV-C co-infected women were not different although race was associated with GBV-C genotype. |
HIV risk, prevention, and testing behaviors among heterosexuals at increased risk for HIV infection - National HIV Behavioral Surveillance System, 21 U.S. cities, 2010
Sionean C , Le BC , Hageman K , Oster AM , Wejnert C , Hess KL , Paz-Bailey G . MMWR Surveill Summ 2014 63 Suppl 14 (14) 1-39 PROBLEM/CONDITION: At the end of 2010, an estimated 872,990 persons in the United States were living with a diagnosis of human immunodeficiency virus (HIV) infection. Approximately one in four of the estimated HIV infections diagnosed in 2011 were attributed to heterosexual contact. Heterosexuals with a low socioeconomic status (SES) are disproportionately likely to be infected with HIV. REPORTING PERIOD: June-December 2010. DESCRIPTION OF SYSTEM: The National HIV Behavioral Surveillance System (NHBS) collects HIV prevalence and risk behavior data in selected metropolitan statistical areas (MSAs) from three populations at high risk for HIV infection: men who have sex with men, injecting drug users, and heterosexuals at increased risk for HIV infection. Data for NHBS are collected in rotating cycles in these three different populations. For the 2010 NHBS cycle among heterosexuals, men and women were eligible to participate if they were aged 18-60 years, lived in a participating MSA, were able to complete a behavioral survey in English or Spanish, and reported engaging in vaginal or anal sex with one or more opposite-sex partners in the 12 months before the interview. Persons who consented to participate completed an interviewer-administered, standardized questionnaire about HIV-associated behaviors and were offered anonymous HIV testing. Participants were sampled using respondent-driven sampling, a type of chain-referral sampling. Sampling focused on persons of low SES (i.e., income at the poverty level or no more than a high school education) because results of a pilot study indicated that heterosexual adults of low SES were more likely than those of high SES to be infected with HIV. To assess risk and testing experiences among persons at risk for acquiring HIV infection through heterosexual sex, analyses excluded participants who were not low SES, those who reported ever having tested positive for HIV, and those who reported recent (i.e., in the 12 months before the interview) male-male sex or injection drug use. This report summarizes unweighted data regarding HIV-associated risk, prevention, and testing behaviors from 9,278 heterosexual men and women interviewed in 2010 (the second cycle of NHBS data collection among heterosexuals). RESULTS: The median age of participants was 35 years; 47% were men. The majority of participants were black or African American (hereafter referred to as black) (72%) or Hispanic/Latino (21%). Most participants (men: 88%; women: 90%) reported having vaginal sex without a condom with one or more opposite-sex partners in the past 12 months; approximately one third (men: 30%; women: 29%) reported anal sex without a condom with one or more opposite-sex partners. The majority of participants (59%) reported using noninjection drugs in the 12 months before the interview; nearly one in seven (15%) had used crack cocaine. Although most participants (men: 71%; women: 77%) had ever been tested for HIV, this percentage was lower among Hispanic/Latino participants (men: 52%; women: 62%). Approximately one third (34%) of participants reported receiving free condoms in the 12 months before the interview; 11% reported participating in a behavioral HIV prevention program. INTERPRETATION: A substantial proportion of heterosexuals interviewed for the 2010 NHBS heterosexual cycle reported engaging in behaviors that increase the risk for HIV infection. However, HIV testing was suboptimal among the overall sample, including among groups disproportionately affected by HIV infection (i.e., blacks and Hispanics/Latinos). PUBLIC HEALTH ACTION: Increasing coverage of HIV testing and other HIV prevention services among heterosexuals at increased risk is important, especially among groups disproportionately affected by HIV infection, such as blacks and Hispanics/Latinos. The National HIV/AIDS Strategy for the United States delineates a coordinated national response to reduce infections and HIV-related health disparities among disproportionately affected groups. NHBS data can guide national and local planning efforts to maximize the impact of HIV prevention programs. |
Integrating active tuberculosis case finding in antenatal services in Zambia
Kancheya N , Luhanga D , Harris JB , Morse J , Kapata N , Bweupe M , Henostroza G , Reid SE . Int J Tuberc Lung Dis 2014 18 (12) 1466-72 SETTING: Three out-patient antenatal care (ANC) clinics in Lusaka, Zambia. OBJECTIVE: To estimate tuberculosis (TB) prevalence in human immunodeficiency virus (HIV) infected and symptomatic, non-HIV-infected pregnant women and explore the feasibility of routine TB screening in ANC settings. DESIGN: Peer educators administered TB symptom questionnaires to pregnant women attending their first ANC clinic visit. Presumptive TB patients were defined as all HIV-infected women and symptomatic non-HIV-infected women. Sputum samples were tested using smear microscopy and culture to estimate TB prevalence. RESULTS: All 5033 (100%) women invited to participate in the study agreed, and 17% reported one or more TB symptoms. Among 1152 presumed TB patients, 17 (1.5%) had previously undiagnosed culture-confirmed TB; 2 (12%) were smear-positive. Stratified by HIV status, TB prevalence was 10/664 (1.5%, 95%CI 0. 7-2.8) among HIV-infected women and 7/488 (1.4%, 95%CI 0.6-2.9) among symptomatic non-HIV-infected women. In HIV-infected women, the only symptom significantly associated with TB was productive cough; symptom screening was only 50% sensitive. CONCLUSION: There is a sizable burden of TB in pregnant women in Zambia, which may lead to adverse maternal and infant outcomes. TB screening in ANC settings in Zambia is acceptable and feasible. More sensitive diagnostics are needed. |
Internalised homophobia is differentially associated with sexual risk behaviour by race/ethnicity and HIV serostatus among substance-using men who have sex with men in the United States
Mansergh G , Spikes P , Flores SA , Koblin BA , McKirnan D , Hudson SM , Colfax GN . Sex Transm Infect 2014 91 (5) 324-8 OBJECTIVES: There is a continuing need to identify factors associated with risk for HIV transmission among men who have sex with men (MSM), including a need for further research in the ongoing scientific debate about the association of internalised homophobia and sexual risk due partly to the lack of specificity in analysis. We assess the association of internalised homophobia by race/ethnicity within HIV serostatus for a large sample of substance-using MSM at high risk of HIV acquisition or transmission. METHODS: Convenience sample of substance-using (non-injection) MSM reporting unprotected anal sex in the prior 6 months residing in Chicago, Los Angeles, New York and San Francisco. The analytic sample included HIV-negative and HIV-positive black (n=391), Latino (n=220), and white (n=458) MSM. Internalised homophobia was assessed using a published four-item scale focusing on negative self-perceptions and feelings of their own sexual behaviour with men, or for being gay or bisexual. Analyses tested associations of internalised homophobia with recent risk behaviour, stratified by laboratory-confirmed HIV serostatus within race/ethnicity, and controlling for other demographic variables. RESULTS: In multivariate analysis, internalised homophobia was inversely associated (p<0.05) with recent unprotected anal sex among black MSM, and not significantly associated with sexual risk behaviour among white and Latino MSM. CONCLUSIONS: More research is needed to further identify nuanced differences in subpopulations of MSM, but these results suggest differentially targeted intervention messages for MSM by race/ethnicity. |
Key populations in sub-Saharan Africa: population size estimates and high risk behaviors
Abdul-Quader AS , Gouws-Williams E , Tlou S , Wright-De Aguero L , Needle R . AIDS Behav 2014 19 Suppl 1 S1-2 The expansion of antiretroviral treatment and other biomedical and behavioral interventions has slowed HIV transmission in a number of countries in sub-Saharan Africa. However, populations at high risk of HIV infection including men who have sex with men (MSM), sex workers (SWs) and people who inject drugs (PWID) have limited access to and uptake of these interventions due to structural factors, legal barriers, stigma and discrimination. Other challenges related to populations at high risk of HIV infection include the lack of accurate population size estimates to help measure program coverage and program reach, lack of good quality epidemiologic data on HIV prevalence and related behaviors at the national and sub-national levels, and lack of real time analysis of programmatic data to guide programming for an AIDS free generation. Increasingly, major funding agencies such as the President’s Emergency Fund for AIDS Relief (PEPFAR) and The Global Fund to Fight AIDS, Tuberculosis and Malaria (GFATM) have recognized population size estimates as an integral part of national and sub-national level strategic planning, target setting and for assessing HIV program results. Program implementers, policy makers and funding organizations have supported population size estimation activities and bio-behavioral surveys among MSM, PWID and SW in a number of countries, including countries with generalized as well as concentrated epidemics, to target and strengthen HIV prevention, care and treatment programming. |
Knowledge, attitudes, and practices regarding antiretroviral management, reproductive health, sexually transmitted infections, and sexual risk behavior among perinatally HIV-infected youth in Thailand
Lolekha R , Boon-Yasidhi V , Leowsrisook P , Naiwatanakul T , Durier Y , Nuchanard W , Tarugsa J , Punpanich W , Pattanasin S , Chokephaibulkit K . AIDS Care 2014 27 (5) 1-11 More than 30% of perinatally HIV-infected children in Thailand are 12 years and older. As these youth become sexually active, there is a risk that they will transmit HIV to their partners. Data on the knowledge, attitudes, and practices (KAP) of HIV-infected youth in Thailand are limited. Therefore, we assessed the KAP of perinatally HIV-infected youth and youth reporting sexual risk behaviors receiving care at two tertiary care hospitals in Bangkok, Thailand and living in an orphanage in Lopburi, Thailand. From October 2010 to July 2011, 197 HIV-infected youth completed an audio computer-assisted self-interview to assess their KAP regarding antiretroviral (ARV) management, reproductive health, sexual risk behaviors, and sexually transmitted infections (STIs). A majority of youth in this study correctly answered questions about HIV transmission and prevention and the importance of taking ARVs regularly. More than half of the youth in this study demonstrated a lack of family planning, reproductive health, and STI knowledge. Girls had more appropriate attitudes toward safe sex and risk behaviors than boys. Although only 5% of the youth reported that they had engaged in sexual intercourse, about a third reported sexual risk behaviors (e.g., having or kissing boy/girlfriend or consuming an alcoholic beverage). We found low condom use and other family planning practices, increasing the risk of HIV and/or STI transmission to sexual partners. Additional resources are needed to improve reproductive health knowledge and reduce risk behavior among HIV-infected youth in Thailand. |
Acceptability and willingness among men who have sex with men (MSM) to use a tablet-based HIV risk assessment in a clinical setting
Jones J , Stephenson R , Smith DK , Toledo L , La Pointe A , Taussig J , Sullivan PS . Springerplus 2014 3 708 We developed an iPad-based application to administer an HIV risk assessment tool in a clinical setting. We conducted focus group discussions (FGDs) with gay, bisexual and other men who have sex with men (MSM) to assess their opinions about using such a device to share risk behavior information in a clinical setting. Participants were asked about their current assessment of their risk or any risk reduction strategies that they discussed with their healthcare providers. Participants were then asked to provide feedback about the iPad-based risk assessment, their opinions about using it in a clinic setting, and suggestions on how the assessment could be improved. FGD participants were generally receptive to the idea of using an iPad-based risk assessment during healthcare visits. Based on the results of the FGDs, an iPad-based risk assessment is a promising method for identifying those patients at highest risk for HIV transmission. |
Airport exit and entry screening for Ebola - August-November 10, 2014
Brown CM , Aranas AE , Benenson GA , Brunette G , Cetron M , Chen TH , Cohen NJ , Diaz P , Haber Y , Hale CR , Holton K , Kohl K , Le AW , Palumbo GJ , Pearson K , Phares CR , Alvarado-Ramy F , Roohi S , Rotz LD , Tappero J , Washburn FM , Watkins J , Pesik N . MMWR Morb Mortal Wkly Rep 2014 63 (49) 1163-7 In response to the largest recognized Ebola virus disease epidemic now occurring in West Africa, the governments of affected countries, CDC, the World Health Organization (WHO), and other international organizations have collaborated to implement strategies to control spread of the virus. One strategy recommended by WHO calls for countries with Ebola transmission to screen all persons exiting the country for "unexplained febrile illness consistent with potential Ebola infection." Exit screening at points of departure is intended to reduce the likelihood of international spread of the virus. To initiate this strategy, CDC, WHO, and other global partners were invited by the ministries of health of Guinea, Liberia, and Sierra Leone to assist them in developing and implementing exit screening procedures. Since the program began in August 2014, an estimated 80,000 travelers, of whom approximately 12,000 were en route to the United States, have departed by air from the three countries with Ebola transmission. Procedures were implemented to deny boarding to ill travelers and persons who reported a high risk for exposure to Ebola; no international air traveler from these countries has been reported as symptomatic with Ebola during travel since these procedures were implemented. |
Anal intercourse without condoms among HIV-positive men who have sex with men recruited from a sexual networking web site, United States
Margolis AD , Joseph H , Hirshfield S , Chiasson MA , Belcher L , Purcell DW . Sex Transm Dis 2014 41 (12) 749-755 BACKGROUND: The changing landscape of HIV prevention in the United States underscores the need to improve our ability to efficiently reach HIV-positive men who have sex with men (MSM) who engage in behaviors that could transmit HIV. METHODS: We examined the prevalence of anal intercourse (AI) without condoms with HIV-negative or unknown serostatus partners ("at-risk partners") among 1319 HIV-positive adult male members of a sexual networking Web site for MSM. Sexual behaviors and substance use were measured over a 60-day recall period. Logistic regression was used to identify correlates of insertive and receptive AI without condoms with at-risk partners. RESULTS: Approximately 25% of the men had been diagnosed as having HIV 12 months or less before study enrollment. Overall, 32% of men engaged in AI without condoms with at-risk partners. Multiple logistic regression identified behavioral predictors of insertive AI without condoms with at-risk partners, including HIV diagnosis within the last 12 months, sex with multiple male partners, substance use in conjunction with sex, and use of phosphodiesterase type 5 inhibitors. Receptive AI without condoms with at-risk partners was associated with younger age (19-24 years), residing outside metropolitan cities, substance use in conjunction with sex, and having multiple male partners. CONCLUSIONS: High levels of sexual risk were found among these MSM. Increased Internet-based HIV prevention marketing efforts and prevention strategies should be considered to efficiently reach HIV-positive MSM who engage in serodiscordant AI without condoms. |
Awareness of HCV infection among persons who inject drugs in San Diego, California
Collier MG , Bhaurla SK , Cuevas-Mota J , Armenta RF , Teshale EH , Garfein RS . Am J Public Health 2014 105 (2) e1-e2 We asked persons who inject drugs questions about HCV, including past testing and diagnosis followed by HCV testing. Of 540 participants, 145 (27%) were anti-HCV positive, but of those who were positive, only 46 (32%) knew about their infection. Asking about previous HCV testing results yielded better results than did asking about prior HCV diagnosis. Factors associated with knowing about HCV infection included older age, HIV testing, and drug treatment. Comprehensive approaches to educating and screening this population for HCV need implementation. |
Challenges in responding to the Ebola epidemic - four rural counties, Liberia, August-November 2014
Summers A , Nyenswah TG , Montgomery JM , Neatherlin J , Tappero JW . MMWR Morb Mortal Wkly Rep 2014 63 (50) 1202-4 The first cases of Ebola virus disease (Ebola) in West Africa were identified in Guinea on March 22, 2014. On March 30, the first Liberian case was identified in Foya Town, Lofa County, near the Guinean border. Because the majority of early cases occurred in Lofa and Montserrado counties, resources were concentrated in these counties during the first several months of the response, and these counties have seen signs of successful disease control. By October 2014, the epidemic had reached all 15 counties of Liberia. During August 27-September 10, 2014, CDC in collaboration with the Liberian Ministry of Health and Social Welfare assessed county Ebola response plans in four rural counties (Grand Cape Mount, Grand Bassa, Rivercess, and Sinoe, to identify county-specific challenges in executing their Ebola response plans, and to provide recommendations and training to enhance control efforts. Assessments were conducted through interviews with county health teams and health care providers and visits to health care facilities. At the time of assessment, county health teams reported lacking adequate training in core Ebola response strategies and reported facing many challenges because of poor transportation and communication networks. Development of communication and transportation network strategies for communities with limited access to roads and limited means of communication in addition to adequate training in Ebola response strategies is critical for successful management of Ebola in remote areas. |
Changes in glomerular kidney function among HIV-1-uninfected men and women receiving emtricitabine-tenofovir disoproxil fumarate preexposure prophylaxis: a randomized clinical trial
Mugwanya KK , Wyatt C , Celum C , Donnell D , Mugo NR , Tappero J , Kiarie J , Ronald A , Baeten JM . JAMA Intern Med 2014 175 (2) 246-54 IMPORTANCE: Tenofovir disoproxil fumarate (TDF) use has been associated with declines in the estimated glomerular filtration rate (eGFR) when used as part of antiretroviral treatment by persons with human immunodeficiency virus (HIV) type 1, but limited data are available for risk when used as preexposure prophylaxis (PrEP) for HIV-1 prevention. OBJECTIVE: To determine whether TDF-based PrEP causes eGFR decline in HIV-1-uninfected adults. DESIGN, SETTING, AND PARTICIPANTS: A per-protocol safety analysis of changes in eGFR in the Partners PrEP Study, a randomized, placebo-controlled trial of daily oral TDF and emtricitabine (FTC)-TDF PrEP among heterosexual HIV-1-uninfected members of serodiscordant couples in Kenya and Uganda. The trial was conducted from 2008 to 2012. MAIN OUTCOMES AND MEASURES: Predefined outcomes of this analysis were mean eGFR change and a 25% or greater eGFR decline from baseline. The eGFR was calculated using the Chronic Kidney Disease Epidemiology Collaboration equation. Results: Of 4640 participants in the once-daily TDF (n = 1548), FTC-TDF (n = 1545), or placebo (n = 1547) groups, 63% were men. At enrollment, median age was 35 years (range, 18-64 years), and mean eGFR was 130 mL/min/1.73 m2. During a median follow-up of 18 months (interquartile range 12-27 months), mean within-group eGFR change from baseline was +0.14 mL/min/1.73 m2 for TDF, -0.22 mL/min/1.73 m2 for FTC-TDF, and +1.37 mL/min/1.73 m2 for placebo, translating into average declines in eGFR attributable to PrEP vs placebo of -1.23 mL/min/1.73 m2 (95% CI, -2.06 to -0.40; P = .004) for TDF and -1.59 mL/min/1.73 m2 (95% CI, -2.44 to -0.74; P < .001) for FTC-TDF. The difference in mean eGFR between PrEP and placebo appeared by 1 month after randomization, was stable through 12 months, and then appeared to wane thereafter. The respective proportions of persons who developed a confirmed 25% or greater eGFR decline from baseline by 12 and 24 months was 1.3% and 1.8% for TDF and 1.2% and 2.5% for FTC-TDF, and these frequencies were not statistically different from the confirmed decline in the placebo group (0.9% and 1.3% by 12 and 24 months, respectively). CONCLUSIONS AND RELEVANCE: In this large randomized, placebo-controlled trial among heterosexual persons, with median follow-up of 18 months and maximum follow-up of 36 months, daily oral TDF-based PrEP resulted in a small but nonprogressive decline in eGFR that was not accompanied by a substantial increase in the risk of clinically relevant (≥25%) eGFR decline. |
Characteristics and predictors of death among hospitalized HIV-infected patients in a low HIV prevalence country: Bangladesh
Shahrin L , Leung DT , Matin N , Pervez MM , Azim T , Bardhan PK , Heffelfinger JD , Chisti MJ . PLoS One 2014 9 (12) e113095 BACKGROUND: Predictors of death in hospitalized HIV-infected patients have not been previously reported in Bangladesh. OBJECTIVE: The primary aim of this study was to determine predictors of death among hospitalized HIV-infected patients at a large urban hospital in Bangladesh. METHODS: A study was conducted in the HIV in-patient unit (Jagori Ward) of icddr,b's Dhaka Hospital. Characteristics of patients who died during hospitalization were compared to those of patients discharged from the ward. Bivariate analysis was performed to determine associations between potential risk factors and death. Multivariable logistic regression was used to identify factors independently associated with death. RESULTS: Of 293 patients admitted to the Jagori Ward, 57 died during hospitalization. Most hospitalized patients (67%) were male and the median age was 35 (interquartile range: 2-65) years. Overall, 153 (52%) patients were diagnosed with HIV within 6 months of hospitalization. The most common presumptive opportunistic infections (OIs) identified were tuberculosis (32%), oesophageal candidiasis (9%), Pneumocystis jirovecii pneumonia (PJP) (8%), and histoplasmosis (7%). On multivariable analysis, independent predictors of mortality were CD4 count ≤200 cells/mm3 (adjusted odds ratio [aOR]: 16.6, 95% confidence interval [CI]: 3.7-74.4), PJP (aOR: 18.5, 95% CI: 4.68-73.3), oesophageal candidiasis (aOR: 27.5, 95% CI: 5.5-136.9), malignancy (aOR:15.2, 95% CI: 2.3-99.4), and bacteriuria (aOR:7.9, 95% CI: 1.2-50.5). Being on antiretroviral therapy prior to hospitalization (aOR: 0.2, 95% CI: 0.06-0.5) was associated with decreased mortality. CONCLUSION: This study showed that most patients who died during hospitalization on the Jagori Ward had HIV-related illnesses which could have been averted with earlier diagnosis of HIV and proper management of OIs. It is prudent to develop a national HIV screening programme to facilitate early identification of HIV. |
Clinical inquiries regarding Ebola virus disease received by CDC - United States, July 9-November 15, 2014
Karwowski MP , Meites E , Fullerton KE , Stroher U , Lowe L , Rayfield M , Blau DM , Knust B , Gindler J , Beneden CV , Bialek SR , Mead P , Oster AM . MMWR Morb Mortal Wkly Rep 2014 63 (49) 1175-9 Since early 2014, there have been more than 6,000 reported deaths from Ebola virus disease (Ebola), mostly in Guinea, Liberia, and Sierra Leone. On July 9, 2014, CDC activated its Emergency Operations Center for the Ebola outbreak response and formalized the consultation service it had been providing to assist state and local public health officials and health care providers evaluate persons in the United States thought to be at risk for Ebola. During July 9-November 15, CDC responded to clinical inquiries from public health officials and health care providers from 49 states and the District of Columbia regarding 650 persons thought to be at risk. Among these, 118 (18%) had initial signs or symptoms consistent with Ebola and epidemiologic risk factors placing them at risk for infection, thereby meeting the definition of persons under investigation (PUIs). Testing was not always performed for PUIs because alternative diagnoses were made or symptoms resolved. In total, 61 (9%) persons were tested for Ebola virus, and four, all of whom met PUI criteria, had laboratory-confirmed Ebola. Overall, 490 (75%) inquiries concerned persons who had neither traveled to an Ebola-affected country nor had contact with an Ebola patient. Appropriate medical evaluation and treatment for other conditions were noted in some instances to have been delayed while a person was undergoing evaluation for Ebola. Evaluating and managing persons who might have Ebola is one component of the overall approach to domestic surveillance, the goal of which is to rapidly identify and isolate Ebola patients so that they receive appropriate medical care and secondary transmission is prevented. Health care providers should remain vigilant and consult their local and state health departments and CDC when assessing ill travelers from Ebola-affected countries. Most of these persons do not have Ebola; prompt diagnostic assessments, laboratory testing, and provision of appropriate care for other conditions are essential for appropriate patient care and reflect hospital preparedness. |
Correlates of prevalent HIV infection among adults and adolescents in the Kisumu incidence cohort study, Kisumu, Kenya
Gumbe A , McLellan-Lemal E , Gust DA , Pals SL , Gray KM , Ndivo R , Chen RT , Mills LA , Thomas TK . Int J STD AIDS 2014 26 (13) 929-40 We estimated HIV prevalence and identified correlates of HIV infection among 1106 men and women aged 16-34 years residing in Kisumu, Kenya. Demographic, sexual, and other behavioural data were collected using audio computer-assisted self-interview in conjunction with a medical examination, real-time parallel rapid HIV testing, and laboratory testing for pregnancy, gonorrhoea, chlamydia, syphilis, and herpes simplex virus type 2. Multivariate logistic regression was used to identify variables associated with prevalent HIV infection by gender. Overall HIV prevalence was 12.1%. HIV prevalence among women (17.1%) was approximately two and one half times the prevalence among men (6.6%). Odds of HIV infection in men increased with age (aOR associated with one year increase in age = 1.21, CI = 1.07-1.35) and were greater among those who were uncircumcised (aOR = 4.42, CI = 1.41-13.89) and those who had an herpes simplex virus type 2 positive (aOR = 3.13, CI = 1.12-8.73) test result. Odds of prevalent HIV infection among women also increased with age (aOR associated with one year increase in age = 1.16, CI = 1.04-1.29). Women who tested herpes simplex virus type 2 positive had more than three times the odds (aOR = 3.85, CI = 1.38-10.46) of prevalent HIV infection compared with those who tested herpes simplex virus type 2 negative. Tailored sexual health interventions and programs may help mitigate HIV age and gender disparities. |
Communicating with school nurses about sexual orientation and sexual health: perspectives of teen young men who have sex with men
Rasberry CN , Morris E , Lesesne CA , Kroupa E , Topete P , Carver LH , Robin L . J Sch Nurs 2014 31 (5) 334-44 Black and Latino young men who have sex with men (YMSM) are at disproportionate risk for sexually transmitted diseases (STDs), including HIV. This study informs school-centered strategies for connecting YMSM to health services by describing their willingness, perceived safety, and experiences in talking to school staff about sexual health. Cross-sectional data were collected from Black and Latino YMSM aged 13-19 through web-based questionnaires (N = 415) and interviews (N = 32). School nurses were the staff members youth most often reported willingness to talk to about HIV testing (37.8%), STD testing (37.1%), or condoms (37.3%), but least often reported as safe to talk to about attraction to other guys (11.4%). Interviews revealed youth reluctance to talk with school staff including nurses when uncertain of staff members' perceptions of lesbian, gay, bisexual, transgender, and questioning (LGBTQ) people or perceiving staff to lack knowledge of LGBTQ issues, communities, or resources. Nurses may need additional training to effectively reach Black and Latino YMSM. |
Community-based control of the brown dog tick in a region with high rates of Rocky Mountain spotted fever, 2012-2013
Drexler N , Miller M , Gerding J , Todd S , Adams L , Dahlgren FS , Bryant N , Weis E , Herrick K , Francies J , Komatsu K , Piontkowski S , Velascosoltero J , Shelhamer T , Hamilton B , Eribes C , Brock A , Sneezy P , Goseyun C , Bendle H , Hovet R , Williams V , Massung R , McQuiston JH . PLoS One 2014 9 (12) e112368 Rocky Mountain spotted fever (RMSF) transmitted by the brown dog tick (Rhipicephalus sanguineus sensu lato) has emerged as a significant public health risk on American Indian reservations in eastern Arizona. During 2003-2012, more than 250 RMSF cases and 19 deaths were documented among Arizona's American Indian population. The high case fatality rate makes community-level interventions aimed at rapid and sustained reduction of ticks urgent. Beginning in 2012, a two year pilot integrated tick prevention campaign called the RMSF Rodeo was launched in a approximately 600-home tribal community with high rates of RMSF. During year one, long-acting tick collars were placed on all dogs in the community, environmental acaricides were applied to yards monthly, and animal care practices such as spay and neuter and proper tethering procedures were encouraged. Tick levels, indicated by visible inspection of dogs, tick traps and homeowner reports were used to monitor tick presence and evaluate the efficacy of interventions throughout the project. By the end of year one, <1% of dogs in the RMSF Rodeo community had visible tick infestations five months after the project was started, compared to 64% of dogs in Non-Rodeo communities, and environmental tick levels were reduced below detectable levels. The second year of the project focused on use of the long-acting collar alone and achieved sustained tick control with fewer than 3% of dogs in the RMSF Rodeo community with visible tick infestations by the end of the second year. Homeowner reports of tick activity in the domestic and peridomestic setting showed similar decreases in tick activity compared to the non-project communities. Expansion of this successful project to other areas with Rhipicephalus-transmitted RMSF has the potential to reduce brown dog tick infestations and save human lives. |
Meteorological influences on nitrogen dynamics of a coastal onsite wastewater treatment system
O'Driscoll MA , Humphrey CP , Deal NE , Lindbo DL , Zarate-Bermudez MA . J Environ Qual 2014 43 (6) 1873-1885 On-site wastewater treatment systems (OWTS) can contribute nitrogen (N) to coastal waters. In coastal areas with shallow groundwater, OWTS are likely affected by meteorological events. However, the meteorological influences on temporal variability of N exports from OWTS are not well documented. Hydrogeological characterization and seasonal monitoring of wastewater and groundwater quality were conducted at a residence adjacent to the Pamlico River Estuary, North Carolina, during a 2-yr field study (October 2009-2011). Rainfall was elevated during the first study year, relative to the annual mean. In the second year, drought was followed by extreme precipitation from Hurricane Irene. Recent meteorological conditions influenced N speciation and concentrations in groundwater. Groundwater total dissolved nitrogen (TDN) beneath the OWTS drainfield was dominated by nitrate during the drought; during wetter periods, ammonium and organic N were common. Effective precipitation (precipitation [P] minus evapotranspiration [ET]) affected OWTS TDN exports because of its influence on groundwater recharge and discharge. Groundwater nitrate-N concentrations beneath the drainfield were typically higher than 10 mg/L when total biweekly precipitation was less than evapotranspiration (precipitation deficit: P < ET). Overall, groundwater TDN concentrations were elevated above background concentrations at distances > 15 m downgradient of the drainfield. Although OWTS nitrate inputs caused elevated groundwater nitrate concentrations between the drainfield and the estuary, the majority of nitrate was attenuated via denitrification between the OWTS and 48 m to the estuary. However, DON originating from the OWTS was mobile and contributed to elevated TDN concentrations along the groundwater flowpath to the estuary. |
Heat stress illness hospitalizations - Environmental Public Health Tracking Program, 20 states, 2001-2010
Choudhary E , Vaidyanathan A . MMWR Surveill Summ 2014 63 (13) 1-10 PROBLEM/CONDITION: Heat stress illness (HSI), also known as heat-related illness, comprises mild heat edema, heat syncope, heat cramps, heat exhaustion (the most common type of HSI), and heat stroke (the most severe form). CDC's Environmental Public Health Tracking Program receives annual hospitalization discharge data from 23 states that are used to assess and monitor trends of HSI hospitalization over time. REPORTING PERIOD: May-September, 2001-2010. DESCRIPTION OF SYSTEM: The Environmental Public Health Tracking Program is a comprehensive surveillance system implemented in 25 states and one city health department. The core of the system is the Tracking Network, which collects data on environmental hazards, health effects, exposures, and population. The Tracking Network provides nationally consistent environmental and health outcome data that enable federal, state, and local public health agencies to assess trends, explore associations, and generate hypotheses using these data. For HSI surveillance, the Tracking Network uses state-based hospital discharge data. RESULTS: During 2001-2010, approximately 28,000 HSI hospitalizations occurred in 20 states participating in the Tracking Program. Data from three states were not included in this report because of missing data for ≥3 years. Two states joined the Tracking Program after the study period and also are not included in this report. The majority of HSI hospitalizations occurred among males and persons aged ≥65 years. The highest rates of hospitalizations were in the Midwest and the South. During this period, an overall 2%-5% increase in the rate of HSI hospitalizations occurred in all 20 states compared with the 2001 rate. The correlation between the average number of HSI hospitalizations and the average monthly maximum temperature/heat index was statistically significant (at p<0.0001) in all 20 states. INTERPRETATION: Consistent with previous studies, age and sex were identified as major risk factors for HSI hospitalizations. Certain Tracking states that experienced high temperatures during summer months showed an increase in rate of HSI hospitalizations over the 10-year study period. PUBLIC HEALTH ACTION: HSIs are preventable and an important focus of public health interventions at state and local health departments. Federal, state, and local public health agencies can use data on HSI hospitalizations for surveillance purposes to estimate trends over time and to design targeted intervention to reduce heat stress morbidity among at-risk populations. |
Association of short-term exposure to ground-level ozone and respiratory outpatient clinic visits in a rural location - Sublette County, Wyoming, 2008-2011
Pride KR , Peel JL , Robinson BF , Busacker A , Grandpre J , Bisgard KM , Yip FY , Murphy TD . Environ Res 2014 137c 1-7 OBJECTIVE: Short-term exposure to ground-level ozone has been linked to adverse respiratory and other health effects; previous studies typically have focused on summer ground-level ozone in urban areas. During 2008-2011, Sublette County, Wyoming (population: ~10,000 persons), experienced periods of elevated ground-level ozone concentrations during the winter. This study sought to evaluate the association of daily ground-level ozone concentrations and health clinic visits for respiratory disease in this rural county. METHODS: Clinic visits for respiratory disease were ascertained from electronic billing records of the two clinics in Sublette County for January 1, 2008-December 31, 2011. A time-stratified case-crossover design, adjusted for temperature and humidity, was used to investigate associations between ground-level ozone concentrations measured at one station and clinic visits for a respiratory health concern by using an unconstrained distributed lag of 0-3 days and single-day lags of 0 day, 1 day, 2 days, and 3 days. RESULTS: The data set included 12,742 case-days and 43,285 selected control-days. The mean ground-level ozone observed was 47+/-8ppb. The unconstrained distributed lag of 0-3 days was consistent with a null association (adjusted odds ratio [aOR]: 1.001; 95% confidence interval [CI]: 0.990-1.012); results for lags 0, 2, and 3 days were consistent with the null. However, the results for lag 1 were indicative of a positive association; for every 10-ppb increase in the 8-h maximum average ground-level ozone, a 3.0% increase in respiratory clinic visits the following day was observed (aOR: 1.031; 95% CI: 0.994-1.069). Season modified the adverse respiratory effects: ground-level ozone was significantly associated with respiratory clinic visits during the winter months. The patterns of results from all sensitivity analyzes were consistent with the a priori model. CONCLUSIONS: The results demonstrate an association of increasing ground-level ozone with an increase in clinic visits for adverse respiratory-related effects in the following day (lag day 1) in Sublette County; the magnitude was strongest during the winter months; this association during the winter months in a rural location warrants further investigation. |
System for rapid assessment of pneumonia and influenza-related mortality-Ohio, 2009-2010
Rodgers LE , Paulson J , Fowler B , Duffy R . Am J Public Health 2014 105 (2) e1-e4 Rapid mortality surveillance is critical for state emergency preparedness. To enhance timeliness during the 2009-2010 influenza A H1N1 pandemic, the Ohio Department of Health activated a drop-down menu within Ohio's Electronic Death Registration System for reporting of pneumonia- or influenza-related deaths approximately 5 days postmortem. We used International Classification of Diseases-Tenth Revision (ICD-10) codes, available 2-3 months postmortem as the standard, and assessed their agreement with drop-down-menu codes for pneumonia- or influenza-related deaths. Among 56 660 Ohio deaths during September 2009-March 2010, agreement was 97.9% for pneumonia (kappa = 0.85) and 99.9% for influenza (kappa = 0.79). Sensitivity was 80.2% for pneumonia and 73.9% for influenza. Drop-down menu coding enhanced timeliness while maintaining high agreement with ICD-10 codes. |
Norovirus Infection and Disease in an Ecuadorian Birth Cohort: Association of Certain Norovirus Genotypes With Host FUT2 Secretor Status.
Lopman B , Trivedi T , Vicuna Y , Costantini V , Collins N , Gregoricus N , Parashar U , Sandoval C , Broncano N , Vaca M , Chico ME , Vinje J , Cooper PJ . J Infect Dis 2014 211 (11) 1813-21 BACKGROUND: Although norovirus is the most common cause of gastroenteritis, there are few data on the community incidence of infection/disease or the patterns of acquired immunity or innate resistance to norovirus. METHODS: We followed a community-based birth cohort of 194 children in Ecuador with the aim to estimate (1) incidence of norovirus gastroenteritis from birth to three years (2) the protective effect of norovirus infection on subsequent infection/disease and (3) the association of infection/disease with secretor status. RESULTS: Over the 3-year period, we detected a mean of 2.26 (Range 0 - 12) diarrheal episodes per child. Norovirus was detected in 260 (18%) samples, but not more frequently in diarrhea samples (79/438; 18%) than diarrhea-free samples (181/1016; 18%, p=0.919). 66% of children had at least one norovirus infection during the first 3 years of life and 40% of children had two infections. Previous norovirus infections were not associated with subsequent risk of infection. All GII.4 infections were amongst secretor positive children (p<0.001), but higher rates of non-GII.4 infections were found in secretor-negative children (RR=0.56; p=0.029). CONCLUSIONS: GII.4 infections were uniquely detected in secretor positive children while non-GII.4 infections were more often found in secretor negatives. |
First report of an infant botulism case due to Clostridium botulinum type Af
de Jong LI , Fernandez RA , Pareja V , Giaroli G , Guidarelli SR , Dykes JK , Luquez C . J Clin Microbiol 2014 53 (2) 740-2 Most infant botulism cases worldwide are due to botulinum toxin types A and B. Rarely, Clostridium botulinum strains that produce two serotypes (Ab, Ba, and Bf) have also been isolated from infant botulism cases. This is the first reported case of infant botulism due to C. botulinum type Af worldwide. |
Association of host, agent and environment characteristics and the duration of incubation and symptomatic periods of norovirus gastroenteritis
Devasia T , Lopman B , Leon J , Handel A . Epidemiol Infect 2014 143 (11) 1-7 We analysed the reported duration of incubation and symptomatic periods of norovirus for a dataset of 1022 outbreaks, 64 of which reported data on the average incubation period and 87 on the average symptomatic period. We found the mean and median incubation periods for norovirus to be 32.8 [95% confidence interval (CI) 30.9-34.6] hours and 33.5 (95% CI 32.0-34.0) hours, respectively. For the symptomatic period we found the mean and median to be 44.2 (95% CI 38.9-50.7) hours and 43.0 (95% CI 36.0-48.0) hours, respectively. We further investigated how these average periods were associated with several reported host, agent and environmental characteristics. We did not find any strong, biologically meaningful associations between the duration of incubation or symptomatic periods and the reported host, pathogen and environmental characteristics. Overall, we found that the distributions of incubation and symptomatic periods for norovirus infections are fairly constant and showed little differences with regard to the host, pathogen and environmental characteristics we analysed. |
Recombinant Marburg viruses containing mutations in the IID region of VP35 prevent inhibition of Host immune responses.
Albarino CG , Wiggleton Guerrero L , Spengler JR , Uebelhoer LS , Chakrabarti AK , Nichol ST , Towner JS . Virology 2014 476c 85-91 Previous in vitro studies have demonstrated that Ebola and Marburg virus (EBOV and MARV) VP35 antagonize the host cell immune response. Moreover, specific mutations in the IFN inhibitory domain (IID) of EBOV and MARV VP35 that abrogate their interaction with virus-derived dsRNA, lack the ability to inhibit the host immune response. To investigate the role of MARV VP35 in the context of infectious virus, we used our reverse genetics system to generate two recombinant MARVs carrying specific mutations in the IID region of VP35. Our data show that wild-type and mutant viruses grow to similar titers in interferon deficient cells, but exhibit attenuated growth in interferon-competent cells. Furthermore, in contrast to wild-type virus, both MARV mutants were unable to inhibit expression of various antiviral genes. The MARV VP35 mutants exhibit similar phenotypes to those previously described for EBOV, suggesting the existence of a shared immune-modulatory strategy between filoviruses. |
Genetic Analysis Workshop 18: Methods and strategies for analyzing human sequence and phenotype data in members of extended pedigrees.
Bickeboller H , Bailey JN , Beyene J , Cantor RM , Cordell HJ , Culverhouse RC , Engelman CD , Fardo DW , Ghosh S , Konig IR , Lorenzo Bermejo J , Melton PE , Santorico SA , Satten GA , Sun L , Tintle NL , Ziegler A , MacCluer JW , Almasy L . BMC Proc 2014 8 S1 Genetic Analysis Workshop 18 provided a platform for developing and evaluating statistical methods to analyze whole-genome sequence data from a pedigree-based sample. In this article we present an overview of the data sets and the contributions that analyzed these data. The family data, donated by the Type 2 Diabetes Genetic Exploration by Next-Generation Sequencing in Ethnic Samples Consortium, included sequence-level genotypes based on sequencing and imputation, genome-wide association genotypes from prior genotyping arrays, and phenotypes from longitudinal assessments. The contributions from individual research groups were extensively discussed before, during, and after the workshop in theme-based discussion groups before being submitted for publication. |
Use of highly pathogenic avian influenza A(H5N1) gain-of-function studies for molecular-based surveillance and pandemic preparedness.
Davis CT , Chen LM , Pappas C , Stevens J , Tumpey TM , Gubareva LV , Katz JM , Villanueva JM , Donis RO , Cox NJ . mBio 2014 5 (6) Zoonotic influenza viruses circulating in poultry and swine pose an ever present threat to human health. In particular, the rapid geographical expansion of highly pathogenic avian influenza (HPAI) A(H5N1) throughout Asia and then into Europe, the Middle East, and Africa during the 2000s galvanized the global community in an attempt to control this rapidly growing threat. Despite successful control efforts in some countries, the virus remains endemic in poultry in at least six countries and continues to cause human illness and deaths as well as countless outbreaks in birds. During the past decade, 668 cases and 393 deaths were detected and reported to the World Health Organization (WHO) (1). During the 17 years since human infections with HPAI A(H5N1) were first identified in Hong Kong, Special Administrative Region, People’s Republic of China, in 1997, these viruses have evolved substantially through mutation and reassortment, resulting in multiple divergent genotypes and clades (2). | Ongoing H5N1 circulation has appropriately resulted in a focus on sequencing viral genomes to understand the evolution of these viruses and the significance of observed genetic changes. Expanded laboratory capacity for high-throughput Sanger sequencing and recent technological advances, such as next-generation sequencing and parallel computing, have revolutionized the quantity, quality, and availability of gene sequences and our ability to quickly and accurately analyze these data (3). Consequently, the number of animal and human influenza virus sequences available in publically accessible databases has dramatically increased over the years, as have the bioinformatics tools required for efficient investigation (4, 5). These advances in laboratory and analytical methods provide strong incentives to utilize molecular data for pandemic risk assessment of zoonotic influenza viruses at the animal-human interface (6). |
Complete genome sequences and phylogenetic analysis of two West Nile virus strains isolated from equines in Argentina in 2006 could indicate an early introduction of the virus in the Southern Cone.
Fabbri CM , Garcia JB , Morales MA , Enria DA , Levis S , Lanciotti RS . Vector Borne Zoonotic Dis 2014 14 (11) 794-800 The complete nucleotide sequences of two West Nile virus (WNV) strains isolated in Argentina were determined. Phylogenetic trees were constructed from the aligned nucleic acid sequences of these two strains along with other previously published complete WNV genome sequences. Phylogenetic data showed that both strains belonged to clade 1a of lineage 1 and clustered in a subclade with American strains isolated during 1999-2002. These results suggest two independent routes of introduction of WNV in Argentina and that the virus could have been circulating in Argentina for some time before being isolated. |
Isolation and enrichment of Cryptosporidium DNA and verification of DNA purity for whole-genome sequencing.
Guo Y , Li N , Lysen C , Frace M , Tang K , Sammons S , Roellig DM , Feng Y , Xiao L . J Clin Microbiol 2014 53 (2) 641-7 Whole genome sequencing of Cryptosporidium spp. is hampered by difficulties in obtaining sufficient, highly pure genomic DNA from clinical specimens. In this study, we developed procedures for the isolation and enrichment of Cryptosporidium genomic DNA from fecal specimens and verification of DNA purity for whole genome sequencing. The isolation and enrichment of genomic DNA were achieved by a combination of three oocyst purification steps and whole genome amplification (WGA) of DNA from purified oocysts. qPCR analysis of WGA products was used as an initial quality assessment of amplified genomic DNA. The purity of WGA products was assessed by Sanger sequencing of cloned products. Next generation sequencing tools were used in final evaluations of genome coverage and extent of contamination. Altogether, 24 fecal specimens of Cryptosporidium parvum, C. hominis, C. andersoni, C. ubiquitum, C. tyzzeri, and Cryptosporidium chipmunk genotype I were processed with the procedures. As expected, WGA products with low Ct values (<16.0) yielded mostly Cryptosporidium sequences in Sanger sequencing. The cloning-sequencing analysis, however, showed significant contamination in 5 WGA products (percentage of positive colonies derived from Cryptosporidium genomic DNA ≤ 25%). Following this strategy, 20 WGA products from six Cryptosporidium species/genotypes with low Ct values (mostly <14.0) were submitted to whole genome sequencing, generating sequence data covering 94.5-99.7% Cryptosporidium genomes, with mostly minor contamination from bacterial, fungal, and host DNA. These results suggest that the described strategy can be used effectively for the isolation and enrichment of Cryptosporidium DNA from fecal specimens for whole genome sequencing. |
Global phylogenomic analysis of nonencapsulated Streptococcus pneumoniae reveals a deep-branching classic lineage that is distinct from multiple sporadic lineages.
Hilty M , Wuthrich D , Salter SJ , Engel H , Campbell S , Sa-Leao R , de Lencastre H , Hermans P , Sadowy E , Turner P , Chewapreecha C , Diggle M , Pluschke G , McGee L , Eser OK , Low DE , Smith-Vaughan H , Endimiani A , Kuffer M , Dupasquier M , Beaudoing E , Weber J , Bruggmann R , Hanage WP , Parkhill J , Hathaway LJ , Muhlemann K , Bentley SD . Genome Biol Evol 2014 6 (12) 3281-94 The surrounding capsule of Streptococcus pneumoniae has been identified as a major virulence factor and is targeted by pneumococcal conjugate vaccines (PCV). However, nonencapsulated Streptococcus pneumoniae (Non-Ec-Sp) have also been isolated globally, mainly in carriage studies. It is unknown if Non-Ec-Sp evolve sporadically, if they have high antibiotic non-susceptiblity rates and a unique, specific gene content. Here, whole genome sequencing of 131 Non-Ec-Sp isolates sourced from 17 different locations around the world was performed. Results revealed a deep-branching classic lineage that is distinct from multiple sporadic lineages. The sporadic lineages clustered with a previously sequenced, global collection of encapsulated S. pneumoniae (Ec-Sp) isolates while the classic lineage is comprised mainly of the frequently identified multi-locus sequences types ST344 (n=39) and ST448 (n=40). All ST344 and nine ST448 isolates had high non-susceptiblity rates to beta-lactams and other antimicrobials. Analysis of the accessory genome reveals that the classic Non-Ec-Sp contained an increased number of mobile elements, than Ec-Sp and sporadic Non-Ec-Sp. Performing adherence assays to human epithelial cells for selected classic and sporadic Non-Ec-Sp revealed that the presence of a integrative conjugative element (ICE) results in increased adherence to human epithelial cells (P=0.005). In contrast, sporadic Non-Ec-Sp lacking the ICE had greater growth in vitro possibly resulting in improved fitness. In conclusion, Non-Ec-Sp isolates from the classic lineage have evolved separately. They have spread globally, are well adapted to nasopharyngeal carriage and are able to coexist with Ec-Sp. Due to continued use of pneumococcal conjugate vaccines, Non-Ec-Sp may become more prevalent. |
Multi-locus analysis of Giardia duodenalis from nonhuman primates kept in zoos in China: geographical segregation and host-adaptation of assemblage B isolates.
Karim MR , Wang R , Yu F , Li T , Dong H , Li D , Zhang L , Li J , Jian F , Zhang S , Rume FI , Ning C , Xiao L . Infect Genet Evol 2014 30 82-88 Only a few studies based on single locus characterization have been conducted on the molecular epidemiology of Giardia duodenalis in nonhuman primates (NHPs). The present study was conducted to examine the occurrence and genotype identity of G. duodenalis in NHPs based on multi-locus analysis of the small-subunit ribosomal RNA (SSU rRNA), triose phosphate isomerase (tpi), glutamate dehydrogenase (gdh), and beta-giardin (bg) genes. Fecal specimens were collected from 496 animals of 36 NHP species kept in seven zoos in China and screened for G. duodenalis by tpi-based PCR. G. duodenalis was detected in 92 (18.6%) specimens from 18 NHP species, belonging to assemblage A (n=4) and B (n=88). In positive NHP species, the infection rates ranged from 4.8% to 100%. In tpi sequence analysis, the assemblage A included subtypes A1, A2 and one novel subtype. Multi-locus analysis of the tpi, gdh, and bg genes detected 11 (8 known and 3 new), 6 (3 known and 3 new) and 9 (2 known and 7 new) subtypes in 88, 47 and 35 isolates in assemblage B, respectively. Thirty-two assemblage B isolates with data at all three loci yielded 15 multi-locus genotypes (MLGs), including 2 known and 13 new MLGs. Phylogenetic analysis of concatenated sequences of assemblage B showed that MLGs found here were genetically different from those of humans, NHPs, rabbit and guinea pig in Italy and Sweden. It further indicated that assemblage B isolates in ring-tailed lemurs and squirrel monkeys might be genetically different from those in other NHPs. These data suggest that NHPs are mainly infected with G. duodenalis assemblage B and there might be geographical segregation and host-adaptation in assemblage B in NHPs. |
Full Genome Sequence of a Reassortant Human G9P[4] Rotavirus Strain.
Lewis J , Roy S , Esona MD , Mijatovic-Rustempasic S , Hardy C , Wang Y , Cortese M , Bowen MD . Genome Announc 2014 2 (6) This is a report of the complete genomic sequence of a reassortant rotavirus group A G9-P[4]-I2-R2-C2-M2-A2-N2-T2-E6-H2 strain designated RVA/Human-wt/USA/ LB1562/2010/G9P[4]. |
Molecular genotyping and quantitation assay for rotavirus surveillance.
Liu J , Lurain K , Sobuz SU , Begum S , Kumburu H , Gratz J , Kibiki G , Toney D , Gautam R , Bowen MD , Petri WA Jr , Haque R , Houpt ER . J Virol Methods 2014 213 157-63 Rotavirus genotyping is useful for surveillance purposes especially in areas where rotavirus vaccination has been or will be implemented. RT-PCR based molecular methods have been applied widely, but quantitative assays targeting a broad spectrum of genotypes have not been developed. Three real time RT-PCR panels were designed to identify G1, G2, G9, G12 (panel GI), G3, G4, G8, G10 (panel GII), and P[4], P[6], P[8], P[10], P[11] (panel P), respectively. An assay targeting NSP3 was included in both G panels as an internal control. The cognate assays were also formulated as one RT-PCR-Luminex panel for simultaneous detection of all the genotypes listed above plus P[9]. The assays were evaluated with various rotavirus isolates and 89 clinical samples from Virginia, Bangladesh and Tanzania, and exhibited 95% (81/85) sensitivity compared with the conventional RT-PCR-Gel-electrophoresis method, and 100% concordance with sequencing. Real time assays identified a significantly higher rate of mixed genotypes in Bangladeshi samples than the conventional gel-electrophoresis-based RT-PCR assay (32.5% versus 12.5%, P<0.05). In these mixed infections, the relative abundance of the rotavirus types could be estimated by Cq values. These typing assays detect and discriminate a broad range of G/P types circulating in different geographic regions with high sensitivity and specificity and can be used for rotavirus surveillance. |
Draft Whole-Genome Sequences of 10 Serogroup O6 Enterotoxigenic Escherichia coli Strains.
Pattabiraman V , Bopp CA . Genome Announc 2014 2 (6) Entertotoxigenic Escherichia coli (ETEC) is a major cause of global diarrhea, resulting in approximately 200 million occurrences and 300,000 to 400,000 deaths annually, primarily in children under the age of five. Here, we announce the release of the draft genomes of 10 ETEC isolates belonging to serogroup O6. |
LPS modification promotes maintenance of Yersinia pestis in fleas.
Aoyagi K , Brooks BD , Bearden SW , Montenieri JA , Gage KL , Fisher MA . Microbiology (Reading) 2014 161 628-38 Yersinia pestis, the causative agent of plague, can be transmitted by fleas in two different manners: by early phase transmission (EPT), which occurs shortly after flea infection, or by blocked fleas following long-term infection. Efficient flea-borne transmission is predicated upon the ability of Y. pestis to be maintained within the flea. Signature-tagged mutagenesis (STM) was used to identify genes required for Y. pestis maintenance in a genuine plague vector, Xenopsylla cheopis. The STM screen identified seven mutants that displayed markedly reduced fitness in fleas after four days, the time during which EPT occurs. Two of the mutants contained insertions in genes encoding glucose-1-phosphate uridylyltransferase (galU) and UDP-4-amino-4-deoxy-L-arabinose-oxoglutarate aminotransferase (arnB), which are involved in the modification of lipid A with aminoarabinose (Ara4N) and resistance to cationic antimicrobial peptides (CAMPs). These Y. pestis mutants were more susceptible to the CAMPs cecropin A and polymyxin B, and produced lipid A lacking Ara4N modifications. Surprisingly, an in-frame deletion of arnB retained modest levels of CAMP resistance and Ara4N modification, indicating the presence of compensatory factors. It was determined that WecE, an aminotransferase involved in biosynthesis of enterobacterial common antigen, plays a novel role in Y. pestis Ara4N modification by partially offsetting the loss of arnB. These results indicate that mechanisms of Ara4N modification of lipid A are more complex than previously thought, and these modifications, as well as several factors yet to be elucidated, play an important role in early survival and transmission of Y. pestis in the flea vector. |
Safety of quadrivalent human papillomavirus vaccine (Gardasil) in pregnancy: adverse events among non-manufacturer reports in the Vaccine Adverse Event Reporting System, 2006-2013
Moro PL , Zheteyeva Y , Lewis P , Shi J , Yue X , Museru OI , Broder K . Vaccine 2014 33 (4) 519-22 BACKGROUND: In 2006, quadrivalent human papillomavirus (HPV4; Gardasil, Merck & Co., Inc.) vaccine was licensed in the US for use in females aged 9-26 years. HPV4 is not recommended during pregnancy; however, inadvertent administration during pregnancy may occur. OBJECTIVES: To evaluate and summarize reports to the Vaccine Adverse Event Reporting System (VAERS) in pregnant women who received HPV4 vaccine and assess for potentially concerning adverse events among non-manufacturer reports. METHODS: We searched the VAERS database for non-manufacturer reports of adverse events (AEs) in pregnant women who received HPV4 vaccine from 6/1/2006 to 12/31/2013. We conducted clinical review of reports and available medical records. RESULTS: We found 147 reports after HPV4 vaccine administered to pregnant women. The most frequent pregnancy-specific AE was spontaneous abortion in 15 (10.2%) reports, followed by elective terminations in 6 (4.1%). Maternal fever was the most frequent non-pregnancy-specific AE in 3 reports. Two reports of major birth defects were received. No maternal deaths were noted. One hundred-three (70.1%) reports did not describe an AE. CONCLUSIONS: This review of VAERS non-manufacturer reports following vaccination with HPV4 in pregnancy did not find any unexpected patterns in maternal or fetal outcomes. |
Zinc supplementation fails to increase the immunogenicity of oral poliovirus vaccine: a randomized controlled trial
Habib MA , Soofi S , Sheraz A , Bhatti ZS , Okayasu H , Zaidi SZ , Molodecky NA , Pallansch MA , Sutter RW , Bhutta ZA . Vaccine 2014 33 (6) 819-25 BACKGROUND: Polio eradication remains a challenge in Pakistan and the causes for the failure to eradicate poliomyelitis are complex. Undernutrition and micronutrient deficiencies, especially zinc deficiency, are major public health problems in Pakistan and could potentially affect the response to enteric vaccines, including oral poliovirus vaccine (OPV). OBJECTIVE: To assess the impact of zinc supplementation among infants on immune response to oral poliovirus vaccine (OPV). METHODS: A double-blind, randomized placebo-controlled trial was conducted in newborns (aged 0-14 days). Subjects were assigned to either receive 10mg of zinc or placebo supplementation daily for 18 weeks. Both groups received OPV doses at birth, at 6 weeks, 10 weeks and 14 weeks. Data was collected on prior immunization status, diarrheal episodes, breastfeeding practices and anthropometric measurements at recruitment and at 6 and 18 weeks. Blood samples were similarly collected to determine the antibody response to OPV and for micronutrient analysis. Logistic regression was used to determine the relationship between seroconversion and zinc status. RESULTS: Overall, 404 subjects were recruited. At recruitment, seropositivity was already high for poliovirus (PV) serotype 1 (zinc: 91.1%; control: 90.5%) and PV2 (90.0%; 92.7%), with lower estimates for PV3 (70.0%; 64.8%). By week 18, the proportion of subjects with measured zinc levels in the normal range (i.e. ≥60mug/dL) was significantly greater in the intervention group compared to the control group (71.9%; 27.4%; p<0.001). No significant difference in seroconversion was demonstrated between the groups for PV1, PV2, or PV3. CONCLUSIONS: There was no effect of zinc supplementation on OPV immunogenicity. These conclusions were confirmed when restricting the analysis to those with measured higher zinc levels. |
Meningococcal carriage among Georgia and Maryland high school students
Harrison LH , Shutt KA , Arnold KE , Stern EJ , Pondo T , Kiehlbauch JA , Myers RA , Hollick RA , Schmink S , Vello M , Stephens DS , Messonnier NE , Mayer L , Clark TA . J Infect Dis 2014 211 (11) 1761-8 BACKGROUND: Meningococcal disease incidence in the U.S. is at an all-time low. In a previous study of Georgia high school students, meningococcal carriage prevalence was 7%. The purpose of this study was to measure the impact of a meningococcal conjugate vaccine on serogroup Y meningococcal carriage and to define the dynamics of carriage in high school students. METHODS: This was a prospective cohort study at 8 high schools, 4 each in Maryland and Georgia during a school year. In each state, 2 high schools were randomized for participating students to receive MCV4-DT at the beginning of the study and 2 at the end. Oropharyngeal swab cultures for meningococcal carriage were performed three times during the school year. RESULTS: Among 3,311 students, prevalence of meningococcal carriage was 3.21%- 4.01%. Phenotypically non-groupable strains accounted for 88% of carriage isolates. There were only 5 observed acquisitions of serogroup Y strains during the study; therefore, the impact of MCV4-DT on meningococcal carriage could not be determined. CONCLUSIONS: Meningococcal carriage rates in U.S. high school students were lower than expected and the vast majority of strains did not express capsule. These findings may help explain the historically low incidence of meningococcal disease in the U.S. |
Multifaceted strategies needed for influenza prevention in long-term care
Lindley MC , Bridges CB . J Infect Dis 2014 211 (12) 1860-1 The increased risk of severe illness from influenza among older adults is well-established with almost 90% of annual influenza-associated deaths occurring among adults 65 years and older.1 This burden is compounded by significantly reduced efficacy of inactivated influenza vaccine in older adults, particularly those aged 70 or older or with decreased functional status.2,3 Outbreaks of influenza among residents in long-term care facilities (LTCF) are well-documented even in the setting of high vaccination rates.4,5 Clearly, additional strategies are needed to prevent influenza in these high-risk populations. In this issue of Journal of Infectious Diseases, Nace et al. present results from an immunogenicity study of high-dose influenza vaccine in a sample of frail elderly residents of LTCF. This study compliments earlier work by DiazGranados et al., who demonstrated superior immunogenicity and superior clinical efficacy against laboratory-confirmed influenza of the high-dose compared to the standard-dose influenza vaccine in community-dwelling persons 65 years and older.6 | Immune senescence is associated with decreased vaccine immune response among healthy adults in their 70s or 80s, and differences in frailty (as measured by ability to live unassisted or other factors) may significantly impact the benefit of influenza vaccination in preventing influenza-related illness and death.2–3,7 The development of more immunogenic vaccines – including high-dose (HD) influenza vaccines – aims to address both of these problems, but confirmatory studies documenting the clinical impact of HD vaccine and superiority of HD over standard-dose vaccine in preventing influenza among LTCF residents are not yet available. Although the study by Nace et al. provides some evidence of improved immunogenicity of HD over standard dose vaccine among LTCF residents, overall titers even after HD vaccination are modest at best, and far below titers reported among community-dwelling older adults following HD influenza vaccination.6 |
Paediatric influenza vaccination: time to better protect high-risk groups?
Nair H , Widdowson MA . Lancet Respir Med 2014 3 (2) 93-94 Influenza is an important contributor to severe acute lower respiratory infection (ALRI). It is estimated to cause 20 million cases of influenza-associated ALRI per year worldwide, including 1 million severe ALRI cases, and 28 000–111 500 deaths in children younger than 5 years, mostly in the first 2 years of life.1 The 2009 H1N1 influenza pandemic might have disproportionately affected children; in the USA, paediatric mortality from confirmed A(H1N1) influenza in 2009–10 was three times higher than that of seasonal influenza. Two-thirds of the children who died had a comorbidity that put them at high-risk for complications with influenza.2 Globally, paediatric pandemic H1N1 mortality was estimated to be 44 500 in 2009–10, 70% of this burden in Africa and southeast Asia.3 Children in countries with a shortage of resources are likely to have fewer well managed underlying disorders, and a higher prevalence of some risk factors for severe influenza (eg, sickle cell disease or HIV).4 Up to now, however, the additional risk posed by underlying disorders has not been well quantified, which hampers policy decisions on the cost-effectiveness of preventive measures and adoption of childhood vaccination. |
Estimated influenza illnesses and hospitalizations averted by vaccination - United States, 2013-14 influenza season
Reed C , Kim IK , Singleton JA , Chaves SS , Flannery B , Finelli L , Fry A , Burns E , Gargiullo P , Jernigan D , Cox N , Bresee J . MMWR Morb Mortal Wkly Rep 2014 63 (49) 1151-4 The Advisory Committee on Immunization Practices recommends annual influenza vaccination for all persons aged ≥6 months to reduce morbidity and mortality caused by influenza in the United States. CDC previously developed a model to estimate that annual influenza vaccination resulted in 1.1-6.6 million fewer cases and 7,700-79,000 fewer hospitalizations per season during the 2005-2013 influenza seasons. For the 2013-14 influenza season, using updated estimates of vaccination coverage, vaccine effectiveness, and influenza hospitalizations, CDC estimates that influenza vaccination prevented approximately 7.2 million illnesses, 3.1 million medically attended illnesses, and 90,000 hospitalizations associated with influenza. Similar to prior seasons, fewer than half of persons aged ≥6 months are estimated to have been vaccinated. If influenza vaccination levels had reached the Healthy People 2020 target of 70%, an estimated additional 5.9 million illnesses, 2.3 million medically attended illnesses, and 42,000 hospitalizations associated with influenza might have been averted. For the nation to more fully benefit from influenza vaccines, more effort is needed to reach the Healthy People 2020 target. |
An evaluation of voluntary 2-dose varicella vaccination coverage in New York City public schools
Doll MK , Rosen JB , Bialek SR , Szeto H , Zimmerman CM . Am J Public Health 2014 105 (5) e1-e8 OBJECTIVES: We assessed coverage for 2-dose varicella vaccination, which is not required for school entry, among New York City public school students and examined characteristics associated with receipt of 2 doses. METHODS: We measured receipt of either at least 1 or 2 doses of varicella vaccine among students aged 4 years and older in a sample of 336 public schools (n = 223 864 students) during the 2010 to 2011 school year. Data came from merged student vaccination records from 2 administrative data systems. We conducted multivariable regression to assess associations of age, gender, race/ethnicity, and school location with 2-dose prevalence. RESULTS: Coverage with at least 1 varicella dose was 96.2% (95% confidence interval [CI] = 96.2%, 96.3%); coverage with at least 2 doses was 64.8% (95% CI = 64.6%, 64.9%). Increasing student age, non-Hispanic White race/ethnicity, and attendance at school in Staten Island were associated with lower 2-dose coverage. CONCLUSIONS: A 2-dose varicella vaccine requirement for school entry would likely improve 2-dose coverage, eliminate coverage disparities, and prevent disease. |
Global invasive bacterial vaccine-preventable diseases surveillance - 2008-2014
Murray J , Agocs M , Serhan F , Singh S , Deloria-Knoll M , O'Brien K , Mwenda JM , Mihigo R , Oliveira L , Teleb N , Ahmed H , Wasley A , Videbaek D , Wijesinghe P , Thapa AB , Fox K , Paladin FJ , Hajjeh R , Schwartz S , Beneden CV , Hyde T , Broome C , Cherian T . MMWR Morb Mortal Wkly Rep 2014 63 (49) 1159-62 Meningitis and pneumonia are leading causes of morbidity and mortality in children globally infected with Streptococcus pneumoniae (pneumococcus), Neisseria meningitidis, and Haemophilus influenzae causing a large proportion of disease. Vaccines are available to prevent many of the common types of these infections. S. pneumoniae was estimated to have caused 11% of deaths in children aged <5 years globally in the pre-pneumococcal conjugate vaccine (PCV) era. Since 2007, the World Health Organization (WHO) has recommended inclusion of PCV in childhood immunization programs worldwide, especially in countries with high child mortality. As of November 26, 2014, a total of 112 (58%) of all 194 WHO member states and 44 (58%) of the 76 member states ever eligible for support from Gavi, the Vaccine Alliance (Gavi), have introduced PCV. Invasive pneumococcal disease (IPD) surveillance that includes data on serotypes, along with meningitis and pneumonia syndromic surveillance, provides important data to guide decisions to introduce PCV and monitor its impact. |
Influenza-related hospitalization and ED visits in children less than 5 years: 2000-2011
Jules A , Grijalva CG , Zhu Y , Talbot HK , Williams JV , Poehling KA , Chaves SS , Edwards KM , Schaffner W , Shay DK , Griffin MR . Pediatrics 2014 135 (1) e66-74 BACKGROUND AND OBJECTIVES: In the United States, recommendations for annual influenza vaccination gradually expanded from 2004 to 2008, to include all children aged ≥6 months. The effects of these policies on vaccine uptake and influenza-associated health care encounters are unclear. The objectives of the study were to examine the annual incidence of influenza-related health care encounters and vaccine uptake among children age 6 to 59 months from 2000-2001 through 2010-2011 in Davidson County, TN. METHODS: We estimated the proportion of laboratory-confirmed influenza-related hospitalizations and emergency department (ED) visits by enrolling and testing children with acute respiratory illness or fever. We estimated influenza-related health care encounters by multiplying these proportions by the number of acute respiratory illness/fever hospitalizations and ED visits for county residents. We assessed temporal trends in vaccination coverage, and influenza-associated hospitalizations and ED visit rates. RESULTS: The proportion of fully vaccinated children increased from 6% in 2000-2001 to 38% in 2010-2011 (P < .05). Influenza-related hospitalizations ranged from 1.9 to 16.0 per 10 000 children (median 4.5) per year. Influenza-related ED visits ranged from 89 to 620 per 10 000 children (median 143) per year. Significant decreases in hospitalizations (P < .05) and increases in ED visits (P < .05) over time were not clearly related to vaccination trends. Influenza-related encounters were greater when influenza A(H3N2) circulated than during other years with median rates of 8.2 vs 3.2 hospitalizations and 307 vs 143 ED visits per 10 000 children, respectively. CONCLUSIONS: Influenza vaccination increased over time; however, the proportion of fully vaccinated children remained <50%. Influenza was associated with a substantial illness burden particularly when influenza A(H3N2) predominated. |
Accuracy of influenza vaccination rate estimates in United States nursing home residents
Grosholz JM , Blake S , Daugherty JD , Ayers E , Omer SB , Polivka-West L , Howard DH . Epidemiol Infect 2014 143 (12) 1-8 The US Center for Medicare and Medicaid Services (CMS) requires nursing homes and long-term-care facilities to document residents' vaccination status on the Resident Assessment Instrument (RAI). Vaccinating residents can prevent costly hospital admissions and deaths. CMS and public health officials use RAI data to measure vaccination rates in long-term-care residents and assess the quality of care in nursing homes. We assessed the accuracy of RAI data against medical records in 39 nursing homes in Florida, Georgia, and Wisconsin. We randomly sampled residents in each home during the 2010-2011 and 2011-2012 influenza seasons. We collected data on receipt of influenza vaccination from charts and RAI data. Our final sample included 840 medical charts with matched RAI records. The agreement rate was 0.86. Using the chart as a gold standard, the sensitivity of the RAI with respect to influenza vaccination was 85% and the specificity was 77%. Agreement rates varied within facilities from 55% to 100%. Monitoring vaccination rates in the population is important for gauging the impact of programmes and policies to promote adherence to vaccination recommendations. Use of data from RAIs is a reasonable approach for gauging influenza vaccination rates in nursing-home residents. |
Adverse events following vaccination with an inactivated, Vero cell culture-derived Japanese encephalitis vaccine in the United States, 2009-2012
Rabe IB , Miller E , Fischer M , Hills S . Vaccine 2014 33 (5) 708-12 BACKGROUND: In March 2009, the U.S. Food and Drug Administration licensed an inactivated, Vero cell culture-derived Japanese encephalitis vaccine (JE-VC [Ixiaro]) for use in adults. The vaccine was licensed based on clinical trial safety data in 3558 JE-VC recipients. It is essential to monitor post-licensure surveillance data to evaluate the safety of JE-VC because rare adverse events may not be detected until the vaccine is administered to a larger population. METHODS: We reviewed adverse events reported to the U.S. Vaccine Adverse Event Reporting System (VAERS) for adults (≥17 years) who received JE-VC from May 2009 through April 2012. Adverse event reporting rates were calculated using 275,848 JE-VC doses distributed. RESULTS: Over the 3 year period, 42 adverse events following vaccination with JE-VC were reported to VAERS for an overall reporting rate of 15.2 adverse events per 100,000 doses distributed. Of the 42 total reports, 5 (12%) were classified as serious for a reporting rate of 1.8 per 100,000 doses distributed; there were no deaths. Hypersensitivity reactions (N=12) were the most commonly reported type of adverse event, with a rate of 4.4 per 100,000 doses distributed; no cases of anaphylaxis were reported. Three adverse events of the central nervous system were reported (one case of encephalitis and two seizures) for a rate of 1.1 per 100,000; all occurred after receipt of JE-VC with other vaccines. CONCLUSIONS: These post-marketing surveillance data suggest a good safety profile for JE-VC consistent with findings from pre-licensure clinical trials. Post-licensure safety data should continue to be monitored for any evidence of rare serious or neurologic adverse events. |
The association between influenza vaccination and other preventative health behaviors in a cohort of pregnant women
Scheminske M , Henninger M , Irving SA , Thompson M , Williams J , Shifflett P , Ball SW , Avalos LA , Naleway AL . Health Educ Behav 2014 42 (3) 402-8 OBJECTIVES: Although pregnant women are a high-priority group for seasonal influenza vaccination, vaccination rates in this population remain below target levels. Previous studies have identified sociodemographic predictors of vaccine choice, but relationships between preconception heath behaviors and seasonal influenza vaccination are poorly understood. This prospective cohort study followed pregnant women during the 2010-2011 influenza season to determine if certain health behaviors were associated with vaccination status. METHOD: Participants were pregnant women receiving prenatal care from Kaiser Permanente Northwest and Kaiser Permanente Northern California. Women were surveyed about preconception smoking, alcohol consumption, and vitamin/supplement use. Vaccination data were obtained from health plan databases and state immunization records. RESULTS: Data from 1,204 women were included in this analysis. Most participants (1,204; 66.4%) received a seasonal influenza vaccine during the study period. Women vaccinated prior to pregnancy were more likely to use a supplement containing folic acid (80%) or vitamin D (30%) compared with women who were vaccinated during pregnancy (72% and 15%, respectively) or unvaccinated women (62% and 12%, respectively, p < .001). Women vaccinated prior to or during pregnancy were more likely (75%) to have never smoked compared with women who were not vaccinated (70%, p = .005). There were no significant differences in alcohol use or household cigarette smoke exposure by vaccination group. CONCLUSIONS: Women who engaged in specific preconception health behaviors were more likely to receive seasonal influenza vaccination. Failure to participate in these health behaviors could alert health care practitioners to patients' increased risk of remaining unvaccinated during pregnancy. |
Comprehensive efforts to increase healthcare personnel immunization
Graitcer SB , Kim D , Lindley M . Hum Vaccin Immunother 2014 10 (9) 2625-6 Vaccination of healthcare personnel (HCP) is an important component of worker and patient safety, yet vaccination rates are lagging. The findings from Taddei et al.'s study of healthcare personnel immunization attitudes and practices in Florence, Italy provides further data of the importance of routine assessment of and recommendations for vaccines for HCP in order to improve coverage. |
Developments in understanding acquired immunity and innate susceptibility to norovirus and rotavirus gastroenteritis in children
Payne DC , Parashar UD , Lopman BA . Curr Opin Pediatr 2014 27 (1) 105-9 PURPOSE OF REVIEW: We discuss recent advances in the understanding of acquired immunity and susceptibility to the two major pediatric enteric viral pathogens, norovirus and rotavirus. RECENT FINDINGS: The prominent decline in severe rotavirus gastroenteritis in areas with mature rotavirus vaccination programmes has correspondingly unmasked the significant burden of disease associated with norovirus gastroenteritis among children. As epidemiologists and vaccinologists set their sights on this next vaccine target, we provide an update on norovirus vaccine development. In addition to these developments regarding acquired immunity, refinements to our understanding of innate susceptibility to norovirus has advanced. Significant recent advances now describe similar immunologic mechanisms in understanding susceptibility for both norovirus and rotavirus, involving histo-blood group antigenic associations, which may also prove to be genotype specific. SUMMARY: This information can potentially be used to tailor both applied and developmental efforts to public health interventions against these important pediatric enteric viral pathogens. |
The role of alcohol policies in preventing intimate partner violence: a review of the literature
Kearns MC , Reidy DE , Valle LA . J Stud Alcohol Drugs 2015 76 (1) 21-30 OBJECTIVE: This article summarizes existing research on the relationship between alcohol policies and intimate partner violence (IPV). Because alcohol use represents an important risk factor for IPV, interventions and policies aimed at decreasing problem drinking may also lead to reductions in IPV. METHOD: Electronic databases were searched to identify relevant peer-reviewed journal articles on alcohol policies and IPV, as well as reference sections of appropriate articles. Only policies that have been studied specifically for their impact on IPV were included. RESULTS: Three alcohol policy areas (outlet density, hours and days of sale, and pricing/taxation) have been studied in relation to IPV outcomes. Research on outlet density has the most consistent findings, with most studies indicating that higher densities of alcohol outlets are associated with higher rates of IPV. Fewer studies have been conducted on pricing policies and policies restricting hours/days of sale, with most studies suggesting no impact on IPV rates. CONCLUSIONS: A higher density of alcohol outlets appears to be associated with greater rates of IPV. However, there is limited evidence suggesting that alcohol pricing policies and restrictions on hours and days of sale are associated with IPV outcomes. Knowledge about the impact of alcohol-related policies on IPV and violence in general is limited by several significant research gaps. Additional research is needed to assess the impact of alcohol policies on IPV and other forms of violence. |
Indoor tanning-related injuries treated in a national sample of US hospital emergency departments
Guy GP Jr , Watson M , Haileyesus T , Annest JL . JAMA Intern Med 2014 175 (2) 309-11 Indoor tanning exposes users to intense UV radiation, which is a known carcinogen.1 However, little is known about the more immediate adverse outcomes of indoor tanning. To our knowledge, this study provides the first national estimates of indoor tanning–related injuries treated in US hospital emergency departments (EDs). |
Injury prevention practices as depicted in G- and PG-rated movies, 2008-2012
Pelletier AR , Tongren JE , Gilchrist J . J Community Health 2014 40 (4) 613-8 Unintentional injuries are the leading cause of death among children in the United States. The use of recommended safety practices can reduce injuries. Children often learn behaviors from media exposure. Children's movies released in 1995-2007 infrequently depicted appropriate injury prevention practices. The aim of this study was to determine if injury prevention practices in children's movies have improved. The top grossing 25 G- and PG-rated movies in the United States per year for 2008-2012 were eligible for inclusion in the study. Movies or scenes were excluded if they were animated, not set in the present day, fantasy, documentary, or not in English. Injury prevention practices involving riding in a motor vehicle, walking, boating, bicycling, and four other activities were recorded for characters with speaking roles. Fifty-six (45 %) of the 125 movies met the inclusion criteria. A total of 603 person-scenes were examined involving 175 (29 %) children and 428 (71 %) adults. Thirty-eight person-scenes involved crashes or falls, resulting in four injuries and no deaths. Overall, 59 % (353/603) of person-scenes showed appropriate injury prevention practices. This included 313 (70 %) of 445 motor-vehicle passengers who were belted; 15 (30 %) of 50 pedestrians who used a crosswalk, 2 (7 %) of 30 boaters who wore personal flotation devices, and 8 (29 %) of 28 bicyclists who wore helmets. In comparison with previous studies, there were significant increases in usage of seat belts, crosswalks, personal flotation devices, and bicycle helmets. However, 41 % of person-scenes still showed unsafe practices and the consequences of those behaviors were infrequently depicted. |
Pathologic and molecular profiling of rapid-onset fibrosis and inflammation induced by multi-walled carbon nanotubes.
Dong J , Porter DW , Batteli LA , Wolfarth MG , Richardson DL , Ma Q . Arch Toxicol 2014 89 (4) 621-33 Multi-walled carbon nanotubes (MWCNT) are new materials with a wide range of industrial and commercial applications. However, their nano-scaled size and fiber-like shape render them respirable and potentially fibrogenic if inhaled into the lungs. To understand MWCNT fibrogenesis, we analyzed the pathologic and molecular aspects of the early phase response to MWCNT in mouse lungs. MWCNT induced rapid and pronounced lesions in the lungs characterized by increased cellularity and formation of fibrotic foci, most notably near where MWCNT deposited, within 14 days post-exposure. Deposition of collagen fibers was markedly increased in the alveolar septa and fibrotic foci, accompanied by elevated expression of fibrotic genes Col1a1, Col1a2, and Fn1 at both mRNA and protein levels. Fibrosis was induced rapidly at 40 mug, wherein fibrotic changes were detected on day 1 and reached a maximal intensity on day 7 through day 14. Induction of fibrosis was dose-dependent at the dose range of 5-40 mug, 7 days post-exposure. MWCNT elicited rapid and prominent infiltrations of neutrophils and macrophages alongside fibrosis implicating acute inflammation in the fibrotic response. At the molecular level, MWCNT induced elevated expression of proinflammatory cytokines TNFalpha, IL1alpha, IL1beta, IL6, and CCL2 in lung tissues as well as the bronchoalveolar lavage fluid, in a dose- and time-dependent manner. MWCNT also increased the expression of fibrogenic growth factors TGF-beta1 and PDGF-A in the lungs significantly. These findings underscore the interplay between acute inflammation and the early fibrotic response in the initiation and propagation of pulmonary fibrosis induced by MWCNT. |
mRNA and miRNA regulatory networks reflective of multi-walled carbon nanotube-induced lung inflammatory and fibrotic pathologies in mice.
Dymacek J , Snyder-Talkington BN , Porter DW , Wolfarth MG , Castranova V , Qian Y , Guo NL . Toxicol Sci 2014 144 (1) 51-64 Multi-walled carbon nanotubes (MWCNT) are known for their transient inflammatory and progressive fibrotic pulmonary effects; however, the mechanisms underlying these pathologies are unknown. In this study, we used time-series microarray data of global lung mRNA and miRNA expression isolated from C57BL/6J mice exposed by pharyngeal aspiration to vehicle or 10, 20, 40, or 80 mug MWCNT at 1, 7, 28, or 56 days post-exposure to determine miRNA andmRNA regulatory networks that are potentially involved in MWCNT-induced inflammatory and fibrotic lung etiology. Using a non-negative matrix factorization method, we determined mRNAs and miRNAs with expression profiles associated with pathology patterns of MWCNT-induced inflammation (based upon bronchoalveolar lavage score) and fibrosis (based upon Sirius Red staining measured with quantitative morphometric analysis). Potential binding targets between pathology-related mRNAs and miRNAs were identified using Ingenuity Pathway Analysis and the miRTarBase, miRecords, and TargetScan databases. Using these experimentally validated and predicted binding targets, we were able to build molecular signaling networks that are potentially reflective of and play a role in MWCNT-induced lung inflammatory and fibrotic pathology. As understanding the regulatory networks between mRNAs and miRNAs in different disease states would be beneficial for understanding the complex mechanisms of pathogenesis, these identified genes and pathways may be useful for determining biomarkers of MWCNT-induced lung inflammation and fibrosis for early detection of disease. |
Biomarkers of Parkinson's disease: present and future.
Miller DB , O'Callaghan JP . Metabolism 2014 64 S40-6 Sporadic or idiopathic Parkinson's disease (PD) is an age-related neurodegenerative disorder of unknown origin that ranks only second behind Alzheimer's disease (AD) in prevalence and its consequent social and economic burden. PD neuropathology is characterized by a selective loss of dopaminergic neurons in the substantia nigra pars compacta; however, more widespread involvement of other CNS structures and peripheral tissues now is widely documented. The onset of molecular and cellular neuropathology of PD likely occurs decades before the onset of the motor symptoms characteristic of PD. The hallmark symptoms of PD, resting tremors, rigidity and postural disabilities, are related to dopamine (DA) deficiency. Current therapies treat these symptoms by replacing or boosting existing DA. All current interventions have limited therapeutic benefit for disease progression because damage likely has progressed over an estimated period of ~5 to 15years to a loss of 60%-80% of the nigral DA neurons, before symptoms emerge. There is no accepted definitive biomarker of PD. An urgent need exists to develop early diagnostic biomarkers for two reasons: (1) to intervene at the onset of disease and (2) to monitor the progress of therapeutic interventions that may slow or stop the course of the disease. In the context of disease development, one of the promises of personalized medicine is the ability to predict, on an individual basis, factors contributing to the susceptibility for the development of a given disease. Recent advances in our understanding of genetic factors underlying or contributing to PD offer the potential for monitoring susceptibility biomarkers that can be used to identify at-risk individuals and possibly prevent the onset of disease through treatment. Finally, the exposome concept is new in the biomarker discovery arena and it is suggested as a way to move forward in identifying biomarkers of neurological diseases. It is a two-stage scheme involving a first stage of exposome-wide association studies (EWAS) to profile omic features in serum to discover molecular biomarkers. The second stage involves application of this knowledge base in follow-up studies. This strategy is unique in that it promotes the use of data-driven (omic) strategies in interrogating diseased and healthy populations and encourages a movement away from using only reductionist strategies to discover biomarkers of exposure and disease. In this short review we will examine 1) advances in our understanding of the molecular mechanisms underlying PD that have led to candidate biomarkers for diagnosis and treatment efficacy and 2) new technologies on the horizon that will lead to novel approaches in biomarker development. |
Detection of co-colonization with Streptococcus pneumoniae by algorithmic use of conventional and molecular methods.
Saha S , Modak JK , Naziat H , Al-Emran HM , Chowdury M , Islam M , Hossain B , Darmstadt GL , Whitney CG , Saha SK . Vaccine 2014 33 (5) 713-8 Detection of pneumococcal carriage by multiple co-colonizing serotypes is important in assessing the benefits of pneumococcal conjugate vaccine (PCV). Various methods differing in sensitivity, cost and technical complexity have been employed to detect multiple serotypes of pneumococcus in respiratory specimens. We have developed an algorithmic method to detect all known serotypes that preserves the relative abundance of specific serotypes by using Quellung-guided molecular techniques. The method involves culturing respiratory swabs followed by serotyping of 100 colonies by either capsular (10 colonies) or PCR (90 colonies) reactions on 96-well plates. The method was evaluated using 102 nasal swabs from children carrying pneumococcus. Multiple serotypes were detected in 22% of carriers, compared to 3% by World Health Organization (WHO)-recommended morphology-based selection of 1 to 3 colonies. Our method, with a processing cost of $87, could detect subdominant strains making up as low as 1% of the population. The method is affordable, practical, and capable of detecting all known serotypes without false positive reactions or change in the native distribution of multiple serotypes. |
Newborn blood spot screening test using multiplexed real-time PCR to simultaneously screen for spinal muscular atrophy and severe combined immunodeficiency.
Taylor JL , Lee FK , Yazdanpanah GK , Staropoli JF , Liu M , Carulli JP , Sun C , Dobrowolski SF , Hannon WH , Vogt RF . Clin Chem 2014 61 (2) 412-9 BACKGROUND: Spinal muscular atrophy (SMA) is a motor neuron disorder caused by the absence of a functional survival of the motor neuron 1, telomeric (SMN1) gene. Type I SMA, a lethal disease of infancy, accounts for the majority of cases. Newborn bloodspot screening (NBS) to detect severe combined immunodeficiency (SCID) has been implemented in public health laboratories in the last 5 years. SCID detection is based on real-time PCR assays to measure T-cell receptor excision circles (TREC), a byproduct of T-cell development. We modified a multiplexed real-time PCR TREC assay to simultaneously determine the presence or absence of the SMN1 gene from a dried blood spot (DBS) punch in a single reaction well. METHOD: An SMN1 assay using a locked nucleic acid probe was initially developed with cell culture and umbilical cord blood (UCB) DNA extracts, and then integrated into the TREC assay. DBS punches were placed in 96-well arrays, washed, and amplified directly using reagents specific for TREC, a reference gene [ribonuclease P/MRP 30kDa subunit (RPP30)], and the SMN1 gene. The assay was tested on DBS made from UCB units and from peripheral blood samples of SMA-affected individuals and their family members. RESULTS: DBS made from SMA-affected individuals showed no SMN1-specific amplification, whereas DBS made from all unaffected carriers and UCB showed SMN1 amplification above a well-defined threshold. TREC and RPP30 content in all DBS were within the age-adjusted expected range. CONCLUSIONS: SMA caused by the absence of SMN1 can be detected from the same DBS punch used to screen newborns for SCID. |
Utility of real-time PCR for detection of Exserohilum rostratum in body and tissue fluids during the multistate outbreak of fungal meningitis and other infections.
Gade L , Grgurich DE , Kerkering TM , Brandt ME , Litvintseva AP . J Clin Microbiol 2014 53 (2) 618-25 Exserohilum rostratum was the major cause of the multistate outbreak of fungal meningitis linked to contaminated injections of methylprednisolone acetate (MPA) produced by the New England Compounding Center (NECC). Previously, we developed a fungal DNA extraction procedure and broad range and E. rostratum- specific PCR assays, and confirmed the presence of fungal DNA in 28% of case-patients. Here we report the development and validation of a TaqMan(R) real-time PCR assay for detection of E. rostratum in body fluids, which we used to confirm infections in 57 additional case-patients, bringing the total number of case-patients with PCR-positive results for E. rostratum to 171 (37% of 461 case-patients with available specimens). Compared to fungal culture and the previous PCR assays, this real-time PCR assay was more sensitive: of 139 case-patients with identical specimens tested by all three methods, 19 (14%) specimens were positive by culture, 41 (29%) positive by the conventional PCR, and 65 (47%) positive by the real-time PCR. We also compared utility of the real-time PCR with the previously described beta-D-glucan (BDG) detection assay for monitoring response to treatment in case-patients with serially collected CSF. In most case-patients, only the incident CSF specimens were positive by real-time PCR while most subsequently collected specimens were negative, confirming our previous observations that BDG assay was more appropriate than real-time PCR for monitoring response to treatment. Our results also demonstrate that real-time PCR is extremely susceptible to contamination and its results should only be used in conjunction with clinical and epidemiological data. |
PET-PCR method for the molecular detection of malaria parasites in a national malaria surveillance study in Haiti, 2011.
Lucchi NW , Karell MA , Journel I , Rogier E , Goldman I , Ljolje D , Huber C , Mace KE , Jean SE , Akom EE , Oscar R , Buteau J , Boncy J , Barnwell JW , Udhayakumar V . Malar J 2014 13 (462) 462 BACKGROUND: Recently, a real-time PCR assay known as photo-induced electron transfer (PET)-PCR which relies on self-quenching primers for the detection of Plasmodium spp. and Plasmodium falciparum was described. PET-PCR assay was found to be robust, and easier to use when compared to currently available real-time PCR methods. The potential of PET-PCR for molecular detection of malaria parasites in a nationwide malaria community survey in Haiti was investigated. METHODS: DNA from the dried blood spots was extracted using QIAGEN methodology. All 2,989 samples were screened using the PET-PCR assay in duplicate. Samples with a cycle threshold (CT) of 40 or less were scored as positive. A subset of the total samples (534) was retested using a nested PCR assay for confirmation. In addition, these same samples were also tested using a TaqMan-based real-time PCR assay. RESULTS: A total of 12 out of the 2,989 samples screened (0.4%) were found to be positive by PET-PCR (mean CT value of 35.7). These same samples were also found to be positive by the nested and TaqMan-based methods. The nested PCR detected an additional positive sample in a subset of 534 samples that was not detected by either PET-PCR or TaqMan-based PCR method. CONCLUSION: While the nested PCR was found to be slightly more sensitive than the PET-PCR, it is not ideal for high throughput screening of samples. Given the ease of use and lower cost than the nested PCR, the PET-PCR provides an alternative assay for the rapid screening of a large number of samples in laboratory settings. |
Structure-dependent immune modulatory activity of protegrin-1 analogs
Zughaier SM , Svoboda P , Pohl J . Antibiotics (Basel) 2014 3 (4) 694-713 Protegrins are porcine antimicrobial peptides (AMPs) that belong to the cathelicidin family of host defense peptides. Protegrin-1 (PG-1), the most investigated member of the protegrin family, is an arginine-rich peptide consisting of 18 amino acid residues, its main chain adopting a beta-hairpin structure that is linked by two disulfide bridges. We report on the immune modulatory activity of PG-1 and its analogs in neutralizing bacterial endotoxin and capsular polysaccharides, consequently inhibiting inflammatory mediators' release from macrophages. We demonstrate that the beta-hairpin structure motif stabilized with at least one disulfide bridge is a prerequisite for the immune modulatory activity of this type of AMP. |
Towards elucidating the effects of purified MWCNTs on human lung epithelial cells
Dong C , EIdawud R , Sargent LM , Kashon ML , Lowry D , Rojanasakul Y , Dinu CZ . Environ Sci Nano 2014 1 (6) 95-603 Toxicity of engineered nanomaterials is associated with their inherent properties, both physical and chemical. Recent studies have shown that exposure to multi-walled carbon nanotubes (MWCNTs) promotes tumors and tumor-associated pathologies and lead to carcinogenesis in model in vivo systems. Here in we examined the potential of purified MWCNTs used at occupationally relevant exposure doses for particles not otherwise regulated to affect human lung epithelial cells. The uptake of the purified MWCNTs was evaluated using fluorescence activated cell sorting (FACS), while the effects on cell fate were assessed using 2- (4-iodophenyl) - 3- (4-nitrophenyl) - 5-(2, 4-disulfophenyl) -2H-tetrazolium salt colorimetric assay, cell cycle and nanoindentation. Our results showed that exposure to MWCNTs reduced cell metabolic activity and induced cell cycle arrest. Our analysis further emphasized that MWCNTs-induced cellular fate results from multiple types of interactions that could be analyzed by means of intracellular biomechanical changes and are pivotal in understanding the underlying MWCNTs-induced cell transformation. |
Traceable assigned values in external quality assessment schemes compared to those obtained by alternative procedure: a case study for Cu, Se and Zn in serum
Patriarca M , Weykamp C , Arnaud J , Jones RL , Parsons PJ , Taylor A . J Anal At Spectrom 2014 30 (1) 148-153 International standards for the recognition of the competence of testing laboratories require that measurement results should be traceable to a conventionally agreed reference. This should be achieved by appropriate calibration of equipment and method validation involving analysis of certified reference materials (CRM). However, these are costly and for many analytical procedures, few are available. Participation in external quality assessment schemes (EQAS) may provide a mean to support the laboratory traceability statement, if the values assigned to test samples are traceable to a stated reference. Values may be assigned to EQAS test samples by a variety of techniques but there has been no direct comparison of results obtained when these procedures are applied to the same samples. In this study, traceable values for Cu, Se and Zn concentrations were assigned to three batches of EQAS serum samples, by analysis by expert laboratories together with CRMs, and compared with those obtained by three other of the approaches described in ISO 13528; analysis by a definitive method (ID-ICP-MS); determination of robust consensus mean from the results of expert laboratories; robust consensus mean of results from EQAS participants. The assigned values (mol L-1) expanded uncertainty (%) for the low, medium and high pools obtained by ID-ICP-MS were: Cu 13.37 1.2, 21.03 1.8, 28.73 1.2; Se 0.74 3.5, 1.51 3.4, 3.11 3.6; Zn 9.69 4.9, 22.52 1.5, 30.85 3.8. Concentrations determined using the three other approaches were similar but the uncertainties increased as the methodologies became increasingly less rigorous. |
Magnetic resonance imaging of graded skeletal muscle injury in live rats
Cutlip RG , Hollander MS , Johnson GA , Johnson BW , Friend SA , Baker BA . Environ Health Insights 2014 8 31-9 INTRODUCTION: Increasing number of stretch-shortening contractions (SSCs) results in increased muscle injury. METHODS: Fischer Hybrid rats were acutely exposed to an increasing number of SSCs in vivo using a custom-designed dynamometer. Magnetic resonance imaging (MRI) imaging was conducted 72 hours after exposure when rats were infused with Prohance and imaged using a 7T rodent MRI system (GE Epic 12.0). Images were acquired in the transverse plane with typically 60 total slices acquired covering the entire length of the hind legs. Rats were euthanized after MRI, the lower limbs removed, and tibialis anterior muscles were prepared for histology and quantified stereology. RESULTS: Stereological analyses showed myofiber degeneration, and cellular infiltrates significantly increased following 70 and 150 SSC exposure compared to controls. MRI images revealed that the percent affected area significantly increased with exposure in all SSC groups in a graded fashion. Signal intensity also significantly increased with increasing SSC repetitions. DISCUSSION: These results suggest that contrast-enhanced MRI has the sensitivity to differentiate specific degrees of skeletal muscle strain injury, and imaging data are specifically representative of cellular histopathology quantified via stereological analyses. |
Morphologic differentiation of viruses beyond the family level
Goldsmith CS . Viruses 2014 6 (12) 4902-4913 Electron microscopy has been instrumental in the identification of viruses by being able to characterize a virus to the family level. There are a few cases where morphologic or morphogenesis factors can be used to differentiate further, to the genus level. These include viruses in the families Poxviridae, Reoviridae, Retroviridae, Herpesviridae, Filoviridae, and Bunyaviridae. |
Multi-walled carbon nanotube-induced gene expression in vitro: concordance with in vivo studies
Snyder-Talkington BN , Dong C , Zhao X , Dymacek J , Porter DW , Wolfarth MG , Castranova V , Qian Y , Guo NL . Toxicology 2014 328c 66-74 There is a current interest in reducing the in vivo toxicity testing of nanomaterials in animals by increasing toxicity testing using in vitro cellular assays; however, toxicological results are seldom concordant between in vivo and in vitro models. This study compared global multi-walled carbon nanotube (MWCNT)-induced gene expression from human lung epithelial and microvascular endothelial cells in monoculture and coculture with gene expression from mouse lungs exposed to MWCNT. Using a cutoff of 10% false discovery rate and 1.5 fold change, we determined that there were more concordant genes (gene expression both up- or downregulated in vivo and in vitro) expressed in both cell types in coculture than in monoculture. When reduced to only those genes involved in inflammation and fibrosis, known outcomes of in vivo MWCNT exposure, there were more disease-related concordant genes expressed in coculture than monoculture. Additionally, different cellular signaling pathways are activated in response to MWCNT dependent upon culturing conditions. As coculture gene expression better correlated with in vivo gene expression, we suggest that cellular cocultures may offer enhanced in vitro models for nanoparticle risk assessment and the reduction of in vivo toxicological testing. |
Multilaboratory testing of antifungal drug combinations against Candida species and Aspergillus fumigatus: utility of one-hundred percent inhibition as the end-point
Ren P , Luo M , Lin S , Ghannoum MA , Isham N , Diekema DJ , Pfaller MA , Messer S , Lockhart SR , Iqbal N , Chaturvedi V . Antimicrob Agents Chemother 2014 59 (3) 1759-66 Four laboratories tested three isolates of Candida species and two isolates of Aspergillus fumigatus using 96-well plates containing combinations of amphotericin B, anidulafungin, caspofungin, micafungin, fluconazole, itraconazole, posaconazole, and voriconazole. The majority of summation fractional inhibitory concentration indices (SigmaFICI) based on the Lowe additivity formula, suggested indifferent drug interactions (SigmaFICI > 0.5 and ≤ 4.0) and no instance of drug antagonism (SigmaFICI > 4.0). The intra- and inter- laboratory agreement rates were superior when MIC100 readings were used as end-points (CI=99%). |
Non-structural protein 1-specific immunoglobulin M and G antibody-capture enzyme-linked immunosorbent assays in diagnosis of flaviviral infections in humans
Chao DY , Galula JU , Shen WF , Davis BS , Chang GJ . J Clin Microbiol 2014 53 (2) 557-66 IgM antibody- and IgG antibody-capture ELISAs (MAC/GAC-ELISA) targeted at envelope protein (E) of dengue viruses (DENV), West Nile virus and Japanese encephalitis virus (JEV) are widely used as sero-diagnostic tests for presumptive confirmation of viral infection. Antibodies directed against the flavivirus nonstructural protein-1 (NS1) have been proposed as serological markers of natural infections among vaccinated populations. The aim of the current study is to optimize an IgM and IgG antibody-capture ELISAs (MAC/GAC-ELISA) to detect anti-NS1 antibodies and compare it with anti-E MAC/GAC-ELISA. Plasmids to express premembrane/envelope (prM/E) or NS1 proteins of six medically important flaviviruses, including dengue (DENV-1 to DENV-4), West Nile (WNV) and Japanese encephalitis (JEV) viruses, were constructed. These plasmids were used for the productions of prM/E containing virus-like particles (VLPs) and secreted NS1 (sNS1) from COS-1 cells. Archived clinical specimens from patients with confirmed DENV, JEV, WNV infections, along with naive sera, were subjected to NS1-MAC/GAC-ELISAs before or after depletion of anti-prM/E antibodies by pre-absorption with or without VLPs. Human serum specimens from previously confirmed DENV infections showed significantly enhanced P/N ratios for NS1-MAC/GAC-ELISAs after the depletion of anti-prM/E antibodies. No statistical differences in sensitivities and specificities were found between the newly developed NS1- and VLP-MAC/GAC-ELISAs. Further application of the assays to WNV and JEV infected serum panels showed similar results. A novel approach to perform MAC/GAC-ELISAs for NS1 antibody detection was successfully developed with great potential to differentiate antibodies elicited by the tetravalent chimeric yellow fever-17D/dengue vaccine or DENV infection. |
Production of a Chaetomium globosum enolase monoclonal antibody
Green BJ , Nayak AP , Lemons AR , Rittenour WR , Hettick JM , Beezhold DH . Monoclon Antib Immunodiagn Immunother 2014 33 (6) 428-37 Chaetomium globosum is a hydrophilic fungal species and a contaminant of water-damaged building materials in North America. Methods to detect Chaetomium species include subjective identification of ascospores, viable culture, or molecular-based detection methods. In this study, we describe the production and initial characterization of a monoclonal antibody (MAb) for C. globosum enolase. MAb 1C7, a murine IgG1 isotype MAb, was produced and reacted with recombinant C. globosum enolase (rCgEno) in an enzyme-linked immunosorbent assay and with a putative C. globosum enolase in a Western blot. Epitope mapping showed MAb 1C7 specific reactivity to an enolase decapeptide, LTYEELANLY, that is highly conserved within the fungal class Sordariomycetes. Cross-reactivity studies showed MAb 1C7 reactivity to C. atrobrunneum but not C. indicum. MAb 1C7 did not react with enolase from Aspergillus fumigatus, which is divergent in only two amino acids within this epitope. The results of this study suggest potential utility of MAb 1C7 in Western blot applications for the detection of Chaetomium and other Sordariomycetes species. |
Effect of wearing an N95 respirator on infrared tympanic membrane temperature measurements
Kim J , Roberge RJ , Powell JB . J Clin Monit Comput 2014 29 (6) 691-5 To determine the impact of wearing an N95 filtering facepiece respirator (N95 FFR) on tympanic temperature measurements. TMT measurements, with and without wearing an N95 filtering facepiece respirator (N95 FFR) were obtained at the onset and termination of 1 h of treadmill exercise in 21 subjects, and at staggered time intervals (0, 20, 40, 60 min) during combined sedentary activity and exercise of another 46 subjects, to determine any effect on TMT. A total of 877 TMT measurements were obtained that demonstrated a mean TMT increase of 0.05 degrees C in the first study group (p = 0.04) and a 0.19 degrees C decrease in the second study group (p < 0.001) with the wearing of an N95 FFR, both of which were lower than controls. Wearing an N95 FFR for 1 h, at different levels of activity, results in significantly lower TMT values than not wearing an N95 FFR, but the magnitude of the changes would likely have minimal clinical significance. |
Endotoxin deposits on the inner surfaces of closed-face cassettes during bioaerosol sampling: a field investigation at composting facilities
Duquenne P , Simon X , Demange V , Harper M , Wild P . Ann Occup Hyg 2014 59 (4) 504-13 A set of 270 bioaerosol samples was taken from 15 composting facilities using polystyrene closed-face filter cassettes (CFCs). The objective was to measure the quantity of endotoxin deposits on the inner surfaces of the cassettes (sometimes referred to as 'wall deposits'). The results show that endotoxins are deposited on the inner surfaces of the CFCs through sampling and/or handling of samples. The quantity of endotoxins measured on inner surfaces range between 0.05 (the limit of detection of the method) and 3100 endotoxin units per cassette. The deposits can represent a large and variable percentage of the endotoxins sampled. More than a third of the samples presented a percentage of inner surface deposits >40% of the total quantity of endotoxins collected (filter + inner surfaces). Omitting these inner surface deposits in the analytical process lead to measurement errors relative to sampling all particles entering the CFC sampler, corresponding to a developing consensus on matching the inhalable particulate sampling convention. The result would be underestimated exposures and could affect the decision as to whether or not a result is acceptable in comparison to airborne concentration limits defined in terms of the inhalability convention. The results of this study suggest including the endotoxins deposited on the inner surfaces of CFCs during analysis. Further researches are necessary to investigate endotoxin deposits on the inner cassette surfaces in other working sectors. |
Galactose-1-phosphate uridyltransferase dried blood spot quality control materials for newborn screening tests
Adam BW , Flores SR , Hou Y , Allen TW , De Jesus VR . Clin Biochem 2014 48 (6) 437-42 OBJECTIVES: We aimed to prepare dried-blood-spot (DBS) quality control (QC) materials for galactose-1-phosphate uridyltransferase (GALT), to evaluate their stability during storage and use, and to evaluate their performance in five DBS GALT test methods. DESIGN AND METHODS: We prepared and characterized GALT-normal and GALT-deficient DBS materials and compared GALT activities in DBSs after predetermined storage intervals at controlled temperatures and humidities. External evaluators documented the suitability of the DBS QC materials for use in five GALT test methods. RESULTS: GALT activity losses from DBSs stored in low (<30%) humidity for 14days at 45 degrees C, 35days at 37 degrees C, 91days at room temperature, 182days at 4 degrees C, and 367days at -20 degrees C were 54%, 53%, 52% 23%, and 7% respectively. In paired DBSs stored in high humidity (>50%) for identical intervals, losses were: 45 degrees C-68%; 37 degrees C-79%; room temperature-72%, and 4 degrees C-63%. GALT activities in DBSs stored at 4 degrees C were stable throughout 19 excursions to room temperature. Twenty-five of 26 external evaluators, using five different GALT test methods, classified the GALT-deficient DBSs as "outside normal limits". All evaluators classified the GALT-normal DBSs as "within normal limits". CONCLUSIONS: Most of the GALT activity loss from DBSs stored at elevated or room temperature was attributable to the effects of storage temperature. Most of the loss from DBSs stored at 4 degrees C was attributable to the effects of elevated humidity. Loss from DBSs stored at -20 degrees C was insignificant. The DBS materials were suitable for monitoring performance of all five GALT test methods. |
Human rhinovirus induced cytokine/chemokine responses in human airway epithelial and immune cells
Rajan D , McCracken CE , Kopleman HB , Kyu SY , Lee FE , Lu X , Anderson LJ . PLoS One 2014 9 (12) e114322 Infections with human rhinovirus (HRV) are commonly associated with acute upper and lower respiratory tract disease and asthma exacerbations. The role that HRVs play in these diseases suggests it is important to understand host-specific or virus-specific factors that contribute to pathogenesis. Since species A HRVs are often associated with more serious HRV disease than species B HRVs, differences in immune responses they induce should inform disease pathogenesis. To identify species differences in induced responses, we evaluated 3 species A viruses, HRV 25, 31 and 36 and 3 species B viruses, HRV 4, 35 and 48 by exposing human PBMCs to HRV infected Calu-3 cells. To evaluate the potential effect of memory induced by previous HRV infection on study responses, we tested cord blood mononuclear cells that should be HRV naive. There were HRV-associated increases (significant increase compared to mock-infected cells) for one or more HRVs for IP-10 and IL-15 that was unaffected by addition of PBMCs, for MIP-1alpha, MIP-1beta, IFN-alpha, and HGF only with addition of PBMCs, and for ENA-78 only without addition of PBMCs. All three species B HRVs induced higher levels, compared to A HRVs, of MIP-1alpha and MIP-1beta with PBMCs and ENA-78 without PBMCs. In contrast, addition of CBMCs had less effect and did not induce MIP-1alpha, MIP-1beta, or IFN-alpha nor block ENA-78 production. Addition of CBMCs did, however, increase IP-10 levels for HRV 35 and HRV 36 infection. The presence of an effect with PBMCs and no effect with CBMCs for some responses suggest differences between the two types of cells possibly because of the presence of HRV memory responses in PBMCs and not CBMCs or limited response capacity for the immature CBMCs relative to PBMCs. Thus, our results indicate that different HRV strains can induce different patterns of cytokines and chemokines; some of these differences may be due to differences in memory responses induced by past HRV infections, and other differences related to virus factors that can inform disease pathogenesis. |
Intravenous and gastric cerium dioxide nanoparticle exposure disrupts microvascular smooth muscle signaling
Minarchick VC , Stapleton PA , Fix NR , Leonard SS , Sabolsky EM , Nurkiewicz TR . Toxicol Sci 2014 144 (1) 77-89 Cerium dioxide nanoparticles (CeO2 NP) hold great therapeutic potential, but the in vivo effects of non-pulmonary exposure routes are unclear. The first aim was to determine whether microvascular function is impaired after intravenous and gastric CeO2 NP exposure. The second aim was to investigate the mechanism(s) of action underlying microvascular dysfunction following CeO2 NP exposure. Rats were exposed to CeO2 NP (primary diameter: 4 +/- 1 nm, surface area: 81.36 m2/g) by intratracheal instillation, intravenous injection, or gastric gavage. Mesenteric arterioles were harvested 24 h post-exposure and vascular function was assessed using an isolated arteriole preparation. Endothelium-dependent and -independent function and vascular smooth muscle (VSM) signaling [soluble guanylyl cyclase (sGC), and cyclic guanosine monophosphate (cGMP)] were assessed. Reactive oxygen species (ROS) generation and nitric oxide (NO) production were analyzed. Compared to controls, endothelium-dependent and -independent dilation were impaired following intravenous injection (by 61% and 45%) and gastric gavage (by 63% and 49%). However, intravenous injection resulted in greater microvascular impairment (16% and 35%) compared to gastric gavage at an identical dose (100 mug). Furthermore, sGC activation and cGMP responsiveness were impaired following pulmonary, intravenous, and gastric CeO2 NP treatment. Finally, nanoparticle exposure resulted in route-dependent, increased ROS generation and decreased NO production. These results indicate that CeO2 NP exposure route differentially impairs microvascular function, which may be mechanistically linked to decreased NO production and subsequent VSM signaling. Fully understanding the mechanisms behind CeO2 NP in vivo effects is a critical step in the continued therapeutic development of this nanoparticle. |
Bacteriophage-mediated control of a two-species biofilm of CAUTI-associated microorganisms in an in vitro urinary catheter model
Lehman SM , Donlan RM . Antimicrob Agents Chemother 2014 59 (2) 1127-37 Microorganisms from a patient or their environment may colonize indwelling urinary catheters, forming biofilm communities on catheter surfaces and increasing patient morbidity and mortality. This study investigated the effect of pre-treating hydrogel-coated silicone catheters with mixtures of Pseudomonas aeruginosa and Proteus mirabilis bacteriophages on the development of single- and two-species biofilms in a multi-day, continuous-flow in vitro model using artificial urine. Novel phages were purified from sewage, characterized, and screened for their ability to reduce biofilm development by clinical isolates of their respective hosts. Screening data showed that Artificial Urine Medium (AUM) is a valid substitute for human urine for the purpose of evaluating uropathogen biofilm control by these bacteriophages. Defined phage cocktails targeting each of P. aeruginosa and P. mirabilis were designed based on biofilm inhibition screens. Hydrogel-coated catheters were pre-treated with one or both cocktails and challenged with approximately 1x103 CFU/mL of the corresponding pathogen(s). Biofilm growth on catheter surfaces in AUM was monitored over 72 to 96 h. Phage pre-treatment reduced P. aeruginosa biofilm counts by 4 log10CFU/cm2 (p≤0.01) and P. mirabilis biofilm counts by > 2 log10 CFU/cm2 (p ≤0.01) over 48 h. The presence of P. mirabilis was always associated with an increase in lumen pH from 7.5 to 9.5, and with eventual blockage of the reactor lines. Results of this study suggest that pretreatment of a hydrogel urinary catheter with a phage cocktail can significantly reduce mixed species biofilm formation by clinically relevant bacteria. |
A combined oral contraceptive affects mucosal SHIV susceptibility factors in a pigtail macaque (Macaca nemestrina) model
Dietz Ostergaard S , Butler K , Ritter JM , Johnson R , Sanders J , Powell N , Lathrop G , Zaki SR , McNicholl JM , Kersh EN . J Med Primatol 2014 44 (2) 97-107 BACKGROUND: Injectable hormonal contraception may increase women's risk of HIV acquisition and can affect biological risk factors in animal models of HIV. We established, for the first time, a model to investigate whether combined oral contraceptives (COC) alter SHIV susceptibility in macaques. METHODS: Seven pigtail macaques were administered a monophasic levonorgestrel (LNG)/ethinyl estradiol (EE) COC at 33% or 66% of the human dose for 60 days. Menstrual cycling, vaginal epithelial thickness, and other SHIV susceptibility factors were monitored for a mean of 18 weeks. RESULTS: Mean vaginal epithelial thicknesses were 290.8 mum at baseline and 186.2 mum during COC (P = 0.0141, Mann-Whitney U-test). Vaginal pH decreased from 8.5 during treatment to 6.5 post-treatment (0.0176 two-tailed t-test). Measured microflora was unchanged. CONCLUSIONS: COC caused thinning of the vaginal epithelium and vaginal pH changes, which may increase SHIV susceptibility. 0.033 mg LNG + .0066 mg EE appeared effective in suppressing ovulation. |
Incidence of sickle cell trait--United States, 2010.
Ojodu J , Hulihan MM , Pope SN , Grant AM . MMWR Morb Mortal Wkly Rep 2014 63 (49) 1155-8 Persons with sickle cell trait (SCT) are heterozygous carriers of an abnormal ss-globin gene that results in the production of an abnormal hemoglobin, Hb S, which can distort red blood cells (http://www.cdc.gov/ncbddd/sicklecell/facts.html). All state newborn screening (NBS) programs have provided universal sickle cell disease (SCD) screening for newborns since 2006. Screening for SCD detects both SCD and SCT. To obtain up-to-date measures of the occurrence of SCT among newborns by race/ethnicity and state of birth, data collected by state NBS programs in 2010 were examined. In 2010, the incidence of SCT in participating states was 15.5 per 1,000 newborns overall; 73.1 among black newborns and 6.9 among Hispanic newborns. Incidence by state ranged from 0.8 per 1,000 screened newborns in Montana to 34.1 per 1,000 in Mississippi. Although the occurrence of SCT varies greatly from state-to-state and among different races and ethnicities, every state and racial/ethnic population includes persons living with the condition. The period immediately following NBS is ideal for primary care providers and genetic counselors to begin educating the families of identified persons with SCT about potential health complications and reproductive considerations. |
The impact of case definition on attention-deficit/hyperactivity disorder prevalence estimates in community-based samples of school-aged children
McKeown RE , Holbrook JR , Danielson ML , Cuffe SP , Wolraich ML , Visser SN . J Am Acad Child Adolesc Psychiatry 2015 54 (1) 53-61 OBJECTIVE: To determine the impact of varying attention-deficit/hyperactivity disorder (ADHD) diagnostic criteria, including new DSM-5 criteria, on prevalence estimates. METHOD: Parent and teacher reports identified high- and low-screen children with ADHD from elementary schools in 2 states that produced a diverse overall sample. The parent interview stage included the Diagnostic Interview Schedule for Children-IV (DISC-IV), and up to 4 additional follow-up interviews. Weighted prevalence estimates, accounting for complex sampling, quantified the impact of varying ADHD criteria using baseline and the final follow-up interview data. RESULTS: At baseline 1,060 caregivers were interviewed; 656 had at least 1 follow-up interview. Teachers and parents reported 6 or more ADHD symptoms for 20.5% (95% CI = 18.1%-23.2%) and 29.8% (CI = 24.5%-35.6%) of children respectively, with criteria for impairment and onset by age 7 years (DSM-IV) reducing these proportions to 16.3% (CI = 14.7%-18.0%) and 17.5% (CI = 13.3%-22.8%); requiring at least 4 teacher-reported symptoms reduced the parent-reported prevalence to 8.9% (CI = 7.4%-10.6%). Revising age of onset to 12 years per DSM-5 increased the 8.9% estimate to 11.3% (CI = 9.5%-13.3%), with a similar increase seen at follow-up: 8.2% with age 7 onset (CI = 5.9%-11.2%) versus 13.0% (CI = 7.6%-21.4%) with onset by age 12. Reducing the number of symptoms required for those aged 17 and older increased the overall estimate to 13.1% (CI = 7.7%-21.5%). CONCLUSION: These findings quantify the impact on prevalence estimates of varying case definition criteria for ADHD. Further research of impairment ratings and data from multiple informants is required to better inform clinicians conducting diagnostic assessments. DSM-5 changes in age of onset and number of symptoms required for older adolescents appear to increase prevalence estimates, although the full impact is uncertain due to the age of our sample. |
Repeated measures study of weekly and daily cytomegalovirus shedding patterns in saliva and urine of healthy cytomegalovirus-seropositive children
Cannon MJ , Stowell JD , Clark R , Dollard PR , Johnson D , Mask K , Stover C , Wu K , Amin M , Hendley W , Guo J , Schmid DS , Dollard SC . BMC Infect Dis 2014 14 (569) 569 BACKGROUND: To better understand potential transmission risks from contact with the body fluids of children, we monitored the presence and amount of CMV shedding over time in healthy CMV-seropositive children. METHODS: Through screening we identified 36 children from the Atlanta, Georgia area who were CMV-seropositive, including 23 who were shedding CMV at the time of screening. Each child received 12 weekly in-home visits at which field workers collected saliva and urine. During the final two weeks, parents also collected saliva and urine daily. RESULTS: Prevalence of shedding was highly correlated with initial shedding status: children shedding at the screening visit had CMV DNA in 84% of follow-up saliva specimens (455/543) and 28% of follow-up urine specimens (151/539); those not shedding at the screening visit had CMV DNA in 16% of follow-up saliva specimens (47/303) and 5% of follow-up urine specimens (16/305). Among positive specimens we found median viral loads of 82,900 copies/mL in saliva and 34,730 copies/mL in urine (P=0.01), while the viral load for the 75th percentile was nearly 1.5 million copies/mL for saliva compared to 86,800 copies/mL for urine. Younger age was significantly associated with higher viral loads, especially for saliva (P<0.001). Shedding prevalence and viral loads were relatively stable over time. All children who were shedding at the screening visit were still shedding at least some days during weeks 11 and 12, and median and mean viral loads did not change substantially over time. CONCLUSIONS: Healthy CMV-seropositive children can shed CMV for months at high, relatively stable levels. These data suggest that behavioral prevention messages need to address transmission via both saliva and urine, but also need to be informed by the potentially higher risks posed by saliva and by exposures to younger children. |
Seroprevalence of cytomegalovirus among children 1 to 5 years of age in the United States from the National Health and Nutrition Examination Survey of 2011 to 2012
Lanzieri TM , Kruszon-Moran D , Amin MM , Bialek SR , Cannon MJ , Carroll MD , Dollard SM . Clin Vaccine Immunol 2014 22 (2) 245-7 Cytomegalovirus (CMV) seroprevalence among U.S. children 1-5 years-old was assessed in the National Health and Nutrition Examination Survey 2011-2012. Overall seroprevalence (95% confidence interval) of IgG was 20.7% (14.4-28.2%), IgM 1.1% (0.4-2.4%), and low IgG avidity 3.6% (1.7-6.6%), corresponding to a 17.3% (10.1-26.7%) prevalence of recent infection among IgG-positive children. |
Taking stock of the CSHCN screener: a review of common questions and current reflections
Bethell CD , Blumberg SJ , Stein RE , Strickland B , Robertson J , Newacheck PW . Acad Pediatr 2014 15 (2) 165-76 Since 2000, the Children with Special Health Care Needs (CSHCN) Screener (CS) has been widely used nationally, by states, and locally as a standardized and brief survey-based method to identify populations of children who experience chronic physical, mental, behavioral, or other conditions and who also require types and amounts of health and related services beyond those routinely used by children. Common questions about the CS include those related to its development and uses; its conceptual framework and potential for under- or overidentification; its ability to stratify CSHCN by complexity of service needs and daily life impacts; and its potential application in clinical settings and comparisons with other identification approaches. This review recaps the development, design, and findings from the use of the CS and synthesizes findings from studies conducted over the past 13 years as well as updated findings on the CS to briefly address the 12 most common questions asked about this tool through technical assistance provided regarding the CS since 2001. Across a range of analyses, the CS consistently identifies a subset of children with chronic conditions who need or use more than a routine type or amount of medical- and health-related services and who share common needs for health care, including care coordination, access to specialized and community-based services, and enhanced family engagement. Scoring algorithms exist to stratify CSHCN by complexity of needs and higher costs of care. Combining CS data with clinical diagnostic code algorithms may enhance capacity to further identify meaningful subgroups. Clinical application is most suited for identifying and characterizing populations of patients and assessing quality and system improvement impacts for children with a broad range of chronic conditions. Other clinical applications require further implementation research. Use of the CS in clinical settings is limited because integration of standardized patient-reported health information is not yet common practice in most settings or in electronic health records. The CS continues to demonstrate validity as a non-condition-specific, population-based tool that addresses many of the limits of condition or diagnosis checklists, including the relatively low prevalence of many individual conditions and substantial within-diagnosis variations and across-diagnoses similarities in health service needs, functioning, and quality of care. |
Persistent associations between maternal prenatal exposure to phthalates on child IQ at age 7 years
Factor-Litvak P , Insel B , Calafat AM , Liu X , Perera F , Rauh VA , Whyatt RM . PLoS One 2014 9 (12) e114003 BACKGROUND: Prior research reports inverse associations between maternal prenatal urinary phthalate metabolite concentrations and mental and motor development in preschoolers. No study evaluated whether these associations persist into school age. METHODS: In a follow up of 328 inner-city mothers and their children, we measured prenatal urinary metabolites of di-n-butyl phthalate (DnBP), butylbenzyl phthalate (BBzP), di-isobutyl phthalate (DiBP), di-2-ethylhexyl phthalate and diethyl phthalate in late pregnancy. The Wechsler Intelligence Scale for Children, 4th edition was administered at child age 7 years and evaluates four areas of cognitive function associated with overall intelligence quotient (IQ). RESULTS: Child full-scale IQ was inversely associated with prenatal urinary metabolite concentrations of DnBP and DiBP: b = -2.69 (95% confidence interval [CI] = -4.33, -1.05) and b = -2.69 (95% CI = -4.22, -1.16) per log unit increase. Among children of mothers with the highest versus lowest quartile DnBP and DiBP metabolite concentrations, IQ was 6.7 (95% CI = 1.9, 11.4) and 7.6 (95% CI = 3.2, 12.1) points lower, respectively. Associations were unchanged after control for cognition at age 3 years. Significant inverse associations were also seen between maternal prenatal metabolite concentrations of DnBP and DiBP and child processing speed, perceptual reasoning and working memory; DiBP and child verbal comprehension; and BBzP and child perceptual reasoning. CONCLUSION: Maternal prenatal urinary metabolite concentrations measured in late pregnancy of DnBP and DiBP are associated with deficits in children's intellectual development at age 7 years. Because phthalate exposures are ubiquitous and concentrations seen here within the range previously observed among general populations, results are of public health significance. |
Post-disaster health indicators for pregnant and postpartum women and infants
Zotti ME , Williams AM , Wako E . Matern Child Health J 2014 19 (6) 1179-88 United States (U.S.) pregnant and postpartum (P/PP) women and their infants may be particularly vulnerable to effects from disasters. In an effort to guide post-disaster assessment and surveillance, we initiated a collaborative process with nationwide expert partners to identify post-disaster epidemiologic indicators for these at-risk groups. This 12 month process began with conversations with partners at two national conferences to identify critical topics for P/PP women and infants affected by disaster. Next we hosted teleconferences with a 23 member Indicator Development Working Group (IDWG) to review and prioritize the topics. We then divided the IDWG into three population subgroups (pregnant women, postpartum women, and infants) that conducted at least three teleconferences to discuss the proposed topics and identify/develop critical indicators, measures for each indicator, and relevant questions for each measure for their respective population subgroup. Lastly, we hosted a full IDWG teleconference to review and approve the indicators, measures, and questions. The final 25 indicators and measures with questions (available online) are organized by population subgroup: pregnant women (indicators = 9; measures = 24); postpartum women (indicators = 10; measures = 36); and infants (indicators = 6; measures = 30). We encourage our partners in disaster-affected areas to test these indicators and measures for relevancy and completeness. In post-disaster surveillance, we envision that users will not use all indicators and measures but will select ones appropriate for their setting. These proposed indicators and measures promote uniformity of measurement of disaster effects among U.S. P/PP women and their infants and assist public health practitioners to identify their post-disaster needs. |
Preconception antiretroviral therapy and birth defects: what is needed?
Bulterys M , Berry RJ , Watts DH . AIDS 2014 28 (18) 2777-2780 Prevention of maternal-to-child transmission (MTCT) of HIV remains a priority, as globally, approximately 700 infants are newly infected with HIV each day [1]. Remarkable progress has been made in the past decade in reducing MTCT, with one million infants prevented from acquiring HIV between 2003 and 2013 because of maternal and infant antiretroviral prophylaxis [2]. However, limited safety data currently exist on potential adverse outcomes among HIV-infected pregnant women and their infants after exposure to combination antiretroviral therapy (cART) before and throughout pregnancy [3]. | In the United States and other high-resource countries, pregnant HIV-infected women receive cART starting usually early in pregnancy and, as a result, MTCT of HIV has been nearly eliminated in the past decade, with rates decreased from approximately 25% to currently 1% [4,5]. The advent of more potent and better-tolerated antiretroviral drugs has led to a strong push toward earlier initiation of cART, including amongst women of reproductive age [6–8]. Thus, an increasing number of HIV-infected women are already taking cART when conception occurs. In resource-limited settings, initiating cART has previously been recommended only for pregnant women with more advanced HIV disease (i.e. CD4+ cell count below 200 cells/μl or symptomatic HIV with CD4+ cell count <350 cells/μl) [9]. However, the 2013 WHO consolidated guidelines recommend that all HIV-infected pregnant women initiate cART regardless of CD4+ cell count, and if breastfeeding, continue cART throughout breastfeeding [10,11]. Women may either continue lifelong treatment regardless of clinical status or stop if they do not yet meet country-specific treatment eligibility criteria (‘option B’). An increasing number of countries are in the process of adopting lifelong treatment (the so-called ‘option B+’) for all pregnant women found to be HIV-infected [12–14]. |
Prenatal exposure to polybrominated diphenyl ethers and polyfluoroalkyl chemicals and infant neurobehavior
Donauer S , Chen A , Xu Y , Calafat AM , Sjodin A , Yolton K . J Pediatr 2014 166 (3) 736-42 OBJECTIVE: To assess the impact of prenatal exposure to polybrominated diphenyl ethers (PBDEs) and polyfluoroalkyl chemicals (PFCs) on early infant neurobehavior. STUDY DESIGN: In a cohort of 349 mother/infant pairs, we measured maternal serum concentrations during pregnancy of PBDEs, including BDE-47 and other related congeners, as well as 2 common PFCs, perfluorooctanoic acid (PFOA) and perfluorooctane sulfonic acid. When the infants were 5 weeks of age, we measured their neurobehavior by using the Neonatal Intensive Care Unit Network Neurobehavioral Scale (NNNS). RESULTS: Neither PBDE nor PFC exposures during gestation were associated with the 11 individual NNNS outcomes included in our study; however, when we used latent profile analysis to categorize infants into neurobehavioral profiles based on performance on the NNNS (social/easygoing, high arousal/difficult, or hypotonic), a 10-fold increase in prenatal PFOA concentrations significantly increased the odds of being categorized as hypotonic compared with social/easygoing (aOR 3.79; 95% CI 1.1-12.8). CONCLUSIONS: Infants of mothers with greater serum concentrations of PFOA during pregnancy were more likely to be categorized as hypotonic. No association between PBDE concentrations and hypotonia was found. Additional studies should further investigate possible associations of prenatal PFC exposure and muscle tone in infants and children. |
Effect of maternal multiple micronutrient vs iron-folic acid supplementation on infant mortality and adverse birth outcomes in rural Bangladesh: the JiVitA-3 randomized trial
West KP Jr , Shamim AA , Mehra S , Labrique AB , Ali H , Shaikh S , Klemm RD , Wu LS , Mitra M , Haque R , Hanif AA , Massie AB , Merrill RD , Schulze KJ , Christian P . JAMA 2014 312 (24) 2649-58 IMPORTANCE: Maternal micronutrient deficiencies may adversely affect fetal and infant health, yet there is insufficient evidence of effects on these outcomes to guide antenatal micronutrient supplementation in South Asia. OBJECTIVE: To assess effects of antenatal multiple micronutrient vs iron-folic acid supplementation on 6-month infant mortality and adverse birth outcomes. DESIGN, SETTING, AND PARTICIPANTS: Cluster randomized, double-masked trial in Bangladesh, with pregnancy surveillance starting December 4, 2007, and recruitment on January 11, 2008. Six-month infant follow-up ended August 30, 2012. Surveillance included 127,282 women; 44,567 became pregnant and were included in the analysis and delivered 28,516 live-born infants. Median gestation at enrollment was 9 weeks (interquartile range, 7-12). INTERVENTIONS: Women were provided supplements containing 15 micronutrients or iron-folic acid alone, taken daily from early pregnancy to 12 weeks postpartum. MAIN OUTCOMES AND MEASURES: The primary outcome was all-cause infant mortality through 6 months (180 days). Prespecified secondary outcomes in this analysis included stillbirth, preterm birth (<37 weeks), and low birth weight (<2500 g). To maintain overall significance of alpha = .05, a Bonferroni-corrected alpha = .01 was calculated to evaluate statistical significance of primary and 4 secondary risk outcomes (.05/5). RESULTS: Among the 22,405 pregnancies in the multiple micronutrient group and the 22,162 pregnancies in the iron-folic acid group, there were 14,374 and 14,142 live-born infants, respectively, included in the analysis. At 6 months, multiple micronutrients did not significantly reduce infant mortality; there were 764 deaths (54.0 per 1000 live births) in the iron-folic acid group and 741 deaths (51.6 per 1000 live births) in the multiple micronutrient group (relative risk [RR], 0.95; 95% CI, 0.86-1.06). Multiple micronutrient supplementation resulted in a non-statistically significant reduction in stillbirths (43.1 vs 48.2 per 1000 births; RR, 0.89; 95% CI, 0.81-0.99; P = .02) and significant reductions in preterm births (18.6 vs 21.8 per 100 live births; RR, 0.85; 95% CI, 0.80-0.91; P < .001) and low birth weight (40.2 vs 45.7 per 100 live births; RR, 0.88; 95% CI, 0.85-0.91; P < .001). CONCLUSIONS AND RELEVANCE: In Bangladesh, antenatal multiple micronutrient compared with iron-folic acid supplementation did not reduce all-cause infant mortality to age 6 months but resulted in a non-statistically significant reduction in stillbirths and significant reductions in preterm births and low birth weight. TRIAL REGISTRATION: clinicaltrials.gov Identifier: NCT00860470. |
Inpatient and emergency room visits for adolescents and young adults with spina bifida living in South Carolina
Mann JR , Royer JA , Turk MA , McDermott S , Holland MM , Ozturk O , Hardin JW , Thibadeau JK . PM R 2014 7 (5) 499-511 OBJECTIVE: To compare emergency room (ER) and inpatient hospital (IP) use rates for people with spina bifida (SB) to peers without SB, when transition from pediatric to adult health care is likely to occur, and analyze those ER and IP rates by age, race, socioeconomic status, gender, and county type. DESIGN: A retrospective cohort study SETTING: Secondary data analysis in South Carolina PARTICIPANTS: We studied individuals who were between 15-24 years old and enrolled in the State Health Plan (SHP) or state Medicaid during the 2000 - 2010 study period. METHODS: Individuals with SB were identified using ICD-9 billing codes (741.0, 741.9) in SHP, Medicaid, and hospital uniform billing (UB) data. ER and IP encounters were identified using UB data. Multivariable Generalized Estimating Equation (GEE) Poisson models were estimated to compare rates of ER and IP use among the SB group to the comparison group. MAIN OUTCOME MEASUREMENTS: Total ER rate and IP rate, in addition to cause-specific rates for ambulatory care sensitive conditions (ACSC) and other condition categories. RESULTS: We found higher rates of ER and IP use in people with SB compared to the control group. Among individuals with SB, young adults (20-24 year olds) had higher rates of ER use due to all ACSC (p=.023), other ACSC (p=.04), and urinary tract infections (UTI; p=.002) compared to adolescents (15-19 year olds). CONCLUSIONS: Young adulthood is associated with increased ER use overall, as well as in specific condition categories (most notably UTI) in 15-24 year olds with SB. This association may be indicative of changing healthcare access as people with SB move from adolescent to adult health care, and/or physiologic changes during the age range studied. |
Adherence to extended postpartum antiretrovirals is associated with decreased breast milk HIV-1 transmission
Davis NL , Miller WC , Hudgens MG , Chasela CS , Sichali D , Kayira D , Nelson JA , Stringer JS , Ellington SR , Kourtis AP , Jamieson DJ , van der Horst C . AIDS 2014 28 (18) 2739-2749 OBJECTIVE: Estimate association between postpartum antiretroviral adherence and breast milk HIV-1 transmission. DESIGN: Prospective cohort study. METHODS: Mother-infant pairs were randomized after delivery to immediately begin receiving 28 weeks of either triple maternal antiretrovirals (zidovudine, lamivudine, and either nevirapine, nelfinavir, or lopinavir-ritonavir) or daily infant nevirapine as part of the Breastfeeding, Antiretrovirals, and Nutrition (BAN) study. Associations between postpartum antiretroviral adherence and rate of breast milk HIV-1 transmission were estimated using Cox models. We measured adherence over four postpartum time intervals using pill count, suspension bottle weight, and maternal self-report. Adherence was categorized and lagged by one interval. Missing adherence measures were multiply imputed. Infant HIV-1 infection was determined by DNA PCR every 2-6 weeks. The primary endpoint was infant HIV-1 infection by 38 weeks of age among infants alive and uninfected at 5 weeks. RESULTS: Analyses included 1479 mother-infant pairs and 45 transmission events. Using pill count and bottle weight information, 22-40% of mother-infant pairs at any given interval were less than 90% adherent. Having at least 90% adherence was associated with a 52% [95% confidence interval (CI) 3-76] relative reduction in the rate of breast milk HIV-1 transmission, compared with having less than 90% adherence when controlling for study arm, breastfeeding status, and maternal characteristics. Complete case analysis rendered similar results (n = 501; relative reduction 59%, 95% CI 6-82). CONCLUSION: Nonadherence to extended postpartum antiretroviral regimens in 'real world' settings is likely to be higher than that seen in BAN. Identifying mothers with difficulty adhering to antiretrovirals, and developing effective adherence interventions, will help maximize benefits of antiretroviral provision throughout breastfeeding. |
Cross-sectional study of cytomegalovirus shedding and immunological markers among seropositive children and their mothers
Stowell JD , Mask K , Amin M , Clark R , Levis D , Hendley W , Lanzieri TM , Dollard SC , Cannon MJ . BMC Infect Dis 2014 14 (568) 568 BACKGROUND: Congenital cytomegalovirus (CMV) is the leading infectious cause of birth defects in the United States. To better understand factors that may influence CMV transmission risk, we compared viral and immunological factors in healthy children and their mothers. METHODS: We screened for CMV IgG antibodies in a convenience sample of 161 children aged 0-47 months from the Atlanta, Georgia metropolitan area, along with 32 mothers of children who screened CMV-seropositive. We assessed CMV shedding via PCR using saliva collected with oral swabs (children and mothers) and urine collected from diapers using filter paper inserts (children only). RESULTS: CMV IgG was present in 31% (50/161) of the children. Half (25/50) of seropositive children were shedding in at least one fluid. The proportion of seropositive children who shed in saliva was 100% (8/8) among the 4-12 month-olds, 64% (9/14) among 13-24 month-olds, and 40% (6/15) among 25-47 month-olds (P for trend=0.003). Seropositive mothers had a lower proportion of saliva shedding (21% [6/29]) than children (P<0.001). Among children who were shedding CMV, viral loads in saliva were significantly higher in younger children (P<0.001); on average, the saliva viral load of infants (i.e., <12 months) was approximately 300 times that of two year-olds (i.e., 24-35 months). Median CMV viral loads were similar in children's saliva and urine but were 10-50 times higher (P<0.001) than the median viral load of the mothers' saliva. However, very high viral loads (> one million copies/mL) were only found in children's saliva (31% of those shedding); children's urine and mothers' saliva specimens all had fewer than 100,000 copies/mL. Low IgG avidity, a marker of primary infection, was associated with younger age (p=0.03), higher viral loads in saliva (p=0.02), and lower antibody titers (p=0.005). CONCLUSIONS: Young CMV seropositive children, especially those less than one year-old may present high-risk CMV exposures to pregnant women, especially via saliva, though further research is needed to see if this finding can be generalized across racial or other demographic strata. |
Effectiveness evaluation of the food fortification program of Costa Rica: impact on anemia prevalence and hemoglobin concentrations in women and children
Martorell R , Ascencio M , Tacsan L , Alfaro T , Young MF , Addo OY , Dary O , Flores-Ayala R . Am J Clin Nutr 2015 101 (1) 210-7 BACKGROUND: Food fortification is one approach for addressing anemia, but information on program effectiveness is limited. OBJECTIVE: We evaluated the impact of Costa Rica's fortification program on anemia in women aged 15-45 y and children aged 1-7 y. DESIGN: Reduced iron, an ineffective fortificant, was replaced by ferrous fumarate in wheat flour in 2002, and ferrous bisglycinate was added to maize flour in 1999 and to liquid and powdered milk in 2001. We used a one-group pretest-posttest design and national survey data from 1996 (baseline; 910 women, 965 children) and 2008-2009 (endline; 863 women, 403 children) to assess changes in iron deficiency (children only) and anemia. Data were also available for sentinel sites (1 urban, 1 rural) for 1999-2000 (405 women, 404 children) and 2008-2009 (474 women, 195 children), including 24-h recall data in children. Monitoring of fortification levels was routine. RESULTS: Foods were fortified as mandated. Fortification provided about one-half the estimated average requirement for iron in children, mostly and equally through wheat flour and milk. Anemia was reduced in children and women in national and sentinel site comparisons. At the national level, anemia declined in children from 19.3% (95% CI: 16.8%, 21.8%) to 4.0% (95% CI: 2.1%, 5.9%) and in women from 18.4% (95% CI: 15.8%, 20.9%) to 10.2% (95% CI: 8.2%, 12.2%). In children, iron deficiency declined from 26.9% (95% CI: 21.1%, 32.7%) to 6.8% (95% CI: 4.2%, 9.3%), and iron deficiency anemia, which was 6.2% (95% CI: 3.0%, 9.3%) at baseline, could no longer be detected at the endline. CONCLUSIONS: A plausible impact pathway suggests that fortification improved iron status and reduced anemia. Although unlikely in the Costa Rican context, other explanations cannot be excluded in a pre/post comparison. |
A water availability intervention in New York City public schools: influence on youths' water and milk behaviors
Elbel B , Mijanovich T , Abrams C , Cantor J , Dunn L , Nonas C , Cappola K , Onufrak S , Park S . Am J Public Health 2014 105 (2) e1-e8 OBJECTIVES: We determined the influence of "water jets" on observed water and milk taking and self-reported fluid consumption in New York City public schools. METHODS: From 2010 to 2011, before and 3 months after water jet installation in 9 schools, we observed water and milk taking in cafeterias (mean 1000 students per school) and surveyed students in grades 5, 8, and 11 (n = 2899) in the 9 schools that received water jets and 10 schools that did not. We performed an observation 1 year after implementation (2011-2012) with a subset of schools. We also interviewed cafeteria workers regarding the intervention. RESULTS: Three months after implementation we observed a 3-fold increase in water taking (increase of 21.63 events per 100 students; P < .001) and a much smaller decline in milk taking (-6.73 events per 100 students; P = .012), relative to comparison schools. At 1 year, relative to baseline, there was a similar increase in water taking and no decrease in milk taking. Cafeteria workers reported that the water jets were simple to clean and operate. CONCLUSIONS: An environmental intervention in New York City public schools increased water taking and was simple to implement. |
Barriers to effective implementation of programs for the prevention of workplace violence in hospitals
Blando J , Ridenour M , Hartley D , Casteel C . Online J Issues Nurs 2015 20 (1) Effective workplace violence (WPV) prevention programs are essential, yet challenging to implement in healthcare. The aim of this study was to identify major barriers to implementation of effective violence prevention programs. After reviewing the related literature, the authors describe their research methods and analysis and report the following seven themes as major barriers to effective implementation of workplace violence programs: a lack of action despite reporting; varying perceptions of violence; bullying; profit-driven management models; lack of management accountability; a focus on customer service; and weak social service and law enforcement approaches to mentally ill patients. The authors discuss their findings in light of previous studies and experiences and offer suggestions for decreasing WPV in healthcare settings. They conclude that although many of these challenges to effective implementation of workplace violence programs are both within the program itself and relate to broader industry and societal issues, creative innovations can address these issues and improve WPV prevention programs. |
Selecting models for a respiratory protection program: what can we learn from the scientific literature?
Shaffer RE , Janssen LL . Am J Infect Control 2014 43 (2) 127-32 BACKGROUND: An unbiased source of comparable respirator performance data would be helpful in setting up a hospital respiratory protection program. METHODS: The scientific literature was examined to assess the extent to which performance data (respirator fit, comfort and usability) from N95 filtering facepiece respirator (FFR) models are available to assist with FFR model selection and procurement decisions. RESULTS: Ten studies were identified that met the search criteria for fit, whereas 5 studies met the criteria for comfort and usability. CONCLUSION: Analysis of these studies indicated that it is difficult to directly use the scientific literature to inform the FFR selection process because of differences in study populations, methodologies, and other factors. Although there does not appear to be a single best fitting FFR, studies demonstrate that fit testing programs can be designed to successfully fit nearly all workers with existing products. Comfort and usability are difficult to quantify. Among the studies found, no significant differences were noted. |
Stop gambling with your hearing
Murphy WJ , Grantham MA . Int J Audiol 2014 54 Suppl 1 1-2 In 2014, the National Hearing Conservation Association took its chances and held its 39th annual conference, titled Stop Gambling with your Hearing, in Las Vegas. The authors who contributed to this issue of the International Journal of Audiology demonstrated that they could be relied upon to deliver a royal flush when it comes to advancing research and knowledge for worker hearing loss prevention. The papers that we were privileged to shepherd through the review process share common themes: epidemiologic and workplace assessments of hearing and new methods to better assess hearing and the effects of wearing hearing protection upon the speech intelligibility and localization. | Hearing loss prevention starts with education and testing. The study by Flamme et al. has expanded the work from last year’s supplement, considering how audiometric testing may be changed. Will pure-tone audiometry become passé? The potential to integrate hearing testing with hearing protector fit-testing seems natural, and combining testing with training in the use of personal protection technology makes sense. Without question, occupational hearing conservation programs must begin with engineering noise controls to reduce exposures for at risk workers. Cantley et al. explored the relationship between hearing loss and tinnitus and workplace injury. They found an increased risk of acute injury among workers with tinnitus and high-frequency hearing loss. Although their research does not draw strong correlations between tinnitus and increased incidence of workplace injury, the communication needs of hearing impaired workers cannot be overlooked. Helleman et al. considered the effects of interrupted exposures to loud music at night clubs - often cited as a potential cause of hearing loss. Their research suggests that quiet zones within clubs little effect on the hearing of the subjects they evaluated. However, providing club patrons a place to get out of the noise was still thought to be important because high noise levels present a risk in and of themselves. Hong et al. investigated the relationships between occupational exposures and hearing among elderly Latino Americans. They concluded that a reduction of occupational exposure to noise and chemicals will have a positive impact on better hearing later in life. |
Patient-physician communication about work-related asthma: what we do and do not know
Mazurek JM , White GE , Moorman JE , Storey E . Ann Allergy Asthma Immunol 2014 114 (2) 97-102 BACKGROUND: Effective patient-physician communication is the key component of the patient-physician relationship. OBJECTIVE: To assess the proportion of ever-employed adults with current asthma who talked about asthma associated with work with their physician or other health professional and to identify factors associated with this communication. METHODS: The 2006 to 2010 Behavioral Risk Factor Surveillance System Asthma Call-Back Survey data from 40 states and the District of Columbia for ever-employed adults (≥18 years old) with current asthma (N = 50,433) were examined. Multivariable logistic regression analyses were conducted to identify factors associated with communication with a health professional about asthma and work. RESULTS: Among ever-employed adults with current asthma, 9.1% were ever told by a physician that their asthma was related to any job they ever had and 11.7% ever told a physician or other health professional that this was the case. When responses to the 2 questions were combined, the proportion of those who communicated with a health professional about asthma and work was 14.7%. Communication with a health professional about asthma and work was associated with age, race or ethnicity, employment, education, income, insurance, and urgent treatment for worsening asthma. CONCLUSION: A small proportion of patients with asthma might communicate with a health professional about asthma associated with work. Future studies should examine whether patients with asthma ever discussed with a health professional the possibility that their asthma might be related to work to provide information on the frequency of patient-clinician communication about asthma related to work. |
Positive psychological factors are associated with lower PTSD symptoms among police officers: post Hurricane Katrina
McCanlies EC , Mnatsakanova A , Andrew ME , Burchfiel CM , Violanti JM . Stress Health 2014 30 (5) 405-15 Following Hurricane Katrina, police officers in the New Orleans geographic area faced a number of challenges. This cross-sectional study examined the association between resilience, satisfaction with life, gratitude, posttraumatic growth, and symptoms of posttraumatic stress disorder in 84 male and 30 female police officers from Louisiana. Protective factors were measured using the Connor-Davidson Resilience scale, Satisfaction with Life Scale, the Gratitude Questionnaire, and the Posttraumatic Growth inventory. Symptoms of posttraumatic stress disorder were measured using the Posttraumatic Stress Disorder Checklist-Civilian (PCL-C). Potential associations were measured using linear regression and analysis of variance. Models were adjusted for age, sex, race, education, and alcohol. Mean PCL-C symptoms were 29.5 +/- 14.5 for females and 27.8 +/- 12.1 for males. Adjusted mean levels of PCL-C symptoms significantly decreased as quartiles of resilience (p < .001), satisfaction with life (p < .001), and gratitude (p < .001) increased. In contrast, PCL-C symptoms were not associated with posttraumatic growth in this sample. These results indicate that positive factors such as resilience, satisfaction with life, and gratitude may help mitigate symptoms of posttraumatic stress disorder. To further explore these relationships, longitudinal follow-up in a larger population would be of interest. Copyright (c) 2014 John Wiley & Sons, Ltd. |
Post-disaster stressful life events and WTC-related posttraumatic stress, depressive symptoms, and overall functioning among responders to the World Trade Center disaster
Zvolensky MJ , Kotov R , Schechter CB , Gonzalez A , Vujanovic A , Pietrzak RH , Crane M , Kaplan J , Moline J , Southwick SM , Feder A , Udasin I , Reissman DB , Luft BJ . J Psychiatr Res 2014 61 97-105 BACKGROUND: The current study examined contributions of post-disaster stressful life events in relation to the maintenance of WTC-related posttraumatic stress, depressive symptoms, and overall functioning among rescue, recovery, and clean-up workers who responded to the September 11, 2001 World Trade Center (WTC) terrorist attacks. METHODS: Participants were 18,896 WTC responders, including 8466 police officers and 10,430 non-traditional responders (85.8% male; 86.4% Caucasian; Mage = 39.5, SD = 8.8) participating in the WTC Health Program who completed an initial examination between July, 2002 and April, 2010 and who were reassessed, on average, 2.5 years later. RESULTS: Path analyses were conducted to evaluate contributions of life events to the maintenance of WTC-related posttraumatic stress, depressive symptoms, and overall functioning. These analyses were stratified by police and non-traditional responder groups and adjusted for age, sex, time from 9/11 to initial visit, WTC exposures (three WTC contextual exposures: co-worker, friend, or a relative died in the disaster; co-worker, friend, or a relative injured in the disaster; and responder was exposed to the dust cloud on 9/11), and interval from initial to first follow-up visit. In both groups, WTC-related posttraumatic stress, depressive symptoms, and overall functioning were stable over the follow-up period. WTC exposures were related to these three outcomes at the initial assessment. WTC-related posttraumatic stress, depressive symptoms, and overall functioning, at the initial assessment each predicted the occurrence of post-disaster stressful life events, as measured by Disaster Supplement of the Diagnostic Interview Schedule. Post-disaster stressful life events, in turn, were associated with subsequent mental health, indicating partial mediation of the stability of observed mental health. CONCLUSIONS: The present findings suggest a dynamic interplay between exposure, post-disaster stressful life events, and WTC-related posttraumatic stress, depressive symptoms, and overall functioning among WTC disaster responders. |
Potential contribution of work-related psychosocial stress to the development of cardiovascular disease and type II diabetes: a brief review
Krajnak KM . Environ Health Insights 2014 8 41-5 Two of the major causes of death worldwide are cardiovascular disease and Type II diabetes. Although death due to these diseases is assessed separately, the physiological process that is attributed to the development of cardiovascular disease can be linked to the development of Type II diabetes and the impact that this disease has on the cardiovascular system. Physiological, genetic, and personal factors contribute to the development of both these disorders. It has also been hypothesized that work-related stress may contribute to the development of Type II diabetes and cardiovascular disease. This review summarizes some of the studies examining the role of work-related stress on the development of these chronic disorders. Because women may be more susceptible to the physiological effects of work-related stress, the papers cited in this review focus on studies that examined the difference in responses of men or women to work-related stress or on studies that focused on the effects of stress on women alone. Based on the papers summarized, it is concluded that (1) work-related stress may directly contribute to the development of cardiovascular disease by inducing increases in blood pressure and changes in heart rate that have negative consequences on functioning of the cardiovascular system; (2) workers reporting increased levels of stress may display an increased risk of Type II diabetes because they adopt poor health habits (ie, increased level of smoking, inactivity etc), which in turn contribute to the development of cardiovascular problems; and (3) women in high demand and low-control occupations report an increased level of stress at work, and thus may be at a greater risk of negative health consequences. |
Examining pre-migration health among Filipino nurses
de Castro AB , Gee G , Fujishiro K , Rue T . J Immigr Minor Health 2014 17 (6) 1670-8 The healthy immigrant hypothesis asserts that immigrants arrive in the receiving country healthier than same race/ethnic counterparts born there. Contemporary research, however, has not evaluated pre-migration health among migrants, nor has explicitly considered comparisons with non-migrants in the country of origin. Pre-migration health was examined among 621 Filipino nurses, including self-reported physical health, mental health, health behaviors, and social stress. Measures were compared by intention to migrate and also tested as predictors of actual migration using time-to-event analysis. Nurses intending to migrate had higher proportion of depression and reported higher general perceived stress compared to those not. Predictors of actual migration included age, mentally unhealthy days, social strain, and social support. Physical health and health behavior measures had no association with migration intention or actual migration. Findings suggest that, relative to those not intending to migrate, nurses intending to migrate have worse mental health status and social stress; and, do not have a physical health advantage. Future research must span the pre- to post-migration continuum to better understand the impact of moving from one country to another on health and well-being. |
Developing a pooled job physical exposure data set from multiple independent studies: an example of a consortium study of carpal tunnel syndrome
Bao SS , Kapellusch JM , Garg A , Silverstein BA , Harris-Adamson C , Burt SE , Dale AM , Evanoff BA , Gerr FE , Hegmann KT , Merlino LA , Thiese MS , Rempel DM . Occup Environ Med 2014 72 (2) 130-7 BACKGROUND: Six research groups independently conducted prospective studies of carpal tunnel syndrome (CTS) incidence in 54 US workplaces in 10 US States. Physical exposure variables were collected by all research groups at the individual worker level. Data from these research groups were pooled to increase the exposure spectrum and statistical power. OBJECTIVE: This paper provides a detailed description of the characteristics of the pooled physical exposure variables and the source data information from the individual research studies. METHODS: Physical exposure data were inspected and prepared by each of the individual research studies according to detailed instructions provided by an exposure subcommittee of the research consortium. Descriptive analyses were performed on the pooled physical exposure data set. Correlation analyses were performed among exposure variables estimating similar exposure aspects. RESULTS: At baseline, there were a total of 3010 participants in the pooled physical exposure data set. Overall, the pooled data meaningfully increased the spectra of most exposure variables. The increased spectra were due to the wider range in exposure data of different jobs provided by the research studies. The correlations between variables estimating similar exposure aspects showed different patterns among data provided by the research studies. CONCLUSIONS: The increased spectra of the physical exposure variables among the data pooled likely improved the possibility of detecting potential associations between these physical exposure variables and CTS incidence. It is also recognised that methods need to be developed for general use by all researchers for standardisation of physical exposure variable definition, data collection, processing and reduction. |
Serological markers for monitoring historical changes in malaria transmission intensity in a highly endemic region of Western Kenya, 1994-2009
Wong J , Hamel MJ , Drakeley CJ , Kariuki S , Shi YP , Lal AA , Nahlen BL , Bloland PB , Lindblade KA , Were V , Otieno K , Otieno P , Odero C , Slutsker L , Vulule JM , Gimnig JE . Malar J 2014 13 (451) 451 BACKGROUND: Monitoring local malaria transmission intensity is essential for planning evidence-based control strategies and evaluating their impact over time. Anti-malarial antibodies provide information on cumulative exposure and have proven useful, in areas where transmission has dropped to low sustained levels, for retrospectively reconstructing the timing and magnitude of transmission reduction. It is unclear whether serological markers are also informative in high transmission settings, where interventions may reduce transmission, but to a level where considerable exposure continues. METHODS: This study was conducted through ongoing KEMRI and CDC collaboration. Asembo, in Western Kenya, is an area where intense malaria transmission was drastically reduced during a 1997-1999 community-randomized, controlled insecticide-treated net (ITN) trial. Two approaches were taken to reconstruct malaria transmission history during the period from 1994 to 2009. First, point measurements were calculated for seroprevalence, mean antibody titre, and seroconversion rate (SCR) against three Plasmodium falciparum antigens (AMA-1, MSP-119, and CSP) at five time points for comparison against traditional malaria indices (parasite prevalence and entomological inoculation rate). Second, within individual post-ITN years, age-stratified seroprevalence data were analysed retrospectively for an abrupt drop in SCR by fitting alternative reversible catalytic conversion models that allowed for change in SCR. RESULTS: Generally, point measurements of seroprevalence, antibody titres and SCR produced consistent patterns indicating that a gradual but substantial drop in malaria transmission (46-70%) occurred from 1994 to 2007, followed by a marginal increase beginning in 2008 or 2009. In particular, proportionate changes in seroprevalence and SCR point estimates (relative to 1994 baseline values) for AMA-1 and CSP, but not MSP-119, correlated closely with trends in parasite prevalence throughout the entire 15-year study period. However, retrospective analyses using datasets from 2007, 2008 and 2009 failed to detect any abrupt drop in transmission coinciding with the timing of the 1997-1999 ITN trial. CONCLUSIONS: In this highly endemic area, serological markers were useful for generating accurate point estimates of malaria transmission intensity, but not for retrospective analysis of historical changes. Further investigation, including exploration of different malaria antigens and/or alternative models of population seroconversion, may yield serological tools that are more informative in high transmission settings. |
Spatial and temporal distribution of soil-transmitted helminth infection in sub-Saharan Africa: a systematic review and geostatistical meta-analysis
Karagiannis-Voules DA , Biedermann P , Ekpo UF , Garba A , Langer E , Mathieu E , Midzi N , Mwinzi P , Polderman AM , Raso G , Sacko M , Talla I , Tchuente LA , Toure S , Winkler MS , Utzinger J , Vounatsou P . Lancet Infect Dis 2014 15 (1) 74-84 BACKGROUND: Interest is growing in predictive risk mapping for neglected tropical diseases (NTDs), particularly to scale up preventive chemotherapy, surveillance, and elimination efforts. Soil-transmitted helminths (hookworm, Ascaris lumbricoides, and Trichuris trichiura) are the most widespread NTDs, but broad geographical analyses are scarce. We aimed to predict the spatial and temporal distribution of soil-transmitted helminth infections, including the number of infected people and treatment needs, across sub-Saharan Africa. METHODS: We systematically searched PubMed, Web of Knowledge, and African Journal Online from inception to Dec 31, 2013, without language restrictions, to identify georeferenced surveys. We extracted data from household surveys on sources of drinking water, sanitation, and women's level of education. Bayesian geostatistical models were used to align the data in space and estimate risk of with hookworm, A lumbricoides, and T trichiura over a grid of roughly 1 million pixels at a spatial resolution of 5 x 5 km. We calculated anthelmintic treatment needs on the basis of WHO guidelines (treatment of all school-aged children once per year where prevalence in this population is 20-50% or twice per year if prevalence is greater than 50%). FINDINGS: We identified 459 relevant survey reports that referenced 6040 unique locations. We estimate that the prevalence of hookworm, A lumbricoides, and T trichiura among school-aged children from 2000 onwards was 16.5%, 6.6%, and 4.4%. These estimates are between 52% and 74% lower than those in surveys done before 2000, and have become similar to values for the entire communities. We estimated that 126 million doses of anthelmintic treatments are required per year. INTERPRETATION: Patterns of soil-transmitted helminth infection in sub-Saharan Africa have changed and the prevalence of infection has declined substantially in this millennium, probably due to socioeconomic development and large-scale deworming programmes. The global control strategy should be reassessed, with emphasis given also to adults to progress towards local elimination. FUNDING: Swiss National Science Foundation and European Research Council. |
Global programme to eliminate lymphatic filariasis: the processes underlying programme success
Ichimori K , King JD , Engels D , Yajima A , Mikhailov A , Lammie P , Ottesen EA . PLoS Negl Trop Dis 2014 8 (12) e3328 Lymphatic filariasis (LF) is caused by filarial worms that live in the lymphatic system and commonly lead to lymphoedema, elephantiasis, and hydrocele. LF is recognized as endemic in 73 countries and territories; an estimated 1.39 billion (thousand million) people live in areas where filariasis has been endemic and is now targeted for treatment [1]. Global momentum to eliminate LF has developed over the past 15 years as a result not only of research demonstrating the value of single-dose treatment strategies and point-of-care diagnostic tools, but also of both the generous donations of medicines from the following committed pharmaceutical companies: GlaxoSmithKline (albendazole), Merck (ivermectin), and Eisai (diethylcarbamazine; DEC), and the essential financial support for programme implementation from the donor community [2]. During 2011, more than 50 countries carried out LF elimination programmes, and more than 500 million people received mass treatment [1]. A principal reason for the programme's dramatic expansion and success to date has been the galvanizing of efforts of all key partners around a common policy framework created and coordinated through the World Health Organization's Global Programme to Eliminate Lymphatic Filariasis (GPELF). This report, rather than highlighting the very considerable contributions of each individual partner or even chronicling most of the specific achievements of the GPELF, instead focuses on the details of the underlying processes themselves and their importance in determining programme success. |
Adherence to malaria prophylaxis among Peace Corps Volunteers in the Africa region, 2013
Landman KZ , Tan KR , Arguin PM . Travel Med Infect Dis 2014 13 (1) 61-8 BACKGROUND: Although malaria can be prevented with prophylaxis, it is diagnosed in over 100 Africa-region Peace Corps Volunteers annually. This suggests that prophylaxis non-adherence is a problem in these non-immune travelers. METHODS: We investigated Volunteers' knowledge, attitudes, and practices regarding prophylaxis using an internet-based survey during August 19-September 30, 2013. Adherence was defined as taking doxycycline or atovaquone-proguanil daily, or taking mefloquine doses no more than 8 days apart. RESULTS: The survey was sent to 3248 Volunteers. Of 781 whose responses were analyzed, 514 (73%) reported adherence to prophylaxis. The most common reasons for non-adherence were forgetting (n = 530, 90%); fear of long-term adverse effects (LTAEs; n = 316, 54%); and experiencing adverse events that Volunteers attributed to prophylaxis (n = 297, 51%). Two hundred fourteen (27%) Volunteers reported not worrying about malaria. On multivariate analysis controlling for sex and experiencing adverse events Volunteers attributed to prophylaxis, the factor most strongly associated with non-adherence was being prescribed mefloquine (OR 5.4, 95% confidence interval 3.2-9.0). CONCLUSIONS: We found moderate adherence and a prevailing fear of LTAEs among Volunteers. Strategies to improve prophylaxis adherence may include medication reminders, increasing education about prophylaxis safety and malaria risk, and promoting prompt management of prophylaxis side effects. |
Complex epidemiology and zoonotic potential for Cryptosporidium suis in rural Madagascar
Bodager JR , Parsons MB , Wright PC , Rasambainarivo F , Roellig D , Xiao L , Gillespie TR . Vet Parasitol 2014 207 140-3 Cryptosporidium spp. is the most important parasitic diarrheal agent in the world, is among the top four causes of moderate-to-severe diarrheal disease in young children in developing nations, and is problematic as an opportunistic co-infection with HIV. In addition, Cryptosporidium is a persistent challenge for livestock production. Despite its zoonotic potential, few studies have examined the ecology and epidemiology of this pathogen in rural systems characterized by high rates of overlap among humans, domesticated animals, and wildlife. To improve our understanding of the zoonotic potential of Cryptosporidium species in the rural tropics, we screened humans, livestock, peridomestic rodents, and wildlife using PCR-RFLP and sequencing-based approaches to distinguish species of Cryptosporidium in rural southeastern Madagascar. Cryptosporidium of multiple species/genotypes were apparent in this study system. Interestingly, C. suis was the dominant species of Cryptosporidium in the region, infecting humans (n=1), cattle (n=18), pigs (n=3), and rodents (n=1). The broad species range of C. suis and the lack of common cattle Cryptosporidium species (Cryptosporidium parvum and Cryptosporidium andersoni) in this system are unique. This report represents the fifth confirmed case of C. suis infection in humans, and the first case in Africa. Few rural human and livestock populations have been screened for Cryptosporidium using genus-specific genotyping methods. Consequently, C. suis may be more widespread in human and cattle populations than previously believed. |
Gardening activities and physical health among older adults: a review of the evidence
Nicklett EJ , Anderson LA , Yen IH . J Appl Gerontol 2014 35 (6) 678-90 Few studies have examined the health-related consequences of gardening among older adults. This scoping review summarizes and characterizes current research that examines the relationship between physical health and participation in planned gardening activities, including establishing, maintaining, or caring for plants. Six databases were searched. Eligible studies were published between 2000 and 2013, were published in English, and assessed different aspects of physical health (e.g., functional ability, energy expenditure, injury) for older adults who had participated in a planned gardening activity. Of the eight eligible studies identified with these criteria, four assessed energy expenditures and four assessed physical functioning. Studies assessing energy expenditures documented that the majority of gardening tasks were classified into low-to-moderate intensity physical activity. The current literature does not provide sufficient evidence of the physical functioning consequences of gardening. Future studies should consider how specific gardening interventions help older adults meet physical activity guidelines. |
Building capacity for health impact assessment: training outcomes from the United States
Schuchter J , Rutt C , Satariano WA , Seto E . Environ Impact Assess Rev 2015 50 190-195 BACKGROUND: Despite the continued growth of Health Impact Assessment (HIA) in the US, there is little research on HIA capacity-building. A comprehensive study of longer-term training outcomes may reveal opportunities for improving capacity building activities and HIA practice. METHODS: We conducted in-depth interviews with HIA trainees in the United States to assess their outcomes and needs. Using a training evaluation framework, we measured outcomes across a spectrum of reaction, learning, behavior and results. RESULTS: From 2006 to 2012, four organizations trained over 2200 people in at least 75 in-person HIA trainings in 29 states. We interviewed 48 trainees, selected both randomly and purposefully. The mean duration between training and interview was 3.4. years. Trainees reported that their training objectives were met, especially when relevant case-studies were used. They established new collaborations at the trainings and maintained them. Training appeared to catalyze more holistic thinking and practice, including a range of HIA-related activities. Many trainees disseminated what they learned and engaged in components of HIA, even without dedicated funding. Going forward, trainees need assistance with quantitative methods, project management, community engagement, framing recommendations, and evaluation. CONCLUSIONS: The research revealed opportunities for a range of HIA stakeholders to refine and coordinate training resources, apply a competency framework and leverage complimentary workforce development efforts, and sensitize and build the capacity of communities. |
The National Prevention Strategy: leveraging multiple sectors to improve population health
Lushniak BD , Alley DE , Ulin B , Graffunder C . Am J Public Health 2014 105 (2) e1-e3 In 2013, the Institute of Medicine reported persistent gaps between the United States and other high-income countries across multiple risk factors, diseases, and health outcomes. Large gaps also exist within the United States, and life expectancy appears to be declining in some US counties and population groups. These alarming trends cannot be explained by the availability of health care alone; rather, they reflect a complex interplay between the physical and social environment, individual health behaviors, and the health care delivery system. Achieving progress will require population-based interventions that address these factors that contribute to health. |
Non-communicable disease training for public health workers in low- and middle-income countries: lessons learned from pilot training in Tanzania
Davila EP , Suleiman Z , Mghamba J , Rolle I , Ahluwalia I , Mmbuji P , de Courten M , Bader A , Zahniser SC , Krag M , Jarrar B . Int Health 2014 7 (5) 339-47 BACKGROUND: Non-communicable diseases (NCDs) are increasing worldwide. A lack of training and experience in NCDs among public health workers is evident in low- and middle- income countries. METHODS: We describe the design and outcomes of applied training in NCD epidemiology and control piloted in Tanzania that included a 2-week interactive course and a 6-month NCD field project. Trainees (n=14 initiated; n=13 completed) were epidemiology-trained Ministry of Health or hospital staff. We evaluated the training using Kirkpatrick's evaluation model for measuring reactions, learning, behavior and results using pre- and post-tests and closed-ended and open-ended questions. RESULTS: Significant improvements in knowledge and self-reported competencies were observed. Trainees reported applying competencies at work and supervisors reported improvements in trainees' performance. Six field projects were completed; one led to staffing changes and education materials for patients with diabetes and another to the initiation of an injury surveillance system. Workplace support and mentoring were factors that facilitated the completion of projects. Follow-up of participants was difficult, limiting our evaluation of the training's outcomes. CONCLUSIONS: The applied NCD epidemiology and control training piloted in Tanzania was well received and showed improvements in knowledge, skill and self-efficacy and changes in workplace behavior and institutional and organizational changes. Further evaluations are needed to better understand the impact of similar NCD trainings and future trainers should ensure that trainees have mentoring and workplace support prior to participating in an applied NCD training. |
Pregnancy desire and dual method contraceptive use among people living with HIV attending clinical care in Kenya, Namibia and Tanzania
Antelman G , Medley A , Mbatia R , Pals S , Arthur G , Haberlen S , Ackers M , Elul B , Parent J , Rwebembera A , Wanjiku L , Muraguri N , Gweshe J , Mudhune S , Bachanas P . J Fam Plann Reprod Health Care 2015 41 (1) e1 AIM: To describe factors associated with pregnancy desire and dual method use among people living with HIV in clinical care in sub-Saharan Africa. DESIGN: Sexually active HIV-positive adults were enrolled in 18 HIV clinics in Kenya, Namibia and Tanzania. Demographic, clinical and reproductive health data were captured by interview and medical record abstraction. Correlates of desiring a pregnancy within the next 6 months, and dual method use [defined as consistent condom use together with a highly effective method of contraception (hormonal, intrauterine device (IUD), permanent)], among those not desiring pregnancy, were identified using logistic regression. RESULTS: Among 3375 participants (median age 37 years, 42% male, 64% on antiretroviral treatment), 565 (17%) desired a pregnancy within the next 6 months. Of those with no short-term fertility desire (n=2542), 686 (27%) reported dual method use, 250 (10%) highly effective contraceptive use only, 1332 (52%) condom use only, and 274 (11%) no protection. Respondents were more likely to desire a pregnancy if they were from Namibia and Tanzania, male, had a primary education, were married/cohabitating, and had fewer children. Factors associated with increased likelihood of dual method use included being female, being comfortable asking a partner to use a condom, and communication with a health care provider about family planning. Participants who perceived that their partner wanted a pregnancy were less likely to report dual method use. CONCLUSIONS: There was low dual method use and low use of highly effective contraception. Contraceptive protection was predominantly through condom-only use. These findings demonstrate the importance of integrating reproductive health services into routine HIV care. |
Time-related increase in urinary testosterone levels and stable semen analysis parameters after bariatric surgery in men
Legro RS , Kunselman AR , Meadows JW , Kesner JS , Krieg EF , Rogers AM , Cooney RN . Reprod Biomed Online 2014 30 (2) 150-6 The aim of this prospective cohort study was to determine the time-course in androgen and semen parameters in men after weight loss associated with bariatric surgery. Six men aged 18-40 years, meeting National Institutes of Health bariatric surgery guidelines, were followed between 2005 and 2008. Study visits took place at baseline, then 1, 3, 6 and 12 months after surgery. All men underwent Roux-en-y gastric bypass (RYGB). At each visit, biometric, questionnaire, serum, and urinary specimens and seman analysis were collected. Urinary integrated total testosterone levels increased significantly (P < 0.0001) by 3 months after surgery, and remained elevated throughout the study. Circulating testosterone levels were also higher at 1 and 6 months after surgery, compared with baseline. Serum sex hormone-binding globulin levels were significantly elevated at all time points after surgery (P < 0.01 to P = 0.02). After RYGB surgery, no significant changes occurred in urinary oestrogen metabolites (oestrone 3-glucuronide), serum oestradiol levels, serial semen parameters or male sexual function by questionnaire. A threshold of weight loss is necessary to improve male reproductive function by reversing male hypogonadism, manifested as increased testosterone levels. Further serial semen analyses showed normal ranges for most parameters despite massive weight loss. |
Association of assisted reproductive technology (ART) treatment and parental infertility diagnosis with autism in ART-conceived children
Kissin DM , Zhang Y , Boulet SL , Fountain C , Bearman P , Schieve L , Yeargin-Allsopp M , Jamieson DJ . Hum Reprod 2014 30 (2) 454-65 STUDY QUESTION: Are assisted reproductive technology (ART) treatment factors or infertility diagnoses associated with autism among ART-conceived children? SUMMARY ANSWER: Our study suggests that the incidence of autism diagnosis in ART-conceived children during the first 5 years of life was higher when intracytoplasmic sperm injection (ICSI) was used compared with conventional IVF, and lower when parents had unexplained infertility (among singletons) or tubal factor infertility (among multiples) compared with other types of infertility. WHAT IS KNOWN ALREADY: Some studies found an increased risk of autism among ART-conceived infants compared with spontaneously-conceived infants. However, few studies, and none in the USA, have examined the associations between types of ART procedures and parental infertility diagnoses with autism among ART-conceived children. STUDY DESIGN, SIZE, DURATION: Population-based retrospective cohort study using linkages between National ART Surveillance System (NASS) data for 1996-2006, California Birth Certificate data for 1997-2006, and California Department of Developmental Services (DDS) Autism Caseload data for 1997-2011. PARTICIPANTS/MATERIALS, SETTING, METHODS: All live born ART-conceived infants born in California in 1997-2006 (n = 42 383) with 5-year observation period were included in the study. We assessed the annual incidence of autism diagnosis documented in DDS, which includes information on the vast majority of persons with autism in California, and the association of autism diagnosis with ART treatment factors and infertility diagnoses. MAIN RESULTS AND THE ROLE OF CHANCE: Among ART-conceived singletons born in California between 1997 and 2006, the incidence of autism diagnosis remained at approximately 0.8% (P for trend 0.19) and was lower with parental diagnosis of unexplained infertility (adjusted hazard risk ratio [aHRR]; 95% confidence interval: 0.38; 0.15-0.94) and higher when ICSI was used (aHRR 1.65; 1.08-2.52), when compared with cases without these patient and treatment characteristics. Among ART-conceived multiples, the incidence of autism diagnosis between 1997 and 2006 remained at approximately 1.2% (P for trend 0.93) and was lower with parental diagnosis of tubal factor infertility (aHRR 0.56; 0.35-0.90) and higher when ICSI was used (aHRR 1.71; 1.10-2.66). LIMITATIONS, REASONS FOR CAUTION: Study limitations include imperfect data linkages, lack of data on embryo quality and possible underestimation of autism diagnosis cases. Limitations of the observational study design could affect the analysis by the possibility of residual confounders. Since information about ICSI use was missing for most frozen/thawed embryo transfer cycles, our findings of association of ICSI use and autism diagnosis can only be generalizable to fresh embryo transfer cycles. WIDER IMPLICATIONS OF THE FINDINGS: Our study provides additional evidence of the association between some types of ART procedures with autism diagnosis. Additional research is required to explain the increased risk of autism diagnosis with ICSI use, as well as studies on the effectiveness and safety of ICSI. STUDY FUNDING/COMPETING INTERESTS: The study was partially supported by the National Institutes of Health. The authors have no competing interests that may be relevant to the study. |
Complications of cesarean deliveries among HIV-infected women in the United States
Kourtis AP , Ellington S , Pazol K , Flowers L , Haddad L , Jamieson DJ . AIDS 2014 28 (17) 2609-2618 OBJECTIVE: To compare rates of complications associated with cesarean delivery in HIV-infected and HIV-uninfected women in the United States and to investigate trends in such complications across four study cycles spanning the implementation of HAART in the United States (1995-1996, 2000-2001, 2005-2006, 2010-2011). DESIGN: The Nationwide Inpatient Sample from the Healthcare Cost and Utilization Project is the largest all-payer hospital inpatient care database in the United States; when weighted to account for the complex sampling design, nationally representative estimates are derived. After restricting the study sample to women aged 15-49 years, our study sample consisted of approximately 1 090 000 cesarean delivery hospitalizations annually. METHODS: Complications associated with cesarean deliveries were categorized as infection, hemorrhage, or surgical trauma, based on groups of specific International Classification of Diseases 9th revision codes. Length of hospitalization, hospital charges, and in-hospital deaths were also examined. RESULTS: The rate of complications significantly decreased during the study periods for HIV-infected and HIV-uninfected women. However, rates of infectious complications and surgical trauma associated with cesarean deliveries remained higher among HIV-infected, compared with HIV-uninfected women in 2010-2011, as did prolonged hospital stay and in-hospital deaths. Length of hospitalization decreased over time for cesarean deliveries of HIV-infected women to a greater extent compared with HIV-uninfected women. CONCLUSION: In the United States, rates of cesarean delivery complications decreased from 1995 to 2011. However, rates of infection, surgical trauma, hospital deaths, and prolonged hospitalization are still higher among HIV-infected women. Clinicians should remain alert to this persistently increased risk of cesarean delivery complications among HIV-infected women. |
Trends and patterns of sexual behaviors among adolescents and adults aged 14 to 59 years, United States
Liu G , Hariri S , Bradley H , Gottlieb SL , Leichliter JS , Markowitz LE . Sex Transm Dis 2015 42 (1) 20-6 BACKGROUND: Evaluation of sexual behaviors is essential to better understand the epidemiology of sexually transmitted infections and their sequelae. METHODS: The National Health and Nutrition Examination Surveys (NHANES) is an ongoing probability sample survey of the US population. Using NHANES sexual behavior data from 1999 to 2012, we performed the following: (1) trend analyses among adults aged 25 to 59 years by 10-year birth cohorts and (2) descriptive analyses among participants aged 14 to 24 years. Sex was defined as vaginal, anal, or oral sex. RESULTS: Among adults aged 25 to 59 years, median age at sexual initiation decreased between the 1940-1949 and 1980-1989 cohorts from 17.9 to 16.2 among females (Ptrend < 0.001) and from 17.1 to 16.1 among males (Ptrend < 0.001). Median lifetime partners increased between the 1940-1949 and 1970-1979 cohorts, from 2.6 to 5.3 among females (Ptrend < 0.001) and from 6.7 to 8.8 among males (Ptrend < 0.001). The percentage of females reporting ever having a same-sex partner increased from 5.2% to 9.3% between the 1940-1949 and 1970-1979 cohorts (Ptrend < 0.001). Among participants aged 14 to 24 years, the percentage having had sex increased with age, from 12.5% among females and 13.1% among males at age 14 years to more than 75% at age 19 years for both sexes. Among sexually experienced 14- to 19-year-olds, 45.2% of females and 55.0% of males had at least 3 lifetime partners; 39.4% of females and 48.6% of males had at least 2 partners in the past year. The proportion of females aged 20 to 24 years who reported ever having a same-sex partner was 14.9%. The proportion of participants aged 14-19 or 20-24 years reporting ever having sex did not differ by survey year from 1999 to 2012 for either males or females. CONCLUSIONS: Sexual behaviors changed with successive birth cohorts, with more pronounced changes among females. A substantial proportion of adolescents are sexually active and have multiple partners. These data reinforce existing recommendations for sexual health education and sexually transmitted infection prevention targeting adolescents before sexual debut. |
Estimation of death rates in US states with small subpopulations
Voulgaraki A , Wei R , Kedem B . Stat Med 2014 34 (11) 1940-52 In US states with small subpopulations, the observed mortality rates are often zero, particularly among young ages. Because in life tables, death rates are reported mostly on a log scale, zero mortality rates are problematic. To overcome the observed zero death rates problem, appropriate probability models are used. Using these models, observed zero mortality rates are replaced by the corresponding expected values. This enables logarithmic transformations and, in some cases, the fitting of the eight-parameter Heligman-Pollard model to produce mortality estimates for ages 0-130 years, a procedure illustrated in terms of mortality data from several states. |
Smoke-free policies in U.S. prisons and jails: a review of the literature
Kennedy SM , Davis SP , Thorne SL . Nicotine Tob Res 2014 17 (6) 629-35 INTRODUCTION: Despite progress in limiting exposure to secondhand smoke (SHS) in the United States, little is known about the impact of smoke-free polices in prisons and jails. SHS exposure in this setting may be great, as smoking prevalence among inmates is more than three times higher than among non-incarcerated adults. To inform the implementation of smoke-free policies, this article reviews the literature on the extent, nature, and impact of smoke-free policies in U.S. prisons and jails. METHODS: We systematically searched PubMed, Embase, EconLit, and Social Services Abstracts databases. We examined studies published prior to January 2014 that described policies prohibiting smoking tobacco in adult U.S. correctional facilities. RESULTS: Twenty-seven studies met inclusion criteria. Smoke-free policies in prisons were rare in the 1980s but, by 2007, 87% prohibited smoking indoors. Policies reduced SHS exposure and a small body of evidence suggests they are associated with health benefits. We did not identify any studies documenting economic outcomes. Non-compliance with policies was documented in a small number of prisons and jails, with 20%-76% of inmates reporting smoking in violation of a policy. Despite barriers, policies were implemented successfully when access to contraband tobacco was limited and penalties were enforced. CONCLUSION: Smoke-free policies have become increasingly common in prisons and jails, but evidence suggests they are not consistently implemented. Future studies should examine the health and economic outcomes of smoke-free policies in prisons and jails. By implementing smoke-free policies, prisons and jails have an opportunity to improve the health of staff and inmates. |
State laws prohibiting sales to minors and indoor use of electronic nicotine delivery systems - United States, November 2014
Marynak K , Holmes CB , King BA , Promoff G , Bunnell R , McAfee T . MMWR Morb Mortal Wkly Rep 2014 63 (49) 1145-50 Electronic nicotine delivery systems (ENDS), including electronic cigarettes (e-cigarettes) and other devices such as electronic hookahs, electronic cigars, and vape pens, are battery-powered devices capable of delivering aerosolized nicotine and additives to the user. Experimentation with and current use of e-cigarettes has risen sharply among youths and adults in the United States. Youth access to and use of ENDS is of particular concern given the potential adverse effects of nicotine on adolescent brain development. Additionally, ENDS use in public indoor areas might passively expose bystanders (e.g., children, pregnant women, and other nontobacco users) to nicotine and other potentially harmful constituents. ENDS use could have the potential to renormalize tobacco use and complicate enforcement of smoke-free policies. State governments can regulate the sales of ENDS and their use in indoor areas where nonusers might be involuntarily exposed to secondhand aerosol. To learn the current status of state laws regulating the sales and use of ENDS, CDC assessed state laws that prohibit ENDS sales to minors and laws that include ENDS use in conventional smoking prohibitions in indoor areas of private worksites, restaurants, and bars. Findings indicate that as of November 30, 2014, 40 states prohibited ENDS sales to minors, but only three states prohibited ENDS use in private worksites, restaurants, and bars. Of the 40 states that prohibited ENDS sales to minors, 21 did not prohibit ENDS use or conventional smoking in private worksites, restaurants, and bars. Three states had no statewide laws prohibiting ENDS sales to minors and no statewide laws prohibiting ENDS use or conventional smoking in private worksites, restaurants, and bars. According to the Surgeon General, ENDS have the potential for public health harm or public health benefit. The possibility of public health benefit from ENDS could arise only if 1) current smokers use these devices to switch completely from combustible tobacco products and 2) the availability and use of combustible tobacco products are rapidly reduced. Therefore, when addressing potential public health harms associated with ENDS, it is important to simultaneously uphold and accelerate strategies found by the Surgeon General to prevent and reduce combustible tobacco use, including tobacco price increases, comprehensive smoke-free laws, high-impact media campaigns, barrier-free cessation treatment and services, and comprehensive statewide tobacco control programs. |
Prevalence and sociodemographic determinants of tobacco use in four countries of the World Health Organization: South-East Asia region: findings from the Global Adult Tobacco Survey
Palipudi K , Rizwan SA , Sinha DN , Andes LJ , Amarchand R , Krishnan A , Asma S . Indian J Cancer 2014 51 Suppl S24-32 INTRODUCTION: Tobacco use is a leading cause of deaths and Disability Adjusted Life Years lost worldwide, particularly in South-East Asia. Health risks associated with exclusive use of one form of tobacco alone has a different health risk profile when compared to dual use. In order to tease out specific profiles of mutually exclusive categories of tobacco use, we carried out this analysis. METHODS: The Global Adult Tobacco Survey (GATS) data was used to describe the profiles of three mutually exclusive tobacco use categories ("Current smoking only," "Current smokeless tobacco [SLT] use only," and "Dual use") in four World Health Organization South-East Asia Region countries, namely Bangladesh, India, Indonesia and Thailand. GATS was a nationally representative household-based survey that used a stratified multistage cluster sampling design proportional to population size. Prevalence of different forms of usage were described as proportions. Logistics regression analyses was performed to calculate odds ratios (OR) with 95% confidence intervals. All analyses were weighted, accounted for the complex sampling design and conducted using SPSS version 18. RESULTS: The prevalence of different forms of tobacco use varied across countries. Current tobacco use ranged from 27.2% in Thailand to 43.3% in Bangladesh. Exclusively smoking was more common in Indonesia (34.0%) and Thailand (23.4%) and less common in Bangladesh (16.1%) and India (8.7%). Exclusively using SLT was more common in Bangladesh (20.3%) and India (20.6%) and less common on Indonesia (0.9%) and Thailand (3.5%). Dual use of smoking and SLT was found in Bangladesh (6.8%) and India (5.3%), but was negligible in Indonesia (0.8) and Thailand (0.4%). Gender, age, education and wealth had significant effects on the OR for most forms of tobacco use across all four countries with the exceptions of SLT use in Indonesia and dual use in both Indonesia and Thailand. In general, the different forms of tobacco use increased among males and with increasing age; and decreased with higher education and wealth. The results for urban versus rural residence were mixed and frequently not significant once controlling for the other demographic factors. CONCLUSION: This study addressed the socioeconomic disparities, which underlie health inequities due to tobacco use. Tobacco control activities in these countries should take in account local cultural, social and demographic factors for successful implementation. |
Illnesses and deaths among persons attending an electronic dance-music festival - New York City, 2013
Ridpath A , Driver CR , Nolan ML , Karpati A , Kass D , Paone D , Jakubowski A , Hoffman RS , Nelson LS , Kunins HV . MMWR Morb Mortal Wkly Rep 2014 63 (50) 1195-8 Outdoor electronic dance-music festivals (EDMFs) are typically summer events where attendees can dance for hours in hot temperatures. EDMFs have received increased media attention because of their growing popularity and reports of illness among attendees associated with recreational drug use. MDMA (3,4-methylenedioxymethamphetamine) is one of the drugs often used at EDMFs. MDMA causes euphoria and mental stimulation but also can cause serious adverse effects, including hyperthermia, seizures, hyponatremia, rhabdomyolysis, and multiorgan failure. In this report, MDMA and other synthetic drugs commonly used at dance festivals are referred to as "synthetic club drugs." On September 1, 2013, the New York City (NYC) Department of Health and Mental Hygiene (DOHMH) received reports of two deaths of attendees at an EDMF (festival A) held August 31-September 1 in NYC. DOHMH conducted an investigation to identify and characterize adverse events resulting in emergency department (ED) visits among festival A attendees and to determine what drugs were associated with these adverse events. The investigation identified 22 cases of adverse events; nine cases were severe, including two deaths. Twenty-one (95%) of the 22 patients had used drugs or alcohol. Of 17 patients with toxicology testing, MDMA and other compounds were identified, most frequently methylone, in 11 patients. Public health messages and strategies regarding adverse health events might reduce illnesses and deaths at EDMFs. |
Addressing the social determinants of health to reduce tobacco-related disparities
Garrett BE , Dube SR , Babb S , McAfee T . Nicotine Tob Res 2014 17 (8) 892-7 Comprehensive tobacco prevention and control efforts that include implementing smoke-free air laws, increasing tobacco prices, conducting hard-hitting mass media campaigns, and making evidence-based cessation treatments available are effective in reducing tobacco use in the general population. However, if these interventions are not implemented in an equitable manner, certain population groups may be left out causing or exacerbating disparities in tobacco use. Disparities in tobacco use have, in part, stemmed from inequities in the way tobacco control policies and programs have been adopted and implemented to reach and impact the most vulnerable segments of the population that have the highest rates of smoking; e.g., those with lower education and incomes. Education and income are the two main social determinants of health that negatively impact health, however, there are other related social determinants of health that must be considered for tobacco control policies to be effective in reducing tobacco-related disparities. This paper will provide an overview of how tobacco control policies and programs can address key social determinants of health in order to achieve equity and eliminate disparities in tobacco prevention and control. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Community Health Services
- Disease Reservoirs and Vectors
- Environmental Health
- Epidemiology and Surveillance
- Food Safety
- Genetics and Genomics
- Immunity and Immunization
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Physical Activity
- Public Health Leadership and Management
- Reproductive Health
- Social and Behavioral Sciences
- Statistics as Topic
- Substance Use and Abuse
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 22, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure