Reporting of adherence to healthy lifestyle behaviors among hypertensive adults in the 50 states and the District of Columbia, 2013
Fang J , Moore L , Loustalot F , Yang Q , Ayala C . J Am Soc Hypertens 2016 10 (3) 252-262 e3 Achieving and maintaining a healthy lifestyle is an important part of hypertension management. The purpose of this study was to assess US state-level prevalence of adherence to healthy lifestyle behaviors among those with self-reported hypertension. Using 2013 data from the Behavioral Risk Factor Surveillance System, a state-based telephone survey, we examined the adherence to five healthy lifestyle behaviors related to hypertension management: having a "normal" weight, not smoking, avoiding or limiting alcohol intake, consuming the recommended amount of fruits and vegetables, and engaging in the recommended amount of physical activity. We estimated age-standardized percentages of each healthy lifestyle behavior overall and by state, as well as prevalence of all five healthy lifestyle behaviors. Overall, the prevalence of healthy lifestyle behaviors varied widely among those with self-reported hypertension: 20.5% had a normal weight, 82.3% did not smoke, 94.1% reported no or limited alcohol intake, 14.1% consumed the recommended amounts of fruits or vegetables, and 46.6% engaged in the recommended amount of physical activity. Overall, only 1.7% of adults with self-reported hypertension reported all five healthy lifestyle behaviors, with significant variation by state. Age-standardized prevalence of individuals reporting all five healthy lifestyle behaviors ranged from 0.3% in Louisiana to 3.8% in the District of Columbia. In conclusion, adherence to healthy lifestyle behaviors varied among those with hypertension; fewer than 2% reported meeting current recommendations and standards when assessed collectively. Disparities were observed by demographic and descriptive characteristics, including geography. |
Predicting periodontitis at state and local levels in the United States
Eke PI , Zhang X , Lu H , Wei L , Thornton-Evans G , Greenlund KJ , Holt JB , Croft JB . J Dent Res 2016 95 (5) 515-22 The objective of the study was to estimate the prevalence of periodontitis at state and local levels across the United States by using a novel, small area estimation (SAE) method. Extended multilevel regression and poststratification analyses were used to estimate the prevalence of periodontitis among adults aged 30 to 79 y at state, county, congressional district, and census tract levels by using periodontal data from the National Health and Nutrition Examination Survey (NHANES) 2009-2012, population counts from the 2010 US census, and smoking status estimates from the Behavioral Risk Factor Surveillance System in 2012. The SAE method used age, race, gender, smoking, and poverty variables to estimate the prevalence of periodontitis as defined by the Centers for Disease Control and Prevention/American Academy of Periodontology case definitions at the census block levels and aggregated to larger administrative and geographic areas of interest. Model-based SAEs were validated against national estimates directly from NHANES 2009-2012. Estimated prevalence of periodontitis ranged from 37.7% in Utah to 52.8% in New Mexico among the states (mean, 45.1%; median, 44.9%) and from 33.7% to 68% among counties (mean, 46.6%; median, 45.9%). Severe periodontitis ranged from 7.27% in New Hampshire to 10.26% in Louisiana among the states (mean, 8.9%; median, 8.8%) and from 5.2% to 17.9% among counties (mean, 9.2%; median, 8.8%). Overall, the predicted prevalence of periodontitis was highest for southeastern and southwestern states and for geographic areas in the Southeast along the Mississippi Delta, as well as along the US and Mexico border. Aggregated model-based SAEs were consistent with national prevalence estimates from NHANES 2009-2012. This study is the first-ever estimation of periodontitis prevalence at state and local levels in the United States, and this modeling approach complements public health surveillance efforts to identify areas with a high burden of periodontitis. |
Asthma action plan receipt among children with asthma 2-17 years of age, United States, 2002-2013
Simon AE , Akinbami LJ . J Pediatr 2016 171 283-9 e1 OBJECTIVE: To examine national trends in the receipt of asthma action plans, an intervention recommended by the National Asthma Education and Prevention Program guidelines. STUDY DESIGN: We used data from the sample child component of the National Health Interview Survey from 2002, 2003, 2008, and 2013 to examine the percentage of children 2-17 years of age with asthma (n = 3714) that have ever received an asthma action plan. Bivariate and multivariate (with adjustment for sociodemographic characteristics and asthma outcomes consistent with greater disease severity) logistic regressions were conducted to examine trends from 2002 to 2013 and to examine, with 2013 data only, the relationship between having received an asthma action plan and both sociodemographic characteristics and indicators of asthma severity. RESULTS: The percentage of children with asthma that had ever received an asthma action plan increased from 41.7% in 2002 to 50.7% in 2013 (P < .001 for trend). In 2013, a greater percentage of non-Hispanic black (58.4%) than non-Hispanic white (47.4%) children (P = .028), privately insured (56.2%) vs those with public insurance only (46.3%) (P = .016), and users of inhaled preventive asthma medication vs those that did not (P < .001) had ever received an asthma action plan. Adjusted results were similar. CONCLUSION: The percentage of US children with asthma that had ever received an asthma action plan increased between 2002 and 2013, although one-half had never received an asthma action plan in 2013. Some sociodemographic and asthma severity measures are related to receipt of an asthma action plan. |
Clinical cancer advances 2016: Annual report on progress against cancer from the American Society of Clinical Oncology
Dizon DS , Krilov L , Cohen E , Gangadhar T , Ganz PA , Hensing TA , Hunger S , Krishnamurthi SS , Lassman AB , Markham MJ , Mayer E , Neuss M , Pal SK , Richardson LC , Schilsky R , Schwartz GK , Spriggs DR , Villalona-Calero MA , Villani G , Masters G . J Clin Oncol 2016 34 (9) 987-1011 The past few years have been incredibly exciting in cancer research and care. Some of the advances highlighted in Clinical Cancer Advances 2016 are already improving the lives of patients today, and many others provide direction for further research. | Compared with when I started my career in oncology, today we do the unthinkable. We no longer treat cancer simply by its type or stage. In the era of precision medicine, we select—and rule out—treatments based the genomic profile of each patient and the tumor. We manage once-debilitating adverse effects to the point that many, if not most, patients can continue their daily activities during treatment. | No recent advance has been more transformative than the rise of immunotherapy, particularly over this past year, making immunotherapy the American Society of Clinical Oncology’s (ASCO’s) Advance of the Year. | The immunotherapy concept is simple: unleash the body’s immune system to attack cancer. It has proven extremely difficult, however, to develop treatments that deliver real, consistent results. Decades of bold ideas, dedication, and financial investment in research have been required to prove immunotherapy’s worth as a treatment for people with an array of different cancers. |
Recent biomarker-confirmed unprotected vaginal sex, but not self-reported unprotected sex, is associated with recurrent bacterial vaginosis
Turner AN , Carr Reese P , Snead MC , Fields K , Ervin M , Kourtis AP , Klebanoff MA , Gallo MF . Sex Transm Dis 2016 43 (3) 172-6 BACKGROUND: Self-reported unprotected vaginal sex seems to increase risk of bacterial vaginosis (BV). However, the validity of self-reports is questionable, given their inconsistency with more objective measures of recent semen exposure such as detection of prostate-specific antigen (PSA). We examined whether recent unprotected sex, as measured both by PSA detection on vaginal swabs and by self-report, was associated with increased BV recurrence. METHODS: We analyzed randomized trial data from nonpregnant, BV-positive adult women recruited from a sexually transmitted disease clinic. Participants received BV therapy at enrollment and were scheduled to return after 4, 12, and 24 weeks. Bacterial vaginosis (by Nugent score) and PSA were measured at each visit. We used Cox proportional hazards models to examine the association between PSA positivity and recurrent BV. We also evaluated associations between self-reported unprotected sex (ever/never since the last visit and in the last 48 hours, analyzed separately) and recurrent BV. RESULTS: Prostate-specific antigen and BV results were available for 96 women who contributed 226 follow-up visits. Prostate-specific antigen positivity was associated with increased BV recurrence (adjusted hazard ratio [aHR], 2.32; 95% confidence interval [CI], 1.28-4.21). In contrast, we observed no significant increase in BV recurrence among women self-reporting unprotected sex since the last visit (aHR, 1.63; 95% CI, 0.77-3.43) or in the last 48 hours (aHR, 1.28; 95% CI, 0.70-2.36). CONCLUSIONS: Estimates from earlier studies linking self-reported unprotected sex and BV may be biased by misclassification. Biomarkers can improve measurement of unprotected sex, a critical exposure variable in sexual health research. |
Slow progress in finalizing measles and rubella elimination in the European region
Biellik R , Davidkin I , Esposito S , Lobanov A , Kojouharova M , Pfaff G , Santos JI , Simpson J , Mamou MB , Butler R , Deshevoi S , Huseynov S , Jankovic D , Shefer A . Health Aff (Millwood) 2016 35 (2) 322-6 All countries in the World Health Organization European Region committed to eliminating endemic transmission of measles and rubella by 2015, and disease incidence has decreased dramatically. However, there was little progress between 2012 and 2013, and the goal will likely not be achieved on time. Genuine political commitment, increased technical capacity, and greater public awareness are urgently needed, especially in Western Europe. |
What is the use of rapid syphilis tests in the United States?
Peterman TA , Fakile YF . Sex Transm Dis 2016 43 (3) 201-3 Syphilis testing is complicated. Rapid syphilis tests may change screening practices in the United States, but first more studies are needed to demonstrate the advantages and disadvantages in the field. |
Estimating the prevalence and predictors of incorrect condom use among sexually active adults in Kenya: results from a nationally representative survey
Grasso MA , Schwarcz S , Galbraith JS , Musyoki H , Kambona C , Kellogg TA . Sex Transm Dis 2016 43 (2) 87-93 BACKGROUND: Condom use continues to be an important primary prevention tool to reduce the acquisition and transmission of HIV and other sexually transmitted infections. However, incorrect use of condoms can reduce their effectiveness. METHODS: Using data from a 2012 nationally representative cross-sectional household survey conducted in Kenya, we analyzed a subpopulation of sexually active adults and estimated the percent that used condoms incorrectly during sex, and the type of condom errors. We used multivariable logistic regression to determine variables to be independently associated with incorrect condom use. RESULTS: Among 13,720 adolescents and adults, 8014 were sexually active in the previous 3 months (60.3%; 95% confidence interval [CI], 59.0-61.7). Among those who used a condom with a sex partner, 20% (95% CI, 17.4-22.6) experienced at least one instance of incorrect condom use in the previous 3 months. Of incorrect condom users, condom breakage or leakage was the most common error (52%; 95% CI, 44.5-59.6). Factors found to be associated with incorrect condom use were multiple sexual partnerships in the past 12 months (2 partners: adjusted odds ratio [aOR], 1.5; 95% CI, 1.0-2.0; P = 0.03; ≥3: aOR, 2.3; 95% CI, 1.5-3.5; P < 0.01) and reporting symptoms of a sexually transmitted infection (aOR, 2.8; 95% CI, 1.8-4.3; P < 0.01). CONCLUSIONS: Incorrect condom use is frequent among sexually active Kenyans and this may translate into substantial HIV transmission. Further understanding of the dynamics of condom use and misuse, in the broader context of other prevention strategies, will aid program planners in the delivery of appropriate interventions aimed at limiting such errors. |
The Global Polio Eradication Initiative: progress, lessons learned, and polio legacy transition planning
Cochi SL , Hegg L , Kaur A , Pandak C , Jafari H . Health Aff (Millwood) 2016 35 (2) 277-83 The world is closer than ever to achieving global polio eradication, with record-low polio cases in 2015 and the impending prospect of a polio-free Africa. Tens of millions of volunteers, social mobilizers, and health workers have participated in the Global Polio Eradication Initiative. The program contributes to efforts to deliver other health benefits, including health systems strengthening. As the initiative nears completion after more than twenty-five years, it becomes critical to document and transition the knowledge, lessons learned, assets, and infrastructure accumulated by the initiative to address other health goals and priorities. The primary goals of this process, known as polio legacy transition planning, are both to protect a polio-free world and to ensure that investments in polio eradication will contribute to other health goals after polio is completely eradicated. The initiative is engaged in an extensive transition process of consultations and planning at the global, regional, and country levels. A successful completion of this process will result in a well-planned and -managed conclusion of the initiative that will secure the global public good gained by ending one of the world's most devastating diseases and ensure that these investments provide public health benefits for years to come. |
Abrupt decline in tuberculosis among foreign-born persons in the United States
Baker BJ , Winston CA , Liu Y , France AM , Cain KP . PLoS One 2016 11 (2) e0147353 While the number of reported tuberculosis (TB) cases in the United States has declined over the past two decades, TB morbidity among foreign-born persons has remained persistently elevated. A recent unexpected decline in reported TB cases among foreign-born persons beginning in 2007 provided an opportunity to examine contributing factors and inform future TB control strategies. We investigated the relative influence of three factors on the decline: 1) changes in the size of the foreign-born population through immigration and emigration, 2) changes in distribution of country of origin among foreign-born persons, and 3) changes in the TB case rates among foreign-born subpopulations. Using data from the U.S. National Tuberculosis Surveillance System and the American Community Survey, we examined TB case counts, TB case rates, and population estimates, stratified by years since U.S. entry and country of origin. Regression modeling was used to assess statistically significant changes in trend. Among foreign-born recent entrants (<3 years since U.S. entry), we found a 39.5% decline (-1,013 cases) beginning in 2007 (P<0.05 compared to 2000-2007) and ending in 2011 (P<0.05 compared to 2011-2014). Among recent entrants from Mexico, 80.7% of the decline was attributable to a decrease in population, while the declines among recent entrants from the Philippines, India, Vietnam, and China were almost exclusively (95.5%-100%) the result of decreases in TB case rates. Among foreign-born non-recent entrants (≥3 years since U.S. entry), we found an 8.9% decline (-443 cases) that resulted entirely (100%) from a decrease in the TB case rate. Both recent and non-recent entrants contributed to the decline in TB cases; factors contributing to the decline among recent entrants varied by country of origin. Strategies that impact both recent and non-recent entrants (e.g., investment in overseas TB control) as well as those that focus on non-recent entrants (e.g., expanded targeted testing of high-risk subgroups among non-recent entrants) will be necessary to achieve further declines in TB morbidity among foreign-born persons. |
Decentralizing access to antiretroviral therapy for children living with HIV in Swaziland
Auld AF , Nuwagaba-Biribonwoha H , Azih C , Kamiru H , Baughman AL , Agolory S , Abrams E , Ellerbrock TV , Okello V , Bicego G , Ehrenkranz P . Pediatr Infect Dis J 2016 35 (8) 886-93 BACKGROUND: In 2007, Swaziland initiated a hub-and-spoke model for decentralizing antiretroviral therapy (ART) access for HIV-infected children (<15 years old). Decentralization was facilitated through: (1) down-referral of stable children on ART from overburdened central facilities (hubs) to primary healthcare clinics (spokes), and (2) pediatric ART initiation at spokes (spoke-initiation). METHODS: We conducted a nationally representative retrospective cohort study among children starting ART during 2004-2010 to assess effect of down-referral and spoke-initiation on rates of loss to follow-up (LTFU), death, and attrition (death or LTFU). Twelve of 28 pediatric ART hubs were randomly selected using probability-proportional-to-size sampling. Seven selected facilities had initiated hub-and-spoke decentralization by study start; at these facilities, 901 of 1,893 hub-initiated and maintained (hub-maintained) children, and 495 of 1,105 down-referred or spoke-initiated children were randomly selected for record abstraction. At the five hub-only facilities, 612 of 1,987 children were randomly selected. Multivariable proportional hazards regression was used to estimate adjusted hazards ratios (AHR) for effect of down-referral (a time-varying covariate) and spoke-initiation on outcomes. RESULTS: Among 2,008 children at ART initiation, median age was 5.0 years, median CD4 percentage 12.0%, median CD4 count 358 cells/microL, and median weight-for-age z-score -1.91. Controlling for known confounders, down-referral was strongly protective against LTFU (AHR 0.40; 95% CI, 0.20-0.79) and attrition (AHR 0.46; 95% CI, 0.26-0.83) but not mortality. Compared with hub-only children or hub-maintained children, spoke-initiated children had similar outcomes. CONCLUSIONS: Decentralization of pediatric ART through down-referral and spoke-initiation within a hub-and-spoke system should be continued and might improve program outcomes. |
Direct observation of treatment provided by a family member as compared to non-family member among children with new tuberculosis: a pragmatic, non-inferiority, cluster-randomized trial in Gujarat, India
Dave PV , Shah AN , Nimavat PB , Modi BB , Pujara KR , Patel P , Mehariya K , Rade KV , Shekar S , Sachdeva KS , Oeltmann JE , Kumar AM . PLoS One 2016 11 (2) e0148488 BACKGROUND: The World Health Organization recommends direct observation of treatment (DOT) to support patients with tuberculosis (TB) and to ensure treatment completion. As per national programme guidelines in India, a DOT provider can be anyone who is acceptable and accessible to the patient and accountable to the health system, except a family member. This poses challenges among children with TB who may be more comfortable receiving medicines from their parents or family members than from unfamiliar DOT providers. We conducted a non-inferiority trial to assess the effect of family DOT on treatment success rates among children with newly diagnosed TB registered for treatment during June-September 2012. METHODS: We randomly assigned all districts (n = 30) in Gujarat to the intervention (n = 15) or usual-practice group (n = 15). Adult family members in the intervention districts were given the choice to become their child's DOT provider. DOT was provided by a non-family member in the usual-practice districts. Using routinely collected clinic-based TB treatment cards, we compared treatment success rates (cured and treatment completed) between the two groups and the non-inferiority limit was kept at 5%. RESULTS: Of 624 children with newly diagnosed TB, 359 (58%) were from intervention districts and 265 (42%) were from usual-practice districts. The two groups were similar with respect to baseline characteristics including age, sex, type of TB, and initial body weight. The treatment success rates were 344 (95.8%) and 247 (93.2%) (p = 0.11) among the intervention and usual-practice groups respectively. CONCLUSION: DOT provided by a family member is not inferior to DOT provided by a non-family member among new TB cases in children and can attain international targets for treatment success. TRIAL REGISTRATION: Clinical Trials Registry-India, National Institute of Medical Statistics (Indian Council of Medical Research) CTRI/2015/09/006229. |
HIV-related mortality among adults (≥18 years) of various Hispanic or Latino subgroups - United States, 2006-2010
Clark H , Surendera Babu A , Harris S , Hardnett F . J Racial Ethn Health Disparities 2015 2 (1) 53-61 Hispanics or Latinos residing in the USA are disproportionately affected by HIV when compared to whites. Health outcomes for Hispanics or Latinos diagnosed with HIV infection may vary by Hispanic or Latino subgroup. We analyzed national mortality data from the National Center for Health Statistics for the years 2006 to 2010 to examine differences in HIV-related mortality among Hispanics or Latinos by sociodemographic factors and by Hispanic or Latino subgroup. After adjusting for age, HIV-related death rates per 100,000 population were highest among Hispanics or Latinos who were male (45.6, 95 % confidence interval [CI], 44.4 to 46.9) compared to female (12.0, 95 % CI 11.4 to 12.6), or resided in the Northeast (75.1, 95 % CI 72.2 to 77.9) compared to other US regions at the time of death. The age-adjusted HIV-related death rate was highest among Puerto Ricans (100.9, 95 % CI 97.0 to 104.8) and lowest among Mexicans (16.9, 95 % CI 16.2 to 17.6). Among all deaths, the proportion of HIV-related deaths was more than four times as high among Puerto Ricans (adjusted prevalence ratio = 4.3, 95 % CI 4.1 to 4.5) compared to Mexicans. To ensure better health outcomes for Hispanics or Latinos living with HIV in the USA, medical care and treatment programs should be adapted to address the needs of various Hispanic or Latino subgroups. |
Spatial variation of insecticide resistance in the dengue vector Aedes aegypti presents unique vector control challenges
Deming R , Manrique-Saide P , Medina Barreiro A , Cardena EU , Che-Mendoza A , Jones B , Liebman K , Vizcaino L , Vazquez-Prokopec G , Lenhart A . Parasit Vectors 2016 9 (1) 67 BACKGROUND: Dengue is a major public health problem in Mexico, where the use of chemical insecticides to control the principal dengue vector, Aedes aegypti, is widespread. Resistance to insecticides has been reported in multiple sites, and the frequency of kdr mutations associated with pyrethroid resistance has increased rapidly in recent years. In the present study, we characterized patterns of insecticide resistance in Ae. aegypti populations in five small towns surrounding the city of Merida, Mexico. METHODS: A cross-sectional, entomological survey was performed between June and August 2013 in 250 houses in each of the five towns. Indoor resting adult mosquitoes were collected in all houses and four ovitraps were placed in each study block. CDC bottle bioassays were conducted using F0-F2 individuals reared from the ovitraps and kdr allele (Ile1016 and Cys1534) frequencies were determined. RESULTS: High, but varying, levels of resistance to chorpyrifos-ethyl was detected in all study towns, complete susceptibility to bendiocarb in all except one town, and variations in resistance to deltamethrin between towns, ranging from 63-88 % mortality. Significant associations were detected between deltamethrin resistance and the presence of both kdr alleles. Phenotypic resistance was highly predictive of the presence of both alleles, however, not all mosquitoes containing a mutant allele were phenotypically resistant. An analysis of genotypic differentiation (exact G test) between the five towns based on the adult female Ae. aegypti collected from inside houses showed highly significant differences (p < 0.0001) between genotypes for both loci. When this was further analyzed to look for fine scale differences at the block level within towns, genotypic differentiation was significant for both loci in San Lorenzo (Ile1016, p = 0.018 and Cys1534, p = 0.007) and for Ile1016 in Acanceh (p = 0.013) and Conkal (p = 0.031). CONCLUSIONS: The results from this study suggest that 3 years after switching chemical groups, deltamethrin resistance and a high frequency of kdr alleles persisted in Ae. aegypti populations. The spatial variation that was detected in both resistance phenotypes and genotypes has practical implications, both for vector control operations as well as insecticide resistance management strategies. |
Urinary and blood cadmium and lead and kidney function: NHANES 2007-2012
Buser MC , Ingber SZ , Raines N , Fowler DA , Scinicariello F . Int J Hyg Environ Health 2016 219 (3) 261-7 BACKGROUND: Cadmium (Cd) and lead (Pb) are widespread environmental contaminants that are known nephrotoxins. However, their nephrotoxic effects at low-environmental exposure levels are debated. OBJECTIVE: We examined the association of blood Pb (B-Pb), blood Cd (B-Cd), urinary Pb (U-Pb) and urinary Cd (U-Cd) with estimated glomerular filtration rate (eGFR) and urinary albumin (ALB). METHODS: We used multivariate linear regression to analyze the association between B-Pb, B-Cd, U-Pb, and U-Cd with eGFR and ALB in adult participants (≥20 years of age) in NHANES 2007-2012. The dataset was limited to NHANES individuals with both blood and urinary metal measurements. RESULTS: We found a statistically significant inverse association between eGFR and B-Cd and statistically significant positive associations between eGFR and both U-Cd and U-Pb, as well as statistically significant associations between ALB and the 3rd and 4th quartiles of U-Cd. CONCLUSIONS: The inverse association between eGFR and B-Cd, in conjunction with positive associations between eGFR and ALB with U-Cd, suggest that U-Cd measurement at low levels of exposure may result from changes in renal excretion of Cd due to kidney function and protein excretion. However, renal effects such as hyperfiltration from Cd-mediated kidney damage or creatinine-specific Cd effects cannot be excluded with this cross-sectional design. |
Menstrual cycle perturbation by organohalogens and elements in the Cree of James Bay, Canada
Wainman BC , Kesner JS , Martin ID , Meadows JW , Krieg EF Jr , Nieboer E , Tsuji LJ . Chemosphere 2016 149 190-201 Persistent organohalogens (POHs) and metals have been linked to alterations in menstrual cycle function and fertility in humans. The Cree First Nations people living near James Bay in Ontario and Quebec, Canada, have elevated levels of POHs, mercury and lead compared to other Canadians. The present study examines the interrelationships between selected POHs and elements on menstrual cycle function in these Cree women. Menstrual cycle characteristics were derived from structured daily diaries and endocrine measurements from daily urine samples collected during one cycle for 42 women age 19-42. We measured 31 POHs in blood plasma and 18 elements in whole blood, for 31 of the participants. POHs and elements detected in ≥70% of the participants were transformed by principal component (PC) analysis to reduce the contaminant exposure data to fewer, uncorrelated PCA variables. Multiple regression analysis revealed that, after adjusting for confounders, PC-3 values showed significant negative association with cycle length, after adjusting for confounders (p = 0.002). PC-3 accounted for 9.2% of the variance and shows positive loadings for cadmium, selenium, and PBDE congeners 47 and 153, and a negative loading for copper. Sensitivity analysis of the model to quantify likely effect sizes showed a range of menstrual cycle length from 25.3 to 28.3 days using the lower and upper 95% confidence limits of mean measured contaminant concentrations to predict cycle length. Our observations support the hypothesis that the menstrual cycle function of these women may be altered by exposure to POHs and elements from their environment. |
Population density, poor sanitation, and enteric infections in Nueva Santa Rosa, Guatemala
Jarquin C , Arnold BF , Munoz F , Lopez B , Cuellar VM , Thornton A , Patel J , Reyes L , Roy SL , Bryan JP , McCracken JP , Colford JM Jr . Am J Trop Med Hyg 2016 94 (4) 912-919 Poor sanitation could pose greater risk for enteric pathogen transmission at higher human population densities because of greater potential for pathogens to infect new hosts through environmentally mediated and person-to-person transmission. We hypothesized that incidence and prevalence of diarrhea, enteric protozoans, and soil-transmitted helminth infections would be higher in high-population-density areas compared with low-population-density areas, and that poor sanitation would pose greater risk for these enteric infections at high density compared with low density. We tested our hypotheses using 6 years of clinic-based diarrhea surveillance (2007-2013) including 4,360 geolocated diarrhea cases tested for 13 pathogens and a 2010 cross-sectional survey that measured environmental exposures from 204 households (920 people) and tested 701 stool specimens for enteric parasites. We found that population density was not a key determinant of enteric infection nor a strong effect modifier of risk posed by poor household sanitation in this setting. |
A systematic meta-analysis of toxoplasma gondii prevalence in food animals in the United States
Guo M , Mishra A , Buchanan RL , Dubey JP , Hill DE , Gamble HR , Jones JL , Pradhan AK . Foodborne Pathog Dis 2016 13 (3) 109-18 Toxoplasma gondii is a widely distributed protozoan parasite. The Centers for Disease Control and Prevention reported that T. gondii is one of three pathogens (along with Salmonella and Listeria), that together account for >70% of all deaths due to foodborne illness in the United States. Food animals are reservoirs for T. gondii and act as one of the sources for parasite transmission to humans. Based on limited population-based data, the Food and Agriculture Organization/World Health Organization estimated that approximately 22% of human T. gondii infections are meatborne. The objective of the current study was to conduct a systematic meta-analysis to provide a precise estimation of T. gondii infection prevalence in food animals produced in the United States. Four databases were searched to collect eligible studies. Prevalence was estimated in six animal categories (confinement-raised market pigs, confinement-raised sows, non-confinement-raised pigs, lamb, goats, and non-confinement-raised chickens) by a quality-effects model. A wide variation in prevalence was observed in each animal category. Animals raised outdoors or that have outdoor access had a higher prevalence as compared with animals raised indoors. T. gondii prevalence in non-confinement-raised pigs ranked the highest (31.0%) followed by goats (30.7%), non-confinement-raised chickens (24.1%), lambs (22.0%), confinement-raised sows (16.7%), and confinement-raised market pigs (5.6%). These results indicate that T. gondii-infected animals are a food safety concern. The computed prevalence can be used as an important input in quantitative microbial risk assessment models to further predict public health burden. |
Complete Closed Genome Sequences of Salmonella enterica subsp. enterica Serotypes Anatum, Montevideo, Typhimurium, and Newport, Isolated from Beef, Cattle, and Humans.
Harhay DM , Bono JL , Smith TP , Fields PI , Dinsmore BA , Santovenia M , Kelley CM , Wang R , Harhay GP . Genome Announc 2016 4 (1) Salmonella enterica spp. are a diverse group of bacteria with a wide range of virulence potential. To facilitate genome comparisons across this virulence spectrum, we present eight complete closed genome sequences of four S. enterica serotypes (Anatum, Montevideo, Typhimurium, and Newport), isolated from various cattle samples and from humans. |
Haematospirillum jordaniae gen. nov., sp. nov., isolated from human blood samples.
Humrighouse BW , Emery BD , Kelly AJ , Metcalfe MG , Mbizo J , McQuiston JR . Antonie Van Leeuwenhoek 2016 109 (4) 493-500 A Gram-negative, aerobic, motile, spiral-shaped bacterium, strain H5569T, was isolated from a human blood sample. Phenotypic and molecular characteristics of the isolate were investigated. Optimal growth was found to occur at 35 degrees C under aerobic conditions on Heart Infusion Agar supplemented with 5 % rabbit blood. The major fatty acids present in the cells were identified as C16:0, C16:1omega7c and C18:1omega7c. The predominant respiratory quinone was found to be ubiquinone-Q10. The G+C content of genomic DNA for strain H5569T was found to be 49.9 %. Based on 16S rRNA gene sequence analysis results, 13 additional isolates were also analysed in this study. Phylogenetic analysis based on 16S rRNA gene sequences revealed that the organism, represented by strain H5569T, forms a distinct lineage within the family Rhodospirillaceae, closely related to two Novispirillum itersonii subspecies (93.9-94.1 %) and two Caenispirillum sp. (91.2-91.6 %). Based on these results, the isolate H5569T is concluded to represent a new genus and species for which the name Haematospirillum jordaniae gen. nov., sp. nov. is proposed. The type strain is H5569T (=DSMT 28903 = CCUG 66838T). |
Costs of expanded rapid HIV testing in four emergency departments
Schackman BR , Eggman AA , Leff JA , Braunlin M , Felsen UR , Fitzpatrick L , Telzak EE , El-Sadr W , Branson BM . Public Health Rep 2016 131 Suppl 1 71-81 OBJECTIVE: The HIV Prevention Trials Network (HPTN) 065 trial sought to expand HIV screening of emergency department (ED) patients in Bronx, New York, and Washington, D.C. This study assessed the testing costs associated with different expansion processes and compared them with costs of a hypothetical optimized process. METHODS: Micro-costing studies were conducted in two participating EDs in each city that switched from point-of-care (POC) to rapid-result laboratory testing. In three EDs, laboratory HIV testing was only conducted for patients having blood drawn for clinical reasons; in the other ED, all HIV testing was conducted with laboratory testing. Costs were estimated through direct observation and interviews to document process flows, time estimates, and labor and materials costs. A hypothetical optimized process flow used minimum time estimates for each process step. National wage and fringe rates and local reagent costs were used to determine the average cost (excluding overhead) per completed nonreactive and reactive test in 2013 U.S. dollars. RESULTS: Laboratory HIV testing costs in the EDs ranged from $17.00 to $23.83 per completed nonreactive test, and POC testing costs ranged from $17.64 to $37.60; cost per completed reactive test ranged from $89.29 to $123.17. Costs of hypothetical optimized HIV testing with automated process steps were approximately 45% lower for nonreactive tests and 20% lower for reactive tests. The cost per ED visit to conduct expanded HIV testing in each hospital ranged from $1.21 to $3.96. CONCLUSION: An optimized process could achieve additional cost savings but would require an investment in electronic system interfaces to further automate testing processes. |
Meningococcal conjugate and tetanus toxoid, reduced diphtheria toxoid, and acellular pertussis vaccination among HIV-infected youth
Setse RW , Siberry GK , Moss WJ , Wheeling J , Bohannon BA , Dominguez KL . Pediatr Infect Dis J 2016 35 (5) e152-7 BACKGROUND: The meningococcal conjugate vaccine (MCV4) and the tetanus toxoid, reduced diphtheria toxoid, and acellular pertussis vaccine (Tdap) were first recommended for adolescents in the United States in 2005. The goal of our study was to determine MCV4 & Tdap vaccines coverage among perinatally and behaviorally HIV-infected adolescents in 2006 and to compare coverage estimates in our study population to similarly aged healthy youth in 2006. METHODS: LEGACY is a retrospective cohort study of HIV-infected youth in 22 HIV specialty clinics across the United States. Among LEGACY participants ≥11 years of age in 2006, we conducted a cross-sectional analysis to determine MCV4, Tdap, and MCV4/Tdap vaccine coverage. We compared vaccine coverage among our study population to coverage among similarly aged youth in the 2006 NIS-Teen Survey. Multivariable mixed effects logistic regression modeling was used to examine associations between MCV4/Tdap vaccination and mode of HIV transmission. RESULTS: MCV4 and Tdap coverage rates among 326 eligible participants were 31.6% and 28.8% respectively. Among adolescents 13-17 years of age, MCV4 and Tdap coverage was significantly higher among HIV-infected youth than among youth in the 2006 NIS-Teen Survey (p <0.01). In multivariable analysis, perinatally HIV-infected youth were significantly more likely to have received MCV4/Tdap vaccination compared with their behaviorally infected counterparts (AOR 5.1, 95% CI 2.0, 12.7). HIV infected youth with CD4 cell counts of 200-499 cells/microL were more likely to have had MCV4/Tdap vaccination compared with those with CD4 counts ≥500cells/microL (AOR 2.2, 95% CI 1.2, 4.3). Participants with plasma HIV RNA viral loads of >400 copies/ml were significantly less likely to have received MCV4/Tdap vaccination (p< 0.05). CONCLUSIONS: MCV4 and Tdap coverage among HIV-infected youth was suboptimal but higher than for healthy adolescents in the 2006 NIS-Teen Survey. Perinatal HIV infection was associated with increased likelihood of vaccination. Specific measures are needed to improve vaccine coverage among adolescents in the United States. |
Place of influenza vaccination among children-United States, 2010-11 through 2013-14 influenza seasons
Santibanez TA , Vogt T , Zhai Y , McIntyre A . Vaccine 2016 34 (10) 1296-303 BACKGROUND: Studies are published on settings adults receive influenza vaccination but few have reported on settings children are vaccinated and how this might be changing over time or vary by socio-demographics. METHODS: Data from the National Immunization Survey-Flu were analyzed to assess place of influenza vaccination among vaccinated children 6 months-17 years during the 2010-11, 2011-12, 2012-13, and 2013-14 influenza seasons. The percentage of children vaccinated at each place was calculated overall and by age, race/ethnicity, income, and Metropolitan Statistical Area (MSA). RESULTS: The places children received influenza vaccination varied little over four recent influenza seasons. From the 2010-11 through 2013-14 influenza seasons the percentage of vaccinated children receiving influenza vaccination at a doctor's office was 64.1%, 65.1%, 65.3%, and 65.3%, respectively with no differences from one season to the next. Likewise, for vaccination at clinics or health centers (17.8%, 17.5%, 17.0%. 18.0%), health departments (3.2%, 3.6%, 3.0%, 2.8%), and other non-medical places (1.6%, 1.4%, 1.2%, 1.1%), there were no differences from one season to the next. There were some differences for vaccinations at hospitals, pharmacies, and schools. There was considerable variability in the place of influenza vaccination by age, race/ethnicity, income, and MSA. Fewer Hispanic children were vaccinated at a doctor's office than black, white, and other or multiple race children and fewer black children and children of other or multiple races were vaccinated at a doctor's office than white children. More children at or below the poverty level were vaccinated at a clinic or health center than all of the other income groups. CONCLUSION: Most vaccinated children receive their influenza vaccination at a doctor's office. Place of vaccination changed little over four recent influenza seasons. Large variability in place of vaccination exists by age, race/ethnicity, income, and MSA. Monitoring place of vaccination can help shape future immunization programs. |
The effect of heterogeneity in uptake of the measles, mumps, and rubella vaccine on the potential for outbreaks of measles: a modelling study
Glasser JW , Feng Z , Omer SB , Smith PJ , Rodewald LE . Lancet Infect Dis 2016 16 (5) 599-605 BACKGROUND: Vaccination programmes to prevent outbreaks after introductions of infectious people aim to maintain the average number of secondary infections per infectious person at one or less. We aimed to assess heterogeneity in vaccine uptake and other characteristics that, together with non-random mixing, could increase this number and to evaluate strategies that could mitigate their impact. METHODS: Because most US children attend elementary school in their own neighbourhoods, surveys of children entering elementary school (age 5 years before Sept 1) allow assessment of spatial heterogeneity in the proportion of children immune to vaccine-preventable diseases. We used data from a 2008 school-entry survey by the Immunization Division of the California Department of Public Health to obtain school addresses; numbers of students enrolled; proportions of enrolled students who had received one or two doses of the measles, mumps, and rubella (MMR) vaccine; and proportions with medical or personal-belief exemptions. Using a mixing model suitable for spatially-stratified populations, we projected the expected numbers of secondary infections per infectious person for measles, mumps, and rubella. We also mapped contributions to this number for measles in San Diego County's 638 elementary schools and its largest district, comprising 200 schools (31%). We then modelled the effect on measles' realised reproduction number (RV) of the following plausible interventions: vaccinating all children with personal-belief exemptions, increasing uptake by 10% to 50% in all low-immunity schools (<90% of students immune) or in only influential (effective daily contact rates >3 or contacts inter-school >30%) low-immunity schools, and increasing private school uptake to the public school average. FINDINGS: In 2008, 39 132 children began elementary school in San Diego County, CA, USA. At entry to school, 97% had received at least one dose of the MMR vaccine, with 2.5% having personal-belief exemptions. We note substantial heterogeneity in immunity throughout the county. Although the average population immunities for measles, mumps, and rubella (92%, 87%, 92%) were similar to the population-immunity thresholds in homogeneous, randomly-mixing populations (91%, 88%, 76%), after accounting for heterogeneity and non-random mixing, the basic reproduction numbers increased by 70%, meaning that introduced pathogens could cause outbreaks. The impact of our modelled interventions ranged from negligible to a nearly complete reduction in the outbreak potential of measles. The most effective intervention to lower the realised reproduction number (RV 3.39) was raising immunity by 50% in 114 schools with low immunity (RV 1.02), but raising immunity by this level in only influential, low-immunity schools also was effective (RV 2.02). The effectiveness of vaccinating the 972 children with personal-belief exemptions was similar to that of targeting all low-immunity schools (RV 1.11). Targeting only private schools had little effect. INTERPRETATION: Our findings suggest that increasing vaccine uptake could prevent outbreaks such as that of measles in San Diego in 2008. Vaccinating children with personal-belief exemptions was one of the most effective interventions that we modelled, but further research on mixing in heterogeneous populations is needed. |
Gavi's transition policy: moving from development assistance to domestic financing of immunization programs
Kallenberg J , Mok W , Newman R , Nguyen A , Ryckman T , Saxenian H , Wilson P . Health Aff (Millwood) 2016 35 (2) 250-8 Gavi, the Vaccine Alliance, was created in 2000 to accelerate the introduction of new and underused vaccines in lower-income countries. The period 2000-15 was marked by the rapid uptake of new vaccines in more than seventy countries eligible for Gavi support. To stay focused on the poorest countries, Gavi's support phases out after countries' gross national income per capita surpasses a set threshold, which requires governments to assume responsibility for the continued financing of vaccines introduced with Gavi support. Gavi's funding will end in the period 2016-20 for nineteen countries that have exceeded the eligibility threshold. To avoid disrupting lifesaving immunization programs and to ensure the long-term sustainable impact of Gavi's investments, it is vital that governments succeed in transitioning from development assistance to domestic financing of immunization programs. This article discusses some of the challenges facing countries currently transitioning out of Gavi support, how Gavi's policies have evolved to help manage the risks involved in this process, and the lessons learned from this experience. |
HPV vaccination coverage of teen girls: The influence of health care providers
Smith PJ , Stokley S , Bednarczyk RA , Orenstein WA , Omer SB . Vaccine 2016 34 (13) 1604-1610 BACKGROUND: Between 2010 and 2014, the percentage of 13-17 year-old girls administered ≥3 doses of the human papilloma virus (HPV) vaccine ("fully vaccinated") increased by 7.7 percentage points to 39.7%, and the percentage not administered any doses of the HPV vaccine ("not immunized") decreased by 11.3 percentage points to 40.0%. OBJECTIVE: To evaluate the complex interactions between parents' vaccine-related beliefs, demographic factors, and HPV immunization status. METHODS: Vaccine-related parental beliefs and sociodemographic data collected by the 2010 National Immunization Survey-Teen among teen girls (n=8490) were analyzed. HPV vaccination status was determined from teens' health care provider (HCP) records. RESULTS: Among teen girls either unvaccinated or fully vaccinated against HPV, teen girls whose parent was positively influenced to vaccinate against HPV were 48.2 percentage points more likely to be fully vaccinated. Parents who reported being positively influenced to vaccinate against HPV were 28.9 percentage points more likely to report that their daughter's HCP talked about the HPV vaccine, 27.2% percentage points more likely to report that their daughter's HCP gave enough time to discuss the HPV shot, and 43.4 percentage points more likely to report that their daughter's HCP recommended the HPV shot (p<0.05). Among teen girls administered 1-2 doses of the HPV vaccine, 87.0% had missed opportunities for HPV vaccine administration. CONCLUSION: Results suggest that an important pathway to achieving higher ≥3 dose HPV vaccine coverage is by increasing HPV vaccination series initiation though HCP talking to parents about the HPV vaccine, giving parents time to discuss the vaccine, and by making a strong recommendation for the HPV. Also, HPV vaccination series completion rates may be increased by eliminating missed opportunities to vaccinate against HPV and scheduling additional follow-up visits to administer missing HPV vaccine doses. |
Increasing coverage of appropriate vaccinations: a Community Guide systematic economic review
Jacob V , Chattopadhyay SK , Hopkins DP , Murphy Morgan J , Pitan AA , Clymer JM . Am J Prev Med 2016 50 (6) 797-808 CONTEXT: Population-level coverage for immunization against many vaccine-preventable diseases remains below optimal rates in the U.S. The Community Preventive Services Task Force recently recommended several interventions to increase vaccination coverage based on systematic reviews of the evaluation literature. The present study provides the economic results from those reviews. EVIDENCE ACQUISITION: A systematic review was conducted (search period, January 1980 through February 2012) to identify economic evaluations of 12 interventions recommended by the Task Force. Evidence was drawn from included studies; estimates were constructed for the population reach of each strategy, cost of implementation, and cost per additional vaccinated person because of the intervention. Analyses were conducted in 2014. EVIDENCE SYNTHESIS: Reminder systems, whether for clients or providers, were among the lowest-cost strategies to implement and the most cost effective in terms of additional people vaccinated. Strategies involving home visits and combination strategies in community settings were both costly and less cost effective. Strategies based in settings such as schools and MCOs that reached the target population achieved additional vaccinations in the middle range of cost effectiveness. CONCLUSIONS: The interventions recommended by the Task Force differed in reach, cost, and cost effectiveness. This systematic review presents the economic information for 12 effective strategies to increase vaccination coverage that can guide implementers in their choice of interventions to fit their local needs, available resources, and budget. |
Knowledge, attitudes and beliefs related to seasonal influenza vaccine among pregnant women in Thailand
Ditsungnoen D , Greenbaum A , Prapasiri P , Dawood FS , Thompson MG , Yoocharoen P , Lindblade KA , Olsen SJ , Muangchana C . Vaccine 2016 34 (18) 2141-6 BACKGROUND: In 2009, Thailand recommended pregnant women be prioritized for influenza vaccination. Vaccine uptake among Thai pregnant women is lower than other high-risk groups. METHODS: During December 2012-April 2013, we conducted a cross-sectional survey of a convenience sample of Thai pregnant women aged ≥15 years attending antenatal clinics at public hospitals in 8 of 77 provinces. A self-administered questionnaire covered knowledge, attitudes, and beliefs related to influenza vaccination using the Health Belief Model. We examined factors associated with willingness to be vaccinated using log-binomial regression models. RESULTS: The survey was completed by 1031 (96%) of 1072 pregnant women approached. A total of 627 (61%) women had heard about influenza vaccine and were included in the analysis, of whom 262 (42%) were willing to be vaccinated, 155 (25%) had received a healthcare provider recommendation for influenza vaccination and 25 (4%) had received the influenza vaccine during the current pregnancy. In unadjusted models, high levels of perceptions of susceptibility (prevalence ratio [PR] 1.5, 95% CI 1.2-2.0), high levels of belief in the benefits of vaccination (PR 2.3, 95% CI 1.7-3.1), moderate (PR 1.7, 95% CI 1.2-2.3) and high (PR 3.4, 95% CI 2.6-4.5) levels of encouragement by others to be vaccinated (i.e., cues to action) were positively associated with willingness to be vaccinated. Moderate (PR 0.5, 95% CI 0.4-0.7) and high levels of (PR 0.5, 95% CI 0.4-0.8) perceived barriers were negatively associated with willingness to be vaccinated. In the final adjusted model, only moderate (PR 1.5, 95% CI 1.1-2.0) and high levels of cues to action (PR 2.7, 95% CI 2.0-3.6) were statistically associated with willingness to be vaccinated. CONCLUSION: Cues to action were associated with willingness to be vaccinated and can be used to inform communication strategies during the vaccine campaign to increase influenza vaccination among Thai pregnant women. |
Combining global elimination of measles and rubella with strengthening of health systems in developing countries
Andrus JK , Cochi SL , Cooper LZ , Klein JD . Health Aff (Millwood) 2016 35 (2) 327-33 Global efforts to eliminate measles and rubella can be combined with other actions to accelerate the strengthening of health systems in developing countries. However, there are several challenges standing in the way of successfully combining measles and rubella vaccination campaigns with health systems strengthening. Those challenges include the following: achieving universal vaccine coverage while integrating the initiative with other primary care strategies and developing the necessary health system resilience to confront emergencies, ensuring epidemiological and laboratory surveillance of vaccine-preventable diseases, developing the human resources needed to effectively manage and implement national strategies, increasing community demand for health services, and obtaining long-term political support. We describe lessons learned from the successful elimination of measles and rubella in the Americas and elsewhere that strive to strengthen national health systems to both improve vaccine uptake and confront emerging threats. The elimination of measles and rubella provides opportunities for nations to strengthen health systems and thus to both reduce inequities and ensure national health security. |
Effectiveness of seasonal influenza vaccine in preventing influenza primary care visits and hospitalisation in Auckland, New Zealand in 2015: interim estimates
Bissielo A , Pierse N , Huang QS , Thompson MG , Kelly H , Mishin VP , Turner N . Euro Surveill 2015 21 (1) Preliminary results for influenza vaccine effectiveness (VE) against acute respiratory illness with circulating laboratory-confirmed influenza viruses in New Zealand from 27 April to 26 September 2015, using a case test-negative design were 36% (95% confidence interval (CI): 11-54) for general practice encounters and 50% (95% CI: 20-68) for hospitalisations. VE against hospitalised influenza A(H3N2) illnesses was moderate at 53% (95% CI: 6-76) but improved compared with previous seasons. |
Older adult falls seen by emergency medical service providers: a prevention opportunity
Faul M , Stevens JA , Sasser SM , Alee L , Deokar AJ , Kuhls DA , Burke PA . Am J Prev Med 2016 50 (6) 719-726 INTRODUCTION: Among people aged ≥65 years, falling is the leading cause of emergency department visits. Emergency medical services (EMS) are often called to help older adults who have fallen, with some requiring hospital transport. Chief aims were to determine where falls occurred and the circumstances under which patients were transported by EMS, and to identify future fall prevention opportunities. METHODS: In 2012, a total of 42 states contributed ambulatory data to the National EMS Information System, which were analyzed in 2014 and 2015. Using EMS records from 911 call events, logistic regression examined patient and environmental factors associated with older adult transport. RESULTS: Among people aged ≥65 years, falls accounted for 17% of all EMS calls. More than one in five (21%) of these emergency 911 calls did not result in a transport. Most falls occurred at home (60.2%) and residential institutions such as nursing homes (21.7%). Logistic regression showed AORs for transport were greatest among people aged ≥85 years (AOR=1.14, 95% CI=1.13, 1.16) and women (AOR=1.30, 95% CI=1.29, 1.32); for falls at residential institutions or nursing homes (AOR=3.52, 95% CI=3.46, 3.58) and in rural environments (AOR=1.15, 95% CI=1.13, 1.17); and where the EMS impression was a stroke (AOR=2.96, 95% CI=2.11, 4.10), followed by hypothermia (AOR=2.36, 95% CI=1.33, 4.43). CONCLUSIONS: This study provides unique insight into fall circumstances and EMS transport activity. EMS personnel are in a prime position to provide interventions that can prevent future falls, or referrals to community-based fall prevention programs and services. |
Implementation measurement for evidence-based violence prevention programs in communities
Massetti GM , Holland KM , Gorman-Smith D . J Community Health 2016 41 (4) 881-94 Increasing attention to the evaluation, dissemination, and implementation of evidence-based programs (EBPs) has led to significant advancements in the science of community-based violence prevention. One of the prevailing challenges in moving from science to community involves implementing EBPs and strategies with quality. The CDC-funded National Centers of Excellence in Youth Violence Prevention (YVPCs) partner with communities to implement a comprehensive community-based strategy to prevent violence and to evaluate that strategy for impact on community-wide rates of violence. As part of their implementation approach, YVPCs document implementation of and fidelity to the components of the comprehensive youth violence prevention strategy. We describe the strategies and methods used by the six YVPCs to assess implementation and to use implementation data to inform program improvement efforts. The information presented describes the approach and measurement strategies employed by each center and for each program implemented in the partner communities. YVPCs employ both established and innovative strategies for measurement and tracking of implementation across a broad range of programs, practices, and strategies. The work of the YVPCs highlights the need to use data to understand the relationship between implementation of EBPs and youth violence outcomes. |
Real-Time TaqMan PCR Assay for the Detection of Heat-Labile and Heat-Stable Enterotoxin Genes in a Geographically Diverse Collection of Enterotoxigenic Escherichia coli Strains and Stool Specimens.
Pattabiraman V , Parsons MB , Bopp CA . Foodborne Pathog Dis 2016 13 (4) 212-20 Enterotoxigenic Escherichia coli (ETEC) are an important cause of diarrhea in children under the age of 5 years in developing countries and are the leading bacterial agent of traveler's diarrhea in persons traveling to these countries. ETEC strains secrete heat-labile (LT) and/or heat-stable (ST) enterotoxins that induce diarrhea by causing water and electrolyte imbalance. We describe the validation of a real-time TaqMan PCR (RT-PCR) assay to detect LT, ST1a, and ST1b enterotoxin genes in E. coli strains and in stool specimens. We validated LT/ST1b duplex and ST1a single-plex RT-PCR assay using a conventional PCR assay as a gold standard with 188 ETEC strains and 42 non-ETEC strains. We validated LT/ST1b duplex and ST1a single-plex RT-PCR assay in stool specimens (n = 106) using traditional culture as the gold standard. RT- PCR assay sensitivities for LT, ST1a, and ST1b detection in strains were 100%, 100%, and 98%; specificities were 95%, 98%, and 99%, and Pearson correlation coefficient r was 0.9954 between RT-PCR assay and the gold standard. In stool specimens, RT-PCR assay sensitivities for LT, ST1a, and ST1b detection were 97%, 100%, and 97%; and specificities were 99%, 94%, and 97%. Pearson correlation coefficient r was 0.9975 between RT-PCR results in stool specimens and the gold standard. Limits of detection of LT, ST1a, and ST1b by RT-PCR assay were 0.1 to1.0 pg/muL and by conventional PCR assay were 100 to1000 pg/muL. The accuracy, rapidity and sensitivity of this RT-PCR assay is promising for ETEC detection in public health/clinical laboratories and for laboratories in need of an independent method to confirm results of other culture independent diagnostic tests. |
Thyroid hormones and timing of pubertal onset in a longitudinal cohort of females, Northern California, 2006-11
Wilken JA , Greenspan LC , Kushi LH , Voss RW , Windham GC . Paediatr Perinat Epidemiol 2016 30 (3) 285-93 BACKGROUND: Pubertal timing is regulated by a complex interplay of hormones. Few studies have evaluated the role of thyroid hormones in pubertal onset. We investigated the associations between blood concentrations of free and total thyroxine (FT4, TT4), free triiodothyronine, and thyroid stimulating hormone and pubertal onset among females. METHODS: Participants included 323 Kaiser Permanente Northern California members followed at annual intervals during 2004-11, who provided a blood sample during the first 3 years of the study. Thyroid hormone concentrations were measured in serum in the first blood specimen available for each participant. Pubertal onset was defined as Tanner stage ≥2 for breast (thelarche) and pubic hair (pubarche) development. Associations between thyroid hormones and pubertal onset were assessed by multivariable logistic regression and Cox proportional hazards modelling. RESULTS: At blood draw, participants were age 6.5-10.1 (median 7.7) years, 10% had reached thelarche, and 12% had reached pubarche. Participants were followed 0-5 years after blood draw (median 4). At most recent clinical visit, participants were age 6.7-14.7 (median 12.3) years, 92% had reached thelarche, and 89% had reached pubarche. No associations were identified between having reached thelarche or pubarche at time of blood draw and thyroid hormones. Examined longitudinally, higher concentrations of pre-pubertal FT4 and TT4 were associated with earlier pubarche (adjusted hazard ratio (HR) 1.41, 95% confidence interval (CI) 1.06, 1.86; per ng/dL and aHR 1.07, 95% CI 1.02, 1.12; per mug/dL respectively). CONCLUSIONS: Higher pre-pubertal concentrations of FT4 and TT4 are associated with earlier pubarche. |
Long-term daily vibration exposure alters current perception threshold (CPT) sensitivity and myelinated axons in a rat-tail model of vibration-induced injury
Krajnak K , Raju SG , Miller GR , Johnson C , Waugh S , Kashon ML , Riley DA . J Toxicol Environ Health A 2016 79 (3) 1-11 Repeated exposure to hand-transmitted vibration through the use of powered hand tools may result in pain and progressive reductions in tactile sensitivity. The goal of the present study was to use an established animal model of vibration-induced injury to characterize changes in sensory nerve function and cellular mechanisms associated with these alterations. Sensory nerve function was assessed weekly using the current perception threshold test and tail-flick analgesia test in male Sprague-Dawley rats exposed to 28 d of tail vibration. After 28 d of exposure, Abeta fiber sensitivity was reduced. This reduction in sensitivity was partly attributed to structural disruption of myelin. In addition, the decrease in sensitivity was also associated with a reduction in myelin basic protein and 2',3'- cyclic nucleotide phosphodiasterase (CNPase) staining in tail nerves, and an increase in circulating calcitonin gene-related peptide (CGRP) concentrations. Changes in Abeta fiber sensitivity and CGRP concentrations may serve as early markers of vibration-induced injury in peripheral nerves. It is conceivable that these markers may be utilized to monitor sensorineural alterations in workers exposed to vibration to potentially prevent additional injury. |
Measurement of macrocyclic trichothecene in floor dust of water-damaged buildings using gas chromatography/tandem mass spectrometry- dust matrix effects
Saito R , Park JH , LeBouf R , Green BJ , Park Y . J Occup Environ Hyg 2016 13 (6) 0 Gas chromatography-tandem mass spectrometry (GC-MS/MS) was used to detect fungal secondary metabolites. Detection of verrucarol, the hydrolysis product of Stachybotrys chartarum macrocyclic trichothecene (MCT), was confounded by matrix effects associated with heterogeneous indoor environmental samples. In this study, we examined the role of dust matrix effects associated with GC-MS/MS to better quantify verrucarol in dust as a measure of total MCT. The efficiency of the internal standard (ISTD, 1,12-dodecanediol), and application of a matrix-matched standard correction method in measuring MCT in floor dust of water-damaged buildings was additionally examined. Compared to verrucarol, ISTD had substantially higher matrix effects in the dust extracts. The results of the ISTD evaluation showed that without ISTD adjustment, there was a 280% ion enhancement in the dust extracts compared to neat solvent. The recovery of verrucarol was 94% when the matrix-matched standard curve without the ISTD was used. Using traditional calibration curves with ISTD adjustment, none of the 21 dust samples collected from water damaged buildings were detectable. In contrast, when the matrix-matched calibration curves without ISTD adjustment were used, 38% of samples were detectable. The study results suggest that floor dust of water-damaged buildings may contain MCT. However, the measured levels of MCT in dust using the GC-MS/MS method could be significantly under- or overestimated, depending on the matrix effects, the inappropriate ISTD, or combination of the two. Our study further shows that the routine application of matrix-matched calibration may prove useful in obtaining accurate measurements of MCT in dust derived from damp indoor environments, while no isotopically-labeled verrucarol is available. |
Quantification of influenza virus RNA in aerosols in patient rooms
Leung NH , Zhou J , Chu DK , Yu H , Lindsley WG , Beezhold DH , Yen HL , Li Y , Seto WH , Peiris JS , Cowling BJ . PLoS One 2016 11 (2) e0148669 BACKGROUND: The potential for human influenza viruses to spread through fine particle aerosols remains controversial. The objective of our study was to determine whether influenza viruses could be detected in fine particles in hospital rooms. METHODS AND FINDINGS: We sampled the air in 2-bed patient isolation rooms for four hours, placing cyclone samplers at heights of 1.5m and 1.0m. We collected ten air samples each in the presence of at least one patient with confirmed influenza A virus infection, and tested the samples by reverse transcription polymerase chain reaction. We recovered influenza A virus RNA from 5/10 collections (50%); 4/5 were from particles>4 mum, 1/5 from 1-4 mum, and none in particles<1 mum. CONCLUSIONS: Detection of influenza virus RNA in aerosols at low concentrations in patient rooms suggests that healthcare workers and visitors might have frequent exposure to airborne influenza virus in proximity to infected patients. A limitation of our study was the small sample size. Further studies should be done to quantify the concentration of viable influenza virus in healthcare settings, and factors affecting the detection of influenza viruses in fine particles in the air. |
Evaluation of multiple blood matrices for assessment of human exposure to nerve agents
Schulze ND , Hamelin EI , Winkeljohn WR , Shaner RL , Basden BJ , deCastro BR , Pantazides BG , Thomas JD , Johnson RC . J Anal Toxicol 2016 40 (3) 229-35 Biomedical samples may be used to determine human exposure to nerve agents through the analysis of specific biomarkers. Samples received may include serum, plasma, whole blood, lysed blood and, due to the toxicity of these compounds, postmortem blood. To quantitate metabolites resulting from exposure to sarin (GB), soman (GD), cyclosarin (GF), VX and VR, these blood matrices were evaluated individually for precision, accuracy, sensitivity and specificity. Accuracies for these metabolites ranged from 100 to 113% with coefficients of variation ranging from 2.31 to 13.5% across a reportable range of 1-100 ng/mL meeting FDA recommended guidelines for bioanalytical methods in all five matrices. Limits of detection were calculated to be 0.09-0.043 ng/mL, and no interferences were detected in unexposed matrix samples. The use of serum calibrators was also determined to be a suitable alternative to matrix-matched calibrators. Finally, to provide a comparative value between whole blood and plasma, the ratio of the five nerve agent metabolites measured in whole blood versus plasma was determined. Analysis of individual whole blood samples (n = 40), fortified with nerve agent metabolites across the reportable range, resulted in average nerve agent metabolite blood to plasma ratios ranging from 0.53 to 0.56. This study demonstrates the accurate and precise quantitation of nerve agent metabolites in serum, plasma, whole blood, lysed blood and postmortem blood. It also provides a comparative value between whole blood and plasma samples, which can assist epidemiologists and physicians with interpretation of test results from blood specimens obtained under variable conditions. |
Antimicrobial resistance in Salmonella in the United States: 1948 - 1995
Tadesse DA , Singh A , Zhao S , Bartholomew M , Womack N , Ayers S , Fields PI , McDermott PF . Antimicrob Agents Chemother 2016 60 (4) 2567-71 We conducted a retrospective study of 2,149 clinical Salmonella strains to help document the historical emergence of antimicrobial resistance. There were significant increases in resistance to older drugs including ampicillin, chloramphenicol, streptomycin, sulfamethoxazole and tetracycline, which was most common in serotype Typhimurium. An increase in multidrug-resistance was observed for each decade since the 1950s. These data help show how Salmonella has evolved over the past six decades following the introduction of new antimicrobial agents. |
Characterization and comparative analysis of 2,4-toluene diisocyanate and 1,6-hexamethylene diisocyanate haptenated human serum albumin and hemoglobin
Mhike M , Hettick JM , Chipinda I , Law BF , Bledsoe TA , Lemons AR , Nayak AP , Green BJ , Beezhold DH , Simoyi RH , Siegel PD . J Immunol Methods 2016 431 38-44 Diisocyanates (dNCOs) are low molecular weight chemical sensitizers that react with autologous proteins to produce neoantigens. dNCO-haptenated proteins have been used as immunogens for generation of dNCO-specific antibodies and as antigens to screen for dNCO-specific antibodies in exposed individuals. Detection of dNCO-specific antibodies in exposed individuals for diagnosis of dNCO asthma has been hampered by poor sensitivities of the assay methods in that specific IgE can only be detected in approximately 25% of the dNCO asthmatics. Apart from characterization of the conjugates used for these immunoassays, the choice of the carrier protein and the dNCO used are important parameters that can influence the detection of dNCO-specific antibodies. Human serum albumin (HSA) is the most common carrier protein used for detection of dNCO specific-IgE and -IgG but the immunogenicity and/or antigenicity of other proteins that may be modified by dNCO in vivo is not well documented. In the current study, 2,4-toluene diisocyanate (TDI) and 1,6-hexamethylene diisocyanate (HDI) were reacted with HSA and human hemoglobin (Hb) and the resultant adducts were characterized by (i) HPLC quantification of the diamine produced from acid hydrolysis of the adducts, (ii) 2,4,6-trinitrobenzene sulfonic acid (TNBS) assay to assess extent of cross-linking, (iii) electrophoretic migration in polyacrylamide gels to analyze intra- and inter-molecular cross-linking, and (iv) evaluation of antigenicity using a monoclonal antibody developed previously to TDI conjugated to Keyhole limpet hemocyanin (KLH). Concentration-dependent increases in the amount of dNCO bound to HDI and TDI, cross-linking, migration in gels, and antibody-binding were observed. TDI reactivity with both HSA and Hb was significantly higher than HDI. Hb-TDI antigenicity was approximately 30% that of HSA-TDI. In conclusion, this data suggests that both, the extent of haptenation as well as the degree of cross-linking differs between the two diisocyanate species studied, which may influence their relative immunogenicity and/or antigenicity. |
Clinicopathologic, immunohistochemical, and ultrastructural findings of a fatal case of Middle East respiratory syndrome coronavirus infection in United Arab Emirates, April 2014
Ng DL , Al Hosani F , Keating MK , Gerber SI , Jones TL , Metcalfe MG , Tong S , Tao Y , Alami NN , Haynes LM , Mutei MA , Abdel-Wareth L , Uyeki TM , Swerdlow DL , Barakat M , Zaki SR . Am J Pathol 2016 186 (3) 652-8 Middle East respiratory syndrome coronavirus (MERS-CoV) infection causes an acute respiratory illness and is associated with a high case fatality rate; however, the pathogenesis of severe and fatal MERS-CoV infection is unknown. We describe the histopathologic, immunohistochemical, and ultrastructural findings from the first autopsy performed on a fatal case of MERS-CoV in the world, which was related to a hospital outbreak in the United Arab Emirates in April 2014. The main histopathologic finding in the lungs was diffuse alveolar damage. Evidence of chronic disease, including severe peripheral vascular disease, patchy cardiac fibrosis, and hepatic steatosis, was noted in the other organs. Double staining immunoassays that used anti-MERS-CoV antibodies paired with immunohistochemistry for cytokeratin and surfactant identified pneumocytes and epithelial syncytial cells as important targets of MERS-CoV antigen; double immunostaining with dipeptidyl peptidase 4 showed colocalization in scattered pneumocytes and syncytial cells. No evidence of extrapulmonary MERS-CoV antigens were detected, including the kidney. These results provide critical insights into the pathogenesis of MERS-CoV in humans. |
Differential diagnosis of Japanese encephalitis virus infections with the Inbios JE Detect and DEN Detect MAC-ELISA kits
Johnson BW , Goodman CH , Jee YM , Featherstone DA . Am J Trop Med Hyg 2016 94 (4) 820-828 Japanese encephalitis virus (JEV) is the leading cause of pediatric viral neurological disease in Asia. The JEV-specific IgM antibody-capture enzyme-linked immunosorbent assay (MAC-ELISA) in cerebrospinal fluid (CSF) and serum is the recommended method of laboratory diagnosis, but specificity of JEV MAC-ELISA can be low due to cross-reactivity. To increase the specificity of the commercially available JE Detect MAC-ELISA (JE Detect), a differential testing algorithm was developed in which samples tested by JE Detect with positive results were subsequently tested by the DEN Detect MAC-ELISA (DEN Detect) kit, and results of both tests were used to make the final interpretation. The testing algorithm was evaluated with a reference panel of serum and CSF samples submitted for confirmatory testing. In serum, the false Japanese encephalitis (JE) positive rate was reduced, but sequential testing in CSF resulted in reduced JE specificity, as true JEV+ CSF samples had positive results by both JE Detect and DEN Detect and were classified as JE- (dengue virus [DENV]+). Differential diagnosis of JE by sequential testing with JE Detect and DEN Detect increased specificity for JE in serum, but more data with CSF is needed to make a final determination on the usefulness of this testing algorithm for CSF. |
The health system impact of false positive newborn screening results for medium-chain acyl-CoA dehydrogenase deficiency: a cohort study.
Karaceper MD , Chakraborty P , Coyle D , Wilson K , Kronick JB , Hawken S , Davies C , Brownell M , Dodds L , Feigenbaum A , Fell DB , Grosse SD , Guttmann A , Laberge AM , Mhanni A , Miller FA , Mitchell JJ , Nakhla M , Prasad C , Rockman-Greenberg C , Sparkes R , Wilson BJ , Potter BK . Orphanet J Rare Dis 2016 11 (1) 12 BACKGROUND: There is no consensus in the literature regarding the impact of false positive newborn screening results on early health care utilization patterns. We evaluated the impact of false positive newborn screening results for medium-chain acyl-CoA dehydrogenase deficiency (MCADD) in a cohort of Ontario infants. METHODS: The cohort included all children who received newborn screening in Ontario between April 1, 2006 and March 31, 2010. Newborn screening and diagnostic confirmation results were linked to province-wide health care administrative datasets covering physician visits, emergency department visits, and inpatient hospitalizations, to determine health service utilization from April 1, 2006 through March 31, 2012. Incidence rate ratios (IRRs) were used to compare those with false positive results for MCADD to those with negative newborn screening results, stratified by age at service use. RESULTS: We identified 43 infants with a false positive newborn screening result for MCADD during the study period. These infants experienced significantly higher rates of physician visits (IRR: 1.42) and hospitalizations (IRR: 2.32) in the first year of life relative to a screen negative cohort in adjusted analyses. Differences in health services use were not observed after the first year of life. CONCLUSIONS: The higher use of some health services among false positive infants during the first year of life may be explained by a psychosocial impact of false positive results on parental perceptions of infant health, and/or by differences in underlying health status. Understanding the impact of false positive newborn screening results can help to inform newborn screening programs in designing support and education for families. This is particularly important as additional disorders are added to expanded screening panels, yielding important clinical benefits for affected children but also a higher frequency of false positive findings. |
Prevalence of cerebral palsy and intellectual disability among children identified in two U.S. National Surveys, 2011-2013
Maenner MJ , Blumberg SJ , Kogan MD , Christensen D , Yeargin-Allsopp M , Schieve LA . Ann Epidemiol 2016 26 (3) 222-6 PURPOSE: Cerebral palsy (CP) and intellectual disability (ID) are developmental disabilities that result in considerable functional limitations. There are few recent and nationally representative prevalence estimates of CP and ID in the United States. METHODS: We used two U.S. nationally representative surveys, the 2011-2012 National Survey of Children's Health (NSCH) and the 2011-2013 National Health Interview Survey (NHIS), to determine the prevalence of CP and ID based on parent report among children aged 2-17 years. RESULTS: CP prevalence was 2.6 (95% confidence interval [CI]: 2.1-3.2) per 1000 in the NSCH and 2.9 (95% CI: 2.3-3.7) in the NHIS. ID prevalence was 12.2 (95% CI: 10.7-13.9) and 12.1 (95% CI: 10.8-13.7) in NSCH and NHIS, respectively. For both conditions, the NSCH and NHIS prevalence estimates were similar to each other for nearly all sociodemographic subgroups examined. CONCLUSIONS: Despite using different modes of data collection, the two surveys produced similar and plausible estimates of CP and ID and offer opportunities to better understand the needs and situations of children with these conditions. |
Hepatitis B virus infection among pregnant women in Haiti: A cross-sectional serosurvey
Tohme RA , Andre-Alboth J , Tejada-Strop A , Shi R , Boncy J , Francois J , Domercant JW , Griswold M , Hyppolite E , Adrien P , Kamili S . J Clin Virol 2016 76 66-71 BACKGROUND: Hepatitis B vaccine administered shortly after birth is highly effective in preventing mother to child transmission (MTCT) of infection. While hepatitis B vaccine was introduced in Haiti as part of a combined pentavalent vaccine in 2012, a birth dose is not yet included in the immunization schedule. OBJECTIVES: Determine the seroprevalence of hepatitis B virus (HBV) infection among pregnant women to evaluate the risk of MTCT. STUDY DESIGN: We selected 1364 residual serum specimens collected during a 2012 human immunodeficiency virus (HIV) sentinel serosurvey among pregnant women attending antenatal care clinics. Haiti was stratified into two regions: West, which includes metropolitan Port-au-Prince, and non-West, which includes all other departments. We evaluated the association between demographic and socioeconomic characteristics and HIV infection with HBV infection. RESULTS: Of 1364 selected specimens, 1307 (96%) were available for testing. A total of 422 specimens (32.7%) tested positive for total anti-HBc (38.2% in West vs. 27% in non-West, p<0.001), and 33 specimens (2.5%) were HBsAg positive (2.1% in West vs. 3% in non-West, p=0.4). Of HBsAg positive specimens, 79% had detectable HBV DNA. Women aged 30 and older had more than double the odds of positive total anti-HBc than women aged 15-19 years (p<0.001). Women with secondary (adjusted odds ratio (aOR)=0.54; 95% CI: 0.36-0.81) and post-secondary education (aOR=0.40, 95% CI: 0.19-0.79) had lower odds of total anti-HBc positivity compared with women with no education. HIV-status was not associated with HBV infection. CONCLUSIONS: Haiti has an intermediate endemicity of chronic HBV infection with high prevalence of positive HBV DNA among chronically infected women. Introduction of a universal birth dose of hepatitis B vaccine might help prevent perinatal HBV transmission. |
Dietary and lifestyle determinants of acrylamide and glycidamide hemoglobin adducts in non-smoking postmenopausal women from the EPIC cohort
Obon-Santacana M , Lujan-Barroso L , Freisling H , Cadeau C , Fagherazzi G , Boutron-Ruault MC , Kaaks R , Fortner RT , Boeing H , Ramon Quiros J , Molina-Montes E , Chamosa S , Castano JM , Ardanaz E , Khaw KT , Wareham N , Key T , Trichopoulou A , Lagiou P , Naska A , Palli D , Grioni S , Tumino R , Vineis P , De Magistris MS , Bueno-de-Mesquita HB , Peeters PH , Wennberg M , Bergdahl IA , Vesper H , Riboli E , Duell EJ . Eur J Nutr 2016 56 (3) 1157-1168 PURPOSE: Acrylamide was classified as 'probably carcinogenic' to humans in 1994 by the International Agency for Research on Cancer. In 2002, public health concern increased when acrylamide was identified in starchy, plant-based foods, processed at high temperatures. The purpose of this study was to identify which food groups and lifestyle variables were determinants of hemoglobin adduct concentrations of acrylamide (HbAA) and glycidamide (HbGA) in 801 non-smoking postmenopausal women from eight countries in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort. METHODS: Biomarkers of internal exposure were measured in red blood cells (collected at baseline) by high-performance liquid chromatography/tandem mass spectrometry (HPLC/MS/MS) . In this cross-sectional analysis, four dependent variables were evaluated: HbAA, HbGA, sum of total adducts (HbAA + HbGA), and their ratio (HbGA/HbAA). Simple and multiple regression analyses were used to identify determinants of the four outcome variables. All dependent variables (except HbGA/HbAA) and all independent variables were log-transformed (log2) to improve normality. Median (25th-75th percentile) HbAA and HbGA adduct levels were 41.3 (32.8-53.1) pmol/g Hb and 34.2 (25.4-46.9) pmol/g Hb, respectively. RESULTS: The main food group determinants of HbAA, HbGA, and HbAA + HbGA were biscuits, crackers, and dry cakes. Alcohol intake and body mass index were identified as the principal determinants of HbGA/HbAA. The total percent variation in HbAA, HbGA, HbAA + HbGA, and HbGA/HbAA explained in this study was 30, 26, 29, and 13 %, respectively. CONCLUSIONS: Dietary and lifestyle factors explain a moderate proportion of acrylamide adduct variation in non-smoking postmenopausal women from the EPIC cohort. |
Implementing an integrated health protection/health promotion intervention in the hospital setting: Lessons learned from the Be Well, Work Well Study
Sorensen G , Nagler EM , Hashimoto D , Dennerlein JT , Theron JV , Stoddard AM , Buxton O , Wallace LM , Kenwood C , Nelson CC , Tamers SL , Grant MP , Wagner G . J Occup Environ Med 2016 58 (2) 185-94 OBJECTIVE: This study reports findings from a proof-of-concept trial designed to examine the feasibility and estimates the efficacy of the "Be Well, Work Well" workplace intervention. METHODS: The intervention included consultation for nurse managers to implement changes on patient-care units and educational programming for patient-care staff to facilitate improvements in safety and health behaviors. We used a mixed-methods approach to evaluate feasibility and efficacy. RESULTS: Using findings from process tracking and qualitative research, we observed challenges to implementing the intervention due to the physical demands, time constraints, and psychological strains of patient care. Using survey data, we found no significant intervention effects. CONCLUSIONS: Beyond educating individual workers, systemwide initiatives that respond to conditions of work might be needed to transform the workplace culture and broader milieu in support of worker health and safety. |
Annual decline in forced expiratory volume is steeper in aluminum potroom workers than in workers without exposure to potroom fumes
Soyseth V , Henneberger PK , Einvik G , Virji MA , Bakke B , Kongerud J . Am J Ind Med 2016 59 (4) 322-9 BACKGROUND: Aluminum potroom exposure is associated with increased mortality of COPD but the association between potroom exposure and annual decline in lung function is unknown. We have measured lung volumes annually using spirometry from 1986 to 1996. The objective was to compare annual decline in forced expiratory volume in 1 s (dFEV1) and forced vital capacity (dFVC). METHODS: The number of aluminum potroom workers was 4,546 (81% males) and the number of workers in the reference group was 651 (76% males). The number of spirometries in the index group and the references were 24,060 and 2,243, respectively. RESULTS: After adjustment for confounders, the difference in dFEV1 and dFVC between the index and reference groups were 13.5 (P < 0.001) and -8.0 (P = 0.060) ml/year. CONCLUSION: Aluminum potroom operators have increased annual decline in FEV1 relative to a comparable group with non-exposure to potroom fumes and gases. |
A case study of multi-seam coal mine entry stability analysis with strength reduction method
Tulu IB , Esterhuizen GS , Klemetti T , Murphy MM , Sumner J , Sloan M . Int J Min Sci Technol 2016 26 (2) 193-196 In this paper, the advantage of using numerical models with the strength reduction method (SRM) to evaluate entry stability in complex multiple-seam conditions is demonstrated. A coal mine under variable topography from the Central Appalachian region is used as a case study. At this mine, unexpected roof conditions were encountered during development below previously mined panels. Stress mapping and observation of ground conditions were used to quantify the success of entry support systems in three room-and-pillar panels. Numerical model analyses were initially conducted to estimate the stresses induced by the multiple-seam mining at the locations of the affected entries. The SRM was used to quantify the stability factor of the supported roof of the entries at selected locations. The SRM-calculated stability factors were compared with observations made during the site visits, and the results demonstrate that the SRM adequately identifies the unexpected roof conditions in this complex case. It is concluded that the SRM can be used to effectively evaluate the likely success of roof supports and the stability condition of entries in coal mines. |
Experiences of a community-based lymphedema management program for lymphatic filariasis in Odisha State, India: an analysis of focus group discussions with patients, families, community members and program volunteers
Cassidy T , Worrell CM , Little K , Prakash A , Patra I , Rout J , Fox LM . PLoS Negl Trop Dis 2016 10 (2) e0004424 BACKGROUND: Globally 68 million people are infected with lymphatic filariasis (LF), 17 million of whom have lymphedema. This study explores the effects of a lymphedema management program in Odisha State, India on morbidity and psychosocial effects associated with lymphedema. METHODOLOGY/PRINCIPAL FINDINGS: Focus groups were held with patients (eight groups, separated by gender), their family members (eight groups), community members (four groups) and program volunteers (four groups) who had participated in a lymphedema management program for the past three years. Significant social, physical, and economic difficulties were described by patients and family members, including marriageability, social stigma, and lost workdays. However, the positive impact of the lymphedema management program was also emphasized, and many family and community members indicated that community members were accepting of patients and had some improved understanding of the etiology of the disease. Program volunteers and community members stressed the role that the program had played in educating people, though interestingly, local explanations and treatments appear to coexist with knowledge of biomedical treatments and the mosquito vector. CONCLUSIONS/SIGNIFICANCE: Local and biomedical understandings of disease can co-exist and do not preclude individuals from participating in biomedical interventions, specifically lymphedema management for those with lymphatic filariasis. There is a continued need for gender-specific psychosocial support groups to address issues particular to men and women as well as a continued need for improved economic opportunities for LF-affected patients. There is an urgent need to scale up LF-related morbidity management programs to reduce the suffering of people affected by LF. |
Community perceptions of mass screening and treatment for malaria in Siaya County, western Kenya
Shuford K , Were F , Awino N , Samuels A , Ouma P , Kariuki S , Desai M , Allen DR . Malar J 2016 15 (1) 71 BACKGROUND: Intermittent mass screening and treatment (iMSaT) is currently being evaluated as a possible additional tool for malaria control and prevention in western Kenya. The literature identifying success and/or barriers to drug trial compliance and acceptability on malaria treatment and control interventions is considerable, especially as it relates to specific target groups, such as school-aged children and pregnant women, but there is a lack of such studies for mass screening and treatment and mass drug administration in the general population. METHODS: A qualitative study was conducted to explore community perceptions of the iMSaT intervention, and specifically of testing and treatment in the absence of symptoms, before and after implementation in order to identify aspects of iMSaT that should be improved in future rounds. Two rounds of qualitative data collection were completed in six randomly selected study communities: a total of 36 focus group discussions (FGDs) with men, women, and opinion leaders, and 12 individual or small group interviews with community health workers. All interviews were conducted in the local dialect Dholuo, digitally recorded, and transcribed into English. English transcripts were imported into the qualitative software programme NVivo8 for content analysis. RESULTS: There were mixed opinions of the intervention. In the pre-implementation round, respondents were generally positive and willing to participate in the upcoming study. However, there were concerns about testing in the absence of symptoms including fear of covert HIV testing and issues around blood sampling. There were fewer concerns about treatment, mostly because of the simpler dosing regimen of the study drug (dihydroartemisinin-piperaquine) compared to the current first-line treatment (artemether-lumefantrine). After the first implementation round, there was a clear shift in perceptions with less common concerns overall, although some of the same issues around testing and general misconceptions about research remained. CONCLUSIONS: Although iMSaT was generally accepted throughout the community, proper sensitization activities-and arguably, a more long-term approach to community engagement-are necessary for dispelling fears, clarifying misconceptions, and educating communities on the consequences of asymptomatic malaria. |
Relationship between mean leucocyte telomere length and measures of allostatic load in US reproductive-aged women, NHANES 1999-2002
Ahrens KA , Rossen LM , Simon AE . Paediatr Perinat Epidemiol 2016 30 (4) 325-35 BACKGROUND: Reproductive health disparities may be partly explained by the cumulative effects of chronic stress experienced by socially disadvantaged groups. Although, telomere length (TL) and allostatic load score have each been used as biological markers of stress, the relationship between these two measures is unknown. METHODS: We investigated the association between leucocyte TL and allostatic load score in 1503 non-pregnant women (20-44 years) participating in the National Health and Nutrition Examination Survey, 1999-2002. We constructed six different allostatic load scores using either quartile- or clinical-based cut-points for 14 biomarkers based on previously published methods. We estimated associations between TL and allostatic load scores and component biomarkers using linear regression, also assessing interactions by race/ethnicity. RESULTS: After adjustment for age, longer TL was associated with higher HDL cholesterol and lower C-reactive protein and creatinine clearance; TL was not associated with the other component biomarkers. Shorter TL was associated with higher allostatic load scores for the two clinical cut-point-based scores after adjustment for age, but not the four scores based on quartile cut-points. Significant interactions by race/ethnicity were observed for TL and HbA1c and triglycerides, but not for other component biomarkers or allostatic load scores. CONCLUSIONS: Although TL and allostatic load score are both considered measures of cumulative stress, most component biomarkers and scores using quartile-based cut-points were not associated with TL. In reproductive-aged women, allostatic load scores using clinical-based cut-points were more strongly associated with TL compared with quartile-based scores. |
Association of progestin contraceptive implant and weight gain
Gallo MF , Legardy-Williams J , Hylton-Kong T , Rattray C , Kourtis AP , Jamieson DJ , Steiner MJ . Obstet Gynecol 2016 127 (3) 573-576 OBJECTIVE: To evaluate initiation of a two-rod, 150-mg levonorgestrel contraceptive implant on women's perceived and observed body weight. METHODS: We conducted a secondary analysis of data from an open, randomized controlled trial of adult, nonpregnant, human immunodeficiency virus-negative women attending a public clinic in Kingston, Jamaica, who were assigned to initiate implant use either immediately or after a 3-month delay. The primary objective of the parent study was to assess the effect of initiation of the implant on the frequency of condom use. We compared study arms during follow-up using one-sided chi tests for differences in perceived weight gain and loss, one-sided Wilcoxon-Mann-Whitney tests for median gain in measured weight, and logistic regression with generalized estimating equations for risk of gaining greater than 2 kg. RESULTS: From 2012 to 2014, women were assigned to the implant (n=208) or delay arm (n=206). At 3 months, more women in the implant arm (15.3%) reported perceived weight gain than in the control arm (4.3%) (P=.01). Despite differences in perception, the implant and control arms did not differ significantly in median weight gain at 1-month (0.0 kg and 0.0 kg, respectively; P=.44) and 3-month visits (0.5 kg and 0.0 kg, respectively; P=.27). Study arms did not differ in risk of gaining greater than 2 kg (odds ratio 0.9, 95% confidence interval 0.6-1.3). CONCLUSION: We found no evidence of weight gain from short-term implant use. Through the power of the nocebo effect, the practice of counseling women to expect possible weight gain from initiating implant use could lead them to perceive weight gain even in its absence and contribute to the early discontinuation of this highly effective contraceptive method. |
Use and effectiveness of quitlines versus Web-based tobacco cessation interventions among 4 state tobacco control programs
Neri AJ , Momin BR , Thompson TD , Kahende J , Zhang L , Puckett MC , Stewart SL . Cancer 2016 122 (7) 1126-33 BACKGROUND: Comparative effectiveness studies of state tobacco quitlines and Web-based tobacco cessation interventions are limited. In 2009, the US Centers for Disease Control and Prevention undertook a study of the comparative effectiveness of state quitlines and Web-based tobacco cessation interventions. METHODS: Standardized questionnaires were administered to smokers who enrolled exclusively in either quitlines or Web-based tobacco cessation services in 4 states in 2011-2012. The primary outcome was the 30-day point prevalence abstinence (PPA) rate at 7 months both between and within interventions. RESULTS: A total of 4086 participants were included in the analysis. Quitline users were significantly older, more heterogeneous in terms of race and ethnicity, less educated, less likely to be employed, and more often single than Web-based users. The 7-month 30-day PPA rate was 32% for quitline users and 27% for Web-based users. Multivariate models comparing 30-day PPA rates between interventions indicated that significantly increased odds of quitting were associated with being partnered, not living with another smoker, low baseline cigarette use, and more interactions with the intervention. After adjustments for demographic and tobacco use characteristics, quitline users had 1.26 the odds of being abstinent in comparison with Web-based users (95% confidence interval, 1.00-1.58; P = .053). CONCLUSIONS: This is one of the largest comparative effectiveness studies of state tobacco cessation interventions to date. These findings will help public health agencies develop and tailor evidence-based tobacco cessation programs. Further research should focus on users of Web-based cessation interventions sponsored by state health departments and their cost-effectiveness. |
How tobacco quitline callers in 38 US states reported hearing about quitline services, 2010-2013
Schauer GL , Malarcher A , Mann N , Fabrikant J , Zhang L , Babb S . Prev Chronic Dis 2016 13 E17 INTRODUCTION: Telephone-based tobacco quitlines are an evidence-based intervention, but little is known about how callers hear about quitlines and whether variations exist by demographics or state. This study assessed trends in "how-heard-abouts" (HHAs) in 38 states. METHODS: Data came from the Centers for Disease Control and Prevention's (CDC's) National Quitline Data Warehouse, which stores nonidentifiable data collected from individual callers at quitline registration and reported quarterly by states. Callers were asked how they heard about the quitline; responses were grouped into the following categories: media, health professional, family or friends, and "other." We examined trends from 2010 through 2013 (N = 1,564,437) using multivariable models that controlled for seasonality and the impact of CDC's national tobacco education campaign, Tips From Former Smokers (Tips). Using data from 2013 only, we assessed HHAs variation by demographics (sex, age, race/ethnicity, education) and state in a 38-state sample (n = 378,935 callers). RESULTS: From 2010 through 2013, the proportion of HHAs through media increased; however, this increase was not significant when we controlled for calendar quarters in which Tips aired. The proportion of HHAs through health professionals increased, whereas those through family or friends decreased. In 2013, HHAs occurred as follows: media, 45.1%; health professionals, 27.5%, family or friends, 17.0%, and other, 10.4%. Media was the predominant HHA among quitline callers of all demographic groups, followed by health professionals (except among people aged 18-24 years). Large variations in source of HHAs were observed by state. CONCLUSION: Most quitline callers in the 38-state sample heard about quitlines through the media or health care professionals. Variations in source of HHAs exist across states; implementation of best-practice quitline promotional strategies is critical to maximize reach. |
Changes in the medical management of patients on opioid analgesics following a diagnosis of substance abuse
Paulozzi LJ , Zhou C , Jones CM , Xu L , Florence CS . Pharmacoepidemiol Drug Saf 2016 25 (5) 545-52 PURPOSE: When providers recognize that patients are abusing prescription drugs, review of the drugs they are prescribed and attempts to treat the substance use disorder are warranted. However, little is known about whether prescribing patterns change following such a diagnosis. METHODS: We used national longitudinal health claims data from the Market Scan(R) commercial claims database for January 2010-June 2011. We used a cohort of 1.85 million adults 18-64 years old prescribed opioid analgesics but without abuse diagnoses during a 6-month "preabuse" period. We identified a subset of 9009 patients receiving diagnoses of abuse of non-illicit drugs (abuse group) during a 6-month "abuse" period and compared them with patients without such a diagnosis (nonabuse group) during both the abuse period and a subsequent 6-month "postabuse" period. RESULTS: During the abuse period 5.78% of the abuse group and 0.14% of the nonabuse group overdosed. Overdose rates declined to 2.12% in the abuse group in the postabuse period. Opioid prescribing rates declined 13.5%, and benzodiazepine rates declined 12.3% in the abuse group in the post-abuse period. Antidepressants and gabapentin were prescribed to roughly one half and one quarter of the abuse group, respectively, during all three periods. Daily opioid dosage did not decline in the abuse group following diagnosis. CONCLUSIONS: Prescribing to people who abuse drugs changes little after their abuse is documented. Actions such as tapering opioid and benzodiazepine prescriptions, maximizing alternative treatments for pain, and greater use of medication-assisted treatment such as buprenorphine could help reduce risk in this population. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Food Safety
- Genetics and Genomics
- Health Economics
- Immunity and Immunization
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Parasitic Diseases
- Reproductive Health
- Substance Use and Abuse
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure