Use of claims data to estimate annual cervical cancer screening percentages in Portland metropolitan area, Oregon.
Abdullah N , Laing RS , Hariri S , Young CM , Schafer S . Cancer Epidemiol 2016 41 106-112 BACKGROUND: Human papillomavirus (HPV) vaccine should reduce cervical dysplasia before cervical cancer. However, dysplasia diagnosis is screening-dependent. Accurate screening estimates are needed. PURPOSE: To estimate the percentage of women in a geographic population that has had cervical cancer screening. METHODS: We analyzed claims data for (Papanicolau) Pap tests from 2008-2012 to estimate the percentage of insured women aged 18-39 years screened. We estimated screening in uninsured women by dividing the percentage of insured Behavioral Risk Factor Surveillance Survey respondents reporting previous-year testing by the percentage of uninsured respondents reporting previous-year testing, and multiplying this ratio by claims-based estimates of insured women with previous-year screening. We calculated a simple weighted average of the two estimates to estimate overall screening percentage. We estimated credible intervals using Monte-Carlo simulations. RESULTS: During 2008-2012, an annual average of 29.6% of women aged 18-39 years were screened. Screening increased from 2008 to 2009 in all age groups. During 2009-2012, the screening percentages decreased for all groups, but declined most in women aged 18-20 years, from 21.5% to 5.4%. Within age groups, compared to 2009, credible intervals did not overlap during 2011 (except age group 21-29 years) and 2012, and credible intervals in the 18-20 year group did not overlap with older groups in any year. CONCLUSIONS: This introduces a novel method to estimate population-level cervical cancer screening. Overall, percentage of women screened in Portland, Oregon fell following changes in screening recommendations released in 2009 and later modified in 2012. |
Reducing recreational sedentary screen time: A Community Guide systematic review
Ramsey Buchanan L , Rooks-Peck CR , Finnie RK , Wethington HR , Jacob V , Fulton JE , Johnson DB , Kahwati LC , Pratt CA , Ramirez G , Glanz K . Am J Prev Med 2016 50 (3) 402-15 CONTEXT: Sedentary time spent with screen media is associated with obesity among children and adults. Obesity has potentially serious health consequences, such as heart disease and diabetes. This Community Guide systematic review examined the effectiveness and economic efficiency of behavioral interventions aimed at reducing recreational (i.e., neither school- nor work-related) sedentary screen time, as measured by screen time, physical activity, diet, and weight-related outcomes. EVIDENCE ACQUISITION: For this review, an earlier ("original") review (search period, 1966 through July 2007) was combined with updated evidence (search period, April 2007 through June 2013) to assess effectiveness of behavioral interventions aimed at reducing recreational sedentary screen time. Existing Community Guide systematic review methods were used. Analyses were conducted in 2013-2014. EVIDENCE SYNTHESIS: The review included 49 studies. Two types of behavioral interventions were evaluated that either (1) focus on reducing recreational sedentary screen time only (12 studies); or (2) focus equally on reducing recreational sedentary screen time and improving physical activity or diet (37 studies). Most studies targeted children aged ≤13 years. Children's composite screen time (TV viewing plus other forms of recreational sedentary screen time) decreased 26.4 (interquartile interval= -74.4, -12.0) minutes/day and obesity prevalence decreased 2.3 (interquartile interval= -4.5, -1.2) percentage points versus a comparison group. Improvements in physical activity and diet were reported. Three study arms among adults found composite screen time decreased by 130.2 minutes/day. CONCLUSIONS: Among children, these interventions demonstrated reduced screen time, increased physical activity, and improved diet- and weight-related outcomes. More research is needed among adolescents and adults. |
Lung function decline over 25 years of follow-up among black and white adults in the ARIC study cohort
Mirabelli MC , Preisser JS , Loehr LR , Agarwal SK , Barr RG , Couper DJ , Hankinson JL , Hyun N , Folsom AR , London SJ . Respir Med 2016 113 57-64 BACKGROUND: Interpretation of longitudinal information about lung function decline from middle to older age has been limited by loss to follow-up that may be correlated with baseline lung function or the rate of decline. We conducted these analyses to estimate age-related decline in lung function across groups of race, sex, and smoking status while accounting for dropout from the Atherosclerosis Risk in Communities Study. METHODS: We analyzed data from 13,896 black and white participants, aged 45-64 years at the 1987-1989 baseline clinical examination. Using spirometry data collected at baseline and two follow-up visits, we estimated annual population-averaged mean changes in forced expiratory volume in one second (FEV1) and forced vital capacity (FVC) by race, sex, and smoking status using inverse-probability-weighted independence estimating equations conditioning-on-being-alive. RESULTS: Estimated rates of FEV1 decline estimated using inverse-probability-weighted independence estimating equations conditioning on being alive were higher among white than black participants at age 45 years (e.g., male never smokers: black: -29.5 ml/year; white: -51.9 ml/year), but higher among black than white participants by age 75 (black: -51.2 ml/year; white: -26). Observed differences by race were more pronounced among men than among women. By smoking status, FEV1 declines were larger among current than former or never smokers at age 45 across all categories of race and sex. By age 60, FEV1 decline was larger among former and never than current smokers. Estimated annual declines generated using unweighted generalized estimating equations were smaller for current smokers at younger ages in all four groups of race and sex compared with results from weighted analyses that accounted for attrition. CONCLUSIONS: Using methods accounting for dropout from an approximately 25-year health study, estimated rates of lung function decline varied by age, race, sex, and smoking status, with largest declines observed among current smokers at younger ages. |
Huntington disease among the Navajo: a population-based study in the Navajo Nation
Gordon PH , Mehal JM , Rowland AS , Cheek JE , Bartholomew ML . Neurology 2016 86 (16) 1552-3 Huntington disease (HD) has a protracted course that imparts substantial personal and economic burden. Disease rates vary by geographic location. In Western countries, prevalence approximates 5.7/100,000; rates are tenfold lower in Asia. Epidemiologic studies from the United States report on mostly white populations. Few studies give rates of HD among minorities, and there are no comprehensive descriptions of HD in American Indians. Better understanding of how the disease affects discrete populations could produce hypotheses for new approaches to treatment. The goal of this study was to describe the epidemiology of HD among Navajo people living in the Navajo Nation, at 27,000 square miles, the largest reservation for American Indians in the United States. |
Genetic Determinants of Drug Resistance in Mycobacterium tuberculosis and Their Diagnostic Value.
Farhat MR , Sultana R , Iartchouk O , Bozeman S , Galagan J , Sisk P , Stolte C , Nebenzahl-Guimaraes H , Jacobson K , Sloutsky A , Kaur D , Posey J , Kreiswirth BN , Kurepina N , Rigouts L , Streicher EM , Victor TC , Warren RM , van Soolingen D , Murray M . Am J Respir Crit Care Med 2016 194 (5) 621-30 BACKGROUND: The development of molecular diagnostics that detect both the presence of Mycobacterium tuberculosis in clinical samples and drug resistance-conferring mutations promises to revolutionize patient care and interrupt transmission by ensuring early diagnosis. However, these tools require the identification of genetic determinants of resistance to the full range of anti-tuberculosis drugs. OBJECTIVES: To determine the optimal molecular approach needed, we sought to create a comprehensive catalogue of resistance mutations and assess their sensitivity and specificity in diagnosing drug resistance. METHODS: We developed and validated molecular inversion probes for DNA capture and deep sequencing of 28 drug resistance loci in M.tuberculosis. We used the probes for targeted sequencing of a geographically diverse set of 1397 clinical M.tuberculosis isolates with known drug resistance phenotypes. We identified a minimal set of mutations to predict resistance to first- and second-line anti-tuberculosis drugs and validated our predictions in an independent dataset. We constructed and piloted a web-based database that provides public access to the sequence data and prediction tool. RESULTS: The predicted resistance to rifampicin and isoniazid exceeded 90% sensitivity and specificity, but was lower for other drugs. The number of mutations needed to diagnose resistance is large and for the 13 drugs studied it was 238 across 18 genetic loci. CONCLUSION: These data suggest that a comprehensive M.tuberculosis drug resistance diagnostic will need to allow for a high dimension of mutation detection. They also support the hypothesis that currently unknown genetic determinants, potentially discoverable by whole genome sequencing, encode resistance to second-line TB drugs. |
Trends and determinants of survival for over 200 000 patients on antiretroviral treatment in the Botswana National Program: 2002-2013
Farahani M , Price N , El-Halabi S , Mlaudzi N , Keapoletswe K , Lebelonyane R , Fetogang EB , Chebani T , Kebaabetswe P , Masupe T , Gabaake K , Auld A , Nkomazana O , Marlink R . AIDS 2016 30 (3) 477-85 OBJECTIVES: To determine the incidence and risk factors of mortality for all HIV-infected patients receiving antiretroviral treatment at public and private healthcare facilities in the Botswana National HIV/AIDS Treatment Programme. DESIGN: We studied routinely collected data from 226 030 patients enrolled in the Botswana National HIV/AIDS Treatment Programme from 2002 to 2013. METHODS: A person-years (P-Y) approach was used to analyse all-cause mortality and follow-up rates for all HIV-infected individuals with documented antiretroviral therapy initiation dates. Marginal structural modelling was utilized to determine the effect of treatment on survival for those with documented drug regimens. Sensitivity analyses were performed to assess the robustness of our results. RESULTS: Median follow-up time was 37 months (interquartile range 11-75). Mortality was highest during the first 3 months after treatment initiation at 11.79 (95% confidence interval 11.49-12.11) deaths per 100 P-Y, but dropped to 1.01 (95% confidence interval 0.98-1.04) deaths per 100 P-Y after the first year of treatment. Twelve-month mortality declined from 7 to 2% of initiates during 2002-2012. Tenofovir was associated with lower mortality than stavudine and zidovudine. CONCLUSION: The observed mortality rates have been declining over time; however, mortality in the first year, particularly first 3 months of antiretroviral treatment, remains a distinct problem. This analysis showed lower mortality with regimens containing tenofovir compared with zidovudine and stavudine. CD4 cell count less than 100 cells/mul, older age and being male were associated with higher odds of mortality. |
Update: influenza activity - United States, October 4, 2015-February 6, 2016
Russell K , Blanton L , Kniss K , Mustaquim D , Smith S , Cohen J , Garg S , Flannery B , Fry AM , Grohskopf LA , Bresee J , Wallis T , Sessions W , Garten R , Xu X , Elal AI , Gubareva L , Barnes J , Wentworth DE , Burns E , Katz J , Jernigan D , Brammer L . MMWR Morb Mortal Wkly Rep 2016 65 (6) 146-53 From October through mid-December 2015, influenza activity remained low in most regions of the United States. Activity began to increase in late December 2015 and continued to increase slowly through early February 2016. Influenza A viruses have been most frequently identified, with influenza A (H3N2) viruses predominating during October until early December, and influenza A (H1N1)pdm09 viruses predominating from mid-December until early February. Most of the influenza viruses characterized during that time are antigenically similar to vaccine virus strains recommended for inclusion in the 2015-16 Northern Hemisphere vaccines. This report summarizes U.S. influenza activity* during October 4, 2015-February 6, 2016, and updates the previous summary. |
Mind the gap: TB trends in the USA and the UK, 2000-2011
Nnadi CD , Anderson LF , Armstrong LR , Stagg HR , Pedrazzoli D , Pratt R , Heilig CM , Abubakar I , Moonan PK . Thorax 2016 71 (4) 356-63 BACKGROUND: TB remains a major public health concern, even in low-incidence countries like the USA and the UK. Over the last two decades, cases of TB reported in the USA have declined, while they have increased substantially in the UK. We examined factors associated with this divergence in TB trends between the two countries. METHODS: We analysed all cases of TB reported to the US and UK national TB surveillance systems from 1 January 2000 through 31 December 2011. Negative binominal regression was used to assess potential demographic, clinical and risk factor variables associated with differences in observed trends. FINDINGS: A total of 259 609 cases were reported. From 2000 to 2011, annual TB incidence rates declined from 5.8 to 3.4 cases per 100 000 in the USA, whereas in the UK, TB incidence increased from 11.4 to 14.4 cases per 100 000. The majority of cases in both the USA (56%) and the UK (64%) were among foreign-born persons. The number of foreign-born cases reported in the USA declined by 15% (7731 in 2000 to 6564 in 2011) while native-born cases fell by 54% (8442 in 2000 to 3883 in 2011). In contrast, the number of foreign-born cases reported in the UK increased by 80% (3380 in 2000 to 6088 in 2011), while the number of native-born cases remained largely unchanged (2158 in 2000 to 2137 in 2011). In an adjusted negative binomial regression model, significant differences in trend were associated with sex, age, race/ethnicity, site of disease, HIV status and previous history of TB (p<0.01). Among the foreign-born, significant differences in trend were also associated with time since UK or US entry (p<0.01). INTERPRETATION: To achieve TB elimination in the UK, a re-evaluation of current TB control policies and practices with a focus on foreign-born are needed. In the USA, maintaining and strengthening control practices are necessary to sustain the progress made over the last 20 years. |
Monitoring HIV and AIDS related policy reforms: a Road Map to Strengthen Policy Monitoring and Implementation in PEPFAR partner countries
Lane J , Verani A , Hijazi M , Hurley E , Hagopian A , Judice N , MacInnis R , Sanford S , Zelek S , Katz A . PLoS One 2016 11 (2) e0146720 Achieving an AIDS-free generation will require the adoption and implementation of critical health policy reforms. However, countries with high HIV burden often have low policy development, advocacy, and monitoring capacity. This lack of capacity may be a significant barrier to achieving the AIDS-free generation goals. This manuscript describes the increased focus on policy development and implementation by the United States President's Emergency Plan for AIDS Relief (PEPFAR). It evaluates the curriculum and learning modalities used for two regional policy capacity building workshops organized around the PEPFAR Partnership Framework agreements and the Road Map for Monitoring and Implementing Policy Reforms. A total of 64 participants representing the U.S. Government, partner country governments, and civil society organizations attended the workshops. On average, participants responded that their policy monitoring skills improved and that they felt they were better prepared to monitor policy reforms three months after the workshop. When followed-up regarding utilization of the Road Map action plan, responses were mixed. Reasons cited for not making progress included an inability to meet or a lack of time, personnel, or governmental support. This lack of progress may point to a need for building policy monitoring systems in high HIV burden countries. Because the success of policy reforms cannot be measured by the mere adoption of written policy documents, monitoring the implementation of policy reforms and evaluating their public health impact is essential. In many high HIV burden countries, policy development and monitoring capacity remains weak. This lack of capacity could hinder efforts to achieve the ambitious AIDS-free generation treatment, care and prevention goals. The Road Map appears to be a useful tool for strengthening these critical capacities. |
Notes from the field: Ebola virus disease response activities during a mass displacement event after flooding - Freetown, Sierra Leone, September-November, 2015
Ratto J , Ivy W 3rd , Purfield A , Bangura J , Omoko A , Boateng I , Duffy N , Sims G , Beamer B , Pi-Sunyer T , Kamara S , Conteh S , Redd J . MMWR Morb Mortal Wkly Rep 2016 65 (7) 188-189 Since the start of the Ebola virus disease (Ebola) outbreak in West Africa, Sierra Leone has reported 8,706 confirmed Ebola cases and 3,956 deaths (1). During September 15-16, 2015, heavy rains flooded the capital, Freetown, resulting in eight deaths, home and property destruction, and thousands of persons in need of assistance (2). By September 27, approximately 13,000 flood-affected persons registered for flood relief services from the government (3). On September 17, two stadiums in Freetown were opened to provide shelter and assistance to flood-affected residents; a total of approximately 3,000 persons stayed overnight in both stadiums (Sierra Leone Ministry of Health and Sanitation, personal communication, September 2015). On the same day the stadiums were opened to flood-affected persons, the Ministry of Health and Sanitation (MoHS) and Western Area Ebola Response Center (WAERC) staff members from CDC, the World Health Organization (WHO), and the African Union evaluated the layout, logistics, and services at both stadiums and identified an immediate need to establish Ebola response activities. The patient in the last Ebola case in the Western Area, which includes Freetown, had died 37 days earlier, on August 11; however, transmission elsewhere in Sierra Leone was ongoing, and movement of persons throughout the country was common (4,5). |
Prevalence of HIV among U.S. female sex workers: Systematic review and meta-analysis
Paz-Bailey G , Noble M , Salo K , Tregear SJ . AIDS Behav 2016 20 (10) 2318-2331 Although female sex workers are known to be vulnerable to HIV infection, little is known about the epidemiology of HIV infection among this high-risk population in the United States. We systematically identified and critically assessed published studies reporting HIV prevalence among female sex workers in the United States. We searched for and included original English-language articles reporting data on the prevalence of HIV as determined by testing at least 50 females who exchanged sexual practices for money or drugs. We did not apply any restrictions on date of publication. We included 14 studies from 1987 to 2013 that reported HIV prevalence for a total of 3975 adult female sex workers. Only two of the 14 studies were conducted in the last 10 years. The pooled estimate of HIV prevalence was 17.3 % (95 % CI 13.5-21.9 %); however, the prevalence of HIV across individual studies varied considerably (ranging from 0.3 to 32 %) and statistical heterogeneity was substantial (I2 = 0.89, Q = 123; p < 0.001). Although the variance across the 14 studies was high, prevalence was generally high (10 % or greater in 11 of the 14 included studies). Very few studies have documented the prevalence of HIV among female sex workers in the United States; however, the available evidence does suggest that HIV prevalence among this vulnerable population is high. |
Enrollment in HIV care two years after HIV diagnosis in the kingdom of Swaziland: An evaluation of a national program of new linkage procedures
MacKellar DA , Williams D , Storer N , Okello V , Azih C , Drummond J , Nuwagaba-Biribonwoha H , Preko P , Morgan RL , Dlamini M , Byrd J , Agolory S , Baughman AL , McNairy ML , Sahabo R , Ehrenkranz P . PLoS One 2016 11 (2) e0150086 To improve early enrollment in HIV care, the Swaziland Ministry of Health implemented new linkage procedures for persons HIV diagnosed during the Soka Uncobe male circumcision campaign (SOKA, 2011-2012) and the Swaziland HIV Incidence Measurement Survey (SHIMS, 2011). Abstraction of clinical records and telephone interviews of a retrospective cohort of HIV-diagnosed SOKA and SHIMS clients were conducted in 2013-2014 to evaluate compliance with new linkage procedures and enrollment in HIV care at 92 facilities throughout Swaziland. Of 1,105 clients evaluated, within 3, 12, and 24 months of diagnosis, an estimated 14.0%, 24.3%, and 37.0% enrolled in HIV care, respectively, after adjusting for lost to follow-up and non-response. Kaplan-Meier functions indicated lower enrollment probability among clients 14-24 (P = 0.0001) and 25-29 (P = 0.001) years of age compared with clients >35 years of age. At 69 facilities to which clients were referred for HIV care, compliance with new linkage procedures was low: referral forms were located for less than half (46.8%) of the clients, and few (9.6%) were recorded in the appointment register or called either before (0.3%) or after (4.9%) their appointment. Of over one thousand clients newly HIV diagnosed in Swaziland in 2011 and 2012, few received linkage services in accordance with national procedures and most had not enrolled in HIV care two years after their diagnosis. Our findings are a call to action to improve linkage services and early enrollment in HIV care in Swaziland. |
Estimates of parainfluenza virus-associated hospitalizations and cost among children aged less than 5 years in the United States, 1998-2010
Abedi GR , Prill MM , Langley GE , Wikswo ME , Weinberg GA , Curns AT , Schneider E . J Pediatric Infect Dis Soc 2016 5 (1) 7-13 BACKGROUND: Parainfluenza virus (PIV) is the second leading cause of hospitalization for respiratory illness in young children in the United States. Infection can result in a full range of respiratory illness, including bronchiolitis, croup, and pneumonia. The recognized human subtypes of PIV are numbered 1-4. This study calculates estimates of PIV-associated hospitalizations among US children younger than 5 years using the latest available data. METHODS: Data from the National Respiratory and Enteric Virus Surveillance System were used to characterize seasonal PIV trends from July 2004 through June 2010. To estimate the number of PIV-associated hospitalizations that occurred annually among US children aged <5 years from 1998 through 2010, respiratory hospitalizations from the Healthcare Cost and Utilization Project Nationwide Inpatient Sample were multiplied by the proportion of acute respiratory infection hospitalizations positive for PIV among young children enrolled in the New Vaccine Surveillance Network. Estimates of hospitalization charges attributable to PIV infection were also calculated. RESULTS: Parainfluenza virus seasonality follows type-specific seasonal patterns, with PIV-1 circulating in odd-numbered years and PIV-2 and -3 circulating annually. The average annual estimates of PIV-associated bronchiolitis, croup, and pneumonia hospitalizations among children aged <5 years in the United States were 3888 (0.2 hospitalizations per 1000), 8481 per year (0.4 per 1000 children), and 10 186 (0.5 per 1000 children), respectively. Annual charges for PIV-associated bronchiolitis, croup, and pneumonia hospitalizations were approximately $43 million, $58 million, and $158 million, respectively. CONCLUSIONS: The majority of PIV-associated hospitalizations in young children occur among those aged 0 to 2 years. When vaccines for PIV become available, immunization would be most effective if realized within the first year of life. |
Evaluation of routine HIV opt-out screening and continuum of care services following entry into eight prison reception centers - California, 2012
Lucas KD , Eckert V , Behrends CN , Wheeler C , MacGowan RJ , Mohle-Boetani JC . MMWR Morb Mortal Wkly Rep 2016 65 (7) 178-181 Early diagnosis of human immunodeficiency virus (HIV) infection and initiation of antiretroviral treatment (ART) improves health outcomes and prevents HIV transmission (1,2). Before 2010, HIV testing was available to inmates in the California state prison system upon request. In 2010, the California Correctional Health Care Services (CCHCS) integrated HIV opt-out screening into the health assessment for inmates entering California state prisons. Under this system, a medical care provider informs the inmate that an HIV test is routinely done, along with screening for sexually transmitted, communicable, and vaccine-preventable diseases, unless the inmate specifically declines the test. During 2012-2013, CCHCS, the California Department of Public Health, and CDC evaluated HIV screening, rates of new diagnoses, linkage to and retention in care, ART response, and post-release linkage to care among California prison inmates. All prison inmates are processed through one of eight specialized reception center facilities, where they undergo a comprehensive evaluation of their medical needs, mental health, and custody requirements for placement in one of 35 state prisons. Among 17,436 inmates who entered a reception center during April-September 2012, 77% were screened for HIV infection; 135 (1%) tested positive, including 10 (0.1%) with newly diagnosed infections. Among the 135 HIV-positive patient-inmates, 134 (99%) were linked to care within 90 days of diagnosis, including 122 (91%) who initiated ART. Among 83 who initiated ART and remained incarcerated through July 2013, 81 (98%) continued ART; 71 (88%) achieved viral suppression (<200 HIV RNA copies/mL). Thirty-nine patient-inmates were released on ART; 12 of 14 who were linked to care within 30 days of release were virally suppressed at that time. Only one of nine persons with a viral load test conducted between 91 days and 1 year post-release had viral suppression. Although high rates of viral suppression were achieved in prison, continuity of care in the community remains a challenge. An infrastructure for post-release linkage to care is needed to help ensure sustained HIV disease control. |
Histoplasmosis-associated hospitalizations in the United States, 2001-2012
Benedict K , Derado G , Mody RK . Open Forum Infect Dis 2016 3 (1) ofv219 We examined trends in histoplasmosis-associated hospitalizations in the United States using the 2001-2012 National (Nationwide) Inpatient Sample. An estimated 50 778 hospitalizations occurred, with significant increases in hospitalizations overall and in the proportion of hospitalizations associated with transplant, diabetes, and autoimmune conditions often treated with biologic therapies; therefore, histoplasmosis remains an important opportunistic infection. |
Benefit of Early Initiation of Influenza Antiviral Treatment to Pregnant Women Hospitalized With Laboratory-Confirmed Influenza
Oboho I , Reed C , Gargiullo P , Leon M , Aragon D , Meek J , Anderson EJ , Ryan P , Lynfield R , Morin C , Bargsten M , Zansky S , Fowler B , Thomas A , Lindegren ML , Schaffner W , Risk I , Finelli L , Chaves SS . J Infect Dis 2016 214 (4) 507-15 BACKGROUND: We describe the impact of early antiviral treatment among pregnant women hospitalized with laboratory-confirmed influenza (2010-14 influenza seasons). METHODS: Severe influenza was defined as intensive care unit admission, mechanical ventilation, respiratory failure, pulmonary embolism, sepsis, or death. Within severity stratum, we used parametric survival analysis to compare length of stay (LOS) by timing of antiviral treatment, adjusting for underlying conditions, influenza vaccination, and pregnancy trimester. RESULTS: Among 865 pregnant women, median age was 27 years (interquartile range [IQR], 23-31). Most (68%) were healthy, and 85% received antiviral treatment. Sixty-three (7%) women had severe influenza, 4 died. Severity was associated with preterm delivery and fetal loss. Women with severe influenza were less likely to be vaccinated than those without (14% vs. 26%, p=0.03). Comparing women treated with antivirals ≤2 vs. >2 days from illness onset, median LOS (days) was respectively 2.2 (IQR 0.9-5.8; n=8) vs. 7.8 (IQR 3.0-20.6; n=7) for severe (p=0.03), and 2.4 (IQR 2.3-2.5; n=153) vs. 3.1 (IQR 2.8-3.5; n=62) for non-severe influenza (p<0.01). CONCLUSIONS: Early influenza antiviral treatment for pregnant women hospitalized with influenza may reduce LOS, especially if severe influenza. Influenza during pregnancy is associated with maternal and infant morbidity and annual influenza vaccination is warranted. |
Current and (potential) future effects of the Affordable Care Act on HIV prevention
Viall AH , McCray E , Mermin J , Wortley P . Curr HIV/AIDS Rep 2016 13 (2) 95-106 Recent advances in science, program, and policy could better position the nation to achieve its vision of the USA as a place where new HIV infections are rare. Among these developments, passage of the Patient Protection and Affordable Care Act (ACA) in 2010 may prove particularly important, as the health system transformations it has launched offer a supportive foundation for realizing the potential of other advances, both within and beyond the clinical arena. This article summarizes opportunities to expand access to high-impact HIV prevention interventions under the ACA, examines whether available evidence indicates that these opportunities are being realized, and considers potential challenges to further gains for HIV prevention in an era of health reform. This article also highlights the new roles that HIV prevention programs and providers may assume in a health system no longer defined by fragmentation among public health, medical care, and community service providers. |
Design and methods of a social network isolation study for reducing respiratory infection transmission: The eX-FLU cluster randomized trial
Aiello AE , Simanek AM , Eisenberg MC , Walsh AR , Davis B , Volz E , Cheng C , Rainey JJ , Uzicanin A , Gao H , Osgood N , Knowles D , Stanley K , Tarter K , Monto AS . Epidemics 2016 15 38-55 BACKGROUND: Social networks are increasingly recognized as important points of intervention, yet relatively few intervention studies of respiratory infection transmission have utilized a network design. Here we describe the design, methods, and social network structure of a randomized intervention for isolating respiratory infection cases in a university setting over a 10-week period. METHODOLOGY/PRINCIPAL FINDINGS: 590 students in six residence halls enrolled in the eX-FLU study during a chain-referral recruitment process from September 2012-January 2013. Of these, 262 joined as "seed" participants, who nominated their social contacts to join the study, of which 328 "nominees" enrolled. Participants were cluster-randomized by 117 residence halls. Participants were asked to respond to weekly surveys on health behaviors, social interactions, and influenza-like illness (ILI) symptoms. Participants were randomized to either a 3-Day dorm room isolation intervention or a control group (no isolation) upon illness onset. ILI cases reported on their isolation behavior during illness and provided throat and nasal swab specimens at onset, day-three, and day-six of illness. A subsample of individuals (= 103) participated in a sub-study using a novel smartphone application, iEpi, which collected sensor and contextually-dependent survey data on social interactions. Within the social network, participants were significantly positively assortative by intervention group, enrollment type, residence hall, iEpi participation, age, gender, race, and alcohol use (all P < 0.002). CONCLUSIONS/SIGNIFICANCE: We identified a feasible study design for testing the impact of isolation from social networks in a university setting. These data provide an unparalleled opportunity to address questions about isolation and infection transmission, as well as insights into social networks and behaviors among college-aged students. Several important lessons were learned over the course of this project, including feasible isolation durations, the need for extensive organizational efforts, as well as the need for specialized programmers and server space for managing survey and smartphone data. |
Development of the World Health Organization measles programmatic risk assessment tool using experience from the 2009 measles outbreak in Namibia
Kriss JL , De Wee RJ , Lam E , Kaiser R , Shibeshi ME , Ndevaetela EE , Muroua C , Shapumba N , Masresha BG , Goodson JL . Risk Anal 2016 37 (6) 1072-1081 In the World Health Organization (WHO) African region, reported measles cases decreased by 80% and measles mortality declined by 88% during 2000-2012. Based on current performance trends, however, focused efforts will be needed to achieve the regional measles elimination goal. To prioritize efforts to strengthen implementation of elimination strategies, the Centers for Disease Control and Prevention and WHO developed a measles programmatic risk assessment tool to identify high-risk districts and guide and strengthen program activities at the subnational level. This article provides a description of pilot testing of the tool in Namibia using comparisons of high-risk districts identified using 2006-2008 data with reported measles cases and incidence during the 2009 outbreak. Of the 34 health districts in Namibia, 11 (32%) were classified as high risk or very high risk, including the district of Engela where the outbreak began in 2009. The district of Windhoek, including the capital city of Windhoek, had the highest overall risk score-driven primarily by poor population immunity and immunization program performance-and one of the highest incidences during the outbreak. Other high-risk districts were either around the capital district or in the northern part of the country near the border with Angola. Districts categorized as high or very high risk based on the 2006-2008 data generally experienced high measles incidence during the large outbreak in 2009, as did several medium- or low-risk districts. The tool can be used to guide measles elimination strategies and to identify programmatic areas that require strengthening. |
Mother-to-child HIV-1 transmission events are differentially impacted by breast milk and its components from HIV-1-infected women
Shen R , Achenbach J , Shen Y , Palaia J , Rahkola JT , Nick HJ , Smythies LE , McConnell M , Fowler MG , Smith PD , Janoff EN . PLoS One 2015 10 (12) e0145150 Breast milk is a vehicle of infection and source of protection in post-natal mother-to-child HIV-1 transmission (MTCT). Understanding the mechanism by which breast milk limits vertical transmission will provide critical insight into the design of preventive and therapeutic approaches to interrupt HIV-1 mucosal transmission. However, characterization of the inhibitory activity of breast milk in human intestinal mucosa, the portal of entry in postnatal MTCT, has been constrained by the limited availability of primary mucosal target cells and tissues to recapitulate mucosal transmission ex vivo. Here, we characterized the impact of skimmed breast milk, breast milk antibodies (Igs) and non-Ig components from HIV-1-infected Ugandan women on the major events of HIV-1 mucosal transmission using primary human intestinal cells and tissues. HIV-1-specific IgG antibodies and non-Ig components in breast milk inhibited the uptake of Ugandan HIV-1 isolates by primary human intestinal epithelial cells, viral replication in and transport of HIV-1- bearing dendritic cells through the human intestinal mucosa. Breast milk HIV-1-specific IgG and IgA, as well as innate factors, blocked the uptake and transport of HIV-1 through intestinal mucosa. Thus, breast milk components have distinct and complementary effects in reducing HIV-1 uptake, transport through and replication in the intestinal mucosa and, therefore, likely contribute to preventing postnatal HIV-1 transmission. Our data suggests that a successful preventive or therapeutic approach would require multiple immune factors acting at multiple steps in the HIV-1 mucosal transmission process. |
Predicting malaria vector distribution under climate change scenarios in China: Challenges for malaria elimination
Ren Z , Wang D , Ma A , Hwang J , Bennett A , Sturrock HJ , Fan J , Zhang W , Yang D , Feng X , Xia Z , Zhou XN , Wang J . Sci Rep 2016 6 20604 Projecting the distribution of malaria vectors under climate change is essential for planning integrated vector control activities for sustaining elimination and preventing reintroduction of malaria. In China, however, little knowledge exists on the possible effects of climate change on malaria vectors. Here we assess the potential impact of climate change on four dominant malaria vectors (An. dirus, An. minimus, An. lesteri and An. sinensis) using species distribution models for two future decades: the 2030 s and the 2050 s. Simulation-based estimates suggest that the environmentally suitable area (ESA) for An. dirus and An. minimus would increase by an average of 49% and 16%, respectively, under all three scenarios for the 2030 s, but decrease by 11% and 16%, respectively in the 2050 s. By contrast, an increase of 36% and 11%, respectively, in ESA of An. lesteri and An. sinensis, was estimated under medium stabilizing (RCP4.5) and very heavy (RCP8.5) emission scenarios. in the 2050 s. In total, we predict a substantial net increase in the population exposed to the four dominant malaria vectors in the decades of the 2030 s and 2050 s, considering land use changes and urbanization simultaneously. Strategies to achieve and sustain malaria elimination in China will need to account for these potential changes in vector distributions and receptivity. |
Mercapturic acids: recent advances in their determination by liquid chromatography/mass spectrometry and their use in toxicant metabolism studies and in occupational and environmental exposure studies
Mathias PI , B'Hymer C . Biomarkers 2016 21 (4) 1-23 This review describes recent selected HPLC/MS methods for the determination of urinary mercapturates that are useful as noninvasive biomarkers in characterizing human exposure to electrophilic industrial chemicals in occupational and environmental studies. High-performance liquid chromatography/mass spectrometry is a sensitive and specific method for analysis of small molecules found in biological fluids. In this review, recent selected mercapturate quantification methods are summarized and specific cases are presented. The biological formation of mercapturates is introduced and their use as indicators of metabolic processing of reactive toxicants is discussed, as well as future trends and limitations in this area of research. |
Prenatal exposure to perfluorocarboxylic acids (pfcas) and fetal and postnatal growth in the Taiwan Maternal and Infant Cohort Study
Wang Y , Adgent M , Su PH , Chen HY , Chen PC , Hsiung CA , Wang SL . Environ Health Perspect 2016 124 (11) 1794-1800 BACKGROUND: Perfluorocarboxylic acids (PFCAs) are environmentally and biologically persistent synthetic chemicals. PFCAs include perfluorooctanoic acid (PFOA, C8) and long-chain PFCAs (C9-C20). Studies examining long-chain PFCAs and fetal and postnatal growth are limited. OBJECTIVES: To investigate the associations of prenatal exposure to long chain PFCAs and fetal and postnatal growth. METHODS: For 223 Taiwanese mothers and their term infants, we measured PFOA and 4 long-chain PFCAs (ng/mL) in third trimester maternal serum; infant weight (kg), length and head circumference (cm) at birth; and childhood weight and height at approximately 2, 5, 8, and 11 years of age. For each sex, we used multivariable linear regression to examine associations between ln-transformed prenatal PFCAs and continuous infant measures, and logistic regression to examine small for gestational age (SGA). Linear mixed models were applied to prenatal PFCAs and childhood weight and height z-scores. RESULTS: In girls, prenatal perfluorononanoic acid (PFNA), perfluorodecanoic acid (PFDeA), perfluoroundecanoic acid (PFUnDA), and perfluorododecanoic acid (PFDoDA) concentrations were inversely associated with birth weight (e.g., betabirth weight (kg) = -0.06, 95% confidence interval [CI]: -0.11, -0.01 per 1 ln-unit PFUnDA increase); Prenatal PFDeA and PFUnDA were associated with elevated odds of SGA; and PFDeA, PFUnDA, and PFDoDA were associated with lower average childhood height z-score. In boys, prenatal PFNA, and PFDoDA were associated with reductions in height at certain ages in childhood, but not with size at birth. CONCLUSIONS: Prenatal exposure to long-chain PFCAs may interfere with fetal and childhood growth in girls, and childhood growth in boys. |
Aggregation of adenovirus 2 in source water and impacts on disinfection by chlorine
Kahler AM , Cromeans TL , Metcalfe MG , Humphrey CD , Hill VR . Food Environ Virol 2016 8 (2) 148-55 It is generally accepted that viral particles in source water are likely to be found as aggregates attached to other particles. For this reason, it is important to investigate the disinfection efficacy of chlorine on aggregated viruses. A method to produce adenovirus particle aggregation was developed for this study. Negative stain electron microscopy was used to measure aggregation before and after addition of virus particles to surface water at different pH and specific conductance levels. The impact of aggregation on the efficacy of chlorine disinfection was also examined. Disinfection experiments with human adenovirus 2 (HAdV2) in source water were conducted using 0.2 mg/L free chlorine at 5 degrees C. Aggregation of HAdV2 in source water (≥3 aggregated particles) remained higher at higher specific conductance and pH levels. However, aggregation was highly variable, with the percentage of particles present in aggregates ranging from 43 to 71 %. Upon addition into source water, the aggregation percentage dropped dramatically. On average, chlorination CT values (chlorine concentration in mg/L x time in min) for 3-log10 inactivation of aggregated HAdV2 were up to three times higher than those for dispersed HAdV2, indicating that aggregation reduced the disinfection rate. This information can be used by water utilities and regulators to guide decision making regarding disinfection of viruses in water. |
Outbreak of foodborne botulism associated with improperly jarred pesto - Ohio and California, 2014
Burke P , Needham M , Jackson BR , Bokanyi R , St Germain E , Englender SJ . MMWR Morb Mortal Wkly Rep 2016 65 (7) 175-177 On July 28, 2014, the Cincinnati Health Department was notified of suspected cases of foodborne botulism in two women admitted to the same hospital 12 days apart. Patient A had been treated for 12 days for suspected autoimmune disease. When patient B, the roommate of patient A, was evaluated at the same medical center for similar symptoms, it was learned that on July 13, patient A and patient B had shared a meal that included prepackaged pesto from a jar; clinicians suspected botulism and notified the local health department. The pesto had been purchased from company A's farm stand in San Clemente, California. Laboratory testing detected botulinum toxin type B by enzyme-linked immunosorbent assay (ELISA) in leftovers of pasta with pesto. A culture of these food samples yielded Clostridium spp. that produced botulinum toxin type B; polymerase chain reaction (PCR) testing also was positive for type B toxin gene. Environmental assessment of company A identified improper acidification and pressurization practices and lack of licensure to sell canned products commercially, including products in hermetically-sealed jars. On July 30, the vendor voluntarily recalled all jarred products, and the California Department of Public Health (CDPH) warned the public not to consume company A's jarred foods. This report describes the two cases and the public health investigation that traced the source of the outbreak. |
Characterization of a Salivirus (Picornaviridae) from a Diarrheal Child in Guatemala.
Ng TF , Magana L , Montmayeur A , Lopez MR , Gregoricus N , Oberste MS , Vinje J , Nix WA . Genome Announc 2016 4 (1) The complete genome sequence of a salivirus was identified in a stool sample from a Guatemalan child with acute gastroenteritis during a 2009 norovirus outbreak. This genome (genotype A1 strain GUT/2009/A-1746) shares 82% to 94% genome-wide nucleotide identity with saliviruses from the United States, China, Germany, and Nigeria, representing the first salivirus sequence from Central America. |
Kroppenstedtia pulmonis sp. nov. and Kroppenstedtia sanguinis sp. nov., isolated from human patients.
Bell ME , Lasker BA , Klenk HP , Hoyles L , Sproer C , Schumann P , Brown JM . Antonie Van Leeuwenhoek 2016 109 (5) 603-10 Three human clinical strains (W9323T, X0209T and X0394) isolated from a lung biopsy, blood and cerebral spinal fluid, respectively, were characterised using a polyphasic taxonomic approach. Comparative analysis of the 16S rRNA gene sequences showed the three strains belong to two novel branches within the genus Kroppenstedtia: 16S rRNA gene sequence analysis of W9323T showed close sequence similarity to Kroppenstedtia eburnea JFMB-ATET (95.3 %), Kroppenstedtia guangzhouensis GD02T (94.7 %) and strain X0209T (94.6 %); sequence analysis of strain X0209T showed close sequence similarity to K. eburnea JFMB-ATET (96.4 %) and K. guangzhouensis GD02T (96.0 %). Strains X0209T and X0394 were 99.9 % similar to each other by 16S rRNA gene sequence analysis. The DNA-DNA relatedness was 94.6 %, confirming that X0209T and X0394 belong to the same species. Chemotaxonomic data for strains W9323T and X0209T were consistent with those described for the members of the genus Kroppenstedtia: the peptidoglycan was found to contain LL-diaminopimelic acid; the major cellular fatty acids were identified as iso-C15 and anteiso-C15; and the major menaquinone was identified as MK-7. Differences in endospore morphology, carbon source utilisation profiles, and cell wall sugar patterns of strains W9323T and X0209T, supported by phylogenetic analysis, enabled us to conclude that the strains each represent a new species within the genus Kroppenstedtia, for which the names Kroppenstedtia pulmonis sp. nov. (type strain W9323T = DSM 45752T = CCUG 68107T) and Kroppenstedtia sanguinis sp. nov. (type strain X0209T = DSM 45749T = CCUG 38657T) are proposed. |
The economic burden of childhood pneumococcal diseases in the Gambia
Usuf E , Mackenzie G , Sambou S , Atherly D , Suraratdecha C . Cost Eff Resour Alloc 2016 14 4 BACKGROUND: Streptococcus pneumoniae is a common cause of child death. However, the economic burden of pneumococcal disease in low-income countries is poorly described. We aimed to estimate from a societal perspective, the costs incurred by health providers and families of children with pneumococcal diseases. METHODS: We recruited children less than 5 years of age with outpatient pneumonia, inpatient pneumonia, pneumococcal sepsis and bacterial meningitis at facilities in rural and urban Gambia. We collected provider costs, out of pocket costs and productivity loss for the families of children. For each disease diagnostic category, costs were collected before, during, and for 1 week after discharge from hospital or outpatient visit. RESULTS: A total of 340 children were enrolled; 100 outpatient pneumonia, 175 inpatient pneumonia 36 pneumococcal sepsis, and 29 bacterial meningitis cases. The mean provider costs per patient for treating outpatient pneumonia, inpatient pneumonia, pneumococcal sepsis and meningitis were US$8, US$64, US$87 and US$124 respectively and the mean out of pocket costs per patient were US$6, US$31, US$44 and US$34 respectively. The economic burden of outpatient pneumonia, inpatient pneumonia, pneumococcal sepsis and meningitis increased to US$15, US$109, US$144 and US$170 respectively when family members' time loss from work was taken into account. CONCLUSION: The economic burden of pneumococcal disease in The Gambia is substantial, costs to families was approximately one-third to a half of the provider costs, and accounted for up to 30 % of total societal costs. The introduction of pneumococcal conjugate vaccine has the potential to significantly reduce this economic burden in this society. |
Intestinal microbiome disruption in patients in a long-term acute care hospital: A case for development of microbiome disruption indices to improve infection prevention.
Halpin AL , de Man TJ , Kraft CS , Perry KA , Chan AW , Lieu S , Mikell J , Limbago BM , McDonald LC . Am J Infect Control 2016 44 (7) 830-6 BACKGROUND: Composition and diversity of intestinal microbial communities (microbiota) are generally accepted as a risk factor for poor outcomes; however, we cannot yet use this information to prevent adverse outcomes. METHODS: Stool was collected from 8 long-term acute care hospital patients experiencing diarrhea and 2 fecal microbiota transplant donors; 16S rDNA V1-V2 hypervariable regions were sequenced. Composition and diversity of each sample were described. Stool was also tested for Clostridium difficile, vancomycin-resistant enterococci (VRE), and carbapenem-resistant Enterobacteriaceae. Associations between microbiota diversity and demographic and clinical characteristics, including antibiotic use, were analyzed. RESULTS: Antibiotic exposure and Charlson Comorbidity Index were inversely correlated with diversity (Spearman = -0.7). Two patients were positive for VRE; both had microbiomes dominated by Enterococcus faecium, accounting for 67%-84% of their microbiome. CONCLUSIONS: Antibiotic exposure correlated with diversity; however, other environmental and host factors not easily obtainable in a clinical setting are also known to impact the microbiota. Therefore, direct measurement of microbiome disruption by sequencing, rather than reliance on surrogate markers, might be most predictive of adverse outcomes. If and when microbiome characterization becomes a standard diagnostic test, improving our understanding of microbiome dynamics will allow for interpretation of results to improve patient outcomes. |
New societal approaches to empowering antibiotic stewardship
Spellberg B , Srinivasn A , Chambers HF . JAMA 2016 315 (12) 1229-30 Substantial concern regarding the ever-worsening crisis of antibiotic resistance has been raised by the World Health Organization, US Centers for Disease Control and Prevention (CDC), European Centre for Disease Prevention and Control, European Medicines Agency, Institute of Medicine, World Economic Forum, and the US Presidential Advisory Council on Science and Technology. The question is no longer whether to act, but how. | Antibiotic stewardship is the term used to describe efforts to optimize selection of antibiotic therapy. Formal antibiotic steward ship programs are essential to help society address antimicrobial resistance by reducing the estimated more than 50% of antibiotic use that is unnecessary or inappropriate.1 The US government has recently emphasized the need for implementation of antibiotic stewardship programs at all hospitals.2 To be effective, antibiotic stewardship programs must incorporate best practices, which include dedicating sufficient resources to the program, appointing a single leader to be accountable for performance, having appropriate antibiotic expertise, implementing action plans, monitoring bacterial resistance, reporting antibiotic usage to staff, and providing education.3 |
Notes from the field: Verona integron-encoded metallo-beta-lactamase-producing Carbapenem-resistant Enterobacteriaceae in a neonatal and adult intensive care unit - Kentucky, 2015
Yaffee AQ , Roser L , Daniels K , Humbaugh K , Brawley R , Thoroughman D , Flinchum A . MMWR Morb Mortal Wkly Rep 2016 65 (7) 190 During August 4-September 1, 2015, eight cases of Verona integron-encoded metallo-beta-lactamase (VIM)-producing Carbapenem-resistant Enterobacteriaceae (CRE) colonization were identified in six patients, using weekly active surveillance perirectal cultures in a Kentucky tertiary care hospital. No cases of clinical infection or complications attributable to colonization were reported. Four of the eight isolates were identified as Enterobacter cloacae; other organisms included Raoultella species (one), Escherichia coli (one), and Klebsiella pneumoniae (two). Six isolates were reported in a neonatal intensive care unit (ICU), and two isolates in an adult trauma and surgical ICU. Patient ages at isolate culture date ranged from 21 days to 68 years. Fifty percent of the patients were male. Previously, only one VIM-producing CRE-colonized patient (an adult, in 2013) had been reported by the same hospital. The six cases are the largest occurrence of VIM-producing CRE colonization reported in the United States and the only recognized cluster of VIM-producing CRE colonization in the United States reported to include a neonatal population. Despite environmental sampling over the same period, surveying patients for exposure to health care outside the United States, surveying health care providers for risk factors, and surveillance culturing of health care provider nares and axillae, a source of VIM-producing CRE has not been identified for this cluster. Prevention measures throughout the ICUs have been enhanced in response to this cluster, as detailed in CDC's 2015 CRE toolkit update. |
Beyond infection: device utilization ratio as a performance measure for urinary catheter harm
Fakih MG , Gould CV , Trautner BW , Meddings J , Olmsted RN , Krein SL , Saint S . Infect Control Hosp Epidemiol 2016 37 (3) 327-33 Catheter-associated urinary tract infection (CAUTI) is considered a reasonably preventable event in the hospital setting, and it has been included in the US Department of Health and Human Services National Action Plan to Prevent Healthcare-Associated Infections. While multiple definitions for measuring CAUTI exist, each has important limitations, and understanding these limitations is important to both clinical practice and policy decisions. The National Healthcare Safety Network (NHSN) surveillance definition, the most frequently used outcome measure for CAUTI prevention efforts, has limited clinical correlation and does not necessarily reflect noninfectious harms related to the catheter. We advocate use of the device utilization ratio (DUR) as an additional performance measure for potential urinary catheter harm. The DUR is patient-centered and objective and is currently captured as part of NHSN reporting. Furthermore, these data are readily obtainable from electronic medical records. The DUR also provides a more direct reflection of improvement efforts focused on reducing inappropriate urinary catheter use. Infect. Control Hosp. Epidemiol. 2016;37(3):327-333. |
Vaccination and 30-day mortality risk in children, adolescents, and young adults
McCarthy NL , Gee J , Sukumaran L , Weintraub E , Duffy J , Kharbanda EO , Baxter R , Irving S , King J , Daley MF , Hechter R , McNeil MM . Pediatrics 2016 137 (3) e20152970 OBJECTIVE: This study evaluates the potential association of vaccination and death in the Vaccine Safety Datalink (VSD). METHODS: The study cohort included individuals ages 9 to 26 years with deaths between January 1, 2005, and December 31, 2011. We implemented a case-centered method to estimate a relative risk (RR) for death in days 0 to 30 after vaccination.Deaths due to external causes (accidents, homicides, and suicides) were excluded from the primary analysis. In a secondary analysis, we included all deaths regardless of cause. A team of physicians reviewed available medical records and coroner's reports to confirm cause of death and assess the causal relationship between death and vaccination. RESULTS: Of the 1100 deaths identified during the study period, 76 (7%) occurred 0 to 30 days after vaccination. The relative risks for deaths after any vaccination and influenza vaccination were significantly lower for deaths due to nonexternal causes (RR 0.57, 95% confidence interval [CI] 0.38-0.83, and RR 0.44, 95% CI 0.24-0.80, respectively) and deaths due to all causes (RR 0.72, 95% CI 0.56-0.91, and RR 0.44, 95% CI 0.28-0.65). No other individual vaccines were significantly associated with death. Among deaths reviewed, 1 cause of death was unknown, 25 deaths were due to nonexternal causes, and 34 deaths were due to external causes. The causality assessment found no evidence of a causal association between vaccination and death. CONCLUSIONS: Risk of death was not increased during the 30 days after vaccination, and no deaths were found to be causally associated with vaccination. |
Prevalence of HPV after introduction of the vaccination program in the United States
Markowitz LE , Liu G , Hariri S , Steinau M , Dunne EF , Unger ER . Pediatrics 2016 137 (3) e20151968 BACKGROUND: Since mid-2006, human papillomavirus (HPV) vaccination has been recommended for females aged 11 to 12 years and through 26 years if not previously vaccinated. METHODS: HPV DNA prevalence was analyzed in cervicovaginal specimens from females aged 14 to 34 years in NHANES in the prevaccine era (2003-2006) and 4 years of the vaccine era (2009-2012) according to age group. Prevalence of quadrivalent HPV vaccine (4vHPV) types (HPV-6, -11, -16, and -18) and other HPV type categories were compared between eras. Prevalence among sexually active females aged 14 to 24 years was also analyzed according to vaccination history. RESULTS: Between the prevacccine and vaccine eras, 4vHPV type prevalence declined from 11.5% to 4.3% (adjusted prevalence ratio [aPR]: 0.36 [95% confidence interval (CI): 0.21-0.61]) among females aged 14 to 19 years and from 18.5% to 12.1% (aPR: 0.66 [95% CI: 0.47-0.93]) among females aged 20 to 24 years. There was no decrease in 4vHPV type prevalence in older age groups. Within the vaccine era, among sexually active females aged 14 to 24 years, 4vHPV type prevalence was lower in vaccinated (≥1 dose) compared with unvaccinated females: 2.1% vs 16.9% (aPR: 0.11 [95% CI: 0.05-0.24]). There were no statistically significant changes in other HPV type categories that indicate cross-protection. CONCLUSIONS: Within 6 years of vaccine introduction, there was a 64% decrease in 4vHPV type prevalence among females aged 14 to 19 years and a 34% decrease among those aged 20 to 24 years. This finding extends previous observations of population impact in the United States and demonstrates the first national evidence of impact among females in their 20s. |
Prevention of antibiotic-nonsusceptible invasive pneumococcal disease with the 13-valent pneumococcal conjugate vaccine
Tomczyk SM , Lynfield R , Schaffner W , Reingold A , Miller L , Petit S , Holtzman C , Zansky SM , Thomas A , Baumbach J , Harrison LH , Farley MM , Beall B , McGee L , Gierke R , Pondo T , Kim L . Clin Infect Dis 2016 62 (9) 1119-25 BACKGROUND: Antibiotic-nonsusceptible invasive pneumococcal disease (IPD) decreased substantially after the US introduction of the pediatric 7-valent pneumococcal conjugate vaccine (PCV7) in 2000. However, rates of antibiotic-nonsusceptible non-PCV7-type IPD increased during 2004-2009. In 2010, the 13-valent pneumococcal conjugate vaccine (PCV13) replaced PCV7. We assessed the impact of PCV13 on antibiotic-nonsusceptible IPD rates. METHODS: We defined IPD as pneumococcal isolation from a normally sterile site in a resident from 10 US surveillance sites. Antibiotic-nonsusceptible isolates were those intermediate or resistant to ≥1 antibiotic classes according to 2012 Clinical and Laboratory Standards Institute breakpoints. We examined rates of antibiotic-nonsusceptibility and estimated cases prevented between observed cases of antibiotic-nonsusceptible IPD and cases that would have occurred if PCV13 had not been introduced. RESULTS: From 2009-2013, rates of antibiotic-nonsusceptible IPD caused by serotypes included in PCV13 but not in PCV7 decreased from 6.5 to 0.5 per 100,000 in children aged <5 years and from 4.4 to 1.4 per 100,000 in adults aged ≥65 years. During 2010-2013, we estimated that 1,636 and 1,327 cases of antibiotic-nonsusceptible IPD caused by serotypes included in PCV13 but not in PCV7, were prevented among children aged <5 years (-97% difference) and among adults aged ≥65 years (-64% difference), respectively. Although we observed small increases in antibiotic-nonsusceptible IPD caused by non-PCV13 serotypes, no non-PCV13 serotype dominated among antibiotic-nonsusceptible strains. CONCLUSIONS: Following PCV13 introduction, antibiotic-nonsusceptible IPD decreased in multiple age groups. Continued surveillance is needed to monitor trends of non-vaccine serotypes. Pneumococcal conjugate vaccines are important tools in the approach to combat antibiotic resistance. |
Global varicella vaccine effectiveness: A meta-analysis
Marin M , Marti M , Kambhampati A , Jeram SM , Seward JF . Pediatrics 2016 137 (3) e20153741 CONTEXT: Several varicella vaccines are available worldwide. Countries with a varicella vaccination program use 1- or 2-dose schedules. OBJECTIVE: We examined postlicensure estimates of varicella vaccine effectiveness (VE) among healthy children. DATA SOURCES: Systematic review and descriptive and meta-analysis of Medline, Embase, Cochrane libraries, and CINAHL databases for reports published during 1995-2014. STUDY SELECTION: Publications that reported original data on dose-specific varicella VE among immunocompetent children. DATA EXTRACTION: We used random effects meta-analysis models to obtain pooled one dose VE estimates by disease severity (all varicella and moderate/severe varicella). Within each severity category, we assessed pooled VE by vaccine and by study design. We used descriptive statistics to summarize 1-dose VE against severe disease. For 2-dose VE, we calculated pooled estimates against all varicella and by study design. RESULTS: The pooled 1-dose VE was 81% (95% confidence interval [CI]: 78%-84%) against all varicella and 98% (95% CI: 97%-99%) against moderate/severe varicella with no significant association between VE and vaccine type or study design (P > .1). For 1 dose, median VE for prevention of severe disease was 100% (mean = 99.4%). The pooled 2-dose VE against all varicella was 92% (95% CI: 88%-95%), with similar estimates by study design. LIMITATIONS: VE was assessed primarily during outbreak investigations and using clinically diagnosed varicella. CONCLUSIONS: One dose of varicella vaccine was moderately effective in preventing all varicella and highly effective in preventing moderate/severe varicella, with no differences by vaccine. The second dose adds improved protection against all varicella. |
The impact and cost-effectiveness of 3 doses of 9-valent human papillomavirus (HPV) vaccine among US females previously vaccinated with 4-valent HPV vaccine
Chesson HW , Laprise JF , Brisson M , Markowitz LE . J Infect Dis 2016 213 (11) 1694-700 BACKGROUND: We estimated the potential impact and cost-effectiveness of providing 3-doses of nonavalent human papillomavirus (HPV) vaccine (9vHPV) to females aged 13 to 18 years who had previously completed a series of quadrivalent HPV vaccine (4vHPV), a strategy we refer to as "additional 9vHPV vaccination." METHODS: We used two distinct models: (1) the "simplified model," which is among the most basic of the published dynamic HPV models, and (2) US HPV-ADVISE, a complex, stochastic individual-based transmission-dynamic model. RESULTS: When assuming no 4vHPV cross-protection, the incremental cost per quality-adjusted life year (QALY) gained by additional 9vHPV vaccination was $146,200 in the simplified model and $108,200 in the US HPV-ADVISE model ($191,800 when assuming 4vHPV cross-protection). In one-way sensitivity analyses in the scenario of no 4vHPV cross-protection, the simplified model results ranged from $70,300 to $182,000 and the US HPV ADVISE model results ranged from $97,600 to $118,900. CONCLUSIONS: The average cost per QALY gained by additional 9vHPV vaccination exceeded $100,000 in both models. However, the results varied considerably in sensitivity and uncertainty analyses. Additional 9vHPV vaccination is likely not as efficient as many other potential HPV vaccination strategies, such as increasing primary 9vHPV vaccine coverage. |
Impact and cost-effectiveness of a second tetanus toxoid, reduced diphtheria toxoid, and acellular pertussis (Tdap) vaccine dose to prevent pertussis in the United States
Kamiya H , Cho BH , Messonnier M , Clark TA , Liang JL . Vaccine 2016 34 (15) 1832-8 INTRODUCTION: The United States experienced a substantial increase in reported pertussis cases over the last decade. Since 2005, persons 11 years and older have been routinely recommended to receive a single dose of tetanus toxoid, reduced diphtheria toxoid and acellular pertussis (Tdap) vaccine. The objective of this analysis was to evaluate the potential impact and cost-effectiveness of recommending a second dose of Tdap. METHODS: A static cohort model was used to calculate the epidemiologic and economic impact of adding a second dose of Tdap at age 16 or 21 years. Projected costs and outcomes were examined from a societal perspective over a 20-year period. Quality-adjusted Life Years (QALY) saved were calculated. RESULTS: Using baseline pertussis incidence from the National Notifiable Diseases Surveillance System, Tdap revaccination at either age 16 or 21 years would reduce outpatient visits by 433 (5%) and 285 (4%), and hospitalization cases by 7 (7%) and 5 (5%), respectively. The costs per QALY saved with a second dose of Tdap were approximately US $19.7 million (16 years) and $26.2 million (21 years). In sensitivity analyses, incidence most influenced the model; as incidence increased, the costs per QALY decreased. To a lesser degree, initial vaccine effectiveness and waning of effectiveness also affected cost outcomes. Multivariate sensitivity analyses showed that under a set of optimistic assumptions, the cost per QALY saved would be approximately $163,361 (16 years) and $204,556 (21 years). CONCLUSION: A second dose of Tdap resulted in a slight decrease in the number of cases and other outcomes, and that trend is more apparent when revaccinating at age 16 years than at age 21 years. Both revaccination strategies had high dollar per QALY saved even under optimistic assumptions in a multivariate sensitivity analysis. |
Inactivated poliovirus type 2 vaccine delivered to rat skin via high density microprojection array elicits potent neutralising antibody responses
Muller DA , Pearson FE , Fernando GJ , Agyei-Yeboah C , Owens NS , Corrie SR , Crichton ML , Wei JC , Weldon WC , Oberste MS , Young PR , Kendall MA . Sci Rep 2016 6 22094 Polio eradication is progressing rapidly, and the live attenuated Sabin strains in the oral poliovirus vaccine (OPV) are being removed sequentially, starting with type 2 in April 2016. For risk mitigation, countries are introducing inactivated poliovirus vaccine (IPV) into routine vaccination programs. After April 2016, monovalent type 2 OPV will be available for type 2 outbreak control. Because the current IPV is not suitable for house-to-house vaccination campaigns (the intramuscular injections require health professionals), we developed a high-density microprojection array, the Nanopatch, delivered monovalent type 2 IPV (IPV2) vaccine to the skin. To assess the immunogenicity of the Nanopatch, we performed a dose-matched study in rats, comparing the immunogenicity of IPV2 delivered by intramuscular injection or Nanopatch immunisation. A single dose of 0.2 D-antigen units of IPV2 elicited protective levels of poliovirus antibodies in 100% of animals. However, animals receiving IPV2 by IM required at least 3 immunisations to reach the same neutralising antibody titres. This level of dose reduction (1/40th of a full dose) is unprecedented for poliovirus vaccine delivery. The ease of administration coupled with the dose reduction observed in this study points to the Nanopatch as a potential tool for facilitating inexpensive IPV for mass vaccination campaigns. |
Complete influenza vaccination trends for children six to twenty-three months
Santibanez TA , Grohskopf LA , Zhai Y , Kahn KE . Pediatrics 2016 137 (3) e20153280 OBJECTIVE: Prevention of influenza among infants and young children is a public health priority because of their high risk for influenza-related complications. Depending on a child's age and previous influenza vaccination history, they are recommended to receive either 1 dose or 2 doses of influenza vaccine to be considered fully vaccinated against influenza for the season. We compared estimates of full (complete) influenza vaccination coverage of children 6 to 23 months across 10 consecutive influenza seasons (2002-2012), by race/ethnicity, age group, and by number of doses required to be fully vaccinated given child's vaccination history. METHODS: National Immunization Survey data were used to estimate full influenza vaccination status among children 6 to 23 months on the basis of provider report. Estimates were computed by using Kaplan-Meier survival analysis methods. RESULTS: Full influenza vaccination coverage among children 6 to 23 months increased from 4.8% in the 2002-2003 influenza season to 44.7% in the 2011-2012 season. In all 10 influenza seasons studied, non-Hispanic black children and Hispanic children had lower full influenza vaccination coverage than non-Hispanic white children. For all 10 influenza seasons, full influenza vaccination coverage was higher among children requiring only 1 dose compared with those requiring 2 doses. CONCLUSIONS: Less than half of children 6 to 23 months in the United States, and an even a smaller percentage of Hispanic and non-Hispanic black children, are fully vaccinated against influenza. More implementation of evidence-based strategies that increase the percentage of children who are fully vaccinated is needed. |
Declining Effectiveness of Herpes Zoster Vaccine in Adults Aged ≥60 Years
Tseng HF , Harpaz R , Luo Y , Hales CM , Sy LS , Tartof SY , Bialek S , Hechter RC , Jacobsen SJ . J Infect Dis 2016 213 (12) 1872-5 Understanding long term effectiveness of herpes zoster (HZ) vaccine is critical for determining vaccine policy. 176,078 members of Kaiser Permanente ≥60 years vaccinated with HZ vaccine and three matched unvaccinated members were included. Hazard ratio (HR) and 95% confidence interval (CI) associated with vaccination at each year following vaccination was estimated by Cox regression model. The effectiveness of HZ vaccine decreased from 68.7% (95% CI, 66.3%-70.9%) in the first year to 4.2 % (95% CI, -24.0%-25.9%) in the eighth. This rapid decline in effectiveness of HZ vaccine suggests that a revaccination strategy may be needed, if feasible. |
Effectiveness of pneumococcal conjugate vaccine against presumed bacterial pneumonia hospitalisation in HIV-uninfected South African children: a case-control study
Madhi SA , Groome MJ , Zar HJ , Kapongo CN , Mulligan C , Nzenze S , Moore DP , Zell ER , Whitney CG , Verani JR . Thorax 2015 70 (12) 1149-55 INTRODUCTION: We evaluated pneumococcal conjugate vaccine (PCV) effectiveness against hospitalisation for presumed bacterial pneumonia (PBP) in HIV-uninfected South African children. 7-valent PCV was introduced in April 2009 using a 2+1 schedule (doses at age 6, 14 and 39 weeks), superseded with 13-valent PCV in May 2011. METHODS: A matched case-control study was conducted at three public hospitals (Soweto, Cape Town and KwaZulu-Natal) between April 2009 and August 2012. PBP cases had either WHO defined radiographically confirmed pneumonia or 'other infiltrate' on chest radiograph with C-reactive protein ≥ 40 mg/L. Hospitalised controls were children admitted with a disease unlikely to be pneumococcal and matched for case age, site and HIV infection status. Age-matched community controls were enrolled from Soweto. Adjusted vaccine effectiveness (aVE) was estimated using conditional logistic regression. RESULTS: Of 1444 HIV-uninfected enrolled PBP cases, 1326 had ≥ 1 hospital controls (n=2075). Overall, aVE of an up-to-date PCV schedule was 20.1% (95% CI -9.3% to 41.6%) in children aged ≥ 8 weeks and 39.2% (95% CI 8.46% to 59.6%) among children 16-103 weeks of age. There were 889 PBP cases in Soweto with hospital controls and ≥ 1 community control (n=2628). The aVE using community controls was similar compared with hospital controls in Soweto, including 32.1% (95% CI 4.6% to 51.6%) and 38.4% (95% CI 7.7% to 58.8%), respectively, in age group ≥ 8 weeks and 52.7% (95% CI 25.7% to 69.9%) and 53.8% (95% CI 19.5% to 73.5%), respectively, in age group 16-103 weeks. CONCLUSIONS: PCV implemented using a 2+1 schedule in the routine infant immunisation programme was effective at preventing PBP in HIV-uninfected children. Effectiveness estimates were similar to efficacy measured by earlier randomised controlled trials using different vaccination schedules. |
A user-centered model for designing consumer mobile health application (apps)
Schnall R , Rojas M , Bakken S , Brown W , Carballo-Dieguez A , Carry M , Gelaude D , Mosley JP , Travers J . J Biomed Inform 2016 60 243-51 BACKGROUND: Mobile technologies are a useful platform for the delivery of health behavior interventions. Yet little work has been done to create a rigorous and standardized process for the design of mobile health (mHealth) apps. This project sought to explore the use of the Information Systems Research (ISR) framework as guide for the design of mHealth apps. METHODS: Our work was guided by the ISR framework which is comprised of 3 cycles: Relevance, Rigor and Design. In the Relevance cycle, we conducted 5 focus groups with 33 targeted end-users. In the Rigor cycle, we performed a review to identify technology-based interventions for meeting the health prevention needs of our target population. In the ISR Design Cycle, we employed usability evaluation methods to iteratively develop and refine mock-ups for a mHealth app. RESULTS: Through an iterative process, we identified barriers and facilitators to the use of mHealth technology for HIV prevention for high-risk MSM, developed 'use cases' and identified relevant functional content and features for inclusion in a design document to guide future app development. Findings from our work support the use of the ISR framework as a guide for designing future mHealth apps. DISCUSSION: Results from this work provide detailed descriptions of the user-centered design and system development and have heuristic value for those venturing into the area of technology-based intervention work. Findings from this study support the use of the ISR framework as a guide for future mobile health (mHealth) app development. CONCLUSION: Use of the ISR framework is a potentially useful approach for the design of a mobile app that incorporates end-users' design preferences. |
Road traffic fatalities in selected governorates of Iraq from 2010 to 2013: prospective surveillance
Leidman E , Maliniak M , Sultan AS , Hassan A , Hussain SJ , Bilukha OO . Confl Health 2016 10 2 BACKGROUND: The insurgency tactics that characterize modern warfare, such as suicide car bombs and roadside bombs, have the potential to significantly impact road traffic injuries in conflict affected-countries. As road traffic incidents are one of the top ten causes of death in Iraq, changes in incidence have important implications for the health system. We aimed to describe patterns of road traffic fatalities for all demographic groups and types of road users in Iraq during a period characterized by a resurgence in insurgency activity. METHODS: Iraqi Ministry of Health routine prospective injury surveillance collects information on all fatal injuries in eight governorates of Iraq: Baghdad, Al-Anbar, Basrah, Erbil, Kerbala, Maysan, Ninevah, and Al-Sulaimaniya. From all injury fatalities documented at the coroner office, we analyzed only those attributed to road traffic that occurred between 1 January 2010 and 31 December 2013. Coroners ascertain information from physical examinations, police reports and family members. RESULTS: Analysis included 7,976 road traffic fatalities. Overall, 6,238 (78.2 %) fatalities were male and 2,272 (28.5 %) were children under 18 years of age. The highest numbers of road traffic fatalities were among males 15 to 34 years of age and children of both sexes under 5 years of age. 49.2 % of fatalities occurred among pedestrians. Among children and females, the majority of road traffic fatalities were pedestrians, 69.0 % and 56.6 %, respectively. Fatalities among motorcyclists (3.7 %) and bicyclists (0.4 %) were least common. Rates of road traffic fatalities ranged from 8.6 to 10.7 per 100,000 population. CONCLUSIONS: The injury surveillance system provides the first data from a conflict-affected country on road traffic fatalities disaggregated by type of road user. The highest numbers of fatalities were among children and young men. Nearly half of fatalities were pedestrians, a proportion nearly double that of any neighboring country. As insurgency activity increased in 2013, the number of road traffic fatalities declined. |
Nocardia donostiensis sp. nov., isolated from human respiratory specimens.
Ercibengoa M , Bell M , Marimon JM , Humrighouse B , Klenk HP , Potter G , Perez-Trallero E . Antonie Van Leeuwenhoek 2016 109 (5) 653-60 Three human clinical isolates (X1654, X1655, and W9944) were recovered from the sputum and bronchial washings of two patients with pulmonary infections. The 16S rRNA gene sequence analysis of the isolates showed that they share 100 % sequence similarity with each other and belong to the genus Nocardia. Close phylogenetic neighbours are Nocardia brevicatena ATCC 15333T (98.6 %) and Nocardia paucivorans ATCC BAA-278T (98.4 %). The in silico DNA-DNA relatedness between the isolates ranges from 96.8 to 100 % suggesting that they belong to the same genomic species. The DNA-DNA relatedness between X1654 and N. brevicatena ATCC 15333T is 13.3 +/- 2.3 % and N. paucivorans ATCC BAA-278T is 18.95 +/- 1.1 % suggesting that they do not belong to the same genomic species. Believed to represent a novel species, these isolates were further characterised to establish their taxonomic standing within the genus. Chemotaxonomic data for isolate X1654 are consistent with those described for the genus Nocardia: this isolate produced saturated and unsaturated fatty acids, tuberculostearic acid (15.9 %), the major menaquinone was MK-8 (H4cyclic), mycolic acid chain lengths ranged from 38 to 58 carbons, produced meso-diaminopimelic acid with arabinose, glucose, and galactose as the whole cell sugars. The polar lipids were diphosphatidylglycerol, phosphatidylethanolamine, phosphatidylinositol, and phosphatidylinositol mannosides. The DNA G+C content is 66.7 mol %. Based on the combination of phenotypic, chemotaxonomic, and genotypic data for X1654, X1655, and W9944, we conclude that these isolates represent a novel species within the genus Nocardia for which we propose the name Nocardia donostiensis sp. nov. with X1654T (=DSM 46814T = CECT 8839T) as the type strain. |
Mammalian pathogenesis and transmission of H7N9 influenza viruses from three waves, 2013-2015.
Belser JA , Creager HM , Sun X , Gustin KM , Jones T , Shieh WJ , Maines TR , Tumpey TM . J Virol 2016 90 (9) 4647-4657 Three waves of human infection with H7N9 influenza viruses have concluded to date, but only viruses within the first wave (isolated between March-September 2013) have been extensively studied in mammalian models. While second- and third-wave viruses remain closely linked phylogenetically and antigenically, even subtle molecular changes can impart critical shifts in mammalian virulence. To determine if H7N9 viruses isolated from humans during 2013-15 have maintained the phenotype first identified among 2013 isolates, we assessed the ability of first-, second-, and third-wave H7N9 viruses isolated from humans to cause disease in mice and ferrets and to transmit among ferrets. Similar to first-wave viruses, H7N9 viruses from 2013-15 were highly infectious in mice, with comparable lethality to the well-studied A/Anhui/1/2013 virus. Second- and third-wave viruses caused moderate disease in ferrets, transmitted efficiently to cohoused, naive contact animals, and demonstrated limited transmissibility by respiratory droplets. All H7N9 viruses replicated efficiently in human bronchial epithelial cells, with subtle changes in pH fusion threshold identified between H7N9 viruses examined. Our results indicate that despite increased genetic diversity and geographical distribution since their initial detection in 2013, H7N9 viruses have maintained a pathogenic phenotype in mammals and continue to represent an immediate threat to public health. IMPORTANCE: H7N9 influenza viruses, first isolated in 2013, continue to cause human infection and represent an ongoing public health threat. Now entering the fourth wave of human infection, H7N9 viruses continue to exhibit genetic diversity in avian hosts, necessitating continuous efforts to monitor their pandemic potential. However, viruses isolated post-2013 have not been extensively studied, limiting our understanding of potential changes in virus-host adaptation. In order to ensure that current research with first-wave H7N9 viruses still pertains to more recently isolated strains, we compared the relative virulence and transmissibility of H7N9 viruses isolated during the second and third waves, through 2015, in the mouse and ferret models. Our finding that second and third wave viruses generally exhibit comparable disease in mammals as first-wave viruses strengthens our ability to extrapolate research from the 2013 viruses to current public health efforts. These data further contribute to our understanding of molecular determinants of pathogenicity, transmissibility, and tropism. |
Recovery of West Nile virus envelope protein domain III chimeras with altered antigenicity and mouse virulence
McAuley AJ , Torres M , Plante JA , Huang CY , Bente DA , Beasley DW . J Virol 2016 90 (9) 4757-4770 Flaviviruses are positive-sense, single-stranded RNA viruses responsible for millions of human infections annually. The envelope (E) protein of flaviviruses comprises three structural domains, of which domain III (EIII) represents a discrete subunit. EIII typically encodes epitopes recognized by virus-specific, potently neutralizing antibodies and is believed to play a major role in receptor binding. In order to assess potential interactions between EIII and the remainder of the E protein, and to assess effects of EIII sequence substitutions on antigenicity, growth, and virulence of a representative flavivirus, chimeric viruses were generated using the West Nile virus (WNV) infectious clone, into which EIIIs from nine flaviviruses of varying genetic diversity from WNV were substituted. Of the constructs tested, chimeras containing EIIIs from Koutango virus (KOUV), Japanese encephalitis virus (JEV), St Louis encephalitis virus (SLEV), and Bagaza virus (BAGV) were successfully recovered. Characterization of the chimeras in vitro and in vivo revealed differences in growth and virulence between the viruses, with in vivo pathogenesis often not correlated to in vitro growth. Taken together, the data demonstrate that substitutions of EIII can allow for the generation of viable chimeric viruses with significantly altered antigenicity and virulence. IMPORTANCE: The envelope (E) glycoprotein is the major protein present on the surface of flavivirus virions, and is responsible for mediating virus binding and entry into target cells. Several viable West Nile virus (WNV) variants were recovered with chimeric E proteins in which the putative receptor-binding domain (EIII) sequences of other mosquito-borne flaviviruses were substituted in place of the WNV EIII, although substitution of several more divergent EIII sequences were not tolerated. The differences in virulence and tissue tropism observed with the chimeric viruses indicate a significant role for this sequence in determining the pathogenesis of the virus within the mammalian host. Our studies demonstrate that these chimeras are viable, and suggest that such recombinant viruses may be useful to investigate domain-specific antibody responses and to more extensively define the contributions of EIII to tropism and pathogenesis of WNV or other flaviviruses. |
Subchronic exposures to fungal bioaerosols promotes allergic pulmonary inflammation in naive mice
Nayak AP , Green BJ , Lemons AR , Marshall NB , Goldsmith WT , Kashon ML , Anderson SE , Germolec DR , Beezhold DH . Clin Exp Allergy 2016 46 (6) 861-70 BACKGROUND: Epidemiological surveys indicate that occupants of mold contaminated environments are at increased risk of respiratory symptoms. The immunological mechanisms associated with these responses require further characterization. OBJECTIVE: The aim of this study was to characterize the immunotoxicological outcomes following repeated inhalation of dry Aspergillus fumigatus spores aerosolized at concentrations potentially encountered in contaminated indoor environments. METHODS: A. fumigatus spores were delivered to the lungs of naive BALB/cJ mice housed in a multi-animal nose-only chamber twice a week for a period of 13 weeks. Mice were evaluated at 24 and 48 hours post-exposure for histopathological changes in lung architecture, recruitment of specific immune cells to the airways, and serum antibody responses. RESULT: Germinating A. fumigatus spores were observed in lungs along with persistent fungal debris in the perivascular regions of the lungs. Repeated exposures promoted pleocellular infiltration with concomitant epithelial mucus hypersecretion, goblet cell metaplasia, subepithelial fibrosis and enhanced airway hyperreactivity. Cellular infiltration in airways was predominated by CD4+ T cells expressing the pro-allergic cytokine IL-13. Furthermore, our studies show that antifungal T cell responses (IFN-gamma+ or IL-17A+ ) co-expressed IL-13, revealing a novel mechanism for the dysregulated immune response to inhaled fungi. Total IgE production was augmented in animals repeatedly exposed to A. fumigatus. CONCLUSIONS & CLINICAL RELEVANCE: Repeated inhalation of fungal aerosols resulted in significant pulmonary pathology mediated by dynamic shifts in specific immune populations and their cytokines. These studies provide novel insights into the immunological mechanisms and targets that govern the health outcomes that result from repeated inhalation of fungal bioaerosols in contaminated environments. This article is protected by copyright. All rights reserved. |
Long-term stability of inorganic, methyl and ethyl mercury in whole blood: Effects of storage temperature and time
Sommer YL , Ward CD , Pan Y , Caldwell KL , Jones RL . J Anal Toxicol 2016 40 (3) 222-8 In this study, we evaluated the effect of temperature on the long-term stability of three mercury species in bovine blood. We used inductively coupled plasma mass spectrometry (ICP-MS) analysis to determine the concentrations of inorganic (iHg), methyl (MeHg) and ethyl (EtHg) mercury species in two blood pools stored at temperatures of -70, -20, 4, 23 degrees C (room temperature) and 37 degrees C. Over the course of a year, we analyzed aliquots of pooled specimens at time intervals of 1, 2, 4 and 6 weeks and 2, 4, 6, 8, 10 and 12 months. We applied a fixed-effects linear model, step-down pairwise comparison and coefficient of variation statistical analysis to examine the temperature and time effects on changes in mercury species concentrations. We observed several instances of statistically significant differences in mercury species concentrations between different temperatures and time points; however, with considerations of experimental factors (such as instrumental drift and sample preparation procedures), not all differences were scientifically important. We concluded that iHg, MeHg and EtHg species in bovine whole blood were stable at -70, -20, 4 and 23 degrees C for 1 year, but blood samples stored at 37 degrees C were stable for no more than 2 weeks. |
NADPH oxidase 1 is associated with altered host survival and t cell phenotypes after influenza A virus infection in mice
Hofstetter AR , De La Cruz JA , Cao W , Patel J , Belser JA , McCoy J , Liepkalns JS , Amoah S , Cheng G , Ranjan P , Diebold BA , Shieh WJ , Zaki S , Katz JM , Sambhara S , Lambeth JD , Gangappa S . PLoS One 2016 11 (2) e0149864 The role of the reactive oxygen species-producing NADPH oxidase family of enzymes in the pathology of influenza A virus infection remains enigmatic. Previous reports implicated NADPH oxidase 2 in influenza A virus-induced inflammation. In contrast, NADPH oxidase 1 (Nox1) was reported to decrease inflammation in mice within 7 days post-influenza A virus infection. However, the effect of NADPH oxidase 1 on lethality and adaptive immunity after influenza A virus challenge has not been explored. Here we report improved survival and decreased morbidity in mice with catalytically inactive NADPH oxidase 1 (Nox1*/Y) compared with controls after challenge with A/PR/8/34 influenza A virus. While changes in lung inflammation were not obvious between Nox1*/Y and control mice, we observed alterations in the T cell response to influenza A virus by day 15 post-infection, including increased interleukin-7 receptor-expressing virus-specific CD8+ T cells in lungs and draining lymph nodes of Nox1*/Y, and increased cytokine-producing T cells in lungs and spleen. Furthermore, a greater percentage of conventional and interstitial dendritic cells from Nox1*/Y draining lymph nodes expressed the co-stimulatory ligand CD40 within 6 days post-infection. Results indicate that NADPH oxidase 1 modulates the innate and adaptive cellular immune response to influenza virus infection, while also playing a role in host survival. Results suggest that NADPH oxidase 1 inhibitors may be beneficial as adjunct therapeutics during acute influenza infection. |
Nanotopographical modulation of cell function through nuclear deformation
Wang K , Bruce A , Mezan R , Kadiyala A , Wang L , Dawson J , Rojanasakul Y , Yang Y . ACS Appl Mater Interfaces 2016 8 (8) 5082-92 Although nanotopography has been shown to be a potent modulator of cell behavior, it is unclear how the nanotopographical cue, through focal adhesions, affects the nucleus, eventually influencing cell phenotype and function. Thus, current methods to apply nanotopography to regulate cell behavior are basically empirical. We, herein, engineered nanotopographies of various shapes (gratings and pillars) and dimensions (feature size, spacing and height), and thoroughly investigated cell spreading, focal adhesion organization and nuclear deformation of human primary fibroblasts as the model cell grown on the nanotopographies. We examined the correlation between nuclear deformation and cell functions such as cell proliferation, transfection and extracellular matrix protein type I collagen production. It was found that the nanoscale gratings and pillars could facilitate focal adhesion elongation by providing anchoring sites, and the nanogratings could orient focal adhesions and nuclei along the nanograting direction, depending on not only the feature size but also the spacing of the nanogratings. Compared with continuous nanogratings, discrete nanopillars tended to disrupt the formation and growth of focal adhesions and thus had less profound effects on nuclear deformation. Notably, nuclear volume could be effectively modulated by the height of nanotopography. Further, we demonstrated that cell proliferation, transfection, and type I collagen production were strongly associated with the nuclear volume, indicating that the nucleus serves as a critical mechanosensor for cell regulation. Our study delineated the relationships between focal adhesions, nucleus and cell function and highlighted that the nanotopography could regulate cell phenotype and function by modulating nuclear deformation. This study provides insight into the rational design of nanotopography for new biomaterials and the cell-substrate interfaces of implants and medical devices. |
Nanotoxicology ten years later: Lights and shadows
Shvedova A , Pietroiusti A , Kagan V . Toxicol Appl Pharmacol 2016 299 1-2 The mounting societal concerns about possible and maybe even likely adverse effects of nanomaterials are reflected in a large and growing number of publications in the field of nanotoxicology. Indeed, today's search in PubMed reveals >3700 publications on the subject denoted by (toxic+nanomaterials) - quite a growth over the last decade that began with only two dozens of them up-to 2005. |
Optimizing virus identification in critically ill children suspected of having an acute severe viral infection
Randolph AG , Agan AA , Flanagan RF , Meece JK , Fitzgerald JC , Loftis LL , Truemper EJ , Li S , Ferdinands JM . Pediatr Crit Care Med 2016 17 (4) 279-86 OBJECTIVES: Multiplex rapid viral tests and nasopharyngeal flocked swabs are increasingly used for viral testing in PICUs. This study aimed at evaluating how the sampling site and the type of diagnostic test influence test results in children with suspected severe viral infection. DESIGN: Prospective cohort study. SETTING: PICUs at 21 tertiary pediatric referral centers in the United States. PATIENTS: During the 2010-2011 and 2011-2012 influenza seasons, we enrolled children (6 mo to 17 yr old) who were suspected to have severe viral infection. INTERVENTIONS: We collected samples by using a standardized protocol for nasopharyngeal aspirate and nasopharyngeal flocked swabs in nonintubated patients and for endotracheal tube aspirate and nasopharyngeal flocked swabs in intubated patients. MEASUREMENTS AND MAIN RESULTS: Viral testing included a single reverse transcription-polymerase chain reaction influenza test and the GenMark Respiratory Viral Panel (20 viruses). We enrolled 90 endotracheally intubated and 133 nonintubated children. We identified influenza in 45 patients with reverse transcription-polymerase chain reaction testing; the multiplex panel was falsely negative for influenza in two patients (4.4%). Six patients (13.3%) had not been diagnosed with influenza in the PICU. Non-influenza viruses were identified in 172 of 223 children (77.1%). In nonintubated children, the same virus was identified by nasopharyngeal flocked swabs and nasopharyngeal aspirate in 133 of 183 paired samples (72.7%), with +nasopharyngeal aspirate/-nasopharyngeal flocked swabs in 32 of 183 paired samples (17.4%). In intubated children, the same virus was identified by nasopharyngeal flocked swabs and endotracheal tube aspirate in 67 of 94 paired samples (71.3%), with +nasopharyngeal flocked swabs/-endotracheal tube aspirate in 22 of 94 paired samples (23.4%). Most discrepancies were either adenovirus or rhinovirus in both groups. CONCLUSIONS: Standardized specimen collection and sensitive diagnostic testing with a reverse transcription-polymerase chain reaction increased the identification of influenza in critically ill children. For most pathogenic viruses identified, results from nasopharyngeal flocked swabs agreed with those from nasopharyngeal or endotracheal aspirates. |
The effects of apolipoprotein B depletion on HDL subspecies composition and function
Davidson WS , Heink A , Sexmith H , Melchior JT , Gordon SM , Kuklenyik Z , Woolett L , Barr JR , Jones JI , Toth CA , Shah AS . J Lipid Res 2016 57 (4) 674-86 High density lipoprotein (HDL) cholesterol efflux function may be a more robust biomarker of coronary artery disease risk than HDL cholesterol (HDL-C). To study HDL function, apoB containing lipoproteins are precipitated from serum. Whether apoB precipitation affects HDL subspecies composition and function has not been thoroughly investigated. We studied the effects of four common apoB precipitation methods (polyethylene glycol (PEG), dextran sulfate/MgCl2, heparin sodium/MnCl2 and LipoSep immunoprecipitant (IP)) on HDL subspecies composition, apolipoproteins and function (cholesterol efflux and reduction of LDL oxidation). PEG dramatically shifted the size distribution of HDL and apolipoproteins (assessed by two independent methods), while leaving substantial amounts of reagent in the sample. PEG also changed the distribution of cholesterol efflux and LDL oxidation across size fractions, but not overall efflux across the HDL range. Dextran sulfate/MgCl2 and heparin sodium/MnCl2 did not change the size distribution of HDL subspecies but altered the quantity of a subset of apolipoproteins. LipoSep IP resulted in a shift in the HDL size distribution, but less so than PEG. Thus, each of the apoB precipitation methods affected HDL composition and/or size distribution. We conclude that careful evaluation is needed when selecting apoB depletion methods for existing and future bioassays of HDL function. |
Effects of intratracheally instilled laser printer-emitted engineered nanoparticles in a mouse model: A case study of toxicological implications from nanomaterials released during consumer use
Pirela SV , Lu X , Miousse I , Sisler JD , Qian Y , Guo N , Koturbash I , Castranova V , Thomas T , Godleski J , Demokritou P . NanoImpact 2016 1 1-8 Incorporation of engineered nanomaterials (ENMs) into toners used in laser printers has led to countless quality and performance improvements. However, the release of ENMs during printing (consumer use) has raised concerns about their potential adverse health effects. The aim of this study was to use “real world” printer-emitted particles (PEPs), rather than raw toner powder, and assess the pulmonary responses following exposure by intratracheal instillation. Nine-week old male Balb/c mice were exposed to various doses of PEPs (0.5, 2.5 and 5 mg/kg body weight) by intratracheal instillation. These exposure doses are comparable to real world human inhalation exposures ranging from 13.7 to 141.9 h of printing. Toxicological parameters reflecting distinct mechanisms of action were evaluated, including lung membrane integrity, inflammation and regulation of DNA methylation patterns. Results from this in vivo toxicological analysis showed that while intratracheal instillation of PEPs caused no changes in the lung membrane integrity, there was a pulmonary immune response, indicated by an elevation in neutrophil and macrophage percentage over the vehicle control and low dose PEPs groups. Additionally, exposure to PEPs upregulated expression of the Ccl5 (Rantes), Nos1 and Ucp2 genes in the murine lung tissue and modified components of the DNA methylation machinery (Dnmt3a) and expression of transposable element (TE) LINE-1 compared to the control group. These genes are involved in both the repair process from oxidative damage and the initiation of immune responses to foreign pathogens. The results are in agreement with findings from previous in vitro cellular studies and suggest that PEPs may cause immune responses in addition to modifications in gene expression in the murine lung at doses that can be comparable to real world exposure scenarios, thereby raising concerns of deleterious health effects. |
Evaluation of the effect of valence state on cerium oxide nanoparticle toxicity following intratracheal instillation in rats
Dunnick KM , Morris AM , Badding MA , Barger M , Stefaniak AB , Sabolsky EM , Leonard SS . Nanotoxicology 2016 10 (7) 1-34 Cerium (Ce) is becoming a popular metal for use in electrochemical applications. When in the form of cerium oxide (CeO2), Ce can exist in both a 3+ and 4+ valence state, acting as an ideal catalyst. Previous in vitro and in vivo evidence have demonstrated that CeO2 has either anti- or pro-oxidant properties, possibly due to the ability of the nanoparticles to transition between valence states. Therefore, we chose to chemically modify the nanoparticles to shift the valence state toward 3+. During the hydrothermal synthesis process, 10 mol% gadolinium (Gd) and 20 mol% Gd, was substituted into the lattice of the CeO2 nanoparticles forming a perfect solid solution with various A-site valence states. These two Gd-doped CeO2 nanoparticles were compared to pure CeO2 nanoparticles. Preliminary characteristics indicated that doping results in minimal size and zeta potential changes but alters valence state. Following characterization, male Sprague-Dawley rats were exposed to 0.5 or 1.0 mg/kg nanoparticles via a single intratracheal instillation. Animals were sacrificed and bronchoalveolar lavage fluid and various tissues were collected to determine the effect of valence state and oxygen vacancies on toxicity 1, 7, or 84 days post-exposure. Results indicate that damage, as measured by elevations in lactate dehydrogenase, occurred within 1 day post-exposure and was sustained 7 days post-exposure, but subsided to control levels 84 days post-exposure. Further, no inflammatory signaling or lipid peroxidation occurred following exposure with any of the nanoparticles. Our results implicate that valence state has a minimal effect on CeO2 nanoparticle toxicity in vivo. |
Prevalence of sugar-sweetened beverage intake among adults - 23 states and the District of Columbia, 2013
Park S , Xu F , Town M , Blanck HM . MMWR Morb Mortal Wkly Rep 2016 65 (7) 169-174 The 2015-2020 Dietary Guidelines for Americans recommend that the daily intake of calories from added sugars not exceed 10% of total calories.* Sugar-sweetened beverages (SSBs) are significant sources of added sugars in the diet of U.S. adults and account for approximately one third of added sugar consumption (1). Among adults, frequent (i.e., at least once a day) SSB intake is associated with adverse health consequences, including obesity, type 2 diabetes, and cardiovascular disease (2). According to the 2009-2010 National Health and Nutrition Examination Survey (NHANES), an in-person and phone follow-up survey, 50.6% of U.S. adults consumed at least one SSB on a given day (3). In addition, SSB intake varies by geographical regions (4,5): the prevalence of daily SSB intake was higher among U.S. adults living in the Northeast (68.4%) and South (66.7%) than among persons living in the Midwest (58.8%). In 2013, the Behavioral Risk Factor Surveillance System (BRFSS), a telephone survey, revised the SSB two-item optional module to retain the first question on regular soda and expand the second question to include more types of SSBs than just fruit drinks. Using 2013 BRFSS data, self-reported SSB (i.e., regular soda, fruit drinks, sweet tea, and sports or energy drinks) intake among adults (aged ≥18 years) was assessed in 23 states and the District of Columbia (DC). The overall age-adjusted prevalence of SSB intake ≥1 time per day was 30.1% and ranged from 18.0% in Vermont to 47.5% in Mississippi. Overall, at least once daily SSB intake was most prevalent among adults aged 18-24 years (43.3%), men (34.1%), non-Hispanic blacks (blacks) (39.9%), unemployed adults (34.4%), and persons with less than a high school education (42.4%). States can use the data for program evaluation and monitoring trends, and information on disparities in SSB consumption could be used to create targeted intervention efforts to reduce SSB consumption. |
Factors associated with self-reported menu-labeling usage among US adults
Lee-Kwan SH , Pan L , Maynard LM , McGuire LC , Park S . J Acad Nutr Diet 2016 116 (7) 1127-35 BACKGROUND: Menu labeling can help people select foods and beverages with fewer calories and is a potential population-based strategy to reduce obesity and diet-related chronic diseases in the United States. OBJECTIVE: The aim of this cross-sectional study was to examine the prevalence of menu-labeling use among adults and its association with sociodemographic, behavioral, and policy factors. METHODS: The 2012 Behavioral Risk Factor Surveillance System data from 17 states, which included 100,141 adults who noticed menu labeling at fast-food or chain restaurants ("When calorie information is available in the restaurant, how often does this information help you decide what to order?") were used. Menu-labeling use was categorized as frequent (always/most of the time), moderate (half the time/sometimes), and never. Multinomial logistic regression was used to examine associations among sociodemographic, behavioral, and policy factors with menu-labeling use. RESULTS: Overall, of adults who noticed menu labeling, 25.6% reported frequent use of menu labeling, 31.6% reported moderate use, and 42.7% reported that they never use menu labeling. Compared with never users, frequent users were significantly more likely to be younger, female, nonwhite, more educated, high-income, adults who were overweight or obese, physically active, former- or never-smokers, less than daily (<1 time/day) consumers of sugar-sweetened beverage, and living in states where menu-labeling legislation was enacted or proposed. CONCLUSIONS: Menu labeling is one method that consumers can use to help reduce their calorie consumption from restaurants. These findings can be used to develop targeted interventions to increase menu-labeling use among subpopulations with lower use. |
Respirable crystalline silica exposures during asphalt pavement milling at eleven highway construction sites
Hammond DR , Shulman SA , Echt AS . J Occup Environ Hyg 2016 13 (7) 0 Asphalt pavement milling machines use a rotating cutter drum to remove the deteriorated road surface for recycling. The removal of the road surface has the potential to release respirable crystalline silica, to which workers can be exposed. This paper describes an evaluation of respirable crystalline silica exposures to the operator and ground worker from two different half-lane and larger asphalt pavement milling machines that had ventilation dust controls and water-sprays designed and installed by the manufacturers. Manufacturer A completed milling for eleven days at four highway construction sites in Wisconsin, and Manufacturer B completed milling for ten days at seven highway construction sites in Indiana. To evaluate the dust controls, full-shift personal breathing zone air samples were collected from an operator and ground worker during the course of normal employee work activities of asphalt pavement milling at eleven different sites. Forty-two personal breathing zone air samples were collected over 21 days (sampling on an operator and ground worker each day). All samples were below 50 mug/m3 for respirable crystalline silica, the National Institute for Occupational Safety and Health recommended exposure limit. The geometric mean personal breathing zone air sample was 6.2 mug/m3 for the operator and 6.1 mug/m3 for the ground worker for the Manufacturer A milling machine. The geometric mean personal breathing zone air sample was 4.2 mug/m3 for the operator and 9.0 mug/m3 for the ground worker for the Manufacturer B milling machine. In addition, upper 95% confidence limits for the mean exposure for each occupation were well below 50 mug/m3 for both studies. The silica content in the bulk asphalt material being milled ranged from 7% to 23% silica for roads milled by Manufacturer A and from 5% to 12% silica for roads milled by Manufacturer B. The results indicate that engineering controls consisting of ventilation controls in combination with water-sprays are capable of controlling occupational exposures to respirable crystalline silica generated by asphalt pavement milling machines on highway construction sites. |
Separate and joint associations of shift work and sleep quality with lipids
Charles LE , Gu JK , Tinney-Zara CA , Fekedulegn D , Ma CC , Baughman P , Hartley TA , Andrew ME , Violanti JM , Burchfiel C M . Saf Health Work 2016 7 (2) 111-9 BACKGROUND: Shift work and/or sleep quality may affect health. We investigated whether shift work and sleep quality, separately and jointly, were associated with abnormal levels of triglycerides, total cholesterol (TC), and low-and high-density lipoprotein cholesterol in 360 police officers (27.5% women). METHODS: Triglycerides, TC, and high-density lipoprotein were analyzed on the Abbott Architect; low-density lipoprotein was calculated. Shift work was assessed using City of Buffalo payroll work history records. Sleep quality (good, ≤5; intermediate, 6–8; poor, ≥9) was assessed using the Pittsburgh Sleep Quality Index questionnaire. A shift work + sleep quality variable was created: day plus good sleep; day plus poor sleep; afternoon/night plus good; and poor sleep quality. Mean values of lipid biomarkers were compared across categories of the exposures using analysis of variance/analysis of covariance. RESULTS: Shift work was not significantly associated with lipids. However, as sleep quality worsened, mean levels of triglycerides and TC gradually increased but only among female officers (age- and race-adjusted p = 0.013 and 0.030, respectively). Age significantly modified the association between sleep quality and TC. Among officers ≥40 years old, those reporting poor sleep quality had a significantly higher mean level of TC (202.9 ± 3.7 mg/dL) compared with those reporting good sleep quality (190.6 ± 4.0 mg/dL) (gender- and race-adjusted p = 0.010). Female officers who worked the day shift and also reported good sleep quality had the lowest mean level of TC compared with women in the other three categories (p = 0.014). CONCLUSION: Sleep quality and its combined influence with shift work may play a role in the alteration of some lipid measures. |
Fibrosis biomarkers in workers exposed to MWCNTs
Fatkhutdinova LM , Khaliullin TO , Vasil'yeva OL , Zalyalov RR , Mustafin IG , Kisin ER , Birch ME , Yanamala N , Shvedova AA . Toxicol Appl Pharmacol 2016 299 125-31 Multi-walled carbon nanotubes (MWCNT) with their unique physico-chemical properties offer numerous technological advantages and are projected to drive the next generation of manufacturing growth. As MWCNT have already found utility in different industries including construction, engineering, energy production, space exploration and biomedicine, large quantities of MWCNT may reach the environment and inadvertently lead to human exposure. This necessitates the urgent assessment of their potential health effects in humans. The current study was carried out at NanotechCenter Ltd. Enterprise (Tambov, Russia) where large-scale manufacturing of MWCNT along with relatively high occupational exposure levels was reported. The goal of this small cross-sectional study was to evaluate potential biomarkers during occupational exposure to MWCNT. All air samples were collected at the workplaces from both specific areas and personal breathing zones using filter-based devices to quantitate elemental carbon and perform particle analysis by TEM. Biological fluids of nasal lavage, induced sputum and blood serum were obtained from MWCNT-exposed and non-exposed workers for assessment of inflammatory and fibrotic markers. It was found that exposure to MWCNTs caused significant increase in IL-1beta, IL6, TNF-alpha, inflammatory cytokines and KL-6, a serological biomarker for interstitial lung disease in collected sputum samples. Moreover, the level of TGF-beta1 was increased in serum obtained from young exposed workers. Overall, the results from this study revealed accumulation of inflammatory and fibrotic biomarkers in biofluids of workers manufacturing MWCNTs. Therefore, the biomarkers analyzed should be considered for the assessment of health effects of occupational exposure to MWCNT in cross-sectional epidemiological studies. |
A longitudinal study of the durability of long-lasting insecticidal nets in Zambia
Tan KR , Coleman J , Smith B , Hamainza B , Katebe-Sakala C , Kean C , Kowal A , Vanden Eng J , Parris TK , Mapp CT , Smith SC , Wirtz R , Kamuliwo M , Craig AS . Malar J 2016 15 (1) 106 BACKGROUND: A key goal of malaria control is to achieve universal access to, and use of, long-lasting insecticidal nets (LLINs) among people at risk for malaria. Quantifying the number of LLINs needed to achieve and maintain universal coverage requires knowing when nets need replacement. Longitudinal studies have observed physical deterioration in LLINs well before the assumed net lifespan of 3 years. The objective of this study was to describe attrition, physical integrity and insecticide persistence of LLINs over time to assist with better quantification of nets needing replacement. METHODS: 999 LLINs distributed in 2011 in two highly endemic provinces in Zambia were randomly selected, and were enrolled at 12 months old. LLINs were followed every 6 months up to 30 months of age. Holes were counted and measured (finger, fist, and head method) and a proportional hole index (pHI) was calculated. Households were surveyed about net care and repair and if applicable, reasons for attrition. Functional survival was defined as nets with a pHI <643 and present for follow-up. At 12 and 24 months of age, 74 LLINs were randomly selected for examination of insecticidal activity and content using bioassay and chemical analysis methods previously described by the World Health Organization (WHO). RESULTS: A total of 999 LLINs were enrolled; 505 deltamethrin-treated polyester nets and 494 permethrin-treated polyethylene nets. With 74 used to examine insecticide activity, 925 were available for full follow-up. At 30 months, 325 (33 %) LLINs remained. Net attrition was primarily due to disposal (29 %). Presence of repairs and use over a reed mat were significantly associated with larger pHIs. By 30 months, only 56 % of remaining nets met criteria for functional survival. A shorter functional survival was associated with having been washed. At 24 months, nets had reduced insecticidal activity (57 % met WHO minimal criteria) and content (5 % met WHO target insecticide content). CONCLUSIONS: The median functional survival time for LLINs observed the study was 2.5-3 years and insecticide activity and content were markedly decreased by 2 years. A better measure of net survival incorporating insecticidal field effectiveness, net physical integrity, and attrition is needed. |
Primaquine to reduce transmission of Plasmodium falciparum malaria in Mali: a single-blind, dose-ranging, adaptive randomised phase 2 trial
Dicko A , Brown JM , Diawara H , Baber I , Mahamar A , Soumare HM , Sanogo K , Koita F , Keita S , Traore SF , Chen I , Poirot E , Hwang J , McCulloch C , Lanke K , Pett H , Niemi M , Nosten F , Bousema T , Gosling R . Lancet Infect Dis 2016 16 (6) 674-684 BACKGROUND: Single low doses of primaquine, when added to artemisinin-based combination therapy, might prevent transmission of Plasmodium falciparum malaria to mosquitoes. We aimed to establish the activity and safety of four low doses of primaquine combined with dihydroartemisinin-piperaquine in male patients in Mali. METHODS: In this phase 2, single-blind, dose-ranging, adaptive randomised trial, we enrolled boys and men with uncomplicated P falciparum malaria at the Malaria Research and Training Centre (MRTC) field site in Ouelessebougou, Mali. All participants were confirmed positive carriers of gametocytes through microscopy and had normal function of glucose-6-phosphate dehydrogenase (G6PD) on colorimetric quantification. In the first phase, participants were randomly assigned (1:1:1) to one of three primaquine doses: 0 mg/kg (control), 0.125 mg/kg, and 0.5 mg/kg. Randomisation was done with a computer-generated randomisation list (in block sizes of six) and concealed with sealed, opaque envelopes. In the second phase, different participants were sequentially assigned (1:1) to 0.25 mg/kg primaquine or 0.0625 mg/kg primaquine. Primaquine tablets were dissolved into a solution and administered orally in a single dose. Participants were also given a 3 day course of dihydroartemisinin-piperaquine, administered by weight (320 mg dihydroartemisinin and 40 mg piperaquine per tablet). Outcome assessors were masked to treatment allocation, but participants were permitted to find out group assignment. Infectivity was assessed through membrane-feeding assays, which were optimised through the beginning part of phase one. The primary efficacy endpoint was the mean within-person percentage change in mosquito infectivity 2 days after primaquine treatment in participants who completed the study after optimisation of the infectivity assay, had both a pre-treatment infectivity measurement and at least one follow-up infectivity measurement, and who were given the correct primaquine dose. The safety endpoint was the mean within-person change in haemoglobin concentration during 28 days of study follow-up in participants with at least one follow-up visit. This study is registered with ClinicalTrials.gov, number NCT01743820. FINDINGS: Between Jan 2, 2013, and Nov 27, 2014, we enrolled 81 participants. In the primary analysis sample (n=71), participants in the 0.25 mg/kg primaquine dose group (n=15) and 0.5 mg/kg primaquine dose group (n=14) had significantly lower mean within-person reductions in infectivity at day 2-92.6% (95% CI 78.3-100; p=0.0014) for the 0.25 mg/kg group; and 75.0% (45.7-100; p=0.014) for the 0.5 mg/kg primaquine group-compared with those in the control group (n=14; 11.3% [-27.4 to 50.0]). Reductions were not significantly different from control for participants assigned to the 0.0625 mg/kg dose group (n=16; 41.9% [1.4-82.5]; p=0.16) and the 0.125 mg/kg dose group (n=12; 54.9% [13.4-96.3]; p=0.096). No clinically meaningful or statistically significant drops in haemoglobin were recorded in any individual in the haemoglobin analysis (n=70) during follow-up. No serious adverse events were reported and adverse events did not differ between treatment groups. INTERPRETATION: A single dose of 0.25 mg/kg primaquine, given alongside dihydroartemisinin-piperaquine, was safe and efficacious for the prevention of P falciparum malaria transmission in boys and men who are not deficient in G6PD. Future studies should assess the safety of single-dose primaquine in G6PD-deficient individuals to define the therapeutic range of primaquine to enable the safe roll-out of community interventions with primaquine. FUNDING: Bill & Melinda Gates Foundation. |
The impact of ART on live birth outcomes: Differing experiences across three states
Luke S , Sappenfield WM , Kirby RS , McKane P , Bernson D , Zhang Y , Chuong F , Cohen B , Boulet SL , Kissin DM . Paediatr Perinat Epidemiol 2016 30 (3) 209-16 BACKGROUND: Research has shown an association between assisted reproductive technology (ART) and adverse birth outcomes. We identified whether birth outcomes of ART-conceived pregnancies vary across states with different maternal characteristics, insurance coverage for ART services, and type of ART services provided. METHODS: CDC's National ART Surveillance System data were linked to Massachusetts, Florida, and Michigan vital records from 2000 through 2006. Maternal characteristics in ART- and non-ART-conceived live births were compared between states using chi-square tests. We performed multivariable logistic regression analyses and calculated adjusted odds ratios (aOR) to assess associations between ART use and singleton preterm delivery (<32 weeks, <37 weeks), singleton small for gestational age (SGA) (<5th and <10th percentiles) and multiple birth. RESULTS: ART use in Massachusetts was associated with significantly lower odds of twins as well as triplets and higher order births compared to Florida and Michigan (aOR 22.6 vs. 30.0 and 26.3, and aOR 37.6 vs. 92.8 and 99.2, respectively; Pinteraction < 0.001). ART use was associated with increased odds of SGA in Michigan only, and with preterm delivery (<32 and <37 weeks) in all states (aOR range: 1.60, 1.87). CONCLUSIONS: ART use was associated with an increased risk of preterm delivery among singletons that showed little variability between states. The number of twins, triplets and higher order gestations per cycle was lower in Massachusetts, which may be due to the availability of insurance coverage for ART in Massachusetts. |
Barriers and facilitators to health center implementation of evidence-based clinical practices in adolescent reproductive health services
Hallum-Montes R , Middleton D , Schlanger K , Romero L . J Adolesc Health 2016 58 (3) 276-83 PURPOSE: Despite the substantial evidence supporting the guidelines for the provision of reproductive health services for adolescents, research points to a persistent gap in their translation into health care practice. This study examines barriers and facilitators that health centers experience when implementing evidence-based clinical practices for adolescent reproductive health services and discusses strategies to address identified barriers. METHODS: Semistructured interviews were conducted with 85 leaders and staff of 30 health centers in Alabama, Georgia, Massachusetts, North Carolina, South Carolina, Pennsylvania, and Texas. Interview data were analyzed for emergent themes following a grounded theory approach. RESULTS: Data analysis revealed that certain factors at health system and community levels influenced health centers' efforts to implement evidence-based clinical practices for adolescent reproductive health care. In particular, support from health center leadership, communication between leadership and staff, and staff attitudes and beliefs were reported as factors that facilitated the implementation of new practices. CONCLUSIONS: Health center efforts to implement new practice guidelines should include efforts to build the capacity of health center leadership to mobilize staff and resources to ensure that new practices are implemented consistently and with quality. |
Exposure and response to current text-only smokeless tobacco health warnings among smokeless tobacco users aged ≥ 18years, United States, 2012-2013
Agaku IT , Singh T , Rolle IV , Ayo-Yusuf OA . Prev Med 2016 87 200-206 INTRODUCTION: We assessed US adult smokeless tobacco (SLT) users' exposure and response to SLT health warnings, which are currently in text-only format, covering 30% of the two primary surfaces of SLT containers and 20% of advertisements. METHODS: Data were from the 2012-2013 National Adult Tobacco Survey. Past 30-day exposure to SLT health warnings among past 30-day SLT users (n=1626) was a self-report of seeing warnings on SLT packages: "Very often," "Often," or "Sometimes" (versus "Rarely" or "Never"). We measured the association between SLT health warning exposure and perceptions of SLT harmfulness and addictiveness using logistic regression. RESULTS: Of past 30-day SLT users, 77.5% reported exposure to SLT health warnings, with lower prevalence reported among females and users of novel SLT products (snus/dissolvable tobacco). Furthermore, exposure reduced linearly with reducing education and annual household income (p<0.01). Among exposed past 30-day SLT users, 73.9% reported thinking about the health risks of SLT, while 17.1% reported stopping SLT use on ≥1 occasion within the past 30days. Exposure to SLT warnings was associated with perceived SLT harmfulness (AOR=2.16; 95% CI=1.15-4.04), but not with perceived SLT addictiveness. CONCLUSION: Socioeconomic disparities found in exposure and response to SLT health warnings can be addressed through implementation of large pictorial warnings. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Food Safety
- Genetics and Genomics
- Health Economics
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Reproductive Health
- Substance Use and Abuse
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure