Prevalence of inflammatory bowel disease among adults aged ≥18 years - United States, 2015
Dahlhamer JM , Zammitti EP , Ward BW , Wheaton AG , Croft JB . MMWR Morb Mortal Wkly Rep 2016 65 (42) 1166-1169 Crohn's disease and ulcerative colitis, collectively known as inflammatory bowel disease (IBD), are characterized by chronic inflammation of the gastrointestinal tract (1). IBD has been associated with poor quality of life and extensive morbidity and often results in complications requiring hospitalizations and surgical procedures (2-4). Most previous studies of IBD have used administrative claims data or data collected from limited geographic areas to demonstrate increases in estimated prevalence of IBD within the United States (5,6). Few national prevalence estimates of IBD among adults based on large, nationally representative data sources exist, and those that do tend to be based on older data. For example, the most recent national study used 1999 National Health Interview Survey (NHIS) data and estimated that 1.8 million (0.9%) U.S. adults had IBD (7). To examine the prevalence of IBD among the civilian, noninstitutionalized U.S. adult population, data from the 2015 NHIS were analyzed. Overall, an estimated 3.1 million, or 1.3%, of U.S. adults have received a diagnosis of IBD. Within population subgroups, a higher prevalence of IBD was identified among adults aged ≥45 years, Hispanics, non-Hispanic whites, and adults with less than a high school level of education, not currently employed, born in the United States, living in poverty, or living in suburban areas. The use of a nationally representative data source such as the NHIS to estimate the prevalence of IBD overall and by population subgroups is important to understand the burden of IBD on the U.S. health care system. |
Prevalence of mixed connective tissue disease in a population-based registry of American Indian/Alaska Native people in 2007
Ferucci ED , Johnston JM , Gordon C , Helmick CG , Lim SS . Arthritis Care Res (Hoboken) 2016 69 (8) 1271-1275 OBJECTIVE: The objective of this surveillance project was to determine the prevalence of mixed connective tissue disease (MCTD) in 2007 in the Indian Health Service (IHS) active clinical population from 3 regions of the United States. METHODS: The IHS Lupus Registry was designed to identify possible MCTD cases in addition to lupus. The population denominator for this report includes American Indian or Alaska Native adults within the IHS active clinical population in 2007, residing in select communities in 3 regions of the US. Potential MCTD cases were identified using a broad range of diagnostic codes and were confirmed by detailed medical record abstraction. Classification as MCTD for this analysis required both rheumatologist diagnosis of MCTD without diagnosis of other connective tissue disease and documentation of the Alarcon-Segovia criteria in the medical record. Prevalence was also calculated using two alternate definitions of MCTD. RESULTS: The age-adjusted prevalence of MCTD using our primary definition was 6.4 per 100,000 (95% confidence interval (CI) 2.8-12.8). The prevalence was higher in women than men using all three definitions of MCTD, and no men met the primary definition of MCTD. CONCLUSION: The first population-based estimates of the prevalence of MCTD in the US American Indian/Alaska Native population show that the prevalence appears to be higher than in other populations. Additional population-based estimates are needed to better understand the epidemiology of MCTD. |
Hyperlipidemia and medical expenditures by cardiovascular disease status in US adults
Zhang D , Wang G , Fang J , Mercado C . Med Care 2016 55 (1) 4-11 BACKGROUND: Hyperlipidemia is a major risk factor for cardiovascular disease (CVD), affecting 73.5 million American adults. Information about health care expenditures associated with hyperlipidemia by CVD status is needed to evaluate the economic benefit of primary and secondary prevention programs for CVD. METHODS: The study sample includes 48,050 men and nonpregnant women ≥18 from 2010 to 2012 Medical Expenditure Panel Survey. A 2-part econometric model was used to estimate annual hyperlipidemia-associated medical expenditures by CVD status. The estimation results from the 2-part model were used to calculate per-capita and national medical expenditures associated with hyperlipidemia. We adjusted the medical expenditures into 2012 dollars. RESULTS: Among those with CVD, per person hyperlipidemia-associated expenditures were $1105 [95% confidence interval (CI), $877-$1661] per year, leading to an annual national expenditure of $15.47 billion (95% CI, $5.23-$27.75 billion). Among people without CVD, per person hyperlipidemia-associated expenditures were $856 (95% CI, $596-$1211) per year, resulting in an annual national expenditure of $23.11 billion (95% CI, $16.09-$32.71 billion). Hyperlipidemia-associated expenditures were attributable mostly to the costs of prescription medication (59%-90%). Among people without CVD, medication expenditures associated with hyperlipidemia were $13.72 billion (95% CI, $10.55-$15.74 billion), higher in men than in women. CONCLUSIONS: Hyperlipidemia significantly increased medical expenditures and the increase was higher in people with CVD than without. The information on estimated expenditures could be used to evaluate and develop effective programs for CVD prevention. |
Admissions after discharge from an emergency department for chest symptoms
Moore BJ , Coffey RM , Heslin KC , Moy E . Diagnosis (Berl) 2016 3 (3) 103-113 Often patients who present to the emergency department (ED) with chest symptoms return to the hospital within 30 days with the same or closely related symptoms and are admitted, raising questions about quality of care, timeliness of diagnosis, and patient safety. This study examined the frequency of and patient characteristics associated with subsequent inpatient admissions for related symptoms after discharge from an ED for chest symptoms. We used data from the 2012 and 2013 Healthcare Cost and Utilization Project (HCUP) State Inpatient Databases (SID) and State Emergency Department Databases (SEDD) from eight states to identify over 1.8 million ED discharges for chest symptoms. Approximately 3% of ED discharges experienced potentially related subsequent admissions within 30 days - 0.2% for acute myocardial infarction (AMI), 1.7% for other cardiovascular conditions, 0.5% for respiratory conditions, and 0.6% for mental disorders. Logistic regression results showed higher odds of subsequent admission for older patients and those residing in low-income areas, and lower odds for females and non White racial/ethnic groups. Privately insured patients had lower odds of subsequent admission than did those who were uninsured or covered by other programs. Because we included multiple diagnostic categories of subsequent admissions, our results show a more complete picture of patients presenting to the ED with chest symptoms compared with previous studies. In particular, we show a lower rate of subsequent admission for AMI versus other diagnoses. ED physicians and administrators can use the results to identify characteristics associated with increased odds of subsequent admission to target at-risk populations. |
1970s and 'Patient 0' HIV-1 genomes illuminate early HIV/AIDS history in North America.
Worobey M , Watts TD , McKay RA , Suchard MA , Granade T , Teuwen DE , Koblin BA , Heneine W , Lemey P , Jaffe HW . Nature 2016 539 (7627) 98-101 The emergence of HIV-1 group M subtype B in North American men who have sex with men was a key turning point in the HIV/AIDS pandemic. Phylogenetic studies have suggested cryptic subtype B circulation in the United States (US) throughout the 1970s and an even older presence in the Caribbean. However, these temporal and geographical inferences, based upon partial HIV-1 genomes that postdate the recognition of AIDS in 1981, remain contentious and the earliest movements of the virus within the US are unknown. We serologically screened >2,000 1970s serum samples and developed a highly sensitive approach for recovering viral RNA from degraded archival samples. Here, we report eight coding-complete genomes from US serum samples from 1978-1979-eight of the nine oldest HIV-1 group M genomes to date. This early, full-genome 'snapshot' reveals that the US HIV-1 epidemic exhibited extensive genetic diversity in the 1970s but also provides strong evidence for its emergence from a pre-existing Caribbean epidemic. Bayesian phylogenetic analyses estimate the jump to the US at around 1970 and place the ancestral US virus in New York City with 0.99 posterior probability support, strongly suggesting this was the crucial hub of early US HIV/AIDS diversification. Logistic growth coalescent models reveal epidemic doubling times of 0.86 and 1.12 years for the US and Caribbean, respectively, suggesting rapid early expansion in each location. Comparisons with more recent data reveal many of these insights to be unattainable without archival, full-genome sequences. We also recovered the HIV-1 genome from the individual known as 'Patient 0' (ref. 5) and found neither biological nor historical evidence that he was the primary case in the US or for subtype B as a whole. We discuss the genesis and persistence of this belief in the light of these evolutionary insights. |
Infection With Hepatitis C Virus Genotype 3 is an Independent Risk Factor for End-stage Liver Disease, Hepatocellular Carcinoma, and Liver-related Death.
McMahon BJ , Bruden D , Townshend-Bulson L , Simons B , Spradling P , Livingston S , Gove J , Hewitt A , Plotnik J , Homan C , Espera H , Negus S , Snowball M , Barbour Y , Bruce M , Gounder P . Clin Gastroenterol Hepatol 2016 15 (3) 431-437 e2 BACKGROUND & AIMS: Few studies have examined factors associated with disease progression in hepatitis C virus (HCV) infection. We examined the association of 11 risk factors with adverse outcomes in a population-based prospective cohort observational study of Alaska Native/American Indian persons with chronic HCV infection. METHODS: We collected data from a population-based cohort study of liver-related adverse outcomes of infection in American Indian/Alaska Native persons with chronic HCV living in Alaska, recruited from 1995 through 2012. We calculated adjusted hazard ratios (aHR) and 95% CIs for end-stage liver disease (ESLD; presence of ascites, esophageal varices, hepatic encephalopathy, or coagulopathy), hepatocellular carcinoma (HCC), and liver-related death using a Cox proportional hazards model. RESULTS: We enrolled 1080 participants followed for 11,171 person-years (mean, 10.3 years); 66%, 19%, and 14% were infected with HCV genotypes 1, 2, and 3, respectively. On multivariate analysis, persons infected with HCV genotype 3 had a significantly increased risk of developing all 3 adverse outcomes. Their aHR for ESLD was 2.1 (95% CI, 1.5-3.0), aHR for HCC was 3.1 (95% CI, 1.4-6.6), and aHR for liver-related death was 2.4 (95% CI, 1.5-4.0) compared to genotype 1. Heavy alcohol use was an age-adjusted risk factor for ESLD (aHR, 2.2; 95% CI, 1.6-3.2), and liver-related death (aHR: 2.9; 95% CI, 1.8-4.6). Obesity was a risk factor for ESLD (aHR, 1.4; 95% CI, 1.0-1.9, and diabetes was a risk factor for ESLD (aHR, 1.5; 95% CI, 1.1-2.2). Male sex was a risk factor for HCC (aHR, 3.6; 95% CI, 1.6-8.2). CONCLUSIONS: In a population-based cohort study of American Indian/Alaska Native persons with chronic HCV infection, we found those infected with HCV genotype 3 to be at high risk for ESLD, HCC, and liver-related death. |
Typhoid fever in South Africa in an endemic HIV setting
Keddy KH , Sooka A , Smith AM , Musekiwa A , Tau NP , Klugman KP , Angulo FJ . PLoS One 2016 11 (10) e0164939 BACKGROUND: Typhoid fever remains an important disease in Africa, associated with outbreaks and the emerging multidrug resistant Salmonella enterica serotype Typhi (Salmonella Typhi) haplotype, H58. This study describes the incidence of, and factors associated with mortality due to, typhoid fever in South Africa, where HIV prevalence is high. METHODS AND FINDINGS: Nationwide active laboratory-based surveillance for culture-confirmed typhoid fever was undertaken from 2003-2013. At selected institutions, additional clinical data from patients were collected including age, sex, HIV status, disease severity and outcome. HIV prevalence among typhoid fever patients was compared to national HIV seroprevalence estimates. The national reference laboratory tested Salmonella Typhi isolates for antimicrobial susceptibility and haplotype. Unadjusted and adjusted logistic regression analyses were conducted determining factors associated with typhoid fever mortality. We identified 855 typhoid fever cases: annual incidence ranged from 0.11 to 0.39 per 100,000 population. Additional clinical data were available for 369 (46.8%) cases presenting to the selected sites. Among typhoid fever patients with known HIV status, 19.3% (29/150) were HIV-infected. In adult females, HIV prevalence in typhoid fever patients was 43.2% (19/44) versus 15.7% national HIV seroprevalence (P < .001); in adult males, 16.3% (7/43) versus 12.3% national HIV seroprevalence (P = .2). H58 represented 11.9% (22/185) of Salmonella Typhi isolates tested. Increased mortality was associated with HIV infection (AOR 10.7; 95% CI 2.3-50.3) and disease severity (AOR 9.8; 95% CI 1.6-60.0) on multivariate analysis. CONCLUSIONS: Typhoid fever incidence in South Africa was largely unchanged from 2003-2013. Typhoid fever mortality was associated disease severity. HIV infection may be a contributing factor. Interventions mandate improved health care access, including to HIV management programmes as well as patient education. Further studies are necessary to clarify relationships between HIV infection and typhoid fever in adults. |
Uptake of hepatitis C screening, characteristics of patients tested, and intervention costs in the BEST-C Study
Brady JE , Liffmann DK , Yartel A , Kil N , Federman AD , Kannry J , Jordan C , Massoud OI , Nerenz DR , Brown KA , Smith BD , Vellozzi C , Rein DB . Hepatology 2016 65 (1) 44-53 BACKGROUND: From December 2012-March 2014, three randomized trials, each implementing a unique intervention in primary care settings (mail recruitment [repeated-mailing], an electronic health record best practice alert [BPA], and patient-solicitation [patient-solicitation]), evaluated HCV antibody testing, diagnosis, and costs for each of the interventions compared to standard-of-care testing. Multilevel multivariable models were used to estimate the adjusted risk ratio (aRR) for receiving an HCV antibody test, and costs were estimated using activity-based costing. RATIONALE: To estimate the effects of interventions conducted as part of the Birth-cohort Evaluation to Advance Screening and Testing for Hepatitis C study on hepatitis C virus (HCV) testing and costs among persons of the 1945-1965 birth-cohort (BC). MAIN RESULTS: Intervention resulted in substantially higher HCV testing rates compared to standard-of-care (26.9% vs. 1.4% for repeated-mailing, 30.9% vs. 3.6% for BPA, and 63.5% vs. 2.0% for patient-solicitation), and significantly higher aRR for testing after controlling for sex, birth year, race, insurance type, and median household income (19.2 [95% Confidence Interval (CI) 9.7-38.2] for repeated-mailing, 13.2 [95% CI 3.6-48.6] for BPA, and 32.9 [95% CI 19.3-56.1] for patient-solicitation). The BPA intervention had the lowest incremental cost per completed test ($24 with fixed startup costs, $3 without) and also the lowest incremental cost per new case identified after omitting fixed startup costs ($1,691). CONCLUSION: HCV testing interventions resulted in an increase in BC testing compared to standard-of-care but also increased costs. The effect size and incremental costs of BPA intervention (excluding startup costs) support more widespread adoption compared to the other interventions. |
Notes from the field: Evaluation of the sensitivity and specificity of a commercially available rapid syphilis test - Escambia County, Florida, 2016
Matthias J , Dwiggins P , Totten Y , Blackmore C , Wilson C , Peterman TA . MMWR Morb Mortal Wkly Rep 2016 65 (42) 1174-1175 In December 2014, the Food and Drug Administration granted the first-ever Clinical Laboratory Improvement Amendments waiver for a rapid treponemal syphilis screening test, Syphilis Health Check (SHC) (1). SHC is a new tool for public health programs to combat increasing syphilis rates, specifically among persons without a prior syphilis infection. SHC can be performed by nonlaboratorian health care personnel and results are available in 10 minutes. In 2015, a total of 7,094 noncongenital cases of syphilis (35.8 case per 100,000) were reported to the Florida Department of Health (2). The Florida Department of Health evaluated the performance of SHC in comparison with treponemal and nontreponemal tests routinely used in its sexually transmitted disease (STD) clinic in Escambia County. | For this evaluation, patients seeking STD testing at the Florida Department of Health STD clinic in Escambia County during March 11–April 21, 2016, were tested for syphilis using the SHC on blood specimens obtained by fingerstick; a venous blood specimen was drawn concurrently and submitted for treponemal (Trep-Sure), and nontreponemal (Arlington Scientific, Inc. [ASI] rapid plasma reagin [RPR] card test for syphilis) testing at the state public health laboratory. The state public health laboratory in Florida uses the CDC-recommended algorithm for syphilis testing (i.e., nontreponemal testing followed by treponemal testing for persons with a reactive nontreponemal test); however, for the purpose of this study, all collected specimens underwent treponemal testing regardless of the nontreponemal test result. The SHC result was compared with results of routine syphilis testing using the traditional testing algorithm at the state laboratory. Sensitivity, specificity, and overall laboratory test agreement were determined using the Trep-Sure qualitative enzyme immunoassay (EIA) reference treponemal test as the standard for “true” positive or negative treponemal test results. |
Partner disclosure and early CD4 response among HIV-infected adults initiating antiretroviral treatment in Nairobi Kenya
Trinh TT , Yatich N , Ngomoa R , McGrath CJ , Richardson BA , Sakr SR , Langat A , John-Stewart GC , Chung MH . PLoS One 2016 11 (10) e0163594 BACKGROUND: Disclosure of HIV serostatus can have significant benefits for people living with HIV/AIDS. However, there is limited data on whether partner disclosure influences ART treatment response. METHODS: We conducted a retrospective cohort study of newly diagnosed, ART-naive HIV-infected adults (>18 years) who enrolled at the Coptic Hope Center in Nairobi, Kenya between January 1st 2009 and July 1st 2011 and initiated ART within 3 months. Analysis was restricted to adults who reported to have either disclosed or not disclosed their HIV status to their partner. Analysis of CD4 response at 6 and 12 months post-ART was stratified by age group. RESULTS: Among 615 adults newly initiating ART with partner disclosure data and 12 month follow-up, mean age was 38 years and 52% were male; 76% reported that they had disclosed their HIV-status to their partner. Those who disclosed were significantly younger and more likely to be married/cohabitating than non-disclosers. At baseline, median CD4 counts were similar between disclosure groups. Among younger adults (< 38 years) those who disclosed had higher CD4 recovery than those who did not at 6 months post- ART (mean difference = 31, 95% CI 3 to 58 p = 0.03) but not at 12 months (mean difference = 17, 95% CI -19 to 52, p = 0.4). Among older adults (≥ 38years) there was no observed difference in CD4 recovery at 6 or 12 months between disclosure groups. CONCLUSION: Among younger adults, disclosure of HIV status to partners may be associated with CD4 recovery following ART. |
Early antiretroviral therapy initiation: Access and equity of viral load testing for HIV treatment monitoring
Peter T , Ellenberger D , Kim AA , Boeras D , Messele T , Roberts T , Stevens W , Jani I , Abimiku A , Ford N , Katz Z , Nkengasong JN . Lancet Infect Dis 2016 17 (1) e26-e29 Scaling up access to HIV viral load testing for individuals undergoing antiretroviral therapy in low-resource settings is a global health priority, as emphasised by research showing the benefits of suppressed viral load for the individual and the whole population. Historically, large-scale diagnostic test implementation has been slow and incomplete because of service delivery and other challenges. Building on lessons from the past, in this Personal View we propose a new framework to accelerate viral load scale-up and ensure equitable access to this essential test. The framework includes the following steps: (1) ensuring adequate financial investment in scaling up this test; (2) achieving pricing agreements and consolidating procurement to lower prices of the test; (3) strengthening functional tiered laboratory networks and systems to expand access to reliable, high-quality testing across countries; (4) strengthening national leadership, with prioritisation of laboratory services; and (5) demand creation and uptake of test results by clinicians, nurses, and patients, which will be vital in ensuring viral load tests are appropriately used to improve the quality of care. The use of dried blood spots to stabilise and ship samples from clinics to laboratories, and the use of point-of-care diagnostic tests, will also be important for ensuring access, especially in settings with reduced laboratory capacity. For countries that have just started to scale up viral load testing, lessons can be learnt from countries such as Botswana, Brazil, South Africa, and Thailand, which have already established viral load programmes. This framework might be useful for guiding the implementation of viral load with the aim of achieving the new global HIV 90-90-90 goals by 2020. |
Effect of pregnancy on response to antiretroviral therapy in HIV-infected African women
Kourtis AP , Wiener J , King CC , Heffron R , Mugo NR , Nanda K , Pyra M , Donnell D , Celum C , Lingappa JR , Baeten JM . J Acquir Immune Defic Syndr 2016 74 (1) 38-43 BACKGROUND: While most recent evidence does not support a role for pregnancy in accelerating HIV disease progression, very little information is available on the effects of incident pregnancy on response to antiretroviral therapy (ART). Hormonal, immune and behavioral changes during pregnancy may influence response to ART. We sought to explore the effects of incident pregnancy (after ART initiation) on virologic, immunologic, and clinical response to ART. METHODS: Data were collected from HIV-infected women participating in 3 prospective studies (Partners in Prevention HSV/HIV Transmission Study, Couples Observational Study, and Partners PrEP Study) from seven countries in Africa from 2004 to 2012. Women were included in this analysis if they were ≤ 45 years of age, were started on ART during the study and were not pregnant at ART initiation. Pregnancy was treated as a time-dependent exposure variable covering the duration of pregnancy, including all pregnancies occurring after ART initiation. Virologic failure was defined as a viral load (VL) greater than 400 copies/ml ≥6 months after ART initiation and viral suppression was defined as viral load ≤400 copies/ml. Multivariable Cox proportional hazards models were used to assess the association between pregnancy and time to viral suppression, virologic failure, WHO clinical stage III/IV and death. Linear mixed effects models were used to assess the association between pregnancy and CD4+ count and VL. All analyses were adjusted for confounders, including pre-ART CD4+ count and plasma VL. RESULTS: A total of 1041 women were followed, contributing 1196.1 person-years of follow-up. Median CD4+ count prior to ART initiation was 276 cells/mm (IQR, 209-375); median pre-ART VL was 17,511 copies/ml (IQR, 2,480-69,286). One-hundred ten women became pregnant after ART initiation. Pregnancy was not associated with time to viral suppression (adjusted HR, 1.20, 95% CI, 0.82-1.77), time to virologic failure (adjusted HR, 0.67, 95% CI, 0.37-1.22), time to WHO clinical stage III or IV (adjusted HR, 0.79, 95% CI, 0.19-3.30) or time to death (adjusted HR, 2.04, 95% CI, 0.25-16.8). Incident pregnancy was associated with an adjusted mean decrease in CD4+ T cell count of 47.3 cells/mm (p<0.001), but not with difference in VL (p=0.06). CONCLUSIONS: For HIV-infected women on ART, incident pregnancy does not affect virologic control or clinical HIV disease progression. A modest decrease in CD4+ T cell count could be due to physiologic effects of pregnancy. |
Global elimination of hepatitis C virus
Ward JW . Gastroenterol Hepatol (N Y) 2016 12 (10) 632-635 JW According to the World Health Organization, the goal is to eliminate hepatitis C virus (HCV) as a public health threat. Specifically, the goal is dramatic and large-scale reductions in new transmissions of HCV, as well as in the number of people becoming ill and dying from HCV, to a level where HCV no longer represents a major health concern. In numeric terms, the World Health Organization has proposed reductions of 60% in HCV-related mortality and 90% in HCV transmission globally and in member countries, including the United States. |
The impact of active surveillance and health education on an Ebola virus disease cluster - Kono District, Sierra Leone, 2014-2015
Stehling-Ariza T , Rosewell A , Moiba SA , Yorpie BB , Ndomaina KD , Jimissa KS , Leidman E , Rijken DJ , Basler C , Wood J , Manso D . BMC Infect Dis 2016 16 (1) 611 BACKGROUND: During December 2014-February 2015, an Ebola outbreak in a village in Kono district, Sierra Leone, began following unsafe funeral practices after the death of a person later confirmed to be infected with Ebola virus. In response, disease surveillance officers and community health workers, in collaboration with local leadership and international partners, conducted 1 day of active surveillance and health education for all households in the village followed by ongoing outreach. This study investigated the impact of these interventions on the outbreak. METHODS: Fifty confirmed Ebola cases were identified in the village between December 1, 2014 and February 28, 2015. Data from case investigations, treatment facility and laboratory records were analyzed to characterize the outbreak. The reproduction number (R) was estimated by fitting to the observed distribution of secondary cases. The impact of the active surveillance and health education was evaluated by comparing two outcomes before and after the day of the interventions: 1) the number of days from symptom onset to case-patient isolation or death and 2) a reported epidemiologic link to a prior Ebola case. RESULTS: The case fatality ratio among the 50 confirmed Ebola cases was 64.0 %. Twenty-three cases occurred among females (46.0 %); the mean age was 39 years (median: 37 years; range: 5 months to 75 years). Forty-three (87.8 %) cases were linked to the index case; 30 (61.2 %) were either at the funeral of Patient 1 or had contact with him while he was ill. R was 0.93 (95 % CI: 0.15-2.3); excluding the funeral, R was 0.29 (95 % CI: 0.11-0.53). The mean number of days in the community after onset of Ebola symptoms decreased from 4.0 days (median: 3 days; 95 % CI: 3.2-4.7) before the interventions to 2.9 days (median: 2 days; 95 % CI: 1.6-4.3) afterward. An epidemiologic link was reported in 47.6 % of case investigations prior to and 100 % after the interventions. CONCLUSIONS: Initial case investigation and contact tracing were hindered by delayed reporting and under-reporting of symptomatic individuals from the community. Active surveillance and health education contributed to quicker identification of suspected cases, interrupting further transmission. |
Implementation of a pragmatic, stepped-wedge cluster randomized trial to evaluate impact of Botswana's Xpert MTB/RIF diagnostic algorithm on TB diagnostic sensitivity and early antiretroviral therapy mortality
Auld AF , Agizew T , Pals S , Finlay A , Ndwapi N , Boyd R , Alexander H , Mathoma A , Basotli J , Gwebe-Nyirenda S , Shepherd J , Ellerbrock TV , Date A . BMC Infect Dis 2016 16 (1) 606 BACKGROUND: In 2012, as a pilot for Botswana's national Xpert MTB/RIF (Xpert) rollout plans, intensified tuberculosis (TB) case finding (ICF) activities were strengthened at 22 HIV treatment clinics prior to phased activation of 13 Xpert instruments. Together, the strengthened ICF intervention and Xpert activation are referred to as the "Xpert package". METHODS: The evaluation, called the Xpert Package Rollout Evaluation using a Stepped-wedge design (XPRES), has two key objectives: (1) to compare sensitivity of microscopy-based and Xpert-based pulmonary TB diagnostic algorithms in diagnosing sputum culture-positive TB; and (2) to evaluate impact of the "Xpert package" on all-cause, 6-month, adult antiretroviral therapy (ART) mortality. A pragmatic, stepped-wedge cluster-randomized trial design was chosen. The design involves enrollment of three cohorts: (1) cohort R, a retrospective cohort of all study clinic ART enrollees in the 24 months before study initiation (July 31, 2012); (2) cohort A, a prospective cohort of all consenting patients presenting to study clinics after study initiation, who received the ICF intervention and the microscopy-based TB diagnostic algorithm; and (3) cohort B, a prospective cohort of all consenting patients presenting to study clinics after Xpert activation, who received the ICF intervention and the Xpert-based TB diagnostic algorithm. TB diagnostic sensitivity will be compared between TB culture-positive enrollees in cohorts A and B. All-cause, 6-month ART-mortality will be compared between cohorts R and B. With anticipated cohort R, A, and B sample sizes of about 10,131, 1,878, and 4,258, respectively, the study is estimated to have >80 % power to detect differences in pre-versus post-Xpert TB diagnostic sensitivity if pre-Xpert sensitivity is ≤52.5 % and post-Xpert sensitivity ≥82.5 %, and >80 % power to detect a 40 % reduction in all-cause, 6-month, ART mortality between cohorts R and B if cohort R mortality is ≥13/100 person-years. DISCUSSION: Only one small previous trial (N = 424) among ART enrolees in Zimbabwe evaluated, in a secondary analysis, Xpert impact on all-cause 6-month ART mortality. No mortality impact was observed. This Botswana trial, with its larger sample size and powered specifically to detect differences in all-cause 6-month ART mortality, remains well-positioned to contribute understanding of Xpert impact. TRIAL REGISTRATION: Retrospectively registered at ClinicalTrials.gov: NCT02538952 . |
A review of the mosquito species (Diptera: Culicidae) of Bangladesh
Irish SR , Al-Amin HM , Alam MS , Harbach RE . Parasit Vectors 2016 9 (1) 559 BACKGROUND: Diseases caused by mosquito-borne pathogens remain an important source of morbidity and mortality in Bangladesh. To better control the vectors that transmit the agents of disease, and hence the diseases they cause, and to appreciate the diversity of the family Culicidae, it is important to have an up-to-date list of the species present in the country. Original records were collected from a literature review to compile a list of the species recorded in Bangladesh. RESULTS: Records for 123 species were collected, although some species had only a single record. This is an increase of ten species over the most recent complete list, compiled nearly 30 years ago. Collection records of three additional species are included here: Anopheles pseudowillmori, Armigeres malayi and Mimomyia luzonensis. CONCLUSIONS: While this work constitutes the most complete list of mosquito species collected in Bangladesh, further work is needed to refine this list and understand the distributions of those species within the country. Improved morphological and molecular methods of identification will allow the refinement of this list in years to come. |
Urbanized white ibises (Eudocimus albus) as carriers of Salmonella enterica of significance to public health and wildlife
Hernandez SM , Welch CN , Peters VE , Lipp EK , Curry S , Yabsley MJ , Sanchez S , Presotto A , Gerner-Smidt P , Hise KB , Hammond E , Kistler WM , Madden M , Conway AL , Kwan T , Maurer JJ . PLoS One 2016 11 (10) e0164402 Worldwide, Salmonella spp. is a significant cause of disease for both humans and wildlife, with wild birds adapted to urban environments having different opportunities for pathogen exposure, infection, and transmission compared to their natural conspecifics. Food provisioning by people may influence these factors, especially when high-density mixed species flocks aggregate. White Ibises (Eudocimus albus), an iconic Everglades species in decline in Florida, are becoming increasingly common in urbanized areas of south Florida where most are hand-fed. We examined the prevalence of Salmonella shedding by ibises to determine the role of landscape characteristics where ibis forage and their behavior, on shedding rates. We also compared Salmonella isolated from ibises to human isolates to better understand non-foodborne human salmonellosis. From 2010-2013, 13% (n = 261) adult/subadult ibises and 35% (n = 72) nestlings sampled were shedding Salmonella. The prevalence of Salmonella shedding by ibises significantly decreased as the percent of Palustrine emergent wetlands and herbaceous grasslands increased, and increased as the proportion of open-developed land types (e.g. parks, lawns, golf courses) increased, suggesting that natural ecosystem land cover types supported birds with a lower prevalence of infection. A high diversity of Salmonella serotypes (n = 24) and strain types (43 PFGE types) were shed by ibises, of which 33% of the serotypes ranked in the top 20 of high significance for people in the years of the study. Importantly, 44% of the Salmonella Pulsed-Field Gel Electrophoresis patterns for ibis isolates (n = 43) matched profiles in the CDC PulseNet USA database. Of these, 20% came from Florida in the same three years we sampled ibis. Importantly, there was a negative relationship between the amount of Palustrine emergent wetland and the number of Salmonella isolates from ibises that matched human cases in the PulseNet database (p = 0.056). Together, our results indicate that ibises are good indicators of salmonellae strains circulating in their environment and they have both the potential and opportunity to transmit salmonellae to people. Finally, they may act as salmonellae carriers to natural environments where other more highly-susceptible groups (nestlings) may be detrimentally affected. |
Frequency of first-line antibiotic selection among US ambulatory care visits for otitis media, sinusitis, and pharyngitis
Hersh AL , Fleming-Dutra KE , Shapiro DJ , Hyun DY , Hicks LA . JAMA Intern Med 2016 176 (12) 1870-1872 The National Action Plan for Combating Antibiotic-Resistant Bacteria set a goal of reducing inappropriate outpatient antibiotic use by 50% by 2020.1 A recent study2 estimated at least 30% of antibiotic prescriptions in ambulatory care settings in the United States during 2010–2011 were unnecessary. Inappropriate antibiotic prescribing also includes choosing an unnecessarily broad-spectrum antibiotic instead of an equally or more effective narrower-spectrum alternative. Otitis media (OM), sinusitis and pharyngitis collectively account for nearly one-third of all antibiotics prescribed in outpatient settings2 and professional guidelines recommend narrow-spectrum agents as first-line therapy for these conditions.2 Alternatives to first-line therapy are indicated in select circumstances, including for patients with penicillin allergy or recent treatment failure. The objective of this study was to measure the frequency with which first-line agents are prescribed for OM, sinusitis and pharyngitis. |
Assessment of fungal diversity in a water-damaged office building.
Green BJ , Lemons AR , Park Y , Cox-Ganser JM , Park JH . J Occup Environ Hyg 2016 14 (4) 285-293 Recent studies have described fungal communities in indoor environments using gene sequencing-based approaches. In this study, dust-borne fungal communities were elucidated from a water-damaged office building located in the northeastern region of the United States using internal transcribed spacer (ITS) rRNA gene sequencing. Genomic DNA was extracted from 5 mg of floor dust derived from 22 samples collected from either the lower floors (n = 8) or a top floor (n = 14) of the office building. ITS gene sequencing resolved a total of 933 ITS sequences and was clustered into 216 fungal operational taxonomic units (OTUs). Analysis of fungal OTUs at the 97% similarity threshold showed a difference between the lower and top floors that was marginally significant (p = 0.049). Species richness and diversity indices were reduced in the lower floor samples compared to the top floor samples and there was a high degree of compositional dissimilarity within and between the two different areas within the building. Fungal OTUs were placed in the phyla Ascomycota (55%), Basidiomycota (41%), Zygomycota (3%), Glomeromycota (0.4%), Chytridiomycota (0.3%) and unassigned fungi (0.5%). The Ascomycota classes with the highest relative abundances included the Dothideomycetes (30%) and Eurotiomycetes (16%). The Basidiomycota consisted of the classes Ustilaginomycetes (14%), Tremellomycetes (11%), and Agaricomycetes (8%). Sequence reads derived from the plant pathogen Ustilago syntherismae were the most abundant in the analysis as were obligate Basidiomycota yeast species that accounted for 12% and 11% of fungal ITS sequences, respectively. ITS gene sequencing provides additional insight into the diversity of fungal OTUs. These data further highlight the contribution of fungi placed in the phylum Basidiomycota, obligate yeasts, as well as xerophilic species that are typically not resolved using traditional culture methods. |
Survey of the perceptions of key stakeholders on the attributes of the South African Notifiable Diseases Surveillance System
Benson FG , Musekiwa A , Blumberg L , Rispel LC . BMC Public Health 2016 16 (1) 1120 BACKGROUND: An effective and efficient notifiable diseases surveillance system (NDSS) is essential for a rapid response to disease outbreaks, and the identification of priority diseases that may cause national, regional or public health emergencies of international concern (PHEICs). Regular assessments of country-based surveillance system are needed to enable countries to respond to outbreaks before they become PHEICs. As part of a broader evaluation of the NDSS in South Africa, the aim of the study was to determine the perceptions of key stakeholders on the national NDSS attributes of acceptability, flexibility, simplicity, timeliness and usefulness. METHODS: During 2015, we conducted a nationally representative cross-sectional survey of communicable diseases coordinators and surveillance officers, as well as members of NDSS committees. Individuals with less than 1 year experience of the NDSS were excluded. Consenting participants completed a self-administered questionnaire. The questionnaire elicited information on demographic information and perceptions of the NDSS attributes. Data were analysed using descriptive statistics and the unconditional logistic regression model. RESULTS: Most stakeholders interviewed (53 %, 60/114) were involved in disease control and response. The median number of years of experience with the NDSS was 11 years (inter-quartile range (IQR): 5 to 20 years). Regarding the NDSS attributes, 25 % of the stakeholders perceived the system to be acceptable, 51 % to be flexible, 45 % to be timely, 61 % to be useful, and 74 % to be simple. Health management stakeholders perceived the system to be more useful and timely compared to the other stakeholders. Those with more years of experience were less likely to perceive the NDSS system as acceptable (OR 0.91, 95 % CI: 0.84-1.00, p = 0.041); those in disease detection were less likely to perceive it as timely (OR 0.10, 95 % CI: 0.01-0.96, p = 0.046) and those participating in National Outbreak Response Team were less likely to perceive it as useful (OR 0.38, 95 % CI: 0.16-0.93, p = 0.034). CONCLUSION: The overall poor perceptions of key stakeholder on the system attributes are a cause for concern. The study findings should inform the revitalisation and reform of the NDSS in South Africa, done in consultation and partnership with the key stakeholders. |
Poultry: the most common food in outbreaks with known pathogens, United States, 1998-2012
Chai SJ , Cole D , Nisler A , Mahon BE . Epidemiol Infect 2016 145 (2) 1-10 As poultry consumption continues to increase worldwide, and as the United States accounts for about one-third of all poultry exports globally, understanding factors leading to poultry-associated foodborne outbreaks in the United States has important implications for food safety. We analysed outbreaks reported to the United States' Foodborne Disease Outbreak Surveillance System from 1998 to 2012 in which the implicated food or ingredient could be assigned to one food category. Of 1114 outbreaks, poultry was associated with 279 (25%), accounting for the highest number of outbreaks, illnesses, and hospitalizations, and the second highest number of deaths. Of the 149 poultry-associated outbreaks caused by a confirmed pathogen, Salmonella enterica (43%) and Clostridium perfringens (26%) were the most common pathogens. Restaurants were the most commonly reported location of food preparation (37% of poultry-associated outbreaks), followed by private homes (25%), and catering facilities (13%). The most commonly reported factors contributing to poultry-associated outbreaks were food-handling errors (64%) and inadequate cooking (53%). Effective measures to reduce poultry contamination, promote safe food-handling practices, and ensure food handlers do not work while ill could reduce poultry-associated outbreaks and illnesses. |
Gastrointestinal illness associated with rancid tortilla chips at a correctional facility - Wyoming, 2015
Lupcho T , Harrist A , Van Houten C . MMWR Morb Mortal Wkly Rep 2016 65 (42) 1170-1173 On October 12, 2015, a county health department notified the Wyoming Department of Health of an outbreak of gastrointestinal illness among residents and staff members at a local correctional facility. The majority of ill persons reported onset of symptoms within 1-3 hours after eating lunch served at the facility cafeteria at noon on October 11. Residents and staff members reported that tortilla chips served at the lunch tasted and smelled like chemicals. The Wyoming Department of Health and county health department personnel conducted case-control studies to identify the outbreak source. Consuming lunch at the facility on October 11 was highly associated with illness; multivariate logistic regression analysis found that tortilla chips were the only food item associated with illness. Hexanal and peroxide, markers for rancidity, were detected in tortilla chips and composite food samples from the lunch. No infectious agent was detected in human stool specimens or food samples. Extensive testing of lunch items did not identify any unusual chemical. Epidemiologic and laboratory evidence implicated rancid tortilla chips as the most likely source of illness. This outbreak serves as a reminder to consider alternative food testing methods during outbreaks of unusual gastrointestinal illness when typical foodborne pathogens are not identified. For interpretation of alternative food testing results, samples of each type of food not suspected to be contaminated are needed to serve as controls. |
Validation of genotype cluster investigations for Mycobacterium tuberculosis: application results for 44 clusters from four heterogeneous United States jurisdictions.
Teeter LD , Vempaty P , Nguyen DT , Tapia J , Sharnprapai S , Ghosh S , Kammerer JS , Miramontes R , Cronin WA , Graviss EA . BMC Infect Dis 2016 16 (1) 594 BACKGROUND: Tracking the dissemination of specific Mycobacterium tuberculosis (Mtb) strains using genotyped Mtb isolates from tuberculosis patients is a routine public health practice in the United States. The present study proposes a standardized cluster investigation method to identify epidemiologic-linked patients in Mtb genotype clusters. The study also attempts to determine the proportion of epidemiologic-linked patients the proposed method would identify beyond the outcome of the conventional contact investigation. METHODS: The study population included Mtb culture positive patients from Georgia, Maryland, Massachusetts and Houston, Texas. Mtb isolates were genotyped by CDC's National TB Genotyping Service (NTGS) from January 2006 to October 2010. Mtb cluster investigations (CLIs) were conducted for patients whose isolates matched exactly by spoligotyping and 12-locus MIRU-VNTR. CLIs were carried out in four sequential steps: (1) Public Health Worker (PHW) Interview, (2) Contact Investigation (CI) Evaluation, (3) Public Health Records Review, and (4) CLI TB Patient Interviews. Comparison between patients whose links were identified through the study's CLI interviews (Step 4) and patients whose links were identified earlier in CLI (Steps 1-3) was conducted using logistic regression. RESULTS: Forty-four clusters were randomly selected from the four study sites (401 patients in total). Epidemiologic links were identified for 189/401 (47 %) study patients in a total of 201 linked patient-pairs. The numbers of linked patients identified in each CLI steps were: Step 1 - 105/401 (26.2 %), Step 2 - 15/388 (3.9 %), Step 3 - 41/281 (14.6 %), and Step 4 - 28/119 (30 %). Among the 189 linked patients, 28 (14.8 %) were not identified in previous CI. No epidemiologic links were identified in 13/44 (30 %) clusters. CONCLUSIONS: We validated a standardized and practical method to systematically identify epidemiologic links among patients in Mtb genotype clusters, which can be integrated into the TB control and prevention programs in public health settings. The CLI interview identified additional epidemiologic links that were not identified in previous CI. One-third of the clusters showed no epidemiologic links despite being extensively investigated, suggesting that some improvement in the interviewing methods is still needed. |
Lassa and Ebola virus inhibitors identified using minigenome and recombinant virus reporter systems.
Welch SR , Guerrero LW , Chakrabarti AK , McMullan LK , Flint M , Bluemling GR , Painter GR , Nichol ST , Spiropoulou CF , Albarino CG . Antiviral Res 2016 136 9-18 Lassa virus (LASV) and Ebola virus (EBOV) infections are important global health issues resulting in significant morbidity and mortality. While several promising drug and vaccine trials for EBOV are ongoing, options for LASV infection are currently limited to ribavirin treatment. A major factor impeding the development of antiviral compounds to treat these infections is the need to manipulate the virus under BSL-4 containment, limiting research to a few institutes worldwide. Here we describe the development of a novel LASV minigenome assay based on the ambisense LASV S segment genome, with authentic terminal untranslated regions flanking a ZsGreen (ZsG) fluorescent reporter protein and a Gaussia princeps luciferase (gLuc) reporter gene. This assay, along with a similar previously established EBOV minigenome, was optimized for high-throughput screening (HTS) of potential antiviral compounds under BSL-2 containment. In addition, we rescued a recombinant LASV expressing ZsG, which, in conjunction with a recombinant EBOV reporter virus, was used to confirm any potential antiviral hits in vitro. Combining an initial screen to identify potential antiviral compounds at BSL-2 containment before progressing to HTS with infectious virus will reduce the amount of expensive and technically challenging BSL-4 containment research. Using these assays, we identified 6-azauridine as having anti-LASV activity, and demonstrated its anti-EBOV activity in human cells. We further identified 2'-deoxy-2'-fluorocytidine as having potent anti-LASV activity, with an EC50 value 10 times lower than that of ribavirin. |
The need for a next-generation public health response to rare diseases.
Valdez R , Grosse SD , Khoury MJ . Genet Med 2016 19 (5) 489-490 Genet Med advance online publication 27 October 2016Genetics in Medicine (2016); doi:10.1038/gim.2016.166. |
Policy changes and improvements in health insurance coverage among MSM: 20 U.S. cities, 2008-2014
Cooley LA , Hoots B , Wejnert C , Lewis R , Paz-Bailey G . AIDS Behav 2016 21 (3) 615-618 Recent policy changes have improved the ability of gay, bisexual, and other men who have sex with men (MSM) to secure health insurance. We wanted to assess changes over time in self-reported health insurance status among MSM participating in CDC's National HIV Behavioral Surveillance (NHBS) in 2008, 2011, and 2014. We analyzed NHBS data from sexually active MSM interviewed at venues in 20 U.S. cities. To determine if interview year was associated with health insurance status, we used a Poisson model with robust standard errors. Among included MSM, the overall percentage of MSM with health insurance rose 16 % from 2008 (68 %) to 2014 (79 %) (p value for trend < 0.001). The change in coverage over time was greatest in key demographic segments with lower health insurance coverage all three interview years, by age, education, and income. Corresponding with recent policy changes, health insurance improved among MSM participating in NHBS, with greater improvements in historically underinsured demographic segments. Despite these increases, improved coverage is still needed. Improved access to health insurance could lead to a reduction in health disparities among MSM over time. |
Cross-neutralizing and protective human antibody specificities to poxvirus infections
Gilchuk I , Gilchuk P , Sapparapu G , Lampley R , Singh V , Kose N , Blum DL , Hughes LJ , Satheshkumar PS , Townsend MB , Kondas AV , Reed Z , Weiner Z , Olson VA , Hammarlund E , Raue HP , Slifka MK , Slaughter JC , Graham BS , Edwards KM , Eisenberg RJ , Cohen GH , Joyce S , Crowe JE Jr . Cell 2016 167 (3) 684-694.e9 Monkeypox (MPXV) and cowpox (CPXV) are emerging agents that cause severe human infections on an intermittent basis, and variola virus (VARV) has potential for use as an agent of bioterror. Vaccinia immune globulin (VIG) has been used therapeutically to treat severe orthopoxvirus infections but is in short supply. We generated a large panel of orthopoxvirus-specific human monoclonal antibodies (Abs) from immune subjects to investigate the molecular basis of broadly neutralizing antibody responses for diverse orthopoxviruses. Detailed analysis revealed the principal neutralizing antibody specificities that are cross-reactive for VACV, CPXV, MPXV, and VARV and that are determinants of protection in murine challenge models. Optimal protection following respiratory or systemic infection required a mixture of Abs that targeted several membrane proteins, including proteins on enveloped and mature virion forms of virus. This work reveals orthopoxvirus targets for human Abs that mediate cross-protective immunity and identifies new candidate Ab therapeutic mixtures to replace VIG. |
Machine learning for predicting vaccine immunogenicity
Lee EK , Nakaya HI , Yuan F , Querec TD , Burel G , Pietz FH , Benecke BA , Pulendran B . Interfaces (Providence) 2016 46 (5) 368-390 The ability to predict how different individuals will respond to vaccination and to understand what best protects individuals from infection greatly facilitates developing next-generation vaccines. It facilitates both the rapid design and evaluation of new and emerging vaccines and identifies individuals unlikely to be protected by vaccine. We describe a general-purpose machine-learning framework, DAMIP, for discovering gene signatures that can predict vaccine immunity and efficacy. DAMIP is a multiple-group, concurrent classifier that offers unique features not present in other models: a nonlinear data transformation to manage the curse of dimensionality and noise; a reserved-judgment region that handles fuzzy entities; and constraints on the allowed percentage of misclassifications. Using DAMIP, implemented results for yellow fever demonstrated that, for the first time, a vaccine's ability to immunize a patient could be successfully predicted (with accuracy of greater than 90 percent) within one week after vaccination. A gene identified by DAMIP, EIF2AK4, decrypted a seven-decade-old mystery of vaccination. Results for flu vaccine demonstrated DAMIP's applicability to both live-attenuated and inactivated vaccines. Results in a malaria study enabled targeted delivery to individual patients. Our project's methods and findings permit highlighting and probabilistically prioritizing hypothesis design to enhance biological discovery. Moreover, they guide the rapid development of better vaccines to fight emerging infections, and improve monitoring for poor responses in the elderly, infants, or others with weakened immune systems. In addition, the project's work should help with universal flu-vaccine design. © 2016 INFORMS. |
Accelerated evaluation of automated vehicles safety in lane-change scenarios based on importance sampling techniques
Zhao D , Lam H , Peng H , Bao S , LeBlanc DJ , Nobukawa K , Pan CS . IEEE Trans Intell Transp Syst 2016 18 (3) 595-607 Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs. |
Analysis of whole human blood for Pb, Cd, Hg, Se, and Mn by ICP-DRC-MS for biomonitoring and acute exposures
Jones DR , Jarrett JM , Tevis DS , Franklin M , Mullinix NJ , Wallon KL , Derrick Quarles C Jr , Caldwell KL , Jones RL . Talanta 2017 162 114-122 We improved our inductively coupled plasma mass spectrometry (ICP-MS) whole blood method [1] for determination of lead (Pb), cadmium (Cd), and mercury (Hg) by including manganese (Mn) and selenium (Se), and expanding the calibration range of all analytes. The method is validated on a PerkinElmer (PE) ELAN® DRC II ICP-MS (ICP-DRC-MS) and uses the Dynamic Reaction Cell (DRC) technology to attenuate interfering background ion signals via ion-molecule reactions. Methane gas (CH4) eliminates background signal from 40Ar2 + to permit determination of 80Se+, and oxygen gas (O2) eliminates several polyatomic interferences (e.g. 40Ar15N+, 54Fe1H+) on 55Mn+. Hg sensitivity in DRC mode is a factor of two higher than vented mode when measured under the same DRC conditions as Mn due to collisional focusing of the ion beam. To compensate for the expanded method's longer analysis time (due to DRC mode pause delays), we implemented an SC4-FAST autosampler (ESI Scientific, Omaha, NE), which vacuum loads the sample onto a loop, to keep the sample-to-sample measurement time to less than 5 min, allowing for preparation and analysis of 60 samples in an 8-h work shift. The longer analysis time also resulted in faster breakdown of the hydrocarbon oil in the interface roughing pump. The replacement of the standard roughing pump with a pump using a fluorinated lubricant, Fomblin®, extended the time between pump maintenance. We optimized the diluent and rinse solution components to reduce carryover from high concentration samples and prevent the formation of precipitates. We performed a robust calculation to determine the following limits of detection (LOD) in whole blood: 0.07 µg dL−1 for Pb, 0.10 µg L−1 for Cd, 0.28 μg L−1 for Hg, 0.99 µg L−1 for Mn, and 24.5 µg L−1 for Se. © 2016 |
Refuge alternatives relief valve testing and design
Lutz TJ , Bissert PT , Homce GT , Yonkey JA . Min Eng 2016 68 (10) 55-59 The U.S. National Institute for Occupational Safety and Health (NIOSH) has been researching refuge alternatives (RAs) since 2007. RAs typically have built-in pressure relief valves (PRVs) to prevent the unit from reaching unsafe pressures. The U.S. Mine Safety and Health Administration requires that these valves vent the chamber at a maximum pressure of 1.25 kPa (0.18 psi, 5.0 in. H2O), or as specified by the manufacturer, above mine atmospheric pressure in the RA. To facilitate PRV testing, an instrumented benchtop test fixture was developed using an off-the-shelf centrifugal blower and ductwork. Relief pressures and flow characteristics were measured for three units: (1) a modified polyvinyl chloride check valve, (2) an off-the-shelf brass/cast-iron butterfly check valve and (3) a commercially available valve that was designed specifically for one manufacturer's steel prefabricated RAs and had been adapted for use in one mine operator's built-in-place RA. PRVs used in tent-style RAs were not investigated. The units were tested with different modifications and configurations in order to check compliance with Title 30 Code of Federal Regulations, or 30 CFR, regulations. The commercially available relief valve did not meet the 30 CFR relief pressure specification but may meet the manufacturer's specification. Alternative valve designs were modified to meet the 30 CFR relief pressure specification, but all valve designs will need further design research to examine survivability in the event of a 103 kPa (15.0 psi) impulse overpressure during a disaster. |
Near-infrared spectroscopy, a rapid method for predicting the age of male and female wild-type and Wolbachia infected Aedes aegypti
Sikulu-Lord MT , Milali MP , Henry M , Wirtz RA , Hugo LE , Dowell FE , Devine GJ . PLoS Negl Trop Dis 2016 10 (10) e0005040 Estimating the age distribution of mosquito populations is crucial for assessing their capacity to transmit disease and for evaluating the efficacy of available vector control programs. This study reports on the capacity of the near-infrared spectroscopy (NIRS) technique to rapidly predict the ages of the principal dengue and Zika vector, Aedes aegypti. The age of wild-type males and females, and males and females infected with wMel and wMelPop strains of Wolbachia pipientis were characterized using this method. Calibrations were developed using spectra collected from their heads and thoraces using partial least squares (PLS) regression. A highly significant correlation was found between the true and predicted ages of mosquitoes. The coefficients of determination for wild-type females and males across all age groups were R2 = 0.84 and 0.78, respectively. The coefficients of determination for the age of wMel and wMelPop infected females were 0.71 and 0.80, respectively (P< 0.001 in both instances). The age of wild-type female Ae. aegypti could be identified as < or ≥ 8 days old with an accuracy of 91% (N = 501), whereas female Ae. aegypti infected with wMel and wMelPop were differentiated into the two age groups with an accuracy of 83% (N = 284) and 78% (N = 229), respectively. Our results also indicate NIRS can distinguish between young and old male wild-type, wMel and wMelPop infected Ae. aegypti with accuracies of 87% (N = 253), 83% (N = 277) and 78% (N = 234), respectively. We have demonstrated the potential of NIRS as a predictor of the age of female and male wild-type and Wolbachia infected Ae. aegypti mosquitoes under laboratory conditions. After field validation, the tool has the potential to offer a cheap and rapid alternative for surveillance of dengue and Zika vector control programs. |
Elevation of alanine aminotransferase activity occurs after activation of the cell-death signaling initiated by pattern-recognition receptors but before activation of cytolytic effectors in NK or CD8+ T cells in the liver during acute HCV infection
Choi YH , Jin N , Kelly F , Sakthivel SK , Yu T . PLoS One 2016 11 (10) e0165533 Pattern-recognition receptors (PRRs) promote host defenses against HCV infection by binding to their corresponding adapter molecules leading to the initiation of innate immune responses including cell death. We investigated the expression of PRR genes, biomarkers of liver cell-death, and T cell and NK cell activation/inhibition-related genes in liver and serum obtained from three experimentally infected chimpanzees with acute HCV infection, and analyzed the correlation between gene expression levels and clinical profiles. Our results showed that expression of hepatic RIG-I, TLR3, TLR7, 2OAS1, and CXCL10 mRNAs was upregulated as early as 7 days post-inoculation and peaked 12 to 83 days post-inoculation. All of the three HCV infected chimpanzees exhibited significant elevations of serum alanine aminotransferase (ALT) activity between 70 and 95 days after inoculation. Elevated levels of serum cytokeratin 18 (CK-18) and caspases 3 and 7 activity coincided closely with the rise of ALT activity, and were preceded by significant increases in levels of caspase 3 and caspase 7 mRNAs in the liver. Particularly we found that significant positive auto-correlations were observed between RIG-I, TLR3, CXCL10, 2OAS1, and PD-L1 mRNA and ALT activity at 3 to 12 days before the peak of ALT activity. However, we observed substantial negative auto-correlations between T cell and NK cell activation/inhibition-related genes and ALT activity at 5 to 32 days after the peak of ALT activity. Our results indicated cell death signaling is preceded by early induction of RIG-I, TLR3, 2OAS1, and CXCL10 mRNAs which leads to elevation of ALT activity and this signaling pathway occurs before the activation of NK and T cells during acute HCV infection. Our study suggests that PRRs and type I IFN response may play a critical role in development of liver cell injury related to viral clearance during acute HCV infection. |
Endocytic pathways used by Andes virus to enter primary human lung endothelial cells
Chiang CF , Flint M , Lin JS , Spiropoulou CF . PLoS One 2016 11 (10) e0164768 Andes virus (ANDV) is the major cause of hantavirus pulmonary syndrome (HPS) in South America. Despite a high fatality rate (up to 40%), no vaccines or antiviral therapies are approved to treat ANDV infection. To understand the role of endocytic pathways in ANDV infection, we used 3 complementary approaches to identify cellular factors required for ANDV entry into human lung microvascular endothelial cells. We screened an siRNA library targeting 140 genes involved in membrane trafficking, and identified 55 genes required for ANDV infection. These genes control the major endocytic pathways, endosomal transport, cell signaling, and cytoskeleton rearrangement. We then used infectious ANDV and retroviral pseudovirions to further characterize the possible involvement of 9 of these genes in the early steps of ANDV entry. In addition, we used markers of cellular endocytosis along with chemical inhibitors of known endocytic pathways to show that ANDV uses multiple routes of entry to infect target cells. These entry mechanisms are mainly clathrin-, dynamin-, and cholesterol-dependent, but can also occur via a clathrin-independent manner. |
Evaluation of SMARTube to detect HIV infection before seroconversion using standard methods
Feldblum PJ , Chen PL , Fischer SJ , Sexton CJ . AIDS Res Hum Retroviruses 2016 32 1067-1071 The acute phase of HIV infection carries substantial risk of transmission; identification of acute-phase infections may offer opportunities to reduce that risk. SMARTube incubation of blood specimens is designed to stimulate in vivo-primed HIV-specific lymphocytes to produce HIV antibodies in vitro. The resulting supernatant (S-plasma) can be tested to identify acute infections with commercially available HIV assays. We assessed the performance of the SMARTube to identify acute HIV infections in studies at three developing country sites. We conducted HIV incidence studies in Ho Chi Minh City, Vietnam, and Bloemfontein and Rustenburg, South Africa. We estimated HIV incidence in cross-sectional samples and measured prospective incidence in uninfected women followed for up to 12 months. We incorporated SMARTube into the HIV testing algorithm at cross-sectional screening and monthly follow-up visits. We tested 1,384 persons in Vietnam, 1,145 women in Bloemfontein, and 538 persons in Rustenburg. Cross-sectional samples from 11 participants that tested positive with SMARTube after an initial unincubated negative test result (11 of 2,472; 0.4% of all specimens) were considered "potential acute" infections. Matching samples from 3 of the 11 (27.3%) were confirmed by polymerase chain reaction (PCR) testing. In follow-up of 355, 401, and 223 uninfected women in Vietnam, Bloemfontein, and Rustenburg, respectively, 11 seroconversions occurred in Bloemfontein and Rustenburg. In four of these incident infections (36.4%), SMARTube testing had resulted in earlier detection of HIV infection than the eventual seroconversion visits. In our field studies, pretreatment with SMARTube allowed the identification of acute HIV-1 infection in some new infections, but with a positive predictive value of 27%. Larger studies are needed to evaluate SMARTube as an alternative to technically challenging and costly enzyme immunoassay and PCR testing to detect acute HIV infection. |
In vitro exposure system for study of aerosolized influenza virus
Creager HM , Zeng H , Pulit-Penaloza JA , Maines TR , Tumpey TM , Belser JA . Virology 2016 500 62-70 Infection of adherent cell monolayers using a liquid inoculum represents an established method to reliably and quantitatively study virus infection, but poorly recapitulates the exposure and infection of cells in the respiratory tract that occurs during infection with aerosolized pathogens. To better simulate natural infection in vitro, we adapted a system that generates viral aerosols similar to those exhaled by infected humans to the inoculation of epithelial cell monolayers. Procedures for cellular infection and calculation of exposure dose were developed and tested using viruses characterized by distinct transmission and pathogenicity phenotypes: an HPAI H5N1, an LPAI H7N9, and a seasonal H3N2 virus. While all three aerosolized viruses were highly infectious in a human bronchial epithelial cell line (Calu-3) cultured submerged in media, differences between the viruses were observed in primary human alveolar epithelial cells and in Calu-3 cells cultured at air-liquid interface. This system provides a novel enhancement to traditional in vitro experiments, particularly those focused on the early stages of infection. |
Increases in NKG2C expression on T cells and higher levels of circulating CD8+ B cells are associated with sterilizing immunity provided by a live attenuated SIV vaccine
Hodara VL , Parodi LM , Keckler MS , Giavedoni LD . AIDS Res Hum Retroviruses 2016 32 1125-1134 Vaccines based on live attenuated viruses are highly effective immunogens in the simian immunodeficiency virus (SIV)/rhesus macaque animal model and offer the possibility of studying correlates of protection against infection with virulent virus. We utilized a tether system for studying, in naive macaques and animals vaccinated with a live-attenuated vaccine, the acute events after challenge with pathogenic SIV. This approach allowed for the frequent sampling of small blood volumes without sedation or restraining of the animals, thus reducing the confounding effect of sampling stress. Before challenge, vaccinated animals presented significantly higher levels of proliferating and activated B cells than naive macaques, which were manifested by high expression of CD8 on B cells. After SIV challenge, the only changes observed in protected vaccinated macaques were significant increases in expression of the NK marker NKG2C on CD4 and CD8 T cells. We also identified that infection of naive macaques with SIV resulted in a transient peak of expression of CD20 on CD8 T cells and a constant rise in the number of B cells expressing CD8. Finally, analysis of a larger cohort of vaccinated animals identified that, even when circulating levels of vaccine virus are below the limit of detection, live attenuated vaccines induce systemic increases of IP-10 and perforin. These studies indicate that components of both the innate and adaptive immune systems of animals inoculated with a live-attenuated SIV vaccine respond to and control infection with virulent virus. Persistence of the vaccine virus in tissues may explain the elevated cytokine and B-cell activation levels. In addition, our report underpins the utility of the tether system for the intensive study of acute immune responses to viral infections. |
Ability of device to collect bacteria from cough aerosols generated by adults with cystic fibrosis
Ku DN , Ku SK , Helfman B , McCarty NA , Wolff BJ , Winchell JM , Anderson LJ . F1000Res 2016 5 1920 Background: Identifying lung pathogens and acute spikes in lung counts remain a challenge in the treatment of patients with cystic fibrosis (CF). Bacteria from the deep lung may be sampled from aerosols produced during coughing. Methods: A new device was used to collect and measure bacteria levels from cough aerosols of patients with CF. Sputum and oral specimens were also collected and measured for comparison. Pseudomonas aeruginosa, Staphylococcus aureus, Klebsiella pneumoniae, and Streptococcus mitis were detected in specimens using Real-Time Polymerase Chain Reaction (RT-PCR) molecular assays. Results: Twenty adult patients with CF and 10 healthy controls participated. CF related bacteria (CFRB) were detected in 13/20 (65%) cough specimens versus 15/15 (100%) sputum specimens. Commensal S. mitis was present in 0/17 (0%, p=0.0002) cough specimens and 13/14 (93%) sputum samples. In normal controls, no bacteria were collected in cough specimens but 4/10 (40%) oral specimens were positive for CFRB. Conclusions: Non-invasive cough aerosol collection may detect lower respiratory pathogens in CF patients, with similar specificity and sensitivity to rates detected by BAL, without contamination by oral CFRB or commensal bacteria. |
Complexities in ferret influenza virus pathogenesis and transmission models
Belser JA , Eckert AM , Tumpey TM , Maines TR . Microbiol Mol Biol Rev 2016 80 (3) 733-44 Ferrets are widely employed to study the pathogenicity, transmissibility, and tropism of influenza viruses. However, inherent variations in inoculation methods, sampling schemes, and experimental designs are often overlooked when contextualizing or aggregating data between laboratories, leading to potential confusion or misinterpretation of results. Here, we provide a comprehensive overview of parameters to consider when planning an experiment using ferrets, collecting data from the experiment, and placing results in context with previously performed studies. This review offers information that is of particular importance for researchers in the field who rely on ferret data but do not perform the experiments themselves. Furthermore, this review highlights the breadth of experimental designs and techniques currently available to study influenza viruses in this model, underscoring the wide heterogeneity of protocols currently used for ferret studies while demonstrating the wealth of information which can benefit risk assessments of emerging influenza viruses. |
Computational fluid dynamics study on the influence of an alternate ventilation configuration on the possible flow path of infectious cough aerosols in a mock airborne infection isolation room
Sharan Thatiparti D , Ghia U , Mead KR . Sci Technol Built Environ 2016 23 (2) 355-366 When infectious epidemics occur, they can be perpetuated within health care settings, potentially resulting in severe health care workforce absenteeism, morbidity, mortality, and economic losses. The ventilation system configuration of an airborne infection isolation room is one factor that can play a role in protecting health care workers from infectious patient bioaerosols. Though commonly associated with airborne infectious diseases, the airborne infection isolation room design can also impact other transmission routes such as short-range airborne as well as fomite and contact transmission routes that are impacted by contagion concentration and recirculation. This article presents a computational fluid dynamics study on the influence of the ventilation configuration on the possible flow path of bioaerosol dispersal behavior in a mock airborne infection isolation room. At first, a mock airborne infection isolation room was modeled that has the room geometry and layout, ventilation parameters, and pressurization corresponding to that of a traditional ceiling-mounted ventilation arrangement observed in existing hospitals. An alternate ventilation configuration was then modeled to retain the linear supply diffuser in the original mock airborne infection isolation room but interchanging the square supply and exhaust locations to place the exhaust closer to the patient source and allow clean air from supply vents to flow in clean-to-dirty flow paths, originating in uncontaminated parts of the room prior to entering the contaminated patient's air space. The modeled alternate airborne infection isolation room ventilation rate was 12 air changes per hour. Two human breathing models were used to simulate a source patient and a receiving health care worker. A patient cough cycle was introduced into the simulation, and the airborne infection dispersal was tracked in time using a multi-phase flow simulation approach. The results from the alternate configuration revealed that the cough aerosols were pulled by the exhaust vent without encountering the health care worker by 0.93 s after patient coughs and the particles were controlled as the aerosols’ flow path was uninterrupted by an air particle streamline from patient to the ceiling exhaust venting out cough aerosols. However, not all the aerosols were vented out of the room. The remaining cough aerosols entered the health care worker's breathing zone by 0.98 s. This resulted in one of the critical stages in terms of the health care worker's exposure to airborne virus and presented the opportunity for the health care worker to suffer adverse health effects from the inhalation of cough aerosols. Within 2 s, the cough aerosols reentered and recirculated within the patient and health care worker's surroundings resulting in pockets of old contaminated air. By this time, coalescence losses decreased as the aerosol were no longer in very close proximity and their movement was primarily influenced by the airborne infection isolation room airflow patterns. In the patient and health care worker's area away from the supply, the fresh air supply failed to reach this part of the room to quickly dilute the cough aerosol concentration. The exhaust was also found to have minimal effect upon cough aerosol removal, except for those areas with high exhaust velocities, very close to the exhaust grill. Within 5–20 s after a patient's cough, the aerosols tended to break up to form smaller sized aerosols of less than one micron diameter. They remained airborne and entrained back into the supply air stream, spreading into the entire room. The suspended aerosols resulted in the floating time of more than 21 s in the room due to one cough cycle. The duration of airborne contagion in the room and its prolonged exposure to the health care worker is likely to happen due to successive coughing cycles. Hence, the evaluated alternate airborne infection isolation room is not effective in removing at least 38% particles exposed to health care worker within the first second of a patient's cough. |
Perinatal regionalization: A geospatial view of perinatal critical care, United States, 2010-2013
Brantley MD , Davis NL , Goodman DA , Callaghan WM , Barfield WD . Am J Obstet Gynecol 2016 216 (2) 185 e1-185 e10 BACKGROUND: Perinatal services exist today as a dyad of maternal and neonatal care. When perinatal care is fragmented or unavailable, excess morbidity and mortality may occur in pregnant women and newborns. OBJECTIVE: Describe spatial relationships between women of reproductive age, individual perinatal subspecialists (Maternal Fetal Medicine and Neonatology), and obstetric and neonatal critical care facilities in the United States to identify gaps in health care access. STUDY DESIGN: We used geographic visualization and conducted surface interpolation, nearest neighbor, and proximity analyses. Source data included 2010 United States Census, October 2013 National Provider Index, 2012 American Hospital Association, 2012 National Center for Health Statistics Natality File, and the 2011 American Academy of Pediatrics Directory. RESULTS: In October 2013, there were 2.5 neonatologists for every Maternal Fetal Medicine specialist in the United States. In 2012 there were 1.4 Level III or higher neonatal intensive care units (NICU) for every Level III obstetric unit (hereafter, obstetric critical care unit). Nationally, 87% of women of reproductive age live within 50 miles of both an obstetric critical care unit and NICU. However, 18% of obstetric critical care units had no NICU and 20% of NICUs had no obstetric critical care unit within a 10 mile radius. Additionally, 26% of obstetric critical care units had no Maternal Fetal Medicine specialist practicing within 10 miles of the facility and 4% of NICUs had no neonatologist practicing within 10 miles. CONCLUSION: Gaps in access and discordance between the availability of Level III or higher obstetric and neonatal care may affect delivery of risk appropriate care for high risk maternal fetal dyads. Further study is needed to understand the importance of these gaps and discordance on maternal and neonatal outcomes. |
Sociodemographic and behavioral factors associated with added sugars intake among US adults
Park S , Thompson FE , McGuire LC , Pan L , Galuska DA , Blanck HM . J Acad Nutr Diet 2016 116 (10) 1589-98 BACKGROUND: Reducing added sugars intake is one of the Healthy People 2020 objectives. High added sugars intake may be associated with adverse health consequences. OBJECTIVE: This cross-sectional study identified sociodemographic and behavioral characteristics associated with added sugars intake among US adults (18 years and older) using the 2010 National Health Interview Survey data (n=24,967). METHODS: The outcome variable was added sugars intake from foods and beverages using scoring algorithms to convert dietary screener frequency responses on nine items to estimates of individual dietary intake of added sugars in teaspoons per day. Added sugars intake was categorized into tertiles (lowest, middle, highest) stratified by sex. The explanatory variables were sociodemographic and behavioral characteristics. Multinomial logistic regression was used to estimate the adjusted odds ratios for the highest and middle tertile added sugars intake groups as compared with the lowest tertile group. RESULTS: Estimated median added sugars intake was 17.6 tsp/d for men and 11.7 tsp/d for women. For men and women, those who had significantly greater odds for being in the highest tertile of added sugars intake (men: ≥22.0 tsp/d; women: ≥14.6 tsp/d) were younger, less educated, had lower income, were less physically active, were current smokers, and were former or current infrequent/light drinkers, whereas non-Hispanic other/multiracial and those living in the West had significantly lower odds for being in the highest tertile of added sugars intake. Different patterns were found by sex. Non-Hispanic black men had lower odds for being in the highest tertile of added sugars intake, whereas non-Hispanic black women had greater odds for being in the highest tertile. CONCLUSIONS: One in three men consumed ≥22.0 tsp added sugars and one in three women consumed ≥14.6 tsp added sugars daily. Higher added sugars intake was associated with various sociodemographic and behavioral characteristics; this information can inform efforts to design programs and policies specific to high-intake populations. |
Work-related illness and injury claims among nationally certified athletic trainers reported to Washington and California from 2001 to 2011
Kucera KL , Roos KG , Hootman JM , Lipscomb HJ , Dement JM , Silverstein BA . Am J Ind Med 2016 59 (12) 1156-1168 BACKGROUND: Little is known about the work-related injury and illnesses experienced by certified athletic trainers (AT). METHODS: The incidence and characteristics of injury/illness claims filed in two workers' compensation systems were described from 2001 to 2011. Yearly populations at risk were estimated from National Athletic Trainers' Association membership statistics. Incidence rate ratios (IRR) were reported by job setting. RESULTS: Claims were predominantly for traumatic injuries and disorders (82.7%: 45.7% sprains/strains, 12.0% open wounds, 6.5% bruises) and at these body sites (back 17.2%, fingers 12.3%, and knee 9.6%) and over half were caused by body motion and overexertion (51.5%). Compared with school settings, clinic/hospital settings had modestly higher claim rates (IRR = 1.29, 95% CI: 1.06-1.52) while other settings (e.g., professional or youth sport, nursing home) had lower claim rates (IRR = 0.63, 95% CI: 0.44-0.70). CONCLUSIONS: These first known estimates of work-related injuries/illnesses among a growing healthcare profession help identify occupational tasks and settings imposing injury risk for ATs. |
Nonstandard work arrangements and worker health and safety
Howard J . Am J Ind Med 2016 60 (1) 1-10 Arrangements between those who perform work and those who provide jobs come in many different forms. Standard work arrangements now exist alongside several nonstandard arrangements: agency work, contract work, and gig work. While standard work arrangements are still the most prevalent types, the rise of nonstandard work arrangements, especially temporary agency, contract, and "gig" arrangements, and the potential effects of these new arrangements on worker health and safety have captured the attention of government, business, labor, and academia. This article describes the major work arrangements in use today, profiles the nonstandard workforce, discusses several legal questions about how established principles of labor and employment law apply to nonstandard work arrangements, summarizes findings published in the past 20 years about the health and safety risks for workers in nonstandard work arrangements, and outlines current research efforts in the area of healthy work design and worker well-being. |
Occupational exposures and chronic obstructive pulmonary disease (COPD): Comparison of a COPD-specific job exposure matrix and expert-evaluated occupational exposures
Kurth L , Doney B , Weinmann S . Occup Environ Med 2016 74 (4) 290-293 OBJECTIVES: To compare the occupational exposure levels assigned by our National Institute for Occupational Safety and Health chronic obstructive pulmonary disease-specific job exposure matrix (NIOSH COPD JEM) and by expert evaluation of detailed occupational information for various jobs held by members of an integrated health plan in the Northwest USA. METHODS: We analysed data from a prior study examining COPD and occupational exposures. Jobs were assigned exposure levels using 2 methods: (1) the COPD JEM and (2) expert evaluation. Agreement (Cohen's kappa coefficients), sensitivity and specificity were calculated to compare exposure levels assigned by the 2 methods for 8 exposure categories. RESULTS: kappa indicated slight to moderate agreement (0.19-0.51) between the 2 methods and was highest for organic dust and overall exposure. Sensitivity of the matrix ranged from 33.9% to 68.5% and was highest for sensitisers, diesel exhaust and overall exposure. Specificity ranged from 74.7% to 97.1% and was highest for fumes, organic dust and mineral dust. CONCLUSIONS: This COPD JEM was compared with exposures assigned by experts and offers a generalisable approach to assigning occupational exposure. |
Inter-laboratory comparison of three earplug fit-test systems
Byrne DC , Murphy WJ , Krieg EF , Ghent RM , Michael KL , Stefanson EW , Ahroon WA . J Occup Environ Hyg 2016 14 (4) 294-305 The National Institute for Occupational Safety and Health (NIOSH) sponsored tests of three earplug fit-test systems (NIOSH HPD Well-Fit, Michael & Associates FitCheck, and Honeywell Safety Products VeriPRO(R)). Each system was compared to laboratory-based real-ear attenuation at threshold (REAT) measurements in a sound field according to ANSI/ASA S12.6-2008 at the NIOSH, Honeywell Safety Products, and Michael & Associates testing laboratories. An identical study was conducted independently at the U.S. Army Aeromedical Research Laboratory (USAARL), which provided their data for inclusion in this report. The Howard Leight Airsoft premolded earplug was tested with twenty subjects at each of the four participating laboratories. The occluded fit of the earplug was maintained during testing with a soundfield-based laboratory REAT system as well as all three headphone-based fit-test systems. The Michael & Associates lab had highest average A-weighted attenuations and smallest standard deviations. The NIOSH lab had the lowest average attenuations and the largest standard deviations. Differences in octave-band attenuations between each fit-test system and the American National Standards Institute (ANSI) sound field method were calculated (Attenfit-test - AttenANSI). A-weighted attenuations measured with FitCheck and HPD Well-Fit systems demonstrated approximately +/-2 dB agreement with the ANSI sound field method, but A-weighted attenuations measured with the VeriPRO system underestimated the ANSI laboratory attenuations. For each of the fit-test systems, the average A-weighted attenuation across the four laboratories was not significantly greater than the average of the ANSI sound field method. Standard deviations for residual attenuation differences were about +/-2 dB for FitCheck and HPD Well-Fit compared to +/-4 dB for VeriPRO. Individual labs exhibited a range of agreement from less than a dB to as much as 9.4 dB difference with ANSI and REAT estimates. Factors such as the experience of study participants and test administrators, and the fit-test psychometric tasks are suggested as possible contributors to the observed results. |
Walking and the perception of neighborhood attributes among U.S. adults, 2012
Paul P , Carlson SA , Fulton JE . J Phys Act Health 2016 14 (1) 1-26 BACKGROUND: The association between walking and environmental attributes depends on walking purpose. This study, based on a large survey of U.S. adults, examined the association between perceived neighborhood safety and built environment attributes, and walking for transportation and leisure. METHODS: Data were obtained on transportation and leisure-time walking, perceived neighborhood safety and built environment attributes, and demographic characteristics from the summer wave of the 2012 ConsumerStyles survey of 3,951 U.S. adults. Associations were examined by demographic characteristics. RESULTS: Seventy-five percent of respondents reported walking for either transportation (54%) or leisure (56%) in the past week, 59% reported no safety concern, and 36% reported absence of any built environment attribute of walkability nearby. Respondents with more education, and those who lived in metropolitan areas were more likely to report built environment attributes supportive of walking. All built environment attributes examined, as well as safety concern due to speeding vehicles, were associated with walking after adjustment for demographic characteristics. CONCLUSIONS: Walking, particularly for transportation, is associated with many built environment attributes among U.S. adults. These attributes may be important to consider when designing and modifying the built environment of communities, especially those which are less walkable. |
Towards systemic evaluation
Reynolds M , Gates E , Hummelbrunner R , Marra M , Williams B . Syst Res Behav Sci 2016 33 (5) 662-673 Problems of conventional evaluation models can be understood as an impoverished conversation between realities (of non-linearity, indeterminate attributes, and ever-changing context), and models of evaluating such realities. Meanwhile, ideas of systems thinking and complexity sciencegrouped here under the acronym STCSstruggle to gain currency in the big E world of institutionalized evaluation. Four evaluation practitioners familiar with evaluation tools associated with STCS offer perspectives on issues regarding mainstream uptake of STCS in the big E world. The perspectives collectively suggest three features of practicing systemic evaluation: (i) developing value in conversing between bounded values (evaluations) and unbounded reality (evaluand), with humility; (ii) developing response-ability with evaluand stakeholders based on reflexivity, with empathy; and (iii) developing adaptive rather than mere contingent use(fulness) of STCS tools as part of evaluation praxis, with inevitable fallibility and an orientation towards bricolage (adaptive use). The features hint towards systemic evaluation as core to a reconfigured notion of developmental evaluation. |
The safety of intrauterine devices among young women: A systematic review
Jatlaoui TC , Riley HE , Curtis KM . Contraception 2016 95 (1) 17-39 OBJECTIVE: To determine the association between use of intrauterine devices (IUDs) by young women and risk of adverse outcomes. METHODS: We searched Pubmed, CINAHL, Embase, Popline and the Cochrane Library for articles from inception of database through December 2015. For outcomes specific to IUD use (IUD expulsion and perforation), we examined effect measures for IUD users generally aged 25 years or younger compared with older IUD users. For outcomes of pregnancy, infection, pelvic inflammatory disease (PID), and heavy bleeding or anemia, we examined young IUD users compared with young users of other contraceptive methods or no method. RESULTS: We identified 3169 articles of which 16 articles from 14 studies met our inclusion criteria. Six studies (Level II-2, good to poor) reported increased risk of expulsion among younger age groups compared with older age groups using Cu-IUDs. Two studies (Level II-2, fair) examined risks of expulsion among younger compared with older women using LNG-IUDs; one reported no difference in expulsion, while the other reported increased odds for younger women. Four studies (Level II-2, good to poor) examined risk of expulsion among Cu- and LNG-IUD users combined and reported no significant differences between younger and older women. For perforation, four studies (Level II-2, fair to poor) found very low perforation rates (range 0-0.1%), with no significant differences between younger and older women. Pregnancies were generally rare among young IUD users in nine studies (Level I to II-2, fair to poor) and no differences were reported for young IUD users compared with young combined oral contraceptive (COC) or etonogestrel (ENG) implant users. PID was rare among young IUD users; one study reported no cases among COC or IUD users and one reported no difference in PID among LNG-IUD users compared with ENG implant users from nationwide insurance claims data (Level I to II-2, fair). One study reported decreased odds of bleeding with LNG-IUD compared with COC use among young women, while one study of young women reported decreased odds of removal for bleeding with LNG-IUD compared with ENG implant (Level I to II-2, fair). CONCLUSION: Overall evidence suggests that the risk of adverse outcomes related to pregnancy, perforation, infection, heavy bleeding or removals for bleeding among young IUD users is low and may not be clinically meaningful. However, the risk of expulsion, especially for Cu-IUDs is higher for younger women compared with older women. If IUD expulsion occurs, a young woman is exposed to an increased risk of unintended pregnancy if replacement contraception is not initiated. IUDs are safe for young women and provide highly effective reversible contraception. |
Nonoral combined hormonal contraceptives and thromboembolism: A systematic review
Tepper NK , Dragoman MV , Gaffield ME , Curtis KM . Contraception 2016 95 (2) 130-139 BACKGROUND: Combined hormonal contraceptives (CHCs), containing estrogen and progestin, are associated with an increased risk of venous thromboembolism (VTE) and arterial thromboembolism (ATE), compared with non-use. Few studies have examined whether non-oral formulations (including the combined hormonal patch, combined vaginal ring, and combined injectable contraceptives) increase the risk of thrombosis compared with combined oral contraceptives (COCs). OBJECTIVES: To examine the risk of VTE and ATE among women using non-oral CHCs compared to women using COCs. METHODS: We searched the PubMed database for all English language articles published from database inception through May 2016. We included primary research studies that examined women using the patch, ring, or combined injectables compared with women using levonorgestrel-containing or norgestimate-containing COCs. Outcomes of interest included VTE (deep venous thrombosis or pulmonary embolism) or ATE (acute myocardial infarction or ischemic stroke). We assessed the quality of each individual piece of evidence using the system developed by the United States Preventive Services Task Force. RESULTS: Eight studies were identified that met inclusion criteria. Of seven analyses from six studies examining VTE among patch users compared with levonorgestrel or norgestimate-containing COC users, two found a statistically significantly elevated risk among patch users (risk estimates 2.2-2.3), one found an elevated risk that did not meet statistical significance (risk estimate 2.0) and four found no increased risk. Of three studies examining VTE among ring users compared with levonorgestrel COC users, one found a statistically significantly elevated risk among patch users (risk estimate 1.9) and two did not. Two studies did not find an increased risk for ATE among women using the patch compared with norgestimate COCs. We did not identify any studies examining combined injectable contraceptives. CONCLUSION: Limited Level II-2 good to fair evidence demonstrated conflicting results on whether women using the patch or the ring have a higher risk of VTE than women using COCs. Evidence did not demonstrate an increased risk of ATE among women using the patch. Overall, any potential elevated risk likely represents a small number of events on a population level. Additional studies with standard methodology are needed to further clarify any associations and better understand mechanisms of hormone-induced thrombosis among users of non-oral combined hormonal contraception. |
Normal pubertal development in daughters of women with PCOS: A controlled study
Legro RS , Kunselman AR , Stetter CM , Gnatuk CL , Estes SJ , Brindle E , Vesper HW , Botelho JC , Lee PA , Dodson WC . J Clin Endocrinol Metab 2016 102 (1) jc20162707 CONTEXT: Daughters of women with polycystic ovary syndrome (PCOS) are thought to be at increased risk for developing stigmata of the syndrome, but the ontogeny during puberty is uncertain. OBJECTIVE: To phenotype daughters (N = 76) of mothers with PCOS and daughters (N = 80) from control mothers for reproductive and metabolic parameters characteristic of PCOS. DESIGN, SETTING, AND PARTICIPANTS: Matched Case/Control Study, Penn State Hershey Medical Center, Non-Hispanic, Caucasian girls age 4-17 INTERVENTION: Obtain birth history, biometric, ovarian ultrasounds, whole body DXA scan for body composition, 2 hour glucose challenged salivary insulin levels and two timed urinary collections (12 hours overnight and 3 hours AM) for gonadotropins and sex steroids. MAIN OUTCOME MEASURES: Main Endpoints: Integrated urinary levels of adrenal (DHEAS) and ovarian (Testosterone) steroids. Other endpoints: Integrated salivary insulin levels and urinary LH levels. RESULTS: There were no differences in detection rates or mean levels for gonadotropins and sex steroids in timed urinary collections between PCOS Daughters and Control Daughters, nor were there differences in integrated salivary insulin levels. 69% of Tanner 4/5 PCOS daughters vs. 31% of control daughters had hirsutism defined as a Ferriman-Gallwey score > 8 (P=0.04). There were no differences in body composition as determined by DXA between groups in the three major body contents, i.e. bone, lean body mass and fat, or in in ovarian volume between groups. CONCLUSIONS: Matched for pubertal stage, PCOS daughters have similar levels of urinary androgens and gonadotropins as well as glucose challenged salivary insulin levels. |
Juxtaarticular myxoma in a pigtail macaque (Macaca nemestrina)
Skinner BL , Johnson CH , Lacy SH . Comp Med 2016 66 (5) 420-423 A 10-y-old pigtail macaque presented with a subcutaneous, soft-tissue mass overlying the right stifle joint. Here we describe the clinical case and histopathologic and immunohistochemical analysis of this lesion. This case represents the first published report of juxtaarticular myxoma in a pigtail macaque. |
Outbreak of influenza a(H3N2) variant virus infections among persons attending agricultural fairs housing infected swine - Michigan and Ohio, July-August 2016
Schicker RS , Rossow J , Eckel S , Fisher N , Bidol S , Tatham L , Matthews-Greer J , Sohner K , Bowman AS , Avrill J , Forshey T , Blanton L , Davis CT , Schiltz J , Skorupski S , Berman L , Jang Y , Bresee JS , Lindstrom S , Trock SC , Wentworth D , Fry AM , de Fijter S , Signs K , DiOrio M , Olsen SJ , Biggerstaff M . MMWR Morb Mortal Wkly Rep 2016 65 (42) 1157-1160 On August 3, 2016, the Ohio Department of Health Laboratory reported to CDC that a respiratory specimen collected on July 28 from a male aged 13 years who attended an agricultural fair in Ohio during July 22-29, 2016, and subsequently developed a respiratory illness, tested positive by real-time reverse transcription-polymerase chain reaction (rRT-PCR) for influenza A(H3N2) variant* (H3N2v). The respiratory specimen was collected as part of routine influenza surveillance activities. The next day, CDC was notified of a child aged 9 years who was a swine exhibitor at an agricultural fair in Michigan who became ill on July 29, 2016, and tested positive for H3N2v virus at the Michigan Department of Health and Human Services Laboratory. Investigations by Michigan and Ohio health authorities identified 18 human infections linked to swine exhibits at agricultural fairs. To minimize transmission of influenza viruses from infected swine to visitors, agricultural fair organizers should consider prevention measures such as shortening the time swine are on the fairgrounds, isolating ill swine, maintaining a veterinarian on call, providing handwashing stations, and prohibiting food and beverages in animal barns. Persons at high risk for influenza-associated complications should be discouraged from entering swine barns. |
Dengue outbreak in Mombasa City, Kenya, 2013-2014: Entomologic investigations
Lutomiah J , Barrera R , Makio A , Mutisya J , Koka H , Owaka S , Koskei E , Nyunja A , Eyase F , Coldren R , Sang R . PLoS Negl Trop Dis 2016 10 (10) e0004981 Dengue outbreaks were first reported in East Africa in the late 1970s to early 1980s including the 1982 outbreak on the Kenyan coast. In 2011, dengue outbreaks occurred in Mandera in northern Kenya and subsequently in Mombasa city along the Kenyan coast in 2013-2014. Following laboratory confirmation of dengue fever cases, an entomologic investigation was conducted to establish the mosquito species, and densities, causing the outbreak. Affected parts of the city were identified with the help of public health officials. Adult Ae. aegypti mosquitoes were collected using various tools, processed and screened for dengue virus (DENV) by cell culture and RT-PCR. All containers in every accessible house and compound within affected suburbs were inspected for immatures. A total of 2,065 Ae. aegypti adults were collected and 192 houses and 1,676 containers inspected. An overall house index of 22%, container index, 31.0% (indoor = 19; outdoor = 43) and Breteau index, 270.1, were observed, suggesting that the risk of dengue transmission was high. Overall, jerry cans were the most productive containers (18%), followed by drums (17%), buckets (16%), tires (14%) and tanks (10%). However, each site had specific most-productive container-types such as tanks (17%) in Kizingo; Drums in Nyali (30%) and Changamwe (33%), plastic basins (35%) in Nyali-B and plastic buckets (81%) in Ganjoni. We recommend that for effective control of the dengue vector in Mombasa city, all container types would be targeted. Measures would include proper covering of water storage containers and eliminating discarded containers outdoors through a public participatory environmental clean-up exercise. Providing reliable piped water to all households would minimize the need for water storage and reduce aquatic habitats. Isolation of DENV from male Ae. aegypti mosquitoes is a first observation in Kenya and provides further evidence that transovarial transmission may have a role in DENV circulation and/or maintenance in the environment. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Epidemiology and Surveillance
- Food Safety
- Genetics and Genomics
- Health Economics
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Physical Activity
- Program Evaluation
- Reproductive Health
- Veterinary Medicine
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure