Prevalence and Predictors of Cholesterol Screening, Awareness, and Statin Treatment Among US Adults With Familial Hypercholesterolemia or Other Forms of Severe Dyslipidemia (1999-2014).
Bucholz EM , Rodday AM , Kolor K , Khoury MJ , de Ferranti SD . Circulation 2018 137 (21) 2218-2230 Background -Familial hypercholesterolemia (FH) and other extreme elevations in low-density lipoprotein cholesterol significantly increase the risk of atherosclerotic cardiovascular disease; however, recent data suggest that prescription rates for statins remain low in these patients. National rates of screening, awareness, and treatment with statins among individuals with FH or severe dyslipidemia are unknown. Methods -Data from the 1999 to 2014 National Health and Nutrition Examination Survey were used to estimate prevalence rates of self-reported screening, awareness, and statin therapy among US adults (n=42 471 weighted to represent 212 million US adults) with FH (defined using the Dutch Lipid Clinic criteria) and with severe dyslipidemia (defined as lowdensity lipoprotein cholesterol levels >/=190 mg/dL). Logistic regression was used to identify sociodemographic and clinical correlates of hypercholesterolemia awareness and statin therapy. Results -The estimated US prevalence of definite/probable FH was 0.47% (standard error, 0.03%) and of severe dyslipidemia was 6.6% (standard error, 0.2%). The frequency of cholesterol screening and awareness was high (>80%) among adults with definite/probable FH or severe dyslipidemia; however, statin use was uniformly low (52.3% [standard error, 8.2%] of adults with definite/probable FH and 37.6% [standard error, 1.2%] of adults with severe dyslipidemia). Only 30.3% of patients with definite/probable FH on statins were taking a high-intensity statin. The prevalence of statin use in adults with severe dyslipidemia increased over time (from 29.4% to 47.7%) but not faster than trends in the general population (from 5.7% to 17.6%). Older age, health insurance status, having a usual source of care, diabetes mellitus, hypertension, and having a personal history of early atherosclerotic cardiovascular disease were associated with higher statin use. Conclusions -Despite the high prevalence of cholesterol screening and awareness, only approximately 50% of adults with FH are on statin therapy, with even fewer prescribed a high-intensity statin; young and uninsured patients are at the highest risk for lack of screening and for undertreatment. This study highlights an imperative to improve the frequency of cholesterol screening and statin prescription rates to better identify and treat this high-risk population. Additional studies are needed to better understand how to close these gaps in screening and treatment. |
Adults with an epilepsy history, especially those 45years or older, those with lower family incomes, and those with a history of hypertension, report a history of stroke five times as often as adults without such a history - 2010, 2013, and 2015 U.S. National Health Interview Survey
Zack MM , Luncheon C . Epilepsy Behav 2018 83 236-238 Stroke is the 5th leading cause of death and a leading cause of serious, long-term disability, despite being largely preventable in most people [1]. Stroke is also a common and serious cardiovascular comorbidity affecting persons with epilepsy, potentially increasing risk of early mortality [2], [3], [4], [5], [6], [7], [8], [9]. Stroke and epilepsy may be bidirectional, resulting from different causal mechanisms or shared risk factors [3], [10], [11]. Adults with epilepsy or seizures report higher levels of cardiovascular disease risk factors, including smoking, physical inactivity, overweight or obesity, and psychological distress [6], [7]. Stroke prevention in the general population including those with epilepsy is critical to reduce epilepsy and overall disability burden. This report uses three recent population-based U.S. adult samples to estimate how often stroke occurs in adults reporting a history of epilepsy, overall and at different levels of demographic characteristics and stroke risk factors. |
Association of birthplace and coronary heart disease and stroke among US adults: National Health Interview Survey, 2006 to 2014
Fang J , Yuan K , Gindi RM , Ward BW , Ayala C , Loustalot F . J Am Heart Assoc 2018 7 (7) BACKGROUND: The proportion of foreign-born US adults has almost tripled since 1970. However, less is known about the cardiovascular morbidity by birthplace among adults residing in the United States. This study's objective was to compare the prevalence of coronary heart disease (CHD) and stroke among US adults by birthplace. METHODS AND RESULTS: We used data from the 2006 to 2014 National Health Interview Survey. Birthplace was categorized as United States or foreign born. Foreign born was then grouped into 6 birthplace regions. We defined CHD and stroke as ever being told by a physician that she or he had CHD or stroke. We adjusted for select demographic and health characteristics in the analysis. Of US adults, 16% were classified as foreign born. Age-standardized prevalence of both CHD and stroke were higher among US- than foreign-born adults (CHD: 8.2% versus 5.5% for men and 4.8% versus 4.1% for women; stroke: 2.7% versus 2.1% for men and 2.7% versus 1.9% for women; all P<0.05). Comparing individual regions with those of US- born adults, CHD prevalence was lower among foreign-born adults from Asia and Mexico, Central America, or the Caribbean. For stroke, although men from South America or Africa had the lowest prevalence, women from Europe had the lowest prevalence. Years of living in the United States was not related to risk of CHD or stroke after adjustment with demographic and health characteristics. CONCLUSIONS: Overall, foreign-born adults residing in the United States had a lower prevalence of CHD and stroke than US-born adults. However, considerable heterogeneity of CHD and stroke risk was found by region of birth. |
Capture of tobacco use among population-based registries: Findings from 10 National Program of Cancer Registries states
Siegel DA , Henley SJ , Wike JM , Ryerson AB , Johnson CJ , Rees JR , Pollack LA . Cancer 2018 124 (11) 2381-2389 BACKGROUND: Tobacco use data are important when the epidemiology and prognosis of tobacco-associated cancers are being defined. Central cancer registries in 10 National Program of Cancer Registries states pilot-tested the collection of standardized tobacco use variables. This study evaluated the capture of tobacco use data and examined smoking prevalence among cancer patients. METHODS: Participating registries collected data about the use of tobacco-cigarettes, other smoked tobacco, and smokeless tobacco-for cases diagnosed during 2011-2013. The percentage of cases with known tobacco variable values was calculated, and the prevalence of tobacco use was analyzed by the primary cancer site and state. RESULTS: Among 1,646,505 incident cancer cases, 51% had known cigarette use data: 18% were current users, 31% were former users, and 51% reported never using. The percentage of cases with a known status for both other smoked tobacco and smokeless tobacco was 43%, with 97% and 98% coded as never users, respectively. The percent known for cigarette use ranged from 27% to 81% by state and improved from 47% in 2011 to 59% in 2013 for all 10 states combined. The percent known for cigarette use and the prevalence of ever smoking cigarettes were highest for laryngeal cancer and tracheal, lung, and bronchus cancer. CONCLUSIONS: Cancer registrars ascertained cigarette use for slightly more than half of all new cancer cases, but other tobacco-related fields were less complete. Studies to evaluate the validity of specific tobacco-related variables and the ability of cancer registries to capture this information from the medical record are needed to gauge the usefulness of collecting these variables through cancer surveillance systems. Cancer 2018. (c) 2018 American Cancer Society. |
Common terms for rare epilepsies: Synonyms, associated terms, and links to structured vocabularies
Grinspan ZM , Tian N , Yozawitz EG , McGoldrick PE , Wolf SM , McDonough TL , Nelson A , Hafeez B , Johnson SB , Hesdorffer DC . Epilepsia Open 2018 3 (1) 91-97 Identifying individuals with rare epilepsy syndromes in electronic data sources is difficult, in part because of missing codes in the International Classification of Diseases (ICD) system. Our objectives were the following: (1) to describe the representation of rare epilepsies in other medical vocabularies, to identify gaps; and (2) to compile synonyms and associated terms for rare epilepsies, to facilitate text and natural language processing tools for cohort identification and population-based surveillance. We describe the representation of 33 epilepsies in 3 vocabularies: Orphanet, SNOMED-CT, and UMLS-Metathesaurus. We compiled terms via 2 surveys, correspondence with parent advocates, and review of web resources and standard vocabularies. UMLS-Metathesaurus had entries for all 33 epilepsies, Orphanet 32, and SNOMED-CT 25. The vocabularies had redundancies and missing phenotypes. Emerging epilepsies (SCN2A-, SCN8A-, KCNQ2-, SLC13A5-, and SYNGAP-related epilepsies) were underrepresented. Survey and correspondence respondents included 160 providers, 375 caregivers, and 11 advocacy group leaders. Each epilepsy syndrome had a median of 15 (range 6-28) synonyms. Nineteen had associated terms, with a median of 4 (range 1-41). We conclude that medical vocabularies should fill gaps in representation of rare epilepsies to improve their value for epilepsy research. We encourage epilepsy researchers to use this resource to develop tools to identify individuals with rare epilepsies in electronic data sources. |
Diabetes is associated with increased prevalence of latent tuberculosis infection: Findings from the National Health and Nutrition Examination Survey, 2011-2012
Barron MM , Shaw KM , McKeever Bullard K , Ali MK , Magee MJ . Diabetes Res Clin Pract 2018 139 366-379 AIMS: We aim to determine the association between prediabetes and diabetes with latent TB using National Health and Nutrition Examination Survey data. METHODS: We performed a cross-sectional analysis of 2011-2012 National Health and Nutrition Examination Survey data. Participants >/=20 years were eligible. Diabetes was defined by glycated hemoglobin (HbA1c) as no diabetes (</=5.6% [38 mmol/mol]), prediabetes (5.7-6.4% [39-46mmol/mol]), and diabetes (>/=6.5% [48 mmol/mol]) combined with self-reported diabetes. Latent TB infection was defined by the QuantiFERON(R)-TB Gold In Tube (QFT-GIT) test. Adjusted odds ratios (aOR) of latent TB infection by diabetes status were calculated using logistic regression and accounted for the stratified probability sample. RESULTS: Diabetes and QFT-GIT measurements were available for 4,958 (89.2%) included participants. Prevalence of diabetes was 11.4% (95%CI 9.8-13.0%) and 22.1% (95%CI 20.5-23.8%) had prediabetes. Prevalence of latent TB infection was 5.9% (95%CI 4.9-7.0%). After adjusting for age, sex, smoking status, history of active TB, and foreign born status, the odds of latent TB infection were greater among adults with diabetes (aOR 1.90, 95%CI 1.15-3.14) compared to those without diabetes. The odds of latent TB in adults with prediabetes (aOR 1.15, 95%CI 0.90-1.47) was similar to those without diabetes. CONCLUSIONS: Diabetes is associated with latent TB infection among adults in the United States, even after adjusting for confounding factors. Given diabetes increases the risk of active TB, patients with co-prevalent diabetes and latent TB may be targeted for latent TB treatment. |
Establishing a vision and eye health surveillance system for the nation: A status update on the Vision and Eye Health Surveillance System
Rein DB , Wittenborn JS , Phillips EA , Saaddine JB . Ophthalmology 2018 125 (4) 471-473 Vision loss and eye disorders cost the US health care system $65.1 billion in 2013, the fifth leading cause of medical expenditures.1 Americans fear losing vision as much as or more than memory, hearing, or speech, and consider blindness among the top 4 worst things that could happen to them.2 It is estimated that as much as 98% of visual impairment and blindness, much of it consisting of uncorrected refractive error and untreated cataracts, in the United States can be prevented through timely diagnosis and early treatment.3 |
The global epidemiology of diabetes and kidney disease
Koye DN , Magliano DJ , Nelson RG , Pavkov ME . Adv Chronic Kidney Dis 2018 25 (2) 121-132 The prevalence of diabetes is increasing worldwide, with the greatest increases occurring in low- and middle-income countries. In most developed countries, type 2 diabetes is presently the leading cause of end-stage renal disease and also contributes substantially to cardiovascular disease. In countries with weaker economies type 2 diabetes is rapidly replacing communicable diseases as a leading cause of kidney disease and is increasingly competing for scarce health care resources. Here, we present a narrative review of the prevalence and incidence of diabetes-related kidney disease worldwide. Mortality among those with diabetes and kidney disease will also be explored. Given the high morbidity and mortality associated with chronic kidney disease, we will also examine the level of awareness of this disease among people who have it. |
Heart disease death rates among blacks and whites aged 35 years - United States, 1968-2015
Van Dyke M , Greer S , Odom E , Schieb L , Vaughan A , Kramer M , Casper M . MMWR Surveill Summ 2018 67 (5) 1-11 PROBLEM/CONDITION: Heart disease is the leading cause of death in the United States. In 2015, heart disease accounted for approximately 630,000 deaths, representing one in four deaths in the United States. Although heart disease death rates decreased 68% for the total population from 1968 to 2015, marked disparities in decreases exist by race and state. PERIOD COVERED: 1968-2015. DESCRIPTION OF SYSTEM: The National Vital Statistics System (NVSS) data on deaths in the United States were abstracted for heart disease using diagnosis codes from the eighth, ninth, and tenth revisions of the International Classification of Diseases (ICD-8, ICD-9, and ICD-10) for 1968-2015. Population estimates were obtained from NVSS files. National and state-specific heart disease death rates for the total population and by race for adults aged >/=35 years were calculated for 1968-2015. National and state-specific black-white heart disease mortality ratios also were calculated. Death rates were age standardized to the 2000 U.S. standard population. Joinpoint regression was used to perform time trend analyses. RESULTS: From 1968 to 2015, heart disease death rates decreased for the total U.S. population among adults aged >/=35 years, from 1,034.5 to 327.2 per 100,000 population, respectively, with variations in the magnitude of decreases by race and state. Rates decreased for the total population an average of 2.4% per year, with greater average decreases among whites (2.4% per year) than blacks (2.2% per year). At the national level, heart disease death rates for blacks and whites were similar at the start of the study period (1968) but began to diverge in the late 1970s, when rates for blacks plateaued while rates for whites continued to decrease. Heart disease death rates among blacks remained higher than among whites for the remainder of the study period. Nationwide, the black-white ratio of heart disease death rates increased from 1.04 in 1968 to 1.21 in 2015, with large increases occurring during the 1970s and 1980s followed by small but steady increases until approximately 2005. Since 2005, modest decreases have occurred in the black-white ratio of heart disease death rates at the national level. The majority of states had increases in black-white mortality ratios from 1968 to 2015. The number of states with black-white mortality ratios >1 increased from 16 (40%) to 27 (67.5%). INTERPRETATION: Although heart disease death rates decreased both for blacks and whites from 1968 to 2015, substantial differences in decreases were found by race and state. At the national level and in most states, blacks experienced smaller decreases in heart disease death rates than whites for the majority of the period. Overall, the black-white disparity in heart disease death rates increased from 1968 to 2005, with a modest decrease from 2005 to 2015. PUBLIC HEALTH ACTION: Since 1968, substantial increases have occurred in black-white disparities of heart disease death rates in the United States at the national level and in many states. These increases appear to be due to faster decreases in heart disease death rates for whites than blacks, particularly from the late 1970s until the mid-2000s. Despite modest decreases in black-white disparities at the national level since 2005, in 2015, heart disease death rates were 21% higher among blacks than among whites. This study demonstrates the use of NVSS data to conduct surveillance of heart disease death rates by race and of black-white disparities in heart disease death rates. Continued surveillance of temporal trends in heart disease death rates by race can provide valuable information to policy makers and public health practitioners working to reduce heart disease death rates both for blacks and whites and disparities between blacks and whites. |
Patient-reported outcomes after ischemic stroke as part of routine care
George MG , Zhao X . Neurology 2018 90 (16) 717-718 Stroke is, unfortunately, an all too frequently experienced event and one that leaves the lives of patients and their families profoundly changed in a matter of minutes. Survival post stroke has markedly improved over the last 15 years with increasing rates in the use of IV alteplase and, more recently, the ability to perform successful endovascular reperfusion for large vessel strokes; however, the outcomes that matter to clinicians are often not the outcomes that matter most to survivors of stroke.1 Previous research has shown that, among patients with a normal modified Rankin Scale (mRS) score, there is a wide distribution of outcomes for physical function, fatigue, and other domains of cognition and social function.2,3 Providing treatment plans and ongoing care for stroke survivors requires understanding the outcomes that are most meaningful to patients because of the variety of poststroke affected domains. Identifying, documenting, and addressing meaningful patient-centric outcomes for each patient is essential for her or his optimal rehabilitation and recovery.4 |
Prevalence of diagnosed diabetes in adults by diabetes type - United States, 2016
Bullard KM , Cowie CC , Lessem SE , Saydah SH , Menke A , Geiss LS , Orchard TJ , Rolka DB , Imperatore G . MMWR Morb Mortal Wkly Rep 2018 67 (12) 359-361 Currently 23 million U.S. adults have been diagnosed with diabetes (1). The two most common forms of diabetes are type 1 and type 2. Type 1 diabetes results from the autoimmune destruction of the pancreas's beta cells, which produce insulin. Persons with type 1 diabetes require insulin for survival; insulin may be given as a daily shot or continuously with an insulin pump (2). Type 2 diabetes is mainly caused by a combination of insulin resistance and relative insulin deficiency (3). A small proportion of diabetes cases might be types other than type 1 or type 2, such as maturity-onset diabetes of the young or latent autoimmune diabetes in adults (3). Although the majority of prevalent cases of type 1 and type 2 diabetes are in adults, national data on the prevalence of type 1 and type 2 in the U.S. adult population are sparse, in part because of the previous difficulty in classifying diabetes by type in surveys (2,4,5). In 2016, supplemental questions to help distinguish diabetes type were added to the National Health Interview Survey (NHIS) (6). This study used NHIS data from 2016 to estimate the prevalence of diagnosed diabetes among adults by primary type. Overall, based on self-reported type and current insulin use, 0.55% of U.S. adults had diagnosed type 1 diabetes, representing 1.3 million adults; 8.6% had diagnosed type 2 diabetes, representing 21.0 million adults. Of all diagnosed cases, 5.8% were type 1 diabetes, and 90.9% were type 2 diabetes; the remaining 3.3% of cases were other types of diabetes. Understanding the prevalence of diagnosed diabetes by type is important for monitoring trends, planning public health responses, assessing the burden of disease for education and management programs, and prioritizing national plans for future type-specific health services. |
Primary care providers' intended use of decision aids for prostate-specific antigen testing for prostate cancer screening
Rim SH , Hall IJ , Massetti GM , Thomas CC , Li J , Richardson LC . J Cancer Educ 2018 34 (4) 666-670 Decision aids are tools intended to help people weigh the benefits and harms of a health decision. We examined primary care providers' perspective on use of decision aids and explored whether providers' beliefs and interest in use of a decision aid was associated with offering the prostate-specific antigen (PSA) test for early detection of prostate cancer. Data were obtained from 2016 DocStyles, an annual, web-based survey of U.S. healthcare professionals including primary care physicians (n = 1003) and nurse practitioners (n = 253). We found that the majority of primary care providers reported not using (patient) decision aids for prostate cancer screening, but were interested in learning about and incorporating these tools in their practice. Given the potential of decision aids to guide in informed decision-making, there is an opportunity for evaluating existing decision aids for prostate cancer screening for clinical use. |
Trends in diabetic ketoacidosis hospitalizations and in-hospital mortality - United States, 2000-2014
Benoit SR , Zhang Y , Geiss LS , Gregg EW , Albright A . MMWR Morb Mortal Wkly Rep 2018 67 (12) 362-365 Diabetes is a common chronic condition and as of 2015, approximately 30 million persons in the United States had diabetes (23 million with diagnosed and 7 million with undiagnosed) (1). Diabetic ketoacidosis (DKA) is a life-threatening but preventable complication of diabetes characterized by uncontrolled hyperglycemia (>250 mg/dL), metabolic acidosis, and increased ketone concentration that occurs most frequently in persons with type 1 diabetes (2). CDC's United States Diabetes Surveillance System* (USDSS) indicated an increase in hospitalization rates for DKA during 2009-2014, most notably in persons aged <45 years. To explore this finding, 2000-2014 data from the Agency for Healthcare Research and Quality's National Inpatient Sample (NIS)(dagger) were assembled to calculate trends in DKA hospitalization rates and in-hospital case-fatality rates. Overall, age-adjusted DKA hospitalization rates decreased slightly from 2000 to 2009, then reversed direction, steadily increasing from 2009 to 2014 at an average annual rate of 6.3%. In-hospital case-fatality rates declined consistently during the study period from 1.1% to 0.4%. Better understanding the causes of this increasing trend in DKA hospitalizations and decreasing trend in in-hospital case-fatality through further exploration using multiple data sources will facilitate the targeting of prevention efforts. |
Trends in obesity and severe obesity prevalence in US youth and adults by sex and age, 2007-2008 to 2015-2016
Hales CM , Fryar CD , Carroll MD , Freedman DS , Ogden CL . JAMA 2018 319 (16) 1723-1725 This study uses National Health and Nutrition Examination Survey data to characterize trends in obesity prevalence among US youth and adults between 2007-2008 and 2015-2016. |
Two Cases of Meningococcal Disease in One Family Separated by an Extended Period - Colorado, 2015-2016.
Spence Davizon E , Soeters HM , Miller L , Barnes M . MMWR Morb Mortal Wkly Rep 2018 67 (12) 366-368 On April 26, 2015, a case of meningococcal disease in a woman aged 75 years was reported to the Colorado Department of Public Health and Environment (CDPHE). As part of routine public health investigation and control activities, all seven family contacts of the patient were advised to receive appropriate postexposure prophylaxis (PEP) to eradicate nasopharyngeal carriage of meningococci and prevent secondary disease (1), although it is not known whether the family contacts complied with PEP recommendations. Fifteen months later, on June 6, 2016, CDPHE was notified that the grandchild of the first patient, a male infant aged 3 months who lived with the first patient, also had meningococcal disease. The infant's immediate family members (parents and one sibling) were among family contacts for whom PEP was recommended in 2015. Neisseria meningitidis isolates from both patients were found to be serogroup C at the CDPHE laboratory. Whole genome sequence (WGS) analysis at CDC found that both isolates had the same sequence type, indicating close genetic relatedness. These cases represent a possible instance of meningococcal disease transmission within a family, despite appropriate PEP recommendations and with a long interval between cases. |
The California Multidrug-Resistant Tuberculosis Consult Service: a partnership of state and local programs
Shah NS , Westenhouse J , Lowenthal P , Schecter G , True L , Mase S , Barry PM , Flood J . Public Health Action 2018 8 (1) 7-13 Background: The US Centers for Disease Control and Prevention recommend expert consultation for multi-drug-resistant tuberculosis (MDR-TB) cases. In 2002, the California MDR-TB Service was created to provide expert MDR-TB consultations. We describe the characteristics, treatment outcomes and management of patients referred to the Service. Methods: Surveillance data were used for descriptive analysis of cases, with consultation during July 2002-December 2012. Clinical consultation data and modified World Health Organization indicators were used to assess the care and management of cases, with consultation from January 2009 to December 2012. Results: Of 339 MDR-TB patients, 140 received a consultation. The proportion of patients receiving a consultation increased from 12% in 2002 to 63% in 2012. There were 24 pre-extensively drug-resistant TB and 5 patients with extensively drug-resistant TB. The majority (n = 123, 88%) completed treatment, 5 (4%) died, 7 (5%) moved before treatment completion, 4 (3%) stopped treatment due to an adverse event and 1 (1%) had an unknown outcome. Indicator data showed that 86% underwent rapid molecular drug susceptibility testing, 98% received at least four drugs to which they had known or presumed susceptibility, and 93% culture converted within 6 months. Conclusions: Consultations with the MDR-TB Service increased over time. Results highlight successful treatment and indicator outcomes. |
Clinical signs of trachoma are prevalent among Solomon Islanders who have no persistent markers of prior infection with Chlamydia trachomatis
Butcher R , Sokana O , Jack K , Sui L , Russell C , Last A , Martin DL , Burton MJ , Solomon AW , Mabey DCW , Roberts CH . Wellcome Open Res 2018 3 14 Background: The low population-prevalence of trachomatous trichiasis and high prevalence of trachomatous inflammation-follicular (TF) provide contradictory estimates of the magnitude of the public health threat from trachoma in the Solomon Islands. Improved characterisation of the biology of trachoma in the region may support policy makers as they decide what interventions are required. Here, age-specific profiles of anti-Pgp3 antibodies and conjunctival scarring were examined to determine whether there is evidence of ongoing transmission and pathology from ocular Chlamydia trachomatis (Ct) infection. Methods: A total of 1511 individuals aged >/=1 year were enrolled from randomly selected households in 13 villages in which >10% of children aged 1-9 years had TF prior to a single round of azithromycin mass drug administration undertaken six months previously. Blood was collected to be screened for antibodies to the Ct antigen Pgp3. Tarsal conjunctival photographs were collected for analysis of scarring severity. Results: Anti-Pgp3 seropositivity was 18% in 1-9 year olds, sharply increasing around the age of sexual debut to reach 69% in those over 25 years. Anti-Pgp3 seropositivity did not increase significantly between the ages of 1-9 years and was not associated with TF (p=0.581) or scarring in children (p=0.472). Conjunctival scars were visible in 13.1% of photographs. Mild (p<0.0001) but not severe (p=0.149) scars increased in prevalence with age. Conclusions: Neither conjunctival scars nor lymphoid follicles were associated with antibodies to Ct, suggesting that they are unlikely to be a direct result of ocular Ct infection . Clinical signs of trachoma were prevalent in this population but were not indicative of the underlying rates of Ct infection. The current World Health Organization guidelines for trachoma elimination indicated that this population should receive intervention with mass distribution of antibiotics, but the data presented here suggest that this may not have been appropriate. |
Comparison of different treatments for isoniazid-resistant tuberculosis: an individual patient data meta-analysis
Fregonese F , Ahuja SD , Akkerman OW , Arakaki-Sanchez D , Ayakaka I , Baghaei P , Bang D , Bastos M , Benedetti A , Bonnet M , Cattamanchi A , Cegielski P , Chien JY , Cox H , Dedicoat M , Erkens C , Escalante P , Falzon D , Garcia-Prats AJ , Gegia M , Gillespie SH , Glynn JR , Goldberg S , Griffith D , Jacobson KR , Johnston JC , Jones-Lopez EC , Khan A , Koh WJ , Kritsk A , Lan ZY , Lee JH , Li PZ , Maciel EL , Galliez RM , Merle CSC , Munang M , Narendran G , Nguyen VN , Nunn A , Ohkado A , Park JS , Phillips PPJ , Ponnuraja C , Reves R , Romanowski K , Seung K , Schaaf HS , Skrahina A , van Soolingen D , Tabarsi P , Trajman A , Trieu L , Velayutham V Banurekha VV , Viiklepp P , Wang JY , Yoshiyama T , Menzies D . Lancet Respir Med 2018 6 (4) 265-275 BACKGROUND: Isoniazid-resistant, rifampicin-susceptible (INH-R) tuberculosis is the most common form of drug resistance, and is associated with failure, relapse, and acquired rifampicin resistance if treated with first-line anti-tuberculosis drugs. The aim of the study was to compare success, mortality, and acquired rifampicin resistance in patients with INH-R pulmonary tuberculosis given different durations of rifampicin, ethambutol, and pyrazinamide (REZ); a fluoroquinolone plus 6 months or more of REZ; and streptomycin plus a core regimen of REZ. METHODS: Studies with regimens and outcomes known for individual patients with INH-R tuberculosis were eligible, irrespective of the number of patients if randomised trials, or with at least 20 participants if a cohort study. Studies were identified from two relevant systematic reviews, an updated search of one of the systematic reviews (for papers published between April 1, 2015, and Feb 10, 2016), and personal communications. Individual patient data were obtained from authors of eligible studies. The individual patient data meta-analysis was performed with propensity score matched logistic regression to estimate adjusted odds ratios (aOR) and risk differences of treatment success (cure or treatment completion), death during treatment, and acquired rifampicin resistance. Outcomes were measured across different treatment regimens to assess the effects of: different durations of REZ (</=6 months vs >6 months); addition of a fluoroquinolone to REZ (fluoroquinolone plus 6 months or more of REZ vs 6 months or more of REZ); and addition of streptomycin to REZ (streptomycin plus 6 months of rifampicin and ethambutol and 1-3 months of pyrazinamide vs 6 months or more of REZ). The overall quality of the evidence was assessed using GRADE methodology. FINDINGS: Individual patient data were requested for 57 cohort studies and 17 randomised trials including 8089 patients with INH-R tuberculosis. We received 33 datasets with 6424 patients, of which 3923 patients in 23 studies received regimens related to the study objectives. Compared with a daily regimen of 6 months of (H)REZ (REZ with or without isoniazid), extending the duration to 8-9 months had similar outcomes; as such, 6 months or more of (H)REZ was used for subsequent comparisons. Addition of a fluoroquinolone to 6 months or more of (H)REZ was associated with significantly greater treatment success (aOR 2.8, 95% CI 1.1-7.3), but no significant effect on mortality (aOR 0.7, 0.4-1.1) or acquired rifampicin resistance (aOR 0.1, 0.0-1.2). Compared with 6 months or more of (H)REZ, the standardised retreatment regimen (2 months of streptomycin, 3 months of pyrazinamide, and 8 months of isoniazid, rifampicin, and ethambutol) was associated with significantly worse treatment success (aOR 0.4, 0.2-0.7). The quality of the evidence was very low for all outcomes and treatment regimens assessed, owing to the observational nature of most of the data, the diverse settings, and the imprecision of estimates. INTERPRETATION: In patients with INH-R tuberculosis, compared with treatment with at least 6 months of daily REZ, addition of a fluoroquinolone was associated with better treatment success, whereas addition of streptomycin was associated with less treatment success; however, the quality of the evidence was very low. These results support the conduct of randomised trials to identify the optimum regimen for this important and common form of drug-resistant tuberculosis. FUNDING: World Health Organization and Canadian Institutes of Health Research. |
Development of a surveillance definition for United States-Mexico binational cases of tuberculosis
Woodruff RSY , Miner MC , Miramontes R . Public Health Rep 2018 133 (2) 155-162 OBJECTIVES: Consistently collected binational surveillance data are important in advocating for resources to manage and treat binational cases of tuberculosis (TB). The objective of this study was to develop a surveillance definition for binational (United States-Mexico) cases of TB to assess the burden on US TB program resources. METHODS: We collaborated with state and local TB program staff members in the United States to identify characteristics associated with binational cases of TB. We collected data on all cases of TB from 9 pilot sites in 5 states (Arizona, California, Colorado, New Mexico, and Texas) during January 1-June 30, 2014, that had at least 1 binational characteristic (eg, "crossed border while on TB treatment" and "received treatment in another country, coordinated by an established, US-funded, binational TB program"). A workgroup of US state, local, and federal partners reviewed results and used them to develop a practical surveillance definition. RESULTS: The pilot sites reported 87 cases of TB with at least 1 binational characteristic during the project period. The workgroup drafted a proposed surveillance definition to include 2 binational characteristics: "crossed border while on TB treatment" (34 of 87 cases, 39%) and "received treatment in another country, coordinated by an established, US-funded, binational TB program" (26 of 87 cases, 30%). Applying the new proposed definition, 39 of 87 pilot cases of TB (45%) met the definition of binational. CONCLUSION: Input from partners who were responsible for the care and treatment of patients who cross the United States-Mexico border was crucial in defining a binational case of TB. |
HIV rapid diagnostic testing by lay providers in a key population-led health service programme in Thailand
Wongkanya R , Pankam T , Wolf S , Pattanachaiwit S , Jantarapakde J , Pengnongyang S , Thapwong P , Udomjirasirichot A , Churattanakraisri Y , Prawepray N , Paksornsit A , Sitthipau T , Petchaithong S , Jitsakulchaidejt R , Nookhai S , Lertpiriyasuwat C , Ongwandee S , Phanuphak P , Phanuphak N . J Virus Erad 2018 4 (1) 12-15 Introduction: Rapid diagnostic testing (RDT) for HIV has a quick turn-around time, which increases the proportion of people testing who receive their result. HIV RDT in Thailand has traditionally been performed only by medical technologists (MTs), which is a barrier to its being scaled up. We evaluated the performance of HIV RDT conducted by trained lay providers who were members of, or worked closely with, a group of men who have sex with men (MSM) and with transgender women (TG) communities, and compared it to tests conducted by MTs. Methods: Lay providers received a 3-day intensive training course on how to perform a finger-prick blood collection and an HIV RDT as part of the Key Population-led Health Services (KPLHS) programme among MSM and TG. All the samples were tested by lay providers using Alere Determine HIV 1/2. HIV-reactive samples were confirmed by DoubleCheckGold Ultra HIV 1&2 and SD Bioline HIV 1/2. All HIV-positive and 10% of HIV-negative samples were re-tested by MTs using Serodia HIV 1/2. Results: Of 1680 finger-prick blood samples collected and tested using HIV RDT by lay providers in six drop-in centres in Bangkok, Chiang Mai, Chonburi and Songkhla, 252 (15%) were HIV-positive. MTs re-tested these HIV-positive samples and 143 randomly selected HIV-negative samples with 100% concordant test results. Conclusion: Lay providers in Thailand can be trained and empowered to perform HIV RDT as they were found to achieve comparable results in sample testing with MTs. Based on the task-shifting concept, this rapid HIV testing performed by lay providers as part of the KPLHS programme has great potential to enhance HIV prevention and treatment programmes among key at-risk populations. |
Impact of enhanced viral haemorrhagic fever surveillance on outbreak detection and response in Uganda
Shoemaker TR , Balinandi S , Tumusiime A , Nyakarahuka L , Lutwama J , Mbidde E , Kofman A , Klena JD , Stroher U , Rollin PE , Nichol ST . Lancet Infect Dis 2018 18 (4) 373-375 The recent outbreak of Marburg virus disease in Kween District, eastern Uganda, reported in The Lancet Infectious Diseases,1 marks the 13th independent viral haemorrhagic fever outbreak identified and confirmed via laboratory test by the Uganda Virus Research Institute (UVRI)’s viral haemorrhagic fever surveillance and laboratory programme since 2010. This Marburg virus disease outbreak was followed closely by three independent confirmations of human Rift Valley fever virus infection in three districts in central Uganda, and now brings the total viral haemorrhagic fever outbreak detections to 16. This exceptional number of early detections and subsequent outbreak responses has led to a significant decrease in the overall intensity (p=0·001) and duration (p<0·0001) of viral haemorrhagic fever outbreaks in Uganda, and serves as a role model for detecting and responding to public health threats of international concern. |
Loss to follow-up and associated factors of patients through the National AIDS Program in Thailand
Teeraananchai S , Kerr SJ , Ruxrungtham K , Avihingsanon A , Chaivooth S , Teeraratkul A , Bhakeecheep S , Ongwandee S , Thanprasertsuk S , Law MG . Antivir Ther 2018 23 (6) 529-538 BACKGROUND: Loss to follow-up is a crucial indicator to evaluate the effective of HIV care and treatment program. We assessed the LTFU rate and associated factors of Thai HIV-infected patients who enrolled in the National AIDS program (NAP) for 2 periods: prior to (pre-ART) and after starting ART (ART-patients). METHODS: Thai HIV patients aged >/= 15 years enrolled in NAP from 2008 to 2014. Vital status was ascertained by linkage with the National Death Registry. Competing risk models were used to calculate the adjusted sub-distribution hazards (aSHR) for LTFU for pre-ART and ART patients, with death considered as a competing risk. RESULTS: A total of 157,026 patients registered in care and were included in analyses. The cumulative incidence of LTFU in pre-ART patients at 1 year was 10.2%, whereas in ART-patients it was 12.8%. Among pre-ART patients, younger age (< 30 vs age >/= 45, aSHR 1.60, 95%CI 1.49-1.72), less advanced HIV stage (aSHR 1.29, 95%CI 1.21-1.37) and higher CD4 count (>/= 350 vs < 100; aSHR 6.31, 95%CI 5.74-6.95) had a higher chance of LTFU. ART-patients with high baseline CD4 count (CD4 >/= 350 vs CD4 < 50; aSHR 2.06, 95%CI 1.97-2.15) and non-advanced HIV stage had increased risk of LTFU. CONCLUSIONS: Our findings provide new evidence of the LTFU rate in Thai HIV-infected patients in NAP. Emphasis needs to be placed on improving follow up in all patients with higher CD4 counts. LTFU will be important to monitor as programs move to commence ART regardless of CD4 count. |
Maternal and perinatal outcomes in pregnant women with suspected Ebola virus disease in Sierra Leone, 2014
Lyman M , Johnson Mpofu J , Soud F , Oduyebo T , Ellington S , Schlough GW , Koroma AP , McFadden J , Morof D . Int J Gynaecol Obstet 2018 142 (1) 71-77 OBJECTIVE: To describe maternal and perinatal outcomes among pregnant women with suspected Ebola virus disease (EVD) in Sierra Leone. METHODS: Observational investigation of maternal and perinatal outcomes among pregnant women with suspected EVD from five districts in Sierra Leone from June-December 2014. Suspected cases were ill pregnant women with symptoms suggestive of EVD or relevant exposures who were tested for EVD. Case frequencies and odds ratios were calculated to compare patient characteristics and outcomes by EVD status. RESULTS: There were 192 suspected cases: 67 (34.9%) EVD-positive, 118 (61.5%) EVD-negative, and 7 (3.6%) EVD status unknown. Women with EVD had increased odds of death (OR 10.22; 95% CI, 4.87-21.46) and spontaneous abortion (OR 4.93; 95% CI, 1.79-13.55) compared with those without EVD. Women without EVD had a high frequency of death (30.2%) and stillbirths (65.9%). One of 14 neonates born following EVD-negative and five of six neonates born following EVD-positive pregnancies died. CONCLUSION: EVD-positive and EVD-negative women with suspected EVD had poor outcomes, highlighting the need for increased attention and resources focused on maternal and perinatal health during an urgent public health response. Capturing pregnancy status in nationwide surveillance of EVD can help improve understanding of disease burden and design effective interventions. This article is protected by copyright. All rights reserved. |
Notes from the field: Fatalities associated with human adenovirus type 7 at a substance abuse rehabilitation facility - New Jersey, 2017
Rozwadowski F , Caulcrick-Grimes M , McHugh L , Haldeman A , Fulton T , Killerby M , Schneider E , Lu X , Sakthivel SK , Bhatnagar J , Rabeneck DB , Zaki S , Watson J . MMWR Morb Mortal Wkly Rep 2018 67 (12) 371-372 On February 3, 2017, a local health department notified the New Jersey Department of Health (NJDOH) of a severe respiratory illness outbreak, including two hospitalizations and one death, at a substance abuse treatment facility. During December 2016–January 2017, NJDOH surveillance for noninfluenza respiratory viruses identified multiple human adenovirus (HAdV) cases in the surrounding community. HAdVs can cause severe respiratory illness, and outbreaks of HAdV type 4 (HAdV-4) and HAdV type 7 (HAdV-7) have been associated with communal living facilities, including military barracks (1). A combined HAdV-4 and HadV-7 live oral vaccine is available but is currently limited to military use (2). NJDOH and the local health department investigated the outbreak in consultation with CDC to describe outbreak scope and provide infection control recommendations in this communal facility. |
Pediatric hospitalizations attributable to rotavirus gastroenteritis among Cambodian children: Seven years of active surveillance, 2010-2016
Angkeabos N , Rin E , Vichit O , Chea C , Tech N , Payne DC , Fox K , Heffelfinger JD , Grabovac V , Nyambat B , Diorditsa S , Samnang C , Hossain MS . Vaccine 2018 36 (51) 7856-7861 BACKGROUND: Each year, approximately 1,066 Cambodian children under five years old die of diarrhea, and 51% of these deaths are due to rotavirus gastroenteritis. Quantifying childhood hospitalizations caused by severe rotavirus infections is also important in demonstrating disease burden caused by this virus. The objective of this study is to update and confirm the current burden of pediatric hospitalizations attributable to rotavirus gastroenteritis among Cambodian children using seven years of continuous active, prospective surveillance from 2010 to 2016. We also characterize the circulating rotavirus genotypic strains during this period. METHODS: Active surveillance for rotavirus gastroenteritis was conducted from January 2010 through December 2016 at a national hospital in Phnom Penh, Cambodia. Children <60months of age who were hospitalized for acute gastroenteritis (AGE) were consented and enrolled. Information on gender, age, clinical characteristics, and month of onset were collected. Stool specimens were collected and tested by enzyme immunoassay for the presence of rotavirus antigen, and genotyping was performed on rotavirus test-positive specimens to characterize predominant rotavirus strains during the surveillance period. RESULTS: Of 7007 children enrolled with AGE and having specimens collected, 3473 (50%) were attributed to rotavirus gastroenteritis. The majority of rotavirus hospitalizations occurred in children younger than two years old (92%). Year-round rotavirus transmission was observed, with seasonal peaks during the cooler, dry months between November and May. Genotypic trends in rotavirus were observed over the surveillance period; the predominant rotavirus strains changed from G1P[8] (2010-2012), to G2P[4] (2013-2014), the emergence of genotype G8P[8] in 2015, and G3P[8] in 2016. CONCLUSIONS: Rotavirus is the leading cause of severe acute gastroenteritis hospitalizations in Cambodian children under five years old, with 50% of such hospitalizations attributable to rotavirus. Over 90% of rotavirus hospitalizations occurred in children under 2years of age. Changes in the predominant rotavirus strains occurred over time among these unvaccinated children. This information is important to understand and prioritize the current potential impacts upon child health that could be achieved through the introduction of rotavirus vaccines in Cambodia. |
Private sector tuberculosis prevention in the US: Characteristics associated with interferon-gamma release assay or tuberculin skin testing
Stockbridge EL , Miller TL , Carlson EK , Ho C . PLoS One 2018 13 (3) e0193432 OBJECTIVE: To determine whether latent tuberculosis infection risk factors are associated with an increased likelihood of latent tuberculosis infection testing in the US private healthcare sector. DATA SOURCE: A national sample of medical and pharmacy claims representing services rendered January 2011 through December 2013 for 3,997,986 commercially insured individuals in the US who were 0 to 64 years of age. STUDY DESIGN: We used multivariable logistic regression models to determine whether TB/LTBI risk factors were associated with an increased likelihood of Interferon-Gamma Release Assay (IGRA) or Tuberculin Skin Test (TST) testing in the private sector. PRINCIPAL FINDINGS: 4.31% (4.27-4.34%) received at least one TST/IGRA test between 2011 and 2013 while 1.69% (1.67-1.72%) received a TST/IGRA test in 2013. Clinical risk factors associated with a significantly increased likelihood of testing included HIV, immunosuppressive therapy, exposure to tuberculosis, a history of tuberculosis, diabetes, tobacco use, end stage renal disease, and alcohol use disorder. Other significant variables included gender, age, asthma, the state tuberculosis rate, population density, and percent of foreign-born persons in a county. CONCLUSIONS: Private sector TST/IGRA testing is not uncommon and testing varies with clinical risk indicators. Thus, the private sector can be a powerful resource in the fight against tuberculosis. Analyses of administrative data can inform how best to leverage private sector healthcare toward tuberculosis prevention activities. |
Public health resilience checklist for high-consequence infectious diseases-informed by the domestic Ebola response in the United States
Sell TK , Shearer MP , Meyer D , Chandler H , Schoch-Spana M , Thomas E , Rose DA , Carbone EG , Toner E . J Public Health Manag Pract 2018 24 (6) 510-518 CONTEXT: The experiences of communities that responded to confirmed cases of Ebola virus disease in the United States provide a rare opportunity for collective learning to improve resilience to future high-consequence infectious disease events. DESIGN: Key informant interviews (n = 73) were conducted between February and November 2016 with individuals who participated in Ebola virus disease planning or response in Atlanta, Georgia; Dallas, Texas; New York, New York; or Omaha, Nebraska; or had direct knowledge of response activities. Participants represented health care; local, state, and federal public health; law; local and state emergency management; academia; local and national media; individuals affected by the response; and local and state governments. Two focus groups were then conducted in New York and Dallas, and study results were vetted with an expert advisory group. RESULTS: Participants focused on a number of important areas to improve public health resilience to high-consequence infectious disease events, including governance and leadership, communication and public trust, quarantine and the law, monitoring programs, environmental decontamination, and waste management. CONCLUSIONS: Findings provided the basis for an evidence-informed checklist outlining specific actions for public health authorities to take to strengthen public health resilience to future high-consequence infectious disease events. |
Rotavirus surveillance in Pakistan during 2015-2016 reveals high prevalence of G12P[6]
Umair M , Salman M , Alam MM , Rana MS , Zaidi SSZ , Bowen MD , Aamir UB , Abbasi BH . J Med Virol 2018 90 (7) 1272-1276 The G12 rotavirus genotype has emerged globally since their first detection in 1987 from the Philippines; however it remains a rare cause of gastroenteritis in Pakistan. Rotavirus surveillance conducted during 2015-2016, assessed 3446 children <5 years hospitalized for gastroenteritis and found 802 (23.2%) positive on ELISA. Genotyping of a subset of positive samples (n = 319) revealed G12P[6] (11.28%) as the third most common G/P combination following G3P[8] (28.5%) and G1P[8] (12.5%); G2P[4] (10.65%) and G3P[6] (8.15%) were other frequently detected strains. Phylogenetic analysis of G12 strains from Pakistan revealed high genetic similarity to G12 strains from Italy, Thailand, Korea and Great Britain as well as local strains within G12 lineage III. In conclusion, G12P[6] was a major contributor of RVA gastroenteritis in Pakistani children. Robust surveillance after the introduction of rotavirus vaccines will help determine the evolution of G12 and other circulating genotypes in the country. This article is protected by copyright. All rights reserved. |
Safe water and hygiene integration with human immunodeficiency virus and antenatal services: Leveraging opportunities for public health interventions and improved service uptake
Routh JA , Loharikar A , Chemey E , Msoma A , Ntambo M , Mvula R , Ayers T , Gunda A , Russo ET , Barr BT , Wood S , Quick R . Am J Trop Med Hyg 2018 98 (5) 1234-1241 Integrating public health interventions with antenatal clinic (ANC) visits may motivate women to attend ANC, thereby improving maternal and neonatal health, particularly for human immunodeficiency virus (HIV)-infected persons. In 2009, in an integrated ANC/Preventing Mother-to-Child Transmission program, we provided free hygiene kits (safe storage containers, WaterGuard water treatment solution, soap, and oral rehydration salts) to women at their first ANC visit and refills at subsequent visits. To increase fathers' participation, we required partners' presence for women to receive hygiene kits. We surveyed pregnant women at baseline and at 12-month follow-up to assess ANC service utilization, HIV counseling and testing (HCT), test drinking water for residual chlorine, and observe handwashing. We conducted in-depth interviews with pregnant women, partners, and health workers. We enrolled 106 participants; 97 (92%) were found at follow-up. During the program, 99% of pregnant women and their partners received HCT, and 99% mutually disclosed. Fifty-six percent of respondents had >/= 4 ANC visits and 90% delivered at health facilities. From baseline to follow-up, the percentage of women who knew how to use WaterGuard (23% versus 80%, P < 0.0001), had residual chlorine in stored water (0% versus 73%, P < 0.0001), had confirmed WaterGuard use (0% versus 70%, P < 0.0003), and demonstrated proper handwashing technique (21% versus 64% P < 0.0001) increased. Program participants showed significant improvements in water treatment and hygiene, and high use of ANC services and HCT. This evaluation suggests that integration of hygiene kits, refills, and HIV testing during ANC is feasible and may help improve household hygiene and increase use of health services. |
School practices to promote social distancing in K-12 schools: review of influenza pandemic policies and practices
Uscher-Pines L , Schwartz HL , Ahmed F , Zheteyeva Y , Meza E , Baker G , Uzicanin A . BMC Public Health 2018 18 (1) 406 BACKGROUND: During an evolving influenza pandemic, community mitigation strategies, such as social distancing, can slow down virus transmission in schools and surrounding communities. To date, research on school practices to promote social distancing in primary and secondary schools has focused on prolonged school closure, with little attention paid to the identification and feasibility of other more sustainable interventions. To develop a list and typology of school practices that have been proposed and/or implemented in an influenza pandemic and to uncover any barriers identified, lessons learned from their use, and documented impacts. METHODS: We conducted a review of the peer-reviewed and grey literature on social distancing interventions in schools other than school closure. We also collected state government guidance documents directed to local education agencies or schools to assess state policies regarding social distancing. We collected standardized information from each document using an abstraction form and generated descriptive statistics on common plan elements. RESULTS: The document review revealed limited literature on school practices to promote social distancing, as well as limited incorporation of school practices to promote social distancing into state government guidance documents. Among the 38 states that had guidance documents that met inclusion criteria, fewer than half (42%) mentioned a single school practice to promote social distancing, and none provided any substantive detail about the policies or practices needed to enact them. The most frequently identified school practices were cancelling or postponing after-school activities, canceling classes or activities with a high rate of mixing/contact that occur within the school day, and reducing mixing during transport. CONCLUSION: Little information is available to schools to develop policies and procedures on social distancing. Additional research and guidance are needed to assess the feasibility and effectiveness of school practices to promote social distancing. |
Syphilis elimination: Lessons learned again
Valentine JA , Bolan GA . Sex Transm Dis 2018 45 S80-S85 It is estimated that approximately 20 million new sexually transmitted infections (STIs) occur each year in the United States. The federally-funded STD prevention program implemented by CDC is primarily focused on the prevention and control of the three most common bacterial STIs: syphilis, gonorrhea, and chlamydia. A range of factors facilitate the transmission and acquisition of sexually transmitted infections, including syphilis. In 1999 CDC launched the National Campaign to Eliminate Syphilis from the United States. The strategies were familiar to public health in general and to STD control in particular: 1) enhanced surveillance, 2) expanded clinical and laboratory services, 3) enhanced health promotion, 4) strengthened community involvement and partnerships, and 5) rapid outbreak response. This national commitment to syphilis elimination was not the first effort, and like others before it too did not succeed. However, the lessons learned from this most recent campaign can inform the way forward to a more comprehensive approach to the prevention and control of STIs and improvement in the nation's health. |
Tuberculosis: progress and advances in development of new drugs, treatment regimens, and host-directed therapies
Tiberi S , du Plessis N , Walzl G , Vjecha MJ , Rao M , Ntoumi F , Mfinanga S , Kapata N , Mwaba P , McHugh TD , Ippolito G , Migliori GB , Maeurer MJ , Zumla A . Lancet Infect Dis 2018 18 (7) e183-e198 Tuberculosis remains the world's leading cause of death from an infectious disease, responsible for an estimated 1 674 000 deaths annually. WHO estimated 600 000 cases of rifampicin-resistant tuberculosis in 2016-of which 490 000 were multidrug resistant (MDR), with less than 50% survival after receiving recommended treatment regimens. Concerted efforts of stakeholders, advocates, and researchers are advancing further development of shorter course, more effective, safer, and better tolerated treatment regimens. We review the developmental pipeline and landscape of new and repurposed tuberculosis drugs, treatment regimens, and host-directed therapies (HDTs) for drug-sensitive and drug-resistant tuberculosis. 14 candidate drugs for drug-susceptible, drug-resistant, and latent tuberculosis are in clinical stages of drug development; nine are novel in phase 1 and 2 trials, and three new drugs are in advanced stages of development for MDR tuberculosis. Specific updates are provided on clinical trials of bedaquiline, delamanid, pretomanid, and other licensed or repurposed drugs that are undergoing investigation, including trials aimed at shortening duration of tuberculosis treatment, improving treatment outcomes and patient adherence, and reducing toxic effects. Ongoing clinical trials for shortening tuberculosis treatment duration, improving treatment outcomes in MDR tuberculosis, and preventing disease in people with latent tuberculosis infection are reviewed. A range of HDTs and immune-based treatments are under investigation as adjunctive therapy for shortening duration of therapy, preventing permanent lung injury, and improving treatment outcomes of MDR tuberculosis. We discuss the HDT development pipeline, ongoing clinical trials, and translational research efforts for adjunct tuberculosis treatment. |
Epidemiological observations on cryptosporidiosis and molecular characterization of Cryptosporidium spp. in sheep and goats in Kuwait.
Majeed QAH , El-Azazy OME , Abdou NMI , Al-Aal ZA , El-Kabbany AI , Tahrani LMA , AlAzemi MS , Wang Y , Feng Y , Xiao L . Parasitol Res 2018 117 (5) 1631-1636 Molecular epidemiological analysis of cryptosporidiosis in Middle Eastern countries suggests that small ruminants could play a major role in the transmission of Cryptosporidium spp. to humans, with a dominance of Cryptosporidium parvum, especially its IId subtypes. However, little information is available on the epidemiology and risk factors of cryptosporidiosis as well the distribution of Cryptosporidium species/genotypes and subtypes in small ruminants in this area, including Kuwait. In the present study, 47 farms from several areas in Kuwait were visited once during October 2014 to September 2015 to collect data on risk factors associated with Cryptosporidium infection. Fecal samples from 334 sheep and 222 goats were examined for Cryptosporidium oocysts by Ziehl-Neelsen staining (ZN) and antigens by enzymatic immunoassay (EIA). The Cryptosporidium prevalence was higher when samples were examined by EIA than ZN (11.4 and 7.2% in sheep and goats by EIA, compared with 4.2 and 3.6% by ZN, respectively). Young age (less than 3 months) and closed housing system are risk factors of Cryptosporidium infection. A correlation between fecal consistency and the occurrence of Cryptosporidium spp. was observed; non-formed fecal samples were often found positive. Molecular characterization of 30 ovine and caprine samples using PCR-RFLP analysis of the small subunit rRNA gene revealed the presence of C. parvum in 23 samples, Cryptosporidium ubiquitum in five samples, and Cryptosporidium xiaoi in two samples. Sequence analysis of C. parvum at the 60 KDa glycoprotein gene locus identified two subtypes, IIaA15G2R1 and IIdA20G1, with the latter being more common (in 2 and 20 successfully subtyped samples, respectively). Only one subtype of C. ubiquitum (XIIa) was recorded. Cryptosporidiosis in small ruminants apparently poses public health problem in Kuwait. |
Evaluation of oral rabies vaccination: Protection against rabies in wild caught raccoons (Procyon lotor)
Blanton JD , Niezgoda M , Hanlon CA , Swope CB , Suckow J , Saidy B , Nelson K , Chipman RB , Slate D . J Wildl Dis 2018 54 (3) 520-527 Oral rabies vaccination (ORV) is an effective tactic for wildlife rabies control, particularly for containment of disease spread along epizootic fronts. As part of the continuing evaluation of the ORV program in free-ranging raccoons in the US, 37 raccoons from ORV-baited areas in Pennsylvania were live-trapped and transferred to captivity to evaluate protection against rabies in animals with varying levels of existing neutralizing antibodies, expressed in international units per milliliter (IU/mL). Among the 37 raccoons at the date of capture, 24% (9/37) of raccoons were seronegative (<0.05 IU/mL), 22% (8/37) were low positive (>/=0.05-0.11 IU/mL), 27% (10/37) were medium positive (>0.11-<0.5 IU/mL), and 27% (10/37) were high positive (>/=0.5 IU/mL). Raccoons were held for 86-199 d between the date of capture and rabies virus challenge. At challenge, 68% (25/37) raccoons were seronegative. The overall survival rate among challenged animals was 46% (17/37). Based on the antibody titers at the time of challenge, survivorship was 24% (6/25) among seronegative animals, 100% (4/4) among low positive animals, 83% (5/6) among medium positive animals, and 100% (2/2) among high positive animals. Evidence of high-titer seroconversion after vaccination is a good surrogate indicator of rabies survival; however, survival rates of approximately 45% (15/35) were found among raccoons with detectable titers below 0.5 IU/mL. In contrast, any detectable titer at the time of challenge (>3 mo after vaccination) appeared to be a surrogate indicator of survival. Overall, we illustrated significant differences in the value of specific titers as surrogates for survival based on the timing of measurement relative to vaccination. However, survivorship was generally greater than 45% among animals with any detectable titer regardless of the timing of measurement. These findings suggest that lower titer cutoffs may represent a valid approach to measuring immunization coverage within ORV management zones, balancing both sensitivity and specificity for estimating herd immunity. |
Impact of community-delivered SMS alerts on dog-owner participation during a mass rabies vaccination campaign, Haiti 2017
Cleaton JM , Wallace RM , Crowdis K , Gibson A , Monroe B , Ludder F , Etheart MD , Natal Vigilato MA , King A . Vaccine 2018 36 (17) 2321-2325 Haiti has historically vaccinated between 100,000 and 300,000 dogs annually against rabies, however national authorities have not been able to reach and maintain the 70% coverage required to eliminate the canine rabies virus variant. Haiti conducts massive dog vaccination campaigns on an annual basis and utilizes both central point and door-to-door methods. These methods require that dog owners are aware of the dates and locations of the campaign. To improve this awareness among dog owners, 600,000 text messages were sent to phones in two Haitian communes (Gonaives and Saint-Marc) to remind dog owners to attend the campaign. Text messages were delivered on the second day and at the mid-point of the campaign. A post-campaign household survey was conducted to assess dog owner's perception of the text messages and the impact on their participation in the vaccination campaign. Overall, 147 of 160 (91.9%) text-receiving dog owners indicated the text was helpful, and 162 of 187 (86.6%) responding dog owners said they would like to receive text reminders during future rabies vaccination campaigns. In areas hosting one-day central point campaigns, dog owners who received the text were 2.0 (95% CI 1.1, 3.6) times more likely to have participated in the campaign (73.1% attendance among those who received the text vs 36.4% among those who did not). In areas incorporating door-to-door vaccination over multiple days there was no significant difference in participation between dog owners who did and did not receive a text. Text message reminders were well-received and significantly improved campaign attendance, indicating that short message service (SMS) alerts may be a successful strategy in low resource areas with large free roaming dog populations. |
Nationwide insecticide resistance status and biting behaviour of malaria vector species in the Democratic Republic of Congo
Wat'senga F , Manzambi EZ , Lunkula A , Mulumbu R , Mampangulu T , Lobo N , Hendershot A , Fornadel C , Jacob D , Niang M , Ntoya F , Muyembe T , Likwela J , Irish SR , Oxborough RM . Malar J 2018 17 (1) 129 BACKGROUND: Globally, the Democratic Republic of Congo (DRC) accounted for 9% of malaria cases and 10% of malaria deaths in 2015. As part of control efforts, more than 40 million long-lasting insecticidal nets (LLINs) were distributed between 2008 and 2013, resulting in 70% of households owning one or more LLINs in 2014. To optimize vector control efforts, it is critical to monitor vector behaviour and insecticide resistance trends. Entomological data was collected from eight sentinel sites throughout DRC between 2013 and 2016 in Kingasani, Mikalayi, Lodja, Kabondo, Katana, Kapolowe, Tshikaji and Kalemie. Mosquito species present, relative densities and biting times were monitored using human landing catches (HLC) conducted in eight houses, three times per year. HLC was conducted monthly in Lodja and Kapolowe during 2016 to assess seasonal dynamics. Laboratory data included resistance mechanism frequency and sporozoite rates. Insecticide susceptibility testing was conducted with commonly used insecticides including deltamethrin and permethrin. Synergist bioassays were conducted with PBO to determine the role of oxidases in permethrin resistance. RESULTS: In Lodja, monthly Anopheles gambiae s.l. biting rates were consistently high at > 10 bites/person/night indoors and outdoors. In Kapolowe, An. gambiae s.l. dominated during the rainy season, and Anopheles funestus s.l. during the dry season. In all sites, An. gambiae and An. funestus biting occurred mostly late at night. In Kapolowe, significant biting of both species started around 19:00, typically before householders use nets. Sporozoite rates were high, with a mean of 4.3% (95% CI 3.4-5.2) for An. gambiae and 3.3% (95% CI 1.3-5.3) for An. funestus. Anopheles gambiae were resistant to permethrin in six out of seven sites in 2016. In three sites, susceptibility to deltamethrin was observed despite high frequency permethrin resistance, indicating the presence of pyrethroid-specific resistance mechanisms. Pre-exposure to PBO increased absolute permethrin-associated mortality by 24%, indicating that resistance was partly due to metabolic mechanisms. The kdr-1014F mutation in An. gambiae was present at high frequency (> 70%) in three sites (Kabondo, Kingasani and Tshikaji), and lower frequency (< 20%) in two sites (Lodja and Kapolowe). CONCLUSION: The finding of widespread resistance to permethrin in DRC is concerning and alternative insecticides should be evaluated. |
Antibiotic therapy duration in US adults with sinusitis
King LM , Sanchez GV , Bartoces M , Hicks LA , Fleming-Dutra KE . JAMA Intern Med 2018 178 (7) 992-994 This study evaluates the duration of antibiotic therapy prescribed for US adults with sinusitis. |
A study of telomere length, arsenic exposure, and arsenic toxicity in a Bangladeshi cohort
Zhang C , Kibriya MG , Jasmine F , Roy S , Gao J , Sabarinathan M , Shinkle J , Delgado D , Ahmed A , Islam T , Eunus M , Islam MT , Hasan R , Graziano JH , Ahsan H , Pierce BL . Environ Res 2018 164 346-355 BACKGROUND: Chronic arsenic exposure is associated with increased risk for arsenical skin lesions, cancer, and other adverse health outcomes. One potential mechanism of arsenic toxicity is telomere dysfunction. However, prior epidemiological studies of arsenic exposure, telomere length (TL), and skin lesion are small and cross-sectional. We investigated the associations between arsenic exposure and TL and between baseline TL and incident skin lesion risk among individuals participating in the Health Effects of Arsenic Longitudinal Study in Bangladesh (2000-2009). METHODS: Quantitative PCR was used to measure the average TL of peripheral blood DNA collected at baseline. The association between baseline arsenic exposure (well water and urine) and TL was estimated in a randomly-selected subcohort (n=1469). A nested case-control study (466 cases and 464 age- and sex-matched controls) was used to estimate the association between baseline TL and incident skin lesion risk (diagnosed <8 years after baseline). RESULTS: No association was observed between arsenic exposure (water or urine) and TL. Among incident skin lesion cases and matched controls, we observed higher skin lesion risk among individuals with shorter TL (Ptrend =1.5 x 10(-5)) with odds ratios of 2.60, 1.59, and 1.10 for the first (shortest), second, and third TL quartiles compared to the fourth (longest). CONCLUSIONS: Arsenic exposure was not associated with TL among Bangladeshi adults, suggesting that leukocyte TL may not reflect a primary mode of action for arsenic's toxicity. However, short TL was associated with increased skin lesion risk, and may be a biomarker of arsenic susceptibility modifying arsenic's effect on skin lesion risk. |
Complete Genome Sequences of a Hantavirus Isolate from New York.
McMullan LK , Albarino CG , Ksiazek TG , Nichol ST , Spiropoulou CF . Genome Announc 2018 6 (12) We report here the complete genome sequences for all three segments of the New York hantavirus (New York 1). This is the first reported L segment sequence for hantaviruses maintained in Peromyscus spp. endemic to the eastern United States and Canada. |
Systematic review of the effect of economic compensation and incentives on uptake of voluntary medical male circumcision among men in sub-Saharan Africa
Carrasco MA , Grund JM , Davis SM , Ridzon R , Mattingly M , Wilkinson J , Kasdan B , Kiggundu V , Njeuhmeli E . AIDS Care 2018 30 (9) 1-12 Voluntary medical male circumcision (VMMC) prevalence in priority countries in sub-Saharan Africa, particularly among men aged >/=20 years, has not yet reached the goal of 80% coverage recommended by the World Health Organization. Determining novel strategies to increase VMMC uptake among men >/=20 years is critical to reach HIV epidemic control. We conducted a systematic review to analyze the effectiveness of economic compensation and incentives to increase VMMC uptake among older men in order to inform VMMC demand creation programs. The review included five qualitative, quantitative, and mixed methods studies published in peer reviewed journals. Data was extracted into a study summary table, and tables synthesizing study characteristics and results. Results indicate that cash reimbursements for transportation and food vouchers of small nominal amounts to partially compensate for wage loss were effective, while enrollment into lotteries offering prizes were not. Economic compensation provided a final push toward VMMC uptake for men who had already been considering undergoing circumcision. This was in settings with high circumcision prevalence brought by various VMMC demand creation strategies. Lottery prizes offered in the studies did not appear to help overcome barriers to access VMMC and qualitative evidence suggests this may partially explain why they were not effective. Economic compensation may help to increase VMMC uptake in priority countries with high circumcision prevalence when it addresses barriers to uptake. Ethical considerations, sustainability, and possible externalities should be carefully analyzed in countries considering economic compensation as an additional strategy to increase VMMC uptake. |
Post-licensure safety surveillance of zoster vaccine live (Zostavax®) in the United States, Vaccine Adverse Event Reporting System (VAERS), 2006-2015.
Miller ER , Lewis P , Shimabukuro TT , Su J , Moro P , Woo EJ , Jankosky C , Cano M . Hum Vaccin Immunother 2018 14 (8) 1-23 BACKGROUND: Herpes zoster (HZ), or shingles, is caused by reactivation of varicella-zoster virus in latently infected individuals. Live-attenuated HZ vaccine (zoster vaccine live, ZVL) is approved in the United States for persons aged >/=50 years and recommended by the CDC for persons >/=60 years. METHODS: We analyzed U.S. reports of adverse events (AEs) following ZVL submitted to the Vaccine Adverse Event Reporting System (VAERS), a spontaneous reporting system to monitor vaccine safety, for persons vaccinated May 1, 2006, through January 31, 2015. We conducted descriptive analysis, clinical reviews of reports with selected pre-specified conditions, and empirical Bayesian data mining. RESULTS: VAERS received 23,092 reports following ZVL, of which 22,120 (96%) were classified as non-serious. Of reports where age was documented (n = 18,817), 83% were in persons aged >/=60 years. Reporting rates of AEs were 106 and 4.4 per 100,000 ZVL doses distributed for all reports and serious reports, respectively. When ZVL was administered alone among persons aged >/=50 years, injection site erythema (27%), HZ (17%), injection site swelling (17%), and rash (14%) were the most commonly reported symptoms among non-serious reports; HZ (29%), pain (18%), and rash (16%) were the most commonly reported symptoms among serious reports. Six reports included laboratory evidence of vaccine-strain varicella-zoster virus (Oka/Merck strain) infection; AEs included HZ, HZ- or varicella-like illness, and local reaction with vesicles. In our review of reports of death with sufficient information to determine cause (n = 46, median age 75 years), the most common causes were heart disease (n = 28), sepsis (n = 4), and stroke (n = 3). Empirical Bayesian data mining did not detect new or unexpected safety signals. CONCLUSIONS: Findings from our safety review of ZVL are consistent with those from pre-licensure clinical trials and other post-licensure assessments. Transient injection-site reactions, HZ, and rashes were most frequently reported to VAERS following ZVL. Overall, our results are reassuring regarding the safety of ZVL. |
The future control of rotavirus disease: Can live oral vaccines alone solve the rotavirus problem
Glass RI , Jiang B , Parashar U . Vaccine 2018 36 (17) 2233-2236 Live oral rotavirus (RV) vaccines used worldwide are most effective in reducing diarrheal hospitalizations from RV in high income countries and least effective in low income countries where RV remains a prime cause of death in children. Research has failed to fully explain the reason for this difference of efficacy for RV vaccines, an observation made with other live oral vaccines for polio, cholera and typhoid fever. Use of parenteral vaccines have been successful in overcoming this problem for both polio and typhoid and parenteral RV vaccines are now in development. This approach should be pursued for rotavirus vaccine as well because in low income countries where oral RV vaccines have been introduced and are only partially effective, RV remains the most common cause of diarrhea in children under 5years. The ultimate control of RV diarrheal will likely require both oral and parenteral vaccines. |
Immunogenicity of type 2 monovalent oral and inactivated poliovirus vaccines for type 2 poliovirus outbreak response: an open-label, randomised controlled trial
Zaman K , Estivariz CF , Morales M , Yunus M , Snider CJ , Gary HEJr , Weldon WC , Oberste MS , Wassilak SG , Pallansch MA , Anand A . Lancet Infect Dis 2018 18 (6) 657-665 BACKGROUND: Monovalent type 2 oral poliovirus vaccine (mOPV2) and inactivated poliovirus vaccine (IPV) are used to respond to type 2 poliovirus outbreaks. We aimed to assess the effect of two mOPV2 doses on the type 2 immune response by varying the time interval between mOPV2 doses and IPV co-administration with mOPV2. METHODS: We did a randomised, controlled, parallel, open-label, non-inferiority, inequality trial at two study clinics in Dhaka, Bangladesh. Healthy infants aged 6 weeks (42-48 days) at enrolment were randomly assigned (1:1:1:1) to receive two mOPV2 doses (each dose consisting of two drops [0.1 mL in total] of about 10(5) 50% cell culture infectious dose of type 2 Sabin strain) at intervals of 1 week, 2 weeks, 4 weeks (standard or control group), or 4 weeks with IPV (0.5 mL of type 1 [Mahoney, 40 D-antigen units], type 2 [MEF-1, 8 D-antigen units], and type 3 [Saukett, 32 D-antigen units]) administered intramuscularly with the first mOPV2 dose. We used block randomisation, randomly selecting blocks of sizes four, eight, 12, or 16 stratified by study sites. We concealed randomisation assignment from staff managing participants in opaque, sequentially numbered, sealed envelopes. Parents and clinic staff were unmasked to assignment after the randomisation envelope was opened. Laboratory staff analysing sera were masked to assignment, but investigators analysing data and assessing outcomes were not. The primary outcome was type 2 immune response measured 4 weeks after mOPV2 administration. The primary modified intention-to-treat analysis included participants with testable serum samples before and after vaccination. A non-inferiority margin of 10% and p=0.05 (one-tailed) was used. This trial is registered at ClinicalTrials.gov, number NCT02643368, and is closed to accrual. FINDINGS: Between Dec 7, 2015, and Jan 5, 2016, we randomly assigned 760 infants to receive two mOPV2 doses at intervals of 1 week (n=191), 2 weeks (n=191), 4 weeks (n=188), or 4 weeks plus IPV (n=190). Immune responses after two mOPV2 doses were observed in 161 (93%) of 173 infants with testable serum samples in the 1 week group, 169 (96%) of 177 in the 2 week group, and 176 (97%) of 181 in the 4 week group. 1 week and 2 week intervals between two mOPV2 doses were non-inferior to 4 week intervals because the lower bound of the absolute differences in the percentage of immune responses were greater than -10% (-4.2% [90% CI -7.9 to -0.4] in the 1 week group and -1.8% [-5.0 to 1.5] in the 2 week group vs the 4 week group). The immune response elicited by two mOPV2 doses 4 weeks apart was not different when IPV was added to the first dose (176 [97%] of 182 infants with IPV vs 176 [97%] of 181 without IPV; p=1.0). During the trial, two serious adverse events (pneumonia; one [1%] of 186 patients in the 1 week group and one [1%] of 182 in the 4 week group) and no deaths were reported; the adverse events were not attributed to the vaccines. INTERPRETATION: Administration of mOPV2 at short intervals does not interfere with its immunogenicity. The addition of IPV to the first mOPV2 dose did not improve poliovirus type 2 immune response. FUNDING: US Centers for Disease Control and Prevention. |
Vaccination patterns in children after autism spectrum disorder diagnosis and in their younger siblings
Zerbo O , Modaressi S , Goddard K , Lewis E , Fireman BH , Daley MF , Irving SA , Jackson LA , Donahue JG , Qian L , Getahun D , DeStefano F , McNeil MM , Klein NP . JAMA Pediatr 2018 172 (5) 469-475 Importance: In recent years, rates of vaccination have been declining. Whether this phenomenon disproportionately affects children with autism spectrum disorder (ASD) or their younger siblings is unknown. Objectives: To investigate if children after receiving an ASD diagnosis obtain their remaining scheduled vaccines according to the Advisory Committee on Immunization Practices (ACIP) recommendations and to compare the vaccination patterns of younger siblings of children with ASD with the vaccination patterns of younger siblings of children without ASD. Design, Setting, and Participants: This investigation was a retrospective matched cohort study. The setting was 6 integrated health care delivery systems across the United States within the Vaccine Safety Datalink. Participants were children born between January 1, 1995, and September 30, 2010, and their younger siblings born between January 1, 1997, and September 30, 2014. The end of follow-up was September 30, 2015. Exposures: Recommended childhood vaccines between ages 1 month and 12 years. Main Outcome and Measure: The proportion of children who received all of their vaccine doses according to ACIP recommendations. Results: The study included 3729 children with ASD (676 [18.1%] female), 592907 children without ASD, and their respective younger siblings. Among children without ASD, 250193 (42.2%) were female. For vaccines recommended between ages 4 and 6 years, children with ASD were significantly less likely to be fully vaccinated compared with children without ASD (adjusted rate ratio, 0.87; 95% CI, 0.85-0.88). Within each age category, vaccination rates were significantly lower among younger siblings of children with ASD compared with younger siblings of children without ASD. The adjusted rate ratios varied from 0.86 for siblings younger than 1 year to 0.96 for those 11 to 12 years old. Parents who had a child with ASD were more likely to refuse at least 1 recommended vaccine for that child's younger sibling and to limit the number of vaccines administered during the younger sibling's first year of life. Conclusions and Relevance: Children with ASD and their younger siblings were undervaccinated compared with the general population. The results of this study suggest that children with ASD and their younger siblings are at increased risk of vaccine-preventable diseases. |
Detection of immunoglobulin G antibodies to Taenia solium cysticercosis antigen glutathione-S-transferase-rT24H in Malian children using multiplex bead assay
Moss DM , Handali S , Chard AN , Trinies V , Bullard S , Wiegand RE , Doumbia S , Freeman MC , Lammie PJ . Am J Trop Med Hyg 2018 98 (5) 1408-1412 Blood samples from 805 students attending 42 elementary schools in Mopti, Sikasso, and Koulikoro regions, and Bamako district in Mali participated in a school water, sanitation, and hygiene intervention. Immunoglobulin (Ig) G responses to several antigens/pathogens were assessed by a multiplex bead assay (MBA), and the recombinant Taenia solium T24H antigen was included. Of all students tested, 8.0% were positive to rT24H, but in some schools 25-30%. A cluster of 12 widespread school locations showed not only a relative risk of 3.23 for T. solium exposure and significantly higher IgG responses (P < 0.001) but also significantly lower elevation (P = 0.04) (m, above sea level) compared with schools outside the cluster. All schools at elevations < 425 m showed significantly higher IgG responses (P = 0.017) than schools at elevations >/= 425 m. The MBA is an excellent serological platform that provides cost-effective opportunities to expand testing in serosurveys. |
Development of a repeat-exposure penile SHIV infection model in macaques to evaluate biomedical preventions against HIV
Garber DA , Mitchell J , Adams D , Guenthner P , Deyounks F , Ellis S , Kelley K , Johnson R , Dobard C , Heneine W , McNicholl J . PLoS One 2018 13 (3) e0194837 Penile acquisition of HIV infection contributes substantially to the global epidemic. Our goal was to establish a preclinical macaque model of penile HIV infection for evaluating the efficacy of new HIV prevention modalities. Rhesus macaques were challenged once or twice weekly with consistent doses of SHIVsf162P3 (a chimeric simian-human immunodeficiency virus containing HIV env) ranging from 4-600 TCID50 (50% tissue culture infective dose), via two penile routes, until systemic SHIV infection was confirmed. One route exposed the inner foreskin, glans and urethral os to virus following deposition into the prepuce (foreskin) pouch. The second route introduced the virus non-traumatically into the distal urethra only. Single-route challenges resulted in dose-dependent rates of SHIV acquisition informing selection of optimal SHIV dosing. Concurrent SHIV challenges via the prepuce pouch (200 TCID50) and urethra (16 TCID50) resulted in infection of 100% (10/10) animals following a median of 2.5 virus exposures (range, 1-12). We describe the first rhesus macaque repeat-exposure SHIV challenge model of penile HIV acquisition. Utilization of the model should further our understanding of penile HIV infection and facilitate the development of new HIV prevention strategies for men. |
Fibrous nanocellulose, crystalline nanocellulose, carbon nanotubes, and crocidolite asbestos elicit disparate immune responses upon pharyngeal aspiration in mice
Park EJ , Khaliullin TO , Shurin MR , Kisin ER , Yanamala N , Fadeel B , Chang J , Shvedova AA . J Immunotoxicol 2018 15 (1) 12-23 With the rapid development of synthetic alternatives to mineral fibers, their possible effects on the environment and human health have become recognized as important issues worldwide. This study investigated effects of four fibrous materials, i.e. nanofibrillar/nanocrystalline celluloses (NCF and CNC), single-walled carbon nanotubes (CNTs), and crocidolite asbestos (ASB), on pulmonary inflammation and immune responses found in the lungs, as well as the effects on spleen and peripheral blood immune cell subsets. BALB/c mice were given NCF, CNC, CNT, and ASB on Day 1 by oropharyngeal aspiration. At 14 days post-exposure, the animals were evaluated. Total cell number, mononuclear phagocytes, polymorphonuclear leukocytes, lymphocytes, and LDH levels were significantly increased in ASB and CNT-exposed mice. Expression of cytokines and chemokines in bronchoalveolar lavage (BAL) was quite different in mice exposed to four particle types, as well as expression of antigen presentation-related surface proteins on BAL cells. The results revealed that pulmonary exposure to fibrous materials led to discrete local immune cell polarization patterns with a TH2-like response caused by ASB and TH1-like immune reaction to NCF, while CNT and CNC caused non-classical or non-uniform responses. These alterations in immune response following pulmonary exposure should be taken into account when testing the applicability of new nanosized materials with fibrous morphology. |
Reagent substitutions in the Centers for Disease Control and Prevention Nijmegen-Bethesda assay for factor VIII inhibitors
Miller CH , Payne AB , Driggers J , Ellingsen D , Boylan B , Bean CJ . Haemophilia 2018 24 (3) e116-e119 The Nijmegen-Bethesda assay (NBA), considered the “gold standard” for measurement of factor VIII (FVIII) inhibitors in haemophilia A,1 introduced two modifications to the traditional Bethesda assay (BA) for stabilization during the 2-hour incubation at 37°C: (i) buffering of normal pooled plasma (NPP) in the test and control mixtures with imidazole and (ii) substitution of FVIII-deficient plasma (FVIIIDP) for imidazole buffer (IB) in the control mixture and for specimen predilution.2 The NBA has not been widely adopted in the United States, because of the increased cost incurred by use of FVIIIDP rather than buffer and the lack of FDA-approved commercial reagents.3 Surveys of North American coagulation laboratories have shown that only 20% use the NBA, 70% use buffered NPP in a “hybrid” of the NBA and BA, and one-third use diluents other than those recommended in published methods.3 This lack of methodological uniformity may partially account for poor interlaboratory reproducibility, a well-known problem with FVIII inhibitor testing.3 |
Bladder management and continence outcomes in adults with spina bifida: Results from the National Spina Bifida Patient Registry 2009-15
Wiener JS , Suson KD , Castillo J , Routh JC , Tanaka S , Liu T , Ward E , Thibadeau J , Joseph D . J Urol 2018 200 (1) 187-194 INTRODUCTION: Most children with spina bifida (SB) now survive into adulthood, but most have neuropathic bladder with potential complications of incontinence, infection, renal damage, and diminished quality of life. We sought to 1) describe contemporary bladder management and continence outcomes of adults with SB; 2) describe differences from younger patients; and 3) assess for association with socio-economic factors. METHODS: We analyzed data on bladder management and outcomes from The National Spina Bifida Patient Registry in adults with SB. A strict definition of continence was utilized. Results were compared to young children (5-11 years) and adolescents (12-19 years). Statistical analysis compared cohorts by gender, ethnicity, SB type, lesion level, insurance status, educational attainment, employment status, and continence. RESULTS: 5250 SB patients were included; 1372 (26.1%) were adults. 45.8% of adults did not take medication, but 76.8% performed clean intermittent catheterization. Continence was lower in adults with myelomeningocele (MMC) (45.8%) versus those with non-MMC SB (63.1%) (p<0.0001). Continence rates were higher in older age cohorts for MMC patients (p<0.0001) but not non-MMC patients (p=0.1192). Bladder management and histories of urologic surgery varied among age groups. On univartiate analysis with SB-related or socio-economic variables, continence was significantly associated with educational level, but on multivariable logistic regression analysis, bladder continence was significantly associated with employment status only. CONCLUSIONS: Bladder management techniques differ between adults and younger patients with SB. Bladder continence outcomes were better in adults with nearly one-half reporting continence. Continence was significantly associated with employment status in adults 25 years or older. |
Neonates with congenital Cytomegalovirus and hearing loss identified via the universal newborn hearing screening program
Rawlinson WD , Palasanthiran P , Hall B , Al Yazidi L , Cannon MJ , Cottier C , van Zuylen WJ , Wilkinson M . J Clin Virol 2018 102 110-115 BACKGROUND: Congenital cytomegalovirus (CMV) is the most common non-genetic cause of sensorineural hearing loss. Currently, there are no universal CMV screening programs for newborns or routine CMV testing of neonates with hearing loss in Australia, or elsewhere. OBJECTIVES: This study was undertaken to determine the prevalence of congenital CMV infection in infants with hearing loss identified using routine resources via the Australian universal neonatal hearing screening (UNHS) program. STUDY DESIGN: Infants who failed UNHS, referred for audiological testing and found to have permanent hearing loss were screened for CMV via PCR of urine and saliva. Congenital CMV was diagnosed if CMV was detected in infants </=30days of age, or using retrospective testing on stored new born screening cards, retrospective testing, or using clinical criteria if >30days of age. The cohort was analyzed for time of testing and prevalence of congenital CMV determined. RESULTS: The Audiology Department reviewed 1669 infants who failed UNHS between 2009 and 2016. Thirty percent (502/1669) had permanent hearing loss confirmed, of whom 336/502 were offered CMV testing. A definite (n=11) or probable (n=8) diagnosis of congenital CMV occurred in 19/323 (5.9%), of whom definite diagnoses were made in 4/19 on tests positive prior to 21days of life, in 5/19 who were positive on neonatal blood screening card (NBSC) testing, in 2/19 who were positive on placental testing. In 8/19 probable diagnoses were made based on positive testing between ages 23-42days and a consistent clinical syndrome in the absence of another cause for hearing loss after genetic and other testing. CMV testing mirrored the timing of audiological testing, with approximately 40% completing audiology and CMV testing by 21days, and 64% by 30days. CONCLUSION: This program, utilizing existing clinical services identified probable congenital CMV in approximately 6% of a large cohort failing UNHS with permanent hearing loss, of whom more than half were definite diagnoses. No additional assets were required to those already existing in this tertiary referral pediatric centre, whilst providing useful and timely data for clinical and audiological management. |
Use of iodine-containing dietary supplements remains low among women of reproductive age in the United States: NHANES 2011-2014
Gupta PM , Gahche JJ , Herrick KA , Ershow AG , Potischman N , Perrine CG . Nutrients 2018 10 (4) In the United States, the American Thyroid Association recommends that women take a dietary supplement containing 150 µg of iodine 3 months prior to conception and while pregnant and lactating to support fetal growth and neurological development. We used data from the National Health and Nutrition Examination Survey 2011–2014 to describe the use of dietary supplements with and without iodine in the past 30 days among 2155 non-pregnant, non-lactating (NPNL) women; 122 pregnant women; and 61 lactating women. Among NPNL women, 45.3% (95% Confidence Interval [CI]: 42.0, 48.6) used any dietary supplement and 14.8% (95% CI: 12.7, 16.8) used a dietary supplement with iodine in the past 30 days. Non-Hispanic black and Hispanic women were less likely to use any dietary supplement as well as one with iodine, than non-Hispanic white or non-Hispanic Asian women (p < 0.05). Among pregnant women, 72.2% (95% CI: 65.8, 78.6) used any dietary supplement; however, only 17.8% (95% CI: 11.4, 24.3) used a dietary supplement with iodine. Among lactating women, 75.0% (95% CI: 63.0, 87.0) used a dietary supplement; however, only 19.0% (95% CI: 8.8, 29.2) used a dietary supplement with iodine. Among NPNL women using a supplement with iodine, median daily iodine intake was 75.0 µg. Self-reported data suggests that the use of iodine containing dietary supplements among pregnant and lactating women remains low in contrast with current recommendations. |
Influence of specific surface area on coal dust explosibility using the 20-L chamber
Zlochower IA , Sapko MJ , Perera IE , Brown CB , Harris ML , Rayyan NS . J Loss Prev Process Ind 2018 54 103-109 The relationship between the explosion inerting effectiveness of rock dusts on coal dusts, as a function of the specific surface area (cm2/g) of each component is examined through the use of 20-L explosion chamber testing. More specifically, a linear relationship is demonstrated for the rock dust to coal dust (or incombustible to combustible) content of such inerted mixtures with the specific surface area of the coal and the inverse of that area of the rock dust. Hence, the inerting effectiveness, defined as above, is more generally linearly dependent on the ratio of the two surface areas. The focus on specific surface areas, particularly of the rock dust, provide supporting data for minimum surface area requirements in addition to the 70% less than 200 mesh requirement specified in 30 CFR 75.2. © 2018 |
NIOSH remembers James Melius, M.D., Dr.P.H
Howard J . Am J Ind Med 2018 61 (5) 446 Dr. Jim Melius, accomplished occupational physician, epidemiologist, and longstanding friend of the National Institute for Occupational Safety and Health (NIOSH), will be remembered by everyone at NIOSH for his over 40 years of work protecting the health and safety of all workers. |
Assessing occupational erionite and respirable crystalline silica exposure among outdoor workers in Wyoming, South Dakota, and Montana
Beaucham C , King B , Feldmann K , Harper M , Dozier A . J Occup Environ Hyg 2018 15 (6) 1-33 Erionite is a naturally occurring fibrous mineral found in many parts of the world, including the western United States. Inhalational exposure to erionite fibers in some localities is associated with health effects similar to those caused by asbestos exposure, including malignant mesothelioma. Therefore, there is concern regarding occupational exposures in the western United States. Currently there are no standard sampling and analytical methods for airborne erionite fibers, as well as no established occupational exposure limits. Due to the potential adverse health effects, characterizing and minimizing exposures is prudent. Crystalline silica also occurs naturally in areas where erionite is found, principally as the mineral quartz. Work activities involving rocks containing quartz and soils derived from those rocks can lead to exposure to respirable crystalline silica (RCS). The typically dry and dusty environment of the western United States can increase the likelihood of exposures to aerosolized rocks and soils, but inhalation exposure is also possible in more humid conditions. In this case study, we describe several outdoor occupational environments with potential exposures to erionite and RCS. We describe our method for evaluating those exposures and demonstrate: (1) the occurrence of occupational exposures to airborne erionite and RCS, (2) that the chemical make-up of the erionite mineral can be determined, and (3) that effective dust control practices are needed to reduce employee exposures to these minerals. |
Characterizing risk assessments for the development of occupational exposure limits for engineered nanomaterials
Schulte PA , Kuempel ED , Drew NM . Regul Toxicol Pharmacol 2018 95 207-219 The commercialization of engineered nanomaterials (ENMs) began in the early 2000's. Since then the number of commercial products and the number of workers potentially exposed to ENMs is growing, as is the need to evaluate and manage the potential health risks. Occupational exposure limits (OELs) have been developed for some of the first generation of ENMs. These OELs have been based on risk assessments that progressed from qualitative to quantitative as nanotoxicology data became available. In this paper, that progression is characterized. It traces OEL development through the qualitative approach of general groups of ENMs based primarily on read-across with other materials to quantitative risk assessments for nanoscale particles including titanium dioxide, carbon nanotubes and nanofibers, silver nanoparticles, and cellulose nanocrystals. These represent prototypic approaches to risk assessment and OEL development for ENMs. Such substance-by-substance efforts are not practical given the insufficient data for many ENMs that are currently being used or potentially entering commerce. Consequently, categorical approaches are emerging to group and rank ENMs by hazard and potential health risk. The strengths and limitations of these approaches are described, and future derivations and research needs are discussed. Critical needs in moving forward with understanding the health effects of the numerous EMNs include more standardized and accessible quantitative data on the toxicity and physicochemical properties of ENMs. |
Codability of industry and occupation information from cancer registry records: Differences by patient demographics, casefinding source, payor, and cancer type
Silver SR , Tsai RJ , Morris CR , Boiano JM , Ju J , Scocozza MS , Calvert GM . Am J Ind Med 2018 61 (6) 524-532 INTRODUCTION: Industry and occupation (I&O) information collected by cancer registries is useful for assessing associations among jobs and malignancies. However, systematic differences in I&O availability can bias findings. METHODS: Codability by patient demographics, payor, identifying (casefinding) source, and cancer site was assessed using I&O text from first primaries diagnosed 2011-2012 and reported to California Cancer Registry. I&O were coded to a U.S. Census code or classified as blank/inadequate/unknown, retired, or not working for pay. RESULTS: Industry was codable for 37% of cases; 50% had "unknown" and 9% "retired" instead of usual industry. Cases initially reported by hospitals, covered by preferred providers, or with known occupational etiology had highest codable industry; cases from private pathology laboratories, with Medicaid, or diagnosed in outpatient settings had least. Occupation results were similar. CONCLUSIONS: Recording usual I&O for retirees and improving linkages for reporting entities without patient access would improve I&O codability and research validity. |
Evaluation of heat stress and heat strain among employees working outdoors in an extremely hot environment
Methner M , Eisenberg J . J Occup Environ Hyg 2018 15 (6) 1-20 A heat stress evaluation was conducted among employees engaged in strenuous work in an extremely hot outdoor environment. Environmental conditions that contribute to heat stress along with various physiological indicators of heat strain were monitored on a task-basis for nine employees daily across 4 workdays. Employees performed moderate to heavy tasks in elevated environmental conditions for longer periods of time than recommended by various heat stress exposure limits. Seven of nine employees showed evidence of excessive heat strain according to criteria yet all employees were able to self-regulate task duration and intensity to avoid heat-related illness. |
Launching the dialogue: Safety and innovation as partners for success in advanced manufacturing
Geraci CL , Tinkle SS , Brenner SA , Hodson LL , Pomeroy-Carter CA , Neu-Baker N . J Occup Environ Hyg 2018 15 (6) 1-14 Emerging and novel technologies, materials, and information integrated into increasingly automated and networked manufacturing processes or into traditional manufacturing settings are enhancing the efficiency and productivity of manufacturing. Globally, there is a move toward a new era in manufacturing that is characterized by: (1) the ability to create and deliver more complex designs of products; (2) the creation and use of materials with new properties that meet a design need; (3) the employment of new technologies, such as additive and digital techniques that improve on conventional manufacturing processes; and (4) a compression of the time from initial design concept to the creation of a final product. Globally, this movement has many names, but "advanced manufacturing" has become the shorthand for this complex integration of material and technology elements that enable new ways to manufacture existing products, as well as new products emerging from new technologies and new design methods. As the breadth of activities associated with advanced manufacturing suggests, there is no single advanced manufacturing industry. Instead, aspects of advanced manufacturing can be identified across a diverse set of business sectors that use manufacturing technologies, ranging from the semiconductors and electronics to the automotive and pharmaceutical industries. The breadth and diversity of advanced manufacturing may change the occupational and environmental risk profile, challenge the basic elements of comprehensive health and safety (material, process, worker, environment, product, and general public health and safety), and provide an opportunity for development and dissemination of occupational and environmental health and safety (OEHS) guidance and best practices. It is unknown how much the risk profile of different elements of OEHS will change, thus requiring an evolution of health and safety practices. These changes may be accomplished most effectively through multi-disciplinary, multi-sector, public-private dialogue that identifies issues and offers solutions. |
Occupational exposure monitoring data collection, storage, and use among state-based and private workers' compensation insurers
Shockey TM , Babik KR , Wurzelbacher SJ , Moore LL , Bisesi MS . J Occup Environ Hyg 2018 15 (6) 1-19 Despite substantial financial and personnel resources being devoted to occupational exposure monitoring (OEM) by employers, workers' compensation insurers, and other organizations, the United States (US) lacks comprehensive occupational exposure databases to use for research and surveillance activities. OEM data are necessary for determining the levels of workers' exposures; compliance with regulations; developing control measures; establishing worker exposure profiles; and improving preventive and responsive exposure surveillance and policy efforts. Workers' compensation insurers as a group may have particular potential for understanding exposures in various industries, especially among small employers. This is the first study to determine how selected state-based and private workers' compensation insurers collect, store, and use OEM data related specifically to air and noise sampling. Of 50 insurers contacted to participate in this study, 28 completed an online survey. All of the responding private and the majority of state-based insurers offered industrial hygiene (IH) services to policyholders and employed one to three certified industrial hygienists on average. Many, but not all, insurers used standardized forms for data collection, but the data were not commonly stored in centralized databases. Data were most often used to provide recommendations for improvement to policyholders. Although not representative of all insurers, the survey was completed by insurers that cover a substantial number of employers and workers. The 20 participating state-based insurers on average provided 48% of the workers' compensation insurance benefits in their respective states or provinces. These results provide insight into potential next steps for improving the access to and usability of existing data as well as ways researchers can help organizations improve data collection strategies. This effort represents an opportunity for collaboration among insurers, researchers, and others that can help insurers and employers while advancing the exposure assessment field in the US. |
Results of a confirmatory mapping tool for Lymphatic filariasis endemicity classification in areas where transmission was uncertain in Ethiopia
Sime H , Gass KM , Mekasha S , Assefa A , Woyessa A , Shafi O , Meribo K , Kebede B , Ogoussan K , Pelletreau S , Bockarie MJ , Kebede A , Rebollo MP . PLoS Negl Trop Dis 2018 12 (3) e0006325 BACKGROUND: The goal of the global lymphatic filariasis (LF) program is to eliminate the disease as a public health problem by the year 2020. The WHO mapping protocol that is used to identify endemic areas in need of mass drug administration (MDA) uses convenience-based sampling. This rapid mapping has allowed the global program to dramatically scale up treatment, but as the program approaches its elimination goal, it is important to ensure that all endemic areas have been identified and have received MDA. In low transmission settings, the WHO mapping protocol for LF mapping has several limitations. To correctly identify the LF endemicity of woredas, a new confirmatory mapping tool was developed to test older school children for circulating filarial antigen (CFA) in settings where it is uncertain. Ethiopia is the first country to implement this new tool. In this paper, we present the Ethiopian experience of implementing the new confirmatory mapping tool and discuss the implications of the results for the LF program in Ethiopia and globally. METHODS: Confirmatory LF mapping was conducted in 1,191 schools in 45 woredas, the implementation unit in Ethiopia, in the regions of Tigray, Amhara, Oromia, SNNP, Afar and Harari, where the results of previous mapping for LF using the current WHO protocol indicated that LF endemicity was uncertain. Within each woreda schools were selected using either cluster or systematic sampling. From selected schools, a total of 18,254 children were tested for circulating filarial antigen (CFA) using the immuno-chromatographic test (ICT). RESULTS: Of the 18,254 children in 45 woredas who participated in the survey, 28 (0.16%) in 9 woredas tested CFA positive. According to the confirmatory mapping threshold, which is >/=2% CFA in children 9-14 years of age, only 3 woredas out of the total 45 had more CFA positive results than the threshold and thus were confirmed to be endemic; the remaining 42 woredas were declared non-endemic. These results drastically decreased the estimated total population living in LF-endemic woredas in Ethiopia and in need of MDA by 49.1%, from 11,580,010 to 5,893,309. CONCLUSION: This study demonstrated that the new confirmatory mapping tool for LF can benefit national LF programs by generating information that not only can confirm where LF is endemic, but also can save time and resources by preventing MDA where there is no evidence of ongoing LF transmission. |
Accelerating momentum: The impact of CDC and RWJF investments in support of public health accreditation and quality improvement
Corso LC , Russo P . J Public Health Manag Pract 2018 24 Suppl 3 S114-s116 National public health accreditation was designed for and built by public health practitioners, through considerable volunteer participation from the field and the leadership of national organizations that represent the agencies to be accredited. Another important ingredient for success was the long-term commitment from 2 cofunders, the Centers for Disease Control and Prevention (CDC) and the Robert Wood Johnson Foundation (RWJF). These organizations provided the necessary fuel and, by continually listening to the field and working collaboratively with many partners, initiated complementary strategies that are serving a vital role in the success of national accreditation and its impact on the field of public health. | Spurred by a 2003 recommendation from the Institute of Medicine,1 the growth of state-specific efforts,2 and interest from their respective organizations,3,4 CDC and RWJF joined forces in 2005 to support the Exploring Accreditation Initiative and the subsequent Public Health Accreditation Board (PHAB).5 These efforts drew information from prior and simultaneous efforts supported by CDC and RWJF, such as the Multi-State Learning Collaborative,2 the National Public Health Performance Standards,6 and the Operational Definition of a Local Health Department.7 | This commentary describes major CDC and RWJF areas of support to the field since the launch of accreditation in 2011. CDC and RWJF have provided a variety of opportunities, such as direct funding related to accreditation readiness and quality improvement (QI), funding and technical assistance through national partner organizations,* and opportunities for training and peer exchange. CDC and RWJF have worked together, sharing observations about needs and successes in the field, to refine opportunities and complement efforts. As a result, several themes emerge that can inform other large-scale collaborative efforts and provide guidance for continued advancement of accreditation. |
Driving change and reinforcing expectations by linking accreditation with programmatic and strategic priorities
Corso LC , Thomas CW . J Public Health Manag Pract 2018 24 Suppl 3 Supplement S109-s113 In 2011, the national accreditation program for public health departments was launched, thus establishing national standards that specify and reinforce expectations for health departments. Accreditation can play an important role in stabilizing public health practice, strengthening quality and performance, and driving change. Accreditation-oriented crosswalks are a type of tool that can help realize the benefits of accreditation by highlighting connections between the national accreditation standards and public health programs, policies, and practices. While many different types of accreditation-oriented crosswalks exist, they all provide the same opportunity: to maximize accreditation's impact on improving public health programs specifically and public health agency practices generally. This commentary discusses development and use of crosswalks as a tool that links accreditation to programmatic and strategic priorities. |
Intracytoplasmic sperm injection use in states with and without insurance coverage mandates for infertility treatment, United States, 2000-2015
Dieke AC , Mehta A , Kissin DM , Nangia AK , Warner L , Boulet SL . Fertil Steril 2018 109 (4) 691-697 OBJECTIVE: To compare indications and trends in intracytoplasmic sperm injection (ICSI) use for in vitro fertilization (IVF) cycles among residents of states with and without insurance mandates for IVF coverage. DESIGN: Cross-sectional analysis of the National Assisted Reproductive Technology Surveillance System from 2011 to 2015 for the main outcome and from 2000 to 2015 for trends. SETTING: IVF cycles performed in U.S. fertility clinics. PATIENT(S): Fresh IVF cycles. INTERVENTION(S): Residency in a state with an insurance mandate for IVF (n = 8 states) versus no mandate (n = 43 states, including DC). MAIN OUTCOME MEASURE(S): ICSI use by insurance coverage mandate status stratified by male-factor infertility diagnosis. RESULT(S): During 2000-2015, there were 1,356,377 fresh IVF cycles, of which 25.8% (n = 350,344) were performed for residents of states with an insurance coverage mandate for IVF. ICSI use increased significantly during 2000-2015 in states both with and without a mandate; however, for non-male-factor infertility cycles, the percentage increase in ICSI use was greater among nonmandate states (34.6% in 2000 to 73.9% in 2015) versus mandate states (39.5% in 2000 to 63.5% in 2015). For male-factor infertility cycles, this percentage increase was approximately 7.3% regardless of residency in a state with an insurance mandate for IVF. From 2011 to 2015, ICSI use was lower in mandate versus nonmandate states, both for cycles with (91.5% vs. 94.5%), and without (60.3% vs. 70.9%) male-factor infertility. CONCLUSION(S): Mandates for IVF coverage were associated with lower ICSI use for non-male-factor infertility cycles. |
Evaluating comprehensive state tobacco prevention and control programs using an outcome indicator framework
Fulmer E , Rogers T , Glasgow L , Brown S , Kuiper N . Health Promot Pract 2018 20 (2) 1524839918760557 The outcome indicator framework helps tobacco prevention and control programs (TCPs) plan and implement theory-driven evaluations of their efforts to reduce and prevent tobacco use. Tobacco use is the single-most preventable cause of morbidity and mortality in the United States. The implementation of public health best practices by comprehensive state TCPs has been shown to prevent the initiation of tobacco use, reduce tobacco use prevalence, and decrease tobacco-related health care expenditures. Achieving and sustaining program goals require TCPs to evaluate the effectiveness and impact of their programs. To guide evaluation efforts by TCPs, the Centers for Disease Control and Prevention's Office on Smoking and Health developed an outcome indicator framework that includes a high-level logic model and evidence-based outcome indicators for each tobacco prevention and control goal area. In this article, we describe how TCPs and other community organizations can use the outcome indicator framework in their evaluation efforts. We also discuss how the framework is used at the national level to unify tobacco prevention and control efforts across varying state contexts, identify promising practices, and expand the public health evidence base. |
Overdose deaths involving opioids, cocaine, and psychostimulants - United States, 2015-2016
Seth P , Scholl L , Rudd RA , Bacon S . MMWR Morb Mortal Wkly Rep 2018 67 (12) 349-358 During 19992015, 568,699 persons died from drug overdoses in the United States.* Drug overdose deaths in the United States increased 11.4% from 2014 to 2015 resulting in 52,404 deaths in 2015, including 33,091 (63.1%) that involved an opioid. The largest rate increases from 2014 to 2015 occurred among deaths involving synthetic opioids other than methadone (synthetic opioids) (72.2%) (1). Because of demographic and geographic variations in overdose deaths involving different drugs (2,3),(dagger) CDC examined age-adjusted death rates for overdoses involving all opioids, opioid subcategories (i.e., prescription opioids, heroin, and synthetic opioids),( section sign) cocaine, and psychostimulants with abuse potential (psychostimulants) by demographics, urbanization levels, and in 31 states and the District of Columbia (DC). There were 63,632 drug overdose deaths in 2016; 42,249 (66.4%) involved an opioid.( paragraph sign) From 2015 to 2016, deaths increased across all drug categories examined. The largest overall rate increases occurred among deaths involving cocaine (52.4%) and synthetic opioids (100%), likely driven by illicitly manufactured fentanyl (IMF) (2,3). Increases were observed across demographics, urbanization levels, and states and DC. The opioid overdose epidemic in the United States continues to worsen. A multifaceted approach, with faster and more comprehensive surveillance, is needed to track emerging threats to prevent and respond to the overdose epidemic through naloxone availability, safe prescribing practices, harm-reduction services, linkage into treatment, and more collaboration between public health and public safety agencies. |
Flea market finds and global exports: Four multistate outbreaks of human Salmonella infections linked to small turtles, United States-2015
Gambino-Shirley K , Stevenson L , Concepcion-Acevedo J , Trees E , Wagner D , Whitlock L , Roberts J , Garrett N , Van Duyne S , McAllister G , Schick B , Schlater L , Peralta V , Reporter R , Li L , Waechter H , Gomez T , Fernandez Ordenes J , Ulloa S , Ragimbeau C , Mossong J , Nichols M . Zoonoses Public Health 2018 65 (5) 560-568 Zoonotic transmission of Salmonella infections causes an estimated 11% of salmonellosis annually in the United States. This report describes the epidemiologic, traceback and laboratory investigations conducted in the United States as part of four multistate outbreaks of Salmonella infections linked to small turtles. Salmonella isolates indistinguishable from the outbreak strains were isolated from a total of 143 ill people in the United States, pet turtles, and pond water samples collected from turtle farm A, as well as ill people from Chile and Luxembourg. Almost half (45%) of infections occurred in children aged <5 years, underscoring the importance of the Centers for Disease Control and Prevention recommendation to keep pet turtles and other reptiles out of homes and childcare settings with young children. Although only 43% of the ill people who reported turtle exposure provided purchase information, most small turtles were purchased from flea markets or street vendors, which made it difficult to locate the vendor, trace the turtles to a farm of origin, provide education and enforce the United States federal ban on the sale and distribution of small turtles. These outbreaks highlight the importance of improving public awareness and education about the risk of Salmonella from small turtles not only in the United States but also worldwide. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Genetics and Genomics
- Health Economics
- Immunity and Immunization
- Laboratory Sciences
- Maternal and Child Health
- Mining
- Obituary
- Occupational Safety and Health
- Parasitic Diseases
- Public Health Leadership and Management
- Reproductive Health
- Substance Use and Abuse
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure