Association between long-term adherence to class-I recommended medications and risk for potentially preventable heart failure hospitalizations among younger adults
Chang TE , Park S , Yang Q , Loustalot F , Butler J , Ritchey MD . PLoS One 2019 14 (9) e0222868 BACKGROUND: Five guideline-recommended medication categories are available to treat patients who have heart failure (HF) with reduced ejection fraction. However, adherence to these medications is often suboptimal, which places patients at increased risk for poor health outcomes, including hospitalization. We aimed to examine the association between adherence to these medications and potentially preventable HF hospitalizations among younger insured adults with newly diagnosed HF. METHODS AND RESULTS: Using the 2008-2012 IBM MarketScan Commercial database, we followed 26,439 individuals aged 18-64 years with newly diagnosed HF and calculated their adherence (using the proportion of days covered (PDC) algorithm) to the five guideline-recommended medication categories: angiotensin-converting enzyme inhibitors/angiotensin-receptor blockers; beta blockers; aldosterone receptor antagonists; hydralazine; and isosorbide dinitrate. We determined the association between PDC and long-term preventable HF hospitalizations (observation years 3-5) as defined by the United States (U.S.) Agency for Healthcare Research and Quality. Overall, 49.0% of enrollees had good adherence (PDC>/=80%), which was more common among enrollees who were older, male, residing in higher income counties, initially diagnosed with HF in an outpatient setting, and who filled prescriptions for fewer medication categories assessed. Adherence differed by medication category and was lowest for isosorbide dinitrate (PDC = 60.7%). In total, 7.6% of enrollees had preventable HF hospitalizations. Good adherers, compared to poor adherers (PDC<40%), were 15% less likely to have a preventable hospitalization (HR 0.85, 95% confidence interval, 0.75-0.96). CONCLUSION: We found that approximately half of insured U.S. adults aged 18-64 years with newly diagnosed HF had good adherence to their HF medications. Patients with good adherence, compared to those with poor adherence, were less likely to have a potentially preventable HF hospitalization 3-5 years after their initial diagnosis. Because HF is a chronic condition that requires long-term management, future studies may want to assess the effectiveness of interventions in sustaining adherence. |
Prevention, diagnosis, evaluation, and treatment of hepatitis C virus infection in chronic kidney disease: Synopsis of the Kidney Disease: Improving Global Outcomes 2018 Clinical Practice Guideline
Gordon CE , Berenguer MC , Doss W , Fabrizi F , Izopet J , Jha V , Kamar N , Kasiske BL , Lai CL , Morales JM , Patel PR , Pol S , Silva MO , Balk EM , Earley A , Di M , Cheung M , Jadoul M , Martin P . Ann Intern Med 2019 171 (7) 496-504 Description: The Kidney Disease: Improving Global Outcomes (KDIGO) 2018 clinical practice guideline for the prevention, diagnosis, evaluation, and treatment of hepatitis C virus (HCV) infection in chronic kidney disease (CKD) is an extensive update of KDIGO's 2008 guideline on HCV infection in CKD. This update reflects the major advances since the introduction of direct-acting antivirals (DAAs) in the management of HCV infection in the CKD population. Methods: The KDIGO work group tasked with developing the HCV and CKD guideline defined the scope of the guideline, gathered evidence, determined topics for systematic review, and graded the quality of evidence previously summarized by the evidence review team. The GRADE (Grading of Recommendations Assessment, Development and Evaluation) approach was used to appraise the quality of evidence and rate the strength of the recommendations. Searches of the English-language literature were conducted through May 2017 and were supplemented with targeted searches for studies of DAA treatment and with abstracts from nephrology, hepatology, and transplantation conferences. A review process involving many stakeholders, subject matter experts, and industry and national organizations informed the guideline's final modification. Recommendation: The updated guideline comprises 66 recommendations. This synopsis focuses on 32 key recommendations pertinent to the prevention, diagnosis, treatment, and management of HCV infection in adult CKD populations. |
Cancer collection efforts in the United States provide clinically relevant data on all primary brain and other CNS tumors
Kruchko C , Gittleman H , Ruhl J , Hofferkamp J , Ward EM , Ostrom QT , Sherman RL , Jones SF , Barnholtz-Sloan JS , Wilson RJ . Neurooncol Pract 2019 6 (5) 330-339 Cancer surveillance is critical for monitoring the burden of cancer and the progress in cancer control. The accuracy of these data is important for decision makers and others who determine resource allocation for cancer prevention and research. In the United States, cancer registration is conducted according to uniform data standards, which are updated and maintained by the North American Association of Central Cancer Registries. Underlying cancer registration efforts is a firm commitment to ensure that data are accurate, complete, and reflective of current clinical practices. Cancer registries ultimately depend on medical records that are generated for individual patients by clinicians to record newly diagnosed cases. For the cancer registration of brain and other CNS tumors, the Central Brain Tumor Registry of the United States is the self-appointed guardian of these data. In 2017, the Central Brain Tumor Registry of the United States took the initiative to promote the inclusion of molecular markers found in the 2016 WHO Classification of Tumours of the Central Nervous System into information collected by cancer registries. The complexities of executing this latest objective are presented according to the cancer registry standard-setting organizations whose collection practices for CNS tumors are directly affected. |
Diagnosis and management of Guillain-Barre syndrome in ten steps
Leonhard SE , Mandarakas MR , Gondim FAA , Bateman K , Ferreira MLB , Cornblath DR , van Doorn PA , Dourado ME , Hughes RAC , Islam B , Kusunoki S , Pardo CA , Reisin R , Sejvar JJ , Shahrizaila N , Soares C , Umapathi T , Wang Y , Yiu EM , Willison HJ , Jacobs BC . Nat Rev Neurol 2019 15 (11) 671-683 Guillain-Barre syndrome (GBS) is a rare, but potentially fatal, immune-mediated disease of the peripheral nerves and nerve roots that is usually triggered by infections. The incidence of GBS can therefore increase during outbreaks of infectious diseases, as was seen during the Zika virus epidemics in 2013 in French Polynesia and 2015 in Latin America. Diagnosis and management of GBS can be complicated as its clinical presentation and disease course are heterogeneous, and no international clinical guidelines are currently available. To support clinicians, especially in the context of an outbreak, we have developed a globally applicable guideline for the diagnosis and management of GBS. The guideline is based on current literature and expert consensus, and has a ten-step structure to facilitate its use in clinical practice. We first provide an introduction to the diagnostic criteria, clinical variants and differential diagnoses of GBS. The ten steps then cover early recognition and diagnosis of GBS, admission to the intensive care unit, treatment indication and selection, monitoring and treatment of disease progression, prediction of clinical course and outcome, and management of complications and sequelae. |
Prescription opioid use in patients with and without systemic lupus erythematosus - Michigan Lupus Epidemiology and Surveillance Program, 2014-2015
Somers EC , Lee J , Hassett AL , Zick SM , Harlow SD , Helmick CG , Barbour KE , Gordon C , Brummett CM , Minhas D , Padda A , Wang L , McCune WJ , Marder W . MMWR Morb Mortal Wkly Rep 2019 68 (38) 819-824 Rheumatic diseases are a leading cause of chronic, noncancer pain. Systemic lupus erythematosus (SLE) is a chronic autoimmune rheumatic disease characterized by periodic flares that can result in irreversible target organ damage, including end-stage renal disease. Both intermittent and chronic musculoskeletal pain, as well as fibromyalgia (considered a centralized pain disorder due to dysregulation of pain processing in the central nervous system), are common in SLE. Opioids are generally not indicated for long-term management of musculoskeletal pain or centralized pain (fibromyalgia) because of lack of efficacy, safety issues ranging from adverse medical effects to overdose, and risk for addiction (1,2). In this study of 462 patients with SLE from the population-based Michigan Lupus Epidemiology and Surveillance (MILES) Cohort and 192 frequency-matched persons without SLE, nearly one third (31%) of SLE patients were using prescription opioids during the study period (2014-2015), compared with 8% of persons without SLE (p<0.001). Among the SLE patients using opioids, 97 (68%) were using them for >1 year, and 31 (22%) were concomitantly on two or more opioid medications. Among SLE patients, those using the emergency department (ED) were approximately twice as likely to use prescription opioids (odds ratio [OR] = 2.1; 95% confidence interval [CI] = 1.3-3.6; p = 0.004). In SLE, the combined contributions of underlying disease and adverse effects of immunosuppressive and glucocorticoid therapies already put patients at higher risk for some known adverse effects attributed to long-term opioid use. Addressing the widespread and long-term use of opioid therapy in SLE will require strategies aimed at preventing opioid initiation, tapering and discontinuation of opioids among patients who are not achieving treatment goals of reduced pain and increased function, and consideration of nonopioid pain management strategies. |
Rapid Identification and Investigation of an HIV Risk Network Among People Who Inject Drugs -Miami, FL, 2018.
Tookes H , Bartholomew TS , Geary S , Matthias J , Poschman K , Blackmore C , Philip C , Suarez E , Forrest DW , Rodriguez AE , Kolber MA , Knaul F , Colucci L , Spencer E . AIDS Behav 2019 24 (1) 246-256 Prevention of HIV outbreaks among people who inject drugs remains a challenge to ending the HIV epidemic in the United States. The first legal syringe services program (SSP) in Florida implemented routine screening in 2018 leading to the identification of ten anonymous HIV seroconversions. The SSP collaborated with the Department of Health to conduct an epidemiologic investigation. All seven acute HIV seroconversions were linked to care (86% within 30 days) and achieved viral suppression (mean 70 days). Six of the seven individuals are epidemiologically and/or socially linked to at least two other seroconversions. Analysis of the HIV genotypes revealed that two individuals are connected molecularly at 0.5% genetic distance. We identified a risk network with complex transmission dynamics that could not be explained by epidemiological methods or molecular analyses alone. Providing wrap-around services through the SSP, including routine screening, intensive linkage and patient navigation, could be an effective model for achieving viral suppression for people who inject drugs. |
The Diagnosis of Fungal Neglected Tropical Diseases (Fungal NTDs) and the Role of Investigation and Laboratory Tests: An Expert Consensus Report.
Hay R , Denning DW , Bonifaz A , Queiroz-Telles F , Beer K , Bustamante B , Chakrabarti A , Chavez-Lopez MG , Chiller T , Cornet M , Estrada R , Estrada-Chavez G , Fahal A , Gomez BL , Li R , Mahabeer Y , Mosam A , Soavina Ramarozatovo L , Rakoto Andrianarivelo M , Rapelanoro Rabenja F , van de Sande W , Zijlstra EE . Trop Med Infect Dis 2019 4 (4) The diagnosis of fungal Neglected Tropical Diseases (NTD) is primarily based on initial visual recognition of a suspected case followed by confirmatory laboratory testing, which is often limited to specialized facilities. Although molecular and serodiagnostic tools have advanced, a substantial gap remains between the desirable and the practical in endemic settings. To explore this issue further, we conducted a survey of subject matter experts on the optimal diagnostic methods sufficient to initiate treatment in well-equipped versus basic healthcare settings, as well as optimal sampling methods, for three fungal NTDs: mycetoma, chromoblastomycosis, and sporotrichosis. A survey of 23 centres found consensus on the key role of semi-invasive sampling methods such as biopsy diagnosis as compared with swabs or impression smears, and on the importance of histopathology, direct microscopy, and culture for mycetoma and chromoblastomycosis confirmation in well-equipped laboratories. In basic healthcare settings, direct microscopy combined with clinical signs were reported to be the most useful diagnostic indicators to prompt referral for treatment. The survey identified that the diagnosis of sporotrichosis is the most problematic with poor sensitivity across the most widely available laboratory tests except fungal culture, highlighting the need to improve mycological diagnostic capacity and to develop innovative diagnostic solutions. Fungal microscopy and culture are now recognized as WHO essential diagnostic tests and better training in their application will help improve the situation. For mycetoma and sporotrichosis, in particular, advances in identifying specific marker antigens or genomic sequences may pave the way for new laboratory-based or point-of-care tests, although this is a formidable task given the large number of different organisms that can cause fungal NTDs. |
Surveillance for coccidioidomycosis - United States, 2011-2017
Benedict K , McCotter OZ , Brady S , Komatsu K , Sondermeyer Cooksey GL , Nguyen A , Jain S , Vugia DJ , Jackson BR . MMWR Surveill Summ 2019 68 (7) 1-15 PROBLEM/CONDITION: Coccidioidomycosis (Valley fever) is an infection caused by the environmental fungus Coccidioides spp., which typically causes respiratory illness but also can lead to disseminated disease. This fungus typically lives in soils in warm, arid regions, including the southwestern United States. REPORTING PERIOD: 2011-2017. DESCRIPTION OF SYSTEM: Coccidioidomycosis has been nationally notifiable since 1995 and is reportable in 26 states and the District of Columbia (DC), where laboratories and physicians notify local and state public health departments about possible coccidioidomycosis cases. Health department staff determine which cases qualify as confirmed cases according to the definition established by Council of State and Territorial Epidemiologists and voluntarily submit basic case information to CDC through the National Notifiable Diseases Surveillance System. RESULTS: During 2011-2017, a total of 95,371 coccidioidomycosis cases from 26 states and DC were reported to CDC. The number of cases decreased from 2011 (22,634 cases) to 2014 (8,232 cases) and subsequently increased to 14,364 cases in 2017; >95% of cases were reported from Arizona and California. Reported incidence in Arizona decreased from 261 per 100,000 persons in 2011 to 101 in 2017, whereas California incidence increased from 15.7 to 18.2, and other state incidence rates stayed relatively constant. Patient demographic characteristics were largely consistent with previous years, with an overall predominance among males and among adults aged >60 years in Arizona and adults aged 40-59 years in California. INTERPRETATION: Coccidioidomycosis remains an important national public health problem with a well-established geographic focus. The reasons for the changing trends in reported cases are unclear but might include environmental factors (e.g., temperature and precipitation), surveillance artifacts, land use changes, and changes in the population at risk for the infection. PUBLIC HEALTH ACTION: Health care providers should consider a diagnosis of coccidioidomycosis in patients who live or work in or have traveled to areas with known geographic risk for Coccidioides and be aware that those areas might be broader than previously recognized. Coccidioidomycosis surveillance provides important information about the epidemiology of the disease but is incomplete both in terms of geographic coverage and data availability. Expanding surveillance to additional states could help identify emerging areas that pose a risk for locally acquired infections. In Arizona and California, where most cases occur, collecting systematic enhanced data, such as more detailed patient characteristics and disease severity, could help clarify the reasons behind the recent changes in incidence and identify additional opportunities for focused prevention and educational efforts. |
Characteristics of large mumps outbreaks in the United States, July 2010-December 2015
Clemmons NS , Redd SB , Gastanaduy PA , Marin M , Patel M , Fiebelkorn AP . Clin Infect Dis 2019 68 (10) 1684-1690 BACKGROUND: Mumps is an acute viral illness that classically presents with parotitis. Although the United States experienced a 99% reduction in mumps cases following implementation of the 2-dose vaccination program in 1989, mumps has resurged in the past 10 years. METHODS: We assessed the epidemiological characteristics of mumps outbreaks with >/=20 cases reported in the United States electronically through the National Notifiable Diseases Surveillance System and from supplemental outbreak data through direct communications with jurisdictions from July 2010 through December 2015. Mumps cases were defined using the 2012 Council of State and Territorial Epidemiologists case definition. RESULTS: Twenty-three outbreaks with 20-485 cases per outbreak were reported in 18 jurisdictions. The duration of outbreaks ranged from 1.5 to 8.5 months (median, 3 months). All outbreaks involved close-contact settings; 18 (78%) involved universities, 16 (70%) occurred primarily among young adults (median age, 18-24 years), and 9 (39%) occurred in highly vaccinated populations (2-dose measles-mumps-rubella vaccine coverage >/=85%). CONCLUSIONS: During 2010-2015, multiple mumps outbreaks among highly vaccinated populations in close-contact settings occurred. Most cases occurred among vaccinated young adults, suggesting that waning immunity played a role. Further evaluation of risk factors associated with these outbreaks is warranted. |
Caregiver perspectives on TB case-finding and HIV clinical services for children diagnosed with TB in Tanzania
Emerson C , Ndakidemi E , Ngowi B , Medley A , Ng'eno B , Godwin M , Ntinginya N , Carpenter D , Kohi W , Modi S . AIDS Care 2019 32 (4) 1-5 Caregivers of children with tuberculosis (TB) and HIV play a critical role in seeking healthcare for their children. To assess the perspectives of caregivers of pediatric TB patients, we conducted 76 in-depth interviews at 10 TB clinics in 5 districts of Tanzania in March 2016. We assessed how the child received their TB diagnosis, the decision-making process around testing the child for HIV, and the process of linking the child to HIV treatment in the event of an HIV diagnosis. Caregivers suspected TB due to cases in their family, or the child being ill and not improving. Most caregivers noted delays before confirmation of a TB diagnosis and having to visit multiple facilities before a diagnosis. Once diagnosed, some caregivers reported challenges administering TB medications due to lack of pediatric formulations. Reasons for accepting HIV testing included recurrent illness and HIV symptoms, history of HIV in the family, and recommendation of the clinical provider. Caregivers described a relatively seamless process for linking their child to HIV treatment, highlighting the success of TB/HIV integration efforts. The multiple clinic visits required prior to TB diagnosis suggests the need for additional training and sensitization of healthcare workers and better TB diagnostic tools. |
Bedaquiline for the treatment of multidrug-resistant tuberculosis in the United States
Mase S , Chorba T , Parks S , Belanger A , Dworkin F , Seaworth B , Warkentin J , Barry P , Shah N . Clin Infect Dis 2019 71 (4) 1010-1016 BACKGROUND: In 2012, the Food and Drug Administration approved the use of bedaquiline fumarate as part of combination therapy for treatment of multidrug-resistant tuberculosis (MDR TB). We describe the treatment outcomes, safety, and tolerability of bedaquiline in our case series. METHODS: Data on patients started on bedaquiline for MDR TB between September 2012 and August 2016 were collected retrospectively through four TB programs using a standardized abstraction tool. Data were analyzed using univariate methods. Adverse events were graded using the Common Terminology Criteria for Adverse Events. RESULTS: Of 14 patients in this case series, 7/14 (50%) had MDR, 4/14 (29%) had pre-extensively-drug-resistant (XDR), and 3/14 (21%) had XDR. All had pulmonary TB, 5/14 (36%) had pulmonary and extrapulmonary TB, and 9/13 (69%) were smear-positive. One patient (7%) had HIV co-infection, 5/14 (36%) had diabetes mellitus, and 5 (36%) had been previously treated for TB. All patients were non-U.S.-born and 5 (36%) had private insurance. All patients achieved sputum culture conversion within a mean of 71 days (26-116); 6/14 after starting bedaquiline. Twelve (86%) completed treatment and 1/14 (7%) moved out of the country. One patient (7%) had QTc prolongation >500 milliseconds and died 20 months after discontinuing bedaquiline of a cause not attributable to the drug. The most common adverse events were peripheral neuropathy (50%), not customarily associated with bedaquiline use, and QTc prolongation (43%). CONCLUSIONS: Of 14 patients, one had an adverse event necessitating bedaquiline discontinuation. Safety, culture conversion, and treatment completion in this series support use of bedaquiline for the treatment of MDR/XDR TB. |
Missed opportunities for prevention and treatment of hepatitis C among persons with HIV/HCV coinfection
Millman AJ , Luo Q , Nelson NP , Vellozzi C , Weiser J . AIDS Care 2019 32 (7) 1-9 Hepatitis C (HCV) and HIV have common modes of transmission but information about HCV transmission risk, prevention, and treatment among persons with coinfection is lacking. The Medical Monitoring Project produces nationally representative estimates describing adults with diagnosed HIV in the United States. Using medical record data recorded during 6/2013-5/2017, we identified persons with detectable HCV RNA documented during the past 24 months. Among persons with coinfection, we described HCV transmission risk factors and receipt of HCV prevention services during the past 12 months and prescription of HCV treatment during the past 24 months. Overall, 4.9% had documented active HCV coinfection, among whom 30.2% were men who have sex with men (MSM), 6.7% reported injection drug use, and 62.1% were prescribed HCV treatment. Among MSM, 45.5% reported condomless anal sex and 45.5% received free condoms. Among persons who used drugs, 30.8% received drug or alcohol counseling, and among persons who injected drugs, 79.2% received sterile syringes. Among persons with HIV/HCV coinfection, recent drug injection was uncommon and most received sterile syringes. However, 1 in 3 were MSM, of whom half reported recent HCV sexual transmission risk behaviors. More than one-third of those with coinfection were not prescribed curative HCV treatment. |
Progress toward poliovirus containment implementation - worldwide, 2018-2019
Moffett DB , Llewellyn A , Singh H , Saxentoff E , Partridge J , Iakovenko M , Roesel S , Asghar H , Baig N , Grabovac V , Gurung S , Gumede-Moeletsi N , Barnor J , Theo A , Rey-Benito G , Villalobos A , Boualam L , Swan J , Sutter RW , Pandel E , Wassilak S , Oberste MS , Lewis I , Zaffran M . MMWR Morb Mortal Wkly Rep 2019 68 (38) 825-829 Among the three wild poliovirus (WPV) types, type 2 (WPV2) was declared eradicated globally by the Global Commission for the Certification of Poliomyelitis Eradication (GCC) in 2015. Subsequently, in 2016, a global withdrawal of Sabin type 2 oral poliovirus vaccine (OPV2) from routine use, through a synchronized switch from the trivalent formulation of oral poliovirus vaccine (tOPV, containing vaccine virus types 1, 2, and 3) to the bivalent form (bOPV, containing types 1 and 3), was implemented. WPV type 3 (WPV3), last detected in 2012 (1), will possibly be declared eradicated in late 2019.* To ensure that polioviruses are not reintroduced to the human population after eradication, World Health Organization (WHO) Member States committed in 2015 to containing all polioviruses in poliovirus-essential facilities (PEFs) that are certified to meet stringent containment criteria; implementation of containment activities began that year for facilities retaining type 2 polioviruses (PV2), including type 2 oral poliovirus vaccine (OPV) materials (2). As of August 1, 2019, 26 countries have nominated 74 PEFs to retain PV2 materials. Twenty-five of these countries have established national authorities for containment (NACs), which are institutions nominated by ministries of health or equivalent bodies to be responsible for poliovirus containment certification. All designated PEFs are required to be enrolled in the certification process by December 31, 2019 (3). When GCC certifies WPV3 eradication, WPV3 and vaccine-derived poliovirus (VDPV) type 3 materials will also be required to be contained, leading to a temporary increase in the number of designated PEFs. When safer alternatives to wild and OPV/Sabin strains that do not require containment conditions are available for diagnostic and serologic testing, the number of PEFs will decrease. Facilities continuing to work with polioviruses after global eradication must minimize the risk for reintroduction into communities by adopting effective biorisk management practices. |
Respiratory viral surveillance of healthcare personnel and patients at an adult long-term care facility
O'Neil CA , Kim L , Prill MM , Talbot HK , Whitaker B , Sakthivel SK , Zhang Y , Zhang J , Tong S , Stone N , Garg S , Gerber SI , Babcock HM . Infect Control Hosp Epidemiol 2019 40 (11) 1-4 We conducted active surveillance of acute respiratory viral infections (ARIs) among residents and healthcare personnel (HCP) at a long-term care facility during the 2015-2016 respiratory illness season. ARIs were observed among both HCP and patients, highlighting the importance of including HCP in surveillance programs. |
Rollout of ShangRing circumcision with active surveillance for adverse events and monitoring for uptake in Kenya
Odoyo-June E , Owuor N , Kassim S , Davis S , Agot K , Serrem K , Otieno G , Awori Q , Hines J , Toledo C , Laube C , Kisia C , Aoko A , Ojiambo V , Mwandi Z , Juma A , Kigen B . PLoS One 2019 14 (9) e0222942 INTRODUCTION: Since 2011, Kenya has been evaluating ShangRing device for use in its voluntary medical male circumcision (VMMC) program according to World Health Organization (WHO) guidelines. Compared to conventional surgical circumcision, the ShangRing procedure is shorter, does not require suturing and gives better cosmetic outcomes. After a pilot evaluation of ShangRing in 2011, Kenya conducted an active surveillance for adverse events associated with its use from 2016-2018 to further assess its safety, uptake and to identify any operational bottlenecks to its widespread use based on data from a larger pool of procedures in routine health care settings. METHODS: From December 2017 to August 2018, HIV-negative VMMC clients aged 13 years or older seeking VMMC at six sites across five counties in Kenya were offered ShangRing under injectable local anesthetic as an alternative to conventional surgical circumcision. Providers described both procedures to clients before letting them make a choice. Outcome measures recorded for clients who chose ShangRing included the proportions who were clinically eligible, had successful device placement, experienced adverse events (AEs), or failed to return for device removal. Clients failing to return for follow up were sought through phone calls, text messages or home visits to ensure removal and complete information on adverse events. RESULTS: Out of 3,692 eligible clients 1,079 (29.2%) chose ShangRing; of these, 11 (1.0%) were excluded due to ongoing clinical conditions, 17 (1.6%) underwent conventional surgery due to lack of appropriate device size at the time of the procedure, 97.3% (1051/1079) had ShangRing placement. Uptake of ShangRing varied from 11% to 97% across different sites. There was one severe AE, a failed ShangRing placement (0.1%) managed by conventional wound suturing, plus two moderate AEs (0.2%), post removal wound dehiscence and bleeding, that resolved without sequelae. The overall AE rate was 0.3%. All clients returned for device removal from fifth to eleventh day after placement. CONCLUSION: ShangRing circumcision is effective and safe in the Kenyan context but its uptake varies widely in different settings. It should be rolled out under programmatic implementation for eligible males to take advantage of its unique benefits and the freedom of choice beyond conventional surgical MMC. Public education on its availability and unique advantages is necessary to optimize its uptake and to actualize the benefit of its inclusion in VMMC programs. |
The impact of maternal HIV and malaria infection on the prevalence of congenital cytomegalovirus infection in Western Kenya
Otieno NA , Nyawanda BO , Otiato F , Oneko M , Amin MM , Otieno M , Omollo D , McMorrow M , Chaves SS , Dollard SC , Lanzieri TM . J Clin Virol 2019 120 33-37 BACKGROUND: Data on congenital cytomegalovirus (CMV) infection in Africa are limited. OBJECTIVE: To describe the prevalence of congenital CMV infection in a population with high prevalence of maternal HIV and malaria infection in western Kenya. STUDY DESIGN: We screened newborns for CMV by polymerase chain reaction assay of saliva swabs and dried blood spots (DBS), and assessed maternal CMV immunoglobulin G (IgG) status by testing serum eluted from newborn's DBS. We calculated adjusted prevalence ratios (aPRs) using log-binomial regression models. RESULTS: Among 1066 mothers, 210 (19.7%) had HIV infection and 207 (19.4%) had malaria infection; 33 (3.1%) mothers had both. Maternal CMV IgG prevalence was 93.1% (95% confidence interval [CI]: 88.3%-96.0%). Among 1078 newborns (12 sets of twins), 39 (3.6%, 95% CI: 2.7-4.9%) were CMV positive. The prevalence of congenital CMV infection by maternal HIV and malaria infection status was 5.0% (95% CI: 2.7-9.2%) for HIV only, 5.1% (95% CI: 2.7-9.4%) for malaria only, 8.8 (95% CI: 3.1-23.0) for HIV and malaria co-infection, and 2.6% (95% CI: 1.7-4.1%) for none. Congenital CMV infection was independently associated with maternal HIV infection (aPR=2.1; 95% CI: 1.0-4.2), adjusting for maternal age, parity, and malaria infection. CONCLUSIONS: The prevalence of congenital CMV infection was higher than the 0.2-0.7% in developed countries. Maternal HIV infection may increase the risk of congenital CMV infection, but the role of maternal malaria on intrauterine transmission of CMV remains unclear. |
Population-based active surveillance for culture-confirmed candidemia - four sites, United States, 2012-2016
Toda M , Williams SR , Berkow EL , Farley MM , Harrison LH , Bonner L , Marceaux KM , Hollick R , Zhang AY , Schaffner W , Lockhart SR , Jackson BR , Vallabhaneni S . MMWR Surveill Summ 2019 68 (8) 1-15 PROBLEM/CONDITION: Candidemia is a bloodstream infection (BSI) caused by yeasts in the genus Candida. Candidemia is one of the most common health care-associated BSIs in the United States, with all-cause in-hospital mortality of up to 30%. PERIOD COVERED: 2012-2016. DESCRIPTION OF SYSTEM: CDC's Emerging Infections Program (EIP), a collaboration among CDC, state health departments, and academic partners that was established in 1995, was used to conduct active, population-based laboratory surveillance for candidemia in 22 counties in four states (Georgia, Maryland, Oregon, and Tennessee) with a combined population of approximately 8 million persons. Laboratories serving the catchment areas were recruited to report candidemia cases to the local EIP program staff. A case was defined as a blood culture that was positive for a Candida species collected from a surveillance area resident during 2012-2016. Isolates were sent to CDC for species confirmation and antifungal susceptibility testing. Any subsequent blood cultures with Candida within 30 days of the initial positive culture in the same patient were considered part of the same case. Trained surveillance officers collected clinical information from the medical chart for all cases, and isolates were sent to CDC for species confirmation and antifungal susceptibility testing. RESULTS: Across all sites and surveillance years (2012-2016), 3,492 cases of candidemia were identified. The crude candidemia incidence averaged across sites and years during 2012-2016 was 8.7 per 100,000 population; important differences in incidence were found by site, age group, sex, and race. The crude annual incidence was the highest in Maryland (14.1 per 100,000 population) and lowest in Oregon (4.0 per 100,000 population). The crude annual incidence of candidemia was highest among adults aged >/=65 years (25.5 per 100,000 population) followed by infants aged <1 year (15.8). The crude annual incidence was higher among males (9.4) than among females (8.0) and was approximately 2 times greater among blacks than among nonblacks (13.7 versus 5.8). Ninety-six percent of cases occurred in patients who were hospitalized at the time of or during the week after having a positive culture. One third of cases occurred in patients who had undergone a surgical procedure in the 90 days before the candidemia diagnosis, 77% occurred in patients who had received systemic antibiotics in the 14 days before the diagnosis, and 73% occurred in patients who had had a central venous catheter (CVC) in place within 2 days before the diagnosis. Ten percent were in patients who had used injection drugs in the past 12 months. The median time from admission to candidemia diagnosis was 5 days (interquartile range [IQR]: 0-16 days). Among 2,662 cases that were treated in adults aged >18 years, 34% were treated with fluconazole alone, 30% with echinocandins alone, and 34% with both. The all-cause, in-hospital case-fatality ratio was 25% for any time after admission; the all-cause in-hospital case-fatality ratio was 8% for <48 hours after a positive culture for Candida species. Candida albicans accounted for 39% of cases, followed by Candida glabrata (28%) and Candida parapsilosis (15%). Overall, 7% of isolates were resistant to fluconazole and 1.6% were resistant to echinocandins, with no clear trends in resistance over the 5-year surveillance period. INTERPRETATION: Approximately nine out of 100,000 persons developed culture-positive candidemia annually in four U.S. sites. The youngest and oldest persons, men, and blacks had the highest incidences of candidemia. Patients with candidemia identified in the surveillance program had many of the typical risk factors for candidemia, including recent surgery, exposure to broad-spectrum antibiotics, and presence of a CVC. However, an unexpectedly high proportion of candidemia cases (10%) occurred in patients with a history of injection drug use (IDU), suggesting that IDU has become a common risk factor for candidemia. Deaths associated with candidemia remain high, with one in four cases resulting in death during hospitalization. PUBLIC HEALTH ACTION: Active surveillance for candidemia yielded important information about the disease incidence and death rate and persons at greatest risk. The surveillance was expanded to nine sites in 2017, which will improve understanding of the geographic variability in candidemia incidence and associated clinical and demographic features. This surveillance will help monitor incidence trends, track emergence of resistance and species distribution, monitor changes in underlying conditions and predisposing factors, assess trends in antifungal treatment and outcomes, and be helpful for those developing prevention efforts. IDU has emerged as an important risk factor for candidemia, and interventions to prevent invasive fungal infections in this population are needed. Surveillance data documenting that approximately two thirds of candidemia cases were caused by species other than C. albicans, which are generally associated with greater antifungal resistance than C. albicans, and the presence of substantial fluconazole resistance supports 2016 clinical guidelines recommending a switch from fluconazole to echinocandins as the initial treatment for candidemia in most patients. |
Sneathia amnii and maternal chorioamnionitis and stillbirth, Mozambique
Vitorino P , Varo R , Castillo P , Hurtado JC , Fernandes F , Valente AM , Mabunda R , Mocumbi S , Gary JM , Jenkinson TG , Mandomando I , Blau DM , Breiman RF , Bassat Q . Emerg Infect Dis 2019 25 (8) 1614-1616 We report a case of Sneathia amnii as the causative agent of maternal chorioamnionitis and congenital pneumonia resulting in a late fetal death in Mozambique, with strong supportive postmortem molecular and histopathologic confirmation. This rare, fastidious gram-negative coccobacillus has been reported to infrequently cause abortions, stillbirths, and neonatal infections. |
On the Fly: Interactions Between Birds, Mosquitoes, and Environment That Have Molded West Nile Virus Genomic Structure Over Two Decades.
Duggal NK , Langwig KE , Ebel GD , Brault AC . J Med Entomol 2019 56 (6) 1467-1474 West Nile virus (WNV) was first identified in North America almost 20 yr ago. In that time, WNV has crossed the continent and established enzootic transmission cycles, resulting in intermittent outbreaks of human disease that have largely been linked with climatic variables and waning avian seroprevalence. During the transcontinental dissemination of WNV, the original genotype has been displaced by two principal extant genotypes which contain an envelope mutation that has been associated with enhanced vector competence by Culex pipiens L. (Diptera: Culicidae) and Culex tarsalis Coquillett vectors. Analyses of retrospective avian host competence data generated using the founding NY99 genotype strain have demonstrated a steady reduction in viremias of house sparrows over time. Reciprocally, the current genotype strains WN02 and SW03 have demonstrated an inverse correlation between house sparrow viremia magnitude and the time since isolation. These data collectively indicate that WNV has evolved for increased avian viremia while house sparrows have evolved resistance to the virus such that the relative host competence has remained constant. Intrahost analyses of WNV evolution demonstrate that selection pressures are avian species-specific and purifying selection is greater in individual birds compared with individual mosquitoes, suggesting that the avian adaptive and/or innate immune response may impose a selection pressure on WNV. Phylogenomic, experimental evolutionary systems, and models that link viral evolution with climate, host, and vector competence studies will be needed to identify the relative effect of different selective and stochastic mechanisms on viral phenotypes and the capacity of newly evolved WNV genotypes for transmission in continuously changing landscapes. |
Isolation and characterization of Akhmeta virus from wild caught rodents ( Apodemus spp.) in Georgia.
Doty JB , Maghlakelidze G , Sikharulidze I , Tu SL , Morgan CN , Mauldin MR , Parkadze O , Kartskhia N , Turmanidze M , Matheny A , Davidson W , Tang S , Gao J , Li Y , Upton C , Carroll DS , Emerson GL , Nakazawa Y . J Virol 2019 93 (24) In 2013, a novel orthopoxvirus was detected in skin lesions of two cattle herders from the Kakheti region of Georgia (country), this virus was named Akhmeta virus. Subsequent investigation of these cases revealed that small mammals in the area had serological evidence of orthopoxvirus infections, suggesting their involvement in the maintenance of these viruses in nature. In October 2015, we began a longitudinal study assessing the natural history of orthopoxviruses in Georgia. As part of this effort, we trapped small mammals near Akhmeta (n=176) and Gudauri (n=110). Here, we describe the isolation and molecular characterization of Akhmeta virus from lesion material and pooled heart and lung samples collected from five wood mice (Apodemus uralensis and A. flavicollis) in these two locations. The genomes of Akhmeta virus obtained from rodents group into 2 clades; one clade represented by viruses isolated from A. uralensis samples, and one clade represented by viruses isolated from A. flavicollis samples. These genomes also display several presumptive recombination events for which gene truncation and identity has been examined.Importance Akhmeta virus is a unique Orthopoxvirus that was described in 2013 from the country of Georgia. This paper presents the first isolation of this virus from small mammal (Rodentia; Apodemus spp.) samples and the molecular characterization of those isolates. The identification of the virus in small mammals is an essential component to understanding of the natural history of this virus and its transmission to human populations; and could guide public health interventions in Georgia. Akhmeta virus genomes harbor evidence suggestive of recombination with a variety of other orthopoxviruses; this has implications for the evolution of orthopoxviruses, their ability to infect mammalian hosts, and their ability to adapt to novel host species. |
Reducing West Nile virus risk through vector management
Nasci RS , Mutebi JP . J Med Entomol 2019 56 (6) 1516-1521 Over 50,000 human West Nile virus (WNV) (Flaviviridae: Flavivirus) clinical disease cases have been reported to the CDC during the 20 yr that the virus has been present in the United States. Despite the establishment and expansion of WNV-focused mosquito surveillance and control efforts and a renewed emphasis on applying integrated pest management (IPM) principles to WNV control, periodic local and regional WNV epidemics with case reports exceeding 2,000 cases per year have occurred during 13 of those 20 yr in the United States. In this article, we examine the scientific literature for evidence that mosquito control activities directed at either preventing WNV outbreaks or stopping those outbreaks once in progress reduce WNV human disease or have a measurable impact on entomological indicators of human WNV risk. We found that, despite a proliferation of research investigating larval and adult mosquito control effectiveness, few of these studies actually measure epidemiological outcomes or the entomological surrogates of WNV risk. Although many IPM principles (e.g., control decisions based on surveillance, use of multiple control methodologies appropriate for the ecosystem) have been implemented effectively, the use of action thresholds or meaningful public health outcome assessments have not been used routinely. Establishing thresholds for entomological indicators of human risk analogous to the economic injury level and economic thresholds utilized in crop IPM programs may result in more effective WNV prevention. |
Maternal exposure to outdoor air pollution and congenital limb deficiencies in the National Birth Defects Prevention Study
Choi G , Stingone JA , Desrosiers TA , Olshan AF , Nembhard WN , Shaw GM , Pruitt S , Romitti PA , Yazdy MM , Browne ML , Langlois PH , Botto L , Luben TJ . Environ Res 2019 179 108716 BACKGROUND: Congenital limb deficiencies (CLDs) are a relatively common group of birth defects whose etiology is mostly unknown. Recent studies suggest maternal air pollution exposure as a potential risk factor. AIM: To investigate the relationship between ambient air pollution exposure during early pregnancy and offspring CLDs. METHODS: The study population was identified from the National Birth Defects Prevention Study, a population-based multi-center case-control study, and consisted of 615 CLD cases and 5,701 controls with due dates during 1997 through 2006. Daily averages and/or maxima of six criteria air pollutants (particulate matter <2.5mum [PM2.5], particulate matter <10mum [PM10], nitrogen dioxide [NO2], sulfur dioxide [SO2], carbon monoxide [CO], and ozone [O3]) were averaged over gestational weeks 2-8, as well as for individual weeks during this period, using data from EPA air monitors nearest to the maternal address. Logistic regression was used to estimate odds ratios (aORs) and 95% confidence intervals (CIs) adjusted for maternal age, race/ethnicity, education, and study center. We estimated aORs for any CLD and CLD subtypes (i.e., transverse, longitudinal, and preaxial). Potential confounding by co-pollutant was assessed by adjusting for one additional air pollutant. Using the single pollutant model, we further investigated effect measure modification by body mass index, cigarette smoking, and folic acid use. Sensitivity analyses were conducted restricting to those with a residence closer to an air monitor. RESULTS: We observed near-null aORs for CLDs per interquartile range (IQR) increase in PM10, PM2.5, and O3. However, weekly averages of the daily average NO2 and SO2, and daily max NO2, SO2, and CO concentrations were associated with increased odds of CLDs. The crude ORs ranged from 1.03 to 1.12 per IQR increase in these air pollution concentrations, and consistently elevated aORs were observed for CO. Stronger associations were observed for SO2 and O3 in subtype analysis (preaxial). In co-pollutant adjusted models, associations with CO remained elevated (aORs: 1.02-1.30); but aORs for SO2 and NO2 became near-null. The aORs for CO remained elevated among mothers who lived within 20km of an air monitor. The aORs varied by maternal BMI, smoking status, and folic acid use. CONCLUSION: We observed modest associations between CLDs and air pollution exposures during pregnancy, including CO, SO2, and NO2, though replication through further epidemiologic research is warranted. |
Differences in price of flavoured and non-flavoured tobacco products sold in the USA, 2011-2016
Agaku IT , Odani S , Armour B , Mahoney M , Garrett BE , Loomis BR , Rogers T , Gammon DG , King BA . Tob Control 2019 29 (5) 537-547 BACKGROUND: Limited data exist on whether there is differential pricing of flavoured and non-flavoured varieties of the same product type. We assessed price of tobacco products by flavour type. METHODS: Retail scanner data from Nielsen were obtained for October 2011 to January 2016. Universal product codes were used to classify tobacco product (cigarettes, roll-your-own cigarettes (RYO), little cigars and moist snuff) flavours as: menthol, flavoured or non-flavoured. Prices were standardised to a cigarette pack (20 cigarette sticks) or cigarette pack equivalent (CPE). Average prices during 2015 were calculated overall and by flavour designation. Joinpoint regression and average monthly percentage change were used to assess trends. RESULTS: During October 2011 to January 2016, price trends increased for menthol (the only flavour allowed in cigarettes) and non-flavoured cigarettes; decreased for menthol, flavoured and non-flavoured RYO; increased for flavoured little cigars, but decreased for non-flavoured and menthol little cigars; and increased for menthol and non-flavoured moist snuff, but decreased for flavoured moist snuff. In 2015, average national prices were US$5.52 and US$5.47 for menthol and non-flavoured cigarettes; US$1.89, US$2.51 and US$4.77 for menthol, non-flavoured and flavoured little cigars; US$1.49, US$1.64 and US$1.78 per CPE for menthol, non-flavoured and flavoured moist snuff; and US$0.93, US$1.03 and $1.64 per CPE flavoured, menthol and non-flavoured RYO, respectively. CONCLUSION: Trends in the price of tobacco products varied across products and flavour types. Menthol little cigars, moist snuff and RYO were less expensive than non-flavoured varieties. Efforts to make flavoured tobacco products less accessible and less affordable could help reduce tobacco product use. |
First-line antibiotic selection in outpatient settings
Palms DL , Hicks LA , Bartoces M , Hersh AL , Zetts R , Hyun DY , Fleming-Dutra KE . Antimicrob Agents Chemother 2019 63 (11) Using the 2014 IBM MarketScan Commercial Database, we compared antibiotic selection for pharyngitis, sinusitis, and acute otitis media in retail clinics, emergency departments, urgent cares, and offices. Only 50% of visits for these conditions received recommended first-line antibiotics. Improving antibiotic selection for common outpatient conditions is an important stewardship target. |
Infectious causes of acute gastroenteritis in US children undergoing allogeneic hematopoietic cell transplant: A longitudinal, multicenter study
Schuster JE , Johnston SH , Piya B , Dulek DE , Wikswo ME , McHenry R , Browne H , Gautam R , Bowen MD , Vinje J , Payne DC , Azimi P , Selvarangan R , Halasa N , Englund JA . J Pediatric Infect Dis Soc 2019 9 (4) 421-427 BACKGROUND: Acute gastroenteritis (AGE) in hematopoietic cell transplant (HCT) patients causes significant morbidity and mortality. Data regarding the longitudinal assessment of infectious pathogens during symptomatic AGE and asymptomatic periods, particularly in children, are limited. We investigated the prevalence of AGE-associated infectious pathogens in children undergoing allogeneic HCT. METHODS: From March 2015 through May 2016, 31 pediatric patients at 4 US children's hospitals were enrolled and had stool collected weekly from pre-HCT through 100 days post-HCT for infectious AGE pathogens by molecular testing. Demographics, clinical symptoms, antimicrobials, vaccination history, and outcomes were manually abstracted from the medical record into a standardized case report form. RESULTS: We identified a pathogen in 18% (38/206) of samples, with many detections occurring during asymptomatic periods. Clostridioides difficile was the most commonly detected pathogen in 39% (15/38) of positive specimens, although only 20% (3/15) of C. difficile-positive specimens were obtained from children with diarrhea. Detection of sapovirus, in 21% (8/38) of pathogen-positive specimens, was commonly associated with AGE, with 87.5% of specimens obtained during symptomatic periods. Norovirus was not detected, and rotavirus was detected infrequently. Prolonged shedding of infectious pathogens was rare. CONCLUSIONS: This multicenter, prospective, longitudinal study suggests that the epidemiology of AGE pathogens identified from allogeneic HCT patients may be changing. Previously reported viruses, such as rotavirus and norovirus, may be less common due to widespread vaccination and institution of infection control precautions, and emerging viruses such as sapoviruses may be increasingly recognized due to the use of molecular diagnostics. |
Oral cholera vaccination coverage after the first global stockpile deployment in Haiti, 2014
Burnett EM , Francois J , Sreenivasan N , Wannemuehler K , Faye PC , Tohme RA , Delly P , Deslouches YG , Etheart MD , Dismer AM , Patel R , Date K . Vaccine 2019 37 (43) 6348-6355 INTRODUCTION: In 2014, an oral cholera vaccine (OCV) campaign targeting 185,314 persons aged >/=1years was conducted in 3 departments via fixed post and door-to-door strategies. This was the first use of the global OCV stockpile in Haiti. METHODS: We conducted a multi-stage cluster survey to assess departmental OCV coverage. Target population estimates were projected from the 2003 Haiti population census with adjustments for population growth and estimated proportion of pregnant women. In the three departments, we sampled 30/106 enumeration areas (EAs) in Artibonite, 30/244 EAs in Centre, and 20/29 EAs in Ouest; 20 households were systematically sampled in each EA. Household and individual interviews using a standard questionnaire were conducted in each selected household; data on OCV receipt were obtained from vaccination card or verbal report. We calculated OCV campaign coverage estimates and 95% confidence intervals (CIs) accounting for survey design. RESULTS: Overall two-dose OCV coverage was 70% (95% CI: 60, 79), 63% (95% CI: 55, 71), and 44% (95% CI: 35, 53) in Artibonite, Centre, and Ouest, respectively. Two-dose coverage was higher in the 1-4years age group than among those>/=15years in Artibonite (difference: 11%; 95% CI: 5%, 17%) and Ouest (difference: 12%; 95% CI: 3, 20). A higher percentage of children aged 5-14years received both recommended doses than did those>/=15years (Artibonite: 14% (95% CI: 8%, 19%) difference; Centre: 11% difference (95% CI: 5%, 17%); Ouest: 10% difference (95% CI: 2%, 17%). The most common reason for not receiving any OCV dose was being absent during the campaign or not having heard about vaccination activities. CONCLUSIONS: While coverage estimates in Artibonite and Centre were comparable with other OCV campaigns in Haiti and elsewhere, inadequate social mobilization and outdated population estimates might have contributed to lower coverage in Ouest. |
Inactivated influenza vaccine and spontaneous abortion in the Vaccine Safety Datalink in 2012-13, 2013-14, and 2014-15
Donahue JG , Kieke BA , King JP , Mascola MA , Shimabukuro TT , DeStefano F , Hanson KE , McClure DL , Olaiya O , Glanz JM , Hechter RC , Irving SA , Jackson LA , Klein NP , Naleway AL , Weintraub ES , Belongia EA . Vaccine 2019 37 (44) 6673-6681 INTRODUCTION: A recent study reported an association between inactivated influenza vaccine (IIV) and spontaneous abortion (SAB), but only among women who had also been vaccinated in the previous influenza season. We sought to estimate the association between IIV administered in three recent influenza seasons and SAB among women who were and were not vaccinated in the previous influenza season. METHODS: We conducted a case-control study over three influenza seasons (2012-13, 2013-14, 2014-15) in the Vaccine Safety Datalink (VSD). Cases (women with SAB) and controls (women with live births) were matched on VSD site, date of last menstrual period, age group, and influenza vaccination status in the previous influenza season. Of 1908 presumptive cases identified from the electronic record, 1236 were included in the main analysis. Administration of IIV was documented in several risk windows, including 1-28, 29-56, and >56days before the SAB date. RESULTS: Among 627 matched pairs vaccinated in the previous season, no association was found between vaccination in the 1-28day risk window and SAB (adjusted odds ratio (aOR) 0.9; 95% confidence interval (CI) 0.6-1.5). The season-specific aOR ranged from 0.5 to 1.7 with all CIs including the null value of 1.0. Similarly, no association was found among women who were not vaccinated in the previous season; the season-specific aOR in the 1-28day risk window ranged from 0.6 to 0.7 and the 95% CI included 1.0 in each season. There was no association found between SAB and influenza vaccination in the other risk windows, or when vaccine receipt was analyzed relative to date of conception. CONCLUSION: During these seasons we found no association between IIV and SAB, including among women vaccinated in the previous season. These findings lend support to current recommendations for influenza vaccination at any time during pregnancy, including the first trimester. |
Combining serological and contact data to derive target immunity levels for achieving and maintaining measles elimination
Funk S , Knapp JK , Lebo E , Reef SE , Dabbagh AJ , Kretsinger K , Jit M , Edmunds WJ , Strebel PM . BMC Med 2019 17 (1) 180 BACKGROUND: Vaccination has reduced the global incidence of measles to the lowest rates in history. However, local interruption of measles virus transmission requires sustained high levels of population immunity that can be challenging to achieve and maintain. The herd immunity threshold for measles is typically stipulated at 90-95%. This figure does not easily translate into age-specific immunity levels required to interrupt transmission. Previous estimates of such levels were based on speculative contact patterns based on historical data from high-income countries. The aim of this study was to determine age-specific immunity levels that would ensure elimination of measles when taking into account empirically observed contact patterns. METHODS: We combined estimated immunity levels from serological data in 17 countries with studies of age-specific mixing patterns to derive contact-adjusted immunity levels. We then compared these to case data from the 10 years following the seroprevalence studies to establish a contact-adjusted immunity threshold for elimination. We lastly combined a range of hypothetical immunity profiles with contact data from a wide range of socioeconomic and demographic settings to determine whether they would be sufficient for elimination. RESULTS: We found that contact-adjusted immunity levels were able to predict whether countries would experience outbreaks in the decade following the serological studies in about 70% of countries. The corresponding threshold level of contact-adjusted immunity was found to be 93%, corresponding to an average basic reproduction number of approximately 14. Testing different scenarios of immunity with this threshold level using contact studies from around the world, we found that 95% immunity would have to be achieved by the age of five and maintained across older age groups to guarantee elimination. This reflects a greater level of immunity required in 5-9-year-olds than established previously. CONCLUSIONS: The immunity levels we found necessary for measles elimination are higher than previous guidance. The importance of achieving high immunity levels in 5-9-year-olds presents both a challenge and an opportunity. While such high levels can be difficult to achieve, school entry provides an opportunity to ensure sufficient vaccination coverage. Combined with observations of contact patterns, further national and sub-national serological studies could serve to highlight key gaps in immunity that need to be filled in order to achieve national and regional measles elimination. |
Uptake and safety of hepatitis A vaccination during pregnancy: A Vaccine Safety Datalink study
Groom HC , Smith N , Irving SA , Koppolu P , Vazquez-Benitez G , Kharbanda EO , Daley MF , Donahue JG , Getahun D , Jackson LA , Klein NP , McCarthy NL , Nordin JD , Panagiotakopoulos L , Naleway AL . Vaccine 2019 37 (44) 6648-6655 INTRODUCTION: Infection with hepatitis A virus (HAV) during pregnancy, although uncommon, is associated with gestational complications and pre-term labor. Hepatitis A vaccine (HepA) is recommended for anyone at increased risk for contracting hepatitis A, including women at risk who are also pregnant. Limited data are available on the safety of maternal HepA vaccination. OBJECTIVES: Assess the frequency of maternal HepA receipt and evaluate the potential association between maternal vaccination and pre-specified maternal and infant safety outcomes. METHODS: A retrospective cohort of pregnancies in the Vaccine Safety Datalink (VSD) resulting in live births from 2004 through 2015 was included. Pregnancies with HepA exposure were compared to those with other vaccine exposures, and to those with no vaccine exposures. Risk factors for contracting hepatitis A were identified up to one-year prior to or during the pregnancy using ICD-9 codes. Maternal and fetal adverse events were evaluated according to maternal HepA exposure status. Adjusted odds ratio (OR) were used to describe the association. RESULTS: Among 666,233 pregnancies in the study period, HepA was administered at a rate of 1.7 per 1000 (n=1140), most commonly within the first six weeks of pregnancy. Less than 3% of those exposed to HepA during pregnancy had an ICD-confirmed risk factor. There were no significant associations between HepA exposure during pregnancy and gestational hypertension, gestational diabetes, pre-eclampsia/eclampsia, cesarean delivery, pre-term delivery, and low birthweight. There was a statistically significant association between HepA exposure during pregnancy and small-for-gestational age (SGA) infants (aOR 1.32, [95% CI 1.09, 1.60], p=0.004). CONCLUSIONS: The rate of maternal HepA vaccination was low and rarely due to documented risk factors for vaccination. HepA vaccination during pregnancy was not associated with an increased risk for a range of adverse events examined among pregnancies resulting in live births, but an identified association between maternal HepA and SGA infant outcomes, while likely due to unmeasured confounding, warrants further exploration. |
Adverse events following adenovirus type 4 and type 7 vaccine, live, oral in the Vaccine Adverse Event Reporting System (VAERS), United States, October 2011-July 2018
McNeil MM , Paradowska-Stankiewicz I , Miller ER , Marquez PL , Seshadri S , Collins LCJr , Cano MV . Vaccine 2019 37 (44) 6760-6767 BACKGROUND: In March 2011, the U.S. Food and Drug Administration licensed adenovirus type 4 and type 7 vaccine, live, oral (Barr Labs, Inc.) (adenovirus vaccine) for use in military personnel 17 through 50years of age. The vaccine was first universally administered to U.S. military recruits in October 2011. We investigated adverse event (AE) reports following the adenovirus vaccine submitted to the Vaccine Adverse Event Reporting System (VAERS). METHODS: We searched the VAERS database for U.S. reports among persons who received adenovirus vaccine during October 2011 through July 2018 including participants in a military observational study. We reviewed all serious reports and accompanying medical records. We compared the proportion of serious reports in a proxy military recruit population and reviewed all reports of suspected allergic reactions following adenovirus vaccination. RESULTS: During the analytic period, VAERS received 100 reports following adenovirus vaccination; 39 (39%) were classified as serious and of these, 17 (44%) were from the observational study. One death was reported. Males accounted for 72% of reports. Median age of vaccinees was 19years (range 17-32). The most frequently reported serious AEs were Guillain Barre syndrome (GBS) (n=12) and anaphylaxis (n=8); of these, two GBS and all the anaphylaxis reports were reported in the observational study. Reports documented concurrent receipt of multiple other vaccines (95%) and penicillin G (IM Pen G) or other antibiotics (50%). CONCLUSIONS: The reporting rate for serious AEs was higher than with other vaccines administered in the comparison military recruit population (39% vs 18%); however, we identified no unexpected or concerning pattern of adenovirus vaccine AEs. Co-administration of vaccines and IM Pen G was commonly reported in this military population. These exposures may have contributed to the GBS and anaphylaxis outcomes observed with the adenovirus vaccine. Future adenovirus vaccine safety studies in a population without these co-administrations would be helpful in clarifying the vaccine's safety profile. |
Impact of the introduction of rotavirus vaccine on hospital admissions for diarrhoea among children in Kenya: A controlled interrupted time series analysis
Otieno GP , Bottomley C , Khagayi S , Adetifa I , Ngama M , Omore R , Ogwel B , Owor BE , Bigogo G , Ochieng JB , Onyango C , Juma J , Mwenda J , Tabu C , Tate JE , Addo Y , Britton T , Parashar UD , Breiman RF , Verani JR , Nokes DJ . Clin Infect Dis 2019 70 (11) 2306-2313 INTRODUCTION: Monovalent rotavirus vaccine, RotarixTM, was introduced in Kenya in July 2014, is recommended to infants as oral doses at ages 6 and 10 weeks. A multi-site study was established in two population based surveillance sites to evaluate vaccine impact on the incidence of rotavirus-associated hospitalisations (RVH). METHODS: Hospital-based surveillance was conducted from January 2010 to June 2017 for acute diarrhoea hospitalisations among children aged <5 years in two health facilities in Kenya. A controlled interrupted time series analysis was undertaken to compare RVH pre and post vaccine introduction using rotavirus negative cases as a control series. The change in incidence post vaccine introduction was estimated from a negative binomial model that adjusted for secular trend, seasonality and multiple health worker industrial actions (strikes). RESULTS: Between January 2010 and June 2017 there were 1513 and 1652 diarrhoea hospitalisations in Kilifi and Siaya; among those tested for rotavirus, 28% (315/1142) and 23% (197/877) were positive, respectively. There was a 57% (95% CI: 8 to 80) reduction in RVH observed in the first year post vaccine introduction in Kilifi and a 59% (95% CI: 20 to 79) reduction in Siaya. In the second year, RVH decreased further at both sites, 80% (95% CI: 46 to 93) reduction in Kilifi and 82% reduction in Siaya (95% CI: 61 to 92), and this reduction was sustained at both sites into the third year. CONCLUSIONS: A substantial reduction of RVH and all-cause diarrhoea has been observed in two demographic surveillance sites in Kenya within 3 years of vaccine introduction. |
Priming effect of bivalent and quadrivalent vaccine for HPV 31/33/45/52: an exploratory analysis from two clinical trials
Sauvageau C , Panicker G , Unger ER , De Serres G , Schiller J , Ouakki M , Gilca V . Hum Vaccin Immunother 2019 16 (3) 590-594 The main objective of this post hoc analysis is to compare the magnitude of the immune response to HPV31/33/45/52 and 58 after a dose of 9vHPV vaccine given to naive (previously unvaccinated) subjects and subjects previously vaccinated with a dose of 2vHPV or 4vHPV vaccine. Results from two clinical trials conducted in the same region, in comparable populations and by the same research team were included in this analysis. In study A, a dose of 9vHPV was administered 6 months after a single dose of 2vHPV as well as to naive subjects. In study B, a dose of 9vHPV was administered 36-96 months (mean 65 months) after a single dose of 4vHPV. Blood samples were collected just before and one month post-9vHPV vaccine administration. For both studies, antibody responses were measured using the same 9-plex virus-like particle based IgG ELISA (M9ELISA). One month after 9vHPV dose administration, all subjects were seropositive to HPV 31/33/45/52 and 58. Subjects who had previously received 2vHPV or 4vHPV had significantly higher (1.8-8.0-fold) GMTs than naive subjects for HPV31/33/45/52 types but not for HPV58. GMTs to HPV31/33/45/52 and 58 were not significantly different between subjects who received a 2vHPV or 4vHPV dose prior to 9vHPV. The strong anamnestic response to one dose of 9vHPV given as late as 3-8 years after a single dose of 2vHPV or 4vHPV vaccine indicates these vaccines induced priming to types only included in the 9vHPV vaccine. |
Safety, tolerability, and immunogenicity of PfSPZ Vaccine administered by direct venous inoculation to infants and young children: findings from an age de-escalation, dose-escalation double-blinded randomized, controlled study in western Kenya
Steinhardt LC , Richie TL , Yego R , Akach D , Hamel MJ , Gutman JR , Wiegand RE , Nzuu EL , Dungani A , Kc N , Murshedkar T , Church LWP , Sim BKL , Billingsley PF , James ER , Abebe Y , Kariuki S , Samuels AM , Otieno K , Sang T , Kachur SP , Styers D , Schlessman K , Abarbanell G , Hoffman SL , Seder RA , Oneko M . Clin Infect Dis 2019 71 (4) 1063-1071 BACKGROUND: The whole sporozoite PfSPZ Vaccine is being evaluated for malaria prevention. The vaccine is administered intravenously for maximal efficacy. Direct venous inoculation (DVI) with PfSPZ Vaccine has been safe, tolerable, and feasible in adults, but safety data for children and infants are limited. METHODS: We conducted an age de-escalation, dose-escalation randomized controlled trial in Siaya County, western Kenya. Children and infants (5-9 years, 13-59 months, and 5-12 months) were enrolled into 13 age-dose cohorts of 12 participants and randomized 2:1 to vaccine or normal saline placebo in escalating doses: 1.35x105, 2.7x105, 4.5x105, 9.0x105, and 1.8x106Plasmodium falciparum sporozoites (PfSPZ), with the two highest doses given twice, 8 weeks apart. Solicited adverse events (AEs) were monitored for eight days after vaccination; unsolicited AEs for 29 days; and serious AEs (SAEs) throughout the study. Blood taken pre-vaccination and one-week post-vaccination was tested for IgG antibodies to Pf circumsporozoite protein (PfCSP) using enzyme-linked immunosorbent assay (ELISA). RESULTS: Rates of AEs were similar in vaccinees and controls for solicited (35.7% vs. 41.5%) and unsolicited (83.9% vs. 92.5%) AEs, respectively. No related grade 3 AEs, SAEs, or grade 3 laboratory abnormalities occurred. Most (79.0%) vaccinations were administered by a single DVI. Among those in the 9.0x105 and 1.8x106 PfSPZ groups, 36/45 (80.0%) vaccinees and 4/21 (19.0%) placebo controls developed antibodies to PfCSP, p<0.001. CONCLUSIONS: PfSPZ Vaccine in doses as high as 1.8x106 can be administered to infants and children by DVI, and was safe, well tolerated, and immunogenic. |
A comparison of machine learning algorithms for the surveillance of autism spectrum disorder.
Lee SH , Maenner MJ , Heilig CM . PLoS One 2019 14 (9) e0222907 OBJECTIVE: The Centers for Disease Control and Prevention (CDC) coordinates a labor-intensive process to measure the prevalence of autism spectrum disorder (ASD) among children in the United States. Random forests methods have shown promise in speeding up this process, but they lag behind human classification accuracy by about 5%. We explore whether more recently available document classification algorithms can close this gap. MATERIALS AND METHODS: Using data gathered from a single surveillance site, we applied 8 supervised learning algorithms to predict whether children meet the case definition for ASD based solely on the words in their evaluations. We compared the algorithms' performance across 10 random train-test splits of the data, using classification accuracy, F1 score, and number of positive calls to evaluate their potential use for surveillance. RESULTS: Across the 10 train-test cycles, the random forest and support vector machine with Naive Bayes features (NB-SVM) each achieved slightly more than 87% mean accuracy. The NB-SVM produced significantly more false negatives than false positives (P = 0.027), but the random forest did not, making its prevalence estimates very close to the true prevalence in the data. The best-performing neural network performed similarly to the random forest on both measures. DISCUSSION: The random forest performed as well as more recently available models like the NB-SVM and the neural network, and it also produced good prevalence estimates. NB-SVM may not be a good candidate for use in a fully-automated surveillance workflow due to increased false negatives. More sophisticated algorithms, like hierarchical convolutional neural networks, may not be feasible to train due to characteristics of the data. Current algorithms might perform better if the data are abstracted and processed differently and if they take into account information about the children in addition to their evaluations. CONCLUSION: Deep learning models performed similarly to traditional machine learning methods at predicting the clinician-assigned case status for CDC's autism surveillance system. While deep learning methods had limited benefit in this task, they may have applications in other surveillance systems. |
Motor vehicle injury prevention in eight American Indian/Alaska Native communities: results from the 2010-2014 Centers for Disease Control and Prevention Tribal Motor Vehicle Injury Prevention Program
Crump CE , Letourneau RJ , Billie H , Zhang X , West B . Public Health 2019 176 29-35 OBJECTIVES: The aim of the study is to increase seat belt (SB) use and reduce motor vehicle (MV) injuries and death; eight tribal communities implemented evidence-based strategies from the Guide to Community Preventive Services during 2010-2014. STUDY DESIGN: SB use was measured through direct observational surveys and traffic safety activity data. Traffic safety activities included enhanced enforcement campaign events, ongoing enforcement of SB laws, and media. The number of MV injuries (including fatal and non-fatal) was measured through MV crash data collected by police. RESULTS: Percentage change increases in SB use were observed in all eight projects; average annual increases of three projects were statistically significant (ranging from 10% to 43%). Four of the eight projects exceeded their goals for percentage change increases in SB use. Approximately 200 media events and 100 enforcement events focused on SB use were conducted across the eight projects. Five projects had an annual average of >/=100 SB use citations during the project period. MV injuries (fatal and non-fatal combined) significantly decreased in three projects (ranging from a 10% to 21% average annual decrease). CONCLUSIONS: Increases in SB use and decreases in the number of MV injuries can be achieved by tailoring evidence-based strategies to tribal communities. |
SeqSero2: rapid and improved Salmonella serotype determination using whole genome sequencing data.
Zhang S , Den-Bakker HC , Li S , Chen J , Dinsmore BA , Lane C , Lauer AC , Fields PI , Deng X . Appl Environ Microbiol 2019 85 (23) SeqSero, launched in 2015, is a software tool for Salmonella serotype determination from whole genome sequencing (WGS) data. Despite its routine use in public health and food safety laboratories in the United States and other countries, the original SeqSero pipeline is relatively slow (minutes per genome using sequencing reads), is not optimized for draft genome assemblies, and may assign multiple serotypes for a strain. Here we present SeqSero2 (github.com/denglab/SeqSero2; denglab.info/SeqSero2), an algorithmic transformation and functional update of the original SeqSero. Major improvements include: 1) additional sequence markers for identification of Salmonella species and subspecies and certain serotypes; 2) a k-mer based algorithm for rapid serotype prediction from raw reads (seconds per genome) and improved serotype prediction from assemblies; and 3) a targeted assembly approach for specific retrieval of serotype determinants from WGS for serotype prediction, new allele discovery, and prediction troubleshooting. Evaluated using 5,794 genomes representing 364 common US serotypes, including 2,280 human isolates of 117 serotypes from the National Antimicrobial Resistance Monitoring System, SeqSero2 is up to 50 times faster than the original SeqSero while maintaining equivalent accuracy for raw reads and substantially improving accuracy for assemblies. SeqSero2 further suggested that 3% of the tested genomes contained reads from multiple serotypes, indicating a use for contamination detection. In addition to short reads, SeqSero2 demonstrated potential for accurate and rapid serotype prediction directly from long nanopore reads despite base call errors. Testing of 40 nanopore-sequenced genomes of 17 serotypes yielded a single H antigen misidentification.IMPORTANCE: Serotyping is the basis of public health surveillance of Salmonella It remains a first-line subtyping method even as surveillance continues to be transformed by whole genome sequencing. SeqSero allows the integration of Salmonella serotyping into a whole genome sequencing-based laboratory workflow while maintaining continuity with the classic serotyping scheme. SeqSero2, informed by extensive testing and application of SeqSero in the United States and other countries, incorporates important improvements and updates that further strengthen its application in routine and large scale surveillance of Salmonella by whole genome sequencing. |
Replicative fitness of seasonal influenza A viruses with decreased susceptibility to baloxavir
Chesnokov A , Patel MC , Mishin VP , De La Cruz JA , Lollis L , Nguyen HT , Dugan V , Wentworth DE , Gubareva LV . J Infect Dis 2019 221 (3) 367-371 Susceptibility of influenza A viruses to baloxavir can be affected by changes at amino acid residue 38 in polymerase acidic (PA) protein. Information on replicative fitness of PA-I38-substituted viruses remains sparse. We demonstrated that substitutions I38L/M/S/T not only had a differential effect on baloxavir susceptibility (9- to 116-fold), but also on in vitro replicative fitness. While I38L conferred undiminished growth, other substitutions led to mild attenuation. In a ferret model, control viruses outcompeted those carrying I38M or I38T substitutions, although their advantage was limited. These findings offer insights into the attributes of baloxavir resistant viruses needed for informed risk assessment. |
Integration of inflammation, fibrosis, and cancer induced by carbon nanotubes
Dong J , Ma Q . Nanotoxicology 2019 13 (9) 1-31 Carbon nanotubes (CNTs) are nanomaterials with unique physicochemical properties that are targets of great interest for industrial and commercial applications. Notwithstanding, some characteristics of CNTs are associated with adverse outcomes from exposure to pathogenic particulates, raising concerns over health risks in exposed workers and consumers. Indeed, certain forms of CNTs induce a range of harmful effects in laboratory animals, among which inflammation, fibrosis, and cancer are consistently observed for some CNTs. Inflammation, fibrosis, and malignancy are complex pathological processes that, in summation, underlie a major portion of human disease. Moreover, the functional interrelationship among them in disease pathogenesis has been increasingly recognized. The CNT-induced adverse effects resemble certain human disease conditions, such as pneumoconiosis, idiopathic pulmonary fibrosis (IPF), and mesothelioma, to some extent. Progress has been made in understanding CNT-induced pathologic conditions in recent years, demonstrating a close interconnection among inflammation, fibrosis, and cancer. Mechanistically, a number of mediators, signaling pathways, and cellular processes are identified as major mechanisms that underlie the interplay among inflammation, fibrosis, and malignancy, and serve as pathogenic bases for these disease conditions in CNT-exposed animals. These studies indicate that CNT-induced pathological effects, in particular, inflammation, fibrosis, and cancer, are mechanistically, and in some cases, causatively, interrelated. These findings generate new insights into CNT adverse effects and pathogenesis and provide new targets for exposure monitoring and drug development against inflammation, fibrosis, and cancer caused by inhaled nanomaterials. |
Azithromycin susceptibility among Neisseria gonorrhoeae isolates and seasonal macrolide use
Olesen SW , Torrone EA , Papp JR , Kirkcaldy RD , Lipsitch M , Grad YH . J Infect Dis 2019 219 (4) 619-623 Rising azithromycin nonsusceptibility among Neisseria gonorrhoeae isolates threatens current treatment recommendations, but the cause of this rise is not well understood. We performed an ecological study of seasonal patterns in macrolide use and azithromycin resistance in N. gonorrhoeae, finding that population-wide macrolide use is associated with increased azithromycin nonsusceptibility. These results, indicative of bystander selection, have implications for antibiotic prescribing guidelines. |
Monoamine oxidase inhibitory activity of flavoured e-cigarette liquids
Truman P , Stanfill S , Heydari A , Silver E , Fowles J . Neurotoxicology 2019 75 123-128 BACKGROUND AND AIMS: Monoamine oxidase inhibitors have been hypothesised to be important in tobacco dependence, reinforcing the brain's response to nicotine by delaying the degradation of neurotransmitters by monoamine oxidases. The development of electronic cigarettes has provided an alternative nicotine delivery system, which is widely viewed as less toxic than tobacco smoke. However, significant data gaps remain. This paper reports the results of measurements of monoamine oxidase inhibitory activity in a small sample of commercially available, flavoured e-liquids. METHODS: Twelve e-liquids were tested for monoamine oxidase inhibitory activity, using the kynuramine assay and monoamine oxidase enzymes (human, recombinant). Control samples of carrier liquids, propylene glycol and glycerol, and nicotine were also tested. RESULTS: Four e-liquids contained high levels of inhibitory activity, four more were moderately inhibitory. The remaining four e-liquids were mildly inhibitory, while the carrier liquids, and nicotine were inactive at relevant concentrations. The active compounds in the e-liquids were subsequently identified as vanillin and ethyl vanillin. Under some conditions of use, the sampled e-liquids with the highest concentrations of monoamine oxidase inhibitory activity have the potential to expose consumers to physiologically significant levels of MAO inhibitory activity. CONCLUSIONS: While only a small sample of e-liquids was tested, the findings suggest that some flavours have pharmacological actions, with potential to enhance the response to nicotine or to other drugs. The public health implications of these preliminary findings on addiction and smoking cessation warrant exploration and further research. |
Characterization of Monkeypox virus dissemination in the black-tailed prairie dog (Cynomys ludovicianus) through in vivo bioluminescent imaging
Weiner ZP , Salzer JS , LeMasters E , Ellison JA , Kondas AV , Morgan CN , Doty JB , Martin BE , Satheshkumar PS , Olson VA , Hutson CL . PLoS One 2019 14 (9) e0222612 Monkeypox virus (MPXV) is a member of the genus Orthopoxvirus, endemic in Central and West Africa. This viral zoonosis was introduced into the United States in 2003 via African rodents imported for the pet trade and caused 37 human cases, all linked to exposure to MPXV-infected black-tailed prairie dogs (Cynomys ludovicianus). Prairie dogs have since become a useful model of MPXV disease, utilized for testing of potential medical countermeasures. In this study, we used recombinant MPXV containing the firefly luciferase gene (luc) and in vivo imaging technology to characterize MPXV pathogenesis in the black-tailed prairie dog in real time. West African (WA) MPXV could be visualized using in vivo imaging in the nose, lymph nodes, intestines, heart, lung, kidneys, and liver as early as day 6 post infection (p.i.). By day 9 p.i., lesions became visible on the skin and in some cases in the spleen. After day 9 p.i., luminescent signal representing MPXV replication either increased, indicating a progression to what would be a fatal infection, or decreased as infection was resolved. Use of recombinant luc+ MPXV allowed for a greater understanding of how MPXV disseminates throughout the body in prairie dogs during the course of infection. This technology will be used to reduce the number of animals required in future pathogenesis studies as well as aid in determining the effectiveness of potential medical countermeasures. |
An overview of the quality assurance programme for HIV rapid testing in South Africa: Outcome of a 2-year phased implementation of quality assurance program
Woldesenbet SA , Kalou M , Mhlongo D , Kufa T , Makhanya M , Adelekan A , Diallo K , Maleka M , Singh B , Parekh B , Mohlala A , Manyike PT , Tucker TJ , Puren AJ . PLoS One 2019 14 (9) e0221906 OBJECTIVE: This is the first large-scale assessment of the implementation of HIV Rapid Test Quality Improvement Initiative in South Africa. METHODS: We used a quasi-experimental one group post-test only design. The intervention implemented starting April 2014 comprised health-care worker training on quality assurance (QA) of HIV rapid testing and enrolment of the facilities in proficiency testing (PT), targeting 2,077 healthcare facilities in 32 high HIV burden districts. Following the intervention, two consecutive rounds of site assessments were undertaken. The first, conducted after a median of 7.5 months following the training, included 1,915 facilities that participated in the QA training, while the second, conducted after a median of one-year following the first-round assessment included 517 (27.0%) of the 1,915 facilities. In both assessments, the Stepwise-Process-for-Improving-the-quality-of-HIV-Rapid-Testing (SPI-RT) checklist was used to score facilities' performance in 7 domains: training, physical facility, safety, pre-testing, testing, post-testing and external quality assessment. Facilities' level of readiness for national certification was assessed. RESULT: Between 2016 and 2017, there were four PT cycles. PT participation increased from 32.4% (620/1,915) in 2016 to 91.5% (1,753/1,915) in 2017. In each PT cycle, PT results were returned by 76%-87% of facilities and a satisfactory result (>80%) was achieved by >/=95% of facilities. In the SPI-RT assessment, in round-one, 22.3% of facilities were close to or eligible for national certification-this significantly increased to 38.8% in round-two (P-value<0.001). The median SPI-RT score for the domains HIV pre-testing (83.3%) and post-testing (72.2%) remained the same between the two rounds. The median score for the testing domain increased by 5.6% (to 77.8%). CONCLUSION: Facilities performance on the domains that are critical for accuracy of diagnosis (i.e. pre-testing, testing and post-testing) remained largely unchanged. This study provided several recommendations to improve QA implementation in South Africa, including the need to improve routine use of internal quality control for corrective actions. |
Comparisons of self-reported and measured height and weight, BMI, and obesity prevalence from national surveys: 1999-2016
Flegal KM , Ogden CL , Fryar C , Afful J , Klein R , Huang DT . Obesity (Silver Spring) 2019 27 (10) 1711-1719 OBJECTIVE: The aim of this study was to compare national estimates of self-reported and measured height and weight, BMI, and obesity prevalence among adults from US surveys. METHODS: Self-reported height and weight data came from the National Health and Nutrition Examination Survey (NHANES), the National Health Interview Survey, and the Behavioral Risk Factor Surveillance System for the years 1999 to 2016. Measured height and weight data were available from NHANES. BMI was calculated from height and weight; obesity was defined as BMI >/= 30. RESULTS: In all three surveys, mean self-reported height was higher than mean measured height in NHANES for both men and women. Mean BMI from self-reported data was lower than mean BMI from measured data across all surveys. For women, mean self-reported weight, BMI, and obesity prevalence in the National Health Interview Survey and Behavioral Risk Factor Surveillance System were lower than self-report in NHANES. The distribution of BMI was narrower for self-reported than for measured data, leading to lower estimates of obesity prevalence. CONCLUSIONS: Self-reported height, weight, BMI, and obesity prevalence were not identical across the three surveys, particularly for women. Patterns of misreporting of height and weight and their effects on BMI and obesity prevalence are complex. |
Lack of in-home piped water and reported consumption of sugar-sweetened beverages among adults in rural Alaska
Mosites E , Seeman S , Fenaughty A , Fink K , Eichelberger L , Holck P , Thomas TK , Bruce MG , Hennessy TW . Public Health Nutr 2019 23 (5) 1-8 OBJECTIVE: To assess whether a community water service is associated with the frequency of sugar-sweetened beverages (SSB) consumption, obesity, or perceived health status in rural Alaska. DESIGN: We examined the cross-sectional associations between community water access and frequency of SSB consumption, body mass index categories, and perceived health status using data from the 2013 and 2015 Alaska Behavioral Risk Factor Surveillance System (BRFSS). Participants were categorized by zip code to 'in-home piped water service' or 'no in-home piped water service' based on water utility data. We evaluated the univariable and multivariable (adjusting for age, household income and education) associations between water service and outcomes using log-linear survey-weighted generalized linear models. SETTING: Rural Alaska, USA. SUBJECTS: Eight hundred and eighty-seven adults, aged 25 years and older. RESULTS: In unadjusted models, participants without in-home water reported consuming SSB more often than participants with in-home water (1.46, 95 % CI: 1.06, 2.00). After adjustment for potential confounders, the effect decreased but remained borderline significant (1.29, 95 % CI: 1.00, 1.67). Obesity was not significantly associated with water service but self-reported poor health was higher in those communities without in-home water (1.63, 95 % CI: 1.05, 2.54). CONCLUSIONS: Not having access to in-home piped water could affect behaviours surrounding SSB consumption and general perception of health in rural Alaska. |
Comparative analyses of workers' compensation claims of injury among temporary and permanent employed workers in Ohio
Al-Tarawneh IS , Wurzelbacher SJ , Bertke SJ . Am J Ind Med 2019 63 (1) 3-22 BACKGROUND: A small but increasing number of studies have examined the risk of injury among temporary workers compared to that among workers in permanent employer arrangements. The purpose of this study was to conduct a comparative analysis of injury risk among temporary and permanent employer workers using a large dataset of workers' compensation (WC) claims of injury. METHODS: Over 1.3 million accepted WC claims in Ohio during the years 2001 to 2013 were analyzed, including 45 046 claims from workers employed by temporary services agencies. General descriptive statistics, injury rates and rate ratios (temporary to permanent workers) were calculated by injury type and event, industry group, and industry manual classes. RESULTS: Injured temporary workers were younger and had less tenure compared to injured permanent workers. Temporary workers had higher injury rates, and lower lost-time and medical costs. Differences in injury rates between temporary and permanent workers varied by injury event, industry, and manual class. CONCLUSION: Temporary workers had higher overall injury rates than permanent workers, controlling for industry manual class. These differences were pronounced for certain industries and injury events. We were not able to control for age and tenure of the worker, so it is not clear how these factors affected observed results. These findings were mostly similar to those from other studies using WC data from the states of Washington and Illinois. Together, these studies provide insights to improve injury prevention among temporary workers, however, additional research is still needed to improve safety and health programming for this group of workers. |
Evaluating employment quality as a determinant of health in a changing labor market
Peckham T , Fujishiro K , Hajat A , Flaherty BP , Seixas N . RSF 2019 5 (4) 258-281 The shifting nature of employment in recent decades has not been adequately examined from a public health perspective. To that end, traditional models of work and health research need to be expanded to include the relational and contractual aspects of employment that also affect health. We examine the association of three health outcomes with different types of employment in the contemporary U.S. labor market, as measured by a multidimensional construct of employment quality (EQ) derived from latent class analysis. We find that EQ is associated with self-rated health, mental health, and occupational injury. Further, we explore three proposed mediating mechanisms of the EQ-health relationship (material deprivation, employment-related stressors, and occupational risk factors), and find each to be supported by these data. |
Severe silicosis in engineered stone fabrication workers - California, Colorado, Texas, and Washington, 2017-2019
Rose C , Heinzerling A , Patel K , Sack C , Wolff J , Zell-Baran L , Weissman D , Hall E , Sooriash R , McCarthy RB , Bojes H , Korotzer B , Flattery J , Weinberg JL , Potocko J , Jones KD , Reeb-Whitaker CK , Reul NK , LaSee CR , Materna BL , Raghu G , Harrison R . MMWR Morb Mortal Wkly Rep 2019 68 (38) 813-818 Silicosis is an incurable occupational lung disease caused by inhaling particles of respirable crystalline silica. These particles trigger inflammation and fibrosis in the lungs, leading to progressive, irreversible, and potentially disabling disease. Silica exposure is also associated with increased risk for lung infection (notably, tuberculosis), lung cancer, emphysema, autoimmune diseases, and kidney disease (1). Because quartz, a type of crystalline silica, is commonly found in stone, workers who cut, polish, or grind stone materials can be exposed to silica dust. Recently, silicosis outbreaks have been reported in several countries among workers who cut and finish stone slabs for countertops, a process known as stone fabrication (2-5). Most worked with engineered stone, a manufactured, quartz-based composite material that can contain >90% crystalline silica (6). This report describes 18 cases of silicosis, including the first two fatalities reported in the United States, among workers in the stone fabrication industry in California, Colorado, Texas, and Washington. Several patients had severe progressive disease, and some had associated autoimmune diseases and latent tuberculosis infection. Cases were identified through independent investigations in each state and confirmed based on computed tomography (CT) scan of the chest or lung biopsy findings. Silica dust exposure reduction and effective regulatory enforcement, along with enhanced workplace medical and public health surveillance, are urgently needed to address the emerging public health threat of silicosis in the stone fabrication industry. |
Trends in malaria prevalence and health related socioeconomic inequality in rural western Kenya: results from repeated household malaria cross-sectional surveys from 2006 to 2013
Were V , Buff AM , Desai M , Kariuki S , Samuels AM , Phillips-Howard P , Ter Kuile FO , Kachur SP , Niessen LW . BMJ Open 2019 9 (9) e033883 OBJECTIVE: The objective of this analysis was to examine trends in malaria parasite prevalence and related socioeconomic inequalities in malaria indicators from 2006 to 2013 during a period of intensification of malaria control interventions in Siaya County, western Kenya. METHODS: Data were analysed from eight independent annual cross-sectional surveys from a combined sample of 19 315 individuals selected from 7253 households. Study setting was a health and demographic surveillance area of western Kenya. Data collected included demographic factors, household assets, fever and medication use, malaria parasitaemia by microscopy, insecticide-treated bed net (ITN) use and care-seeking behaviour. Households were classified into five socioeconomic status and dichotomised into poorest households (poorest 60%) and less poor households (richest 40%). Adjusted prevalence ratios (aPR) were calculated using a multivariate generalised linear model accounting for clustering and cox proportional hazard for pooled data assuming constant follow-up time. RESULTS: Overall, malaria infection prevalence was 36.5% and was significantly higher among poorest individuals compared with the less poor (39.9% vs 33.5%, aPR=1.17; 95% CI 1.11 to 1.23) but no change in prevalence over time (trend p value <0.256). Care-seeking (61.1% vs 62.5%, aPR=0.99; 95% CI 0.95 to 1.03) and use of any medication were similar among the poorest and less poor. Poorest individuals were less likely to use Artemether-Lumefantrine or quinine for malaria treatment (18.8% vs 22.1%, aPR=0.81, 95% CI 0.72 to 0.91) while use of ITNs was lower among the poorest individuals compared with less poor (54.8% vs 57.9%; aPR=0.95; 95% CI 0.91 to 0.99), but the difference was negligible. CONCLUSIONS: Despite attainment of equity in ITN use over time, socioeconomic inequalities still existed in the distribution of malaria. This might be due to a lower likelihood of treatment with an effective antimalarial and lower use of ITNs by poorest individuals. Additional strategies are necessary to reduce socioeconomic inequities in prevention and control of malaria in endemic areas in order to achieve universal health coverage and sustainable development goals. |
Socioeconomic patterns of smoking cessation behavior in low and middle-income countries: Emerging evidence from the Global Adult Tobacco Surveys and International Tobacco Control Surveys
Nargis N , Yong HH , Driezen P , Mbulo L , Zhao L , Fong GT , Thompson ME , Borland R , Palipudi KM , Giovino GA , Thrasher JF , Siahpush M . PLoS One 2019 14 (9) e0220223 INTRODUCTION: Tobacco smoking is often more prevalent among those with lower socio-economic status (SES) in high-income countries, which can be driven by the inequalities in initiation and cessation of smoking. Smoking is a leading contributor to socio-economic disparities in health. To date, the evidence for any socio-economic inequality in smoking cessation is lacking, especially in low- and middle-income countries (LMICs). This study examined the association between cessation behaviours and SES of smokers from eight LMICs. METHODS: Data among former and current adult smokers aged 18 and older came from contemporaneous Global Adult Tobacco Surveys (2008-2011) and the International Tobacco Control Surveys (2009-2013) conducted in eight LMICs (Bangladesh, Brazil, China, India, Mexico, Malaysia, Thailand and Uruguay). Adjusted odds ratios (AORs) of successful quitting in the past year by SES indicators (household income/wealth, education, employment status, and rural-urban residence) were estimated using multivariable logistic regression controlling for socio-demographics and average tobacco product prices. A random effects meta-analysis was used to combine the estimates of AORs pooled across countries and two concurrent surveys for each country. RESULTS: Estimated quit rates among smokers (both daily and occasional) varied widely across countries. Meta-analysis of pooled AORs across countries and data sources indicated that there was no clear evidence of an association between SES indicators and successful quitting. The only exception was employed smokers, who were less likely to quit than their non-employed counterparts, which included students, homemakers, retirees, and the unemployed (pooled AOR approximately 0.8, p<0.10). CONCLUSION: Lack of clear evidence of the impact of lower SES on adult cessation behaviour in LMICs suggests that lower-SES smokers are not less successful in their attempts to quit than their higher-SES counterparts. Specifically, lack of employment, which is indicative of younger age and lower nicotine dependence for students, or lower personal disposable income and lower affordability for the unemployed and the retirees, may be associated with quitting. Raising taxes and prices of tobacco products that lowers affordability of tobacco products might be a key strategy for inducing cessation behaviour among current smokers and reducing overall tobacco consumption. Because low-SES smokers are more sensitive to price increases, tobacco taxation policy can induce disproportionately larger decreases in tobacco consumption among them and help reduce socio-economic disparities in smoking and consequent health outcomes. |
Knowledge, attitudes, and practices among veterinarians during an outbreak of canine leptospirosis-Maricopa County, Arizona, 2017
LaFerla Jenni M , Woodward P , Yaglom H , Levy C , Iverson SA , Kretschmer M , Jarrett N , Dooley E , Narang J , Venkat H . Prev Vet Med 2019 172 104779 Leptospirosis, caused by Leptospira spp., is a zoonotic bacterial disease important to both human and animal health. Six pathogenic serovars are currently known to commonly infect and cause disease in dogs in the United States. While canine leptospirosis infection is historically rare in Arizona (</=5 cases reported annually) (ADHS unpublished data) several clusters were reported in Maricopa County (MC) during February 2016-January 2017. Public health initiated an outbreak response and developed a knowledge, attitudes, and practices survey for veterinarians. The goals were to determine awareness and general attitudes about canine leptospirosis and to identify gaps in veterinarians' knowledge in treatment and prevention. We distributed a 40-question self-administered online survey to 1058 Arizona Veterinary Medical Association members, made available during February 9-May 15, 2017. We analyzed the results using Pearson's Chi-squared or Fisher's exact test; a P-value <0.05 was considered statistically significant. We analyzed 202 complete responses. Veterinarians from 10 (66%) of 15 Arizona counties were represented. MC practices were more likely to stock leptospirosis vaccine (80%) than other counties combined (58%) (P=0.004). The average composite knowledge score was 24.4 out of 38 (range 12-37, median 24); 49% of respondents demonstrated higher knowledge as defined by authors, largely in identification of leptospirosis risk factors (86%) and routes of exposure (73%). Fewer than half (45%) of respondents correctly identified the length of time bacteria can be shed in dogs' urine. Eighty-one percent of respondents demonstrated lower knowledge about clinical signs associated with leptospirosis; only 47% of respondents identified eight clinical signs commonly associated with leptospirosis. Sixty-one percent of MC respondents agreed that leptospirosis is an important canine disease in their geographic area, while only 40% of other county respondents agreed (P=0.03). Seventy percent of respondents identified diagnostic testing options. The majority correctly selected infection-control practices in line with recommendations from 2 national clinical guidelines. More respondents would recommend leptospirosis vaccination if dogs traveled or lived in rural areas (87-96%) than if dogs attended day care or were boarded (63%). We identified opportunities for education, including the local epidemiology of leptospirosis, transmission prevention strategies, vaccine safety, testing, clinical identification, and emerging risk factors. Our findings will help guide the design of educational materials for small animal veterinarians in Arizona regarding recommendations for prevention of animal and human leptospirosis infections; these efforts could also shift the culture of reporting companion animal diseases to improve future One Health collaborations. |
Epidemiology of West Nile virus in the United States: Implications for arbovirology and public health
Petersen LR . J Med Entomol 2019 56 (6) 1456-1462 Since West Nile virus (WNV) emerged in the United States in 1999, 22,999 neuroinvasive disease cases in humans were reported through 2017. These cases have arisen from an estimated seven million human infections. Population incidence is geographically heterogeneous and is highest in the West and Midwest. Upwards of 2% of the population in some jurisdictions may become infected during outbreaks. Before universal screening of the United States blood supply, this high infection incidence and that approximately 75% of those infected remain asymptomatic translated into a considerable risk of WNV transfusion transmission despite the short duration of viremia following infection. Universal blood donor screening has nearly eliminated the risk of WNV transfusion transmission, but at enormous cost. WNV transmission via transplanted organs carries extremely high morbidity and mortality. Improved vector surveillance and timely and effective response to surveillance data can reduce the impact of WNV and should remain public health priorities. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Health Economics
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Substance Use and Abuse
- Veterinary Medicine
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 22, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure