Assessing health outcomes, quality of life, and healthcare use among school-age children with asthma
Lozier MJ , Zahran HS , Bailey CM . J Asthma 2018 56 (1) 1-8 OBJECTIVE: Asthma affects six million children in the United States. Most people can control their asthma symptoms with effective care, management, and appropriate medical treatment. Information on the relationship between asthma control and quality of life indicators and health care use among school-age children is limited. METHODS: Using the 2006-2010 combined Behavior Risk Factor Surveillance System Asthma Call-back Survey child data, we examined asthma control and asthma attack status among school-age (aged 5-17 years) children with asthma from 35 states and the District of Columbia. Multivariable logistic regression models were used to assess if having uncontrolled asthma and having >/=1 asthma attacks affect quality of life (activity limitation and missed school days) and healthcare use (emergency department [ED] visits and hospitalizations). RESULTS: About one-third (36.5%) of the 8,484 respondents with current asthma had uncontrolled asthma and 56.8% reported >/=1 asthma attack in the past year. Having uncontrolled asthma and having >/=1 asthma attack were significantly associated with activity limitation (aPR = 1.43 and 1.74, respectively), missed school (1.45 and 1.68), ED visits (2.05 and 4.78), and hospitalizations (2.38 and 3.64). Long-term control (LTC) medication use was higher among respondents with uncontrolled asthma (61.3%) than respondents with well-controlled asthma (33.5%). CONCLUSIONS: Having uncontrolled asthma is associated with reduced quality of life and increased health care use. However, only 61.3% of respondents with uncontrolled asthma use LTC medications. Increasing use of LTC medications among children with uncontrolled asthma could help improve quality of life and reduce health care use. |
Differences in breast cancer incidence among young women aged 20-49 years by stage and tumor characteristics, age, race, and ethnicity, 2004-2013
Shoemaker ML , White MC , Wu M , Weir HK , Romieu I . Breast Cancer Res Treat 2018 169 (3) 595-606 PURPOSE: Younger women diagnosed with breast cancer have poorer prognoses and higher mortality compared to older women. Young black women have higher incidence rates of breast cancer and more aggressive subtypes than women of other races/ethnicities. In this study, we examined recent trends and variations in breast cancer incidence among young women in the United States. METHODS: Using 2004-2013 National Program of Cancer Registries and Surveillance, Epidemiology, and End Results Program data, we calculated breast cancer incidence rates and trends and examined variations in stage, grade, and tumor subtype by age and race/ethnicity among young women aged 20-49 years. RESULTS: The majority of breast cancer cases occurred in women aged 40-44 and 45-49 years (77.3%). Among women aged < 45 years, breast cancer incidence was highest among black women. Incidence trends increased from 2004 to 2013 for Asian or Pacific Islander (API) women and white women aged 20-34 years. Black, American Indian or Alaska Native, and Hispanic women had higher proportions of cases diagnosed at later stages than white and API women. Black women had a higher proportion of grade III-IV tumors than other racial/ethnic groups. Across all age groups, incidence rates for triple-negative breast cancer were significantly higher in black women than women of other races/ethnicities, and this disparity increased with age. CONCLUSIONS: Breast cancer among young women is a highly heterogeneous disease. Differences in tumor characteristics by age and race/ethnicity suggest opportunities for further research into personal and cultural factors that may influence breast cancer risk among younger women. |
Health-risk behaviors and chronic conditions among adults with inflammatory bowel disease - United States, 2015 and 2016
Xu F , Dahlhamer JM , Zammitti EP , Wheaton AG , Croft JB . MMWR Morb Mortal Wkly Rep 2018 67 (6) 190-195 Inflammatory bowel disease (IBD), which includes Crohn's disease and ulcerative colitis, involves chronic inflammation of the gastrointestinal tract. In 2015, an estimated 3.1 million adults in the United States had ever received a diagnosis of IBD (1). Nationally representative samples of adults with IBD have been unavailable or too small to assess relationships between IBD and other chronic conditions and health-risk behaviors (2). To assess the prevalence of health-risk behaviors and chronic conditions among adults with and without IBD, CDC aggregated survey data from the 2015 and 2016 National Health Interview Survey (NHIS). An estimated 3.1 million (unadjusted lifetime prevalence = 1.3%) U.S. adults had ever received a diagnosis of IBD. Adults with IBD had a significantly lower prevalence of having never smoked cigarettes than did adults without the disease (55.9% versus 63.5%). Adults with IBD had significantly higher prevalences than did those without the disease in the following categories: having smoked and quit (26.0% versus 21.0%; having met neither aerobic nor muscle-strengthening activity guidelines (50.4% versus 45.2%); reporting <7 hours of sleep, on average, during a 24-hour period (38.2% versus 32.2%); and having serious psychological distress (7.4% versus 3.4%). In addition, nearly all of the chronic conditions evaluated were more common among adults with IBD than among adults without IBD. Understanding the health-risk behaviors and prevalence of certain chronic conditions among adults with IBD could inform clinical practice and lead to better disease management. |
Prevalence of obesity among youths by household income and education level of head of household - United States 2011-2014
Ogden CL , Carroll MD , Fakhouri TH , Hales CM , Fryar CD , Li X , Freedman DS . MMWR Morb Mortal Wkly Rep 2018 67 (6) 186-189 Obesity prevalence varies by income and education level, although patterns might differ among adults and youths (1-3). Previous analyses of national data showed that the prevalence of childhood obesity by income and education of household head varied across race/Hispanic origin groups (4). CDC analyzed 2011-2014 data from the National Health and Nutrition Examination Survey (NHANES) to obtain estimates of childhood obesity prevalence by household income (</=130%, >130% to </=350%, and >350% of the federal poverty level [FPL]) and head of household education level (high school graduate or less, some college, and college graduate). During 2011-2014 the prevalence of obesity among U.S. youths (persons aged 2-19 years) was 17.0%, and was lower in the highest income group (10.9%) than in the other groups (19.9% and 18.9%) and also lower in the highest education group (9.6%) than in the other groups (18.3% and 21.6%). Continued progress is needed to reduce disparities, a goal of Healthy People 2020. The overall Healthy People 2020 target for childhood obesity prevalence is <14.5% (5). |
Adenovirus type 4 respiratory infections among civilian adults, northeastern United States, 2011-2015(1)
Kajon AE , Lamson DM , Bair CR , Lu X , Landry ML , Menegus M , Erdman DD , St George K . Emerg Infect Dis 2018 24 (2) 201-209 Human adenovirus type 4 (HAdV-4) is most commonly isolated in military settings. We conducted detailed molecular characterization on 36 HAdV-4 isolates recovered from civilian adults with acute respiratory disease (ARD) in the northeastern United States during 2011-2015. Specimens came from college students, residents of long-term care facilities or nursing homes, a cancer patient, and young adults without co-morbidities. HAdV-4 genome types 4a1 and 4a2, the variants most frequently detected among US military recruits in basic training before the restoration of vaccination protocols, were isolated in most cases. Two novel a-like variants were recovered from students enrolled at a college in Tompkins County, New York, USA, and a prototype-like variant distinguishable from the vaccine strain was isolated from an 18-year-old woman visiting a physician's office in Ulster County, New York, USA, with symptoms of influenza-like illness. Our data suggest that HAdV-4 might be an underestimated causative agent of ARD among civilian adults. |
Agreement between self-reported and physically verified male circumcision status in Nyanza region, Kenya: Evidence from the TASCO study
Odoyo-June E , Agot K , Mboya E , Grund J , Musingila P , Emusu D , Soo L , Otieno-Nyunya B . PLoS One 2018 13 (2) e0192823 BACKGROUND: Self-reported male circumcision (MC) status is widely used to estimate community prevalence of circumcision, although its accuracy varies in different settings depending on the extent of misreporting. Despite this challenge, self-reported MC status remains essential because it is the most feasible method of collecting MC status data in community surveys. Therefore, its accuracy is an important determinant of the reliability of MC prevalence estimates based on such surveys. We measured the concurrence between self-reported and physically verified MC status among men aged 25-39 years during a baseline household survey for a study to test strategies for enhancing MC uptake by older men in Nyanza region of Kenya. The objective was to determine the accuracy of self-reported MC status in communities where MC for HIV prevention is being rolled out. METHODS: Agreement between self-reported and physically verified MC status was measured among 4,232 men. A structured questionnaire was used to collect data on MC status followed by physical examination to verify the actual MC status whose outcome was recorded as fully circumcised (no foreskin), partially circumcised (foreskin is past corona sulcus but covers less than half of the glans) or uncircumcised (foreskin covers half or more of the glans). The sensitivity and specificity of self-reported MC status were calculated using physically verified MC status as the gold standard. RESULTS: Out of 4,232 men, 2,197 (51.9%) reported being circumcised, of whom 99.0% were confirmed to be fully circumcised on physical examination. Among 2,035 men who reported being uncircumcised, 93.7% (1,907/2,035) were confirmed uncircumcised on physical examination. Agreement between self-reported and physically verified MC status was almost perfect, kappa (k) = 98.6% (95% CI, 98.1%-99.1%. The sensitivity of self-reporting being circumcised was 99.6% (95% CI, 99.2-99.8) while specificity of self-reporting uncircumcised was 99.0% (95% CI, 98.4-99.4) and did not differ significantly by age group based on chi-square test. Rate of consenting to physical verification of MC status differed by client characteristics; unemployed men were more likely to consent to physical verification (odds ratio [OR] = 1.48, (95% CI, 1.30-1.69) compared to employed men and those with post-secondary education were less likely to consent to physical verification than those with primary education or less (odds ratio [OR] = 0.61, (95% CI, 0.51-0.74). CONCLUSIONS: In this Kenyan context, both sensitivity and specificity of self-reported MC status was high; therefore, MC prevalence estimates based on self-reported MC status should be deemed accurate and applicable for planning. However MC programs should assess accuracy of self-reported MC status periodically for any secular changes that may undermine its usefulness for estimating community MC prevalence in their unique settings. |
Annual estimates of the burden of seasonal influenza in the United States: A tool for strengthening influenza surveillance and preparedness
Rolfes MA , Foppa IM , Garg S , Flannery B , Brammer L , Singleton JA , Burns E , Jernigan D , Olsen SJ , Bresee J , Reed C . Influenza Other Respir Viruses 2018 12 (1) 132-137 BACKGROUND: Estimates of influenza disease burden are broadly useful for public health, helping national and local authorities monitor epidemiologic trends, plan and allocate resources, and promote influenza vaccination. Historically, estimates of the burden of seasonal influenza in the United States, focused mainly on influenza-related mortality and hospitalization, were generated every few years. Since the 2010-2011 influenza season, annual US influenza burden estimates have been generated and expanded to include estimates of influenza-related outpatient medical visits and symptomatic illness in the community. METHODS: We used routinely collected surveillance data, outbreak field investigations, and proportions of people seeking health care from survey results to estimate the number of illnesses, medical visits, hospitalizations, and deaths due to influenza during six influenza seasons (2010-2011 through 2015-2016). RESULTS: We estimate that the number of influenza-related illnesses that have occurred during influenza season has ranged from 9.2 million to 35.6 million, including 140 000 to 710 000 influenza-related hospitalizations. DISCUSSION: These annual efforts have strengthened public health communications products and supported timely assessment of the impact of vaccination through estimates of illness and hospitalizations averted. Additionally, annual estimates of influenza burden have highlighted areas where disease surveillance needs improvement to better support public health decision making for seasonal influenza epidemics as well as future pandemics. |
Epidemiology and risk factors for hepatitis C virus infection in a high-prevalence population
Fill MA , Sizemore LA , Rickles M , Cooper KC , Buecker CM , Mullins HL , Hofmeister MG , Abara WE , Foster MA , Asher AK , Schaffner W , Dunn JR , Jones TF , Wester C . Epidemiol Infect 2018 146 (4) 1-7 To understand increasing rates of hepatitis C virus (HCV) infection in Tennessee, we conducted testing, risk factor analysis and a nested case-control study among persons who use drugs. During June-October 2016, HCV testing with risk factor assessment was conducted in sexually transmitted disease clinics, family planning clinics and an addiction treatment facility in eastern Tennessee; data were analysed by using multivariable logistic regression. A nested case-control study was conducted to assess drug-using risks and behaviours among persons who reported intranasal or injection drug use (IDU). Of 4753 persons tested, 397 (8.4%) were HCV-antibody positive. HCV infection was significantly associated with a history of both intranasal and IDU (adjusted odds ratio (aOR) 35.4, 95% confidence interval (CI) 24.1-51.9), IDU alone (aOR 52.7, CI 25.3-109.9), intranasal drug use alone (aOR 2.6, CI 1.8-3.9) and incarceration (aOR 2.7, CI 2.0-3.8). By 4 October 2016, 574 persons with a reported history of drug use; 63 (11%) were interviewed further. Of 31 persons who used both intranasal and injection drugs, 26 (84%) reported previous intranasal drug use, occurring 1-18 years (median 5.5 years) before their first IDU. Our findings provide evidence that reported IDU, intranasal drug use and incarceration are independent indicators of risk for past or present HCV infection in the study population. |
Human coronavirus circulation in the United States 2014-2017
Killerby ME , Biggs HM , Haynes A , Dahl RM , Mustaquim D , Gerber SI , Watson JT . J Clin Virol 2018 101 52-56 BACKGROUND: Human coronaviruses (HCoVs) -OC43, -229E, -NL63 and -HKU1 cause upper and lower respiratory tract infections. HCoVs are globally distributed and the predominant species may vary by region or year. Prior studies have shown seasonal patterns of HCoV species and annual variation in species prevalence but national circulation patterns in the US have not yet been described. OBJECTIVES: To describe circulation patterns of HCoVs -OC43, -229E, -NL63 and -HKU1 in the US. STUDY DESIGN: We reviewed real-time reverse transcription polymerase chain reaction (rRT-PCR) test results for HCoV-OC43, -229E, -NL63 and -HKU1 reported to The National Respiratory and Enteric Virus Surveillance System (NREVSS) by U.S. laboratories from July 2014-June 2017. We calculated the total number of tests and percent positive by week. For a subset of HCoV positive submissions with age and sex of the patient available, we tested for differences in age and sex across the four HCoV species using Chi Square and Kruskal Wallace tests. RESULTS: 117 laboratories reported 854,575 HCoV tests; 2.2% were positive for HCoV-OC43, 1.0% for HCoV-NL63, 0.8% for HCoV-229E, and 0.6% for HCoV-HKU1. The percentage of positive tests peaked during December - March each year. No significant differences in sex were seen across species, although a significant difference in age distribution was noted. CONCLUSIONS: Common HCoVs may have annual peaks of circulation in winter months in the US, and individual HCoVs may show variable circulation from year to year. Different HCoV species may be detected more frequently in different age groups. Further years of data are needed to better understand patterns of activity for HCoVs. |
In what circumstances could nondaily preexposure prophylaxis for HIV substantially reduce program costs
Mitchell KM , Dimitrov D , Hughes JP , Xia F , Donnell D , Amico KR , Bokoch K , Chitwarakorn A , Bekker LG , Holtz TH , Mannheimer S , Grant RM , Boily MC . AIDS 2018 32 (6) 809-818 OBJECTIVES: To review the main factors influencing the costs of nondaily oral preexposure prophylaxis (PrEP) with tenofovir (+/-emtricitabine). To estimate the cost reductions possible with nondaily PrEP compared with daily PrEP for different populations (MSM and heterosexual populations). DESIGN: Systematic review and data triangulation. METHODS: We estimated the required number of tablets/person/week for dosing regimens used in the HPTN 067/ADAPT (daily/time-driven/event-driven) and IPERGAY (on-demand) trials for different patterns of sexual intercourse. Using trial data, and behavioural and cost data obtained through systematic literature reviews, we estimated cost savings because of tablet reductions for nondaily versus daily oral PrEP, assuming 100% adherence. RESULTS: Among different populations being prioritized for PrEP, the median reported number of days of sexual activity varied between 0 and 2 days/week (0-1.5 days/week for MSM, 1-2 days/week for heterosexual populations). With 100% adherence and two or less sex-days/week, HPTN 067/ADAPT nondaily regimens reduced the number of tablets/week by more than 40% compared with daily PrEP. PrEP program costs were reduced the most in settings with high-drug costs, for example, by 66-69% with event-driven PrEP for French/US populations reporting on average one sex-day/week. CONCLUSION: Nondaily oral PrEP could lower costs substantially (>50%) compared with daily PrEP, particularly in high-income countries. Adherence and efficacy data are needed to determine cost-effectiveness. |
Influenza-associated pediatric deaths in the United States, 2010-2016
Shang M , Blanton L , Brammer L , Olsen SJ , Fry AM . Pediatrics 2018 141 (4) BACKGROUND: Influenza-associated pediatric deaths became a notifiable condition in the United States in 2004. METHODS: We analyzed deaths in children aged <18 years with laboratory-confirmed influenza virus infection reported to the Centers for Disease Control and Prevention during the 2010-2011 to 2015-2016 influenza seasons. Data were collected with a standard case report form that included demographics, medical conditions, and clinical diagnoses. RESULTS: Overall, 675 deaths were reported. The median age was 6 years (interquartile range: 2-12). The average annual incidence was 0.15 per 100 000 children (95% confidence interval: 0.14-0.16) and was highest among children aged <6 months (incidence: 0.66; 95% confidence interval: 0.53-0.82), followed by children aged 6-23 months (incidence: 0.33; 95% confidence interval: 0.27-0.39). Only 31% (n = 149 of 477) of children aged >/=6 months had received any influenza vaccination. Overall, 65% (n = 410 of 628) of children died within 7 days after symptom onset. Half of the children (n = 327 of 654) had no preexisting medical conditions. Compared with children with preexisting medical conditions, children with none were younger (median: 5 vs 8 years old), less vaccinated (27% vs 36%), more likely to die before hospital admission (77% vs 48%), and had a shorter illness duration (4 vs 7 days; P < .05 for all). CONCLUSIONS: Each year, influenza-associated pediatric deaths are reported. Young children have the highest death rates, especially infants aged <6 months. Increasing vaccination among children, pregnant women, and caregivers of infants may reduce influenza-associated pediatric deaths. |
Mortality estimates among adult patients with severe acute respiratory infections from two sentinel hospitals in southern Arizona, United States, 2010-2014
Barnes SR , Wansaula Z , Herrick K , Oren E , Ernst K , Olsen SJ , Casal MG . BMC Infect Dis 2018 18 (1) 78 BACKGROUND: From October 2010 through February 2016, Arizona conducted surveillance for severe acute respiratory infections (SARI) among adults hospitalized in the Arizona-Mexico border region. There are few accurate mortality estimates in SARI patients, particularly in adults >/= 65 years old. The purpose of this study was to generate mortality estimates among SARI patients that include deaths occurring shortly after hospital discharge and identify risk factors for mortality. METHODS: Patients admitted to two sentinel hospitals between 2010 and 2014 who met the SARI case definition were enrolled. Demographic data were used to link SARI patients to Arizona death certificates. Mortality within 30 days after the date of admission was calculated and risk factors were identified using logistic regression models. RESULTS: Among 258 SARI patients, 47% were females, 51% were white, non-Hispanic and 39% were Hispanic. The median age was 63 years (range, 19 to 97 years) and 80% had one or more pre-existing health condition; 9% died in hospital. Mortality increased to 12% (30/258, 30% increase) when electronic vital records and a 30-day post-hospitalization time frame were used. Being age >/= 65 years (OR = 4.0; 95% CI: 1.6-9.9) and having an intensive care unit admission (OR = 7.4; 95% CI: 3.0-17.9) were independently associated with mortality. CONCLUSION: The use of electronic vital records increased SARI-associated mortality estimates by 30%. These findings may help guide prevention and treatment measures, particularly in high-risk persons in this highly fluid border population. |
Nasopharyngeal carriage of Streptococcus pneumoniae among HIV-infected and -uninfected children <5 years of age before introduction of pneumococcal conjugate vaccine in Mozambique
Verani JR , Massora S , Acacio S , Dos Santos RT , Vubil D , Pimenta F , Moura I , Whitney CG , Costa MH , Macete E , Matsinhe MB , Carvalho MDG , Sigauque B . PLoS One 2018 13 (2) e0191113 Nasopharyngeal carriage is a precursor for pneumococcal disease and can be useful for evaluating pneumococcal conjugate vaccine (PCV) impact. We studied pre-PCV pneumococcal carriage among HIV-infected and -uninfected children in Mozambique. Between October 2012 and March 2013, we enrolled HIV-infected children age <5 years presenting for routine care at seven HIV clinics in 3 sites, including Maputo (urban-south), Nampula (urban-north), and Manhica (rural-south). We also enrolled a random sample of HIV-uninfected children <5 years old from a demographic surveillance site in Manhica. A single nasopharyngeal swab was obtained and cultured following enrichment in Todd Hewitt broth with yeast extract and rabbit serum. Pneumococcal isolates were serotyped by Quellung reaction and multiplex polymerase chain reaction. Factors associated with pneumococcal carriage were examined using logistic regression. Overall pneumococcal carriage prevalence was 80.5% (585/727), with similar prevalences among HIV-infected (81.5%, 339/416) and HIV-uninfected (79.1%, 246/311) children, and across age strata. Among HIV-infected, after adjusting for recent antibiotic use and hospitalization, there was no significant association between study site and colonization: Maputo (74.8%, 92/123), Nampula (83.7%, 82/98), Manhica (84.6%, 165/195). Among HIV-uninfected, report of having been born to an HIV-infected mother was not associated with colonization. Among 601 pneumococcal isolates from 585 children, serotypes 19F (13.5%), 23F (13.1%), 6A (9.2%), 6B (6.2%) and 19A (5.2%) were most common. The proportion of serotypes included in the 10- and 13-valent vaccines was 44.9% and 61.7%, respectively, with no significant differences by HIV status or age group. Overall 36.9% (n = 268) of children were colonized with a PCV10 serotype and 49.7% (n = 361) with a PCV13 serotype. Pneumococcal carriage was common, with little variation by geographic region, age, or HIV status. PCV10 was introduced in April 2013; ongoing carriage studies will examine the benefits of PCV10 among HIV-infected and-uninfected children. |
Persistent pandemic lineages of uropathogenic Escherichia coli in a college community-1999-2017
Yamaji R , Rubin J , Thys E , Friedman CR , Riley LW . J Clin Microbiol 2018 56 (4) BACKGROUND: Incidence of drug-resistant community-acquired urinary tract infections (CA-UTI) continues to increase worldwide. In 1999-2000, a single lineage of uropathogenic Escherichia coli (UPEC) ST69 caused 51% of trimethoprim-sulfamethoxazole-resistant UTI in a Northern California university community. We compared the clonal distribution of UPEC and its impact on antimicrobial resistance prevalence in the same community during two periods separated by 17 years. METHODS: We analyzed E. coli isolates from urine samples from patients with symptoms of UTI who visited a health service between September 2016 and May 2017 and compared them to UPEC isolates collected similarly between October 1999 and March 2000. Isolates were tested for antimicrobial drug susceptibility and genotyped by multilocus sequence typing. RESULTS: In 1999-2000, strains belonging to ST95, ST127, ST73, ST69, ST131, and ST10 caused 125 (56%) of 225 UTI cases, while in 2016-2017 the same sequence types (STs) caused 148 (64%) of 233 UTI cases. The frequency of ampicillin resistance and ciprofloxacin resistance rose from 24.4% to 41.6% (P<0.001), and from 0.9% to 5.1% (P<0.003), respectively. The six STs accounted for 78.6% and 72.7% of these increases, respectively. CONCLUSION: Prevalence of drug-resistant UTI in this community appears to be largely influenced by a small set of dominant UPEC STs circulating in the same community 17 years apart. Further research to determine the origin and reasons for persistence of these dominant genotypes is necessary to combat antimicrobial resistant CA-UTI. |
Prevalence and correlates of HIV infection among men who inject drugs in a remote area of Vietnam
Nghiem VT , Bui TC , Nadol PP , Phan SH , Kieu BT , Kling R , Hammett TM . Harm Reduct J 2018 15 (1) 8 BACKGROUND: Lack of information on the HIV epidemic among men who inject drugs (MWID) in northwestern Vietnam, a remote area, may hamper national efforts to control the disease. We examined HIV prevalence, needle-syringe sharing behaviors, and associated factors among MWID in three areas of northwestern Vietnam. METHODS: We used descriptive analysis to report the characteristics, frequency of risk behaviors, and of access to healthcare services among the MWID. Univariable logistic regression was used to assess the associations between the HIV infection, needle-syringe sharing behaviors, and their independent variables. We further explored these associations in multivariable analyses where we included independent variables based on a priori knowledge and their associations with the dependent variables determined in univariable analyses (p < 0.25). RESULTS: The HIV prevalence was 37.9, 16.9, and 18.5% for Tuan Giao, Bat Xat, and Lao Cai City, respectively, and 25.4% overall. MWID of Thai minority ethnicity were more likely to be HIV-positive (adjusted odds ratio (AOR) 3.55; 95% confidence interval (CI) 1.84-6.87). The rate of needle-syringe sharing in the previous 6 months was approximately 9% among the MWID in Tuan Giao and Lao Cai City, and 27.8% in Bat Xat. Two thirds of the participants never underwent HIV testing before this study. Ever having been tested for HIV before this study was not associated with any needle-syringe sharing behaviors. Among the HIV-positive MWID, those who received free clean needles and syringes were less likely to give used needles and syringes to peers (AOR 0.21; 95% CI 0.06-0.79). Going to a "hotspot" in the previous week was associated with increased odds of needle-syringe sharing in multiple subgroups. CONCLUSION: Our findings on HIV prevalence and testing participation among a subset of MWID in the northwestern Vietnam were corroborated with trend analysis results from the most recent HIV/STI Integrated Biological and Behavioral Surveillance report (data last collected in 2013.) We provided important insights into these MWID's risky injection behaviors. We suggest heightened emphasis on HIV testing and needle and syringe provision for this population. Also, policymakers and program implementers should target hotspots as a main venue to tackle HIV epidemics. |
Update: Influenza activity - United States, October 1, 2017-February 3, 2018
Budd AP , Wentworth DE , Blanton L , Elal AIA , Alabi N , Barnes J , Brammer L , Burns E , Cummings CN , Davis T , Flannery B , Fry AM , Garg S , Garten R , Gubareva L , Jang Y , Kniss K , Kramer N , Lindstrom S , Mustaquim D , O'Halloran A , Olsen SJ , Sessions W , Taylor C , Xu X , Dugan VG , Katz J , Jernigan D . MMWR Morb Mortal Wkly Rep 2018 67 (6) 169-179 Influenza activity in the United States began to increase in early November 2017 and rose sharply from December through February 3, 2018; elevated influenza activity is expected to continue for several more weeks. Influenza A viruses have been most commonly identified, with influenza A(H3N2) viruses predominating, but influenza A(H1N1)pdm09 and influenza B viruses were also reported. This report summarizes U.S. influenza activity* during October 1, 2017-February 3, 2018,(dagger) and updates the previous summary (1). |
Use of long-acting reversible contraception among adolescent and young adult women and receipt of sexually transmitted infection/human immunodeficiency virus-related services
Steiner RJ , Pazol K , Swartzendruber A , Liddon N , Kramer MR , Gaydos LM , Sales JM . J Adolesc Health 2018 62 (4) 417-423 PURPOSE: Long-acting reversible contraceptive (LARC) methods do not require annual clinic visits for continuation, potentially impacting receipt of recommended sexually transmitted infection (STI)/human immunodeficiency virus (HIV) services for young women. We assess service receipt among new and continuing LARC users versus moderately and less effective method users and non-contraceptors. METHODS: Using 2011-2015 National Survey of Family Growth data from sexually active women aged 15-24 years (n = 2,018), we conducted logistic comparisons of chlamydia, any STI and HIV testing, and sexual risk assessment in the past year by current contraceptive type. RESULTS: Less than half of respondents were tested for chlamydia (40.9%), any STI (47.3%), or HIV (25.9%); 66.5% had their sexual risk assessed. Differences in service receipt between new and continuing LARC users as compared with moderately effective method users were not detected in multivariable models, except that continuing LARC users were less likely to be tested for HIV (adjusted prevalence ratio [aPR] = .52, 95% confidence interval [CI] = .32-.85). New, but not continuing, LARC users were more likely than less effective method users (aPR = 1.35, 95% CI = 1.03-1.76) and non-contraceptors (aPR = 1.43, 95% CI = 1.11-1.85) to have their sexual risk assessed, although both groups were more likely than non-contraceptors to be tested for chlamydia (new: aPR = 1.52, 95% CI = 1.08-2.15; continuing: aPR = 1.69, 95% CI = 1.24-2.29). CONCLUSIONS: We found little evidence that LARC use was associated with lower prevalence of STI testing. However, new, but not continuing, LARC users, as compared with those not using a method requiring a clinic visit, were more likely to have had their risk assessed, suggesting that initiating LARC may offer an opportunity to receive services that does not persist. |
West Nile and St. Louis encephalitis viral genetic determinants of avian host competence.
Maharaj PD , Bosco-Lauth AM , Langevin SA , Anishchenko M , Bowen RA , Reisen WK , Brault AC . PLoS Negl Trop Dis 2018 12 (2) e0006302 West Nile virus (WNV) and St. Louis encephalitis (SLEV) virus are enzootically maintained in North America in cycles involving the same mosquito vectors and similar avian hosts. However, these viruses exhibit dissimilar viremia and virulence phenotypes in birds: WNV is associated with high magnitude viremias that can result in mortality in certain species such as American crows (AMCRs, Corvus brachyrhynchos) whereas SLEV infection yields lower viremias that have not been associated with avian mortality. Cross-neutralization of these viruses in avian sera has been proposed to explain the reduced circulation of SLEV since the introduction of WNV in North America; however, in 2015, both viruses were the etiologic agents of concurrent human encephalitis outbreaks in Arizona, indicating the need to re-evaluate host factors and cross-neutralization responses as factors potentially affecting viral co-circulation. Reciprocal chimeric WNV and SLEV viruses were constructed by interchanging the pre-membrane (prM)-envelope (E) genes, and viruses subsequently generated were utilized herein for the inoculation of three different avian species: house sparrows (HOSPs; Passer domesticus), house finches (Haemorhous mexicanus) and AMCRs. Cross-protective immunity between parental and chimeric viruses were also assessed in HOSPs. Results indicated that the prM-E genes did not modulate avian replication or virulence differences between WNV and SLEV in any of the three avian species. However, WNV-prME proteins did dictate cross-protective immunity between these antigenically heterologous viruses. Our data provides further evidence of the important role that the WNV / SLEV viral non-structural genetic elements play in viral replication, avian host competence and virulence. |
Phylogenetic Evidence for the Existence of Multiple Strains of Rickettsia parkeri in the New World.
Nieri-Bastos FA , Marcili A , De Sousa R , Paddock CD , Labruna MB . Appl Environ Microbiol 2018 84 (8) The bacterium Rickettsia parkeri has been reported infecting ticks of the 'Amblyomma maculatum species complex' in the New World, where it causes spotted fever illness in humans. In South America, three additional rickettsial strains, namely Atlantic rainforest, NOD, and Parvitarsum have been isolated from the ticks Amblyomma ovale, Amblyomma nodosum, and Amblyomma parvitarsum, respectively. These three strains are phylogenetically closely related to R. parkeri, Rickettsia africae, and Rickettsia sibirica. Herein, we performed a robust phylogenetic analysis encompassing 5 genes (gltA, ompA, virB4, dnaA, dnaK) and 3 intergenic spacers (mppE-pur, rrl-rrf-ITS, rpmE-tRNA(fmet) ) from 41 rickettsial isolates, including different isolates of R. parkeri, R. africae, R. sibirica, R. conorii, and strains Atlantic rainforest, NOD, and Parvitarsum. In our phylogenetic analyses, all New World isolates grouped in a major clade distinct from the Old World Rickettsia species (R. conorii, R. sibirica, R. africae). This New World clade was subdivided into the following 4 clades: the R. parkeri sensu stricto clade, comprising the type strain Maculatum 20(T) and all other isolates of R. parkeri from North and South America, associated with ticks of the A. maculatum species complex; the strain NOD clade, comprising two South American isolates from A. nodosum ticks; the Parvitarsum clade, comprising two South American isolates from A. parvitarsum ticks; and, the strain Atlantic rainforest clade, comprising six South American isolates from the A. ovale species complex (A. ovale or A. aureolatum). Under such evidences, we propose that strains Atlantic rainforest, NOD, and Parvitarsum are South American strains of R. parkeri.Importance Since the description of Rickettsia parkeri infecting ticks of the 'Amblyomma maculatum species complex' and humans in the New World, three novel phylogenetic close-related ricketsial isolates were reported in South America. Herein, we provide genetic evidence that these novel isolates, namely strains Atlantic rainforest, NOD, and Parvitarsum, are South American strains of R. parkeri. Interestingly, each of these R. parkeri strains seem to be primarily associated with a tick species group, namely, R. parkeri sensu stricto with the 'A. maculatum species group', R. parkeri strain NOD with A. nodosum, R. parkeri strain Parvitarsum with A. parvitarsum, and R. parkeri strain Atlantic rainforest with 'A. ovale species group'. Such rickettsial strain-tick species specificity suggests coevolution of each tick-strain association. Finally, because R. parkeri sensu stricto and R. parkeri strain Atlantic rainforest are human pathogens, the potential of R. parkeri strains NOD and Parvitarsum to be human pathogen cannot be discarded. |
Integrated vector control of Aedes aegypti mosquitoes around target houses
Barrera R , Amador M , Munoz J , Acevedo V . Parasit Vectors 2018 11 (1) 88 BACKGROUND: The developing fetuses of pregnant women are at high risk of developing serious birth defects following Zika virus infections. We applied an Integrated Vector Control (IVC) approach using source reduction, larviciding, and mass trapping with non-insecticidal sticky traps to protect targeted houses by reducing the density of female Aedes aegypti mosquitoes. METHODS: We tested the hypothesis that Ae. aegypti density could be reduced to below three female mosquitoes/trap/week around a target house in the center of a circular area with a 150 m radius using IVC. Two non-adjacent areas within the same neighbourhood were selected and randomly designated as the treatment or control areas. Sentinel Autocidal Gravid Ovitraps (SAGO traps) were placed in each study area and were sampled weekly from May to November, during the 2016 Zika epidemic in Puerto Rico. The experimental design was longitudinal with pre-and post-IVC treatment observations between treatment and control areas, and a partial cross-over design, where IVC was applied to the original control area after 2 months to determine if Ae. aegypti density converged to levels observed in the treatment area. Pools of female Ae. aegypti mosquitoes were analyzed by RT-PCR to detect Zika, dengue and chikungunya virus RNA. RESULTS: Overall, pre-treatment mosquito densities in the inner (0-50 m; 15.6 mosquitoes/trap/week), intermediate (50-100 m; 18.1) and outer rings (100-150 m; 15.6) were reduced after treatment to 2.8, 4.1, and 4.3 in the inner, middle, and outer rings, respectively. Density at the target house in the treatment area changed from 27.7 mosquitoes/trap/week before IVC to 2.1 after IVC (92.4% reduction), whereas after treating the original control area (cross-over) density changed from 22.4 to 3.5 (84.3% reduction). Vector reductions were sustained in both areas after IVC. Zika virus was detected in Ae. aegypti, but the low incidence of the virus precluded assessing the impact of IVC on Zika transmission during the study. CONCLUSIONS: Applying IVC to circular areas that were surrounded by untreated areas significantly decreased the number of mosquitoes around target houses located in the center. Gravid Ae. aegypti females in the center of the 150 m areas fell below threshold levels that possibly protect against novel invading arboviruses, such as chikungunya and Zika. |
Mosquitoes of northwestern Uganda
Mutebi JP , Crabtree MB , Kading RC , Powers AM , Ledermann JP , Mossel EC , Zeidner N , Lutwama JJ , Miller BR . J Med Entomol 2018 55 (3) 587-599 Despite evidence of arbovirus activity in northwestern Uganda (West Nile Sub-region), there is very limited information on the mosquito fauna of this region. The only published study reported 52 mosquito species in northwestern Uganda but this study took place in 1950 and the information has not been updated for more than 60 yr. In January and June 2011, CO2 baited-light traps were used to collect 49,231 mosquitoes from four different locations, Paraa (9,487), Chobe (20,025), Sunguru (759), and Rhino Camp (18,960). Overall, 72 mosquito species representing 11 genera were collected. The largest number of distinct species was collected at Chobe (43 species), followed by Paraa (40), Sunguru (34), and Rhino Camp (25). Only eight of the 72 species (11.1%) were collected from all four sites: Aedes (Stegomyia) aegypti formosus (Walker), Anopheles (Cellia) funestus group, Culex (Culex) decens group, Cx. (Culex) neavei Theobald, Cx. (Culex) univittatus Theobald, Cx. (Culiciomyia) cinereus Theobald, Cx. (Oculeomyia) poicilipes (Theobald), and Mansonia (Mansonoides) uniformis (Theobald). Fifty-four species were detected in northwestern Uganda for the first time; however, these species have been detected elsewhere in Uganda and do not represent new introductions to the country. Thirty-three species collected during this study have previously been implicated in the transmission of arboviruses of public health importance. |
Potential impact of antibiotic stewardship programs on overall antibiotic use in adult acute-care hospitals in the United States
Kabbani S , Baggs J , Hicks LA , Srinivasan A . Infect Control Hosp Epidemiol 2018 39 (3) 1-4 We sought to characterize the expected decline in US acute-care antibiotic prescribing resulting from new accreditation standards requiring antibiotic stewardship programs.1 We conducted a narrative review of published literature assessing the impact of antibiotic stewardship program implementation on total antibiotic prescribing in acute-care hospitals in the United States. | | A PubMed search was performed using the following search strategy: antimicrobial OR antibiotic AND stewardship from January 1996 to December 2016. Finally, 12 articles and 1 abstract that reported the effect of antimicrobial stewardship programs on total antibiotic use in adult US acute-care hospitals were included. The median and interquartile range (IQR) of decline in antibiotic use observed with implementation of antibiotic stewardship programs were calculated (Table 1). If no significant decline in antibiotic use was noted, percentage decline was considered to be zero. To quantify the expected national decline in antibiotic use following the implementation of antibiotic stewardship programs, the calculated median and IQR were applied to the 2012 national estimate of adult antibiotic use in acute care hospitals obtained from the Truven Health MarketScan Hospital Drug Database (HDD).2 |
Pool water quality and prevalence of microbes in filter backwash from metro-Atlanta swimming pools
Murphy JL , Hlavsa MC , Carter BC , Miller C , Jothikumar N , Gerth TR , Beach MJ , Hill VR . J Water Health 2018 16 (1) 87-92 During the 2012 summer swim season, aquatic venue data and filter backwash samples were collected from 127 metro-Atlanta pools. Last-recorded water chemistry measures indicated 98% (157/161) of samples were from pools with >/=1 mg/L residual chlorine without stabilized chlorine or >/=2 mg/L with stabilized chlorine and 89% (144/161) had pH readings 7.2-7.8. These water quality parameters are consistent with the 2016 Model Aquatic Health Code (2nd edition) recommendations. We used previously validated real-time polymerase chain reaction assays for detection of seven enteric microbes, including Escherichia coli, and Pseudomonas aeruginosa. E. coli was detected in 58% (93/161) of samples, signifying that swimmers likely introduced fecal material into pool water. P. aeruginosa was detected in 59% (95/161) of samples, indicating contamination from swimmers or biofilm growth on surfaces. Cryptosporidium spp. and Giardia duodenalis were each detected in approximately 1% of samples. These findings indicate the need for aquatics staff, state and local environmental health practitioners, and swimmers to each take steps to minimize the risk of transmission of infectious pathogens. |
Sodium hypochlorite dosage for household and emergency water treatment: updated recommendations
Wilhelm N , Kaufmann A , Blanton E , Lantagne D . J Water Health 2018 16 (1) 112-125 Household water treatment with chlorine can improve the microbiological quality of household water and reduce diarrheal disease. We conducted laboratory and field studies to inform chlorine dosage recommendations. In the laboratory, reactors of varying turbidity (10-300 NTU) and total organic carbon (0-25 mg/L addition) were created, spiked with Escherichia coli, and dosed with 3.75 mg/L sodium hypochlorite. All reactors had >4 log reduction of E. coli 24 hours after chlorine addition. In the field, we tested 158 sources in 22 countries for chlorine demand. A 1.88 mg/L dosage for water from improved sources of <5 or <10 NTU turbidity met free chlorine residual criteria (</=2.0 mg/L at 1 hour, >/=0.2 mg/L at 24 hours) 91-94% and 82-87% of the time at 8 and 24 hours, respectively. In unimproved water source samples, a 3.75 mg/L dosage met relaxed criteria (</=4.0 mg/L at 1 hour, >/=0.2 mg/L after 24 hours) 83% and 65% of the time after 8 and 24 hours, respectively. We recommend water from improved/low turbidity sources be dosed at 1.88 mg/L and used within 24 hours, and from unimproved/higher turbidity sources be dosed at 3.75 mg/L and consumed within 8 hours. Further research on field effectiveness of chlorination is recommended. |
Temporal patterns in principal Salmonella serotypes in the USA; 1996-2014
Powell MR , Crim SM , Hoekstra RM , Williams MS , Gu W . Epidemiol Infect 2018 146 (4) 1-5 Analysing temporal patterns in foodborne illness is important to designing and implementing effective food safety measures. The reported incidence of illness due to Salmonella in the USA. Foodborne Diseases Active Surveillance Network (FoodNet) sites has exhibited no declining trend since 1996; however, there have been significant annual trends among principal Salmonella serotypes, which may exhibit complex seasonal patterns. Data from the original FoodNet sites and penalised cubic B-spline regression are used to estimate temporal patterns in the reported incidence of illness for the top three Salmonella serotypes during 1996-2014. Our results include 95% confidence bands around the estimated annual and monthly curves for each serotype. The results show that Salmonella serotype Typhimurium exhibits a statistically significant declining annual trend and seasonality (P < 0.001) marked by peaks in late summer and early winter. Serotype Enteritidis exhibits a significant annual trend with a higher incidence in later years and seasonality (P < 0.001) marked by a peak in late summer. Serotype Newport exhibits no significant annual trend with significant seasonality (P < 0.001) marked by a peak in late summer. |
Comparative genome analysis reveals a complex population structure of Legionella pneumophila subspecies.
Kozak-Muiznieks NA , Morrison SS , Mercante JW , Ishaq MK , Johnson T , Caravas J , Lucas CE , Brown E , Raphael BH , Winchell JM . Infect Genet Evol 2018 59 172-185 The majority of Legionnaires' disease (LD) cases are caused by Legionella pneumophila, a genetically heterogeneous species composed of at least 17 serogroups. Previously, it was demonstrated that L. pneumophila consists of three subspecies: pneumophila, fraseri and pascullei. During an LD outbreak investigation in 2012, we detected that representatives of both subspecies fraseri and pascullei colonized the same water system and that the outbreak-causing strain was a new member of the least represented subspecies pascullei. We used partial sequence based typing consensus patterns to mine an international database for additional representatives of fraseri and pascullei subspecies. As a result, we identified 46 sequence types (STs) belonging to subspecies fraseri and two STs belonging to subspecies pascullei. Moreover, a recent retrospective whole genome sequencing analysis of isolates from New York State LD clusters revealed the presence of a fourth L. pneumophila subspecies that we have termed raphaeli. This subspecies consists of 15 STs. Comparative analysis was conducted using the genomes of multiple members of all four L. pneumophila subspecies. Whereas each subspecies forms a distinct phylogenetic clade within the L. pneumophila species, they share more average nucleotide identity with each other than with other Legionella species. Unique genes for each subspecies were identified and could be used for rapid subspecies detection. Improved taxonomic classification of L. pneumophila strains may help identify environmental niches and virulence attributes associated with these genetically distinct subspecies. |
Whole-Genome Sequence of Human Rhinovirus C47, Isolated from an Adult Respiratory Illness Outbreak in Butte County, California, 2017.
Pan CY , Padilla T , Yagi S , Lewis LS , Ng TFF , Marine RL , Nix WA , Wadford DA . Genome Announc 2018 6 (5) Here, we report the full coding sequence of rhinovirus C47 (RV-C47), obtained from a patient respiratory sample collected during an acute respiratory illness investigation in Butte County, California, in January 2017. This is the first whole-genome sequence of RV-C47 to be reported. |
Near-Complete Genome Sequences of Several New Norovirus Genogroup II Genotypes.
Chhabra P , Aswath K , Collins N , Ahmed T , Olortegui MP , Kosek M , Cebelinski E , Cooper PJ , Bucardo F , Lopez MR , Castro CJ , Marine RL , Ng TFF , Vinje J . Genome Announc 2018 6 (6) We report here the near-complete genome sequences of 13 norovirus strains detected in stool samples from patients with acute gastroenteritis from Bangladesh, Ecuador, Guatemala, Peru, Nicaragua, and the United States that are classified into one existing (genotype II.22 [GII.22]), 3 novel (GII.23, GII.24 and GII.25), and 3 tentative novel (GII.NA1, GII.NA2, and GII.NA3) genotypes. |
Changes in health insurance coverage associated with the Affordable Care Act among adults with and without a cancer history: Population-based national estimates
Davidoff AJ , Guy GPJr , Hu X , Gonzales F , Han X , Zheng Z , Parsons H , Ekwueme DU , Jemal A . Med Care 2018 56 (3) 220-227 BACKGROUND: The Affordable Care Act (ACA) improved health care coverage accessibility by expanding Medicaid eligibility, creating insurance Marketplaces, and subsidizing premiums. We examine coverage changes associated with ACA implementation, comparing adults with and without a cancer history. METHODS: We included nonelderly adults from the 2012 to 2015 National Health Interview Survey. Using information on state Medicaid policies (2013), expansion decisions (2015), family structure, income, insurance offers, and current coverage, we assigned adults in all 4 years to mutually exclusive eligibility categories including: Medicaid-eligible pre-ACA; expansion eligible for Medicaid; and Marketplace premium subsidy eligible. Linear probability regressions estimated pre-post (2012-2013 vs. 2014-2015) coverage changes by eligibility category, stratified by cancer history. RESULTS: The uninsured rate for cancer survivors decreased from 12.4% to 7.7% (P<0.001) pre-post ACA implementation. Relative to income >400% of the federal poverty guideline, the uninsured rate for cancer survivors decreased by an adjusted 8.4 percentage points [95% confidence interval (CI), 1.3-15.6] among pre-ACA Medicaid eligible; 16.7 percentage points (95% CI, 9.0-24.5) among expansion eligible, and 11.3 percentage points (95% CI, -0.8 to 23.5, with a trend P=0.069) for premium subsidy eligible. Decreases in uninsured among expansion-eligible adults without a cancer history [9.7 percentage points (95% CI, 7.4-12.0), were smaller than for cancer survivors (with a trend, P=0.086)]. Despite coverage gains, approximately 528,000 cancer survivors and 19.1 million without a cancer history remained uninsured post-ACA, yet over half were eligible for Medicaid or subsidized Marketplace coverage. CONCLUSIONS: ACA implementation was associated with large coverage gains in targeted expansion groups, including cancer survivors, but additional progress is needed. |
Epidemiology and outcomes of Clostridium difficile infection in allogeneic hematopoietic cell and lung transplant recipients
Dubberke ER , Reske KA , Olsen MA , Bommarito K , Cleveland AA , Silveira FP , Schuster MG , Kauffman CA , Avery RK , Pappas PG , Chiller TM . Transpl Infect Dis 2018 20 (2) e12855 BACKGROUND: Clostridium difficile infection (CDI) is a common complication of lung and allogeneic hematopoietic cell (HCT) transplant, but the epidemiology and outcomes of CDI after transplant are poorly described. METHODS: We performed a prospective, multicenter study of CDI within 365 days post-allogeneic HCT or lung transplantation. Data were collected via patient interviews and medical chart review. Participants were followed weekly in the 12 weeks post-transplant and while hospitalized and contacted monthly up to 18 months post-transplantation. RESULTS: Six sites participated in the study with 614 total participants; 4 enrolled allogeneic HCT (385 participants) and 5 enrolled lung transplant recipients (229 participants). 150 CDI cases occurred within one year of transplantation; the incidence among lung transplant recipients was 13.1% and among allogeneic HCTs was 31.2%. Median time to CDI was significantly shorter among allogeneic HCT than lung transplant recipients (27 days vs. 90 days; p=0.037). CDI was associated with significantly higher mortality from 31-180 days post-index date among the allogeneic HCT recipients (Hazard ratio [HR]=1.80; p=0.007). There was a trend towards increased mortality among lung transplant recipients from 120-180 days post-index date (HR=4.7, p=0.09). CONCLUSIONS: The epidemiology and outcomes of CDI vary by transplant population; surveillance for CDI should continue beyond the immediate post-transplant period. This article is protected by copyright. All rights reserved. |
Meningococcal conjugate vaccine safety surveillance in the Vaccine Safety Datalink using a tree-temporal scan data mining method.
Li R , Weintraub E , McNeil MM , Kulldorff M , Lewis EM , Nelson J , Xu S , Qian L , Klein NP , Destefano F . Pharmacoepidemiol Drug Saf 2018 27 (4) 391-397 PURPOSE: The objective of our study was to conduct a data mining analysis to identify potential adverse events (AEs) following MENACWY-D using the tree-temporal scan statistic in the Vaccine Safety Datalink population and demonstrate the feasibility of this method in a large distributed safety data setting. METHODS: Traditional pharmacovigilance techniques used in vaccine safety are generally geared to detecting AEs based on pre-defined sets of conditions or diagnoses. Using a newly developed tree-temporal scan statistic data mining method, we performed a pilot study to evaluate the safety profile of the meningococcal conjugate vaccine Menactra(R) (MenACWY-D), screening thousands of potential AE diagnoses and diagnosis groupings. The study cohort included enrolled participants in the Vaccine Safety Datalink aged 11 to 18 years who had received MenACWY-D vaccination(s) between 2005 and 2014. The tree-temporal scan statistic was employed to identify statistical associations (signals) of AEs following MENACWY-D at a 0.05 level of significance, adjusted for multiple testing. RESULTS: We detected signals for 2 groups of outcomes: diseases of the skin and subcutaneous tissue, fever, and urticaria. Both groups are known AEs following MENACWY-D vaccination. We also identified a statistical signal for pleurisy, but further examination suggested it was likely a false signal. No new MENACWY-D safety concerns were raised. CONCLUSIONS: As a pilot study, we demonstrated that the tree-temporal scan statistic data mining method can be successfully applied to screen broadly for a wide range of vaccine-AE associations within a large health care data network. |
Annual changes in rotavirus hospitalization rates before and after rotavirus vaccine implementation in the United States
Shah MP , Dahl RM , Parashar UD , Lopman BA . PLoS One 2018 13 (2) e0191429 BACKGROUND: Hospitalizations for rotavirus and acute gastroenteritis (AGE) have declined in the US with rotavirus vaccination, though biennial peaks in incidence in children aged less than 5 years occur. This pattern may be explained by lower rotavirus vaccination coverage in US children (59% to 73% from 2010-2015), resulting in accumulation of susceptible children over two successive birth cohorts. METHODS: Retrospective cohort analysis of claims data of commercially insured US children aged <5 years. Age-stratified hospitalization rates for rotavirus and for AGE from the 2002-2015 rotavirus seasons were examined. Median age and rotavirus vaccination coverage for biennial rotavirus seasons during pre-vaccine (2002-2005), early post-vaccine (2008-2011) and late post-vaccine (2012-2015) years. RESULTS: Age-stratified hospitalization rates decreased from pre-vaccine to early post-vaccine and then to late post-vaccine years. The clearest biennial pattern in hospitalization rates is the early post-vaccine period, with higher rates in 2009 and 2011 than in 2008 and 2010. The pattern diminishes in the late post-vaccine period. For rotavirus hospitalizations, the median age and the difference in age between biennial seasons was highest during the early post-vaccine period; these differences were not observed for AGE hospitalizations. There was no significant difference in vaccination coverage between biennial seasons. CONCLUSIONS: These observations provide conflicting evidence that incomplete vaccine coverage drove the biennial pattern in rotavirus hospitalizations that has emerged with rotavirus vaccination in the US. As this pattern is diminishing with higher vaccine coverage in recent years, further increases in vaccine coverage may reach a threshold that eliminates peak seasons in hospitalizations. |
Constrained minimization problems for the reproduction number in meta-population models
Poghotanyan G , Feng Z , Glasser JW , Hill AN . J Math Biol 2018 77 1795-1831 The basic reproduction number ([Formula: see text]) can be considerably higher in an SIR model with heterogeneous mixing compared to that from a corresponding model with homogeneous mixing. For example, in the case of measles, mumps and rubella in San Diego, CA, Glasser et al. (Lancet Infect Dis 16(5):599-605, 2016. https://doi.org/10.1016/S1473-3099(16)00004-9 ), reported an increase of 70% in [Formula: see text] when heterogeneity was accounted for. Meta-population models with simple heterogeneous mixing functions, e.g., proportionate mixing, have been employed to identify optimal vaccination strategies using an approach based on the gradient of the effective reproduction number ([Formula: see text]), which consists of partial derivatives of [Formula: see text] with respect to the proportions immune [Formula: see text] in sub-groups i (Feng et al. in J Theor Biol 386:177-187, 2015. https://doi.org/10.1016/j.jtbi.2015.09.006 ; Math Biosci 287:93-104, 2017. https://doi.org/10.1016/j.mbs.2016.09.013 ). These papers consider cases in which an optimal vaccination strategy exists. However, in general, the optimal solution identified using the gradient may not be feasible for some parameter values (i.e., vaccination coverages outside the unit interval). In this paper, we derive the analytic conditions under which the optimal solution is feasible. Explicit expressions for the optimal solutions in the case of [Formula: see text] sub-populations are obtained, and the bounds for optimal solutions are derived for [Formula: see text] sub-populations. This is done for general mixing functions and examples of proportionate and preferential mixing are presented. Of special significance is the result that for general mixing schemes, both [Formula: see text] and [Formula: see text] are bounded below and above by their corresponding expressions when mixing is proportionate and isolated, respectively. |
Influenza vaccination coverage in children with neurologic disorders and their siblings, July 2006-June 2014
Havers FP , Fry AM , Peacock G , Chen J , Reed C . Pediatr Infect Dis J 2018 37 (8) 814-816 Children with neurologic disorders are at high risk for influenza-associated complications. We identified 184,460 children 1-17 years with neurologic disorders and 204,966 siblings in a commercial insurance claims database from July 2006-June 2014. Among children with neurologic disorders, coverage increased from 22.4% in 2006-07 to 42.3% in 2013-14, but remained suboptimal. A lower proportion of siblings were vaccinated. |
Interim estimates of 2017-18 seasonal influenza vaccine effectiveness - United States, February 2018
Flannery B , Chung JR , Belongia EA , McLean HQ , Gaglani M , Murthy K , Zimmerman RK , Nowalk MP , Jackson ML , Jackson LA , Monto AS , Martin ET , Foust A , Sessions W , Berman L , Barnes JR , Spencer S , Fry AM . MMWR Morb Mortal Wkly Rep 2018 67 (6) 180-185 In the United States, annual vaccination against seasonal influenza is recommended for all persons aged >/=6 months (1). During each influenza season since 2004-05, CDC has estimated the effectiveness of seasonal influenza vaccine to prevent laboratory-confirmed influenza associated with medically attended acute respiratory illness (ARI). This report uses data from 4,562 children and adults enrolled in the U.S. Influenza Vaccine Effectiveness Network (U.S. Flu VE Network) during November 2, 2017-February 3, 2018. During this period, overall adjusted vaccine effectiveness (VE) against influenza A and influenza B virus infection associated with medically attended ARI was 36% (95% confidence interval [CI] = 27%-44%). Most (69%) influenza infections were caused by A(H3N2) viruses. VE was estimated to be 25% (CI = 13% to 36%) against illness caused by influenza A(H3N2) virus, 67% (CI = 54%-76%) against A(H1N1)pdm09 viruses, and 42% (CI = 25%-56%) against influenza B viruses. These early VE estimates underscore the need for ongoing influenza prevention and treatment measures. CDC continues to recommend influenza vaccination because the vaccine can still prevent some infections with currently circulating influenza viruses, which are expected to continue circulating for several weeks. Even with current vaccine effectiveness estimates, vaccination will still prevent influenza illness, including thousands of hospitalizations and deaths. Persons aged >/=6 months who have not yet been vaccinated this season should be vaccinated. |
Methods for addressing "innocent bystanders" when evaluating safety of concomitant vaccines
Wang SV , Abdurrob A , Spoendlin J , Lewis E , Newcomer SR , Fireman B , Daley MF , Glanz JM , Duffy J , Weintraub ES , Kulldorff M . Pharmacoepidemiol Drug Saf 2018 27 (4) 405-412 PURPOSE: The need to develop methods for studying the safety of childhood immunization schedules has been recognized by the Institute of Medicine and Department of Health and Human Services. The recommended childhood immunization schedule includes multiple vaccines in a visit. A key concern is safety of concomitant (same day) versus separate day vaccination. This paper addresses a methodological challenge for observational studies using a self-controlled design to investigate the safety of concomitant vaccination. METHODS: We propose a process for distinguishing which of several concomitantly administered vaccines is responsible for increased risk of an adverse event while adjusting for confounding due to relationships between effect modifying risk factors and concomitant vaccine combinations. We illustrate the approach by re-examining the known increase in risk of seizure 7 to 10 days after measles-mumps-rubella (MMR) vaccination and evaluating potential independent or modifying effects of other vaccines. RESULTS: Initial analyses suggested that DTaP had both an independent and potentiating effect on seizure. After accounting for the relationship between age at vaccination and vaccine combination, there was little evidence for increased risk of seizure with same day administration of DTaP and MMR; incidence rate ratio, 95% confidence interval 1.2 (0.9-1.6), P value = theta.226. CONCLUSION: We have shown that when using a self-controlled design to investigate safety of concomitant vaccination, it can be critically important to adjust for time-invariant effect modifying risk factors, such as age at time of vaccination, which are structurally related to vaccination patterns due to recommended immunization schedules. |
Statin use and risks of influenza-related outcomes among older adults receiving standard-dose or high-dose influenza vaccines through Medicare during 2010-2015
Izurieta HS , Chillarige Y , Kelman JA , Forshee R , Qiang Y , Wernecke M , Ferdinands JM , Lu Y , Wei Y , Xu W , Lu M , Fry A , Pratt D , Shay DK . Clin Infect Dis 2018 67 (3) 378-387 Background: Statins are used to reduce cardiovascular disease risk. Recent studies suggest that statin use may be associated with an increased influenza risk among influenza vaccinees. We used Medicare data to evaluate associations between statins and risks of influenza-related encounters among vaccinees. Methods: In this retrospective cohort study, we identified Medicare beneficiaries aged >65 years who received high-dose (HD) or standard-dose (SD) influenza vaccines at pharmacies from 2010-11 through 2014-15. Statin users were matched to non-users by vaccine type, demographics, prior medical encounters, and comorbidities. We used multivariable Poisson models to estimate associations between statin use around the time of vaccination and risk of influenza-related encounters. Study outcomes included influenza-related office visits with a rapid test followed by dispensing of oseltamivir, and influenza-related hospitalizations (including emergency room visits), during high influenza circulation periods. Results: The study included 1,403,651 statin users matched to non-users. Cohorts were well-balanced, with standardized mean differences </=0.03 for all measured covariates. For statin users, compared to non-users, the adjusted relative risk was 1.086 (95% confidence interval [CI] 1.025-1.150) for influenza-related visits and 1.096 (95% CI 1.013-1.185) for influenza-related hospitalizations. The risk difference ranged from -0.02 to 0.23 for influenza-related visits and from -0.04 to 0.13 for hospitalizations, depending on season severity. Results were similar for HD and SD vaccinees, and for non-synthetic and synthetic statin users. Conclusions: Among 2.8 million Medicare beneficiaries, these results suggest that statin use around time of vaccination does not substantially affect the risk of influenza-related medical encounters among older adults. |
Immunogenicity of fractional-dose vaccine during a yellow fever outbreak - preliminary report
Ahuka-Mundeke S , Casey RM , Harris JB , Dixon MG , Nsele PM , Kizito GM , Umutesi G , Laven J , Paluku G , Gueye AS , Hyde TB , Sheria GKM , Muyembe-Tanfum JJ , Staples JE . N Engl J Med 2018 381 (5) 444-454 Background In 2016, the response to a yellow fever outbreak in Angola and the Democratic Republic of Congo led to a global shortage of yellow fever vaccine. As a result, a fractional dose of the 17DD yellow fever vaccine (containing one fifth [0.1 ml] of the standard dose) was offered to 7.6 million children 2 years of age or older and nonpregnant adults in a preemptive campaign in Kinshasa. The goal of this study was to assess the immune response to the fractional dose in a large-scale campaign. Methods We recruited participants in four age strata at six vaccination sites. We assessed neutralizing antibody titers against yellow fever virus in blood samples obtained before vaccination and 28 to 35 days after vaccination, using a plaque reduction neutralization test with a 50% cutoff (PRNT50). Participants with a PRNT50 titer of 10 or higher at baseline were considered to be seropositive. Those with a baseline titer of less than 10 who became seropositive at follow-up were classified as having undergone seroconversion. Participants who were seropositive at baseline and who had an increase in the titer by a factor of 4 or more at follow-up were classified as having an immune response. Results Among 716 participants who completed follow-up, 705 (98%; 95% confidence interval [CI], 97 to 99) were seropositive after vaccination. Among 493 participants who were seronegative at baseline, 482 (98%; 95% CI, 96 to 99) underwent seroconversion. Among 223 participants who were seropositive at baseline, 148 (66%; 95% CI, 60 to 72) had an immune response. Lower baseline titers were associated with a higher probability of having an immune response (P<0.001). Conclusions A fractional dose of the 17DD yellow fever vaccine was effective at inducing seroconversion in most of the participants who were seronegative at baseline. These findings support the use of fractional-dose vaccination for outbreak control. (Funded by the U.S. Agency for International Development and the Centers for Disease Control and Prevention.). |
Initiation, extension, and termination of RNA synthesis by a paramyxovirus polymerase.
Jordan PC , Liu C , Raynaud P , Lo MK , Spiropoulou CF , Symons JA , Beigelman L , Deval J . PLoS Pathog 2018 14 (2) e1006889 Paramyxoviruses represent a family of RNA viruses causing significant human diseases. These include measles virus, the most infectious virus ever reported, in addition to parainfluenza virus, and other emerging viruses. Paramyxoviruses likely share common replication machinery but their mechanisms of RNA biosynthesis activities and details of their complex polymerase structures are unknown. Mechanistic and functional details of a paramyxovirus polymerase would have sweeping implications for understanding RNA virus replication and for the development of new antiviral medicines. To study paramyxovirus polymerase structure and function, we expressed an active recombinant Nipah virus (NiV) polymerase complex assembled from the multifunctional NiV L protein bound to its phosphoprotein cofactor. NiV is an emerging highly pathogenic virus that causes severe encephalitis and has been declared a global public health concern due to its high mortality rate. Using negative-stain electron microscopy, we demonstrated NiV polymerase forms ring-like particles resembling related RNA polymerases. We identified conserved sequence elements driving recognition of the 3'-terminal genomic promoter by NiV polymerase, and leading to initiation of RNA synthesis, primer extension, and transition to elongation mode. Polyadenylation resulting from NiV polymerase stuttering provides a mechanistic basis for transcription termination. It also suggests a divergent adaptation in promoter recognition between pneumo- and paramyxoviruses. The lack of available antiviral therapy for NiV prompted us to identify the triphosphate forms of R1479 and GS-5734, two clinically relevant nucleotide analogs, as substrates and inhibitors of NiV polymerase activity by delayed chain termination. Overall, these findings provide low-resolution structural details and the mechanism of an RNA polymerase from a previously uncharacterized virus family. This work illustrates important functional differences yet remarkable similarities between the polymerases of nonsegmented negative-strand RNA viruses. |
Dynamics of Evolution of Poliovirus Neutralizing Antigenic Sites and Other Capsid Functional Domains during a Large and Prolonged Outbreak.
Shaw J , Jorba J , Zhao K , Iber J , Chen Q , Adu F , Adeniji A , Bukbuk D , Baba M , Henderson E , Dybdahl-Sissoko N , Macdonald S , Weldon WC , Gumede N , Oberste MS , Kew OM , Burns CC . J Virol 2018 92 (9) We followed the dynamics of capsid amino acid replacement among 403 Nigerian outbreak isolates of type 2 circulating vaccine-derived poliovirus (cVDPV2) from 2005 through 2011. Four different functional domains were analyzed: 1) neutralizing antigenic (NAg) sites, 2) residues binding the poliovirus receptor (PVR), 3) VP1 residues 1-32, and 4) the capsid structural core. Amino acid replacements mapped to 37 of 43 positions across all 4 NAg sites; the most variable and polymorphic residues were in NAg sites 2 and 3b. The most divergent of the 120 NAg variants had no more than 5 replacements in all NAg sites, and were still neutralized at titers similar to those of Sabin 2. PVR-binding residues were less variable (25 different variants; 0-2 replacements/isolate; 30/44 invariant positions), with the most variable residues also forming parts of NAg sites 2 and 3a. Residues 1-32 of VP1 were highly variable (133 different variants; 0-6 replacements/isolate; 5/32 invariant positions), with residues 1-18 predicted to form a well-conserved amphipathic helix. Replacement events were dated by mapping them onto the branches of time-scaled phylogenies. Rates of amino acid replacement varied widely across positions and followed no simple substitution model. Replacements into the structural core were the most conservative and were fixed at an overall rate approximately 20-fold lower than rates for the NAg sites and VP1 1-32, and approximately 5-fold lower than the rate for the PVR-binding sites. Only VP1-143-Ile, a non-NAg site surface residue and known attenuation site, appeared to be under strong negative selection.IMPORTANCE The high rate of poliovirus evolution is offset by strong selection against amino acid replacement at most positions of the capsid. Consequently, poliovirus vaccines developed from strains isolated decades ago have been used worldwide to bring wild polioviruses almost to extinction. The apparent antigenic stability of poliovirus obscures a dynamic of continuous change within the neutralizing antigenic (NAg) sites. During seven years of a large outbreak in Nigeria, the circulating type 2 vaccine-derived polioviruses generated 120 different NAg site variants via multiple independent pathways. Nonetheless, overall antigenic evolution was constrained, as no isolate had fixed more than 5 amino acid differences from the Sabin 2 NAg sites, and the most divergent isolates were efficiently neutralized by human immune sera. Evolution elsewhere in the capsid was also constrained. Amino acids binding the poliovirus receptor were strongly conserved, and extensive variation in the VP1 amino terminus still conserved a predicted amphipathic helix. |
Evaluating the recombinant T24H enzyme-linked immunoelectrotransfer blot assay for the diagnosis of neurocysticercosis in a panel of samples from a large community-based randomized control trial in 60 villages in Burkina Faso
Dermauw V , Carabin H , Cisse A , Millogo A , Tarnagda Z , Ganaba R , Noh J , Handali S , Breen K , Richter V , Cisse R , Preux PM , Boncoeur-Martel MP , Winkler AS , Van Hul A , Dorny P , Gabriel S . Am J Trop Med Hyg 2018 98 (2) 565-569 Current guidelines for the diagnosis of neurocysticercosis (NCC) recommend the use of the lentil lectin-bound glycoprotein enzyme-linked immunoelectrotransfer blot assay (LLGP-EITB) as the reference standard for serological testing. In response to the drawbacks involved with the use of the LLGP-EITB, a recombinant T24H antigen (rT24H) EITB assay was developed, with promising results. However, the test has yet to be evaluated among individuals from sub-Saharan Africa (SSA). The aim of the present study was to investigate the performance of the rT24H EITB assay for the detection of NCC cases in a panel of serum samples (N = 366, of which 173 patients presented with epileptic seizures and/or severe chronic headaches, and 193 matched manifestation-free participants) collected as part of a large community-based trial in Burkina Faso. A perfect agreement between the rT24H EITB and the native gp24 (and its homodimer, gp42) LLGP-EITB was found (kappa value of 1.0). Furthermore, among patients with the neurological manifestations of interest who underwent a computed tomography scan, the rT24H EITB and native antigen LLGP-EITB had a comparable ability to correctly identify NCC cases with multiple viable (rT24H: sensitivity: 80.0%), single viable (66.7%), and calcified/degenerating cysts only (25.0%), albeit for multiple viable and calcified cysts, the rT24H estimated sensitivity seemed lower, but more uncertain, than previously reported. The rT24H EITB specificity was high (98.2%) and in line with previous studies. This study confirms the value of the recombinant rT24H EITB as an alternative to the native antigen LLGP-EITB for the diagnosis of NCC in a SSA community setting. |
High-intensity stretch-shortening contraction training modifies responsivity of skeletal muscle in old male rats
Rader EP , Naimo MA , Ensey J , Baker BA . Exp Gerontol 2018 104 118-126 Utilization of high-intensity resistance training to counter age-related sarcopenia is currently debated because of the potential for maladaptation when training design is inappropriate. Training design is problematic because the influence of various loading variables (e.g. contraction mode, repetition number, and training frequency) is still not well characterized at old age. To address this in a precisely controlled manner, we developed a rodent model of high-intensity training consisting of maximally-activated stretch-shortening contractions (SSCs), contractions typical during resistance training. With this model, we determined that at old age, high-repetition SSC training (80 SSCs: 8 sets of 10 repetitions) performed frequently (i.e. 3days per week) for 4.5weeks induced strength deficits with no muscle mass gain while decreasing frequency to 2days per week promoted increases in muscle mass and muscle quality (i.e. performance normalized to muscle mass). This finding confirmed the popular notion that decreasing training frequency has a robust effect with age. Meanwhile, the influence of other loading variables remains contentious. The aim of the present study was to assess muscle adaptation following modulation of contraction mode and repetition number during high-intensity SSC training. Muscles of young (3month old) and old (30month old) male rats were exposed to 4.5weeks of low-repetition static training of 4 (i.e. 4 sets of one repetition) isometric (ISO) contractions 3days per week or a more moderate-repetition dynamic training of 40 SSCs (i.e. 4 sets of 10 repetitions) 3days per week. For young rats, performance and muscle mass increased regardless of training protocol. For old rats, no muscle mass adaptation was observed for 4 ISO training while 40 SSC training induced muscle mass gain without improvement in muscle quality, an outcome distinct from modulating training frequency. Muscle mass gain for old rats was accompanied by decreased protein levels of tumor necrosis factor alpha, a mediator of age-related chronic inflammatory signaling, to young levels. These findings suggest that while dynamic high-intensity training with a moderate number of repetitions has a limited capacity for altering muscle quality, such training is a viable strategy for countering age-related inflammatory signaling and modifying muscle mass. |
Impact of enzymatic hydrolysis on the quantification of total urinary concentrations of chemical biomarkers
Dwivedi P , Zhou X , Powell TG , Calafat AM , Ye X . Chemosphere 2018 199 256-262 Human exposure to consumer and personal care products chemicals such as phenols, including parabens and other antimicrobial agents, can be assessed through biomonitoring by quantifying urinary concentrations of the parent chemical or its metabolites, often after hydrolysis of phase II conjugates. Developing suitable analytical methods for the concurrent quantification of multiple exposure biomarkers is challenging because optimal conditions for the hydrolysis of such conjugates (e.g., O-glucuronides, N-glucuronides, sulfates) may differ depending on the biomarker. We evaluated the effectiveness of seven commercial hydrolytic enzymes to simultaneously hydrolyze N-glucuronides (using the antibacterial triclocarban as example compound) and other conjugates (using select phenols and parabens as examples) by using on-line solid phase extraction-high performance liquid chromatography-isotope dilution-tandem mass spectrometry. Incubation (30min, 55 degrees C) with a genetically engineered beta-glucuronidase (IMCS, >/=15 units/muL urine) hydrolyzed N-glucuronide triclocarban, but did not fully hydrolyze the conjugates of phenols and parabens. By contrast, incubation (4h, 37 degrees C) with solid beta-glucuronidase (Helix pomatia, Type H-1, >/=30 units/muL urine) or liquid beta-glucuronidase/arylsulfatase (Helix pomatia, 30 units/muL urine [i.e., 30 muL/100muL urine]) in the presence of 100muL methanol for 100muL urine completely hydrolyzed N-glucuronide triclocarban and the conjugates of several phenols and parabens, without cleaving the ester bond of the parabens to form p-hydroxybenzoic acid. These results highlight the relevance of method validation procedures that include optimizing the hydrolysis of phase II urinary conjugates (e.g., enzyme type and amount used, reaction time, temperature) to quantify accurately and concurrently multiple exposure biomarkers for biomonitoring purposes. |
Laboratory testing for factor VIII and IX inhibitors in haemophilia: A review
Miller CH . Haemophilia 2018 24 (2) 186-197 Inhibitors are antibodies directed against haemophilia treatment products which interfere with their function. Factor VIII (FVIII) inhibitors in haemophilia A and factor IX (FIX) inhibitors in haemophilia B are significant clinically when they require a change in a patient's treatment regimen. Their persistence may increase morbidity and mortality. Multiple laboratory tests are now available for detecting and understanding inhibitors in haemophilia. Inhibitors are traditionally measured by their interference in clotting or chromogenic factor assays. They may also be detected using immunologic assays, such as enzyme-linked immunosorbent assay or fluorescence immunoassay. Anti-FVIII or anti-FIX antibodies of IgG4 subclass best correlate with the presence of functional inhibitors. Improvements in inhibitor measurement have been recently introduced. Preanalytical heat treatment of patient specimens allows testing of patients without delaying treatment. Use of chromogenic and immunologic assays may aid in identification of false-positive results, which are frequent among low-titre inhibitors. Validated reagent substitutions can be used to reduce assay cost. New methods for defining assay positivity and reporting low-titre inhibitors have been suggested. Challenges remain in the areas of quality control, assay standardization, monitoring of patients undergoing immune tolerance induction therapy and testing in the presence of modified and novel treatment products. |
Standard Reference Material (SRM) 2378 fatty acids in frozen human serum. Certification of a clinical SRM based on endogenous supplementation of polyunsaturated fatty acids
Benner BA Jr , Schantz MM , Powers CD , Schleicher RL , Camara JE , Sharpless KE , Yen JH , Sniegoski LT . Anal Bioanal Chem 2018 410 (9) 2321-2329 Dietary fatty acids can be both beneficial and detrimental to human health depending on the degree and type of saturation. Healthcare providers and research scientists monitor the fatty acid content of human plasma and serum as an indicator of health status and diet. In addition, both the Centers for Disease Control & Prevention (CDC) and the National Institutes of Health - Office of Dietary Supplements are interested in circulating fatty acids (FAs) because they may be predictive of coronary heart disease. The National Institute of Standards and Technology (NIST) provides a wide variety of reference materials (RMs) and Standard Reference Materials(R) (SRM(R)s) including blood, serum, plasma, and urine with values assigned for analytes of clinical interest. NIST SRM 2378 Fatty Acids in Frozen Human Serum was introduced in 2015 to help validate methods used for the analysis of FAs in serum, and consists of three different pools of serum acquired from (1) healthy donors who had taken fish oil dietary supplements (at least 1000 mg per day) for at least one month (level 1 material), (2) healthy donors who had taken flaxseed oil dietary supplements (at least 1000 mg per day) for at least one month (level 2 material), and (3) healthy donors eating "normal" diets who had not taken dietary supplements containing fish or plant oils (level 3 material). The use of dietary supplements by donors provided SRMs with natural endogenous ranges of FAs at concentrations observed in human populations. Results from analyses using two methods at NIST, including one involving a novel microwave-assisted acid hydrolysis procedure, and one at the CDC are presented here. These results and their respective uncertainties were combined to yield certified values with expanded uncertainties for 12 FAs and reference values with expanded uncertainties for an additional 18 FAs. |
Talaromycosis (penicilliosis) in a cynomolgus macaque
Iverson WO , Karanth S , Wilcox A , Pham CD , Lockhart SR , Nicholson SM . Vet Pathol 2018 55 (4) 300985818758468 A sexually mature Chinese-origin female Macaca fascicularis assigned to the high-dose group in a 26-week toxicology study with an experimental immunomodulatory therapeutic antibody (a CD40 L antagonist fusion protein) was euthanized at the scheduled terminal sacrifice on study day 192. The animal was healthy at study initiation and remained clinically normal throughout the study. On study day 141, abnormal clinical pathology changes were found during a scheduled evaluation; splenomegaly was detected on study day 149 and supported by ultrasound examination. At the scheduled necropsy, there was marked splenomegaly with a nodular and discolored appearance. Cytologic examination of a splenic impression smear revealed yeast-like organisms within macrophages. Histologically, there was disseminated systemic granulomatous inflammation with 2- to 3-mum oval, intracytoplasmic yeast-like organisms in multiple organs identified as Talaromyces (Penicillium) marneffei. This organism, not previously reported as a pathogen in macaques, causes an important opportunistic infection in immunosuppressed humans in specific global geographic locations. |
Brief report: Self-injurious behaviors in preschool children with autism spectrum disorder compared to other developmental delays and disorders
Soke GN , Rosenberg SA , Rosenberg CR , Vasa RA , Lee LC , DiGuiseppi C . J Autism Dev Disord 2018 48 (7) 2558-2566 We compared the prevalence of self-injurious behaviors (SIB) in preschoolers aged 30-68 months with autism spectrum disorder (ASD) (n = 691) versus other developmental delays and disorders (DD) (n = 977) accounting for sociodemographic, cognitive, and medical factors. SIB prevalence was higher in ASD versus all DD [adjusted odds-ratio (aOR) 2.13 (95% confidence interval (95% CI) 1.53, 2.97)]. In subgroup analyses, SIB prevalence was higher in ASD versus DD without ASD symptoms [aOR 4.42 (95% CI 2.66, 7.33)], but was similar between ASD and DD with ASD symptoms [aOR 1.09 (95% CI 0.68, 1.77)]. We confirmed higher prevalence of SIB in ASD versus DD, independent of confounders. In children with DD, SIB prevalence increased with more ASD symptoms. These findings are informative to clinicians, researchers, and policymakers. |
Impact of Tourette syndrome on school measures in a nationally representative sample
Claussen AH , Bitsko RH , Holbrook JR , Bloomfield J , Giordano K . J Dev Behav Pediatr 2018 39 (4) 335-342 OBJECTIVE: Children with Tourette syndrome (TS) are at risk for a variety of co-occurring conditions and learning and school problems. The purpose of this study was to determine the impact of TS and co-occurring conditions on school measures. METHODS: Parent-reported data from the 2007-2008 and 2011-2012 National Survey of Children's Health were combined (n = 129,353 children aged 6-17 yrs). Parent report of health care provider diagnosis of TS; co-occurring mental, emotional, and behavioral conditions; learning and language conditions; and school measures were assessed. School measures included type of school, individual education plan (IEP), number of school days missed, school problems, doing well in school, doing homework, and repeating a grade. Children with TS were compared with those who never had TS on school measures accounting for co-occurring conditions. RESULTS: After adjusting for demographics, compared with children without TS, children currently with TS were more likely to have an IEP, have a parent contacted about school problems, and not complete homework. After further adjusting for co-occurring conditions, only IEP status remained statistically significant. Compared with children with mild TS, children with moderate or severe TS were more likely to have an IEP, repeat a grade, encounter school problems, and not care about doing well in school. CONCLUSION: Tourette syndrome severity and co-occurring conditions are associated with school challenges and educational service needs. Awareness among health care providers, teachers and parents of the potential challenges related to both TS and co-occurring conditions would help to best support the child's education. |
Influence of family demographic factors on social communication questionnaire scores
Rosenberg SA , Moody EJ , Lee LC , DiGuiseppi C , Windham GC , Wiggins LD , Schieve LA , Ledbetter CM , Levy SE , Blaskey L , Young L , Bernal P , Rosenberg CR , Fallin MD . Autism Res 2018 11 (5) 695-706 This study examined the effect of demographic factors on Social Communication Questionnaire (SCQ) scores in children aged 30-68 months. Diagnoses of ASD were made after a gold standard evaluation that included the Autism Diagnostic Observation Schedule (ADOS), and the Autism Diagnostic Interview Revised (ADI-R). The relationship of demographic variables to SCQ scores was compared in two source populations: (a) children recruited from clinical and educational sources serving children who have ASD or other developmental disorders (CE) and (b) children recruited from birth certificates to represent the general population (BC). The impact of the demographic variables-child sex, child age, maternal language, maternal ethnicity, maternal education, maternal race, and household income-on total SCQ score were studied to examine their impact on the SCQ's performance. Demographic factors predicting the SCQ total score were used to generate ROCs. Factors that had a significant influence on SCQ performance were identified by examining the area under the ROCs. Optimal SCQ cut-points were generated for significant factors using the Youden's Index. Overall male sex, lower household income, lower maternal education and Black race predicted higher SCQ scores. In this sample, the most common optimum value for the SCQ cut-point across the different sociodemographic groups was 11. Autism Res 2018. (c) 2018 International Society for Autism Research, Wiley Periodicals, Inc. LAY SUMMARY: Screeners are used to help identify children who are more likely to have ASD than their peers. Ideally screeners should be accurate for different groups of children and families. This study examined how well the Social Communication Questionnaire (SCQ) predicts ASD. We found that male sex, lower household income, lower maternal education and Black race were associated with higher SCQ scores. In this study an SCQ cut-point of 11 worked best across the different sociodemographic groups in our sample. |
National and state trends in sudden unexpected infant death: 1990-2015
Erck Lambert AB , Parks SE , Shapiro-Mendoza CK . Pediatrics 2018 141 (3) BACKGROUND: Sharp declines in sudden unexpected infant death (SUID) in the 1990s and a diagnostic shift from sudden infant death syndrome (SIDS) to unknown cause and accidental suffocation and strangulation in bed (ASSB) in 1999-2001 have been documented. We examined trends in SUID and SIDS, unknown cause, and ASSB from 1990 to 2015 and compared state-specific SUID rates to identify significant trends that may be used to inform SUID prevention efforts. METHODS: We used data from US mortality files to evaluate national and state-specific SUID rates (deaths per 100 000 live births) for 1990-2015. SUID included infants with an underlying cause of death, SIDS, unknown cause, or ASSB. To examine overall US rates for SUID and SUID subtypes, we calculated the percent change by fitting Poisson regression models. We report state differences in SUID and compared state-specific rates from 2000-2002 to 2013-2015 by calculating the percent change. RESULTS: SUID rates declined from 154.6 per 100 000 live births in 1990 to 92.4 in 2015, declining 44.6% from 1990 to 1998 and 7% from 1999 to 2015. From 1999 to 2015, SIDS rates decreased 35.8%, ASSB rates increased 183.8%, and there was no significant change in unknown cause rates. SUID trends among states varied widely from 41.5 to 184.3 in 2000-2002 and from 33.2 to 202.2 in 2013-2015. CONCLUSIONS: Reductions in SUID rates since 1999 have been minimal, and wide variations in state-specific rates remain. States with significant declines in SUID rates might have SUID risk-reduction programs that could serve as models for other states. |
A survey of atmospheric monitoring systems in U.S. underground coal mines
Rowland JH III , Harteis SP , Yuan L . Min Eng 2018 70 (2) 37-40 In 1995 and 2003, the U.S. Mine Safety and Health Administration (MSHA) conducted surveys to determine the number of atmospheric monitoring systems (AMS) that were being used in underground coal mines in the United States. The survey reports gave data for the different AMS manufacturers, the different types of equipment monitored, and the different types of gas sensors and their locations. Since the last survey in 2003, MSHA has changed the regulation requirements for early fire detection along belt haulage entries. As of Dec. 31, 2009, point-type heat sensors are prohibited for use for an early fire detection system. Instead, carbon monoxide (CO) sensors are now required. This report presents results from a new survey and examines how the regulation changes have had an impact on the use of CO sensors in underground coal mines in the United States. The locations and parameters monitored by AMS and CO systems are also discussed. |
Possible role of regional variation in allergic contact dermatitis: case report
Jiang A , Harrison JC , Siegel PD , Maibach H . Contact Dermatitis 2018 78 (3) 228-229 A 27-year-old male presented to our dermatitis clinic with 6 months' duration of red oedematous lesions on his ankles. He was previously treated for this suspected allergic contact dermatitis with prednisone (30 mg daily) for 2 weeks, during which time these lesions cleared. However, upon prednisone discontinuation the lesions recurred within several days. |
Pulmonary impairment and risk assessment in a diacetyl-exposed population: Microwave popcorn workers
Park RM , Gilbert SJ , Whittaker C . J Occup Environ Med 2018 60 (6) 496-506 OBJECTIVES: The butter flavoring additive, diacetyl (DA), can cause bronchiolitis obliterans (BO) by inhalation. A risk assessment was performed using data from a microwave popcorn manufacturing plant. METHODS: Current employees' medical history and pulmonary function tests together with air sampling over a 2.7 yr period were used to analyze FEV1 and FEV1/FVC. The exposure responses for declining pulmonary function and for possible early onset of BO were estimated using multiple regression methods. Several exposure metrics were investigated; benchmark dose and excess lifetime risk of impairment were calculated. RESULTS: Forty-six percent of the population had less than 6 mo exposure to DA. Percent-of-predicted FEV1 declined with cumulative exposure (0.40 per ppm-yr, p < 10) and as did percent FEV1/FVC (0.13 per ppm-yr, p = 0.0004). Lifetime respiratory impairment prevalence of one per thousand resulted from 0.005 ppm DA and one per thousand lifetime incidence of impairment was predicted for 0.002 ppm DA. CONCLUSION: DA exposures, often exceeding 1 ppm in the past, place workers at high risk of pulmonary impairment. |
A next-generation sequencing and bioinformatics protocol for Malaria drug Resistance marker Surveillance (MaRS).
Talundzic E , Ravishankar S , Kelly J , Patel D , Plucinski M , Schmedes S , Ljolje D , Clemons B , Madison-Antenucci S , Arguin PM , Lucchi N , Vannberg F , Udhayakumar V . Antimicrob Agents Chemother 2018 62 (4) The recent advances in next-generation sequencing technologies provide a new and effective way of tracking malaria drug resistant parasites. To take advantage of this technology an end-to-end Illumina targeted amplicon deep sequencing (TADS) and bioinformatics pipeline for molecular surveillance of drug resistance in P. falciparum, called Malaria Resistance Surveillance (MaRS), was developed. TADS relies on PCR enriching genomic regions, specifically target genes of interest, prior to deep sequencing. MaRS enables researchers to simultaneously collect data on allele frequencies of multiple full-length P. falciparum drug resistance genes (crt, mdr1, k13, dhfr, dhps, and cytochrome b) as well as the mitochondrial genome. Information is captured at the individual patient level for both known and potential new single nucleotide polymorphisms associated with drug resistance. MaRS pipeline was validated using 245 imported malaria cases that were reported to the Centers for Disease Control and Prevention (CDC). The chloroquine resistant crt CV IET genotype was observed in 42% of samples, the highly pyrimethamine resistant triple mutant dhpsIRN in 92% of samples, and the sulfadoxine resistant dhps S GE AA in 26% of samples. The mdr1 N F SND genotype was found in 40% of samples. With the exception of two cases imported from Cambodia, no artemisinin resistant K13 alleles were identified and 99% of patients carried parasites susceptible to atovaquone-proguanil. Our goal is to implement MaRS at the CDC for routine surveillance of imported malaria cases in the U.S. and aid in the adoption of this system in participating state public health laboratories as well as global partners. |
Candidate-gene based GWAS identifies reproducible DNA markers for metabolic pyrethroid resistance from standing genetic variation in East African Anopheles gambiae
Weetman D , Wilding CS , Neafsey DE , Muller P , Ochomo E , Isaacs AT , Steen K , Rippon EJ , Morgan JC , Mawejje HD , Rigden DJ , Okedi LM , Donnelly MJ . Sci Rep 2018 8 (1) 2920 Metabolic resistance to pyrethroid insecticides is widespread in Anopheles mosquitoes and is a major threat to malaria control. DNA markers would aid predictive monitoring of resistance, but few mutations have been discovered outside of insecticide-targeted genes. Isofemale family pools from a wild Ugandan Anopheles gambiae population, from an area where operational pyrethroid failure is suspected, were genotyped using a candidate-gene enriched SNP array. Resistance-associated SNPs were detected in three genes from detoxification superfamilies, in addition to the insecticide target site (the Voltage Gated Sodium Channel gene, Vgsc). The putative associations were confirmed for two of the marker SNPs, in the P450 Cyp4j5 and the esterase Coeae1d by reproducible association with pyrethroid resistance in multiple field collections from Uganda and Kenya, and together with the Vgsc-1014S (kdr) mutation these SNPs explained around 20% of variation in resistance. Moreover, the >20 Mb 2La inversion also showed evidence of association with resistance as did environmental humidity. Sequencing of Cyp4j5 and Coeae1d detected no resistance-linked loss of diversity, suggesting selection from standing variation. Our study provides novel, regionally-validated DNA assays for resistance to the most important insecticide class, and establishes both 2La karyotype variation and humidity as common factors impacting the resistance phenotype. |
Case report: Conjunctival infestation with Thelazia gulosa: A novel agent of human thelaziasis in the United States
Bradbury RS , Breen KV , Bonura EM , Hoyt JW , Bishop HS . Am J Trop Med Hyg 2018 98 (4) 1171-1174 We report a case of thelaziasis in a 26-year-old female, acquired in Oregon. A total of 14 worms were removed from the patient's left eye and were morphologically identified as being Thelazia gulosa. Until now, only two species of Thelazia have been implicated in causing human disease, Thelazia callipaeda in Asia and Europe and occasional reports of Thelazia californiensis from the United States of America. Here, we describe a third, previously unreported parasite of humans, T. gulosa (the cattle eyeworm) as an agent of human thelaziasis and the first reported case of human thelaziasis in North America in over two decades. |
Efficacy and safety of primaquine and methylene blue for prevention of Plasmodium falciparum transmission in Mali: a phase 2, single-blind, randomised controlled trial
Dicko A , Roh ME , Diawara H , Mahamar A , Soumare HM , Lanke K , Bradley J , Sanogo K , Kone DT , Diarra K , Keita S , Issiaka D , Traore SF , McCulloch C , Stone WJR , Hwang J , Muller O , Brown JM , Srinivasan V , Drakeley C , Gosling R , Chen I , Bousema T . Lancet Infect Dis 2018 18 (6) 627-639 BACKGROUND: Primaquine and methylene blue are gametocytocidal compounds that could prevent Plasmodium falciparum transmission to mosquitoes. We aimed to assess the efficacy and safety of primaquine and methylene blue in preventing human to mosquito transmission of P falciparum among glucose-6-phosphate dehydrogenase (G6PD)-normal, gametocytaemic male participants. METHODS: This was a phase 2, single-blind, randomised controlled trial done at the Clinical Research Centre of the Malaria Research and Training Centre (MRTC) of the University of Bamako (Bamako, Mali). We enrolled male participants aged 5-50 years with asymptomatic P falciparum malaria. G6PD-normal participants with gametocytes detected by blood smear were randomised 1:1:1:1 in block sizes of eight, using a sealed-envelope design, to receive either sulfadoxine-pyrimethamine and amodiaquine, sulfadoxine-pyrimethamine and amodiaquine plus a single dose of 0.25 mg/kg primaquine, dihydroartemisinin-piperaquine, or dihydroartemisinin-piperaquine plus 15 mg/kg per day methylene blue for 3 days. Laboratory staff, investigators, and insectary technicians were masked to the treatment group and gametocyte density of study participants. The study pharmacist and treating physician were not masked. Participants could request unmasking. The primary efficacy endpoint, analysed in all infected patients with at least one infectivity measure before and after treatment, was median within-person percentage change in mosquito infectivity 2 and 7 days after treatment, assessed by membrane feeding. This study is registered with ClinicalTrials.gov, number NCT02831023. FINDINGS: Between June 27, 2016, and Nov 1, 2016, 80 participants were enrolled and assigned to the sulfadoxine-pyrimethamine and amodiaquine (n=20), sulfadoxine-pyrimethamine and amodiaquine plus primaquine (n=20), dihydroartemisinin-piperaquine (n=20), or dihydroartemisinin-piperaquine plus methylene blue (n=20) groups. Among participants infectious at baseline (54 [68%] of 80), those in the sulfadoxine-pyrimethamine and amodiaquine plus primaquine group (n=19) had a median 100% (IQR 100 to 100) within-person reduction in mosquito infectivity on day 2, a larger reduction than was noted with sulfadoxine-pyrimethamine and amodiaquine alone (n=12; -10.2%, IQR -143.9 to 56.6; p<0.0001). The dihydroartemisinin-piperaquine plus methylene blue (n=11) group had a median 100% (IQR 100 to 100) within-person reduction in mosquito infectivity on day 2, a larger reduction than was noted with dihydroartemisinin-piperaquine alone (n=12; -6.0%, IQR -126.1 to 86.9; p<0.0001). Haemoglobin changes were similar between gametocytocidal arms and their respective controls. After exclusion of blue urine, adverse events were similar across all groups (59 [74%] of 80 participants had 162 adverse events overall, 145 [90%] of which were mild). INTERPRETATION: Adding a single dose of 0.25 mg/kg primaquine to sulfadoxine-pyrimethamine and amodiaquine or 3 days of 15 mg/kg per day methylene blue to dihydroartemisinin-piperaquine was highly efficacious for preventing P falciparum transmission. Both primaquine and methylene blue were well tolerated. FUNDING: Bill & Melinda Gates Foundation, European Research Council. |
Multiplex serology for impact evaluation of bed net distribution on burden of lymphatic filariasis and four species of human malaria in northern Mozambique
Plucinski MM , Candrinho B , Chambe G , Muchanga J , Muguande O , Matsinhe G , Mathe G , Rogier E , Doyle T , Zulliger R , Colborn J , Saifodine A , Lammie P , Priest JW . PLoS Negl Trop Dis 2018 12 (2) e0006278 BACKGROUND: Universal coverage with long-lasting insecticidal nets (LLINs) is a primary control strategy against Plasmodium falciparum malaria. However, its impact on the three other main species of human malaria and lymphatic filariasis (LF), which share the same vectors in many co-endemic areas, is not as well characterized. The recent development of multiplex antibody detection provides the opportunity for simultaneous evaluation of the impact of control measures on the burden of multiple diseases. METHODOLOGY/PRINCIPAL FINDINGS: Two cross-sectional household surveys at baseline and one year after a LLIN distribution campaign were implemented in Mecuburi and Nacala-a-Velha Districts in Nampula Province, Mozambique. Both districts were known to be endemic for LF; both received mass drug administration (MDA) with antifilarial drugs during the evaluation period. Access to and use of LLINs was recorded, and household members were tested with P. falciparum rapid diagnostic tests (RDTs). Dried blood spots were collected and analyzed for presence of antibodies to three P. falciparum antigens, P. vivax MSP-119, P. ovale MSP-119, P. malariae MSP-119, and three LF antigens. Seroconversion rates were calculated and the association between LLIN use and post-campaign seropositivity was estimated using multivariate regression. The campaign covered 68% (95% CI: 58-77) of the population in Nacala-a-Velha and 46% (37-56) in Mecuburi. There was no statistically significant change in P. falciparum RDT positivity between the two surveys. Population seropositivity at baseline ranged from 31-81% for the P. falciparum antigens, 3-4% for P. vivax MSP-119, 41-43% for P. ovale MSP-119, 46-56% for P. malariae MSP-119, and 37-76% for the LF antigens. The seroconversion rate to the LF Bm33 antigen decreased significantly in both districts. The seroconversion rate to P. malariae MSP-119 and the LF Wb123 and Bm14 antigens each decreased significantly in one of the two districts. Community LLIN use was associated with a decreased risk of P. falciparum RDT positivity, P. falciparum CSP and LSA-1 seropositivity, and P. malariae MSP-119 seropositivity, but not LF antigen seropositivity. CONCLUSIONS/SIGNIFICANCE: The study area noted significant declines in LF seropositivity, but these were not associated with LLIN use. The MDA could have masked any impact of the LLINs on population LF seropositivity. The LLIN campaign did not reach adequately high coverage to decrease P. falciparum RDT positivity, the most common measure of P. falciparum burden. However, the significant decreases in the seroconversion rate to the P. malariae antigen, coupled with an association between community LLIN use and individual-level decreases in seropositivity to P. falciparum and P. malariae antigens show evidence of impact of the LLIN campaign and highlight the utility of using multiantigenic serological approaches for measuring intervention impact. |
Feasibility of using pedometers in a state-based surveillance system: 2014 Arizona Behavioral Risk Factor Surveillance System
Florez-Pregonero A , Fulton JE , Dorn JM , Ainsworth BE . J Sport Health Sci 2018 7 (1) 34-41 Background: Despite their utility in accessing ambulatory movement, pedometers have not been used consistently to monitor physical activity in U.S. surveillance systems. This study was designed to determine the feasibility of using pedometers to assess daily steps taken in a sub-sample of adults from Maricopa County who completed the 2014 Arizona Behavioral Risk Factor Surveillance System Survey. Methods: Respondents were sent an Omron HJ324U pedometer, a logbook to record steps taken, and a walking questionnaire. The pedometer was worn for 7 days. Feasibility was assessed for acceptability (interest in study), demand (procedures followed correctly), implementation (time to complete study), and practicality (cost). Results: Acceptability was modest with 23.9% (830/3476) agreeing to participate. Among those participating (92.9%; 771/830), 50.1% (386/771) returned the logbook. Demand was modest with 39.3% (303/771) of logbooks returned with valid data. Implementation represented 5 months to recruit participants. The cost to obtain valid step-count data was USD61.60 per person. An average of 6363 ± 3049 steps/day were taken with most participants classified as sedentary (36.0%) or low active (35.6%). Conclusion: The feasibility of using pedometers in a state-based surveillance system is modest at best. Feasibility may potentially be improved with easy-to-use pedometers where data can be electronically downloaded. |
Assisted reproductive technology surveillance - United States, 2015
Sunderam S , Kissin DM , Crawford SB , Folger SG , Boulet SL , Warner L , Barfield WD . MMWR Surveill Summ 2018 67 (3) 1-28 PROBLEM/CONDITION: Since the first U.S. infant conceived with assisted reproductive technology (ART) was born in 1981, both the use of ART and the number of fertility clinics providing ART services have increased steadily in the United States. ART includes fertility treatments in which eggs or embryos are handled in the laboratory (i.e., in vitro fertilization [IVF] and related procedures). Although the majority of infants conceived through ART are singletons, women who undergo ART procedures are more likely than women who conceive naturally to deliver multiple-birth infants. Multiple births pose substantial risks for both mothers and infants, including obstetric complications, preterm delivery (<37 weeks), and low birthweight (<2,500 g) infants. This report provides state-specific information for the United States (including the District of Columbia and Puerto Rico) on ART procedures performed in 2015 and compares birth outcomes that occurred in 2015 (resulting from ART procedures performed in 2014 and 2015) with outcomes for all infants born in the United States in 2015. PERIOD COVERED: 2015. DESCRIPTION OF SYSTEM: In 1995, CDC began collecting data on ART procedures performed in fertility clinics in the United States as mandated by the Fertility Clinic Success Rate and Certification Act of 1992 (FCSRCA) (Public Law 102-493 [October 24, 1992]). Data are collected through the National ART Surveillance System, a web-based data collection system developed by CDC. This report includes data from 52 reporting areas (the 50 states, the District of Columbia, and Puerto Rico). RESULTS: In 2015, a total of 182,111 ART procedures (range: 135 in Alaska to 23,198 in California) with the intent to transfer at least one embryo were performed in 464 U.S. fertility clinics and reported to CDC. These procedures resulted in 59,334 live-birth deliveries (range: 55 in Wyoming to 7,802 in California) and 71,152 infants born (range: 68 in Wyoming to 9,176 in California). Nationally, the number of ART procedures performed per 1 million women of reproductive age (15-44 years), a proxy measure of the ART utilization rate, was 2,832. ART use exceeded the national rate in 13 reporting areas (California, Connecticut, Delaware, the District of Columbia, Hawaii, Illinois, Maryland, Massachusetts, New Hampshire, New Jersey, New York, Rhode Island, and Virginia). Nationally, among ART transfer procedures in patients using fresh embryos from their own eggs, the average number of embryos transferred increased with increasing age of the woman (1.6 among women aged <35 years, 1.8 among women aged 35-37 years, and 2.3 among women aged >37 years). Among women aged <35 years, the national elective single-embryo transfer (eSET) rate was 34.7% (range: 11.3% in Puerto Rico to 88.1% in Delaware). In 2015, ART contributed to 1.7% of all infants born in the United States (range: 0.3% in Puerto Rico to 4.5% in Massachusetts). ART also contributed to 17.0% of all multiple-birth infants, 16.8% of all twin infants, and 22.2% of all triplets and higher-order infants. The percentage of multiple-birth infants was higher among infants conceived with ART (35.3%) than among all infants born in the total birth population (3.4%). Approximately 34.0% of ART-conceived infants were twins and 1.0% were triplets and higher-order infants. Nationally, infants conceived with ART contributed to 5.1% of all low birthweight infants. Among ART-conceived infants, 25.5% had low birthweight, compared with 8.1% among all infants. ART-conceived infants contributed to 5.3% of all preterm (gestational age <37 weeks) infants. The percentage of preterm births was higher among infants conceived with ART (31.2%) than among all infants born in the total birth population (9.7%). Among singletons, the percentage of ART-conceived infants who had low birthweight was 8.7% compared with 6.4% among all infants born. The percentage of ART-conceived infants who were born preterm was 13.4% among singletons compared with 7.9% among all infants. INTERPRETATION: Multiple births from ART contributed to a substantial proportion of all twins, triplets, and higher-order infants born in the United States. For women aged <35 years, who are typically considered good candidates for eSET, the national average of 1.6 embryos was transferred per ART procedure. Of the four states (Illinois, Massachusetts, New Jersey, and Rhode Island) with comprehensive mandated health insurance coverage for ART procedures (i.e., coverage for at least four cycles of IVF), three (Illinois, Massachusetts, and New Jersey) had rates of ART use exceeding 1.5 times the national rate. This type of mandated insurance coverage has been associated with greater use of ART and likely accounts for some of the difference in per capita ART use observed among states. PUBLIC HEALTH ACTION: Twins account for the majority of ART-conceived multiple births. Reducing the number of embryos transferred and increasing use of eSET when clinically appropriate could help reduce multiple births and related adverse health consequences for both mothers and infants. State-based surveillance of ART might be useful for monitoring and evaluating maternal and infant health outcomes of ART in states with high ART use. |
E-cigarette and smokeless tobacco use and switching among smokers: Findings from the National Adult Tobacco Survey
Anic GM , Holder-Hayes E , Ambrose BK , Rostron BL , Coleman B , Jamal A , Apelberg BJ . Am J Prev Med 2018 54 (4) 539-551 INTRODUCTION: Assessing the extent that cigarette smokers use or switch to e-cigarettes and smokeless tobacco can help inform the population health impact of these products. This study estimated the prevalence of e-cigarette and smokeless tobacco use and switching among current and recent former adult cigarette smokers. METHODS: Data from the 2012-2013 (n=8,891) and 2013-2014 (n=11,379) National Adult Tobacco Survey were analyzed in 2016. Response rates for this telephone survey were 44.9% and 36.1%, respectively. Tobacco product use was assessed by smoking status. RESULTS: Current e-cigarette use increased for all groups, with a greater increase among recent quitters, 9.1% (95% CI=7.1%, 11.1%) in 2012-2013 and 15.8% (95% CI=13.7%, 17.9%) in 2013-2014, than smokers with an unsuccessful quit attempt, 10.4% (95% CI=9.1%, 11.7%) in 2012-2013 and 14.8% (95% CI=13.5%, 16.1%) in 2013-2014, or smokers with no quit attempt, 5.9% (95% CI=4.8%, 6.9%) in 2012-2013 and 10.7% (95% CI=9.4%, 12.0%) in 2013-2014. Between 2012-2013 and 2013-2014, current use of smokeless tobacco remained steady for recent quitters (4.6% to 4.7%, p=0.92) and smokers with no quit attempt (4.0% to 4.3%, p=0.97), and decreased in smokers with an unsuccessful quit attempt (5.7% to 3.8%, p=0.004). More recent quitters completely switched to e-cigarettes in the past year (15.3% in 2012-2013, 25.7% in 2013-2014) than to smokeless tobacco (4.6% in 2012-2013, 4.5% in 2013-2014). CONCLUSIONS: Current and recent former adult smokers are more likely to use e-cigarettes than smokeless tobacco. Current e-cigarette use was most prevalent among unsuccessful quitters and recent quitters, who were substantially more likely to report complete switching to e-cigarettes than smokeless tobacco. |
Impact of e-cigarette minimum legal sale age laws on current cigarette smoking
Dutra LM , Glantz SA , Arrazola RA , King BA . J Adolesc Health 2018 62 (5) 532-538 PURPOSE: The purpose of this study was to use individual-level data to examine the relationship between e-cigarette minimum legal sale age (MLSA) laws and cigarette smoking among U.S. adolescents, adjusting for e-cigarette use. METHODS: In 2016 and 2017, we regressed (logistic) current (past 30-day) cigarette smoking (from 2009-2014 National Youth Tobacco Surveys [NYTS]) on lagged (laws enacted each year counted for the following year) and unlagged (laws enacted January-June counted for that year) state e-cigarette MLSA laws prohibiting sales to youth aged <18 or <19 years (depending on the state). Models were adjusted for year and individual- (e-cigarette and other tobacco use, sex, race/ethnicity, and age) and state-level (smoke-free laws, cigarette taxes, medical marijuana legalization, income, and unemployment) covariates. RESULTS: Cigarette smoking was not significantly associated with lagged MLSA laws after adjusting for year (odds ratio [OR] = .87, 95% confidence interval [CI]: .73-1.03; p = .10) and covariates (OR = .85, .69-1.03; p = .10). Unlagged laws were significantly and negatively associated with cigarette smoking (OR = .84, .71-.98, p = .02), but not after adjusting for covariates (OR = .84, .70-1.01, p = .07). E-cigarette and other tobacco use, sex, race/ethnicity, age, and smoke-free laws were associated with cigarette smoking (p <.05). Results unadjusted for e-cigarette use and other tobacco use yielded a significant negative association between e-cigarette MLSA laws and cigarette smoking (lagged: OR = .78, .64-.93, p = .01; unlagged: OR = .80, .68-.95, p = .01). CONCLUSIONS: After adjusting for covariates, state e-cigarette MLSA laws did not affect youth cigarette smoking. Unadjusted for e-cigarette and other tobacco use, these laws were associated with lower cigarette smoking. |
Reasons for electronic cigarette use among middle and high school students - National Youth Tobacco Survey, United States, 2016
Tsai J , Walton K , Coleman BN , Sharapova SR , Johnson SE , Kennedy SM , Caraballo RS . MMWR Morb Mortal Wkly Rep 2018 67 (6) 196-200 Electronic cigarettes (e-cigarettes) were the most commonly used tobacco product among U.S. middle school and high school students in 2016 (1). CDC and the Food and Drug Administration (FDA) analyzed data from the 2016 National Youth Tobacco Survey (NYTS) to assess self-reported reasons for e-cigarette use among U.S. middle school (grades 6-8) and high school (grades 9-12) student e-cigarette users. Among students who reported ever using e-cigarettes in 2016, the most commonly selected reasons for use were 1) use by "friend or family member" (39.0%); 2) availability of "flavors such as mint, candy, fruit, or chocolate" (31.0%); and 3) the belief that "they are less harmful than other forms of tobacco such as cigarettes" (17.1%). The least commonly selected reasons were 1) "they are easier to get than other tobacco products, such as cigarettes" (4.8%); 2) "they cost less than other tobacco products such as cigarettes" (3.2%); and 3) "famous people on TV or in movies use them" (1.5%). Availability of flavors as a reason for use was more commonly selected by high school users (32.3%) than by middle school users (26.8%). Efforts to prevent middle school and high school students from initiating the use of any tobacco product, including e-cigarettes, are important to reduce tobacco product use among U.S. youths (2). |
Identification and characterization of influenza A viruses in selected domestic animals in Kenya, 2010-2012.
Munyua P , Onyango C , Mwasi L , Waiboci LW , Arunga G , Fields B , Mott JA , Cardona CJ , Kitala P , Nyaga PN , Njenga MK . PLoS One 2018 13 (2) e0192721 BACKGROUND: Influenza A virus subtypes in non-human hosts have not been characterized in Kenya. We carried out influenza surveillance in selected domestic animals and compared the virus isolates with isolates obtained in humans during the same period. METHODS: We collected nasal swabs from pigs, dogs and cats; oropharyngeal and cloacal swabs from poultry; and blood samples from all animals between 2010 and 2012. A standardized questionnaire was administered to farmers and traders. Swabs were tested for influenza A by rtRT-PCR, virus isolation and subtyping was done on all positive swabs. All sera were screened for influenza A antibodies by ELISA, and positives were evaluated by hemagglutination inhibition (HI). Full genome sequencing was done on four selected pig virus isolates. RESULTS: Among 3,798 sera tested by ELISA, influenza A seroprevalence was highest in pigs (15.9%; 172/1084), 1.2% (3/258) in ducks, 1.4% (1/72) in cats 0.6% (3/467) in dogs, 0.1% (2/1894) in chicken and 0% in geese and turkeys. HI testing of ELISA-positive pig sera showed that 71.5% had positive titers to A/California/04/2009(H1N1). Among 6,289 swabs tested by rRT-PCR, influenza A prevalence was highest in ducks [1.2%; 5/423] and 0% in cats and turkeys. Eight virus isolates were obtained from pig nasal swabs collected in 2011 and were determined to be A(H1N1)pdm09 on subtyping. On phylogenetic analysis, four hemagglutinin segments from pig isolates clustered together and were closely associated with human influenza viruses that circulated in Kenya in 2011. CONCLUSION: Influenza A(H1N1)pdm09 isolated in pigs was genetically similar to contemporary human pandemic influenza virus isolates. This suggest that the virus was likely transmitted from humans to pigs, became established and circulated in Kenyan pig populations during the study period. Minimal influenza A prevalence was observed in the other animals studied. |
Assessing the impact of public education on a preventable zoonotic disease: rabies
Hasanov E , Zeynalova S , Geleishvili M , Maes E , Tongren E , Marshall E , Banyard A , McElhinney LM , Whatmore AM , Fooks AR , Horton DL . Epidemiol Infect 2018 146 (2) 227-235 Effective methods to increase awareness of preventable infectious diseases are key components of successful control programmes. Rabies is an example of a disease with significant impact, where public awareness is variable. A recent awareness campaign in a rabies endemic region of Azerbaijan provided a unique opportunity to assess the efficacy of such campaigns. A cluster cross-sectional survey concerning rabies was undertaken following the awareness campaign in 600 households in 38 randomly selected towns, in districts covered by the campaign and matched control regions. This survey demonstrated that the relatively simple awareness campaign was effective at improving knowledge of rabies symptoms and vaccination schedules. Crucially, those in the awareness campaign group were also 1.4 times more likely to report that they had vaccinated their pets, an essential component of human rabies prevention. In addition, low knowledge of appropriate post-exposure treatment and animal sources of rabies provide information useful for future public awareness campaigns in the region and other similar areas. |
Lyme disease surveillance in the United States: Looking for ways to cut the Gordian knot
Cartter ML , Lynfield R , Feldman KA , Hook SA , Hinckley AF . Zoonoses Public Health 2018 65 (2) 227-229 Current surveillance methods have been useful to document geographic expansion of Lyme disease in the United States and to monitor the increasing incidence of this major public health problem. Nevertheless, these approaches are resource-intensive, generate results that are difficult to compare across jurisdictions, and measure less than the total burden of disease. By adopting more efficient methods, resources could be diverted instead to education of at-risk populations and new approaches to prevention. In this special issue of Zoonoses and Public Health, seven articles are presented that either evaluate traditional Lyme disease surveillance methods or explore alternatives that have the potential to be less costly, more reliable, and sustainable. Twenty-five years have passed since Lyme disease became a notifiable condition - it is time to reevaluate the purpose and goals of national surveillance. |
Lyme disease testing in a high-incidence state: Clinician knowledge and patterns
Conant JL , Powers J , Sharp G , Mead PS , Nelson CA . Am J Clin Pathol 2018 149 (3) 234-240 Objectives: Lyme disease (LD) incidence is increasing, but data suggest some clinicians are not fully aware of recommended procedures for ordering and interpreting diagnostic tests. The study objective was to assess clinicians' knowledge and practices regarding LD testing in a high-incidence region. Methods: We distributed surveys to 1,142 clinicians in the University of Vermont Medical Center region, of which 144 were completed (12.6% response rate). We also examined LD laboratory test results and logs of calls to laboratory customer service over a period of 2.5 years and 6 months, respectively. Results: Most clinicians demonstrated basic knowledge of diagnostic protocols, but many misinterpreted Western blot results. For example, 42.4% incorrectly interpreted a positive immunoglobulin M result as an overall positive test in a patient with longstanding symptoms. Many also reported receiving patient requests for unvalidated tests. Conclusions: Additional education and modifications to LD test ordering and reporting systems would likely reduce errors and improve patient care. |
Motor abnormalities and epilepsy in infants and children with evidence of congenital Zika virus infection
Pessoa A , van der Linden V , Yeargin-Allsopp M , Carvalho Mdcg , Ribeiro EM , Van Naarden Braun K , Durkin MS , Pastula DM , Moore JT , Moore CA . Pediatrics 2018 141 S167-s179 Initial reports of congenital Zika virus (ZIKV) infection focused on microcephaly at birth with severe brain anomalies; the phenotype has broadened to include microcephaly that develops after birth and neurodevelopmental sequelae. In this narrative review, we summarize medical literature describing motor abnormalities and epilepsy in infants with evidence of congenital ZIKV infection and provide information on the impact of these conditions. Specific scenarios are used to illustrate the complex clinical course in infants with abnormalities that are consistent with congenital Zika syndrome. A search of the English-language medical literature was done to identify motor abnormalities and epilepsy in infants with evidence of congenital ZIKV infection by using Medline and PubMed, Embase, Scientific Electronic Library Online, Scopus, the OpenGrey Repository, and the Grey Literature Report in Public Health. Search terms included "Zika" only and "Zika" in combination with any of the following terms: "epilepsy," "seizure," "motor," and "cerebral palsy." Clinical features of motor abnormalities and epilepsy in these children were reviewed. Thirty-six publications were identified; 8 were selected for further review. Among infants with clinical findings that are consistent with congenital Zika syndrome, 54% had epilepsy and 100% had motor abnormalities. In these infants, impairments that are consistent with diagnoses of cerebral palsy and epilepsy occur frequently. Pyramidal and extrapyramidal motor abnormalities were notable for their early development and co-occurrence. Prompt identification of potential disabilities enables early intervention to improve the quality of life for affected children. Long-term studies of developmental outcomes and interventions in children with congenital ZIKV infection are needed. |
Public health approach to addressing the needs of children affected by congenital Zika syndrome
Broussard CS , Shapiro-Mendoza CK , Peacock G , Rasmussen SA , Mai CT , Petersen EE , Galang RR , Newsome K , Reynolds MR , Gilboa SM , Boyle CA , Moore CA . Pediatrics 2018 141 S146-s153 We have learned much about the short-term sequelae of congenital Zika virus (ZIKV) infection since the Centers for Disease Control and Prevention activated its ZIKV emergency response in January 2016. Nevertheless, gaps remain in our understanding of the full spectrum of adverse health outcomes related to congenital ZIKV infection and how to optimize health in those who are affected. To address the remaining knowledge gaps, support affected children so they can reach their full potential, and make the best use of available resources, a carefully planned public health approach in partnership with pediatric health care providers is needed. An essential step is to use population-based data captured through surveillance systems to describe congenital Zika syndrome. Another key step is using collected data to investigate why some children exhibit certain sequelae during infancy and beyond, whereas others do not, and to describe the clustering of anomalies and the timing of when these anomalies occur, among other research questions. The final critical step in the public health framework for congenital Zika syndrome is an intervention strategy with evidence-based best practices for longer-term monitoring and care. Adherence to recommended evaluation and management procedures for infants with possible congenital ZIKV infection, including for those with less obvious developmental and medical needs at birth, is essential. It will take many years to fully understand the effects of ZIKV on those who are congenitally infected; however, the lifetime medical and educational costs as well as the emotional impact on affected children and families are likely to be substantial. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Food Safety
- Genetics and Genomics
- Health Economics
- Healthcare Associated Infections
- Immunity and Immunization
- Laboratory Sciences
- Maternal and Child Health
- Mining
- Occupational Safety and Health
- Parasitic Diseases
- Physical Activity
- Reproductive Health
- Substance Use and Abuse
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 22, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure