Dysbiosis, inflammation, and response to treatment: a longitudinal study of pediatric subjects with newly diagnosed inflammatory bowel disease.
Shaw KA , Bertha M , Hofmekler T , Chopra P , Vatanen T , Srivatsa A , Prince J , Kumar A , Sauer C , Zwick ME , Satten GA , Kostic AD , Mulle JG , Xavier RJ , Kugathasan S . Genome Med 2016 8 (1) 75 BACKGROUND: Gut microbiome dysbiosis has been demonstrated in subjects with newly diagnosed and chronic inflammatory bowel disease (IBD). In this study we sought to explore longitudinal changes in dysbiosis and ascertain associations between dysbiosis and markers of disease activity and treatment outcome. METHODS: We performed a prospective cohort study of 19 treatment-naive pediatric IBD subjects and 10 healthy controls, measuring fecal calprotectin and assessing the gut microbiome via repeated stool samples. Associations between clinical characteristics and the microbiome were tested using generalized estimating equations. Random forest classification was used to predict ultimate treatment response (presence of mucosal healing at follow-up colonoscopy) or non-response using patients' pretreatment samples. RESULTS: Patients with Crohn's disease had increased markers of inflammation and dysbiosis compared to controls. Patients with ulcerative colitis had even higher inflammation and dysbiosis compared to those with Crohn's disease. For all cases, the gut microbial dysbiosis index associated significantly with clinical and biological measures of disease severity, but did not associate with treatment response. We found differences in specific gut microbiome genera between cases/controls and responders/non-responders including Akkermansia, Coprococcus, Fusobacterium, Veillonella, Faecalibacterium, and Adlercreutzia. Using pretreatment microbiome data in a weighted random forest classifier, we were able to obtain 76.5 % accuracy for prediction of responder status. CONCLUSIONS: Patient dysbiosis improved over time but persisted even among those who responded to treatment and achieved mucosal healing. Although dysbiosis index was not significantly different between responders and non-responders, we found specific genus-level differences. We found that pretreatment microbiome signatures are a promising avenue for prediction of remission and response to treatment. |
Recommendations from the International Colorectal Cancer Screening Network on the evaluation of the cost of screening programs
Subramanian S , Tangka FK , Hoover S , Nadel M , Smith R , Atkin W , Patnick J . J Public Health Manag Pract 2016 22 (5) 461-5 Worldwide, colorectal cancer is the fourth leading cause of death from cancer and the incidence is projected to increase. Many countries are exploring the introduction of organized screening programs, but there is limited information on the resources required and guidance for cost-effective implementation. To facilitate the generating of the economics evidence base for program implementation, we collected and analyzed detailed program cost data from 5 European members of the International Colorectal Cancer Screening Network. The cost per person screened estimates, often used to compare across programs as an overall measure, varied significantly across the programs. In addition, there were substantial differences in the programmatic and clinical cost incurred, even when the same type of screening test was used. Based on these findings, several recommendations are provided to enhance the underlying methodology and validity of the comparative economic assessments. The recommendations include the need for detailed activity-based cost information, the use of a comprehensive set of effectiveness measures to adequately capture differences between programs, and the incorporation of data from multiple programs in cost-effectiveness models to increase generalizability. Economic evaluation of real-world colorectal cancer-screening programs is essential to derive valuable insights to improve program operations and ensure optimal use of available resources. |
Reducing health inequities in the U.S.: recommendations from the NHLBI's health inequities Think Tank meeting
Sampson UK , Kaplan RM , Cooper RS , Diez Roux AV , Marks JS , Engelgau MM , Peprah E , Mishoe H , Boulware LE , Felix KL , Califf RM , Flack JM , Cooper LA , Gracia JN , Henderson JA , Davidson KW , Krishnan JA , Lewis TT , Sanchez E , Luban NL , Vaccarino V , Wong WF , Wright JT Jr , Meyers D , Ogedegbe OG , Presley-Cantrell L , Chambers DA , Belis D , Bennett GC , Boyington JE , Creazzo TL , de Jesus JM , Krishnamurti C , Lowden MR , Punturieri A , Shero ST , Young NS , Zou S , Mensah GA . J Am Coll Cardiol 2016 68 (5) 517-24 The National, Heart, Lung, and Blood Institute convened a Think Tank meeting to obtain insight and recommendations regarding the objectives and design of the next generation of research aimed at reducing health inequities in the United States. The panel recommended several specific actions, including: 1) embrace broad and inclusive research themes; 2) develop research platforms that optimize the ability to conduct informative and innovative research, and promote systems science approaches; 3) develop networks of collaborators and stakeholders, and launch transformative studies that can serve as benchmarks; 4) optimize the use of new data sources, platforms, and natural experiments; and 5) develop unique transdisciplinary training programs to build research capacity. Confronting health inequities will require engaging multiple disciplines and sectors (including communities), using systems science, and intervening through combinations of individual, family, provider, health system, and community-targeted approaches. Details of the panel's remarks and recommendations are provided in this report. |
Trends in prevalence of chronic kidney disease in the United States
Murphy D , McCulloch CE , Lin F , Banerjee T , Bragg-Gresham JL , Eberhardt MS , Morgenstern H , Pavkov ME , Saran R , Powe NR , Hsu CY . Ann Intern Med 2016 165 (7) 473-481 Background: Trends in the prevalence of chronic kidney disease (CKD) are important for health care policy and planning. Objective: To update trends in CKD prevalence. Design: Repeated cross-sectional study. Setting: NHANES (National Health and Nutrition Examination Survey) for 1988 to 1994 and every 2 years from 1999 to 2012. Participants: Adults aged 20 years or older. Measurements: Chronic kidney disease (stages 3 and 4) was defined as an estimated glomerular filtration rate (eGFR) of 15 to 59 mL/min/1.73 m2, estimated with the Chronic Kidney Disease Epidemiology Collaboration equation from calibrated serum creatinine measurements. An expanded definition of CKD also included persons with an eGFR of at least 60 mL/min/1.73 m2 and a 1-time urine albumin-creatinine ratio of at least 30 mg/g. Results: The unadjusted prevalence of stage 3 and 4 CKD increased from the late 1990s to the early 2000s. Since 2003 to 2004, however, the overall prevalence has largely stabilized (for example, 6.9% prevalence in 2003 to 2004 and in 2011 to 2012). There was little difference in adjusted prevalence of stage 3 and 4 CKD overall in 2003 to 2004 versus 2011 to 2012 after age, sex, race/ethnicity, and diabetes mellitus status were controlled for (P = 0.26). Lack of increase in CKD prevalence since the early 2000s was observed in most subgroups and with an expanded definition of CKD that included persons with higher eGFRs and albuminuria. Limitation: Serum creatinine and albuminuria were measured only once in each person. Conclusion: In a reversal of prior trends, there has been no appreciable increase in the prevalence of stage 3 and 4 CKD in the U.S. population overall during the most recent decade. Primary Funding Source: American Society of Nephrology Foundation for Kidney Research Student Scholar Grant Program, Centers for Disease Control and Prevention, and National Institutes of Health. |
Ultraviolet radiation exposure and its impact on skin cancer risk
Watson M , Holman DM , Maguire-Eisen M . Semin Oncol Nurs 2016 32 (3) 241-54 Objectives: To review research and evidence-based resources on skin cancer prevention and early detection and their importance for oncology nurses. Data Sources: Journal articles, federal reports, cancer surveillance data, behavioral surveillance data. Conclusion: Most cases of skin cancer are preventable. Survivors of many types of cancer are at increased risk of skin cancers. Implications for Nursing Practice: Oncology nurses can play an important role in protecting their patients from future skin cancer morbidity and mortality. © 2016. |
Prevalence of amyotrophic lateral sclerosis - United States, 2012-2013
Mehta P , Kaye W , Bryan L , Larson T , Copeland T , Wu J , Muravov O , Horton K . MMWR Surveill Summ 2016 65 (8) 1-12 PROBLEM/CONDITION: Amyotrophic lateral sclerosis (ALS), commonly known as Lou Gehrig's disease, is a progressive and fatal neuromuscular disease for which no cure or viable treatment has been identified. ALS, like most noncommunicable diseases, is not a nationally notifiable disease in the United States. The prevalence of ALS in the United States during 2010-2011 was estimated to be 3.9 cases per 100,000 persons in the general population. Updated prevalence estimates are needed to help monitor disease status, better understand etiology, and identify risk factors for ALS. PERIOD COVERED: 2012-2013. DESCRIPTION OF SYSTEM: The National ALS Registry, established in 2009, collects data on ALS patients in the United States to better describe the incidence and prevalence of ALS, examine risk factors such as environmental and occupational exposures, and characterize the demographics of those living with ALS. To identify prevalent cases of ALS, data are compiled from four national administrative databases (maintained by the Centers for Medicare and Medicaid Services, the Veterans Health Administration, and the Veterans Benefits Administration). To identify cases not included in these databases and to better understand risk-factors associated with ALS and disease progression, the Registry also includes data that are collected from patients who voluntarily enroll and complete online surveys. RESULTS: During 2012 and 2013, the Registry identified 14,713 and 15,908 persons, respectively, who met the surveillance case definition of ALS. The estimated ALS prevalence rate was 4.7 cases per 100,000 U.S. population for 2012 and 5.0 per 100,000 for 2013. Due to revisions to the algorithm and use of death data from the National Death Index, an updated prevalence estimate has been calculated retrospectively for October 19, 2010-December 31, 2011. This updated estimate showed a prevalence rate of 4.3 per 100,000 population and a total of 13,282 cases. Since the inception of the Registry, the pattern of characteristics (e.g., age, sex, and race/ethnicity) among persons with ALS have remained unchanged. Overall, ALS was more common among whites, males, and persons aged 60-69 years. The age groups with the lowest number of ALS cases were persons aged 18-39 years and those aged ≥80 years. Males had a higher prevalence rate of ALS than females overall and across all data sources. These findings remained consistent during October 2010-December 2013. INTERPRETATION: The Registry is the only available data source that can be used to estimate the national prevalence for ALS in the United States. Use of both administrative national databases and self-report from patients enables a comprehensive approach to estimate ALS prevalence. The overall increase in the prevalence rate from 4.3 per 100,000 persons (revised) during 2010-2011 to 4.7 and 5.0 per 100,000 persons, respectively, during 2012-2013 likely is not an actual increase in the number of ALS cases. Rather, this increase might be attributed to improved case ascertainment due to the refinement of the algorithm used to identify definite ALS cases, along with an increased public awareness of the Registry. Registry estimates of ALS prevalence are consistent with findings from long-established ALS registries in Europe and from smaller-scale epidemiologic studies previously conducted in the United States. PUBLIC HEALTH ACTIONS: Data collected by the National ALS Registry are being used to better describe the epidemiology of ALS in the United States and to help facilitate research. The combined approach of using national administrative databases and a self-enrollment web portal to collect data is novel and potentially could be used for other non-notifiable diseases such as Parkinson's disease or multiple sclerosis. Increased public awareness of the Registry might lead to more ALS cases being identified from the secure web portal (https://www.cdc.gov/als), which can ascertain cases apart from the national administrative databases. For example, in 2014, the ALS Ice Bucket Challenge, a social media-centered campaign, received extensive public visibility and created increased awareness of ALS. The Agency for Toxic Substances and Disease Registry (ATSDR) works closely with ALS advocacy and support groups, researchers, health care professionals, and others to promote the National ALS Registry and to identify all cases of ALS in the United States. In addition to estimating the prevalence of ALS, the Registry is being used to collect specimens from patient enrollees through a new biorepository, connect patient enrollees with new clinical trials and epidemiologic studies, and fund studies to help learn more about the etiology of ALS. Additional information about the National ALS Registry is available at http://www.cdc.gov/als or by calling toll-free at 1-877-442-9719. |
The global epidemic of chronic kidney disease: a call for action
Whelan E . Occup Environ Med 2016 73 (8) 499-500 While the Ebola and Zika viruses have made national and international headlines in recent months, another epidemic of larger magnitude is quietly devastating agricultural communities in developing countries worldwide. In Central America, the death toll from a mysterious type of chronic kidney disease (CKD) is estimated to be 20 000 in just 10 years.1 Unlike the CKD seen in developed countries, which is typically linked to hypertension and diabetes, this disease appears to be multifactorial and disproportionately afflicts young men of working age. In El Salvador, CKD is the second leading cause of mortality among men of working age.2 Similar excesses have been reported in other parts of Central America,3 as well as in Sri Lanka,4 India5 and Egypt.6 Occupation is believed to be the driving factor. According to the leading hypothesis, heat stress and dehydration from strenuous work such as manual cutting of sugar cane, perhaps in a synergistic association with exposure to environmental toxins, result in kidney damage that leads to permanent loss of function. | Manual sugarcane cutting involves high cardiovascular demand comparable to that experienced by endurance athletes, except that cane cutters are exposed to ‘daily’ demands for the entire harvest season. The risk for the disease is exacerbated by the pay structure of the work, in that cane cutters are paid according to how much cane they cut, creating a disincentive to take breaks for rest and water. A particularly disturbing characteristic of this type of CKD is that, in its early stages, people show no symptoms. It is a silent killer. By some accounts, the disease has existed in parts of the world for decades, but the death rate has accelerated with industrial-scale agriculture expansion and global climate change. |
Changes in disparity in county-level diagnosed diabetes prevalence and incidence in the United States, between 2004 and 2012
Shrestha SS , Thompson TJ , Kirtland KA , Gregg EW , Beckles GL , Luman ET , Barker LE , Geiss LS . PLoS One 2016 11 (8) e0159876 BACKGROUND: In recent decades, the United States experienced increasing prevalence and incidence of diabetes, accompanied by large disparities in county-level diabetes prevalence and incidence. However, whether these disparities are widening, narrowing, or staying the same has not been studied. We examined changes in disparity among U.S. counties in diagnosed diabetes prevalence and incidence between 2004 and 2012. METHODS: We used 2004 and 2012 county-level diabetes (type 1 and type 2) prevalence and incidence data, along with demographic, socio-economic, and risk factor data from various sources. To determine whether disparities widened or narrowed over the time period, we used a regression-based beta-convergence approach, accounting for spatial autocorrelation. We calculated diabetes prevalence/incidence percentage point (ppt) changes between 2004 and 2012 and modeled these changes as a function of baseline diabetes prevalence/incidence in 2004. Covariates included county-level demographic and, socio-economic data, and known type 2 diabetes risk factors (obesity and leisure-time physical inactivity). RESULTS: For each county-level ppt increase in diabetes prevalence in 2004 there was an annual average increase of 0.02 ppt (p<0.001) in diabetes prevalence between 2004 and 2012, indicating a widening of disparities. However, after accounting for covariates, diabetes prevalence decreased by an annual average of 0.04 ppt (p<0.001). In contrast, changes in diabetes incidence decreased by an average of 0.04 ppt (unadjusted) and 0.09 ppt (adjusted) for each ppt increase in diabetes incidence in 2004, indicating a narrowing of county-level disparities. CONCLUSIONS: County-level disparities in diagnosed diabetes prevalence in the United States widened between 2004 and 2012, while disparities in incidence narrowed. Accounting for demographic and, socio-economic characteristics and risk factors for type 2 diabetes narrowed the disparities, suggesting that these factors are strongly associated with changes in disparities. Public health interventions that target modifiable risk factors, such as obesity and physical inactivity, in high burden counties might further reduce disparities in incidence and, over time, in prevalence. |
Recovery of Neisseria gonorrhoeae from 4 commercially available transport systems
Papp JR , Henning T , Khubbar M , Kalve V , Bhattacharyya S , Travanty E , Xavier K , Jones K , Rudrik JT , Gaynor A , Hagan C . Diagn Microbiol Infect Dis 2016 86 (2) 144-7 Four commercial transport systems for the recovery of Neisseria gonorrhoeae were evaluated in support of the need to obtain culture isolates for the detection of antimicrobial resistance. Bacterial recovery from the InTray GC system was superior with minimal loss of viability in contrast to non-nutritive transport systems. |
Surveillance of infectious diseases in the Arctic
Bruce M , Zulz T , Koch A . Public Health 2016 137 5-12 OBJECTIVES: This study reviews how social and environmental issues affect health in Arctic populations and describes infectious disease surveillance in Arctic Nations with a special focus on the activities of the International Circumpolar Surveillance (ICS) project. METHODS: We reviewed the literature over the past 2 decades looking at Arctic living conditions and their effects on health and Arctic surveillance for infectious diseases. RESULTS: In regards to other regions worldwide, the Arctic climate and environment are extreme. Arctic and sub-Arctic populations live in markedly different social and physical environments compared to those of their more southern dwelling counterparts. A cold northern climate means people spending more time indoors, amplifying the effects of household crowding, smoking and inadequate ventilation on the person-to-person spread of infectious diseases. The spread of zoonotic infections north as the climate warms, emergence of antibiotic resistance among bacterial pathogens, the re-emergence of tuberculosis, the entrance of HIV into Arctic communities, the specter of pandemic influenza or the sudden emergence and introduction of new viral pathogens pose new challenges to residents, governments and public health authorities of all Arctic countries. ICS is a network of hospitals, public health agencies, and reference laboratories throughout the Arctic working together for the purposes of collecting, comparing and sharing of uniform laboratory and epidemiological data on infectious diseases of concern and assisting in the formulation of prevention and control strategies (Fig. 1). In addition, circumpolar infectious disease research workgroups and sentinel surveillance systems for bacterial and viral pathogens exist. CONCLUSIONS: The ICS system is a successful example of collaborative surveillance and research in an extreme environment. |
Update on vaccine-derived polioviruses - worldwide, January 2015-May 2016
Jorba J , Diop OM , Iber J , Sutter RW , Wassilak SG , Burns CC . MMWR Morb Mortal Wkly Rep 2016 65 (30) 763-9 In 1988, the World Health Assembly resolved to eradicate poliomyelitis worldwide. One of the main tools used in polio eradication efforts has been the live, attenuated, oral poliovirus vaccine (OPV), an inexpensive vaccine easily administered by trained volunteers. OPV might require several doses to induce immunity, but provides long-term protection against paralytic disease. Through effective use of OPV, the Global Polio Eradication Initiative (GPEI) has brought wild polioviruses to the threshold of eradication. However, OPV use, particularly in areas with low routine vaccination coverage, is associated with the emergence of genetically divergent vaccine-derived polioviruses (VDPVs) whose genetic drift from the parental OPV strains indicates prolonged replication or circulation (3). VDPVs can emerge among immunologically normal vaccine recipients and their contacts as well as among persons with primary immunodeficiencies (PIDs). Immunodeficiency-associated VDPVs (iVDPVs) can replicate for years in some persons with PIDs. In addition, circulating vaccine-derived polioviruses (cVDPVs) can emerge in areas with low OPV coverage and can cause outbreaks of paralytic polio. This report updates previous summaries regarding VDPVs. |
Ebola virus disease and critical illness
Leligdowicz A , Fischer WA 2nd , Uyeki TM , Fletcher TE , Adhikari NK , Portella G , Lamontagne F , Clement C , Jacob ST , Rubinson L , Vanderschuren A , Hajek J , Murthy S , Ferri M , Crozier I , Ibrahima E , Lamah MC , Schieffelin JS , Brett-Major D , Bausch DG , Shindo N , Chan AK , O'Dempsey T , Mishra S , Jacobs M , Dickson S , Lyon GM 3rd , Fowler RA . Crit Care 2016 20 (1) 217 As of 20 May 2016 there have been 28,646 cases and 11,323 deaths resulting from the West African Ebola virus disease (EVD) outbreak reported to the World Health Organization. There continue to be sporadic flare-ups of EVD cases in West Africa.EVD presentation is nonspecific and characterized initially by onset of fatigue, myalgias, arthralgias, headache, and fever; this is followed several days later by anorexia, nausea, vomiting, diarrhea, and abdominal pain. Anorexia and gastrointestinal losses lead to dehydration, electrolyte abnormalities, and metabolic acidosis, and, in some patients, acute kidney injury. Hypoxia and ventilation failure occurs most often with severe illness and may be exacerbated by substantial fluid requirements for intravascular volume repletion and some degree of systemic capillary leak. Although minor bleeding manifestations are common, hypovolemic and septic shock complicated by multisystem organ dysfunction appear the most frequent causes of death.Males and females have been equally affected, with children (0-14 years of age) accounting for 19 %, young adults (15-44 years) 58 %, and older adults (≥45 years) 23 % of reported cases. While the current case fatality proportion in West Africa is approximately 40 %, it has varied substantially over time (highest near the outbreak onset) according to available resources (40-90 % mortality in West Africa compared to under 20 % in Western Europe and the USA), by age (near universal among neonates and high among older adults), and by Ebola viral load at admission.While there is no Ebola virus-specific therapy proven to be effective in clinical trials, mortality has been dramatically lower among EVD patients managed with supportive intensive care in highly resourced settings, allowing for the avoidance of hypovolemia, correction of electrolyte and metabolic abnormalities, and the provision of oxygen, ventilation, vasopressors, and dialysis when indicated. This experience emphasizes that, in addition to evaluating specific medical treatments, improving the global capacity to provide supportive critical care to patients with EVD may be the greatest opportunity to improve patient outcomes. |
Impact of health system inputs on health outcome: A multilevel longitudinal analysis of Botswana National Antiretroviral Program (2002-2013)
Farahani M , Price N , El-Halabi S , Mlaudzi N , Keapoletswe K , Lebelonyane R , Fetogang EB , Chebani T , Kebaabetswe P , Masupe T , Gabaake K , Auld AF , Nkomazana O , Marlink R . PLoS One 2016 11 (8) e0160206 OBJECTIVE: To measure the association between the number of doctors, nurses and hospital beds per 10,000 people and individual HIV-infected patient outcomes in Botswana. DESIGN: Analysis of routinely collected longitudinal data from 97,627 patients who received ART through the Botswana National HIV/AIDS Treatment Program across all 24 health districts from 2002 to 2013. Doctors, nurses, and hospital bed density data at district-level were collected from various sources. METHODS: A multilevel, longitudinal analysis method was used to analyze the data at both patient- and district-level simultaneously to measure the impact of the health system input at district-level on probability of death or loss-to-follow-up (LTFU) at the individual level. A marginal structural model was used to account for LTFU over time. RESULTS: Increasing doctor density from one doctor to two doctors per 10,000 population decreased the predicted probability of death for each patient by 27%. Nurse density changes from 20 nurses to 25 nurses decreased the predicted probability of death by 28%. Nine percent decrease was noted in predicted mortality of an individual in the Masa program for every five hospital bed density increase. CONCLUSION: Considerable variation was observed in doctors, nurses, and hospital bed density across health districts. Predictive margins of mortality and LTFU were inversely correlated with doctor, nurse and hospital bed density. The doctor density had much greater impact than nurse or bed density on mortality or LTFU of individual patients. While long-term investment in training more healthcare professionals should be made, redistribution of available doctors and nurses can be a feasible solution in the short term. |
Infrequent clinical assessment of chronic hepatitis B patients in United States general healthcare settings
Spradling PR , Xing J , Rupp LB , Moorman AC , Gordon SC , Teshale ET , Lu M , Boscarino JA , Trinacty CM , Schmidt MA , Holmberg SD . Clin Infect Dis 2016 63 (9) 1205-1208 Among 2,338 chronic hepatitis B patients followed during 2006-2013 in the Chronic Hepatitis Cohort Study, 78% had ≥1 alanine aminotransferase and 37% had ≥1 HBV DNA level assessed annually. Among cirrhotic patients, 46% never had hepatic imaging. Patients in this cohort were insufficiently monitored for disease activity and hepatocellular carcinoma. |
2016 Infectious Diseases Society of America (IDSA) Clinical Practice Guideline for the Treatment of Coccidioidomycosis
Galgiani JN , Ampel NM , Blair JE , Catanzaro A , Geertsma F , Hoover SE , Johnson RH , Kusne S , Lisse J , MacDonald JD , Meyerson SL , Raksin PB , Siever J , Stevens DA , Sunenshine R , Theodore N . Clin Infect Dis 2016 63 (6) e112-46 It is important to realize that guidelines cannot always account for individual variation among patients. They are not intended to supplant physician judgment with respect to particular patients or special clinical situations. Infectious Diseases Society of America considers adherence to these guidelines to be voluntary, with the ultimate determination regarding their application to be made by the physician in the light of each patient's individual circumstances.Coccidioidomycosis, also known as San Joaquin Valley fever, is a systemic infection endemic to parts of the southwestern United States and elsewhere in the Western Hemisphere. Residence in and recent travel to these areas are critical elements for the accurate recognition of patients who develop this infection. In this practice guideline, we have organized our recommendations to address actionable questions concerning the entire spectrum of clinical syndromes. These can range from initial pulmonary infection, which eventually resolves whether or not antifungal therapy is administered, to a variety of pulmonary and extrapulmonary complications. Additional recommendations address management of coccidioidomycosis occurring for special at-risk populations. Finally, preemptive management strategies are outlined in certain at-risk populations and after unintentional laboratory exposure. |
Augmented passive immunotherapy with P4 peptide improves phagocyte activity in severe sepsis
Morton B , Mitsi E , Pennington SH , Reine J , Wright AD , Parker R , Welters ID , Blakey JD , Rajam G , Ades EW , Ferreira DM , Wang D , Kadioglu A , Gordon SB . Shock 2016 46 (6) 635-641 INTRODUCTION: Antimicrobial resistance threatens to undermine treatment for severe infection; new therapeutic strategies are urgently needed. Pre-clinical work shows that augmented passive immunotherapy with P4 peptide increases phagocytic activity and shows promise as a novel therapeutic strategy. Our aim was to determine ex vivo P4 activity in a target population of patients admitted to critical care with severe infection. METHODS: We prospectively recruited UK critical care unit patients with severe sepsis and observed clinical course (≥3 months post discharge). Blood samples were taken in early (≤ 48hrs post-diagnosis, n = 54), latent (seven days post-diagnosis, n = 39) and convalescent (3-6 months post-diagnosis, n = 18) phases of disease. The primary outcome measure was killing of opsonised S.pneumoniae by neutrophils with and without P4 peptide stimulation. We also used a flow cytometric whole blood phagocytosis assay to determine phagocyte association and oxidation of intraphagosomal reporter beads. RESULTS: P4 peptide increased neutrophil killing of opsonised pneumococci by 8.6% (C.I. 6.35 - 10.76, p < 0.001) in all phases of sepsis, independent of infection source and microbiological status. This represented a 54.9% increase in bacterial killing compared to unstimulated neutrophils (15.6%) in early phase samples. Similarly, P4 peptide treatment significantly increased neutrophil and monocyte intraphagosomal reporter bead association and oxidation, independent of infection source. CONCLUSIONS: We have extended pre-clinical work to demonstrate that P4 peptide significantly increases phagocytosis and bacterial killing in samples from a target patient population with severe sepsis. This study supports the rationale for augmented passive immunotherapy as a therapeutic strategy in severe sepsis. |
Chronic health conditions in Medicare beneficiaries 65 years and older with HIV infection
Friedman EE , Duffus WA . AIDS 2016 30 (16) 2529-2536 OBJECTIVES: to examine sociodemographic factors and chronic health conditions of people living with HIV (PLWHIV/ HIV+) ≥65 years, and to compare their chronic disease prevalence to beneficiaries without HIV. DESIGN: National fee for service (FFS) Medicare claims data (parts A and B) from 2006-2009 were used to create a retrospective cohort of beneficiaries >65 years old. METHODS: Beneficiaries with 1 inpatient or skilled nursing facility claim, or 2 outpatient claims with HIV diagnosis codes were considered HIV+. HIV+ beneficiaries were compared to uninfected beneficiaries on demographic factors and on the prevalence of hypertension, hyperlipidemia, ischemic heart disease, rheumatoid arthritis/osteoarthritis, and diabetes. Odds ratios (OR), 95% confidence intervals (CI), and p-values were calculated. Adjustment variables included age, sex, race/ethnicity, end stage renal disease (ESRD), and dual Medicare-Medicaid enrollment. Chronic conditions were examined individually, and as an index from zero to all five conditions. RESULTS: Of 29,060,418 eligible beneficiaries, 24,735 (0.09%) were HIV+. HIV+ beneficiaries were more likely to be Hispanic, African American, male, and younger (p>0.0001), and were 1.5 to 2.1 times as likely to have a chronic disease (diabetes (aOR) 1.51 95% CI (1.47, 1.55): rheumatoid arthritis/osteoarthritis 2.14 95% CI (2.08, 2.19)), and 2.4 to 7 times as likely to have 1-5 co-morbid chronic conditions (1 condition (aOR) 2.38 95% CI (2.21, 2.57): 5 conditions 7.07 95% CI (6.61, 7.56)). CONCLUSIONS: Our results show that PLWHIV ≥65 years are at higher risk of comorbidities than other FFS Medicare beneficiaries. This finding has implications for both the management and cost of the health of PLWHIV ≥65. |
Clinical and laboratory profile of persons living with human immunodeficiency virus/acquired immune deficiency syndrome and histoplasmosis from a Colombian hospital
Caceres DH , Tobon AM , Cleveland AA , Scheel CM , Berbesi DY , Ochoa J , Restrepo A , Brandt ME , Chiller T , Gomez BL . Am J Trop Med Hyg 2016 95 (4) 918-924 Histoplasmosis is common among persons living with human immunodeficiency virus/acquired immune deficiency syndrome (PLWHA) in Latin America, but its diagnosis is difficult and often nonspecific. We conducted prospective screening for histoplasmosis among PLWHA with signs or symptoms suggesting progressive disseminated histoplasmosis (PDH) and hospitalized in Hospital La Maria in Medellin, Colombia. The study's aim was to obtain a clinical and laboratory profile of PLWHA with PDH. During 3 years (May 2008 to August 2011), we identified 89 PLWHA hospitalized with symptoms suggestive of PDH, of whom 45 (51%) had histoplasmosis. We observed tuberculosis (TB) coinfection in a large proportion of patients with PDH (35%), so all analyses were performed adjusting for this coinfection and, alternatively, excluding histoplasmosis patients with TB. Results showed that the patients with PDH were more likely to have Karnofsky score ≤ 30 (prevalence ratio [PR] = 1.98, 95% confidence interval [CI] = 0.97-4.06), liver compromised with hepatomegaly and/or splenomegaly (PR = 1.77, CI = 1.03-3.06) and elevation in serum of alanine aminotransferase and aspartate aminotransferase to values > 40 mU/mL (PR = 2.06, CI = 1.09-3.88 and PR = 1.53, CI = 0.99-2.35, respectively). Using multiple correspondence analyses, we identified in patients with PDH a profile characterized by the presence of constitutional symptoms, namely weight loss and Karnofsky classification ≤ 30, gastrointestinal manifestations with alteration of liver enzymes and hepatosplenomegaly and/or splenomegaly, skin lesions, and hematological alterations. Study of the profiles is no substitute for laboratory diagnostics, but identifying clinical and laboratory indicators of PLWHA with PDH should allow development of strategies for reducing the time to diagnosis and thus mortality caused by Histoplasma capsulatum. |
Clinical characteristics, virulence factors and molecular typing of methicillin-resistant Staphylococcus aureus infections in Shenzhen City, China
Hu L , Li Y , Lu Y , Klena JD , Qiu Y , Lin Y , Jiang M , Shi X , Chen L , Liu X , Ma H , Cheng J , Wu S , Kan B , Hu Q . Epidemiol Infect 2016 144 (14) 1-9 Methicillin-resistant Staphylococcus aureus (MRSA) has emerged as a serious hospital and community-acquired infection and some strains are associated with greater severity. We investigated the clinical variability and molecular characteristics of MRSA infections in Shenzhen, China through a study at nine sentinel hospitals from January to December 2014. MRSA infections were classified as community-associated (CA-MRSA), healthcare-associated (HA-MRSA), and healthcare-associated community-onset (HACO-MRSA). In total, 812 MRSA isolates were collected and 183 of these were selected for further study. Patients with HA-MRSA infections were generally of greater age compared to other groups. Distinct body site and clinical presentations were evident in infected patients, e.g. CA-MRSA (skin and soft tissue, 53%), HA-MRSA (respiratory tract, 22%; surgical site, 20%; trauma wounds, 20%) and HACO-MRSA (mastitis, 47%). In contrast to HA-MRSA, other categories of strains were significantly more susceptible to gentamicin, sulfamethoxazole/trimethoprim, and tetracycline. No resistance to vancomycin or linezolid was recorded. The predominant clonal lineage within each strain category was CC59-t437-SCCmec IV/V-agr I (CA, 51.4%; HA, 28.9%; HACO, 52.9%) which exhibited characteristics of a traditional CA clone together with agr I which is more often associated with HA clones. In conclusion, for the three categories of MRSA infections, there were significant differences in clinical characteristics of patients, but the predominant clone in each category shared a similar genetic background which suggests that transmission of MRSA strains has occurred between the community and hospitals in Shenzhen. |
How to prescribe fewer unnecessary antibiotics: Talking points that work with patients and their families
Fleming-Dutra KE , Mangione-Smith R , Hicks LA . Am Fam Physician 2016 94 (3) 200-2 Antibiotic resistance is one of the world’s most pressing public health problems. Antibiotic-resistant infections account for an estimated 2 million illnesses and 23,000 deaths annually in the United States.1 Antibiotic use is a major driver of resistance,1 and most antibiotics are used in outpatient settings.2 Nationally, population-based rates of antibiotic prescribing for children decreased from 2000 to 2010, but did not change among adults 18 to 64 years of age and increased for adults 65 years and older.3 Additionally, broad-spectrum antibiotic prescribing increased for all age groups from 2000 to 2010,3 and antibiotic prescribing for acute respiratory tract infections remained common in children and adults.3,4 In 2011, U.S. clinicians prescribed 262.5 million outpatient courses of antibiotics; of these, family physicians prescribed one-fourth, more than any other subspecialty.5 Thus, family physicians are critical partners in the effort to avoid antibiotic overuse. | Being a good antibiotic steward means protecting patients and the public from antibiotic resistance and adverse events by prescribing antibiotics only when needed, and prescribing the right drug at the right dosage for the right duration. Antibiotic use in childhood has been linked to increased risks of autoimmune diseases and obesity, which are likely mediated via disruptions in the microbiome.6 Clinicians should carefully weigh the risks and benefits when prescribing these drugs. In focus groups, messages about the risks associated with antibiotic use resonated with parents, who stated that they want to be informed about possible adverse drug events.7 However, adult patients seemed to be less concerned about the possibility of adverse drug events.7 |
Characterization of clinical and environmental isolates of Vibrio cidicii sp. nov., a close relative of Vibrio navarrensis.
Orata FD , Xu Y , Gladney LM , Rishishwar L , Case RJ , Boucher Y , Jordan IK , Tarr CL . Int J Syst Evol Microbiol 2016 66 (10) 4148-4155 Four Vibrio spp. isolates from the historical culture collection at the Centers for Disease Control and Prevention, obtained from human blood specimens (n = 3) and river water (n = 1), show characteristics distinct from those of isolates of the most closely related species, Vibrio navarrensis and Vibrio vulnificus, based on phenotypic and genotypic tests. They are specifically adapted to survival in both freshwater and seawater, being able to grow in rich media without added salts as well as salinities above that of seawater. Phenotypically, these isolates resemble V. navarrensis, their closest known relative with a validly published name, but the group of isolates is distinguished from V. navarrensis by the ability to utilize L-rhamnose. Average nucleotide identity and percent DNA-DNA hybridization values obtained from the pairwise comparisons of whole genome sequences of these isolates to V. navarrensis range from 95.4-95.8% and 61.9-64.3%, respectively, suggesting that the group represents a different species. Phylogenetic analysis of the core genome, including four protein-coding housekeeping genes (pyrH, recA, rpoA, and rpoB), places these four isolates into their own monophyletic clade, distinct from V. navarrensis and V. vulnificus. Based on these differences, we propose these isolates belong to a novel Vibrio species. The name Vibrio cidicii sp. nov. is proposed for these isolates; strain LMG 29267T (= CIP 111013T = 2756-81T), isolated from river water, is the type strain. |
Exposure science in an age of rapidly changing climate: challenges and opportunities
LaKind JS , Overpeck J , Breysse PN , Backer L , Richardson SD , Sobus J , Sapkota A , Upperman CR , Jiang C , Beard CB , Brunkard JM , Bell JE , Harris R , Chretien JP , Peltier RE , Chew GL , Blount BC . J Expo Sci Environ Epidemiol 2016 26 (6) 529-538 Climate change is anticipated to alter the production, use, release, and fate of environmental chemicals, likely leading to increased uncertainty in exposure and human health risk predictions. Exposure science provides a key connection between changes in climate and associated health outcomes. The theme of the 2015 Annual Meeting of the International Society of Exposure Science-Exposures in an Evolving Environment-brought this issue to the fore. By directing attention to questions that may affect society in profound ways, exposure scientists have an opportunity to conduct "consequential science"-doing science that matters, using our tools for the greater good and to answer key policy questions, and identifying causes leading to implementation of solutions. Understanding the implications of changing exposures on public health may be one of the most consequential areas of study in which exposure scientists could currently be engaged. In this paper, we use a series of case studies to identify exposure data gaps and research paths that will enable us to capture the information necessary for understanding climate change-related human exposures and consequent health impacts. We hope that paper will focus attention on under-developed areas of exposure science that will likely have broad implications for public health. |
Spike gene deletion quasispecies in serum of patient with acute MERS-CoV infection.
Lu X , Rowe LA , Frace M , Stevens J , Abedi GR , El Nile O , Banassir T , Al-Masri M , Watson JT , Assiri A , Erdman DD . J Med Virol 2016 89 (3) 542-545 The spike glycoprotein of the Middle East respiratory coronavirus (MERS-CoV) facilitates receptor binding and cell entry. During investigation of a multi-facility outbreak of MERS-CoV in Taif, Saudi Arabia, we identified a mixed population of wild-type and variant sequences with a large 530 nucleotide deletion in the spike gene from the serum of one patient. The out of frame deletion predicted loss of most of the S2 subunit of the spike protein leaving the S1 subunit with an intact receptor binding domain. This finding documents human infection with a novel genetic variant of MERS-CoV present as a quasispecies. |
Inhibition of influenza A virus matrix and nonstructural gene expression using RNA interference.
McMillen CM , Beezhold DH , Blachere FM , Othumpangat S , Kashon ML , Noti JD . Virology 2016 497 171-184 Influenza antiviral drugs that use protein inhibitors can lose their efficacy as resistant strains emerge. As an alternative strategy, we investigated the use of small interfering RNA molecules (siRNAs) by characterizing three siRNAs (M747, M776 and M832) targeting the influenza matrix 2 gene and three (NS570, NS595 and NS615) targeting the nonstructural protein 1 and 2 genes. We also re-examined two previously reported siRNAs, M331 and M950, which target the matrix 1 and 2 genes. Treatment with M331-, M776-, M832-, and M950-siRNAs attenuated influenza titer. M776-siRNA treated cells had 29.8% less infectious virus than cells treated with the previously characterized siRNA, M950. NS570-, NS595- and NS615-siRNAs reduced nonstructural protein 1 and 2 expression and enhanced type I interferon expression by 50%. Combination siRNA treatment attenuated 20.9% more infectious virus than single siRNA treatment. Our results suggest a potential use for these siRNAs as an effective anti-influenza virus therapy. |
Healthy or unhealthy migrants? Identifying internal migration effects on mortality in Africa using health and demographic surveillance systems of the INDEPTH network
Ginsburg C , Bocquier P , Beguy D , Afolabi S , Augusto O , Derra K , Herbst K , Lankoande B , Odhiambo F , Otiende M , Soura A , Wamukoya M , Zabre P , White MJ , Collinson MA . Soc Sci Med 2016 164 59-73 Migration has been hypothesised to be selective on health but this healthy migrant hypothesis has generally been tested at destinations, and for only one type of flow, from deprived to better-off areas. The circulatory nature of migration is rarely accounted for. This study examines the relationship between different types of internal migration and adult mortality in Health and Demographic Surveillance System (HDSS) populations in West, East, and Southern Africa, and asks how the processes of selection, adaptation and propagation explain the migration-mortality relationship experienced in these contexts. The paper uses longitudinal data representing approximately 900 000 adults living in nine sub-Saharan African HDSS sites of the INDEPTH Network. Event History Analysis techniques are employed to examine the relationship between all-cause mortality and migration status, over periods ranging from 3 to 14 years for a total of nearly 4.5 million person-years. The study confirms the importance of migration in explaining variation in mortality, and the diversity of the migration-mortality relationship over a range of rural and urban local areas in the three African regions. The results confirm that the pattern of migration-mortality relationship is not exclusively explained by selection but also by propagation and adaptation. Consequences for public health policy are drawn. |
Assessing the effect of potential reductions in non-hepatic mortality on the estimated cost-effectiveness of hepatitis C treatment in early stages of liver disease
Leidner AJ , Chesson HW , Spradling PR , Holmberg SD . Appl Health Econ Health Policy 2016 15 (1) 65-74 BACKGROUND: Most cost-effectiveness analyses of hepatitis C (HCV) therapy focus on the benefits of reducing liver-related morbidity and mortality. OBJECTIVES: Our objective was to assess how cost-effectiveness estimates of HCV therapy can vary depending on assumptions regarding the potential impact of HCV therapy on non-hepatic mortality. METHODS: We adapted a state-transition model to include potential effects of HCV therapy on non-hepatic mortality. We assumed successful treatment could reduce non-hepatic mortality by as little as 0 % to as much as 100 %. Incremental cost-effectiveness ratios were computed comparing immediate treatment versus delayed treatment and comparing immediate treatment versus non-treatment. RESULTS: Comparing immediate treatment versus delayed treatment, when we included a 44 % reduction in non-hepatic mortality following successful HCV treatment, the incremental cost per quality-adjusted life year (QALY) gained by HCV treatment fell by 76 % (from US$314,100 to US$76,900) for patients with no fibrosis and by 43 % (from US$62,500 to US$35,800) for patients with moderate fibrosis. Comparing immediate treatment versus non-treatment, assuming a 44 % reduction in non-hepatic mortality following successful HCV treatment, the incremental cost per QALY gained by HCV treatment fell by 64 % (from US$186,700 to US$67,300) for patients with no fibrosis and by 27 % (from US$35,000 to US$25,500) for patients with moderate fibrosis. CONCLUSION: Including reductions in non-hepatic mortality from HCV treatment can have substantial effects on the estimated cost-effectiveness of treatment. |
Vaccination coverage disparities between foreign-born and U.S.-born children aged 19-35 months, United States, 2010-2012
Varan AK , Rodriguez-Lainz A , Hill HA , Elam-Evans LD , Yankey D , Li Q . J Immigr Minor Health 2016 19 (4) 779-789 Healthy People 2020 targets high vaccination coverage among children. Although reductions in coverage disparities by race/ethnicity have been described, data by nativity are limited. The National Immunization Survey is a random-digit-dialed telephone survey that estimates vaccination coverage among U.S. children aged 19-35 months. We assessed coverage among 52,441 children from pooled 2010-2012 data for individual vaccines and the combined 4:3:1:3*:3:1:4 series (which includes ≥4 doses of diphtheria, tetanus, and acellular pertussis vaccine/diphtheria and tetanus toxoids vaccine/diphtheria, tetanus toxoids, and pertussis vaccine, ≥3 doses of poliovirus vaccine, ≥1 dose of measles-containing vaccine, ≥3 or ≥4 doses of Haemophilus influenzae type b vaccine (depending on product type of vaccine; denoted as 3* in the series name), ≥3 doses of hepatitis B vaccine, ≥1 dose of varicella vaccine, and ≥4 doses of pneumococcal conjugate vaccine). Coverage estimates controlling for sociodemographic factors and multivariable logistic regression modeling for 4:3:1:3*:3:1:4 series completion are presented. Significantly lower coverage among foreign-born children was detected for DTaP, hepatitis A, hepatitis B, Hib, pneumococcal conjugate, and rotavirus vaccines, and for the combined series. Series completion disparities persisted after control for demographic, access-to-care, poverty, and language effects. Substantial and potentially widening disparities in vaccination coverage exist among foreign-born children. Improved immunization strategies targeting this population and continued vaccination coverage monitoring by nativity are needed. |
Environmental isolation of circulating vaccine-derived poliovirus after interruption of wild poliovirus transmission - Nigeria, 2016
Etsano A , Damisa E , Shuaib F , Nganda GW , Enemaku O , Usman S , Adeniji A , Jorba J , Iber J , Ohuabunwo C , Nnadi C , Wiesen E . MMWR Morb Mortal Wkly Rep 2016 65 (30) 770-3 In September 2015, more than 1 year after reporting its last wild poliovirus (WPV) case in July 2014, Nigeria was removed from the list of countries with endemic poliovirus transmission, leaving Afghanistan and Pakistan as the only remaining countries with endemic WPV. However, on April 29, 2016, a laboratory-confirmed, circulating vaccine-derived poliovirus type 2 (cVDPV2) isolate was reported from an environmental sample collected in March from a sewage effluent site in Maiduguri Municipal Council, Borno State, a security-compromised area in northeastern Nigeria. VDPVs are genetic variants of the vaccine viruses with the potential to cause paralysis and can circulate in areas with low population immunity. The Nigeria National Polio Emergency Operations Center initiated emergency response activities, including administration of at least 2 doses of oral poliovirus vaccine (OPV) to all children aged <5 years through mass campaigns; retroactive searches for missed cases of acute flaccid paralysis (AFP), and enhanced environmental surveillance. Approximately 1 million children were vaccinated in the first OPV round. Thirteen previously unreported AFP cases were identified. Enhanced environmental surveillance has not resulted in detection of additional VDPV isolates. The detection of persistent circulation of VDPV2 in Borno State highlights the low population immunity, surveillance limitations, and risk for international spread of cVDPVs associated with insurgency-related insecurity. Increasing vaccination coverage with additional targeted supplemental immunization activities and reestablishment of effective routine immunization activities in newly secured and difficult-to-reach areas in Borno is urgently needed. |
The globally synchronized switch-another milestone toward achieving polio eradication
Wassilak SG , Vertefeuille JF , Martin RM . JAMA Pediatr 2016 170 (10) 927-928 To avoid the risks for vaccine-associated paralytic polio1 and circulating vaccine-derived poliovirus (cVDPV) outbreaks,2 the Polio Eradication and End-game Strategic Plan3 2013-2018 directs the phasing out of all oral poliovirus vaccine (OPV) use after wild poliovirus (WPV) is eradicated. This has started with the type 2 component in the vaccine. From April 17 through May 1, 2016, 155 countries using trivalent OPV (tOPV) (Sabin strains of types 1, 2, and 3) in their national immunization schedules removed it and introduced bivalent OPV (bOPV) (Sabin strains of types 1 and 3).3 This massive public health event required the engagement of every health facility providing vaccines in each of these countries. This unprecedented, synchronized vaccine introduction and withdrawal is termed the switch.4,5 | The switch was only possible owing to the efforts of thousands of health care professionals working in a coordinated manner across the globe. Successful implementation of the switch required extensive planning by national governments in partnership with Global Polio Eradication Initiative (GPEI) partners (World Health Organization [WHO], United Nations Children’s Emergency Fund, Rotary International, the Bill and Melinda Gates Foundation, and the US Centers for Disease Control and Prevention) in collaboration with Gavi, the Vaccine Alliance, and other key stakeholders and partners. Independent monitors in each country, both national and international, visited health facilities and vaccine stores to check that tOPV was no longer being stored and that bOPV was in stock for use. |
Guideline for collection, analysis and presentation of safety data in clinical trials of vaccines in pregnant women
Jones CE , Munoz FM , Spiegel HM , Heininger U , Zuber PL , Edwards KM , Lambach P , Neels P , Kohl KS , Gidudu J , Hirschfeld S , Oleske JM , Khuri-Bulos N , Bauwens J , Eckert LO , Kochhar S , Bonhoeffer J , Heath PT . Vaccine 2016 34 (49) 5998-6006 Vaccination during pregnancy is increasingly being used as an effective approach for protecting both young infants and their mothers from serious infections. Drawing conclusions from published studies in this area can be difficult because of the inability to compare vaccine trial results across different studies and settings due to the heterogeneity in the definitions of terms used to assess the safety of vaccines in pregnancy and the data collected in such studies. The guidelines proposed in this document have been developed to harmonize safety data collection in all phases of clinical trials of vaccines in pregnant women and apply to data from the mother, fetus and infant. Guidelines on the prioritization of the data to be collected is also provided to allow applicability in various geographic, cultural and resource settings, including high, middle and low-income countries. |
High-dose influenza vaccine favors acute plasmablast responses rather than long-term cellular responses
Kim JH , Talbot HK , Mishina M , Zhu Y , Chen J , Cao W , Reber AJ , Griffin MR , Shay DK , Spencer SM , Sambhara S . Vaccine 2016 34 (38) 4594-4601 High-dose (HD) influenza vaccine shows improved relative efficacy against influenza disease compared to standard-dose (SD) vaccine in individuals 65years. This has been partially credited to superior serological responses, but a comprehensive understanding of cell-mediated immunity (CMI) of HD vaccine remains lacking. In the current study, a total of 105 participants were randomly administered HD or SD vaccine and were evaluated for serological responses. Subsets of the group (n=12-26 per group) were evaluated for B and T cell responses at days 0, 7, 14 and 28 post-vaccination by flow cytometry or ELISPOT assay. HD vaccine elicited significantly higher hemagglutination inhibition (HI) titers than SD vaccine at d28, but comparable titers at d365 post-vaccination. HD vaccine also elicited higher vaccine-specific plasmablast responses at d7 post-vaccination than SD vaccine. However, long-lived memory B cell induction, cytokine-secreting T cell responses and persistence of serological memory were comparable regardless of vaccine dose. More strategies other than increased Ag amount may be needed to improve CMI in older adults. TRIAL REGISTRATION: ClinicalTrials.gov NCT 01189123. |
Case-control study of vaccine effectiveness in preventing laboratory-confirmed influenza hospitalizations in older adults, United States, 2010-11
Havers FP , Sokolow L , Shay DK , Farley MM , Monroe M , Meek J , Kirley PD , Bennett NM , Morin C , Aragon D , Thomas A , Schaffner W , Zansky SM , Baumbach J , Ferdinands J , Fry AM . Clin Infect Dis 2016 63 (10) 1304-1311 BACKGROUND: Older adults are at increased risk of influenza-associated complications, including hospitalization, but influenza vaccine effectiveness (VE) data are limited for this population. We conducted a case-control study to estimate VE to prevent laboratory-confirmed influenza hospitalizations among adults aged ≥50 years in eleven U.S. Emerging Infections Program (EIP) hospitalization surveillance sites. METHODS: Cases were RT-PCR-confirmed influenza infections in adults ≥50 years old hospitalized during the 2010-11 influenza season identified through EIP surveillance. Community controls, identified through home telephone lists, were matched by age group (+/- 5 years), county, and month of case hospitalization. Vaccination status was determined by self-report (with location and date) or medical records. Conditional logistic regression models were used to calculate adjusted VE (aVE) estimates [100 x (1 - adjusted odds ratio)] adjusting for sex, race, socioeconomic factors, smoking, chronic medical conditions, recent respiratory hospitalizations, and functional status. RESULTS: Among case-patients, 205/368 (55%) were vaccinated; 489/773 (63%) of controls were vaccinated. Case-patients were more likely to be persons of non-white race, with ≥2 chronic health conditions, with a recent hospitalization for a respiratory condition, with an income <$35,000, and report a lower functional status score (P-values <0.01 for all). Adjusted VE was 56.8% (95% confidence interval (CI): 34.1%-71.7%) and was similar by age, including adults ≥75 years [aVE 57.3% (95% CI 15.9%-78.4%)]. CONCLUSION: During 2010-11, influenza vaccination was associated with a significant reduction of the risk of laboratory-confirmed influenza hospitalization among adults aged ≥50 years regardless of age group. |
A qualitative study of male veterans' violence perpetration and treatment preferences
Tharp AT , Sherman M , Holland K , Townsend B , Bowling U . Mil Med 2016 181 (8) 735-9 Prevention and treatment of intimate partner violence (IPV) has increasingly focused on engaging men; however, very little work has examined how men manage the negative emotions associated with relationship conflict, as well as their preferences for and perceived barriers to treatment. Given the overrepresentation of IPV among men with post-traumatic stress disorder, the perspectives of male veterans with and without post-traumatic stress disorder are critical to informing IPV prevention and treatment within the Veterans Administration (VA) healthcare system. This qualitative study involved interviews with 25 male veterans who reported recent IPV perpetration. Interview themes included coping with emotions associated with violence and preferences and barriers to seeking treatment related to IPV. Results found the participants were interested in receiving IPV treatment at the Veterans Administration, and interviews offered several suggestions for developing or adapting prevention and treatment options for male veterans and their families to take into account violence in their relationships. |
Influence of Aspergillus fumigatus conidia viability on murine pulmonary microRNA and mRNA expression following subchronic inhalation exposure.
Croston TL , Nayak AP , Lemons AR , Goldsmith WT , Gu JK , Germolec DR , Beezhold DH , Green BJ . Clin Exp Allergy 2016 46 (10) 1315-27 BACKGROUND: Personal exposure to fungal bioaerosols derived from contaminated building materials or agricultural commodities may induce or exacerbate a variety of adverse health effects. The genomic mechanisms that underlie pulmonary immune responses to fungal bioaerosols have remained unclear. OBJECTIVE: The impact of fungal viability on the pulmonary microRNA and messenger RNA profiles that regulate murine immune responses was evaluated following subchronic inhalation exposure to Aspergillus fumigatus conidia. METHODS: Three groups of naive B6C3F1/N mice were exposed via nose-only inhalation to A. fumigatus viable conidia, heat-inactivated conidia, or HEPA-filtered air twice a week for 13 weeks. Total RNA was isolated from whole lung 24 and 48 hours post final exposure and was further processed for gene expression and microRNA array analysis. The molecular network pathways between viable and heat-inactivated conidia groups were evaluated. RESULTS: Comparison of datasets revealed increased Il4, Il13, and Il33 expression in mice exposed to viable versus heat-inactivated conidia. Of 415 microRNAs detected, approximately 50% were altered in mice exposed to viable versus heat-inactivated conidia 48 hours post exposure. Significantly downregulated (P < 0.05) miR-29a-3p was predicted to regulate TGF-beta3 and Clec7a, genes involved in innate responses to viable A. fumigatus. Also significantly downregulated (P < 0.05), miR-23b-3p regulates genes involved in pulmonary IL-13 and IL-33 responses and SMAD2, downstream of TGF-beta signaling. Using Ingenuity Pathway Analysis, a novel interaction was identified between viable conidia and SMAD2/3. CONCLUSION AND CLINICAL RELEVANCE: Examination of the pulmonary genetic profiles revealed differentially expressed genes and microRNAs following subchronic inhalation exposure to A. fumigatus. MicroRNAs regulating genes involved in the pulmonary immune responses were those with the greatest fold change. Specifically, germinating A. fumigatus conidia were associated with Clec7a and were predicted to interact with Il13 and Il33. Furthermore, altered microRNAs may serve as potential biomarkers to evaluate fungal exposure. |
Clinical evaluation of the BD FACSPresto Near-Patient CD4 Counter in Kenya
Angira F , Akoth B , Omolo P , Opollo V , Bornheimer S , Judge K , Tilahun H , Lu B , Omana-Zapata I , Zeh C . PLoS One 2016 11 (8) e0157939 BACKGROUND: The BD FACSPresto Near-Patient CD4 Counter was developed to expand HIV/AIDS management in resource-limited settings. It measures absolute CD4 counts (AbsCD4), percent CD4 (%CD4), and hemoglobin (Hb) from a single drop of capillary or venous blood in approximately 23 minutes, with throughput of 10 samples per hour. We assessed the performance of the BD FACSPresto system, evaluating accuracy, stability, linearity, precision, and reference intervals using capillary and venous blood at KEMRI/CDC HIV-research laboratory, Kisumu, Kenya, and precision and linearity at BD Biosciences, California, USA. METHODS: For accuracy, venous samples were tested using the BD FACSCalibur instrument with BD Tritest CD3/CD4/CD45 reagent, BD Trucount tubes, and BD Multiset software for AbsCD4 and %CD4, and the Sysmex KX-21N for Hb. Stability studies evaluated duration of staining (18-120-minute incubation), and effects of venous blood storage <6-24 hours post-draw. A normal cohort was tested for reference intervals. Precision covered multiple days, operators, and instruments. Linearity required mixing two pools of samples, to obtain evenly spaced concentrations for AbsCD4, total lymphocytes, and Hb. RESULTS: AbsCD4 and %CD4 venous/capillary (N = 189/ N = 162) accuracy results gave Deming regression slopes within 0.97-1.03 and R2 ≥0.96. For Hb, Deming regression results were R2 ≥0.94 and slope ≥0.94 for both venous and capillary samples. Stability varied within 10% 2 hours after staining and for venous blood stored less than 24 hours. Reference intervals results showed that gender-but not age-differences were statistically significant (p<0.05). Precision results had <3.5% coefficient of variation for AbsCD4, %CD4, and Hb, except for low AbsCD4 samples (<6.8%). Linearity was 42-4,897 cells/muL for AbsCD4, 182-11,704 cells/muL for total lymphocytes, and 2-24 g/dL for Hb. CONCLUSIONS: The BD FACSPresto system provides accurate, precise clinical results for capillary or venous blood samples and is suitable for near-patient CD4 testing. TRIAL REGISTRATION: ClinicalTrials.gov NCT02396355. |
Usual intake of key minerals among children in the second year of life, NHANES 2003-2012
Hamner HC , Perrine CG , Scanlon KS . Nutrients 2016 8 (8) Iron, calcium, and zinc are important nutrients for the young, developing child. This study describes the usual intake of iron, calcium, and zinc among US children in the second year of life using two days of dietary intake data from the National Health and Nutrition Examination Survey 2003-2012. Estimates were calculated using PC-SIDE to account for within and between person variation. Mean usual iron, calcium, and zinc intakes were 9.5 mg/day, 1046 mg/day, and 7.1 mg/day, respectively. Over a quarter of children had usual iron intakes less than the Recommended Dietary Allowance (RDA) (26.1%). Eleven percent of children had usual calcium intakes below the RDA and over half of children had usual intakes of zinc that exceeded the tolerable upper intake level (UL). Two percent or less had usual intakes below the Estimated Average Requirement (EAR) for iron, calcium, and zinc. Our findings suggest that during 2003-2012, one in four children and one in ten children had usual intakes below the RDA for iron and calcium, respectively. Children who are not meeting their nutrient requirements could be at increased risk for developing deficiencies such as iron deficiency or could lead to a shortage in adequate nutrients required for growth and development. One in every two children is exceeding the UL for zinc, but the interpretation of these estimates should be done with caution given the limited data on adverse health outcomes. Continued monitoring of zinc intake and further assessment for the potential of adverse health outcomes associated with high zinc intakes may be needed. |
CDC Grand Rounds: Adolescence - preparing for lifelong health and wellness
Banspach S , Zaza S , Dittus P , Michael S , Brindis CD , Thorpe P . MMWR Morb Mortal Wkly Rep 2016 65 (30) 759-62 Approximately 42 million adolescents aged 10-19 years, representing 13% of the population, resided in the United States in 2014. Adolescence is characterized by rapid and profound physical, intellectual, emotional, and psychological changes, as well as development of healthy or risky behaviors that can last a lifetime. Parents have strong influence on their adolescent children's lives, and family-based programs can help parents support healthy adolescent development. Because schools are natural learning environments, implementing and improving school-based policies and programs are strategic ways to reinforce healthy behaviors and educate adolescents about reducing risky behaviors. Health care during adolescence should be tailored to meet the changing developmental needs of the adolescent while providing welcoming, safe, and confidential care. Parents, educators, care providers, public health officials, and communities should collaborate in fostering healthy environments for all adolescents, now and into the future. |
Design and Methodological Considerations of the Centers for Disease Control and Prevention Urologic and Renal Protocol for the Newborn and Young Child with Spina Bifida
Routh JC , Cheng EY , Austin JC , Baum MA , Gargollo PC , Grady RW , Herron AR , Kim SS , King SJ , Koh CJ , Paramsothy P , Raman L , Schechter MS , Smith KA , Tanaka ST , Thibadeau JK , Walker WO , Wallis MC , Wiener JS , Joseph DB . J Urol 2016 196 (6) 1728-1734 INTRODUCTION: Care of children with spina bifida (SB) has significantly advanced over the last half-century, resulting in gains in longevity and quality of life for affected children and caregivers. Bladder dysfunction is the norm in SB patients and may result in infection, renal scarring, and chronic kidney disease. However, the optimal urologic management for SB-related bladder dysfunction is unknown. METHODS: In 2012, Centers for Disease Control and Prevention (CDC) convened a working group composed of pediatric urologists, nephrologists, epidemiologists, methodologists, community advocates, and CDC personnel to develop a protocol to optimize urologic care of children with SB from the newborn period through 5 years of age. RESULTS: An iterative quality-improvement protocol was selected; in this model, participating institutions agree to prospectively treat all newborns with SB using a single consensus-based protocol. Over the course of the 5-year study period, study outcomes are routinely assessed and the protocol adjusted as needed in order to optimize patient and process outcomes. Primary study outcomes include urinary tract infections (UTI), renal scarring, renal function, and bladder characteristics. The protocol specifies the timing and use of testing (e.g., ultrasonography, urodynamics) and interventions (e.g., intermittent catheterization, prophylactic antibiotics, antimuscarinic medications). Starting in 2014, the CDC began funding nine study sites to implement and evaluate the protocol. CONCLUSIONS: The CDC Urologic and Renal Protocol for the Newborn and Young Child with Spina Bifida began accruing patients in 2015. Assessment in the first 5 years will focus on UTIs, renal function, renal scarring, and clinical process improvements. |
Simulation and measurement of medium-frequency signals coupling from a line to a loop antenna
Damiano NW , Li J , Zhou C , Brocker DE , Qin Y , Werner DH , Werner PL . IEEE Trans Ind Appl 2016 52 (4) 3527-3534 The underground-mining environment can affect radio-signal propagation in various ways. Understanding these effects is especially critical in evaluating communications systems used during normal mining operations and during mine emergencies. One of these types of communications systems relies on medium-frequency (MF) radio frequencies. This paper presents the simulation and measurement results of recent National Institute for Occupational Safety and Health (NIOSH) research aimed at investigating MF coupling between a transmission line (TL) and a loop antenna in an underground coal mine. Two different types of measurements were completed: 1) line-current distribution and 2) line-to-antenna coupling. Measurements were taken underground in an experimental coal mine and on a specially designed surface test area. The results of these tests are characterized by current along a TL and voltage induced in the loop from a line. This paper concludes with a discussion of issues for MF TLs. These include electromagnetic fields at the ends of the TL, connection of the ends of the TL, the effect of other conductors underground, and the proximity of coal or earth. These results could help operators by providing examples of these challenges that may be experienced underground and a method by which to measure voltage induced by a line. |
Vibrations transmitted from human hands to upper arm, shoulder, back, neck, and head
Xu XS , Dong RG , Welcome DE , Warren C , McDowell TW , Wu JZ . Int J Ind Ergon 2016 62 1-12 Some powered hand tools can generate significant vibration at frequencies below 25 Hz. It is not clear whether such vibration can be effectively transmitted to the upper arm, shoulder, neck, and head and cause adverse effects in these substructures. The objective of this study is to investigate the vibration transmission from the human hands to these substructures. Eight human subjects participated in the experiment, which was conducted on a 1-D vibration test system. Unlike many vibration transmission studies, both the right and left hand-arm systems were simultaneously exposed to the vibration to simulate a working posture in the experiment. A laser vibrometer and three accelerometers were used to measure the vibration transmitted to the substructures. The apparent mass at the palm of each hand was also measured to help in understanding the transmitted vibration and biodynamic response. This study found that the upper arm resonance frequency was 7-12 Hz, the shoulder resonance was 7-9 Hz, and the back and neck resonances were 6-7 Hz. The responses were affected by the hand-arm posture, applied hand force, and vibration magnitude. The transmissibility measured on the upper arm had a trend similar to that of the apparent mass measured at the palm in their major resonant frequency ranges. The implications of the results are discussed. Relevance to industry: Musculoskeletal disorders (MSDs) of the shoulder and neck are important issues among many workers. Many of these workers use heavy-duty powered hand tools. The combined mechanical loads and vibration exposures are among the major factors contributing to the development of MSDs. The vibration characteristics of the body segments examined in this study can be used to help understand MSDs and to help develop more effective intervention methods. © 2016. |
Evaluation of the impact of the revised National Institute for Occupational Safety and Health lifting equation
Lu ML , Putz-Anderson V , Garg A , Davis KG . Hum Factors 2016 58 (5) 667-82 OBJECTIVE: The objective of this article is to evaluate the impact of the Revised National Institute for Occupational Safety and Health Lifting Equation (RNLE). BACKGROUND: The RNLE has been used extensively as a risk assessment method for prevention of low back pain (LBP). However, the impact of the RNLE has not been documented. METHODS: A systematic review of the literature on the RNLE was conducted. The review consisted of three parts: characterization of the RNLE publications, assessment of the impact of the RNLE, and evaluation of the influences of the RNLE on ergonomic standards. The literature for assessing the impact was categorized into four research areas: methodology, laboratory, field, and risk assessment studies using the Lifting Index (LI) or Composite LI (CLI), both of which are the products of the RNLE. RESULTS: The impact of the RNLE has been both widespread and influential. We found 24 studies that examined the criteria used to define lifting capacity used by the RNLE, 28 studies that compared risk assessment methods for identifying LBP, 23 studies that found the RNLE useful in identifying the risk of LBP with different work populations, and 13 studies on the relationship between LI/CLI and LBP outcomes. We also found evidence on the adoption of the RNLE as an ergonomic standard for use by various local, state, and international entities. CONCLUSION: The review found 13 studies that link LI/CLI to adverse LBP outcomes. These studies showed a positive relationship between LI/CLI metrics and the severity of LBP outcomes. |
Population pharmacokinetics and pharmacodynamics of lumefantrine in young Ugandan children treated with artemether-lumefantrine for uncomplicated malaria
Tchaparian E , Sambol NC , Arinaitwe E , McCormack SA , Bigira V , Wanzira H , Muhindo M , Creek DJ , Sukumar N , Blessborn D , Tappero JW , Kakuru A , Bergqvist Y , Aweeka FT , Parikh S . J Infect Dis 2016 214 (8) 1243-51 BACKGROUND: The pharmacokinetics and pharmacodynamics of lumefantrine, a component of the most widely-used treatment for malaria, artemether-lumefantrine, has not been adequately characterized in young children. METHODS: Capillary whole blood was collected in 105 Ugandan children, ages 6 months to 2 years, treated for 249 episodes of Plasmodium falciparum malaria with artemether-lumefantrine. RESULTS: Population pharmacokinetics for lumefantrine employed a 2-compartment open model with first-order absorption. Age had a significant positive correlation with bioavailability in a model that included allometric scaling. Children not on trimethoprim-sulfamethoxazole with concentrations below 200 ng/mL had a 3-fold higher hazard of 28-day recurrent parasitemia compared to those above 200 ng/mL(p=0.0007). However, for children on trimethoprim-sulfamethoxazole, the risk of recurrent parasitemia did not differ significantly based on this threshold. Day 3 concentrations were a stronger predictor of 28-day recurrence than day 7 concentrations. CONCLUSIONS: We demonstrate that age, in addition to weight, is a determinant of lumefantrine exposure, and in the absence of TS, lumefantrine exposure is a determinant of recurrent parasitemia. Exposure in children 6 months to 2 years was generally lower than those published for older children and adults. Further refinement of artemether-lumefantrine dosing to improve exposure in infants and very young children may be warranted. |
Pregnant women and infants as sentinel populations to monitor prevalence of malaria: results of pilot study in Lake Zone of Tanzania
Willilo RA , Molteni F , Mandike R , Mugalura FE , Mutafungwa A , Thadeo A , Benedictor E , Kafuko JM , Kaspar N , Ramsan MM , Mwaipape O , McElroy PD , Gutman J , Colaco R , Reithinger R , Ngondi JM . Malar J 2016 15 (1) 392 BACKGROUND: As malaria control interventions are scaled-up, rational approaches are needed for monitoring impact over time. One proposed approach includes monitoring the prevalence of malaria infection among pregnant women and children at the time of routine preventive health facility (HF) visits. This pilot explored the feasibility and utility of tracking the prevalence of malaria infection in pregnant women attending their first antenatal care (ANC) visit and infants presenting at 9-12 months of age for measles vaccination. METHODS: Pregnant women attending first ANC and infants nine to 12 months old presenting for measles vaccination at a non-probability sample of 54 HFs in Tanzania's Lake Zone (Mara, Mwanza and Kagera Regions) were screened for malaria infection using a malaria rapid diagnostic test (RDT) from December 2012 to November 2013, regardless of symptoms. Participants who tested positive were treated for malaria per national guidelines. Data were collected monthly. RESULTS: Overall 89.9 and 78.1 % of expected monthly reports on malaria infection prevalence were received for pregnant women and infants, respectively. Among 51,467 pregnant women and 35,155 infants attending routine preventive HF visits, 41.2 and 37.3 % were tested with RDT, respectively. Malaria infection prevalence was 12.8 % [95 % confidence interval (CI) 11.3-14.3] among pregnant women and 11.0 % (95 % CI 9.5-12.5) among infants, and varied by month. There was good correlation of the prevalence of malaria among pregnant women and infants at the HF level (Spearman rho = 0.6; p < 0.001). This approach is estimated to cost $1.28 for every person tested, with the RDT accounting for 72 % of the cost. CONCLUSIONS: Malaria infection was common and well correlated among pregnant women and infants attending routine health services. Routine screening of these readily accessible populations may offer a practical strategy for continuously tracking malaria trends, particularly seasonal variation. Positivity rates among afebrile individuals presenting for routine care offer an advantage as they are unaffected by the prevalence of other causes of febrile illness, which could influence positivity rates among febrile patients presenting to outpatient clinics. The data presented here suggest that in addition to contributing to clinical management, ongoing screening of pregnant women could be used for routine surveillance and detection of hotspots. |
Quantifying heterogeneous malaria exposure and clinical protection in a cohort of Ugandan children
Rodriguez-Barraquer I , Arinaitwe E , Jagannathan P , Boyle MJ , Tappero J , Muhindo M , Kamya MR , Dorsey G , Drakeley C , Ssewanyana I , Smith DL , Greenhouse B . J Infect Dis 2016 214 (7) 1072-80 BACKGROUND: Plasmodium falciparum malaria remains a leading cause of childhood morbidity and mortality. There are important gaps in our understanding of the factors driving the development of anti-malaria immunity as a function of age and exposure. METHODS: We use data from a cohort of 93 children participating in a clinical trial in an area of very high exposure to P. falciparum in Tororo, Uganda. We jointly quantify individual heterogeneity in risk of infection, and development of immunity against infection and clinical disease. RESULTS: Results show significant heterogeneity in the hazard of infection, and independent effects of age and cumulative number of infections on the risk of infection and disease. The risk of developing clinical malaria upon infection decreased on average by 6% (95%CI 0 - 12%) for each additional year of age and by 2% (95%CI 1%-3%) for each additional prior infection. Children randomized to dihydroartemisinin-piperaquine (DP) for treatment appeared to develop immunity more slowly than those receiving artemether-lumefrantine (AL). CONCLUSION: Heterogeneity in P. falciparum exposure and immunity can be independently evaluated using detailed longitudinal studies. Improved understanding of the factors driving immunity will provide key information to anticipate the impact of malaria-control interventions and to understand the mechanisms of clinical immunity. |
A conversation about chemoprophylaxis
Itoh M , Arguin PM . Travel Med Infect Dis 2016 14 (5) 434-435 Meg: I'm in Angola right now where the country is seeing the largest yellow fever outbreak since 1986 and the number of malaria cases appear to be higher than normal this time of the year. I was at dinner the other night among seasoned travelers and I was struck by how different our attitudes were about malaria prophylaxis. | Paul: I hope you're taking yours. | Meg: Of course! I'm religiously taking mine every morning. Some of us are meticulous about not missing a dose whereas others seem to have a more cavalier attitude about it. They don't seem to be aware of the potential serious consequences of a malaria infection. And I was surprised by the spectrum of side effects they say they've experienced in the past with the different prophylaxis options. A lot of GI upset, and of course, the most common – weird, vivid dreams with mefloquine. But I've also heard people tell me about neuropsychiatric side effects even on other antimalarials. |
How reported usefulness modifies the association between neighborhood supports and walking behavior
Carlson SA , Paul P , Watson KB , Schmid TL , Fulton JE . Prev Med 2016 91 76-81 Neighborhood supports have been associated with walking, but this association may be modified by reports about the usefulness of these supports for promoting walking. This study examined the association between reported presence of neighborhood supports and walking and whether usefulness modified this association in a nationwide sample of U.S. adults. Measures of reported presence and use or potential use (i.e., usefulness) of neighborhood supports (shops within walking distance, transit stops, sidewalks, parks, interesting things to look at, well-lit at night, low crime rate, and cars following speed limit) were examined in 3973 adults who completed the 2014 SummerStyles survey. Multinomial regression models were used to examine the association between presence of supports with walking frequency (frequently, sometimes, rarely (referent)) and the role usefulness had on this association. The interaction term between reported presence and usefulness was significant for all supports (p<0.05). For adults who reported a support as useful, a positive association between presence of the support and walking frequency was observed for all supports. For adults who did not report a support as useful, the association between presence of the support and walking frequency was null for most supports and negative for sidewalks, well-lit at night, and low crime rate. The association between presence of neighborhood supports and walking is modified by reported usefulness of the support. Tailoring initiatives to meet a community's supply of and affinity for neighborhood supports may help initiatives designed to promote walking and walkable communities succeed. |
The integrated transport and health impact modeling tool in Nashville, Tennessee, USA: implementation steps and lessons learned
Whitfield GP , Meehan LA , Maizlish N , Wendel AM . J Transp Health 2016 5 172-181 The Integrated Transport and Health Impact Model (ITHIM) is a comprehensive tool that estimates the hypothetical health effects of transportation mode shifts through changes to physical activity, air pollution, and injuries. The purpose of this paper is to describe the implementation of ITHIM in greater Nashville, Tennessee (USA), describe important lessons learned, and serve as an implementation guide for other practitioners and researchers interested in running ITHIM. As might be expected in other metropolitan areas in the US, not all the required calibration data was available locally. We utilized data from local, state, and federal sources to fulfill the 14 ITHIM calibration items, which include disease burdens, travel habits, physical activity participation, air pollution levels, and traffic injuries and fatalities. Three scenarios were developed that modeled stepwise increases in walking and bicycling, and one that modeled reductions in car travel. Cost savings estimates were calculated by scaling national-level, disease-specific direct treatment costs and indirect lost productivity costs to the greater Nashville population of approximately 1.5 million. Implementation required approximately one year of intermittent, part-time work. Across the range of scenarios, results suggested that 24-123 deaths per year could be averted in the region through a 1-5% reduction in the burden of several chronic diseases. This translated into $10-$63 million in estimated direct and indirect cost savings per year. Implementing ITHIM in greater Nashville has provided local decision makers with important information on the potential health effects of transportation choices. Other jurisdictions interested in ITHIM might find the Nashville example as a useful guide to streamline the effort required to calibrate and run the model. © 2016. |
Research gaps from evidence-based contraception guidance: the U.S. Medical Eligibility Criteria for Contraceptive Use, 2016 and the U.S. Selected Practice Recommendations for Contraceptive Use, 2016
Horton LG , Folger SG , Berry-Bibee E , Jatlaoui TC , Tepper NK , Curtis KM . Contraception 2016 94 (6) 582-589 The US Medical Eligibility Criteria for Contraceptive Use (US MEC) and the US Selected Practice Recommendations for Contraceptive Use (US SPR) were first adapted by the Centers for Disease Control and Prevention (CDC) from their World Health Organization counterparts in 2010 and 2013, respectively [1], [2]. These evidence-based guidance documents provide critically important information for health care providers on the safety of contraceptive methods for women with specific characteristics or medical conditions (US MEC) and on common but sometimes complex clinical issues regarding how to use contraceptive methods in select circumstances (US SPR). Through knowledge and use of the contraception guidance, providers are helping women select and use safe and effective contraceptive methods. Some of these methods may not otherwise have been considered due to patient or provider concerns about potential risks given the patient's particular characteristics or underlying medical condition, or their use may have been delayed due to unnecessary tests prior to initiation. The US MEC and US SPR remove unnecessary medical barriers to access and use of effective contraceptive methods; this is especially important for women with medical conditions associated with heightened risk during pregnancy [1]. | Between August 2014 and August 2015, CDC conducted a formal process to update the 2010 US MEC and the 2013 US SPR. Through this process, CDC identified topics warranting systematic reviews of published evidence to help inform recommendation updates and to serve as a basis for the addition of new topics to the existing guidance. Systematic reviews were then conducted, presented and discussed at a meeting of national family planning experts, and CDC determined the final recommendations. As part of this review process, gaps in the evidence were identified for which research is needed to further inform the recommendations. In this paper, we list these research gaps (Tables 1 and 2) and explore three of the topics in depth. |
Contraceptive use among nonpregnant and postpartum women at risk for unintended pregnancy, and female high school students, in the context of Zika preparedness - United States, 2011-2013 and 2015
Boulet SL , D'Angelo DV , Morrow B , Zapata L , Berry-Bibee E , Rivera M , Ellington S , Romero L , Lathrop E , Frey M , Williams T , Goldberg H , Warner L , Harrison L , Cox S , Pazol K , Barfield W , Jamieson DJ , Honein MA , Kroelinger CD . MMWR Morb Mortal Wkly Rep 2016 65 (30) 780-7 Zika virus infection during pregnancy can cause congenital microcephaly and brain abnormalities. Since 2015, Zika virus has been spreading through much of the World Health Organization's Region of the Americas, including U.S. territories. Zika virus is spread through the bite of Aedes aegypti or Aedes albopictus mosquitoes, by sex with an infected partner, or from a pregnant woman to her fetus during pregnancy. CDC estimates that 41 states are in the potential range of Aedes aegypti or Aedes albopictus mosquitoes, and on July 29, 2016, the Florida Department of Health identified an area in one neighborhood of Miami where Zika virus infections in multiple persons are being spread by bites of local mosquitoes. These are the first known cases of local mosquito-borne Zika virus transmission in the continental United States.(dagger) CDC prevention efforts include mosquito surveillance and control, targeted education about Zika virus and condom use to prevent sexual transmission, and guidance for providers on contraceptive counseling to reduce unintended pregnancy. To estimate the prevalence of contraceptive use among nonpregnant and postpartum women at risk for unintended pregnancy and sexually active female high school students living in the 41 states where mosquito-borne transmission might be possible, CDC used 2011-2013 and 2015 survey data from four state-based surveillance systems: the Behavioral Risk Factor Surveillance System (BRFSS, 2011-2013), which surveys adult women; the Pregnancy Risk Assessment Monitoring System (PRAMS, 2013) and the Maternal and Infant Health Assessment (MIHA, 2013), which surveys women with a recent live birth; and the Youth Risk Behavior Survey (YRBS, 2015), which surveys students in grades 9-12. CDC defines an unintended pregnancy as one that is either unwanted (i.e., the pregnancy occurred when no children, or no more children, were desired) or mistimed (i.e., the pregnancy occurred earlier than desired). The proportion of women at risk for unintended pregnancy who used a highly effective reversible method, known as long-acting reversible contraception (LARC), ranged from 5.5% to 18.9% for BRFSS-surveyed women and 6.9% to 30.5% for PRAMS/MIHA-surveyed women. The proportion of women not using any contraception ranged from 12.3% to 34.3% (BRFSS) and from 3.5% to 15.3% (PRAMS/MIHA). YRBS data indicated that among sexually active female high school students, use of LARC at last intercourse ranged from 1.7% to 8.4%, and use of no contraception ranged from 7.3% to 22.8%. In the context of Zika preparedness, the full range of contraceptive methods approved by the Food and Drug Administration (FDA), including LARC, should be readily available and accessible for women who want to avoid or delay pregnancy. Given low rates of LARC use, states can implement strategies to remove barriers to the access and availability of LARC including high device costs, limited provider reimbursement, lack of training for providers serving women and adolescents on insertion and removal of LARC, provider lack of knowledge and misperceptions about LARC, limited availability of youth-friendly services that address adolescent confidentiality concerns, inadequate client-centered counseling, and low consumer awareness of the range of contraceptive methods available. |
Multiple imputation of completely missing repeated measures data within persons from a complex sample: application to accelerometer data in the National Health and Nutrition Examination Survey
Liu B , Yu M , Graubard BI , Troiano RP , Schenker N . Stat Med 2016 35 (28) 5170-5188 The Physical Activity Monitor component was introduced into the 2003-2004 National Health and Nutrition Examination Survey (NHANES) to collect objective information on physical activity including both movement intensity counts and ambulatory steps. Because of an error in the accelerometer device initialization process, the steps data were missing for all participants in several primary sampling units, typically a single county or group of contiguous counties, who had intensity count data from their accelerometers. To avoid potential bias and loss in efficiency in estimation and inference involving the steps data, we considered methods to accurately impute the missing values for steps collected in the 2003-2004 NHANES. The objective was to come up with an efficient imputation method that minimized model-based assumptions. We adopted a multiple imputation approach based on additive regression, bootstrapping and predictive mean matching methods. This method fits alternative conditional expectation (ace) models, which use an automated procedure to estimate optimal transformations for both the predictor and response variables. This paper describes the approaches used in this imputation and evaluates the methods by comparing the distributions of the original and the imputed data. A simulation study using the observed data is also conducted as part of the model diagnostics. Finally, some real data analyses are performed to compare the before and after imputation results. |
Disparities in adult cigarette smoking - United States, 2002-2005 and 2010-2013
Martell BN , Garrett BE , Caraballo RS . MMWR Morb Mortal Wkly Rep 2016 65 (30) 753-8 Although cigarette smoking has substantially declined since the release of the 1964 Surgeon General's report on smoking and health,* disparities in tobacco use exist among racial/ethnic populations (1). Moreover, because estimates of U.S. adult cigarette smoking and tobacco use are usually limited to aggregate racial or ethnic population categories (i.e., non-Hispanic whites [whites]; non-Hispanic blacks or African Americans [blacks]; American Indians and Alaska Natives [American Indians/Alaska Natives]; Asians; Native Hawaiians or Pacific Islanders [Native Hawaiians/Pacific Islanders]; and Hispanics/Latinos [Hispanics]), these estimates can mask differences in cigarette smoking prevalence among subgroups of these populations. To assess the prevalence of and changes in cigarette smoking among persons aged ≥18 years in six racial/ethnic populations and 10 select subgroups in the United States,(dagger) CDC analyzed self-reported data collected during 2002-2005 and 2010-2013 from the National Survey on Drug Use and Health (NSDUH) (2) and compared differences between the two periods. During 2010-2013, the overall prevalence of cigarette smoking among the racial/ethnic populations and subgroups ranged from 38.9% for American Indians/Alaska Natives to 7.6% for both Chinese and Asian Indians. During 2010-2013, although cigarette smoking prevalence was relatively low among Asians overall (10.9%) compared with whites (24.9%), wide within-group differences in smoking prevalence existed among Asian subgroups, from 7.6% among both Chinese and Asian Indians to 20.0% among Koreans. Similarly, among Hispanics, the overall prevalence of current cigarette smoking was 19.9%; however, within Hispanic subgroups, prevalences ranged from 15.6% among Central/South Americans to 28.5% among Puerto Ricans. The overall prevalence of cigarette smoking was higher among men than among women during both 2002-2005 (30.0% men versus 23.9% women) and 2010-2013 (26.4% versus 21.1%) (p<0.05). These findings highlight the importance of disaggregating tobacco use estimates within broad racial/ethnic population categories to better understand and address disparities in tobacco use among U.S. adults. |
Paired real-time PCR assays for detection of Borrelia miyamotoi in North American Ixodes scapularis and Ixodes pacificus (Acari: Ixodidae).
Graham CB , Pilgard MA , Maes SE , Hojgaard A , Eisen RJ . Ticks Tick Borne Dis 2016 7 (6) 1230-1235 Borrelia miyamotoi is an emerging, tick-borne human pathogen. In North America, it is primarily associated with Ixodes scapularis and Ixodes pacificus, two species known to bite humans. Here we describe the development and evaluation of a pair of real-time TaqMan PCR assays designed to detect B. miyamotoi in North American ticks. We sought to achieve sensitivity to B. miyamotoi strains associated with ticks throughout North America, the full genetic diversity of which is unknown, by targeting sequences that are largely conserved between B. miyamotoi strains from the eastern United States and genetically distinct B. miyamotoi strains from Japan. The two assays target different loci on the B. miyamotoi chromosome and can be run side by side under identical cycling conditions. One of the assays also includes a tick DNA target that can be used to verify the integrity of tick-derived samples. Using both recombinant plasmid controls and genomic DNA from North American and Japanese strains, we determined that both assays reliably detect as few as 5 copies of the B. miyamotoi genome. We verified that neither detects B. burgdorferi, B. lonestari or B. turicatae. This sensitive and specific pair of assays successfully detected B. miyamotoi in naturally-infected, colony-reared nymphs and in field-collected I. scapularis and I. pacificus from the Northeast and the Pacific Northwest respectively. These assays will be useful in screening field-collected Ixodes spp. from varied regions of North America to assess the risk of human exposure to this emerging pathogen. |
Phylogenetic and geographic patterns of bartonella host shifts among bat species.
McKee CD , Hayman DT , Kosoy MY , Webb CT . Infect Genet Evol 2016 44 382-394 The influence of factors contributing to parasite diversity in individual hosts and communities are increasingly studied, but there has been less focus on the dominant processes leading to parasite diversification. Using bartonella infections in bats as a model system, we explored the influence of three processes that can contribute to bartonella diversification and lineage formation: (1) spatial correlation in the invasion and transmission of bartonella among bats (phylogeography); (2) divergent adaptation of bartonellae to bat hosts and arthropod vectors; and (3) evolutionary codivergence between bats and bartonellae. Using a combination of global fit techniques and ancestral state reconstruction, we found that codivergence appears to be the dominant process leading to diversification of bartonella in bats, with lineages of bartonellae corresponding to separate bat suborders, superfamilies, and families. Furthermore, we estimated the rates at which bartonellae shift bat hosts across taxonomic scales (suborders, superfamilies, and families) and found that transition rates decrease with increasing taxonomic distance, providing support for a mechanism that can contribute to the observed evolutionary congruence between bats and their associated bartonellae. While bartonella diversification is associated with host sympatry, the influence of this factor is minor compared to the influence of codivergence and there is a clear indication that some bartonella lineages span multiple regions, particularly between Africa and Southeast Asia. Divergent adaptation of bartonellae to bat hosts and arthropod vectors is apparent and can dilute the overall pattern of codivergence, however its importance in the formation of Bartonella lineages in bats is small relative to codivergence. We argue that exploring all three of these processes yields a more complete understanding of bat-bartonella relationships and the evolution of the genus Bartonella, generally. Application of these methods to other infectious bacteria and viruses could uncover common processes that lead to parasite diversification and the formation of host-parasite relationships. |
Full-Genome Sequence of a Neuroinvasive West Nile Virus Lineage 2 Strain from a Fatal Horse Infection in South Africa.
Mentoor JL , Lubisi AB , Gerdes T , Human S , Williams JH , Venter M . Genome Announc 2016 4 (4) We report here the complete genome sequence of a lineage 2 West Nile virus (WNV) strain that resulted in fatal neurological disease in a horse in South Africa. Several recent reports exist of neurological disease associated with lineage 2 WNV in humans and horses in South Africa and Europe; however, there are a lack of sequencing data from recent fatal cases in Southern Africa, where these strains likely originate. A better understanding of the genetic composition of highly neuroinvasive lineage 2 strains may facilitate the identification of putative genetic factors associated with increased virulence. |
Development of a Rickettsia bellii-Specific TaqMan Assay Targeting the Citrate Synthase Gene.
Hecht JA , Allerdice ME , Krawczak FS , Labruna MB , Paddock CD , Karpathy SE . J Med Entomol 2016 53 (6) 1492-1495 Rickettsia bellii is a rickettsial species of unknown pathogenicity that infects argasid and ixodid ticks throughout the Americas. Many molecular assays used to detect spotted fever group (SFG) Rickettsia species do not detect R. bellii, so that infection with this bacterium may be concealed in tick populations when assays are used that screen specifically for SFG rickettsiae. We describe the development and validation of a R. bellii-specific, quantitative, real-time PCR TaqMan assay that targets a segment of the citrate synthase (gltA) gene. The specificity of this assay was validated against a panel of DNA samples that included 26 species of Rickettsia, Orientia, Ehrlichia, Anaplasma, and Bartonella, five samples of tick and human DNA, and DNA from 20 isolates of R. bellii, including 11 from North America and nine from South America. A R. bellii control plasmid was constructed, and serial dilutions of the plasmid were used to determine the limit of detection of the assay to be one copy per 4 microl of template DNA. This assay can be used to better determine the role of R. bellii in the epidemiology of tick-borne rickettsioses in the Western Hemisphere. |
An Optimized Method for Quantification of Pathogenic Leptospira in Environmental Water Samples.
Riediger IN , Hoffmaster AR , Casanovas-Massana A , Biondo AW , Ko AI , Stoddard RA . PLoS One 2016 11 (8) e0160523 Leptospirosis is a zoonotic disease usually acquired by contact with water contaminated with urine of infected animals. However, few molecular methods have been used to monitor or quantify pathogenic Leptospira in environmental water samples. Here we optimized a DNA extraction method for the quantification of leptospires using a previously described Taqman-based qPCR method targeting lipL32, a gene unique to and highly conserved in pathogenic Leptospira. QIAamp DNA mini, MO BIO PowerWater DNA and PowerSoil DNA Isolation kits were evaluated to extract DNA from sewage, pond, river and ultrapure water samples spiked with leptospires. Performance of each kit varied with sample type. Sample processing methods were further evaluated and optimized using the PowerSoil DNA kit due to its performance on turbid water samples and reproducibility. Centrifugation speeds, water volumes and use of Escherichia coli as a carrier were compared to improve DNA recovery. All matrices showed a strong linearity in a range of concentrations from 106 to 10 degrees leptospires/mL and lower limits of detection ranging from <1 cell /ml for river water to 36 cells/mL for ultrapure water with E. coli as a carrier. In conclusion, we optimized a method to quantify pathogenic Leptospira in environmental waters (river, pond and sewage) which consists of the concentration of 40 mL samples by centrifugation at 15,000xg for 20 minutes at 4 degrees C, followed by DNA extraction with the PowerSoil DNA Isolation kit. Although the method described herein needs to be validated in environmental studies, it potentially provides the opportunity for effective, timely and sensitive assessment of environmental leptospiral burden. |
A simulation model to estimate the risk of transfusion-transmitted arboviral infection.
Shang G , Biggerstaff BJ , Richardson AM , Gahan ME , Lidbury BA . Transfus Apher Sci 2016 55 (2) 233-239 BACKGROUND: The arboviruses West Nile virus (WNV), dengue virus (DENV) and Ross River virus (RRV) have been demonstrated to be blood transfusion-transmissible. A model to estimate the risk of WNV to the blood supply using a Monte Carlo approach has been developed and also applied to Chikungunya virus. Also, a probabilistic model was developed to assess the risk of DENV to blood safety, which was later adapted to RRV. To address efficacy and limitations within each model we present a hybrid model that promises improved accuracy, and is broadly applicable to assess the risk of arboviral transmission by blood transfusion. MATERIAL AND METHODS: Data were drawn from the Cairns Public Health Unit (Australia) and published literature. Based on the published models and using R code, a novel 'combined' model was developed and validated against the BP model using sensitivity testing. RESULTS: The mean risk per 10,000 of the combined model is 0.98 with a range from 0.79 to 1.25, while the maximum risk was 4.45 ranging from 2.62 to 7.67 respectively. These parameters for the BP model were 1.20 ranging from 0.84 to 1.55, and 2.86 ranging from 1.33 to 5.23 respectively. CONCLUSION: The combined simulation model is simple and robust. We propose it can be applied as a 'generic' arbovirus model to assess the risk from known or novel arboviral threats to the blood supply. |
Update: Ongoing Zika virus transmission - Puerto Rico, November 1, 2015-July 7, 2016
Adams L , Bello-Pagan M , Lozier M , Ryff KR , Espinet C , Torres J , Perez-Padilla J , Febo MF , Dirlikov E , Martinez A , Munoz-Jordan J , Garcia M , Segarra MO , Malave G , Rivera A , Shapiro-Mendoza C , Rosinger A , Kuehnert MJ , Chung KW , Pate LL , Harris A , Hemme RR , Lenhart A , Aquino G , Zaki S , Read JS , Waterman SH , Alvarado LI , Alvarado-Ramy F , Valencia-Prado M , Thomas D , Sharp TM , Rivera-Garcia B . MMWR Morb Mortal Wkly Rep 2016 65 (30) 774-9 Zika virus is a flavivirus transmitted primarily by Aedes aegypti and Aedes albopictus mosquitoes, and infection can be asymptomatic or result in an acute febrile illness with rash. Zika virus infection during pregnancy is a cause of microcephaly and other severe birth defects. Infection has also been associated with Guillain-Barre syndrome (GBS) and severe thrombocytopenia. In December 2015, the Puerto Rico Department of Health (PRDH) reported the first locally acquired case of Zika virus infection. This report provides an update to the epidemiology of and public health response to ongoing Zika virus transmission in Puerto Rico. A confirmed case of Zika virus infection is defined as a positive result for Zika virus testing by reverse transcription-polymerase chain reaction (RT-PCR) for Zika virus in a blood or urine specimen. A presumptive case is defined as a positive result by Zika virus immunoglobulin M (IgM) enzyme-linked immunosorbent assay (MAC-ELISA) and a negative result by dengue virus IgM ELISA, or a positive test result by Zika IgM MAC-ELISA in a pregnant woman. An unspecified flavivirus case is defined as positive or equivocal results for both Zika and dengue virus by IgM ELISA. During November 1, 2015-July 7, 2016, a total of 23,487 persons were evaluated by PRDH and CDC Dengue Branch for Zika virus infection, including asymptomatic pregnant women and persons with signs or symptoms consistent with Zika virus disease or suspected GBS; 5,582 (24%) confirmed and presumptive Zika virus cases were identified. Persons with Zika virus infection were residents of 77 (99%) of Puerto Rico's 78 municipalities. During 2016, the percentage of positive Zika virus infection cases among symptomatic males and nonpregnant females who were tested increased from 14% in February to 64% in June. Among 9,343 pregnant women tested, 672 had confirmed or presumptive Zika virus infection, including 441 (66%) symptomatic women and 231 (34%) asymptomatic women. One patient died after developing severe thrombocytopenia (4). Evidence of Zika virus infection or recent unspecified flavivirus infection was detected in 21 patients with confirmed GBS. The widespread outbreak and accelerating increase in the number of cases in Puerto Rico warrants intensified vector control and personal protective behaviors to prevent new infections, particularly among pregnant women. |
Notes from the field: Fatal infection associated with equine exposure - King County, Washington, 2016
Kawakami V , Rietberg K , Lipton B , Eckmann K , Watkins M , Oltean H , Kay M , Rothschild C , Kobayashi M , Van Beneden C , Duchin J . MMWR Morb Mortal Wkly Rep 2016 65 (30) 788 On March 17, 2016, Public Health-Seattle & King County in Washington was notified of two persons who received a diagnosis of Streptococcus equi subspecies zooepidemicus (S. zooepidemicus) infections. S. zooepidemicus is a zoonotic pathogen that rarely causes human illness and is usually associated with consuming unpasteurized dairy products or with direct horse contact (1). In horses, S. zooepidemicus is a commensal bacterium that can cause respiratory, wound, and uterine infections (2). The health department investigated to determine the magnitude of the outbreak, identify risk factors, and offer recommendations. |
Prolonged detection of Zika virus RNA in pregnant women
Meaney-Delman D , Oduyebo T , Polen KN , White JL , Bingham AM , Slavinski SA , Heberlein-Larson L , St George K , Rakeman JL , Hills S , Olson CK , Adamski A , Culver Barlow L , Lee EH , Likos AM , Munoz JL , Petersen EE , Dufort EM , Dean AB , Cortese MM , Santiago GA , Bhatnagar J , Powers AM , Zaki S , Petersen LR , Jamieson DJ , Honein MA . Obstet Gynecol 2016 128 (4) 724-730 OBJECTIVE: Zika virus infection during pregnancy is a cause of microcephaly and other fetal brain abnormalities. Reports indicate that the duration of detectable viral RNA in serum after symptom onset is brief. In a recent case report involving a severely affected fetus, Zika virus RNA was detected in maternal serum 10 weeks after symptom onset, longer than the duration of RNA detection in serum previously reported. This report summarizes the clinical and laboratory characteristics of pregnant women with prolonged detection of Zika virus RNA in serum that were reported to the U.S. Zika Pregnancy Registry. METHODS: Data were obtained from the U.S. Zika Pregnancy Registry, an enhanced surveillance system of pregnant women with laboratory evidence of confirmed or possible Zika virus infection. For this case series, we defined prolonged detection of Zika virus RNA as Zika virus RNA detection in serum by real-time reverse transcription-polymerase chain reaction (RT-PCR) 14 or more days after symptom onset or, for women not reporting signs or symptoms consistent with Zika virus disease (asymptomatic), 21 or more days after last possible exposure to Zika virus. RESULTS: Prolonged Zika virus RNA detection in serum was identified in four symptomatic pregnant women up to 46 days after symptom onset and in one asymptomatic pregnant woman 53 days postexposure. Among the five pregnancies, one pregnancy had evidence of fetal Zika virus infection confirmed by histopathologic examination of fetal tissue, three pregnancies resulted in live births of apparently healthy neonates with no reported abnormalities, and one pregnancy is ongoing. CONCLUSION: Zika virus RNA was detected in the serum of five pregnant women beyond the previously estimated timeframe. Additional real-time RT-PCR testing of pregnant women might provide more data about prolonged detection of Zika virus RNA and the possible diagnostic, epidemiologic, and clinical implications for pregnant women. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Drug Safety
- Environmental Health
- Genetics and Genomics
- Global Health
- Health Economics
- Immunity and Immunization
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Mining
- Occupational Safety and Health
- Parasitic Diseases
- Physical Activity
- Reproductive Health
- Statistics as Topic
- Substance Use and Abuse
- Zoonotic and Vectorborne Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure