Last data update: Oct 28, 2024. (Total: 48004 publications since 2009)
Records 1-30 (of 47 Records) |
Query Trace: Devine C [original query] |
---|
Extrapolating sentinel surveillance information to estimate national COVID hospital admission rates: A Bayesian modeling approach
Devine O , Pham H , Gunnels B , Reese HE , Steele M , Couture A , Iuliano D , Sachdev D , Alden NB , Meek J , Witt L , Ryan PA , Reeg L , Lynfield R , Ropp SL , Barney G , Tesini BL , Shiltz E , Sutton M , Talbot HK , Reyes I , Havers FP . Influenza Other Respir Viruses 2024 18 (10) e70026 The COVID-19-Associated Hospitalization Surveillance Network (COVID-NET) was established in March 2020 to monitor trends in hospitalizations associated with SARS-CoV-2 infection. COVID-NET is a geographically diverse population-based surveillance system for laboratory-confirmed COVID-19-associated hospitalizations with a combined catchment area covering approximately 10% of the US population. Data collected in COVID-NET includes monthly counts of hospitalizations for persons with confirmed SARS-CoV-2 infection who reside within the defined catchment area. A Bayesian modeling approach is proposed to estimate US national COVID-associated hospital admission rates based on information reported in the COVID-NET system. A key component of the approach is the ability to estimate uncertainty resulting from extrapolation of hospitalization rates observed within COVID-NET to the US population. In addition, the proposed model enables estimation of other contributors to uncertainty including temporal dependence among reported COVID-NET admission counts, the impact of unmeasured site-specific factors, and the frequency and accuracy of testing for SARS-CoV-2 infection. Based on the proposed model, an estimated 6.3 million (95% uncertainty interval (UI) 5.4-7.3 million) COVID-19-associated hospital admissions occurred in the United States from September 2020 through December 2023. Between April 2020 and December 2023, model-based monthly admission rate estimates ranged from a minimum of 1 per 10,000 population (95% UI 0.7-1.2) in June of 2023 to a highest monthly level of 16 per 10,000 (95% UI 13-19) in January 2022. |
Reported incidence of infections caused by pathogens transmitted commonly through food: Impact of increased use of culture-independent diagnostic tests - Foodborne Diseases Active Surveillance Network, 1996-2023
Shah HJ , Jervis RH , Wymore K , Rissman T , LaClair B , Boyle MM , Smith K , Lathrop S , McGuire S , Trevejo R , McMillian M , Harris S , Zablotsky Kufel J , Houck K , Lau CE , Devine CJ , Boxrud D , Weller DL . MMWR Morb Mortal Wkly Rep 2024 73 (26) 584-593 Reducing foodborne disease incidence is a public health priority. This report summarizes preliminary 2023 Foodborne Diseases Active Surveillance Network (FoodNet) data and highlights efforts to increase the representativeness of FoodNet. During 2023, incidences of domestically acquired campylobacteriosis, Shiga toxin-producing Escherichia coli infection, yersiniosis, vibriosis, and cyclosporiasis increased, whereas those of listeriosis, salmonellosis, and shigellosis remained stable compared with incidences during 2016-2018, the baseline used for tracking progress towards federal disease reduction goals. During 2023, the incidence and percentage of infections diagnosed by culture-independent diagnostic tests (CIDTs) reported to FoodNet continued to increase, and the percentage of cases that yielded an isolate decreased, affecting observed trends in incidence. Because CIDTs allow for diagnosis of infections that previously would have gone undetected, lack of progress toward disease reduction goals might reflect changing diagnostic practices rather than an actual increase in incidence. Continued surveillance is needed to monitor the impact of changing diagnostic practices on disease trends, and targeted prevention efforts are needed to meet disease reduction goals. During 2023, FoodNet expanded its catchment area for the first time since 2004. This expansion improved the representativeness of the FoodNet catchment area, the ability of FoodNet to monitor trends in disease incidence, and the generalizability of FoodNet data. |
The 2018-2019 FoodNet Population Survey: A tool to estimate risks and behaviors associated with enteric infections
Devine CJ , Molinari NA , Shah HJ , Blackstock AJ , Geissler A , Marder EP , Payne DC . Am J Epidemiol 2024 The FoodNet Population Survey is a periodic survey of randomly selected residents in 10 US sites on exposures and behaviors that may be associated with acute diarrheal infections and the health care sought for those infections. This survey is used to estimate the true disease burden of enteric illness in the United States and to estimate rates of exposure to potential sources of illness. Unlike previous FoodNet Population Surveys, this cycle used multiple sampling frames and administration modes, including cell phone and web-based questionnaires, that allowed for additional question topics and a larger sample size. It also oversampled children to increase representation of this population. Analytic modeling adjusted for mode effects when estimating the prevalence estimates of exposures and behaviors. This report describes the design, methodology, challenges, and descriptive results from the 2018-19 FoodNet Population Survey. |
Comparing individual and community-level characteristics of people with ground beef-associated salmonellosis and other ground beef eaters: a case-control analysis
Salah Z , Canning M , Rickless D , Devine C , Buckman R , Payne DC , Marshall KE . J Food Prot 2024 100303 Salmonella is estimated to be the leading bacterial cause of U.S. domestically-acquired foodborne illness. Large outbreaks of Salmonella attributed to ground beef have been reported in recent years. The demographic and sociodemographic characteristics of infected individuals linked to these outbreaks are poorly understood. We employed a retrospective case-control design; case-patients were people with laboratory-confirmed Salmonella infections linked to ground beef-associated outbreaks between 2012-2019, and controls were respondents to the 2018-2019 FoodNet Population Survey who reported eating ground beef and denied recent gastrointestinal illness. We used county-level CDC/ATSDR Social Vulnerability Index (SVI) to compare case-patient and controls. Case-patient status was regressed on county-level social vulnerability and individual-level demographic characteristics. We identified 376 case-patients and 1,321 controls in the FoodNet sites. Being a case-patient was associated with increased overall county-level social vulnerability (OR: 1.21 [95% CI: 1.07-1.36]) and socioeconomic vulnerability (OR: 1.24 [1.05-1.47]) when adjusted for individual-level demographics. Case-patient status was not strongly associated with the other SVI themes of household composition and disability, minority status and language, and housing type and transportation. Data on individual-level factors such as income, poverty, unemployment, and education could facilitate further analyses to understand this relationship. |
Preliminary incidence and trends of infections caused by pathogens transmitted commonly through food - Foodborne Diseases Active Surveillance Network, 10 U.S. Sites, 2022
Delahoy MJ , Shah HJ , Weller DL , Ray LC , Smith K , McGuire S , Trevejo RT , Scallan Walter E , Wymore K , Rissman T , McMillian M , Lathrop S , LaClair B , Boyle MM , Harris S , Zablotsky-Kufel J , Houck K , Devine CJ , Lau CE , Tauxe RV , Bruce BB , Griffin PM , Payne DC . MMWR Morb Mortal Wkly Rep 2023 72 (26) 701-706 Each year, infections from major foodborne pathogens are responsible for an estimated 9.4 million illnesses, 56,000 hospitalizations, and 1,350 deaths in the United States (1). To evaluate progress toward prevention of enteric infections in the United States, the Foodborne Diseases Active Surveillance Network (FoodNet) conducts surveillance for laboratory-diagnosed infections caused by eight pathogens transmitted commonly through food at 10 U.S. sites. During 2020-2021, FoodNet detected decreases in many infections that were due to behavioral modifications, public health interventions, and changes in health care-seeking and testing practices during the COVID-19 pandemic. This report presents preliminary estimates of pathogen-specific annual incidences during 2022, compared with average annual incidences during 2016-2018, the reference period for the U.S. Department of Health and Human Services' Healthy People 2030 targets (2). Many pandemic interventions ended by 2022, resulting in a resumption of outbreaks, international travel, and other factors leading to enteric infections. During 2022, annual incidences of illnesses caused by the pathogens Campylobacter, Salmonella, Shigella, and Listeria were similar to average annual incidences during 2016-2018; however, incidences of Shiga toxin-producing Escherichia coli (STEC), Yersinia, Vibrio, and Cyclospora illnesses were higher. Increasing culture-independent diagnostic test (CIDT) usage likely contributed to increased detection by identifying infections that would have remained undetected before widespread CIDT usage. Reducing pathogen contamination during poultry slaughter and processing of leafy greens requires collaboration among food growers and processors, retail stores, restaurants, and regulators. |
Head Impact Exposures Among Youth Tackle and Flag American Football Athletes
Waltzman D , Sarmiento K , Devine O , Zhang X , DePadilla L , Kresnow MJ , Borradaile K , Hurwitz A , Jones D , Goyal R , Breiding MJ . Sports Health 2021 13 (5) 454-462 BACKGROUND: Promoted as a safer alternative to tackle football, there has been an increase in flag football participation in recent years. However, examinations of head impact exposure in flag football as compared with tackle football are currently limited. HYPOTHESIS: Tackle football athletes will have a greater number and magnitude of head impacts compared with flag football athletes. STUDY DESIGN: Cohort study. LEVEL OF EVIDENCE: Level 4. METHODS: Using mouthguard sensors, this observational, prospective cohort study captured data on the number and magnitude of head impacts among 524 male tackle and flag football athletes (6-14 years old) over the course of a single football season. Estimates of interest based on regression models used Bayesian methods to estimate differences between tackle and flag athletes. RESULTS: There were 186,239 head impacts recorded during the study. Tackle football athletes sustained 14.67 (95% CI 9.75-21.95) times more head impacts during an athletic exposure (game or practice) compared with flag football athletes. Magnitude of impact for the 50th and 95th percentile was 18.15g (17.95-18.34) and 52.55g (51.06-54.09) for a tackle football athlete and 16.84g (15.57-18.21) and 33.51g (28.23-39.08) for a flag football athlete, respectively. A tackle football athlete sustained 23.00 (13.59-39.55) times more high-magnitude impacts (≥40g) per athletic exposure compared with a flag football athlete. CONCLUSION: This study demonstrates that youth athletes who play tackle football are more likely to experience a greater number of head impacts and are at a markedly increased risk for high-magnitude impacts compared with flag football athletes. CLINICAL RELEVANCE: These results suggest that flag football has fewer head impact exposures, which potentially minimizes concussion risk, making it a safer alternative for 6- to 14-year-old youth football athletes. |
Insecticide resistance compromises the control of Aedes aegypti in Bangladesh.
Al-Amin HM , Gyawali N , Graham M , Alam MS , Lenhart A , Hugo LE , Rašić G , Beebe NW , Devine GJ . Pest Manag Sci 2023 79 (8) 2846-2861 BACKGROUND: With no effective drugs or widely available vaccines, dengue control in Bangladesh is dependent on targeting the primary vector Aedes aegypti with insecticides and larval source management. Despite these interventions, the dengue burden is increasing in Bangladesh, and the country experienced its worst outbreak in 2019 with 101,354 hospitalized cases. This may be partially facilitated by the presence of intense insecticide resistance in vector populations. Here, we describe the intensity and mechanisms of resistance to insecticides commonly deployed against Ae. aegypti in Dhaka, Bangladesh. RESULTS: Dhaka Ae. aegypti colonies exhibited high-intensity resistance to pyrethroids. Using CDC bottle assays, we recorded 2 - 24% mortality (recorded at 24 hours) to permethrin and 48 - 94% mortality to deltamethrin, at 10x the diagnostic dose. Bioassays conducted using insecticide-synergist combinations suggested that metabolic mechanisms were contributing to pyrethroid resistance, specifically multi-function oxidases, esterases, and glutathione S-transferases. In addition, kdr alleles were detected, with a high frequency (78-98%) of homozygotes for the V1016G mutation. A large proportion (≤ 74%) of free-flying and resting mosquitoes from Dhaka colonies survived exposure to standard applications of pyrethroid aerosols in an experimental free-flight room. Although that exposure affected Ae. aegypti's immediate host-seeking behavior, the effect was transient in surviving mosquitoes. CONCLUSION: The intense resistance characterized in this study is likely compromising the operational effectiveness of pyrethroids against Ae. aegypti in Dhaka. Switching to alternative chemical classes may offer a medium-term solution, but ultimately a more sustainable and effective approach to controlling dengue vectors is required. This article is protected by copyright. All rights reserved. |
Point Prevalence Estimates of Activity-Limiting Long-Term Symptoms among U.S. Adults ≥1 Month After Reported SARS-CoV-2 Infection, November 1, 2021.
Tenforde MW , Devine OJ , Reese HE , Silk BJ , Iuliano AD , Threlkel R , Vu QM , Plumb ID , Cadwell BL , Rose C , Steele MK , Briggs-Hagen M , Ayoubkhani D , Pawelek P , Nafilyan V , Saydah SH , Bertolli J . J Infect Dis 2023 227 (7) 855-863 BACKGROUND: Although most adults infected with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) fully recover, a proportion have ongoing symptoms, or post-COVID conditions (PCC), after infection. The objective of this analysis was to estimate the number of United States (US) adults with activity-limiting PCC on 1 November 2021. METHODS: We modeled the prevalence of PCC using reported infections occurring from 1 February 2020 to 30 September 2021, and population-based, household survey data on new activity-limiting symptoms ≥1 month following SARS-CoV-2 infection. From these data sources, we estimated the number and proportion of US adults with activity-limiting PCC on 1 November 2021 as 95% uncertainty intervals, stratified by sex and age. Sensitivity analyses adjusted for underascertainment of infections and uncertainty about symptom duration. RESULTS: On 1 November 2021, at least 3.0-5.0 million US adults, or 1.2%-1.9% of the US adult population, were estimated to have activity-limiting PCC of ≥1 month's duration. Population prevalence was higher in females (1.4%-2.2%) than males. The estimated prevalence after adjusting for underascertainment of infections was 1.7%-3.8%. CONCLUSIONS: Millions of US adults were estimated to have activity-limiting PCC. These estimates can support future efforts to address the impact of PCC on the US population. |
Differences in Head Impact Exposures Between Youth Tackle and Flag Football Games and Practices: Potential Implications for Prevention Strategies
Sarmiento K , Waltzman D , Devine O , Zhang X , DePadilla L , Kresnow MJ , Borradaile K , Hurwitz A , Jones D , Goyal R , Breiding MJ . Am J Sports Med 2021 49 (8) 3635465211011754 BACKGROUND: Interventions designed to reduce the risk for head impacts and concussion in youth football have increased over the past decade; however, understanding of the role of regular game play on head impact exposure among youth tackle and flag football athletes is currently limited. PURPOSE: To explore head impact exposure among youth tackle and flag football athletes (age range, 6-14 years) during both practices and games. STUDY DESIGN: Cohort study; Level of evidence, 2. METHODS: Using the Vector MouthGuard sensor, the authors collected head impact data from 524 tackle and flag youth football athletes over the course of a football season. Quantities of interest were estimated from regression models using Bayesian methods. RESULTS: For impacts ≥10g, a tackle football athlete had an estimated 17.55 (95% CI, 10.78-28.96) times more head impacts per practice compared with a flag football athlete (6.85 [95% CI, 6.05-7.76] and 0.39 [95% CI, 0.24-0.62] head impacts, respectively). Additionally, a tackle football athlete had an estimated 19.48 (95% CI, 12.74-29.98) times more head impacts per game compared with a flag football athlete (13.59 [95% CI, 11.97-15.41] and 0.70 [95% CI, 0.46-1.05] head impacts, respectively). Among tackle football athletes, the estimated average impact rate was 6.51 (95% CI, 5.75-7.37) head impacts during a practice and 12.97 (95% CI, 11.36-14.73) impacts during a game, resulting in 2.00 (95% CI, 1.74-2.29) times more ≥10g head impacts in games versus practices. Tackle football athletes had 2.06 (95% CI, 1.80-2.34) times more high-magnitude head impacts (≥40g) during a game than during a practice. On average, flag football athletes experienced an estimated 0.37 (95% CI, 0.20-0.60) head impacts during a practice and 0.77 (95% CI, 0.53-1.06) impacts during a game, resulting in 2.06 (95% CI, 1.29-3.58) times more ≥10g head impacts in games versus practices. Because of model instability caused by a large number of zero impacts for flag football athletes, a comparison of high-magnitude head impacts is not reported for practices or games. CONCLUSION: This study provides a characterization of the head impact exposure of practices and games among a large population of youth tackle and flag football athletes aged 6 to 14 years. These findings suggest that a greater focus on game-based interventions, such as fair play interventions and strict officiating, may be beneficial to reduce head impact exposures for youth football athletes. |
Second WIN International Conference on "Integrated approaches and innovative tools for combating insecticide resistance in vectors of arboviruses", October 2018, Singapore
Corbel V , Durot C , Achee NL , Chandre F , Coulibaly MB , David JP , Devine GJ , Dusfour I , Fonseca DM , Griego J , Juntarajumnong W , Lenhart A , Kasai S , Martins AJ , Moyes C , Ng LC , Pinto J , Pompon JF , Muller P , Raghavendra K , Roiz D , Vatandoost H , Vontas J , Weetman D . Parasit Vectors 2019 12 (1) 331 The past 40 years have seen a dramatic emergence of epidemic arboviral diseases transmitted primarily by mosquitoes. The frequency and magnitude of the epidemics, especially those transmitted by urban Aedes species, have progressively increased over time, accelerating in the past 10 years. To reduce the burden and threat of vector-borne diseases, the World Health Organization (WHO) has recently adopted the Global Vector Control Response (GVCR) in order to support countries in implementing effective sustainable vector control. The evidence-base to support vector control is however limited for arboviral diseases which make prioritization difficult. Knowledge gaps in the distribution, mechanisms and impact of insecticide resistance on vector control impedes the implementation of locally tailored Aedes control measures. This report summarizes the main outputs of the second international conference of the Worldwide Insecticide resistance Network (WIN) on "Integrated approaches and innovative tools for combating insecticide resistance in arbovirus vectors" held in Singapore, 1-3 October 2018. The aims of the conference were to review progress and achievements made in insecticide resistance surveillance worldwide, and to discuss the potential of integrated vector management and innovative technologies for efficiently controlling arboviral diseases. The conference brought together 150 participants from 26 countries. |
Systematic review and Bayesian meta-analysis of the dose-response relationship between folic acid intake and changes in blood folate concentrations
Crider KS , Devine O , Qi YP , Yeung LF , Sekkarie A , Zaganjor I , Wong E , Rose CE , Berry RJ . Nutrients 2019 11 (1) The threshold for population-level optimal red blood cell (RBC) folate concentration among women of reproductive age for the prevention of neural tube defects has been estimated at 906 nmol/L; however, the dose-response relationship between folic acid intake and blood folate concentrations is uncharacterized. To estimate the magnitude of blood folate concentration increase in response to specific dosages of folic acid under steady-state conditions (as could be achieved with food fortification), a systematic review of the literature and meta-analysis was conducted. Of the 14,002 records we identified, 533 were selected for full-text review, and data were extracted from 108 articles. The steady-state concentrations (homeostasis) of both serum/plasma and RBC folate concentrations were estimated using a Bayesian meta-analytic approach and one-compartment physiologically-based pharmacokinetic models. RBC folate concentrations increased 1.78 fold (95% credible interval (CI): 1.66, 1.93) from baseline to steady-state at 375(-)570 microg folic acid/day, and it took a median of 36 weeks of folic acid intake (95% CI: 27, 52) to achieve steady-state RBC folate concentrations. Based on regression analysis, we estimate that serum/plasma folate concentrations increased 11.6% (95% CI: 8.4, 14.9) for every 100 microg/day folic acid intake. These results will help programs plan and monitor folic acid fortification programs. |
Quantifying primaquine effectiveness and improving adherence: a round table discussion of the APMEN Vivax Working Group
Thriemer K , Bobogare A , Ley B , Gudo CS , Alam MS , Anstey NM , Ashley E , Baird JK , Gryseels C , Jambert E , Lacerda M , Laihad F , Marfurt J , Pasaribu AP , Poespoprodjo JR , Sutanto I , Taylor WR , van den Boogaard C , Battle KE , Dysoley L , Ghimire P , Hawley B , Hwang J , Khan WA , Mudin RNB , Sumiwi ME , Ahmed R , Aktaruzzaman MM , Awasthi KR , Bardaji A , Bell D , Boaz L , Burdam FH , Chandramohan D , Cheng Q , Chindawongsa K , Culpepper J , Das S , Deray R , Desai M , Domingo G , Duoquan W , Duparc S , Floranita R , Gerth-Guyette E , Howes RE , Hugo C , Jagoe G , Sariwati E , Jhora ST , Jinwei W , Karunajeewa H , Kenangalem E , Lal BK , Landuwulang C , Le Perru E , Lee SE , Makita LS , McCarthy J , Mekuria A , Mishra N , Naket E , Nambanya S , Nausien J , Duc TN , Thi TN , Noviyanti R , Pfeffer D , Qi G , Rahmalia A , Rogerson S , Samad I , Sattabongkot J , Satyagraha A , Shanks D , Sharma SN , Sibley CH , Sungkar A , Syafruddin D , Talukdar A , Tarning J , Kuile FT , Thapa S , Theodora M , Huy TT , Waramin E , Waramori G , Woyessa A , Wongsrichanalai C , Xa NX , Yeom JS , Hermawan L , Devine A , Nowak S , Jaya I , Supargiyono S , Grietens KP , Price RN . Malar J 2018 17 (1) 241 The goal to eliminate malaria from the Asia-Pacific by 2030 will require the safe and widespread delivery of effective radical cure of malaria. In October 2017, the Asia Pacific Malaria Elimination Network Vivax Working Group met to discuss the impediments to primaquine (PQ) radical cure, how these can be overcome and the methodological difficulties in assessing clinical effectiveness of radical cure. The salient discussions of this meeting which involved 110 representatives from 18 partner countries and 21 institutional partner organizations are reported. Context specific strategies to improve adherence are needed to increase understanding and awareness of PQ within affected communities; these must include education and health promotion programs. Lessons learned from other disease programs highlight that a package of approaches has the greatest potential to change patient and prescriber habits, however optimizing the components of this approach and quantifying their effectiveness is challenging. In a trial setting, the reactivity of participants results in patients altering their behaviour and creates inherent bias. Although bias can be reduced by integrating data collection into the routine health care and surveillance systems, this comes at a cost of decreasing the detection of clinical outcomes. Measuring adherence and the factors that relate to it, also requires an in-depth understanding of the context and the underlying sociocultural logic that supports it. Reaching the elimination goal will require innovative approaches to improve radical cure for vivax malaria, as well as the methods to evaluate its effectiveness. |
Engendering healthy masculinities to prevent sexual violence: Rationale for and design of the Manhood 2.0 trial
Abebe KZ , Jones KA , Culyba AJ , Feliz NB , Anderson H , Torres I , Zelazny S , Bamwine P , Boateng A , Cirba B , Detchon A , Devine D , Feinstein Z , Macak J , Massof M , Miller-Walfish S , Morrow SE , Mulbah P , Mulwa Z , Paglisotti T , Ripper L , Ports KA , Matjasko JL , Garg A , Kato-Wallace J , Pulerwitz J , Miller E . Contemp Clin Trials 2018 71 18-32 Violence against women and girls is an important global health concern. Numerous health organizations highlight engaging men and boys in preventing violence against women as a potentially impactful public health prevention strategy. Adapted from an international setting for use in the US, "Manhood 2.0" is a "gender transformative" program that involves challenging harmful gender and sexuality norms that foster violence against women while promoting bystander intervention (i.e., giving boys skills to interrupt abusive behaviors they witness among peers) to reduce the perpetration of sexual violence (SV) and adolescent relationship abuse (ARA). Manhood 2.0 is being rigorously evaluated in a community-based cluster-randomized trial in 21 lower resource Pittsburgh neighborhoods with 866 adolescent males ages 13-19. The comparison intervention is a job readiness training program which focuses on the skills needed to prepare youth for entering the workforce, including goal setting, accountability, resume building, and interview preparation. This study will provide urgently needed information about the effectiveness of a gender transformative program, which combines healthy sexuality education, gender norms change, and bystander skills to interrupt peers' disrespectful and harmful behaviors to reduce SV/ARA perpetration among adolescent males. In this manuscript, we outline the rationale for and evaluation design of Manhood 2.0. Clinical Trials #: NCT02427061. |
Modeling the impact of folic acid fortification and supplementation on red blood cell folate concentrations and predicted neural tube defect risk in the United States: have we reached optimal prevention
Crider KS , Qi YP , Devine O , Tinker SC , Berry RJ . Am J Clin Nutr 2018 107 (6) 1027-1034 Background: The US CDC and the Institute of Medicine recommend that women capable of becoming pregnant consume >/=400 microg synthetic folic acid/d to prevent neural tube defects (NTDs). The United States has 3 sources of folic acid: fortified enriched cereal grain products (ECGPs), fortified ready-to-eat (RTE) cereals, and dietary supplements. Objective: Our objectives were as follows: 1) to estimate the usual daily folic acid intake and distributions of red blood cell (RBC) folate concentrations among women consuming folic acid from different sources; 2) to assess the usual daily total folic acid intake associated with optimal RBC folate concentrations for NTD prevention; 3) to predict NTD prevalence; and 4) to estimate the number of preventable folate-sensitive NTDs. Design: NHANES data (2007-2012) for nonpregnant women of reproductive age (12-49 y) were used to estimate usual daily intakes of synthetic folic acid and natural food folate. We applied existing models of the relation between RBC folate concentrations and NTD risk to predict NTD prevalence. Results: Based on the distribution of overall RBC folate concentrations (4783 women), the predicted NTD prevalence was 7.3/10,000 live births [95% uncertainty interval (UI): 5.5-9.4/10,000 live births]. Women consuming folic acid from ECGPs as their only source had lower usual daily total folic acid intakes (median: 115 microg/d; IQR: 79-156 microg/d), lower RBC folate concentrations (median: 881 nmol/L; IQR: 704-1108 nmol/L), and higher predicted NTD prevalence (8.5/10,000 live births; 95% UI: 6.4-10.8/10,000 live births) compared with women consuming additional folic acid from diet or supplements. If women who currently consume folic acid from ECGPs only (48% of women) consumed additional folic acid sources, 345 (95% UI: 0-821) to 701 (95% UI: 242-1189) additional NTDs/y could be prevented. Conclusions: This analysis supports current recommendations and does not indicate any need for higher intakes of folic acid to achieve optimal NTD prevention. Ensuring 400 microg/d intake of folic acid prior to pregnancy has the potential to increase the number of babies born without an NTD. |
Estimating the numbers of pregnant women infected with Zika virus and infants with congenital microcephaly in Colombia, 2015-2017
Adamski A , Bertolli J , Castaneda-Orjuela C , Devine OJ , Johansson MA , Duarte MAG , Tinker SC , Farr SL , Reyes MMM , Tong VT , Garcia OEP , Valencia D , Ortiz DAC , Honein MA , Jamieson DJ , Martinez MLO , Gilboa SM . J Infect 2018 76 (6) 529-535 BACKGROUND: Colombia experienced a Zika virus (ZIKV) outbreak in 2015-2016. To assist with planning for medical and supportive services for infants affected by prenatal ZIKV infection, we used a model to estimate the number of pregnant women infected with ZIKV and the number of infants with congenital microcephaly from August 2015- August 2017. METHODS: We used nationally-reported cases of symptomatic ZIKV disease among pregnant women and information from the literature on the percent of asymptomatic infections to estimate the number of pregnant women with ZIKV infection occurring August 2015 - December 2016. We then estimated the number of infants with congenital microcephaly expected to occur August 2015 - August 2017. To compare to the observed counts of infants with congenital microcephaly due to all causes reported through the national birth defects surveillance system, the model was time limited to produce estimates for February - November 2016. FINDINGS: We estimated 1,140-2,160 (interquartile range [IQR]) infants with congenital microcephaly in Colombia, during August 2015 - August 2017, whereas 340-540 infants with congenital microcephaly would be expected in the absence of ZIKV. Based on the time limited version of the model, for February - November 2016, we estimated 650-1,410 infants with congenital microcephaly in Colombia. The 95% uncertainty interval for the latter estimate encompasses the 476 infants with congenital microcephaly reported during that approximate time frame based on national birth defects surveillance. INTERPRETATION: Based on modeled estimates, ZIKV infection during pregnancy in Colombia could lead to 3 to 4 times as many infants with congenital microcephaly in 2015-2017 as would have been expected in the absence of the ZIKV outbreak. FUNDING: This publication was made possible through support provided by the Bureau for Global Health, U.S. Agency for International Development under the terms of an Interagency Agreement with Centers for Disease Control and Prevention. |
Proportion of selected congenital heart defects attributable to recognized risk factors
Simeone RM , Tinker SC , Gilboa SM , Agopian AJ , Oster ME , Devine OJ , Honein MA . Ann Epidemiol 2016 26 (12) 838-845 PURPOSE: To assess the contribution of multiple risk factors for two congenital heart defects-hypoplastic left heart syndrome (HLHS) and tetralogy of Fallot (TOF). METHODS: We used data from the National Birth Defects Prevention Study (1997-2011) to estimate average adjusted population attributable fractions for several recognized risk factors, including maternal prepregnancy overweight-obesity, pregestational diabetes, age, and infant sex. RESULTS: There were 594 cases of isolated simple HLHS, 971 cases of isolated simple TOF, and 11,829 controls in the analysis. Overall, 57.0% of HLHS cases and 37.0% of TOF cases were estimated to be attributable to risk factors included in our model. Among modifiable HLHS risk factors, maternal prepregnancy overweight-obesity accounted for the largest proportion of cases (6.5%). Among modifiable TOF risk factors, maternal prepregnancy overweight-obesity and maternal age of 35 years or older accounted for the largest proportions of cases (8.3% and 4.3%, respectively). CONCLUSIONS: Approximately half of HLHS cases and one-third of TOF cases were estimated to be attributable to risk factors included in our models. Interventions targeting factors that can be modified may help reduce the risk of HLHS and TOF development. Additional research into the etiology of HLHS and TOF may reveal other modifiable risk factors that might contribute to primary prevention efforts. |
Near-infrared spectroscopy, a rapid method for predicting the age of male and female wild-type and Wolbachia infected Aedes aegypti
Sikulu-Lord MT , Milali MP , Henry M , Wirtz RA , Hugo LE , Dowell FE , Devine GJ . PLoS Negl Trop Dis 2016 10 (10) e0005040 Estimating the age distribution of mosquito populations is crucial for assessing their capacity to transmit disease and for evaluating the efficacy of available vector control programs. This study reports on the capacity of the near-infrared spectroscopy (NIRS) technique to rapidly predict the ages of the principal dengue and Zika vector, Aedes aegypti. The age of wild-type males and females, and males and females infected with wMel and wMelPop strains of Wolbachia pipientis were characterized using this method. Calibrations were developed using spectra collected from their heads and thoraces using partial least squares (PLS) regression. A highly significant correlation was found between the true and predicted ages of mosquitoes. The coefficients of determination for wild-type females and males across all age groups were R2 = 0.84 and 0.78, respectively. The coefficients of determination for the age of wMel and wMelPop infected females were 0.71 and 0.80, respectively (P< 0.001 in both instances). The age of wild-type female Ae. aegypti could be identified as < or ≥ 8 days old with an accuracy of 91% (N = 501), whereas female Ae. aegypti infected with wMel and wMelPop were differentiated into the two age groups with an accuracy of 83% (N = 284) and 78% (N = 229), respectively. Our results also indicate NIRS can distinguish between young and old male wild-type, wMel and wMelPop infected Ae. aegypti with accuracies of 87% (N = 253), 83% (N = 277) and 78% (N = 234), respectively. We have demonstrated the potential of NIRS as a predictor of the age of female and male wild-type and Wolbachia infected Ae. aegypti mosquitoes under laboratory conditions. After field validation, the tool has the potential to offer a cheap and rapid alternative for surveillance of dengue and Zika vector control programs. |
Estimating the Number of Pregnant Women Infected With Zika Virus and Expected Infants With Microcephaly Following the Zika Virus Outbreak in Puerto Rico, 2016.
Ellington SR , Devine O , Bertolli J , Martinez Quinones A , Shapiro-Mendoza CK , Perez-Padilla J , Rivera-Garcia B , Simeone RM , Jamieson DJ , Valencia-Prado M , Gilboa SM , Honein MA , Johansson MA . JAMA Pediatr 2016 170 (10) 940-945 Importance: Zika virus (ZIKV) infection during pregnancy is a cause of congenital microcephaly and severe fetal brain defects, and it has been associated with other adverse pregnancy and birth outcomes. Objective: To estimate the number of pregnant women infected with ZIKV in Puerto Rico and the number of associated congenital microcephaly cases. Design, Setting, and Participants: We conducted a modeling study from April to July 2016. Using parameters derived from published reports, outcomes were modeled probabilistically using Monte Carlo simulation. We used uncertainty distributions to reflect the limited information available for parameter values. Given the high level of uncertainty in model parameters, interquartile ranges (IQRs) are presented as primary results. Outcomes were modeled for pregnant women in Puerto Rico, which currently has more confirmed ZIKV cases than any other US location. Exposure: Zika virus infection in pregnant women. Main Outcomes and Measures: Number of pregnant women infected with ZIKV and number of congenital microcephaly cases. Results: We estimated an IQR of 5900 to 10300 pregnant women (median, 7800) might be infected during the initial ZIKV outbreak in Puerto Rico. Of these, an IQR of 100 to 270 infants (median, 180) may be born with microcephaly due to congenital ZIKV infection from mid-2016 to mid-2017. In the absence of a ZIKV outbreak, an IQR of 9 to 16 cases (median, 12) of congenital microcephaly are expected in Puerto Rico per year. Conclusions and Relevance: The estimate of 5900 to 10300 pregnant women that might be infected with ZIKV provides an estimate for the number of infants that could potentially have ZIKV-associated adverse outcomes. Including baseline cases of microcephaly, we estimated that an IQR of 110 to 290 total cases of congenital microcephaly, mostly attributable to ZIKV infection, could occur from mid-2016 to mid-2017 in the absence of effective interventions. The primary limitation in this analysis is uncertainty in model parameters. Multivariate sensitivity analyses indicated that the cumulative incidence of ZIKV infection and risk of microcephaly given maternal infection in the first trimester were the primary drivers of both magnitude and uncertainty in the estimated number of microcephaly cases. Increased information on these parameters would lead to more precise estimates. Nonetheless, the results underscore the need for urgent actions being undertaken in Puerto Rico to prevent congenital ZIKV infection and prepare for affected infants. |
Congenital Heart Defects in the United States: Estimating the Magnitude of the Affected Population in 2010.
Gilboa SM , Devine OJ , Kucik JE , Oster ME , Riehle-Colarusso T , Nembhard WN , Xu P , Correa A , Jenkins K , Marelli AJ . Circulation 2016 134 (2) 101-9 BACKGROUND: -Because of advancements in care, there has been a decline in mortality from congenital heart defects (CHD) over the last several decades. However, there are no current empirical data documenting the number of people living with CHD in the United States (US). Our aim was to estimate the CHD prevalence across all age groups in the US in the year 2010. METHODS: -The age-, sex-, and severity-specific observed prevalence of CHD in Quebec, Canada in the year 2010 was assumed to equal the CHD prevalence in the non-Hispanic white population in the US in 2010. A race-ethnicity adjustment factor, reflecting differential survival between racial-ethnic groups through age 5 for persons with a CHD and that in the general US population, was applied to the estimated non-Hispanic white rates to derive CHD prevalence estimates among US non-Hispanic blacks and Hispanics. Confidence intervals for the estimated CHD prevalence rates and case counts were derived using a combination of Taylor series approximations and Monte Carlo simulation. RESULTS: -We estimated that approximately 2.4 million people (1.4 million adults, 1 million children) were living with CHD in the US in 2010. Nearly 300,000 of these individuals had severe CHD. CONCLUSIONS: -Our estimates highlight the need for two important efforts: (1) planning for health services delivery to meet the needs of the growing population of adults with CHD and; (2) the development of surveillance data across the lifespan to provide empirical estimates of the prevalence of CHD across all age groups in the US. |
Rapid and non-destructive detection and identification of two strains of Wolbachia in Aedes aegypti by near-infrared spectroscopy
Sikulu-Lord MT , Maia MF , Milali MP , Henry M , Mkandawile G , Kho EA , Wirtz RA , Hugo LE , Dowell FE , Devine GJ . PLoS Negl Trop Dis 2016 10 (6) e0004759 The release of Wolbachia infected mosquitoes is likely to form a key component of disease control strategies in the near future. We investigated the potential of using near-infrared spectroscopy (NIRS) to simultaneously detect and identify two strains of Wolbachia pipientis (wMelPop and wMel) in male and female laboratory-reared Aedes aegypti mosquitoes. Our aim is to find faster, cheaper alternatives for monitoring those releases than the molecular diagnostic techniques that are currently in use. Our findings indicate that NIRS can differentiate females and males infected with wMelPop from uninfected wild type samples with an accuracy of 96% (N = 299) and 87.5% (N = 377), respectively. Similarly, females and males infected with wMel were differentiated from uninfected wild type samples with accuracies of 92% (N = 352) and 89% (N = 444). NIRS could differentiate wMelPop and wMel transinfected females with an accuracy of 96.6% (N = 442) and males with an accuracy of 84.5% (N = 443). This non-destructive technique is faster than the standard polymerase chain reaction diagnostic techniques. After the purchase of a NIRS spectrometer, the technique requires little sample processing and does not consume any reagents. |
Quantifying the epidemiological impact of vector control on dengue
Reiner RC Jr , Achee N , Barrera R , Burkot TR , Chadee DD , Devine GJ , Endy T , Gubler D , Hombach J , Kleinschmidt I , Lenhart A , Lindsay SW , Longini I , Mondy M , Morrison AC , Perkins TA , Vazquez-Prokopec G , Reiter P , Ritchie SA , Smith DL , Strickman D , Scott TW . PLoS Negl Trop Dis 2016 10 (5) e0004588 Dengue virus (DENV) is a self-limiting illness in tropical and subtropical regions around the globe caused by four closely related, but distinct, virus serotypes (DENV-1, -2, -3, and -4) that are transmitted among humans by mosquitoes, primarily Aedes aegypti [1]. Approximately 4 billion people living in more than 128 countries are at risk of infection [2]. Each year there are an estimated 400 million new infections, of which about 100 million manifest as apparent illness [3]. The outcome of human infections ranges from asymptomatic to mild illness to severe, life-threatening disease [4]. DENV not only causes more human morbidity and mortality than any other arthropod-borne virus but it is also a growing public health threat. There has been a dramatic 4-fold increase in dengue cases between 1990–2013 and dengue continues to expand in geographic range [2,3,5,6]. | Presently, vector control is the primary means for preventing dengue [7]. Several vaccine constructs are in clinical trials and initial results are encouraging [8]; recently licensure was granted for the Sanofi Pasteur vaccine in Mexico, Brazil, and the Philippines [9]. A few well-documented successes indicate that, when rigorously applied, vector control can reduce dengue. The advent of DDT in 1947 led to a hemisphere-wide program in the 1950s and 1960s across Central and South America that dramatically reduced Ae. aegypti populations, resulting in impressive reductions in yellow fever and dengue [10]. During the 1970s–1980s [11] and the 1980s–1990s [12], respectively, Singapore and Cuba successfully used vector control and larval source reduction to reduce the force of DENV infection (i.e., per capita risk of human infection [13]) and, thus, disease. Recent trials of indoor residual spraying [14] and indoor space spraying [15] appeared to reduce human DENV infections. Regrettably, these control achievements were rare and ultimately transient. Dengue reinvaded Latin America after the Ae. aegypti eradication campaign ended, rebounded in Singapore and Cuba after 20 and 16 years of successful control, respectively, and is increasingly being reported in Africa due to improved surveillance [16]. |
Red blood cell folate insufficiency among nonpregnant women of childbearing age in Guatemala 2009 to 2010: Prevalence and predicted neural tube defects risk
Rosenthal J , Reeve ME , Ramirez N , Crider KS , Sniezek J , Vellozzi C , Devine O , Lopez-Pazos E . Birth Defects Res A Clin Mol Teratol 2016 106 (7) 587-95 BACKGROUND: The World Health Organization recently released recommendations stating that red blood cell (RBC) folate concentrations should be above 400 ng/L (906 nmol/L) for optimal prevention of folate-sensitive neural tube defects (NTDs). The objective of this study was to determine the distribution of folate insufficiency (FI) (<906 nmol/L) and potential risk of NTDs based on RBC folate concentrations among nonpregnant women of child-bearing age in Guatemala. METHODS: A national and regional multistage cluster probability survey was completed during 2009 to 2010 among Guatemalan women of child-bearing age 15 to 49 years of age. Demographic and health information and blood samples for RBC folate analyses were collected from 1473 women. Prevalence rate ratios of FI and predicted NTD prevalence were estimated based on RBC folate concentrations comparing subpopulations of interest. RESULTS: National FI prevalence was 47.2% [95% confidence interval, 43.3-51.1] and showed wide variation by region (18-81%). In all regions, FI prevalence was higher among indigenous (27-89%) than among nonindigenous populations (16-44%). National NTD risk based on RBC folate concentrations was estimated to be 14 per 10,000 live births (95% uncertainty interval, 11.1-18.6) and showed wide regional variation (from 11 NTDS in the Metropolitan region to 26 NTDs per 10,000 live births in the Norte region). CONCLUSION: FI remains a common problem in populations with limited access to fortified products, specifically rural, low income, and indigenous populations. However, among subpopulations that are most likely to have fortified food, the prevalence of FI is similar to countries with well-established fortification programs. |
An assessment of data quality in a multi-site electronic medical record system in Haiti
Puttkammer N , Baseman JG , Devine EB , Valles JS , Hyppolite N , Garilus F , Honore JG , Matheson AI , Zeliadt S , Yuhas K , Sherr K , Cadet JR , Zamor G , Pierre E , Barnhart S . Int J Med Inform 2015 86 104-16 OBJECTIVES: Strong data quality (DQ) is a precursor to strong data use. In resource limited settings, routine DQ assessment (DQA) within electronic medical record (EMR) systems can be resource-intensive using manual methods such as audit and chart review; automated queries offer an efficient alternative. This DQA focused on Haiti's national EMR - iSante - and included longitudinal data for over 100,000 persons living with HIV (PLHIV) enrolled in HIV care and treatment services at 95 health care facilities (HCF). METHODS: This mixed-methods evaluation used a qualitative Delphi process to identify DQ priorities among local stakeholders, followed by a quantitative DQA on these priority areas. The quantitative DQA examined 13 indicators of completeness, accuracy, and timeliness of retrospective data collected from 2005 to 2013. We described levels of DQ for each indicator over time, and examined the consistency of within-HCF performance and associations between DQ and HCF and EMR system characteristics. RESULTS: Over all iSante data, age was incomplete in <1% of cases, while height, pregnancy status, TB status, and ART eligibility were more incomplete (approximately 20-40%). Suspicious data flags were present for <3% of cases of male sex, ART dispenses, CD4 values, and visit dates, but for 26% of cases of age. Discontinuation forms were available for about half of all patients without visits for 180 or more days, and >60% of encounter forms were entered late. For most indicators, DQ tended to improve over time. DQ was highly variable across HCF, and within HCFs DQ was variable across indicators. In adjusted analyses, HCF and system factors with generally favorable and statistically significant associations with DQ were University hospital category, private sector governance, presence of local iSante server, greater HCF experience with the EMR, greater maturity of the EMR itself, and having more system users but fewer new users. In qualitative feedback, local stakeholders emphasized lack of stable power supply as a key challenge to data quality and use of the iSante EMR. CONCLUSIONS: Variable performance on key DQ indicators across HCF suggests that excellent DQ is achievable in Haiti, but further effort is needed to systematize and routinize DQ approaches within HCFs. A dynamic, interactive "DQ dashboard" within iSante could bring transparency and motivate improvement. While the results of the study are specific to Haiti's iSante data system, the study's methods and thematic lessons learned holdgeneralized relevance for other large-scale EMR systems in resource-limited countries. |
The challenges of introducing routine G6PD testing into radical cure: a workshop report
Ley B , Luter N , Espino FE , Devine A , Kalnoky M , Lubell Y , Thriemer K , Baird JK , Poirot E , Conan N , Kheong CC , Dysoley L , Khan WA , Dion-Berboso AG , Bancone G , Hwang J , Kumar R , Price RN , von Seidlein L , Domingo GJ . Malar J 2015 14 (1) 377 The only currently available drug that effectively removes malaria hypnozoites from the human host is primaquine. The use of 8-aminoquinolines is hampered by haemolytic side effects in glucose-6-phosphate dehydrogenase (G6PD) deficient individuals. Recently a number of qualitative and a quantitative rapid diagnostic test (RDT) format have been developed that provide an alternative to the current standard G6PD activity assays. The WHO has recently recommended routine testing of G6PD status prior to primaquine radical cure whenever possible. A workshop was held in the Philippines in early 2015 to discuss key challenges and knowledge gaps that hinder the introduction of routine G6PD testing. Two point-of-care (PoC) test formats for the measurement of G6PD activity are currently available: qualitative tests comparable to malaria RDT as well as biosensors that provide a quantitative reading. Qualitative G6PD PoC tests provide a binomial test result, are easy to use and some products are comparable in price to the widely used fluorescent spot test. Qualitative test results can accurately classify hemizygous males, heterozygous females, but may misclassify females with intermediate G6PD activity. Biosensors provide a more complex quantitative readout and are better suited to identify heterozygous females. While associated with higher costs per sample tested biosensors have the potential for broader use in other scenarios where knowledge of G6PD activity is relevant as well. The introduction of routine G6PD testing is associated with additional costs on top of routine treatment that will vary by setting and will need to be assessed prior to test introduction. Reliable G6PD PoC tests have the potential to play an essential role in future malaria elimination programmes, however require an improved understanding on how to best integrate routine G6PD testing into different health settings. |
Specific SSRIs and birth defects: bayesian analysis to interpret new data in the context of previous reports
Reefhuis J , Devine O , Friedman JM , Louik C , Honein MA . BMJ 2015 351 h3190 OBJECTIVE: To follow up on previously reported associations between periconceptional use of selective serotonin reuptake inhibitors (SSRIs) and specific birth defects using an expanded dataset from the National Birth Defects Prevention Study. DESIGN: Bayesian analysis combining results from independent published analyses with data from a multicenter population based case-control study of birth defects. SETTING: 10 centers in the United States. PARTICIPANTS: 17 952 mothers of infants with birth defects and 9857 mothers of infants without birth defects, identified through birth certificates or birth hospitals, with estimated dates of delivery between 1997 and 2009. EXPOSURES: Citalopram, escitalopram, fluoxetine, paroxetine, or sertraline use in the month before through the third month of pregnancy. Posterior odds ratio estimates were adjusted to account for maternal race/ethnicity, education, smoking, and prepregnancy obesity. MAIN OUTCOME MEASURE: 14 birth defects categories that had associations with SSRIs reported in the literature. RESULTS: Sertraline was the most commonly reported SSRI, but none of the five previously reported birth defects associations with sertraline was confirmed. For nine previously reported associations between maternal SSRI use and birth defect in infants, findings were consistent with no association. High posterior odds ratios excluding the null value were observed for five birth defects with paroxetine (anencephaly 3.2, 95% credible interval 1.6 to 6.2; atrial septal defects 1.8, 1.1 to 3.0; right ventricular outflow tract obstruction defects 2.4, 1.4 to 3.9; gastroschisis 2.5, 1.2 to 4.8; and omphalocele 3.5, 1.3 to 8.0) and for two defects with fluoxetine (right ventricular outflow tract obstruction defects 2.0, 1.4 to 3.1 and craniosynostosis 1.9, 1.1 to 3.0). CONCLUSIONS: These data provide reassuring evidence for some SSRIs but suggest that some birth defects occur 2-3.5 times more frequently among the infants of women treated with paroxetine or fluoxetine early in pregnancy. |
Assessing the association between natural food folate intake and blood folate concentrations: a systematic review and Bayesian meta-analysis of trials and observational studies
Marchetta CM , Devine OJ , Crider KS , Tsang BL , Cordero AM , Qi YP , Guo J , Berry RJ , Rosenthal J , Mulinare J , Mersereau P , Hamner HC . Nutrients 2015 7 (4) 2663-86 Folate is found naturally in foods or as synthetic folic acid in dietary supplements and fortified foods. Adequate periconceptional folic acid intake can prevent neural tube defects. Folate intake impacts blood folate concentration; however, the dose-response between natural food folate and blood folate concentrations has not been well described. We estimated this association among healthy females. A systematic literature review identified studies (1 1992-3 2014) with both natural food folate intake alone and blood folate concentration among females aged 12-49 years. Bayesian methods were used to estimate regression model parameters describing the association between natural food folate intake and subsequent blood folate concentration. Seven controlled trials and 29 observational studies met the inclusion criteria. For the six studies using microbiologic assay (MA) included in the meta-analysis, we estimate that a 6% (95% Credible Interval (CrI): 4%, 9%) increase in red blood cell (RBC) folate concentration and a 7% (95% CrI: 1%, 12%) increase in serum/plasma folate concentration can occur for every 10% increase in natural food folate intake. Using modeled results, we estimate that a natural food folate intake of ≥450 mug dietary folate equivalents (DFE)/day could achieve the lower bound of an RBC folate concentration (~1050 nmol/L) associated with the lowest risk of a neural tube defect. Natural food folate intake affects blood folate concentration and adequate intakes could help women achieve a RBC folate concentration associated with a risk of 6 neural tube defects/10,000 live births. |
Assessing the association between the methylenetetrahydrofolate reductase (MTHFR) 677C>T polymorphism and blood folate concentrations: a systematic review and meta-analysis of trials and observational studies.
Tsang BL , Devine OJ , Cordero AM , Marchetta CM , Mulinare J , Mersereau P , Guo J , Qi YP , Berry RJ , Rosenthal J , Crider KS , Hamner HC . Am J Clin Nutr 2015 101 (6) 1286-94 BACKGROUND: The methylenetetrahydrofolate reductase (MTHFR) 677C>T polymorphism is a risk factor for neural tube defects. The T allele produces an enzyme with reduced folate-processing capacity, which has been associated with lower blood folate concentrations. OBJECTIVE: We assessed the association between MTHFR C677T genotypes and blood folate concentrations among healthy women aged 12-49 y. DESIGN: We conducted a systematic review of the literature published from January 1992 to March 2014 to identify trials and observational studies that reported serum, plasma, or red blood cell (RBC) folate concentrations and MTHFR C677T genotype. We conducted a meta-analysis for estimates of percentage differences in blood folate concentrations between genotypes. RESULTS: Forty studies met the inclusion criteria. Of the 6 studies that used the microbiologic assay (MA) to measure serum or plasma (S/P) and RBC folate concentrations, the percentage difference between genotypes showed a clear pattern of CC > CT > TT. The percentage difference was greatest for CC > TT [S/P: 13%; 95% credible interval (CrI): 7%, 18%; RBC: 16%; 95% CrI: 12%, 20%] followed by CC > CT (S/P: 7%; 95% CrI: 1%, 12%; RBC: 8%; 95% CrI: 4%, 12%) and CT > TT (S/P: 6%; 95% CrI: 1%, 11%; RBC: 9%; 95% CrI: 5%, 13%). S/P folate concentrations measured by using protein-binding assays (PBAs) also showed this pattern but to a greater extent (e.g., CC > TT: 20%; 95% CrI: 17%, 22%). In contrast, RBC folate concentrations measured by using PBAs did not show the same pattern and are presented in the Supplemental Material only. CONCLUSIONS: Meta-analysis results (limited to the MA, the recommended population assessment method) indicated a consistent percentage difference in S/P and RBC folate concentrations across MTHFR C677T genotypes. Lower blood folate concentrations associated with this polymorphism could have implications for a population-level risk of neural tube defects. |
Using Bayesian models to assess the effects of under-reporting of cannabis use on the association with birth defects, National Birth Defects Prevention Study, 1997-2005
van Gelder MM , Donders AR , Devine O , Roeleveld N , Reefhuis J . Paediatr Perinat Epidemiol 2014 28 (5) 424-33 BACKGROUND: Studies on associations between periconceptional cannabis exposure and birth defects have mainly relied on self-reported exposure. Therefore, the results may be biased due to under-reporting of the exposure. The aim of this study was to quantify the potential effects of this form of exposure misclassification. METHODS: Using multivariable logistic regression, we re-analysed associations between periconceptional cannabis use and 20 specific birth defects using data from the National Birth Defects Prevention Study from 1997-2005 for 13 859 case infants and 6556 control infants. For seven birth defects, we implemented four Bayesian models based on various assumptions concerning the sensitivity of self-reported cannabis use to estimate odds ratios (ORs), adjusted for confounding and under-reporting of the exposure. We used information on sensitivity of self-reported cannabis use from the literature for prior assumptions. RESULTS: The results unadjusted for under-reporting of the exposure showed an association between cannabis use and anencephaly (posterior OR 1.9 [95% credible interval (CRI) 1.1, 3.2]) which persisted after adjustment for potential exposure misclassification. Initially, no statistically significant associations were observed between cannabis use and the other birth defect categories studied. Although adjustment for under-reporting did not notably change these effect estimates, cannabis use was associated with esophageal atresia (posterior OR 1.7 [95% CRI 1.0, 2.9]), diaphragmatic hernia (posterior OR 1.8 [95% CRI 1.1, 3.0]), and gastroschisis (posterior OR 1.7 [95% CRI 1.2, 2.3]) after correction for exposure misclassification. CONCLUSIONS: Under-reporting of the exposure may have obscured some cannabis-birth defect associations in previous studies. However, the resulting bias is likely to be limited. |
Diabetes and congenital heart defects: a systematic review, meta-analysis, and modeling project.
Simeone RM , Devine OJ , Marcinkevage JA , Gilboa SM , Razzaghi H , Bardenheier BH , Sharma AJ , Honein MA . Am J Prev Med 2014 48 (2) 195-204 CONTEXT: Maternal pregestational diabetes (PGDM) is a risk factor for development of congenital heart defects (CHDs). Glycemic control before pregnancy reduces the risk of CHDs. A meta-analysis was used to estimate summary ORs and mathematical modeling was used to estimate population attributable fractions (PAFs) and the annual number of CHDs in the U.S. potentially preventable by establishing glycemic control before pregnancy. EVIDENCE ACQUISITION: A systematic search of the literature through December 2012 was conducted in 2012 and 2013. Case-control or cohort studies were included. Data were abstracted from 12 studies for a meta-analysis of all CHDs. EVIDENCE SYNTHESIS: Summary estimates of the association between PGDM and CHDs and 95% credible intervals (95% CrIs) were developed using Bayesian random-effects meta-analyses for all CHDs and specific CHD subtypes. Posterior estimates of this association were combined with estimates of CHD prevalence to produce estimates of PAFs and annual prevented cases. Ninety-five percent uncertainty intervals (95% UIs) for estimates of the annual number of preventable cases were developed using Monte Carlo simulation. Analyses were conducted in 2013. The summary OR estimate for the association between PGDM and CHDs was 3.8 (95% CrI=3.0, 4.9). Approximately 2670 (95% UI=1795, 3795) cases of CHDs could potentially be prevented annually if all women in the U.S. with PGDM achieved glycemic control before pregnancy. CONCLUSIONS: Estimates from this analysis suggest that preconception care of women with PGDM could have a measureable impact by reducing the number of infants born with CHDs. |
Community socioeconomic disadvantage and the survival of infants with congenital heart defects
Kucik JE , Nembhard WN , Donohue P , Devine O , Wang Y , Minkovitz CS , Burke T . Am J Public Health 2014 104 (11) e1-e8 OBJECTIVES: We examined the association between survival of infants with severe congenital heart defects (CHDs) and community-level indicators of socioeconomic status. METHODS: We identified infants born to residents of Arizona, New Jersey, New York, and Texas between 1999 and 2007 with selected CHDs from 4 population-based, statewide birth defect surveillance programs. We linked data to the 2000 US Census to obtain 11 census tract-level socioeconomic indicators. We estimated survival probabilities and hazard ratios adjusted for individual characteristics. RESULTS: We observed differences in infant survival for 8 community socioeconomic indicators (P < .05). The greatest mortality risk was associated with residing in communities in the most disadvantaged deciles for poverty (adjusted hazard ratio [AHR] = 1.49; 95% confidence interval [CI] = 1.11, 1.99), education (AHR = 1.51; 95% CI = 1.16, 1.96), and operator or laborer occupations (AHR = 1.54; 95% CI = 1.16, 1.96). Survival decreased with increasing numbers of indicators that were in the most disadvantaged decile. Community-level mortality risk persisted when we adjusted for individual-level characteristics. CONCLUSIONS: The increased mortality risk among infants with CHDs living in socioeconomically deprived communities might indicate barriers to quality and timely care at which public health interventions might be targeted. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Oct 28, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure