Last data update: Apr 18, 2025. (Total: 49119 publications since 2009)
Records 1-23 (of 23 Records) |
Query Trace: O'Hagan J[original query] |
---|
Incubation period of Clostridioides difficile infection in hospitalized patients and long-term care facility residents: a prospective cohort study
Curry SR , Hecker MT , O'Hagan J , Kutty PK , Ng-Wong YK , Cadnum JL , Alhmidi H , Gonzalez-Orta MI , Saldana C , Wilson BM , Donskey CJ . Antimicrob Steward Healthc Epidemiol 2024 4 (1) e144 BACKGROUND: The incubation period for Clostridioides difficile infection (CDI) is generally considered to be less than 1 week, but some recent studies suggest that prolonged carriage prior to disease onset may be common. OBJECTIVE: To estimate the incubation period for patients developing CDI after initial negative cultures. METHODS: In 3 tertiary care medical centers, we conducted a cohort study to identify hospitalized patients and long-term care facility residents with negative initial cultures for C. difficile followed by a diagnosis of CDI with or without prior detection of carriage. Cases were classified as healthcare facility-onset, community-onset, healthcare facility-associated, or community-associated and were further classified as probable, possible, or unlikely CDI. A parametric accelerated failure time model was used to estimate the distribution of the incubation period. RESULTS: Of 4,179 patients with negative enrollment cultures and no prior CDI diagnosis within 56 days, 107 (2.6%) were diagnosed as having CDI, including 19 (17.8%) with and 88 (82.2%) without prior detection of carriage. When the data were censored to only include participants with negative cultures collected within 14 days, the estimated median incubation period was 6 days with 25% and 75% of estimated incubation periods occurring within 3 and 12 days, respectively. The observed estimated incubation period did not differ significantly for patients classified as probable, possible, or unlikely CDI. CONCLUSION: Our findings are consistent with the previous studies that suggested the incubation period for CDI is typically less than 1 week and is less than 2 weeks in most cases. |
Natural history of Clostridioides difficile colonization and infection following new acquisition of carriage in healthcare settings: A prospective cohort study
Curry SR , Hecker MT , O'Hagan J , Kutty PK , Alhmidi H , Ng-Wong YK , Cadnum JL , Jencson AL , Gonzalez-Orta M , Saldana C , Wilson BM , Donskey CJ . Clin Infect Dis 2023 77 (1) 77-83 BACKGROUND: Limited information is available on the natural history of Clostridioides difficile colonization and infection in patients with new acquisition of C. difficile in healthcare settings. METHODS: In 3 hospitals and affiliated long-term care facilities, we collected serial perirectal cultures from patients with no diarrhea on enrollment to identify new acquisition of toxigenic C. difficile carriage and determined the duration and burden of carriage. Asymptomatic carriage was defined as transient if only 1 culture was positive with negative cultures before and after or persistent if 2 or more cultures were positive. Clearance of carriage was defined as 2 consecutive negative perirectal cultures. RESULTS: Of 1,432 patients with negative initial cultures and at least 1 follow-up culture, 39 (2.7%) developed CDI without prior detection of carriage and 142 (9.9%) acquired asymptomatic carriage with 19 (13.4%) subsequently diagnosed with CDI. Of 82 patients analyzed for persistence of carriage, 50 (61.0%) had transient carriage and 32 (39.0%) had persistent carriage, with an estimated median of 77 days to clearance of colonization (range, 14 to 133 days). Most persistent carriers had a relatively high burden of carriage and maintained the same ribotype over time, whereas most transient carriers had a low burden of carriage detected only using broth enrichment cultures. CONCLUSIONS: In 3 healthcare facilities, 9.9% of patients acquired asymptomatic carriage of toxigenic C. difficile, and 13.4% were subsequently diagnosed with CDI. Most carriers had transient rather than persistent carriage and most patients developing CDI did not have prior detection of carriage. |
Improving mathematical modeling of interventions to prevent healthcare-associated infections by interrupting transmission or pathogens: How common modeling assumptions about colonized individuals impact intervention effectiveness estimates
Gowler CD , Slayton RB , Reddy SC , O'Hagan JJ . PLoS One 2022 17 (2) e0264344 Mathematical models are used to gauge the impact of interventions for healthcare-associated infections. As with any analytic method, such models require many assumptions. Two common assumptions are that asymptomatically colonized individuals are more likely to be hospitalized and that they spend longer in the hospital per admission because of their colonization status. These assumptions have no biological basis and could impact the estimated effects of interventions in unintended ways. Therefore, we developed a model of methicillin-resistant Staphylococcus aureus transmission to explicitly evaluate the impact of these assumptions. We found that assuming that asymptomatically colonized individuals were more likely to be admitted to the hospital or spend longer in the hospital than uncolonized individuals biased results compared to a more realistic model that did not make either assumption. Results were heavily biased when estimating the impact of an intervention that directly reduced transmission in a hospital. In contrast, results were moderately biased when estimating the impact of an intervention that decolonized hospital patients. Our findings can inform choices modelers face when constructing models of healthcare-associated infection interventions and thereby improve their validity. |
Modeling the potential impact of administering vaccines against Clostridioides difficile infection to individuals in healthcare facilities.
Toth DJA , Keegan LT , Samore MH , Khader K , O'Hagan JJ , Yu H , Quintana A , Swerdlow DL . Vaccine 2020 38 (37) 5927-5932 ![]() BACKGROUND: A vaccine against Clostridioides difficile infection (CDI) is in development. While the vaccine has potential to both directly protect those vaccinated and mitigate transmission by reducing environmental contamination, the impact of the vaccine on C. difficile colonization remains unclear. Consequently, the transmission-reduction effect of the vaccine depends on the contribution of symptomatic CDI to overall transmission of C. difficile. METHODS: We designed a simulation model of CDI among patients in a network of 10 hospitals and nursing homes and calibrated the model using estimates of transmissibility from whole genome sequencing studies that estimated the fraction of CDI attributable to transmission from other CDI patients. We assumed the vaccine reduced the rate of progression to CDI among carriers by 25-95% after completion of a 3-dose vaccine course administered to randomly chosen patients at facility discharge. We simulated the administration of this vaccination campaign and tallied effects over 5 years. RESULTS: We estimated 30 times higher infectivity of CDI patients compared to other carriers. Simulations of the vaccination campaign produced an average reduction of 3-16 CDI cases per 1000 vaccinated patients, with 2-11 of those cases prevented among those vaccinated and 1-5 prevented among unvaccinated patients. CONCLUSIONS: Our findings demonstrate potential for a vaccine against CDI to reduce transmissions in healthcare facilities, even with no direct effect on carriage susceptibility. The vaccine's population impact will increase if received by individuals at risk for CDI onset in high-transmission settings. |
Modeling Infectious Diseases in Healthcare Network (MInD-Healthcare) framework for describing and reporting multidrug resistant organism and healthcare-associated infections agent-based modeling methods
Slayton RB , O'Hagan JJ , Barnes S , Rhea S , Hilscher R , Rubin M , Lofgren E , Singh B , Segre A , Paul P . Clin Infect Dis 2020 71 (9) 2527-2532 Mathematical modeling of healthcare associated infections (HAIs) and multidrug resistant organisms (MDROs) improves our understanding of pathogens transmission dynamics and provides a framework for evaluating prevention strategies. One way of improving the communication among modelers is by providing a standardized way of describing and reporting models thereby instilling confidence in the reproducibility and generalizability of such models. We updated the Overview, Design concepts, and Details protocol developed by Grimm et al. for describing agent-based models (ABMs) to better align with elements commonly included in healthcare-related ABMs. The MInD-Healthcare framework includes the following nine key elements: 1. Purpose and scope; 2. Entities, state variables, and scales; 3. Initialization; 4. Process overview and scheduling; 5. Input data; 6. Agent interactions and organism transmission; 7. Stochasticity; 8. Submodels; 9. Model verification, calibration, and validation. Our objective is that this framework will improve the quality of evidence generated utilizing these models. |
Forecasting the 2014 West African Ebola outbreak
Carias C , O'Hagan JJ , Gambhir M , Kahn EB , Swerdlow DL , Meltzer MI . Epidemiol Rev 2019 41 (1) 34-50 In 2014/15 an Ebola outbreak of unprecedented dimensions afflicted the West African countries of Liberia, Guinea, and Sierra Leone. We performed a systematic review of manuscripts that forecasted the outbreak while it was occurring, and derive implications on the ways results could be interpreted by policy-makers. We reviewed 26 manuscripts, published between 2014 and April 2015, that presented forecasts of the West African Ebola outbreak. Forecasted case counts varied widely. An important determinant of forecast accuracy for case counts was how far into the future predictions were made. Generally, those that made forecasts less than 2 months into the future tended to be more accurate than those that made forecasts more than 10 weeks into the future. The exceptions were parsimonious statistical models in which the decay of the rate of spread of the pathogen among susceptible individuals was dealt with explicitly. Regarding future outbreaks, the most important lessons for policy makers when using similar modeling results are: i) uncertainty of forecasts will be higher in the beginning of the outbreak, ii) when data are limited, forecasts produced by models designed to inform specific decisions should be used in complimentary fashion for robust decision making - for this outbreak, two statistical models produced the most reliable case counts forecasts, but did not allow to understand the impact of interventions, while several compartmental models could estimate the impact of interventions but required data that was not available; iii) timely collection of essential data is necessary for optimal model use. |
Sample size estimates for cluster-randomized trials in hospital infection control and antimicrobial stewardship
Blanco N , Harris AD , Magder LS , Jernigan JA , Reddy SC , O'Hagan J , Hatfield KM , Pineles L , Perencevich E , O'Hara LM . JAMA Netw Open 2019 2 (10) e1912644 Importance: An important step in designing, executing, and evaluating cluster-randomized trials (CRTs) is understanding the correlation and thus nonindependence that exists among individuals in a cluster. In hospital epidemiology, there is a shortage of CRTs that have published their intraclass correlation coefficient or coefficient of variation (CV), making prospective sample size calculations difficult for investigators. Objectives: To estimate the number of hospitals needed to power parallel CRTs of interventions to reduce health care-associated infection outcomes and to demonstrate how different parameters such as CV and expected effect size are associated with the sample size estimates in practice. Design, Setting, and Participants: This longitudinal cohort study estimated parameters for sample size calculations using national rates developed by the Centers for Disease Control and Prevention for methicillin-resistant Staphylococcus aureus (MRSA) bacteremia, central-line-associated bloodstream infections (CLABSI), catheter-associated urinary tract infections (CAUTI), and Clostridium difficile infections (CDI) from 2016. For MRSA and vancomycin-resistant enterococci (VRE) acquisition, outcomes were estimated using data from 2012 from the Benefits of Universal Glove and Gown study. Data were collected from June 2017 through September 2018 and analyzed from September 2018 through January 2019. Main Outcomes and Measures: Calculated number of clusters needed for adequate power to detect an intervention effect using a 2-group parallel CRT. Results: To study an intervention with a 30% decrease in daily rates, 73 total clusters were needed (37 in the intervention group and 36 in the control group) for MRSA bacteremia, 82 for CAUTI, 60 for CLABSI, and 31 for CDI. If a 10% decrease in rates was expected, 768 clusters were needed for MRSA bacteremia, 875 for CAUTI, 631 for CLABSI, and 329 for CDI. For MRSA or VRE acquisition, 50 or 40 total clusters, respectively, were required to observe a 30% decrease, whereas 540 or 426 clusters, respectively, were required to detect a 10% decrease. Conclusions and Relevance: This study suggests that large sample sizes are needed to appropriately power parallel CRTs targeting infection prevention outcomes. Sample sizes are most associated with expected effect size and CV of hospital rates. |
Iterative development of a tailored mHealth intervention for adolescent and young adult survivors of childhood cancer
Schwartz LA , Psihogios AM , Henry-Moss D , Daniel LC , Ver Hoeve ES , Velazquez-Martin B , Butler E , Hobbie WL , Buchanan Lunsford N , Sabatino SA , Barakat LP , Ginsberg JP , Fleisher L , Deatrick JA , Jacobs LA , O'Hagan B , Anderson L , Fredericks E , Amaral S , Dowshen N , Houston K , Vachani C , Hampshire MK , Metz JM , Hill-Kayser CE , Szalda D . Clin Pract Pediatr Psychol 2019 7 (1) 31-43 Objective: Methods for developing mobile health (mHealth) interventions are not well described. To guide the development of future mHealth interventions, we describe the application of the agile science framework to iteratively develop an mHealth intervention for adolescent and young adult (AYA) survivors of childhood cancer. Method: We created the AYA STEPS mobile app (AYA Self-management via Texting, Education, and Plans for Survivorship) by modifying and integrating 2 existing programs: an online survivorship care plan (SCP) generator and a text messaging self-management intervention for AYA off treatment. The iterative development process involved 3 stages of agile science: (1) formative work, (2) obtaining feedback about the first AYA STEPS prototype, and (3) pilot testing and finalization of a prototype. We determined preferences of AYA stakeholders as well as discovered and addressed technology problems prior to beginning a subsequent randomized controlled trial. Results: AYA survivors reported that the app and the embedded tailored messages related to their health and SCP were easy to use and generally satisfying and beneficial. Usage data supported that AYA were engaged in the app. Technology glitches were discovered in the pilot and addressed. Conclusion: The iterative development of AYA STEPS was essential for creating a consistent and acceptable end user experience. This study serves as one example of how behavioral scientists may apply agile science to their own mHealth research. |
The challenges of tracking Clostridium difficile to its source in hospitalized patients
O'Hagan JJ , McDonald LC . Clin Infect Dis 2018 68 (2) 210-212 Clostridium difficile is responsible for between 400,000 and 500,000 infections in the United States each year and is the leading cause of healthcare-associated infections [1, 2]. A major risk factor for Clostridium difficile infection (CDI) is a current or recent stay in a hospital, where rates of both symptomatic CDI and asymptomatic colonization are higher than in the community [3–6]. Infection control recommendations for hospitals focus on preventing transmission from symptomatic CDI patients; active surveillance testing to detect asymptomatically-colonized patients is not currently recommended [7]. However, asymptomatically-colonized patients, as well as patients with active CDI, may cause CDI through transmission to other patients, although their relative contributions to overall transmission remain uncertain [8–11]. For example, using restriction enzyme analysis typing, Clabots et al found that 84% (16/19) of hospital acquisitions were linked to asymptomatic carriers while, in another study that used variable-number tandem repeats genotyping, 29% (16/56) of acquisitions were linked to carriers [9, 10]. Transmission occurs via C. difficile spores that are resistant to commonly-used hand sanitizers and environmental disinfectants [3]. Therefore, evidence that asymptomatic carriers are responsible for a large amount of transmission within hospitals would prompt new interventions designed to reduce transmission from such patients. In this issue, Kong et al report findings from the largest study to date investigating the relative roles of carriers and cases as sources of transmission to patients with CDI [12]. |
The potential for interventions in a long-term acute care hospital to reduce transmission of carbapenem-resistant Enterobacteriaceae in affiliated healthcare facilities
Toth DJA , Khader K , Slayton RB , Kallen AJ , Gundlapalli AV , O'Hagan JJ , Fiore AE , Rubin MA , Jernigan JA , Samore MH . Clin Infect Dis 2017 65 (4) 581-587 Background.: Carbapenem-resistant Enterobacteriaceae (CRE) are high-priority bacterial pathogens targeted for efforts to decrease transmissions and infections in healthcare facilities. Some regions have experienced CRE outbreaks that were likely amplified by frequent transmission in long-term acute care hospitals (LTACHs). Planning and funding intervention efforts focused on LTACHs is one proposed strategy to contain outbreaks; however, the potential regional benefits of such efforts are unclear. Methods.: We designed an agent-based simulation model of patients in a regional network of 10 healthcare facilities including 1 LTACH, 3 short-stay acute care hospitals (ACHs) and 6 nursing homes (NHs). The model was calibrated to achieve realistic patient flow and CRE transmission and detection rates. We then simulated the initiation of an entirely LTACH-focused intervention in a previously CRE-free region, including active surveillance for CRE carriers and enhanced isolation of identified carriers. Results.: When initiating the intervention at the 1st clinical CRE detection in the LTACH, cumulative CRE transmissions over 5 years across all 10 facilities were reduced by 79-93% compared to no-intervention simulations. This result was robust to changing assumptions for transmission within non-LTACH facilities and flow of patients from the LTACH. Delaying the intervention until the 20th CRE detection resulted in substantial delays in achieving optimal regional prevalence, while still reducing transmissions by 60-79% over 5 years. Conclusions.: Focusing intervention efforts on LTACHs is potentially a highly efficient strategy for reducing CRE transmissions across an entire region, particularly when implemented as early as possible in an emerging outbreak. |
Mortality from circulatory diseases and other non-cancer outcomes among nuclear workers in France, the United Kingdom and the United States (INWORKS)
Gillies M , Richardson DB , Cardis E , Daniels RD , O'Hagan JA , Haylock R , Laurier D , Leuraud K , Moissonnier M , Schubauer-Berigan MK , Thierry-Chef I , Kesminiene A . Radiat Res 2017 188 (3) 276-290 Positive associations between external radiation dose and non-cancer mortality have been made in a number of published studies, primarily of populations exposed to high-dose, high-dose-rate ionizing radiation. The goal of this study was to determine whether external radiation dose was associated with non-cancer mortality in a large pooled cohort of nuclear workers exposed to low-dose radiation accumulated at low dose rates. The cohort comprised 308,297 workers from France, United Kingdom and United States. The average cumulative equivalent dose at a tissue depth of 10 mm [Hp(10)] was 25.2 mSv. In total, 22% of the cohort were deceased by the end of follow-up, with 46,029 deaths attributed to non-cancer outcomes, including 27,848 deaths attributed to circulatory diseases. Poisson regression was used to investigate the relationship between cumulative radiation dose and non-cancer mortality rates. A statistically significant association between radiation dose and all non-cancer causes of death was observed [excess relative risk per sievert (ERR/Sv) = 0.19; 90% CI: 0.07, 0.30]. This was largely driven by the association between radiation dose and mortality due to circulatory diseases (ERR/Sv = 0.22; 90% CI: 0.08, 0.37), with slightly smaller positive, but nonsignificant, point estimates for mortality due to nonmalignant respiratory disease (ERR/Sv = 0.13; 90% CI: -0.17, 0.47) and digestive disease (ERR/Sv = 0.11; 90% CI: -0.36, 0.69). The point estimate for the association between radiation dose and deaths due to external causes of death was nonsignificantly negative (ERR = -0.12; 90% CI: <-0.60, 0.45). Within circulatory disease subtypes, associations with dose were observed for mortality due to cerebrovascular disease (ERR/Sv = 0.50; 90% CI: 0.12, 0.94) and mortality due to ischemic heart disease (ERR/Sv = 0.18; 90% CI: 0.004, 0.36). The estimates of associations between radiation dose and non-cancer mortality are generally consistent with those observed in atomic bomb survivor studies. The findings of this study could be interpreted as providing further evidence that non-cancer disease risks may be increased by external radiation exposure, particularly for ischemic heart disease and cerebrovascular disease. However, heterogeneity in the estimated ERR/Sv was observed, which warrants further investigation. Further follow-up of these cohorts, with the inclusion of internal exposure information and other potential confounders associated with lifestyle factors, may prove informative, as will further work on elucidating the biological mechanisms that might cause these non-cancer effects at low doses. |
Examining temporal effects on cancer risk in the International Nuclear Workers' Study (INWORKS)
Daniels RD , Bertke SJ , Richardson DB , Cardis E , Gillies M , O'Hagan JA , Haylock R , Laurier D , Leuraud K , Moissonnier M , Thierry-Chef I , Kesminiene A , Schubauer-Berigan MK . Int J Cancer 2016 140 (6) 1260-1269 The paper continues the series of publications from the International Nuclear Workers Study cohort (INWORKS) that comprises 308,297 workers from France, the United Kingdom and the United States, providing 8.2 million person-years of observation from a combined follow-up period (at earliest 1944 to at latest 2005). These workers' external radiation exposures were primarily to photons, resulting in an estimated average career absorbed dose to the colon of 17.4 milligray. The association between cumulative ionizing radiation dose and cancer mortality was evaluated in general relative risk models that describe modification of the excess relative risk (ERR) per gray (Gy) by time since exposure and age at exposure. Methods analogous to a nested-case control study using conditional logistic regression of sampled risks sets were used. Outcomes included: all solid cancers, lung cancer, leukemias excluding chronic lymphocytic, acute myeloid leukemia, chronic myeloid leukemia, multiple myeloma, Hodgkin lymphoma, and non-Hodgkin lymphoma. Significant risk heterogeneity was evident in chronic myeloid leukemia with time since exposure, where we observed increased ERR per Gy estimates shortly after exposure (2-10 year) and again later (20-30 years). We observed delayed effects for acute myeloid leukemia although estimates were not statistically significant. Solid cancer excess risk was restricted to exposure at age 35+ years and also diminished for exposure 30 years prior to attained age. Persistent or late effects suggest additional follow-up may inform on lifetime risks. However, cautious interpretation of results is needed due to analytical limitations and a lack of confirmatory results from other studies. This article is protected by copyright. All rights reserved. |
The International Nuclear Workers Study (Inworks): A Collaborative epidemiological study to improve knowledge about health effects of protracted low-dose exposure
Laurier D , Richardson DB , Cardis E , Daniels RD , Gillies M , O'Hagan J , Hamra GB , Haylock R , Leuraud K , Moissonnier M , Schubauer-Berigan MK , Thierry-Chef I , Kesminiene A . Radiat Prot Dosimetry 2016 173 21-25 INWORKS is a multinational cohort study, gathering 308 297 workers in the nuclear industry in France, the United Kingdom and the United States of America, with detailed individual monitoring data for external exposure to ionising radiation. Over a mean duration of follow-up of 27 y, the number of observed deaths was 66 632, including 17 957 deaths due to solid cancers, 1791 deaths due to haematological cancers and 27 848 deaths due to cardiovascular diseases. Mean individual cumulative external dose over the period 1945-2005 was 25 mSv. Analyses demonstrated a significant association between red bone marrow dose and the risk of leukaemia (excluding chronic lymphocytic leukaemia) and between colon dose and the risk of solid cancers. INWORKS assembled some of the strongest evidence to strengthen the scientific basis for the protection of adults from low dose, low-dose rate, exposures to ionising radiation. |
Temporally varying relative risks for infectious diseases: implications for infectious disease control
Goldstein E , Pitzer VE , O'Hagan JJ , Lipsitch M . Epidemiology 2016 28 (1) 136-144 Risks for disease in some population groups relative to others (relative risks) are usually considered to be consistent over time, though they are often modified by other, non-temporal factors. For infectious diseases, in which overall incidence often varies substantially over time, the patterns of temporal changes in relative risks can inform our understanding of basic epidemiologic questions. For example, recent work suggests that temporal changes in relative risks of infection over the course of an epidemic cycle can both be used to identify population groups that drive infectious disease outbreaks, and help elucidate differences in the effect of vaccination against infection (that is relevant to transmission control) compared with its effect against disease episodes (that reflects individual protection). Patterns of change in the in age groups affected over the course of seasonal outbreaks can provide clues to the types of pathogens that could be responsible for diseases for which an infectious cause is suspected. Changing apparent efficacy of vaccines during trials may provide clues to the vaccine's mode of action and/or indicate risk heterogeneity in the trial population. Declining importance of unusual behavioral risk factors may be a signal of increased local transmission of an infection. We review these developments and the related public health implications. |
Estimation of severe Middle East Respiratory Syndrome cases in the Middle East, 2012-2016
O'Hagan JJ , Carias C , Rudd JM , Pham HT , Haber Y , Pesik N , Cetron MS , Gambhir M , Gerber SI , Swerdlow DL . Emerg Infect Dis 2016 22 (10) 1797-9 Using data from travelers to 4 countries in the Middle East, we estimated 3,250 (95% CI 1,300-6,600) severe cases of Middle East respiratory syndrome occurred in this region during September 2012-January 2016. This number is 2.3-fold higher than the number of laboratory-confirmed cases recorded in these countries. |
Exportations of symptomatic cases of MERS-CoV infection to countries outside the Middle East
Carias C , O'Hagan JJ , Jewett A , Gambhir M , Cohen NJ , Haber Y , Pesik N , Swerdlow DL . Emerg Infect Dis 2016 22 (3) 723-5 In 2012, an outbreak of infection with Middle East respiratory syndrome coronavirus (MERS-CoV), was detected in the Arabian Peninsula. Modeling can produce estimates of the expected annual number of symptomatic cases of MERS-CoV infection exported and the likelihood of exportation from source countries in the Middle East to countries outside the region. |
Risk of cancer from occupational exposure to ionising radiation: retrospective cohort study of workers in France, the United Kingdom, and the United States (INWORKS)
Richardson DB , Cardis E , Daniels RD , Gillies M , O'Hagan JA , Hamra GB , Haylock R , Laurier D , Leuraud K , Moissonnier M , Schubauer-Berigan MK , Thierry-Chef I , Kesminiene A . BMJ 2015 351 h5359 STUDY QUESTION: Is protracted exposure to low doses of ionising radiation associated with an increased risk of solid cancer? METHODS: In this cohort study, 308 297 workers in the nuclear industry from France, the United Kingdom, and the United States with detailed monitoring data for external exposure to ionising radiation were linked to death registries. Excess relative rate per Gy of radiation dose for mortality from cancer was estimated. Follow-up encompassed 8.2 million person years. Of 66 632 known deaths by the end of follow-up, 17 957 were due to solid cancers. STUDY ANSWER AND LIMITATIONS: Results suggest a linear increase in the rate of cancer with increasing radiation exposure. The average cumulative colon dose estimated among exposed workers was 20.9 mGy (median 4.1 mGy). The estimated rate of mortality from all cancers excluding leukaemia increased with cumulative dose by 48% per Gy (90% confidence interval 20% to 79%), lagged by 10 years. Similar associations were seen for mortality from all solid cancers (47% (18% to 79%)), and within each country. The estimated association over the dose range of 0-100 mGy was similar in magnitude to that obtained over the entire dose range but less precise. Smoking and occupational asbestos exposure are potential confounders; however, exclusion of deaths from lung cancer and pleural cancer did not affect the estimated association. Despite substantial efforts to characterise the performance of the radiation dosimeters used, the possibility of measurement error remains. WHAT THIS STUDY ADDS: The study provides a direct estimate of the association between protracted low dose exposure to ionising radiation and solid cancer mortality. Although high dose rate exposures are thought to be more dangerous than low dose rate exposures, the risk per unit of radiation dose for cancer among radiation workers was similar to estimates derived from studies of Japanese atomic bomb survivors. Quantifying the cancer risks associated with protracted radiation exposures can help strengthen the foundation for radiation protection standards. FUNDING, COMPETING INTERESTS, DATA SHARING: Support from the US Centers for Disease Control and Prevention; Ministry of Health, Labour and Welfare of Japan; Institut de Radioprotection et de Surete Nucleaire; AREVA; Electricite de France; US National Institute for Occupational Safety and Health; US Department of Energy; and Public Health England. Data are maintained and kept at the International Agency for Research on Cancer. |
Cohort profile: the International Nuclear Workers Study (INWORKS)
Hamra GB , Richardson DB , Cardis E , Daniels RD , Gillies M , O'Hagan JA , Haylock R , Laurier D , Leuraud K , Moissonnier M , Schubauer-Berigan M , Thierry-Chef I , Kesminiene A . Int J Epidemiol 2015 45 (3) 693-9 The effects of exposure to ionizing radiation have been studied for decades. The health effects of moderate to high exposure are well characterized, but the effects of low-level, chronic exposure remain a subject of continued debate.1 Moreover, repeated or protracted low-dose rate exposures to ionizing radiation have become increasingly common over the past quarter-century.1 The largest contributor to this trend has been medical radiation expos-ure.2,3 Since the 1980s, studies of nuclear industry workers have been conducted to provide direct information about these effects.2,3 These cohorts are well suited for this purpose: they include large number of workers, with individual (person-specific) monitoring of external doses and many years of follow-up. Estimates from early, cohort-specific studies, were, however, compatible with a wide range of possibilities, from a reduction of risk at low doses to risks higher than those on which current radiation protection recommendations are based. | To further improve the precision of estimates of radiation-induced cancer risk following protracted low doses of ionizing radiation and to strengthen the scientific basis of radiation protection standards, an International Collaborative Study of Cancer Risk among Radiation Workers in the Nuclear Industry, the ‘15-Country Study’, was carried out using a common core protocol in 15 coun-tries.3–5 Information was collected on nearly 600000 workers and a thorough study of errors in recorded doses was carried out to evaluate the comparability of recorded dose estimates across facilities and time, and to identify and quantify sources of bias and uncertainties in dose estimates, which were taken into account in the statistical analyses of the results.6 |
Dose estimation for a study of nuclear workers in France, the United Kingdom and the United States of America: methods for the International Nuclear Workers Study (INWORKS)
Thierry-Chef I , Richardson DB , Daniels RD , Gillies M , Hamra GB , Haylock R , Kesminiene A , Laurier D , Leuraud K , Moissonnier M , O'Hagan J , Schubauer-Berigan MK , Cardis E . Radiat Res 2015 183 (6) 632-42 In the framework of the International Nuclear Workers Study conducted in France, the UK and the U.S. (INWORKS), updated and expanded methods were developed to convert recorded doses of ionizing radiation to estimates of organ doses or individual personal dose equivalent [Hp(10)] for a total number of 308,297 workers, including 40,035 women. This approach accounts for differences in dosimeter response to predominant workplace energy and geometry of exposure and for the recently published ICRP report on dose coefficients for men and women separately. The overall mean annual individual personal dose equivalent, including zero doses, is 1.73 mSv [median = 0.42; interquartile range (IQR): 0.07, 1.59]. Associated individual organ doses were estimated. INWORKS includes workers who had potential for exposure to neutrons. Therefore, we analyzed neutron dosimetry data to identify workers potentially exposed to neutrons. We created a time-varying indicator for each worker, classifying them according to whether they had a positive recorded neutron dose and if so, whether their neutron dose ever exceeded 10% of their total external penetrating radiation dose. The number of workers flagged as being exposed to neutrons was 13% for the full cohort, with 15% of the cohort in France, 12% of the cohort in the UK and 14% in the U.S. We also used available information on in vivo and bioassay monitoring to identify workers with known depositions or suspected internal contaminations. As a result of this work, information is now available that will allow various types of sensitivity analyses. |
Ionising radiation and risk of death from leukaemia and lymphoma in radiation-monitored workers (INWORKS): an international cohort study
Leuraud K , Richardson DB , Cardis E , Daniels RD , Gillies M , O'Hagan JA , Hamra GB , Haylock R , Laurier D , Moissonnier M , Schubauer-Berigan MK , Thierry-Chef I , Kesminiene A . Lancet Haematol 2015 2 (7) e276-81 BACKGROUND: There is much uncertainty about the risks of leukaemia and lymphoma after repeated or protracted low-dose radiation exposure typical of occupational, environmental, and diagnostic medical settings. We quantified associations between protracted low-dose radiation exposures and leukaemia, lymphoma, and multiple myeloma mortality among radiation-monitored adults employed in France, the UK, and the USA. METHODS: We assembled a cohort of 308 297 radiation-monitored workers employed for at least 1 year by the Atomic Energy Commission, AREVA Nuclear Cycle, or the National Electricity Company in France, the Departments of Energy and Defence in the USA, and nuclear industry employers included in the National Registry for Radiation Workers in the UK. The cohort was followed up for a total of 8·22 million person-years. We ascertained deaths caused by leukaemia, lymphoma, and multiple myeloma. We used Poisson regression to quantify associations between estimated red bone marrow absorbed dose and leukaemia and lymphoma mortality. FINDINGS: Doses were accrued at very low rates (mean 1·1 mGy per year, SD 2·6). The excess relative risk of leukaemia mortality (excluding chronic lymphocytic leukaemia) was 2·96 per Gy (90% CI 1·17-5·21; lagged 2 years), most notably because of an association between radiation dose and mortality from chronic myeloid leukaemia (excess relative risk per Gy 10·45, 90% CI 4·48-19·65). INTERPRETATION: This study provides strong evidence of positive associations between protracted low-dose radiation exposure and leukaemia. FUNDING: Centers for Disease Control and Prevention, Ministry of Health, Labour and Welfare of Japan, Institut de Radioprotection et de SÛreté Nucléaire, AREVA, Electricité de France, National Institute for Occupational Safety and Health, US Department of Energy, US Department of Health and Human Services, University of North Carolina, Public Health England. |
Estimating the United States demand for influenza antivirals and the effect on severe influenza disease during a potential pandemic
O'Hagan JJ , Wong KK , Campbell AP , Patel A , Swerdlow DL , Fry AM , Koonin LM , Meltzer MI . Clin Infect Dis 2015 60 Suppl 1 S30-41 Following the detection of a novel influenza strain A(H7N9), we modeled the use of antiviral treatment in the United States to mitigate severe disease across a range of hypothetical pandemic scenarios. Our outcomes were total demand for antiviral (neuraminidase inhibitor) treatment and the number of hospitalizations and deaths averted. The model included estimates of attack rate, healthcare-seeking behavior, prescription rates, adherence, disease severity, and the potential effect of antivirals on the risks of hospitalization and death. Based on these inputs, the total antiviral regimens estimated to be available in the United States (as of April 2013) were sufficient to meet treatment needs for the scenarios considered. However, distribution logistics were not examined and should be addressed in future work. Treatment was estimated to avert many severe outcomes (5200-248 000 deaths; 4800-504 000 hospitalizations); however, large numbers remained (25 000-425 000 deaths; 580 000-3 700 000 hospitalizations), suggesting that the impact of combinations of interventions should be examined. |
Infectious disease modeling methods as tools for informing response to novel influenza viruses of unknown pandemic potential
Gambhir M , Bozio C , O'Hagan JJ , Uzicanin A , Johnson LE , Biggerstaff M , Swerdlow DL . Clin Infect Dis 2015 60 Suppl 1 S11-9 The rising importance of infectious disease modeling makes this an appropriate time for a guide for public health practitioners tasked with preparing for, and responding to, an influenza pandemic. We list several questions that public health practitioners commonly ask about pandemic influenza and match these with analytical methods, giving details on when during a pandemic the methods can be used, how long it might take to implement them, and what data are required. Although software to perform these tasks is available, care needs to be taken to understand: (1) the type of data needed, (2) the implementation of the methods, and (3) the interpretation of results in terms of model uncertainty and sensitivity. Public health leaders can use this article to evaluate the modeling literature, determine which methods can provide appropriate evidence for decision-making, and to help them request modeling work from in-house teams or academic groups. |
Mobile messaging as surveillance tool during pandemic (H1N1) 2009, Mexico
Lajous M , Danon L , Lopez-Ridaura R , Astley CM , Miller JC , Dowell SF , O'Hagan JJ , Goldstein E , Lipsitch M . Emerg Infect Dis 2010 16 (9) 1488-9 To the Editor: Pandemic (H1N1) 2009 highlighted challenges faced by disease surveillance systems. New approaches to complement traditional surveillance are needed, and new technologies provide new opportunities. We evaluated cell phone technology for surveillance of influenza outbreaks during the outbreak of pandemic (H1N1) 2009 in Mexico. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 18, 2025
- Content source:
- Powered by CDC PHGKB Infrastructure