Last data update: Apr 18, 2025. (Total: 49119 publications since 2009)
Records 1-30 (of 41 Records) |
Query Trace: Kendall ME[original query] |
---|
Long-term impact of 10-valent pneumococcal conjugate vaccine among children <5 years, Uganda, 2014-2021
Wanyana MW , Migisha R , King P , Bulage L , Kwesiga B , Kadobera D , Ario AR , Harris JR . PLOS Glob Public Health 2025 5 (1) e0002980 Pneumonia is the second leading cause of hospital admissions and deaths among children <5 years in Uganda. In 2014, Uganda officially rolled out the introduction of the pneumococcal conjugate vaccine (PCV) into routine immunization schedule. However, little is known about the long-term impact of PCV on pneumonia admissions and deaths. In this study, we described the trends and spatial distribution of pneumonia hospital admissions and mortality among children <5 years in Uganda, 2014-2021. We analysed secondary data on pneumonia admissions and deaths from the District Health Information System version 2 during 2014-2021. The proportion of pneumonia cases admitted and case-fatality rates (CFRs) for children <5 years were calculated for children <5 years presenting at the outpatient department. At national, regional, and district levels, pneumonia mortality rates were calculated per 100,000 children <5 years. The Mann-Kendall Test was used to assess trend significance. We found 667,122 pneumonia admissions and 11,692 (2%) deaths during 2014-2021. The overall proportion of pneumonia cases admitted among children <5 years was 22%. The overall CFR was 0.39%, and the overall pneumonia mortality rate among children <5 years was 19 deaths per 100,000. From 2014 to 2021, there were declines in the proportion of pneumonia cases admitted (31% to 15%; p = 0.051), mortality rates (24/100,000 to 14 per 100,000; p = 0.019), and CFR (0.57% to 0.24%; p = 0.019), concomitant with increasing PCV coverage. Kotido District had a persistently high proportion of pneumonia cases that were admitted (>30%) every year while Kasese District had persistently high mortality rates (68-150 deaths per 100,000 children <5 years). Pneumonia admissions, mortality, and case fatality among children <5 years declined during 2013-2021 in Uganda after the introduction of PCV. However, with these trends it is unlikely that Uganda will meet the 2025 GAPPD targets. There is need to review implementation of existing interventions and identify gaps in order to highlight priority actions to further accelerate declines. |
Statewide outbreak of neisseria meningitidis serogroup Y, sequence type 1466 - Virginia, 2022-2024
Robinson M , Crain J , Kendall B , Alexander V , Diskin E , Saady D , Hicks C , Myrick-West A , Bordwine P , Sockwell D , Craig E , Rubis A , McNamara L , Sharma S , Howie R , Marasini D , Marjuki H , Colón A . MMWR Morb Mortal Wkly Rep 2024 73 (43) 973-977 ![]() ![]() Invasive meningococcal disease (IMD) is a severe illness that can have devastating effects; outbreaks are uncommon in the United States. Vaccination is the preferred control measure for IMD outbreaks when a defined population at risk (e.g., college students or persons experiencing homelessness) can be identified. In August 2022, the Virginia Department of Health (VDH) began investigating an IMD outbreak in Virginia's Eastern Health Planning Region, prompted by the detection of four confirmed cases within 8 weeks. Clinical isolates available from three cases were characterized as Neisseria meningitidis serogroup Y, sequence type 1466. A subsequent statewide investigation identified 36 genetically related cases, including seven deaths (case fatality rate = 19.4%) as of March 1, 2024. A majority of patients (63.9%) were in an age group (30-60 years) not generally considered at increased risk for IMD; 78.0% were non-Hispanic Black or African American. No common exposures, affiliations, or risk factors were identified, and a defined population could not be identified for vaccination. VDH recommended quadrivalent (serogroups A, C, W, and Y) meningococcal conjugate vaccination of a subset of close contacts of patients based on IMD risk factors and age range similar to that of patients with identified cases. IMD outbreaks might affect populations without established IMD risk factors. Lack of a well-defined population at risk might prompt exploration of novel control strategies, such as selective vaccination of close contacts. |
A historical survey of key epidemiological studies of ionizing radiation exposure
Little MP , Bazyka D , Gonzalez AB , Brenner AV , Chumak VV , Cullings H , Daniels RD , French B , Grant E , Hamada N , Hauptmann M , Kendall GM , Laurier D , Lee C , Lee WJ , Linet MS , Mabuchi K , Morton LM , Muirhead CR , Preston DL , Rajaraman P , Richardson DB , Sakata R , Samet JM , Simon SL , Sugiyama H , Wakeford R , Zablotska LB . Radiat Res 2024 ![]() In this article we review the history of key epidemiological studies of populations exposed to ionizing radiation. We highlight historical and recent findings regarding radiation-associated risks for incidence and mortality of cancer and non-cancer outcomes with emphasis on study design and methods of exposure assessment and dose estimation along with brief consideration of sources of bias for a few of the more important studies. We examine the findings from the epidemiological studies of the Japanese atomic bomb survivors, persons exposed to radiation for diagnostic or therapeutic purposes, those exposed to environmental sources including Chornobyl and other reactor accidents, and occupationally exposed cohorts. We also summarize results of pooled studies. These summaries are necessarily brief, but we provide references to more detailed information. We discuss possible future directions of study, to include assessment of susceptible populations, and possible new populations, data sources, study designs and methods of analysis. |
Coronavirus disease 2019 infections among emergency health care personnel: Impact on delivery of United States emergency medical care, 2020
Weber KD , Mower W , Krishnadasan A , Mohr NM , Montoy JC , Rodriguez RM , Giordano PA , Eyck PT , Harland KK , Wallace K , McDonald LC , Kutty PK , Hesse EM , Talan DA . Ann Emerg Med 2024 STUDY OBJECTIVE: In the early months of the coronavirus disease 2019 (COVID-19) pandemic and before vaccine availability, there were concerns that infected emergency department (ED) health care personnel could present a threat to the delivery of emergency medical care. We examined how the pandemic affected staffing levels and whether COVID-19 positive staff were potentially infectious at work in a cohort of US ED health care personnel in 2020. METHODS: The COVID-19 Evaluation of Risks in Emergency Departments (Project COVERED) project was a multicenter prospective cohort study of US ED health care personnel conducted from May to December 2020. During surveillance, health care personnel completed weekly electronic surveys and underwent periodic serology and nasal reverse transcription polymerase chain reaction testing for SARS-CoV-2, and investigators captured weekly data on health care facility COVID-19 prevalence and health care personnel staffing. Surveys asked about symptoms, potential exposures, work attendance, personal protective equipment use, and behaviors. RESULTS: We enrolled 1,673 health care personnel who completed 29,825 person weeks of surveillance. Eighty-nine (5.3%) health care personnel documented 90 (0.3%; 95% confidence interval [CI] 0.2% to 0.4%) person weeks of missed work related to documented or concerns for COVID-19 infection. Health care personnel experienced symptoms of COVID-19 during 1,256 (4.2%) person weeks and worked at least one shift whereas symptomatic during 1,042 (83.0%) of these periods. Seventy-five (4.5%) participants tested positive for SARS-CoV-2 during the surveillance period, including 43 (57.3%) who indicated they never experienced symptoms; 74 (98.7%; 95% CI 90.7% to 99.9%) infected health care personnel worked at least one shift during the initial period of infection, and 71 (94.7%) continued working until laboratory confirmation of their infection. Physician staffing was not associated with the facility or community COVID-19 levels within any time frame studied (Kendall tau's 0.02, 0.056, and 0.081 for no shift, one-week time shift, and 2-week time shift, respectively). CONCLUSIONS: During the first wave of the pandemic, COVID-19 infections in ED health care personnel were infrequent, and the time lost from the workforce was minimal. Health care personnel frequently reported for work while infected with SARS-CoV-2 before laboratory confirmation. The ED staffing levels were poorly correlated with facility and community COVID-19 burden. |
Trends in tuberculosis clinicians' adoption of short-course regimens for latent tuberculosis infection
Feng PI , Horne DJ , Wortham JM , Katz DJ . J Clin Tuberc Other Mycobact Dis 2023 33 100382 OBJECTIVE: Little is known about regimen choice for latent tuberculosis infection in the United States. Since 2011, the Centers for Disease Control and Prevention has recommended shorter regimens-12 weeks of isoniazid and rifapentine or 4 months of rifampin-because they have similar efficacy, better tolerability, and higher treatment completion than 6-9 months of isoniazid. The objective of this analysis is to describe frequencies of latent tuberculosis infection regimens prescribed in the United States and assess changes over time. METHODS: Persons at high risk for latent tuberculosis infection or progression to tuberculosis disease were enrolled into an observational cohort study from September 2012-May 2017, tested for tuberculosis infection, and followed for 24 months. This analysis included those with at least one positive test who started treatment. RESULTS: Frequencies of latent tuberculosis infection regimens and 95% confidence intervals were calculated overall and by important risk groups. Changes in the frequencies of regimens by quarter were assessed using the Mann-Kendall statistic. Of 20,220 participants, 4,068 had at least one positive test and started treatment: 95% non-U.S.-born, 46% female, 12% <15 years old. Most received 4 months of rifampin (49%), 6-9 months of isoniazid (32%), or 12 weeks of isoniazid and rifapentine (13%). Selection of short-course regimens increased from 55% in 2013 to 81% in late 2016 (p < 0.001). CONCLUSIONS: Our study identified a trend towards adoption of shorter regimens. Future studies should assess the impact of updated treatment guidelines, which have added 3 months of daily isoniazid and rifampin to recommended regimens. |
Trends in stigmatizing language about addiction: A longitudinal analysis of multiple public communication channels
McLaren N , Jones CM , Noonan R , Idaikkadar N , Sumner SA . Drug Alcohol Depend 2023 245 109807 INTRODUCTION: Stigma associated with substance use and addiction is a major barrier to overdose prevention. Although stigma reduction is a key goal of federal strategies to prevent overdose, there is limited data to assess progress made in reducing use of stigmatizing language about addiction. METHODS: Using language guidelines published by the federal National Institute on Drug Abuse (NIDA), we examined trends in use of stigmatizing terms about addiction across four popular public communication modalities: news articles, blogs, Twitter, and Reddit. We calculate percent changes in the rates of articles/posts using stigmatizing terms over a five-year period (2017-2021) by fitting a linear trendline and assess statistically significant trends using the Mann-Kendall test. RESULTS: The rate of articles containing stigmatizing language decreased over the past five years for news articles (-68.2 %, p < 0.001) and blogs (-33.6 %, p < 0.001). Among social media platforms, the rate of posts using stigmatizing language increased (Twitter [43.5 %, p = 0.01]) or remained stable (Reddit [3.1 %, p = 0.29]). In absolute terms, news articles had the highest rate of articles containing stigmatizing terms over the five-year period (324.9 articles per million) compared to 132.3, 18.3, and 138.6 posts per million for blogs, Twitter, and Reddit, respectively. CONCLUSIONS: Use of stigmatizing language about addiction appears to have decreased across more traditional, longer-format communication modalities such as news articles. Additional work is needed to reduce use of stigmatizing language on social media. |
Scale-up of HIV index testing in an urban population: experiences and achievements from Nairobi County, Kenya
Joel JN , Awuor P , Blanco N , Lavoie MC , Lascko T , Ngunu C , Mwangi J , Mutisya I , Ng'eno C , Wangusi R , Koech E . Trop Med Int Health 2022 28 (2) 116-125 OBJECTIVE: To describe the implementation strategies of the index testing program across Nairobi County in Kenya, assess outcomes along the HIV index testing cascade (acceptance, elicitation ratio, HIV positivity, and linkage to treatment), and assess annual changes along the HIV index testing cascade during the first two years of implementation. METHODS: Retrospective analysis of programmatic aggregate data collected from October 2017 to September 2019 after the roll-out of index testing services in 48 health facilities in Nairobi County. Proportions and ratios were calculated for acceptance, elicitation ratio, testing uptake, and HIV positivity. We compared these outcomes between years using a chi-squared test, Fisher's exact test, or Wilcoxon sign test, and we assessed trends using the Mann-Kendall test. RESULTS: Testing among eligible partners increased from 42.4% (1,471/3,470) to 74.9% (6,114/8,159) in the general population, and the positivity yield remained high across both years (25.2% in year 1 and 24.1% in year 2). Index testing positivity yield remained significantly higher than other testing modalities (24.3% versus 1.3%, p<0.001). The contribution of index testing services to the total number of HIV-positive individuals identified increased from 7.5% in the first year to 28.6% in the second year (p<0.001). More men were tested, but the positivity yield was higher among women (30.0%) and those aged 50 years or older (32.4%). Testing eligible partners in key populations decreased from 52.4% (183/349) to 40.7% (109/268) (p=0.674); however, the HIV positivity yield increased from 8.6% to 23.9% (p<0.001) by the second year of implementation. The HIV positivity yield from index testing remained higher than other testing modalities (14% vs. 0.9%, p<0.001) for key populations. CONCLUSION: Index testing was well-accepted and effective in identifying individuals living with HIV in a Kenyan urban setting across both general populations and key populations. Ongoing adaptations to the strategies deployed as part of index testing services helped improve most of the outcomes along the index testing cascade. This article is protected by copyright. All rights reserved. |
Pharmacist-driven transitions of care practice model for prescribing oral antimicrobials at hospital discharge
Mercuro NJ , Medler CJ , Kenney RM , MacDonald NC , Neuhauser MM , Hicks LA , Srinivasan A , Divine G , Beaulac A , Eriksson E , Kendall R , Martinez M , Weinmann A , Zervos M , Davis SL . JAMA Netw Open 2022 5 (5) e2211331 IMPORTANCE: Although prescribers face numerous patient-centered challenges during transitions of care (TOC) at hospital discharge, prolonged duration of antimicrobial therapy for common infections remains problematic, and resources are needed for antimicrobial stewardship throughout this period. OBJECTIVE: To evaluate a pharmacist-driven intervention designed to improve selection and duration of oral antimicrobial therapy prescribed at hospital discharge for common infections. DESIGN, SETTING, AND PARTICIPANTS: This quality improvement study used a nonrandomized stepped-wedge design with 3 study phases from September 1, 2018, to August 31, 2019. Seventeen distinct medicine, surgery, and specialty units from a health system in Southeast Michigan participated, including 1 academic tertiary hospital and 4 community hospitals. Hospitalized adults who had urinary, respiratory, skin and/or soft tissue, and intra-abdominal infections and were prescribed antimicrobials at discharge were included in the analysis. Data were analyzed from February 18, 2020, to February 28, 2022. INTERVENTIONS: Clinical pharmacists engaged in a new standard of care for antimicrobial stewardship practices during TOC by identifying patients to be discharged with a prescription for oral antimicrobials and collaborating with primary teams to prescribe optimal therapy. Academic and community hospitals used both antimicrobial stewardship and clinical pharmacists in a multidisciplinary rounding model to discuss, document, and facilitate order entry of the antimicrobial prescription at discharge. MAIN OUTCOMES AND MEASURES: The primary end point was frequency of optimized antimicrobial prescription at discharge. Health system guidelines developed from national guidelines and best practices for short-course therapies were used to evaluate optimal therapy. RESULTS: A total of 800 patients prescribed oral antimicrobials at hospital discharge were included in the analysis (441 women [55.1%]; mean [SD] age, 66.8 [17.3] years): 400 in the preintervention period and 400 in the postintervention period. The most common diagnoses were pneumonia (264 [33.0%]), upper respiratory tract infection and/or acute exacerbation of chronic obstructive pulmonary disease (214 [26.8%]), and urinary tract infection (203 [25.4%]). Patients in the postintervention group were more likely to have an optimal antimicrobial prescription (time-adjusted generalized estimating equation odds ratio, 5.63 [95% CI, 3.69-8.60]). The absolute increase in optimal prescribing in the postintervention group was consistent in both academic (37.4% [95% CI, 27.5%-46.7%]) and community (43.2% [95% CI, 32.4%-52.8%]) TOC models. There were no differences in clinical resolution or mortality. Fewer severe antimicrobial-related adverse effects (time-adjusted generalized estimating equation odds ratio, 0.40 [95% CI, 0.18-0.88]) were identified in the postintervention (13 [3.2%]) compared with the preintervention (36 [9.0%]) groups. CONCLUSIONS AND RELEVANCE: The findings of this quality improvement study suggest that targeted antimicrobial stewardship interventions during TOC were associated with increased optimal, guideline-concordant antimicrobial prescriptions at discharge. |
Trends of notification rates and treatment outcomes of tuberculosis cases with and without HIV co-infection in eight rural districts of Uganda (2015 - 2019)
Baluku JB , Nanyonjo R , Ayo J , Obwalatum JE , Nakaweesi J , Senyimba C , Lukoye D , Lubwama J , Ward J , Mukasa B . BMC Public Health 2022 22 (1) 651 BACKGROUND: The End TB Strategy aims to reduce new tuberculosis (TB) cases by 90% and TB-related deaths by 95% between 2015 - 2035. We determined the trend of case notification rates (CNRs) and treatment outcomes of TB cases with and without HIV co-infection in rural Uganda to provide an interim evaluation of progress towards this global target in rural settings. METHODS: We extracted retrospective programmatic data on notified TB cases and treatment outcomes from 2015 - 2019 for eight districts in rural Uganda from the District Health Information System 2. We estimated CNRs as the number of TB cases per 100,000 population. Treatment success rate (TSR) was calculated as the sum of TB cure and treatment completion for each year. Trends were estimated using the Mann-Kendall test. RESULTS: A total of 11,804 TB cases, of which 5,811 (49.2%) were HIV co-infected, were notified. The overall TB CNR increased by 3.7-fold from 37.7 to 141.3 cases per 100,000 population in 2015 and 2019 respectively. The increment was observed among people with HIV (from 204.7 to 730.2 per 100,000, p = 0.028) and HIV-uninfected individuals (from 19.9 to 78.7 per 100,000, p = 0.028). There was a decline in the TSR among HIV-negative TB cases from 82.1% in 2015 to 63.9% in 2019 (p = 0.086). Conversely, there was an increase in the TSR among HIV co-infected TB cases (from 69.9% to 81.9%, p = 0.807). CONCLUSION: The CNR increased among people with and without HIV while the TSR reduced among HIV-negative TB cases. There is need to refocus programs to address barriers to treatment success among HIV-negative TB cases. |
Social determinants of health and race disparities in kidney transplant
Wesselman H , Ford CG , Leyva Y , Li X , Chang CH , Dew MA , Kendall K , Croswell E , Pleis JR , Ng YH , Unruh ML , Shapiro R , Myaskovsky L . Clin J Am Soc Nephrol 2021 16 (2) 262-274 BACKGROUND AND OBJECTIVES: Black patients have a higher incidence of kidney failure but lower rate of deceased- and living-donor kidney transplantation compared with White patients, even after taking differences in comorbidities into account. We assessed whether social determinants of health (e.g., demographics, cultural, psychosocial, knowledge factors) could account for race differences in receiving deceased- and living-donor kidney transplantation. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: Via medical record review, we prospectively followed 1056 patients referred for kidney transplant (2010-2012), who completed an interview soon after kidney transplant evaluation, until their kidney transplant. We used multivariable competing risk models to estimate the cumulative incidence of receipt of any kidney transplant, deceased-donor transplant, or living-donor transplant, and the factors associated with each outcome. RESULTS: Even after accounting for social determinants of health, Black patients had a lower likelihood of kidney transplant (subdistribution hazard ratio, 0.74; 95% confidence interval, 0.55 to 0.99) and living-donor transplant (subdistribution hazard ratio, 0.49; 95% confidence interval, 0.26 to 0.95), but not deceased-donor transplant (subdistribution hazard ratio, 0.92; 95% confidence interval, 0.67 to 1.26). Black race, older age, lower income, public insurance, more comorbidities, being transplanted before changes to the Kidney Allocation System, greater religiosity, less social support, less transplant knowledge, and fewer learning activities were each associated with a lower probability of any kidney transplant. Older age, more comorbidities, being transplanted before changes to the Kidney Allocation System, greater religiosity, less social support, and fewer learning activities were each associated with a lower probability of deceased-donor transplant. Black race, older age, lower income, public insurance, higher body mass index, dialysis before kidney transplant, not presenting with a potential living donor, religious objection to living-donor transplant, and less transplant knowledge were each associated with a lower probability of living-donor transplant. CONCLUSIONS: Race and social determinants of health are associated with the likelihood of undergoing kidney transplant. |
Epidemiological studies of low-dose ionizing radiation and cancer: Summary bias assessment and meta-analysis
Hauptmann M , Daniels RD , Cardis E , Cullings HM , Kendall G , Laurier D , Linet MS , Little MP , Lubin JH , Preston DL , Richardson DB , Stram DO , Thierry-Chef I , Schubauer-Berigan MK , Gilbert ES , Berrington de Gonzalez A . J Natl Cancer Inst Monogr 2020 2020 (56) 188-200 BACKGROUND: Ionizing radiation is an established carcinogen, but risks from low-dose exposures are controversial. Since the Biological Effects of Ionizing Radiation VII review of the epidemiological data in 2006, many subsequent publications have reported excess cancer risks from low-dose exposures. Our aim was to systematically review these studies to assess the magnitude of the risk and whether the positive findings could be explained by biases. METHODS: Eligible studies had mean cumulative doses of less than 100 mGy, individualized dose estimates, risk estimates, and confidence intervals (CI) for the dose-response and were published in 2006-2017. We summarized the evidence for bias (dose error, confounding, outcome ascertainment) and its likely direction for each study. We tested whether the median excess relative risk (ERR) per unit dose equals zero and assessed the impact of excluding positive studies with potential bias away from the null. We performed a meta-analysis to quantify the ERR and assess consistency across studies for all solid cancers and leukemia. RESULTS: Of the 26 eligible studies, 8 concerned environmental, 4 medical, and 14 occupational exposure. For solid cancers, 16 of 22 studies reported positive ERRs per unit dose, and we rejected the hypothesis that the median ERR equals zero (P = .03). After exclusion of 4 positive studies with potential positive bias, 12 of 18 studies reported positive ERRs per unit dose (P = .12). For leukemia, 17 of 20 studies were positive, and we rejected the hypothesis that the median ERR per unit dose equals zero (P = .001), also after exclusion of 5 positive studies with potential positive bias (P = .02). For adulthood exposure, the meta-ERR at 100 mGy was 0.029 (95% CI = 0.011 to 0.047) for solid cancers and 0.16 (95% CI = 0.07 to 0.25) for leukemia. For childhood exposure, the meta-ERR at 100 mGy for leukemia was 2.84 (95% CI = 0.37 to 5.32); there were only two eligible studies of all solid cancers. CONCLUSIONS: Our systematic assessments in this monograph showed that these new epidemiological studies are characterized by several limitations, but only a few positive studies were potentially biased away from the null. After exclusion of these studies, the majority of studies still reported positive risk estimates. We therefore conclude that these new epidemiological studies directly support excess cancer risks from low-dose ionizing radiation. Furthermore, the magnitude of the cancer risks from these low-dose radiation exposures was statistically compatible with the radiation dose-related cancer risks of the atomic bomb survivors. |
Rearing Aedes aegypti mosquitoes in a laboratory setting
Masters SW , Knapek KJ , Kendall LV . Lab Animal Sci Prof 2020 55 (6) 42-45 Many species of insects have been used in a plethora of scientific studies.3 Laboratory reared mosquitoes are particularly integral to arthropod-borne disease experiments, especially with the recent emergence in North America of Zika virus in the human population. Aedes aegypti (Figure 1) is the preferred model for researchers because of their capacity to spread viruses such as, Zika, dengue, chikungunya, and yellow fever4 as well as their relatively short rearing times in a laboratory insectary and the ability to store their eggs for up to 6 mo. |
Epidemiological studies of low-dose ionizing radiation and cancer: Rationale and framework for the monograph and overview of eligible studies
Berrington de Gonzalez A , Daniels RD , Cardis E , Cullings HM , Gilbert E , Hauptmann M , Kendall G , Laurier D , Linet MS , Little MP , Lubin JH , Preston DL , Richardson DB , Stram D , Thierry-Chef I , Schubauer-Berigan MK . J Natl Cancer Inst Monogr 2020 2020 (56) 97-113 Whether low-dose ionizing radiation can cause cancer is a critical and long-debated question in radiation protection. Since the Biological Effects of Ionizing Radiation report by the National Academies in 2006, new publications from large, well-powered epidemiological studies of low doses have reported positive dose-response relationships. It has been suggested, however, that biases could explain these findings. We conducted a systematic review of epidemiological studies with mean doses less than 100 mGy published 2006-2017. We required individualized doses and dose-response estimates with confidence intervals. We identified 26 eligible studies (eight environmental, four medical, and 14 occupational), including 91 000 solid cancers and 13 000 leukemias. Mean doses ranged from 0.1 to 82 mGy. The excess relative risk at 100 mGy was positive for 16 of 22 solid cancer studies and 17 of 20 leukemia studies. The aim of this monograph was to systematically review the potential biases in these studies (including dose uncertainty, confounding, and outcome misclassification) and to assess whether the subset of minimally biased studies provides evidence for cancer risks from low-dose radiation. Here, we describe the framework for the systematic bias review and provide an overview of the eligible studies. |
Strengths and weaknesses of dosimetry used in studies of low-dose radiation exposure and cancer
Daniels RD , Kendall GM , Thierry-Chef I , Linet MS , Cullings HM . J Natl Cancer Inst Monogr 2020 2020 (56) 114-132 BACKGROUND: A monograph systematically evaluating recent evidence on the dose-response relationship between low-dose ionizing radiation exposure and cancer risk required a critical appraisal of dosimetry methods in 26 potentially informative studies. METHODS: The relevant literature included studies published in 2006-2017. Studies comprised case-control and cohort designs examining populations predominantly exposed to sparsely ionizing radiation, mostly from external sources, resulting in average doses of no more than 100 mGy. At least two dosimetrists reviewed each study and appraised the strengths and weaknesses of the dosimetry systems used, including assessment of sources and effects of dose estimation error. An overarching concern was whether dose error might cause the spurious appearance of a dose-response where none was present. RESULTS: The review included 8 environmental, 4 medical, and 14 occupational studies that varied in properties relative to evaluation criteria. Treatment of dose estimation error also varied among studies, although few conducted a comprehensive evaluation. Six studies appeared to have known or suspected biases in dose estimates. The potential for these biases to cause a spurious dose-response association was constrained to three case-control studies that relied extensively on information gathered in interviews conducted after case ascertainment. CONCLUSIONS: The potential for spurious dose-response associations from dose information appeared limited to case-control studies vulnerable to recall errors that may be differential by case status. Otherwise, risk estimates appeared reasonably free of a substantial bias from dose estimation error. Future studies would benefit from a comprehensive evaluation of dose estimation errors, including methods accounting for their potential effects on dose-response associations. |
Does racial disparity in kidney transplant waitlisting persist after accounting for social determinants of health
Ng YH , Pankratz VS , Leyva Y , Ford CG , Pleis JR , Kendall K , Croswell E , Dew MA , Shapiro R , Switzer GE , Unruh ML , Myaskovsky L . Transplantation 2020 104 (7) 1445-1455 BACKGROUND: African Americans (AA) have lower rates of kidney transplantation (KT) compared with Whites (WH), even after adjusting for demographic and medical factors. In this study, we examined whether the racial disparity in KT waitlisting persists after adjusting for social determinants of health (eg, cultural, psychosocial, and knowledge). METHODS: We prospectively followed a cohort of 1055 patients who were evaluated for KT between 3 of 10 to 10 of 12 and followed through 8 of 18. Participants completed a semistructured telephone interview shortly after their first KT evaluation appointment. We used the Wilcoxon rank-sum and Pearson chi-square tests to examine race differences in the baseline characteristics. We then assessed racial differences in the probability of waitlisting while accounting for all predictors using cumulative incidence curves and Fine and Gray proportional subdistribution hazards models. RESULTS: There were significant differences in the baseline characteristics between non-Hispanic AA and non-Hispanic WH. AA were 25% less likely (95% confidence interval, 0.60-0.96) to be waitlisted than WH even after adjusting for medical factors and social determinants of health. In addition, being older, having lower income, public insurance, more comorbidities, and being on dialysis decreased the probability of waitlisting while having more social support and transplant knowledge increased the probability of waitlisting. CONCLUSIONS: Racial disparity in kidney transplant waitlisting persisted even after adjusting for medical factors and social determinants of health, suggesting the need to identify novel factors that impact racial disparity in transplant waitlisting. Developing interventions targeting cultural and psychosocial factors may enhance equity in access to transplantation. |
Unexpected race and ethnicity differences in the US National Veterans Affairs Kidney Transplant Program
Myaskovsky L , Kendall K , Li X , Chang CH , Pleis JR , Croswell E , Ford CG , Switzer GE , Langone A , Mittal-Henkle A , Saha S , Thomas CP , Adams Flohr J , Ramkumar M , Dew MA . Transplantation 2019 103 (12) 2701-2714 BACKGROUND: Racial/ethnic minorities have lower rates of deceased kidney transplantation (DDKT) and living donor kidney transplantation (LDKT) in the United States. We examined whether social determinants of health (eg, demographics, cultural, psychosocial, knowledge factors) could account for differences in the Veterans Affairs (VA) Kidney Transplantation (KT) Program. METHODS: We conducted a multicenter longitudinal cohort study of 611 Veterans undergoing evaluation for KT at all National VA KT Centers (2010-2012) using an interview after KT evaluation and tracking participants via medical records through 2017. RESULTS: Hispanics were more likely to get any KT (subdistribution hazard ratios [SHR] [95% confidence interval (CI)]: 1.8 [1.2-2.8]) or DDKT (SHR [95% CI]: 2.0 [1.3-3.2]) than non-Hispanic white in univariable analysis. Social determinants of health, including marital status (SHR [95% CI]: 0.6 [0.4-0.9]), religious objection to LDKT (SHR [95% CI]: 0.6 [0.4-1.0]), and donor preference (SHR [95% CI]: 2.5 [1.2-5.1]), accounted for some racial differences, and changes to Kidney Allocation System policy (SHR [95% CI]: 0.3 [0.2-0.5]) mitigated race differences in DDKT in multivariable analysis. For LDKT, non-Hispanic African American Veterans were less likely to receive an LDKT than non-Hispanic white (SHR [95% CI]: 0.2 [0.0-0.7]), but accounting for age (SHR [95% CI]: 1.0 [0.9-1.0]), insurance (SHR [95% CI]: 5.9 [1.1-33.7]), presenting with a living donor (SHR [95% CI]: 4.1 [1.4-12.3]), dialysis duration (SHR [95% CI]: 0.3 [0.2-0.6]), network of potential donors (SHR [95% CI]: 1.0 [1.0-1.1]), self-esteem (SHR [95% CI]: 0.4 [0.2-0.8]), transplant knowledge (SHR [95% CI]: 1.3 [1.0-1.7]), and changes to Kidney Allocation System policy (SHR [95% CI]: 10.3 [2.5-42.1]) in multivariable analysis eliminated those disparities. CONCLUSIONS: The VA KT Program does not exhibit the same pattern of disparities in KT receipt as non-VA centers. Transplant centers can use identified risk factors to target patients who may need more support to ensure they receive a transplant. |
Pan-viral serology implicates enteroviruses in acute flaccid myelitis.
Schubert RD , Hawes IA , Ramachandran PS , Ramesh A , Crawford ED , Pak JE , Wu W , Cheung CK , O'Donovan BD , Tato CM , Lyden A , Tan M , Sit R , Sowa GA , Sample HA , Zorn KC , Banerji D , Khan LM , Bove R , Hauser SL , Gelfand AA , Johnson-Kerner BL , Nash K , Krishnamoorthy KS , Chitnis T , Ding JZ , McMillan HJ , Chiu CY , Briggs B , Glaser CA , Yen C , Chu V , Wadford DA , Dominguez SR , Ng TFF , Marine RL , Lopez AS , Nix WA , Soldatos A , Gorman MP , Benson L , Messacar K , Konopka-Anstadt JL , Oberste MS , DeRisi JL , Wilson MR . Nat Med 2019 25 (11) 1748-1752 ![]() ![]() Since 2012, the United States of America has experienced a biennial spike in pediatric acute flaccid myelitis (AFM)(1-6). Epidemiologic evidence suggests non-polio enteroviruses (EVs) are a potential etiology, yet EV RNA is rarely detected in cerebrospinal fluid (CSF)(2). CSF from children with AFM (n = 42) and other pediatric neurologic disease controls (n = 58) were investigated for intrathecal antiviral antibodies, using a phage display library expressing 481,966 overlapping peptides derived from all known vertebrate and arboviruses (VirScan). Metagenomic next-generation sequencing (mNGS) of AFM CSF RNA (n = 20 cases) was also performed, both unbiased sequencing and with targeted enrichment for EVs. Using VirScan, the viral family significantly enriched by the CSF of AFM cases relative to controls was Picornaviridae, with the most enriched Picornaviridae peptides belonging to the genus Enterovirus (n = 29/42 cases versus 4/58 controls). EV VP1 ELISA confirmed this finding (n = 22/26 cases versus 7/50 controls). mNGS did not detect additional EV RNA. Despite rare detection of EV RNA, pan-viral serology frequently identified high levels of CSF EV-specific antibodies in AFM compared with controls, providing further evidence for a causal role of non-polio EVs in AFM. |
Spatio-temporal coherence of dengue, chikungunya and Zika outbreaks in Merida, Mexico
Bisanzio D , Dzul-Manzanilla F , Gomez-Dantes H , Pavia-Ruz N , Hladish TJ , Lenhart A , Palacio-Vargas J , Gonzalez Roldan JF , Correa-Morales F , Sanchez-Tejeda G , Kuri Morales P , Manrique-Saide P , Longini IM , Halloran ME , Vazquez-Prokopec GM . PLoS Negl Trop Dis 2018 12 (3) e0006298 Response to Zika virus (ZIKV) invasion in Brazil lagged a year from its estimated February 2014 introduction, and was triggered by the occurrence of severe congenital malformations. Dengue (DENV) and chikungunya (CHIKV) invasions tend to show similar response lags. We analyzed geo-coded symptomatic case reports from the city of Merida, Mexico, with the goal of assessing the utility of historical DENV data to infer CHIKV and ZIKV introduction and propagation. About 42% of the 40,028 DENV cases reported during 2008-2015 clustered in 27% of the city, and these clustering areas were where the first CHIKV and ZIKV cases were reported in 2015 and 2016, respectively. Furthermore, the three viruses had significant agreement in their spatio-temporal distribution (Kendall W>0.63; p<0.01). Longitudinal DENV data generated patterns indicative of the resulting introduction and transmission patterns of CHIKV and ZIKV, leading to important insights for the surveillance and targeted control to emerging Aedes-borne viruses. |
High-density microprojection array delivery to rat skin of low doses of trivalent inactivated poliovirus vaccine elicits potent neutralising antibody responses
Muller DA , Fernando GJP , Owens NS , Agyei-Yeboah C , Wei JCJ , Depelsenaire ACI , Forster A , Fahey P , Weldon WC , Oberste MS , Young PR , Kendall MAF . Sci Rep 2017 7 (1) 12644 To secure a polio-free world, the live attenuated oral poliovirus vaccine (OPV) will eventually need to be replaced with inactivated poliovirus vaccines (IPV). However, current IPV delivery is less suitable for campaign use than OPV, and more expensive. We are progressing a microarray patch delivery platform, the Nanopatch, as an easy-to-use device to administer vaccines, including IPV. The Nanopatch contains an ultra-high density array (10,000/cm2) of short (~230 mum) microprojections that delivers dry coated vaccine into the skin. Here, we compare the relative immunogenicity of Nanopatch immunisation versus intramuscular injection in rats, using monovalent and trivalent formulations of IPV. Nanopatch delivery elicits faster antibody response kinetics, with high titres of neutralising antibody after just one (IPV2) or two (IPV1 and IPV3) immunisations, while IM injection requires two (IPV2) or three (IPV1 and IPV3) immunisations to induce similar responses. Seroconversion to each poliovirus type was seen in 100% of rats that received ~1/40th of a human dose of IPV delivered by Nanopatch, but not in rats given ~1/8th or ~1/40th dose by IM injection. Ease of administration coupled with dose reduction observed in this study suggests the Nanopatch could facilitate inexpensive IPV vaccination in campaign settings. |
Impact of Larger Sputum Volume on Xpert(®) MTB/RIF Assay Detection of Mycobacterium tuberculosis in Smear-Negative Individuals with Suspected Tuberculosis.
Badal-Faesen S , Firnhaber C , Kendall MA , Wu X , Grinsztejn B , Escada Rods , Fernandez M , Hogg E , Sanne I , Johnson P , Alland D , Mazurek GH , Benator DA , Luetkemeyer AF . J Clin Med 2017 6 (8) ![]() As a strategy to improve the sensitivity of nucleic acid-based testing in acid-fast bacilli (AFB) negative samples, larger volumes of sputum (5-10 mL) were tested with Xpert(R) MTB/RIF from 176 individuals with smear-negative sputum undergoing tuberculosis evaluation. Despite larger volumes, this strategy had a suboptimal sensitivity of 50% (4/8). |
Development of a roof bolter canopy air curtain for respirable dust control
Reed WR , Joy GJ , Kendall B , Bailey A , Zheng Y . Min Eng 2017 69 (1) 33-39 Testing of the roof bolter canopy air curtain (CAC) designed by the U.S. National Institute for Occupational Safety and Health (NIOSH) has gone through many iterations, demonstrating successful dust control performance under controlled laboratory conditions. J.H. Fletcher & Co., an original equipment manufacturer of mining equipment, further developed the concept by incorporating it into the design of its roof bolting machines. In the present work, laboratory testing was conducted, showing dust control efficiencies ranging from 17.2 to 24.5 percent. Subsequent computational fluid dynamics (CFD) analysis revealed limitations in the design, and a potential improvement was analyzed and recommended. As a result, a new CAC design is being developed, incorporating the results of the testing and CFD analysis. |
Examination of a newly developed mobile dry scrubber (DS) for coal mine dust control applications
Organiscak J , Noll J , Yantek D , Kendall B . Trans Soc Min Metall Explor Inc 2016 340 38-47 The Office of Mine Safety and Health Research of the U.S. National Institute for Occupational Safety and Health (NIOSH OMSHR) conducted laboratory testing of a self-tramming, remotely controlled mobile Dry Scrubber (DS) that J.H. Fletcher and Co. developed under a contract with NIOSH OMSHR to reduce the exposure of miners to airborne dust. The scrubber was found to average greater than 95 percent dust removal efficiency with disposable filters, and 88 and 90 percent, respectively, with optional washable filters in their prewash and post-wash test conditions. Although the washable filters can be reused, washing them generated personal and downstream respirable dust concentrations of 1.2 and 8.3 mg/m(3), respectively, for a 10-min washing period. The scrubber's velocity-pressure-regulated variable-frequency-drive fan maintained relatively consistent airflow near the targeted 1.42 and 4.25 m(3)/s (3,000 and 9,000 ft(3)/min) airflow rates during most of the laboratory dust testing until reaching its maximum 60-Hz fan motor frequency or horsepower rating at 2,610 Pa (10.5 in. w.g.) of filter differential pressure and 3.97 m(3)/s (8,420 ft(3)/min) of scrubber airflow quantity. Laboratory sound level measurements of the scrubber showed that the outlet side of the scrubber was noisier, and the loaded filters increased sound levels compared with clean filters at the same airflow quantities. With loaded filters, the scrubber reached a 90 dB(A) sound level at 2.83 m(3)/s (6,000 ft(3)/min) of scrubber airflow, indicating that miners should not be overexposed in relation to MSHA's permissible exposure level - under Title 30 Code of Federal Regulations Part 62.101- of 90 dB(A) at or below this airflow quantity. The scrubber's washable filters were not used during field-testing because of their lower respirable dust removal efficiency and the airborne dust generated by filter washing. Field-testing the scrubber with disposable filters at two underground coal mine sections showed that it could clean a portion of the section return air and provide dust reduction of about 50 percent at the face area downstream of the continuous-miner operation. |
Inactivated poliovirus type 2 vaccine delivered to rat skin via high density microprojection array elicits potent neutralising antibody responses
Muller DA , Pearson FE , Fernando GJ , Agyei-Yeboah C , Owens NS , Corrie SR , Crichton ML , Wei JC , Weldon WC , Oberste MS , Young PR , Kendall MA . Sci Rep 2016 6 22094 Polio eradication is progressing rapidly, and the live attenuated Sabin strains in the oral poliovirus vaccine (OPV) are being removed sequentially, starting with type 2 in April 2016. For risk mitigation, countries are introducing inactivated poliovirus vaccine (IPV) into routine vaccination programs. After April 2016, monovalent type 2 OPV will be available for type 2 outbreak control. Because the current IPV is not suitable for house-to-house vaccination campaigns (the intramuscular injections require health professionals), we developed a high-density microprojection array, the Nanopatch, delivered monovalent type 2 IPV (IPV2) vaccine to the skin. To assess the immunogenicity of the Nanopatch, we performed a dose-matched study in rats, comparing the immunogenicity of IPV2 delivered by intramuscular injection or Nanopatch immunisation. A single dose of 0.2 D-antigen units of IPV2 elicited protective levels of poliovirus antibodies in 100% of animals. However, animals receiving IPV2 by IM required at least 3 immunisations to reach the same neutralising antibody titres. This level of dose reduction (1/40th of a full dose) is unprecedented for poliovirus vaccine delivery. The ease of administration coupled with the dose reduction observed in this study points to the Nanopatch as a potential tool for facilitating inexpensive IPV for mass vaccination campaigns. |
Evaluation of Xpert MTB/RIF to identify pulmonary tuberculosis in tuberculosis suspects from low and higher prevalence settings compared to acid fast smear and culture
Firnhaber C , Kendall MA , Wu X , Mazurek GH , Benator DA , Arduino R , Fernandez M , Guy E , Johnson P , Metchock B , Sattler F , Telzak E , Wang YF , Weiner M , Swindells S , Sanne IM , Havlir DV , Grinsztejn B , Alland D . Clin Infect Dis 2016 62 (9) 1081-8 ![]() BACKGROUND: Xpert MTB/RIF(Xpert) is a rapid nucleic acid amplification test widely used in high tuberculosis(TB) prevalence settings to detect tuberculosis as well as rpoB mutations associated with rifampin resistance. Data are needed on the diagnostic performance of Xpert in lower prevalence settings to inform appropriate use for both tuberculosis detection and the need for respiratory isolation. METHODS: Xpert was compared to two sputa, each evaluated with AFB smear and mycobacterial culture using liquid and solid culture media, from participants with suspected pulmonary TB from the US, Brazil, and South Africa. RESULTS: Of 992 participants enrolled with evaluable results, 22% had culture-confirmed TB. In 638(64%) US participants, one Xpert demonstrated sensitivity of 85.2%(96.7% in participants with AFB smear-positive(AFB+) sputum, 59.3% with AFB- sputum),specificity of 99.2%, NPV 97.6%, and PPV 94.9%. Results did not differ between higher and low prevalence settings. A second Xpert increased overall sensitivity to 91.1%(100% if AFB+, 71.4% if AFB-), with specificity of 98.9%. In US participants, a single negative Xpert predicted the absence of AFB+/culture+ tuberculosis with an NPV of 99.7%; NPV of two Xperts was 100%, suggesting a role in removing patients from airborne infection isolation. Xpert detected TB DNA and mutations associated with rifampin resistance in five of seven participants with rifampin-resistant, culture+ tuberculosis. Specificity for rifampin resistance was 99.5%,NPV was 98.9%. CONCLUSIONS: In the US, Xpert testing performed comparably to two higher TB prevalence settings. These data support the use of Xpert in the initial evaluation of TB suspects and in algorithms assessing need for respiratory isolation. |
Communitywide cryptosporidiosis outbreak associated with a surface water-supplied municipal water system - Baker City, Oregon, 2013
De Silva MB , Schafer S , Kendall Scott M , Robinson B , Hills A , Buser GL , Salis K , Gargano J , Yoder J , Hill V , Xiao L , Roellig D , Hedberg K . Epidemiol Infect 2015 144 (2) 1-11 Cryptosporidium, a parasite known to cause large drinking and recreational water outbreaks, is tolerant of chlorine concentrations used for drinking water treatment. Human laboratory-based surveillance for enteric pathogens detected a cryptosporidiosis outbreak in Baker City, Oregon during July 2013 associated with municipal drinking water. Objectives of the investigation were to confirm the outbreak source and assess outbreak extent. The watershed was inspected and city water was tested for contamination. To determine the community attack rate, a standardized questionnaire was administered to randomly sampled households. Weighted attack rates and confidence intervals (CIs) were calculated. Water samples tested positive for Cryptosporidium species; a Cryptosporidium parvum subtype common in cattle was detected in human stool specimens. Cattle were observed grazing along watershed borders; cattle faeces were observed within watershed barriers. The city water treatment facility chlorinated, but did not filter, water. The community attack rate was 28.3% (95% CI 22.1-33.6), sickening an estimated 2780 persons. Watershed contamination by cattle probably caused this outbreak; water treatments effective against Cryptosporidium were not in place. This outbreak highlights vulnerability of drinking water systems to pathogen contamination and underscores the need for communities to invest in system improvements to maintain multiple barriers to drinking water contamination. |
Strengthening the Reporting of Observational Studies in Epidemiology for respondent-driven sampling studies: "STROBE-RDS" statement
White RG , Hakim AJ , Salganik MJ , Spiller MW , Johnston LG , Kerr L , Kendall C , Drake A , Wilson D , Orroth K , Egger M , Hladik W . J Clin Epidemiol 2015 68 (12) 1463-71 OBJECTIVES: Respondent-driven sampling (RDS) is a new data collection methodology used to estimate characteristics of hard-to-reach groups, such as the HIV prevalence in drug users. Many national public health systems and international organizations rely on RDS data. However, RDS reporting quality and available reporting guidelines are inadequate. We carried out a systematic review of RDS studies and present Strengthening the Reporting of Observational Studies in Epidemiology for RDS Studies (STROBE-RDS), a checklist of essential items to present in RDS publications, justified by an explanation and elaboration document. STUDY DESIGN AND SETTING: We searched the MEDLINE (1970-2013), EMBASE (1974-2013), and Global Health (1910-2013) databases to assess the number and geographical distribution of published RDS studies. STROBE-RDS was developed based on STROBE guidelines, following Guidance for Developers of Health Research Reporting Guidelines. RESULTS: RDS has been used in over 460 studies from 69 countries, including the USA (151 studies), China (70), and India (32). STROBE-RDS includes modifications to 12 of the 22 items on the STROBE checklist. The two key areas that required modification concerned the selection of participants and statistical analysis of the sample. CONCLUSION: STROBE-RDS seeks to enhance the transparency and utility of research using RDS. If widely adopted, STROBE-RDS should improve global infectious diseases public health decision making. |
Total cholesterol performance of Abell-Levy-Brodie-Kendall reference measurement procedure: certification of Japanese in-vitro diagnostic assay manufacturers through CDC's Cholesterol Reference Method Laboratory Network
Nakamura M , Iso H , Kitamura A , Imano H , Kiyama M , Yokoyama S , Kayamori Y , Koyama I , Nishimura K , Nakai M , Dasti M , Vesper HW , Miyamoto Y . Clin Chim Acta 2015 445 127-32 BACKGROUND: Accurate measurement of total cholesterol (TC) is important for cardiovascular disease risk management. The US Centers for Disease Control and Prevention (CDC) and Cholesterol Reference Method Laboratory Network (CRMLN) perform Abell-Levy-Brodie-Kendall (AK) reference measurement procedure (RMP) for TC as a secondary reference method, and implement Certification Protocol for Manufacturers. Japanese CRMLN laboratory at Osaka performed the AK RMP for 22 years, and conducted TC certification for reagent/calibrator/instrument systems of six Japanese manufacturers every 2 years for 16 years. Osaka TC performance was examined and compared to CDC's reference values. METHODS: AK RMP involved sample hydrolysis, cholesterol extraction, and determination of cholesterol levels by spectrophotometry. The Certification Protocol for Manufacturers includes comparison with AK RMP using at least 40 fresh specimens. Demonstration of average bias ≤3% and total coefficient of variation ≤3% qualified an analytical system for certification. RESULTS: In the AK RMP used in the Osaka CRMLN laboratory, the regression equation for measuring TC was y (Osaka)=1.000x (CDC)+0.032 (n=619, R2=1.000). Six Japanese manufacturers had allowable performance for certification. CONCLUSIONS: The AK RMP for TC measurement was accurate, precise, and stable for 22 years. Six Japanese manufacturers were certified for 16years. |
Eliminating preventable HIV-related maternal mortality in sub-Saharan Africa: what do we need to know?
Kendall T , Danel I , Cooper D , Dilmitis S , Kaida A , Kourtis AP , Langer A , Lapidos-Salaiz I , Lathrop E , Moran AC , Sebitloane H , Turan JM , Watts DH , Wegner MN . J Acquir Immune Defic Syndr 2014 67 Suppl 4 S250-8 INTRODUCTION: HIV makes a significant contribution to maternal mortality, and women living in sub-Saharan Africa are most affected. International commitments to eliminate preventable maternal mortality and reduce HIV-related deaths among pregnant and postpartum women by 50% will not be achieved without a better understanding of the links between HIV and poor maternal health outcomes and improved health services for the care of women living with HIV (WLWH) during pregnancy, childbirth, and postpartum. METHODS: This article summarizes priorities for research and evaluation identified through consultation with 30 international researchers and policymakers with experience in maternal health and HIV in sub-Saharan Africa and a review of the published literature. RESULTS: Priorities for improving the evidence about effective interventions to reduce maternal mortality and improve maternal health among WLWH include better quality data about causes of maternal death among WLWH, enhanced and harmonized program monitoring, and research and evaluation that contributes to improving: (1) clinical management of pregnant and postpartum WLWH, including assessment of the impact of expanded antiretroviral therapy on maternal mortality and morbidity, (2) integrated service delivery models, and (3) interventions to create an enabling social environment for women to begin and remain in care. CONCLUSIONS: As the global community evaluates progress and prepares for new maternal mortality and HIV targets, addressing the needs of WLWH must be a priority now and after 2015. Research and evaluation on maternal health and HIV can increase collaboration on these 2 global priorities, strengthen political constituencies and communities of practice, and accelerate progress toward achievement of goals in both areas. |
Typhoid fever acquired in the United States, 1999-2010: epidemiology, microbiology, and use of a space-time scan statistic for outbreak detection
Imanishi M , Newton AE , Vieira AR , Gonzalez-Aviles G , Kendall Scott ME , Manikonda K , Maxwell TN , Halpin JL , Freeman MM , Medalla F , Ayers TL , Derado G , Mahon BE , Mintz ED . Epidemiol Infect 2014 143 (11) 1-12 Although rare, typhoid fever cases acquired in the United States continue to be reported. Detection and investigation of outbreaks in these domestically acquired cases offer opportunities to identify chronic carriers. We searched surveillance and laboratory databases for domestically acquired typhoid fever cases, used a space-time scan statistic to identify clusters, and classified clusters as outbreaks or non-outbreaks. From 1999 to 2010, domestically acquired cases accounted for 18% of 3373 reported typhoid fever cases; their isolates were less often multidrug-resistant (2% vs. 15%) compared to isolates from travel-associated cases. We identified 28 outbreaks and two possible outbreaks within 45 space-time clusters of 2 domestically acquired cases, including three outbreaks involving 2 molecular subtypes. The approach detected seven of the ten outbreaks published in the literature or reported to CDC. Although this approach did not definitively identify any previously unrecognized outbreaks, it showed the potential to detect outbreaks of typhoid fever that may escape detection by routine analysis of surveillance data. Sixteen outbreaks had been linked to a carrier. Every case of typhoid fever acquired in a non-endemic country warrants thorough investigation. Space-time scan statistics, together with shoe-leather epidemiology and molecular subtyping, may improve outbreak detection. |
Population size, HIV, and behavior among MSM in Luanda, Angola: challenges and findings in the first ever HIV and syphilis biological and behavioral survey
Kendall C , Kerr LR , Mota RM , Cavalcante S , Macena RH , Chen S , Gaffga N , Monterosso E , Bastos FI , Serrano D . J Acquir Immune Defic Syndr 2014 66 (5) 544-51 OBJECTIVES: To conduct the first population size estimation and biological and behavioral surveillance survey among men who have sex with men (MSM) in Angola. DESIGN: Population size estimation with multiplier method and a cross-sectional study using respondent-driven sampling. SETTING: Luanda Province, Angola. Study was conducted in a large hospital. PARTICIPANTS: Seven hundred ninety-two self-identified MSM accepted a unique object for population size estimation. Three hundred fifty-one MSM were recruited with respondent-driven sampling for biological and behavioral surveillance survey. METHODS: Interviews and testing for HIV and syphilis were conducted on-site. Analysis used Respondent-Driven Sampling Analysis Tool and STATA 11.0. Univariate, bivariate, and multivariate analyses examined factors associated with HIV and unprotected sex. Six imputation strategies were used for missing data for those refusing to test for HIV. MAIN OUTCOME: A population size of 6236 MSM was estimated. Twenty-seven of 351 individuals were tested positive. Adjusted HIV prevalence was 3.7% (8.7% crude). With imputation, HIV seroprevalence was estimated between 3.8% [95% confidence interval (CI): 1.6 to 6.5] and 10.5% (95% CI: 5.6 to 15.3). Being older than 25 (odds ratio = 10.8, 95% CI: 3.5 to 32.8) and having suffered episodes of homophobia (odds ratio = 12.7, 95% CI: 3.2 to 49.6) significantly increased the chance of HIV seropositivity. CONCLUSIONS: Risk behaviors are widely reported, but HIV seroprevalence is lower than expected. The difference between crude and adjusted values was mostly due to treatment of missing values in Respondent-Driven Sampling Analysis Tool. Solutions are proposed in this article. Although concerns were raised about feasibility and adverse outcomes for MSM, the study was successfully and rapidly completed with no adverse effects. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 18, 2025
- Content source:
- Powered by CDC PHGKB Infrastructure