Last data update: Apr 29, 2024. (Total: 46658 publications since 2009)
Records 1-23 (of 23 Records) |
Query Trace: Bloland P [original query] |
---|
Targeted Short Message Service-Based Intervention to Improve Routine Immunization Reporting in Bauchi State, Nigeria, 2016
Adegoke OJ , Mungure E , Osadebe LU , Adeoye OB , Aduloju M , Makinde I , Ahmed B , Nguku PM , Waziri NE , Bloland PB , MacNeil A . Pan Afr Med J 12/28/2021 40 11 INTRODUCTION: High quality, timely and complete immunization data are essential for program planning and decision-making. In Nigeria, the National Health Management Information System (NHMIS) Routine Immunization (RI) module and dashboard (on the District Health Information System version 2 (DHIS2) platform) support the use of real time RI data. We deployed an automated short message service (SMS) notification system that works with the existing RI module to facilitate improvements in RI data in the DHIS2. METHODS: A pilot project was performed using intervention and control local government areas (LGAs). A mixed methods approach using both qualitative and quantitative methods was used to evaluate the system. We assessed changes in reporting rates across different reports. The evaluation also included baseline and post-intervention surveys of health facility (HF) staff. RESULTS: Reporting timeliness (76% pre and 99% post intervention) and completeness (83% pre and 99% post intervention) were consistently higher during the post-intervention than the pre-intervention period for facilities in the intervention LGA while reporting timeliness (65% pre and 66% post intervention) and completeness (71% and 77% post intervention) for facilities in the control LGA showed no change. Users reported that the SMS system was easy to understand and helped to facilitate improvements in consistency of data and timeliness of reporting. Inability of health care workers to effect changes at the HF level and the lack of immediate feedback were reported as key challenges to timeliness and quality of reports. CONCLUSION: An SMS-based intervention improved timeliness and completeness of health data reporting. However, the intervention should be evaluated on a larger scale over a longer time period before considering a national implementation. |
Use of a district health information system 2 routine immunization dashboard for immunization program monitoring and decision making, Kano State, Nigeria
Tchoualeu DD , Elmousaad HE , Osadebe LU , Adegoke OJ , Nnadi C , Haladu SA , Jacenko SM , Davis LB , Bloland PB , Sandhu HS . Pan Afr Med J 12/28/2021 40 2 INTRODUCTION: a district health information system 2 tool with a customized routine immunization (RI) module and indicator dashboard was introduced in Kano State, Nigeria, in November 2014 to improve data management and analysis of RI services. We assessed the use of the module for program monitoring and decision-making, as well as the enabling factors and barriers to data collection and use. METHODS: a mixed-methods approach was used to assess user experience with the RI data module and dashboard, including 1) a semi-structured survey questionnaire administered at 60 health facilities administering vaccinations and 2) focus group discussions and 16 in-depth interviews conducted with immunization program staff members at the local government area (LGA) and state levels. RESULTS: in health facilities, a RI monitoring chart was used to review progress toward meeting vaccination coverage targets. At the LGA, staff members used RI dashboard data to prioritize health facilities for additional support. At the State level, immunization program staff members use RI data to make policy decisions. They viewed the provision of real-time data through the RI dashboard as a "game changer". Use of immunization data is facilitated through review meetings and supportive supervision visits. Barriers to data use among LGA staff members included inadequate understanding of the data collection tools and computer illiteracy. CONCLUSION: the routine immunization data dashboard facilitated access to and use of data for decision-making at the LGA, State and national levels, however, use at the health facility level remains limited. Ongoing data review meetings and training on computer skills and data collection tools are recommended. |
Association of malnutrition with subsequent malaria parasitemia among children younger than three years in Kenya: A secondary data analysis of the Asembo Bay Cohort Study
Donovan CV , McElroy P , Adair L , Pence BW , Oloo AJ , Lal A , Bloland P , Nahlen B , Juliano JJ , Meshnick S . Am J Trop Med Hyg 2020 104 (1) 243-254 Malaria and malnutrition remain primary causes of morbidity and mortality among children younger than 5 years in Africa. Studies investigating the association between malnutrition and subsequent malaria outcomes are inconsistent. We studied the effects of malnutrition on incidence and prevalence of malaria parasitemia in data from a cohort studied in the 1990s. Data came from the Asembo Bay cohort study, which collected malaria and health information on children from 1992 to 1996 in western Kenya. Infants were enrolled at birth and followed up until loss to follow-up, death, end of study, or 5 years old. Anthropometric measures and blood specimens were obtained monthly. Nutritional exposures included categorized Z-scores for height-for-age, weight-for-age, and weight-for-height. Febrile parasitemia and afebrile parasitemia were assessed with thick and thin blood films. Multiply imputed and weighted multinomial generalized estimating equation models estimated odds ratios (OR) for the association between exposures and outcomes. The sample included 1,182 children aged 0-30 months who contributed 18,028 follow-up visits. There was no significant association between malnutrition and either incident febrile parasitemia or prevalent febrile parasitemia. Prevalence ORs for afebrile parasitemia increased from 1.07 (95% CI: 0.89, 1.29) to 1.35 (1.03, 1.76) as stunting severity increased from mild to severe, and from 1.16 (1.02, 1.33) to 1.35 (1.09, 1.66) as underweight increased from mild to moderate. Stunting and underweight did not show a significant association with subsequent febrile parasitemia infections, but they did show a modest association with subsequent afebrile parasitemia. Consideration should be given to testing malnourished children for malaria, even if they present without fever. |
Implementing the routine immunisation data module and dashboard of DHIS2 in Nigeria, 2014-2019
Shuaib F , Garba AB , Meribole E , Obasi S , Sule A , Nnadi C , Waziri NE , Bolu O , Nguku PM , Ghiselli M , Adegoke OJ , Jacenko S , Mungure E , Gidado S , Wilson I , Wiesen E , Elmousaad H , Bloland P , Rosencrans L , Mahoney F , MacNeil A , Franka R , Vertefeuille J . BMJ Glob Health 2020 5 (7) In 2010, Nigeria adopted the use of web-based software District Health Information System, V.2 (DHIS2) as the platform for the National Health Management Information System. The platform supports real-time data reporting and promotes government ownership and accountability. To strengthen its routine immunisation (RI) component, the US Centers for Disease Control and Prevention (CDC) through its implementing partner, the African Field Epidemiology Network-National Stop Transmission of Polio, in collaboration with the Government of Nigeria, developed the RI module and dashboard and piloted it in Kano state in 2014. The module was scaled up nationally over the next 4 years with funding from the Bill & Melinda Gates Foundation and CDC. One implementation officer was deployed per state for 2 years to support operations. Over 60 000 RI healthcare workers were trained on data collection, entry and interpretation and each local immunisation officer in the 774 local government areas (LGAs) received a laptop and stock of RI paper data tools. Templates for national-level and state-level RI bulletins and LGA quarterly performance tools were developed to promote real-time data use for feedback and decision making, and enhance the performance of RI services. By December 2017, the DHIS2 RI module had been rolled out in all 36 states and the Federal Capital Territory, and all states now report their RI data through the RI Module. All states identified at least one government DHIS2 focal person for oversight of the system's reporting and management operations. Government officials routinely collect RI data and use them to improve RI vaccination coverage. This article describes the implementation process-including planning and implementation activities, achievements, lessons learnt, challenges and innovative solutions-and reports the achievements in improving timeliness and completeness rates. |
Financial cost analysis of a strategy to improve the quality of administrative vaccination data in Uganda
Ward K , Mugenyi K , MacNeil A , Luzze H , Kyozira C , Kisakye A , Matseketse D , Newall AT , Heywood AE , Bloland P , Pallas SW . Vaccine 2019 38 (5) 1105-1113 BACKGROUND: High-quality vaccination data are critical to planning, implementation and evaluation of immunization programs. However, sub-optimal administrative vaccination data quality in low- and middle-income countries persist for heterogeneous reasons, though most relate to organizational factors and human behavior. The nationwide Data Improvement Team (DIT) strategy in Uganda aimed to strengthen human resource capacity to generate quality administrative vaccination data at the health facility. METHODS: A financial cost analysis of the Uganda DIT strategy (2014-2016) was conducted from the program funder perspective. Activity-based micro-costing from funder financial and program monitoring records was used to estimate total and unit costs by program area (in 2016 US dollars). Hypothetical scenarios were developed to illustrate potential approaches to reducing costs. RESULTS: Over 25 months the DIT strategy was implemented in all 116 operational districts and 3443 (89%) health facilities in Uganda at a total financial cost of US $575 275. Training and deployment of DITs accounted for the highest proportion of expenditure across program areas (69%). Transport, per diems, lodging, and honoraria for DIT members and national supervisors were the main cost drivers of the strategy. Deployment of 557 DIT members cost US $839 per DIT member, US $4 030 per district, and US $136 per health facility. The estimated opportunity cost of government staff time wasn't a major cost driver (2.5%) of total cost. CONCLUSION: The results provide the first estimates of the magnitude and drivers of cost to implement a national workforce capacity building strategy to improve administrative vaccination data quality in a low- or middle-income country. Financial costs are a critical input to combine with future outcome data to describe the cost of strategies relative to performance outcomes. The operational costs of the strategy were modest (0.5-1.6%) relative to the estimated operational costs of Uganda's national immunization program. |
Defining & assessing the quality, usability, and utilization of immunization data
Bloland P , MacNeil A . BMC Public Health 2019 19 (1) 380 BACKGROUND: High quality data are needed for decision-making at all levels of the public health system, from guiding public health activities at the local level, to informing national policy development, to monitoring the impact of global initiatives. Although a number of approaches have been developed to evaluate the underlying quality of routinely collected vaccination administrative data, there remains a lack of consensus around how data quality is best defined or measured. DISCUSSION: We present a definitional framework that is intended to disentangle many of the elements that have confused discussions of vaccination data quality to date. The framework describes immunization data in terms of three key characteristics: data quality, data usability, and data utilization. The framework also offers concrete suggestions for a specific set of indicators that could be used to better understand immunization those key characteristics, including Trueness, Concurrence, Relevancy, Efficiency, Completeness, Timeliness, Integrity, Consistency, and Utilization. CONCLUSION: Being deliberate about the choice of indicators; being clear on their definitions, limitations, and methods of measurement; and describing how those indicators work together to give a more comprehensive and practical understanding of immunization data quality, usability, and use, should yield more informed, and therefore better, programmatic decision-making. |
Developing standardized competencies to strengthen immunization systems and workforce
Traicoff D , Pope A , Bloland P , Lal D , Bahl J , Stewart S , Ryman T , Abbruzzese M , Lee C , Ahrendts J , Shamalla L , Sandhu H . Vaccine 2019 37 (11) 1428-1435 Despite global support for immunization as a core component of the human right to health and the maturity of immunization programs in low- and middle-income countries throughout the world, there is no comprehensive description of the standardized competencies needed for immunization programs at the national, multiple sub-national, and community levels. The lack of defined and standardized competencies means countries have few guidelines to help them address immunization workforce planning, program management, and performance monitoring. Potential consequences resulting from the lack of defined competencies include inadequate or inefficient distribution of resources to support the required functions and difficulties in adequately managing the health workforce. In 2015, an international multi-agency working group convened to define standardized competencies that national immunization programs could adapt for their own workforce planning needs. The working group used a stepwise approach to ensure that the competencies would align with immunization programs' objectives. The first step defined the attributes of a successful immunization program. The group then defined the work functions needed to achieve those attributes. Based on the work functions, the working group defined specific competencies. This process resulted in three products: (1) Attributes of an immunization program described within eight technical domains at four levels within a health system: National, Provincial, District/Local, and Community; (2) 229 distinct functions within those eight domains at each of the four levels; and (3) 242 competencies, representing eight technical domains and two foundational domains (Management and Leadership and Vaccine Preventable Diseases and Program). Currently available as a working draft and being tested with immunization projects in several countries, the final document will be published by WHO as normative guidelines. Vertical immunization programs as well as integrated systems can customize the framework to suit their needs. Standardized competencies can support immunization program improvements and help strengthen effective health systems. |
Considerations for the development and implementation of electronic immunization registries in Africa
Namageyo-Funa A , Samuel A , Bloland P , Macneil A . Pan Afr Med J 2018 30 81 While paper-based immunization registries are the prevalent form of documenting individual-level immunization service delivery in Africa, some countries are interested in transitioning to electronic immunization registries (EIRs) which have the potential to transform immunization data into useable information for decision making to optimize the performance of immunization programs. This report discusses opportunities and challenges in the adoption of EIRs in the African continent. |
Assessment of select electronic health information systems that support immunization data capture - Kenya, 2017
Namageyo-Funa A , Aketch M , Tabu C , MacNeil A , Bloland P . BMC Health Serv Res 2018 18 (1) 621 BACKGROUND: Although electronic health information systems (EHIS) with immunization components exist in Kenya, questions and concerns remain about their use and alignment with the Kenya Ministry of Health's (MOH) National Vaccine and Immunization Program (NVIP). This article reports on the findings of an assessment of select EHIS with immunization components in Kenya, specifically related to system design, development, and implementation. METHODS: We conducted a rapid assessment of select EHIS with immunization components in Kenya from January to May 2017 to understand the design, development, implementation of the EHIS including the lessons learned from their use. We also assessed how the data elements in the EHIS compared to the data elements in the Maternal and Child Health Booklet used in the existing paper based system in Kenya. RESULTS: The EHIS reviewed varied in purpose, content, and population covered. Only one system was built to focus specifically on immunization data. Substantial differences in system functionality and immunization-related data elements included in the EHIS were identified. None of the EHIS had all the data elements necessary to fully replace or operate independently from the standardized paper-based system for recording immunization data in Kenya. CONCLUSIONS: Overall, the findings of this assessment highlighted substantial variation in the EHIS with immunization components. The findings provide insights and lessons learned for the Kenya MOH NVIP, immunization partners, vendors of EHIS, and users of EHIS to consider as Kenya transitions from paper-based to electronic immunization information systems. |
Enhancing workforce capacity to improve vaccination data quality, Uganda
Ward K , Mugenyi K , Benke A , Luzze H , Kyozira C , Immaculate A , Tanifum P , Kisakye A , Bloland P , MacNeil A . Emerg Infect Dis 2017 23 (13) S85-93 In Uganda, vaccine dose administration data are often not available or are of insufficient quality to optimally plan, monitor, and evaluate program performance. A collaboration of partners aimed to address these key issues by deploying data improvement teams (DITs) to improve data collection, management, analysis, and use in district health offices and health facilities. During November 2014-September 2016, DITs visited all districts and 89% of health facilities in Uganda. DITs identified gaps in awareness and processes, assessed accuracy of data, and provided on-the-job training to strengthen systems and improve healthcare workers' knowledge and skills in data quality. Inaccurate data were observed primarily at the health facility level. Improvements in data management and collection practices were observed, although routine follow-up and accountability will be needed to sustain change. The DIT strategy offers a useful approach to enhancing the quality of health data. |
Assessing inactivated polio vaccine introduction and utilization in Kano State, Nigeria, April-November 2015
Osadebe LU , Macneil A , Elmousaad H , Davis L , Idris JM , Haladu SA , Adeoye OB , Nguku P , Aliu-Mamudu U , Hassan E , Vertefeuille J , Bloland P . J Infect Dis 2017 216 S137-S145 Background. Kano State, Nigeria, introduced inactivated polio vaccine (IPV) into its routine immunization (RI) schedule in March 2015 and was the pilot site for an RI data module for the National Health Management Information System (NHMIS). We determined factors impacting IPV introduction and the value of the RI module on monitoring new vaccine introduction. Methods. Two assessment approaches were used: (1) analysis of IPV vaccinations reported in NHMIS, and (2) survey of 20 local government areas (LGAs) and 60 associated health facilities (HF). Results. By April 2015, 66% of LGAs had at least 20% of HFs administering IPV, by June all LGAs had HFs administering IPV and by July, 91% of the HFs in Kano reported administering IPV. Among surveyed staff, most rated training and implementation as successful. Among HFs, 97% had updated RI reporting tools, although only 50% had updated microplans. Challenges among HFs included: IPV shortages (20%), hesitancy to administer 2 injectable vaccines (28%), lack of knowledge on multi-dose vial policy (30%) and age of IPV administration (8%). Conclusion. The introduction of IPV was largely successful in Kano and the RI module was effective in monitoring progress, although certain gaps were noted, which should be used to inform plans for future vaccine introductions. |
Fifty years of global immunization at CDC, 1966-2015
Mast EE , Cochi SL , Kew OM , Cairns KL , Bloland PB , Martin R . Public Health Rep 2017 132 (1) 18-26 On November 23, 1965, President Lyndon Johnson announced plans for a 5-year smallpox eradication and measles control program in West Africa that enabled the Centers for Disease Control and Prevention (CDC) to establish a Smallpox Eradication Program in January 1966. Since then, CDC’s global immunization endeavors have encompassed global smallpox eradication, the establishment and growth of the Expanded Program on Immunization (EPI) to strengthen national immunization programs, global efforts to eradicate polio and eliminate measles and rubella, and vaccine introduction into national immunization schedules beyond the original 6 EPI vaccines. CDC has provided scientific leadership, evidence-based guidance, and programmatic strategies to build public health infrastructure around the world, needed to achieve and measure the impact of these global immunization initiatives. This article marks the 50th anniversary of CDC’s global immunization leadership, highlights key historical events, and provides an overview of CDC’s future directions. | Before 1955, smallpox and diphtheria-tetanus-pertussis vaccines were the only routinely recommended childhood vaccines in the United States. The roots of global immunization at CDC began after clinical trials for the Salk inactivated polio vaccine (IPV) in 1954. After investigators announced on April 12, 1955, that Salk IPV was safe and effective, large-scale vaccination campaigns were implemented across the United States, and IPV was set to join diphtheria-tetanus-pertussis and smallpox vaccines in the childhood vaccination schedule. However, improperly prepared IPV by Cutter Pharmaceuticals used for the vaccination campaigns led to 200 cases of paralysis and 10 deaths.1 |
Serological markers for monitoring historical changes in malaria transmission intensity in a highly endemic region of Western Kenya, 1994-2009
Wong J , Hamel MJ , Drakeley CJ , Kariuki S , Shi YP , Lal AA , Nahlen BL , Bloland PB , Lindblade KA , Were V , Otieno K , Otieno P , Odero C , Slutsker L , Vulule JM , Gimnig JE . Malar J 2014 13 (451) 451 BACKGROUND: Monitoring local malaria transmission intensity is essential for planning evidence-based control strategies and evaluating their impact over time. Anti-malarial antibodies provide information on cumulative exposure and have proven useful, in areas where transmission has dropped to low sustained levels, for retrospectively reconstructing the timing and magnitude of transmission reduction. It is unclear whether serological markers are also informative in high transmission settings, where interventions may reduce transmission, but to a level where considerable exposure continues. METHODS: This study was conducted through ongoing KEMRI and CDC collaboration. Asembo, in Western Kenya, is an area where intense malaria transmission was drastically reduced during a 1997-1999 community-randomized, controlled insecticide-treated net (ITN) trial. Two approaches were taken to reconstruct malaria transmission history during the period from 1994 to 2009. First, point measurements were calculated for seroprevalence, mean antibody titre, and seroconversion rate (SCR) against three Plasmodium falciparum antigens (AMA-1, MSP-119, and CSP) at five time points for comparison against traditional malaria indices (parasite prevalence and entomological inoculation rate). Second, within individual post-ITN years, age-stratified seroprevalence data were analysed retrospectively for an abrupt drop in SCR by fitting alternative reversible catalytic conversion models that allowed for change in SCR. RESULTS: Generally, point measurements of seroprevalence, antibody titres and SCR produced consistent patterns indicating that a gradual but substantial drop in malaria transmission (46-70%) occurred from 1994 to 2007, followed by a marginal increase beginning in 2008 or 2009. In particular, proportionate changes in seroprevalence and SCR point estimates (relative to 1994 baseline values) for AMA-1 and CSP, but not MSP-119, correlated closely with trends in parasite prevalence throughout the entire 15-year study period. However, retrospective analyses using datasets from 2007, 2008 and 2009 failed to detect any abrupt drop in transmission coinciding with the timing of the 1997-1999 ITN trial. CONCLUSIONS: In this highly endemic area, serological markers were useful for generating accurate point estimates of malaria transmission intensity, but not for retrospective analysis of historical changes. Further investigation, including exploration of different malaria antigens and/or alternative models of population seroconversion, may yield serological tools that are more informative in high transmission settings. |
Pattern of morbidity and mortality in Karbala hospitals during Ashura mass gathering at Karbala, Iraq, 2010
Al-Lami F , Al-Fatlawi A , Bloland P , Nawwar A , Jetheer A , Hantoosh H , Radhi F , Mohan B , Abbas M , Kamil A , Khayatt I , Baqir H . East Mediterr Health J 2013 19 S13-S18 Religious mass gatherings are increasingly common in Iraq and can harbour considerable public health risks. This study was aimed at determining morbidity and mortality patterns in hospitals in Karbala city, Iraq during the mass gathering for Ashura in 2010. We conducted a cross-sectional study on attendees at the 3 public hospitals in the city. The study period was divided into pre-event, event, and post-event phases. Morbidity and mortality data were obtained from hospital registry books and the coroners office. About 80% of the 18 415 consultations were at emergency rooms. Average daily emergency room attendance was higher during the event compared with pre- and post-event phases, while average daily admissions decreased. Compared with the pre-event phase, a 7-fold increase in febrile disorders and a 2-fold increase in chronic diseases and injuries were noted during the event phase. There was no difference between the 3 phases for average daily death rate,nor for cause of death. |
Adverse drug events resulting from use of drugs with sulphonamide-containing anti-malarials and artemisinin-based ingredients: findings on incidence and household costs from three districts with routine demographic surveillance systems in rural Tanzania
Njau JD , Kabanywanyi AM , Goodman CA , MacArthur JR , Kapella BK , Gimnig JE , Kahigwa E , Bloland PB , Abdulla SM , Kachur SP . Malar J 2013 12 (1) 236 BACKGROUND: Anti-malarial regimens containing sulphonamide or artemisinin ingredients are widely used in malaria-endemic countries. However, evidence of the incidence of adverse drug reactions (ADR) to these drugs is limited, especially in Africa, and there is a complete absence of information on the economic burden such ADR place on patients. This study aimed to document ADR incidence and associated household costs in three high malaria transmission districts in rural Tanzania covered by demographic surveillance systems. METHODS: Active and passive surveillance methods were used to identify ADR from sulphadoxine-pyrimethamine (SP) and artemisinin (AS) use. ADR were identified by trained clinicians at health facilities (passive surveillance) and through cross-sectional household surveys (active surveillance). Potential cases were followed up at home, where a complete history and physical examination was undertaken, and household cost data collected. Patients were classified as having 'possible' or 'probable' ADR by a physician. RESULTS: A total of 95 suspected ADR were identified during a two-year period, of which 79 were traced, and 67 reported use of SP and/or AS prior to ADR onset. Thirty-four cases were classified as 'probable' and 33 as 'possible' ADRs. Most (53) cases were associated with SP monotherapy, 13 with the AS/SP combination (available in one of the two areas only), and one with AS monotherapy. Annual ADR incidence per 100,000 exposures was estimated based on 'probable' ADR only at 5.6 for AS/SP in combination, and 25.0 and 11.6 for SP monotherapy. Median ADR treatment costs per episode ranged from US$2.23 for those making a single provider visit to US$146.93 for patients with four visits. Seventy-three per cent of patients used out-of-pocket funds or sold part of their farm harvests to pay for treatment, and 19% borrowed money. CONCLUSION: Both passive and active surveillance methods proved feasible methods for anti-malarial ADR surveillance, with active surveillance being an important complement to facility-based surveillance, given the widespread practice of self-medication. Household costs associated with ADR treatment were high and potentially catastrophic. Efforts should be made to both improve pharmacovigilance across Africa and to identify strategies to reduce the economic burden endured by households suffering from ADR. |
The role of public health institutions in global health system strengthening efforts: the US CDC's perspective
Bloland P , Simone P , Burkholder B , Slutsker L , De Cock KM . PLoS Med 2012 9 (4) e1001199 Peter Bloland and colleagues from the US CDC lay out the agency's priorities for health systems strengthening efforts. |
Highlights and conclusions from the Eastern Mediterranean Public Health Network (EMPHNET) conference 2011
Al Nsour M , Kaiser R , Elkreem EA , Walke H , Kandeel A , Bloland P . East Mediterr Health J 2012 18 (2) 189-191 As a follow up of a short communication that the Eastern Mediterranean Health Journal published in December 2011, this article reports on highlights and conclusions from scientific abstracts, methodology workshops and plenary sessions that were presented as part of the Eastern Mediterranean Public Health Network (EMPHNET) conference held from 6 to 9 December 2011 in Sharm Al Sheikh, Egypt. |
Training the global public health workforce through applied epidemiology training programs: CDC's experience, 1951-2011
Schneider D , Evering-Watley M , Walke H , Bloland PB . Public Health Rev 2011 33 (1) 190-203 The strengthening of health systems is becoming increasingly recognized as necessary for the achievement of many objectives promoted or supported by global public health initiatives. Key within the effort to strengthen health systems is the development of a well-prepared, skilled, and knowledgeable public health workforce. Over 60 years ago, the United States Centers for Disease Control and Prevention (CDC) began the first training program in applied epidemiology, the Epidemic Intelligence Service (EIS), a two-year, in-service training program in epidemiology and public health practice. Since 1951, the EIS has produced well-trained and highly qualified applied or field epidemiologists, many of whom later became leaders within the US public health system. In 1980, the CDC began assisting other countries to develop their own field epidemiology training programs (FETPs), modeling them after the highly successful EIS program. FETPs differ from other training programs in epidemiology in that: (1) they are positioned within Ministries of Health and the activities of the residents are designed to address the priority health issues of the Ministry; (2) they stress the principle of training through service; and (3) they provide close supervision and mentoring by trained field epidemiologists. While FETPs are designed to be adaptable to the needs of any given country, there exist many fundamental similarities in the skills and knowledge required by public health workers. Recognizing this, CDC developed a standard core FETP curriculum that can be adapted to any country's needs. Countries can further customize FETP trainings to meet their specific needs by adding specialized - tracks - or by targeting different audiences and levels of the health system. Although FETPs require substantial investments in time and resources as well as significant commitment from ministries, CDC's vision is that every country will have access to an FETP to help build its public health workforce and strengthen its public health systems. |
Erratum: Molecular monitoring of resistant dhfr and dhps allelic haplotypes in Morogoro and Mvomero districts in south eastern Tanzania
Malisa A , Pearce R , Abdullah S , Mutayoba B , Mshinda H , Kachur P , Bloland P , Roper C . Afr Health Sci 2011 11 (2) 142-50 BACKGROUND: Resistance to the antimalarial drug sulfadoxine-pyrimethamine (SP) emerged in Plasmodium falciparum from Asia in the 1960s and subsequently spread to Africa. In Tanzania, SP use as a national policy began in 1983 as a second line to chloroquine (CQ) for the treatment of uncomplicated malaria, until August 2001 when it was approved to replace CQ as a national first line. OBJECTIVE: The present study assesses the frequency of resistant dhfr and dhps alleles in Morogoro-Mvomero district in south eastern Tanzania and contrast their rate of change during 17 years of SP second line use against five years of SP first line use. METHODOLOGY: Cross sectional surveys of asymptomatic infections were carried out at the end of rainy season during July-September of 2000, when SP was the national second line (CQ was the first line) and 2006 when SP was the national first line antimalarial treatment. Genetic analysis of SP resistance genes was carried out on 1,044 asymptomatic infections and the effect of the two policies on SP evolution compared. RESULTS: The frequency of the most resistant allele, the double dhps-triple dhfr mutant genotype, increased by only 1% during 17 years of SP second line use, but there was a dramatic increase by 45% during five years of SP first line use. CONCLUSION: We conclude that National policy change from second line to first line SP, brought about an immediate shift in treatment practice and this in turn had a highly significant impact on drug pressure. The use of SP in specific programs only such as intermittent preventive treatment of infants (IPTi) and intermittent preventive treatment of pregnant women (IPTp) will most likely reduce substantially SP selection pressure and the SP resistance alleles alike. [This corrects the article on p. 367 in vol. 10, PMID: 21416039.]. |
Drug dispensing practices during implementation of artemisinin-based combination therapy at health facilities in rural Tanzania, 2002-2005
Thwing JI , Njau JD , Goodman C , Munkondya J , Kahigwa E , Bloland PB , Mkikima S , Mills A , Abdulla S , Kachur SP . Trop Med Int Health 2011 16 (3) 272-9 OBEJCTIVE: To assess the degree to which policy changes to artemisinin-based combination therapies (ACTs) as first-line treatment for uncomplicated malaria translate into effective ACT delivery. METHODS: Prospective observational study of drug dispensing practices at baseline and during the 3 years following introduction of ACT with sulfadoxine-pyrimethamine (SP) plus artesunate (AS) in Rufiji District, compared with two neighbouring districts where SP monotherapy remained the first-line treatment, was carried out. Demographic and dispensing data were collected from all patients at the dispensing units of selected facilities for 1 month per quarter, documenting a total of 271 953 patient encounters in the three districts. RESLTS: In Rufiji, the proportion of patients who received a clinical diagnosis of malaria increased from 47.6% to 57.0%. A majority (75.9%) of these received SP + AS during the intervention period. Of patients who received SP + AS, 94.6% received the correct dose of both. Among patients in Rufiji who received SP, 14.2% received SP monotherapy, and among patients who received AS, 0.3% received AS monotherapy. CONCLUSIONS: The uptake of SP + AS in Rufiji was rapid and sustained. Although some SP monotherapy occurred, AS monotherapy was rare, and most received the correct dose of both drugs. These results suggest that implementation of an artemisinin combination therapy, accompanied by training, job aids and assistance in stock management, can rapidly increase access to effective antimalarial treatment. |
Rift Valley fever outbreak in livestock in Kenya, 2006-2007
Munyua P , Murithi RM , Wainwright S , Githinji J , Hightower A , Mutonga D , Macharia J , Ithondeka PM , Musaa J , Breiman RF , Bloland P , Njenga MK . Am J Trop Med Hyg 2010 83 58-64 We analyzed the extent of livestock involvement in the latest Rift Valley fever (RVF) outbreak in Kenya that started in December 2006 and continued until June 2007. When compared with previous RVF outbreaks in the country, the 2006-07 outbreak was the most extensive in cattle, sheep, goats, and camels affecting thousands of animals in 29 of 69 administrative districts across six of the eight provinces. This contrasted with the distribution of approximately 700 human RVF cases in the country, where over 85% of these cases were located in four districts; Garissa and Ijara districts in Northeastern Province, Baringo district in Rift Valley Province, and Kilifi district in Coast Province. Analysis of livestock and human data suggests that livestock infections occur before virus detection in humans, as supported by clustering of human RVF cases around livestock cases in Baringo district. The highest livestock morbidity and mortality rates were recorded in Garissa and Baringo districts, the same districts that recorded a high number of human cases. The districts that reported RVF in livestock for the first time in 2006/07 included Kitui, Tharaka, Meru South, Meru central, Mwingi, Embu, and Mbeere in Eastern Province, Malindi and Taita taveta in Coast Province, Kirinyaga and Murang'a in Central Province, and Baringo and Samburu in Rift Valley Province, indicating that the disease was occurring in new regions in the country. |
Pathologic studies on suspect animal and human cases of Rift Valley fever from an outbreak in Eastern Africa, 2006-2007
Shieh WJ , Paddock CD , Lederman E , Rao CY , Gould LH , Mohamed M , Mosha F , Mghamba J , Bloland P , Njenga MK , Mutonga D , Samuel AA , Guarner J , Breiman RF , Zaki SR . Am J Trop Med Hyg 2010 83 38-42 Rift Valley fever (RVF) is an important viral zoonotic disease in Africa with periodic outbreaks associated with severe disease, death, and economic hardship. During the 2006-2007 outbreaks in Eastern Africa, postmortem and necropsy tissue samples from 14 animals and 20 humans clinically suspected of RVF were studied with histopathologic evaluation and immunohistochemical (IHC) assays. Six animal and 11 human samples had IHC evidence of Rift Valley fever virus (RVFV) antigens. We found that extensive hepatocellular necrosis without prominent inflammatory cell infiltrates is the most distinctive histopathologic change in liver tissues infected with RVFV. Pathologic studies on postmortem tissue samples can help establish the diagnosis of RVF, differentiating from endemic diseases with clinical manifestations similar to RVF, such as malaria, leptospirosis, or yellow fever. |
Epidemiologic and clinical aspects of a Rift Valley fever outbreak in humans in Tanzania, 2007
Mohamed M , Mosha F , Mghamba J , Zaki SR , Shieh WJ , Paweska J , Omulo S , Gikundi S , Mmbuji P , Bloland P , Zeidner N , Kalinga R , Breiman RF , Njenga MK . Am J Trop Med Hyg 2010 83 22-7 In January 2007, an outbreak of Rift Valley fever (RVF) was detected among humans in northern Tanzania districts. By the end of the outbreak in June, 2007, 511 suspect RVF cases had been recorded from 10 of the 21 regions of Tanzania, with laboratory confirmation of 186 cases and another 123 probable cases. All confirmed RVF cases were located in the north-central and southern regions of the country, with an eventual fatality rate of 28.2% (N = 144). All suspected cases had fever; 89% had encephalopathy, 10% hemorrhage, and 3% retinopathy. A total of 169 (55%) of the 309 confirmed or probable cases were also positive for malaria as detected by peripheral blood smear. In a cohort of 20 RVF cases with known outcome that were also positive for human immunodeficiency virus, 15 (75%) died. Contact with sick animals and animal products, including blood, meat, and milk, were identified as major risk factors of acquiring RVF. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure