Using simulation to compare established and emerging interventions to reduce cardiovascular disease risk in the United States
Homer J , Wile K , Yarnoff B , Trogdon JG , Hirsch G , Cooper L , Soler R , Orenstein D . Prev Chronic Dis 2014 11 E195 INTRODUCTION: Computer simulation offers the ability to compare diverse interventions for reducing cardiovascular disease risks in a controlled and systematic way that cannot be done in the real world. METHODS: We used the Prevention Impacts Simulation Model (PRISM) to analyze the effect of 50 intervention levers, grouped into 6 (2 x 3) clusters on the basis of whether they were established or emerging and whether they acted in the policy domains of care (clinical, mental health, and behavioral services), air (smoking, secondhand smoke, and air pollution), or lifestyle (nutrition and physical activity). Uncertainty ranges were established through probabilistic sensitivity analysis. RESULTS: Results indicate that by 2040, all 6 intervention clusters combined could result in cumulative reductions of 49% to 54% in the cardiovascular risk-related death rate and of 13% to 21% in risk factor-attributable costs. A majority of the death reduction would come from Established interventions, but Emerging interventions would also contribute strongly. A slim majority of the cost reduction would come from Emerging interventions. CONCLUSION: PRISM allows public health officials to examine the potential influence of different types of interventions - both established and emerging - for reducing cardiovascular risks. Our modeling suggests that established interventions could still contribute much to reducing deaths and costs, especially through greater use of well-known approaches to preventive and acute clinical care, whereas emerging interventions have the potential to contribute significantly, especially through certain types of preventive care and improved nutrition. |
Using the Community Readiness Model to examine the built and social environment: a case study of the High Point neighborhood, Seattle, Washington, 2000-2010
Buckner-Brown J , Sharify DT , Blake B , Phillips T , Whitten K . Prev Chronic Dis 2014 11 E194 BACKGROUND: Residents of many cities lack affordable, quality housing. Economically disadvantaged neighborhoods often have high rates of poverty and crime, few institutions that enhance the quality of its residents' lives, and unsafe environments for walking and other physical activity. Deteriorating housing contributes to asthma-related illness. We describe the redevelopment of High Point, a West Seattle neighborhood, to improve its built environment, increase neighborhood physical activity, and reduce indoor asthma triggers. COMMUNITY CONTEXT: High Point is one of Seattle's most demographically diverse neighborhoods. Prior to redevelopment, it had a distressed infrastructure, rising crime rates, and indoor environments that increased asthma-related illness in children and adolescents. High Point residents and partners developed and implemented a comprehensive redevelopment plan to create a sustainable built environment to increase outdoor physical activity and improve indoor environments. METHODS: We conducted a retrospective analysis of the High Point redevelopment, organized by the different stages of change in the Community Readiness Model. We also examined the multisector partnerships among government and community groups that contributed to the success of the High Point project. OUTCOME: Overall quality of life for residents improved as a result of neighborhood redevelopment. Physical activity increased, residents reported fewer days of poor physical or mental health, and social connectedness between neighbors grew. Asthma-friendly homes significantly decreased asthma-related illness among children and adolescents. INTERPRETATION: Providing affordable, quality housing to low-income families improved individual and neighborhood quality of life. Efforts to create social change and improve the health outcomes for entire populations are more effective when multiple organizations work together to improve neighborhood health. |
Vital Signs: cervical cancer incidence, mortality, and screening - United States, 2007-2012
Benard VB , Thomas CC , King J , Massetti GM , Doria-Rose VP , Saraiya M . MMWR Morb Mortal Wkly Rep 2014 63 (44) 1004-1009 BACKGROUND: Cervical cancer screening is one of the greatest cancer prevention achievements, yet some women still develop or die from this disease. OBJECTIVE: To assess recent trends in cervical cancer incidence and mortality, current screening percentages, and factors associated with higher incidence and death rates and inadequate screening. METHODS: Percentages of women who had not been screened for cervical cancer in the past 5 years were estimated using data from the 2012 Behavioral Risk Factor Surveillance System survey. State-specific cervical cancer incidence data from the United States Cancer Statistics and mortality data from the National Vital Statistics System were used to calculate incidence and death rates for 2011 by state. Incidence and death rates and annual percentage changes from 2007 to 2011 were calculated by state and U.S. Census region. RESULTS: In 2012, the percentage of women who had not been screened for cervical cancer in the past 5 years was estimated to be 11.4%; the percentage was larger for women without health insurance (23.1%) and for those without a regular health care provider (25.5%). From 2007 to 2011, the cervical cancer incidence rate decreased by 1.9% per year while the death rate remained stable. The South had the highest incidence rate (8.5 per 100,000), death rate (2.7 per 100,000), and percentage of women who had not been screened in the past 5 years (12.3%). CONCLUSIONS: Trends in cervical cancer incidence rates have decreased slightly while death rates have been stable over the last 5 years. The proportion of inadequately screened women is higher among older women, Asians/Pacific Islanders, and American Indians/Alaska Natives. IMPLICATIONS FOR PUBLIC HEALTH PRACTICE: There continue to be women who are not screened as recommended, and women who die from this preventable cancer. Evidence-based public health approaches are available to increase women's access to screening and timely follow-up of abnormal results. |
Neoplasms misdiagnosed as "chronic Lyme disease"
Nelson C , Elmendorf S , Mead P . JAMA Intern Med 2014 175 (1) 132-3 Clinical features of Lyme disease include erythema migrans rash, facial palsy, arthritis, and peripheral neuropathy. In endemic areas, patients with erythema migrans can be diagnosed clinically. Otherwise, diagnosis is based on the history of possible exposure, compatible clinical features, and positive 2-tier serologic testing.1 | Chronic Lyme disease is a loosely defined diagnosis given by a small number of physicians—who are not usually infectious disease experts—to patients with various nonspecific symptoms, including patients with no objective evidence of Lyme disease.2 In addition to adverse outcomes from unconventional treatments for chronic Lyme disease,3,4 patients misdiagnosed with chronic Lyme disease may be harmed when their actual condition remains untreated. | We report 3 cases in which diagnosis of the patients’ actual conditions was delayed due to the misdiagnosis of chronic Lyme disease. Institutional review board approval was not obtained for this case series because it did not meet the regulatory definition of research and was outside the scope of institutional review board requirements. All 3 patients gave written informed consent to share their medical records for this case series. |
Premature deaths among children with epilepsy - South Carolina, 2000-2011
Selassie AW , Wilson DA , Malek AM , Wagner JL , Smith G , Martz G , Edwards J , Wannamaker B , Zack MM , Kobau R . MMWR Morb Mortal Wkly Rep 2014 63 (44) 989-994 Epilepsy is a common childhood neurologic disorder. In 2007, epilepsy affected an estimated 450,000 children aged 0-17 years in the United States. Approximately 53% of children with epilepsy and special health care needs have co-occurring conditions, and only about one third have access to comprehensive care. The few studies of mortality risk among children with epilepsy as compared with the general population generally find a higher risk for death among children with epilepsy with co-occurring conditions but a similar risk for death among children with epilepsy with no co-occurring conditions. However, samples from these mortality studies are often small, limiting comparisons, and are not representative. This highlights the need for expanded mortality surveillance among children with epilepsy to better understand their excess mortality. This report describes mortality among children with epilepsy in South Carolina during 2000-2011 by demographic characteristics and underlying causes of death. The overall mortality rate among children with epilepsy was 8.8 deaths per 1,000 person-years, and the annual risk for death was 0.84%. Developmental conditions, cardiovascular disorders, and injuries were the most common causes of death among children with epilepsy. Team-based care coordination across medical and nonmedical systems can improve outcomes and reduce health care costs for children with special health care needs, but they require more study among children with epilepsy. Ensuring appropriate and timely health care and social services for children with epilepsy, especially those with complications, might reduce the risk for premature death. Health care providers, social service providers, advocacy groups and others can work together to assess whether coordinated care can improve outcomes for children with epilepsy. |
The alliance to reduce disparities in diabetes: infusing policy and system change with local experience
Goode TD , Jack L Jr . Health Promot Pract 2014 15 6s-10s This supplement provides a comprehensive and in-depth examination of proven clinical-community health strategies employed by the Alliance to Reduce Disparities in Diabetes, across five sites located in diverse geographic regions of the United States, including a tribal community. Alliance projects in these communities focused on African Americans, Hispanics/Latinos, and American Indians as priority populations. Each project was implemented with an understanding that there are cultural norms, community characteristics, and health care system challenges that require sustained multicomponent approaches to ameliorate factors that exacerbate poor disease management and health outcomes. The articles increase understanding of what is required to implement evidence-based approaches shaped by local experiences in order to meet the needs of diverse communities affected by diabetes. Lessons learned have generic elements that can be used in other priority populations and settings. |
Arthritis among veterans - United States, 2011-2013
Murphy LB , Helmick CG , Allen KD , Theis KA , Baker NA , Murray GR , Qin J , Hootman JM , Brady TJ , Barbour KE . MMWR Morb Mortal Wkly Rep 2014 63 (44) 999-1003 Arthritis is among the most common chronic conditions among veterans and is more prevalent among veterans than nonveterans. Contemporary population-based estimates of arthritis prevalence among veterans are needed because previous population-based studies predate the Persian Gulf War, were small, or studied men only despite the fact that women comprise an increasing proportion of military personnel and typically have a higher prevalence of arthritis than men. To address this knowledge gap, CDC analyzed combined 2011, 2012, and 2013 Behavioral Risk Factor Surveillance System (BRFSS) data among all adults aged ≥18 years, by veteran status, to estimate the total and sex-specific prevalence of doctor-diagnosed arthritis overall and by sociodemographic categories, and the state-specific prevalence (overall and sex-specific) of doctor-diagnosed arthritis. This report summarizes the results of these analyses, which found that one in four veterans reported that they had arthritis (25.6%) and that prevalence was higher among veterans than nonveterans across most sociodemographic categories, including sex (prevalence among male and female veterans was 25.0% and 31.3%, respectively). State-specific, age-standardized arthritis prevalence among veterans ranged from 18.8% in Hawaii to 32.7% in West Virginia. Veterans comprise a large and important target group for reducing the growing burden of arthritis. Those interested in veterans' health can help to improve the quality of life of veterans by ensuring that they have access to affordable, evidence-based, physical activity and self-management education classes that reduce the adverse effects of arthritis (e.g., pain and depression) and its common comorbidities (e.g., heart disease and diabetes). |
Increasing HIV-1 molecular complexity among men who have sex with men in Bangkok.
Leelawiwat W , Rutvisuttinunt W , Arroyo M , Mueanpai F , Kongpechsatit O , Chonwattana W , Chaikummao S , de Souza M , vanGriensven DF , McNicholl JM , Curlin M . AIDS Res Hum Retroviruses 2014 31 (4) 393-400 BACKGROUND: In Thailand, new HIV-1 infections are largely concentrated in certain risk groups such as men who have sex with men (MSM), where annual incidence may be as high as 12% per year. The paucity of information on the molecular epidemiology of HIV-1 in Thai MSM limits progress in understanding the epidemic and developing new prevention methods. We evaluated HIV-1 subtypes in seroincident and seroprevalent HIV-1 infected men enrolled in the Bangkok MSM Cohort Study (BMCS) between 2006 and 2011. METHODS: We characterized HIV-1 subtype in 231 seroprevalent and 194 seroincident subjects using the multihybridization assay (MHA). Apparent dual infections, recombinant strains, and isolates found to be non-typeable by MHA were further characterized by targeted genomic sequencing. RESULTS: Most subjects were infected with HIV-1 CRF01_AE (82%), followed by infections with recombinants (11%, primarily CRF01_AE/B recombinants), subtype B (5%), and dual infections (2%). More than 11 distinct chimeric patterns were observed among CRF01B_AE/B recombinants, most involving recombination within integrase. A significant increase in the proportion of non-typeable strains was observed among seroincident MSM between 2006 and 2011. CONCLUSION: CRF01_AE and subtype B were the most and least common infecting strains, respectively. The predominance of CRF01_AE among HIV-1 infections in Thai MSM participating in the BMCS parallels trends observed in Thai heterosexuals and injecting drug users. The presence of complex recombinants, and a significant rise in non-typeable strains suggest ongoing changes in the genetic makeup of the HIV-1 epidemic in Thailand, which may pose challenges for HIV-1 prevention efforts and vaccine development. |
Shigellosis with decreased susceptibility to azithromycin
Heiman KE , Grass JE , Sjolund-Karlsson M , Bowen A . Pediatr Infect Dis J 2014 33 (11) 1204-5 Shigella with decreased susceptibility to azithromycin (DSA-Shigella) is emerging in the United States.1 This is concerning because azithromycin is recommended for treatment of multidrug-resistant shigellosis among children and adults.2 In the United States, Shigella causes approximately 500,000 illnesses annually, mainly in children <10 years of age, and it can cause large school- and childcare-associated outbreaks.3 Because clinical guidelines for determining susceptibility of Shigella to azithromycin do not exist, DSA-Shigella isolates are difficult to identify and treatment decisions must be made without azithromycin susceptibility data. | We identified DSA-Shigella isolates through the National Antimicrobial Resistance Monitoring System (NARMS), which in 2011 began measuring azithromycin minimum inhibitory concentrations among all Shigella isolates submitted from public health laboratories to Centers for Disease Control and Prevention for routine surveillance and outbreak evaluation (∼5% of US Shigella isolates). Additional DSA-Shigella isolates were identified through NARMS retrospective studies.1 We defined DSA as azithromycin minimum inhibitory concentration >16 μg/mL using broth microdilution.1 Macrolide resistance genes mphA and ermB were detected using polymerase chain reaction. |
Strong agreement of nationally recommended retention measures from the Institute of Medicine and Department of Health and Human Services
Rebeiro PF , Horberg MA , Gange SJ , Gebo KA , Yehia BR , Brooks JT , Buchacz K , Silverberg MJ , Gill J , Moore RD , Althoff KN . PLoS One 2014 9 (11) e111772 OBJECTIVE: We sought to quantify agreement between Institute of Medicine (IOM) and Department of Health and Human Services (DHHS) retention indicators, which have not been compared in the same population, and assess clinical retention within the largest HIV cohort collaboration in the U.S. DESIGN: Observational study from 2008-2010, using clinical cohort data in the North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD). METHODS: Retention definitions used HIV primary care visits. The IOM retention indicator was: ≥2 visits, ≥90 days apart, each calendar year. This was extended to a 2-year period; retention required meeting the definition in both years. The DHHS retention indicator was: ≥1 visit each semester over 2 years, each ≥60 days apart. Kappa statistics detected agreement between indicators and C statistics (areas under Receiver-Operating Characteristic curves) from logistic regression analyses summarized discrimination of the IOM indicator by the DHHS indicator. RESULTS: Among 36,769 patients in 2008-2009 and 34,017 in 2009-2010, there were higher percentages of participants retained in care under the IOM indicator than the DHHS indicator (80% vs. 75% in 2008-2009; 78% vs. 72% in 2009-2010, respectively) (p<0.01), persisting across all demographic and clinical characteristics (p<0.01). There was high agreement between indicators overall (kappa = 0.83 in 2008-2009; kappa = 0.79 in 2009-2010, p<0.001), and C statistics revealed a very strong ability to predict retention according to the IOM indicator based on DHHS indicator status, even within characteristic strata. CONCLUSIONS: Although the IOM indicator consistently reported higher retention in care compared with the DHHS indicator, there was strong agreement between IOM and DHHS retention indicators in a cohort demographically similar to persons living with HIV/AIDS in the U.S. Persons with poorer retention represent subgroups of interest for retention improvement programs nationally, particularly in light of the White House Executive Order on the HIV Care Continuum. |
Photovoice: a novel approach to improving antituberculosis treatment adherence in Pune, India
Shelke SC , Adhav PS , Moonan PK , Willis M , Parande MA , Satyanarayana S , Kshirsagar VD , Ghosh S . Tuberc Res Treat 2014 2014 302601 We compared antituberculosis treatment (ATT) adherence and outcomes among patients exposed to Photovoice (video of previously cured TB patients sharing experiences about TB treatment) versus those not exposed. The odds of successful outcome (i.e., cured or completing treatment) for the 135 patients who watched Photovoice were 3 times greater (odds ratio: 2.8; 95% CI: 1.3-6.1) than for patients who did not watch Photovoice. The comparison group, on average, missed more doses (10.9 doses; 95% CI: 6.6-11.1) than the intervention group who saw Photovoice (5.5 doses; 95% CI: 3.7-6.1). Using Photovoice at initiation of ATT has the potential to improve treatment adherence and outcomes. |
Establishment of a community care center for isolation and management of Ebola patients - Bomi County, Liberia, October 2014
Logan G , Vora NM , Nyensuah TG , Gasasira A , Mott J , Walke H , Mahoney F , Luce R , Flannery B . MMWR Morb Mortal Wkly Rep 2014 63 (44) 1010-1012 As of October 29, 2014, a total of 6,454 Ebola virus disease (Ebola) cases had been reported in Liberia by the Liberian Ministry of Health and Social Welfare, with 2,609 deaths. Although the national strategy for combating the ongoing Ebola epidemic calls for construction of Ebola treatment units (ETUs) in all 15 counties of Liberia, only a limited number are operational, and most of these are within Montserrado County. ETUs are intended to improve medical care delivery to persons whose illnesses meet Ebola case definitions, while also allowing for the safe isolation of patients to break chains of transmission in the community. Until additional ETUs are constructed, the Ministry of Health and Social Welfare is supporting development of community care centers (CCCs) for isolation of patients who are awaiting Ebola diagnostic test results and for provision of basic care (e.g., oral rehydration salts solutions) to patients confirmed to have Ebola who are awaiting transfer to ETUs. CCCs often have less bed capacity than ETUs and are frequently placed in areas not served by ETUs; if built rapidly enough and in sufficient quantity, CCCs will allow Ebola-related health measures to reach a larger proportion of the population. Staffing requirements for CCCs are frequently lower than for ETUs because CCCs are often designed such that basic patient needs such as food are provided for by friends and family of patients rather than by CCC staff. (It is customary in Liberia for friends and family to provide food for hospitalized patients.) Creation of CCCs in Liberia has been led by county health officials and nongovernmental organizations, and this local, community-based approach is intended to destigmatize Ebola, to encourage persons with illness to seek care rather than remain at home, and to facilitate contact tracing of exposed family members. This report describes one Liberian county's approach to establishing a CCC. |
Estimating the cost to U.S. health departments to conduct HIV surveillance
Shrestha RK , Sansom SL , Laffoon BT , Farnham PG , Shouse RL , MacMaster K , Hall HI . Public Health Rep 2014 129 (6) 496-504 OBJECTIVES: HIV case surveillance is a primary source of information for monitoring HIV burden in the United States and guiding the allocation of prevention and treatment funds. While the number of people living with HIV and the need for surveillance data have increased, little is known about the cost of surveillance. We estimated the economic cost to health departments of conducting high-quality HIV case surveillance. METHODS: We collected primary data on the unit cost and quantity of resources used to operate the HIV case surveillance program in Michigan, where HIV burden (i.e., the number of HIV cases) is moderate to high (n=14,864 cases). Based on Michigan's data, we projected the expected annual HIV surveillance cost for U.S., state, local, and territorial health departments. We based our cost projection on the variation in the number of new and established cases, area-specific wages, and potential economies of scale. RESULTS: We estimated the annual total HIV surveillance cost to the Michigan health department to be $1,286,524 ($87/case), the annual total cost of new cases to be $108,657 ($133/case), and the annual total cost of established cases to be $1,177,867 ($84/case). Our projected median annual HIV surveillance cost per health department ranged from $210,600 in low-HIV burden sites to $1,835,000 in high-HIV burden sites. CONCLUSIONS: Our analysis shows that a systematic approach to costing HIV surveillance at the health department level is feasible. For HIV surveillance, a substantial portion of total surveillance costs is attributable to maintaining established cases. |
Etiologic agents of central nervous system infections among febrile hospitalized patients in the country of Georgia
Akhvlediani T , Bautista CT , Shakarishvili R , Tsertsvadze T , Imnadze P , Tatishvili N , Davitashvili T , Samkharadze T , Chlikadze R , Dvali N , Dzigua L , Karchava M , Gatserelia L , Macharashvili N , Kvirkvelia N , Habashy EE , Farrell M , Rowlinson E , Sejvar J , Hepburn M , Pimentel G , Dueger E , House B , Rivard R . PLoS One 2014 9 (11) e111393 OBJECTIVES: There is a large spectrum of viral, bacterial, fungal, and prion pathogens that cause central nervous system (CNS) infections. As such, identification of the etiological agent requires multiple laboratory tests and accurate diagnosis requires clinical and epidemiological information. This hospital-based study aimed to determine the main causes of acute meningitis and encephalitis and enhance laboratory capacity for CNS infection diagnosis. METHODS: Children and adults patients clinically diagnosed with meningitis or encephalitis were enrolled at four reference health centers. Cerebrospinal fluid (CSF) was collected for bacterial culture, and in-house and multiplex RT-PCR testing was conducted for herpes simplex virus (HSV) types 1 and 2, mumps virus, enterovirus, varicella zoster virus (VZV), Streptococcus pneumoniae, HiB and Neisseria meningitidis. RESULTS: Out of 140 enrolled patients, the mean age was 23.9 years, and 58% were children. Bacterial or viral etiologies were determined in 51% of patients. Five Streptococcus pneumoniae cultures were isolated from CSF. Based on in-house PCR analysis, 25 patients were positive for S. pneumoniae, 6 for N. meningitidis, and 1 for H. influenzae. Viral multiplex PCR identified infections with enterovirus (n = 26), VZV (n = 4), and HSV-1 (n = 2). No patient was positive for mumps or HSV-2. CONCLUSIONS: Study findings indicate that S. pneumoniae and enteroviruses are the main etiologies in this patient cohort. The utility of molecular diagnostics for pathogen identification combined with the knowledge provided by the investigation may improve health outcomes of CNS infection cases in Georgia. |
First population-level effectiveness evaluation of a national programme to prevent HIV transmission from mother to child, South Africa
Goga AE , Dinh TH , Jackson DJ , Lombard C , Delaney KP , Puren A , Sherman G , Woldesenbet S , Ramokolo V , Crowley S , Doherty T , Chopra M , Shaffer N , Pillay Y . J Epidemiol Community Health 2014 69 (3) 240-8 BACKGROUND: There is a paucity of data on the national population-level effectiveness of preventing mother-to-child transmission (PMTCT) programmes in high-HIV-prevalence, resource-limited settings. We assessed national PMTCT impact in South Africa (SA), 2010. METHODS: A facility-based survey was conducted using a stratified multistage, cluster sampling design. A nationally representative sample of 10 178 infants aged 4-8 weeks was recruited from 565 clinics. Data collection included caregiver interviews, record reviews and infant dried blood spots to identify HIV-exposed infants (HEI) and HIV-infected infants. During analysis, self-reported antiretroviral (ARV) use was categorised: 1a: triple ARV treatment; 1b: azidothymidine >10 weeks; 2a: azidothymidine ≤10 weeks; 2b: incomplete ARV prophylaxis; 3a: no antenatal ARV and 3b: missing ARV information. Findings were adjusted for non-response, survey design and weighted for live-birth distributions. RESULTS: Nationally, 32% of live infants were HEI; early mother-to-child transmission (MTCT) was 3.5% (95% CI 2.9% to 4.1%). In total 29.4% HEI were born to mothers on triple ARV treatment (category 1a) 55.6% on prophylaxis (1b, 2a, 2b), 9.5% received no antenatal ARV (3a) and 5.5% had missing ARV information (3b). Controlling for other factors groups, 1b and 2a had similar MTCT to 1a (Ref; adjusted OR (AOR) for 1b, 0.98, 0.52 to 1.83; and 2a, 1.31, 0.69 to 2.48). MTCT was higher in group 2b (AOR 3.68, 1.69 to 7.97). Within group 3a, early MTCT was highest among breastfeeding mothers 11.50% (4.67% to 18.33%) for exclusive breast feeding, 11.90% (7.45% to 16.35%) for mixed breast feeding, and 3.45% (0.53% to 6.35%) for no breast feeding). Antiretroviral therapy or >10 weeks prophylaxis negated this difference (MTCT 3.94%, 1.98% to 5.90%; 2.07%, 0.55% to 3.60% and 2.11%, 1.28% to 2.95%, respectively). CONCLUSIONS: SA, a high-HIV-prevalence middle income country achieved <5% MTCT by 4-8 weeks post partum. The long-term impact on PMTCT on HIV-free survival needs urgent assessment. |
HIV, chlamydia, gonorrhea, and primary and secondary syphilis among American Indians and Alaska Natives within Indian Health Service Areas in the United States, 2007-2010
Walker FJ , Llata E , Doshani M , Taylor MM , Bertolli J , Weinstock HS , Hall HI . J Community Health 2014 40 (3) 484-92 National rates from human immunodeficiency virus (HIV) and sexually transmitted disease (STD) surveillance may not effectively convey the impact of HIV and STDs on American Indian/Alaska Native (AI/AN) communities. Instead, we compared average annual diagnosis rates per 100,000 population of HIV, chlamydia (CT), gonorrhea (GC), and primary and secondary (P&S) syphilis, from 2007 to 2010, among AI/AN aged ≥13 years residing in 625 counties in the 12 Indian Health Service Areas, all AI/AN, and all races/ethnicities to address this gap. AI/AN comprised persons reported as AI/AN only, with or without Hispanic ethnicity. Out of 12 IHS Areas, 10 had higher case rates for CT, 3 for GC, and 4 for P&S syphilis compared to rates for all races/ethnicities. Eight Areas had higher HIV diagnosis rates than for all AI/AN, but HIV rates for all IHS Areas were lower than national rates for all races/ethnicities. Two IHS Areas ranking highest in rates of CT and GC and four Areas with highest P&S syphilis also had high HIV rates. STD and HIV rates among AI/AN were greater in certain IHS Areas than expected from observing national rates for AI/AN. Integrated surveillance of overlapping trends in STDs and HIV may be useful in guiding prevention efforts for AI/AN populations. |
The impact of external donor support through the U.S. President's Emergency Plan for AIDS Relief on the cost of red cell concentrate in Namibia, 2004-2011
Pitman JP , Bocking A , Wilkinson R , Postma MJ , Basavaraju SV , Von Finckenstein B , Mataranyika M , Marfin AA , Lowrance DW , Sibinga CT . Blood Transfus 2014 13 (2) 1-8 BACKGROUND: External assistance can rapidly strengthen health programmes in developing countries, but such funding can also create sustainability challenges. From 2004-2011, the U.S. President's Emergency Plan for AIDS Relief (PEPFAR) provided more than $8 million to the Blood Transfusion Service of Namibia (NAMBTS) for supplies, equipment, and staff salaries. This analysis describes the impact that support had on actual production costs and the unit prices charged for red cell concentrate (RCC) units issued to public sector hospitals. MATERIALS AND METHODS: A costing system developed by NAMBTS to set public sector RCC unit prices was used to describe production costs and unit prices during the period of PEPFAR scale-up (2004-2009) and the 2 years in which PEPFAR support began to decline (2010-2011). Hypothetical production costs were estimated to illustrate differences had PEPFAR support not been available. RESULTS: Between 2004-2006, NAMBTS sold 22,575 RCC units to public sector facilities. During this time, RCC unit prices exceeded per unit cost-recovery targets by between 40.3% (US$16.75 or N$109.86) and 168.3% (US$48.72 or N$333.28) per year. However, revenue surpluses dwindled between 2007 and 2011, the final year of the study period, when NAMBTS sold 20,382 RCC units to public facilities but lost US$23.31 (N$170.43) on each unit. DISCUSSION: PEPFAR support allowed NAMBTS to leverage domestic cost-recovery revenue to rapidly increase blood collections and the distribution of RCC. However, external support kept production costs lower than they would have been without PEPFAR. If PEPFAR funds had not been available, RCC prices would have needed to increase by 20% per year to have met annual cost-recovery targets and funded the same level of investments as were made with PEPFAR support. Tracking the subsidising influence of external support can help blood services make strategic investments and plan for unit price increases as external funds are withdrawn. |
Community-based evaluation of PMTCT uptake in Nyanza province, Kenya
Kohler PK , Okanda J , Kinuthia J , Mills LA , Olilo G , Odhiambo F , Laserson KF , Zierler B , Voss J , John-Stewart G . PLoS One 2014 9 (10) e110110 INTRODUCTION: Facility-based assessments of prevention of mother-to-child HIV transmission (PMTCT) programs may overestimate population coverage. There are few community-based studies that evaluate PMTCT coverage and uptake. METHODS: During 2011, a cross-sectional community survey among women who gave birth in the prior year was performed using the KEMRI-CDC Health and Demographic Surveillance System in Western Kenya. A random sample (n = 405) and a sample of women known to be HIV-positive through previous home-based testing (n = 247) were enrolled. Rates and correlates of uptake of antenatal care (ANC), HIV-testing, and antiretrovirals (ARVs) were determined. RESULTS: Among 405 women in the random sample, 379 (94%) reported accessing ANC, most of whom (87%) were HIV tested. Uptake of HIV testing was associated with employment, higher socioeconomic status, and partner HIV testing. Among 247 known HIV-positive women, 173 (70%) self-disclosed their HIV status. Among 216 self-reported HIV-positive women (including 43 from the random sample), 82% took PMTCT ARVs, with 54% completing the full antenatal, peripartum, and postpartum course. Maternal ARV use was associated with more ANC visits and having an HIV tested partner. ARV use during delivery was lowest (62%) and associated with facility delivery. Eighty percent of HIV infected women reported having their infant HIV tested, 11% of whom reported their child was HIV infected, 76% uninfected, 6% declined to say, 7% did not recall; 79% of infected children were reportedly receiving HIV care and treatment. CONCLUSIONS: Community-based assessments provide data that complements clinic-based PMTCT evaluations. In this survey, antenatal HIV test uptake was high; most HIV infected women received ARVs, though many women did not self-disclose HIV status to field team. Community-driven strategies that encourage early ANC, partner involvement, and skilled delivery, and provide PMTCT education, may facilitate further reductions in vertical transmission. |
Introduction: health equity among incarcerated female adolescents and adult women: infectious and other disease morbidity
LeBlanc TT , Reid L , Dean HD , Green Y . Women Health 2014 54 (8) 687-693 The number of persons under correctional supervision in the United States increased in the mid-1970s and peaked in 2009 (Bureau of Justice Statistics, 2013). Though in subsequent years, incarcerated populations declined slightly, the United States continues to have one of the highest rates of incarceration among developed nations, and in the world, with 1 in 4 American adults behind bars (Pew Center on the States, 2012). Though detained populations are predominantly male, in the past 30 years, the number of women inmates in correctional facilities has increased dramatically. From 1977–2004, the number of U.S. female prisoners serving more than a year grew by 757%, while during the same period, the number of male prisoners grew by 388% (Frost, Greene, & Pranis, 2006). The growth of women in jails and prisons has surpassed male inmate population growth in 50 states (Frost, Greene, & Pranis, 2006). From 2000 to 2009, the number of women incarcerated in state or federal prisons rose by 21.6%, compared to a 15.6% increase for men (Mauer, 2013). | Nationally, there are more than eight times as many women under correctional supervision as there were in 1980 (American Civil Liberties Union, 2006). The United States has the highest incarceration rate for women in the world. In 2006, the rate was approximately 123 per 100,000 for women, which is much higher than those of England (17 per 100,000), France (6 per 100,000), Russia (73 per 100,000), and Thailand (88 per 100,000) (Hartney, 2006). |
A new strategy for public health surveillance at CDC: improving national surveillance activities and outcomes
Richards CL , Iademarco MF , Anderson TC . Public Health Rep 2014 129 (6) 472-6 Public health surveillance is the cornerstone of public health practice and can be defined as the “… systematic, ongoing collection, management, analysis, and interpretation of data followed by the dissemination of these data to public health programs to stimulate public health action.”1 Stakeholders in the United States at all levels of government (i.e., federal and state, territorial, local, and tribal [STLT]), in academia and industry, and the general public rely on high-quality, timely surveillance data to detect and monitor diseases, injuries, and conditions; assess the impact of interventions; and assist in the management of large-scale disease incidents. Surveillance data are crucially important to inform policy changes, guide new program interventions, sharpen public communications, and help agencies assess research investments. | The public health surveillance enterprise in the U.S. is a long-term partnership that operates through thousands of agencies at the federal and STLT levels. The U.S. Centers for Disease Control and Prevention (CDC) generally does not collect public health surveillance information directly, but relies on state and local health departments and other systems to do so. CDC, however, plays an important collaborative role in aggregating, analyzing, and disseminating surveillance data; creating tools for surveillance; providing technical assistance to states and territories; researching surveillance policy; and funding surveillance activities. In the past few years, observers inside and outside CDC have identified some of the most important influences shaping surveillance in the 21st century (e.g., security concerns, technological advances, and health-care reform) and how these influences may affect the surveillance enterprise. Observers have touched on the need for ongoing evaluation of surveillance systems; standardization, with the goal of developing sustainable and integrated systems; and system and workforce adaptability to current demands. These observers have recognized many challenges that could impede progress, such as funding, workforce, information technology standards, patient confidentiality, and concerns about data access, quality, and sharing.1–3 For example, one fundamental challenge is the tension, both at the federal and STLT levels, between the needs of the whole surveillance enterprise and specific disease control programs, which require specialized surveillance data and are organized and funded along disease-specific lines. |
Restaurant manager and worker food safety certification and knowledge
Brown LG , Le B , Wong MR , Reimann D , Nicholas D , Faw B , Davis E , Selman CA . Foodborne Pathog Dis 2014 11 (11) 835-43 Over half of foodborne illness outbreaks occur in restaurants. To combat these outbreaks, many public health agencies require food safety certification for restaurant managers, and sometimes workers. Certification entails passing a food safety knowledge examination, which is typically preceded by food safety training. Current certification efforts are based on the assumption that certification leads to greater food safety knowledge. The Centers for Disease Control and Prevention conducted this study to examine the relationship between food safety knowledge and certification. We also examined the relationships between food safety knowledge and restaurant, manager, and worker characteristics. We interviewed managers (N=387) and workers (N=365) about their characteristics and assessed their food safety knowledge. Analyses showed that certified managers and workers had greater food safety knowledge than noncertified managers and workers. Additionally, managers and workers whose primary language was English had greater food safety knowledge than those whose primary language was not English. Other factors associated with greater food safety knowledge included working in a chain restaurant, working in a larger restaurant, having more experience, and having more duties. These findings indicate that certification improves food safety knowledge, and that complex relationships exist among restaurant, manager, and worker characteristics and food safety knowledge. |
Computational framework for next-generation sequencing of heterogeneous viral populations using combinatorial pooling.
Skums P , Artyomenko A , Glebova O , Ramachandran S , Mandoiu I , Campo DS , Dimitrova Z , Zelikovsky A , Khudyakov Y . Bioinformatics 2014 31 (5) 682-90 MOTIVATION: Next-generation sequencing (NGS) allows for analyzing a large number of viral sequences from infected patients, providing an opportunity to implement large-scale molecular surveillance of viral diseases. However, despite improvements in technology, traditional protocols for NGS of large numbers of samples are still highly cost- and labor-intensive. One of the possible cost-effective alternatives is combinatorial pooling. Although a number of pooling strategies for consensus sequencing of DNA samples and detection of SNPs have been proposed, these strategies cannot be applied to sequencing of highly heterogeneous viral populations. RESULTS: We developed a cost-effective and reliable protocol for sequencing of viral samples, that combines NGS using barcoding and combinatorial pooling and a computational framework including algorithms for optimal virus-specific pools design and deconvolution of individual samples from sequenced pools. Evaluation of the framework on experimental and simulated data for hepatitis C virus showed that it substantially reduces the sequencing costs and allows deconvolution of viral populations with a high accuracy. AVAILABILITY: The source code and experimental data sets are available at http://alan.cs.gsu.edu/NGS/?q=content/pooling |
Non-avian animal reservoirs present a source of influenza A PB1-F2 proteins with novel virulence-enhancing markers.
Alymova IV , York IA , McCullers JA . PLoS One 2014 9 (11) e111603 PB1-F2 protein, expressed from an alternative reading frame of most influenza A virus (IAV) PB1 segments, may possess specific residues associated with enhanced inflammation (L62, R75, R79, and L82) and cytotoxicity (I68, L69, and V70). These residues were shown to increase the pathogenicity of primary viral and secondary bacterial infections in a mouse model. In contrast to human seasonal influenza strains, virulence-associated residues are present in PB1-F2 proteins from pandemic H1N1 1918, H2N2 1957, and H3N2 1968, and highly pathogenic H5N1 strains, suggesting their contribution to viruses' pathogenic phenotypes. Non-human influenza strains may act as donors of virulent PB1-F2 proteins. Previously, avian influenza strains were identified as a potential source of inflammatory, but not cytotoxic, PB1-F2 residues. Here, we analyze the frequency of virulence-associated residues in PB1-F2 sequences from IAVs circulating in mammalian species in close contact with humans: pigs, horses, and dogs. All four inflammatory residues were found in PB1-F2 proteins from these viruses. Among cytotoxic residues, I68 was the most common and was especially prevalent in equine and canine IAVs. Historically, PB1-F2 from equine (about 75%) and canine (about 20%) IAVs were most likely to have combinations of the highest numbers of residues associated with inflammation and cytotoxicity, compared to about 7% of swine IAVs. Our analyses show that, in addition to birds, pigs, horses, and dogs are potentially important sources of pathogenic PB1-F2 variants. There is a need for surveillance of IAVs with genetic markers of virulence that may be emerging from these reservoirs in order to improve pandemic preparedness and response. |
Dominant drug targets suppress the emergence of antiviral resistance.
Tanner EJ , Liu HM , Oberste MS , Pallansch M , Collett MS , Kirkegaard K . Elife 2014 3 The emergence of drug resistance can defeat the successful treatment of pathogens that display high mutation rates, as exemplified by RNA viruses. Here we detail a new paradigm in which a single compound directed against a 'dominant drug target' suppresses the emergence of naturally occurring drug-resistant variants in mice and cultured cells. All new drug-resistant viruses arise during intracellular replication and initially express their phenotypes in the presence of drug-susceptible genomes. For the targets of most anti-viral compounds, the presence of these drug-susceptible viral genomes does not prevent the selection of drug resistance. Here we show that, for an inhibitor of the function of oligomeric capsid proteins of poliovirus, the expression of drug-susceptible genomes causes chimeric oligomers to form, thus rendering the drug-susceptible genomes dominant. The use of dominant drug targets should suppress drug resistance whenever multiple genomes arise in the same cell and express products in a common milieu. |
Immunity to polio, measles and rubella in women of child-bearing age and estimated congenital rubella syndrome incidence, Cambodia, 2012
Mao B , Chheng K , Wannemuehler K , Vynnycky E , Buth S , Soeung SC , Reef S , Weldon W , Quick L , Gregory CJ . Epidemiol Infect 2014 143 (9) 1-10 Significant gaps in immunity to polio, measles, and rubella may exist in adults in Cambodia and threaten vaccine-preventable disease (VPD) elimination and control goals, despite high childhood vaccination coverage. We conducted a nationwide serological survey during November-December 2012 of 2154 women aged 15-39 years to assess immunity to polio, measles, and rubella and to estimate congenital rubella syndrome (CRS) incidence. Measles and rubella antibodies were detected by IgG ELISA and polio antibodies by microneutralization testing. Age-structured catalytic models were fitted to rubella serological data to predict CRS cases. Overall, 29.8% of women lacked immunity to at least one poliovirus (PV); seroprevalence to PV1, PV2 and PV3 was 85.9%, 93.4% and 83.3%, respectively. Rubella and measles antibody seroprevalence was 73.3% and 95.9%, respectively. In the 15-19 years age group, 48.2% [95% confidence interval (CI) 42.4-54.1] were susceptible to either PV1 or PV3, and 40.3% (95% CI 33.0-47.5) to rubella virus. Based on rubella antibody seroprevalence, we estimate that >600 infants are born with CRS in Cambodia annually. Significant numbers of Cambodian women are still susceptible to polio and rubella, especially those aged 15-19 years, emphasizing the need to include adults in VPD surveillance and a potential role for vaccination strategies targeted at adults. |
Declines in pneumonia hospitalizations of children aged <2 years associated with the use of pneumococcal conjugate vaccines - Tennessee, 1998-2012
Griffin MR , Mitchel E , Moore MR , Whitney CG , Grijalva CG . MMWR Morb Mortal Wkly Rep 2014 63 (44) 995-998 The 7-valent pneumococcal conjugate vaccine (PCV7) was added to the U.S. infant immunization schedule in the year 2000. By 2009, PCV7 introduction was associated with a 43% decline in all-cause pneumonia among U.S. children aged <2 years. In 2010, a new 13-valent pneumococcal conjugate vaccine (PCV13) replaced PCV7 in the infant immunization schedule, expanding protection from seven to 13 pneumococcal serotypes. To examine changes in all-cause pneumonia hospitalizations among children aged <2 years after the switch to PCV13, Tennessee hospital discharge data for 1998-2012 were analyzed. By 2012, all-cause pneumonia hospitalizations in children aged <2 years had declined an additional 27%, relative to the PCV7 years. Pneumonia hospitalizations were estimated to be 4.1 per 1,000 population in 2012, a historically low rate that represents a 72% decline from the rate before PCV7 introduction. Tennessee children aged <2 years experienced about 1,300 fewer pneumonia hospitalizations annually in 2011 and 2012 than in the years before pneumococcal conjugate vaccine (PCV) use. These data attest to the powerful impact of the PCV program on pneumonia in Tennessee children. The observed trend likely represents a major decline in pneumococcal pneumonia, which should stimulate a reassessment of current causes and appropriate management of pneumonia in children. |
Oral shedding of Marburg virus in experimentally infected Egyptian Fruit Bats (Rousettus aegyptiacus)
Amman BR , Jones ME , Sealy TK , Uebelhoer LS , Schuh AJ , Bird BH , Coleman-McCray JD , Martin BE , Nichol ST , Towner JS . J Wildl Dis 2014 51 (1) 113-24 Marburg virus (Marburg marburgvirus; MARV) causes sporadic outbreaks of Marburg hemorrhagic fever (MHF) in Africa. The Egyptian fruit bat (Rousettus aegyptiacus) has been identified as a natural reservoir based most-recently on the repeated isolation of MARV directly from bats caught at two locations in southwestern Uganda where miners and tourists separately contracted MHF from 2007-2008. Despite learning much about the ecology of MARV through extensive field investigations, there remained unanswered questions such as determining the primary routes of virus shedding and the severity of disease, if any, caused by MARV in infected bats. To answer these questions and others, we experimentally infected captive-bred R. aegyptiacus with MARV under high (biosafety level 4) containment. These experiments have shown infection profiles consistent with R. aegyptiacus being a bona fide natural reservoir host for MARV and shown routes of viral shedding capable of infecting humans and other animals. |
Desorption atmospheric pressure photoionization and direct analysis in real time coupled with travelling wave ion mobility mass spectrometry
Rasanen RM , Dwivedi P , Fernandez FM , Kauppila TJ . Rapid Commun Mass Spectrom 2014 28 (21) 2325-36 RATIONALE: Ambient mass spectrometry (MS) is a tool for screening analytes directly from sample surfaces. However, background impurities may complicate the spectra and therefore fast separation techniques are needed. Here, we demonstrate the use of travelling wave ion mobility spectrometry in a comparative study of two ambient MS techniques. METHODS: Desorption atmospheric pressure photoionization (DAPPI) and direct analysis in real time (DART) were coupled with travelling wave ion mobility mass spectrometry (TWIM-MS) for highly selective surface analysis. The ionization efficiencies of DAPPI and DART were compared. Test compounds were: bisphenol A, benzo[a]pyrene, ranitidine, cortisol and alpha-tocopherol. DAPPI-MS and DART-TWIM-MS were also applied to the analysis of chloroquine from dried blood spots, and alpha-tocopherol from almond surface, and DAPPI-TWIM-MS was applied to analysis of pharmaceuticals and multivitamin tablets. RESULTS: DAPPI was approximately 100 times more sensitive than DART for bisphenol A and 10-20 times more sensitive for the other compounds. The limits of detection were between 30-290 and 330-8200 fmol for DAPPI and DART, respectively. Also, from the authentic samples, DAPPI ionized chloroquine and alpha-tocopherol more efficiently than DART. The mobility separation enabled the detection of species with low signal intensities, e.g. thiamine and cholecalciferol, in the DAPPI-TWIM-MS analysis of multivitamin tablets. CONCLUSIONS: DAPPI ionized the studied compounds of interest more efficiently than DART. For both DAPPI and DART, the mobility separation prior to MS analysis reduced the amount of chemical noise in the mass spectrum and significantly increased the signal-to-noise ratio for the analytes. |
Tools for improving clinical preventive services receipt among women with disabilities of childbearing ages and beyond
Sinclair LB , Taft KE , Sloan ML , Stevens AC , Krahn GL . Matern Child Health J 2014 19 (6) 1189-201 Efforts to improve clinical preventive services (CPS) receipt among women with disabilities are poorly understood and not widely disseminated. The reported results represent a 2-year, Centers for Disease Control and Prevention and Association of Maternal and Child Health Programs partnership to develop a central resource for existing tools that are of potential use to maternal and child health practitioners who work with women with disabilities. Steps included contacting experts in the fields of disability and women's health, searching the Internet to locate examples of existing tools that may facilitate CPS receipt, convening key stakeholders from state and community-based programs to determine their potential use of the tools, and developing an online Toolbox. Nine examples of existing tools were located. The tools focused on facilitating use of the CPS guidelines, monitoring CPS receipt among women with disabilities, improving the accessibility of communities and local transportation, and training clinicians and women with disabilities. Stakeholders affirmed the relevance of these tools to their work and encouraged developing a Toolbox. The Toolbox, launched in May 2013, provides information and links to existing tools and accepts feedback and proposals for additional tools. This Toolbox offers central access to existing tools. Maternal and child health stakeholders and other service providers can better locate, adopt and implement existing tools to facilitate CPS receipt among adolescent girls with disabilities who are transitioning into adult care as well as women with disabilities of childbearing ages and beyond. |
Undertreated and untreated pain should be considered an adverse event of neonatal circumcision - reply
El Bcheraoui C , Dominguez KL , Kilmarx PH . JAMA Pediatr 2014 168 (11) 1077 We would like to thank Bisogni et al., for their comments about intra- and post-operative pain as an adverse event (AE) to male circumcision (MC). Use of appropriate analgesia for pain management is a good practice that should be the standard of care during and after any surgical procedure as it can substantially control pain1. In a prospective study of 583 neonatal circumcisions performed between December 2005 and December 2008, when appropriate analgesia was applied, 93.5% of neonates circumcised in the first week of life showed no indication of pain on an objective standardized neonatal pain rating system used by the authors2. | The recently published manuscript3 which found a low incidence of adverse events (AEs) (< 0.5%) associated with male circumcision in U.S. medical settings during 2001–2010 is based on data from a healthcare reimbursement claims database. This database captures only diagnoses and procedures billed to third parties. The analysis studied the association between 41 AEs, not including pain, and male circumcision. A search of the same healthcare reimbursement claims database for the ICD9 codes 338.18 (other acute postoperative pain) and 338.19 (other acute pain) detected one patient with pain associated with the male circumcision procedure among 1,400,920 circumcised males. Taking into consideration the possibility that pain may be an underreported AE in the healthcare reimbursement claims database used for this analysis, a more thorough analysis of the association of pain and male circumcision would require additional data sources, including information related to use of and type of pain control methods. As previously recommended3, future researchers studying the association between AEs and male circumcision should consider using additional data sources to ascertain AEs that are not captured in data from reimbursement claims. |
Leveraging birth defects surveillance data for health services research
Cassell CH , Grosse SD , Kirby RS . Birth Defects Res A Clin Mol Teratol 2014 100 (11) 815-21 In this editorial, we define health services research (HSR) and its relevance and importance to birth defects surveillance and research. We briefly discuss key HSR concepts, several types of HSR data sources, and the linkage of these data for birth defects research. We also examine some challenges in data linkages and conclude by identifying research gaps in HSR for children with birth defects and their families. | Health services research is multidisciplinary and broad in scope, examining how financing systems, organization structures and processes, social factors, health technologies and personal behaviors affect access to care, cost and quality of care, health and well-being (AHRQ, 2012). The leveraging of birth defects surveillance data for HSR has been noted as a critical strategy to further the public health research priorities for birth defects, including congenital heart defects (Oster et al., 2013a), craniosynostosis (Rasmussen et al., 2008a), Down syndrome (Rasmussen et al., 2008b) and orofacial clefts (Yazdy et al., 2007). |
Epidemiology of twinning in the National Birth Defects Prevention Study, 1997 to 2007
Dawson AL , Tinker SC , Jamieson DJ , Hobbs CA , Rasmussen SA , Reefhuis J . Birth Defects Res A Clin Mol Teratol 2014 103 (2) 85-99 BACKGROUND: Our objective was to evaluate associations between twinning and maternal demographic factors and periconceptional exposures among infants with and without orofacial clefts. METHODS: We used data from the National Birth Defects Prevention Study; 228 twins and 8242 singletons without birth defects (controls), and 117 twins and 2859 singletons with orofacial clefts, born 1997 to 2007, were included in the analyses. Because of the occurrence of twinning due to the use of assisted reproductive technologies, logistic regression models were computed to estimate odds ratios and 95% confidence intervals for each exposure, stratified by fertility treatment use. To evaluate factors by zygosity, we used sex-pairing data and a simulation approach to estimate the zygosity of like-sex twin pairs for unassisted conceptions. RESULTS: Among control mothers who did not use fertility treatments, predictors of twinning included non-Hispanic black maternal race (adjusted odds ratio, 1.6; 95% confidence interval, 1.0-2.4), and tobacco smoking (adjusted odds ratio, 1.6; 95% confidence interval, 1.1-2.4). Among control mothers who used fertility treatments, older maternal age, higher income, and state of residence were associated with twinning. Associations were generally stronger among mothers of dizygotic (estimated) twins than monozygotic (estimated) twins. Results for mothers of infants with isolated orofacial clefts were similar to those of controls. CONCLUSION: We observed an increased twinning frequency with increasing maternal age, but factors such as maternal race/ethnicity and socioeconomic status may also contribute. Among women receiving fertility treatments, factors associated with twinning suggested a relation with treatment specifics (e.g., treatment type and number of embryos implanted) and availability of insurance coverage. |
Transfusion-related adverse reactions reported to the National Healthcare Safety Network Hemovigilance Module, United States, 2010 to 2012
Harvey AR , Basavaraju SV , Chung KW , Kuehnert MJ . Transfusion 2014 55 (4) 709-18 BACKGROUND: In 2010, health care facilities in the United States began voluntary enrollment in the National Healthcare Safety Network (NHSN) Hemovigilance Module. Participants report transfusion practices; red blood cell, platelet (PLT), plasma, and cryoprecipitate units transfused; and transfusion-related adverse reactions and process errors to the Centers for Disease Control and Prevention through a secure, Internet-accessible surveillance application available to transfusing facilities. STUDY DESIGN AND METHODS: Facilities submitting at least 1 month of transfused components data and adverse reactions from January 1, 2010, to December 31, 2012, were included in this analysis. Adverse reaction rates for transfused components, stratified by component type and collection and modification methods, were calculated. RESULTS: In 2010 to 2012, a total of 77 facilities reported 5136 adverse reactions among 2,144,723 components transfused (239.5/100,000). Allergic (46.8%) and febrile nonhemolytic (36.1%) reactions were most frequent; 7.2% of all reactions were severe or life-threatening and 0.1% were fatal. PLT transfusions (421.7/100,000) had the highest adverse reaction rate. CONCLUSION: Adverse transfusion reaction rates from the NHSN Hemovigilance Module in the United States are comparable to early hemovigilance reporting from other countries. Although severe reactions are infrequent, the numbers of transfusion reactions in US hospitals suggest that interventions to prevent these reactions are important for patient safety. Further investigation is needed to understand the apparent increased risk of reactions from apheresis-derived blood components. Comprehensive evaluation, including data validation, is important to continued refinement of the module. |
The preventability of ventilator-associated events: the CDC Prevention Epicenters' Wake Up and Breathe Collaborative
Klompas M , Anderson D , Trick W , Babcock H , Kerlin MP , Li L , Sinkowitz-Cochran R , Ely EW , Jernigan J , Magill S , Lyles R , O'Neil C , Kitch BT , Arrington E , Balas MC , Kleinman K , Bruce C , Lankiewicz J , Murphy MV , ECox C , Lautenbach E , Sexton D , Fraser V , Weinstein RA , Platt R . Am J Respir Crit Care Med 2014 191 (3) 292-301 RATIONALE: The Centers for Disease Control and Prevention (CDC) introduced ventilator-associated event (VAE) definitions in January 2013. Little is known about VAE prevention. We hypothesized that daily, coordinated spontaneous awakening trials (SATs) and spontaneous breathing trials (SBTs) might prevent VAEs. OBJECTIVES: To assess the preventability of VAEs. METHODS: We nested a multicenter quality improvement collaborative within a prospective study of VAE surveillance amongst 20 intensive care units between November 2011 and May 2013. Twelve units joined the collaborative and implemented an opt-out protocol for nurses and respiratory therapists to perform paired daily SATs and SBTs. The remaining 8 units conducted surveillance alone. We measured temporal trends in VAEs using generalized mixed effects regression models adjusted for patient-level unit, age, sex, reason for intubation, SOFA score, and comorbidity index. MEASUREMENTS AND MAIN RESULTS: We tracked 5,164 consecutive episodes of mechanical ventilation: 3,425 in collaborative units and 1,739 in surveillance-only units. Within collaborative units, significant increases in SATs, SBTs, and percentage of SBTs performed without sedation were mirrored by significant decreases in duration of mechanical ventilation and hospital length-of-stay. There was no change in VAE risk per ventilator-day but significant decreases in VAE risk per episode of mechanical ventilation (OR 0.63, 95% CI 0.42-0.97) and infection-related ventilator-associated complications (OR 0.35, 95% CI 0.17-0.71) but not pneumonias (OR 0.51, 95% CI 0.19-1.3). Within surveillance-only units, there were no significant changes in SAT, SBT, or VAE rates. CONCLUSIONS: Enhanced performance of paired, daily SATs and SBTs is associated with lower VAE rates. Clinical trial registration available at www.clinicaltrials.gov, ID NCT01583413. |
An analysis of trainers' perspectives within an ecological framework: factors that influence mine safety training processes
Haas EJ , Hoebbel CL , Rost KA . Saf Health Work 2014 5 (3) 118-124 BACKGROUND: Satisfactory completion of mine safety training is a prerequisite for being hired and for continued employment in the coal industry. Although training includes content to develop skills in a variety of mineworker competencies, research and recommendations continue to specify that specific limitations in the self-escape portion of training still exist and that mineworkers need to be better prepared to respond to emergencies that could occur in their mine. Ecological models are often used to inform the development of health promotion programs but have not been widely applied to occupational health and safety training programs. METHODS: Nine mine safety trainers participated in in-depth semi-structured interviews. A theoretical analysis of the interviews was completed via an ecological lens. Each level of the social ecological model was used to examine factors that could be addressed both during and after mine safety training. RESULTS: The analysis suggests that problems surrounding communication and collaboration, leadership development, and responsibility and accountability at different levels within the mining industry contribute to deficiencies in mineworkers’ mastery and maintenance of skills. CONCLUSION: This study offers a new technique to identify limitations in safety training systems and processes. The analysis suggests that training should be developed and disseminated with consideration of various levels - individual, interpersonal, organizational, and community - to promote skills. If factors identified within and between levels are addressed, it may be easier to sustain mineworker competencies that are established during safety training. |
Further study of the intrinsic safety of internally shorted lithium and lithium-ion cells within methane-air
Dubaniewicz Jr TH , DuCarme JP . J Loss Prev Process Ind 2014 32 165-173 National Institute for Occupational Safety and Health (NIOSH) researchers continue to study the potential for lithium and lithium-ion battery thermal runaway from an internal short circuit in equipment for use in underground coal mines. Researchers conducted cell crush tests using a plastic wedge within a 20-L explosion-containment chamber filled with 6.5% CH4-air to simulate the mining hazard. The present work extends earlier findings to include a study of LiFePO4 cells crushed while under charge, prismatic form factor LiCoO2 cells, primary spiral-wound constructed LiMnO2 cells, and crush speed influence on thermal runaway susceptibility. The plastic wedge crush was a more severe test than the flat plate crush with a prismatic format cell. Test results indicate that prismatic Saft MP 174565 LiCoO2 and primary spiral-wound Saft FRIWO M52EX LiMnO2 cells pose a CH4-air ignition hazard from internal short circuit. Under specified test conditions, A123 systems ANR26650M1A LiFePO4 cylindrical cells produced no chamber ignitions while under a charge of up to 5 A. Common spiral-wound cell separators are too thin to meet intrinsic safety standards provisions for distance through solid insulation, suggesting that a hard internal short circuit within these cells should be considered for intrinsic safety evaluation purposes, even as a non-countable fault. Observed flames from a LiMnO2 spiral-wound cell after a chamber ignition within an inert atmosphere indicate a sustained exothermic reaction within the cell. The influence of crush speed on ignitions under specified test conditions was not statistically significant. |
Prevalence and spectrum of illness among hospitalized adults with malaria in Blantyre, Malawi
Segula D , Frosch AP , SanJoaquin M , Taulo D , Skarbinski J , Mathanga DP , Allain TJ , Molyneux M , Laufer MK , Heyderman RS . Malar J 2014 13 (1) 391 BACKGROUND: As control interventions are rolled out, the burden of malaria may shift from young children to older children and adults as acquisition of immunity is slowed and persistence of immunity is short-lived. Data for malaria disease in adults are difficult to obtain because of co-morbid conditions and because parasitaemia may be asymptomatic. Regular surveys of adult admissions to a hospital in Malawi were conducted to characterize the clinical spectrum of malaria and to establish a baseline to monitor changes that occur in future. METHODS: In 2011-2012, at Queen Elizabeth Hospital, Blantyre, four separated one-week surveys in the peak malaria transmission period (wet season) and three one-week surveys in the low transmission period (dry season) were conducted using rapid diagnostic tests (RDT) with confirmation of parasitaemia by microscopy. All adults (aged ≥15) being admitted to the adult medical wards regardless of the suspected diagnosis, were enrolled. Participants with a positive malaria test underwent a standardized physical examination and laboratory tests. Malaria syndromes were characterized by reviewing charts and laboratory results on discharge. RESULTS: 765 adult admissions were screened. 63 (8.2%) were RDT-positive with 61 (8.0%) positive by microscopy. Over the course of the seven study weeks, two patients were judged to have incidental parasitaemia, 31 (4.1%) had uncomplicated malaria and 28 (3.7%) had severe malaria. Both uncomplicated and severe malaria cases were more common in the rainy season than the dry season. Prostration (22/28 cases) and hyperparasitaemia (>250,000 parasites/mul) (9/28) were the most common features of severe malaria. Jaundice (4/28), severe anaemia (2/28), hyperlactataemia (2/28), shock (1/28) and haemoglobinuria (1/28) were less commonly seen, and no patient had severe metabolic derangement or organ failure. There were no deaths attributable to malaria. CONCLUSION: In this study of adults admitted to hospital in southern Malawi, an area with year-round transmission of Plasmodium falciparum, classical metabolic and organ complications of malaria were not encountered. Prostration and hyperparasitaemia were more common indicators of severity in patients admitted with malaria, none of whom died. These data will provide a baseline for monitoring trends in the frequency and clinical patterns of severe malaria in adults. |
Efficacy of artemether-lumefantrine and dihydroartemisinin-piperaquine for the treatment of uncomplicated malaria in children in Zaire and Uige provinces, Angola
Plucinski MM , Talundzic E , Morton L , Dimbu PR , Macaia AP , Fortes F , Goldman I , Lucchi N , Stennies G , MacArthur JR , Udhayakumar V . Antimicrob Agents Chemother 2014 59 (1) 437-43 The development of resistance to antimalarials is a major challenge for global malaria control. Artemisinin-based combination therapies, the newest class of antimalarials, are used worldwide but there have been reports of artemisinin resistance in Southeast Asia. In February-May 2013, we conducted open-label, nonrandomized therapeutic efficacy studies of artemether-lumefantrine (AL) and dihydroartemisinin-piperaquine (DP) in Zaire and Uige Provinces in northern Angola. The parasitological and clinical responses to treatment in children with uncomplicated P. falciparum monoinfection were measured over 28 days and the main outcome was PCR-corrected adequate clinical and parasitological response (ACPR) proportion on day 28. Parasites from treatment failures were analyzed for the presence of putative molecular markers of resistance to lumefantrine and artemisinins, including the recently identified mutations in the K13-propeller gene. In the 320 children finishing the study, 25 treatment failures were observed, 24 in the AL arms and one in the DP arm. The PCR-corrected ACPR proportion on day 28 for AL was 88% (95% CI: 78-95) in Zaire and 97% (91-100) in Uige. For DP, it was 100% (95-100) in Zaire, and 100% (96-100) in Uige. None of the treatment failures had molecular evidence of artemisinin resistance. In contrast, 91% of AL treatment failures had markers associated with lumefantrine resistance on day of failure. The absence of molecular markers for artemisinin resistance and the observed efficacies of both drug combinations suggest no evidence of artemisinin resistance in northern Angola. There is evidence of increased lumefantrine resistance in Zaire, which should continue to be monitored. |
Evidence of autochthonous Chagas disease in southeastern Texas
Garcia MN , Aguilar D , Gorchakov R , Rossmann SN , Montgomery SP , Rivera H , Woc-Colburn L , Hotez P , Murray KO . Am J Trop Med Hyg 2014 92 (2) 325-30 Autochthonous transmission of Trypanosoma cruzi in the United States is rarely reported. Here, we describe five newly identified patients with autochthonously acquired infections from a small pilot study of positive blood donors in southeast Texas. Case-patients 1-4 were possibly infected near their residences, which were all in the same region approximately 100 miles west of Houston. Case-patient 5 was a young male with considerable exposure from routine outdoor and camping activities associated with a youth civil organization. Only one of the five autochthonous case-patients received anti-parasitic treatment. Our findings suggest an unrecognized risk of human vector-borne transmission in southeast Texas. Education of physicians and public health officials is crucial for identifying the true disease burden and source of infection in Texas. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Community Health Services
- Epidemiology and Surveillance
- Food Safety
- Genetics and Genomics
- Immunity and Immunization
- Laboratory Sciences
- Maternal and Child Health
- Medicine
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Parasitic Diseases
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure