Reducing the burden of disease and death from familial hypercholesterolemia: a call to action.
Knowles JW , O'Brien EC , Greendale K , Wilemon K , Genest J , Sperling LS , Neal WA , Rader DJ , Khoury MJ . Am Heart J 2014 168 (6) 807-11 Familial hypercholesterolemia (FH) is a genetic disease characterized by substantial elevations of low-density lipoprotein cholesterol, unrelated to diet or lifestyle. Untreated FH patients have 20 times the risk of developing coronary artery disease, compared with the general population. Estimates indicate that as many as 1 in 500 people of all ethnicities and 1 in 250 people of Northern European descent may have FH; nevertheless, the condition remains largely undiagnosed. In the United States alone, perhaps as little as 1% of FH patients have been diagnosed. Consequently, there are potentially millions of children and adults worldwide who are unaware that they have a life-threatening condition. In countries like the Netherlands, the United Kingdom, and Spain, cascade screening programs have led to dramatic improvements in FH case identification. Given that there are currently no systematic approaches in the United States to identify FH patients or affected relatives, the patient-centric nonprofit FH Foundation convened a national FH Summit in 2013, where participants issued a "call to action" to health care providers, professional organizations, public health programs, patient advocacy groups, and FH experts, in order to bring greater attention to this potentially deadly, but (with proper diagnosis) eminently treatable, condition. |
The Steps to Health Randomized Trial for Arthritis: a self-directed exercise versus nutrition control program
Wilcox S , McClenaghan B , Sharpe PA , Baruth M , Hootman JM , Leith K , Dowda M . Am J Prev Med 2014 48 (1) 1-12 BACKGROUND: Despite the established benefits of exercise for adults with arthritis, participation is low. Safe, evidence-based, self-directed programs, which have the potential for high reach at a low cost, are needed. PURPOSE: To test a 12-week, self-directed, multicomponent exercise program for adults with arthritis. DESIGN: Randomized controlled trial. Data were collected from 2010 to 2012. Data were analyzed in 2013 and 2014. SETTING/PARTICIPANTS: Adults with arthritis (N=401, aged 56.3 [10.7] years, 85.8% women, 63.8% white, 35.2% African American, BMI of 33.0 [8.2]) completed measures at a university research center and participated in a self-directed exercise intervention (First Step to Active Health(R)) or nutrition control program (Steps to Healthy Eating). INTERVENTION: Intervention participants received a self-directed multicomponent exercise program and returned self-monitoring logs for 12 weeks. MAIN OUTCOME MEASURES: Self-reported physical activity, functional performance measures, and disease-specific outcomes (arthritis symptoms and self-efficacy) assessed at baseline, 12 weeks, and 9 months. RESULTS: Participants in the exercise condition showed greater increases in physical activity than those in the nutrition control group (p=0.01). Significant improvements, irrespective of condition, were seen in lower body strength, functional exercise capacity, lower body flexibility, pain, fatigue, stiffness, and arthritis management self-efficacy (p values<0.0001). More adverse events occurred in the exercise than nutrition control condition, but only one was severe and most were expected with increased physical activity. CONCLUSIONS: The exercise program improves physical activity, and both programs improve functional and psychosocial outcomes. Potential reasons for improvements in the nutrition control condition are discussed. These interventions have the potential for large-scale dissemination. This study is registered at Clinicaltrials.gov NCT01172327. |
Lung function and metabolic syndrome: findings of National Health and Nutrition Examination Survey 2007-2010
Ford ES , Cunningham TJ , Mercado CI . J Diabetes 2014 6 (6) 603-613 BACKGROUND: Considerable uncertainty remains about obstructive lung function (OLF) in adults with metabolic syndrome (MetS). The aim of the present study was to examine pulmonary function status in adults with and without MetS. METHODS: We used data from 3109 participants aged >= 20 years of the National Health and Nutrition Examination Survey 2007-2010. Subjects' MetS status was established on the basis of the 2009 harmonizing definition. Participants received spirometry. RESULTS: After age adjustment, 79.3% (SE 1.1) of participants with MetS had normal lung function, 8.7% (0.9) had restrictive lung function (RLF), 7.1% (0.8) had mild OLF, and 4.8% (0.6) had moderate OLF or worse. Among participants without MetS, these estimates were 78.7% (1.2), 3.9% (0.6), 10.9% (1.1), and 6.4% (0.8), respectively. After multiple adjustment, participants with MetS were more likely to have RLF (adjusted prevalence ratio [ aPR] 2.20; 95% confidence interval [ CI] 1.67, 2.90) and less likely to have any OLF (aPR 0.73; 95% CI 0.62, 0.86) than those without MetS. Furthermore, participants with MetS had lower mean levels of forced expiratory volume in one second (FEV1), FEV1 % predicted, forced vital capacity (FVC), and FVC % predicted, but a higher FEV1/FVC ratio than participants without MetS. Mean levels of FEV1, FEV1 % predicted, FVC, and FVC % predicted declined significantly, but not the FEV1 /FVC ratio, as the number of components increased. CONCLUSIONS: Compared with adults without MetS, spirometry is more likely to show a restrictive pattern and less likely to show an obstructive pattern among adults with MetS. |
Ohio primary health care providers' practices and attitudes regarding screening women with prior gestational diabetes for type 2 diabetes mellitus - 2010
Rodgers L , Conrey EJ , Wapner A , Ko JY , Dietz PM , Oza-Frank R . Prev Chronic Dis 2014 11 E213 INTRODUCTION: Gestational diabetes mellitus (GDM) is associated with a 7-fold increased lifetime risk for developing type 2 diabetes mellitus. Early diagnosis of type 2 diabetes is crucial for preventing complications. Despite recommendations for type 2 diabetes screening every 1 to 3 years for women with previous diagnoses of GDM and all women aged 45 years or older, screening prevalence is unknown. We sought to assess Ohio primary health care providers' practices and attitudes regarding assessing GDM history and risk for progression to type 2 diabetes. METHODS: During 2010, we mailed surveys to 1,400 randomly selected Ohio family physicians and internal medicine physicians; we conducted analyses during 2011-2013. Overall responses were weighted to adjust for stratified sampling. Chi-square tests compared categorical variables. RESULTS: Overall response rate was 34% (380 eligible responses). Among all respondents, 57% reported that all new female patients in their practices are routinely asked about GDM history; 62% reported screening women aged 45 years or younger with prior GDM every 1 to 3 years for glucose intolerance; and 42% reported that screening for type 2 diabetes among women with prior GDM is a high or very high priority in their practice. CONCLUSION: Because knowing a patient's GDM history is the critical first step in the prevention of progression to type 2 diabetes for women who had GDM, suboptimal screening for both GDM history and subsequent glucose abnormalities demonstrates missed opportunities for identifying and counseling women with increased risk for type 2 diabetes. |
Predicted 10-year risk of developing cardiovascular disease at the state level in the U.S
Yang Q , Zhong Y , Ritchey M , Loustalot F , Hong Y , Merritt R , Bowman BA . Am J Prev Med 2014 48 (1) 58-69 BACKGROUND: Cardiovascular disease (CVD) is the leading cause of death in the U.S. State-specific predicted 10-year risk of developing CVD could provide useful information for state health planning and policy. PURPOSE: To estimate state-specific 10-year risk of developing CVD. METHODS: Using the updated non-laboratory-based Framingham CVD Risk Score (RS), this study estimated 10-year risk of developing CVD; coronary heart disease (CHD); and stroke, stratified by demographic factors and by state among 2009 Behavioral Risk Factors Surveillance System participants aged 30-74 years. Data analysis was completed in June 2014. RESULTS: The age-standardized mean CVD, CHD, and stroke RSs for adults aged 30-74 years were 14.6%, 10.4%, and 2.3% among men, respectively, and 7.5%, 4.5%, and 1.8% among women. RSs increased significantly with age and were highest among non-Hispanic blacks, those with less than high school education, and households with incomes <$35,000. State-specific age-standardized CVD, CHD, and stroke RS ranged, among men, from lows in Utah (13.2%, 9.6%, and 2.1%, respectively) to highs in Louisiana (16.2%, 11.7%, and 2.6%), and among women, from lows in Minnesota (6.3%, 3.8%, and 1.5%) to highs in Mississippi (8.7%, 5.3%, and 2.1%). CONCLUSIONS: The predicted 10-year risk of developing CVD varies significantly by age, gender, race/ethnicity, educational attainment, household income, and state of residence. These results support the development and implementation of targeted prevention programs by states to address the risk of developing CVD, CHD, and stroke among their populations. |
Prevalence and costs of skin cancer treatment in the U.S., 2002-2006 and 2007-2011
Guy GP Jr , Machlin SR , Ekwueme DU , Yabroff KR . Am J Prev Med 2014 48 (2) 183-187 BACKGROUND: Skin cancer, the most common cancer in the U.S., is a major public health problem. The incidence of nonmelanoma and melanoma skin cancer is increasing; however, little is known about the economic burden of treatment. PURPOSE: To examine trends in the treated prevalence and treatment costs of nonmelanoma and melanoma skin cancers. METHODS: This study used data on adults from the 2002-2011 Medical Expenditure Panel Survey full-year consolidated files and information from corresponding medical conditions and medical event files to estimate the treated prevalence and treatment cost of nonmelanoma skin cancer, melanoma skin cancer, and all other cancer sites. Analyses were conducted in January 2014. RESULTS: The average annual number of adults treated for skin cancer increased from 3.4 million in 2002-2006 to 4.9 million in 2007-2011 (p<0.001). During this period, the average annual total cost for skin cancer increased from $3.6 billion to $8.1 billion (p=0.001), representing an increase of 126.2%, while the average annual total cost for all other cancers increased by 25.1%. During 2007-2011, nearly 5 million adults were treated for skin cancer annually, with average treatment costs of $8.1 billion each year. CONCLUSIONS: These findings demonstrate that the health and economic burden of skin cancer treatment is substantial and increasing. Such findings highlight the importance of skin cancer prevention efforts, which may result in future savings to the healthcare system. |
Prevalence of health promotion programs in primary health care units in Brazil
Ramos LR , Malta DC , Gomes GA , Bracco MM , Florindo AA , Mielke GI , Parra DC , Lobelo F , Simones EJ , Hallal PC . Rev Saude Publica 2014 48 (5) 837-44 OBJECTIVE: Assessment of prevalence of health promotion programs in primary health care units within Brazil's health system. METHODS: We conducted a cross-sectional descriptive study based on telephone interviews with managers of primary care units. Of a total 42,486 primary health care units listed in the Brazilian Unified Health System directory, 1,600 were randomly selected. Care units from all five Brazilian macroregions were selected proportionally to the number of units in each region. We examined whether any of the following five different types of health promotion programs was available: physical activity; smoking cessation; cessation of alcohol and illicit drug use; healthy eating; and healthy environment. Information was collected on the kinds of activities offered and the status of implementation of the Family Health Strategy at the units. RESULTS: Most units (62.0%) reported having in place three health promotion programs or more and only 3.0% reported having none. Healthy environment (77.0%) and healthy eating (72.0%) programs were the most widely available; smoking and alcohol use cessation were reported in 54.0% and 42.0% of the units. Physical activity programs were offered in less than 40.0% of the units and their availability varied greatly nationwide, from 51.0% in the Southeast to as low as 21.0% in the North. The Family Health Strategy was implemented in most units (61.0%); however, they did not offer more health promotion programs than others did. CONCLUSIONS: Our study showed that most primary care units have in place health promotion programs. Public policies are needed to strengthen primary care services and improve training of health providers to meet the goals of the agenda for health promotion in Brazil. |
The future burden of CKD in the United States: a simulation model for the CDC CKD initiative
Hoerger TJ , Simpson SA , Yarnoff BO , Pavkov ME , Rios Burrows N , Saydah SH , Williams DE , Zhuo X . Am J Kidney Dis 2014 65 (3) 403-11 BACKGROUND: Awareness of chronic kidney disease (CKD), defined by kidney damage or reduced glomerular filtration rate, remains low in the United States, and few estimates of its future burden exist. STUDY DESIGN: We used the CKD Health Policy Model to simulate the residual lifetime incidence of CKD and project the prevalence of CKD in 2020 and 2030. The simulation sample was based on nationally representative data from the 1999 to 2010 National Health and Nutrition Examination Surveys. SETTING & POPULATION: Current US population. MODEL, PERSPECTIVE, & TIMELINE: Simulation model following up individuals from current age through death or age 90 years. OUTCOMES: Residual lifetime incidence represents the projected percentage of persons who will develop new CKD during their lifetimes. Future prevalence is projected for 2020 and 2030. MEASUREMENTS: Development and progression of CKD are based on annual decrements in estimated glomerular filtration rates that depend on age and risk factors. RESULTS: For US adults aged 30 to 49, 50 to 64, and 65 years or older with no CKD at baseline, the residual lifetime incidences of CKD are 54%, 52%, and 42%, respectively. The prevalence of CKD in adults 30 years or older is projected to increase from 13.2% currently to 14.4% in 2020 and 16.7% in 2030. LIMITATIONS: Due to limited data, our simulation model estimates are based on assumptions about annual decrements in estimated glomerular filtration rates. CONCLUSIONS: For an individual, lifetime risk of CKD is high, with more than half the US adults aged 30 to 64 years likely to develop CKD. Knowing the lifetime incidence of CKD may raise individuals' awareness and encourage them to take steps to prevent CKD. From a national burden perspective, we estimate that the population prevalence of CKD will increase in coming decades, suggesting that development of interventions to slow CKD onset and progression should be considered. |
Global surveillance of cancer survival 1995-2009: analysis of individual data for 25 676 887 patients from 279 population-based registries in 67 countries (CONCORD-2)
Allemani C , Weir HK , Carreira H , Harewood R , Spika D , Wang XS , Bannon F , Ahn JV , Johnson CJ , Bonaventure A , Marcos-Gragera R , Stiller C , Azevedo e Silva G , Chen WQ , Ogunbiyi OJ , Rachet B , Soeberg MJ , You H , Matsuda T , Bielska-Lasota M , Storm H , Tucker TC , Coleman MP . Lancet 2014 385 (9972) 977-1010 BACKGROUND: Worldwide data for cancer survival are scarce. We aimed to initiate worldwide surveillance of cancer survival by central analysis of population-based registry data, as a metric of the effectiveness of health systems, and to inform global policy on cancer control. METHODS: Individual tumour records were submitted by 279 population-based cancer registries in 67 countries for 25.7 million adults (age 15-99 years) and 75 000 children (age 0-14 years) diagnosed with cancer during 1995-2009 and followed up to Dec 31, 2009, or later. We looked at cancers of the stomach, colon, rectum, liver, lung, breast (women), cervix, ovary, and prostate in adults, and adult and childhood leukaemia. Standardised quality control procedures were applied; errors were corrected by the registry concerned. We estimated 5-year net survival, adjusted for background mortality in every country or region by age (single year), sex, and calendar year, and by race or ethnic origin in some countries. Estimates were age-standardised with the International Cancer Survival Standard weights. FINDINGS: 5-year survival from colon, rectal, and breast cancers has increased steadily in most developed countries. For patients diagnosed during 2005-09, survival for colon and rectal cancer reached 60% or more in 22 countries around the world; for breast cancer, 5-year survival rose to 85% or higher in 17 countries worldwide. Liver and lung cancer remain lethal in all nations: for both cancers, 5-year survival is below 20% everywhere in Europe, in the range 15-19% in North America, and as low as 7-9% in Mongolia and Thailand. Striking rises in 5-year survival from prostate cancer have occurred in many countries: survival rose by 10-20% between 1995-99 and 2005-09 in 22 countries in South America, Asia, and Europe, but survival still varies widely around the world, from less than 60% in Bulgaria and Thailand to 95% or more in Brazil, Puerto Rico, and the USA. For cervical cancer, national estimates of 5-year survival range from less than 50% to more than 70%; regional variations are much wider, and improvements between 1995-99 and 2005-09 have generally been slight. For women diagnosed with ovarian cancer in 2005-09, 5-year survival was 40% or higher only in Ecuador, the USA, and 17 countries in Asia and Europe. 5-year survival for stomach cancer in 2005-09 was high (54-58%) in Japan and South Korea, compared with less than 40% in other countries. By contrast, 5-year survival from adult leukaemia in Japan and South Korea (18-23%) is lower than in most other countries. 5-year survival from childhood acute lymphoblastic leukaemia is less than 60% in several countries, but as high as 90% in Canada and four European countries, which suggests major deficiencies in the management of a largely curable disease. INTERPRETATION: International comparison of survival trends reveals very wide differences that are likely to be attributable to differences in access to early diagnosis and optimum treatment. Continuous worldwide surveillance of cancer survival should become an indispensable source of information for cancer patients and researchers and a stimulus for politicians to improve health policy and health-care systems. FUNDING: Canadian Partnership Against Cancer (Toronto, Canada), Cancer Focus Northern Ireland (Belfast, UK), Cancer Institute New South Wales (Sydney, Australia), Cancer Research UK (London, UK), Centers for Disease Control and Prevention (Atlanta, GA, USA), Swiss Re (London, UK), Swiss Cancer Research foundation (Bern, Switzerland), Swiss Cancer League (Bern, Switzerland), and University of Kentucky (Lexington, KY, USA). |
Habitual sleep duration and predicted 10-year cardiovascular risk using the pooled cohort risk equations among US adults
Ford ES . J Am Heart Assoc 2014 3 (6) e001454 BACKGROUND: The association between sleep duration and predicted cardiovascular risk has been poorly characterized. The objective of this study was to examine the association between self-reported sleep duration and predicted 10-year cardiovascular risk among US adults. METHODS AND RESULTS: Data from 7690 men and nonpregnant women who were aged 40 to 79 years, who were free of self-reported heart disease and stroke, and who participated in a National Health and Nutrition Examination Survey from 2005 to 2012 were analyzed. Sleep duration was self-reported. Predicted 10-year cardiovascular risk was calculated using the pooled cohort equations. Among the included participants, 13.1% reported sleeping ≤5 hours, 24.4% reported sleeping 6 hours, 31.9% reported sleeping 7 hours, 25.2% reported sleeping 8 hours, 4.0% reported sleeping 9 hours, and 1.3% reported sleeping ≥10 hours. After adjustment for covariates, geometric mean-predicted 10-year cardiovascular risk was 4.0%, 3.6%, 3.4%, 3.5%, 3.7%, and 3.7% among participants who reported sleeping ≤5, 6, 7, 8, 9, and ≥10 hours per night, respectively (PWald chi-square<0.001). The age-adjusted percentages of predicted cardiovascular risk ≥20% for the 6 intervals of sleep duration were 14.5%, 11.9%, 11.0%, 11.4%, 11.8%, and 16.3% (PWald chi-square=0.022). After maximal adjustment, however, sleep duration was not significantly associated with cardiovascular risk ≥20% (PWald chi-square=0.698). CONCLUSIONS: Mean-predicted 10-year cardiovascular risk was lowest among adults who reported sleeping 7 hours per night and increased as participants reported sleeping fewer and more hours. |
Challenges of ascertaining national trends in the incidence of coronary heart disease in the United States
Ford ES , Roger VL , Dunlay SM , Go AS , Rosamond WD . J Am Heart Assoc 2014 3 (6) e001097 Despite major therapeutic advances, the public health burden associated with coronary heart disease (CHD) remains enormous with approximately 525 000 people predicted to have a new myocardial infarction (MI) in 2013, ≈15.4 million estimated to be living with CHD in 2013, and ≈1 346 000 people hospitalized in 2009 for CHD.1 | There are a variety of ways to measure the population impact of a disease including prevalence, associated morbidity and mortality, quality of life, health care utilization, and economic costs, and one of the most critical is disease incidence. From a surveillance perspective in the United States, the national vital statistics data system provides information about the death rate for CHD, various national data systems provide estimates of hospitalizations for CHD and outpatient visits for CHD, and national data systems provide data about levels of risk factors for CHD. The data systems allowing for estimates of prevalent CHD are less robust as they rely primarily on self‐reported information. | A particularly glaring gap in our knowledge base has been the lack of nationally representative data to measure the incidence of CHD. Measuring incidence of a disease is particularly salient because incidence (1) is a key measure in helping to define the burden of a disease and identify high‐risk populations, (2) provides valuable information in helping decision makers set public health priorities, and (3) is a more relevant measure to assess the collective influence of risk factors in a population than prevalence. Consequently, tracking incidence of a disease in populations can: (1) yield timely data about potentially unfavorable changes in incidence that may prompt a search for explanations and corrective actions to redirect the course of a disease in a population, (2) provide valuable feedback in assessing efforts to control a disease, and (3) generate useful information for updating priorities regarding health promotion and disease prevention. The reasons why a national surveillance system to track CHD incidence in the United States has never been developed are not entirely clear but may relate to the cost and complexity of implementing such a system. |
Current depression among adult cancer survivors: findings from the 2010 Behavioral Risk Factor Surveillance System
Zhao G , Okoro CA , Li J , White A , Dhingra S , Li C . Cancer Epidemiol 2014 38 (6) 757-64 BACKGROUND: A cancer diagnosis and subsequent treatments constitute a significantly increased psychological burden among cancer patients. This study examined the prevalence of current depression and the risk factors associated with a high burden of depression among cancer survivors in the US. METHODS: We analyzed data from 3550 cancer survivors (aged ≥18 years) and 26,917 adults without cancer who participated in the 2010 Behavioral Risk Factor Surveillance System. Depressive symptoms were assessed by the Patient Health Questionnaire-8 diagnostic algorithm. Participants with a total depression severity score of ≥10 were defined as having current depression. Prevalence and prevalence ratios were estimated by conducting log-linear regression analysis while controlling for potential confounders. RESULTS: Overall, 13.7% of cancer survivors (vs. 8.9% of adults without cancer, P<0.001) reported having current depression; the prevalence varied significantly by cancer category. Among cancer survivors, after multivariate adjustment for covariates, cancer diagnosis within a year, being in 'other' racial/ethnic group, divorced, separated, widowed, or never married, current or former smoker, or having histories of diabetes, disability, or depression were associated with significantly higher prevalence ratios for current depression; whereas being at an advanced age (≥60 years old), attaining educational levels of >high school graduate, or engaging in leisure-time physical activity were associated with significantly lower prevalence ratios for current depression. CONCLUSION: Our results indicate that cancer survivors are at increased risk of current depression. Targeting cancer survivors at high risk of depressive issues may be especially important for clinical support and interventions aimed at improving mental well-being. |
Dilated eye examination screening guideline compliance among patients with diabetes without a diabetic retinopathy diagnosis: the role of geographic access
Lee DJ , Kumar N , Feuer WJ , Chou CF , Rosa PR , Schiffman JC , Morante A , Aldahan A , Staropoli P , Fernandez CA , Tannenbaum SL , Lam BL . BMJ Open Diabetes Res Care 2014 2 (1) e000031 OBJECTIVE: To estimate the prevalence of, and factors associated with, dilated eye examination guideline compliance among patients with diabetes mellitus (DM), but without diabetic retinopathy. RESEARCH DESIGN AND METHODS: Utilizing the computerized billing records database, we identified patients with International Classification of Diseases (ICD)-9 diagnoses of DM, but without any ocular diagnoses. The available medical records of patients in 2007-2008 were reviewed for demographic and ocular information, including visits through 2010 (n=200). Patients were considered guideline compliant if they returned at least every 15 months for screening. Participant street addresses were assigned latitude and longitude coordinates to assess their neighborhood socioeconomic status (using the 2000 US census data), distance to the screening facility, and public transportation access. Patients not compliant, based on the medical record review, were contacted by phone or mail and asked to complete a follow-up survey to determine if screening took place at other locations. RESULTS: The overall screening compliance rate was 31%. Patient sociodemographic characteristics, insurance status, and neighborhood socioeconomic measures were not significantly associated with compliance. However, in separate multivariable logistic regression models, those living eight or more miles from the screening facility were significantly less likely to be compliant relative to those living within eight miles (OR=0.36 (95% CI 0.14 to 0.86)), while public transit access quality was positively associated with screening compliance (1.34 (1.07 to 1.68)). CONCLUSIONS: Less than one-third of patients returned for diabetic retinopathy screening at least every 15 months, with transportation challenges associated with noncompliance. Our results suggest that reducing transportation barriers or utilizing community-based screening strategies may improve compliance. |
Recruitment strategies to engage African American men in HIV testing randomized controlled trials in the rural southern United States
Brown EJ , Thomas P , Willis LA , Sutton MY . J Assoc Nurses AIDS Care 2014 25 (6) 670-4 Of the estimated 1.1 million Americans living with HIV infection, about 18% are unaware of their infection (Centers for Disease Control and Prevention [CDC], 2012). Routine HIV testing can facilitate early diagnosis and linkage to care for persons who may be unaware of their infection, thereby decreasing unintended ongoing HIV transmission (CDC, 2011; Hall, Holtgrave & Maulsby, 2012). African Americans, especially men, are disproportionately represented among both persons living with HIV and persons unaware of their infection (CDC, 2013). Increased HIV testing efforts are vital, and routine HIV testing has been recommended by the CDC and the United States Preventive Services Task Force (CDC, 2006; Chou et al., 2012). | For many African Americans in the southern United States, the challenges of HIV-related stigma, racism, and distrust (both historical and institutional), poverty, and decreased health care access often make seeking and utilizing routine HIV testing and/or care at traditional health venues a challenge, especially in rural areas (Adimora, Ramirez, Schoenbach, & Cohen, 2014; Prejean, Tang, & Hall, 2013). In rural northern Florida, where African Americans are disproportionately affected by high rates of HIV (Figure 1), conducting research to better understand how to best develop and deliver HIV services is a vital part of reducing the burden of HIV. Community-based testing approaches that bring services to people where they live, work, and play could increase HIV testing of at-risk populations in the rural south. Research is needed to identify effective means of increasing HIV testing among southern rural African American men as part of our national HIV prevention strategy (Office of National AIDS Policy, 2010). |
Respiratory syncytial virus - United States, July 2012-June 2014
Haynes AK , Prill MM , Iwane MK , Gerber SI . MMWR Morb Mortal Wkly Rep 2014 63 (48) 1133-6 Respiratory syncytial virus (RSV) causes lower respiratory infection among infants and young children worldwide. Annually in the United States, RSV infection has been associated with an estimated 57,527 hospitalizations and 2.1 million outpatient visits among children aged <5 years. In temperate climate zones, RSV generally circulates during the fall, winter, and spring. However, the exact timing and duration of RSV seasons vary by region and from year-to-year. Knowing the start of the RSV season in any given locality is important to health care providers and public health officials who use RSV seasonality data to guide diagnostic testing and the timing of RSV immunoprophylaxis for children at high risk for severe respiratory infection. To describe RSV seasonality (defined as onset, offset, peak, and duration) nationally, by U.S. Department of Health and Human Services (HHS) regions and for the state of Florida, CDC analyzes RSV laboratory detections reported to the National Respiratory and Enteric Virus Surveillance System (NREVSS). Florida is reported separately because it has an earlier season onset and longer season duration than the rest of the country. For 2012-13, the RSV season onset ranged from late October to late December, and season offset ranged from late December to late April, excluding Florida. For 2013-14, the RSV season onset ranged from late October to late January, and season offset from late January to early April, excluding Florida. Weekly updates of RSV national, regional, and state RSV trends are available from NREVSS at http://www.cdc.gov/surveillance/nrevss. |
Typhoid fever
Wain J , Hendriksen RS , Mikoleit ML , Keddy KH , Ochiai RL . Lancet 2014 385 (9973) 1136-45 Control of typhoid fever relies on clinical information, diagnosis, and an understanding for the epidemiology of the disease. Despite the breadth of work done so far, much is not known about the biology of this human-adapted bacterial pathogen and the complexity of the disease in endemic areas, especially those in Africa. The main barriers to control are vaccines that are not immunogenic in very young children and the development of multidrug resistance, which threatens efficacy of antimicrobial chemotherapy. Clinicians, microbiologists, and epidemiologists worldwide need to be familiar with shifting trends in enteric fever. This knowledge is crucial, both to control the disease and to manage cases. Additionally, salmonella serovars that cause human infection can change over time and location. In areas of Asia, multidrug-resistant Salmonella enterica serovar Typhi (S Typhi) has been the main cause of enteric fever, but now S Typhi is being displaced by infections with drug-resistant S enterica serovar Paratyphi A. New conjugate vaccines are imminent and new treatments have been promised, but the engagement of local medical and public health institutions in endemic areas is needed to allow surveillance and to implement control measures. |
Using geospatial modelling to optimize the rollout of antiretroviral-based pre-exposure HIV interventions in sub-Saharan Africa
Gerberry DJ , Wagner BG , Garcia-Lerma JG , Heneine W , Blower S . Nat Commun 2014 5 5454 Antiretroviral (ARV)-based pre-exposure HIV interventions may soon be rolled out in resource-constrained Sub-Saharan African countries, but rollout plans have yet to be designed. Here we use geospatial modelling and optimization techniques to compare two rollout plans for ARV-based microbicides in South Africa: a utilitarian plan that minimizes incidence by using geographic targeting, and an egalitarian plan that maximizes geographic equity in access to interventions. We find significant geographic variation in the efficiency of interventions in reducing HIV transmission, and that efficiency increases disproportionately with increasing incidence. The utilitarian plan would result in considerable geographic inequity in access to interventions, but (by exploiting geographic variation in incidence) could prevent ~40% more infections than the egalitarian plan. Our results show that the geographic resource allocation decisions made at the beginning of a rollout, and the location where the rollout is initiated, will be crucial in determining the success of interventions in reducing HIV epidemics. |
Why is school closed today? Unplanned K-12 school closures in the United States, 2011-2013
Wong KK , Shi J , Gao H , Zheteyeva YA , Lane K , Copeland D , Hendricks J , McMurray L , Sliger K , Rainey JJ , Uzicanin A . PLoS One 2014 9 (12) e113755 INTRODUCTION: We describe characteristics of unplanned school closures (USCs) in the United States over two consecutive academic years during a non-pandemic period to provide context for implementation of school closures during a pandemic. METHODS: From August 1, 2011 through June 30, 2013, daily systematic internet searches were conducted for publicly announced USCs lasting ≥1 day. The reason for closure and the closure dates were recorded. Information on school characteristics was obtained from the National Center for Education Statistics. RESULTS: During the two-year study period, 20,723 USCs were identified affecting 27,066,426 students. Common causes of closure included weather (79%), natural disasters (14%), and problems with school buildings or utilities (4%). Only 771 (4%) USCs lasted ≥4 school days. Illness was the cause of 212 (1%) USCs; of these, 126 (59%) were related to respiratory illnesses and showed seasonal variation with peaks in February 2012 and January 2013. CONCLUSIONS: USCs are common events resulting in missed school days for millions of students. Illness causes few USCs compared with weather and natural disasters. Few communities have experience with prolonged closures for illness. |
Management of Cryptococcus gattii meningoencephalitis
Franco-Paredes C , Womack T , Bohlmeyer T , Sellers B , Hays A , Patel K , Lizarazo J , Lockhart SR , Siddiqui W , Marr KA . Lancet Infect Dis 2014 15 (3) 348-55 Cryptococcosis is a fungal disease caused by Cryptococcus neoformans and Cryptococcus gattii. By inhalation and subsequent pulmonary infection, it may disseminate to the CNS and cause meningitis or meningoencephalitis. Most cases occur in immunosuppressed hosts, including patients with HIV/AIDS, patients receiving immunosuppressing drugs, and solid organ transplant recipients. However, cryptococcosis also occurs in individuals with apparently healthy immune systems. A growing number of cases are caused by C gattii, with infections occurring in both immunosuppressed and immunocompetent individuals. In the majority of documented cases, treatment of C gattii infection of the CNS requires aggressive management of raised intracranial pressure along with standard antifungal therapy. Early cerebrospinal fluid evacuation is often needed through placement of a percutaneous lumbar drain or ventriculostomy. Furthermore, pharmacological immunosuppression with a high dose of dexamethasone is sometimes needed to ameliorate a persistently increased inflammatory response and to reduce intracranial pressure. In this Grand Round, we present the case of an otherwise healthy adolescent female patient, who, despite aggressive management, succumbed to C gattii meningoencephalitis. We also present a review of the existing literature and discuss optimum clinical management of meningoencephalitis caused by C gattii. |
Gender-related differences in outcomes and attrition on antiretroviral treatment among an HIV-infected patient cohort in Zimbabwe: 2007-2010
Takarinda KC , Harries AD , Shiraishi RW , Mutasa-Apollo T , Abdul-Quader A , Mugurungi O . Int J Infect Dis 2014 30 98-105 OBJECTIVES: To determine 1) gender-related differences in antiretroviral therapy (ART) outcomes and 2) gender-specific characteristics associated with attrition. METHODS: This was a retrospective patient record review of 3,919 HIV-infected patients' ≥15 years old who initiated ART between 2007 and 2009 in 40 randomly selected ART facilities countrywide. RESULTS: Compared to females, males had more documented active tuberculosis (12% vs. 9%, p<0.02) and a lower median CD4 cell count (117cells/muL vs 143cells/muL p<0.001) at ART initiation. Males had a higher risk of attrition [adjusted hazard ratio (AHR) 1.28, 95% confidence interval (CI) 1.10-1.49] and mortality [AHR 1.56, 95% CI 1.10-2.20]. Factors associated with attrition for both sexes were lower baseline weight (<45kg and 45-60kg vs. >60kg), initiating ART at an urban health facility and care at central/provincial or district/mission hospitals vs. primary healthcare facilities. CONCLUSIONS: Our findings show that males presented late for ART initiation compared to females. Similar to other studies, males had higher patient attrition and mortality compared to females and this may be partly attributed to late presentation for HIV treatment and care. These observations highlight the need to encourage early HIV testing and enrollment into HIV treatment and care, and eventually patient retention on ART particularly amongst men. |
Hepatitis E as a cause of acute jaundice syndrome in northern Uganda, 2010-2012
Gerbi GB , Williams R , Bakamutumaho B , Liu S , Downing R , Drobeniuc J , Kamili S , Xu F , Holmberg SD , Teshale EH . Am J Trop Med Hyg 2014 92 (2) 411-4 Hepatitis E virus (HEV) is a common cause of acute viral hepatitis in developing countries; however, its contribution to acute jaundice syndrome is not well-described. A large outbreak of hepatitis E occurred in northern Uganda from 2007 to 2009. In response to this outbreak, acute jaundice syndrome surveillance was established in 10 district healthcare facilities to determine the proportion of cases attributable to hepatitis E. Of 347 acute jaundice syndrome cases reported, the majority (42%) had hepatitis E followed by hepatitis B (14%), malaria (10%), hepatitis C (5%), and other/unknown (29%). Of hepatitis E cases, 72% occurred in Kaboong district, and 68% of these cases occurred between May and August of 2011. Residence in Kaabong district was independently associated with hepatitis E (adjusted odds ratio = 13; 95% confidence interval = 7-24). The findings from this surveillance show that an outbreak and sporadic transmission of hepatitis E occur in northern Uganda. |
Intimate partner violence and human immunodeficiency virus risk among black and Hispanic women
Morales-Aleman MM , Hageman K , Gaul ZJ , Le B , Paz-Bailey G , Sutton MY . Am J Prev Med 2014 47 (6) 689-702 BACKGROUND: Approximately 80% of new HIV infections among U.S. women are among black/African American and Hispanic women. HIV risk may be associated with intimate partner violence (IPV); data regarding IPV for women in high-HIV prevalence areas are scarce. PURPOSE: To examine prevalence and correlates of IPV among women. METHODS: Heterosexual women and their male partners in cities with high HIV prevalence were enrolled. During 2006-2007, participants completed interviews about HIV risk factors and IPV (physical violence or forced sex) experiences. Data were analyzed during 2012-2013 using multivariate logistic regression to identify individual- and partner-level IPV correlates. RESULTS: Of 1,011 female respondents, 985 (97.4%) provided risk factor and demographic data. Most were non-Hispanic black/African American (82.7%); living at or below poverty (86.7%); and tested HIV-negative (96.8%). IPV-physical violence was reported by 29.1%, and IPV-forced sex by 13.7%. Being married/living with a partner (AOR=1.60, 95% CI=1.06, 2.40); non-injection drug use (AOR=1.74, 95% CI=1.22, 2.48); and ever discussing male partners' number of current sex partners (AOR=1.60, 95% CI=1.15, 2.24) were associated with IPV-physical violence. Women reporting concurrent sex partners (AOR=1.80, 95% CI=1.04, 3.13) and ever discussing number of male partners' past sex partners (AOR=1.85, 95% CI=1.13, 3.05) were associated with IPV-forced sex. Feeling comfortable asking a male partner to use condoms was associated with decreased IPV-physical violence (AOR=0.32, 95% CI=0.16,0.64) and -forced sex (AOR=0.37, 95% CI=0.16, 0.85). CONCLUSIONS: Prevention interventions that enhance women's skills to decrease HIV and IPV risk are important strategies for decreasing racial/ethnic disparities among women. |
Investigation of an outbreak of bloody diarrhea complicated with hemolytic uremic syndrome
Chokoshvili O , Lomashvili K , Malakmadze N , Geleishvil M , Brant J , Imnadze P , Chitadze N , Tevzadze L , Chanturia G , Tevdoradze T , Tsertsvadze T , Talkington D , Mody RK , Strockbine N , Gerber RA , Maes E , Rush T . J Epidemiol Glob Health 2014 4 (4) 249-59 In July-August 2009, eight patients with bloody diarrhea complicated by hemolytic uremic syndrome (HUS) were admitted to hospitals in Tbilisi, Georgia. We started active surveillance in two regions for bloody diarrhea and post-diarrheal HUS. Of 25 case-patients who developed HUS, including the initial 8 cases, half were 15years old, 67% were female and seven (28%) died. No common exposures were identified. Among 20 HUS case-patients tested, Shiga toxin was detected in the stools of 2 patients (one with elevated serum IgG titers to several Escherichia coli serogroups, including O111 and O104). Among 56 persons with only bloody diarrhea, we isolated Shiga toxin-producing E. coli (STEC) O104:H4 from 2 and Shigella from 10; 2 had serologic evidence of E. coli O26 infection. These cases may indicate a previously unrecognized burden of HUS in Georgia. We recommend national reporting of HUS and improving STEC detection capacity. |
Analysis of Neisseria gonorrhoeae azithromycin susceptibility in the United States by the Gonococcal Isolate Surveillance Project, 2005 to 2013
Kirkcaldy RD , Soge O , Papp JR , Hook EW 3rd , Del Rio C , Kubin G , Weinstock HS . Antimicrob Agents Chemother 2014 59 (2) 998-1003 BACKGROUND: Azithromycin, administered with ceftriaxone, is recommended by CDC for treatment of gonorrhea. Many experts have expressed concern about the ease with which Neisseria gonorrhoeae can acquire macrolide resistance. OBJECTIVE: We sought to describe gonococcal azithromycin susceptibility in the United States and determine whether azithromycin susceptibility has changed over time. METHODS: We analyzed 2005-2013 data from the Gonococcal Isolate Surveillance Project, a CDC-supported sentinel surveillance network that monitors gonococcal antimicrobial susceptibility. RESULTS: 44,144 N. gonorrhoeae isolates were tested for azithromycin susceptibility by agar dilution methods. The overall azithromycin MIC50 was 0.25 mug/ml and the MIC90 was 0.5 mug/ml. There were no overall temporal trends in geometric means. Isolates from men who had sex with men had significantly higher geometric mean MICs than isolates from men who have sex exclusively with women. The overall prevalence of reduced azithromycin susceptibility (MIC ≥2 mug/ml) was 0.4% and varied by year from 0.3% (2006 and 2009) to 0.6% (2013). CONCLUSION: We did not find a clear temporal trend in gonococcal azithromycin MICs in the United States and the prevalence of reduced azithromycin susceptibility remains low. These findings support continued use of azithromycin in the combination therapy regimen for gonorrhea. |
Assessment of the potential for international dissemination of Ebola virus via commercial air travel during the 2014 west African outbreak
Bogoch II , Creatore MI , Cetron MS , Brownstein JS , Pesik N , Miniota J , Tam T , Hu W , Nicolucci A , Ahmed S , Yoon JW , Berry I , Hay SI , Anema A , Tatem AJ , MacFadden D , German M , Khan K . Lancet 2014 385 (9962) 29-35 BACKGROUND: The WHO declared the 2014 west African Ebola epidemic a public health emergency of international concern in view of its potential for further international spread. Decision makers worldwide are in need of empirical data to inform and implement emergency response measures. Our aim was to assess the potential for Ebola virus to spread across international borders via commercial air travel and assess the relative efficiency of exit versus entry screening of travellers at commercial airports. METHODS: We analysed International Air Transport Association data for worldwide flight schedules between Sept 1, 2014, and Dec 31, 2014, and historic traveller flight itinerary data from 2013 to describe expected global population movements via commercial air travel out of Guinea, Liberia, and Sierra Leone. Coupled with Ebola virus surveillance data, we modelled the expected number of internationally exported Ebola virus infections, the potential effect of air travel restrictions, and the efficiency of airport-based traveller screening at international ports of entry and exit. We deemed individuals initiating travel from any domestic or international airport within these three countries to have possible exposure to Ebola virus. We deemed all other travellers to have no significant risk of exposure to Ebola virus. FINDINGS: Based on epidemic conditions and international flight restrictions to and from Guinea, Liberia, and Sierra Leone as of Sept 1, 2014 (reductions in passenger seats by 51% for Liberia, 66% for Guinea, and 85% for Sierra Leone), our model projects 2.8 travellers infected with Ebola virus departing the above three countries via commercial flights, on average, every month. 91 547 (64%) of all air travellers departing Guinea, Liberia, and Sierra Leone had expected destinations in low-income and lower-middle-income countries. Screening international travellers departing three airports would enable health assessments of all travellers at highest risk of exposure to Ebola virus infection. INTERPRETATION: Decision makers must carefully balance the potential harms from travel restrictions imposed on countries that have Ebola virus activity against any potential reductions in risk from Ebola virus importations. Exit screening of travellers at airports in Guinea, Liberia, and Sierra Leone would be the most efficient frontier at which to assess the health status of travellers at risk of Ebola virus exposure, however, this intervention might require international support to implement effectively. FUNDING: Canadian Institutes of Health Research. |
Case-control study of risk factors for human infection with avian influenza A(H7N9) virus in Shanghai, China, 2013
Li J , Chen J , Yang G , Zheng YX , Mao SH , Zhu WP , Yu XL , Gao Y , Pan QC , Yuan ZA . Epidemiol Infect 2014 143 (9) 1-7 The first human infection with avian influenza A(H7N9) virus was reported in Shanghai, China in March 2013. An additional 32 cases of human H7N9 infection were identified in the following months from March to April 2013 in Shanghai. Here we conducted a case-control study of the patients with H7N9 infection (n = 25) using controls matched by age, sex, and residence to determine risk factors for H7N9 infection. Our findings suggest that chronic disease and frequency of visiting a live poultry market (>10 times, or 1-9 times during the 2 weeks before illness onset) were likely to be significantly associated with H7N9 infection, with the odds ratios being 4.07 [95% confidence interval (CI) 1.32-12.56], 10.61 (95% CI 1.85-60.74), and 3.76 (95% CI 1.31-10.79), respectively. Effective strategies for live poultry market control should be reinforced and ongoing education of the public is warranted to promote behavioural changes that can help to eliminate direct or indirect contact with influenza A(H7N9) virus. |
Condom effectiveness for HIV prevention by consistency of use among men who have sex with men (MSM) in the U.S
Smith DK , Herbst JH , Zhang X , Rose CE . J Acquir Immune Defic Syndr 2014 68 (3) 337-44 OBJECTIVE: We derived an estimate of male condom effectiveness during anal sex among men who have sex with men (MSM) because the most widely used estimate of condom effectiveness (80%) was based on studies of persons during heterosexual sex with an HIV-positive partner. DESIGN: Assessed male condom effectiveness during anal sex between MSM in two prospective cohort studies of HIV incidence by self-reported consistency of use. METHODS: Analyzed data combined from US participants in the EXPLORE trial (1999-2001) public use dataset and in the VAX 004 trial (1998-1999) dataset. Initially HIV-uninfected MSM enrolled in these trials completed baseline and semiannual interviews about their sexual behaviors with male partners and underwent HIV testing. Using a time-to-event model, effectiveness of consistent condom use in preventing HIV infection was estimated among men reporting receptive and/or insertive anal sex with an HIV-positive partner, and consistency of condom use. RESULTS: Among MSM reporting any anal sex with an HIV-positive male partner, we found 70% effectiveness with reported consistent condom use (compared to never use) and no significant protection when comparing sometimes use to never use. This point estimate for MSM was less than the 80% effectiveness estimate reported for heterosexuals in HIV discordant couples reporting consistent condom use. However, the point estimates in the two populations are not statistically different. Only 16% of MSM reported consistent condom use during anal sex with male partners of any HIV status over the entire observation period. CONCLUSIONS: These estimates are useful for counseling efforts, and for modeling the impact and comparative effectiveness of condoms and other prevention methods used by MSM. |
Diabetes and tuberculosis in the Pacific Islands region
Viney K , Brostrom R , Nasa J , Defang R , Kienene T . Lancet Diabetes Endocrinol 2014 2 (12) 932 We read with great interest the Series paper by Lönnroth and colleagues1 about tuberculosis and diabetes published in The Lancet Diabetes & Endocrinology. The Series paper focused on countries with the largest burden of tuberculosis and diabetes-associated tuberculosis, but did not mention the Pacific Island countries. Pacific Island countries generally have small population sizes, and we presume that this is the reason for the omission. | However, many of the 22 countries that comprise the Pacific Islands region have very high rates of type 2 diabetes (up to 37% prevalence in adults)2 and high rates of tuberculosis (up to 343 cases per 100 000 population).3 Seven of the world's top ten countries with the highest prevalence of diabetes are in the Pacific Islands region, with rates exceeding 25% in five island countries.2 Similarly, tuberculosis case notification rates are also high in selected Pacific Island countries, with the region also recording some of the highest tuberculosis rates in the world.3 Because the population sizes are small, the Pacific Islands region is often not prioritised by policy makers.4 | Regarding diabetes-associated tuberculosis, the Pacific Islands region should be highly prioritised for policy makers, programme managers, and researchers. With the use of original data (Viney K and Brostrom R, unpublished) we estimate that 25% of tuberculosis cases in Kiribati and 41% in the Marshall Islands are directly attributable to diabetes, rates that are much higher than other countries in Lönnroth and colleagues' Series paper.1 Additionally, the Pacific Islands region continues to pioneer in the development and implementation of useful clinical standards to manage and prevent tuberculosis and diabetes.5 We urge policy makers to recognise the important health problems in the Pacific Islands region, including the challenging work undertaken by Ministries of Health and technical agencies to improve patient outcomes for populations with this dual burden of disease. |
Nationwide assessment of insecticide susceptibility in Anopheles gambiae populations from Zimbabwe
Lukwa N , Sande S , Makuwaza A , Chiwade T , Netsa M , Asamoa K , Vazquez-Prokopec G , Reithinger R , Williams J . Malar J 2014 13 408 BACKGROUND: The scale-up of malaria interventions in sub-Saharan Africa has been accompanied by a dramatic increase in insecticide resistance in Anopheles spp. In Zimbabwe resistance to pyrethroid insecticides was reported in Gokwe District in 2008. This study reports results of the first nation-wide assessment of insecticide susceptibility in wild populations of Anopheles gambiae sensu lato (s.l.) in Zimbabwe, and provides a comprehensive review of the insecticide resistance status of An. gambiae s.l. in southern African countries. METHODS: World Health Organization (WHO) insecticide susceptibility tests were performed on 2,568 field collected mosquitoes originating from 13 sentinel sites covering all endemic regions in Zimbabwe in 2011-2012. At each site, 24-hour mortality and knock-down values for 50% and 90% of exposed mosquitoes (KD50 and KD90, respectively) were calculated for pools of 20-84 (mean, 54) mosquitoes exposed to 4% DDT, 0.1% bendiocarb, 0.05% lambda-cyhalothrin or 5% malathion. Susceptibility results from Zimbabwe were compiled with results published during 2002-2012 for all southern African countries to investigate the resistance status of An. gambiae s.l. in the region. RESULTS: Using WHO criteria, insecticide resistance was not detected at any site sampled and for any of the insecticide formulations tested during the malaria transmission season in 2012. Knock-down within 1 hr post-insecticide exposure ranged from 95% to 100%; mortality 24 hours post-insecticide exposure ranged from 98% to 100%. Despite the lack of insecticide resistance, high variability was found across sites in KD50 and KD90 values. A total of 24 out of 64 (37.5%) sites in southern Africa with reported data had evidence of phenotypic insecticide resistance in An. gambiae s.l. to at least one insecticide. CONCLUSION: Despite a long history of indoor residual spraying of households with insecticide, up to 2012 there was no evidence of phenotypic resistance to any of the four insecticide classes in An. gambiae s.l. collected across different eco-epidemiological areas in Zimbabwe. Results reinforce the need for careful monitoring over time in sentinel sites in order to detect the potential emergence and propagation of insecticide resistance as insecticidal vector control interventions in Zimbabwe continue to be implemented. |
Introduction of the exotic tick Hyalomma truncatum on a human with travel to Ethiopia: a case report
Mathison BA , Gerth WJ , Pritt BS , Baugh S . Ticks Tick Borne Dis 2014 6 (2) 152-4 An Oregon resident returned from a photography trip to Ethiopia with a male Hyalomma truncatum tick attached to the skin on his lower back. The tick was identified morphologically and deposited in the U.S. National Tick Collection housed at Georgia Southern University, Statesboro, Georgia. The public health importance of Hyalomma species of ticks and diagnostic dilemmas with identifying exotic ticks imported into the U.S. are discussed. |
Serum concentrations of perfluorinated compounds (PFC) among selected populations of children and adults in California
Wu XM , Bennett DH , Calafat AM , Kato K , Strynar M , Andersen E , Moran RE , Tancredi DJ , Tulve NS , Hertz-Picciotto I . Environ Res 2014 136c 264-273 Perfluorinated compounds (PFCs) have been widely used in industrial applications and consumer products. Their persistent nature and potential health impacts are of concern. Given the high cost of collecting serum samples, this study is to understand whether we can quantify PFC serum concentrations using factors extracted from questionnaire responses and indirect measurements, and whether a single serum measurement can be used to classify an individual's exposure over a one-year period. The study population included three demographic groups: young children (2-8 years old) (N=67), parents of young children (<55 years old) (N=90), and older adults (>55 years old) (N=59). PFC serum concentrations, house dust concentrations, and questionnaires were collected. The geometric mean of perfluorooctane sulfonic acid (PFOS) was highest for the older adults. In contrast, the geometric mean of perfluorooctanoic acid (PFOA) was highest for children. Serum concentrations of the parent and the child from the same family were moderately correlated (Spearman correlation (r)=0.26-0.79, p<0.05), indicating common sources within a family. For adults, age, having occupational exposure or having used fire extinguisher, frequencies of consuming butter/margarine, pork, canned meat entrees, tuna and white fish, freshwater fish, and whether they ate microwave popcorn were significantly positively associated with serum concentrations of individual PFCs. For children, residential dust concentrations, frequency of wearing waterproof clothes, frequency of having canned fish, hotdogs, chicken nuggets, French fries, and chips, and whether they ate microwave popcorn were significant positive predictors of individual PFC serum concentrations. In addition, the serum concentrations collected in a subset of young children (N=20) and the parents (N=42) one year later were strongly correlated (r=0.68-0.98, p<0.001) with the levels measured at the first visits, but showed a decreasing trend. Children had moderate correlation (r=0.43) between serum and dust concentrations of PFOS, indicating indoor sources contribute to exposure. In conclusion, besides food intake, occupational exposure, consumer product use, and exposure to residential dust contribute to PFC exposure. The downward temporal trend of serum concentrations reflects the reduction of PFCs use in recent years while the year-to-year correlation indicates that a single serum measurement could be an estimate of exposure relative to the population for a one-year period in epidemiology studies. |
Case-control study of breast cancer and exposure to synthetic environmental chemicals among Alaska Native women
Holmes AK , Koller KR , Kieszak SM , Sjodin A , Calafat AM , Sacco FD , Varner DW , Lanier AP , Rubin CH . Int J Circumpolar Health 2014 73 (1) 25760 BACKGROUND: Exposure to environmental chemicals may impair endocrine system function. Alaska Native (AN) women may be at higher risk of exposure to these endocrine disrupting chemicals, which may contribute to breast cancer in this population. OBJECTIVE: To measure the association between exposure to select environmental chemicals and breast cancer among AN women. DESIGN: A case-control study of 170 women (75 cases, 95 controls) recruited from the AN Medical Center from 1999 to 2002. Participants provided urine and serum samples. Serum was analyzed for 9 persistent pesticides, 34 polychlorinated biphenyl (PCB) congeners, and 8 polybrominated diethyl ether (PBDE) congeners. Urine was analyzed for 10 phthalate metabolites. We calculated geometric means (GM) and compared cases and controls using logistic regression. RESULTS: Serum concentrations of most pesticides and 3 indicator PCB congeners (PCB-138/158; PCB-153, PCB-180) were lower in case women than controls. BDE-47 was significantly higher in case women (GM=38.8 ng/g lipid) than controls (GM=25.1 ng/g lipid) (p=0.04). Persistent pesticides, PCBs, and most phthalate metabolites were not associated with case status in univariate logistic regression. The odds of being a case were higher for those with urinary mono-(2-ethylhexyl) phthalate (MEHP) concentrations that were above the median; this relationship was seen in both univariate (OR 2.16, 95% CI 1.16-4.05, p=0.02) and multivariable (OR 2.43, 95% CI 1.13-5.25, p=0.02) logistic regression. Women with oestrogen receptor (ER)-/progesterone receptor (PR)-tumour types tended to have higher concentrations of persistent pesticides than did ER+/PR+ women, although these differences were not statistically significant. CONCLUSIONS: Exposure to the parent compound of the phthalate metabolite MEHP may be associated with breast cancer. However, our study is limited by small sample size and an inability to control for the confounding effects of body mass index. The association between BDE-47 and breast cancer warrants further investigation. |
SEER cancer registry biospecimen research: yesterday and tomorrow.
Altekruse SF , Rosenfeld GE , Carrick DM , Pressman EJ , Schully SD , Mechanic LE , Cronin KA , Hernandez BY , Lynch CF , Cozen W , Khoury MJ , Penberthy LT . Cancer Epidemiol Biomarkers Prev 2014 23 (12) 2681-7 The National Cancer Institute's (NCI) Surveillance, Epidemiology, and End Results (SEER) registries have been a source of biospecimens for cancer research for decades. Recently, registry-based biospecimen studies have become more practical, with the expansion of electronic networks for pathology and medical record reporting. Formalin-fixed paraffin-embedded specimens are now used for next-generation sequencing and other molecular techniques. These developments create new opportunities for SEER biospecimen research. We evaluated 31 research articles published during 2005 to 2013 based on authors' confirmation that these studies involved linkage of SEER data to biospecimens. Rather than providing an exhaustive review of all possible articles, our intent was to indicate the breadth of research made possible by such a resource. We also summarize responses to a 2012 questionnaire that was broadly distributed to the NCI intra- and extramural biospecimen research community. This included responses from 30 investigators who had used SEER biospecimens in their research. The survey was not intended to be a systematic sample, but instead to provide anecdotal insight on strengths, limitations, and the future of SEER biospecimen research. Identified strengths of this research resource include biospecimen availability, cost, and annotation of data, including demographic information, stage, and survival. Shortcomings include limited annotation of clinical attributes such as detailed chemotherapy history and recurrence, and timeliness of turnaround following biospecimen requests. A review of selected SEER biospecimen articles, investigator feedback, and technological advances reinforced our view that SEER biospecimen resources should be developed. This would advance cancer biology, etiology, and personalized therapy research. |
Molecular mechanisms of fluconazole resistance in Candida parapsilosis isolates from a U.S. surveillance system.
Grossman NT , Pham CD , Cleveland AA , Lockhart SR . Antimicrob Agents Chemother 2014 59 (2) 1030-7 Candida parapsilosis is the second or third most common cause of candidemia in many countries. The Infectious Disease Society of America recommends fluconazole as primary therapy for C. parapsilosis candidemia. Although fluconazole resistance among C. parapsilosis isolates is low in most US institutions, the resistance rate can be as high as 7.5%. This study was designed to assess the mechanisms of fluconazole resistance in 706 incident bloodstream isolates from US hospitals. We sequenced the ERG11 and MRR1 genes of 122 C. parapsilosis isolates with resistant (30 isolates; 4.2%), susceptible dose-dependent (37 isolates; 5.2%) and susceptible (55 isolates) fluconazole MIC values, and used RT-PCR on RNA from 17 isolates to investigate the regulation of MDR1. By comparing these isolates to fully fluconazole susceptible isolates we detected at least two mechanisms of fluconazole resistance: an amino acid substitution in the 14-alpha-demethylase gene ERG11, and overexpression of the efflux pump MDR1, possibly due to point mutations in the MRR1 transcription factor that regulates MDR1. The ERG11 single nucleotide polymorphism (snp) was found in 57% of the fluconazole resistant isolates and in no susceptible isolates. The MRR1 snps were more difficult to characterize, as not all resulted in overexpression of MDR1 and not all MDR1 overexpression was associated with a snp in MRR1. Further work to characterize the MRR1 snps and search for overexpression of other efflux pumps is needed. |
Clinical utility of gene-expression profiling in women with early breast cancer: an overview of systematic reviews.
Marrone M , Stewart A , Dotson WD . Genet Med 2014 17 (7) 519-32 PURPOSE: This overview systematically evaluates the clinical utility of using Oncotype DX and MammaPrint gene-expression profiling tests to direct treatment decisions in women with breast cancer. The findings are intended to inform an updated recommendation from the Evaluation of Genomic Applications in Practice and Prevention Working Group. METHODS: Evidence reported in systematic reviews evaluating the clinical utility of Oncotype DX and MammaPrint, as well as the ability to predict treatment outcomes, change in treatment decisions, and cost-effectiveness, was qualitatively synthesized. RESULTS: Five systematic reviews found no direct evidence of clinical utility for either test. Indirect evidence showed Oncotype DX was able to predict treatment effects of adjuvant chemotherapy, whereas no evidence of predictive value was found for MammaPrint. Both tests influenced a change in treatment recommendations in 21 to 74% of participants. The cost-effectiveness of Oncotype DX varied with the alternative compared. For MammaPrint, lack of evidence of the predictive value led to uncertainty in the cost-effectiveness. CONCLUSION: No studies were identified that provided direct evidence that using gene-expression profiling tests to direct treatment decisions improved outcomes in women with breast cancer. Three ongoing studies may provide direct evidence for determining the clinical utility of gene-expression profiling testing. |
Full genome characterization of human Rotavirus A strains isolated in Cameroon, 2010-2011: diverse combinations of the G and P genes and lack of reassortment of the backbone genes.
Ndze VN , Esona MD , Achidi EA , Gonsu KH , Doro R , Marton S , Farkas S , Ngeng MB , Ngu AF , Obama-Abena MT , Bányai K . Infect Genet Evol 2014 28 537-60 Over the past few years whole genome sequencing of rotaviruses has become a routine laboratory method in many strain surveillance studies. To study the molecular evolutionary pattern of representative Cameroonian Rotavirus A (RVA) strains, the semiconductor sequencing approach was used following random amplification of genomic RNA. In total, 31 RVA strains collected during 2010-2011 in three Cameroonian study sites located 120 to 1240km from each other were sequenced and analyzed. Sequence analysis of the randomly selected representative strains showed that 18 RVAs were Wa-like, expressing G1P[6], G12P[6], or G12P[8] neutralization antigens on the genotype 1 genomic constellation (I1-R1-C1-M1-A1-N1-T1-E1-H1), whereas 13 other strains were DS-1-like, expressing G2P[4], G2P[6], G3P[6], and G6P[6] on the genotype 2 genomic constellation (I2-R2-C2-M2-A2-N2-T2-E2-H2). No inter-genogroup reassortment in the backbone genes was observed. Phylogenetic analysis of the Cameroonian G6P[6] strains indicated the separation of the strains identified in the Far North region (Maroua) and the Northwest region (Bamenda and Esu) into two branches that is consistent with multiple introductions of G6P[6] strains into this country. The present whole genome based molecular characterization study indicates that the emerging G6P[6] strain is fully heterotypic to Rotarix, the vaccine introduced during 2014 in childhood immunization program in Cameroon. Continuous strain monitoring is therefore needed in this area and elsewhere to see if G6s, besides genotype G1 to G4, G8, G9 and G12, may become a new, regionally important genotype in the post vaccine licensure era in Africa. |
Plasmid-mediated quinolone resistance in isolates of Salmonella enterica serotype Typhi, USA.
Sjolund-Karlsson M , Howie R , Rickert R , Newton A , Gonzalez-Aviles G , Crump JA . Int J Antimicrob Agents 2014 45 (1) 88-90 Salmonella enterica serotype Typhi is the causative agent of typhoid fever, a severe, systemic, febrile illness. The infection is usually acquired through consumption of water or food contaminated with human faecal material and is therefore more common in areas with poor sanitation and crowding. Typhoid fever was estimated to cause 11.9 million illnesses and 129 000 deaths in 2010 [1]. A large proportion of typhoid fever occurs among infants and children in South-Central and Southeast Asia. Typhoid fever in the USA is often associated with international travel. | Timely treatment with appropriate antimicrobial agents is critical for optimal management of typhoid fever. However, resistance to traditional first-line antimicrobial agents (chloramphenicol, ampicillin and trimethoprim/sulfamethoxazole) is common and has prompted the use of other drugs such as fluoroquinolones (e.g. ciprofloxacin) [2]. In the USA, antimicrobial susceptibility among S. Typhi is monitored by the National Antimicrobial Resistance Monitoring System (NARMS) at the US Centers for Disease Control and Prevention (CDC). |
Outbreak of hepatitis C virus infection associated with narcotics diversion by an hepatitis C virus-infected surgical technician.
Warner AE , Schaefer MK , Patel PR , Drobeniuc J , Xia G , Lin Y , Khudyakov Y , Vonderwahl CW , Miller L , Thompson ND . Am J Infect Control 2014 43 (1) 53-8 BACKGROUND: Drug diversion by health care personnel poses a risk for serious patient harm. Public health identified 2 patients diagnosed with acute hepatitis C virus (HCV) infection who shared a common link with a hospital. Further investigation implicated a drug-diverting, HCV-infected surgical technician who was subsequently employed at an ambulatory surgical center. METHODS: Patients at the 2 facilities were offered testing for HCV infection if they were potentially exposed. Serum from the surgical technician and patients testing positive for HCV but without evidence of infection before their surgical procedure was further tested to determine HCV genotype and quasi-species sequences. Parenteral medication handling practices at the 2 facilities were evaluated. RESULTS: The 2 facilities notified 5970 patients of their possible exposure to HCV, 88% of whom were tested and had results reported to the state public health departments. Eighteen patients had HCV highly related to the surgical technician's virus. The surgical technician gained unauthorized access to fentanyl owing to limitations in procedures for securing controlled substances. CONCLUSIONS: Public health surveillance identified an outbreak of HCV infection due to an infected health care provider engaged in diversion of injectable narcotics. The investigation highlights the value of public health surveillance in identifying HCV outbreaks and uncovering a method of drug diversion and its impacts on patients. |
Assessment of empirical antibiotic therapy optimisation in six hospitals: an observational cohort study
Braykov NP , Morgan DJ , Schweizer ML , Uslan DZ , Kelesidis T , Weisenberg SA , Johannsson B , Young H , Cantey J , Srinivasan A , Perencevich E , Septimus E , Laxminarayan R . Lancet Infect Dis 2014 14 (12) 1220-7 BACKGROUND: Modification of empirical antimicrobials when warranted by culture results or clinical signs is recommended to control antimicrobial overuse and resistance. We aimed to assess the frequency with which patients were started on empirical antimicrobials, characteristics of the empirical regimen and the clinical characteristics of patients at the time of starting antimicrobials, patterns of changes to empirical therapy at different timepoints, and modifiable factors associated with changes to the initial empirical regimen in the first 5 days of therapy. METHODS: We did a chart review of adult inpatients receiving one or more antimicrobials in six US hospitals on 4 days during 2009 and 2010. Our primary outcome was the modification of antimicrobial regimen on or before the 5th day of empirical therapy, analysed as a three-category variable. Bivariate analyses were used to establish demographic and clinical variables associated with the outcome. Variables with p values below 0.1 were included in a multivariable generalised linear latent and mixed model with multinomial logit link to adjust for clustering within hospitals and accommodate a non-binary outcome variable. FINDINGS: Across the six study sites, 4119 (60%) of 6812 inpatients received antimicrobials. Of 1200 randomly selected patients with active antimicrobials, 730 (61%) met inclusion criteria. At the start of therapy, 220 (30%) patients were afebrile and had normal white blood cell counts. Appropriate cultures were collected from 432 (59%) patients, and 250 (58%) were negative. By the 5th day of therapy, 12.5% of empirical antimicrobials were escalated, 21.5% were narrowed or discontinued, and 66.4% were unchanged. Narrowing or discontinuation was more likely when cultures were collected at the start of therapy (adjusted OR 1.68, 95% CI 1.05-2.70) and no infection was noted on an initial radiological study (1.76, 1.11-2.79). Escalation was associated with multiple infection sites (2.54, 1.34-4.83) and a positive culture (1.99, 1.20-3.29). INTERPRETATION: Broad-spectrum empirical therapy is common, even when clinical signs of infection are absent. Fewer than one in three inpatients have their regimens narrowed within 5 days of starting empirical antimicrobials. Improved diagnostic methods and continued education are needed to guide discontinuation of antimicrobials. FUNDING: US Centers for Disease Control and Prevention, Division of Healthcare Quality Promotion; Robert Wood Johnson Foundation; US Department of Veterans Administration; US Department of Homeland Security. |
Reduced rotavirus vaccine effectiveness among children born during the rotavirus season: a pooled analysis of 5 case-control studies from the Americas
Premkumar PS , Parashar UD , Gastanaduy PA , McCracken JP , de Oliveira LH , Payne DC , Patel MM , Tate JE , Lopman BA . Clin Infect Dis 2014 60 (7) 1075-8 We assessed whether birth during rotavirus season modifies rotavirus vaccine effectiveness (VE), using data from rotavirus VE studies. In the first year of life, adjusted VE was 72% for children born in rotavirus season and 84% with children born in other months (P = .01). Seasonal factors may interfere with vaccine performance. |
Safety and immunogenicity of dry powder measles vaccine administered by inhalation: a randomized controlled Phase I clinical trial
Agarkhedkar S , Kulkarni PS , Winston S , Sievers R , Dhere RM , Gunale B , Powell K , Rota PA , Papania M . Vaccine 2014 32 (50) 6791-7 BACKGROUND: Measles is a highly infectious respiratory disease which causes 122,000 deaths annually. Although measles vaccine is extremely safe and effective, vaccine coverage could be improved by a vaccine that is more easily administered and transported. We developed an inhalable dry powder measles vaccine (MVDP) and two delivery devices, and demonstrated safety, immunogenicity, and efficacy of the vaccine in preclinical studies. Here we report the first clinical trial of MVDP delivered by inhalation. METHODOLOGY: Sixty adult males aged 18 to 45 years, seropositive for measles antibody, were enrolled in this controlled Phase I clinical study. Subjects were randomly assigned in 1:1:1 ratio to receive either MVDP by Puffhaler((R)) or by Solovent devices or the licensed subcutaneous measles vaccine. Adverse events (AEs) were recorded with diary cards until day 28 post-vaccination and subjects were followed for 180 days post-vaccination to assess potential serious long term adverse events. Measles antibody was measured 7 days before vaccination and at days 21 and 77 after vaccination by ELISA and a plaque reduction neutralization test. RESULTS: All subjects completed the study according to protocol. Most subjects had high levels of baseline measles antibody. No adverse events were reported. MVDP produced serologic responses similar to subcutaneous vaccination. CONCLUSIONS: MVDP was well tolerated in all subjects. Most subjects had high baseline measles antibody titer which limited ability to measure the serologic responses, and may have limited the adverse events following vaccination. Additional studies in subjects without pre-existing measles antibody are needed to further elucidate the safety and immunogenicity of MVDP. |
Seasonal influenza vaccine coverage among high-risk populations in Thailand, 2010-2012
Owusu JT , Prapasiri P , Ditsungnoen D , Leetongin G , Yoocharoen P , Rattanayot J , Olsen SJ , Muangchana C . Vaccine 2014 33 (5) 742-7 BACKGROUND: The Advisory Committee on Immunization Practice of Thailand prioritizes seasonal influenza vaccinations for populations who are at highest risk for serious complications (pregnant women, children 6 months-2 years, persons ≥65 years, persons with chronic diseases, obese persons), and healthcare personnel and poultry cullers. The Thailand government purchases seasonal influenza vaccine for these groups. We assessed vaccination coverage among high-risk groups in Thailand from 2010 to 2012. METHODS: National records on persons who received publicly purchased vaccines from 2010 to 2012 were analyzed by high-risk category. Denominator data from multiple sources were compared to calculate coverage. Vaccine coverage was defined as the proportion of individuals in each category who received the vaccine. Vaccine wastage was defined as the proportion of publicly purchased vaccines that were not used. RESULTS: From 2010 to 2012, 8.18 million influenza vaccines were publicly purchased (range, 2.37-3.29 million doses/year), and vaccine purchases increased 39% over these years. Vaccine wastage was 9.5%. Approximately 5.7 million (77%) vaccine doses were administered to persons ≥65 years and persons with chronic diseases, 1.4 million (19%) to healthcare personnel/poultry cullers, 82,570 (1.1%) to children 6 months-2 years, 78,885 (1.1%) to obese persons, 26,481 (0.4%) to mentally disabled persons, and 17,787 (0.2%) to pregnant women. Between 2010 and 2012, coverage increased among persons with chronic diseases (8.6% versus 14%; p<0.01) and persons ≥65 years (12%, versus 20%; p<0.01); however, coverage decreased for mentally disabled persons (6.1% versus 4.9%; p<0.01), children 6 months-2 years (2.3% versus 0.9%; p<0.01), pregnant women (1.1% versus 0.9%; p<0.01), and obese persons (0.2% versus 0.1%; p<0.01). CONCLUSIONS: From 2010 to 2012, the availability of publicly purchased vaccines increased. While coverage remained low for all target groups, coverage was highest among persons ≥65 years and persons with chronic diseases. Annual coverage assessments are necessary to promote higher coverage among high-risk groups in Thailand. |
Strain diversity plays no major role in the varying efficacy of rotavirus vaccines: an overview
Velasquez DE , Parashar UD , Jiang B . Infect Genet Evol 2014 28 561-71 While a monovalent Rotarix(R) [RV1] and a pentavalent RotaTeq(R) [RV5] have been extensively tested and found generally safe and equally efficacious in clinical trials, the question still lingers about the evolving diversity of circulating rotavirus strains over time and their relationship with protective immunity induced by rotavirus vaccines. We reviewed data from clinical trials and observational studies that assessed the efficacy or field effectiveness of rotavirus vaccines against different rotavirus strains worldwide. RV1 provided broad clinical efficacy and field effectiveness against severe diarrhea due to all major circulating strains, including the homotypic G1P[8] and the fully heterotypic G2P[4] strains. Similarly, RV5 provided broad efficacy and effectiveness against RV5 and non-RV5 strains throughout different locations. Rotavirus vaccination provides broad heterotypic protection; however continuing surveillance is needed to track the change of circulating strains and monitor the effectiveness and safety of vaccines. |
Live attenuated H7N7 influenza vaccine primes for a vigorous antibody response to inactivated H7N7 influenza vaccine
Babu TM , Levine M , Fitzgerald T , Luke C , Sangster MY , Jin H , Topham D , Katz J , Treanor J , Subbarao K . Vaccine 2014 32 (50) 6798-804 BACKGROUND: H7 influenza viruses have emerged as potential pandemic threat. We evaluated the safety and immunogenicity of two candidate H7 pandemic live attenuated influenza vaccines (pLAIV) and their ability to prime for responses to an unadjuvanted H7 pandemic inactivated influenza vaccine (pIIV). METHODS: Healthy seronegative adults received two doses of A/Netherlands/219/03 (H7N7) or one dose of A/chicken/British Columbia/CN-6/04 (H7N3) pLAIV all given as 10(7.5) 50% tissue culture infective doses (TCID50) intranasally. A subset of subjects received one 45mug dose of H7N7 pIIV containing the A/Mallard/Netherlands/12/2000 HA intramuscularly 18-24 months after pLAIV. Viral shedding was assessed by culture and real-time polymerase chain reaction (rRT-PCR), B cell responses following pLAIV were evaluated by ELISPOT and flow cytometry. Serum antibody was assessed by hemagglutination-inhibition (HAI), microneutralization (MN) and ELISA assays after each vaccine. RESULTS: Serum HAI or MN responses were not detected in any subject following one or two doses of either H7 pLAIV, although some subjects had detectable H7 specific B cells after vaccination. However, 10/13 subjects primed with two doses of H7N7 pLAIV responded to a subsequent dose of the homologous H7N7 pIIV with high titer HAI and MN antibody that cross-reacted with both North American and Eurasian lineage H7 viruses, including H7N9. In contrast, naive subjects and recipients of a single dose of the mismatched H7N3 pLAIV did not develop HAI or MN antibody after pIIV. CONCLUSIONS: While pLAIVs did not elicit detectable serum MN or HAI antibody, strain-specific pLAIV priming established long term immune memory that was cross-reactive with other H7 influenza strains. Understanding the mechanisms underlying priming by pLAIV may aid in pandemic vaccine development. |
The history of the United States Advisory Committee on Immunization Practices (ACIP)
Walton LR , Orenstein WA , Pickering LK . Vaccine 2014 33 (3) 405-14 The United States Advisory Committee on Immunization Practices (ACIP) is a federal advisory committee that develops written recommendations for use of vaccines licensed by the Food and Drug Administration (FDA) for the U.S. civilian population. Vaccine development and disease outbreaks contributed to the need for a systematized, science-based, formal mechanism for establishing national immunization policy in this country. Formed in 1964, the ACIP was charged with this role. The committee has undergone significant changes in structure and operational activities during its 50-year history. The ACIP works closely with many liaison organizations to develop its immunization recommendations, which are harmonized among key professional medical societies. ACIP vaccine recommendations form two immunization schedules, which are updated annually: (1) the childhood and adolescent immunization schedule and (2) the adult immunization schedule. Today, once ACIP recommendations are adopted by the Director of the Centers for Disease Control and Prevention and the Secretary of the Department of Health and Human Services, these recommendations are published in Morbidity and Mortality Weekly Report (MMWR), become official policy, and are incorporated into the appropriate immunization schedule. |
International meeting on influenza vaccine effectiveness, 3-4 December 2012, Geneva, Switzerland
Lafond KE , Tam JS , Bresee JS , Widdowson MA . Vaccine 2014 32 (49) 6591-6595 On December 3-4 2012, the World Health Organization convened a meeting of influenza vaccine effectiveness (VE) experts from over 25 countries in Geneva, Switzerland, to review recent developments in the global influenza vaccine landscape and evaluate approaches to determining the effectiveness of influenza vaccine products among target populations. Vaccine manufacturers from Thailand, Vietnam, India, and Brazil shared recent advances illustrating the expansion of influenza vaccine production worldwide. Randomized controlled trials are underway in several low and middle-income countries including India, Thailand, Bangladesh, and South Africa, to fill knowledge gaps in target populations such as children and pregnant women. National and international networks in the United States, Canada, Europe, Latin America and Australia are conducting multi-site observational studies with shared methodologies to generate national influenza VE estimates and pool data for regional estimates. Standardized VE estimation methods are key to generating point estimates that are comparable internationally and across different settings. |
Childhood vaccines and Kawasaki disease, Vaccine Safety Datalink, 1996-2006
Abrams JY , Weintraub ES , Baggs JM , McCarthy NL , Schonberger LB , Lee GM , Klein NP , Belongia EA , Jackson ML , Naleway AL , Nordin JD , Hambidge SJ , Belay ED . Vaccine 2014 33 (2) 382-7 BACKGROUND: Kawasaki disease is a childhood vascular disorder of unknown etiology. Concerns have been raised about vaccinations being a potential risk factor for Kawasaki disease. METHODS: Data from the Vaccine Safety Datalink were collected on children aged 0-6 years at seven managed care organizations across the United States. Defining exposure as one of several time periods up to 42 days after vaccination, we conducted Poisson regressions controlling for age, sex, season, and managed care organization to determine if rates of physician-diagnosed and verified Kawasaki disease were elevated following vaccination compared to rates during all unexposed periods. We also performed case-crossover analyses to control for unmeasured confounding. RESULTS: A total of 1,721,186 children aged 0-6 years from seven managed care organizations were followed for a combined 4,417,766 person-years. The rate of verified Kawasaki disease was significantly lower during the 1-42 days after vaccination (rate ratio=0.50, 95% CL=0.27-0.92) and 8-42 days after vaccination (rate ratio=0.45, 95% CL=0.22-0.90) compared to rates during unexposed periods. Breaking down the analysis by vaccination category did not identify a subset of vaccines which was solely responsible for this association. The case-crossover analyses revealed that children with Kawasaki disease had lower rates of vaccination in the 42 days prior to symptom onset for both physician-diagnosed Kawasaki disease (rate ratio=0.79, 95% CL=0.64-0.97) and verified Kawasaki disease (rate ratio=0.38, 95% CL=0.20-0.75). CONCLUSIONS: Childhood vaccinations' studied did not increase the risk of Kawasaki disease; conversely, vaccination was associated with a transient decrease in Kawasaki disease incidence. Verifying and understanding this potential protective effect could yield clues to the underlying etiology of Kawasaki disease. |
The complementary roles of Phase 3 trials and post-licensure surveillance in the evaluation of new vaccines
Lopalco PL , DeStefano F . Vaccine 2014 33 (13) 1541-8 Vaccines have led to significant reductions in morbidity and saved countless lives from many infectious diseases and are one of the most important public health successes of the modern era. Both vaccines' effectiveness and safety are keys for the success of immunisation programmes. The role of post-licensure surveillance has become increasingly recognised by regulatory authorities in the overall vaccine development process. Safety, purity, and effectiveness of vaccines are carefully assessed before licensure, but some safety and effectiveness aspects need continuing monitoring after licensure; Post-marketing activities are a necessary complement to pre-licensure activities for monitoring vaccine quality and to inform public health programmes. In the recent past, the availability of large databases together with data-mining and cross-linkage techniques have significantly improved the potentialities of post-licensure surveillance. The scope of this review is to present challenges and opportunities offered by vaccine post-licensure surveillance. While pre-licensure activities form the foundation for the development of effective and safe vaccines, post-licensure monitoring and assessment, are necessary to assure that vaccines are effective and safe when translated in real world settings. Strong partnerships and collaboration at an international level between different stakeholders is necessary for finding and optimally allocating resources and establishing robust post-licensure processes. |
Does closure of children's medical home impact their immunization coverage?
Kolasa MS , Stevenson J , Ossa A , Lutz J . Public Health 2014 128 (12) 1106-11 OBJECTIVES: Little is known about the impact closing a health care facility has on immunization coverage of children utilizing that facility as a medical home. The authors assessed the impact of closing a Medicaid managed care facility in Philadelphia on immunization coverage of children, primarily low income children from racial/ethnic minority groups, utilizing that facility for routine immunizations. STUDY DESIGN: Observational longitudinal cohort case study. METHODS: Eligible children were born 03/01/05-06/30/07, present in Philadelphia's immunization information system (IIS), and were active clients of the facility before it closed in September 2007. IIS-recorded immunization coverage at ages 5, 7, 13, 16 and 19 months through January 2009 was compared between clinic children age-eligible to receive specific vaccines before clinic closing (preclosure cohorts) and children not age-eligible to receive those vaccines prior to closing (postclosure cohorts). RESULTS: Of 630 eligible children, 99 (16%) had no additional IIS-recorded immunizations. Third dose DTaP vaccine coverage at age seven months among preclosure cohorts was 54.4% vs. 40.3% among postclosure cohorts [risk ratio 1.31 (1.15,1.49)]. Fourth dose DTaP coverage at 19 months was 65.9% vs. 57.7% [risk ratio 1.24 (1.08,1.42)]. MMR coverage at 16 months was 79.5% vs. 69.9% [risk ratio 1.47 (1.22, 1.76)]. Coverage for the 431331 vaccination series at 19 months was 63.8% vs. 53.8% [risk ratio 1.28 (1.12,1.88)]. CONCLUSIONS: Immunization coverage declined at key age milestones for active clients of a Medicaid managed care that closed as compared with preclosure cohorts of clients from the same facility. When a primary health care facility closes, efforts should be made to ensure that children who had received vaccinations at that facility quickly establish a new medical home. |
A bibliometric analysis of U.S.-based research on the Behavioral Risk Factor Surveillance System
Khalil GM , Gotway Crawford CA . Am J Prev Med 2014 48 (1) 50-7 BACKGROUND: Since Alan Pritchard defined bibliometrics as "the application of statistical methods to media of communication" in 1969, bibliometric analyses have become widespread. To date, however, bibliometrics has not been used to analyze publications related to the U.S. Behavioral Risk Factor Surveillance System (BRFSS). PURPOSE: To determine the most frequently cited BRFSS-related topical areas, institutions, and journals. METHODS: A search of the Web of Knowledge database in 2013 identified U.S.-published studies related to BRFSS, from its start in 1984 through 2012. Search terms were BRFSS, Behavioral Risk Factor Surveillance System, or Behavioral Risk Survey. The resulting 1,387 articles were analyzed descriptively and produced data for VOSviewer, a computer program that plotted a relevance distance-based map and clustered keywords from text in titles and abstracts. RESULTS: Topics, journals, and publishing institutions ranged widely. Most research was clustered by content area, such as cancer screening, access to care, heart health, and quality of life. The American Journal of Preventive Medicine and American Journal of Public Health published the most BRFSS-related papers (95 and 70, respectively). CONCLUSIONS: Bibliometrics can help identify the most frequently published BRFSS-related topics, publishing journals, and publishing institutions. BRFSS data are widely used, particularly by CDC and academic institutions such as the University of Washington and other universities hosting top-ranked schools of public health. Bibliometric analysis and mapping provides an innovative way of quantifying and visualizing the plethora of research conducted using BRFSS data and summarizing the contribution of this surveillance system to public health. |
Media violence exposure and physical aggression in fifth-grade children
Coker TR , Elliott MN , Schwebel DC , Windle M , Toomey SL , Tortolero SR , Hertz MF , Peskin MF , Schuster MA . Acad Pediatr 2014 15 (1) 82-8 OBJECTIVE: To examine the association of media violence exposure and physical aggression in fifth graders across 3 media types. METHODS: We analyzed data from a population-based, cross-sectional survey of 5,147 fifth graders and their parents in 3 US metropolitan areas. We used multivariable linear regression and report partial correlation coefficients to examine associations between children's exposure to violence in television/film, video games, and music (reported time spent consuming media and reported frequency of violent content: physical fighting, hurting, shooting, or killing) and the Problem Behavior Frequency Scale. RESULTS: Child-reported media violence exposure was associated with physical aggression after multivariable adjustment for sociodemographics, family and community violence, and child mental health symptoms (partial correlation coefficients: TV, 0.17; video games, 0.15; music, 0.14). This association was significant and independent for television, video games, and music violence exposure in a model including all 3 media types (partial correlation coefficients: TV, 0.11; video games, 0.09; music, 0.09). There was a significant positive interaction between media time and media violence for video games and music but not for television. Effect sizes for the association of media violence exposure and physical aggression were greater in magnitude than for most of the other examined variables. CONCLUSIONS: The association between physical aggression and media violence exposure is robust and persistent; the strength of this association of media violence may be at least as important as that of other factors with physical aggression in children, such as neighborhood violence, home violence, child mental health, and male gender. |
Firearm use in G- and PG-rated movies, 2008-2012
Pelletier AR , Eric Tongren J , Gilchrist J . Am J Prev Med 2014 47 (6) e11-2 Popular movies represent a common form of media exposure for children, whether viewed in theaters, on TV, or over the Internet. Based on social cognitive theory, children learn behaviors in part through their exposure to media images.1 Exposure to violence in media may have a negative impact on children.2 From 1995 to 2007, almost a third (31%) of the G- and PG-rated movies with the highest U.S. box-office gross revenues had scenes involving firearms.3–5 Movies released during 2008–2012 were examined to determine whether the depiction of firearms in movies marketed to children has changed. |
Improving injury prevention through health information technology
Haegerich TM , Sugerman DE , Annest JL , Klevens J , Baldwin GT . Am J Prev Med 2014 48 (2) 219-228 Health information technology is an emerging area of focus in clinical medicine with the potential to improve injury and violence prevention practice. With injuries being the leading cause of death for Americans aged 1-44 years, greater implementation of evidence-based preventive services, referral to community resources, and real-time surveillance of emerging threats is needed. Through a review of the literature and capturing of current practice in the field, this paper showcases how health information technology applied to injury and violence prevention can lead to strengthened clinical preventive services, more rigorous measurement of clinical outcomes, and improved injury surveillance, potentially resulting in health improvement. |
Evaluation of fast-track diagnostics and TaqMan array card real-time PCR assays for the detection of respiratory pathogens.
Driscoll AJ , Karron RA , Bhat N , Thumar B , Kodani M , Fields BS , Whitney CG , Levine OS , O'Brien KL , Murdoch DR . J Microbiol Methods 2014 107c 222-226 Several commercial assays are now available to detect the nucleic acid of multiple respiratory pathogens from a single specimen. Head-to-head comparisons of such assays using a single set of standard specimens provide additional information about key assay parameters such as sensitivity, specificity and lower limits of detection, and help to inform the decision regarding which method to use. We evaluated two real-time PCR platforms: the Fast-track Diagnostics(R) (FTD) multiplex respiratory panel and a TaqMan array card (TAC) for simultaneous uniplex detection of multiple respiratory pathogens. Two sets of samples were used to evaluate the assays. One set was created by spiking pooled nasal wash or phosphate buffered saline with specified volumes of known concentrations of virus and/or bacteria. Clinical nasal wash specimens from children with lower respiratory tract illness comprised the other set. Thirteen pathogen targets were compared between the two platforms. Testing with a validation panel of spiked samples revealed a sensitivity of 96.1% and 92.9% for the FTD and TAC assays, respectively. Specificity could not be reliably calculated due to a suspected contamination of the sample substrate. Inter-assay agreement was high (>95%) for most targets. Previously untested clinical specimens tested by both assays revealed a high percent agreement (>95%) for all except rhinovirus, enterovirus and Streptococcus pneumoniae. Limitations of this evaluation included extraction of the validation samples by two different methods and the evaluation of the assays in different laboratories. However, neither of these factors significantly impacted inter-assay agreement for these sets of samples, and it was demonstrated that both assays could reliably detect clinically relevant concentrations of bacterial and viral pathogens. |
Disparate detection outcomes for anti-HCV IgG and HCV RNA in dried blood spots.
Tejada-Strop A , Drobeniuc J , Mixson-Hayden T , Forbi JC , Le NT , Li L , Mei J , Terrault N , Kamili S . J Virol Methods 2014 212c 66-70 Dried blood spots (DBS) expedite the collection, storage and shipping of blood samples, thereby facilitating large-scale serologic studies. We evaluated the sensitivity of anti-HCV IgG testing and HCV-RNA quantitation using freshly prepared and stored DBS derived from HCV-infected patients. Protocols for elution were optimized using DBS prepared from plasma of 52 HCV-infected persons and 51 uninfected persons (control DBS), then applied to DBS from 33 chronic hepatitis C patients that had been stored at -20 degrees C for 5 years (stored DBS). Control and stored DBS, and their corresponding plasma, were processed for anti-HCV IgG testing using the VITROS chemiluminescence assay (CIA) and the HCV 3.0 enzyme immunoassay (EIA) (Ortho-Clinical Diagnostics), and for HCV RNA quantitation by quantitative (q) RT-PCR. HCV genotyping was conducted by nucleotide sequencing. The sensitivity of CIA and EIA in control DBS was 92% and 90%, respectively, compared to 100% and 97%, respectively, in stored DBS. The sensitivity of HCV RNA detection was 88% in control DBS, compared to 36% in stored DBS. Specificity was 100% for all the assays in both control and stored DBS. Genotypes 1, 2 and 3 were detected in 16 (62%), 6 (23.1%), and 4 (15.3%) samples, respectively. Sequences generated from DBS and their corresponding plasma samples were identical. Whereas the sensitivity of anti-HCV IgG detection in stored DBS was equivalent to that in recently prepared DBS, the sensitivity of HCV RNA detection was markedly lower in stored DBS compared to recently prepared DBS. Stored DBS may be reliably used for anti-HCV detection but for HCV-RNA-based testing freshly prepared DBS is preferable to stored DBS. |
An HIV-1 RNA test following a reactive fourth-generation antigen/antibody combination assay confirms a high proportion of HIV infections.
Westheimer E , Fu J , Radix A , Giancotti FR , Hall L , Daskalakis DC , Tsoi B , Peters PJ . J Clin Virol 2014 61 (4) 623-624 The HIV testing algorithm in the United States was updated in June 2014 (new testing algorithm) and recommends screening for HIV infection with a fourth-generation HIV antigen/antibody combination immunoassay (IA) followed by a supplemental antibody assay to validate the presence of anti-HIV-1 or HIV-2 antibodies and, if necessary, an HIV-1 RNA assay to resolve discrepancies [1]. This new testing algorithm was reported in a series of manuscripts in a 2013 supplement of this journal [2]. The algorithm includes tests that can detect HIV-1 p24 antigen and HIV-1 RNA resulting in improved acute HIV-1 infection diagnosis compared with HIV antibody testing alone [3]. As an important advantage of this new testing algorithm is the improved sensitivity for detecting acute HIV-1 infection, an alternative has been proposed which uses an FDA-approved HIV-1 RNA assay as the supplemental test after a reactive fourth-generation result instead of an antibody assay. If the HIV-1 RNA result is negative, an HIV-1/HIV-2 antibody differentiation immunoassay is then performed. This algorithm (alternative testing algorithm) is included in the updated recommendations as an “Alternative Testing Sequence When Tests in the Recommended Algorithm Cannot be Used” [1]. We present the results of an evaluation of this alternative testing algorithm using an HIV-1 RNA assay as the confirmatory test. |
Secretory IgA is concentrated in the outer layer of colonic mucus along with gut bacteria
Rogier EW , Frantz AL , Bruno ME , Kaetzel CS . Pathogens 2014 3 (2) 390-403 Antibodies of the secretory IgA (SIgA) class comprise the first line of antigen-specific immune defense, preventing access of commensal and pathogenic microorganisms and their secreted products into the body proper. In addition to preventing infection, SIgA shapes the composition of the gut microbiome. SIgA is transported across intestinal epithelial cells into gut secretions by the polymeric immunoglobulin receptor (pIgR). The epithelial surface is protected by a thick network of mucus, which is composed of a dense, sterile inner layer and a loose outer layer that is colonized by commensal bacteria. Immunofluorescence microscopy of mouse and human colon tissues demonstrated that the SIgA co-localizes with gut bacteria in the outer mucus layer. Using mice genetically deficient for pIgR and/or mucin-2 (Muc2, the major glycoprotein of intestinal mucus), we found that Muc2 but not SIgA was necessary for excluding gut bacteria from the inner mucus layer in the colon. Our findings support a model whereby SIgA is anchored in the outer layer of colonic mucus through combined interactions with mucin proteins and gut bacteria, thus providing immune protection against pathogens while maintaining a mutually beneficial relationship with commensals. |
An ultra-trace analysis technique for SF6 using gas chromatography with negative ion chemical ionization mass spectrometry
Jong EC , Macek PV , Perera IE , Luxbacher KD , McNair HM . J Chromatogr Sci 2014 53 (6) 854-9 Sulfur hexafluoride (SF6) is widely used as a tracer gas because of its detectability at low concentrations. This attribute of SF6 allows the quantification of both small-scale flows, such as leakage, and large-scale flows, such as atmospheric currents. SF6's high detection sensitivity also facilitates greater usage efficiency and lower operating cost for tracer deployments by reducing quantity requirements. The detectability of SF6 is produced by its high molecular electronegativity. This property provides a high potential for negative ion formation through electron capture thus naturally translating to selective detection using negative ion chemical ionization mass spectrometry (NCI-MS). This paper investigates the potential of using gas chromatography (GC) with NCI-MS for the detection of SF6. The experimental parameters for an ultra-trace SF6 detection method utilizing minimal customizations of the analytical instrument are detailed. A method for the detection of parts per trillion (ppt) level concentrations of SF6 for the purpose of underground ventilation tracer gas analysis was successfully developed in this study. The method utilized a Shimadzu gas chromatography with negative ion chemical ionization mass spectrometry system equipped with an Agilent J&W HP-porous layer open tubular column coated with an alumina oxide (Al2O3) S column. The method detection limit (MDL) analysis as defined by the Environmental Protection Agency of the tracer data showed the method MDL to be 5.2 ppt. |
Validation of the Endopep-MS method for qualitative detection of active botulinum neurotoxins in human and chicken serum
Bjornstad K , Tevell Aberg A , Kalb SR , Wang D , Barr JR , Bondesson U , Hedeland M . Anal Bioanal Chem 2014 406 (28) 7149-61 Botulinum neurotoxins (BoNTs) are highly toxic proteases produced by anaerobic bacteria. Traditionally, a mouse bioassay (MBA) has been used for detection of BoNTs, but for a long time, laboratories have worked with alternative methods for their detection. One of the most promising in vitro methods is a combination of an enzymatic and mass spectrometric assay called Endopep-MS. However, no comprehensive validation of the method has been presented. The main purpose of this work was to perform a validation for the qualitative analysis of BoNT-A, B, C, C/D, D, D/C, and F in serum. The limit of detection (LOD), selectivity, precision, stability in matrix and solution, and correlation with the MBA were evaluated. The LOD was equal to or even better than that of the MBA for BoNT-A, B, D/C, E, and F. Furthermore, Endopep-MS was for the first time successfully used to differentiate between BoNT-C and D and their mosaics C/D and D/C by different combinations of antibodies and target peptides. In addition, sequential antibody capture was presented as a new way to multiplex the method when only a small sample volume is available. In the comparison with the MBA, all the samples analyzed were positive for BoNT-C/D with both methods. These results indicate that the Endopep-MS method is a valid alternative to the MBA as the gold standard for BoNT detection based on its sensitivity, selectivity, and speed and that it does not require experimental animals. |
Evaluation of a new handheld instrument for the detection of counterfeit artesunate by visual fluorescence comparison
Ranieri N , Tabernero P , Green MD , Verbois L , Herrington J , Sampson E , Satzger RD , Phonlavong C , Thao K , Newton PN , Witkowski MR . Am J Trop Med Hyg 2014 91 (5) 920-4 There is an urgent need for accurate and inexpensive handheld instruments for the evaluation of medicine quality in the field. A blinded evaluation of the diagnostic accuracy of the Counterfeit Detection Device 3 (CD-3), developed by the US Food and Drug Administration Forensic Chemistry Center, was conducted in the Lao People's Democratic Republic. Two hundred three samples of the oral antimalarial artesunate were compared with authentic products using the CD-3 by a trainer and two trainees. The specificity (95% confidence interval [95% CI]), sensitivity (95% CI), positive predictive value (95% CI), and negative predictive value (95% CI) of the CD-3 for detecting counterfeit (falsified) artesunate were 100% (93.8-100%), 98.4% (93.8-99.7%), 100% (96.2-100%), and 97.4% (90.2-99.6%), respectively. Interobserver agreement for 203 samples of artesunate was 100%. The CD-3 holds promise as a relatively inexpensive and easy to use instrument for field evaluation of medicines, potentially empowering drug inspectors, customs agents, and pharmacists. |
Highlights from the United States Food and Drug Administration's public workshop on the development of animal models of pregnancy to address medical countermeasures in an "at-risk" population of pregnant women: influenza as a case study
Williams D , Basavarajappa MS , Rasmussen SA , Morris S , Mattison D . Birth Defects Res A Clin Mol Teratol 2014 100 (10) 806-10 The U.S. Food and Drug Administration (FDA) and other federal agencies partner to ensure that medical countermeasures (e.g., drug therapies and vaccines) are available for public health emergencies (FDA, 2014). Despite continuing progress, providing medical countermeasures and treatment guidelines for certain populations (e.g., pregnant women) is challenging due to the lack of clinical and/or animal data. Thus, a workshop was convened to discuss animal models of pregnancy for the evaluation of disease progression and medical countermeasures. |
A comparison of two laboratories for the measurement of wood dust using button sampler and diffuse reflection infrared Fourier-transform spectroscopy (DRIFTS)
Chirila MM , Sarkisian K , Andrew ME , Kwon CW , Rando RJ , Harper M . Ann Occup Hyg 2014 59 (3) 336-46 The current measurement method for occupational exposure to wood dust is by gravimetric analysis and is thus non-specific. In this work, diffuse reflection infrared Fourier transform spectroscopy (DRIFTS) for the analysis of only the wood component of dust was further evaluated by analysis of the same samples between two laboratories. Field samples were collected from six wood product factories using 25-mm glass fiber filters with the Button aerosol sampler. Gravimetric mass was determined in one laboratory by weighing the filters before and after aerosol collection. Diffuse reflection mid-infrared spectra were obtained from the wood dust on the filter which is placed on a motorized stage inside the spectrometer. The metric used for the DRIFTS analysis was the intensity of the carbonyl band in cellulose and hemicellulose at ~1735cm-1. Calibration curves were constructed separately in both laboratories using the same sets of prepared filters from the inhalable sampling fraction of red oak, southern yellow pine, and western red cedar in the range of 0.125-4mg of wood dust. Using the same procedure in both laboratories to build the calibration curve and analyze the field samples, 62.3% of the samples measured within 25% of the average result with a mean difference between the laboratories of 18.5%. Some observations are included as to how the calibration and analysis can be improved. In particular, determining the wood type on each sample to allow matching to the most appropriate calibration increases the apparent proportion of wood dust in the sample and this likely provides more realistic DRIFTS results. |
Recent trends in hepatic diseases during pregnancy in the United States, 2002-2010
Ellington SR , Flowers L , Legardy-Williams JK , Jamieson DJ , Kourtis AP . Am J Obstet Gynecol 2014 212 (4) 524 e1-7 OBJECTIVE: While pregnancy-related severe liver disorders are rare, when they occur morbidity and mortality rates are increased for mothers and infants. The objective of this study was to examine the prevalence and trends of hepatic diseases during pregnancy hospitalizations from 2002 through 2010 in the United States. STUDY DESIGN: Hospital discharge data were obtained from the Nationwide Inpatient Sample, the largest all-payer hospital inpatient care database in the United States that provides nationally representative estimates. Pregnancy hospitalizations with the following diagnoses were identified: hepatitis B, hepatitis C, gallbladder disease/cholelithiasis, liver disorders of pregnancy, chronic/alcohol-related liver disease, biliary tract disease, and HELLP (hemolysis, elevated liver enzymes, low platelet count) syndrome. Age, insurance status, hospital location, and hospital region were compared among women with and without hepatic diseases using a chi2 test. Trends in rates of pregnancy hospitalizations and mean charges were analyzed using multivariable logistic and linear regression, respectively. RESULTS: From 2002 through 2010 there were an estimated 41,479,358 pregnancy hospitalizations in the United States. Gallbladder disease and liver disorders of pregnancy were the most common hepatic diseases (rates = 7.18 and 4.65/1000 pregnancy hospitalizations, respectively). Adjusted rates and mean charges significantly increased for all hepatic diseases during pregnancy over the study period. All hepatic diseases were associated with significantly higher charges compared to all pregnancy hospitalizations. HELLP syndrome was associated with the highest mean charges. CONCLUSION: This large study among a representative sample of the US population provides valuable information that can aid policy planning and management of these hepatic diseases during pregnancy in the United States. |
State-based maternal death reviews: assessing opportunities to alter outcomes
Callaghan WM . Am J Obstet Gynecol 2014 211 (6) 581-2 In 1950, the Journal of the American Medical Association reported that, for the first time, the US maternal mortality rate had dipped below 1 per 1000 live births and declared that “Childbearing has been made quite safe.”1 The Centers for Disease Control and Prevention called the decline from 800-900 maternal deaths per 100,000 live births in 1900 to 10-20 deaths per 100,000 live births in 2000 one of the 10 great public health achievements of the 20th century.2 There is no question that great strides were made in maternity care and that women and society benefited. But the story is not over. Women still die from conditions directly or indirectly related to pregnancy, and evidence is emerging that the trend that we so rightly celebrated is not continuing. | Accounting for maternal deaths ought to be easy. The events are rare, dramatic, and devastating for the woman's family and those who cared for her. We have a functioning vital statistics system, and all deaths are registered. However, even today, we struggle to assess accurately the number of women who die in the United States because they became pregnant. There is no question that vital statistics by themselves underestimate the number of maternal deaths, largely because of the lack of diagnostic nuance allowed by the coding rules of International Classification of Diseases; this limitation has been demonstrated in the United States and other developed countries.3-6 In response to the inadequacy of vital records for public health surveillance, in 1986 the Centers for Disease Control and Prevention's Division of Reproductive Health and the American College of Obstetricians and Gynecologists worked to enhance the identification of deaths that are related to pregnancy by establishing the Pregnancy Mortality Surveillance System (PMSS). PMSS relies on state departments of vital statistics to identify deaths during and within 1 year of the end of a pregnancy by all means available. Currently, this system reports a pregnancy-related mortality ratio of approximately 17 per 100,000 live births for 2010. Although the ratio may be stabilizing in recent years, it increased by 50% over the preceding 15 years.7 Moreover, although PMSS likely captures almost all of the deaths that are possible by using a process based on voluntary reporting, it still likely undercounts these events. Another recent estimate that was based on statistical models place the US maternal mortality rate at 18.5 per 100,000 live births for 2013 and suggests that the United States is among the few countries in the world where the rate is increasing.8 There is reason to suspect that better identification plays some role in the observed increases, but it would be presumptuous to state categorically that there is no true increase in the risk of maternal death in the United States. We certainly have no evidence that the risk is falling. |
Trends in infant bedding use: National Infant Sleep Position Study, 1993-2010
Shapiro-Mendoza CK , Colson ER , Willinger M , Rybin DV , Camperlengo L , Corwin MJ . Pediatrics 2014 135 (1) 10-7 BACKGROUND: Use of potentially hazardous bedding, as defined by the American Academy of Pediatrics (eg, pillows, quilts, comforters, loose bedding), is a modifiable risk factor for sudden infant death syndrome and unintentional sleep-related suffocation. The proportion of US infants sleeping with these types of bedding is unknown. METHODS: To investigate the US prevalence of and trends in bedding use, we analyzed 1993-2010 data from the National Infant Sleep Position study. Infants reported as being usually placed to sleep with blankets, quilts, pillows, and other similar materials under or covering them in the last 2 weeks were classified as bedding users. Logistic regression was used to describe characteristics associated with bedding use. RESULTS: From 1993 to 2010, bedding use declined but remained a widespread practice (moving average of 85.9% in 1993-1995 to 54.7% in 2008-2010). Prevalence was highest for infants of teen-aged mothers (83.5%) and lowest for infants born at term (55.6%). Bedding use was also frequently reported among infants sleeping in adult beds, on their sides, and on a shared surface. The rate of decline in bedding use was markedly less from 2001-2010 compared with 1993-2000. For 2007 to 2010, the strongest predictors (adjusted odds ratio: ≥1.5) of bedding use were young maternal age, non-white race and ethnicity, and not being college educated. CONCLUSIONS: Bedding use for infant sleep remains common despite recommendations against this practice. Understanding trends in bedding use is important for tailoring safe sleep interventions. |
Missed opportunities for early infant HIV diagnosis: results of a national study in South Africa
Woldesenbet SA , Jackson D , Goga AE , Crowley S , Doherty T , Mogashoa MM , Dinh TH , Sherman GG . J Acquir Immune Defic Syndr 2014 68 (3) e26-32 BACKGROUND: Services to diagnose early infant HIV infection should be offered at the six-week immunisation visit. Despite high six-week immunisation attendance, the coverage of early infant diagnosis (EID) is low in many sub-Saharan countries. We explored reasons for such missed opportunities at six-week immunisation visits. METHODS: We used data from two cross-sectional surveys conducted in 2010 in South Africa. A national assessment was undertaken among randomly selected public facilities (n=625) to ascertain procedures for EID. A sub-sample of these facilities (n=565) were re-visited to assess the HIV-status of 4-8 week old infants receiving six-week immunisation. We examined potential missed opportunities for EID. We used logistic regression to assess factors influencing maternal intention to report for EID at six-week immunisation visits. RESULTS: EID services were available in >95% of facilities, and 72% of immunisation service points (ISPs). The majority (68%) of ISPs provide EID for infants with reported or documented (on infant's Road-to-Health Chart/booklet - iRtHC) HIV-exposure. Only 9% of ISPs offered provider-initiated counselling and testing (PICT) for infants of undocumented/unknown HIV-exposure. Interviews with self-reported HIV-positive mothers at ISPs revealed only 55% had their HIV-status documented on their iRtHC and 35% intended to request EID during six-week immunisation. Maternal non-reporting for EID was associated with fear of discrimination, poor adherence to antiretrovirals, and inadequate knowledge about mother-to-child HIV transmission (MTCT). CONCLUSION: Missed opportunities for EID were attributed to poor documentation of HIV-status on iRtHC, inadequate maternal knowledge about MTCT, fear of discrimination, and the lack of PICT service for undocumented, unknown, or undeclared HIV-exposed infants. |
The National Spina Bifida Patient Registry: profile of a large cohort of participants from the first 10 clinics
Sawin KJ , Liu T , Ward E , Thibadeau J , Schechter MS , Soe MM , Walker W . J Pediatr 2014 166 (2) 444-50 e1 OBJECTIVE: To use data from the US National Spina Bifida Patient Registry (NSBPR) to describe variations in Contexts of Care, Processes of Care, and Health Outcomes among individuals with spina bifida (SB) receiving care in 10 clinics. STUDY DESIGN: Reported here are baseline cross-sectional data representing the first visit of 2172 participants from 10 specialized, multidisciplinary SB clinics participating in the NSBPR. We used descriptive statistics, the Fisher exact test, chi2 test, and Wilcoxon rank-sum test to examine the data. RESULTS: The mean age was 10.1 (SD 8.1) years with slightly more female subjects (52.5%). The majority was white (63.4%) and relied upon public insurance (53.5%). One-third had sacral lesions, 44.8% had mid-low lumbar lesions, and 24.9% had high lumbar and thoracic lesions. The most common surgery was ventricular shunt placement (65.7%). The most common bladder-management technique among those with bladder impairment was intermittent catheterization (69.0%). Almost 14% experienced a pressure ulcer in the last year. Of those ages 5 years or older with bowel or bladder impairments, almost 30% were continent of stool; a similar percentage was continent of urine. Most variables were associated with type of SB diagnosis. CONCLUSION: The NSBPR provides a cross section of a predominantly pediatric population of patients followed in specialized SB programs. There were wide variations in the variables studied and major differences in Context of Care, Processes of Care, and Health Outcomes by type of SB. Such wide variation and the differences by type of SB should be considered in future analyses of outcomes. |
Prevalence of groups A and C rotavirus antibodies in infants with biliary atresia and cholestatic controls
Clemente MG , Patton JT , Yolken R , Whitington PF , Parashar UD , Jiang B , Raghunathan T , Schwarz KB . J Pediatr 2014 166 (1) 79-84 OBJECTIVE: To analyze the prevalence of acute asymptomatic group A and C rotavirus (RV-A and RV-C) infection in neonates with cholestasis. STUDY DESIGN: Participants were infants <180 days of age with cholestasis (serum direct or conjugated bilirubin >20% of total and ≥2 mg/dL) enrolled in the Childhood Liver Disease Research and Education Network during RV season (December-May). Forty infants with biliary atresia (BA), age 62 +/- 29 days (range, 4.7-13 weeks) and 38 infants with cholestasis, age 67 +/- 44 days (range, 3-15.8 weeks) were enrolled. RESULTS: At enrollment, RV-A IgM positivity rates did not differ between infants with BA (10%) vs those without (18%) (P = .349). RV-C IgM was positive in 0% of infants with BA vs 3% in those without BA (P = .49). RV-A IgG was lower in infants with BA: 51 +/- 39 vs 56 +/- 44 enzyme-linked immunoassay unit, P = .045 but this difference may lack biological relevance as maternal RV-A IgG titers were similar between groups. Infant RV-A IgM titers at 2-6 months follow-up increased markedly vs at presentation in both infants with BA (50 +/- 30 vs 9 +/- 9) and those without (43 +/- 18 vs 16 +/- 20 enzyme-linked immunoassay unit) (P < .0001), without differences between groups. CONCLUSIONS: RV-A infection in the first 6 months of life is common in infants with cholestasis of any cause. RV-A could have different pathogenetic effects by initiating different hepatic immune responses in infants with vs without BA or could lack pathogenetic significance. |
Primary prevention of lead poisoning in children: a cross-sectional study to evaluate state specific lead-based paint risk reduction laws in preventing lead poisoning in children
Kennedy C , Lordo R , Sucosky MS , Boehm R , Brown MJ . Environ Health 2014 13 93 BACKGROUND: Children younger than 72 months are most at risk of environmental exposure to lead from ingestion through normal mouthing behavior. Young children are more vulnerable to lead poisoning than adults because lead is absorbed more readily in a child's gastrointestinal tract. Our focus in this study was to determine the extent to which state mandated lead laws have helped decrease the number of new cases of elevated blood-lead levels (EBLL) in homes where an index case had been identified. METHODS: A cross-sectional study was conducted to compare 682 residential addresses, identified between 2000 and 2009, in two states with and one state without laws to prevent childhood lead poisoning among children younger than 72 months, to determine whether the laws were effective in preventing subsequent cases of lead poisoning detected in residential addresses after the identification of an index case. In this study, childhood lead poisoning was defined as the blood lead level (BLL) that would have triggered an environmental investigation in the residence. The two states with lead laws, Massachusetts (MA) and Ohio (OH), had trigger levels of ≥25 mug/dL and ≥15 mug/dL respectively. In Mississippi (MS), the state without legislation, the trigger level was ≥15 mug/dL. RESULTS: The two states with lead laws, MA and OH, were 79% less likely than the one without legislation, MS, to have residential addresses with subsequent lead poisoning cases among children younger than 72 months, adjusted OR = 0.21, 95% CI (0.08-0.54). CONCLUSIONS: For the three states studied, the evidence suggests that lead laws such as those studied herein effectively reduced primary exposure to lead among young children living in residential addresses that may have had lead contaminants. |
A prospective study of prepregnancy serum concentrations of perfluorochemicals and the risk of gestational diabetes
Zhang C , Sundaram R , Maisog J , Calafat AM , Barr DB , Buck Louis GM . Fertil Steril 2014 103 (1) 184-9 OBJECTIVE: To examine preconception serum concentrations of perfluorooctanoic acid (PFOA) and six other PFCs in relation to gestational diabetes (GDM) risk. DESIGN: Prospective cohort with longitudinal follow-up. SETTING: Not applicable. PATIENT(S): Among 501 women recruited upon discontinuing contraception for the purpose of becoming pregnant, 258 (51%) became pregnant and were eligible for the study, of which 28 (11%) reported having physician-diagnosed GDM during follow-up. INTERVENTION(S): None. MAIN OUTCOME MEASURE(S): The odds ratios (ORs) and 95% confidence intervals (CIs) of GDM associated with each standard deviation (SD) increment of preconception serum PFOA concentration (ng/mL, log-transformed) and six other PFCs were estimated with the use of logistic regression after adjusting for age, prepregnancy body mass index, smoking, and parity conditional on gravidity. RESULT(S): Preconception geometric mean (95% CI) PFOA concentrations (in ng/mL) were higher for women with than without GDM (3.94 [3.15-4.93] vs. 3.07 [2.83-3.12], respectively). Each SD increment in PFOA was associated with a 1.87-fold increased GDM risk (adjusted OR 1.86 [95% CI 1.14-3.02]). A slightly increased risk associated with each SD increment for the six other PFCs was observed as well (all ORs >1.0, range 1.06-1.27), although the associations were not statistically significant. CONCLUSION(S): Our findings suggested that higher environmentally relevant concentrations of PFOA were significantly associated with an increased risk of GDM. If corroborated, these findings may be suggestive of a possible environmental etiology for GDM. |
Protective efficacy of prolonged co-trimoxazole prophylaxis in HIV-exposed children up to age 4 years for the prevention of malaria in Uganda: a randomised controlled open-label trial
Homsy J , Dorsey G , Arinaitwe E , Wanzira H , Kakuru A , Bigira V , Muhindo M , Kamya MR , Sandison TG , Tappero JW . Lancet Glob Health 2014 2 (12) e727-36 BACKGROUND: WHO recommends daily co-trimoxazole for children born to HIV-infected mothers from 6 weeks of age until breastfeeding cessation and exclusion of HIV infection. We have previously reported on the effectiveness of continuation of co-trimoxazole prophylaxis up to age 2 years in these children. We assessed the protective efficacy and safety of prolonging co-trimoxazole prophylaxis until age 4 years in HIV-exposed children. METHODS: We undertook an open-label randomised controlled trial alongside two observational cohorts in eastern Uganda, an area with high HIV prevalence, malaria transmission intensity, and antifolate resistance. We enrolled HIV-exposed infants between 6 weeks and 9 months of age and prescribed them daily co-trimoxazole until breastfeeding cessation and HIV-status confirmation. At the end of breastfeeding, children who remained HIV-uninfected were randomly assigned (1:1) to discontinue co-trimoxazole or to continue taking it up to age 2 years. At age 2 years, children who continued co-trimoxazole prophylaxis were randomly assigned (1:1) to discontinue or continue prophylaxis from age 2 years to age 4 years. The primary outcome was incidence of malaria (defined as the number of treatments for new episodes of malaria diagnosed with positive thick smear) at age 4 years. For additional comparisons, we observed 48 HIV-infected children who took continuous co-trimoxazole prophylaxis and 100 HIV-unexposed uninfected children who never received prophylaxis. We measured grade 3 and 4 serious adverse events and hospital admissions. All children were followed up to age 5 years and all analyses were by intention to treat. This study is registered with ClinicalTrials.gov, number NCT00527800. FINDINGS: 203 HIV-exposed infants were enrolled between Aug 10, 2007, and March 28, 2008. After breastfeeding ended, 185 children were not infected with HIV and were randomly assigned to stop (n=87) or continue (n=98) co-trimoxazole up to age 2 years. At age 2 years, 91 HIV-exposed children who had remained on co-trimoxazole prophylaxis were randomly assigned to discontinue (n=46) or continue (n=45) co-trimoxazole from age 2 years to age 4 years. We recorded 243 malaria episodes (2.91 per person-years) in the 45 HIV-exposed children assigned to continue co-trimoxazole until age 4 years compared with 503 episodes (5.60 per person-years) in the 46 children assigned to stop co-trimoxazole at age 2 years (incidence rate ratio 0.53, 95% CI 0.39-0.71; p<0.0001). There was no evidence of malaria incidence rebound in the year after discontinuation of co-trimoxazole in the HIV-exposed children who stopped co-trimoxazole at age 2 years, but incidence increased significantly in HIV-exposed children who stopped co-trimoxazole at age 4 years (odds ratio 1.78, 95% CI 1.19-2.66; p=0.005). Incidence of grade 3 or 4 serious adverse events, hospital admissions, or deaths did not significantly differ between HIV-exposed, HIV-unexposed, and HIV-infected children. INTERPRETATION: Continuation of co-trimoxazole prophylaxis up to 4 years of age seems safe and efficacious to protect HIV-exposed children living in malaria-endemic areas. FUNDING: Centers for Disease Control and Prevention Global AIDS Program, Doris Duke Charitable Foundation. |
Eliminating preventable HIV-related maternal mortality in sub-Saharan Africa: what do we need to know?
Kendall T , Danel I , Cooper D , Dilmitis S , Kaida A , Kourtis AP , Langer A , Lapidos-Salaiz I , Lathrop E , Moran AC , Sebitloane H , Turan JM , Watts DH , Wegner MN . J Acquir Immune Defic Syndr 2014 67 Suppl 4 S250-8 INTRODUCTION: HIV makes a significant contribution to maternal mortality, and women living in sub-Saharan Africa are most affected. International commitments to eliminate preventable maternal mortality and reduce HIV-related deaths among pregnant and postpartum women by 50% will not be achieved without a better understanding of the links between HIV and poor maternal health outcomes and improved health services for the care of women living with HIV (WLWH) during pregnancy, childbirth, and postpartum. METHODS: This article summarizes priorities for research and evaluation identified through consultation with 30 international researchers and policymakers with experience in maternal health and HIV in sub-Saharan Africa and a review of the published literature. RESULTS: Priorities for improving the evidence about effective interventions to reduce maternal mortality and improve maternal health among WLWH include better quality data about causes of maternal death among WLWH, enhanced and harmonized program monitoring, and research and evaluation that contributes to improving: (1) clinical management of pregnant and postpartum WLWH, including assessment of the impact of expanded antiretroviral therapy on maternal mortality and morbidity, (2) integrated service delivery models, and (3) interventions to create an enabling social environment for women to begin and remain in care. CONCLUSIONS: As the global community evaluates progress and prepares for new maternal mortality and HIV targets, addressing the needs of WLWH must be a priority now and after 2015. Research and evaluation on maternal health and HIV can increase collaboration on these 2 global priorities, strengthen political constituencies and communities of practice, and accelerate progress toward achievement of goals in both areas. |
Concentrations of environmental phenols and parabens in milk, urine and serum of lactating North Carolina women
Hines EP , Mendola P , vonEhrenstein OS , Ye X , Calafat AM , Fenton SE . Reprod Toxicol 2014 54 120-8 Phenols and parabens show some evidence for endocrine disruption in laboratory animals. The goal of the Methods Advancement for Milk Analysis (MAMA) Study was to develop or adapt methods to measure parabens (methyl, ethyl, butyl, propyl) and phenols (bisphenol A (BPA), 2,4- and 2,5-dichlorophenol, benzophenone-3, triclosan) in urine, milk and serum twice during lactation, to compare concentrations across matrices and with endogenous biomarkers among 34 North Carolina women. These non-persistent chemicals were detected in most urine samples (53-100%) and less frequently in milk or serum; concentrations differed by matrix. Although urinary parabens, triclosan and dichlorophenols concentrations correlated significantly at two time points, those of BPA and benzophenone-3 did not, suggesting considerable variability in those exposures. These pilot data suggest that nursing mothers are exposed to phenols and parabens; urine is the best measurement matrix; and correlations between chemical and endogenous immune-related biomarkers merit further investigation. |
Reduction of spinal loads through adjustable interventions at the origin and destination of palletizing tasks
Ramsey T , Davis KG , Kotowski SE , Anderson VP , Waters T . Hum Factors 2014 56 (7) 1222-1234 OBJECTIVE: This article evaluates the effectiveness of two interventions: a self-leveling pallet carousel designed to position the loads vertically and horizontally at origin, and an adjustable cart designed to raise loads vertically at destination to reduce spine loads. BACKGROUND: Low back disorders among workers in manual material handling industries are very prevalent and have been linked to manual palletizing operations. Evidence into the effectiveness of ergonomic interventions is limited, with no research that investigates interventions with adjustable load location. METHOD: Thirteen males experienced in manual material handling participated in simulated order selecting tasks where spine loads were quantified for each intervention condition: carousel to traditional cart, pallet to traditional cart, pallet to adjustable cart, and carousel to adjustable cart. RESULTS: The results showed that combining both devices results in reduction in spine compression (61%), anterior-posterior shear (72%), and lateral shear (63%) compared to traditional palletizing conditions. Individually, the carousel was responsible for the greatest reductions, but the lowest values were typically achieved by combining the adjustable cart and carousel. CONCLUSION: The combination of the interventions (self-leveling carousel and adjustable cart) was most effective in reducing the spine loads when compared to the traditional pallet-cart condition. The individual interventions also reduced the loads compared to the traditional condition. APPLICATION: With de-palletizing/palletizing tasks being a major source of low back injuries, the combination of self-leveling carousel and adjustable cart has been found to be effective in reducing the peak spine loading as compared to traditional pallet on floor and nonadjustable flat cart conditions. |
Usefulness of the working conditions and health survey in central America in prevention. Author response to comments by Jensen
Benavides FG , Wesseling C , Delclos GL , Felknor S , Pinilla J , Rodrigo F . Occup Environ Med 2014 72 (3) 236-7 We appreciate Dr Jensen's comments1 on the Central American Survey on Working Conditions and Health (ECCTS)2 and his concern that our sampling methodology may have produced biased results. The ECCTS has broadly followed the methodological criteria of the European Working Condition Survey (EWCS).3 ,4 | Of note is that, to estimate prevalence of exposures to different working conditions, the EWCS is applied every 5 years to a representative sample of only 1000 workers in the majority of European countries. The ECCTS had double the number, that is, 2000 per country. However, we acknowledge that a larger sample is better and, in fact, some European countries have started to increase their sample size. | To achieve representativeness, a national population sample must be properly spread over geographic sub-areas and population sub-groups. The random selection of a large number of census segments, proportional to the respective populations of the departments or provinces, accounted for geographic regions and levels of urbanisation, similar to the stratified procedures in the EWCS. In addition, we applied weights for sex, age and economic sector to each individual in the national samples to correct for differences in the sample with the underlying national working population with regard to these key socio-demographic parameters. Finally, for regional comparisons we applied an additional weight to adjust for the population size of the different countries. However, unlike the EWCS, we did not adjust for type of industry and occupation because the latter information was not always available from the census. As Dr Jensen points out, this would have been a more ideal approach. |
Opportunities and challenges of nanotechnology in the green economy
Iavicoli I , Leso V , Ricciardi W , Hodson LL , Hoover MD . Environ Health 2014 13 78 In a world of finite resources and ecosystem capacity, the prevailing model of economic growth, founded on ever-increasing consumption of resources and emission pollutants, cannot be sustained any longer. In this context, the "green economy" concept has offered the opportunity to change the way that society manages the interaction of the environmental and economic domains. To enable society to build and sustain a green economy, the associated concept of "green nanotechnology" aims to exploit nano-innovations in materials science and engineering to generate products and processes that are energy efficient as well as economically and environmentally sustainable. These applications are expected to impact a large range of economic sectors, such as energy production and storage, clean up-technologies, as well as construction and related infrastructure industries. These solutions may offer the opportunities to reduce pressure on raw materials trading on renewable energy, to improve power delivery systems to be more reliable, efficient and safe as well as to use unconventional water sources or nano-enabled construction products therefore providing better ecosystem and livelihood conditions.However, the benefits of incorporating nanomaterials in green products and processes may bring challenges with them for environmental, health and safety risks, ethical and social issues, as well as uncertainty concerning market and consumer acceptance. Therefore, our aim is to examine the relationships among guiding principles for a green economy and opportunities for introducing nano-applications in this field as well as to critically analyze their practical challenges, especially related to the impact that they may have on the health and safety of workers involved in this innovative sector. These are principally due to the not fully known nanomaterial hazardous properties, as well as to the difficulties in characterizing exposure and defining emerging risks for the workforce. Interestingly, this review proposes action strategies for the assessment, management and communication of risks aimed to precautionary adopt preventive measures including formation and training of employees, collective and personal protective equipment, health surveillance programs to protect the health and safety of nano-workers. It finally underlines the importance that occupational health considerations will have on achieving an effectively sustainable development of nanotechnology. |
Commentary on (1) 'Application of the health belief model: development of the hearing beliefs questionnaire (HBQ) and its associations with hearing health behaviors' (International Journal of Audiology, 2013; 52, 558-567), and (2) 'Development and evaluation of a questionnaire to assess knowledge, attitudes, and behaviors towards hearing loss prevention' (International Journal of Audiology, 2014; 53, 209-218)
Stephenson MR , Stephenson CM . Int J Audiol 2014 54 (1) 1-3 Saunders et al recently published two manuscripts regarding the use of the Health Belief Model (HBM) to develop a survey capable of addressing hearing health behaviors— particularly those associated with hearing loss prevention. (CitationSaunders et al, 2013, Citation2014). Both of these are fine articles but we also call your attention to earlier reports describing the use of the Health Belief Model and the development and application of a survey tool in a program designed to positively influence attitudes, beliefs, and behavioral intentions regarding hearing health behaviors. Given those earlier efforts, statements from CitationSaunders et al (2013, Citation2014) that they were the first to have developed a psychometrically valid survey in the context of a comprehensive application of the HBM might mislead some readers. | NIOSH initiated a research program in this area in the early 1990s, focusing on the application of health communication / health promotion theory to prevent noise-induced hearing loss. The program includes research (partly funded by NIOSH grants) by Dr. Sally Lusk and her colleagues regarding the use of Health Promotion models to prevent occupational hearing loss (CitationLusk et al, 1994, Citation1995, Citation2003; CitationKerr et al, 2002) along with an extensive body of intramural research conducted by NIOSH scientists. |
The geography of malaria genetics in the Democratic Republic of Congo: A complex and fragmented landscape.
Carrel M , Patel J , Taylor SM , Janko M , Mwandagalirwa MK , Tshefu AK , Escalante AA , McCollum A , Alam MT , Udhayakumar V , Meshnick S , Emch M . Soc Sci Med 2014 133 233-41 Understanding how malaria parasites move between populations is important, particularly given the potential for malaria to be reintroduced into areas where it was previously eliminated. We examine the distribution of malaria genetics across seven sites within the Democratic Republic of Congo (DRC) and two nearby countries, Ghana and Kenya, in order to understand how the relatedness of malaria parasites varies across space, and whether there are barriers to the flow of malaria parasites within the DRC or across borders. Parasite DNA was retrieved from dried blood spots from 7 Demographic and Health Survey sample clusters in the DRC. Malaria genetic characteristics of parasites from Ghana and Kenya were also obtained. For each of 9 geographic sites (7 DRC, 1 Ghana and 1 Kenya), a pair-wise RST statistic was calculated, indicating the genetic distance between malaria parasites found in those locations. Mapping genetics across the spatial extent of the study area indicates a complex genetic landscape, where relatedness between two proximal sites may be relatively high (RST > 0.64) or low (RST < 0.05), and where distal sites also exhibit both high and low genetic similarity. Mantel's tests suggest that malaria genetics differ as geographic distances increase. Principal Coordinate Analysis suggests that genetically related samples are not co-located. Barrier analysis reveals no significant barriers to gene flow between locations. Malaria genetics in the DRC have a complex and fragmented landscape. Limited exchange of genes across space is reflected in greater genetic distance between malaria parasites isolated at greater geographic distances. There is, however, evidence for close genetic ties between distally located sample locations, indicating that movement of malaria parasites and flow of genes is being driven by factors other than distance decay. This research demonstrates the contributions that spatial disease ecology and landscape genetics can make to understanding the evolutionary dynamics of infectious diseases. |
Malaria surveillance - United States, 2012
Cullen KA , Arguin PM . MMWR Surveill Summ 2014 63 Suppl 12 (12) 1-22 PROBLEM/CONDITION: Malaria in humans is caused by intraerythrocytic protozoa of the genus Plasmodium. These parasites are transmitted by the bite of an infective female Anopheles mosquito. The majority of malaria infections in the United States occur among persons who have traveled to regions with ongoing malaria transmission. However, malaria is also occasionally acquired by persons who have not traveled out of the country, through exposure to infected blood products, congenital transmission, laboratory exposure, or local mosquitoborne transmission. Malaria surveillance in the United States is conducted to identify episodes of local transmission and to guide prevention recommendations for travelers. PERIOD COVERED: This report summarizes cases in persons with onset of symptoms in 2012 and summarizes trends during previous years. DESCRIPTION OF SYSTEM: Malaria cases diagnosed by blood film, polymerase chain reaction, or rapid diagnostic tests are mandated to be reported to local and state health departments by health-care providers or laboratory staff. Case investigations are conducted by local and state health departments, and reports are transmitted to CDC through the National Malaria Surveillance System (NMSS), National Notifiable Diseases Surveillance System (NNDSS), or direct CDC consults. For the first time, CDC conducted antimalarial drug resistance testing on blood samples submitted to CDC by health-care providers or local/state health departments. Data from these reporting systems serve as the basis for this report. RESULTS: CDC received 1,687 reported cases of malaria with an onset of symptoms in 2012 among persons in the United States, including 1,683 cases classified as imported, one laboratory-acquired case, one nosocomial case, and two cryptic cases. The total number of cases represents a 12% decrease from the 1,925 cases reported for 2011. Plasmodium falciparum, P. vivax, P. malariae, and P. ovale were identified in 58%, 17%, 3%, and 3% of cases, respectively. Twenty (1%) patients were infected by two species. The infecting species was unreported or undetermined in 17% of cases, a decrease of 6 percentage points from 2011. Polymerase chain reaction testing determined or corrected the species for 45 (43%) of the 104 samples submitted for drug resistance testing. Of the 909 patients who reported purpose of travel, 604 (66%) were visiting friends or relatives (VFR). Among the 983 cases in U.S. civilians for whom information on chemoprophylaxis use and travel region was known, 63 (6%) patients reported that they had followed and adhered to a chemoprophylaxis drug regimen recommended by CDC for the regions to which they had traveled. Thirty-two cases were reported in pregnant women, among whom only one adhered to chemoprophylaxis. Among all reported cases, 231 (14%) were classified as severe infections in 2012. Of these, six persons with malaria died in 2012. Beginning in 2012, there were 104 blood samples submitted to CDC that were tested for molecular markers associated with antimalarial drug resistance. Of the 65 P. falciparum-positive samples, 53 (82%) had genetic polymorphisms associated with pyrimethamine drug resistance, 61 (94%) with sulfadoxine resistance, 29 (45%) with chloroquine resistance, 1 (2%) with mefloquine drug resistance, 2 (3%) with atovaquone resistance, and none with artemisinin resistance. INTERPRETATION: Despite the 12% decline in the number of cases reported in 2012 compared with 2011, the overall trend in malaria cases has been increasing since 1973. Although progress has been made in reducing the global burden of malaria, the disease remains endemic in many regions, and the use of appropriate prevention measures by travelers is still inadequate. PUBLIC HEALTH ACTIONS: Completion of data elements on the malaria case report form increased slightly in 2012 compared with 2011, but still remains unacceptably low. This incomplete reporting compromises efforts to examine trends in malaria cases and prevent infections. VFRs continue to be a difficult population to reach with effective malaria prevention strategies. Evidence-based prevention strategies that effectively target VFRs need to be developed and implemented to have a substantial impact on the numbers of imported malaria cases in the United States. Although more patients reported taking chemoprophylaxis to prevent malaria, the majority reported not taking it, and adherence was poor among those who did take chemoprophylaxis. Proper use of malaria chemoprophylaxis will prevent the majority of malaria illness and reduce the risk for severe disease (http://www.cdc.gov/malaria/travelers/drugs.html). Malaria infections can be fatal if not diagnosed and treated promptly with antimalarial medications appropriate for the patient's age and medical history, the likely country of malaria acquisition, and previous use of antimalarial chemoprophylaxis. Recent molecular laboratory advances have enabled CDC to identify and conduct molecular surveillance of antimalarial drug resistance (http://www.cdc.gov/malaria/features/ars.html). These advances will allow CDC to track, guide treatment, and manage drug resistant malaria parasites both domestically and globally. For this to be successful, specimens should be submitted for cases diagnosed in the United States and for ongoing specimen collection and testing globally. Clinicians should consult the CDC Guidelines for Treatment of Malaria and contact the CDC's Malaria Hotline for case management advice when needed. Malaria treatment recommendations can be obtained online (http://www.cdc.gov/malaria/diagnosis_treatment) or by calling the Malaria Hotline (770-488-7788 or toll-free at 855-856-4713). |
Evaluation of a universal coverage bed net distribution campaign in four districts in Sofala Province, Mozambique
Plucinski MM , Chicuecue S , Macete E , Colborn J , Yoon SS , Kachur SP , Aide P , Alonso P , Guinovart C , Morgan J . Malar J 2014 13 427 BACKGROUND: Malaria is the leading cause of death in Mozambique in children under five years old. In 2009, Mozambique developed a novel bed net distribution model to increase coverage, based on assumptions about sleeping patterns. The coverage and impact of a bed net distribution campaign using this model in four districts in Sofala Province, Mozambique was evaluated. METHODS: Paired household, cross-sectional surveys were conducted one month after the 2010 distribution of 140,000 bed nets and again 14 months after the campaign in 2011. During household visits, malaria blood smears were performed and haemoglobin levels were assessed on children under five and data on bed net ownership, access and use were collected; these indicators were analysed at individual, household and community levels. Logistic regression was used to evaluate predictors of malaria infection and anaemia. RESULTS: The campaign reached 98% (95% CI: 97-99%) of households registered during the precampaign listing, with 81% (95% CI: 77-85%) of sleeping spaces covered by campaign bed nets and 85% (95% CI: 81-88%) of the population sleeping in a sleeping space with a campaign bed net designated for the sleeping space. One year after the campaign, 65% (95% CI: 57-72%) of sleeping spaces were observed to have hanging bed nets. The proportion of sleeping spaces for which bed nets were reported used four or more times per week was 65% (95% CI: 56-74%) in the wet season and 60% (95% CI: 52-68%) in the dry season. Malaria parasitaemia prevalence in children under five years old was 47% (95% CI: 40-54%) in 2010 and 36% (95% CI: 27-45%) in 2011. Individual-level malaria infection and anaemia were significantly associated with community-level use of bed nets. CONCLUSIONS: The campaign using the novel distribution model achieved high coverage, although usage was not uniformly high. A significant decrease in malaria parasitaemia prevalence a year after the campaign was not observed, but community-level use of bed nets was significantly associated with a reduced risk for malaria infection and anaemia in children under five. |
The public health workforce: moving forward in the 21st century
Coronado F , Koo D , Gebbie K . Am J Prev Med 2014 47 S275-7 In 1994, the Core Public Health Functions Steering Committee, which was convened by the Assistant Secretary for Health and included representatives from U.S. Public Health Service agencies and other major public health organizations, was organized to clarify the public health functions of assessment, policy development, and assurance identified by the IOM Committee on Public Health. Among its other activities, the Steering Committee was charged with developing the framework for the Essential Public Health Services to categorize all public health activities.1 It also commissioned a subcommittee on public health workforce, training, and education to provide a profile of the public health workforce and make projections regarding the workforce of the 21st century. | Twenty years later, the Essential Public Health Services continue to serve as a framework for public health initiatives across public health organizations; additionally, substantial advances have been made in establishing strong national, state, and local leadership with emphasis in collaborative partnerships among practice and academic entities to deliver the Essential Public Health Services across the nation. Efforts to better understand the workforce composition also followed these seminal efforts. Gebbie et al.2 estimated the size and composition of the public health workforce in 2000, and others3–5 followed more recently with comprehensive assessments of the epidemiologist, public health nurse, and other public health workforce capacity. Despite continuing challenges with defining position classifications and the organizational level and location of departments of health, workforce information is now routinely collected by the National Association of County and City Health Officials and the Association of State and Territorial Health Officials in their regular profile surveys of local and state public health agencies, respectively. The Council of State and Territorial Epidemiologists and Association of Public Health Laboratories also have contributed similar efforts for their respective constituents. Improvements have also occurred in training and educating the public health workforce. Within the last 10 years, schools of public health and health professions schools have made substantial changes in their curricula, including improved practice-based education and stronger applied research agendas. Both schools of public health and liberal arts colleges have introduced public health education at the undergraduate level, where it has become one of the fastest-growing majors. Enhancements in the continuing education curricula to support training of public health workers, including use of integrated distance learning systems built on existing public networks, make information on best public health practices readily available.6,7 Additionally, as information systems and technologies improve, the introduction of the evolving field of public health informatics is fundamentally transforming certain aspects of public health practice, research, and learning.8 |
What we know, and don't know, about the impact of state policy and systems-level interventions on prescription drug overdose
Haegerich TM , Paulozzi LJ , Manns BJ , Jones CM . Drug Alcohol Depend 2014 145c 34-47 BACKGROUND: Drug overdose deaths have been rising since the early 1990s and is the leading cause of injury death in the United States. Overdose from prescription opioids constitutes a large proportion of this burden. State policy and systems-level interventions have the potential to impact prescription drug misuse and overdose. METHODS: We searched the literature to identify evaluations of state policy or systems-level interventions using non-comparative, cross-sectional, before-after, time series, cohort, or comparison group designs or randomized/non-randomized trials. Eligible studies examined intervention effects on provider behavior, patient behavior, and health outcomes. RESULTS: Overall study quality is low, with a limited number of time-series or experimental designs. Knowledge and prescribing practices were measured more often than health outcomes (e.g., overdoses). Limitations include lack of baseline data and comparison groups, inadequate statistical testing, small sample sizes, self-reported outcomes, and short-term follow-up. Strategies that reduce inappropriate prescribing and use of multiple providers and focus on overdose response, such as prescription drug monitoring programs, insurer strategies, pain clinic legislation, clinical guidelines, and naloxone distribution programs, are promising. Evidence of improved health outcomes, particularly from safe storage and disposal strategies and patient education, is weak. CONCLUSIONS: While important efforts are underway to affect prescriber and patient behavior, data on state policy and systems-level interventions are limited and inconsistent. Improving the evidence base is a critical need so states, regulatory agencies, and organizations can make informed choices about policies and practices that will improve prescribing and use, while protecting patient health. |
Long-term health and medical cost impact of smoking prevention in adolescence
Wang LY , Michael SL . J Adolesc Health 2014 56 (2) 160-6 PURPOSE: To estimate smoking progression probabilities from adolescence to young adulthood and to estimate long-term health and medical cost impacts of preventing smoking in today's adolescents. METHODS: Using data from the National Longitudinal Study of Adolescent Health (Add Health), we first estimated smoking progression probabilities from adolescence to young adulthood. Then, using the predicted probabilities, we estimated the number of adolescents who were prevented from becoming adult daily smokers as a result of a hypothetical 1 percentage point reduction in the prevalence of ever smoking in today's adolescents. We further estimated lifetime medical costs saved and quality-adjusted life years (QALYs) gained as a result of preventing adolescents from becoming adult daily smokers. All costs were in 2010 dollars. RESULTS: Compared with never smokers, those who had tried smoking at baseline had higher probabilities of becoming current or former daily smokers at follow-up regardless of baseline grade or sex. A hypothetical 1 percentage point reduction in the prevalence of ever smoking in 24.5 million students in 7th-12th grades today could prevent 35,962 individuals from becoming a former daily smoker and 44,318 individuals from becoming a current daily smoker at ages 24-32 years. As a result, lifetime medical care costs are estimated to decrease by $1.2 billion and lifetime QALYs is estimated to increase by 98,590. CONCLUSIONS: Effective smoking prevention programs for adolescents go beyond reducing smoking prevalence in adolescence; they also reduce daily smokers in young adulthood, increase QALYs, and reduce medical costs substantially in later life. This finding indicates the importance of continued investment in effective youth smoking prevention programs. |
Observed transition from opioid analgesic deaths toward heroin
Dasgupta N , Creppage K , Austin A , Ringwalt C , Sanford C , Proescholdbell SK . Drug Alcohol Depend 2014 145c 238-241 BACKGROUND: In the United States, overdose mortality from controlled substances has increased over the last two decades, largely involving prescription opioid analgesics. Recently, there has been speculation on a transition away from prescription opioid use toward heroin, however the impact on overdose deaths has not been evaluated. METHODS: Time series study of North Carolina residents, 2007 through 2013. Monthly ratio of prescription opioid-to-heroin overdose deaths. Non-parametric local regression models used to ascertain temporal shifts from overdoses involving prescription opioids to heroin. RESULTS: There were 4332 overdose deaths involving prescription opioids, and 455 involving heroin, including 44 where both were involved (total n=4743). A gradual 6-year shift toward increasing heroin deaths was observed. In January, 2007, for one heroin death there were 16 opioid analgesic deaths; in December, 2013 there were 3 prescription opioid deaths for each heroin death. The transition to heroin appears to have started prior to the introduction of tamper-resistant opioid analgesics. The age of death among heroin decedents shifted toward younger adults. Most heroin and opioid analgesic deaths occurred in metropolitan areas, with little change between 2007 and 2013. CONCLUSIONS: The observed increases in heroin overdose deaths can no longer be considered speculation. Deaths among younger adults were noted to have increased in particular, suggesting new directions for targeting interventions. More research beyond vital statistics is needed to understand the root causes of the shift from prescription opioids to heroin. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Genetics and Genomics
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Occupational Safety and Health
- Parasitic Diseases
- Public Health Leadership and Management
- Substance Use and Abuse
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure