Reducing indoor tanning - an opportunity for melanoma prevention
Guy GP Jr , Watson M , Richardson LC , Lushniak BD . JAMA Dermatol 2016 152 (3) 257-9 The incidence of melanoma has been rapidly increasing in the United States. Since exposure to UV radiation from indoor tanning is preventable, reducing exposure is an important strategy for melanoma prevention.1 The article by Lazovich et al2 in this issue of JAMA Dermatology provides an in-depth analysis of a case-control study conducted in Minnesota examining the association between indoor tanning and melanoma. The authors2 found that indoor tanning was strongly associated with increased melanoma risk among women, especially among women younger than 30 years, for whom indoor tanning was associated with a 6-fold increase in the likelihood of developing melanoma. Nearly all women in the study (96.8%) diagnosed as having melanoma when younger than 30 years had engaged in indoor tanning, all initiating indoor tanning before age 25 years, and nearly all (90.5%) engaging in frequent indoor tanning (>10 times per year). | Several other studies3,4 have noted increases in melanoma among young white women and hypothesized that increases among this demographic may be related to increases in indoor tanning. By focusing on sex and age at diagnosis, Lazovich et al2 provide important additional support for this hypothesis. Their findings are also consistent with those of a study5 demonstrating the widespread use of indoor tanning among young non-Hispanic white women. This article2 builds on the previous literature and demonstrates the importance of public health efforts in reducing indoor tanning. While all exposure to UV radiation can increase the risk of melanoma, exposure to artificial UV radiation from indoor tanning is a completely avoidable risk factor. In addition, UV radiation from indoor tanning is often more intense than UV radiation from the sun, and users often expose more areas of the body. |
Risk factors for radiation exposure in newly diagnosed IBD patients
Grand DJ , Harris A , Shapiro J , Wu E , Giacalone J , Sands BE , Bright R , Moniz H , Mallette M , Leleiko N , Wallenstein S , Samad Z , Merrick M , Shah SA . Abdom Radiol (NY) 2016 41 (7) 1363-9 PURPOSE: Patients with inflammatory bowel disease (IBD) may be exposed to high doses of diagnostic radiation. The purpose of this study is to identify subsets of this population at risk for significant radiation exposure. METHODS: This HIPAA compliant, IRB approved study consists of 336 patients (237 adult and 99 pediatric) within the Ocean State Crohn's & Colitis Area Registry (OSCCAR). All were newly diagnosed with IBD and prospectively enrolled between 1/2008 and 12/2012. Comprehensive chart review was performed. RESULTS: 207 (61.6%) patients were diagnosed with Crohn's disease (CD), 120 (35.7%) with ulcerative colitis (UC), and 9 (2.7%) with inflammatory bowel disease, type unspecified (IBDU). 192 (57.1%) patients were exposed to GI-specific radiation. Average GI-specific radiation dose for adult IBD patients was 14.1 mSV and was significantly greater among adult CD than adult UC patients (p = 0.01). Pediatric patients underwent fewer CT scans (p < 0.0001). Risk factors for increased radiation exposure include: GI surgery (p = 0.003), biologic therapy (p = 0.01), pain-predominant symptoms (as compared to diarrhea-predominant symptoms; p < 0.05), and isolated ileal disease (p = 0.02). Patients with stricturing or penetrating disease received higher radiation doses than patients with non-stricturing, non-penetrating disease (p < 0.0001). CONCLUSIONS: A variety of risk factors are associated with increased exposure to ionizing radiation after diagnosis of IBD. Knowledge of these risk factors can help physicians prospectively identify patients at risk for elevated radiation exposure and consider low-dose or radiation-free imaging. |
Timing of initiation of maintenance dialysis: a qualitative analysis of the electronic medical records of a national cohort of patients from the Department of Veterans Affairs
Wong SP , Vig EK , Taylor JS , Burrows NR , Liu CF , Williams DE , Hebert PL , O'Hare AM . JAMA Intern Med 2016 176 (2) 228-35 IMPORTANCE: There is often considerable uncertainty about the optimal time to initiate maintenance dialysis in individual patients and little medical evidence to guide this decision. Objective: To gain a better understanding of the factors influencing the timing of initiation of dialysis in clinical practice. DESIGN, SETTING, AND PARTICIPANTS: A qualitative analysis was conducted using the electronic medical records from the Department of Veterans Affairs (VA) of a national random sample of 1691 patients for whom the decision to initiate maintenance dialysis occurred in the VA between January 1, 2000, and December 31, 2009. Data analysis took place from June 1 to November 30, 2014. MAIN OUTCOMES AND MEASURES: Central themes related to the timing of initiation of dialysis as documented in patients' electronic medical records. Results: Of the 1691 patients, 1264 (74.7%) initiated dialysis as inpatients and 1228 (72.6%) initiated dialysis with a hemodialysis catheter. Cohort members met with a nephrologist during an outpatient clinic visit a median of 3 times (interquartile range, 0-6) in the year prior to initiation of dialysis. The mean (SD) estimated glomerular filtration rate at the time of initiation for cohort members was 10.4 (5.7) mL/min/1.73 m2. The timing of initiation of dialysis reflected the complex interplay of at least 3 interrelated and dynamic processes. The first was physician practices, which ranged from practices intended to prepare patients for dialysis to those intended to forestall the need for dialysis by managing the signs and symptoms of uremia with medical interventions. The second process was sources of momentum. Initiation of dialysis was often precipitated by clinical events involving acute illness or medical procedures. In these settings, the imperative to treat often seemed to override patient choice. The third process was patient-physician dynamics. Interactions between patients and physicians were sometimes adversarial, and physician recommendations to initiate dialysis sometimes seemed to conflict with patient priorities. CONCLUSIONS AND RELEVANCE: The initiation of maintenance dialysis reflects the care practices of individual physicians, sources of momentum for initiation of dialysis, interactions between patients and physicians, and the complex interplay of these dynamic processes over time. Our findings suggest opportunities to improve communication between patients and physicians and to better align these processes with patients' values, goals, and preferences. |
What predicts an advanced-stage diagnosis of breast cancer? sorting out the influence of method of detection, access to care and biological factors
Lipscomb J , Fleming ST , Trentham-Dietz A , Kimmick G , Wu XC , Morris CR , Zhang K , Smith RA , Anderson RT , Sabatino SA . Cancer Epidemiol Biomarkers Prev 2016 25 (4) 613-23 BACKGROUND: Multiple studies have yielded important findings regarding the determinants of an advanced-stage diagnosis of breast cancer. We seek to advance this line of inquiry through a broadened conceptual framework and accompanying statistical modeling strategy that recognize the dual importance of access-to-care and biological factors on stage. METHODS: The CDC-sponsored Breast and Prostate Cancer Data Quality and Patterns of Care Study yielded a 7-state, cancer registry-derived population-based sample of 9,142 women diagnosed with a first primary in situ or invasive breast cancer in 2004. The likelihood of advanced-stage cancer (AJCC IIIB, IIIIC, or IV) was investigated through multivariable regression modeling, with base-case analyses using the method of instrumental variables (IV) to detect and correct for possible selection bias. The robustness of base-case findings were examined through extensive sensitivity analyses. RESULTS: Advanced-stage disease was negatively associated with detection by mammography (p<0.001) and age<50 (p<0.001), and positively related to black race (p=0.066), not being privately insured [Medicaid (p=0.012), Medicare (p=0.036), uninsured (p=0.069)], being single (p=0.058), BMI>40 (p<0.001), a HER2 Type tumor (p<0.001), and tumor grade not well differentiated (p<0.001). This IV model detected and adjusted for significant selection effects associated with method of detection (p=0.017). Sensitivity analyses generally supported these base-case results. CONCLUSIONS: Through our comprehensive modeling strategy and multiple sensitivity analyses, we provide new estimates of the magnitude and robustness of the determinants of advanced-stage breast cancer. IMPACT: Statistical approaches frequently used to address observational data biases in treatment-outcome studies can be applied similarly in analyses of the determinants of stage-at-diagnosis. |
Location isn't everything: Proximity, hospital characteristics, choice of hospital, and disparities for breast cancer surgery patients
Keating NL , Kouri EM , He Y , Freedman RA , Volya R , Zaslavsky AM . Health Serv Res 2016 51 (4) 1561-83 OBJECTIVE: Assess the relative importance of proximity and other hospital characteristics in the choice of hospital for breast cancer surgery by race/ethnicity. DATA: SEER-Medicare data. STUDY DESIGN: Observational study of women aged >65 years receiving surgery for stage I/II/III breast cancer diagnosed in 1992-2007 in Detroit (N = 10,746 white/black), Atlanta (N = 4,018 white/black), Los Angeles (N = 9,433 white/black/Asian/Hispanic), and San Francisco (N = 4,856 white/black/Asian). We calculated the distance from each patient's census tract of residence to each area hospital. We estimated discrete choice models for the probability of receiving surgery at each hospital based on distance and assessed whether deviations from these predictions entailed interactions of hospital characteristics with the patient's race/ethnicity. We identified high-quality hospitals by rates of adjuvant radiation therapy and by survey measures of patient experiences, and we assessed how observed surgery rates at high-quality hospitals deviated from those predicted based on distance alone. PRINCIPAL FINDINGS: Proximity was significantly associated with hospital choice in all areas. Minority more often than white breast cancer patients had surgery at hospitals with more minority patients, those treating more Medicaid patients, and in some areas, lower quality hospitals. CONCLUSIONS: Residential location alone does not explain concentration of racial/ethnic-minority breast cancer surgery patients in certain hospitals that are sometimes of lower quality. |
The effect of multiple primary rules on cancer incidence rates and trends
Weir HK , Johnson CJ , Ward KC , Coleman MP . Cancer Causes Control 2016 27 (3) 377-90 PURPOSE: An examination of multiple primary cancers can provide insight into the etiologic role of genes, the environment, and prior cancer treatment on a cancer patient's risk of developing a subsequent cancer. Different rules for registering multiple primary cancers (MP) are used by cancer registries throughout the world making data comparisons difficult. METHODS: We evaluated the effect of SEER and IARC/IACR rules on cancer incidence rates and trends using data from the SEER Program. We estimated age-standardized incidence rate (ASIR) and trends (1975-2011) for the top 26 cancer categories using joinpoint regression analysis. RESULTS: ASIRs were higher using SEER compared to IARC/IACR rules for all cancers combined (3 %) and, in rank order, melanoma (9 %), female breast (7 %), urinary bladder (6 %), colon (4 %), kidney and renal pelvis (4 %), oral cavity and pharynx (3 %), lung and bronchus (2 %), and non-Hodgkin lymphoma (2 %). ASIR differences were largest for patients aged 65+ years. Trends were similar using both MP rules with the exception of cancers of the urinary bladder, and kidney and renal pelvis. CONCLUSIONS: The choice of multiple primary coding rules effects incidence rates and trends. Compared to SEER MP coding rules, IARC/IACR rules are less complex, have not changed over time, and report fewer multiple primary cancers, particularly cancers that occur in paired organs, at the same anatomic site and with the same or related histologic type. Cancer registries collecting incidence data using SEER rules may want to consider including incidence rates and trends using IARC/IACR rules to facilitate international data comparisons. |
Breast and cervical cancer screening among Hispanic subgroups in the USA: estimates from the National Health Interview Survey 2008, 2010, and 2013
Shoemaker ML , White MC . Cancer Causes Control 2016 27 (3) 453-7 PURPOSE: This study examined patterns in mammography and Pap test use across and within subpopulations of Hispanic women. METHODS: Based on data from the National Health Interview Survey (2008, 2010, and 2013), we estimated the proportion of Hispanic women reporting testing for breast and cervical cancer for specific subgroups. We examined test use by demographic characteristics using Chi-square tests. RESULTS: Overall, the proportion of women aged 50-74 years who reported a mammogram within the past 2 years did not differ significantly across Hispanic subgroups. Among publically and uninsured women, however, proportions of mammography utilization varied significantly across Hispanic subgroups. The proportion of women aged 21-65 years who received a Pap test within the past 3 years differed significantly across Hispanic subgroups. CONCLUSIONS: Among subgroups of Hispanic women, patterns in mammography and Pap test use vary by insurance status, length of US residency, and type of screening. Certain subgroups of Hispanic women may benefit from culturally tailored efforts to promote breast and cervical cancer screening. |
Cancer incidence in Appalachia, 2004-2011
Wilson RJ , Ryerson AB , Singh SD , King JB . Cancer Epidemiol Biomarkers Prev 2016 25 (2) 250-8 BACKGROUND: Limited literature is available about cancer in the Appalachian Region. This is the only known analysis of all cancers for Appalachia and non-Appalachia covering 100% of the US population. Appalachian cancer incidence and trends were evaluated by state, sex, and race and compared with those found in non-Appalachian regions. METHODS: US counties were identified as Appalachian or non-Appalachian. Age-adjusted cancer incidence rates, standard errors, and confidence intervals were calculated using the most recent data from the United States Cancer Statistics for 2004 to 2011. RESULTS: Generally, Appalachia carries a higher cancer burden compared with non-Appalachia, particularly for tobacco-related cancers. For all cancer sites combined, Appalachia has higher rates regardless of sex, race, or region. The Appalachia and non-Appalachia cancer incidence gap has narrowed, with the exception of oral cavity and pharynx, larynx, lung and bronchus, and thyroid cancers. CONCLUSIONS: Higher cancer incidence continues in Appalachia and appears at least in part to reflect high tobacco use and potential differences in socioeconomic status, other risk factors, patient health care utilization, or provider practices. It is important to continue to evaluate this population to monitor results from screening and early detection programs, understand behavioral risk factors related to cancer incidence, increase efforts to reduce tobacco use and increase cancer screening, and identify other areas where effective interventions may mediate disparities. IMPACT: Surveillance and evaluation of special populations provide means to monitor screening and early detection programs, understand behavioral risk factors, and increase efforts to reduce tobacco use to mediate disparities. Cancer Epidemiol Biomarkers Prev; 25(2); 1-9. (c)2016 AACR. |
Diabetes and cardiovascular disease risk in Cambodian refugees
Marshall GN , Schell TL , Wong EC , Berthold SM , Hambarsoomian K , Elliott MN , Bardenheier BH , Gregg EW . J Immigr Minor Health 2016 18 (1) 110-7 To determine rates of diabetes, hypertension, and hyperlipidemia in Cambodian refugees, and to assess the proportion whose conditions are satisfactorily managed in comparison to the general population. Self-report and laboratory/physical health assessment data obtained from a household probability sample of U.S.-residing Cambodian refugees (N = 331) in 2010-2011 were compared to a probability sample of the adult U.S. population (N = 6,360) from the 2009-2010 National Health and Nutrition Examination Survey. Prevalence of diabetes, hypertension and hyperlipidemia in Cambodian refugees greatly exceeded rates found in the age- and gender-adjusted U.S. POPULATION: Cambodian refugees with diagnosed hypertension or hyperlipidemia were less likely than their counterparts in the general U.S. population to have blood pressure and total cholesterol within recommended levels. Increased attention should be paid to prevention and management of diabetes and cardiovascular disease risk factors in the Cambodian refugee community. Research is needed to determine whether this pattern extends to other refugee groups. |
Bordetella pertussis Strain Lacking Pertactin and Pertussis Toxin.
Williams MM , Sen K , Weigand MR , Skoff TH , Cunningham VA , Halse TA , Tondella ML . Emerg Infect Dis 2016 22 (2) 319-22 A Bordetella pertussis strain lacking 2 acellular vaccine immunogens, pertussis toxin and pertactin, was isolated from an unvaccinated infant in New York State in 2013. Comparison with a French strain that was pertussis toxin-deficient, pertactin wild-type showed that the strains carry the same 28-kb deletion in similar genomes. |
Serum Biomarkers Indicate Long-term Reduction in Liver Fibrosis in Patients With Sustained Virological Response to Treatment for HCV Infection
Lu M , Li J , Zhang T , Rupp LB , Trudeau S , Holmberg SD , Moorman AC , Spradling PR , Teshale EH , Xu F , Boscarino JA , Schmidt MA , Vijayadeva V , Gordon SC . Clin Gastroenterol Hepatol 2016 14 (7) 1044-1055 e3 BACKGROUND & AIMS: Sustained viral response (SVR) to antiviral therapy for hepatitis C virus (HCV) correlates with changes in biochemical measures of liver function. However, little is known about the long-term effects of SVR on liver fibrosis. We investigated the effects of HCV therapy on fibrosis, based on fibrosis-4 (FIB4) score, over a 10 year period. METHODS: We collected data from participants in the chronic hepatitis C cohort-part of an observational multicenter study of patients with hepatitis C at 4 large US health systems-from January 1, 2006 through December 31, 2013. We calculated patients' FIB4 and aminotransferase-to-platelet ratio index (APRI) scores over a 10 year period. Of 4731 patients with HCV infection, 1657 (35%) were treated and 755 (46%) of these patients achieved an SVR. RESULTS: In propensity score-adjusted analyses, we observed significant longitudinal changes in FIB4 score that varied with treatment and response to treatment. In patients with an SVR, FIB4 scores were initially higher than in patients without SVRs, but then decreased sharply, remaining significantly lower over the 10 year period than in untreated patients or patients with treatment failure (P<.001). In independent analyses, men and patients with HCV genotype 1 or 3 infections had higher FIB4 scores than women or patients with HCV genotype 2 infections (P<.01 for both). Findings were similar in a sensitivity analysis that substituted the APRI as the marker of fibrosis instead of FIB4 score. CONCLUSIONS: An SVR to HCV treatment appears to induce long-term regression of fibrosis, based on FIB4 or APRI scores collected over 10 years patients in a large study. Patients receiving no treatment or with treatment failure had progressive increases in FIB4 scores. |
Standardizing the influenza neuraminidase inhibition assay among United States public health laboratories conducting virological surveillance
Okomo-Adhiambo M , Mishin VP , Sleeman K , Saguar E , Guevara H , Reisdorf E , Griesser RH , Spackman KJ , Mendenhall M , Carlos MP , Healey B , St George K , Laplante J , Aden T , Chester S , Xu X , Gubareva LV . Antiviral Res 2016 128 28-35 BACKGROUND: Monitoring influenza virus susceptibility to neuraminidase (NA) inhibitors (NAIs) is vital for detecting drug-resistant variants, and is primarily assessed using NA inhibition (NI) assays, supplemented by NA sequence analysis. However, differences in NI testing methodologies between surveillance laboratories results in variability of 50% inhibitory concentration (IC50) values, which impacts data sharing, reporting and interpretation. In 2011, the Centers for Disease Control and Prevention (CDC), in collaboration with the Association for Public Health Laboratories (APHL) spearheaded efforts to standardize fluorescence-based NI assay testing in the United States (U.S.), with the goal of achieving consistency of IC50 data. METHODS: For the standardization process, three participating state public health laboratories (PHLs), designated as National Surveillance Reference Centers for Influenza (NSRC-Is), assessed the NAI susceptibility of the 2011-12 CDC reference virus panel using stepwise procedures with support from the CDC reference laboratory. Next, the NSRC-Is assessed the NAI susceptibility of season 2011-12 U.S. influenza surveillance isolates (n=940), with a large subset (n=742) tested in parallel by CDC. Subsequently, U.S. influenza surveillance isolates (n=9629) circulating during the next three influenza seasons (2012-15), were independently tested by the three NSRC-Is (n=7331) and CDC (n=2298). RESULTS: The NI assay IC50s generated by respective NSRC-Is using viruses and drugs prepared by CDC were similar to those obtained with viruses and drugs prepared in-house, and were uniform between laboratories. IC50s for U.S. surveillance isolates tested during four consecutive influenza seasons (2011-15) were consistent from season to season, within and between laboratories. CONCLUSION: These results show that the NI assay is robust enough to be standardized, marking the first time IC50 data have been normalized across multiple laboratories, and used for U.S. national NAI susceptibility surveillance. |
Young people and HIV: A call to action
Koenig LJ , Hoyer D , Purcell DW , Zaza S , Mermin J . Am J Public Health 2016 106 (3) e1-e4 HIV is having a significant impact on young people, among whom the rate of new diagnoses is high and health disparities are more pronounced. Incidence is increasing among young gay and bisexual men, and, among Black males, the largest percentage of new infections occur among those aged between 13 and 24 years. Youths are least likely to experience the health and prevention benefits of treatment. Nearly half of young people with HIV are not diagnosed; among those diagnosed, nearly a quarter are not linked to care, and three quarters are not virally suppressed. Addressing this burden will require renewed efforts to implement effective prevention strategies across multiple sectors, including educational, social, policy, and health care systems that influence prevention knowledge, service use, and treatment options for youths. |
Notes from the field: Ongoing cholera outbreak - Kenya, 2014-2016
George G , Rotich J , Kigen H , Catherine K , Waweru B , Boru W , Galgalo T , Githuku J , Obonyo M , Curran K , Narra R , Crowe SJ , O'Reilly CE , Macharia D , Montgomery J , Neatherlin J , De Cock KM , Lowther S , Gura Z , Langat D , Njeru I , Kioko J , Muraguri N . MMWR Morb Mortal Wkly Rep 2016 65 (3) 68-69 On January 6, 2015, a man aged 40 years was admitted to Kenyatta National Hospital in Nairobi, Kenya, with acute watery diarrhea. The patient was found to be infected with toxigenic Vibrio cholerae serogroup O1, serotype Inaba. A subsequent review of surveillance reports identified four patients in Nairobi County during the preceding month who met either of the Kenya Ministry of Health suspected cholera case definitions: 1) severe dehydration or death from acute watery diarrhea (more than four episodes in 12 hours) in a patient aged ≥5 years, or 2) acute watery diarrhea in a patient aged ≥2 years in an area where there was an outbreak of cholera. An outbreak investigation was immediately initiated. A confirmed cholera case was defined as isolation of V. cholerae O1 or O139 from the stool of a patient with suspected cholera or a suspected cholera case that was epidemiologically linked to a confirmed case. By January 15, 2016, a total of 11,033 suspected or confirmed cases had been reported from 22 of Kenya's 47 counties. The outbreak is ongoing. |
Notes from the field: Tetanus cases after voluntary medical male circumcision for HIV prevention - Eastern and Southern Africa, 2012-2015
Grund JM , Toledo C , Davis SM , Ridzon R , Moturi E , Scobie H , Naouri B , Reed JB , Njeuhmeli E , Thomas AG , Benson FN , Sirengo MW , Muyenzi LN , Lija GJ , Rogers JH , Mwanasalli S , Odoyo-June E , Wamai N , Kabuye G , Zulu JE , Aceng JR , Bock N . MMWR Morb Mortal Wkly Rep 2016 65 (2) 36-7 Voluntary medical male circumcision (VMMC) decreases the risk for female-to-male HIV transmission by approximately 60% (1), and the President's Emergency Plan for AIDS Relief (PEPFAR) is supporting the scale-up of VMMC for adolescent and adult males in countries with high prevalence of human immunodeficiency virus (HIV) and low coverage of male circumcision (2). As of September 2015, PEPFAR has supported approximately 8.9 million VMMCs (3). |
Ensuring quality: a key consideration in scaling-up HIV-related point-of-care testing programs
Fonjungo PN , Osmanov S , Kuritsky J , Ndihokubwayo JB , Bachanas P , Peeling RW , Timperi R , Fine G , Stevens W , Habiyambere V , Nkengasong JN . AIDS 2016 30 (8) 1317-23 OBJECTIVE: The objective of the World Health Organization (WHO)/U.S. President's Emergency Plan for AIDS Relief (PEPFAR) consultation was to discuss innovative strategies, offer guidance and develop a comprehensive policy framework for implementing quality-assured HIV-related point-of-care testing (POCT). METHODS: The consultation was attended by representatives from international agencies (WHO, UNICEF, UNITAID, Clinton Health Access Initiative [CHAI]), USAID, Centers for Disease Control and Prevention [CDC]/PEPFAR Cooperative Agreement Partners, and experts from more than 25 countries including policy makers, clinicians, laboratory experts and program implementers. MAIN OUTCOMES: There was strong consensus among all participants that ensuring access to quality of POCT represents one of the key challenges for the success of HIV prevention, treatment and care programs. The following four strategies were recommended: 1) implement a newly proposed concept of a sustainable quality assurance cycle that includes (a) careful planning; (b) definition of goals and targets; (c) timely implementation; (d) continuous monitoring; (e) improvements and adjustments, where necessary; and (f) a detailed evaluation; 2) the importance of supporting a cadre of workers (e.g. volunteer quality corps [Q-Corps]) with the role to ensure that the quality assurance cycle is followed and sustained; 3) implementation of the new strategy should be seen as a step-wise process, supported by development of appropriate policies and tools; and 4) joint partnership under the leadership of the Ministries of Health to ensure sustainability of implementing novel approaches. CONCLUSIONS: The outcomes of this consultation have been well received by program implementers in the field. The recommendations also laid the groundwork for developing key policy and quality documents for the implementation of HIV-related POCT. |
Epidemiology of invasive group A streptococcal disease in Alaska, 2001 to 2013
Rudolph K , Bruce MG , Bruden D , Zulz T , Reasonover A , Hurlburt D , Hennessy T . J Clin Microbiol 2016 54 (1) 134-41 The Arctic Investigations Program (AIP) began surveillance for invasive group A streptococcal (GAS) infections in Alaska in 2000 as part of the invasive bacterial diseases population-based laboratory surveillance program. Between 2001 and 2013, there were 516 cases of GAS infection reported, for an overall annual incidence of 5.8 cases per 100,000 persons with 56 deaths (case fatality rate, 10.7%). Of the 516 confirmed cases of invasive GAS infection, 422 (82%) had isolates available for laboratory analysis. All isolates were susceptible to penicillin, cefotaxime, and levofloxacin. Resistance to tetracycline, erythromycin, and clindamycin was seen in 11% (n = 8), 5.8% (n = 20), and 1.2% (n = 4) of the isolates, respectively. A total of 51 emm types were identified, of which emm1 (11.1%) was the most prevalent, followed by emm82 (8.8%), emm49 (7.8%), emm12 and emm3 (6.6% each), emm89 (6.2%), emm108 (5.5%), emm28 (4.7%), emm92 (4%), and emm41 (3.8%). The five most common emm types accounted for 41% of isolates. The emm types in the proposed 26-valent and 30-valent vaccines accounted for 56% and 78% of all cases, respectively. GAS remains an important cause of invasive bacterial disease in Alaska. Continued surveillance of GAS infections will help improve understanding of the epidemiology of invasive disease, with an impact on disease control, notification of outbreaks, and vaccine development. |
Estimates of CDC-funded and national HIV diagnoses: A comparison by demographic and HIV-related factors
Krueger A , Dietz P , Van Handel M , Belcher L , Johnson AS . AIDS Behav 2016 20 (12) 2961-2965 To determine whether CDC-funded HIV testing programs are reaching persons disproportionately affected by HIV infection. The percentage distribution for HIV testing and diagnoses by demographics and transmission risk group (diagnoses only) were calculated using 2013 data from CDC's National HIV Surveillance System and CDC's national HIV testing program data. In 2013, nearly 3.2 million CDC-funded tests were provided to persons aged 13 years and older. Among persons who received a CDC-funded test, 41.1 % were aged 20-29 years; 49.2 % were male, 46.2 % were black/African American, and 56.2 % of the tests were conducted in the South. Compared with the characteristics of all persons diagnosed with HIV in the United States in 2013, among persons diagnosed as a result of CDC-funded tests, a higher percentage were aged 20-29 years (40.3 vs 33.7 %) and black/African American (55.3 vs 46.0 %). CDC-funded HIV testing programs are reaching young people and blacks/African Americans. |
Factors associated with ever being HIV-tested in Zimbabwe: an extended analysis of the Zimbabwe Demographic and Health Survey (2010-2011)
Takarinda KC , Madyira LK , Mhangara M , Makaza V , Maphosa-Mutsaka M , Rusakaniko S , Kilmarx PH , Mutasa-Apollo T , Ncube G , Harries AD . PLoS One 2016 11 (1) e0147828 INTRODUCTION: Zimbabwe has a high human immunodeficiency virus (HIV) burden. It is therefore important to scale up HIV-testing and counseling (HTC) as a gateway to HIV prevention, treatment and care. OBJECTIVE: To determine factors associated with being HIV-tested among adult men and women in Zimbabwe. METHODS: Secondary analysis was done using data from 7,313 women and 6,584 men who completed interviewer-administered questionnaires and provided blood specimens for HIV testing during the Zimbabwe Demographic and Health Survey (ZDHS) 2010-11. Factors associated with ever being HIV-tested were determined using multivariate logistic regression. RESULTS: HIV-testing was higher among women compared to men (61% versus 39%). HIV-infected respondents were more likely to be tested compared to those who were HIV-negative for both men [adjusted odds ratio (AOR) = 1.53; 95% confidence interval (CI) (1.27-1.84)] and women [AOR = 1.42; 95% CI (1.20-1.69)]. However, only 55% and 74% of these HIV-infected men and women respectively had ever been tested. Among women, visiting antenatal care (ANC) [AOR = 5.48, 95% CI (4.08-7.36)] was the most significant predictor of being tested whilst a novel finding for men was higher odds of testing among those reporting a sexually transmitted infection (STI) in the past 12 months [AOR = 1.86, 95%CI (1.26-2.74)]. Among men, the odds of ever being tested increased with age ≥20 years, particularly those 45-49 years [AOR = 4.21; 95% CI (2.74-6.48)] whilst for women testing was highest among those aged 25-29 years [AOR = 2.01; 95% CI (1.63-2.48)]. Other significant factors for both sexes were increasing education level, higher wealth status and currently/formerly being in union. CONCLUSIONS: There remains a high proportion of undiagnosed HIV-infected persons and hence there is a need for innovative strategies aimed at increasing HIV-testing, particularly for men and in lower-income and lower-educated populations. Promotion of STI services can be an important gateway for testing more men whilst ANC still remains an important option for HIV-testing among pregnant women. |
Inadequate diagnosis and treatment of malaria among travelers returning from Africa during the Ebola epidemic - United States, 2014-2015
Tan KR , Cullen KA , Koumans EH , Arguin PM . MMWR Morb Mortal Wkly Rep 2016 65 (2) 27-9 Among 1,683 persons in the United States who developed malaria following international travel during 2012, more than half acquired disease in one of 16 countries* in West Africa (1). Since March 2014, West Africa has experienced the world's largest epidemic of Ebola virus disease (Ebola), primarily affecting Guinea, Sierra Leone, and Liberia; in 2014, approximately 20,000 Ebola cases were reported (2). Both Ebola and malaria are often characterized by fever and malaise and can be clinically indistinguishable, especially early in the course of disease. Immediate laboratory testing is critical for diagnosis of both Ebola and malaria, so that appropriate lifesaving treatment can be initiated. CDC recommends prompt malaria testing of patients with fever and history of travel to an area that is endemic for malaria, using blood smear microscopy, with results available within a few hours (3). Empiric treatment of malaria is not recommended by CDC (4). Reverse transcription-polymerase chain reaction (RT-PCR) testing is recommended to diagnose Ebola (5). During the Ebola outbreak in West Africa, CDC received reports of delayed laboratory testing for malaria in travelers returning to the United States because of infection control concerns related to Ebola (6). CDC reviewed documented calls to its malaria consultation service and selected three patient cases to present as examples of deficiencies in the evaluation and treatment of malaria among travelers returning from Africa during the Ebola epidemic. |
Increases in acute hepatitis B virus infections - Kentucky, Tennessee, and West Virginia, 2006-2013
Harris AM , Iqbal K , Schillie S , Britton J , Kainer MA , Tressler S , Vellozzi C . MMWR Morb Mortal Wkly Rep 2016 65 (3) 47-50 As many as 2.2 million persons in the United States are chronically infected with hepatitis B virus (HBV) (1), and approximately 15%-25% of persons with chronic HBV infection will die prematurely from cirrhosis or liver cancer (2). Since 2006, the overall U.S. incidence of acute HBV infection has remained stable; the rate in 2013 was 1.0 case per 100,000 persons (3). Hepatitis B vaccination is highly effective in preventing HBV infection and is recommended for all infants (beginning at birth), all adolescents, and adults at risk for HBV infection (e.g., persons who inject drugs, men who have sexual contact with men, persons infected with human immunodeficiency virus [HIV], and others). Hepatitis B vaccination coverage is low among adults: 2013 National Health Interview Survey data indicated that coverage with ≥3 doses of hepatitis B vaccine was 32.6% for adults aged 19-49 years (4). Injection drug use is a risk factor for both hepatitis C virus (HCV) and HBV. Among young adults in some rural U.S. communities, an increased incidence of HCV infection has been associated with a concurrent increase of injection drug use (5); and recent data indicate an increase of acute HCV infection in the Appalachian region associated with injection drug use (6). Using data from the National Notifiable Diseases Surveillance System (NNDSS) during 2006-2013, CDC assessed the incidence of acute HBV infection in three of the four Appalachian states (Kentucky, Tennessee, and West Virginia) included in the HCV infection study (6). Similar to the increase of HCV infections recently reported, an increase in incident cases of acute HBV infection in these three states has occurred among non-Hispanic whites (whites) aged 30-39 years who reported injection drug use as a common risk factor. Since 2009, cases of acute HBV infection have been reported from more non-urban than urban regions. Evidence-based services to prevent HBV infection are needed. |
Active monitoring of travelers arriving from Ebola-affected countries - New York City, October 2014-April 2015
Millman AJ , Chamany S , Guthartz S , Thihalolipavan S , Porter M , Schroeder A , Vora NM , Varma JK , Starr D . MMWR Morb Mortal Wkly Rep 2016 65 (3) 51-54 The Ebola virus disease (Ebola) outbreak in West Africa has claimed approximately 11,300 lives (1), and the magnitude and course of the epidemic prompted many nonaffected countries to prepare for Ebola cases imported from affected countries. In October 2014, CDC and the Department of Homeland Security (DHS) implemented enhanced entry risk assessment and management at five U.S. airports: John F. Kennedy (JFK) International Airport in New York City (NYC), O'Hare International Airport in Chicago, Newark Liberty International Airport in New Jersey, Hartsfield-Jackson International Airport in Atlanta, and Dulles International Airport in Virginia (2). Enhanced entry risk assessment began at JFK on October 11, 2014, and at the remaining airports on October 16 (3). On October 21, DHS exercised its authority to direct all travelers flying into the United States from an Ebola-affected country to arrive at one of the five participating airports. At the time, the Ebola-affected countries included Guinea, Liberia, Mali, and Sierra Leone. On October 27, CDC issued updated guidance for monitoring persons with potential Ebola virus exposure (4), including recommending daily monitoring of such persons to ascertain the presence of fever or symptoms for a period of 21 days (the maximum incubation period of Ebola virus) after the last potential exposure; this was termed "active monitoring." CDC also recommended "direct active monitoring" of persons with a higher risk for Ebola virus exposure, including health care workers who had provided direct patient care in Ebola-affected countries. Direct active monitoring required direct observation of the person being monitored by the local health authority at least once daily (5). This report describes the operational structure of the NYC Department of Health and Mental Hygiene's (DOHMH) active monitoring program during its first 6 months (October 2014-April 2015) of operation. Data collected on persons who required direct active monitoring are not included in this report. |
Anaemia in HIV-infected pregnant women receiving triple antiretroviral combination therapy for prevention of mother-to-child transmission: a secondary analysis of the Kisumu breastfeeding study (KiBS)
Odhiambo C , Zeh C , Angira F , Opollo V , Akinyi B , Masaba R , Williamson JM , Otieno J , Mills LA , Lecher SL , Thomas TK . Trop Med Int Health 2016 21 (3) 373-84 OBJECTIVE: The prevalence of anaemia during pregnancy is estimated to be 35-75% in sub-Saharan Africa and is associated with an increased risk of maternal mortality. We evaluated the frequency and factors associated with anaemia in HIV-infected women undergoing antiretroviral (ARV) therapy for prevention of mother-to-child transmission (PMTCT) enrolled in The Kisumu Breastfeeding Study 2003-2009. METHODS: Maternal haematological parameters were monitored from 32 to 34 weeks of gestation to 2 years post-delivery among 522 enrolled women. Clinical and laboratory assessments for causes of anaemia were performed, and appropriate management was initiated. Anaemia was graded using the National Institutes of Health Division of AIDS 1994 Adult Toxicity Tables. Data were analysed using SAS software, v 9.2. The Wilcoxon two-sample rank test was used to compare groups. A logistic regression model was fitted to describe the trend in anaemia over time. RESULTS: At enrolment, the prevalence of any grade anaemia (Hb < 9.4 g/dl) was 61.8%, but fell during ARV therapy, reaching a nadir (7.4%) by 6 months post-partum. A total of 41 women (8%) developed severe anaemia (Hb < 7 g/dl) during follow-up; 2 (4.9%) were hospitalised for blood transfusion, whereas 3 (7.3%) were transfused while hospitalised (for delivery). The greatest proportion of severe anaemia events occurred around delivery (48.8%; n = 20). Anaemia (Hb ≥ 7 and < 9.4 g/dl) at enrolment was associated with severe anaemia at delivery (OR 5.87; 95% CI: 4.48, 7.68, P < 0.01). Few cases of severe anaemia coincided with clinical malaria (24.4%; n = 10) and helminth (7.3%; n = 3) infections. CONCLUSION: Resolution of anaemia among most participants during study follow-up was likely related to receipt of ARV therapy. Efforts should be geared towards addressing common causes of anaemia in HIV-infected pregnant women, prioritising initiation of ARV therapy and management of peripartum blood loss. |
Burden of nursing home-onset Clostridium difficile infection in the united states: Estimates of incidence and patient outcomes
Hunter JC , Mu Y , Dumyati GK , Farley MM , Winston LG , Johnston HL , Meek JI , Perlmutter R , Holzbauer SM , Beldavs ZG , Phipps EC , Dunn JR , Cohen JA , Avillan J , Stone ND , Gerding DN , McDonald LC , Lessa FC . Open Forum Infect Dis 2016 3 (1) ofv196 BACKGROUND: Approximately 4 million Americans receive nursing home (NH) care annually. Nursing home residents commonly have risk factors for Clostridium difficile infection (CDI), including advanced age and antibiotic exposures. We estimated national incidence of NH-onset (NHO) CDI and patient outcomes. METHODS: We identified NHO-CDI cases from population-based surveillance of 10 geographic areas in the United States. Cases were defined by C difficile-positive stool collected in an NH (or from NH residents in outpatient settings or ≤3 days after hospital admission) without a positive stool in the prior 8 weeks. Medical records were reviewed on a sample of cases. Incidence was estimated using regression models accounting for age and laboratory testing method; sampling weights were applied to estimate hospitalizations, recurrences, and deaths. RESULTS: A total of 3503 NHO-CDI cases were identified. Among 262 sampled cases, median age was 82 years, 76% received antibiotics in the 12 weeks prior to the C difficile-positive specimen, and 57% were discharged from a hospital in the month before specimen collection. After adjusting for age and testing method, the 2012 national estimate for NHO-CDI incidence was 112 800 cases (95% confidence interval [CI], 93 400-131 800); 31 400 (28%) were hospitalized within 7 days after a positive specimen (95% CI, 25 500-37 300), 20 900 (19%) recurred within 14-60 days (95% CI, 14 600-27 100), and 8700 (8%) died within 30 days (95% CI, 6600-10 700). CONCLUSIONS: Nursing home onset CDI is associated with substantial morbidity and mortality. Strategies focused on infection prevention in NHs and appropriate antibiotic use in both NHs and acute care settings may decrease the burden of NHO CDI. |
New records and updated checklist of phlebotomine sand flies (Diptera: Psychodidae) From Liberia
Obenauer PJ , Rueda LM , El-Hossary SS , Watany N , Stoops CA , Fakoli LS , Bolay FK , Diclaro JW 2nd . J Med Entomol 2016 53 (3) 717-720 Phlebotomine sand flies from three counties in Liberia were collected from January 2011 to July 2013. In total, 3,118 sand flies were collected: 18 species were identified, 13 of which represented new records for Liberia. An updated taxonomic checklist is provided with a brief note on sand fly biology, and the disease vector potential for species is discussed. |
A detailed exploration into the association of prescribed opioid dosage and overdose deaths among patients with chronic pain
Bohnert AS , Logan JE , Ganoczy D , Dowell D . Med Care 2016 54 (5) 435-41 BACKGROUND: High opioid dosage has been associated with overdose, and clinical guidelines have cautioned against escalating dosages above 100 morphine-equivalent mg (MEM) based on the potential harm and the absence of evidence of benefit from high dosages. However, this 100 MEM threshold was chosen somewhat arbitrarily. OBJECTIVE: To examine the association of prescribed opioid dosage as a continuous measure in relation to risk of unintentional opioid overdose to identify the range of dosages associated with risk of overdose at a detailed level. METHODS: In this nested case-control study with risk-set sampling of controls, cases (opioid overdose decedents) and controls were identified from a population of patients of the Veterans Health Administration who were prescribed opioids and who have a chronic pain diagnosis. Unintentional fatal opioid analgesic overdose was measured from National Death Index records and prescribed opioid dosage from pharmacy records. RESULTS: The average prescribed opioid dosage was higher (P<0.001) for cases (mean=98.1 MEM, SD=112.7; median=60, interquartile range, 30-120), than controls (mean=47.7 MEM, SD=65.2; median=25, interquartile range, 15-45). In a ROC analysis, dosage was a moderately good "predictor" of opioid overdose death, indicating that, on average, overdose cases had a prescribed opioid dosage higher than 71% of controls. CONCLUSIONS: A clear cut-point in opioid dosage to distinguish between overdose cases and controls was not found. However, lowering the recommended dosage threshold below the 100 MEM used in many recent guidelines would affect proportionately few patients not at risk for overdose while potentially benefitting many of those at risk for overdose. |
Economic and social impact of Pertussis among adolescents in San Diego County
Varan AK , Harriman KH , Winter K , Thun MD , McDonald EC . J Adolesc Health 2016 58 (2) 241-4 PURPOSE: During recent pertussis epidemics, adolescents have experienced a large burden of disease. We assessed the impact of pertussis among San Diego adolescents and their households. METHODS: Parents of pertussis patients aged 13-17 years were surveyed about health care utilization, missed work and school, and other factors. Costs of medical visits, medication use, and lost wages were estimated. RESULTS: The parents of 53 (of 108 [49%]) eligible 2013 pertussis patients were interviewed; 51 (96%) of these patients previously received tetanus, diphtheria, and acellular pertussis vaccine. Medical visits included primary care (81%), urgent care (11%), and emergency department (9%); all patients received antibiotics. Forty-seven households (89%) received a post-exposure prophylaxis recommendation, and five (9%) reported ≥1 unpaid parental leave day. Thirty-eight patients (72%) missed ≥1 school day (mean = 5.4 days). Societal costs were estimated at $315.15 per household and $236,047.35 in San Diego during 2013-2014. CONCLUSIONS: Even among vaccinated adolescents, pertussis can result in considerable societal costs. |
Antidepressant prescription claims among reproductive-aged women with private employer-sponsored insurance - United States 2008-2013
Dawson AL , Ailes EC , Gilboa SM , Simeone RM , Lind JN , Farr SL , Broussard CS , Reefhuis J , Carrino G , Biermann J , Honein MA . MMWR Morb Mortal Wkly Rep 2016 65 (3) 41-46 Antidepressant medication use during pregnancy has been increasing in the United States (1). Many women require antidepressants on an ongoing basis, and a clear consensus on the safest medication options for both the mother and her fetus does not exist (2). Given that half of all U.S. pregnancies are unplanned (3), antidepressant use will occur during the first weeks of pregnancy, a critical period for fetal development. To understand trends among women of reproductive age, CDC used Truven Health's MarketScan Commercial Claims and Encounters data* to estimate the number of antidepressant prescriptions filled by women aged 15-44 years with private employer-sponsored insurance. During 2008-2013, an average of 15.4% of women aged 15-44 years filled at least one prescription for an antidepressant in a single year. The most frequently filled antidepressants included sertraline, bupropion, and citalopram. Prescribing of antidepressants is common, and research on antidepressant safety during pregnancy needs to be accelerated to provide evidence-based information to health care providers and women about the potential risks for antidepressant exposure before and during pregnancy and between pregnancies. |
Effects of repeated annual inactivated influenza vaccination among healthcare personnel on serum hemagglutinin inhibition antibody response to A/Perth/16/2009 (H3N2)-like virus during 2010-11
Thompson MG , Naleway A , Fry AM , Ball S , Spencer SM , Reynolds S , Bozeman S , Levine M , Katz JM , Gaglani M . Vaccine 2016 34 (7) 981-8 BACKGROUND: Recently, lower estimates of influenza vaccine effectiveness (VE) against A(H3N2) virus illness among those vaccinated during the previous season or multiple seasons have been reported; however, it is unclear whether these effects are due to differences in immunogenicity. METHODS: We performed hemagglutination inhibition antibody (HI) assays on serum collected at preseason, approximately 30 days post-vaccination, and postseason from a prospective cohort of healthcare personnel (HCP). Eligible participants had medical and vaccination records for at least four years (since July, 2006), including 578 HCP who received 2010-11 trivalent inactivated influenza vaccine [IIV3, containing A/Perth/16/2009-like A(H3N2)] and 209 HCP who declined vaccination. Estimates of the percentage with high titers (≥40 and>100) and geometric mean fold change ratios (GMRs) to A/Perth/16/2009-like virus by number of prior vaccinations were adjusted for age, sex, race, education, household size, hospital care responsibilities, and study site. RESULTS: Post-vaccination GMRs were inversely associated with the number of prior vaccinations, increasing from 2.3 among those with 4 prior vaccinations to 6.2 among HCP with zero prior vaccinations (F[4,567]=9.97, p<.0005). Thirty-two percent of HCP with 1 prior vaccination achieved titers >100 compared to only 11% of HCP with 4 prior vaccinations (adjusted odds ratio=6.8, 95% CI=3.1 - 15.3). CONCLUSION: Our findings point to an exposure-response association between repeated IIV3 vaccination and HI for A(H3N2) and are consistent with recent VE observations. Ultimately, better vaccines and vaccine strategies may be needed in order to optimize immunogenicity and VE for HCP and other repeated vaccinees. |
Antibody levels and protection after hepatitis B vaccine: Results of a 30-year follow-up study and response to a booster dose
Bruce MG , Bruden D , Hurlburt D , Zanis C , Thompson G , Rea L , Toomey M , Townshend-Bulson L , Rudolph K , Bulkow L , Spradling PR , Baum R , Hennessy T , McMahon BJ . J Infect Dis 2016 214 (1) 16-22 BACKGROUND: The duration of protection in children and adults resulting from hepatitis B vaccination is unknown. In 1981, we immunized a cohort of 1578 Alaska Native adults and children from 15 Alaska communities aged ≥6 months using 3 doses of plasma-derived hepatitis B vaccine. METHODS: Persons were tested for antibody to hepatitis B surface antigen (anti-HBs) levels 30 years after receiving the primary series. Those with levels <10 mIU/mL received 1 booster dose of recombinant hepatitis B vaccine 2-4 weeks later and were then evaluated on the basis of anti-HBs measurements 30 days after the booster. RESULTS: Among 243 persons (56%) who responded to the original primary series but received no subsequent doses during the 30-year period, 125 (51%) had an anti-HBs level ≥10 mIU/mL. Among participants with anti-HBs levels <10 mIU/mL who were available for follow-up, 75 of 85 (88%) responded to a booster dose with an anti-HBs level ≥10 mIU/mL at 30 days. Initial anti-HBs level after the primary series was correlated with higher anti-HBs levels at 30 years. CONCLUSIONS: Based on anti-HBs level ≥10 mIU/mL at 30 years and an 88% booster dose response, we estimate that ≥90% of participants had evidence of protection 30 years later. Booster doses are not needed. |
Beyond efficacy: the full public health impact of vaccines
Saadatian-Elahi M , Horstick O , Breiman RF , Gessner BD , Gubler DJ , Louis J , Parashar UD , Tapia R , Picot V , Zinsou JA , Nelson CB . Vaccine 2016 34 (9) 1139-47 There is an active discussion in the public health community on how to assess and incorporate, in addition to safety and measures of protective efficacy, the full public health value of preventive vaccines into the evidence-based decision-making process of vaccine licensure and recommendations for public health use. The conference "Beyond efficacy: the full public health impact of vaccines in addition to efficacy measures in trials" held in Annecy, France (June 22-24, 2015) has addressed this issue and provided recommendations on how to better capture the whole public health impact of vaccines. Using key examples, the expert group stressed that we are in the midst of a new paradigm in vaccine evaluation, where all aspects of public health value of vaccines beyond efficacy should be evaluated. To yield a wider scope of vaccine benefits, additional measures such as vaccine preventable disease incidence, overall efficacy and other outcomes such as under-five mortality or non-etiologically confirmed clinical syndromes should be assessed in addition to traditional efficacy or effectiveness measurements. Dynamic modelling and the use of probe studies should also be considered to provide additional insight to the full public health value of a vaccine. The use of burden reduction and conditional licensure of vaccines based on collection of outcome results should be considered by regulatory agencies. |
Effect of a clinical decision support system on early action on immunological treatment failure in patients with HIV in Kenya: A cluster randomised controlled trial
Oluoch T , Katana A , Kwaro D , Santas X , Langat P , Mwalili S , Muthusi K , Okeyo N , Ojwang JK , Cornet R , Abu-Hanna A , de Keizer N . Lancet HIV 2015 3018 (15) 00242-8 BACKGROUND: A clinical decision support system (CDSS) is a computer program that applies a set of rules to data stored in electronic health records to offer actionable recommendations. We aimed to establish whether a CDSS that supports detection of immunological treatment failure among patients with HIV taking antiretroviral therapy (ART) would improve appropriate and timely action. METHODS: We did this prospective, cluster randomised controlled trial in adults and children (aged ≥18 months) who were eligible for, and receiving, ART at HIV clinics in Siaya County, western Kenya. Health facilities were randomly assigned (1:1), via block randomisation (block size of two) with a computer-generated random number sequence, to use electronic health records either alone (control) or with CDSS (intervention). Facilities were matched by type and by number of patients enrolled in HIV care. The primary outcome measure was the difference between groups in the proportion of patients who experienced immunological treatment failure and had a documented clinical action. We used generalised linear mixed models with random effects to analyse clustered data. This trial is registered with ClinicalTrials.gov, number NCT01634802. FINDINGS: Between Sept 1, 2012, and Jan 31, 2014, 13 clinics, comprising 41 062 patients, were randomly assigned to the control group (n=6) or the intervention group (n=7). Data collection at each site took 12 months. Among patients eligible for ART, 10 358 (99%) of 10 478 patients were receiving ART at control sites and 10 991 (99%) of 11 028 patients were receiving ART at intervention sites. Of these patients, 1125 (11%) in the control group and 1342 (12%) in the intervention group had immunological treatment failure, of whom 332 (30%) and 727 (54%), respectively, received appropriate action. The likelihood of clinicians taking appropriate action on treatment failure was higher with CDSS alerts than with no decision support system (adjusted odds ratio 3·18, 95% CI 1·02-9·87). INTERPRETATION: CDSS significantly improved the likelihood of appropriate and timely action on immunological treatment failure. We expect our findings will be generalisable to virological monitoring of patients with HIV receiving ART once countries implement the 2015 WHO recommendation to scale up viral load monitoring. |
Masculinity and bystander attitudes: Moderating effects of masculine gender role stress
Leone RM , Parrott DJ , Swartout KM , Tharp AT . Psychol Violence 2016 6 (1) 82-90 OBJECTIVE: The purpose of the current study was to examine the bystander decision-making process as a mechanism by which men's adherence to various dimensions of traditional masculinity is associated with their confidence to intervene in sexually aggressive events. Further, this study examined the stress men experience from their attempts to adhere to traditional male gender roles as a moderator of this mediational path. METHOD: Participants (n 252) completed measures of traditional masculinity, decisional balance (i.e., weighing the pros and cons) for intervening, masculine gender roles stress, and bystander efficacy. RESULTS: The belief that men must attain social status was associated with more confidence in men's ability to intervene. This effect was mediated by greater perceived positive consequences for intervention among men high, but not low, in masculine gender role stress. The belief that men should be tough and aggressive was associated with greater perceived negative consequences for intervention and less confidence to intervene. The belief that men should not act in stereotypically feminine ways was directly associated with less confidence for intervention. CONCLUSIONS: Findings highlight the importance of examining masculinity from a multidimensional perspective to better understand how adherence to various norms differentially influences bystander behavior. These findings may help to inform bystander intervention programming. |
Global prevalence of past-year violence against children: A systematic review and minimum estimates
Hillis S , Mercy J , Amobi A , Kress H . Pediatrics 2016 137 (3) e20154079 CONTEXT: Evidence confirms associations between childhood violence and major causes of mortality in adulthood. A synthesis of data on past-year prevalence of violence against children will help advance the United Nations' call to end all violence against children. OBJECTIVES: Investigators systematically reviewed population-based surveys on the prevalence of past-year violence against children and synthesized the best available evidence to generate minimum regional and global estimates. DATA SOURCES: We searched Medline, PubMed, Global Health, NBASE, CINAHL, and the World Wide Web for reports of representative surveys estimating prevalences of violence against children. STUDY SELECTION: Two investigators independently assessed surveys against inclusion criteria and rated those included on indicators of quality. DATA EXTRACTION: Investigators extracted data on past-year prevalences of violent victimization by country, age group, and type (physical, sexual, emotional, or multiple types). We used a triangulation approach which synthesized data to generate minimum regional prevalences, derived from population-weighted averages of the country-specific prevalences. RESULTS: Thirty-eight reports provided quality data for 96 countries on past-year prevalences of violence against children. Base case estimates showed a minimum of 50% or more of children in Asia, Africa, and Northern America experienced past-year violence, and that globally over half of all children-1 billion children, ages 2-17 years-experienced such violence. LIMITATIONS: Due to variations in timing and types of violence reported, triangulation could only be used to generate minimum prevalence estimates. CONCLUSIONS: Expanded population-based surveillance of violence against children is essential to target prevention and drive the urgent investment in action endorsed in the United Nations 2030 Sustainable Development Agenda. |
Multi-Center Evaluation of the Xpert Norovirus Assay for Detection of Norovirus GI and GII in Fecal Specimens.
Gonzalez MD , Langley LC , Buchan BW , Faron ML , Maier M , Templeton K , Walker K , Popowitch EB , Miller MB , Rao A , Liebert UG , Ledeboer NA , Vinje J , Burnham CA . J Clin Microbiol 2016 54 (1) 142-7 Norovirus is the most common cause of sporadic gastroenteritis and outbreaks worldwide. The rapid identification of norovirus has important implications for infection prevention measures and may reduce the need for additional diagnostic testing. The Xpert Norovirus assay recently received FDA clearance for the detection and differentiation of norovirus genogroups I and II (GI and GII), which account for the vast majority of infections. In this study, we evaluated the performance of the Xpert Norovirus assay with both fresh, prospectively collected (n = 914) and frozen, archived (n = 489) fecal specimens. A Centers for Disease Control and Prevention (CDC) composite reference method was used as the gold standard for comparison. For both prospective and frozen specimens, the Xpert Norovirus assay showed positive percent agreement (PPA) and negative percent agreement (NPA) values of 98.3% and 98.1% for GI and of 99.4% and 98.2% for GII, respectively. Norovirus prevalence in the prospective specimens (collected from March to May of 2014) was 9.9% (n = 90), with the majority of positives caused by genogroup II (82%, n = 74). The positive predictive value (PPV) of the Xpert Norovirus assay was 75% for GI-positive specimens, whereas it was 86.5% for GII-positive specimens. The negative predictive values (NPV) for GI and GII were 100% and 99.9%, respectively. |
Development and Validation of an Improved PCR Method Using the 23S-5S Intergenic Spacer for Detection of Rickettsiae in Dermacentor variabilis Ticks and Tissue Samples from Humans and Laboratory Animals.
Kakumanu ML , Ponnusamy L , Sutton HT , Meshnick SR , Nicholson WL , Apperson CS . J Clin Microbiol 2016 54 (4) 972-9 A novel nested PCR assay was developed to detect Rickettsia spp. in ticks and tissue samples from humans and laboratory animals. Primers were designed for the nested run to amplify a variable region of the 23S-5S intergenic spacer of Rickettsia spp. The newly designed primers were evaluated using genomic DNA from 11 Rickettsia species, belonging to the spotted fever, typhus, and ancestral groups and in parallel compared to other Rickettsia-specific PCR targets (ompA, gltA and 17-kDa). The new 23S-5S nested PCR assay amplified all 11 Rickettsia spp. but the assays employing other PCR targets did not. The novel nested assay was sensitive enough to detect one copy of a cloned 23S-5S IGS fragment from "Candidatus R. amblyommii". Subsequently, the detection efficiency of the 23S-5S nested assay was compared to the other three assays using genomic DNA extracted from 40 adult Dermacentor variabilis ticks. The nested 23S-5S assay detected Rickettsia DNA in 45% of the ticks while the amplification rate of the three other assays ranged between 5-20%. The novel PCR assay was validated using clinical samples from humans and laboratory animals that were known to be infected with pathogenic species of Rickettsia. The nested 23S-5S PCR assay was coupled with reverse line blot hybridization with species-specific probes for high-throughput detection and simultaneous identification of the species of Rickettsia in the ticks. Rickettsia amblyommii, R. montanensis, R. felis and R. bellii were frequently identified species along with some potentially novel Rickettsia that were closely related to R. bellii and R. conorii. |
MMP-9-dependent serum-borne bioactivity caused by multi-walled carbon nanotube exposure induces vascular dysfunction via the CD36 scavenger receptor
Aragon M , Erdely A , Bishop L , Salmen R , Weaver J , Liu J , Hall P , Eye T , Kodali V , Zeidler-Erdely P , Stafflinger JE , Ottens AK , Campen MJ . Toxicol Sci 2016 150 (2) 488-98 Inhalation of multi-walled carbon nanotubes (MWCNT) causes systemic effects including vascular inflammation, endothelial dysfunction, and acute phase protein expression. MWCNTs translocate only minimally beyond the lungs, thus cardiovascular effects thereof may be caused by generation of secondary biomolecular factors from MWCNT-pulmonary interactions that spill over into the systemic circulation. Therefore, we hypothesized that induced matrix metalloproteinase-9 (MMP-9) is a generator of factors that, in turn, drive vascular effects through ligand-receptor interactions with the multiligand pattern recognition receptor, CD36. To test this, wildtype (WT; C57BL/6) and MMP-9-/- mice were exposed to varying doses (10 or 40 microg) of MWCNTs via oropharyngeal aspiration and serum was collected at 4 and 24 h post-exposure. Endothelial cells treated with serum from MWCNT-exposed WT mice exhibited significantly reduced nitric oxide (NO) generation, as measured by electron paramagnetic resonance, an effect that was independent of NO scavenging. Serum from MWCNT-exposed WT mice inhibited acetylcholine-mediated relaxation of aortic rings at both time points. Absence of CD36 on the aortic rings (obtained from CD36-deficient mice) abolished the serum-induced impairment of vasorelaxation. MWCNT exposure induced MMP-9 protein levels in both bronchoalveolar lavage and whole lung lysates. Serum from MMP-9-/- mice exposed to MWCNT did not diminish the magnitude of vasorelaxation in naive WT aortic rings, although a modest right shift of the acetylcholine dose-response curve was observed in both MWCNT dose groups relative to controls. In conclusion, pulmonary exposure to MWCNT leads to elevated MMP-9 levels and MMP-9-dependent generation of circulating bioactive factors that promote endothelial dysfunction and decreased NO bioavailability via interaction with vascular CD36. |
Acute infections, cost per infection and turnaround time in three United States hospital laboratories using fourth-generation antigen-antibody human immunodeficiency virus immunoassays
Wesolowski LG , Nasrullah M , Coombs RW , Rosenberg E , Ethridge SF , Hutchinson AB , Dragavon J , Rychert J , Nolte FS , Madory JE , Werner BG . Open Forum Infect Dis 2016 3 (1) ofv188 BACKGROUND: To improve clinical and public health outcomes through early human immunodeficiency virus (HIV) detection, fourth-generation antigen/antibody immunoassay (4IA) and supplemental testing results must be returned rapidly. METHODS: We examined HIV testing data at Harborview Medical Center (HMC), Massachusetts General Hospital (MGH), and the Medical University of South Carolina (MUSC), which used 4IA and supplemental antibody and nucleic acid tests (NATs). At MGH and MUSC, HIV-1 Western blot (WB) and HIV-2 testing were conducted at a reference laboratory. We compared time from specimen collection to laboratory result for established (positive WB) and acute infections (reactive 4IA, negative/indeterminate WB, detectable NAT), and we calculated testing cost per positive-test result. RESULTS: From 3731 (MUSC) to 19 774 (MGH) tests were conducted; 0.01% (MGH) to 0.05% (HMC) were acute infections. Each laboratory had reactive 4IA, WB-negative, or indeterminate specimens without NAT (ie, potential acute infections). Time to result was 1.5 (HMC) to 5.2 days (MGH) for acute and 1.0 (HMC) to 5.2 days (MGH) for established infections. Costs were $1054 (MGH) to $1521 (MUSC). CONCLUSIONS: Conducting supplemental testing in-house lowered turnaround times, which may be further reduced with rapid HIV-1/HIV-2 differentiation tests. Hospitals may benefit from quantitative NATs not requiring physician orders, so all potential acute infections receive NAT. |
Maternity care practices and breastfeeding among adolescent mothers aged 12-19 years - United States, 2009-2011
Olaiya O , Dee DL , Sharma AJ , Smith RA . MMWR Morb Mortal Wkly Rep 2016 65 (2) 17-22 The American Academy of Pediatrics recommends that infants be breastfed exclusively* for the first 6 months of life, and that mothers continue breastfeeding for at least 1 year (1). However, in 2011, only 19.3% of mothers aged ≤20 years in the United States exclusively breastfed their infants at 3 months, compared with 36.4% of women aged 20-29 years and 45.0% of women aged ≥30 years.(dagger) Hospitals play an essential role in providing care that helps mothers establish and continue breastfeeding. The U.S. Surgeon General and numerous health professional organizations recommend providing care aligned with the Baby-Friendly Hospital Initiative (BFHI), including adherence to the Ten Steps to Successful Breastfeeding (Ten Steps), as well as not providing gift packs containing infant formula (2,3). Implementing BFHI-aligned maternity care improves duration of any and exclusive breastfeeding among mothers (4,5); however, studies have not examined associations between BFHI-aligned maternity care and breastfeeding outcomes solely among adolescent mothers (for this report, adolescents refers to persons aged 12-19 years). Therefore, CDC analyzed 2009-2011 Pregnancy Risk Assessment Monitoring System (PRAMS) data and determined that among adolescent mothers who initiated breastfeeding, self-reported prevalence of experiencing any of the nine selected BFHI-aligned maternity care practices included in the PRAMS survey ranged from 29.2% to 95.4%. Among the five practices identified to be significantly associated with breastfeeding outcomes in this study, the more practices a mother experienced, the more likely she was to be breastfeeding (any amount or exclusively) at 4 weeks and 8 weeks postpartum. Given the substantial health advantages conferred to mothers and children through breastfeeding, and the particular vulnerability of adolescent mothers to lower breastfeeding rates, it is important for hospitals to provide evidence-based maternity practices related to breastfeeding as part of their routine care to all mothers, including adolescent mothers. |
Increasing prevalence of gastroschisis - 14 States, 1995-2012
Jones AM , Isenburg J , Salemi JL , Arnold KE , Mai CT , Aggarwal D , Arias W , Carrino GE , Ferrell E , Folorunso O , Ibe B , Kirby RS , Krapfl HR , Marengo LK , Mosley BS , Nance AE , Romitti PA , Spadafino J , Stock J , Honein MA . MMWR Morb Mortal Wkly Rep 2016 65 (2) 23-6 Gastroschisis is a serious congenital defect in which the intestines protrude through an opening in the abdominal wall. Gastroschisis requires surgical repair soon after birth and is associated with an increased risk for medical complications and mortality during infancy. Reports from multiple surveillance systems worldwide have documented increasing prevalence of gastroschisis since the 1980s, particularly among younger mothers; however, since publication of a multistate U.S. report that included data through 2005 (1), it is not known whether prevalence has continued to increase. Data on gastroschisis from 14 population-based state surveillance programs were pooled and analyzed to assess the average annual percent change (AAPC) in prevalence and to compare the prevalence during 2006-2012 with that during 1995-2005, stratified by maternal age and race/ethnicity. The pooled data included approximately 29% of U.S. births for the period 1995-2012. During 1995-2012, gastroschisis prevalence increased in every category of maternal age and race/ethnicity, and the AAPC ranged from 3.1% in non-Hispanic white (white) mothers aged <20 years to 7.9% in non-Hispanic black (black) mothers aged <20 years. These corresponded to overall percentage increases during 1995-2012 that ranged from 68% in white mothers aged <20 years to 263% in black mothers aged <20 years. Gastroschisis prevalence increased 30% between the two periods, from 3.6 per 10,000 births during 1995-2005 to 4.9 per 10,000 births during 2006-2012 (prevalence ratio = 1.3, 95% confidence interval [CI]: 1.3-1.4), with the largest increase among black mothers aged <20 years (prevalence ratio = 2.0, 95% CI: 1.6-2.5). Public health research is urgently needed to identify factors contributing to this increase. |
Total Worker Health(R): more implications for the occupational health nurse
Schill AL , Chosewood LC . Workplace Health Saf 2016 64 (1) 4-5 As co-managers of the National Institute for Occupational Safety and Health (NIOSH) Total Worker Health® (TWH) Program, we read the Campbell and Burns article titled “Total Worker Health Implications for the Occupational Health Nurse” published in the July issue with great interest. We are delighted that TWH resonates so strongly with many of our stakeholders, especially occupational health nurses, who, we believe, play a critically important role in the health of working Americans. Therefore, we take this opportunity to more fully describe TWH concepts and expand upon four points to more fully inform all Journal readers about TWH efforts at NIOSH. | First, TWH is not synonymous with wellness programs. This misunderstanding has been so common that recently (July 2015) the definition of TWH was revised to more clearly distinguish the TWH approach from that of wellness programs that focus primarily on worker health-related behaviors. The revised definition is, | A Total Worker Health® (TWH) approach is defined as policies, programs, and practices that integrate protection from work-related safety and health hazards with promotion of injury and illness prevention efforts to advance worker well-being. | Simply put, the TWH approach integrates workplace interventions that protect workers’ safety and health with activities that advance their overall well-being. The TWH approach always prioritizes a hazard-free work environment that protects the safety and health of all workers. Simultaneously, the approach advocates integration of all organizational policies, programs, and practices that contribute to worker safety, health, and well-being, including those relevant to the control of hazards and exposures, the organization of work, compensation and benefits, work–life management, a health-supporting built environment, and well-being supports. |
The use of personal flotation devices in the Northeast lobster fishing industry: An examination of the decision-making process
Weil R , Pinto K , Lincoln J , Hall-Arber M , Sorensen J . Am J Ind Med 2016 59 (1) 73-80 BACKGROUND: This study explored perspectives of Northeast commercial lobstermen regarding the use of personal flotation devices (PFDs). Researchers sought to identify factors contributing to low PFD use, and motivators that could lead to increased use of PFDs. METHODS: This qualitative research (n = 72) included 25 commercial fishermen who participated in in-depth, semi-structured interviews, and 47 attendees of Lobstermen's meetings who engaged in focus groups. RESULTS: The results showed substantial barriers to PFD use. Fishermen described themselves as being proactive about safety whenever possible, but described a longstanding tradition of not wearing PFDs. Key factors integrally linked with the lack of PFD use were workability, identity/social stigma, and risk diffusion. CONCLUSION: Future safety interventions will need to address significant barriers to PFD use that include issues of comfort and ease of use, as well as social acceptability of PFDs and reorientation of risk perceptions related to falls overboard. |
Hearing difficulty and tinnitus among U.S. workers and non-workers in 2007
Masterson EA , Themann CL , Luckhaupt SE , Li J , Calvert GM . Am J Ind Med 2016 59 (4) 290-300 BACKGROUND: Hearing loss and tinnitus are two potentially debilitating physical conditions affecting many people in the United States. The purpose of this study was to estimate the prevalence of hearing difficulty, tinnitus, and their co-occurrence within U.S. POPULATIONS: METHODS: Data from the 2007 National Health Interview Survey (NHIS) were examined. Weighted prevalence and adjusted prevalence ratios for self-reported hearing difficulty, tinnitus, and their co-occurrence were estimated and compared by demographic, among workers with and without occupational noise exposure, and across industries and occupations. RESULTS: Seven percent of U.S. workers never exposed to occupational noise had hearing difficulty, 5% had tinnitus and 2% had both conditions. However, among workers who had ever been exposed to occupational noise, the prevalence was 23%, 15%, and 9%, respectively (P < 0.0001). CONCLUSIONS: Hearing difficulty and tinnitus are prevalent in the U.S.; especially among noise-exposed workers. Improved strategies for hearing conservation or better implementation are needed. Am. J. Ind. Med. Published 2016. This article is a U.S. Government work and is in the public domain in the USA. |
Age-dependent muscle adaptation after chronic stretch-shortening contractions in rats
Rader EP , Layner K , Triscuit AM , Chetlin RD , Ensey J , Baker BA . Aging Dis 2016 7 (1) 1-13 Age-related differences in contraction-induced adaptation have been well characterized especially for young and old rodent models but much less so at intermediate ages. Therefore, additional research is warranted to determine to what extent alterations in adaptation are due to maturation versus aging per se. The purpose of our study was to evaluate muscles of Fisher 344XBrown Norway rats of various ages following one month of exposure to stretch-shortening contractions (SSCs). With exposure, muscles mass increased by ~10% for 27 and 30 month old rats vs. ~20% for 3 and 6 month old rats (P < 0.05). For 3 month old rats, maximum isometric force and dynamic peak force increased by 22 +/- 8% and 27 +/- 10%, respectively (P < 0.05). For 6 month old rats, these forces were unaltered by exposure and positive work capacity diminished by 27 +/- 2% (P = 0.006). By 30 months of age, age-related deficits in maximum isometric force, peak force, negative work, and positive work were apparent and SSC exposure was ineffective at counteracting such deficits. Recovery from fatigue was also tested and exposure-induced improvements in fatigue recovery were indicated for 6 month old rats and to a lesser extent for 3 month old rats whereas no such effect was observed for older rats. Alterations in fatigue recovery were accompanied by evidence of substantial type IIb to IIx fiber type shifting. These results highlight the exceptional adaptive capacity for strength at a young age, the inclination for adaptation in fatigue recovery at early adulthood, and diminished adaptation for muscle performance in general beginning at late adulthood. Such findings motivate careful investigation to determine appropriate SSC exposures at all stages of life. |
Using dust assessment technology to leverage mine site manager-worker communication and health behavior: A longitudinal case study
Haas EJ , Cecala AB , Hoebbel CL . J Progress Res Soc Sci 2016 3 (1) 154-167 Research continues to investigate barriers to managing occupational health and safety behaviors among the workforce. Recent literature argues that (1) there is a lack of consistent, multilevel communication and application of health and safety practices, and (2) social scientific methods are absent when determining how to manage injury prevention in the workplace. In response, the current study developed and tested a multilevel intervention case study at two industrial mineral mines to help managers and workers communicate about and reduce respirable silica dust exposures at their mine sites. A dust assessment technology, the Helmet-CAM, was used to identify and encourage communication about potential problem areas and tasks on site that contributed to elevated exposures. The intervention involved pre- and post-assessment field visits, four weeks apart that included multiple forms of data collection from workers and managers. Results revealed that mine management can utilize dust assessment technology as a risk communication tool to prompt and communicate about healthier behaviors with their workforce. Additionally, when workers were debriefed with the Helmet-CAM data through the device software, the dust exposure data can help improve the knowledge and awareness of workers, empowering them to change subtle behaviors that could reduce future elevated exposures to respirable silica dust. This case study demonstrates that incorporating social scientific methods into the application of health and safety management strategies, such as behavioral modification and technology integration, can leverage managers' communication practices with workers, subsequently improving health and safety behaviors. |
Analysis of the current rib support practices and techniques in U.S. coal mines
Mohamed KM , Murphy MM , Lawson HE , Klemetti T . Int J Min Sci Technol 2016 26 (1) 77-87 Design of rib support systems in U.S. coal mines is based primarily on local practices and experience. A better understanding of current rib support practices in U.S. coal mines is crucial for developing a sound engineering rib support design tool. The objective of this paper is to analyze the current practices of rib control in U.S. coal mines. Twenty underground coal mines were studied representing various coal basins, coal seams, geology, loading conditions, and rib control strategies. The key findings are: (1) any rib design guideline or tool should take into account external rib support as well as internal bolting; (2) rib bolts on their own cannot contain rib spall, especially in soft ribs subjected to significant load - external rib control devices such as mesh are required in such cases to contain rib sloughing; (3) the majority of the studied mines follow the overburden depth and entry height thresholds recommended by the Program Information Bulletin 11-29 issued by the Mine Safety and Health Administration; (4) potential rib instability occurred when certain geological features prevailed - these include draw slate and/or bone coal near the rib/roof line, claystone partings, and soft coal bench overlain by rock strata; (5) 47% of the studied rib spall was classified as blocky - this could indicate a high potential of rib hazards; and (6) rib injury rates of the studied mines for the last three years emphasize the need for more rib control management for mines operating at overburden depths between 152.4 m and 304.8 m. |
Distribution of Cryptosporidium species in Tibetan sheep and yaks in Qinghai, China
Li P , Cai J , Cai M , Wu W , Li C , Lei M , Xu H , Feng L , Ma J , Feng Y , Xiao L . Vet Parasitol 2016 215 58-62 Few data are available on the distribution of Cryptosporidium species in Tibetan sheep and yaks, which are free-range animals living in a cold, low oxygen, and high ultraviolet radiation habitat. In this study, 904 fecal specimens were collected from 350 Tibetan sheep and 554 yaks in six counties. Cryptosporidium spp. were detected and differentiated by PCR and sequence analyses. Altogether, 43 (12.3%) Tibetan sheep and 158 (28.5%) yaks were positive for Cryptosporidium spp. In Tibetan sheep, Cryptosporidium xiaoi (39/43, 90.7%) was the dominant species, with the remaining cases (4/43, 9.3%) by Cryptosporidium ubiquitum. All C. ubiquitum specimens belonged to the subtype family XIIa. In contrast, Cryptosporidium andersoni (72/158, 45.6%), Cryptosporidium bovis (47/158, 29.7%), Cryptosporidium ryanae cattle type (35/158, 22.2%), C. ryanae buffalo type (2/158, 1.3%), and Cryptosporidium suis-like (2/158, 1.3%) were identified in yaks. Contradictory to previous observations, C. andersoni was one of the dominant Cryptosporidium species in yaks in this study. Despite sharing habitats, Tibetan sheep and yaks are evidently infected with different Cryptosporidium species. |
Advancing translation and dissemination research and practice through the Physical Activity Policy Research Network Plus
Pollack KM , Schmid TL , Wilson AL , Schulman E . Environ Behav 2016 48 (1) 266-272 In the United States (U.S.), physical inactivity is the fourth leading cause of death, with an estimated 200,000 deaths annually (Danaei et al., 2009). The lack of activity across the life span is important because it is a well-documented risk factor for leading non-communicable diseases including cardiovascular disease, cancers, obesity, and type 2 diabetes, as well as impaired quality of life (U.S. Burden of Disease Collaborators, 2013; U.S. Department of Health and Human Services, 2008). The need for policies and environments that promote population-wide increases in physical activity is important, given that only half of U.S. adults meet the recommended 150 minutes of moderate intensity physical activity weekly, 75 minutes of vigorous intensity activity, or an equivalent combination (U.S. Department of Health and Human Services, 2010). In addition, approximately 25% of adults report no leisure time physical activity (U.S. Department of Health and Human Services, 2010). | Despite the availability of evidence-based interventions targeting the various factors that influence participation in, and opportunities for, physical activity, there is little indication that many of these interventions are being widely disseminated or implemented in the U.S. (King & Sallis, 2009; Owen, Glanz, Sallis, & Kelder, 2006). Eyler, Brownson, and Schmid (2013) recently noted slow progress in the evolution of physical activity interventions that targeted individual behavior change to ones that focus on multilevel policy and environmental changes. Moreover, the authors noted the persistence of health disparities in physical activity and a need for more work on translation, dissemination, and implementation (TDI) research, specifically to reduce physical activity disparities (Eyler et al., 2013). |
Physical activities of U.S. high school students - 2010 National Youth Physical Activity and Nutrition Survey
Song M , Carroll DD , Lee SM , Fulton JE . J Phys Act Health 2015 12 Suppl 1 S11-7 BACKGROUND: The 2008 Physical Activity Guidelines recommend youth participate in a variety of physical activities; however, few nationally representative studies describe the types and variety of youth activity. This study assessed the most frequently reported types and variety of activities among U.S. high school students, and examined the association between variety and meeting the 2008 Guidelines for aerobic activity (aerobic guideline). METHODS: We analyzed data on 8628 U.S. high school students in grades 9-12 from the 2010 National Youth Physical Activity and Nutrition Survey. Types of physical activity were assessed by identifying which activities each student reported in the past 7 days. Variety was assessed by the total number of different activities each student reported. Percentage (95% CI) of students who reported engaging in each activity was assessed. Logistic regression was used to examine the association between variety and meeting the aerobic guideline. RESULTS: Walking was the most frequently reported activity among U.S. high school students. On average, students reported participating in 6 different activities. Variety was positively associated with meeting the aerobic guideline. CONCLUSIONS: These findings support encouraging youth to participate in many physical activities and may be useful for developing interventions that focus on the most prevalent activities. |
Iniciativas escolares y deportivas lideradas desde la Fédération Internationale de Football Association (FIFA): revisión sistemática
Correa JE , Meneses-Echavez JF , Barengo NC , Tovar G , Ruiz-Castellanos E , Lobelo F , Ramirez-Velez R . Glob Health Promot 2015 22 (3) 67-76 Introduccion: Los programas iniciados por la Federation Internationale de Football Association (FIFA) consisten en la difusion de mensajes relacionados con el cuidado de la salud y como estrategia de prevencion de lesiones deportivas entre los ninos y jovenes. El objetivo de esta revision sistematica fue resumir los resultados de la implementacion de los programas "FIFA 11 para la salud" y "FIFA 11+". Metodos: Se realizo una busqueda sistematica en las bases de datos electronicos de MEDLINE, EMBASE y Scopus, identificando los estudios que evaluaran la implementacion de los programas "FIFA 11 para la salud" y "FIFA 11+", durante los ultimos 10 anos (1 enero 2003 a 1 diciembre 2013). Resultados: Incluimos 17 estudios. Dos estudios evaluaron la implementacion del programa "FIFA 11 para la salud" y encontraron un aumento significativo en el conocimiento de los mensajes de promocion de la salud; 15 estudios evaluaron los efectos del programa "FIFA 11+", reportando una reduccion en el riesgo de lesiones deportivas y mejorias en el rendimiento deportivo. Discusion: Los programas "FIFA 11 para la salud" y "FIFA 11+" han demostrado resultados positivos para la salud, en el ambito escolar y deportivo. Conclusiones: Dichos programas del FIFA representan una oportunidad para crear habitos protectores y fomentar modos de vida saludables en ninos y jovenes. |
Soy intake modifies the relation between urinary bisphenol A concentrations and pregnancy outcomes among women undergoing assisted reproduction
Chavarro JE , Minguez-Alarcon L , Chiu YH , Gaskins AJ , Souter I , Williams PL , Calafat AM , Hauser R . J Clin Endocrinol Metab 2016 101 (3) jc20153473 CONTEXT: Experimental data in rodents suggest that the adverse reproductive health effects of bisphenol A (BPA) can be modified by intake of soy phytoestrogens. Whether the same is true in humans is not known. OBJECTIVE: The purpose of this study was to evaluate whether soy consumption modifies the relation between urinary BPA levels and infertility treatment outcomes among women undergoing assisted reproduction. SETTING: The study was conducted in a fertility center in a teaching hospital. DESIGN: We evaluated 239 women enrolled between 2007 and 2012 in the Environment and Reproductive Health (EARTH) Study, a prospective cohort study, who underwent 347 in vitro fertilization (IVF) cycles. Participants completed a baseline questionnaire and provided up to 2 urine samples in each treatment cycle before oocyte retrieval. IVF outcomes were abstracted from electronic medical records. We used generalized linear mixed models with interaction terms to evaluate whether the association between urinary BPA concentrations and IVF outcomes was modified by soy intake. MAIN OUTCOME MEASURE: Live birth rates per initiated treatment cycle were measured. RESULTS: Soy food consumption modified the association of urinary BPA concentration with live birth rates (P for interaction = .01). Among women who did not consume soy foods, the adjusted live birth rates per initiated cycle in increasing quartiles of cycle-specific urinary BPA concentrations were 54%, 35%, 31%, and 17% (P for trend = .03). The corresponding live birth rates among women reporting pretreatment consumption of soy foods were 38%, 42%, 47%, and 49% (P for trend= 0.35). A similar pattern was found for implantation (P for interaction = .02) and clinical pregnancy rates (P for interaction = .03) per initiated cycle, where urinary BPA was inversely related to these outcomes among women not consuming soy foods but unrelated to them among soy consumers. CONCLUSION: Soy food intake may protect against the adverse reproductive effects of BPA. As these findings represent the first report suggesting a potential interaction between soy and BPA in humans, they should be further evaluated in other populations. |
Prevalence of reproductive tract infections and the predictive value of girls' symptom-based reporting: findings from a cross-sectional survey in rural western Kenya
Kerubo E , Laserson KF , Otecko N , Odhiambo C , Mason L , Nyothach E , Oruko KO , Bauman A , Vulule J , Zeh C , Phillips-Howard PA . Sex Transm Infect 2016 92 (4) 251-6 OBJECTIVES: Reproductive tract infections (RTIs), including sexually acquired, among adolescent girls is a public health concern, but few studies have measured prevalence in low-middle-income countries. The objective of this study was to examine prevalence in rural schoolgirls in Kenya against their reported symptoms. METHODS: In 2013, a survey was conducted in 542 adolescent schoolgirls aged 14-17 years who were enrolled in a menstrual feasibility study. Vaginal self-swabbing was conducted after girls were interviewed face-to-face by trained nurses on symptoms. The prevalence of girls with symptoms and laboratory-confirmed infections, and the sensitivity, specificity, positive and negative predictive values of symptoms compared with laboratory results, were calculated. RESULTS: Of 515 girls agreeing to self-swab, 510 answered symptom questions. A quarter (24%) reported one or more symptoms; most commonly vaginal discharge (11%), pain (9%) or itching (4%). Laboratory tests confirmed 28% of girls had one or more RTI. Prevalence rose with age; among girls aged 16-17 years, 33% had infections. Bacterial vaginosis was the most common (18%), followed by Candida albicans (9%), Chlamydia trachomatis (3%), Trichomonas vaginalis (3%) and Neisseria gonorrhoeae (1%). Reported symptoms had a low sensitivity and positive predictive value. Three-quarters of girls with bacterial vaginosis and C. albicans, and 50% with T. vaginalis were asymptomatic. CONCLUSIONS: There is a high prevalence of adolescent schoolgirls with RTI in rural Kenya. Public efforts are required to identify and treat infections among girls to reduce longer-term sequelae but poor reliability of symptom reporting minimises utility of symptom-based diagnosis in this population. TRIAL REGISTRATION NUMBER: ISRCTN17486946. |
Excess mortality due to depression and anxiety in the United States: results from a nationally representative survey
Pratt LA , Druss BG , Manderscheid RW , Walker ER . Gen Hosp Psychiatry 2015 39 39-45 OBJECTIVES: We compared the mortality of persons with and without anxiety and depression in a nationally representative survey and examined the role of socioeconomic factors, chronic diseases and health behaviors in explaining excess mortality. METHODS: The 1999 National Health Interview Survey was linked with mortality data through 2011. We calculated the hazard ratio (HR) for mortality by presence or absence of anxiety/depression and evaluated potential mediators. We calculated the population attributable risk of mortality for anxiety/depression. RESULTS: Persons with anxiety/depression died 7.9years earlier than other persons. At a population level, 3.5% of deaths were attributable to anxiety/depression. Adjusting for demographic factors, anxiety/depression was associated with an elevated risk of mortality [HR=1.61, 95% confidence interval (CI)=1.40, 1.84]. Chronic diseases and health behaviors explained much of the elevated risk. Adjusting for demographic factors, people with past-year contact with a mental health professional did not demonstrate excess mortality associated with anxiety/depression while those without contact did. CONCLUSIONS: Anxiety/depression presents a mortality burden at both individual and population levels. Our findings are consistent with targeting health behaviors and physical illnesses as strategies for reducing this excess mortality among people with anxiety/depression. |
Relationship between nonmedical prescription-opioid use and heroin use
Compton WM , Jones CM , Baldwin GT . N Engl J Med 2016 374 (2) 154-63 The nonmedical use of prescription opioids is a major public health issue in the United States, both because of the overall high prevalence and because of marked increases in associated morbidity and mortality.1 In 2014, a total of 10.3 million persons reported using prescription opioids nonmedically (i.e., using medications that were not prescribed for them or were taken only for the experience or feeling that they caused).2 Emergency department visits involving misuse or abuse of prescription opioids increased 153% between 2004 and 2011, and admissions to substance-abuse treatment programs linked to prescription opioids more than quadrupled between 2002 and 2012.3,4 Most troubling, between 2000 and 2014 the rates of death from prescription-opioid overdose nearly quadrupled (from 1.5 to 5.9 deaths per 100,000 persons) (Figure 1). | Stay up to date on relevant content from the New England Journal of Medicine with free email alerts. SIGN UP | The pattern of nonmedical use of prescription opioids varies, from infrequent use once or twice per year to daily or compulsive heavy use and addiction. A key underlying characteristic of the epidemic is the association between the increasing rate of opioid prescribing and increasing opioid-related morbidity and mortality.6-9 Pain has also been identified as a poorly addressed clinical and public health problem for which treatment with prescription opioids may play an important role.10 Taken together, these trends suggest the need for balanced prevention responses that aim to reduce the rates of nonmedical use and overdose while maintaining access to prescription opioids when indicated. |
Smoke-free public policies and voluntary policies in personal settings in Tbilisi, Georgia: a qualitative study
Berg CJ , Smith SA , Bascombe TM , Maglakelidze N , Starua L , Topuridze M . Int J Environ Res Public Health 2016 13 (2) 156 Georgia has limited tobacco control policies, particularly in the area of smoke-free public policies, which may influence the adoption of smoke-free home rules. We qualitatively examined knowledge about and reactions to public and personal smoke-free policies among Tbilisi residents. In Spring 2014, we conducted six focus groups among 47 total participants-two among male smokers, one among male nonsmokers, two among female smokers, and one among female nonsmokers. Our sample was 48.9% male and 70.2% past 30-day smokers. Most believed that SHS was dangerous, with particular concern regarding the impact of SHS on children and pregnant women. Many had misconceptions about how to protect others from SHS and the effectiveness of some approaches. Many indicated that they had some type of home rules, but few reported a complete ban on smoking in the home. Even when some restrictions were in place, they rarely were effective or enforced. Common concerns about the partial smoke-free public policy in Georgia included its economic impact, perceived discrimination among smokers, and the policy being against the Georgian culture. These concerns were heightened when participants were asked about the possible implementation of a complete smoke-free policy. Educational programs are needed to promote smoke-free policies in Georgia. |
Cigarette smoking among working women of reproductive age-United States, 2009-2013
Mazurek JM , England LJ . Nicotine Tob Res 2016 18 (5) 894-9 BACKGROUND: Employers play a vital role in promoting and supporting tobacco use cessation among tobacco-using workers. Cigarette smoking during pregnancy is a preventable cause of complications in pregnancy and adverse infant health outcomes. PURPOSE: To estimate cigarette smoking prevalence and attempts to quit among working women of reproductive age in different industries and occupations using a nationally representative survey. METHODS: The 2009-2013 National Health Interview Survey data for women of reproductive age (18-49 years) who were working in the week prior to the interview (n = 30855) were analyzed. Data were adjusted for nonresponse and weighted to produce nationally representative estimates. RESULTS: During 2009-2013, among working women of reproductive age, an estimated 17.3% (95% confidence interval [CI]: 16.7-17.8) and 12.9% (95% CI: 12.4-13.4) were current and former cigarette smokers, respectively. Of women who smoke daily, 44.5% (95% CI: 42.5-46.5) had made a quit attempt for more than 1 day in the year before the interview. Cigarette smoking prevalence was highest among women working in the construction industry (29.2%; 95% CI: 22.8-35.7) and in construction and extraction occupations (34.6%; 95% CI: 23.4-45.9). Among working women who were pregnant at the time of the interview, 6.8% (95% CI: 4.4-9.2) and 20.4% (95% CI: 16.9-24.0) were current and former cigarette smokers, respectively. CONCLUSIONS: Cigarette smoking prevalence varies by industry and occupation. Intensifying tobacco control efforts in high prevalence industries and occupations could result in higher cessation rates and improvements in health among women of reproductive age. IMPLICATIONS: This study identified discrepancies in cigarette smoking among women of reproductive age across industries and occupations. In the absence of smoke-free local and state laws, employer-established smoke-free policies and workplace cessation programs are important for achieving reduction of tobacco use among women and for protecting other workers' health. Results in this report may assist in developing educational campaigns targeting women in industries and occupations with high prevalence of cigarette smoking and low percentage of ever-smokers who had quit. |
Epidemiological observations on cryptosporidiosis in diarrheic goat kids in Greece
Giadinis ND , Papadopoulos E , Lafi SQ , Papanikolopoulou V , Karanikola S , Diakou A , Vergidis V , Xiao L , Ioannidou E , Karatzias H . Vet Med Int 2015 2015 764193 This study aimed at investigating the occurrence of Cryptosporidium spp. in diarrheic goat kids in Greece and the risk factors associated with cryptosporidiosis. Altogether, 292 diarrheic 4-15-day-old goat kids from 54 dairy goat herds of Northern Greece were examined. Oocysts of Cryptosporidium spp. were detected in 223 of 292 (76.4%) goat kids and the intensity of infection was scored as "high" in 142 samples, "moderate" in 45 samples, and "low" in 36 samples. Larger herds (>200 animals) had higher infection rates than smaller ones, although this difference was not statistically significant. Significantly higher infection rates were observed in herds during late kidding season (1 January to 30 April) compared to the early one (1 September to 31 December). These results suggest that cryptosporidiosis is very common in diarrheic goat kids in Greece, especially in large herds during the late parturition season. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Health Economics
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Parasitic Diseases
- Physical Activity
- Reproductive Health
- Social and Behavioral Sciences
- Substance Use and Abuse
- Veterinary Medicine
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure