Trends in the prevalence of ketoacidosis at diabetes diagnosis: the SEARCH for Diabetes in Youth Study
Dabelea D , Rewers A , Stafford JM , Standiford DA , Lawrence JM , Saydah S , Imperatore G , D'Agostino RB Jr , Mayer-Davis EJ , Pihoker C . Pediatrics 2014 133 (4) e938-45 OBJECTIVE: To estimate temporal changes in the prevalence of diabetic ketoacidosis (DKA) at diagnosis of type 1 or type 2 diabetes in youth and to explore factors associated with its occurrence. METHODS: Five centers identified incident cases of diabetes among youth aged 0 to 19 years starting in 2002. DKA presence was defined as a bicarbonate level <15 mmol/L and/or a pH <7.25 (venous) or <7.30 (arterial or capillary) or mention of DKA in the medical records. We assessed trends in the prevalence of DKA over 3 time periods (2002-2003, 2004-2005, and 2008-2010). Logistic regression was used to determine factors associated with DKA. RESULTS: In youth with type 1 diabetes (n = 5615), the prevalence of DKA was high and stable over time (30.2% in 2002-2003, 29.1% in 2004-2005, and 31.1% in 2008-2010; P for trend = .42). Higher prevalence was associated with younger age at diagnosis (P < .0001), minority race/ethnicity (P = .019), income (P = .019), and lack of private health insurance (P = 008). Among youth with type 2 diabetes (n = 1425), DKA prevalence decreased from 11.7% in 2002-2003 to 5.7% in 2008-2010 (P for trend = .005). Higher prevalence was associated with younger age at diagnosis (P = .001), minority race/ethnicity (P = .013), and male gender (P = .001). CONCLUSIONS: The frequency of DKA in youth with type 1 diabetes, although stable, remains high, indicating a persistent need for increased awareness of signs and symptoms of diabetes and better access to health care. In youth with type 2 diabetes, DKA at onset is less common and is decreasing over time. |
Protein nanoparticles as drug delivery carriers for cancer therapy
Lohcharoenkal W , Wang L , Chen YC , Rojanasakul Y . Biomed Res Int 2014 2014 180549 Nanoparticles have increasingly been used for a variety of applications, most notably for the delivery of therapeutic and diagnostic agents. A large number of nanoparticle drug delivery systems have been developed for cancer treatment and various materials have been explored as drug delivery agents to improve the therapeutic efficacy and safety of anticancer drugs. Natural biomolecules such as proteins are an attractive alternative to synthetic polymers which are commonly used in drug formulations because of their safety. In general, protein nanoparticles offer a number of advantages including biocompatibility and biodegradability. They can be prepared under mild conditions without the use of toxic chemicals or organic solvents. Moreover, due to their defined primary structure, protein-based nanoparticles offer various possibilities for surface modifications including covalent attachment of drugs and targeting ligands. In this paper, we review the most significant advancements in protein nanoparticle technology and their use in drug delivery arena. We then examine the various sources of protein materials that have been used successfully for the construction of protein nanoparticles as well as their methods of preparation. Finally, we discuss the applications of protein nanoparticles in cancer therapy. |
Implementing survivorship care plans for colon cancer survivors
Mayer DK , Gerstel A , Walton AL , Triglianos T , Sadiq TE , Hawkins NA , Davies JM . Oncol Nurs Forum 2014 41 (3) 266-73 PURPOSE/OBJECTIVES: To evaluate the feasibility, usability, and satisfaction of a survivorship care plan (SCP) and identify the optimum time for its delivery during the first 12 months after diagnosis. DESIGN: Prospective, descriptive, single-arm study. SETTING: A National Cancer Institute-designated cancer center in the southeastern United States. SAMPLE: 28 nonmetastatic colon cancer survivors within the first year of diagnosis and their primary care physicians (PCPs). METHODS: Regular screening identified potential participants who were followed until treatment ended. An oncology certified nurse developed the JourneyForward SCP, which then was delivered to the patient by the oncology nurse practitioner (NP) during a routine follow-up visit and mailed to the PCP. MAIN RESEARCH VARIABLES: Time to complete, time to deliver, usability, and satisfaction with the SCP. FINDINGS: During one year, 75 patients were screened for eligibility, 34 SCPs were delivered, and 28 survivors and 15 PCPs participated in the study. It took an average of 49 minutes to complete a surgery SCP and 90 minutes to complete a surgery plus chemotherapy SCP. Most survivors identified that before treatment ended or within the first three months was the preferred time to receive an SCP. CONCLUSIONS: The SCPs were well received by the survivors and their PCPs, but were too time and labor intensive to track and complete. IMPLICATIONS FOR NURSING: More work needs to be done to streamline processes that identify eligible patients and to develop and implement SCPs. Measuring outcomes will be needed to demonstrate whether SCPs are useful or not. |
Association of overweight and obesity with the use of self and home-based infusion therapy among haemophilic men
Ullman M , Zhang QC , Brown D , Grant A , Soucie JM . Haemophilia 2014 20 (3) 340-8 An elevated body mass index (BMI) may make venipuncture more difficult, potentially impacting the use of home infusion (HI) and self-infusion (SI). We sought to determine whether above-normal BMI is associated with decreased use of HI treatment and SI of clotting factor concentrate among haemophilic persons. We analysed data from 10 814 male patients with haemophilia A and B (45% with severe disease) aged 6-79 years enrolled in the Centers for Disease Control and Prevention Universal Data Collection surveillance project between 1998 and 2008. Associations between the use of HI and SI and BMI were evaluated using logistic regression. Fifty per cent of haemophilic men were overweight or obese, similar to rates reported among the general US population by the 2007-2008 National Health and Nutrition Examination Survey [Flegal, KM et al., JAMA 2010;303:235-241;]. Twenty per cent of children and 22% of teens were obese, as were 28% of adults [Ogden, CL et al., JAMA 2010;303:235, 242]. Overall, 70% of the study sample used HI; 44% of those who used HI also used SI. Overweight and obese men were each less likely to use HI than those of normal weight [odds ratio (OR) 0.8; 95% confidence interval (CI) 0.7-1.0 and OR 0.7; 95% CI 0.6-0.8 respectively]. Obese teens and adult men were also less likely to practice SI than teens and adults of normal weight (OR 0.8; 95% CI 0.7-0.9 for each). We conclude that overweight and obese haemophilic men are less likely to use HI and obese men are less likely to use SI than their normal-weight counterparts. |
Are we there yet? The smallpox research agenda using variola virus.
Damon IK , Damaso CR , McFadden G . PLoS Pathog 2014 10 (5) e1004108 Despite significant advances, there is more work to be done before the international community can be confident that it possesses sufficient protection against any future smallpox threats. The current World Health Organization (WHO)-approved research agenda for smallpox has been tightly focused by the interpretation that research “essential for public health” equates solely to applied research related directly to the development of new antiviral drugs, safer vaccines, and better diagnostics. Despite considerable advances in this direction, we argue that the research agenda with live variola virus is not yet finished and that significant gaps still remain. | Variola virus is unique amongst the orthopoxviruses in that it is known to be a sole human pathogen. The viral and host factors responsible for this human-specific tropism remain essentially unknown to this day, although the current genomic information across orthopoxviruses makes hypothesis-driven experimental design using functional genomic approaches more feasible. Indeed, greater exploitation of current technologies may lead to additional therapeutic or diagnostic products to better respond to any future emergency situation resulting from a smallpox appearance. |
Genetic diversity of rotavirus genome segment 6 (encoding VP6) in Pretoria, South Africa.
Nyaga MM , Esona MD , Jere KC , Peenze I , Seheri ML , Mphahlele MJ . Springerplus 2014 3 179 BACKGROUND: Rotavirus viral protein 6 (VP6), encoded by genome segment (GS) 6, is the primary target for rotavirus diagnosis by serological and some molecular techniques. Selected full length nucleotide sequences of GS 6 of rotavirus strains from South Africa were sequenced and analysed to determine genetic diversity and variations within the circulating rotaviruses. FINDINGS: The VP6 amplicons were sequenced using the Sanger ABI 3130xl. Phylogenetic and pairwise analysis revealed that the VP6 genes of the study strains belonged to two different VP6 [I] genotypes. Five sequences were assigned genotype I1 and seven as genotype I2. Comparison of the group specific antigenic regions of the South African strains to the reference strains, shows that the South African VP6 sequences belonging to the VP6 genotype I2 were highly conserved, with only two amino acids changes at positions 239 (TN) and 261(IV). On the other hand, South African VP6 sequences belonging to I1 genotypes revealed several amino acid variations mostly within the antigenic region III. CONCLUSIONS: Rotavirus strains with I1 and I2 genotype are predominantly circulating within the South African communities of which the later seems to be more conserved within the antigenic regions. The observed genetic variations observed within GS 6 of rotaviruses analysed in the current study are unlikely to impact negatively on the performance of the current VP6-based detection methods. Nevertheless, investigators should continually consider this diversity and adapt the primer design for the detection and characterization of the VP6 gene accordingly. |
SHIV susceptibility changes during the menstrual cycle of pigtail macaques
Kersh EN , Henning T , Vishwanathan SA , Morris M , Butler K , Adams DR , Guenthner P , Srinivasan P , Smith J , Radzio J , Garcia-Lerma JG , Dobard C , Heneine W , McNicholl J . J Med Primatol 2014 43 (5) 310-6 BACKGROUND: Hormonal changes during menstrual cycling may affect susceptibility to HIV. METHODS: We determined the simian human immunodeficiency virus (SHIV) acquisition time point in 43 cycling pigtail macaques infected by repeated vaginal virus exposures initiated randomly in the cycle. RESULTS: SHIV infection was first detected in the follicular phase in 38 macaques (88%), and in the luteal phase in five macaques (12%), indicating a statistically significant timing difference. Assuming a 7-day eclipse phase, most infections occurred during or following a high-progesterone period associated with menstruation, vaginal epithelium thinning, and suppressed mucosal immunity. CONCLUSIONS: This raises questions whether other high-progesterone conditions (pregnancy, hormonal contraception) similarly affect HIV risk. |
Modeling risks of infection with transient maternal antibodies and waning active immunity: application to Bordetella pertussis in Sweden
Feng Z , Glasser JW , Hill AN , Franko MA , Carlsson RM , Hallander H , Tull P , Olin P . J Theor Biol 2014 356 123-32 Serological surveys provide reliable information from which to calculate forces (instantaneous rates) of infection, but waning immunity and clinical consequences that depend on residual immunity complicate interpretation of results. We devised a means of calculating these rates that accounts for passively acquired maternal antibodies that decay or active immunity that wanes, permitting re-infection. We applied our method to pertussis (whooping cough) in Sweden, where vaccination was discontinued from 1979 to 1995. A national cross-sectional serosurvey of antibodies to pertussis toxin, which peak soon after infection and then decay, was conducted shortly after vaccination resumed. Together with age-specific contact rates in Finland, contemporary forces of infection enable us to evaluate the recent assertion that the probability of infection upon contact is age-independent. We find elevated probabilities among children, adolescents and young adults, whose contacts may be more intimate than others. Products of contact rates and probabilities of infection permit transmission modeling and estimation of the intrinsic reproduction number. In contrast to another recent estimate, ours approximates the ratio of life expectancy and age at first infection. Our framework is sufficiently general to accommodate more realistic sojourn distributions and additional lifetime infections. |
Monitoring prevention of mother-to-child transmission in Botswana
Legwaila K , Motswere-Chirwa C , Matambo S , Kolobe T , Jimbo W , Keapoletswe K , Letsholathebe V , Lu L . Afr J Midwifery Womens Health 2014 8 (2) 73-75 BACKGROUND: In Botswana, the prevention of mother-to-child transmission (PMTCT) programme has succeeded in reducing rates of transmission of HIV from mother to child since the start of the national antiretroviral (ARV) programme in 2002. METHODS: Data on PMTCT interventions for women who delivered at Nyangabgwe Referral Hospital (NRH), the second largest hospital in Botswana, from 2003 to 2012 were collected from maternity registers. RESULTS: Of 46,354 women, 33% were HIV-positive, 58% were HIV-negative, and 9% were not tested. The percentage of women with a known HIV status increased from 50% in 2003 to 97% in 2012. PMTCT uptake for women on any ARV increased from 61% in 2003 to 86% in 2012. Infants given azidothymidine (AZT) and nevirapine prophylaxis increased from 61% to 85%. CONCLUSIONS: Review of maternity registers demonstrated improvement of multiple PMTCT interventions at NRH. This is a useful approach for monitoring programme quality and guiding strategic planning. |
Outbreak of measles among persons with prior evidence of immunity, New York City, 2011
Rosen JB , Rota JS , Hickman CJ , Sowers SB , Mercader S , Rota PA , Bellini WJ , Huang AJ , Doll MK , Zucker JR , Zimmerman CM . Clin Infect Dis 2014 58 (9) 1205-10 BACKGROUND: Measles was eliminated in the United States through high vaccination coverage and a public health system able to rapidly respond to measles. Measles may occur among vaccinated individuals, but secondary transmission from such individuals has not been documented. METHODS: Suspected patients and contacts exposed during a measles outbreak in New York City in 2011 were investigated. Medical histories and immunization records were obtained. Cases were confirmed by detection of measles-specific immunoglobulin M and/or RNA. Tests for measles immunoglobulin G (IgG), IgG avidity, measurement of measles neutralizing antibody titers, and genotyping were performed to characterize the cases. RESULTS: The index patient had 2 doses of measles-containing vaccine; of 88 contacts, 4 secondary patients were confirmed who had either 2 doses of measles-containing vaccine or a past positive measles IgG antibody. All patients had laboratory confirmation of measles infection, clinical symptoms consistent with measles, and high-avidity IgG antibody characteristic of a secondary immune response. Neutralizing antibody titers of secondary patients reached >80 000 mIU/mL 3-4 days after rash onset and that of the index was <500 mIU/mL 9 days after rash onset. No additional cases of measles occurred among 231 contacts of secondary patients. CONCLUSIONS: This is the first report of measles transmission from a twice-vaccinated individual with documented secondary vaccine failure. The clinical presentation and laboratory data of the index patient were typical of measles in a naive individual. Secondary patients had robust anamnestic antibody responses. No tertiary cases occurred despite numerous contacts. This outbreak underscores the need for thorough epidemiologic and laboratory investigation of suspected cases of measles regardless of vaccination status. |
Persistent racial/ethnic disparities in AIDS diagnosis rates among people who inject drugs in U.S. metropolitan areas, 1993-2007
Pouget ER , West BS , Tempalski B , Cooper HL , Hall HI , Hu X , Friedman SR . Public Health Rep 2014 129 (3) 267-279 OBJECTIVES: We estimated race/ethnicity-specific incident AIDS diagnosis rates (IARs) among people who inject drugs (PWID) in U.S. metropolitan statistical areas (MSAs) over time to assess the change in disparities after highly active antiretroviral therapy (HAART) dissemination. METHODS: We compared IARs and 95% confidence intervals (CIs) for black/African American and Hispanic/Latino PWID with those of white PWID in 93 of the most populous MSAs. We selected two three-year periods from the years immediately preceding HAART (1993-1995) and the years with the most recent available data (2005-2007). To maximize stability, we aggregated data across three-year periods, and we aggregated data for black/African American and Hispanic/Latino PWID for most comparisons with data for white PWID. We assessed disparities by comparing IAR 95% CIs for overlap. RESULTS: IARs were significantly higher for black/African American and Hispanic/Latino PWID than for white PWID in 81% of MSAs in 1993-1995 and 77% of MSAs in 2005-2007. MSAs where disparities became non-significant over time were concentrated in the West. Significant differences were more frequent in comparisons between black/African American and white PWID (85% of MSAs in 1993-1995, 79% of MSAs in 2005-2007) than in comparisons between Hispanic/Latino and white PWID (53% of MSAs in 1993-1995, 56% of MSAs in 2005-2007). IARs declined modestly across racial/ethnic groups in most MSAs. CONCLUSIONS: AIDS diagnosis rates continue to be substantially higher for black/African American and Hispanic/Latino PWID than for white PWID in most large MSAs. This finding suggests a need for increased targeting of prevention and treatment programs, as well as research on MSA-level conditions that may serve to maintain the disparities. |
Alcohol use and its association with HIV risk behaviors among a cohort of patients attending HIV clinical care in Tanzania, Kenya, and Namibia
Medley A , Seth P , Pathak S , Howard AA , Deluca N , Matiko E , Mwinyi A , Katuta F , Sheriff M , Makyao N , Wanjiku L , Ngare C , Bachanas P . AIDS Care 2014 26 (10) 1-10 This article describes the frequency of alcohol use among HIV-positive patients attending clinical care in sub-Saharan Africa and explores the association between alcohol use, medication adherence, and sexual risk behavior. Data from 3538 patients attending an HIV clinic in Kenya, Tanzania, or Namibia were captured through interview and medical record abstraction. Participants were categorized into three drinking categories: nondrinkers, nonharmful drinkers, and harmful/likely dependent drinkers. A proportional odds model was used to identify correlates associated with categories of alcohol use. Overall, 20% of participants reported alcohol use in the past 6 months; 15% were categorized as nonharmful drinkers and 5% as harmful/likely dependent drinkers. Participants who reported missing a dose of their HIV medications [adjusted odds ratio (AOR): 2.04, 95% confidence interval (CI): 1.67, 2.49]; inconsistent condom use (AOR: 1.49, 95% CI: 1.23, 1.79); exchanging sex for food, money, gifts, or a place to stay (AOR: 1.57, 95% CI: 1.06, 2.32); and having a sexually transmitted infection symptom (AOR: 1.40, 95% CI: 1.10, 1.77) were more likely to be categorized in the higher risk drinking categories. This research highlights the need to integrate alcohol screening and counseling into the adherence and risk reduction counseling offered to HIV-positive patients as part of their routine care. Moreover, given the numerous intersections between alcohol and HIV, policies that focus on reducing alcohol consumption and alcohol-related risk behavior should be integrated into HIV prevention, care, and treatment strategies. |
Assessing the impact of public health interventions on the transmission of pandemic H1N1 influenza a virus aboard a Peruvian navy ship
Vera DM , Hora RA , Murillo A , Wong JF , Torre AJ , Wang D , Boulay D , Hancock K , Katz JM , Ramos M , Loayza L , Quispe J , Reaves EJ , Bausch DG , Chowell G , Montgomery JM . Influenza Other Respir Viruses 2014 8 (3) 353-9 BACKGROUND: Limited data exist on transmission dynamics and effectiveness of control measures for influenza in confined settings. OBJECTIVES: To investigate the transmission dynamics of a 2009 pandemic H1N1 influenza A outbreak aboard a Peruvian Navy ship and quantify the effectiveness of the implemented control measures. METHODS: We used surveillance data and a simple stochastic epidemic model to characterize and evaluate the effectiveness of control interventions implemented during an outbreak of 2009 pandemic H1N1 influenza A aboard a Peruvian Navy ship. RESULTS: The serological attack rate for the outbreak was 49.1%, with younger cadets and low-ranking officers at greater risk of infection than older, higher-ranking officers. Our transmission model yielded a good fit to the daily time series of new influenza cases by date of symptom onset. We estimated a reduction of 54.4% in the reproduction number during the period of intense control interventions. CONCLUSION: Our results indicate that the patient isolation strategy and other control measures put in place during the outbreak reduced the infectiousness of isolated individuals by 86.7%. Our findings support that early implementation of control interventions can limit the spread of influenza epidemics in confined settings. |
Community-based electronic data collections for HIV prevention research with black/African-American men in the rural, Southern USA
Djawe K , Brown EE , Gaul Z , Sutton M . AIDS Care 2014 26 (10) 1-9 In Florida, the HIV case rate among black men is five times that of white men; tailored HIV prevention interventions are lacking. Historical concerns regarding trust with public health venues and sharing sensitive information make face-to-face data collection with some rural, southern black men challenging. We evaluated the feasibility and acceptability of using audio computer-assisted self-interviews (ACASIs) by local community-based organization members to collect HIV-related information from black men in rural settings. We used logistic regression to estimate associations between using ACASI and participants' sociodemographic characteristics. Of 636 men approached, 586 (92.0%) participated, 479 (81.7%) never completed a computer survey, and 287 (71%) of those reporting a preference preferred ACASI for future data collections. Increased age, past computer use, and sharing a household with someone were significantly associated with ACASI feasibility and acceptability. Using ACASI with black men in rural settings is feasible for HIV intervention research and disparity-reducing goals. |
Complications among adults hospitalized with influenza: a comparison of seasonal influenza and the 2009 H1N1 pandemic
Reed C , Chaves SS , Perez A , D'Mello T , Kirley PD , Aragon D , Meek JI , Farley MM , Ryan P , Lynfield R , Morin CA , Hancock EB , Bennett NM , Zansky SM , Thomas A , Lindegren ML , Schaffner W , Finelli L . Clin Infect Dis 2014 59 (2) 166-74 BACKGROUND: Persons with influenza can develop complications that result in hospitalization and death. These are most commonly respiratory-related, but cardiovascular or neurologic complications or exacerbations of underlying chronic medical conditions may also occur. Patterns of complications observed during pandemics may differ from typical influenza seasons, and characterizing variations in influenza-related complications can provide a better understanding of the impact of pandemics and guide appropriate clinical management and planning for the future. METHODS: Using a population-based surveillance system, we compared clinical complications using ICD-9 discharge diagnosis codes in adults hospitalized with seasonal influenza (n=5,270) or 2009 pandemic influenza A(H1N1) (H1N1pdm09) (n=4,962). RESULTS: Adults hospitalized with H1N1pdm09 were younger (median age 47 years) than those with seasonal influenza (median: 68 years, p<0.01), and differed in the frequency of certain underlying medical conditions. While there was similar risk for many influenza-associated complications, after controlling for age and type of underlying medical condition adults hospitalized with H1N1pdm09 were more likely to have lower respiratory tract complications, shock/sepsis, and organ failure than those with seasonal influenza. They were also more likely to be admitted to the ICU, require mechanical ventilation, or die. Young adults, in particular, had 2-4 times the risk of severe outcomes from H1N1pdm09 than persons of the same ages with seasonal influenza. CONCLUSIONS: While thought of as a relatively mild pandemic, these data highlight the impact of the 2009 pandemic on the risk of severe influenza, especially among younger adults, and the impact this virus may continue to have. |
Disparities among 2009 pandemic influenza A (H1N1) hospital admissions: a mixed methods analysis - Illinois, April-December 2009
Soyemi K , Medina-Marino A , Sinkowitz-Cochran R , Schneider A , Njai R , McDonald M , Glover M , Garcia J , Aiello AE . PLoS One 2014 9 (4) e84380 During late April 2009, the first cases of 2009 pandemic influenza A (H1N1) (pH1N1) in Illinois were reported. On-going, sustained local transmission resulted in an estimated 500,000 infected persons. We conducted a mixed method analysis using both quantitative (surveillance) and qualitative (interview) data; surveillance data was used to analyze demographic distribution of hospitalized cases and follow-up interview data was used to assess health seeking behavior. Invitations to participate in a telephone interview were sent to 120 randomly selected Illinois residents that were hospitalized during April-December 2009. During April-December 2009, 2,824 pH1N1 hospitalizations occurred in Illinois hospitals; median age (interquartile range) at admission was 24 (range: 6-49) years. Hospitalization rates/100,000 persons for blacks and Hispanics, regardless of age or sex were 2-3 times greater than for whites (blacks, 36/100,000 (95% Confidence Interval ([95% CI], 33-39)); Hispanics, 35/100,000 [95%CI,32-37] (; whites, 13/100,000[95%CI, 12-14); p<0.001). Mortality rates were higher for blacks (0.9/100,000; p<0.09) and Hispanics (1/100,000; p<0.04) when compared with the mortality rates for whites (0.6/100,000). Of 33 interview respondents, 31 (94%) stated that they had heard of pH1N1 before being hospitalized, and 24 (73%) did not believed they were at risk for pH1N1. On average, respondents reported experiencing symptoms for 2 days (range: 1-7) before seeking medical care. When asked how to prevent pH1N1 infection in the future, the most common responses were getting vaccinated and practicing hand hygiene. Blacks and Hispanics in Illinois experienced disproportionate pH1N1 hospitalization and mortality rates. Public health education and outreach efforts in preparation for future influenza pandemics should include prevention messaging focused on perception of risk, and ensure community wide access to prevention messages and practices. |
How should social mixing be measured: comparing web-based survey and sensor-based methods
Smieszek T , Barclay VC , Seeni I , Rainey JJ , Gao H , Uzicanin A , Salathe M . BMC Infect Dis 2014 14 136 BACKGROUND: Contact surveys and diaries have conventionally been used to measure contact networks in different settings for elucidating infectious disease transmission dynamics of respiratory infections. More recently, technological advances have permitted the use of wireless sensor devices, which can be worn by individuals interacting in a particular social context to record high resolution mixing patterns. To date, a direct comparison of these two different methods for collecting contact data has not been performed. METHODS: We studied the contact network at a United States high school in the spring of 2012. All school members (i.e., students, teachers, and other staff) were invited to wear wireless sensor devices for a single school day, and asked to remember and report the name and duration of all of their close proximity conversational contacts for that day in an online contact survey. We compared the two methods in terms of the resulting network densities, nodal degrees, and degree distributions. We also assessed the correspondence between the methods at the dyadic and individual levels. RESULTS: We found limited congruence in recorded contact data between the online contact survey and wireless sensors. In particular, there was only negligible correlation between the two methods for nodal degree, and the degree distribution differed substantially between both methods. We found that survey underreporting was a significant source of the difference between the two methods, and that this difference could be improved by excluding individuals who reported only a few contact partners. Additionally, survey reporting was more accurate for contacts of longer duration, and very inaccurate for contacts of shorter duration. Finally, female participants tended to report more accurately than male participants. CONCLUSIONS: Online contact surveys and wireless sensor devices collected incongruent network data from an identical setting. This finding suggests that these two methods cannot be used interchangeably for informing models of infectious disease dynamics. |
Completeness of reporting of race and ethnicity data in the Nationally Notifiable Diseases Surveillance System, United States, 2006-2010
Adekoya N , Truman BI , Ajani UA . J Public Health Manag Pract 2014 21 (2) E16-22 CONTEXT: During 1994-1997, approximately 70% and 60% of the cases of conditions reported to the National Notifiable Diseases Surveillance System included persons of known race and ethnicity, respectively. A major goal of the Healthy People 2020 initiative is to eliminate health disparities. OBJECTIVE: To describe trends in the completeness of race and ethnicity in case reports of the National Notifiable Diseases Surveillance System during 2006-2010. METHODS: The National Notifiable Diseases Surveillance System is a public health surveillance system that aggregates case reports of infectious diseases and conditions that are designated nationally notifiable and are collected by US states and territories. The Centers for Disease Control and Prevention (Atlanta, Georgia) maintains this surveillance system in collaboration with the Council of State and Territorial Epidemiologists. We used Cochran-Armitage Trend Test (SAS, version 9.2) to test the hypothesis that the percentage of case reports with the completeness of race and ethnicity data increased or decreased linearly during 2006-2010. MAIN OUTCOME MEASURE: Completeness of race and ethnicity variables. RESULTS: The 32 conditions reviewed included 1030804 case records. Seventy percent of records included a known value for race, and 49% of records included ethnicity during 2006-2010. During 2006-2010, race was known in 70% or more of records in 24 of 32 conditions and in 23 of 51 jurisdictions. During 2006-2010, the systemwide reporting of race remained at the same level of completeness (70%) but the reporting of ethnicity increased slightly from 48% in 2006 to 53% in 2010. In comparison with race, the proportions of records coded to ethnicity were less among all conditions. CONCLUSIONS: Significant change has occurred in the completeness of reporting of ethnicity but not race during 2006-2010. However, the reporting of ethnicity still lags substantially behind the reporting of race. Jurisdictions that identify conditions with lower rates of completeness of race and ethnicity can assess the net benefits of efforts to improve the completeness of race and ethnicity data. |
Emergence and clonal dissemination of Salmonella enterica serovar Enteritidis causing salmonellosis in Mauritius
Issack MI , Hendriksen RS , Hyytia-Trees E , Svendsen CA , Mikoleit M . J Infect Dev Ctries 2014 8 (4) 454-60 INTRODUCTION: For decades, Salmonella enterica serovar Enteritidis has been among the most prevalent serovars reported worldwide. However, it was rarely encountered in Mauritius until 2007; since then the number of non-typhoidal Salmonella serogroup O:9 (including serovar Enteritidis) increased. A study was conducted to investigate the genetic relatedness between S. Enteritidis isolates recovered in Mauritius from food and clinical specimens (stool, blood, and exudate). METHODOLOGY: Forty-seven isolates of S. Enteritidis obtained in 2009 from human stools, blood cultures and exudates, and from food specimens were characterized by antimicrobial susceptibility testing and Multiple-Locus Variable-number tandem repeat Analysis (MLVA). RESULTS: With the exception of a single isolate which demonstrated intermediate susceptibility to streptomycin, all isolates were pansusceptible to the 14 antimicrobials tested. Thirty seven out of the 47 isolates (78.7%) exhibited an indistinguishable MLVA profile which included isolates from ready-to-eat food products, chicken, and human clinical isolates from stool, blood and exudate. CONCLUSIONS: The presence of highly related strains in both humans and raw chicken, and the failure to isolate the serovar from other foods, suggests that poultry is the main reservoir of S. Enteritidis in Mauritius and that the majority of human cases are associated with chicken consumption which originated from one major producer. Stool isolates were indistinguishable or closely related to blood and exudate isolates, indicating that, besides gastroenteritis, the same strain caused invasive infections. Control of S.Enteritidis by poultry breeders would lower the financial burden associated with morbidity in humans caused by this organism in Mauritius. |
Social influence in child care centers: a test of the theory of normative social behavior
Lapinski MK , Anderson J , Shugart A , Todd E . Health Commun 2014 29 (3) 219-32 Child care centers are a unique context for studying communication about the social and personal expectations about health behaviors. The theory of normative social behavior (TNSB; Rimal & Real, 2005 ) provides a framework for testing the role of social and psychological influences on handwashing behaviors among child care workers. A cross-sectional survey of child care workers in 21 centers indicates that outcome expectations and group identity increase the strength of the relationship between descriptive norms and handwashing behavior. Injunctive norms also moderate the effect of descriptive norms on handwashing behavior such that when strong injunctive norms are reported, descriptive norms are positively related to handwashing, but when weak injunctive norms are reported, descriptive norms are negatively related to handwashing. The findings suggest that communication interventions in child care centers can focus on strengthening injunctive norms in order to increase handwashing behaviors in child care centers. The findings also suggest that the theory of normative social behavior can be useful in organizational contexts. |
Endemic fungal infections in solid organ and hematopoietic cell transplant recipients enrolled in the Transplant-Associated Infection Surveillance Network (TRANSNET)
Kauffman CA , Freifeld AG , Andes DR , Baddley JW , Herwaldt L , Walker RC , Alexander BD , Anaissie EJ , Benedict K , Ito JI , Knapp KM , Lyon GM , Marr KA , Morrison VA , Park BJ , Patterson TF , Schuster MG , Chiller TM , Pappas PG . Transpl Infect Dis 2014 16 (2) 213-24 BACKGROUND: Invasive fungal infections are a major cause of morbidity and mortality among solid organ transplant (SOT) and hematopoietic cell transplant (HCT) recipients, but few data have been reported on the epidemiology of endemic fungal infections in these populations. METHODS: Fifteen institutions belonging to the Transplant-Associated Infection Surveillance Network prospectively enrolled SOT and HCT recipients with histoplasmosis, blastomycosis, or coccidioidomycosis occurring between March 2001 and March 2006. RESULTS: A total of 70 patients (64 SOT recipients and 6 HCT recipients) had infection with an endemic mycosis, including 52 with histoplasmosis, 9 with blastomycosis, and 9 with coccidioidomycosis. The 12-month cumulative incidence rate among SOT recipients for histoplasmosis was 0.102%. Occurrence of infection was bimodal; 28 (40%) infections occurred in the first 6 months post transplantation, and 24 (34%) occurred between 2 and 11 years post transplantation. Three patients were documented to have acquired infection from the donor organ. Seven SOT recipients with histoplasmosis and 3 with coccidioidomycosis died (16%); no HCT recipient died. CONCLUSIONS: This 5-year multicenter prospective surveillance study found that endemic mycoses occur uncommonly in SOT and HCT recipients, and that the period at risk extends for years after transplantation. |
An epidemiologic investigation of occupational transmission of Mycobacterium tuberculosis infection to dental health care personnel: infection prevention and control implications
Merte JL , Kroll CM , Collins AS , Melnick AL . J Am Dent Assoc 2014 145 (5) 464-471 BACKGROUND: The authors describe an investigation of a dental hygienist who developed active pulmonary tuberculosis (TB), worked for several months while infectious and likely transmitted Mycobacterium tuberculosis in a dental setting in Washington state. METHODS: Clark County Public Health (CCPH) conducted an epidemiologic investigation of 20 potentially exposed close contacts and 734 direct-care dental patients in 2010. RESULTS: Of 20 close contacts, one family member and two coworkers, all of whom were from countries in which TB is endemic, had latent TB infection (LTBI). One U.S.-born coworker experienced a tuberculin skin test (TST) conversion from 0 to 8 millimeters. Of the 305 of 731 (41.7 percent) potentially exposed patients who received a single TST, 23 (7.5 percent) had a positive TST result of at least 5 mm. Among the subset of 157 patients tested by CCPH staff, 16 (10.2 percent) had a positive TST result. The dental office did not have infection prevention and control policies related to TB identification, prevention or education. CONCLUSIONS: The coworker's TST conversion indicated a recent infection, likely owed to occupational transmission. The proportion of dental patients with positive TST results was greater than the 1999-2000 National Health and Nutrition Examination Survey prevalence estimate in the general population, and it may reflect transmission from the hygienist with active TB or a prevalence of LTBI in the community. Practical Implications All dental practices should implement administrative procedures for TB identification and control as described in this article, even if none of their patients are known to have TB. |
Incidence of and risk factors for hospital-acquired diarrhea in three tertiary care public hospitals in Bangladesh
Bhuiyan MU , Luby SP , Zaman RU , Rahman MW , Sharker MA , Hossain MJ , Rasul CH , Ekram AR , Rahman M , Sturm-Ramirez K , Azziz-Baumgartner E , Gurley ES . Am J Trop Med Hyg 2014 91 (1) 165-172 During April 2007-April 2010, surveillance physicians in adult and pediatric medicine wards of three tertiary public hospitals in Bangladesh identified patients who developed hospital-acquired diarrhea. We calculated incidence of hospital-acquired diarrhea. To identify risk factors, we compared these patients to randomly selected patients from the same wards who were admitted > 72 hours without having diarrhea. The incidence of hospital-acquired diarrhea was 4.8 cases per 1,000 patient-days. Children < 1 year of age were more likely to develop hospital-acquired diarrhea than older children. The risk of developing hospital-acquired diarrhea increased for each additional day of hospitalization beyond 72 hours, whereas exposure to antibiotics within 72 hours of admission decreased the risk. There were three deaths among case-patients; all were infants. Patients, particularly young children, are at risk for hospital-acquired diarrhea and associated deaths in Bangladeshi hospitals. Further research to identify the responsible organisms and transmission routes could inform prevention strategies. |
School-located influenza vaccination with third-party billing: outcomes, cost, and reimbursement
Kempe A , Daley MF , Pyrzanowski J , Vogt T , Fang H , Rinehart DJ , Morgan N , Riis M , Rodgers S , McCormick E , Hammer A , Campagna EJ , Kile D , Dickinson M , Hambidge SJ , Shlay JC . Acad Pediatr 2014 14 (3) 234-40 OBJECTIVE: To assess rates of immunization; costs of conducting clinics; and reimbursements for a school-located influenza vaccination (SLIV) program that billed third-party payers. METHODS: SLIV clinics were conducted in 19 elementary schools in the Denver Public School district (September 2010 to February 2011). School personnel obtained parental consent, and a community vaccinator conducted clinics and performed billing. Vaccines For Children vaccine was available for eligible students. Parents were not billed for any fees. Data were collected regarding implementation costs and vaccine cost was calculated using published private sector prices. Reimbursement amounts were compared to costs. RESULTS: Overall, 30% of students (2784 of 9295) received ≥1 influenza vaccine; 39% (1079 of 2784) needed 2 doses and 80% received both. Excluding vaccine costs, implementation costs were $24.69 per vaccination. The percentage of vaccine costs reimbursed was 62% overall (82% from State Child Health Insurance Program (SCHIP), 50% from private insurance). The percentage of implementation costs reimbursed was 19% overall (23% from private, 27% from Medicaid, 29% from SCHIP and 0% among uninsured). Overall, 25% of total costs (implementation plus vaccine) were reimbursed. CONCLUSIONS: A SLIV program resulted in vaccination of nearly one third of elementary students. Reimbursement rates were limited by 1) school restrictions on charging parents fees, 2) low payments for vaccine administration from public payers and 3) high rates of denials from private insurers. Some of these problems might be reduced by provisions in the Affordable Care Act. |
School-located influenza vaccination with third-party billing: what do parents think?
Kempe A , Daley MF , Pyrzanowski J , Vogt TM , Campagna EJ , Dickinson LM , Hambidge SJ , Shlay JC . Acad Pediatr 2014 14 (3) 241-8 OBJECTIVE: School-located influenza vaccination (SLIV) may be instrumental in achieving high vaccination rates among children. Sustainability of SLIV programs may require third-party billing. This study assessed, among parents of elementary school students, the attitudes about SLIV and billing at school, as well as factors associated with being supportive of SLIV. METHODS: We conducted a survey (April 2010 to June 2010) of parents of 1000 randomly selected primarily low-income children at 20 elementary schools at which SLIV with billing had occurred. RESULTS: Response rate was 70% (n = 699). Eighty-one percent agreed (61% strongly) they "would be okay" with SLIV for their child. Many agreed it was better to get vaccinated at their child's doctor's office because they could take care of other health issues (72%) and the doctor knows the child's medical history (65%). However, an equal percentage (47%) thought the best place for influenza vaccination was the child's doctor's office and the child's school. Twenty-five percent did not want to give health insurance information necessary for billing at school. Factors independently associated with strongly supporting SLIV included parental education of high school or less (relative risk 1.30; 95% confidence interval 1.09-1.58), Hispanic ethnicity (1.25; 1.08-1.45); believing the vaccine is efficacious (1.49; 1.23-1.84); and finding school delivery more convenient (2.37; 1.82-3.45). Having concerns about the safety of influenza vaccine (0.80; 0.72-0.88) and not wanting their child to be vaccinated without a parent (0.74; 0.64-0.83) were negatively associated. CONCLUSIONS: The majority of parents were supportive of SLIV, although parental concerns about not being present for vaccination and about the safety and efficacy of the vaccine will need to be addressed. |
The effectiveness of seasonal trivalent inactivated influenza vaccine in preventing laboratory confirmed influenza hospitalisations in Auckland, New Zealand in 2012
Turner N , Pierse N , Bissielo A , Huang QS , Baker MG , Widdowson MA , Kelly H . Vaccine 2014 32 (29) 3687-93 BACKGROUND: Few studies report the effectiveness of trivalent inactivated influenza vaccine (TIV) in preventing hospitalisation for influenza-confirmed respiratory infections. Using a prospective surveillance platform, this study reports the first such estimate from a well-defined ethnically diverse population in New Zealand (NZ). METHODS: A case test-negative design was used to estimate propensity adjusted vaccine effectiveness. Patients with a severe acute respiratory infection (SARI), defined as a patient of any age requiring hospitalisation with a history of a fever or a measured temperature ≥38 degrees C and cough and onset within the past 7 days, admitted to public hospitals in South and Central Auckland were eligible for inclusion in the study. Cases were SARI patients who tested positive for influenza, while non-cases (controls) were SARI patients who tested negative. Results were adjusted for the propensity to be vaccinated and the timing of the influenza season. RESULTS: The propensity and season adjusted vaccine effectiveness (VE) was estimated as 39% (95% CI 16;56). The VE point estimate against influenza A (H1N1) was lower than for influenza B or influenza A (H3N2) but confidence intervals were wide and overlapping. Estimated VE was 59% (95% CI 26;77) in patients aged 45-64 years but only 8% (-78;53) in those aged 65 years and above. CONCLUSION: Prospective surveillance for SARI has been successfully established in NZ. This study for the first year, the 2012 influenza season, has shown low to moderate protection by TIV against influenza positive hospitalisation. |
Evaluation of sex, race, body mass index and pre-vaccination serum progesterone levels and post-vaccination serum anti-anthrax protective immunoglobulin G on injection site adverse events following anthrax vaccine adsorbed (AVA) in the CDC AVA human clinical trial
Pondo T , Rose CE , Martin SW , Keitel WA , Keyserling HL , Babcock J , Parker S , Jacobson RM , Poland GA , McNeil MM . Vaccine 2014 32 (28) 3548-54 BACKGROUND: Anthrax vaccine adsorbed (AVA) administered intramuscularly (IM) results in fewer adverse events (AEs) than subcutaneous (SQ) administration. Women experience more AEs than men. Antibody response, female hormones, race, and body mass index (BMI) may contribute to increased frequency of reported injection site AEs. METHODS: We analyzed data from the CDC anthrax vaccine adsorbed human clinical trial. This double blind, randomized, placebo controlled trial enrolled 1563 participants and followed them through 8 injections (AVA or placebo) over a period of 42 months. For the trial's vaccinated cohort (n=1267), we used multivariable logistic regression to model the effects of study group (SQ or IM), sex, race, study site, BMI, age, and post-vaccination serum anti-PA IgG on occurrence of AEs of any severity grade. Also, in a women-only subset (n=227), we assessed effect of pre-vaccination serum progesterone level and menstrual phase on AEs. RESULTS: Participants who received SQ injections had significantly higher proportions of itching, redness, swelling, tenderness and warmth compared to the IM study group after adjusting for other risk factors. The proportions of redness, swelling, tenderness and warmth were all significantly lower in blacks vs. non-black participants. We found arm motion limitation, itching, pain, swelling and tenderness were more likely to occur in participants with the highest anti-PA IgG concentrations. In the SQ study group, redness and swelling were more common for obese participants compared to participants who were not overweight. Females had significantly higher proportions of all AEs compared to males. Menstrual phase was not associated with any AEs. CONCLUSIONS: Female and non-black participants had a higher proportion of AVA associated AEs and higher anti-PA IgG concentrations. Antibody responses to other vaccines may also vary by gender and race. Further studies may provide better understanding for higher proportions of AEs in women and non-black participants. |
Absence of associations between influenza vaccines and increased risks of seizures, Guillain-Barre syndrome, encephalitis, or anaphylaxis in the 2012-2013 season
Kawai AT , Li L , Kulldorff M , Vellozzi C , Weintraub E , Baxter R , Belongia EA , Daley MF , Jacobsen SJ , Naleway A , Nordin JD , Lee GM . Pharmacoepidemiol Drug Saf 2014 23 (5) 548-53 PURPOSE: We conducted weekly surveillance for pre-specified adverse events following receipt of the 2012-2013 influenza vaccines in the Vaccine Safety Datalink (VSD). METHODS: For each outcome, risk intervals (i.e., period after vaccination with a potentially increased risk) were defined on the basis of biologic plausibility and prior literature. Seizures following inactivated influenza vaccine (IIV) were monitored in children in three age groups (6-23 months, 24-59 months, and 5-17 years) using a self-controlled risk interval design. We also monitored for Guillain-Barre syndrome, encephalitis, and anaphylaxis following IIV in patients ≥6 months of age using a cohort design with historical controls. In the risk intervals following live attenuated influenza vaccine (LAIV), we collected weekly counts of Guillain-Barre syndrome, encephalitis, and anaphylaxis in patients ages 2-49. Among LAIV vaccinees, numbers of expected events based on rates in historical controls were calculated, adjusted for age and site. RESULTS: At the end of surveillance, approximately 3.6 million first doses of IIV and 250 000 first doses of LAIV had been administered in the VSD. No elevated risks were identified in risk intervals following 2012-2013 IIV, as compared with a self-matched control interval or to historical controls. For each outcome, fewer than three events occurred in the risk interval following 2012-2013 LAIV, and we thus were unable to estimate measures of relative risks. CONCLUSIONS: No increased risk was identified for any of the pre-specified outcomes following 2012-2013 influenza vaccinations in the VSD. Published 2014. This article is a U.S. Government work and is in the public domain in the USA. |
Completeness and timeliness of electronic vs. conventional laboratory reporting for communicable disease surveillance - Oklahoma, 2011
Johnson MG , Williams J , Lee A , Bradley KK . Public Health Rep 2014 129 (3) 261-266 OBJECTIVES: The Health Information Technology for Economic and Clinical Health (HITECH) Act encourages the meaningful use of certified electronic health record technology. A HITECH-compliant core component is nationwide electronic laboratory reporting (ELR) implementation for communicable disease surveillance. In Oklahoma, laboratories with ≥400 positive tests/year for reportable diseases must use ELR. Of 18 such laboratories, two have adopted ELR. We compared completeness and timeliness of ELR reports from these two laboratories with conventional reports from all other Oklahoma laboratories. METHODS: We retrospectively reviewed confirmed reportable disease cases for January 1-December 31, 2011, excluding tuberculosis, hepatitis, sexually transmitted infections, diseases without laboratory diagnoses, and immediately reportable diseases. Probable reportable tickborne disease cases were included. We compared ELR with conventional reporting (i.e., mail, fax, telephone, and Internet). We assessed data completeness based on eight demographic and two laboratory fields in each disease report and timeliness by percentage of cases reported in ≤1 business day. RESULTS: Overall, 1,867 reports met the inclusion criteria; 24% of these reports had been submitted by ELR. Data completeness was 90% for ELR and 95% for conventional reporting. Patient addresses accounted for 97% of the missing data fields for ELR reports. Timeliness was 91% for ELR and 87% for conventional reports. CONCLUSIONS: Although early in the transition to ELR compliance in Oklahoma, ELR has already yielded improved timeliness for communicable disease surveillance. However, ELR did not yield more complete reports than conventional reporting. Requiring specific demographic data fields for ELR reports can improve the completeness of ELR. |
School violence and bullying among sexual minority high school students, 2009-2011
O'Malley Olsen E , Kann L , Vivolo-Kantor A , Kinchen S , McManus T . J Adolesc Health 2014 55 (3) 432-8 PURPOSE: School-based victimization has short- and long-term implications for the health and academic lives of sexual minority students. This analysis assessed the prevalence and relative risk of school violence and bullying among sexual minority and heterosexual high school students. METHODS: Youth Risk Behavior Survey data from 10 states and 10 large urban school districts that assessed sexual identity and had weighted data in the 2009 and/or 2011 cycle were combined to create two large population-based data sets, one containing state data and one containing district data. Prevalence of physical fighting, being threatened or injured with a weapon, weapon carrying, and being bullied on school property and not going to school because of safety concerns was calculated. Associations between these behaviors and sexual identity were identified. RESULTS: In the state data, sexual minority male students were at greater risk for being threatened or injured with a weapon, not going to school because of safety concerns and being bullied than heterosexual male students. Sexual minority female students were at greater risk than heterosexual female students for all five behaviors. In the district data, with one exception, sexual minority male and female students were at greater risk for all five behaviors than heterosexual students. CONCLUSIONS: Sexual minority students still routinely experience more school victimization than their heterosexual counterparts. The implementation of comprehensive, evidence-based programs and policies has the ability to reduce school violence and bullying, especially among sexual minority students. |
Falls and fall injuries among adults with arthritis - United States, 2012
Barbour KE , Stevens JA , Helmick CG , Luo YH , Murphy LB , Hootman JM , Theis K , Anderson LA , Baker NA , Sugerman DE . MMWR Morb Mortal Wkly Rep 2014 63 (17) 379-83 Falls are the leading cause of injury-related morbidity and mortality among older adults, with more than one in three older adults falling each year, resulting in direct medical costs of nearly $30 billion. Some of the major consequences of falls among older adults are hip fractures, brain injuries, decline in functional abilities, and reductions in social and physical activities. Although the burden of falls among older adults is well-documented, research suggests that falls and fall injuries are also common among middle-aged adults. One risk factor for falling is poor neuromuscular function (i.e., gait speed and balance), which is common among persons with arthritis. In the United States, the prevalence of arthritis is highest among middle-aged adults (aged 45-64 years) (30.2%) and older adults (aged ≥65 years) (49.7%), and these populations account for 52% of U.S. adults. Moreover, arthritis is the most common cause of disability. To examine the prevalence of falls among middle-aged and older adults with arthritis in different states/territories, CDC analyzed data from the 2012 Behavioral Risk Factor Surveillance System (BRFSS) to assess the state-specific prevalence of having fallen and having experienced a fall injury in the past 12 months among adults aged ≥45 years with and without doctor-diagnosed arthritis. This report summarizes the results of that analysis, which found that for all 50 states and the District of Columbia (DC), the prevalence of any fall (one or more), two or more falls, and fall injuries in the past 12 months was significantly higher among adults with arthritis compared with those without arthritis. The prevalence of falls and fall injuries is high among adults with arthritis but can be addressed through greater dissemination of arthritis management and fall prevention programs in clinical and community practice. |
Bullying as a longitudinal predictor of adolescent dating violence
Foshee VA , McNaughton Reyes HL , Vivolo-Kantor AM , Basile KC , Chang LY , Faris R , Ennett ST . J Adolesc Health 2014 55 (3) 439-44 PURPOSE: One suggested approach to preventing adolescent dating violence is to prevent behavioral precursors to dating violence, such as bullying. However, no longitudinal study has examined bullying as a behavioral precursor to dating violence. In this study, longitudinal data were used to examine (1) whether direct and indirect bullying perpetration in the sixth grade predicted the onset of physical dating violence perpetration by the eighth grade and (2) whether the associations varied by sex and race/ethnicity of the adolescent. METHODS: Data were collected in school from sixth graders in three primarily rural counties and then again when students were in the eighth grade. Analyses were conducted with 1,154 adolescents who had not perpetrated dating violence at the sixth-grade assessment. The sample was 47% male, 29% black, and 10% of another race/ethnicity than black or white. RESULTS: Direct bullying, defined as hitting, slapping, or picking on another kid in the sixth grade, predicted the onset of physical dating violence perpetration by the eighth grade, controlling for indirect bullying and potential confounders. Although indirect bullying, defined as spreading false rumors and excluding students from friendship groups, was associated with the onset of physical dating violence perpetration in bivariate analyses, it did not predict the onset of physical dating violence when controlling for direct bullying. None of the associations examined varied by sex or race/ethnicity of the adolescents. CONCLUSIONS: Our findings suggest that efforts targeted at preventing direct bullying may also prevent the onset of physical dating violence. |
Real-Time Detection of HIV-2 by Reverse Transcription-Loop-Mediated Isothermal Amplification.
Curtis KA , Niedzwiedz P , Youngpairoj AS , Rudolph DL , Owen SM . J Clin Microbiol 2014 52 (7) 2674-6 Currently, there are no FDA-approved, nucleic acid amplification tests (NAATs) for the detection or confirmation of HIV-2 infection. We describe the development of a real-time assay for detection of HIV-2 DNA and RNA, using reverse-transcription loop-mediated isothermal amplification (RT-LAMP) and the ESEQuant Tube Scanner, a portable isothermal amplification/detection device. |
Rationale for eliminating Staphylococcus breakpoints for beta-lactam agents other than penicillin, oxacillin or cefoxitin, and ceftaroline
Dien Bard J , Hindler JA , Gold HS , Limbago B . Clin Infect Dis 2014 58 (9) 1287-96 Due to the ongoing concern about the reliability of Staphylococcus breakpoints (interpretive criteria) for other beta-lactam agents, the Clinical and Laboratory Standards Institute recently approved the elimination of all breakpoints for antistaphylococcal beta-lactams except for penicillin, oxacillin or cefoxitin, and ceftaroline. Routine testing of penicillin and oxacillin or cefoxitin should be used to infer susceptibility for all beta-lactams with approved clinical indications for staphylococcal infections. It is critical for laboratories to reject requests for susceptibility testing of other beta-lactams against staphylococci and to indicate that susceptibility to these agents can be predicted from the penicillin and oxacillin or cefoxitin results. This article reviews beta-lactam resistance mechanisms in staphylococci, current antimicrobial susceptibility testing and reporting recommendations for beta-lactams and staphylococci, and microbiologic data and clinical data supporting the elimination of staphylococcal breakpoints for other beta-lactam agents. |
Development and validation of benomyl birdseed agar for the isolation of Cryptococcus neoformans and Cryptococcus gattii from environmental samples
Pham CD , Ahn S , Turner LA , Wohrle R , Lockhart SR . Med Mycol 2014 52 (4) 417-21 One of the difficulties of isolating Cryptococcus neoformans and Cryptococcus gattii from environmental samples is the abundant overgrowth of other yeast and mold species that occurs on the plates. Here we report the application of benomyl to Guizotia abyssinica seed extract growth medium to improve the isolation of C. neoformans and C. gattii from environmental samples. We validated this medium by recovering C. neoformans and C. gattii from convenience soils and swabs from a region of the United States where these yeasts are endemic. |
Trends in weight management goals and behaviors among 9th-12th grade students: United States, 1999-2009
Demissie Z , Lowry R , Eaton DK , Nihiser AJ . Matern Child Health J 2014 19 (1) 74-83 To examine trends in weight management goals and behaviors among U.S. high school students during 1999-2009. Data from six biennial cycles (1999-2009) of the national Youth Risk Behavior Survey were analyzed. Cross-sectional, nationally representative samples of 9th-12th grade students (approximately 14,000 students/cycle) completed self-administered questionnaires. Logistic regression models adjusted for grade, race/ethnicity, and obesity were used to test for trends in weight management goals and behaviors among subgroups of students. Combined prevalences and trends differed by sex and by race/ethnicity and weight status within sex. During 1999-2009, the prevalence of female students trying to gain weight decreased (7.6-5.7 %). Among female students trying to lose or stay the same weight, prevalences decreased for eating less (69.6-63.2 %); fasting (23.3-17.6 %); using diet pills/powders/liquids (13.7-7.8 %); and vomiting/laxatives (9.5-6.6 %) for weight control. During 1999-2009, the prevalence of male students trying to lose weight increased (26.1-30.5 %). Among male students trying to lose or stay the same weight, the prevalence of exercising to control weight did not change during 1999-2003 and then increased (74.0-79.1 %) while the prevalence of taking diet pills/powders/liquids for weight control decreased (6.9-5.1 %) during 1999-2009. Weight management goals and behaviors changed during 1999-2009 and differed by subgroup. To combat the use of unhealthy weight control behaviors, efforts may be needed to teach adolescents about recommended weight management strategies and avoiding the risks associated with unhealthy methods. |
Validation of self-reported maternal and infant health indicators in the Pregnancy Risk Assessment Monitoring System
Dietz P , Bombard J , Mulready-Ward C , Gauthier J , Sackoff J , Brozicevic P , Gambatese M , Nyland-Funke M , England L , Harrison L , Taylor A . Matern Child Health J 2014 18 (10) 2489-98 To assess the validity of self-reported maternal and infant health indicators reported by mothers an average of 4 months after delivery. Three validity measures-sensitivity, specificity and positive predictive value (PPV)-were calculated for pregnancy history, pregnancy complications, health care utilization, and infant health indicators self-reported on the Pregnancy Risk Assessment Monitoring System (PRAMS) questionnaire by a representative sample of mothers delivering live births in New York City (NYC) (n = 603) and Vermont (n = 664) in 2009. Data abstracted from hospital records served as gold standards. All data were weighted to be representative of women delivering live births in NYC or Vermont during the study period. Most PRAMS indicators had >90 % specificity. Indicators with >90 % sensitivity and PPV for both sites included prior live birth, any diabetes, and Medicaid insurance at delivery, and for Vermont only, infant admission to the NICU and breastfeeding in the hospital. Indicators with poor sensitivity and PPV (<70 %) for both sites (i.e., NYC and Vermont) included placenta previa and/or placental abruption, urinary tract infection or kidney infection, and for NYC only, preterm labor, prior low-birth-weight birth, and prior preterm birth. For Vermont only, receipt of an HIV test during pregnancy had poor sensitivity and PPV. Mothers accurately reported information on prior live births and Medicaid insurance at delivery; however, mothers' recall of certain pregnancy complications and pregnancy history was poor. These findings could be used to prioritize data collection of indicators with high validity. |
Postpartum venous thromboembolism: incidence and risk factors
Tepper NK , Boulet SL , Whiteman MK , Monsour M , Marchbanks PA , Hooper WC , Curtis KM . Obstet Gynecol 2014 123 (5) 987-996 OBJECTIVE: To calculate incidence of postpartum venous thromboembolism by week after delivery and to examine potential risk factors for venous thromboembolism overall and at different times during the postpartum period. METHODS: A deidentified health care claims information database from employers, health plans, hospitals, and Medicaid programs across the United States was used to identify delivery hospitalizations among women aged 15-44 years during the years 2005-2011. International Classification of Diseases, 9th Revision, Clinical Modification diagnosis and procedure codes were used to identify instances of venous thromboembolism and associated characteristics and conditions among women with recent delivery. Incidence proportions of venous thromboembolism by week postpartum through week 12 were calculated per 10,000 deliveries. Logistic regression was used to calculate odds ratios for selected risk factors among women with postpartum venous thromboembolism and among women with venous thromboembolism during the early or later postpartum periods. RESULTS: The incidence proportion of postpartum venous thromboembolism was highest during the first 3 weeks after delivery, dropping from nine per 10,000 during the first week to one per 10,000 at 4 weeks after delivery and decreasing steadily through the 12th week. Certain obstetric procedures and complications such as cesarean delivery, preeclampsia, hemorrhage, and postpartum infection conferred an increased risk for venous thromboembolism (odds ratios ranging from 1.3 to 6.4), which persisted over the 12-week period compared with women without these risk factors. CONCLUSION: Risk for postpartum venous thromboembolism is highest during the first 3 weeks after delivery. Women with obstetric complications are at highest risk for postpartum venous thromboembolism, and this risk remains elevated throughout the first 12 weeks after delivery. LEVEL OF EVIDENCE: II. |
Facility-based identification of women with severe maternal morbidity: it is time to start
Callaghan WM , Grobman WA , Kilpatrick SJ , Main EK , D'Alton M . Obstet Gynecol 2014 123 (5) 978-981 Although maternal deaths have been the traditional indicator of maternal health, these events are the "tip of the iceberg" in that there are many women who have significant complications of pregnancy, labor, and delivery. Identifying women who experience severe maternal morbidity and reviewing their care can provide critical information to inform quality improvement in obstetrics. In this commentary, we review methods to identify women who experienced severe complications of pregnancy. We propose a simple validated approach based on transfusion of four or more units of blood products, admission to an intensive care unit, or both as a starting point for identification and review of severe maternal morbidity in health care settings for the purpose of understanding successes and failures in systems of care. |
Clinical interventions to reduce secondhand smoke exposure among pregnant women: a systematic review
Tong VT , Dietz PM , Rolle IV , Kennedy SM , Thomas W , England LJ . Tob Control 2014 24 (3) 217-23 OBJECTIVE: To conduct a systematic review of clinical interventions to reduce secondhand smoke (SHS) exposure among non-smoking pregnant women. DATA SOURCES: We searched 16 databases for publications from 1990 to January 2013, with no language restrictions. STUDY SELECTION: Papers were included if they met the following criteria: (1) the study population included non-smoking pregnant women exposed to SHS, (2) the clinical interventions were intended to reduce SHS exposure at home, (3) the study included a control group and (4) outcomes included either reduced SHS exposure of non-smoking pregnant women at home or quit rates among smoking partners during the pregnancy of the woman. DATA EXTRACTION: Two coders independently reviewed each abstract or full text to identify eligible papers. Two abstractors independently coded papers based on US Preventive Services Task Force criteria for study quality (good, fair, poor), and studies without biochemically-verified outcome measures were considered poor quality. DATA SYNTHESIS: From 4670 papers, we identified five studies that met our inclusion criteria: four focused on reducing SHS exposure among non-smoking pregnant women, and one focused on providing cessation support for smoking partners of pregnant women. All were randomised controlled trials, and all reported positive findings. Three studies were judged poor quality because outcome measures were not biochemically-verified, and two were considered fair quality. CONCLUSIONS: Clinical interventions delivered in prenatal care settings appear to reduce SHS exposure, but study weaknesses limit our ability to draw firm conclusions. More rigorous studies, using biochemical validation, are needed to identify strategies for reducing SHS exposure in pregnant women. |
Complementary and alternative medicine for Duchenne and Becker muscular dystrophies: characteristics of users and caregivers
Zhu Y , Romitti PA , Conway KM , Andrews J , Liu K , Meaney FJ , Street N , Puzhankara S , Druschel CM , Matthews DJ . Pediatr Neurol 2014 51 (1) 71-7 BACKGROUND: Complementary and alternative medicine is frequently used in the management of chronic pediatric diseases, but little is known about its use by those with Duchenne or Becker muscular dystrophy. METHODS: Complementary and alternative medicine use by male patients with Duchenne or Becker muscular dystrophy and associations with characteristics of male patients and their caregivers were examined through interviews with 362 primary caregivers identified from the Muscular Dystrophy Surveillance, Tracking, and Research Network. RESULTS: Overall, 272 of the 362 (75.1%) primary caregivers reported that they had used any complementary and alternative medicine for the oldest Muscular Dystrophy Surveillance, Tracking, and Research Network male in their family. The most commonly reported therapies were from the mind-body medicine domain (61.0%) followed by those from the biologically based practice (39.2%), manipulative and body-based practice (29.3%), and whole medical system (6.9%) domains. Aquatherapy, prayer and/or blessing, special diet, and massage were the most frequently used therapies. Compared with nonusers, male patients who used any therapy were more likely to have an early onset of symptoms and use a wheel chair; their caregivers were more likely to be non-Hispanic white. Among domains, associations were observed with caregiver education and family income (mind-body medicines [excluding prayer and/or blessing only] and whole medical systems) and Muscular Dystrophy Surveillance, Tracking, and Research Network site (biologically based practices and mind-body medicines [excluding prayer and/or blessing only]). CONCLUSIONS: Complementary and alternative medicine use was common in the management of Duchenne and Becker muscular dystrophies among Muscular Dystrophy Surveillance, Tracking, and Research Network males. This widespread use suggests further study to evaluate the efficacy of integrating complementary and alternative medicine into treatment regimens for Duchenne and Becker muscular dystrophies. |
Cost-effectiveness of testing hepatitis B-positive pregnant women for hepatitis B e antigen or viral load
Fan L , Owusu-Edusei K Jr , Schillie SF , Murphy TV . Obstet Gynecol 2014 123 (5) 929-937 OBJECTIVE: To estimate the cost-effectiveness of testing pregnant women with hepatitis B (hepatitis B surface antigen [HBsAg]-positive) for hepatitis B e antigen (HBeAg) or hepatitis B virus (HBV) DNA, and administering maternal antiviral prophylaxis if indicated, to decrease breakthrough perinatal HBV transmission from the U.S. health care perspective. METHODS: A Markov decision model was constructed for a 2010 birth cohort of 4 million neonates to estimate the cost-effectiveness of two strategies: testing HBsAg-positive pregnant women for 1) HBeAg or 2) HBV load. Maternal antiviral prophylaxis is given from 28 weeks of gestation through 4 weeks postpartum when HBeAg is positive or HBV load is high (10 copies/mL or greater). These strategies were compared with the current recommendation. All neonates born to HBsAg-positive women received recommended active-passive immunoprophylaxis. Effects were measured in quality-adjusted life-years (QALYs) and all costs were in 2010 U.S. dollars. RESULTS: The HBeAg testing strategy saved $3.3 million and 3,080 QALYs and prevented 486 chronic HBV infections compared with the current recommendation. The HBV load testing strategy cost $3 million more than current recommendation, saved 2,080 QALYs, and prevented 324 chronic infections with an incremental cost-effectiveness ratio of $1,583 per QALY saved compared with the current recommendations. The results remained robust over a wide range of assumptions. CONCLUSION: Testing HBsAg-positive pregnant women for HBeAg or HBV load followed by maternal antiviral prophylaxis if HBeAg-positive or high viral load to reduce perinatal hepatitis B transmission in the United States is cost-effective. |
Safety, effectiveness and acceptability of the PrePex device for adult male circumcision in Kenya
Feldblum PJ , Odoyo-June E , Obiero W , Bailey RC , Combes S , Hart C , Jou Lai J , Fischer S , Cherutich P . PLoS One 2014 9 (5) e95357 OBJECTIVE: To assess the safety, effectiveness and acceptability of the PrePex device for adult medical male circumcision (MMC) in routine service delivery in Kenya. METHODS: We enrolled 427 men ages 18-49 at one fixed and two outreach clinics. Procedures were performed by trained clinical officers and nurses. The first 50 enrollees were scheduled for six follow-up visits, and remaining men were followed at Days 7 and 42. We recorded adverse events (AEs) and time to complete healing, and interviewed men about acceptability and pain. RESULTS: Placement and removal procedures each averaged between 3 and 4 minutes. Self-reported pain was minimal during placement but was fleetingly intense during removal. The rate of moderate/severe AEs was 5.9% overall (95% confidence interval [CI] 3.8%-8.5%), all of which resolved without sequelae. AEs included 5 device displacements, 2 spontaneous foreskin detachments, and 9 cases of insufficient foreskin removal. Surgical completion of MMC was required for 9 men (2.1%). Among the closely monitored first 50 participants, the probability of complete healing by Day 42 was 0.44 (95% CI 0.30-0.58), and 0.90 by Day 56. A large majority of men was favorable about their MMC procedure and would recommend PrePex to friends and family. CONCLUSIONS: The PrePex device was effective for MMC in Kenya, and well-accepted. The AE rate was higher than reported for surgical procedures there, or in previous PrePex studies. Healing time is longer than following surgical circumcision. Provider experience and clearer counseling on post-placement and post-removal care should lead to lower AE rates. TRIAL REGISTRATION: ClinicalTrials.gov NCT01711411. |
Introducing a new monitoring manual for home fortification and strengthening capacity to monitor nutrition interventions
Jefferds ME , Flores-Ayala R . Matern Child Nutr 2014 11 Suppl 4 229-33 Lack of monitoring capacity is a key barrier for nutrition interventions and limits programme management, decision making and programme effectiveness in many low-income and middle-income countries. A 2011 global assessment reported lack of monitoring capacity was the top barrier for home fortification interventions, such as micronutrient powders or lipid-based nutrient supplements. A Manual for Developing and Implementing Monitoring Systems for Home Fortification Interventions was recently disseminated. It is comprehensive and describes monitoring concepts and frameworks and includes monitoring tools and worksheets. The monitoring manual describes the steps of developing and implementing a monitoring system for home fortification interventions, including identifying and engaging stakeholders; developing a programme description including logic model and logical framework; refining the purpose of the monitoring system, identifying users and their monitoring needs; describing the design of the monitoring system; developing indicators; describing the core components of a comprehensive monitoring plan; and considering factors related to stage of programme development, sustainability and scale up. A fictional home fortification example is used throughout the monitoring manual to illustrate these steps. The monitoring manual is a useful tool to support the development and implementation of home fortification intervention monitoring systems. In the context of systematic capacity gaps to design, implement and monitor nutrition interventions in many low-income and middle-income countries, the dissemination of new tools, such as monitoring manuals may have limited impact without additional attention to strengthening other individual, organisational and systems levels capacities. |
Association of the neighborhood retail food environment with sodium and potassium intake among US adults
Greer S , Schieb L , Schwartz G , Onufrak S , Park S . Prev Chronic Dis 2014 11 E70 INTRODUCTION: High sodium intake and low potassium intake, which can contribute to hypertension and risk of cardiovascular disease, may be related to the availability of healthful food in neighborhood stores. Despite evidence linking food environment with diet quality, this relationship has not been evaluated in the United States. The modified retail food environment index (mRFEI) provides a composite measure of the retail food environment and represents the percentage of healthful-food vendors within a 0.5 mile buffer of a census tract. METHODS: We analyzed data from 8,779 participants in the National Health and Nutrition Examination Survey, 2005-2008. By using linear regression, we assessed the relationship between mRFEI and sodium intake, potassium intake, and the sodium-potassium ratio. Models were stratified by region (South and non-South) and included participant and neighborhood characteristics. RESULTS: In the non-South region, higher mRFEI scores (indicating a more healthful food environment) were not associated with sodium intake, were positively associated with potassium intake (P [trend] = .005), and were negatively associated with the sodium-potassium ratio (P [trend] = .02); these associations diminished when neighborhood characteristics were included, but remained close to statistical significance for potassium intake (P [trend] = .05) and sodium-potassium ratio (P [trend] = .07). In the South, mRFEI scores were not associated with sodium intake, were negatively associated with potassium intake (P [trend] = < .001), and were positively associated with sodium-potassium ratio (P [trend] = .01). These associations also diminished after controlling for neighborhood characteristics for both potassium intake (P [trend] = .03) and sodium-potassium ratio (P [trend] = .40). CONCLUSION: We found no association between mRFEI and sodium intake. The association between mRFEI and potassium intake and the sodium-potassium ratio varied by region. National strategies to reduce sodium in the food supply may be most effective to reduce sodium intake. Strategies aimed at the local level should consider regional context and neighborhood characteristics. |
Characteristics of US health care providers who counsel adolescents on sports and energy drink consumption
Xiang N , Wethington H , Onufrak S , Belay B . Int J Pediatr 2014 2014 987082 OBJECTIVE: To examine the proportion of health care providers who counsel adolescent patients on sports and energy drink (SED) consumption and the association with provider characteristics. METHODS: This is a cross-sectional analysis of a survey of providers who see patients ≤17 years old. The proportion providing regular counseling on sports drinks (SDs), energy drinks (EDs), or both was assessed. Chi-square analyses examined differences in counseling based on provider characteristics. Multivariate logistic regression calculated adjusted odds ratios (aOR) for characteristics independently associated with SED counseling. RESULTS: Overall, 34% of health care providers regularly counseled on both SEDs, with 41% regularly counseling on SDs and 55% regularly counseling on EDs. On adjusted modeling regular SED counseling was associated with the female sex (aOR: 1.44 [95% CI: 1.07-1.93]), high fruit/vegetable intake (aOR: 2.05 [95% CI: 1.54-2.73]), family/general practitioners (aOR: 0.58 [95% CI: 0.41-0.82]) and internists (aOR: 0.37 [95% CI: 0.20-0.70]) versus pediatricians, and group versus individual practices (aOR: 0.59 [95% CI: 0.42-0.84]). Modeling for SD- and ED-specific counseling found similar associations with provider characteristics. CONCLUSION: The prevalence of regular SED counseling is low overall and varies. Provider education on the significance of SED counseling and consumption is important. |
Mental work demands, retirement, and longitudinal trajectories of cognitive functioning
Fisher GG , Stachowski A , Infurna FJ , Faul JD , Grosch J , Tetrick LE . J Occup Health Psychol 2014 19 (2) 231-42 Age-related changes in cognitive abilities are well-documented, and a very important indicator of health, functioning, and decline in later life. However, less is known about the course of cognitive functioning before and after retirement and specifically whether job characteristics during one's time of employment (i.e., higher vs. lower levels of mental work demands) moderate how cognition changes both before and after the transition to retirement. We used data from n = 4,182 (50% women) individuals in the Health and Retirement Study, a nationally representative panel study in the United States, across an 18 year time span (1992-2010). Data were linked to the O*NET occupation codes to gather information about mental job demands to examine whether job characteristics during one's time of employment moderates level and rate of change in cognitive functioning (episodic memory and mental status) both before and after retirement. Results indicated that working in an occupation characterized by higher levels of mental demands was associated with higher levels of cognitive functioning before retirement, and a slower rate of cognitive decline after retirement. We controlled for a number of important covariates, including socioeconomic (education and income), demographic, and health variables. Our discussion focuses on pathways through which job characteristics may be associated with the course of cognitive functioning in relation to the important transition of retirement. Implications for job design as well as retirement are offered. |
Notes from the field: investigation of infectious disease risks associated with a nontransplant anatomical donation center - Arizona, 2014
de Perio MA , Bernard BP , Delaney LJ , Pesik N , Cohen NJ . MMWR Morb Mortal Wkly Rep 2014 63 (17) 384-5 CDC is investigating reports of potential occupational exposure to human immunodeficiency virus (HIV), hepatitis B virus (HBV), hepatitis C virus (HCV), and Mycobacterium tuberculosis among workers performing preparation and dissection procedures on human nontransplant anatomical materials at a nontransplant anatomical donation center in Arizona. CDC is working with Arizona public health officials to inform persons exposed to these potentially infected materials. Nontransplant anatomical centers around the United States process thousands of donated cadavers annually. These materials (which might be fresh, frozen, or chemically preserved) are used by universities and surgical instrument and pharmaceutical companies for medical education and research. The American Association of Tissue Banks has developed accreditation policies for nontransplant anatomical donation organizations. It also has written standards that specify exclusion criteria for donor material, as well as use of proper environmental controls and safe work practices to prevent transmission of infectious agents during receipt and handling of nontransplant anatomical materials. At the center under investigation, which is now closed, these standards might not have been consistently implemented. |
A novel algorithm for determining contact area between a respirator and a headform
Lei Z , Yang J , Zhuang Z . J Occup Environ Hyg 2014 11 (4) 227-37 The contact area, as well as the contact pressure, is created when a respiratory protection device (a respirator or surgical mask) contacts a human face. A computer-based algorithm for determining the contact area between a headform and N95 filtering facepiece respirator (FFR) was proposed. Six N95 FFRs were applied to five sizes of standard headforms (large, medium, small, long/narrow, and short/wide) to simulate respirator donning. After the contact simulation between a headform and an N95 FFR was conducted, a contact area was determined by extracting the intersection surfaces of the headform and the N95 FFR. Using computer-aided design tools, a superimposed contact area and an average contact area, which are non-uniform rational basis spline (NURBS) surfaces, were developed for each headform. Experiments that directly measured dimensions of the contact areas between headform prototypes and N95 FFRs were used to validate the simulation results. Headform sizes influenced all contact area dimensions (P < 0.0001), and N95 FFR sizing systems influenced all contact area dimensions (P < 0.05) except the left and right chin regions. The medium headform produced the largest contact area, while the large and small headforms produced the smallest. |
Effect of fiber length on carbon nanotube-induced fibrogenesis
Manke A , Luanpitpong S , Dong C , Wang L , He X , Battelli L , Derk R , Stueckle TA , Porter DW , Sager T , Gou H , Dinu CZ , Wu N , Mercer RR , Rojanasakul Y . Int J Mol Sci 2014 15 (5) 7444-7461 Given their extremely small size and light weight, carbon nanotubes (CNTs) can be readily inhaled by human lungs resulting in increased rates of pulmonary disorders, particularly fibrosis. Although the fibrogenic potential of CNTs is well established, there is a lack of consensus regarding the contribution of physicochemical attributes of CNTs on the underlying fibrotic outcome. We designed an experimentally validated in vitro fibroblast culture model aimed at investigating the effect of fiber length on single-walled CNT (SWCNT)-induced pulmonary fibrosis. The fibrogenic response to short and long SWCNTs was assessed via oxidative stress generation, collagen expression and transforming growth factor-beta (TGF-beta) production as potential fibrosis biomarkers. Long SWCNTs were significantly more potent than short SWCNTs in terms of reactive oxygen species (ROS) response, collagen production and TGF-beta release. Furthermore, our finding on the length-dependent in vitro fibrogenic response was validated by the in vivo lung fibrosis outcome, thus supporting the predictive value of the in vitro model. Our results also demonstrated the key role of ROS in SWCNT-induced collagen expression and TGF-beta activation, indicating the potential mechanisms of length-dependent SWCNT-induced fibrosis. Together, our study provides new evidence for the role of fiber length in SWCNT-induced lung fibrosis and offers a rapid cell-based assay for fibrogenicity testing of nanomaterials with the ability to predict pulmonary fibrogenic response in vivo. |
Evaluation of a diffusion charger for measuring aerosols in a workplace
Vosburgh DJ , Ku BK , Peters TM . Ann Occup Hyg 2014 58 (4) 424-36 The model DC2000CE diffusion charger from EcoChem Analytics (League City, TX, USA) has the potential to be of considerable use to measure airborne surface area concentrations of nanoparticles in the workplace. The detection efficiency of the DC2000CE to reference instruments was determined with monodispersed spherical particles from 54 to 565.7nm. Surface area concentrations measured by a DC2000CE were then compared to measured and detection efficiency adjusted reference surface area concentrations for polydispersed aerosols (propylene torch exhaust, incense, diesel exhaust, and Arizona road dust) over a range of particle sizes that may be encountered in a workplace. The ratio of surface area concentrations measured by the DC2000CE to that measured with the reference instruments for unimodal and multimodal aerosols ranged from 0.02 to 0.52. The ratios for detection efficiency adjusted unimodal and multimodal surface area concentrations were closer to unity (0.93-1.19) for aerosols where the majority of the surface area was within the size range of particles used to create the correction. A detection efficiency that includes the entire size range of the DC2000CE is needed before a calibration correction for the DC2000CE can be created. For diesel exhaust, the DC2000CE retained a linear response compared to reference instruments up to 2500mm(2) m(-3), which was greater than the maximum range stated by the manufacturer (1000mm(2) m(-3)). Physical limitations with regard to DC2000CE orientation, movement, and vibration were identified. Vibrating the DC2000CE while measuring aerosol concentrations may cause an increase of ~35mm(2) m(-3), whereas moving the DC2000CE may cause concentrations to be inflated by as much as 400mm(2) m(-3). Depending on the concentration of the aerosol of interest being measured, moving or vibrating a DC2000CE while measuring the aerosol should be avoided. |
Alterations in cardiomyocyte function after pulmonary treatment with stainless steel welding fume in rats
Popstojanov R , Antonini JM , Salmen R , Ye M , Zheng W , Castranova V , Fekedulegn DB , Kan H . J Toxicol Environ Health A 2014 77 (12) 705-715 Welding fume is composed of a complex of different metal particulates. Pulmonary exposure to different welding fumes may exert a negative impact on cardiac function, although the underlying mechanisms remain unclear. To explore the effect of welding fumes on cardiac function, Sprague-Dawley rats were exposed by intratracheal instillation to 2 mg/rat of manual metal arc hard surfacing welding fume (MMA-HS) once per week for 7 wk. Control rats received saline. Cardiomyocytes were isolated enzymatically at d 1 and 7 postexposure. Intracellular calcium ([Ca2+]i) transients (fluorescence ratio) were measured on the stage of an inverted phase-contrast microscope using a myocyte calcium imaging/cell length system. Phosphorylation levels of cardiac troponin I (cTnI) were determined by Western blot. The levels of nonspecific inflammatory marker C-reactive protein (CRP) and proinflammatory cytokine interleukin-6 (IL-6) in serum were measured by enzyme-linked immunosorbent assay (ELISA). Contraction of isolated cardiomyocytes was significantly reduced at d 1 and d 7 postexposure. Intracellular calcium levels were decreased in response to extracellular calcium stimulation at d 7 postexposure. Changes of intracellular calcium levels after isoprenaline hydrochloride (ISO) stimulation were not markedly different between groups at either time point. Phosphorylation levels of cTnI in the left ventricle were significantly lower at d 1 postexposure. The serum levels of CRP were not markedly different between groups at either time point. Serum levels of IL-6 were not detectable in both groups. Cardiomyocyte alterations observed after welding fume treatment were mainly due to alterations in intracellular calcium handling and phosphorylation levels of cTnI. |
Inference of strata separation and gas emission paths in longwall overburden using continuous wavelet transform of well logs and geostatistical simulation
Karacan CO , Olea RA . J Appl Geophy 2014 105 147-158 Prediction of potential methane emission pathways from various sources into active mine workings or sealed gobs from longwall overburden is important for controlling methane and for improving mining safety. The aim of this paper is to infer strata separation intervals and thus gas emission pathways from standard well log data. The proposed technique was applied to well logs acquired through the Mary Lee/Blue Creek coal seam of the Upper Pottsville Formation in the Black Warrior Basin, Alabama, using well logs from a series of boreholes aligned along a nearly linear profile.For this purpose, continuous wavelet transform (CWT) of digitized gamma well logs was performed by using Mexican hat and Morlet, as the mother wavelets, to identify potential discontinuities in the signal. Pointwise Holder exponents (PHE) of gamma logs were also computed using the generalized quadratic variations (GQV) method to identify the location and strength of singularities of well log signals as a complementary analysis. PHEs and wavelet coefficients were analyzed to find the locations of singularities along the logs.Using the well logs in this study, locations of predicted singularities were used as indicators in single normal equation simulation (SNESIM) to generate equi-probable realizations of potential strata separation intervals. Horizontal and vertical variograms of realizations were then analyzed and compared with those of indicator data and training image (TI) data using the Kruskal-Wallis test. A sum of squared differences was employed to select the most probable realization representing the locations of potential strata separations and methane flow paths.Results indicated that singularities located in well log signals reliably correlated with strata transitions or discontinuities within the strata. Geostatistical simulation of these discontinuities provided information about the location and extents of the continuous channels that may form during mining. If there is a gas source within their zone of influence, paths may develop and allow methane movement towards sealed or active gobs under pressure differentials. Knowledge gained from this research will better prepare mine operations for potential methane inflows, thus improving mine safety. |
Occurrence and molecular characterization of Cryptosporidium spp. in yaks (Bos grunniens) in China.
Ma J , Cai J , Ma J , Feng Y , Xiao L . Vet Parasitol 2014 202 113-8 Compared with dairy and beef cattle, few data are available on the occurrence and distribution of Cryptosporidium species in yaks, which live in a very different habitat. In this study, 327 fecal specimens were collected from yaks in 4 counties in Qinghai Province of China and screened for Cryptosporidium by nested PCR analysis of the 18S rRNA gene. A total of 98 (30.0%) specimens were positive for Cryptosporidium. The occurrence of Cryptosporidium varied significantly among age groups; infection rates were 49.3% in weaned calves, 31.7% in yearlings, and 17.4% in adults. PCR products of all Cryptosporidium-positive specimens were successfully sequenced, with 56 specimens (57.1%) having C. bovis, 33 (33.7%) having C. ryanae, 2 (2.0%) having C. andersoni, 1 (1.0%) having C. ubiquitum, 1 (1.0%) having C. xiaoi, 2 (2.0%) having a novel genotype, and 3 (3.1%) having mixed infections of C. bovis and C. ryanae. There were some age-related differences in the distribution of Cryptosporidium species in post-weaned yaks examined. To our knowledge, this is the first report of C. andersoni, C. ubiquitum, C. xiaoi and a novel Cryptosporidium genotype in yaks. |
Trichomonas vaginalis metronidazole resistance is associated with single nucleotide polymorphisms in the nitroreductase genes ntr4Tv and ntr6Tv
Paulish-Miller TE , Augostini P , Schuyler JA , Smith WL , Mordechai E , Adelson ME , Gygax SE , Secor WE , Hilbert DW . Antimicrob Agents Chemother 2014 58 (5) 2938-43 Metronidazole resistance in the sexually transmitted parasite Trichomonas vaginalis is a problematic public health issue. We have identified single nucleotide polymorphisms (SNPs) in two nitroreductase genes (ntr4Tv and ntr6Tv) associated with resistance. These SNPs were associated with one of two distinct T. vaginalis populations identified by multilocus sequence typing, yet one SNP (ntr6Tv A238T), which results in a premature stop codon, was associated with resistance independent of population structure and may be of diagnostic value. |
Trends and seasonal variation in outpatient antibiotic prescription rates in the United States, 2006 to 2010
Suda KJ , Hicks LA , Roberts RM , Hunkler RJ , Taylor TH . Antimicrob Agents Chemother 2014 58 (5) 2763-6 Antibiotic-resistant bacteria are an increasing threat to the effectiveness of antibiotics. The majority of antibiotics are prescribed in primary care settings for upper respiratory tract infections. The purpose of this study was to describe seasonal trends in outpatient antibiotic prescriptions (Rx) in the United States over a 5-year period. This study was a retrospective, cross-sectional observation of systemic antibiotic prescriptions in the outpatient setting from 2006 to 2010. Winter months were defined as the first and fourth quarters of the calendar year. Antibiotic prescribing rates were calculated (prescriptions/1,000 population) using annual U.S. Census Bureau population data. Over 1.34 billion antibiotic prescriptions were dispensed over the 5-year period. The antibiotic prescription (Rx) rate decreased from 892 Rx/1,000 population in 2006 to 867 Rx/1,000 population in 2010. Penicillins and macrolides were the primary antibiotic classes prescribed, but penicillin prescribing decreased while macrolide prescribing increased over the study period. Overall, antibiotic prescriptions were 24.5% higher in winter months than in the summer, with the largest difference (28.8%) in 2008 and the smallest (20.4%) in 2010. This seasonality was consistently drug class dependent, driven by 75% and 100% increases in penicillin and macrolide prescriptions, respectively, in the winter months. The mean outpatient antibiotic prescription rate decreased in the United States from 2006 to 2010. More antibiotic prescribing, predominately driven by the macrolide and penicillin classes, in the outpatient setting was observed in the winter months. Understanding annual variability in antibiotic use can assist with designing interventions to improve the judicious use of antibiotics. |
Effects of Massachusetts health reform on the use of clinical preventive services
Okoro CA , Dhingra SS , Coates RJ , Zack M , Simoes EJ . J Gen Intern Med 2014 29 (9) 1287-95 BACKGROUND: Expansion of health insurance coverage, and hence clinical preventive services (CPS), provides an opportunity for improvements in the health of adults. The degree to which expansion of health insurance coverage affects the use of CPS is unknown. OBJECTIVE: To assess whether Massachusetts health reform was associated with changes in healthcare access and use of CPS. DESIGN: We used a difference-in-differences framework to examine change in healthcare access and use of CPS among working-aged adults pre-reform (2002-2005) and post-reform (2007-2010) in Massachusetts compared with change in other New England states (ONES). SETTING: Population-based, cross-sectional Behavioral Risk Factor Surveillance System surveys. PARTICIPANTS: A total of 208,831 survey participants aged 18 to 64 years. INTERVENTION: Massachusetts health reform enacted in 2006. MEASUREMENTS: Four healthcare access measures outcomes and five CPS. KEY RESULTS: The proportions of adults who had health insurance coverage, a healthcare provider, no cost barrier to healthcare, an annual routine checkup, and a colorectal cancer screening increased significantly more in Massachusetts than those in the ONES. In Massachusetts, the prevalence of cervical cancer screening in pre-reform and post-reform periods was about the same; however, the ONES had a decrease of -1.6 percentage points (95 % confidence interval [CI] -2.5, -0.7; p <0.001). As a result, the prevalence of cervical cancer screening in Massachusetts was increased relative to the ONES (1.7, 95 % CI 0.2, 3.2; p = 0.02). Cholesterol screening, influenza immunization, and breast cancer screening did not improve more in Massachusetts than in the ONES. LIMITATIONS: Data are self-reported. CONCLUSIONS: Health reform may increase healthcare access and improve use of CPS. However, the effects of health reform on CPS use may vary by type of service and by state. |
Addressing disparities in the health of American Indian and Alaskan Native people: the importance of improved public health data
Bauer UE , Plescia M . Am J Public Health 2014 104 Suppl 3 S255-7 Chronic diseases and injuries are now the greatest threat to health in the 21st century. Racial and ethnic disparities in health status, largely attributable to chronic diseases, are widely recognized as a priority public health and civil rights challenge. The articles in this supplement of the American Journal of Public Health document the substantial burden of disease borne by American Indian and Alaska Native (AI/AN) people. Addressing these issues should continue to be a major priority for public health, amplified in urgency by the legacy of social, environmental and cultural injustices that have been inflicted on these populations. |
The African Health Profession Regulatory Collaborative (ARC) at two years
McCarthy CF , Zuber A , Kelley MA , Verani AR , Riley PL . Afr J Midwifery Womens Health 2014 8 4-9 BACKGROUND: The African Health Profession Regulatory Collaborative (ARC) for nurses and midwives was created in response to the increasing reliance on shifting HIV tasks to nurses and midwives without the necessary regulation supporting this enhanced professional role. ARC APPROACH: The ARC initiative comprises regional meetings, technical assistance, and regulatory improvement grants which enhance HIV service delivery by nurses and midwives, and systematic evaluation of project impact. RESULTS: Eight of 11 countries funded by ARC advanced a full stage in regulatory capacity during their 1-year project period. Countries in ARC also demonstrated increased capacity in project management and proposal writing. DISCUSSION: The progress of country teams thus far suggests ARC is a successful model for regulation strengthening and capacity building, as well as presenting a novel approach for sustainability and country ownership. The ARC platform has been a successful vehicle for regional harmonisation of updated regulations and promises to help facilitate the enhancement of HIV service delivery by nurses and midwives. |
Assessing the contributions of an academic health department for a school of public health in New York State
Eidson M , Pradhan E , Morse D . Public Health Rep 2014 129 (3) 296-302 The Institute of Medicine has released key reports that address issues of public health, training, and collaboration.1–3 During the past two decades, efforts have been made to bridge the gap between academic and public health practice, including grants from the Centers for Disease Control and Prevention (CDC) through the Association of Schools and Programs of Public Health (ASPPH).4 ASPPH has issued reports on demonstrating excellence using the practice-based model for teaching,5 service,6 and research.7 ASPPH continues to assist schools of public health (SPHs) through its ASPPH Practice Council and the Academic Public Health Practice Committee, which comprises deans of SPHs. | Academic/health department partnerships have been used to facilitate public health preparedness,8–11 chronic disease epidemiology,12 and response to outbreaks.13,14 One important area for such partnerships is the establishment of field-based internships.5 Surveys of SPHs and state health agencies in the 1990s provided a broad summary of such partnerships at that time.15,16 | A health department such as the New York State Department of Health (NYSDOH) that has a formal affiliation with an academic institution is referred to as an academic health department (AHD).17 The University at Albany School of Public Health (UA-SPH) was created in 1985 by a memorandum of understanding (MOU) between NYSDOH and UA. Faculty members were initially primarily NYSDOH staff. Over time, full-time UA-funded staff were hired. The partnership was partially supported through ASPPH grants. This model was described in the ASPPH “Strong Schools, Strong Partners” report.18 More details on the history of the MOU and the UA-SPH/NYSDOH AHD partnership have been detailed elsewhere.19 NYSDOH does not require its staff to apply for UA-SPH faculty status, and all staff work with UA-SPH is voluntary. The MOU does not specify the type, schedule, or level of effort for NYSDOH faculty contributions. However, UA-SPH and the individual academic departments have guidelines that apply to faculty applications and renewals. The MOU states that NYSDOH staff cannot receive a salary or bonuses for their UA-SPH affiliations. |
Developing a Continuing Professional Development Program to Improve Nursing Practice in Lesotho
Moetsana-Poka F , Lehana T , Lebaka M , McCarthy CF . Afr J Midwifery Womens Health 2014 8 10-13 INTRODUCTION: In 2010, the Lesotho government required professional regulatory bodies to enforce continuing professional development (CPD) amongst their members. The Lesotho Nursing Council (LNC) sought financial and technical support from the African Health Profession Regulatory Collaborative (ARC) to develop a national CPD programme. METHODS: The LNC and other national nursing and midwifery leadership collaborated on a step-by-step process of designing a CPD framework, engaging key stakeholders, and pilot testing CPD implementation, the Lesotho nursing leaders drafted a nursing and midwifery CPD framework. CONCLUSIONS: The Lesotho nursing leaders developed the first health professional CPD programme in the country. Development of a CPD programme in Lesotho has reconciled nursing practice with the legislative standards governing the workforce. |
Accuracy of assisted reproductive technology information on birth certificates: Florida and Massachusetts, 2004-06
Cohen B , Bernson D , Sappenfield W , Kirby RS , Kissin D , Zhang Y , Copeland G , Zhang Z , Macaluso M . Paediatr Perinat Epidemiol 2014 28 (3) 181-90 BACKGROUND: Assisted Reproductive Technology (ART) includes fertility procedures where both egg and sperm are handled in the lab. ART use has increased considerably in recent years, accounting for 47 090 livebirths in the US in 2010. ART increases the probability of multiple gestation births, which are at higher risks than singletons for adverse outcomes. Additionally, ART is associated with a greater risk of complications during pregnancy, labour, and delivery, and increased risk of adverse perinatal outcomes in singleton births. METHODS: We merged Florida and Massachusetts birth records from 2004-06 with the National ART Surveillance System (NASS) and using NASS as the gold standard, calculated sensitivity, specificity, and positive predictive value (PPV) of ART reporting on the birth certificates by maternal, infant, and hospital characteristics. We fit random-effects logistic regression models to evaluate simultaneously the association of ART reporting with these predictors while accounting for correlation among births occurring in the same hospital. RESULTS: Sensitivity of ART reporting on the birth certificate was 28.9% in Florida and 41.4% in Massachusetts. Specificity was >99% in both states. PPV was 45.5% in Florida and 54.6% in Massachusetts. The odds of ART reporting varied by state and by several maternal and delivery characteristics including age, parity, history of fetal loss, plurality, race/Hispanic ethnicity, delivery payment source, pre-existing conditions, and complications during pregnancy or labour and delivery. CONCLUSIONS: There was significant under-reporting of ART procedures on the birth certificates. Using data on ART births identified only from birth certificates yields a biased sample of the population of ART births. |
Disparities in current cigarette smoking prevalence by type of disability, 2009-2011
Courtney-Long E , Stevens A , Caraballo R , Ramon I , Armour BS . Public Health Rep 2014 129 (3) 252-260 OBJECTIVES: Smoking, the leading cause of disease and death in the United States, has been linked to a number of health conditions including cancer and cardiovascular disease. While people with a disability have been shown to be more likely to report smoking, little is known about the prevalence of smoking by type of disability, particularly for adults younger than 50 years of age. METHODS: We used data from the 2009-2011 National Health Interview Survey to estimate the prevalence of smoking by type of disability and to examine the association of functional disability type and smoking among adults aged 18-49 years. RESULTS: Adults with a disability were more likely than adults without a disability to be current smokers (38.8% vs. 20.7%, p<0.001). Among adults with disabilities, the prevalence of smoking ranged from 32.4% (self-care difficulty) to 43.8% (cognitive limitation). When controlling for sociodemographic characteristics, having a disability was associated with statistically significantly higher odds of current smoking (adjusted odds ratio = 1.57, 95% confidence interval 1.40, 1.77). CONCLUSIONS: The prevalence of current smoking for adults was higher for every functional disability type than for adults without a disability. By understanding the association between smoking and disability type among adults younger than 50 years of age, resources for cessation services can be better targeted during the ages when increased time for health improvement can occur. |
Potentially preventable deaths from the five leading causes of death - United States, 2008-2010
Yoon PW , Bastian B , Anderson RN , Collins JL , Jaffe HW . MMWR Morb Mortal Wkly Rep 2014 63 (17) 369-74 In 2010, the top five causes of death in the United States were 1) diseases of the heart, 2) cancer, 3) chronic lower respiratory diseases, 4) cerebrovascular diseases (stroke), and 5) unintentional injuries. The rates of death from each cause vary greatly across the 50 states and the District of Columbia (2). An understanding of state differences in death rates for the leading causes might help state health officials establish disease prevention goals, priorities, and strategies. States with lower death rates can be used as benchmarks for setting achievable goals and calculating the number of deaths that might be prevented in states with higher rates. To determine the number of premature annual deaths for the five leading causes of death that potentially could be prevented ("potentially preventable deaths"), CDC analyzed National Vital Statistics System mortality data from 2008-2010. The number of annual potentially preventable deaths per state before age 80 years was determined by comparing the number of expected deaths (based on average death rates for the three states with the lowest rates for each cause) with the number of observed deaths. The results of this analysis indicate that, when considered separately, 91,757 deaths from diseases of the heart, 84,443 from cancer, 28,831 from chronic lower respiratory diseases, 16,973 from cerebrovascular diseases (stroke), and 36,836 from unintentional injuries potentially could be prevented each year. In addition, states in the Southeast had the highest number of potentially preventable deaths for each of the five leading causes. The findings provide disease-specific targets that states can use to measure their progress in preventing the leading causes of deaths in their populations. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Epidemiology and Surveillance
- Food Safety
- Health Behavior and Risk
- Healthcare Associated Infections
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Medicine
- Nutritional Sciences
- Occupational Safety and Health
- Occupational Safety and Health - Mining
- Parasitic Diseases
- Public Health Leadership and Management
- Reproductive Health
- Substance Use and Abuse
- Vital Statistics
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure