Updated recommendations for client- and provider-oriented interventions to increase breast, cervical, and colorectal cancer screening
Community Preventive Services Task Force , Sabatino SA . Am J Prev Med 2012 43 (1) 92-6 The Community Preventive Services Task Force (Task Force) recommends increasing screening for breast cancer through use of group education, one-on-one education, client reminders, reducing client out-of-pocket costs, and provider assessment and feedback; increasing screening for cervical cancer through use of one-on-one education, client reminders, and provider assessment and feedback; and increasing screening for colorectal cancer through use of one-on-one education, client reminders, reducing structural barriers to screening, and provider assessment and feedback. The Task Force found insufficient evidence to determine the effectiveness of increasing screening for breast cancer through use of client incentives, mass media, or provider incentives; for cervical cancer screening through use of group education, client incentives, mass media, reducing client out-of-pocket costs, reducing structural barriers, or provider incentives; and for colorectal cancer screening through use of group education, client incentives, mass media, reducing client out-of-pocket costs, or provider incentives. Details of these findings, and some considerations for use, are provided in this article. |
A national survey of hemochromatosis patients
Mainous AG 3rd , Knoll ME , Everett CJ , Hulihan MM , Grant AM , Garrison C , Koenig G , Sayers C , Allen KW . J Am Board Fam Med 2012 25 (4) 432-6 BACKGROUND: Hereditary hemochromatosis (HH) is a common genetic disease in the United States, but little is known about the diagnosis from the patient's perspective. The purpose of this study was to characterize the circumstances surrounding the diagnosis of HH and assess treatments and health information needs. METHODS: We surveyed US adults aged 18 years and older who were diagnosed with HH after 1996. Response rate was 46%, with a total sample size of 979. Respondents were asked about the use of genetic and clinical markers in their diagnosis, current treatments, and health information needs. RESULTS: Results were stratified by age, education, and income status. Total of 90.0% of women and 75.5% of men were genetically tested for HH (P < .01). Approximately half (52.5%) were diagnosed by a gastroenterologist, hematologist, or other specialty physician and half were diagnosed by a primary care provider. Most of the respondents thought their HH had improved with the initial treatment and most patients were still receiving treatment for HH. Patient interest in learning more about specific hemochromatosis topics was generally high. CONCLUSIONS: Since the introduction of genetic identification of HH, these tests have been used in the diagnosis of the majority of patients. Primary care physicians may need to be more aware HH and strategies for diagnosis. |
Effectiveness of interventions to increase screening for breast, cervical, and colorectal cancers: nine updated systematic reviews for the Guide to Community Preventive Services
Sabatino SA , Lawrence B , Elder R , Mercer SL , Wilson KM , Devinney B , Melillo S , Carvalho M , Taplin S , Bastani R , Rimer BK , Vernon SW , Melvin CL , Taylor V , Fernandez M , Glanz K . Am J Prev Med 2012 43 (1) 97-118 CONTEXT: Screening reduces mortality from breast, cervical, and colorectal cancers. The Guide to Community Preventive Services previously conducted systematic reviews on the effectiveness of 11 interventions to increase screening for these cancers. This article presents results of updated systematic reviews for nine of these interventions. EVIDENCE ACQUISITION: Five databases were searched for studies published during January 2004-October 2008. Studies had to (1) be a primary investigation of one or more intervention category; (2) be conducted in a country with a high-income economy; (3) provide information on at least one cancer screening outcome of interest; and (4) include screening use prior to intervention implementation or a concurrent group unexposed to the intervention category of interest. Forty-five studies were included in the reviews. EVIDENCE SYNTHESIS: Recommendations were added for one-on-one education to increase screening with fecal occult blood testing (FOBT) and group education to increase mammography screening. Strength of evidence for client reminder interventions to increase FOBT screening was upgraded from sufficient to strong. Previous findings and recommendations for reducing out-of-pocket costs (breast cancer screening); provider assessment and feedback (breast, cervical, and FOBT screening); one-on-one education and client reminders (breast and cervical cancer screening); and reducing structural barriers (breast cancer and FOBT screening) were reaffirmed or unchanged. Evidence remains insufficient to determine effectiveness for the remaining screening tests and intervention categories. CONCLUSIONS: Findings indicate new and reaffirmed interventions effective in promoting recommended cancer screening, including colorectal cancer screening. Findings can be used in community and healthcare settings to promote recommended care. Important research gaps also are described. |
First things first: protecting children with asthma from infection with influenza
Garbe PL , Callahan DB , Lu PJ , Euler GL . Am J Respir Crit Care Med 2012 185 (12) i-ii Currently in the U.S., approximately 7 million children (9.4%) have asthma (1), making it the most prevalent serious chronic illness among U.S. children. Clinically, the association of viral respiratory infections and asthma exacerbations has been understood for decades. More recently, infections with particular viruses have been identified as being particularly risky: respiratory syncytial virus, rhinovirus, and influenza virus are notable examples. In the spring of 2009, a new influenza virus (A(H1N1)pdm09 [2009 H1N1]) with pandemic potential was isolated from patients in the U.S. and around the world (2). Early data indicated that certain comorbid medical conditions increased the risk for hospitalization and intensive care unit admission (3). Persons with asthma appeared to bear a disproportionate risk, and local and state health departments along with Centers for Disease Control and Prevention (CDC) developed and disseminated guidance early in the outbreak for persons with asthma and their health care providers. Early diagnosis and use of antiviral medication, along with public health practices like self-distancing and hand-washing, were emphasized. Persons with comorbid conditions (including asthma) were prioritized to receive vaccine once it became available. These recommendations, however, were more re-iterations of existing practices and policies rather than de novo interventions. As was consistent with previous recommendations, vaccination of persons with asthma was to prevent influenza because of the risk of increased disease severity, rather than increased risk of becoming infected with influenza virus. Analysis of existing data did not, at that point in time, support (nor refute) an increased risk of infection among persons with asthma. |
Asthma prevalence among US elderly by age groups: age still matters
Oraka E , Kim HJ , King ME , Callahan DB . J Asthma 2012 49 (6) 593-599 OBJECTIVE: For over three decades, the greatest burden of asthma deaths has occurred among persons aged 65 years and older. This study analyzed the association between increasing age and asthma prevalence among age groups within the US elderly population. METHODS: We analyzed aggregated data on 54,485 civilian, noninstitutionalized US adults aged 65 years and older from the 2001-2010 National Health Interview Survey (NHIS). We estimated the prevalence of current asthma, lifetime asthma, and chronic obstructive pulmonary disease (COPD) among US elderly by 5-year age groups and age stages ("young elderly" aged 65-84 years and "oldest old" aged ≥85 years). We calculated adjusted odds ratios (AOR) and 95% confidence intervals (CI) to identify asthma prevalence patterns among elderly populations. RESULTS: From 2001 to 2010, the estimated average annual prevalence of current asthma among US elderly was 7.0%. Estimates of lifetime asthma, COPD, and co-occurring current asthma and COPD were 9.9%, 9.7%, and 3.0%, respectively. Prevalence of asthma decreased with advancing age while prevalence of COPD increased with advancing age. When controlling for study variables and significant interactions (p = .05) with COPD, the odds of reporting current asthma decreased with advancing age: 0.87 (95% CI, 0.76-1.01) for 70- to 74-year-olds; 0.76 (95% CI, 0.66-0.87) for 75- to 79-year-olds; 0.62 (95% CI, 0.51-0.75) for 80- to 84-year-olds; and 0.45 (95% CI, 0.36-0.55) for ≥85-year-olds, as compared to 65- to 69-year-olds. CONCLUSIONS: Asthma continues to affect a substantial proportion of the US elderly population. Increased diagnosis of COPD may overshadow correct diagnosis and treatment in populations with advancing age. Treatment guidelines should focus on preventable risk behaviors to increase the quality of life within this population. |
Management and control of varicella on cruise ships: a collaborative approach to promoting public health
Cramer EH , Slaten DD , Guerreiro A , Robbins D , Ganzon A . J Travel Med 2012 19 (4) 226-32 BACKGROUND: In most years varicella is the vaccine-preventable disease most frequently reported to Centers for Disease Control and Prevention (CDC) by cruise ships. Since 2005, CDC has received numerous isolated case reports of varicella among crew members and has investigated varicella outbreaks aboard vessels sailing into and from US seaports. METHODS: CDC investigators reviewed electronic varicella case reports from 2005 to 2009 and outbreak reports from 2009 to characterize the response and control efforts implemented by cruise ships in accordance with CDC protocols. Outbreak reports from 2009 were manually reviewed for details of case identification, contact investigations, isolation and restriction of cases and contacts, respectively, and number of contacts administered varicella vaccine post-exposure by cruise lines. RESULTS: During 2005 to 2009, cruise ships reported 278 cases of varicella to CDC among predominantly male (80%) crew members, three-quarters of whom were residents of Caribbean countries, Indonesia, the Philippines, or India, and whose median age was 29 years. Cases were more commonly reported during spring and winter months. During 2009, cruise ships reported 94 varicella cases among crew members of which 66 (70%) were associated with 18 reported varicella outbreaks. Outbreak response included isolation of 66 (100%) of 66 cases, restriction of 66 (26%) of 255 crew-contacts, and administration of post-exposure vaccine to 522 close contacts and other susceptible crew members per standard CDC recommendations. DISCUSSION: Most cases reported to CDC during 2005 to 2009 were among non-US resident crew members. Overall, cruise lines sailing into North America have the onboard capability to manage varicella cases and outbreaks and appear responsive to CDC recommendations. Cruise lines should continue to implement CDC-recommended response protocols to curtail outbreaks rapidly and should consider whether pre-placement varicella immunity screening and vaccination of crew members is a cost-effective option for their respective fleet operations. |
Measles, rubella, and varicella among the crew of a cruise ship sailing from Florida, United States, 2006
Mitruka K , Felsen CB , Tomianovic D , Inman B , Street K , Yambor P , Reef SE . J Travel Med 2012 19 (4) 233-7 BACKGROUND: Cruise ship outbreaks of vaccine-preventable diseases (VPD) such as rubella and varicella have been previously associated with introduction and spread among susceptible crew members originating from countries with endemic transmission of these diseases. METHODS: During February to April 2006, we investigated a cluster of rash illnesses due to measles, rubella, or varicella on a cruise ship sailing from Florida to the Caribbean. Case-finding measures included review of medical logs, active surveillance for rash illness among crew members, and passive surveillance for rash illness in the ship's infirmary lasting two incubation periods from the last case of measles. Passengers with potential exposure to these VPD were notified by letters. All susceptible crew members with potential exposure were administered the measles, mumps, and rubella vaccine after informed consent. RESULTS: A total of 16 cases were identified only among crew members: 1 rubella, 3 measles (two-generation spread), 11 varicella (three-generation spread), and 1 unknown diagnosis. Of 1,197 crew members evaluated, 4 had proof of immunity to measles and rubella. Based on passive surveillance, no cases were identified among passengers, the majority of whom resided in the United States. CONCLUSION: The international makeup of the population aboard cruise ships combined with their semi-enclosed environment has the potential to facilitate introduction and spread of VPD such as measles, rubella, and varicella onboard and into communities. Cruise lines should ensure crew members have evidence of immunity to these diseases. Passengers should be up to date with all vaccinations, including those that are travel-specific, prior to embarking on cruise travel. |
Norovirus outbreak of probable waterborne transmission with high attack rate in a Guatemalan resort
Arvelo W , Sosa SM , Juliao P , Lopez MR , Estevez A , Lopez B , Morales-Betoulle ME , Gonzalez M , Gregoricus NA , Hall AJ , Vinje J , Parashar U , Lindblade KA . J Clin Virol 2012 55 (1) 8-11 BACKGROUND: In February 2009, a group of Guatemalan school children developed acute gastroenteritis (AGE) after participating in a school excursion. OBJECTIVES: We conducted a retrospective cohort investigation to characterize the outbreak and guide control measures. STUDY DESIGN: A case was defined as an illness with onset of diarrhea or vomiting during February 25-March 5, 2009. Participants were interviewed using a standardized questionnaire, and stool specimens were collected. We inspected the excursion site and tested water samples for total coliforms and Escherichia coli. RESULTS: We identified 119 excursion participants, of which 92 (77%) had been ill. Fifty-six (62%) patients sought care for their illness, and three (3%) were hospitalized. Eighteen (90%) of the 20 specimens from ill children tested positive for norovirus. Among these, 16 (89%) were of the genogroup I (GI.7) and two (11%) were genogroup II (GII.12 and GII.17). One (8%) of the 12 food handlers had norovirus (GI.7). Drinking water samples had 146 most probable numbers (MPN)/100ml of total coliforms and five MPN/100ml of E. coli. CONCLUSION: We describe the first laboratory-confirmed norovirus outbreak in Guatemala. The high illness attack rate, detection of multiple norovirus strains in sick persons, and presence of fecal contamination of drinking water indicate likely waterborne transmission. |
Oseltamivir-resistant 2009 H1N1 influenza pneumonia during therapy in a renal transplant recipient
Shetty AK , Ross GA , Pranikoff T , Gubareva LV , Sechrist C , Guirand DM , Abramson J , Lin JJ . Pediatr Transplant 2012 16 (5) E153-7 The emergence of oseltamivir-resistant 2009 H1N1 influenza virus (conferred by the H275Y substitution in NA) during therapy or prophylaxis in immunocompromised patients is a serious concern. The optimal therapy for immunosuppressed patients with oseltamivir-resistant 2009 H1N1 influenza virus is unknown and few options exist. We report a 10-yr-old recipient of kidney transplant who was hospitalized with oseltamivir-resistant 2009 H1N1 influenza pneumonia complicated by severe respiratory failure, ARDS, and renal failure requiring institution of ECMO and CRRT. On presentation, treatment with oseltamivir (second course) and broad-spectrum antibiotics was initiated. Immunosuppressive agents were stopped on hospital day (d) 2. On hospital d 7, given his critical status, immunocompromised state, and difficulty in obtaining intravenous zanamivir, after obtaining ethical approval and parental consent, he was treated with intravenous peramivir (through an Emergency Investigational New Drug Application) for two wk. He tolerated the regimen well and his clinical status improved gradually. Several factors may have contributed to virus clearance and survival including recovery of the immune system, aggressive critical care support, and administration of peramivir. Ongoing surveillance is essential to monitor how oseltamivir-resistant H275Y mutant viruses may evolve in the future. |
Population pharmacokinetics and pharmacodynamics of ofloxacin in South African patients with multidrug-resistant tuberculosis
Chigutsa E , Meredith S , Wiesner L , Padayatchi N , Harding J , Moodley P , Mac Kenzie WR , Weiner M , McIlleron H , Kirkpatrick CM . Antimicrob Agents Chemother 2012 56 (7) 3857-63 Despite the important role of fluoroquinolones and the predominant use of ofloxacin for treating multidrug-resistant tuberculosis in South Africa, there are limited data on ofloxacin pharmacokinetics in patients with multidrug-resistant tuberculosis, no ofloxacin pharmacokinetic data from South African patients, and no direct assessment of the relationship between ofloxacin pharmacokinetics and the MIC of ofloxacin of patient isolates. Our objectives are to describe ofloxacin pharmacokinetics in South African patients being treated for multidrug-resistant tuberculosis and assess the adequacy of ofloxacin drug exposure with respect to the probability of pharmacodynamic target attainment (area under the time curve/MIC ratio of at least 100). Sixty-five patients with multidrug-resistant tuberculosis were recruited from 2 hospitals in South Africa. We determined the ofloxacin MICs for the Mycobacterium tuberculosis isolates from baseline sputum specimens. Patients received daily doses of 800 mg ofloxacin, in addition to other antitubercular drugs. Patients underwent pharmacokinetic sampling at steady state. NONMEM was used for data analysis. The population pharmacokinetics of ofloxacin in this study has been adequately described. The probability of target attainment expectation in the study population was 0.45. Doubling the dose to 1,600 mg could increase this to only 0.77. The currently recommended ofloxacin dose appeared inadequate for the majority of this study population. Studies to assess the tolerability of higher doses are warranted. Alternatively, ofloxacin should be replaced with more potent fluoroquinolones. |
Protecting adults from influenza: tis the season to learn from the pandemic
Schuchat A , Katz JM . J Infect Dis 2012 206 (6) 803-5 Although influenza seasons come and go, one unfortunate constant over the past decade has been a lack of measurable progress in protecting adults from influenza. Despite greater vaccine supply, rising vaccination rates in children, and universal recommendations for all adults to be vaccinated annually, vaccination rates among the general adult population have scarcely budged. This stagnation in population coverage accentuates the value of approaches that improve influenza vaccine efficacy in adults. In this issue of the Journal of Infectious Diseases Jackson reports on the comparative immunogenicity of multiple formulations of A(H1N1) 09pdm influenza vaccine among adults [1]. Their study has potential relevance for improved control of seasonal influenza, as well as better preparedness against future pandemic and avian influenza threats. | The availability of inactivated vaccine and the recurring burden of influenza and its complications led the US surgeon general in 1960 to issue the first recommendations for routine annual influenza vaccination of older adults, pregnant women, and others with chronic medical conditions [2]. Vaccination rates among the elderly increased substantially during the 1990s [3] but subsequently plateaued at approximately 60% to 70%. Older adults continue to experience a disproportionate burden of severe illness caused by influenza. Unfortunately, influenza vaccine effectiveness is generally lower among older populations, even during seasons when vaccine strains are well matched to circulating viruses. Efforts to overcome immune senescence and identify formulations with improved immunogenicity and clinical protection have been a focus of researchers, manufacturers, and the government. The potential roles of high-dose antigen formulations as well as adjuvants in improving immune response have been of particular interest with respect to both avian and seasonal influenza vaccines. |
HIV nucleic acid amplification testing versus rapid testing: it is worth the wait. Testing preferences of men who have sex with men
O'Neal J D , Golden MR , Branson BM , Stekler JD . J Acquir Immune Defic Syndr 2012 60 (4) e119-22 We conducted a study comparing the OraQuickADVANCE Rapid HIV-1/2 Antibody Test, Uni-Gold Recombigen HIV Test, Determine HIV 1/2 Ag/Ab Combo, EIA, and pooled nucleic acid amplification testing (NAAT). Men who have sex with men rated tests based on specimen collection method and trust in each test. Among 490 subjects, OraQuick performed on oral fluids ranked highest for specimen collection method but lowest on trust; NAAT scored highest on trust. Among a subset of these subjects, 46% would opt for NAAT if choosing one test. Strategies are needed to increase HIV testing that is accurate and consistent with client preferences. |
Antiretroviral preexposure prophylaxis for heterosexual HIV transmission in Botswana
Thigpen MC , Kebaabetswe PM , Paxton LA , Smith DK , Rose CE , Segolodi TM , Henderson FL , Pathak SR , Soud FA , Chillag KL , Mutanhaurwa R , Chirwa LI , Kasonde M , Abebe D , Buliva E , Gvetadze RJ , Johnson S , Sukalac T , Thomas VT , Hart C , Johnson JA , Malotte CK , Hendrix CW , Brooks JT . N Engl J Med 2012 367 (5) 423-34 BACKGROUND: Preexposure prophylaxis with antiretroviral agents has been shown to reduce the transmission of human immunodeficiency virus (HIV) among men who have sex with men; however, the efficacy among heterosexuals is uncertain. METHODS: We randomly assigned HIV-seronegative men and women to receive either tenofovir disoproxil fumarate and emtricitabine (TDF-FTC) or matching placebo once daily. Monthly study visits were scheduled, and participants received a comprehensive package of prevention services, including HIV testing, counseling on adherence to medication, management of sexually transmitted infections, monitoring for adverse events, and individualized counseling on risk reduction; bone mineral density testing was performed semiannually in a subgroup of participants. RESULTS: A total of 1219 men and women underwent randomization (45.7% women) and were followed for 1563 person-years (median, 1.1 years; maximum, 3.7 years). Because of low retention and logistic limitations, we concluded the study early and followed enrolled participants through an orderly study closure rather than expanding enrollment. The TDF-FTC group had higher rates of nausea (18.5% vs. 7.1%, P<0.001), vomiting (11.3% vs. 7.1%, P=0.008), and dizziness (15.1% vs. 11.0%, P=0.03) than the placebo group, but the rates of serious adverse events were similar (P=0.90). Participants who received TDF-FTC, as compared with those who received placebo, had a significant decline in bone mineral density. K65R, M184V, and A62V resistance mutations developed in 1 participant in the TDF-FTC group who had had an unrecognized acute HIV infection at enrollment. In a modified intention-to-treat analysis that included the 33 participants who became infected during the study (9 in the TDF-FTC group and 24 in the placebo group; 1.2 and 3.1 infections per 100 person-years, respectively), the efficacy of TDF-FTC was 62.2% (95% confidence interval, 21.5 to 83.4; P=0.03). CONCLUSIONS: Daily TDF-FTC prophylaxis prevented HIV infection in sexually active heterosexual adults. The long-term safety of daily TDF-FTC prophylaxis, including the effect on bone mineral density, remains unknown. (Funded by the Centers for Disease Control and Prevention and the National Institutes of Health; TDF2 ClinicalTrials.gov number, NCT00448669 .). |
Factors affecting the spread and maintenance of plague
Gage KL . Adv Exp Med Biol 2012 954 79-94 Plague is characterized by its potentially explosive spread during human epidemics and rodent epizootics. Recent research has suggested how this spread is likely to occur and what factors are associated with the onset of plague outbreaks and the continued spread of the disease. Among the apparent drivers of these outbreaks are climatic variables, host and vector densities, percolation thresholds, and the ability of many fleas to transmit efficiently soon after taking an infectious blood meal and before Yersinia pestis biofilm-related blockages appear in their guts. This presentation discusses each of these topics and their likely contribution to the rapid spread of plague to humans and in natural systems. |
Cryptosporidium spp. in quails (Coturnix coturnix japonica) in Henan, China: molecular characterization and public health significance
Wang R , Wang F , Zhao J , Qi M , Ning C , Zhang L , Xiao L . Vet Parasitol 2012 187 534-7 The prevalence of Cryptosporidium spp. was investigated in scale quail (Coturnix coturnix japonica) farms in Henan Province, China between September 2006 and August 2007. One thousand eight hundred and eighteen fecal samples from 47 quail farms in five areas were collected for the examination of Cryptosporidium oocysts. The overall prevalence of Cryptosporidium was 13.1% (95% CI 13.1+/-1.6%) (29 of 47 farms), with 72-100-day-old quails having the highest prevalence (23.6%, 95% CI 23.6+/-2.6%) (chi(2)=64.91; rho<0.01). The highest prevalence was observed in autumn (21.8%, 95% CI 21.8+/-3.1%) and the lowest in winter (chi(2)=74.83; rho<0.01). Two hundred and thirty-nine Cryptosporidium-positive samples were analyzed by PCR-restriction fragment length polymorphism (RFLP) analysis of the small subunit (SSU) rRNA gene, and 42 were further analyzed by DNA sequencing of the PCR products. Two Cryptosporidium species were identified, Cryptosporidium baileyi in 237 birds on 29 farms, and potentially zoonotic Cryptosporidium meleagridis in only two birds on two farms. These findings may suggest that quails are not a major source of zoonotic Cryptosporidium in the study area. |
Inappropriate medication in home health care
Lau DT , Dwyer LL . J Gen Intern Med 2012 27 (5) 490; author reply 491 We read with interest the study by Bao and colleagues examining the use of Beers-defined potentially inappropriate medications among older patients receiving home health care (HHC) services in the United States.1 Ensuring proper medication use especially in older adults remains a public health priority, and the authors argue that HHC patients may be at high risk for using ineffective or unsafe medications likely due to their often complex medication regimens and multiple physician prescribers. The study analyzes the 2007 National Home and Hospice Care Survey (NHHCS), a nationally representative survey of U.S. home health and hospice care agencies that collected data on current HHC patients and hospice care discharges.2 Bao and colleagues restricted their analysis to HHC patients who were age 65 or older and used at least one medication (n = 3,124). The authors, however, did not differentiate between patients receiving and patients not receiving end-of-life (EOL) care. According to NHHCS, we calculated that 15% (weighted) of HHC patients in their study had a medical prognosis indicating a six-month-or-less life expectancy and received “palliative, end of life, or terminal care instead of active or curative treatment.” | It is important for the Bao et al. study to distinguish between HHC patients who did and who did not receive EOL care. While the 2003 Beers list is commonly used to define medications to avoid among older patients3 (albeit not without controversy), there is no clear consensus about which medications are unsuitable for older patients receiving EOL care. The validity of the Beers list as a prescribing quality indicator to assess EOL treatment is disputable in principle and evidence.4 Research has argued that short-acting benzodiazepines, gastrointestinal antispasmodics, anticholinergics, and antihistamines that are on the Beers list may be clinically appropriate for older patients receiving EOL care whose goal of care is to manage pain and other distressing symptoms.5 Furthermore, although long half-life benzodiazepines generally should be avoided in older patients according to Beers, withdrawing a long half-life benzodiazepine may pose unnecessary, significant risk for major withdrawal symptoms in older patients receiving EOL care.6 Consequently, the examination of inappropriate medication use among HHC patients without differentiating between those receiving and those not receiving EOL care raises concerns about the Bao et al. study findings and their suggested policy and practice implications. In general, applying the Beers list to examine medication appropriateness in older adults should be performed judiciously in settings where EOL care is provided. |
Establishing exposure science as a distinct scientific discipline
Pleil JD , Blount BC , Waidyanatha S , Harper M . J Expo Sci Environ Epidemiol 2012 22 (4) 317-9 As readers of this journal, we are likely in agreement that “Exposure science is the bedrock for protection of public health.”,1 and despite some differing opinions as to what the exact definition of “exposure science” should be, a general consensus states that it “… studies human contact with chemical, physical, or biological agents occurring in their environments, and advances knowledge of the mechanisms and dynamics of events either causing or preventing adverse health outcomes.”2,3 | We have probably also observed that, in the greater scheme of scientific professions, those who practice exposure science are erstwhile chemists, biologists, physicists, toxicologists, epidemiologists, mathematicians, computer scientists, statisticians, environmental engineers, and medical/public health doctors; few, if any, of us are formally trained “exposure scientists”. Furthermore, exposure science tends to be considered a part of the other public health disciplines; the toxicologists, statisticians, and epidemiologists treat exposure as a subset of their disciplines, and often express concern about the lack of sufficient exposure information. In this article, we hope to promote exposure science as a distinct and recognizable scientific discipline. | The question is, how can we improve the perception, practice, and value of exposure science among the more established medical and public health disciplines? |
Childhood lead poisoning associated with gold ore processing: a village-level investigation - Zamfara State, Nigeria, October-November 2010
Lo YC , Dooyema CA , Neri A , Durant J , Jefferies T , Medina-Marino A , de Ravello L , Thoroughman D , Davis L , Dankoli RS , Samson MY , Ibrahim LM , Okechukwu O , Umar-Tsafe NT , Dama AH , Brown MJ . Environ Health Perspect 2012 120 (10) 1450-5 BACKGROUND: During May-June 2010, a childhood lead poisoning outbreak related to gold-ore-processing was confirmed in 2 villages in Zamfara State, Nigeria. During June-September, villages with suspected or confirmed childhood lead poisoning continued to be identified in Zamfara State. OBJECTIVES: We investigated the extent of childhood lead poisoning (≥1 child with a blood lead level [BLL] ≥10 microg/dL) and lead contamination (≥1 soil/dust sample with a lead level >400 parts per million) among villages in Zamfara State and identified villages that should be prioritized for urgent interventions. METHODS: We used chain-referral sampling to identify villages of interest, defined as villages suspected of participation in gold-ore-processing during the previous 12 months. We interviewed villagers, determined BLLs among children aged <5 years, and analyzed soil/dust from public areas and homes for lead. RESULTS: We identified 131 villages of interest and visited 74 (56%) villages in 3 local government areas. Fifty-four (77%) of 70 villages that completed the survey reported gold-ore-processing. Ore-processing villages were more likely to have ≥1 child aged <5 years with lead poisoning (68% vs. 50%, p=0.17) or death following convulsions (74% vs. 44%, p=0.02). Soil/dust contamination and BLL ≥45 microg/dL were identified in ore-processing villages only [50% (p<0.001) and 15% (p=0.22), respectively]. The odds of childhood lead poisoning or lead contamination was 3.5 times as high in ore-processing villages than the other villages (95% CI: 1.1, 11.3). CONCLUSION: Childhood lead poisoning and lead contamination were widespread in surveyed areas, particularly among villages that had processed ore recently. Urgent interventions are required to reduce lead exposure, morbidity, and mortality in affected communities. |
Using multiple imputation to assign pesticide use for non-responders in the follow-up questionnaire in the Agricultural Health Study
Heltshe SL , Lubin JH , Koutros S , Coble JB , Ji BT , Alavanja MC , Blair A , Sandler DP , Hines CJ , Thomas KW , Barker J , Andreotti G , Hoppin JA , Beane Freeman LE . J Expo Sci Environ Epidemiol 2012 22 (4) 409-16 The Agricultural Health Study (AHS), a large prospective cohort, was designed to elucidate associations between pesticide use and other agricultural exposures and health outcomes. The cohort includes 57,310 pesticide applicators who were enrolled between 1993 and 1997 in Iowa and North Carolina. A follow-up questionnaire administered 5 years later was completed by 36,342 (63%) of the original participants. Missing pesticide use information from participants who did not complete the second questionnaire impedes both long-term pesticide exposure estimation and statistical inference of risk for health outcomes. Logistic regression and stratified sampling were used to impute key variables related to the use of specific pesticides for 20,968 applicators who did not complete the second questionnaire. To assess the imputation procedure, a 20% random sample of participants was withheld for comparison. The observed and imputed prevalence of any pesticide use in the holdout dataset were 85.7% and 85.3%, respectively. The distribution of prevalence and days/year of use for specific pesticides were similar across observed and imputed in the holdout sample. When appropriately implemented, multiple imputation can reduce bias and increase precision and can be more valid than other missing data approaches. |
Epidemiological and laboratory characterization of a yellow fever outbreak in northern Uganda, October 2010-January 2011
Wamala JF , Malimbo M , Okot CL , Atai-Omoruto AD , Tenywa E , Miller JR , Balinandi S , Shoemaker T , Oyoo C , Omony EO , Kagirita A , Musenero MM , Makumbi I , Nanyunja M , Lutwama JJ , Downing R , Mbonye AK . Int J Infect Dis 2012 16 (7) e536-42 BACKGROUND: In November 2010, following reports of an outbreak of a fatal, febrile, hemorrhagic illness in northern Uganda, the Uganda Ministry of Health established multisector teams to respond to the outbreak. METHODS: This was a case-series investigation in which the response teams conducted epidemiological and laboratory investigations on suspect cases. The cases identified were line-listed and a data analysis was undertaken regularly to guide the outbreak response. RESULTS: Overall, 181 cases met the yellow fever (YF) suspected case definition; there were 45 deaths (case fatality rate 24.9%). Only 13 (7.5%) of the suspected YF cases were laboratory confirmed, and molecular sequencing revealed 92% homology to the YF virus strain Couma (Ethiopia), East African genotype. Suspected YF cases had fever (100%) and unexplained bleeding (97.8%), but jaundice was rare (11.6%). The overall attack rate was 13 cases/100,000 population, and the attack rate was higher for males than females and increased with age. The index clusters were linked to economic activities undertaken by males around forests. CONCLUSIONS: This was the largest YF outbreak ever reported in Uganda. The wide geographical case dispersion as well as the male and older age preponderance suggests transmission during the outbreak was largely sylvatic and related to occupational activities around forests. |
Acanthamoeba keratitis: the persistence of cases following a multistate outbreak
Yoder JS , Verani J , Heidman N , Hoppe-Bauer J , Alfonso EC , Miller D , Jones DB , Bruckner D , Langston R , Jeng BH , Joslin CE , Tu E , Colby K , Vetter E , Ritterband D , Mathers W , Kowalski RP , Acharya NR , Limaye AP , Leiter C , Roy S , Lorick S , Roberts J , Beach MJ . Ophthalmic Epidemiol 2012 19 (4) 221-5 PURPOSE: To describe the trend of Acanthamoeba keratitis case reports following an outbreak and the recall of a multipurpose contact lens disinfection solution. Acanthamoeba keratitis is a serious eye infection caused by the free-living amoeba Acanthamoeba that primarily affects contact lens users. METHODS: A convenience sample of 13 ophthalmology centers and laboratories in the USA, provided annual numbers of Acanthamoeba keratitis cases diagnosed between 1999-2009 and monthly numbers of cases diagnosed between 2007-2009. Data on ophthalmic preparations of anti-Acanthamoeba therapies were collected from a national compounding pharmacy. RESULTS: Data from sentinel site ophthalmology centers and laboratories revealed that the yearly number of cases gradually increased from 22 in 1999 to 43 in 2003, with a marked increase beginning in 2004 (93 cases) that continued through 2007 (170 cases; p < 0.0001). The outbreak identified from these sentinel sites resulted in the recall of a contact lens disinfecting solution. There was a statistically significant (p ≤ 0.0001) decrease in monthly cases reported from 28 cases in June 2007 (following the recall) to seven cases in June 2008, followed by an increase (p = 0.0004) in reported cases thereafter; cases have remained higher than pre-outbreak levels. A similar trend was seen in prescriptions for Acanthamoeba keratitis chemotherapy. Cases were significantly more likely to be reported during summer than during other seasons. CONCLUSION: The persistently elevated number of reported cases supports the need to understand the risk factors and environmental exposures associated with Acanthamoeba keratitis. Further prevention efforts are needed to reduce the number of cases occurring among contact lens wearers. |
Current status of Clostridium difficile infection epidemiology
Lessa FC , Gould CV , McDonald LC . Clin Infect Dis 2012 55 Suppl 2 S65-70 The dramatic changes in the epidemiology of Clostridium difficile infection (CDI) during recent years, with increases in incidence and severity of disease in several countries, have made CDI a global public health challenge. Increases in CDI incidence have been largely attributed to the emergence of a previously rare and more virulent strain, BI/NAP1/027. Increased toxin production and high-level resistance to fluoroquinolones have made this strain a very successful pathogen in healthcare settings. In addition, populations previously thought to be at low risk are now being identified as having severe CDI. Recent genetic analysis suggests that C. difficile has a highly fluid genome with multiple mechanisms to modify its content and functionality, which can make C. difficile adaptable to environmental changes and potentially lead to the emergence of more virulent strains. In the face of these changes in the epidemiology and microbiology of CDI, surveillance systems are necessary to monitor trends and inform public health actions. |
Developmental genetics of secretory vesicle acidification during Caenorhabditis elegans spermatogenesis.
Gleason EJ , Hartley PD , Henderson M , Hill-Harfe KL , Price PW , Weimer RM , Kroft TL , Zhu GD , Cordovado S , L'Hernault S W . Genetics 2012 191 (2) 477-91 Secretory vesicles are used during spermatogenesis to deliver proteins to the cell surface. In Caenorhabditis elegans, secretory membranous organelles (MO) fuse with the plasma membrane to transform spermatids into fertilization-competent spermatozoa. We show that, like the acrosomal vesicle of mammalian sperm, MOs undergo acidification during development. Treatment of spermatids with the V-ATPase inhibitor bafilomycin blocks both MO acidification and formation of functional spermatozoa. There are several spermatogenesis-defective mutants that cause defects in MO morphogenesis, including spe-5. We determined that spe-5, which is on chromosome I, encodes one of two V-ATPase B paralogous subunits. The spe-5 null mutant is viable but sterile because it forms arrested, multi-nucleate spermatocytes. Immunofluorescence with a SPE-5-specific monoclonal antibody shows that SPE-5 expression begins in spermatocytes and is found in all subsequent stages of spermatogenesis. Most SPE-5 is discarded into the residual body during spermatid budding, but a small amount remains in budded spermatids where it localizes to MOs as a discrete dot. The other V-ATPase B subunit is encoded by vha-12, which is located on the X chromosome. Usually, spe-5 mutants are self-sterile in a wild-type vha-12 background. However, an extrachromosomal transgene containing wild-type vha-12 driven by its own promoter allows spe-5 mutant hermaphrodites to produce progeny, indicating that VHA-12 can at least partially substitute for SPE-5. Others have shown that the X chromosome is transcriptionally silent in the male germline, so expression of the autosomally located spe-5 gene ensures that a V-ATPase B subunit is present during spermatogenesis. |
Genetic variants in IGF-I, IGF-II, IGFBP-3, and adiponectin genes and colon cancer risk in African Americans and Whites.
Keku TO , Vidal A , Oliver S , Hoyo C , Hall IJ , Omofoye O , McDoom M , Worley K , Galanko J , Sandler RS , Millikan R . Cancer Causes Control 2012 23 (7) 1127-38 PURPOSE: Evaluating genetic susceptibility may clarify effects of known environmental factors and also identify individuals at high risk. We evaluated the association of four insulin-related pathway gene polymorphisms in insulin-like growth factor-1 (IGF-I) (CA)(n) repeat, insulin-like growth factor-2 (IGF-II) (rs680), insulin-like growth factor-binding protein-3 (IGFBP-3) (rs2854744), and adiponectin (APM1 rs1501299) with colon cancer risk, as well as relationships with circulating IGF-I, IGF-II, IGFBP-3, and C-peptide in a population-based study. METHODS: Participants were African Americans (231 cases and 306 controls) and Whites (297 cases, 530 controls). Consenting subjects provided blood specimens and lifestyle/diet information. Genotyping for all genes except IGF-I was performed by the 5'-exonuclease (Taqman) assay. The IGF-I (CA)(n) repeat was assayed by PCR and fragment analysis. Circulating proteins were measured by enzyme immunoassays. Odds ratios (ORs) and 95 % confidence intervals (CIs) were calculated by logistic regression. RESULTS: The IGF-I (CA)(19) repeat was higher in White controls (50 %) than African American controls (31 %). Whites homozygous for the IGF-I (CA)(19) repeat had a nearly twofold increase in risk of colon cancer (OR = 1.77; 95 % CI = 1.15-2.73), but not African Americans (OR = 0.73, 95 % CI 0.50-1.51). We observed an inverse association between the IGF-II Apa1 A-variant and colon cancer risk (OR = 0.49, 95 % CI 0.28-0.88) in Whites only. Carrying the IGFBP-3 variant alleles was associated with lower IGFBP-3 protein levels, a difference most pronounced in Whites (p-trend <0.05). CONCLUSIONS: These results support an association between insulin pathway-related genes and elevated colon cancer risk in Whites but not in African Americans. |
Estrogen-related genes and their contribution to racial differences in breast cancer risk.
Reding KW , Chen C , Lowe K , Doody DR , Carlson CS , Chen CT , Houck J , Weiss LK , Marchbanks PA , Bernstein L , Spirtas R , McDonald JA , Strom BL , Burkman RT , Simon MS , Liff JM , Daling JR , Malone KE . Cancer Causes Control 2012 23 (5) 671-81 Racial differences in breast cancer risk, including the risks of hormone receptor subtypes of breast cancer, have been previously reported. We evaluated whether variation in genes related to estrogen metabolism (COMT, CYP1A1, CYP1B1, CYP17A1, CYP19A1, ESR1, GSTM1, GSTP1, GSTT1, HSD17B1, SULT1A1, and UGT1A1) contributes to breast cancer risk and/or racial differences in risk within the CARE study, a multi-centered, population-based case-control study of breast cancer. Genetic variation was assessed as single nucleotide polymorphisms (SNPs), haplotypes, and SNP-hormone therapy (HT) interactions within a subset of 1,644 cases and 1,451 controls, including 949 Black women (493 cases and 456 controls), sampled from the CARE study population. No appreciable associations with breast cancer risk were detected for single SNPs or haplotypes in women overall. We detected SNP-HT interactions in women overall within CYP1B1 (rs1800440; p (het)=0.003) and within CYP17A1 (rs743572; p (het)=0.009) in which never users of HT were at a decreased risk of breast cancer, while ever users were at a non-significant increased risk. When investigated among racial groups, we detected evidence of an SNP-HT interaction with CYP1B1 in White women (p value=0.02) and with CYP17A1 in Black women (p value=0.04). This analysis suggests that HT use may modify the effect of variation in estrogen-related genes on breast cancer risk, which may affect Black and White women to a different extent. |
The influence of perceptions of HIV infection, care, and identity on care entry
Fagan JL , Beer L , Garland P , Valverde E , Courogen M , Hillman D , Brady K , Bertolli J . AIDS Care 2012 24 (6) 737-43 The benefits of accessing HIV care after diagnosis (e.g., improved clinical outcomes and reduced transmission) are well established. However, many persons who are aware that they are HIV infected have never received HIV medical care. During 2008-2010, we conducted 43 in-depth interviews in three health department jurisdictions among adults who had received an HIV diagnosis but who had never accessed HIV medical care. Respondents were selected from the HIV/AIDS Reporting System, a population-based surveillance system. We explored how respondents perceived HIV infection and HIV medical care. Most respondents associated HIV with death. Many respondents said that HIV medical care was not necessary until one is sick. Further, we explored how these perceptions may have conflicted with one's identity and thus served as barriers to timely care entry. Most respondents perceived themselves as healthy. All respondents acknowledged their HIV serostatus, but many did not self-identify as HIV-positive. Several respondents expressed that they were not ready to receive HIV care immediately but felt that they would eventually attempt to access care. Some stated that they needed time to accept their HIV diagnosis before entering care. To improve timely linkage to care, we suggest that during the posttest counseling session and subsequent linkage-to-care activities, counselors and service providers discuss patient perceptions of HIV, particularly to address beliefs that HIV infection is a "death sentence" or that HIV care is necessary only for those who exhibit symptoms. |
A genome-wide association study of host genetic determinants of the antibody response to Anthrax Vaccine Adsorbed.
Pajewski NM , Shrestha S , Quinn CP , Parker SD , Wiener H , Aissani B , McKinney BA , Poland GA , Edberg JC , Kimberly RP , Tang J , Kaslow RA . Vaccine 2012 30 (32) 4778-84 Several lines of evidence have supported a host genetic contribution to vaccine response, but genome-wide assessments for specific determinants have been sparse. Here we describe a genome-wide association study (GWAS) of protective antigen-specific antibody (AbPA) responses among 726 European-Americans who received Anthrax Vaccine Adsorbed (AVA) as part of a clinical trial. After quality control, 736,996 SNPs were tested for association with the AbPA response to 3 or 4 AVA vaccinations given over a 6-month period. No SNP achieved the threshold of genome-wide significance (p=5x10(-8)), but suggestive associations (p<1x10(-5)) were observed for SNPs in or near the class II region of the major histocompatibility complex (MHC), in the promoter region of SPSB1, and adjacent to MEX3C. Multivariable regression modeling suggested that much of the association signal within the MHC corresponded to previously identified HLA DR-DQ haplotypes involving component HLA-DRB1 alleles of *15:01, *01:01, or *01:02. We estimated the proportion of additive genetic variance explained by common SNP variation for the AbPA response after the 6 month vaccination. This analysis indicated a significant, albeit imprecisely estimated, contribution of variation tagged by common polymorphisms (p=0.032). Future studies will be required to replicate these findings in European Americans and to further elucidate the host genetic factors underlying variable immune response to AVA. |
Efficient error correction for next-generation sequencing of viral amplicons.
Skums P , Dimitrova Z , Campo DS , Vaughan G , Rossi L , Forbi JC , Yokosawa J , Zelikovsky A , Khudyakov Y . BMC Bioinformatics 2012 13 Suppl 10 S6 BACKGROUND: Next-generation sequencing allows the analysis of an unprecedented number of viral sequence variants from infected patients, presenting a novel opportunity for understanding virus evolution, drug resistance and immune escape. However, sequencing in bulk is error prone. Thus, the generated data require error identification and correction. Most error-correction methods to date are not optimized for amplicon analysis and assume that the error rate is randomly distributed. Recent quality assessment of amplicon sequences obtained using 454-sequencing showed that the error rate is strongly linked to the presence and size of homopolymers, position in the sequence and length of the amplicon. All these parameters are strongly sequence specific and should be incorporated into the calibration of error-correction algorithms designed for amplicon sequencing. RESULTS: In this paper, we present two new efficient error correction algorithms optimized for viral amplicons: (i) k-mer-based error correction (KEC) and (ii) empirical frequency threshold (ET). Both were compared to a previously published clustering algorithm (SHORAH), in order to evaluate their relative performance on 24 experimental datasets obtained by 454-sequencing of amplicons with known sequences. All three algorithms show similar accuracy in finding true haplotypes. However, KEC and ET were significantly more efficient than SHORAH in removing false haplotypes and estimating the frequency of true ones. CONCLUSIONS: Both algorithms, KEC and ET, are highly suitable for rapid recovery of error-free haplotypes obtained by 454-sequencing of amplicons from heterogeneous viruses.The implementations of the algorithms and data sets used for their testing are available at: http://alan.cs.gsu.edu/NGS/?q=content/pyrosequencing-error-correction-algorithm. |
Moving knowledge into action: developing the Rapid Synthesis and Translation Process within the Interactive Systems Framework
Thigpen S , Puddy RW , Singer HH , Hall DM . Am J Community Psychol 2012 50 285-94 The Interactive Systems Framework (ISF) for Dissemination and Implementation presents an overall framework for translating knowledge into action. Each of its three systems requires further clarification and explanation to truly understand how to conduct this work. This article describes the development and initial application of the Rapid Synthesis and Translation Process (RSTP) using the exchange model of knowledge transfer in the context of one of the ISF systems: the Prevention Synthesis and Translation System (see [special issue "introduction" article] for a translation of the Wandersman et al. (Am J Community Psychol 41:3-4, 2008) article using the RSTP). This six-step process, which was developed by and for the Division of Violence Prevention at the Centers for Disease Control and Prevention in collaboration with partners, serves as an example of how a federal agency can expedite the transfer of research knowledge to practitioners to prevent violence. While the RSTP itself represents one of the possible functions in the Prevention Synthesis and Translation System, the resulting products affect both prevention support and prevention delivery as well. Examples of how practitioner and researcher feedback were incorporated into the Rapid Synthesis and Translation Process are discussed. |
Attitudes towards requiring ignition interlocks for all driving while intoxicated offenders: findings from the 2010 HealthStyles Survey
Shults RA , Bergen G . Inj Prev 2012 19 (1) 68-71 Ignition interlocks are effective in reducing recidivism among driving while intoxicated (DWI) offenders while installed on their vehicles. However, the devices are not widely used in the USA. This survey gauged public support for requiring ignition interlocks for all convicted DWI offenders including first-time offenders. 69% of respondents supported such a policy. Support was lowest (38%) among persons who reported drinking and driving in the past 30 days. Multivariate regression analysis indicated that support varied little by region, community size or most measured individual characteristics. Persons who did not drink and drive were 80% more likely to support the requirement than those who drink and drive. These findings suggest that laws requiring ignition interlocks for all convicted DWI offenders may face the most opposition in communities with high levels of drinking and driving. |
Evaluation of blood collection filter papers for HIV-1 DNA PCR.
Masciotra S , Khamadi S , Bile E , Puren A , Fonjungo P , Nguyen S , Girma M , Downing R , Ramos A , Subbarao S , Ellenberger D . J Clin Virol 2012 55 (2) 101-6 BACKGROUND: The collection of dried blood spots (DBS) on Whatman 903 cards has facilitated for years the detection of HIV-1 in infants by DNA PCR as early as 4-6 weeks after birth in resource-limited settings (RLS), but alternate blood collection devices are proving to be necessary. OBJECTIVES: The qualitative detection of HIV-1 DNA by PCR from DBS prepared on three commercially available blood collection cards was evaluated at the Centers for Disease Control and Prevention (CDC) and in four laboratories in Africa. STUDY DESIGN: DBS were prepared on Ahlstrom grade 226, Munktell TFN and Whatman 903, and stored under a variety of conditions. DBS were stored at ambient temperature (RT), 37 degrees C with high humidity, and -20 degrees C for varying lengths of time. The presence of HIV-1 DNA was tested using Roche Amplicor HIV-1 DNA (v 1.5) weekly for 4 weeks and at weeks 8 and 12 (RT and 37 degrees C), at weeks 4, 8, and 18 (-20 degrees C) of storage. DBS specimens were also tested after international shipment at RT. In addition, after nearly 3 years storage at -20 degrees C, DBS were also evaluated independently using the COBAS Ampliprep/TaqMan HIV-1 Qual and Abbott RealTime HIV-1 Qualitative tests. RESULTS: HIV-1 DNA was detected equally well on the three blood collection cards regardless of storage conditions and PCR assay. CONCLUSIONS: Ahlstrom 226 and Munktell TFN papers were comparable to Whatman 903 for HIV-1 DNA detection and may be considered as optional blood collection devices in resource-limited countries. |
Genotyping of Candida parapsilosis from three neonatal intensive care units (NICUs) using a panel of five multilocus microsatellite markers: broad genetic diversity and a cluster of related strains in one NICU.
Reiss E , Lasker BA , Lott TJ , Bendel CM , Kaufman DA , Hazen KC , Wade KC , McGowan KL , Lockhart SR . Infect Genet Evol 2012 12 (8) 1654-60 Candida parapsilosis (CP) (n=40) isolated from an unselected patient population in the neonatal intensive care units (NICUs) of 3 U.S. hospitals were collected over periods of 3.5-9 years. Two previously published microsatellite markers and three additional trinucleotide markers were used to produce multiplex genotypes, which revealed broad strain diversity among the NICU isolates with a combined index of discrimination (D)=0.997. A cluster of 8 related CP strains from 4 infants in a single NICU was observed. An extended collection of 24 CP isolates from the general population of that hospital showed that the cluster of NICU isolates was related to 3 isolates from general hospital patients. This microsatellite marker set is suitable to investigate clusters of colonizing and infecting strains of CP. |
Protocol for the detection of Treponema pallidum in paraffin-embedded specimens.
Chen CY , Pillay A . Methods Mol Biol 2012 903 295-306 Formalin-fixed paraffin-embedded (FFPE) tissue blocks are routinely used for histopathological examination and are also useful for specific pathogen detection by polymerase chain reaction (PCR). FFPE tissue is stable at ambient temperature for an extended period of time and relatively easy to transport compared to fresh tissue, which has to be processed or frozen immediately. In addition, archival material is an invaluable source for retrospective molecular and clinical investigation. This chapter describes detailed procedures for nucleic acid extraction and PCR detection of Treponema pallidum using FFPE tissue. |
Protocol for the use of a rapid real-time PCR method for the detection of HIV-1 proviral DNA using double-stranded primer.
Pau CP , Wells SK , Granade TC . Methods Mol Biol 2012 903 263-71 This chapter describes a real-time PCR method for the detection of HIV-1 proviral DNA in whole blood samples using a novel double-stranded primer system. The assay utilizes a simple commercially available DNA extraction method and a rapid and easy-to-perform real-time PCR protocol to consistently detect a minimum of four copies of HIV-1 group M proviral DNA in as little as 90 min after sample (whole blood) collection. Co-amplification of the human RNase P gene serves as an internal control to monitor the efficiency of both the DNA extraction and amplification. Once the assay is validated properly, it may be suitable as an alternative confirmation test for HIV-1 infections in a variety of HIV testing venues including the mother-to-child transmission testing sites, clinics, and diagnostic testing centers. |
The molecular diagnosis of sexually transmitted genital ulcer disease.
Chen CY , Ballard RC . Methods Mol Biol 2012 903 103-12 Highly sensitive and specific nucleic acid amplification tests (NAATs) have emerged as the gold standard diagnostic tests for many infectious diseases. Real-time PCR has further refined the technology of nucleic acid amplification with detection in a closed system and enabled multiplexing to simultaneously detect multiple pathogens. It is a versatile, fast, and high-throughput system for pathogen detection that has reduced the risk of PCR contamination, eliminated post-PCR manipulations, and improved the cost-effectiveness of testing. In addition, real-time PCR can be applied to self-collected noninvasive specimens. Here, we describe an in-house developed TaqMan-based real-time multiplex PCR (M-PCR) assay for the diagnosis of sexually transmitted genital ulcer disease (GUD) and discuss briefly on issues associated with validation of assay performance. |
Fungal hemolysins
Nayak AP , Green BJ , Beezhold DH . Med Mycol 2012 51 (1) 1-16 Hemolysins are a class of proteins defined by their ability to lyse red cells but have been described to exhibit pleiotropic functions. These proteins have been extensively studied in bacteria and more recently in fungi. Within the last decade, a number of studies have characterized fungal hemolysins and revealed a fascinating yet diverse group of proteins. The purpose of this review is to provide a synopsis of the known fungal hemolysins with an emphasis on those belonging to the aegerolysin protein family. New insight and perspective into fungal hemolysins in biotechnology and health are additionally presented. |
Chronic exposure to corticosterone enhances the neuroinflammatory and neurotoxic responses to methamphetamine
Kelly KA , Miller DB , Bowyer JF , O'Callaghan JP . J Neurochem 2012 122 (5) 995-1009 Upregulation of proinflammatory cytokines and chemokines in brain ("neuroinflammation") accompanies neurological disease and neurotoxicity. Previously, we documented a striatal neuroinflammatory response to acute administration of a neurotoxic dose of methamphetamine (METH), i.e. one associated with evidence of dopaminergic terminal damage and activation of microglia and astroglia. When we used minocycline to suppress METH-induced neuroinflammation, indices of dopaminergic neurotoxicity were not affected but suppression of neuroinflammation was incomplete. Here, we administered the classic anti-inflammatory glucocorticoid, corticosterone (CORT), in an attempt to completely suppress METH-related neuroinflammation. METH alone caused large increases in striatal proinflammatory cytokine/chemokine mRNA and subsequent astrocytic hypertrophy, microglial activation and dopaminergic nerve terminal damage. Pretreatment of mice with acute CORTfailed to prevent neuroinflammatory responses to METH. Surprisingly, when mice were pretreated with chronic CORT in the drinking water, an enhanced striatal neuroinflammatory response to METH was observed, an effect that was accompanied by enhanced METH-induced astrogliosis and dopaminergic neurotoxicity. Chronic CORT pretreatment also sensitized frontal cortex and hippocampus to mount a neuroinflammatory response to METH. Because the levels of chronic CORT used are associated with high physiological stress, our data suggest that chronic CORT therapy or sustained physiological stress may sensitize the neuroinflammatory and neurotoxicity responses to METH. (Published 2012. This article is a US Government work and is in the public domain in the USA.) |
Rash, hepatotoxicity and hyperbilirubinemia among Kenyan infants born to HIV-infected women receiving triple-antiretroviral drugs for the prevention of mother-to-child HIV transmission
Minniear TD , Zeh C , Polle N , Masaba R , Peters PJ , Oyaro B , Akoth B , Ndivo R , Angira F , Mills LA , Thomas TK . Pediatr Infect Dis J 2012 31 (11) 1155-7 We compared adverse events among breastfeeding neonates born to Kenyan mothers receiving triple-antiretroviral therapy including either nevirapine or nelfinavir. Nevirapine-exposed infants had an absolute increase in risk for rash but no significant risk differences for hepatotoxicity or high-risk hyperbilirubinemia compared with nelfinavir-exposed infants. From an infant-safety perspective, nevirapine-based regimens given during pregnancy and breastfeeding are viable options where alternatives to breast milk are not safe, affordable, or feasible. |
Early anthropometric indices predict short stature and overweight status in a cohort of Peruvians in early adolescence
Sterling R , Miranda JJ , Gilman RH , Cabrera L , Sterling CR , Bern C , Checkley W . Am J Phys Anthropol 2012 148 (3) 451-61 While childhood malnutrition is associated with increased morbidity and mortality, less well understood is how early childhood growth influences height and body composition later in life. We revisited 152 Peruvian children who participated in a birth cohort study between 1995 and 1998, and obtained anthropometric and bioimpedance measurements 11-14 years later. We used multivariable regression models to study the effects of childhood anthropometric indices on height and body composition in early adolescence. Each standard deviation decrease in length-for-age at birth was associated with a decrease in adolescent height-for-age of 0.7 SD in both boys and girls (all P < 0.001) and 9.7 greater odds of stunting (95% CI 3.3-28.6). Each SD decrease in length-for-age in the first 30 months of life was associated with a decrease in adolescent height-for-age of 0.4 in boys and 0.6 standard deviation in girls (all P < 0.001) and with 5.8 greater odds of stunting (95% CI 2.6-13.5). The effect of weight gain during early childhood on weight in early adolescence was more complex to understand. Weight-for-length at birth and rate of change in weight-for-length in early childhood were positively associated with age- and sex-adjusted body mass index and a greater risk of being overweight in early adolescence. Linear growth retardation in early childhood is a strong determinant of adolescent stature, indicating that, in developing countries, growth failure in height during early childhood persists through early adolescence. Interventions addressing linear growth retardation in childhood are likely to improve adolescent stature and related-health outcomes in adulthood. (Am J Phys Anthropol 148:451-461, 2012. (c) 2012 Wiley Periodicals, Inc.) |
In-utero exposure to dichlorodiphenyltrichloroethane and cognitive development among infants and school-aged children
Jusko TA , Klebanoff MA , Brock JW , Longnecker MP . Epidemiology 2012 23 (5) 689-98 BACKGROUND: Dichlorodiphenyltrichloroethane (DDT) continues to be used for control of infectious diseases in several countries. In-utero exposure to DDT and dichlorodiphenyldichloroethylene (DDE) has been associated with developmental and cognitive impairment among children. We examined this association in an historical cohort in which the level of exposure was greater than in previous studies. METHODS: The association of in-utero DDT and DDE exposure with infant and child neurodevelopment was examined in 1100 subjects in the Collaborative Perinatal Project, a prospective birth cohort enrolling pregnant women from 12 study centers in the United States from 1959 to 1965. Maternal DDT and DDE concentrations were measured in archived serum specimens. Infant mental and motor development was assessed at age 8 months using the Bayley Scales of Infant Development, and child cognitive development was assessed at age 7 years, using the Wechsler Intelligence Scale for Children. RESULTS: Although levels of DDT and DDE were relatively high in this population (median DDT concentration, 8.9 mcg/L; DDE, 24.5 mcg/L), neither were related to Mental or Psychomotor Development scores on the Bayley Scales nor to Full-Scale Intelligence Quotient at 7 years of age. Categorical analyses showed no evidence of dose- response for either maternal DDT or DDE, and estimates of the association between continuous measures of exposure and neurodevelopment were indistinguishable from 0. CONCLUSIONS: Adverse associations were not observed between maternal serum DDT and DDE concentrations and offspring neurodevelopment at 8 months or 7 years in this cohort. |
Eliminating the use of partially hydrogenated oil in food production and preparation
Dietz WH , Scanlon KS . JAMA 2012 308 (2) 143-4 Consumption of trans-fatty acids (TFAs) adversely affects cardiovascular risk factors and is associated with increased risk of coronary heart disease (CHD) events,1 making the reduction of TFA intake key to achieving the Department of Health and Human Services' Million Hearts goal to reduce myocardial infarctions and associated medical costs. Effects of TFA intake include increases in low-density lipoprotein cholesterol levels and decreases in high-density lipoprotein cholesterol levels.1trans-Fatty acids also have been associated with proinflammatory effects, endothelial dysfunction, and decreased insulin sensitivity in persons with insulin resistance.1 To address this public health concern, the Dietary Guidelines for Americans2 and the Institute of Medicine have recommended that TFA intake should be as low as possible.3 This Viewpoint focuses on progress in reducing TFA intake in the United States and the potential health benefits of further reducing intake by eliminating a primary source of TFA. | Some TFAs are naturally present in dairy and meat products of ruminant animals, referred to as ruminant TFAs, and small amounts of industrially produced TFAs are formed during refinement of oils and prolonged deep frying of foods. However, the primary dietary source of industrially produced TFAs is partially hydrogenated oils. These industrially produced TFAs are commonly present in vegetable shortenings, margarines, baked goods, snack foods, and other foods made with or fried in partially hydrogenated oils. The partial hydrogenation process was initially thought to produce fats that were less harmful than saturated fat, but evidence has emerged that TFAs adversely affect health. Current dietary guidelines recommend keeping total TFA consumption as low as possible, and specifically limiting intake of foods that contain industrially produced TFAs such as partially hydrogenated oils.2 |
Prevalence of hearing loss in the United States by industry
Masterson EA , Tak S , Themann CL , Wall DK , Groenewold MR , Deddens JA , Calvert GM . Am J Ind Med 2012 56 (6) 670-81 BACKGROUND: Twenty-two million workers are exposed to hazardous noise in the United States. The purpose of this study is to estimate the prevalence of hearing loss among U.S. industries. METHODS: We examined 2000-2008 audiograms for male and female workers ages 18-65, who had higher occupational noise exposures than the general population. Prevalence and adjusted prevalence ratios (PRs) for hearing loss were estimated and compared across industries. RESULTS: In our sample, 18% of workers had hearing loss. When compared with the Couriers and Messengers industry sub-sector, workers employed in Mining (PR = 1.65, CI = 1.57-1.73), Wood Product Manufacturing (PR = 1.65, CL = 1.61-1.70), Construction of Buildings (PR = 1.52, CI = 1.45-1.59), and Real Estate and Rental and Leasing (PR = 1.59, CL = 1.51-1.68) had higher risks for hearing loss. CONCLUSIONS: Workers in the Mining, Manufacturing, and Construction industries need better engineering controls for noise and stronger hearing conservation strategies. More hearing loss research is also needed within traditional "low-risk" industries like Real Estate. (Am. J. Ind. Med. (c) 2012 Wiley Periodicals, Inc.) |
Effect of boot weight and sole flexibility on gait and physiological responses of firefighters in stepping over obstacles
Chiou SS , Turner N , Zwiener J , Weaver DL , Haskell WE . Hum Factors 2012 54 (3) 373-86 OBJECTIVE: The authors investigated the effect of boot weight and sole flexibility on spatiotemporal gait characteristics and physiological responses of firefighters in negotiating obstacles. BACKGROUND: Falls and overexertion are the leading causes of fire ground injuries and fatalities among firefighters. There have been few in-depth studies conducted to evaluate the risk factors of falls and overexertion associated with firefighter boots. METHOD: For the study, 13 female and 14 male firefighters, while wearing full turnout clothing and randomly assigned boots, walked for 5 min while stepping over obstacles. The independent variables included boot weight, sole flexibility, gender, and task duration. Spatiotemporal measures of foot trajectories and toe clearance were determined. Minute ventilation, oxygen consumption, carbon dioxide production, and heart rate were measured. RESULTS: Increased boot weight was found to significantly reduce trailing toe clearance when crossing the 30-cm obstacle. Significant increases in lateral displacement of the foot were found near the end of the 5-min walk compared with the beginning of the task Increased boot weight significantly increased oxygen consumption. There were significant decreases in oxygen consumption for more flexible soles. CONCLUSION: Firefighters were more likely to trip over obstacles when wearing heavier boots and after walking for a period of time. Boot weight affected metabolic variables (5% to 11% increases per 1-kg increase in boot weight), which were mitigated by sole flexibility (5% to 7% decrease for more flexible soles). APPLICATION: This study provides useful information for firefighters and boot manufacturers in boot selection and design for reducing falls and overexertion. |
Factors affecting extension ladder angular positioning
Simeonov P , Hsiao H , Kim IJ , Powers JR , Kau TY . Hum Factors 2012 54 (3) 334-45 OBJECTIVE: The study objectives were to identify factors affecting extension ladders' angular positioning and evaluate the effectiveness of two anthropometric positioning methods. BACKGROUND: A leading cause for extension ladder fall incidents is a slide-out event, usually related to suboptimal ladder inclination. An improved ladder positioning method or procedure could reduce the risk of ladder stability failure and the related fall injury. METHOD: Participants in the study were 20 experienced and 20 inexperienced ladder users. A series of ladder positioning tests was performed in a laboratory environment with 4.88-m (16-ft) and 7.32-m (24-ft) ladders in extended and retracted positions. The setup methods included a no-instruction condition and two anthropometric approaches: the American National Standards Institute A14 and "fireman" methods. Performance measures included positioning angle and time. RESULTS: The results indicated that ladder setup method and ladder effective length, defined by size and extended state, affected ladder positioning angle. On average, both anthropometric methods were effective in improving extension ladder positioning; however, they required 50% more time than did the no-instruction condition and had a 9.5% probability of setting the ladder at a less-than-70 degrees angle. Shorter ladders were consistently positioned at shallower angles. CONCLUSION: Anthropometric methods may lead to safer ladder positioning than does no instruction when accurately and correctly performed. Workers tended to underperform as compared with their theoretical anthropometric estimates. Specific training or use of an assistive device may be needed to improve ladder users' performance. APPLICATION: The results provide practical insights for employers and workers to correctly set up extension ladders. |
Assessment of fall-arrest systems for scissor lift operators: computer modeling and manikin drop testing
Pan CS , Powers JR , Hartsell JJ , Harris JR , Wimer BM , Dong RG , Wu JZ . Hum Factors 2012 54 (3) 358-72 OBJECTIVE: The current study is intended to evaluate the stability of a scissor lift and the performance of various fall-arrest harnesses/lanyards during drop/fall-arrest conditions and to quantify the dynamic loading to the head/ neck caused by fall-arrest forces. BACKGROUND: No data exist that establish the efficacy of fall-arrest systems for use on scissor lifts or the injury potential from the fall incidents using a fall-arrest system. METHOD: The authors developed a multibody dynamic model of the scissor lift and a human lift operator model using ADAMS and LifeMOD Biomechanics Human Modeler. They evaluated lift stability for four fall-arrest system products and quantified biomechanical impacts on operators during drop/fall arrest, using manikin drop tests. Test conditions were constrained to flat surfaces to isolate the effect of manikin-lanyard interaction. RESULTS: The fully extended scissor lift maintained structural and dynamic stability for all manikin drop test conditions. The maximum arrest forces from the harnesses/lanyards were all within the limits of ANSI Z359.1. The dynamic loading in the lower neck during the fall impact reached a level that is typically observed in automobile crash tests, indicating a potential injury risk for vulnerable participants. CONCLUSION: Fall-arrest systems may function as an effective mechanism for fall injury protection for operators of scissor lifts. However, operators may be subjected to significant biomechanical loadings on the lower neck during fall impact. APPLICATION: Results suggest that scissor lifts retain stability under test conditions approximating human falls from predefined distances but injury could occur to vulnerable body structures. |
Associations of work hours with carotid intima-media thickness and ankle-brachial index: the Multi-Ethnic Study of Atherosclerosis (MESA)
Charles LE , Fekedulegn D , Burchfiel CM , Fujishiro K , Landsbergis P , Diez Roux AV , MacDonald L , Foy CG , Andrew ME , Stukovsky KH , Baron S . Occup Environ Med 2012 69 (10) 713-20 OBJECTIVES: Long working hours may be associated with cardiovascular disease (CVD). The objective was to investigate cross-sectional associations of work hours with carotid intima-media thickness (CIMT) and ankle-brachial index (ABI). METHODS: Participants were 1694 women and 1868 men from the Multi-Ethnic Study of Atherosclerosis. CIMT and ABI were measured using standard protocols. Information on work hours was obtained from questionnaires. Mean values of CIMT and ABI were examined across five categories of hours worked per week (≤20, 21-39, 40, 41-50 and >50) using analysis of variance/analysis of covariance. p Values for trend were obtained from linear regression models. RESULTS: Mean age of participants was 56.9+/-8.4 years; 52.4% were men. Distinct patterns of association between work hours and the subclinical CVD biomarkers were found for women and men, although this heterogeneity by gender was not statistically significant. Among women only, work hours were positively associated with common (but not internal) CIMT (p=0.073) after full risk factor adjustment. Compared with women working 40 h, those working >50 h were more likely to have an ABI <1 (vs 1-1.4) (OR=1.85, 95% CI 1.01 to 3.38). In men, work hours and ABI were inversely associated (p=0.046). There was some evidence that the association between work hours and ABI was modified by occupational category (interaction p=0.061). Among persons classified as management/professionals, longer work hours was associated with lower ABI (p=0.015). No significant associations were observed among other occupational groups. CONCLUSIONS: Working longer hours may be associated with subclinical CVD. These associations should be investigated using longitudinal studies. |
Vaccinations and preventive screening services for older adults: opportunities and challenges in the USA
Shenson D , Anderson L , Slonim A , Benson W . Perspect Public Health 2012 132 (4) 165-70 Vaccinations and disease-screening services occupy an important position within the constellation of interventions designed to prevent, forestall or mitigate illness: they straddle the worlds of clinical medicine and public health. This paper focuses on a set of clinical preventive services that are recommended in the USA for adults aged 65 and older, based on their age and gender. These services include immunisations against influenza and pneumococcal disease, and screening for colorectal and breast cancers. We explore opportunities and challenges to enhance the delivery of these interventions, and describe some recently developed models for integrating prevention efforts based in clinician offices and in communities. We also report on a state-level surveillance measure that assesses whether older adults are 'up to date' on this subset of preventive services. To better protect the health of older Americans and change the projected trajectory of medical costs, expanded delivery of recommended vaccinations and disease screenings is likely to remain a focus for both US medicine and public health. |
Interim evaluation of a large scale sanitation, hygiene and water improvement programme on childhood diarrhea and respiratory disease in rural Bangladesh
Huda TM , Unicomb L , Johnston RB , Halder AK , Yushuf Sharker MA , Luby SP . Soc Sci Med 2012 75 (4) 604-11 Started in 2007, the Sanitation Hygiene Education and Water Supply in Bangladesh (SHEWA-B) project aims to improve the hygiene, sanitation and water supply for 20 million people in Bangladesh, and thus reduce disease among this population. This paper assesses the effectiveness of SHEWA-B on changing behaviors and reducing diarrhea and respiratory illness among children < 5 years of age. We assessed behaviors at baseline in 2007 and after 6 months and 18 months by conducting structured observation of handwashing behavior in 500 intervention and 500 control households. In addition we conducted spot checks of water and sanitation facilities in 850 intervention and 850 control households. We also collected monthly data on diarrhea and respiratory illness from 500 intervention and 500 control households from October 2007 to September 2009. Participants washed their hands with soap < 3% of the time around food related events in both intervention and control households at baseline and after 18 months. Washing both hands with soap or ash after cleaning a child's anus increased from 22% to 36%, and no access to a latrine decreased from 10% to 6.8% from baseline to 18 months. The prevalence of diarrhea and respiratory illness, among children <5 years of age were similar in intervention and control communities throughout the study. This large scale sanitation, hygiene and water improvement programme resulted in improvements in a few of its targeted behaviors, but these modest behavior changes have not yet resulted in a measurable reduction in childhood diarrhea and respiratory illness. |
Magnetic dipole moment of a moving electric dipole
Hnizdo V . Am J Phys 2012 80 (7) 645-647 The relativistic transformations of the polarization (electric moment density) P and magnetization (magnetic moment density) M of macroscopic electrodynamics1 imply corresponding transformations of the electric and magnetic dipole moments p and m, respectively, of a particle. Thus, to first order in ν | /c,2 | p=p0+1cv×m0, | (1) | m=m0−1cv×p0. | (2) | Here, the subscript 0 denotes quantities in the particle’s rest frame and v is the particle’s velocity. According to Eq. (1), a moving rest-frame magnetic dipole m0 | develops an electric dipole moment p=v×m0/c | . While this fact is well known and understood,3–5 the complementary effect that a moving electric dipole acquires a magnetic moment does not seem to be understood equally well. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Drug Safety
- Environmental Health
- Epidemiology and Surveillance
- Genetics and Genomics
- Health Behavior and Risk
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Nutritional Sciences
- Occupational Safety and Health
- Public Health, General
- Sciences, General
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure