Trends in hospitalization with chronic obstructive pulmonary disease-United States, 1990-2005
Brown DW , Croft JB , Greenlund KJ , Giles WH . COPD 2010 7 (1) 59-62 Chronic obstructive pulmonary disease (COPD) is the fourth leading cause of death in the United States and a major cause of morbidity and disability. To update national estimates and examine trends for hospitalization with COPD between 1990 and 2005, we analyzed data from the National Hospital Discharge Survey (NHDS). The results indicated that an estimated 715,000 hospitalizations with COPD, or 23.6 per 10,000 population, occurred during 2005, an increase in the number and the rate of COPD hospitalizations since 1990 (370,000 hospitalizations; rate = 15.9 per 10,000 population). To reverse increases in the number of COPD hospitalizations and decrease the burden of COPD, public health programs should continue focused efforts to reduce total personal exposure to tobacco smoke, including passive smoke exposure; to occupational dusts and chemicals; and to other indoor and outdoor air pollutants linked to COPD. |
Weight loss from maximum body weight and mortality: the Third National Health and Nutrition Examination Survey Linked Mortality File
Ingram DD , Mussolino ME . Int J Obes (Lond) 2010 34 (6) 1044-50 OBJECTIVE: The aim of this longitudinal study is to examine the relationship between weight loss from maximum body weight, body mass index (BMI), and mortality in a nationally representative sample of men and women. DESIGN: Longitudinal cohort study. SUBJECTS: In all, 6117 whites, blacks, and Mexican-Americans 50 years and over at baseline who survived at least 3 years of follow-up, from the Third National Health and Nutrition Examination Survey Linked Mortality Files (1988-1994 with passive mortality follow-up through 2000), were included. MEASUREMENTS: Measured body weight and self-reported maximum body weight obtained at baseline. Weight loss (maximum body weight minus baseline weight) was categorized as <5%, 5-<15%, and ≥15%. Maximum BMI (reported maximum weight (kg)/measured baseline height (m)(2)) was categorized as healthy weight (18.5-24.9), overweight (25.0-29.9), and obese (≥30.0). RESULTS: In all, 1602 deaths were identified. After adjusting for age, race, smoking, health status, and preexisting illness, overweight men with weight loss of 15% or more, overweight women with weight loss of 5-<15%, and women in all BMI categories with weight loss of 15% or more were at increased risk of death from all causes compared with those in the same BMI category who lost <5%; hazard ratios ranged from 1.46 to 2.70. Weight loss of 5-<15% reduced risk of death from cardiovascular diseases among obese men. CONCLUSIONS: Weight loss of 15% or more from maximum body weight is associated with increased risk of death from all causes among overweight men and among women regardless of maximum BMI. International Journal of Obesity advance online publication, 9 March 2010; doi:10.1038/ijo.2010.41. |
Medical complications among hospitalizations for ischemic stroke in the United States from 1998 to 2007
Tong X , Kuklina EV , Gillespie C , George MG . Stroke 2010 41 (5) 980-6 BACKGROUND AND PURPOSE: The common medical complications after ischemic stroke are associated with increased mortality and resource use. METHOD: The study population consisted of 1 150 336 adult hospitalizations with ischemic stroke as a primary diagnosis included in the 1998 to 2007 Nationwide Inpatient Sample of the Healthcare Cost and Utilization Project. Multiple logistic regression analyses were used to examine changes between 1998 to 1999 and 2006 to 2007 in the prevalence of acute myocardial infarction, pneumonia, deep venous thrombosis, pulmonary embolism, or urinary tract infection, in-hospital mortality, and length of stay. RESULTS: In 2006 to 2007, the prevalence of hospitalizations with a secondary diagnosis of acute myocardial infarction, pneumonia, deep venous thrombosis, pulmonary embolism, and urinary tract infection was 1.6%, 2.9%, 0.8%, 0.3%, and 10.1%, respectively. The adjusted ORs for a hospitalization in 2006 to 2007 complicated by acute myocardial infarction, deep venous thrombosis, pulmonary embolism, or urinary tract infection, using 1998 to 1999 as the referent, were 1.39, 1.68, 2.39, and 1.18, respectively. The odds of pneumonia did not change significantly between 1998 to 1999 and 2006 to 2007. In-hospital mortality was significantly lower in 2006 to 2007 than in 1998 to 1999. Despite the overall length of stay decreasing significantly from 1998 to 1999 to 2006 to 2007, it remained the same for hospitalizations with acute myocardial infarction, pneumonia, deep vein thrombosis, and pulmonary embolism. CONCLUSIONS: Although in-hospital mortality decreased over the study period, 4 of the 5 complications were more common in 2006 to 2007 than they were 8 years earlier with the largest increase observed for deep venous thrombosis and pulmonary embolism. |
Physical activity levels and differences in the prevalence of diabetes between the United States and Canada
Zhang X , Geiss LS , Caspersen CJ , Cheng YJ , Engelgau MM , Johnson JA , Plotnikoff RC , Gregg EW . Prev Med 2010 50 241-5 OBJECTIVE: To examine the American-Canadian difference in physical activity (PA) and its association with diabetes prevalence. METHODS: We used cross-sectional data from nationally representative samples of adults (8688 persons aged ≥18years) participating in the 2004 Joint Canada/U.S. Survey of Health. Using data on up to 22 activities in the past 3months, we defined 3 PA groups (in MET-hours/day) as: low (<1.5), moderate (1.5-2.9), and high (≥3.0). We employed logistic regression models in our analyses. RESULTS: Self-reported diabetes prevalence was 7.6% in the U.S. and 5.4% in Canada. The prevalence of low PA was considerably higher in the U. S. (70.9%) than in Canada (52.3%), while levels of moderate and high PA were higher in Canada (24.6% and 23.1%, respectively) than in the U.S. (14.3% and 14.8%, respectively). Using nationality (Canada as reference) to predict diabetes status, the adjusted odds ratio was 1.48 (95%CI, 1.22-1.79), and became 1.38 (95%CI, 1.15-1.66) when additionally adjusting for PA level. We estimate that 20.8% of the U.S.-Canada difference in diabetes prevalence is associated with PA. CONCLUSIONS: The difference in the prevalence of diabetes between U.S. and Canadian adults may be partially explained by differences in PA between the two countries. |
Annual report to the nation on the status of cancer, 1975-2006, featuring colorectal cancer trends and impact of interventions (risk factors, screening, and treatment) to reduce future rates
Edwards BK , Ward E , Kohler BA , Eheman C , Zauber AG , Anderson RN , Jemal A , Schymura MJ , Lansdorp-Vogelaar I , Seeff LC , van Ballegooijen M , Goede SL , Ries LA . Cancer 2010 116 (3) 544-73 BACKGROUND: The American Cancer Society, the Centers for Disease Control and Prevention (CDC), the National Cancer Institute (NCI), and the North American Association of Central Cancer Registries (NAACCR) collaborate annually to provide updated information regarding cancer occurrence and trends in the United States. This year's report includes trends in colorectal cancer (CRC) incidence and death rates and highlights the use of microsimulation modeling as a tool for interpreting past trends and projecting future trends to assist in cancer control planning and policy decisions. METHODS: Information regarding invasive cancers was obtained from the NCI, CDC, and NAACCR; and information on deaths was obtained from the CDC's National Center for Health Statistics. Annual percentage changes in the age-standardized incidence and death rates (based on the year 2000 US population standard) for all cancers combined and for the top 15 cancers were estimated by joinpoint analysis of long-term trends (1975-2006) and for short-term fixed-interval trends (1997-2006). All statistical tests were 2-sided. RESULTS: Both incidence and death rates from all cancers combined significantly declined (P < .05) in the most recent time period for men and women overall and for most racial and ethnic populations. These decreases were driven largely by declines in both incidence and death rates for the 3 most common cancers in men (ie, lung and prostate cancers and CRC) and for 2 of the 3 leading cancers in women (ie, breast cancer and CRC). The long-term trends for lung cancer mortality in women had smaller and smaller increases until 2003, when there was a change to a nonsignificant decline. Microsimulation modeling demonstrates that declines in CRC death rates are consistent with a relatively large contribution from screening and with a smaller but demonstrable impact of risk factor reductions and improved treatments. These declines are projected to continue if risk factor modification, screening, and treatment remain at current rates, but they could be accelerated further with favorable trends in risk factors and higher utilization of screening and optimal treatment. CONCLUSIONS: Although the decrease in overall cancer incidence and death rates is encouraging, rising incidence and mortality for some cancers are of concern. |
Limited evolution of human immunodeficiency virus type 1 in the thymus of a perinatally infected child
Scinicariello F , Kourtis AP , Nesheim S , Abramowsky C , Lee FK . Clin Infect Dis 2010 50 (5) 726-32 BACKGROUND: Involvement of the thymus during human immunodeficiency virus (HIV) infection may impair production of naive lymphocytes leading to more rapid depletion, but the characteristics of primary strains in the thymus are not well studied because of the unavailability of tissue in living individuals. METHODS: We studied the characteristics of HIV type 1 (HIV-1) in a 5-year old perinatally infected child with thymitis and compared the genomic sequences of the HIV-1 C2-V5 region of the env gene in the thymic tissue and peripheral blood. RESULTS: The thymus harbored predominantly viral sequences close to the founder HIV-1 variant that circulated in the blood at 2 and 3 months of age, whereas the peripheral blood virus at 5 years of age had evolved extensively. Viral sequences from circulating CD8(+) T cells at 5 years of age phylogenetically clustered with those from the thymic tissue. CONCLUSIONS: These results indicate the existence of a distinct thymic viral reservoir and suggest that circulating CD8(+) T cells were infected in the thymus, presumably at the CD4(+)CD8(+) thymocyte stage. They also demonstrate that not all thymic HIV infections will necessarily lead to severe thymic dysfunction. The characteristics of the virus strain seeding the thymus may dictate the rate of disease progression. |
New drugs and new regimens for the treatment of tuberculosis: review of the drug development pipeline and implications for national programmes
Lienhardt C , Vernon A , Raviglione MC . Curr Opin Pulm Med 2010 16 (3) 186-93 PURPOSE OF REVIEW: The aim is to review briefly the problems related to treatment of drug-susceptible and drug-resistant tuberculosis (TB), describe recent advances in the development of new drugs and new regimens, and discuss implications for control programmes. RECENT FINDINGS: Encouraging advances in TB drug research and development have been made since the turn of the century, resulting in a large number of new products introduced into the global portfolio. SUMMARY: Currently, nine compounds at least have advanced to clinical development, including four existing drugs redeveloped for TB indication and five new chemical entities. Present clinical trials are testing new combinations of drugs for a shortened treatment of drug-susceptible TB (<6 months duration) or the safety and efficacy of new drugs in addition to an optimized background therapy for the treatment of multidrug-resistant TB. There are at least 34 compounds or projects in the discovery and preclinical stages, including eight compounds in preclinical development. This increasing development of single compounds underscores the needs for a novel approach to test for optimal drug combinations that would be proposed for treatment of TB in all its forms, and the necessary collaboration of pharmaceutical companies, academia, research institutions, donors, and regulatory authorities. |
Haemophilus influenzae type b disease in HIV-infected children: a review of the disease epidemiology and effectiveness of Hib conjugate vaccines
Mangtani P , Mulholland K , Madhi SA , Edmond K , O'Loughlin R , Hajjeh R . Vaccine 2010 28 (7) 1677-83 The paper reviews the literature on the epidemiology of Hib disease and the effectiveness of Hib conjugate vaccine (HibCV) in HIV-infected children. The current three-dose primary Hib conjugate vaccine schedule in low-income settings has had a striking impact on the incidence of Hib disease. However, HIV-infected children have an almost 6-fold higher risk of Haemophilus influenzae type b (Hib) invasive disease than HIV-uninfected children and HibCV effectiveness is lower in this population. HIV-related HibCV failures are difficult to detect without well functioning surveillance systems and HIV testing of cases. Breakthrough Hib cases have been noted in vaccinated HIV-infected children in South Africa. A HibCV booster dose in addition to the three-dose primary schedule is routine in many, but not all, high-income countries. In order to determine whether a booster dose should be given to HIV-infected children in developing countries, well-designed studies need to be conducted to better determine the persistence of protective antibody concentrations, response to booster doses of vaccine as well as timing of and risk factors for vaccine failure in HIV-infected children both treated and naive to antiretroviral drug therapy (ART). Meanwhile, physicians and public health personnel should be especially vigilant at ensuring that HIV-infected infants receive their primary doses of HibCV, ART and co-trimoxazole prophylaxis. Until more definitive evidence is available, physicians may also need to consider a booster dose for such children irrespective of ART status. In any updating of vaccine schedules, HIV-infected children need particular consideration. |
Characterization of group A rotavirus infections in adolescents and adults from Pune, India: 1993-1996 and 2004-2007
Tatte VS , Gentsch JR , Chitambar SD . J Med Virol 2010 82 (3) 519-27 A total of 1,591 fecal specimens were collected in 1993-1996 and 2004-2007 from adolescents and adults with acute gastroenteritis in Pune, India for detection and characterization of rotavirus. At the two time points, group A rotavirus was detected in 8.6% and 16.2% of the adolescents and 5.2% and 17.2% of the adults, respectively. Reverse transcription-PCR with consensus primers followed by multiplex genotyping PCR detected common strains G1P[8], G2P[4], G3P[8], and G4P[8] in a total of 53.1% of the samples from 1993 to 1996, while the only prevalent strain identified in 2004-2007 was G2P[4] (23.5% of total). Uncommon rotavirus strains (G1P[4], G2P[8] G9P[6]/P[4]) increased from 7.8% (1993-1996) to 41.2% (2004-2007), while the prevalence of mixed rotavirus infections was high (39%/35%) at both time points. Mixed infections detected by multiplex PCR were confirmed by sequencing two or more individual genotype-specific PCR products of the VP7 and VP4 genes from the same sample. Phylogenetic analysis of the sequences showed circulation of a heterogeneous rotavirus strain population comprising genotypes G1 (lineages I and IIb), G2 (lineages I and IIb), G4 (lineage Ia), P[4] (lineages P[4]-5 and P[4]-1), P[8] (lineages P[8]-II and P[8]-III), and P[6] (M37-like lineage). The VP6 gene sequences of the nontypeable strains were most homologous to animal strains. This study documents the molecular epidemiology of rotavirus strains in adolescents and adults in India, and suggests that it may be important to monitor these strains over time for the potential impact on rotavirus vaccines under development for use in the Indian population. J. Med. Virol. 82:519-527, 2010. (c) 2010 Wiley-Liss, Inc. |
Phthalate exposure and precocious puberty in females
Lomenick JP , Calafat AM , Melguizo Castro MS , Mier R , Stenger P , Foster MB , Wintergerst KA . J Pediatr 2010 156 (2) 221-5 OBJECTIVE: To determine whether phthalate exposure is associated with precocious puberty in girls. STUDY DESIGN: This was a multicenter cross-sectional study in which 28 girls with central precocious puberty (CPP) and 28 age- and race-matched prepubertal females were enrolled. Nine phthalate metabolites and creatinine were measured in spot urine samples from these 56 children. RESULTS: Levels of 8 of the 9 phthalate metabolites were above the limit of detection (LOD) in all 56 subjects. Mono (2-ethylhexyl) phthalate (MEHP) was below the LOD in 25/56 samples (14 subjects with precocious puberty and 11 controls). No significant differences between the children with CPP and the controls in either absolute or creatinine-normalized concentrations of any of the 9 phthalate metabolites were measured. CONCLUSIONS: Although phthalates may be associated with certain other toxicities in humans, our study suggests that their exposure is not associated with precocious puberty in female children. |
General public health considerations for responding to animal hoarding cases
Castrodale L , Bellay YM , Brown CM , Cantor FL , Gibbins JD , Headrick ML , Leslie MJ , MacMahon K , O'Quin JM , Patronek GJ , Silva RA , Wright JC , Yu DT . J Environ Health 2010 72 (7) 14-18 Animal hoarding is an under-recognized problem that exists in most communities and adversely impacts the health, welfare, and safety of humans, animals, and the environment. These guidelines address public health and worker safety concerns in handling situations where animal hoarding or other dense concentrations of animals have caused unhealthy and unsafe conditions. Because animal hoarding situations are often complex, a full response is likely to be prolonged and require a cross-jurisdictional multiagency effort. Each animal hoarding case has unique circumstances related to the types and numbers of animals involved, the physical structure(s) where they are being kept, and the health status of the animals, among other factors that must be taken into account in planning a response. Some general public health considerations and associated recommendations for personal protective equipment use are presented that apply to all cases, however. |
Impact of the Red River catastrophic flood on women giving birth in North Dakota, 1994-2000
Tong VT , Zotti ME , Hsia J . Matern Child Health J 2010 15 (3) 281-8 To document changes in birth rates, birth outcomes, and pregnancy risk factors among women giving birth after the 1997 Red River flood in North Dakota. We analyzed detailed county-level birth files pre-disaster (1994-1996) and post-disaster (1997-2000) in North Dakota. Crude birth rates and adjusted fertility rates were calculated. The demographic and pregnancy risk factors were described among women delivering singleton births. Logistic regression was conducted to examine associations between the disaster and low birth weight (<2,500 g), preterm birth (<37 weeks), and small for gestational age infants adjusting for confounders. The crude birth rate and direct-adjusted fertility rate decreased significantly after the disaster in North Dakota. The proportion of women giving birth who were older, non-white, unmarried, and had a higher education increased. Compared to pre-disaster, there were significant increases in the following maternal measures after the disaster: any medical risks (5.1-7.1%), anemia (0.7-1.1%), acute or chronic lung disease (0.4-0.5%), eclampsia (0.3-2.1%), and uterine bleeding (0.3-0.4%). In addition, there was a significant increase in births that were low birth weight (OR 1.11, 95% CI 1.03-1.21) and preterm (OR 1.09, 95% CI 1.03-1.16) after adjusting for maternal characteristics and smoking. Following the flood, there was an increase in medical risks, low birth weight, and preterm delivery among women giving birth in North Dakota. Further research that examines birth outcomes of women following a catastrophic disaster is warranted. |
A study of variations in the reported haemophilia A prevalence around the world
Stonebraker JS , Bolton-Maggs PHB , Soucie JM , Walker I , Brooker M . Haemophilia 2010 16 (1) 20-32 The objectives of this paper were to study the reported haemophilia A prevalence (per 100 000 males) on a country-by-country basis and address the following: Does the reported prevalence of haemophilia A vary by national economies? We collected prevalence data for 106 countries from the World Federation of Hemophilia (WFH) annual global surveys and the literature. We found that the reported haemophilia A prevalence varied considerably among countries, even among the wealthiest of countries. The prevalence (per 100 000 males) for high income countries was 12.8 +/- 6.0 (mean +/- SD) whereas it was 6.6 +/- 4.8 for the rest of the world. Within a country, there was a strong trend of increasing prevalence over time - the prevalence for Canada ranged from 10.2 in 1989 to 14.2 in 2008 (R = 0.94 and P < 0.001) and for the United Kingdom it ranged from 9.3 in 1974 to 21.6 in 2006 (R = 0.94 and P < 0.001). Prevalence data reported from the WFH compared well with prevalence data from the literature. Patient registries generally provided the highest quality of prevalence data. The lack of accurate country-specific prevalence data has constrained planning efforts for the treatment and care of people with haemophilia A. With improved information, healthcare agencies can assess budgetary needs to develop better diagnostic and treatment facilities for affected patients and families and work to ensure adequate supplies of factor VIII concentrates for treatment. In addition, this information can help manufacturers plan the production of concentrates and prevent future shortages. copyright 2009 Blackwell Publishing Ltd. |
The things that get measured are the things that get done
Backinger CL , Malarcher AM . Am J Prev Med 2010 38 S433-6 The things that get measured are the things that get done.1 This simple, yet insightful statement underscores the priority of area of surveillance, one of the six core strategies from the Consumer Demand Workshop that has as its ultimate goal to increase demand among smokers for proven tobacco-cessation products and services.2 The specific core strategy related to surveillance is, “Systematically measuring, tracking, reporting, and studying quitting and treatment use—and their drivers and benefits—to identify opportunities and successes.”2 Surveillance is needed to assess all the steps on the quitter's journey, starting from the decision to make a quit attempt, through the choice of method to quit, the actual quit attempt, short-term success including relapse and re-cycling, and long-term success.3 No national survey exists in the U.S. that measures all the dynamic changes in tobacco-use behavior (host), tobacco products (agent), tobacco industry (vector), and social, policy, and media environments (environment).4 | Although this commentary addresses a specific part of the host domain, tobacco cessation, it is important to recognize that all domains influence the quitting process. The core strategy of surveillance for building consumer demand among smokers for proven tobacco-cessation products and services can also inform the other five core strategies: perspective on quitting, redesigning products and services, marketing and promotion, understanding policies as opportunities for cessation, and combining and integrating the strategies. |
Survival and growth of salmonella in salsa and related ingredients
Ma L , Zhang G , Gerner-Smidt P , Tauxe RV , Doyle MP . J Food Prot 2010 73 (3) 434-44 A large outbreak of Salmonella Saintpaul associated with raw jalapeno peppers, serrano peppers, and possibly tomatoes was reported in the United States in 2008. During the outbreak, two clusters of illness investigated among restaurant patrons were significantly associated with eating salsa. Experiments were performed to determine the survival and growth characteristics of Salmonella in salsa and related major ingredients, i.e., tomatoes, jalapeno peppers, and cilantro. Intact and chopped vegetables and different formulations of salsas were inoculated with a five-strain mixture of Salmonella and then stored at 4, 12, and 21 degrees C for up to 7 days. Salmonella populations were monitored during storage. Salmonella did not grow, but survived on intact tomatoes and jalapeno peppers, whereas significant growth at 12 and 21 degrees C was observed on intact cilantro. In general, growth of Salmonella occurred in all chopped vegetables when stored at 12 and 21 degrees C, with chopped jalapeno peppers being the most supportive of Salmonella growth. Regardless of differences in salsa formulation, no growth of Salmonella (initial inoculation ca. 3 log CFU/g) was observed in salsa held at 4 degrees C; however, rapid or gradual decreases in Salmonella populations were only observed in formulations that contained both fresh garlic and lime juice. Salmonella grew at 12 and 21 degrees C in salsas, except for those formulations that contained both fresh garlic and lime juice, in which salmonellae were rapidly or gradually inactivated, depending on salsa formulation. These results highlight the importance of preharvest pathogen contamination control of fresh produce and proper formulation and storage of salsa. |
Latent TB infection treatment acceptance and completion in the United States and Canada
Horsburgh Jr CR , Goldberg S , Bethel J , Chen S , Colson PW , Hirsch-Moverman Y , Hughes S , Shrestha-Kuwahara R , Sterling TR , Wall K , Weinfurter P . Chest 2010 137 (2) 401-9 BACKGROUND: Treatment of latent TB infection (LTBI) is essential for preventing TB in North America, but acceptance and completion of this treatment have not been systematically assessed. METHODS: We performed a retrospective, randomized two-stage cross-sectional survey of treatment and completion of LTBI at public and private clinics in 19 regions of the United States and Canada in 2002. RESULTS: At 32 clinics that both performed tuberculin skin testing and offered treatment, 123 (17.1%; 95% CI, 14.5%-20.0%) of 720 subjects tested and offered treatment declined. Employees at health-care facilities were more likely to decline (odds ratio [OR], 4.74; 95% CI, 1.75-12.9; P = .003), whereas those in contact with a patient with TB were less likely to decline (OR, 0.19; 95% CI, 0.07-0.50; P = .001). At 68 clinics starting treatment regardless of where skin testing was performed, 1,045 (52.7%; 95% CI, 48.5%-56.8%) of 1,994 people starting treatment failed to complete the recommended course. Risk factors for failure to complete included starting the 9-month isoniazid regimen (OR, 2.08; 95% CI, 1.23-3.57), residence in a congregate setting (nursing home, shelter, or jail; OR, 2.94; 95% CI, 1.58-5.56), injection drug use (OR, 2.13; 95% CI, 1.04-4.35), age >or= 15 years (OR, 1.49; 95% CI, 1.14-1.94), and employment at a health-care facility (1.37; 95% CI, 1.00-1.85). CONCLUSIONS: Fewer than half of the people starting treatment of LTBI completed therapy. Shorter regimens and interventions targeting residents of congregate settings, injection drug users, and employees of health-care facilities are needed to increase completion. |
Measurement of HIV prevention indicators: a comparison of the PLACE method and a household survey in Zambia
Tate J , Singh K , Ndubani P , Kamwanga J , Buckner B . AIDS Behav 2010 14 (1) 209-17 Reaching populations at greatest risk for acquiring HIV is essential for efforts to combat the epidemic. This paper presents, the Priorities for Local AIDS Control Efforts (PLACE) method which focuses on understanding the venues where people are meeting new sexual partners and behaviors which put people at risk. A comparison of data from two PLACE studies in Zambia with a national household survey, the Zambia Sexual Behavior Survey 2005, indicated that the PLACE population was at greater risk of acquiring HIV. Respondents in the two PLACE studies were significantly more likely to report 1+ new partners in the past 4 weeks, 2+ partners in the past 12 months, 1+ new partner in the past 12 months and transactional sex. Data from the PLACE method is important for targeting interventions for those most likely to acquire and transmit HIV. |
Health risks and travel preparation among foreign visitors and expatriates during the 2008 Beijing Olympic and Paralympic Games
Jentes ES , Davis XM , Macdonald S , Snyman PJ , Nelson H , Quarry D , Lai I , van Vliet EW , Balaban V , Marano C , Mues K , Kozarsky P , Marano N . Am J Trop Med Hyg 2010 82 (3) 466-472 During the 2008 Olympic and Paralympic Games, we conducted surveillance of illnesses among travelers at six Beijing clinics. Surveys asked about demographic, pre-travel, and vaccination information, and physician-provided diagnoses. Of 807 respondents, 38% and 57% were classified as foreign visitors (FV) and expatriates, respectively. Less than one-half of FV sought pre-travel advice; sources included health-care providers and friends/family. FV vaccination rate was also low; however, most vaccines given were recommended by the Centers for Disease Control and Prevention. The most common FV diagnoses were respiratory, injury/musculoskeletal, and gastrointestinal illnesses; for expatriates, injury/musculoskeletal, respiratory, and dermatologic were the most common illnesses. Respiratory illnesses in expatriates were significantly less in 2008 than during 2004-2007 (chi(2) = 10.2; P = 0.0014), suggesting that control programs may have reduced pollutants/respiratory irritants during the 2008 Games. We found no previous studies of health outcomes among expatriates living in cities with mass travel events. These findings highlight the need to continuously disseminate information to health-care providers advising travelers. |
Is sexual serosorting occurring among HIV-positive injection drug users? Comparison between those with HIV-positive partners only, HIV-negative partners only, and those with any partners of unknown status
Mizuno Y , Purcell DW , Latka MH , Metsch LR , Ding H , Gomez CA , Knowlton AR . AIDS Behav 2010 14 (1) 92-102 Using baseline data from a multi-site, randomized controlled study (INSPIRE), we categorized 999 HIV-positive IDUs into three groups based on serostatus of their sex partners. Our data provide some evidence for serosorting occurring in our sample; about 40% of the sample had sex exclusively with HIV-positive partners, and about half of them reported having unprotected sex with these partners. Twenty per cent had sex exclusively with HIV-negative partners; their sexual behaviors tended to be least risky with about two-thirds reporting their sex was protected. However, we also found that another 40% had at least one partner of unknown HIV status and sexual and drug risk was the highest among them. They were also least empowered, showing attributes that may undermine HIV prevention. Some of these findings are consistent with findings from MSM studies, suggesting that partner selection practices are similar between primarily heterosexual IDUs and MSM. |
The LIFE Project: a community-based weight loss intervention program for rural African American women
Parker VG , Coles C , Logan BN , Davis L . Fam Community Health 2010 33 (2) 133-143 Obesity continues to be a significant health problem for African American women. While a number of obesity interventions target urban African American women, few target rural ones. The LIFE Project is a 10-week intervention designed to reduce obesity in this rural population. Two different interventions (spiritually based and nonspiritually based) were pilot tested, each utilizing a pretest, posttest design. Results demonstrated that both interventions led to significant reductions in weight, but the spiritually based intervention led to additional improvements. The LIFE Project also demonstrated that churches are appropriate settings to deliver health interventions to these women. |
Effects of folic acid awareness on knowledge and consumption for the prevention of birth defects among Hispanic women in several U.S. communities
Prue CE , Hamner HC , Flores AL . J Womens Health (Larchmt) 2010 19 (4) 689-98 BACKGROUND: The neural tube defects (NTDs) anencephaly and spina bifida, are serious birth defects of the brain and spine that affect about 3000 pregnancies per year in the United States. Research has found a strong link between periconceptional folic acid consumption and NTD prevention. METHODS: Because Hispanic women have higher rates of NTD-affected births, targeted folic acid promotion efforts were conducted in several major cities from 1999 to 2002. Efforts included paid and unpaid placements of Spanish language public service announcements (PSAs) and community-level education through the use of promotoras. Analyses focused on whether or not women's reported awareness of folic acid, regardless of promotion type, impacted their knowledge or behavior. RESULTS AND CONCLUSIONS: Women who reported awareness of folic acid had greater folic acid knowledge and use of vitamins containing folic acid than those not aware. Analyses also examined the use of vitamins containing folic acid by pregnancy intention among women who reported awareness of folic acid. The results were varied. Pregnancy wanters were most likely to use vitamins containing folic acid daily. For this group, however, awareness did not play as large a role in whether they reported consuming a vitamin containing folic acid or not, as it did for pregnancy waiters and avoiders. |
Building consumer demand for tobacco-cessation products and services: the national tobacco cessation collaborative's consumer demand roundtable
Backinger CL , Thornton-Bullock A , Miner C , Orleans CT , Siener K , DiClemente CC , Phillips TM , Rowden JN , Arkin E . Am J Prev Med 2010 38 S307-11 Of the 43.4 million current smokers in the U.S., 70% say that they want to quit, and over 40% report making at least one serious quit attempt each year.1, 2 But most smokers who try to quit do not use the proven treatments that could double or triple their chances of succeeding.3 Unfortunately, smokers in the groups and populations with the highest smoking prevalence (Native Americans/Alaska Natives, low-income smokers, and those with limited formal education) are the most likely to try to quit, the least likely to use proven treatments, and the most likely to fail in their attempts.4, 5 There is no better way to improve the nation's health and reduce health and healthcare disparities than to reach, with treatments that work, more of the 17 million U.S. smokers who try to quit. | Boosting smokers' success by increasing their awareness of, demand for, access to, and use of effective treatments was recently identified as a priority by the Treating Tobacco Use and Dependence: 2008 Update—Clinical Practice Guideline and previously by the NIH State-of-the-Science Conference on Tobacco Cessation, Prevention, and Control in 2006.3, 6 The National Tobacco Cessation Collaborative (NTCC)'s Consumer Demand Roundtable was created to focus on this priority of building greater demand for tobacco-cessation products and services. Part of the reason for the underuse of science-based treatments is that for decades the public health community has seen smokers as “patients” who are prescribed treatments and told how to quit. With this view, treatments only have to be effective, but not necessarily appealing. But in today's consumer culture, smokers have many options, both proven and unproven. Viewing smokers instead as “consumers” involves seeing them as empowered to make treatment choices. Viewing smokers and quitters as consumers makes it clear that proven treatments must not only be effective, but also engaging and able to produce a positive consumer experience. From this perspective, if we are doing our jobs, quitters should want to use the treatments that work. The fact that treatment use remains low even when proven treatments are offered free of charge or are fully covered by health insurance indicates that we have more work to do. |
Contribution of integrated campaign distribution of long-lasting insecticidal nets to coverage of target groups and total populations in malaria-endemic areas in Madagascar
Kulkarni MA , Eng JV , Desrochers RE , Cotte AH , Goodson JL , Johnston A , Wolkon A , Erskine M , Berti P , Rakotoarisoa A , Ranaivo L , Peat J . Am J Trop Med Hyg 2010 82 (3) 420-425 In October 2007, Madagascar conducted a nationwide integrated campaign to deliver measles vaccination, mebendazole, and vitamin A to children six months to five years of age. In 59 of the 111 districts, long-lasting insecticidal nets (LLINs) were delivered to children less than five years of age in combination with the other interventions. A community-based, cross-sectional survey assessed LLIN ownership and use six months post-campaign during the rainy season. LLIN ownership was analyzed by wealth quintile to assess equity. In the 59 districts, 76.8% of households possessed at least one LLIN from any source and 56.4% of households possessed a campaign net. Equity of campaign net ownership was evident. Post-campaign, the LLIN use target of ≥ 80% by children less than five years of age and a high level of LLIN use (69%) by pregnant women were attained. Targeted LLIN distribution further contributed to total population coverage (60%) through use of campaign nets by all age groups. |
Mothers' preferences and willingness to pay for vaccinating daughters against human papillomavirus
Brown DS , Johnson FR , Poulos C , Messonnier ML . Vaccine 2010 28 (7) 1702-8 A choice-format, conjoint-analysis survey was developed and fielded to estimate how features of human papillomavirus (HPV) vaccines affect mothers' perceived benefit and stated vaccine uptake for daughters. Data were collected from a national sample of 307 U.S. mothers of girls aged 13-17 years who had not yet received an HPV vaccine. Preferences for four features of HPV vaccines were evaluated: protection against cervical cancer, protection against genital warts, duration of protection, and cost. We estimate that mean maximum willingness-to-pay (WTP)-an economic measure of the total benefits to consumers-for current HPV vaccine technology ranges between $560 and $660. All vaccine features were statistically significant determinants of WTP and uptake. Mothers were willing to pay $238 more for a vaccine that provides 90% protection for genital warts relative to a vaccine that provides no protection against warts. WTP for lifetime protection vs. 10 years protection was $245. Mothers strongly valued greater cervical cancer efficacy, with 100% protection against cervical cancers the most desired feature overall. Adding a second HPV vaccine choice to U.S. consumers' alternatives is predicted to increase stated uptake by 16%. Several features were significantly associated with stated choices and uptake: age of mother, race/ethnicity, household income, and concern about HPV risks. These findings provide new data on how HPV vaccines are viewed and valued by mothers, and how uptake may change in the context of evolving vaccine technology and as new data are reported on duration and efficacy. |
Evaluating a web-based test results system at an urban STI clinic
Ling SB , Richardson DB , Mettenbrink CJ , Westergaard BC , Sapp-Jones TD , Crane LA , Nyquist AC , McFarlane M , Kachur R , Rietmeijer CA . Sex Transm Dis 2010 37 (4) 259-63 BACKGROUND: Notifying patientsof gonorrhea and chlamydia test results using online services may improve clinic efficiency and increase receipt of test results. This study evaluated the implementation of an online results system in an urban sexually transmitted infections clinic. METHODS: Using the clinic's electronic medical records system to assess if and how gonorrhea and chlamydia test results were obtained, 3 time periods were examined between December 2007 and April 2009: Period 1, six months before initiation of the online results system; Period 2, six months when patients could opt in for online results by creating their own access codes; and Period 3, four months when access codes were assigned. In addition, a survey was conducted to assess reasons for accepting or declining the online results system. RESULTS: A total of 9056 new patient visits were evaluated. During Periods 1, 2, and 3, respectively 67%, 67%, and 70% patients received results either online or by telephone (NS). The proportion of patients calling the clinic for results decreased from 67% in Period 1, to 51% in Period 2, and 36% in Period 3 (P < 0.0001). Survey results indicated that patients accepted online results primarily because of the ability to check results anytime of day. Reasons for not accepting results online included lack of Internet access or a preference to receive results via the telephone. CONCLUSIONS: The online results system decreased the number of phone calls to the clinic pertaining to STI test results, but had no effect on the overall proportion of patients receiving results. |
Relative transmissibility of an R5 clade C simian-human immunodeficiency virus across different mucosae in macaques parallels the relative risks of sexual HIV-1 transmission in humans via different routes
Chenine AL , Siddappa NB , Kramer VG , Sciaranghella G , Rasmussen RA , Lee SJ , Santosuosso M , Poznansky MC , Velu V , Amara RR , Souder C , Anderson DC , Villinger F , Else JG , Novembre FJ , Strobert E , O'Neil SP , Secor WE , Ruprecht RM . J Infect Dis 2010 201 (8) 1155-63 BACKGROUND: Worldwide, approximately 90% of all human immunodeficiency virus (HIV) transmissions occur mucosally; almost all involve R5 strains. Risks of sexual HIV acquisition are highest for rectal, then vaginal, and finally oral exposures. METHODS: Mucosal lacerations may affect the rank order of susceptibility to HIV but cannot be assessed in humans. We measured relative virus transmissibility across intact mucosae in macaques using a single stock of SHIV-1157ipd3N4, a simian-human immunodeficiency virus encoding a primary R5 HIV clade C env (SHIV-C). RESULTS: The penetrability of rhesus macaque mucosae differed significantly, with rectal challenge requiring the least virus, followed by vaginal and then oral routes ([Formula: see text], oral vs vaginal; [Formula: see text] rectal vs vaginal). These findings imply that intrinsic mucosal properties are responsible for the differential mucosal permeability. The latter paralleled the rank order reported for humans, with relative risk estimates within the range of epidemiological human studies. To test whether inflammation facilitates virus transmission-as predicted from human studies-we established a macaque model of localized buccal inflammation. Systemic infection occurred across inflamed but not normal buccal mucosa. CONCLUSION: Our primate data recapitulate virus transmission risks observed in humans, thus establishing R5 SHIV-1157ipd3N4 in macaques as a robust model system to study cofactors involved in human mucosal HIV transmission and its prevention. |
Revisiting pneumococcal carriage using broth-enrichment and PCR techniques for enhanced detection of carriage and serotypes
Carvalho MD , Pimenta FC , Jackson D , Roundtree A , Ahmad Y , Millar EV , O'Brien KL , Whitney CG , Cohen AL , Beall BW . J Clin Microbiol 2010 48 (5) 1611-8 The measurement of pneumococcal carriage in the nasopharyngeal reservoir is subject to potential confounders that include low-density and multiple-strain colonization. To compare different methodologies, we picked a random sampling of 100 nasopharyngeal (NP) specimens recovered from infants less than 2 years of age that were previously assessed for pneumococcal carriage and serotypes using a conventional method employing direct plating from the transport/storage medium (50 pneumococcal culture-negative and 50 pneumococcal culture-positive). We used a broth enrichment approach and a conventional PCR approach (with and without broth-enrichment) for determining pneumococcal carriage and serotypes to compare to initial conventional culture-based results. Additionally we used lytA-targeted real time PCR for pneumococcal detection. Broth enrichment for both culture-based and PCR based methods enhanced the isolation of pneumococci and detection of serotype diversity, with the most effective serotype-deduction method employing broth enrichment prior to sequential multiplex PCR. Similarly, we also found that broth enrichment followed by lytA-specific real time PCR was most sensitive for detecting apparent pneumococcal carriage. The broth enrichment, conventional multiplex PCR, and real time PCR approaches used in this study were effective in detecting pneumococcal carriage within the 50 specimens that were negative using conventional direct plating from transport medium (ranging from 8/50 - 22/50 (16-44%) positives), and the 3 different serotyping approaches employing broth-enrichment increased the number of serotype identifications from the 100 specimens (12 - 29 additional identifications). A PCR-based approach that employed a broth enrichment step appeared to best enhance the detection of mixed serotypes and low density pneumococcal carriage. |
Evaluation of antimicrobial resistance phenotypes for predicting multidrug-resistant salmonella recovered from retail meats and humans in the United States
Whichard JM , Medalla F , Hoekstra RM , McDermott PF , Joyce K , Chiller T , Barrett TJ , White DG . J Food Prot 2010 73 (3) 445-51 Although multidrug-resistant (MDR) non-Typhi Salmonella (NTS) strains are a concern in food production, determining resistance to multiple antimicrobial agents at slaughter or processing may be impractical. Single antimicrobial resistance results for predicting multidrug resistance are desirable. Sensitivity, specificity, positive predictive value (PPV), and negative predictive value were used to determine each antimicrobial agent's ability to predict MDR phenotypes of human health significance: ACSSuT (resistance to at least ampicillin, chloramphenicol, streptomycin, sulfamethoxazole, tetracycline) in NTS isolates, and MDR-AmpC-SN (resistance to ACSSuT, additional resistance to amoxicillin-clavulanate and to ceftiofur, and decreased susceptibility [MIC ≥ 2 mug/ml] to ceftriaxone) in NTS serotype Newport. The U.S. National Antimicrobial Resistance Monitoring System determined MICs to 15 or more antimicrobial agents for 9,955 NTS isolates from humans from 1999 to 2004 and 689 NTS isolates from retail meat from 2002 to 2004. A total of 847 (8.5%) human and 26 (3.8%) retail NTS isolates were ACSSuT; 995 (10.0%) human and 16 (2.3%) retail isolates were serotype Newport. Among Salmonella Newport, 204 (20.5%) human and 9 (56.3%) retail isolates were MDR-AmpC-SN. Chloramphenicol resistance provided the highest PPVs for ACSSuT among human (90.5%; 95% confidence interval, 88.4 to 92.3) and retail NTS isolates (96.3%; 95% confidence interval, 81.0 to 99.9). Resistance to ceftiofur and to amoxicillin-clavulanate and decreased susceptibility to ceftriaxone provided the highest PPVs (97.1, 98.1, and 98.6%, respectively) for MDR-AmpC-SN from humans. High PPVs for these agents applied to retail meat MDR-AmpC-SN, but isolate numbers were lower. Variations in MIC results may complicate ceftriaxone's predictive utility. Selecting specific antimicrobial resistance offers practical alternatives for predicting MDR phenotypes. Chloramphenicol resistance works best for ACSSuT-NTS, and resistance to ceftiofur, amoxicillin-clavulanate, or chloramphenicol works best for MDR-AmpC-SN. |
Identification and characterization of CTX-M-producing Shigella isolates in the United States
Folster JP , Pecic G , Krueger A , Rickert R , Burger K , Carattoli A , Whichard JM . Antimicrob Agents Chemother 2010 54 (5) 2269-70 Shigellosis is a major source of gastroenteritis throughout the world (14). ... |
Analyzing digital vector waveforms of 0-3000 Hz magnetic fields for health studies
Bowman JD , Miller CK , Krieg EF , Song R . Bioelectromagnetics 2010 31 (5) 391-405 To improve the assessment of magnetic field exposures for occupational health studies, the Multiwave(R) System III (MW3) was developed to capture personal exposures to the three-dimensional magnetic field vector B(t) in the 0-3000 Hz band. To process hundreds of full-shift MW3 measurements from epidemiologic studies, new computer programs were developed to calculate the magnetic field's physical properties and its interaction with biological systems through various mechanisms (magnetic induction, radical pair interactions, ion resonance, etc.). For automated calculations in the frequency domain, the software uses new algorithms that remove artifacts in the magnetic field's Fourier transform due to electronic noise and the person's motion through perturbations in the geomagnetic field from steel objects. These algorithms correctly removed the Fourier transform artifacts in 92% of samples and have improved the accuracy of frequency-dependent metrics by as much as 3300%. The output of the MwBatch software is a matrix of 41 exposure metrics calculated for each 2/15 s sample combined with 8 summary metrics for the person's full-period exposure, giving 294 summary-exposure metrics for each person monitored. In addition, the MwVisualizer software graphically explores the magnetic field's vector trace, its component waveforms, and the metrics over time. The output was validated against spreadsheet calculations with pilot data. This software successfully analyzed full-shift MW3 monitoring with 507 electric utility workers, comprising over 1 million vector waveforms. The software's output can be used to test hypotheses about magnetic field biology and disease with biophysical models and also assess compliance with exposure limits. Bioelectromagnetics, 2010. (c) 2010 Wiley-Liss, Inc. |
Comparison of immunoassay and HPLC-MS/MS used to measure urinary metabolites of atrazine, metolachlor, and chlorpyrifos from farmers and non-farmers in Iowa
Curwin BD , Hein MJ , Barr DB , Striley C . J Expo Sci Environ Epidemiol 2010 20 (2) 205-12 Urine samples were collected from 51 participants in a study investigating pesticide exposure among farm families in Iowa. Aliquots from the samples were sent to two different labs and analyzed for metabolites of atrazine (atrazine mercapturate), metolachlor (metolachlor mercapturate) and chlorpyrifos (TCP) by two different analytical methods: immunoassay and high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS). HPLC-MS/MS methods tend to be highly specific, but are costly and time consuming. Immunoassay methods are cheaper and faster, but can be less sensitive due to cross reactivity and matrix effects. Three statistical methods were employed to compare the two analytical methods. Each statistical method differed in how the samples that had results below the limit of detection (LOD) were treated. The first two methods involved an imputation procedure and the third method used maximum likelihood estimation (MLE). A fourth statistical method that modeled each lab separately using MLE was used for comparison. The immunoassay and HPLC-MS/MS methods were moderately correlated (correlation 0.40-0.49), but the immunoassay methods consistently had significantly higher geometric mean (GM) estimates for each pesticide metabolite. The GM estimates for atrazine mercapturate, metolachlor mercapturate, and TCP by immunoassay ranged from 0.16-0.98 microg l(-1), 0.24-0.45 microg l(-1) and 14-14 microg l(-1), respectively and by HPLC-MS/MS ranged from 0.0015-0.0039 microg l(-1), 0.12-0.16 microg l(-1), and 2.9-3.0 microg l(-1), respectively. Immunoassays tend to be cheaper and faster than HPLC-MS/MS, however, they may result in an upward bias of urinary pesticide metabolite levels. |
Risk factors for non-syndromic holoprosencephaly in the National Birth Defects Prevention Study
Miller EA , Rasmussen SA , Siega-Riz AM , Frias JL , Honein MA . Am J Med Genet C Semin Med Genet 2010 154C (1) 62-72 Holoprosencephaly (HPE) is a complex structural brain anomaly that results from incomplete cleavage of the forebrain. The prevalence of HPE at birth is low, and risk factors have been difficult to identify. Using data from a large multi-state population-based case-control study, we examined risk factors for non-syndromic HPE. Data from maternal telephone interviews were available for 74 infants with HPE and 5871 controls born between 1997 and 2004. Several characteristics and exposures were examined, including pregnancy history, medical history, maternal diet and use of nutritional supplements, medications, tobacco, alcohol, and illegal substances. We used chi(2)-tests and logistic regression (excluding women with pre-existing diabetes) to examine associations with HPE. Except for diet (year before pregnancy) and sexually transmitted infections (STIs) (throughout pregnancy), most exposures were examined for the time period from the month before to the third month of pregnancy. HPE was found to be associated with pre-existing diabetes (chi(2) = 6.0; P = 0.01), aspirin use [adjusted odds ratio (aOR) = 3.4; 95% confidence interval (CI) 1.6-6.9], lower education level (aOR = 2.5; 95%CI 1.1-5.6), and use of assisted reproductive technologies (ART) (crude OR = 4.2; 95%CI 1.3-13.7). Consistent maternal folic acid use appeared to be protective (aOR = 0.4; 95%CI 0.2-1.0), but the association was of borderline statistical significance. While some of these findings support previous observations, other potential risk factors identified warrant further study. |
An update on cardiovascular malformations in congenital rubella syndrome
Oster ME , Riehle-Colarusso T , Correa A . Birth Defects Res A Clin Mol Teratol 2010 88 (1) 1-8 BACKGROUND: Congenital rubella syndrome (CRS) has long been characterized by the triad of deafness, cataract, and cardiovascular malformations (CVMs). While initial reports identified patent ductus arteriosus (PDA) as the primary CVM in CRS, the exact nature of the CVMs found in CRS has not been well established. METHODS: We searched the English literature from 1941 through 2008 to identify studies that used cardiac catheterization or echocardiography to evaluate the CVMs in CRS. RESULTS: Of the 121 patients in the 10 studies with catheterization data, 78% had branch pulmonary artery stenosis, and 62% had a PDA. In 49% of cases, both branch pulmonary artery stenosis and PDA were present, whereas isolated branch pulmonary artery stenosis and isolated PDA were found in 29 and 13% of cases, respectively. Of the 12 patients in the 10 studies with echocardiographic data, PDA was more common than branch pulmonary artery stenosis, but this finding is greatly limited by the small numbers of patients and limitations of echocardiography. Although published studies of CVMs in CRS have in general reported PDA as the CVM phenotype most commonly associated with CRS, among CRS cases evaluated by catheterization, branch pulmonary artery stenosis was actually more common than PDA. Moreover, although the combination of branch pulmonary artery stenosis and PDA was more common than either branch pulmonary artery stenosis or PDA alone, isolated branch pulmonary artery stenosis was twice as common as isolated PDA. CONCLUSION: Among children with suspected CRS, clinical evaluations for the presence of CVMs should include examinations for both branch pulmonary artery stenosis and PDA. |
Late-treated phenylketonuria and partial reversibility of intellectual impairment
Grosse SD . Child Dev 2010 81 (1) 200-211 Individuals with late-treated phenylketonuria (PKU) not detected by newborn screening but who followed dietary treatment for at least 12 months before 7 years of age have intelligence quotient (IQ) scores that range from severe impairment to the low-normal range. Among adults with late-treated PKU in California, 85% of those who were born from 1961 to 1978 had IQ scores of 70 or above. Longitudinal studies with repeated cognitive assessments often show average changes in cognitive test scores as high as 20-45 points. Although the severe cognitive impairment associated with untreated PKU can in many cases be partially reversed with dietary treatment, prompt initiation of treatment following newborn metabolic screening is essential for optimal development and the prevention of disability. |
Fetal constraint as a potential risk factor for craniosynostosis
Sanchez-Lara PA , Carmichael SL , Graham Jr JM , Lammer EJ , Shaw GM , Ma C , Rasmussen SA . Am J Med Genet A 2010 152A (2) 394-400 Non-syndromic craniosynostosis is multifactorial, and fetal head constraint has been hypothesized as one factor thought to play a role. Data from the National Birth Defects Prevention Study (NBDPS), a large multi-site case-control study of birth defects, were used to evaluate associations between four selected factors related to fetal constraint and craniosynostosis: plurality (twins or higher), macrosomia (birth weight >4,000 g), post-term gestational age (> or =42 weeks), and nulliparity (no previous live births). Case infants (n = 675) had craniosynostosis documented either by radiographic evidence or by surgical intervention. Infants with a recognized or strongly suspected single-gene conditions or chromosomal abnormalities were excluded. Control infants (n = 5,958) had no major birth defects and were randomly selected from the same population as case infants. Logistic regression was used to estimate odds ratios for the association between these four factors and craniosynostosis, while adjusting for several covariates. We found that plurality and nulliparity were associated with a twofold increased risk for metopic craniosynostosis, and macrosomia had almost twice the risk of developing coronal craniosynostosis. Contrary to our hypothesis, prematurity and low birth weight were also associated with craniosynostosis. In conclusion, these four constraint-related factors were not found to be associated with craniosynostosis when all suture types were combined, though some types of craniosynostosis were associated with individual constraint-related factors. |
History and current status of newborn screening for hemoglobinopathies
Benson JM , Therrell Jr BL . Semin Perinatol 2010 34 (2) 134-144 The impact of hemoglobinopathies on healthcare in the United States, particularly sickle cell disease (SCD), has been significant. Enactment of the Sickle Cell Anemia Control Act in 1972 significantly increased the federal interest in the SCDs and other hemoglobinopathies. Only since May 1, 2006, have all states required and provided universal newborn screening for SCD despite a national recommendation to this effect in 1987. In this article, we review the history of screening for SCD and other hemoglobinopathies, along with federal and state activities that have contributed to improved health outcomes for patients with SCD, as well as current newborn screening practices. We also chronicle the federal activities that have helped to shape and to refine laboratory screening and diagnostic proficiency. Finally, we review molecular testing strategies that have evolved and outline their possible future impacts on disease detection and outcome improvement. |
Improving and assuring newborn screening laboratory quality worldwide: 30-year experience at the Centers for Disease Control and Prevention
De Jesus VR , Mei JV , Bell CJ , Hannon WH . Semin Perinatol 2010 34 (2) 125-33 Newborn screening is the largest population-based genetic screening effort in the United States. The detection of treatable, inherited congenital disorders is a major public health responsibility. The Centers for Disease Control and Prevention's (CDC's) Newborn Screening Quality Assurance Program helps newborn screening laboratories ensure that testing accurately detects these disorders, does not delay diagnosis, minimizes false-positive reports, and sustains high-quality performance. For over 30 years, the CDC's Newborn Screening Quality Assurance Program has performed this essential public health service, ensuring the quality and accuracy of screening tests for more than 4 million infants born each year in the United States and millions more worldwide. The Program has grown from 1 disorder in 1978 for 31 participants to more than 50 disorders for 459 participants in 2009. This report reviews the Program's milestones and services to the newborn screening community. |
Serious complications within 30 days of screening and surveillance colonoscopy are uncommon
Ko CW , Riffle S , Michaels L , Morris C , Holub J , Shapiro JA , Ciol MA , Kimmey MB , Seeff LC , Lieberman D . Clin Gastroenterol Hepatol 2010 8 (2) 166-73 BACKGROUND & AIMS: The risk of serious complications after colonoscopy has important implications for the overall benefits of colorectal cancer screening programs. We evaluated the incidence of serious complications within 30 days after screening or surveillance colonoscopies in diverse clinical settings and sought to identify potential risk factors for complications. METHODS: Patients age 40 and over undergoing colonoscopy for screening, surveillance, or evaluation based an abnormal result from another screening test were enrolled through the National Endoscopic Database (CORI). Patients completed a standardized telephone interview approximately 7 and 30 days after their colonoscopy. We estimated the incidence of serious complications within 30 days of colonoscopy and identified risk factors associated with complications using logistic regression analyses. RESULTS: We enrolled 21,375 patients. Gastrointestinal bleeding requiring hospitalization occurred in 34 patients (incidence 1.59/1000 exams; 95% confidence interval [CI], 1.10-2.22). Perforations occurred in 4 patients (0.19/1000 exams; 95% CI, 0.05-0.48), diverticulitis requiring hospitalization in 5 patients (0.23/1000 exams; 95% CI, 0.08-0.54), and postpolypectomy syndrome in 2 patients (0.09/1000 exams; 95% CI, 0.02-0.30). The overall incidence of complications directly related to colonoscopy was 2.01 per 1000 exams (95% CI, 1.46-2.71). Two of the 4 perforations occurred without biopsy or polypectomy. The risk of complications increased with preprocedure warfarin use and performance of polypectomy with cautery. CONCLUSIONS: Complications after screening or surveillance colonoscopy are uncommon. Risk factors for complications include warfarin use and polypectomy with cautery. |
Does inadequate diet during childhood explain the higher high fracture rates in the Southern United States?
Paulozzi LJ . Osteoporos Int 2010 21 (3) 417-23 SUMMARY: Southern states have the highest age-adjusted hip fracture rates among older adults in the United States. Regional hip fracture rates in the United States in 1986-1993 correlate with death rates from rickets in the 1940s. Historical patterns of bone nutrition early in life might explain contemporary geographic patterns in bone fragility. INTRODUCTION: State of residence early in life is a better predictor of the risk of hip fracture after age 65 than state of current residence. Therefore, the geography of rickets mortality in the United States before 1950 was compared with the geography of hip fracture rates among older adults in the United States during 1986-1993. METHODS: Vital statistics data for the US white population for 1942-1948 allowed calculation of the ratio of deaths from rickets to live births for each geographic division of the USA. These ratios were correlated with previously published, standardized hip fracture rates among whites 65-89 years old during 1986-1993 by census division. RESULTS: During 1942-1948, the rickets mortality ratio among whites was 3.11 in the South, 1.91 in the Northeast, 1.75 in the Midwest, and 1.04 in the West. The correlation of mortality with risk of hip fracture was 0.71 (p = 0.03) for both sexes combined and 0.86 (p = 0.01) for women. CONCLUSIONS: Inadequate nutrition during skeletal formation early in life might explain the higher incidence of hip fracture among older adults in the South. |
ROPS performance during field upset and static testing
Harris JR , McKenzie Jr EA , Etherton JR , Cantis DM , Ronaghi M . J Agric Saf Health 2010 16 (1) 5-18 Agriculture remains one of the most hazardous occupations in the U.S. By conservative estimates, tractor overturns alone claim 120 lives annually. A rollover protective structure (ROPS) and a seatbelt are a highly effective engineering safety control that can prevent many of these fatalities and reduce the severity of injuries associated with tractor overturn. SAE J2194 is a consensus performance standard established for agricultural ROPS. According to this standard, satisfactory ROPS performance can be demonstrated through static testing, field upset testing, or impact testing. A previous modeling study suggested that static testing may underpredict the strain induced in a ROPS during a field upset. In the current study, field upset testing and laboratory static testing results were compared. Field upset testing included six rear and six side upset tests performed according to SAE J2194 guidelines. Additionally, static testing was performed on a ROPS of the same model. The results support findings from the modeling study. Near the lowest sections of the ROPS, the plastic strain resulting from rear upset testing exceeded the plastic strain from static testing for 18 of 24 data points. Conversely, the ROPS plastic strain from side upset testing was typically less than plastic strain from laboratory static testing. However, data indicate that the side upset test may not be very repeatable. This study suggests that the longitudinal loading energy criterion for static testing might not be a conservative predictor of rear upset ROPS response. |
Underground coal mining injury: a look at how age and experience relate to days lost from work following an injury
Margolis KA . Saf Sci 2010 48 (4) 417-421 Coal has been mined in the United States since colonial times and coal mining has always been a dangerous occupation. Despite the dangers involved in coal mining, coal is essential to the functioning of our society. Coal provides energy for products, businesses, and homes. Not only is coal mining a dangerous occupation, but, like many other industries, coal mining has also been referred to as a "graying occupation" as many coal miners are reaching retirement age. Younger workers possess certain advantages as older workers may have age-associated decrements in cognitive function, health, and recuperative ability. Although there are documented decreases in health and safety associated with age, there are also benefits at the workplace associated with increasing age. Increasing age brings about more experience and familiarity with the work environment. This study used the Mine Safety and Health Administration's (MSHA) database on accidents, injury, and illness from the years 2003 through 2007 to examine how age, experience at the current mine, total years experience as a coal miner, and experience in the current job affects injury severity. The results of the data indicated that there was a relationship between age and days lost as well as total mining experience and days lost following an injury. Furthermore, the data indicated an increased risk of overexertion injuries as age increases. These are important findings for the coal mining industry as many miners are more experienced and older. |
Work-related non-fatal injuries to adults on farms in the U.S., 2001 and 2004
Goldcamp EM . J Agric Saf Health 2010 16 (1) 41-51 The National Institute for Occupational Safety and Health (NIOSH), in an ongoing effort to address the issue of injuries on farms in the U.S., collaborated with the USDA to complete the 2001 and 2004 Occupational Injury Surveillance of Production Agriculture Surveys (OISPAS). The OISPAS data indicated that the estimated adult working population (household and hired) on U.S. farms decreased from 6,170,940 in 2001 to 5,294,912 in 2004. The estimated number of work-related injuries decreased from 75,756 to 71,081. The rate of injury increased over this same time period (12.3 injuries per 1,000 working adults to 13.4 injuries per 1,000 working adults). The majority of these injuries occurred to adults in the age range of 45 to 54 years. The vast majority of injuries occurred to males, over 75% in both years. Animals (17%) and the ground (17%) were the source of injury in approximately 35% of injuries reported in each year. The most common injury events were"struck by objects"; and falls. These two events combined accounted for over half of all work-related injuries in both 2001 and 2004. The OISPAS data indicated that although injuries are decreasing as the size of the at-risk population decreases, the rate of injury is increasing. The results of this research may be used to direct current injury prevention efforts and to plan for future injury surveillance. |
Occupational fatalities, injuries, illnesses, and related economic loss in the wholesale and retail trade sector
Anderson VP , Schulte PA , Sestito J , Linn H , Nguyen LS . Am J Ind Med 2010 53 (7) 673-85 BACKGROUND: The wholesale and retail trade (WRT) sector employs over 21 million workers, or nearly 19% of the annual average employment in private industry. The perception is that workers in this sector are generally at low risk of occupational injury and death. These workers, however, are engaged in a wide range of demanding job activities and are exposed to a variety of hazards. Prior to this report, a comprehensive appraisal of the occupational fatal and nonfatal burdens affecting the retail and wholesale sectors was lacking. The focus of this review is to assess the overall occupational safety and health burden in WRT and to identify various subsectors that have high rates of burden from occupational causes. Ultimately, these findings should be useful for targeted intervention efforts. METHODS: We reviewed Bureau of Labor Statistics (BLS), 2006 fatality, injury, and illness data for the WRT sector and provide comparisons between the WRT sector, its' subsectors, and private industry, which serves as a baseline. The BLS data provide both counts and standardized incidence rates for various exposures, events, and injury types for fatalities, injuries, and illnesses. In an effort to estimate the economic burden of these fatalities, injuries, and illnesses, a focused review of the literature was conducted. RESULTS AND CONCLUSION: In 2006, WRT workers experienced 820,500 injuries/illnesses and 581 fatalities. The total case injury/illness rate for the retail sector was 4.9/100 FTE and for the wholesale sector 4.1/100 FTE. The WRT sector represents 15.5% of the private sector work population in 2006, yet accounts for 20.1% of nonfatal injuries and illnesses of the private sector. In 2003, the disparity was only 2% but increased to 3% in 2004 and 2005. Three WRT subsectors had injury/illness rates well above the national average: beer/wine/liquor (8.4/100); building materials/supplies (7.6/100); and grocery-related products (7.0/100). Occupational deaths with the highest rates were found in gasoline stations (9.8/100,000), convenience stores (6.1/100,000), and used car dealers (5.5/100,000). In terms of actual numbers, the category of food and beverage stores had 82 fatalities in 2006. Based on 1993 data, costs, both direct and indirect, in the WRT sector for fatal injuries were estimated to exceed $8.6 billion. The full economic loss to society and the family has not been adequately measured. Overexertion and contact with objects/equipment represent the top two events or exposures leading to injury or illness. Together they account for 57% of the events or exposures for nonfatal WRT injuries and illnesses. This sector is important because it is large and pervasive as a result, even a relatively small increase in injury rates and accompanying days away from work will have significant impact on working families and society. Am. J. Ind. Med. (c) 2010 Wiley-Liss, Inc. |
Facial anthropometric differences among gender, ethnicity, and age groups
Zhuang Z , Landsittel D , Benson S , Roberge R , Shaffer R . Ann Occup Hyg 2010 54 (4) 391-402 OBJECTIVES: The impact of race/ethnicity upon facial anthropometric data in the US workforce, on the development of personal protective equipment, has not been investigated to any significant degree. The proliferation of minority populations in the US workforce has increased the need to investigate differences in facial dimensions among these workers. The objective of this study was to determine the face shape and size differences among race and age groups from the National Institute for Occupational Safety and Health survey of 3997 US civilian workers. METHODS: Survey participants were divided into two gender groups, four racial/ethnic groups, and three age groups. Measurements of height, weight, neck circumference, and 18 facial dimensions were collected using traditional anthropometric techniques. A multivariate analysis of the data was performed using Principal Component Analysis. An exploratory analysis to determine the effect of different demographic factors had on anthropometric features was assessed via a linear model. The 21 anthropometric measurements, body mass index, and the first and second principal component scores were dependent variables, while gender, ethnicity, age, occupation, weight, and height served as independent variables. RESULTS: Gender significantly contributes to size for 19 of 24 dependent variables. African-Americans have statistically shorter, wider, and shallower noses than Caucasians. Hispanic workers have 14 facial features that are significantly larger than Caucasians, while their nose protrusion, height, and head length are significantly shorter. The other ethnic group was composed primarily of Asian subjects and has statistically different dimensions from Caucasians for 16 anthropometric values. Nineteen anthropometric values for subjects at least 45 years of age are statistically different from those measured for subjects between 18 and 29 years of age. Workers employed in manufacturing, fire fighting, healthcare, law enforcement, and other occupational groups have facial features that differ significantly than those in construction. CONCLUSIONS: Statistically significant differences in facial anthropometric dimensions (P < 0.05) were noted between males and females, all racial/ethnic groups, and the subjects who were at least 45 years old when compared to workers between 18 and 29 years of age. These findings could be important to the design and manufacture of respirators, as well as employers responsible for supplying respiratory protective equipment to their employees. |
Foreword for special edition on migration and occupational health
Howard J . Am J Ind Med 2010 53 (4) 325-6 Anthropologists tell us that human beings have beenmigrating sinceHomo erectusleft Africa for Eurasia amillion years ago. After occupying Africa 150,000 years ago,Homo sapiensmigrated out of Africa about 70,000 years agoand began arriving in the Americas about 20,000 years ago.Even though the reasons for these historically very remotemigrations are unclear, what is clear though is that migrationforms an important part of human history and continues to doso.Modern humans migrate voluntarily, are forced tomigrate through hostile circumstances, or they are enslavedand transported against their will. Voluntary migrationgenerally occurs because people seek better economic,social, or political opportunities. Forced migration occursbecause people flee war, persecution, or famine. Enslave-ment has occurred throughout history, but the forcedtransport of 20 million native Africans to North Americafrom the 1600s to the 1800s represents one of the mostshameful coerced migrations in human history. |
Assessing the impact of state insurance policies on chlamydia screening: a panel data analysis
Owusu-Edusei Jr K , Gift TL . Health Policy 2010 96 (3) 231-8 OBJECTIVES: In the late 1990s, three Southern states (Maryland (MD), Georgia (GA) and Tennessee (TN)) enacted laws that required health plans to reimburse for chlamydia screening for the populations at risk. We assessed the impact of the laws on chlamydia screening rates for Georgia (GA) and Tennessee (TN). METHODS: We extracted monthly chlamydia screening rates on employer-sponsored privately insured women and used a panel regression analysis to conduct an intervention analysis that compared changes in screening rates in Georgia and Tennessee to ten southern states, based on the dates that the laws were enacted in the two states. Maryland was excluded due to non-specificity of the law and insufficient data. RESULTS: Although there were substantial increases in screening rates in both GA and TN after the enactment of the laws, data from the other ten states showed similar increases over the same period. Thus, there was no significant difference in the increase in screening rates between Georgia and Tennessee and the other states. CONCLUSION: Because this analysis was restricted to privately insured patients, additional studies are needed to assess the effectiveness (or the lack thereof) of the laws for other populations, such as those covered by Medicaid, within the individual states. |
Manifestations and effects of violence and social and economic disadvantage. Foreword
Hall JE . Fam Community Health 2010 33 (2) 80-81 This issue presents a diverse mixture of articles addressing the manifestations and effects of violence and social and economic disadvantage. The first collection of articles presents new research in the areas of family and intimate partner violence (IPV) and community violence. Exposure to each of these violence types shapes health trajectories over the course of life, by increasing adverse health experiences and the likelihood of involvement in risk bevhaviors that may lead to social problems, disability, or premature death. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Environmental Health
- Epidemiology and Surveillance
- Food Safety
- Health Behavior and Risk
- Health Communication and Education
- Immunity and Immunization
- Informatics
- Laboratory Sciences
- Maternal and Child Health
- Medicine
- Nutritional Sciences
- Occupational Safety and Health
- Public Health Leadership and Management
- Social and Behavioral Sciences
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure