Satisfaction with life domains in people with epilepsy
Kobau R , Luncheon C , Zack MM , Shegog R , Price PH . Epilepsy Behav 2012 25 (4) 546-551 While commonly used quality-of-life instruments assess perceived epilepsy-associated limitations in life domains and formally document patient concerns, less is known of community-dwelling adults with epilepsy about their satisfaction with broader life domains, such as satisfaction with housing, education, neighborhood, ability to help others, and achievement of goals. The purpose of this study was to examine satisfaction with life domains in a representative sample of community-dwelling adults with self-reported epilepsy from the 2008 HealthStyles survey. Following adjustment for sex, age group, race/ethnicity, education, and income, people with epilepsy were more likely to report frustration in the domains of achievement (e.g., dissatisfaction with education and life goals), compromised social interactions (dissatisfaction with family life, friends, and social life), and compromised physical capability (dissatisfaction with health and energy level). Life satisfaction and other well-being domains can supplement health indicators to guide treatment and program services for people with epilepsy to maximize their well-being. |
Tracking stroke hospitalization clusters over time and associations with county-level socioeconomic and healthcare characteristics
Schieb LJ , Mobley LR , George M , Casper M . Stroke 2012 44 (1) 146-52 BACKGROUND AND PURPOSE: This study evaluated clustering of stroke hospitalization rates, patterns of the clustering over time, and associations with community-level characteristics. METHODS: We used Medicare hospital claims data from 1995-1996 to 2005-2006 with a principal discharge diagnosis of stroke to calculate county-level stroke hospitalization rates. We identified statistically significant clusters of high- and low-rate counties by using local indicators of spatial association, tracked cluster status over time, and assessed associations between cluster status and county-level socioeconomic and healthcare profiles. RESULTS: Clearly defined clusters of counties with high- and low-stroke hospitalization rates were identified in each time. Approximately 75% of counties maintained their cluster status from 1995-1996 to 2005-2006. In addition, 243 counties transitioned into high-rate clusters, and 148 transitioned out of high-rate clusters. Persistently high-rate clusters were located primarily in the Southeast, whereas persistently low-rate clusters occurred mostly in New England and in the West. In general, persistently low-rate counties had the most favorable socioeconomic and healthcare profiles, followed by counties that transitioned out of or into high-rate clusters. Persistently high-rate counties experienced the least favorable socioeconomic and healthcare profiles. CONCLUSIONS: The persistence of clusters of high- and low-stroke hospitalization rates during a 10-year period suggests that the underlying causes of stroke in these areas have also persisted. The associations found between cluster status (persistently high, transitional, persistently low) and socioeconomic and healthcare profiles shed new light on the contributions of community-level characteristics to geographic disparities in stroke hospitalizations. |
Trends in the coronary heart disease risk profile of middle-aged adults
Kramarow E , Lubitz J , Francis R Jr . Ann Epidemiol 2012 23 (1) 31-4 PURPOSE: To examine recent trends in the coronary heart disease (CHD) risk profiles of the population aged 45 to 64 in the United States. METHODS: Data from the National Health and Nutrition Examination Surveys (NHANES) from 2 time periods (1988-1994 and 2005-2008) are used to estimate the CHD risk functions derived from the Framingham Heart Study. The risk functions take account of levels of blood pressure (systolic and diastolic), total and high-density lipoprotein serum cholesterol, diabetes (doctor diagnosed or based on fasting glucose), and smoking status to estimate the 10-year risk of myocardial infarction or coronary death. We estimate the risk functions by gender, race, and age group (45-54 and 55-64). RESULTS: The CHD risk profile of middle-aged adults has improved over time. For example, the mean 10-year risk of heart attack or CHD death among persons 55 to 64 years has declined from 7.1% to 5.2%. Declines are seen among both men and women and among non-Hispanic Blacks and non-Hispanic whites. CONCLUSIONS: Despite increases in diabetes and obesity, the CHD risk profile of middle-aged adults improved during the period from 1988-1994 to 2005-2008. |
Menarche, menopause, and breast cancer risk: individual participant meta-analysis, including 118,964 women with breast cancer from 117 epidemiological studies
Collaborative Group on Hormonal Factors in Breast Cancer , Lee N , Marchbanks P , Ory HW , Peterson HB , Wingo P . Lancet Oncol 2012 13 (11) 1141-51 BACKGROUND: Menarche and menopause mark the onset and cessation, respectively, of ovarian activity associated with reproduction, and affect breast cancer risk. Our aim was to assess the strengths of their effects and determine whether they depend on characteristics of the tumours or the affected women. METHODS: Individual data from 117 epidemiological studies, including 118,964 women with invasive breast cancer and 306,091 without the disease, none of whom had used menopausal hormone therapy, were included in the analyses. We calculated adjusted relative risks (RRs) associated with menarche and menopause for breast cancer overall, and by tumour histology and by oestrogen receptor expression. FINDINGS: Breast cancer risk increased by a factor of 1.050 (95% CI 1.044-1.057; p<0.0001) for every year younger at menarche, and independently by a smaller amount (1.029, 1.025-1.032; p<0.0001), for every year older at menopause. Premenopausal women had a greater risk of breast cancer than postmenopausal women of an identical age (RR at age 45-54 years 1.43, 1.33-1.52, p<0.001). All three of these associations were attenuated by increasing adiposity among postmenopausal women, but did not vary materially by women's year of birth, ethnic origin, childbearing history, smoking, alcohol consumption, or hormonal contraceptive use. All three associations were stronger for lobular than for ductal tumours (p<0.006 for each comparison). The effect of menopause in women of an identical age and trends by age at menopause were stronger for oestrogen receptor-positive disease than for oestrogen receptor-negative disease (p<0.01 for both comparisons). INTERPRETATION: The effects of menarche and menopause on breast cancer risk might not be acting merely by lengthening women's total number of reproductive years. Endogenous ovarian hormones are more relevant for oestrogen receptor-positive disease than for oestrogen receptor-negative disease and for lobular than for ductal tumours. FUNDING: Cancer Research UK. |
Physical activity, psychological distress, and receipt of mental healthcare services among cancer survivors
Zhao G , Li C , Li J , Balluz LS . J Cancer Surviv 2012 7 (1) 131-9 PURPOSE: Physical activity confers multiple health benefits in the general population. This study examined the associations of physical activity with serious psychological distress (SPD) and receipt of mental healthcare services among U.S. adult cancer survivors. METHODS: We analyzed data from 4,797 cancer survivors (aged ≥18 years) and 38,571 adults without cancer who participated in the 2009 Behavioral Risk Factor Surveillance System. SPD was assessed using the Kessler-6 questionnaire. Adjusted prevalence and prevalence ratios were estimated by conducting log-linear regression analysis while controlling for potential confounders. RESULTS: Overall, 6.6 % of cancer survivors (vs. 3.7 % of adults without cancer, P < 0.001) reported having SPD, and 14.0 % of cancer survivors (vs. 10.0 % of adults without cancer, P < 0.001) reported receiving mental healthcare services; the percentages decreased with increasing physical activity levels. After multivariate adjustment, compared to cancer survivors who were physically inactive, cancer survivors who engaged in physical activity >0 to <150 min/week and ≥150 min/week were 62 % and 61 % (P < 0.001 for both) less likely to report SPD, respectively; cancer survivors who engaged in physical activity ≥150 min/week were 33 % (P < 0.05) less likely to report receiving mental healthcare services. Additionally, the inverse association between physical activity and receiving mental healthcare services persisted among women with breast or reproductive cancers and among men and women with gastrointestinal cancers. CONCLUSION: The inverse associations between physical activity and SPD or receiving mental healthcare services suggest that physical activity may play a role in improving mental health among cancer survivors. IMPLICATIONS FOR CANCER SURVIVORS: Healthcare clinicians may consider routinely monitoring and assessing the psychological well-being of cancer survivors and educate them about the potential benefits of physical activity in improving their mental health. |
Elevated C-reactive protein is associated with severe periodic leg movements of sleep in patients with restless legs syndrome
Trotti LM , Rye DB , Staercke CD , Hooper WC , Quyyumi A , Bliwise DL . Brain Behav Immun 2012 26 (8) 1239-43 BACKGROUND: Restless legs syndrome (RLS) is a common sleep disorder in which urges to move the legs are felt during rest, are felt at night, and are improved by leg movement. RLS has been implicated in the development of cardiovascular disease. Periodic leg movements (PLMs) may be a mediator of this relationship. We evaluated systemic inflammation and PLMs in RLS patients to further assess cardiovascular risk. METHODS: 137 RLS patients had PLM measurements taken while unmedicated for RLS. Banked plasma was assayed for high sensitivity C-reactive protein (CRP), interleukin-6 (IL-6), and tumor necrosis factor alpha (TNF-alpha). RESULTS: Mean (SD) PLM index was 19.3 (22.0). PLMs were unrelated to TNF-a and IL-6, but were modestly correlated with logCRP (r(129)=0.19, p=0.03). Those patients with at least 45PLMs/h had an odds ratio of 3.56 (95% CI 1.26-10.03, p=0.02, df=1) for having elevated CRP compared to those with fewer than 45PLMs/h. After adjustment for age, race, gender, diabetes, hypertension, hyperlipidemia, inflammatory disorders, CRP-lowering medications, and body mass index, the OR for those with 45PLMs/h was 8.60 (95% CI 1.23 to 60.17, p=0.03, df=10). CONCLUSIONS: PLMs are associated with increased inflammation, such that those RLS patients with at least 45PLMs/h had more than triple the odds of elevated CRP than those with fewer PLMs. Further investigation into PLMs and inflammation is warranted. |
Breast MRI use uncommon among U.S. women
Miller JW , Sabatino S , Thompson TD , Breen N , White MC , Ryerson AB , Taplin SH , Ballard-Barbash R . Cancer Epidemiol Biomarkers Prev 2012 22 (1) 159-66 BACKGROUND: The goal of breast cancer screening is to reduce breast cancer mortality. Mammography is the standard screening method for detecting breast cancer early. Breast magnetic resonance imaging (MRI) is recommended to be used in conjunction with mammography for screening subsets of women at high risk for breast cancer. We offer the first study to provide national estimates of breast MRI use among women in the United States. METHODS: We analyzed data from women who responded to questions about having a breast MRI on the 2010 National Health Interview Survey. We assessed report of having a breast MRI and reasons for it by sociodemographic characteristics and access to health care and computed 5-year and lifetime breast cancer risk using the Gail model. RESULTS: Among 11,222 women who responded, almost 5% reported ever having a breast MRI and 2% reported having an MRI within the 2 years preceding the survey. Less than half of the women who reported having a breast MRI were at increased risk. Approximately 60% of women reported having the breast MRI for diagnostic reasons. Women who ever had a breast MRI were more likely to be older, black, and insured and to report a usual source of health care compared to women who reported no MRI. CONCLUSIONS: Breast MRI use may be underused or overused in certain subgroups of women. IMPACT: As access to health care improves, the use of breast MRI and the appropriateness of its use for breast cancer detection will be important to monitor. |
The epidemiology of chronic hepatitis C and one-time hepatitis C virus testing of persons born during 1945 to 1965 in the United States
Ward JW . Clin Liver Dis 2013 17 (1) 1-11 Hepatitis C virus (HCV) is the most common blood-borne infection in the United States. HCV infection is a leading cause of chronic liver disease, end-stage liver disease, and liver transplantation. Newly available therapies can clear HCV in most infected persons who receive treatment. However, many persons living with HCV infection are unaware of their infection status, including those born during 1945-1965 (a population at increased risk for chronic hepatitis C in the United States). This review highlights the epidemiology of hepatitis C and the importance of HCV testing and linkage to care in an era of more effective antiviral therapies. |
Lost to follow-up but perhaps not lost in the health system
Delcher C , Meredith G , Griswold M , Roussel B , Duval N , Louissaint E , Joseph P . J Acquir Immune Defic Syndr 2012 61 (5) e75-7 In their recent paper, Coria et al1 express concern over the large proportion (22%) of Haitian postpartum mothers who were lost to follow-up after seeking services in the Haitian Study Group for Kaposi Sarcoma and Opportunistic Infections (GHESKIO) clinic during the period 1999–2005. The purpose of this letter is to (1) inform the reader of the development of a new national HIV/AIDS case surveillance system in Haiti (unavailable during the time of the Coria et al study) and (2) examine the resulting case surveillance data with regard to patient records reported from GHESKIO and other clinics in Haiti. We believe that the national surveillance data may help to identify patients who seem to be lost to follow-up. These data were presented in part as a poster at the 2012 International AIDS Conference in Washington, DC.2 |
Outbreak of influenza A (H3N2) variant virus infection among attendees of an agricultural fair, Pennsylvania, USA, 2011
Wong KK , Greenbaum A , Moll ME , Lando J , Moore EL , Ganatra R , Biggerstaff M , Lam E , Smith EE , Storms AD , Miller JR , Dato V , Nalluswami K , Nambiar A , Silvestri SA , Lute JR , Ostroff S , Hancock K , Branch A , Trock SC , Klimov A , Shu B , Brammer L , Epperson S , Finelli L , Jhung MA . Emerg Infect Dis 2012 18 (12) 1937-44 During August 2011, influenza A (H3N2) variant [A(H3N2)v] virus infection developed in a child who attended an agricultural fair in Pennsylvania, USA; the virus resulted from reassortment of a swine influenza virus with influenza A(H1N1)pdm09. We interviewed fair attendees and conducted a retrospective cohort study among members of an agricultural club who attended the fair. Probable and confirmed cases of A(H3N2)v virus infection were defined by serology and genomic sequencing results, respectively. We identified 82 suspected, 4 probable, and 3 confirmed case-patients who attended the fair. Among 127 cohort study members, the risk for suspected case status increased as swine exposure increased from none (4%; referent) to visiting swine exhibits (8%; relative risk 2.1; 95% CI 0.2-53.4) to touching swine (16%; relative risk 4.4; 95% CI 0.8-116.3). Fairs may be venues for zoonotic transmission of viruses with epidemic potential; thus, health officials should investigate respiratory illness outbreaks associated with agricultural events. |
Pediatric and adolescent tuberculosis in the United States, 2008-2010
Winston CA , Menzies HJ . Pediatrics 2012 130 (6) e1425-32 OBJECTIVE: We examined heterogeneity among children and adolescents diagnosed with tuberculosis (TB) in the United States, and we investigated potential international TB exposure risk. METHODS: We analyzed demographic and clinical characteristics by origin of birth for persons <18 years with verified case of incident TB disease reported to National TB Surveillance System from 2008 to 2010. We describe newly available data on parent or guardian countries of origin and history of having lived internationally for pediatric patients with TB (<15 years of age). RESULTS: Of 2660 children and adolescents diagnosed with TB during 2008-2010, 822 (31%) were foreign-born; Mexico was the most frequently reported country of foreign birth. Over half (52%) of foreign-born patients diagnosed with TB were adolescents aged 13 to 17 years who had lived in the United States on average >3 years before TB diagnosis. Foreign-born pediatric patients with foreign-born parents were older (mean, 7.8 years) than foreign-born patients with US-born parents (4.2 years) or US-born patients (3.6 years). Among US-born pediatric patients, 66% had at least 1 foreign-born parent, which is >3 times the proportion in the general population. Only 25% of pediatric patients with TB diagnosed in the United States had no known international connection through family or residence history. CONCLUSIONS: Three-quarters of pediatric patients with TB in the United States have potential TB exposures through foreign-born parents or residence outside the United States. Missed opportunities to prevent TB disease may occur if clinicians fail to assess all potential TB exposures during routine clinic visits. |
Epidemiology of invasive pneumococcal disease among high-risk adults since introduction of pneumococcal conjugate vaccine for children
Muhammad RD , Oza-Frank R , Zell E , Link-Gelles R , Narayan KM , Schaffner W , Thomas A , Lexau C , Bennett NM , Farley MM , Harrison LH , Reingold A , Hadler J , Beall B , Klugman KP , Moore MR . Clin Infect Dis 2012 56 (5) e59-67 BACKGROUND: Certain chronic diseases increase risk for invasive pneumococcal disease (IPD) and are indications for receipt of 23-valent pneumococcal polysaccharide vaccine (PPV23). Since the pediatric introduction of 7-valent pneumococcal conjugate vaccine (PCV7) in 2000, incidence of IPD among adults has declined. The relative magnitude of these indirect effects among persons with and without PPV23 indications is unknown. METHODS: We evaluated IPD incidence among adults with and without PPV23 indications using population- and laboratory-based data collected during 1998-2009 and estimates of the denominator populations with PPV23 indications from the National Health Interview Survey. We compared rates before and after PCV7 use by age, race, PPV23 indication and serotype. RESULTS: The proportion of adult IPD cases with PPV23 indications increased from 51% before to 61% after PCV7 introduction (p<0.0001). PCV7-serotype IPD declined among all race, age and PPV23 indication strata, ranging from 82-97%. Overall IPD rates declined in most strata, by up to 65%. However, incidence remained highest among adults with PPV23 indications compared to those without (34.9 vs. 8.8 cases per 100,000 population, respectively). Apart from age ≥65 years, diabetes is now the most common indication for PPV23 (20% of all cases vs. 10% of cases in 1998-99). CONCLUSIONS: Although IPD rates have declined among adults, adults with underlying conditions remain at increased risk of IPD and comprise a larger proportion of adult IPD cases in 2009 compared to 2000. A continued increase in the prevalence of diabetes among U.S. adults could lead to increased burden of pneumococcal disease. |
Epidemiology, seasonality, and burden of influenza and influenza-like illness in urban and rural Kenya, 2007-2010
Katz MA , Lebo E , Emukule G , Njuguna HN , Aura B , Cosmas L , Audi A , Junghae M , Waiboci LW , Olack B , Bigogo G , Njenga MK , Feikin DR , Breiman RF . J Infect Dis 2012 206 Suppl 1 S53-60 BACKGROUND: The epidemiology and burden of influenza remain poorly defined in sub-Saharan Africa. Since 2005, the Kenya Medical Research Institute and Centers for Disease Control and Prevention-Kenya have conducted population-based infectious disease surveillance in Kibera, an urban informal settlement in Nairobi, and in Lwak, a rural community in western Kenya. METHODS: Nasopharyngeal and oropharyngeal swab specimens were obtained from patients who attended the study clinic and had acute lower respiratory tract (LRT) illness. Specimens were tested for influenza virus by real-time reverse-transcription polymerase chain reaction. We adjusted the incidence of influenza-associated acute LRT illness to account for patients with acute LRT illness who attended the clinic but were not sampled. RESULTS: From March 2007 through February 2010, 4140 cases of acute LRT illness were evaluated in Kibera, and specimens were collected from 1197 (27%); 319 (27%) were positive for influenza virus. In Lwak, there were 6733 cases of acute LRT illness, and specimens were collected from 1641 (24%); 359 (22%) were positive for influenza virus. The crude and adjusted rates of medically attended influenza-associated acute LRT illness were 6.9 and 13.6 cases per 1000 person-years, respectively, in Kibera, and 5.6 and 23.0 cases per 1000 person-years, respectively, in Lwak. In both sites, rates of influenza-associated acute LRT illness were highest among children <2 years old and lowest among adults ≥50 years old. CONCLUSION: In Kenya, the incidence of influenza-associated acute LRT illness was high in both rural and urban settings, particularly among the most vulnerable age groups. |
Evidence of an explosive epidemic of HIV infection in a cohort of men who have sex with men in Bangkok, Thailand
van Griensven F , Thienkrua W , McNicholl J , Wimonsate W , Chaikummao S , Chonwattana W , Varangrat A , Sirivongrangson P , Mock PA , Akarasewi P , Tappero JW . AIDS 2012 27 (5) 825-32 OBJECTIVE: To assess HIV-prevalence, incidence and risk factors in a cohort of men who have sex with men (MSM) in Bangkok. DESIGN: Cohort study with 4-monthly follow-up visits conducted between April 2006 and December 2011 at a dedicated study clinic in a central Bangkok hospital. Participants were 1744 homosexually active Thai men, ≥18 years old and residents of Bangkok. METHODS: Men were tested for HIV-infection at every study visit and for sexually transmitted infections at baseline. Demographic and behavioural data were collected by audio-computer-assisted self-interview. Logistic regression analysis was used to evaluate risk factors for HIV-prevalence and Cox proportional hazard analysis to evaluate risk factors for HIV-incidence. RESULTS: Baseline HIV-prevalence was 21.3% (n = 372) and 60 months cumulative HIV-free survival was 76.1% (n = 222). Overall HIV-incidence density was 5.9 per 100 person-years (PY). Multivariate risk factors for HIV-prevalence were older age, secondary/vocational education (vs. university or higher), employed or unemployed (vs. studying), nitrate inhalation, drug use for sexual pleasure, receptive anal intercourse, history of sexual coercion, no prior HIV-testing, and anti-HSV-1 and 2 and T. pallidum (TP) positivity at baseline. Multivariate risk factors for HIV-incidence were younger age, living alone or with roommate (vs. with a partner or family), drug use for sexual pleasure, inconsistent condom use, receptive anal intercourse, group sex, and anti-HSV-1 and 2 and TP positivity at baseline. Having no anal intercourse partners was inversely associated with HIV-incidence. CONCLUSIONS: The high HIV prevalence and incidence in this cohort of Bangkok MSM documents an explosive epidemic. Additional preventive interventions for MSM are urgently needed. |
Global seasonality of rotavirus disease
Patel MM , Pitzer V , Alonso WJ , Vera D , Lopman B , Tate J , Viboud C , Parashar UD . Pediatr Infect Dis J 2012 32 (4) e134-47 BACKGROUND: A substantial number of surveillance studies have documented rotavirus prevalence among children admitted for dehydrating diarrhea. We sought to establish global seasonal patterns of rotavirus disease before widespread vaccine introduction. METHODS: We reviewed studies of rotavirus detection in children with diarrhea published since 1995. We assessed potential relationships between seasonal prevalence and locality by plotting the average monthly proportion of diarrhea cases positive for rotavirus according to geography, country development, and latitude. We used linear regression to identify variables that were potentially associated with the seasonal intensity of rotavirus. RESULTS: Among a total of 99 studies representing all six geographical regions of the world, patterns of year-round disease were more evident in low- and low-middle income countries compared with upper-middle and high income countries where disease was more likely to be seasonal. The level of country development was a stronger predictor of strength of seasonality (P=0.001) than geographical location or climate. However, the observation of distinctly different seasonal patterns of rotavirus disease in some countries with similar geographical location, climate and level of development indicate that a single unifying explanation for variation in seasonality of rotavirus disease is unlikely. CONCLUSION: While no unifying explanation emerged for varying rotavirus seasonality globally, the country income level was somewhat more predictive of the likelihood of having seasonal disease than other factors. Future evaluation of the effect of rotavirus vaccination on seasonal patterns of disease in different settings may help understand factors that drive the global seasonality of rotavirus disease. |
Hepatitis C testing, infection, and linkage to care among racial and ethnic minorities in the United States, 2009-2010
Tohme RA , Xing J , Liao Y , Holmberg SD . Am J Public Health 2012 103 (1) 112-9 OBJECTIVES: We estimated rates and determinants of hepatitis C virus (HCV) testing, infection, and linkage to care among US racial/ethnic minorities. METHODS: We analyzed the Racial and Ethnic Approaches to Community Health Across the US Risk Factor Survey conducted in 2009-2010 (n = 53 896 minority adults). RESULTS: Overall, 19% of respondents were tested for HCV. Only 60% of those reporting a risk factor were tested, with much lower rates among Asians reporting injection drug use (40%). Odds of HCV testing decreased with age and increased with higher education. Of those tested, 8.3% reported HCV infection. Respondents with income of $75,000 or more were less likely to report HCV infection than those with income less than $25,000. College-educated non-Hispanic Blacks and Asians had lower odds of HCV infection than those who did not finish high school. Of those infected, 44.4% were currently being followed by a physician, and 41.9% had taken HCV medications. CONCLUSIONS: HCV testing and linkage to care among racial/ethnic minorities are suboptimal, particularly among those reporting HCV risk factors. Socioeconomic factors were significant determinants of HCV testing, infection, and access to care. Future HCV testing and prevention activities should be directed toward racial/ethnic minorities, particularly those of low socioeconomic status. (Am J Public Health. Published online ahead of print November 15, 2012: e1-e8. doi:10.2105/AJPH.2012.300858). |
Association between community socioeconomic position and HIV diagnosis rate among adults and adolescents in the United States, 2005 to 2009
An Q , Prejean J , McDavid Harrison K , Fang X . Am J Public Health 2012 103 (1) 120-6 OBJECTIVES: We examined the association between socioeconomic position (SEP) and HIV diagnosis rates in the United States and whether racial/ethnic disparities in diagnosis rates persist after control for SEP. METHODS: We used cases of HIV infection among persons aged 13 years and older, diagnosed 2005 through 2009 in 37 states and reported to national HIV surveillance through June 2010, and US Census data, to examine associations between county-level SEP measures and 5-year average annual HIV diagnosis rates overall and among race/ethnicity-sex groups. RESULTS: The HIV diagnosis rate was significantly higher for individuals in the low-SEP tertile than for those in the high-SEP tertile (rate ratios for low- vs high-SEP tertiles range = 1.68-3.38) except for White males and Hispanic females. The SEP disparities were larger for minorities than for Whites. Racial disparities persisted after we controlled for SEP, urbanicity, and percentage of population aged 20 to 50 years, and were high in the low-SEP tertile for males and in low- and high-SEP tertiles for females. CONCLUSIONS: Findings support continued prioritization of HIV testing, prevention, and treatment to persons in economically deprived areas, and Blacks of all SEP levels. (Am J Public Health. Published online ahead of print November 15, 2012: e1-e7. doi:10.2105/AJPH.2012.300853). |
Clinician practices and attitudes regarding early antiretroviral therapy in the United States
Kurth AE , Mayer K , Beauchamp G , McKinstry L , Farrior J , Buchacz K , Donnell D , Branson B , El-Sadr W . J Acquir Immune Defic Syndr 2012 61 (5) e65-e69 BACKGROUND: Use of antiretroviral therapy (ART) to prevent HIV transmission has received substantial attention after a recent trial demonstrating efficacy of ART to reduce HIV transmission in HIV-discordant couples. OBJECTIVE: To assess practices and attitudes of HIV clinicians regarding early initiation of ART for treatment and prevention of HIV at sites participating in the HIV Prevention Trials Network 065 study. DESIGN: Cross-sectional internet-based survey. METHODS: ART-prescribing clinicians (n = 165 physicians, nurse practitioners, physician assistants) at 38 HIV care sites in Bronx, NY, and Washington, DC, completed a brief anonymous Internet survey, before any participation in the HIV Prevention Trials Network 065 study. Analyses included associations between clinician characteristics and willingness to prescribe ART for prevention. RESULTS: Almost all respondents (95%), of whom 59% were female, 66% white, and 77% HIV specialists, "strongly agreed/agreed" that early ART can decrease HIV transmission. Fifty-six percent currently recommend ART initiation for HIV-infected patients with CD4+ count <500 cells per cubic millimeter, and 14% indicated that they initiate ART irrespective of CD4+ count. Most (75%) indicated that they would consider initiating ART earlier than otherwise indicated for patients in HIV-discordant sexual partnerships, and 40% would do so if a patient was having unprotected sex with a partner of unknown HIV status. There were no significant differences by age, gender, or clinician type in likelihood of initiating ART for reasons including HIV transmission prevention to sexual partners. CONCLUSIONS: This sample of US clinicians indicated support for early ART initiation to prevent HIV transmission, especially for situations where such transmission would be more likely to occur. |
Delayed 2009 pandemic influenza A virus subtype H1N1 circulation in West Africa, May 2009-April 2010
Nzussouo NT , Michalove J , Diop OM , Njouom R , Monteiro Mde L , Adje HK , Manoncourt S , Amankwa J , Koivogui L , Sow S , Elkory MB , Collard JM , Dalhatu I , Niang MN , Lafond K , Moniz F , Coulibaly D , Kronman KC , Oyofo BA , Ampofo W , Tamboura B , Bara AO , Jusot JF , Ekanem E , Sarr FD , Hwang I , Cornelius C , Coker B , Lindstrom S , Davis R , Dueger E , Moen A , Widdowson MA . J Infect Dis 2012 206 Suppl 1 S101-7 To understand 2009 pandemic influenza A virus subtype H1N1 (A[H1N1]pdm09) circulation in West Africa, we collected influenza surveillance data from ministries of health and influenza laboratories in 10 countries, including Cameroon, from 4 May 2009 through 3 April 2010. A total of 10,203 respiratory specimens were tested, of which 25% were positive for influenza virus. Until the end of December 2009, only 14% of all detected strains were A(H1N1)pdm09, but the frequency increased to 89% from January through 3 April 2010. Five West African countries did not report their first A(H1N1)pdm09 case until 6 months after the emergence of the pandemic in North America, in April 2009. The time from first detection of A(H1N1)pdm09 in a country to the time of A(H1N1)pdm09 predominance varied from 0 to 37 weeks. Seven countries did not report A(H1N1)pdm09 predominance until 2010. Introduction and transmission of A(H1N1)pdm09 were delayed in this region. |
Global distribution and genetic diversity of Bartonella in bat flies (Hippoboscoidea, Streblidae, Nycteribiidae).
Morse SF , Olival KJ , Kosoy M , Billeter S , Patterson BD , Dick CW , Dittmar K . Infect Genet Evol 2012 12 (8) 1717-23 Recently, a growing number Bartonella spp. have been identified as causative agents for a broadening spectrum of zoonotic diseases, emphasizing their medical importance. Many mammalian reservoirs and vectors however are still unknown, hindering our understanding of pathogen ecology and obscuring epidemiological connections. New Bartonella genotypes were detected in a global sampling of 19 species of blood-feeding bat flies (Diptera, Hippoboscoidea, Nycteribiidae, Streblidae) from 20 host bat species, suggesting an important role of bat flies in harboring bartonellae. Evolutionary relationships were explored in the context of currently known Bartonella species and genotypes. Phylogenetic and gene network analyses point to an early evolutionary association and subsequent radiation of bartonellae with bat flies and their hosts. The recovery of unique clades, uniting Bartonella genotypes from bat flies and bats, supports previous ideas of these flies potentially being vectors for Bartonella. Presence of bartonellae in some female bat flies and their pupae suggests vertical transmission across developmental stages. The specific function of bartonellae in bats and bat flies remains a subject of debate, but in addition to pathogenic interactions, parasitic, mutualistic, or reservoir functions need to be considered. |
Pyrethroid resistance in Aedes aegypti and Aedes albopictus from Port-au-Prince, Haiti
McAllister JC , Godsey MS , Scott ML . J Vector Ecol 2012 37 (2) 325-32 In Port-au-Prince, Haiti, the status of insecticide resistance has not recently been evaluated for Aedes aegypti (L) and Aedes albopictus (Skuse) populations. No prophylactics exist for dengue, so prevention is only through vector control methods. An earthquake occurred in Haiti on January 12, 2010, with a magnitude of 7.0 Mw that devastated the area. Dengue became a major concern for the humanitarian relief workers that entered the country. Bottle bioassays were conducted in the field on adult mosquitoes reared from larvae collected from the grounds of the U.S. Embassy and from an adjacent neighborhood in eastern Port-au-Prince, Haiti. At the CDC, Fort Collins, CO, bioassays, molecular, and biochemical assays were performed on mosquitoes reared from field-collected eggs. A small percentage of the population was able to survive the diagnostic dose in bioassays run in Haiti. Mosquitoes tested at the CDC demonstrated no phenotypic resistance. A variety of factors could be responsible for the discrepancies between the field and lab data, but temperature and larval nutrition are probably most important. Knowledge of localized resistance and underlying mechanisms helps in making rational decisions in selection of appropriate and effective insecticides in the event of a dengue outbreak. |
Urinary perchlorate as a measure of dietary and drinking water exposure in a representative sample of the United States population 2001-2008
Lau FK , Decastro BR , Mills-Herring L , Tao L , Valentin-Blasini L , Alwis KU , Blount BC . J Expo Sci Environ Epidemiol 2012 23 (2) 207-14 Perchlorate (ClO(4)(-)) is ubiquitous in the environment and inhibits the thyroid's uptake of iodide. Food and tap water are likely sources of environmental exposure to perchlorate. The aim of this study was to identify significant dietary sources of perchlorate using perchlorate measured in urine as an exposure indicator. Sample-weighted, age-stratified linear regression models of National Health and Nutrition Examination Survey (NHANES) 2001-2008 data (n=16,955 participants) characterized the association between urinary perchlorate and the mass consumed in USDA food groups, controlling for urinary creatinine and other potential confounders. Separate models of NHANES 2005-2006 data (n=2841) evaluated the association between urinary perchlorate and perchlorate consumed via residential tap water. Consumption of milk products was associated with statistically significant contributions to urinary perchlorate across all age strata: 2.93 ng ClO(4)(-)/ml per kg consumed for children (6-11 years-old (YO)); 1.54 ng ClO(4)(-)/ml per kg for adolescents (12-19 YO); and 0.69 ng ClO(4)(-)/ml per kg for adults (20-84 YO). Vegetables were a significant contributor for adolescents and adults, whereas fruits and eggs contributed significantly only for adults. Dark-green leafy vegetables contributed the most among all age strata: 30.83 ng ClO(4)(-)/ml per kg for adults. Fats, oils, and salad dressings were significant contributors only for children. Three food groups were negatively associated with urinary perchlorate: grain products for children; sugars, sweets, and beverages for adolescents; and home tap water for adults. In a separate model, however, perchlorate consumed via home tap water contributed significantly to adult urinary perchlorate: 2.11E-4 ng ClO(4)(-)/ml per ng perchlorate in tap water consumed. In a nationally representative sample of the United States 6-84 YO, diet and tap water contributed significantly to urinary perchlorate, with diet contributing substantially more than tap water. (Journal of Exposure Science and Environmental Epidemiology advance online publication, 28 November 2012; doi:10.1038/jes.2012.108.) |
Predictors and variability of urinary paraben concentrations in men and women, including before and during pregnancy
Smith KW , Braun JM , Williams PL , Ehrlich S , Correia KF , Calafat AM , Ye X , Ford J , Keller M , Meeker JD , Hauser R . Environ Health Perspect 2012 120 (11) 1538-43 BACKGROUND: Parabens are suspected endocrine disruptors and ubiquitous preservatives used in personal care products, pharmaceuticals, and foods. No studies have assessed the variability of parabens in women, including during pregnancy. OBJECTIVE: We evaluated predictors and variability of urinary paraben concentrations. METHODS: We measured urinary concentrations of methyl (MP), propyl (PP), and butyl paraben (BP) among couples from a fertility center. Mixed-effects regression models were fit to examine demographic predictors of paraben concentrations and to calculate intraclass correlation coefficients (ICCs). RESULTS: Between 2005 and 2010, we collected 2,721 spot urine samples from 245 men and 408 women. The median concentrations were 112 microg/L (MP), 24.2 microg/L (PP), and 0.70 microg/L (BP). Urinary MP and PP concentrations were 4.6 and 7.8 times higher in women than men, respectively, and concentrations of both MP and PP were 3.8 times higher in African Americans than Caucasians. MP and PP concentrations were slightly more variable in women (ICC = 0.42, 0.43) than men (ICC = 0.54, 0.51), and were weakly correlated between partners (r = 0.27-0.32). Among 129 pregnant women, urinary paraben concentrations were 25-45% lower during pregnancy than before pregnancy, and MP and PP concentrations were more variable (ICCs of 0.38 and 0.36 compared with 0.46 and 0.44, respectively). CONCLUSIONS: Urinary paraben concentrations were more variable in women compared with men, and during pregnancy compared with before pregnancy. However, results for this study population suggest that a single urine sample may reasonably represent an individual's exposure over several months, and that a single sample collected during pregnancy may reasonably classify gestational exposure. |
Excessive heat and respiratory hospitalizations in New York State: estimating current and future public health burden related to climate change
Lin S , Hsu WH , Van Zutphen AR , Saha S , Luber G , Hwang SA . Environ Health Perspect 2012 120 (11) 1571-7 BACKGROUND: Although many climate-sensitive environmental exposures are related to mortality and morbidity, there is a paucity of estimates of the public health burden attributable to climate change. OBJECTIVE: We estimated the excess current and future public health impacts related to respiratory hospitalizations attributable to extreme heat in summer in New York State (NYS) overall, its geographic regions, and across different demographic strata. METHODS: On the basis of threshold temperature and percent risk changes identified from our study in NYS, we estimated recent and future attributable risks related to extreme heat due to climate change using the global climate model with various climate scenarios. We estimated effects of extreme high apparent temperature in summer on respiratory admissions, days hospitalized, direct hospitalization costs, and lost productivity from days hospitalized after adjusting for inflation. RESULTS: The estimated respiratory disease burden attributable to extreme heat at baseline (1991-2004) in NYS was 100 hospital admissions, US$644,069 in direct hospitalization costs, and 616 days of hospitalization per year. Projections for 2080-2099 based on three different climate scenarios ranged from 206-607 excess hospital admissions, US$26-$76 million in hospitalization costs, and 1,299-3,744 days of hospitalization per year. Estimated impacts varied by geographic region and population demographics. CONCLUSIONS: We estimated that excess respiratory admissions in NYS due to excessive heat would be 2 to 6 times higher in 2080-2099 than in 1991-2004. When combined with other heat-associated diseases and mortality, the potential public health burden associated with global warming could be substantial. |
Adiposity, body composition, and weight change in relation to organochlorine pollutant plasma concentrations
De Roos AJ , Ulrich CM , Sjodin A , McTiernan A . J Expo Sci Environ Epidemiol 2012 22 (6) 617-24 We investigated cross-sectional associations of body composition and weight change with polychlorinated biphenyls (PCB) and organochlorine pesticides/pesticide metabolites measured in blood collected at the baseline of the Physical Activity for Total Health study of postmenopausal, overweight women living in the Seattle, Washington metropolitan area. Indicators of greater adiposity were associated with lower plasma concentrations of most PCBs with six or more chlorine atoms. This pattern was observed for current weight, body mass index, fat mass percent, subcutaneous abdominal fat, intra-abdominal fat, waist circumference, hip circumference, waist-to-hip ratio, and maximum adult weight. Conversely, PCB 105, PCB 118, and p,p'-DDE were generally increased or showed no association with these variables. Weight gain since age 35 was associated with lower concentrations of almost every organochlorine we studied, and past weight loss episodes of at least 20 pounds (≥9.1 kg) were associated with higher concentrations. Our results have implications for epidemiologic studies of organochlorines in terms of covariates that may be important to consider in statistical analyses, particularly as such considerations may differ importantly by specific analyte. Our finding of increased organochlorine concentrations with past weight loss episodes may have public health significance; however, this association requires confirmation in longitudinal studies. |
Use of electronic health records and administrative data for public health surveillance of eye health and vision-related conditions in the United States
Elliott AF , Davidson A , Lum F , Chiang MF , Saaddine JB , Zhang X , Crews JE , Chou CF . Am J Ophthalmol 2012 154 S63-70 PURPOSE: To discuss the current trend toward greater use of electronic health records and how these records could enhance public health surveillance of eye health and vision-related conditions. DESIGN: Perspective, comparing systems. METHODS: We describe 3 currently available sources of electronic health data (Kaiser Permanente, the Veterans Health Administration, and the Centers for Medicare & Medicaid Services) and how these sources can contribute to a comprehensive vision and eye health surveillance system. RESULTS: Each of the 3 sources of electronic health data can contribute meaningfully to a comprehensive vision and eye health surveillance system, but none currently provide all the information required. The use of electronic health records for vision and eye health surveillance has both advantages and disadvantages. CONCLUSIONS: Electronic health records may provide additional information needed to create a comprehensive vision and eye health surveillance system. Recommendations for incorporating electronic health records into such a system are presented. |
The variability of vision loss assessment in federally sponsored surveys: seeking conceptual clarity and comparability
Crews JE , Lollar DJ , Kemper AR , Lee LM , Owsley C , Zhang X , Elliott AF , Chou CF , Saaddine JB . Am J Ophthalmol 2012 154 S31-S44 e1 PURPOSE: To review U.S. national population-based surveys to evaluate comparability and conceptual clarity of vision measures. DESIGN: Perspective. METHODS: The vision questions in 12 surveys were mapped to the World Health Organization's International Classification of Functioning, Disability and Health framework under the domains of condition, impairment, activity limitation, participation, and environment. Surveys examined include the National Health Interview Survey, the Behavioral Risk Factor Surveillance Survey, National Health and Nutrition Examination Survey, the Census, and the Visual Function Questionnaire. RESULTS: Nearly 100 vision measures were identified in 12 surveys. These surveys provided no consistent measure of vision or vision impairment. Survey questions asked about differing characteristics of vision-related disease, function, and social roles. A question related to ability to read newspaper print was the most commonly asked question in surveys. CONCLUSIONS: Limited comparability of data and lack of conceptual clarity in the population-based surveys resulted in an inability to consistently characterize the population of people experiencing vision impairment. Consequently, vision surveillance was limited. |
Mortality trends from 2003 to 2009 among adolescents and young adults in rural western Kenya using a health and demographic surveillance system
Phillips-Howard PA , Odhiambo FO , Hamel M , Adazu K , Ackers M , van Eijk AM , Orimba V , Hoog AV , Beynon C , Vulule J , Bellis MA , Slutsker L , de Cock K , Breiman R , Laserson KF . PLoS One 2012 7 (11) e47017 BACKGROUND: Targeted global efforts to improve survival of young adults need information on mortality trends; contributions from health and demographic surveillance system (HDSS) are required. METHODS AND FINDINGS: This study aimed to explore changing trends in deaths among adolescents (15-19 years) and young adults (20-24 years), using census and verbal autopsy data in rural western Kenya using a HDSS. Mid-year population estimates were used to generate all-cause mortality rates per 100,000 population by age and gender, by communicable (CD) and non-communicable disease (NCD) causes. Linear trends from 2003 to 2009 were examined. In 2003, all-cause mortality rates of adolescents and young adults were 403 and 1,613 per 100,000 population, respectively, among females; and 217 and 716 per 100,000, respectively, among males. CD mortality rates among females and males 15-24 years were 500 and 191 per 100,000 (relative risk [RR] 2.6; 95% confidence intervals [CI] 1.7-4.0; p<0.001). NCD mortality rates in same aged females and males were similar (141 and 128 per 100,000, respectively; p = 0.76). By 2009, young adult female all-cause mortality rates fell 53% (chi(2) for linear trend 30.4; p<0.001) and 61.5% among adolescent females (chi(2) for linear trend 11.9; p<0.001). No significant CD mortality reductions occurred among males or for NCD mortality in either gender. By 2009, all-cause, CD, and NCD mortality rates were not significantly different between males and females, and among males, injuries equalled HIV as the top cause of death. CONCLUSIONS: This study found significant reductions in adolescent and young adult female mortality rates, evidencing the effects of targeted public health programmes, however, all-cause and CD mortality rates among females remain alarmingly high. These data underscore the need to strengthen programmes and target strategies to reach both males and females, and to promote NCD as well as CD initiatives to reduce the mortality burden amongst both gender. |
Newborn screening for critical congenital heart disease: essential public health roles for birth defects monitoring programs
Olney RS , Botto LD . Birth Defects Res A Clin Mol Teratol 2012 94 (12) 965-9 Newborn screening for critical congenital heart defects, added in September 2011 to the Recommended Uniform Screening Panel in the United States, is a new public health priority and has particular relevance for state birth defects surveillance programs. In this commentary, we review the background to potential involvement by birth defects programs with screening, and detail key questions that these programs can evaluate: (1) health outcomes after newborn screening among affected children; (2) missed primary targets of screening (i.e., affected children who were not screened or had false-negative screens); (3) burden and screening accuracy for secondary targets; (4) the role of altitude, sociodemographic characteristics, and other special circumstances; (5) the contribution of prenatal and clinical diagnoses before newborn screening; and (6) costs and service utilization. To address these issues, monitoring programs will need to pay particular attention to: (1) data sources and quality; (2) timeliness; (3) long-term follow-up for comprehensive outcomes; (4) reporting standards; and (5) state and national program coordination. Although some aspects of involvement with these screening programs will require new partnerships and paradigm shifts in birth defects program operations, the visibility of these screening programs among stakeholders will also provide birth defects programs with new opportunities to demonstrate their usefulness. (Birth Defects Research (Part A), (c) 2012 Wiley Periodicals, Inc.) |
Influenza surveillance in 15 countries in Africa, 2006-2010
Radin JM , Katz MA , Tempia S , Nzussouo NT , Davis R , Duque J , Adedeji A , Adjabeng MJ , Ampofo WK , Ayele W , Bakamutumaho B , Barakat A , Cohen AL , Cohen C , Dalhatu IT , Daouda C , Dueger E , Francisco M , Heraud JM , Jima D , Kabanda A , Kadjo H , Kandeel A , Bi Shamamba SK , Kasolo F , Kronmann KC , Mazaba Liwewe ML , Lutwama JJ , Matonya M , Mmbaga V , Mott JA , Muhimpundu MA , Muthoka P , Njuguna H , Randrianasolo L , Refaey S , Sanders C , Talaat M , Theo A , Valente F , Venter M , Woodfill C , Bresee J , Moen A , Widdowson MA . J Infect Dis 2012 206 Suppl 1 S14-21 BACKGROUND: In response to the potential threat of an influenza pandemic, several international institutions and governments, in partnership with African countries, invested in the development of epidemiologic and laboratory influenza surveillance capacity in Africa and the African Network of Influenza Surveillance and Epidemiology (ANISE) was formed. METHODS: We used a standardized form to collect information on influenza surveillance system characteristics, the number and percent of influenza-positive patients with influenza-like illness (ILI), or severe acute respiratory infection (SARI) and virologic data from countries participating in ANISE. RESULTS: Between 2006 and 2010, the number of ILI and SARI sites in 15 African countries increased from 21 to 127 and from 2 to 98, respectively. Children 0-4 years accounted for 48% of all ILI and SARI cases of which 22% and 10%, respectively, were positive for influenza. Influenza peaks were generally discernible in North and South Africa. Substantial cocirculation of influenza A and B occurred most years. CONCLUSIONS: Influenza is a major cause of respiratory illness in Africa, especially in children. Further strengthening influenza surveillance, along with conducting special studies on influenza burden, cost of illness, and role of other respiratory pathogens will help detect novel influenza viruses and inform and develop targeted influenza prevention policy decisions in the region. |
Genomic comparison of Escherichia coli O104:H4 isolates from 2009 and 2011 reveals plasmid, and prophage heterogeneity, including shiga toxin encoding phage stx2.
Ahmed SA , Awosika J , Baldwin C , Bishop-Lilly KA , Biswas B , Broomall S , Chain PS , Chertkov O , Chokoshvili O , Coyne S , Davenport K , Detter JC , Dorman W , Erkkila TH , Folster JP , Frey KG , George M , Gleasner C , Henry M , Hill KK , Hubbard K , Insalaco J , Johnson S , Kitzmiller A , Krepps M , Lo CC , Luu T , McNew LA , Minogue T , Munk CA , Osborne B , Patel M , Reitenga KG , Rosenzweig CN , Shea A , Shen X , Strockbine N , Tarr C , Teshima H , van Gieson E , Verratti K , Wolcott M , Xie G , Sozhamannan S , Gibbons HS . PLoS One 2012 7 (11) e48228 In May of 2011, an enteroaggregative Escherichia coli O104:H4 strain that had acquired a Shiga toxin 2-converting phage caused a large outbreak of bloody diarrhea in Europe which was notable for its high prevalence of hemolytic uremic syndrome cases. Several studies have described the genomic inventory and phylogenies of strains associated with the outbreak and a collection of historical E. coli O104:H4 isolates using draft genome assemblies. We present the complete, closed genome sequences of an isolate from the 2011 outbreak (2011C-3493) and two isolates from cases of bloody diarrhea that occurred in the Republic of Georgia in 2009 (2009EL-2050 and 2009EL-2071). Comparative genome analysis indicates that, while the Georgian strains are the nearest neighbors to the 2011 outbreak isolates sequenced to date, structural and nucleotide-level differences are evident in the Stx2 phage genomes, the mer/tet antibiotic resistance island, and in the prophage and plasmid profiles of the strains, including a previously undescribed plasmid with homology to the pMT virulence plasmid of Yersinia pestis. In addition, multiphenotype analysis showed that 2009EL-2071 possessed higher resistance to polymyxin and membrane-disrupting agents. Finally, we show evidence by electron microscopy of the presence of a common phage morphotype among the European and Georgian strains and a second phage morphotype among the Georgian strains. The presence of at least two stx2 phage genotypes in host genetic backgrounds that may derive from a recent common ancestor of the 2011 outbreak isolates indicates that the emergence of stx2 phage-containing E. coli O104:H4 strains probably occurred more than once, or that the current outbreak isolates may be the result of a recent transfer of a new stx2 phage element into a pre-existing stx2-positive genetic background. |
Genomic basis of a polyagglutinating isolate of Neisseria meningitidis.
Rishishwar L , Katz LS , Sharma NV , Rowe L , Frace M , Thomas JD , Harcourt BH , Mayer LW , Jordan IK . J Bacteriol 2012 194 (20) 5649-56 Containment strategies for outbreaks of invasive Neisseria meningitidis disease are informed by serogroup assays that characterize the polysaccharide capsule. We sought to uncover the genomic basis of conflicting serogroup assay results for an isolate (M16917) from a patient with acute meningococcal disease. To this end, we characterized the complete genome sequence of the M16917 isolate and performed a variety of comparative sequence analyses against N. meningitidis reference genome sequences of known serogroups. Multilocus sequence typing and whole-genome sequence comparison revealed that M16917 is a member of the ST-11 sequence group, which is most often associated with serogroup C. However, sequence similarity comparisons and phylogenetic analysis showed that the serogroup diagnostic capsule polymerase gene (synD) of M16917 belongs to serogroup B. These results suggest that a capsule-switching event occurred based on homologous recombination at or around the capsule locus of M16917. Detailed analysis of this locus uncovered the locations of recombination breakpoints in the M16917 genome sequence, which led to the introduction of an approximately 2-kb serogroup B sequence cassette into the serogroup C genomic background. Since there is no currently available vaccine for serogroup B strains of N. meningitidis, this kind capsule-switching event could have public health relevance as a vaccine escape mutant. |
Attitudes and program preferences of African-American urban young adults about pre-exposure prophylaxis (PrEP)
Smith DK , Toledo L , Smith DJ , Adams MA , Rothenberg R . AIDS Educ Prev 2012 24 (5) 408-21 We elicited attitudes about, and service access preferences for, daily oral antiretroviral pre-exposure prophylaxis (PrEP) from urban, African-American young men and women, ages 18-24 years, at risk for HIV transmission through their sexual and drug-related behaviors participating in eight mixed-gender and two MSM-only focus groups in Atlanta, Georgia. Participants reported substantial interest in PrEP associated with its perceived cost, effectiveness, and ease of accessing services and medication near to their homes or by public transportation. Frequent HIV testing was a perceived benefit. Participants differed about whether risk-reduction behaviors would change, and in which direction; and whether PrEP use would be associated with HIV stigma or would enhance the reputation for PrEP users. This provides the first information about the interests, concerns, and preferences of young adult African Americans that can be used to inform the introduction of PrEP services into HIV prevention efforts for this critical population group. |
Receipt of cancer treatment summaries and follow-up instructions among adult cancer survivors: results from a national survey
Sabatino SA , Thompson TD , Smith JL , Rowland JH , Forsythe LP , Pollack L , Hawkins NA . J Cancer Surviv 2012 7 (1) 32-43 PURPOSE: The purpose of this study is to examine reporting of treatment summaries and follow-up instructions among cancer survivors. METHODS: Using the 2010 National Health Interview Survey, we created logistic regression models among cancer survivors not in treatment (n = 1,345) to determine characteristics associated with reporting treatment summaries and written follow-up instructions, adjusting for sociodemographic, access, and cancer-related factors. Findings are presented for all survivors and those recently diagnosed (≤4 years). We also examined unadjusted associations between written instructions and subsequent surveillance and screening. RESULTS: Among those recently diagnosed, 38% reported receiving treatment summaries and 58% reported written instructions. Among all survivors, approximately one third reported summaries and 44% reported written instructions. After adjustment, lower reporting of summaries was associated with cancer site, race, and number of treatment modalities among those recently diagnosed, and white vs. black or Hispanic race/ethnicity, breast vs. colorectal cancer, >10 vs. ≤5 years since diagnosis, no clinical trials participation, and better than fair health among all survivors. For instructions, lower reporting was associated with no trials participation and lower income among those recently diagnosed, and increasing age, white vs. black race, lower income, >10 vs. ≤5 years since diagnosis, 1 vs. ≥2 treatment modalities, no trials participation, and at least good vs. fair/poor health among all survivors. Written instructions were associated with reporting provider recommendations for breast and cervical cancer surveillance, and recent screening mammograms. CONCLUSION: Many recently diagnosed cancer survivors did not report receiving treatment summaries and written follow-up instructions. Opportunities exist to examine associations between use of these documents and recommended care and outcomes, and to facilitate their adoption. IMPLICATIONS FOR CANCER SURVIVORS: Cancer survivors who have completed therapy should ask their providers for treatment summaries and written follow-up instructions, and discuss with them how their cancer and therapy impact their future health care. |
Influenza virus h5 DNA vaccination is immunogenic by intramuscular and intradermal routes in humans.
Ledgerwood JE , Hu Z , Gordon IJ , Yamshchikov G , Enama ME , Plummer S , Bailer R , Pearce MB , Tumpey TM , Koup RA , Mascola JR , Nabel GJ , Graham BS . Clin Vaccine Immunol 2012 19 (11) 1792-7 Avian influenza virus causes outbreaks in domestic and wild birds around the world, and sporadic human infections have been reported. A DNA vaccine encoding hemagglutinin (HA) protein from the A/Indonesia/5/05 (H5N1) strain was initially tested in two randomized phase I clinical studies. Vaccine Research Center study 304 (VRC 304) was a double-blinded study with 45 subjects randomized to placebo, 1 mg of vaccine, or 4 mg of vaccine treatment groups (n = 15/group) by intramuscular (i.m.) Biojector injection. VRC 305 was an open-label study to evaluate route, with 44 subjects randomized to intradermal (i.d.) injections of 0.5 mg by needle/syringe or by Biojector or 1 mg delivered as two 0.5-mg Biojector injections in the same deltoid or as 0.5 mg in each deltoid (n = 11/group). Injections were administered at weeks 0, 4, and 8 in both studies. Antibody responses to H5 were assessed by hemagglutination inhibition (HAI) assay, enzyme-linked immunosorbent assay (ELISA), and neutralization assay, and the H5 T cell responses were assessed by enzyme-linked immunospot and intracellular cytokine staining assays. There were no vaccine-related serious adverse events, and the vaccine was well tolerated in all groups. At 1 mg, i.d. vaccination compared to i.m. vaccination induced a greater frequency and magnitude of response by ELISA, but there were no significant differences in the frequency or magnitude of response between the i.d. and i.m. routes in the HAI or neutralization assays. T cell responses were more common in subjects who received the 1- or 4-mg dose i.m. These studies demonstrated that the DNA vaccine encoding H5 is safe and immunogenic and served to define the proper dose and route for further studies. The i.d. injection route did not offer a significant advantage over the i.m. route, and no difference was detected by delivery to one site versus splitting the dose between two sites for i.d. vaccine administration. The 4-mg dose (i.m) was further investigated in prime-boost regimens. |
Rules and tools that improved Vaccines for Children vaccine-ordering practices in Oregon: a 2010 pilot project
Hewett R , Vancuren A , Trocio L , Beaudrault S , Gund A , Luther M , Groom H . J Public Health Manag Pract 2013 19 (1) 9-15 OBJECTIVE: This project's objective was to enhance efforts to improve vaccine-ordering efficiencies among targeted clinics using publicly purchased vaccines. DESIGN: Using an assessment of ordering behavior developed by the Centers for Disease Control and Prevention, we selected and trained immunization providers and assessed improvements in ordering behavior by comparing ordering patterns before and after the intervention. SETTING: A total of 144 Vaccines for Children program providers in Oregon. PARTICIPANTS: We assessed 144 providers trained in the Economic Order Quantity process between January and November 2010. INTERVENTION (IF APPLICABLE): Providers were invited to participate in regional trainings. Trainings included assignment of ordering frequency and dissemination of tools to support adherence to the recommended ordering frequency. MAIN OUTCOME MEASURE(S): The percent increase in targeted clinics ordering according to recommended order frequency and the resulting decrease in orders placed, as an outcome of training and ordering tools. RESULTS: Only 35% of targeted providers were ordering according to the recommended ordering frequency before the project began. After completing training, utilizing ordering tools and ordering over a 7-month period, 78% of the targeted clinics were ordering according to the recommended frequency, a 120% increase in the number of clinics ordering with the recommended frequency. At baseline, targeted clinics placed 915 total vaccine orders over a 7-month period. After completing training and participating in the Economic Order Quantity process, only 645 orders were placed, a reduction of 30% . CONCLUSIONS: The initiative was successful in reducing the number of orders placed by Vaccines for Children providers in Oregon. A previous effort to reduce ordering, without the use of training or tools, did not achieve the same levels of provider compliance, suggesting that the addition of staff and development of tools were helpful in supporting behavior change and improving providers' ability to adhere to assigned order frequencies. Reducing order frequency results in more efficient vaccine ordering patterns and benefits vaccine distributors, Oregon Immunization Program staff, and provider staff. |
Vaccination coverage among American Indian and Alaska Native children, 2006-2010
Groom AV , Santibanez TA , Bryan RT . Pediatrics 2012 130 (6) e1592-9 BACKGROUND AND OBJECTIVES: A previous study on vaccination coverage in the American Indian/Alaska Native (AI/AN) population found that disparities in coverage between AI/AN and white children existed from 2001 to 2004 but were absent in 2005. The objective of this study was to describe vaccination coverage levels for AI/AN children aged 19-35 months in the United States between 2006 and 2010, examining whether gains found for AI/AN children in 2005 have been sustained. METHODS: Data from the 2006 through 2010 National Immunization Surveys were analyzed. Groups were defined as AI/AN (alone or in combination with any other race and excluding Hispanics) and white-only non-Hispanic children. Comparisons in demographics and vaccination coverage were made. RESULTS: Demographic risk factors often associated with underimmunization were significantly higher for AI/AN respondents compared with white respondents in most years studied. Overall, vaccination coverage was similar between the 2 groups in most years, although coverage with 4 or more doses of pneumococcal conjugate vaccine was lower for AI/AN children in 2008 and 2009, as was coverage with vaccine series measures the series in 2006 and 2009. When stratified by geographic regions, AI/AN children had coverage that was similar to or higher than that of white children for most vaccines in most years studied. CONCLUSIONS: The gains in vaccination coverage found in 2005 have been maintained. The absence of disparities in coverage with most vaccines between AI/AN children and white children from 2006 through 2010 is a clear success. These types of periodic reviews are important to ensure we remain vigilant. |
Long-term immune responses to Coxiella burnetii after vaccination
Kersh GJ , Fitzpatrick KA , Self JS , Biggerstaff BJ , Massung RF . Clin Vaccine Immunol 2012 20 (2) 129-33 Q fever is a zoonotic disease caused by infection with the bacterium Coxiella burnetii. Infection with C. burnetii results in humoral and cellular immune responses, both of which are thought to contribute to protection against subsequent infection. Whole-cell formalin-inactivated vaccines have also been shown to induce both humoral and cellular immunity and provide protection. Whether measurement of cellular or humoral immunity is a better indicator of immune protection is not known, and the duration of immunity induced by natural infection or vaccination is also poorly understood. To better understand the measurement and duration of C. burnetii immunity, sixteen people vaccinated against Q fever (0.2 to 10.3 years before analysis) and 29 controls with low risk of Q fever exposure were tested for immune responses to C. burnetii by an indirect fluorescent antibody test (IFA) to measure circulating antibody and by an interferon gamma release assay (IGRA) to measure cellular immunity. In vaccinated subjects, the IFA detected antibodies in 13/16, and the IGRA also detected positive responses in 13/16. All of the vaccinated subjects had a positive response in at least one of the assays, whereas 8/29 control subjects were positive in at least one assay. There was not a correlation between time since vaccination and responses in these assays. These results show that IFA and IGRA perform similarly in detection of C. burnetii immune responses, and that Q fever vaccination establishes long-lived immune responses to C. burnetii. |
More evidence for use of pneumococcal conjugate vaccines
Whitney CG . Lancet 2012 381 (9862) 182-3 Pneumococcal conjugate vaccines (PCVs) are among the leading interventions for reducing deaths and improving the health of children around the world. These vaccines are now routinely used in about 88 countries, with the number of countries increasing quickly.1 PCVs are used on various schedules, designed to complement existing schedules for other vaccines that are already part of national immunisation programmes. Until now, however, clinical trial evidence to support some of the different ways PCVs can be used was missing. Results from Arto Palmu and colleagues' Finnish Invasive Pneumococcal disease (FinIP) vaccine trial2 in The Lancet fill some of these gaps. | This cluster randomised trial in Finland showed vaccine efficacy of 93% (95% CI 75–99) to 100% (79–100) against invasive pneumococcal disease for the ten-valent pneumococcal vaccine formulation. This vaccine is constructed of pneumococcal polysaccharides conjugated to Haemophilus influenzae protein D, tetanus toxoid, and diphtheria toxoid carriers (PHiD-CV10, marketed under the trade name of Synflorix). PHiD-CV10 was first licensed in 2009 on the basis of a comparable immune response to the already licensed seven-valent pneumococcal conjugate vaccine (Prevnar/Prevenar) and on data showing efficacy against otitis media for an earlier, similar 11-valent version.3 While the data used for licensure showed that PHiD-CV10 would be efficacious against disease, the evidence was somewhat indirect. Palmu and colleagues provide confirmatory, conclusive evidence about the vaccine's benefits against invasive disease. Additional direct data, from a clinical trial that assessed pneumonia endpoints, have been presented in abstract form, but are not yet published.4 |
Association of childhood pertussis with receipt of 5 doses of pertussis vaccine by time since last vaccine dose, California, 2010
Misegades LK , Winter K , Harriman K , Talarico J , Messonnier NE , Clark TA , Martin SW . JAMA 2012 308 (20) 2126-32 CONTEXT: In 2010, California experienced its largest pertussis epidemic in more than 60 years; a substantial burden of disease was noted in the 7- to 10-year-old age group despite high diphtheria, tetanus, and acellular pertussis vaccine (DTaP) coverage, indicating the possibility of waning protection. OBJECTIVE: To evaluate the association between pertussis and receipt of 5 DTaP doses by time since fifth DTaP dose. DESIGN, SETTING, AND PARTICIPANTS: Case-control evaluation conducted in 15 California counties. Cases (n = 682) were all suspected, probable, and confirmed pertussis cases among children aged 4 to 10 years reported from January through December 14, 2010; controls (n = 2016) were children in the same age group who received care from the clinicians reporting the cases. Three controls were selected per case. Vaccination histories were obtained from medical records and immunization registries. MAIN OUTCOME MEASURES: Primary outcomes were (1) odds ratios (ORs) for the association between pertussis and receipt of the 5-dose DTaP series and (2) ORs for the association between pertussis and time since completion (<12, 12-23, 24-35, 36-47, 48-59, or ≥60 months) of the 5-dose DTaP series. Logistic regression was used to calculate ORs, accounting for clustering by county and clinician, and vaccine effectiveness (VE) was estimated as (1 - OR) x 100%. RESULTS: Among cases and controls, 53 (7.8%) and 19 (0.9%) had not received any pertussis-containing vaccines, respectively. Compared with controls, children with pertussis had a lower odds of having received all 5 doses of DTaP (OR, 0.11; 95% CI, 0.06-0.21 [estimated VE, 88.7%; 95% CI, 79.4%-93.8%]). When children were categorized by time since completion of the DTaP series, using an unvaccinated reference group, children with pertussis compared with controls were less likely to have received their fifth dose within the prior 12 months (19 [2.8%] vs 354 [17.6%], respectively; OR, 0.02; 95% CI, 0.01-0.04 [estimated VE, 98.1%; 95% CI, 96.1%-99.1%]). This association was evident with longer time since vaccination, with ORs increasing with time since the fifth dose. At 60 months or longer (n = 231 cases [33.9%] and n = 288 controls [14.3%]), the OR was 0.29 (95% CI, 0.15-0.54 [estimated VE, 71.2%; 95% CI, 45.8%-84.8%]). Accordingly, the estimated VE declined each year after receipt of the fifth dose of DTaP. CONCLUSION: Among children in 15 California counties, children with pertussis, compared with controls, had lower odds of having received the 5-dose DTaP series; as time since last DTaP dose increased, the odds increased, which is consistent with a progressive decrease in estimated vaccine effectiveness each year after the final dose of pertussis vaccine. |
Dose sparing and enhanced immunogenicity of inactivated rotavirus vaccine administered by skin vaccination using a microneedle patch
Moon S , Wang Y , Edens C , Gentsch JR , Prausnitz MR , Jiang B . Vaccine 2012 31 (34) 3396-402 Skin immunization is effective against a number of infectious diseases, including smallpox and tuberculosis, but is difficult to administer. Here, we assessed the use of an easy-to-administer microneedle (MN) patch for skin vaccination using an inactivated rotavirus vaccine (IRV) in mice. Female inbred BALB/c mice in groups of six were immunized once in the skin using MN coated with 5mcg or 0.5mcg of inactivated rotavirus antigen or by intramuscular (IM) injection with 5mcg or 0.5mcg of the same antigen, bled at 0 and 10 days, and exsanguinated at 28 days. Rotavirus-specific IgG titers increased over time in sera of mice immunized with IRV using MN or IM injection. However, titers of IgG and neutralizing activity were generally higher in MN immunized mice than in IM immunized mice; the titers in mice that received 0.5mcg of antigen with MN were comparable or higher than those that received 5mcg of antigen IM, indicating dose sparing. None of the mice receiving negative-control, antigen-free MN had any IgG titers. In addition, MN immunization was at least as effective as IM administration in inducing a memory response of dendritic cells in the spleen. Our findings demonstrate that MN delivery can reduce the IRV dose needed to mount a robust immune response compared to IM injection and holds promise as a strategy for developing a safer and more effective rotavirus vaccine for use among children throughout the world. |
Development and evaluation of a Naive Bayesian model for coding causation of workers' compensation claims
Bertke SJ , Meyers AR , Wurzelbacher SJ , Bell J , Lampl ML , Robins D . J Safety Res 2012 43 327-32 INTRODUCTION: Tracking and trending rates of injuries and illnesses classified as musculoskeletal disorders caused by ergonomic risk factors such as overexertion and repetitive motion (MSDs) and slips, trips, or falls (STFs) in different industry sectors is of high interest to many researchers. Unfortunately, identifying the cause of injuries and illnesses in large datasets such as workers’ compensation systems often requires reading and coding the free form accident text narrative for potentially millions of records. METHOD: To alleviate the need for manual coding, this paper describes and evaluates a computer auto-coding algorithm that demonstrated the ability to code millions of claims quickly and accurately by learning from a set of previously manually coded claims. CONCLUSIONS: The auto-coding program was able to code claims as a musculoskeletal disorders, STF or other with approximately 90% accuracy. IMPACT ON INDUSTRY: The program developed and discussed in this paper provides an accurate and efficient method for identifying the causation of workers' compensation claims as a STF or MSD in a large database based on the unstructured text narrative and resulting injury diagnoses. The program coded thousands of claims in minutes. The method described in this paper can be used by researchers and practitioners to relieve the manual burden of reading and identifying the causation of claims as a STF or MSD. Furthermore, the method can be easily generalized to code/classify other unstructured text narratives. |
Self-reported seatbelt use, United States, 2002-2010: does prevalence vary by state and type of seatbelt law?
Shults RA , Beck LF . J Safety Res 2012 43 417-20 PROBLEM: Motor-vehicle crashes are a leading cause of death in the United States. Seatbelts are highly effective in preventing serious injury and death in the event of a crash. Not all states have primary enforcement of seatbelt laws. METHODS: Data from the 2002, 2006, 2008, and 2010 Behavioral Risk Factor Surveillance System were used to calculate prevalence of seatbelt use by state and type of state seatbelt law (primary vs. secondary enforcement). RESULTS AND DISCUSSION: Self-reported seatbelt use among adults in the United States increased steadily between 2002 and 2010, with the national prevalence reaching 87% in 2010. Overall, seatbelt use in 2010 was 9 percentage points higher in the states with primary enforcement laws than in the states with secondary enforcement laws (89% vs. 80%). IMPACT ON INDUSTRY: Primary enforcement seatbelt laws and enhanced enforcement of seatbelt laws are proven strategies for increasing seatbelt use and reducing traffic fatalities. |
Patients with severe traumatic brain injury transferred to a Level I or II trauma center: United States, 2007 to 2009
Sugerman DE , Xu L , Pearson WS , Faul M . J Trauma Acute Care Surg 2012 73 (6) 1489-97 BACKGROUND: Patients with severe traumatic brain injury (TBI), head Abbreviated Injury Scale (AIS) score of 3 or greater, who are indirectly transported from the scene of injury to a nontrauma center can experience delays to definitive neurosurgical management. Transport to a hospital with appropriate initial emergency department treatment and rapid admission has been shown to reduce mortality in a state's trauma system. This study was conducted to see if the same finding holds with a nationally representative sample of patients with severe TBI seen at Level I and II trauma centers. METHODS: This study is based on adult (≥18 years), severe TBI patients treated in a nationally representative sample of Level I and II trauma centers, submitting data to the National Trauma Databank National Sample Program from 2007 to 2009. We analyzed independent variables including age, sex, primary payer, race, ethnicity, mode of transport, injury type (blunt vs. penetrating), mechanism of injury, trauma center level, head AIS, initial Glasgow Coma Scale (GCS), Injury Severity Score (ISS), and systolic blood pressure by transfer status. The primary outcome variable was inpatient death, with discharge disposition, neurosurgical procedures, and mean hospital, intensive care unit, and ventilator days serving as secondary outcomes. RESULTS: After exclusion criteria were applied (ISS < 16; age < 18 years; GCS motor score = 6; non-head AIS score ≥ 3; head AIS < 3; patients with missing transfer status, and death on arrival), a weighted sample of 51,300 (16%) patients was eligible for analysis. In bivariate analyses, transferred patients were older (≥60 years), white, insured, less severely injured (head AIS score ≤ 4, ISS ≤ 25), and less likely to have sustained penetrating trauma (p < 0.001). After controlling for all variables, direct transport, 1 or more comorbidities, advanced age, head AIS score, intracranial hemorrhage, and firearm injury remained significant predictors of death. Being transferred (adjusted odds ratio, 0.79; 95% confidence interval, 0.64-0.96) lowered the risk of death. CONCLUSION: Patients with severe TBI who were transferred to a Level I or II trauma center had lower injury severity, including less penetrating trauma, and, as a result, were less likely to die compared with patients who were directly admitted to a Level I or II trauma center. The results may demonstrate adherence with the current Guidelines for Prehospital Management of Traumatic Brain Injury and Guidelines for Field Triage of Injured Patients, which recommend the direct transport of patients with severe TBI to the highest level trauma center. Patients with severe TBI who cannot be taken to a trauma center should be stabilized at a nontrauma center and then transferred to a Level I or II trauma center. Regional and national trauma databases should consider collecting information on patient outcomes at referral facilities and total transport time after injury, to better address the outcomes of patient triage decisions. LEVEL OF EVIDENCE: Prognostic study, level III; therapeutic study, level IV. |
Injuries from ingesting wire bristles dislodged from grill-cleaning brushes - Providence, Rhode Island, 2009-2012
Grand DJ , Egglin TK , Mayo-Smith WW , Cronan JJ , Gilchrist J . J Safety Res 2012 43 413-5 Foreign object ingestion is a common reason for visiting an emergency department; however, wire grill-cleaning brush bristles are an uncommon foreign object. This report describes a series of twelve cases identified in a single hospital system from July 2009 through June 2012. Patients included six males and six females; ages ranged from 11 to 75 (mean: 47 years). The patients all reported recent outdoor residential food grilling and use of commercially available wire grill-cleaning brushes. The severity of injury ranged from puncture of the soft tissues of the neck, causing severe pain on swallowing, to perforation of the gastrointestinal tract requiring emergent surgery. Before cooking, persons should examine the grill surface carefully for the presence of wire bristles that might have dislodged from the grill brush and could embed in cooked food. Alternative residential grill-cleaning methods or products might be considered. (2012 Elsevier Ltd.) |
A multidomain magnetic passive aerosol sampler: development and experimental evaluation
Hsiao TC , Jaques PA , Gao PF . Aerosol Sci Technol 2013 47 (1) 37-45 An innovative "quarter-sized" multidomain magnetic passive aerosol sampler (MPAS) has been developed, mainly for determining particle penetration through personal protective ensembles. The MPAS is a 28 mm disc with a height of 8.6 mm. It consists of 186 small magnets with about 140 on the collection area, arranged in an alternating N and S pole pattern. In contrast to conventional passive samplers, it uses magnetic force to collect a quantifiable amount of surrogate Fe3O4 particles within a substantially shortened sampling time. This article presents detailed design, principles of operation, performance evaluation, and the development of a deposition velocity model for the MPAS. Performance of the MPAS was evaluated under various test conditions, including different particle sizes ranging from 95 to 350 nm and wind speeds ranging from 0.48 to 1.17 m/s. A previously developed recirculation aerosol wind tunnel was employed to evaluate its performance. Experimental results show that the dimensionless deposition velocity increased with increasing particle size (e.g., about 5-fold greater for 300 nm particles than 100 nm particles at 0.48 m/s) and slightly decreased with increasing wind speeds. Our results show that this sampler is promising for the measurement of particle penetration through protective ensembles for which high collection efficiency is needed. |
Sculpting humoral immunity through dengue vaccination to enhance protective immunity
Crill WD , Hughes HR , Trainor NB , Davis BS , Whitney MT , Chang GJ . Front Immunol 2012 3 334 Dengue viruses (DENV) are the most important mosquito transmitted viral pathogens infecting humans. DENV infection produces a spectrum of disease, most commonly causing a self-limiting flu-like illness known as dengue fever; yet with increased frequency, manifesting as life-threatening dengue hemorrhagic fever (DHF). Waning cross-protective immunity from any of the four dengue serotypes may enhance subsequent infection with another heterologous serotype to increase the probability of DHF. Decades of effort to develop dengue vaccines are reaching the finishing line with multiple candidates in clinical trials. Nevertheless, concerns remain that imbalanced immunity, due to the prolonged prime-boost schedules currently used in clinical trials, could leave some vaccinees temporarily unprotected or with increased susceptibility to enhanced disease. Here we develop a DENV serotype 1 (DENV-1) DNA vaccine with the immunodominant cross-reactive B cell epitopes associated with immune enhancement removed. We compare wild-type (WT) with this cross-reactivity reduced (CRR) vaccine and demonstrate that both vaccines are equally protective against lethal homologous DENV-1 challenge. Under conditions mimicking natural exposure prior to acquiring protective immunity, WT vaccinated mice enhanced a normally sub-lethal heterologous DENV-2 infection resulting in DHF-like disease and 95% mortality in AG129 mice. However, CRR vaccinated mice exhibited redirected serotype-specific and protective immunity, and significantly reduced morbidity and mortality not differing from nasmall yi, Ukrainianve mice. Thus, we demonstrate in an in vivo DENV disease model, that non-protective vaccine-induced immunity can prime vaccinees for enhanced DHF-like disease and that CRR DNA immunization significantly reduces this potential vaccine safety concern. The sculpting of immune memory by the modified vaccine and resulting redirection of humoral immunity provide insight into DENV vaccine-induced immune responses. |
New perspectives for in vitro risk assessment of multiwalled carbon nanotubes: application of coculture and bioinformatics
Snyder-Talkington BN , Qian Y , Castranova V , Guo NL . J Toxicol Environ Health B Crit Rev 2012 15 (7) 468-92 Nanotechnology is a rapidly expanding field with wide application for industrial and medical use; therefore, understanding the toxicity of engineered nanomaterials is critical for their commercialization. While short-term in vivo studies have been performed to understand the toxicity profile of various nanomaterials, there is a current effort to shift toxicological testing from in vivo observational models to predictive and high-throughput in vitro models. However, conventional monoculture results of nanoparticle exposure are often disparate and not predictive of in vivo toxic effects. A coculture system of multiple cell types allows for cross-talk between cells and better mimics the in vivo environment. This review proposes that advanced coculture models, combined with integrated analysis of genome-wide in vivo and in vitro toxicogenomic data, may lead to development of predictive multigene expression-based models to better determine toxicity profiles of nanomaterials and consequent potential human health risk due to exposure to these compounds. |
Exposure to triclosan augments the allergic response to ovalbumin in a mouse model of asthma
Anderson SE , Franko J , Kashon ML , Anderson KL , Hubbs AF , Lukomska E , Meade BJ . Toxicol Sci 2012 132 (1) 96-106 During the last decade there has been a remarkable and unexplained increase in the prevalence of asthma. These studies were conducted to investigate the role of dermal exposure to triclosan, an endocrine-disrupting compound, on the hypersensitivity response to ovalbumin (OVA) in a murine model of asthma. Triclosan has had widespread use in the general population as an antibacterial and antifungal agent and is commonly found in consumer products such as soaps, deodorants, toothpastes, shaving creams, mouth washes, and cleaning supplies. For these studies, BALB/c mice were exposed dermally to concentrations of triclosan ranging from 0.75-3% (0.375-1.5 mg/mouse/day) for 28 consecutive days. Concordantly, mice were intraperitoneally injected with OVA (0.9 mcg) and aluminum hydroxide (0.5 mg) on days 1 and 10 and challenged with OVA (125 mcg) by pharyngeal aspiration on days 19 and 27. Compared to the animals exposed to OVA alone, increased spleen weights, OVA-specific IgE, Interleukin (IL)-13 cytokine levels, and lung eosinophils were demonstrated when mice were co-exposed to OVA and triclosan. Statistically significant increases in OVA-specific and non-specific airway hyperreactivity (AHR) were observed for all triclosan co-exposed groups when compared to the vehicle and OVA controls. In these studies exposure to triclosan alone was not demonstrated to be allergenic, however; co-exposure with a known allergen resulted in enhancement of the hypersensitivity response to that allergen, suggesting that triclosan exposure may augment the allergic responses to other environmental allergens. |
2-Butoxyethanol and benzyl alcohol reactions with the nitrate radical: rate coefficients and gas-phase products
Harrison JC , Wells JR . Int J Chem Kinet 2012 44 (12) 778-788 The bimolecular rate coefficients kNO∙3+2-butoxyethanol and kNO∙3+benzylalcohol were measured using the relative rate technique at (297 ± 3) K and 1 atmosphere total pressure. Values of (2.7 ± 0.7) and (4.0 ± 1.0) × 10−15 cm3 molecule−1 s−1 were observed for kNO∙3+2-butoxyethanol and kNO∙3+benzylalcohol, respectively. In addition, the products of 2-butoxyethanol+NO∙3 and benzylalcohol+NO∙3 gas-phase reactions were investigated. Derivatizing agents O-(2,3,4,5,6-pentafluorobenzyl)hydroxylamine and N, O-bis (trimethylsilyl)trifluoroacetamide and gas chromatography mass spectrometry (GC/MS) were used to identify the reaction products. For 2-butoxyethanol+NO∙3 reaction: hydroxyacetaldehyde, 3-hydroxypropanal, 4-hydroxybutanal, butoxyacetaldehyde, and 4-(2-oxoethoxy)butan-2-yl nitrate were the derivatized products observed. For the benzylalcohol+NO∙3 reaction: benzaldehyde ((C6H5)C(=O)H) was the only derivatized product observed. Negative chemical ionization was used to identify the following nitrate products: [(2-butoxyethoxy)(oxido)amino]oxidanide and benzyl nitrate, for 2-butoxyethanol+NO∙3 and benzylalcohol+NO∙3, respectively. The elucidation of these products was facilitated by mass spectrometry of the derivatized reaction products coupled with a plausible 2-butoxyethanol or benzylalcohol+NO∙3 reaction mechanisms based on previously published volatileorganiccompound+NO∙3 gas-phase mechanisms. |
Accounting for an isobaric interference allows correct determination of folate vitamers in serum by isotope dilution-liquid chromatography-tandem MS
Fazili Z , Pfeiffer CM . J Nutr 2012 143 (1) 108-13 Mild and prolonged oxidative degradation of 5-methyltetrahydrofolate (5-methylTHF) leads to the biologically inactive pyrazino-s-triazine derivative of 4alpha-hydroxy-5-methylTHF (MeFox). MeFox and the biologically active 5-formyltetrahydrofolate (5-formylTHF) are isobaric compounds that behave similarly during chromatographic and mass separation, making coelution and misidentification likely. Our published routine liquid chromatography-tandem MS (LC-MS/MS) method did not discern between 5-formylTHF and MeFox, measuring the sum of these compounds at a mass to charge ratio (m/z) of 474-->327 as 5-formylTHF. We modified this method to separate MeFox and 5-formylTHF by either chromatography or unique mass transitions and then applied the 2 methods to serum specimens to determine typical concentrations of these compounds. The 2 unique transitions (m/z: 5-formylTHF, 474-->299; MeFox, 474-->284) showed good sensitivity [limit of detection (nmol/L): 5-formylTHF, 0.21; MeFox, 0.34], selectivity (no interfering peaks), spiking recovery (mean +/- SD: 5-formylTHF, 103 +/- 3.4%; MeFox, 94 +/- 10%), and low imprecision (CV: 5-formylTHF, 3.9% at 2.4 nmol/L; MeFox, 5.1% at 2.9 nmol/L). The mass separation method detected 5-formylTHF in the same specimens as the chromatographic separation method. Analysis of several thousand serum specimens showed that the majority ( approximately 85%) contained MeFox at <3 nmol/L but no detectable 5-formylTHF concentrations, some ( approximately 14%) contained 5-formylTHF at <0.5 nmol/L, and a few specimens contained 5-formylTHF at >1 nmol/L and MeFox at >10 nmol/L. In summary, serum can contain 5-formylTHF high enough to contribute to total folate and contains MeFox that will bias total folate if not appropriately separated. Including measurements of MeFox and 5-formylTHF along with the other folate vitamers will enhance assessments of the association between biologically active folate and health effects. |
Carrageenan-based gel retains limited anti-HIV-1 activity 8-24 hours after vaginal application by HIV-infected Thai women enrolled in a phase I safety trial
Haaland RE , Chaowanachan T , Evans-Strickfaden T , van de Wijgert JH , Kilmarx PH , McLean CA , Hart CE . J Acquir Immune Defic Syndr 2012 61 (5) e71-3 There is a need to improve in vitro testing to evaluate topical microbicide candidates that prevent acquisition of HIV. Many vaginal microbicides with different anti-HIV activities have undergone preclinical testing and a few of those have been selected for clinical safety and efficacy testing. Clinical efficacy trials of vaginal microbicide gels have yielded mixed results. Initial nonspecific entry inhibitors were shown to be ineffective in clinical efficacy trials,1–4 whereas more recent testing of microbicide gels containing the nucleoside reverse transcriptase inhibitor, tenofovir, has shown some level of protection in 1 of 2 clinical trials.5,6 Yet, nearly all preclinical testing outcomes predicted that products should significantly reduce sexual acquisition of HIV when used appropriately in a clinical trial. However, a recent report using ex vivo testing of the microbicide Pro2000 demonstrated that this product loses anti-HIV activity after vaginal application and sexual intercourse.7 Changes in anti-HIV activity over time after vaginal application have not been studied for most candidate vaginal microbicides. | We investigated the carrageenan-based vaginal gel Carraguard, an HIV entry inhibitor that failed in a clinical efficacy trial,1 for degradation and loss of anti-infective activity after vaginal application. Cervicovaginal lavages (CVLs) were collected from 16 HIV-infected Thai women participating in a randomized, controlled, 3-treatment (Carraguard, methylcellulose placebo, no product) crossover safety trial.8 Women applied each treatment daily for 7 days after menses. The order of the 3 treatments was randomized. CVLs (5 mL) were collected on the first clinic visit in each treatment cycle before gel application (T0), 15 minutes after gel application (T15min), and on day 7 clinic visit which was 8–24 hours after the final gel application (T8–24hr). Self-reported adherence was 98% overall, and participants reported that vaginal application of the gel was highly acceptable.9 |
Dengue virus: isolation, propagation, quantification, and storage
Medina F , Medina JF , Colon C , Vergne E , Santiago GA , Munoz-Jordan JL . Curr Protoc Microbiol 2012 Chapter 15 Unit15D 2 Dengue is a disease caused by infection with one of the four dengue virus serotypes (DENV-1, -2, -3, and -4). The virus is transmitted to humans by Aedes sp. mosquitoes. This enveloped virus contains a positive single-stranded RNA genome. Clinical manifestations of dengue can have a wide range of outcomes varying from a mild febrile illness to a life-threatening condition. New techniques have largely replaced the use of DENV isolation in disease diagnosis. However, virus isolation still serves as the gold standard for detection and serotyping of DENV and is common practice in research and reference laboratories where clinical isolates of the virus are characterized and sequenced, or used for a variety of research experiments. Isolation of DENV from clinical samples can be achieved in mammalian and mosquito cells or by inoculation of mosquitoes. The experimental methods presented here describe the most common procedures used for the isolation, serotyping, propagation, and quantification of DENV. (Curr. Protoc. Microbiol. 27:15D.2.1-15D.2.24. (c) 2012 by John Wiley & Sons, Inc.) |
Development of human posture simulation method for assessing posture angles and spinal loads
Lu ML , Waters T , Werren D . Hum Factors Ergon Manuf 2012 25 (1) 123-136 Video-based posture analysis employing a biomechanical model is gaining a growing popularity for ergonomic assessments. A human posture simulation method of estimating multiple body postural angles and spinal loads from a video record was developed to expedite ergonomic assessments. The method was evaluated by a repeated measures study design with three trunk flexion levels, two lift asymmetry levels, three viewing angles, and three trial repetitions as experimental factors. The study comprised two phases evaluating the accuracy of simulating self- and other people's lifting posture via a proxy of a computer-generated humanoid. The mean values of the accuracy of simulating self- and humanoid postures were 12° and 15°, respectively. The repeatability of the method for the same lifting condition was excellent (~2°). The least simulation error was associated with side viewing angle. The estimated back compressive force and moment, calculated by a three-dimensional biomechanical model, exhibited a range of 5% underestimation. The posture simulation method enables researchers to quantify simultaneously body posture angles and spinal loading variables with accuracy and precision comparable to on-screen posture-matching methods. |
The National Center on Birth Defects and Developmental Disabilities: past, present, and future
Boyle CA , Cordero JF , Trevathan E . Am J Prev Med 2012 43 (6) 655-8 The National Center on Birth Defects and Developmental Disabilities (NCBDDD) was established in April 2001 as a result of The Children’s Health Act of 2000.1 The supporters of the center’s creation wanted to raise the visibility of child health and disability at the CDC. From its inception, the NCBDDD included the established programs in birth defects, genetic disorders, developmental disabilities and disabilities and health, with a more than 30-year history in surveillance, research, and public health practice, including the training of public health professionals in these respective fields. | In 2003, the NCBDDD expanded with the addition of the blood disorders program, which initially had been established in response to the discovery that HIV, a bloodborne infection, highly affected people with hemophilia who required treatments using blood products. This led to the development of a comprehensive program to ensure optimal care for this and other populations affected by nonmalignant blood disorders.2 The current paper examines the health impact of the center’s work that was highlighted in a year-long reflection,“10 Years of Service”3;and provides a framework for how the NCBDDD will move forward with a renewed emphasis on enhancing public health and healthcare-system capacity to have greater health impact for the populations served. |
Global regional and national causes of child mortality
Modell B , Berry RJ , Boyle CA , Christianson A , Darlison M , Dolk H , Howson CP , Mastroiacovo P , Mossey P , Rankin J . Lancet 2012 380 (9853) 1556; author reply 1556-7 Using vital registration and verbal autopsy, Li Liu and colleagues (June 9, p 2151)1 attribute 270 000 deaths worldwide in children younger than 5 years to congenital anomalies (chromosomal disorders and congenital malformations)—3·5% of the total, or 2·0 per 1000 births. Liu and colleagues' comments that medically certified vital registration data are available for only 2·7% of under-5 deaths, and that verbal autopsy methods “are subject to inherent misclassification errors”, are especially applicable to congenital anomalies, since correct diagnosis often requires advanced diagnostic facilities. Consequently, valid mortality rates are obtainable only for high-income settings, where mortality has been greatly reduced by multiple interventions. In lower-income settings, most congenital anomalies remain undiagnosed, and the associated mortality is inevitably mis-attributed. | The Global Burden of Disease Congenital Expert Group (of which we were members), starting from the known birth prevalence of major congenital anomalies (in the absence of intervention) of 22·5 per 1000 births worldwide2, 3, 4 and known mortality in the absence of diagnosis and care, estimated a minimum of 10 under-5 deaths per 1000 births worldwide—ie, more than four times higher than Liu and colleagues' estimate. In Hungary in 1970–80, congenital anomalies caused around 6·5 under-5 deaths per 1000 births even though diagnosis and care were generally available.3 Currently in the UK they cause around 2·5 under-5 deaths per 1000 births.5 Liu and colleagues' estimates are lower than this for almost all WHO regions. (Although their report in fact only covers neonatal deaths, this omission hardly affects the difference between our estimates.) |
Attenuation and duration of siesmic signals generated from controlled methane and coal dust explosions in an underground mine
Murphy MM , Westman EC , Barczak TM . Int J Rock Mech Min Sci 2012 56 112-120 Seismic monitoring provides a useful means for detection and | evaluation of events resulting from mining activity. Seismic | signature characteristics such as arrival times, amplitudes, duration and frequency content can indicate the nature and location of | the source. In the past, most mining-related seismic measurements have focused on events such as rockbursts, production | blasts from quarries, roof falls and rock fractures [1,2,3,4]. However, little or no effort has been expended towards examining the | characteristics of a signature emanating from a methane and coal | dust explosion in an underground mine. The Sago Mine disaster in | 2006 provides an example of why these particular signatures | should be researched. A small amplitude signal was identified on | records of the regional seismic network stations that were closest | to the mine [5]. The epicentral location of the small amplitude | signal was at the Sago Mine. However, it was unclear whether the | signature represented the explosion itself or another type of | mining-related seismicity such as a large roof fall. This paper | presents findings from a study aimed at examining seismicity | from methane and coal dust explosions by analyzing the attenuation and duration of seismic signatures collected from controlled | methane and coal dust explosions, with potential applications to | forensic studies of mine explosions such as the Sago Mine | disaster. |
Sustainability of market-based community distribution of Sprinkles in western Kenya
Suchdev PS , Shah A , Jefferds ME , Eleveld A , Patel M , Stein AD , Macdonald B , Ruth L . Matern Child Nutr 2013 9 Suppl 1 78-88 To evaluate the sustainability of market-based community distribution of micronutrient powders (Sprinkles((R)) , Hexagon Nutrition, Mumbai, India.) among pre-school children in Kenya, we conducted in August 2010 a follow-up survey, 18 months after study-related marketing and household monitoring ended. We surveyed 849 children aged 6-35 months randomly selected from 60 study villages. Nutritional biomarkers were measured by fingerstick; demographic characteristics, Sprinkles purchases and use were assessed through household questionnaires. We compared Sprinkles use, marketing efforts and biomarker levels with the data from surveys conducted in March 2007, March 2008 and March 2009. We used logistic regression to evaluate associations between marketing activities and Sprinkles use in the 2010 survey. At the 2010 follow-up, 21.9% of children used Sprinkles in the previous 7 days, compared with 64.9% in 2008 (P < 0.001). Average intake was 3.2 sachets week(-1) in 2008, 1.6 sachets week(-1) in 2009 and 1.1 sachets week(-1) in 2010 (P < 0.001). Factors associated with recent Sprinkles use in 2010 included young age [6-23 months vs. 24-35 months, adjusted odds ratio (aOR) = 1.5, P = 0.02], lowest 2 quintiles of socio-economic status (aOR = 1.7, P = 0.004), household attendance at trainings or launches (aOR = 2.8, P < 0.001) and ever receiving promotional items including free Sprinkles, calendars, cups and t-shirts (aOR = 1.7, P = 0.04). In 2010, there was increased prevalence of anaemia and malaria (P < 0.001), but not iron deficiency (P = 0.44), compared with that in 2008. Sprinkles use in 2010 was associated with decreased iron deficiency (P = 0.03). Sprinkles coverage reduced after stopping household monitoring and reducing marketing activities. Continued promotion and monitoring of Sprinkles usage may be important components to sustain the programme. |
Reducing sodium intake at the community level: the Sodium Reduction in Communities Program
Mugavero K , Losby JL , Gunn JP , Levings JL , Lane RI . Prev Chronic Dis 2012 9 E168 Approximately 90% of Americans aged 2 years or older consume too much sodium (1). The consumption of too much sodium increases blood pressure, which increases the risk for stroke, coronary heart disease, heart failure, and renal disease (2). Population-based strategies to reduce salt intake are cost-effective, can reduce blood pressure (3), and, according to the Institute of Medicine, are needed at national, state, and community levels (2). To improve food environments and reduce sodium intake at the community level, the Centers for Disease Control and Prevention (CDC) funds the Sodium Reduction in Communities Program (SRCP). This demonstration project supports communities in creating more healthful food environments and aims to expand the evidence base for effective community strategies to address sodium intake at the population level. In this article, we describe the role of communities and environments in influencing health and strategies being implemented and evaluated by SRCP communities. |
Serum 25-hydroxyvitamin D and risk of major osteoporotic fractures in older U.S. adults
Looker AC . J Bone Miner Res 2012 28 (5) 997-1006 Results from previous prospective studies linking serum hydroxyvitamin D (25OHD) with fracture risk have been inconsistent. The present study examined the relationship between serum 25OHD and risk of incident major osteoporotic fracture (hip, spine, radius and humerus) in older US adults. The study used a pooled cohort of 4749 men and women ages 65 years and older from the third National Health and Nutrition Examination Survey (NHANES III, 1988-94) and NHANES 2000-2004. Incident fractures were identified using linked mortality and Medicare records that were obtained for participants from both surveys. Serum 25OHD values were measured by radioimmunoassay in both surveys. Cox proportional hazards models were used to estimate the relative risk (RR) of fracture by serum 25OHD level. There were 525 incident major osteoporotic fractures (287 hip fractures) in the sample. Serum 25OHD was a significant linear predictor of major osteoporotic fracture and significant quadratic predictor of hip fracture in the total sample and among those with less than 10 years of follow-up, but it was not related to risk of either fracture type among those with > 10 years of follow-up. Major osteoporotic fracture risk was increased by 26-27% for each SD decrease in serum 25OHD among those with less than 10 years of follow-up. Serum 25OHD was significantly related to risk of major osteoporotic fractures as a group and to hip fracture alone in this cohort of older US adults from NHANES III and NHANES 2000-2004. However, the predictive utility of serum 25OHD diminished after ten years. In addition, the relationship appeared to be linear when major osteoporotic fracture risk was considered but quadratic when hip fracture risk was assessed. (c) 2012 American Society for Bone and Mineral Research. |
Modeling of the biodynamic responses distributed at the fingers and palm of the hand in three orthogonal directions
Dong RG , Welcome DE , McDowell TW , Wu JZ . J Sound Vib 2013 332 (4) 1125-1140 The objectives of this study were to develop models of the hand–arm system in the three orthogonal directions (xh, yh, and zh) and to enhance the understanding of the hand vibration dynamics. A four-degrees-of-freedom (DOF) model and 5-DOF model were used in the simulation for each direction. The driving-point mechanical impedances distributed at the fingers and palm of the hand reported in a previous study were used to determine the parameters of the models. The 5-DOF models were generally superior to the 4-DOF models for the simulation. Hence, as examples of applications, the 5-DOF models were used to predict the transmissibility of a vibration-reducing glove and the vibration transmissibility on the major substructures of the hand-arm system. The model-predicted results were also compared with the experimental data reported in two other recent studies. Some reasonable agreements were observed in the comparisons, which provided some validation of the developed models. This study concluded that the 5-DOF models are acceptable for helping to design and analyze vibrating tools and anti-vibration devices. This study also confirmed that the 5-DOF model in the zh direction is acceptable for a coarse estimation of the biodynamic responses distributed throughout the major substructures of the hand-arm system. Some interesting phenomena observed in the experimental study of the biodynamic responses in the three directions were also explained in this study. |
Exposure assessment for roofers exposed to silica during installation of roof tiles
Hall RM , Achutan C , Sollberger R , McCleery RE , Rodriguez M . J Occup Environ Hyg 2013 10 (1) D6-D10 Occupational exposure to silica in the construction industry has been well documented,(1–7) and respirable crystalline silica (quartz and cristobalite) has been associated with silicosis,(8,9) lung cancer,(10,11) pulmonary tuberculosis,(12,13) and airway diseases.(14,15) | These concerns prompted a local construction union to request assistance from the National Institute for Occupational Safety and Health (NIOSH) for health hazard evaluations concerning exposures to dust and silica among roofers in Phoenix, Arizona. In response to these requests, NIOSH performed field studies to evaluate roofers’ exposures to silica. |
Research to improve extension ladder angular positioning
Simeonov P , Hsiao H , Powers J , Kim IJ , Kau TY , Weaver D . Appl Ergon 2012 44 (3) 496-502 A leading cause for extension ladder fall incidents is a slide-out event usually related to suboptimal ladder inclination. An improved ladder positioning method or procedure could reduce the risk of ladder stability failure and the related fall injury. The objective of the study was to comparatively evaluate the effectiveness of a multimodal angle indicator with other existing methods for extension ladder angular positioning. Twenty experienced and 20 inexperienced ladder users participated in the study. Four ladder positioning methods were tested in a controlled laboratory environment with 4.88 m (16 ft) and 7.32 m (24 ft) ladders in extended and retracted positions. The positioning methods included a no-instruction method, the current standard anthropometric method, and two instrumental methods - a bubble level indicator, and a multimodal indicator providing direct feedback with visual and sound signals. Performance measures included positioning angle and time. The results indicated that the anthropometric method was effective in improving the extension ladder positioning angle (p < 0.001); however, it was associated with considerable variability and required 50% more time than no-instruction. The bubble level indicator was an accurate positioning method (with very low variability), but required more than double the time of the no-instruction method (p < 0.001). The multimodal indicator improved the ladder angle setting as compared to the no-instruction and anthropometry methods (p < 0.001) and required the least time for ladder positioning among the tested methods (p < 0.001). An indicator with direct multimodal feedback is a viable approach for quick and accurate ladder positioning. The main advantage of the new multimodal method is that it provides continuous feedback on the angle of the device and hence does not require repositioning of the ladder. Furthermore, this indicator can be a valuable tool for training ladder users to correctly apply the current ANSI A14 standard anthropometric method in ladder angular positioning. The multimodal indicator concept has been further developed to become a hand-held tool in the form of a smart phone application. |
Uncompensated consequences of workplace injuries and illness: long-term disability and early termination
Park RM , Bhattacharya A . J Safety Res 2012 44 119-24 PROBLEM: Costs related to early retirement or termination, and long-term disability resulting from work-related injury or illness, or their residual effects, could be outside the workers compensation (WC) envelope. METHOD: Using a benefits database providing utilization information for medical insurance and WC, statistical models were fit to determine if the rate of early retirement, long-term disability status, or any early termination depended on a prior WC claim. RESULTS: Rates of early retirement or long-term disability varied widely across industrial sectors and by employee classification likely reflecting variable benefits structures or reporting across employers. For any early termination the WC-associated rate ratio in hourly nonunion employees was 1.20 (95%CI = 1.14-1.28); for hourly union employees the rate ratio was 1.05 (95%CI = 0.97-1.13); for salaried nonunion employees, the rate ratio was 3.43 (95%CI = 3.11-3.79). In the manufacturing-durable sector the WC-associated rate ratio for hourly nonunion employees was 1.58 (95%CI = 1.42-1.76); for union hourly employees the rate ratio was 1.23 (95%CI = 1.10-1.38). In contrast, in the transportation-utilities-communications sector, for hourly nonunion employees the WC-associated rate ratio was 0.52 (95%CI = 0.46-0.59) whereas for union hourly employees the rate ratio was 1.22 (95%CI = 1.08-1.38). DISCUSSION: Prior WC predicts increased early termination in some workplaces but not others. Substantial uncompensated costs of workplace injuries and illnesses may result either from adverse events previously compensated by WC or from uncompensated events in individuals having other, WC-compensated episodes, i.e., workers in higher risk jobs. In other workplaces reduced termination rates with prior WC suggests added costs internalized by employers. SUMMARY: Conditions leading to WC claims appear to have cost implications related to early - or delayed - removal from the workforce. These costs can affect both employees and employers and should be included in estimates of burden of occupational injury and illness. |
Workplace violence among Pennsylvania education workers: differences among occupations
Tiesman H , Konda S , Hendricks S , Mercer D , Amandus H . J Safety Res 2012 44 65-71 PROBLEM: The literature on education employees as victims of workplace violence (WPV) is limited. Moreover, prior studies have focused primarily on teachers. The purpose of this study was to measure the prevalence and characteristics of physical and non-physical WPV in a state-based cohort of education workers. METHOD: A state-wide sample of 6,450 workers was drawn using de-identified union membership lists provided by Pennsylvania's education unions. The sample was stratified on gender, occupation, and school location. Occupational groups included special education teachers, general education teachers, pupil service professionals, education support personnel, and teaching aides. A cross-sectional survey was mailed to participants. Analyses were performed using Proc SURVEY methods in SAS. RESULTS: An estimated 7.8% (95%CI = 6.6 − 9.1) of education workers were physically assaulted and 28.9% (95%CI = 26.4 − 31.5) experienced a non-physical WPV event during the 2009–2010 school year. Special education teachers were significantly more likely to be physically assaulted and experience a non-physical WPV event compared to general education teachers (Prevalence Rate Ratio = 3.6, 95% 2.4-5.5; PRR = 1.4, 95%CI = 1.1 − 1.8). The majority of education workers were physically assaulted during regular school hours (97%) by a student (95%). Education support personnel experienced a large percentage of physical assaults perpetrated by co-workers (36%). The most common perpetrator of non-physical WPV was a student (73%); however, 15% of non-physical events were perpetrated by co-workers. DISCUSSION: Special education teachers were at the highest risk for both physical and non-physical WPV. Education support personnel experienced a high percentage of WPV perpetrated by co-workers. If not already present, schools should consider implementing comprehensive WPV prevention programs for their employees. IMPACT ON INDUSTRY: Those employed in a school setting are at risk for physical and non-physical WPV. Special education teachers have unique workplace hazards. Strategies that protect the special education teacher, while still protecting the special education student should be considered. |
Lifetime organophosphorous insecticide use among private pesticide applicators in the Agricultural Health Study
Hoppin JA , Long S , Umbach DM , Lubin JH , Starks SE , Gerr F , Thomas K , Hines CJ , Weichenthal S , Kamel F , Koutros S , Alavanja M , Beane Freeman LE , Sandler DP . J Expo Sci Environ Epidemiol 2012 22 (6) 584-92 Organophosphorous insecticides (OPs) are the most commonly used insecticides in US agriculture, but little information is available regarding specific OP use by individual farmers. We describe OP use for licensed private pesticide applicators from Iowa and North Carolina in the Agricultural Health Study (AHS) using lifetime pesticide use data from 701 randomly selected male participants collected at three time periods. Of 27 OPs studied, 20 were used by >1%. Overall, 95% had ever applied at least one OP. The median number of different OPs used was 4 (maximum=13). Malathion was the most commonly used OP (74%) followed by chlorpyrifos (54%). OP use declined over time. At the first interview (1993-1997), 68% of participants had applied OPs in the past year; by the last interview (2005-2007), only 42% had. Similarly, median annual application days of OPs declined from 13.5 to 6 days. Although OP use was common, the specific OPs used varied by state, time period, and individual. Much of the variability in OP use was associated with the choice of OP, rather than the frequency or duration of application. Information on farmers' OP use enhances our ability to characterize and understand the potential health effects of multiple OP exposures. |
Predicting temporal trends in sickness absence rates for civil service employees of a federal public health agency
Spears DR , McNeil C , Warnock E , Trapp J , Oyinloye O , Whitehurst V , Decker KC , Chapman S , Campbell M , Meechan P . J Occup Environ Med 2012 55 (2) 179-90 OBJECTIVE: To determine whether trends of sickness in employees at a federal agency are predictable, and whether the variance was minimal enough to detect unusual levels of employee illness for further investigation. METHODS: Ten years of absenteeism data from an attendance system were analyzed for rates of sickness absence. Specifically, week of year and day of week were used to describe temporal trends. RESULTS: This study evaluates the predictability in temporal absence trends due to sickness among employees at a federal agency. Trends follow regular patterns during a given year that correspond to seasonal illnesses. Temporal trends in sick leave have been proven to be very predictable. CONCLUSION: The minimal variance allows the detection of sick leave anomalies that may be ascribable to specific causes, allowing the business or agency to follow-up and develop interventions. |
Incidence and costs of family member hospitalization following injuries of workers' compensation claimants
Asfaw A , Pana-Cryan R , Bushnell PT . Am J Ind Med 2012 55 (11) 1028-36 BACKGROUND: The consequences of occupational injuries for the health of family members have rarely been studied. We hypothesized that non-fatal occupational injury would increase the incidence and costs of hospitalization among workers' families, and that family members of severely injured workers would be likely to experience greater increases in hospitalizations than family members of non-severely injured workers. DATA AND METHODS: We used the MarketScan databases from Thomson Reuters for 2002-2005, which include workers' compensation and inpatient medical care claims data for injured workers' families. We used a before-after analysis to compare the odds and costs of family hospitalization 3 months before and after the index occupational injury among 18,411 families. Severe injuries were defined by receipt of indemnity payments and at least 7 days of lost work. Family hospitalizations were measured by the incidence of hospitalization of at least one family member. RESULTS: Among families of all injured workers, the odds of at least one family member being hospitalized were 31% higher [95% confidence intervals (CI) = 1.11-1.55] in the 3 months following occupational injury than in the 3 months preceding injury. Among the families of severely injured workers, the odds of hospitalization were 56% higher [95% CI = 1.05-2.34] in the 3 months following injury. Hospitalization costs were found to rise by approximately the same percentage as hospitalization incidence. CONCLUSION: The impact of occupational injury may extend beyond the workplace and adversely affect the health and inpatient medical care use of family members. (Am. J. Ind. Med. 55:1028-1036, 2012. (c) 2012 Wiley Periodicals, Inc.) |
The classic pneumoconioses: new epidemiological and laboratory observations
Laney AS , Weissman DN . Clin Chest Med 2012 33 (4) 745-58 The purpose of this article is to provide an update on selected issues of current interest and recent developments related to 3 types of inorganic mineral dust exposures causing classic forms of pneumoconiosis: coal mine dust, crystalline silica, and asbestos. Common themes include new imaging modalities, emerging exposures, and evolving appreciation of additional adverse health effects associated with exposure to these inorganic mineral dusts. |
Malaria prevalence among pregnant women in two districts with differing endemicity in Chhattisgarh, India
Singh N , Singh MP , Wylie BJ , Hussain M , Kojo YA , Shekhar C , Sabin L , Desai M , Udhayakumar V , Hamer DH . Malar J 2012 11 274 BACKGROUND: In India, malaria is not uniformly distributed. Chhattisgarh is a highly malarious state where both Plasmodium falciparum and Plasmodium vivax are prevalent with a preponderance of P. falciparum. Malaria in pregnancy (MIP), especially when caused by P. falciparum, poses substantial risk to the mother and foetus by increasing the risk of foetal death, prematurity, low birth weight (LBW), and maternal anaemia. These risks vary between areas with stable and unstable transmission. The specific objectives of this study were to determine the prevalence of malaria, its association with maternal and birth outcomes, and use of anti-malarial preventive measures for development of evidence based interventions to reduce the burden of MIP. METHODS: A cross-sectional study of pregnant women presenting to antenatal clinics (ANC) or delivery units (DU), or hospitalized for non-obstetric illness was conducted over 12 months in high (Bastar) and low (Rajnandgaon) transmission districts in Chhattisgarh state. Intensity of transmission was defined on the basis of slide positivity rates with a high proportion due to P. falciparum. In each district, a rural and an urban health facility was selected. RESULTS: Prevalence of peripheral parasitaemia was low: 1.3% (35/2696) among women at ANCs and 1.9% at DUs (19/1025). Peripheral parasitaemia was significantly more common in Bastar (2.8%) than in Rajnandgaon (0.1%) (p < 0.0001). On multivariate analysis of ANC participants, residence in Bastar district (stable malaria transmission) was strongly associated with peripheral parasitaemia (adjusted OR [aOR] 43.4; 95% CI, 5.6-335.2). Additional covariates associated with parasitaemia were moderate anaemia (aOR 3.7; 95% CI 1.8-7.7), fever within the past week (aOR 3.2; 95% CI 1.2-8.6), and lack of formal education (aOR 4.6; 95% CI 2.0-10.7). Similarly, analysis of DU participants revealed that moderate anaemia (aOR 2.5; 95% CI 1.1-5.4) and fever within the past week (aOR 5.8; 95% CI 2.4-13.9) were strongly associated with peripheral and/or placental parasitaemia. Malaria-related admissions were more frequent among pregnant women in Bastar, the district with greater malaria prevalence (51% vs. 11%, p < 0.0001). CONCLUSIONS: Given the overall low prevalence of malaria, a strategy of enhanced anti-vector measures coupled with intermittent screening and targeted treatment during pregnancy should be considered for preventing malaria-associated morbidity in central India. |
Ongoing outbreak of an acute muscular Sarcocystis-like illness among travellers returning from Tioman Island, Malaysia, 2011-2012
Esposito D , Freedman D , Neumayr A , Parola P . Euro Surveill 2012 17 (45) As of 4 November, 2012, 100 patients with an acute muscular Sarcocystis-like illness associated with travel to Tioman Island, Malaysia, have been identified. Thirty-five travelled there mostly during July and August 2011 and 65 mostly during July and August 2012, suggesting an ongoing outbreak. Epidemiological investigations are ongoing. Public health agencies and practicing clinicians should be aware of this rarely-reported disease in humans and consider it as differential diagnosis in travellers returning from Tioman Island. |
Personality and reduced incidence of walking limitation in late life: findings from the Health, Aging, and Body Composition study
Tolea MI , Ferrucci L , Costa PT , Faulkner K , Rosano C , Satterfield S , Ayonayon HN , Simonsick EM . J Gerontol B Psychol Sci Soc Sci 2012 67 (6) 712-719 OBKECTIVES: To examine the association between openness to experience and conscientiousness and incident reported walking limitation. METHOD: The study population consisted of 786 men and women aged 71-81 years (M = 75 years, SD = 2.7) participating in the Health, Aging, and Body Composition-Cognitive Vitality Substudy. RESULTS: Nearly 20% of participants (155/786) developed walking limitation during 6 years of follow-up. High openness was associated with a reduced risk of walking limitation (hazard ratio [HR] = 0.83, 95% confidence interval [CI] = 0.69-0.98), independent of sociodemographic factors, health conditions, and conscientiousness. This association was not mediated by lifestyle factors and was not substantially modified by other risk factors for functional disability. Conscientiousness was not associated with risk of walking limitation (HR = 0.91, 95% CI = 0.77-1.07). DISCUSSION: Findings suggest that personality dimensions, specifically higher openness to experience, may contribute to functional resilience in late life. |
Associations between neighborhood characteristics and physical activity among youth within Rural-Urban Commuting Areas in the US
Kasehagen L , Busacker A , Kane D , Rohan A . Matern Child Health J 2012 16 Suppl 2 258-67 The association among rural-urban communities, neighborhood characteristics, and youth physical activity is inconsistent in the literature. We used data from the 2007 National Survey of Children's Health, for youth aged 10-17 years (n = 45,392), to examine the association between physical activity and neighborhood characteristics, after adjusting for known confounders. We also examined the association between physical activity and neighborhood characteristics within seven levels of Rural-Urban Commuting Areas (RUCAs) that depict a continuum from isolated rural to dense urban communities. Attainment of a minimum physical activity level differed by RUCA (P = 0.0004). In adjusted, RUCA-specific models, the presence of parks was associated with attaining a minimum physical activity level in only one of the seven RUCAs (adjusted odds ratio: 3.49; 95 % confidence interval: 1.55, 7.84). This analysis identified no association between youths' minimum physical activity attainment and neighborhood characteristics in unstratified models; and, RUCA-specific models showed little heterogeneity by rural-urban community type. Although this analysis found little association between youth physical activity and neighborhood characteristics, the findings could reflect the crude categorization of the neighborhood amenities (sidewalks, parks, recreation centers) and detracting elements (litter, dilapidated housing, vandalism) and suggests that simple measurement of the presence of an amenity or detracting element is insufficient for determining potential associations with reaching minimum levels of physical activity. By exploring neighborhood characteristics and features of neighborhood amenities within the context of well-defined community types, like RUCAs, we can better understand how and why these factors contribute to different levels of youth physical activity. |
Associations of openness and conscientiousness with walking speed decline: findings from the Health, Aging, and Body Composition study
Tolea MI , Costa PT Jr , Terracciano A , Ferrucci L , Faulkner K , Coday MM , Ayonayon HN , Simonsick EM . J Gerontol B Psychol Sci Soc Sci 2012 67 (6) 705-711 OBJECTIVES: The objective of this study was to explore the associations between openness to experience and conscientiousness, two dimensions of the five-factor model of personality, and usual gait speed and gait speed decline. METHODS: Baseline analyses were conducted on 907 men and women aged 71-82 years participating in the Cognitive Vitality substudy of the Health, Aging, and Body Composition study. The longitudinal analytic sample consisted of 740 participants who had walking speed assessed 3 years later. RESULTS: At baseline, gait speed averaged 1.2 m/s, and an average decline of 5% over the 3-year follow-up period was observed. Higher conscientiousness was associated with faster initial walking speed and less decline in walking speed over the study period, independent of sociodemographic characteristics. Lifestyle factors and disease status appear to play a role in the baseline but not the longitudinal association between conscientiousness and gait speed. Openness was not associated with either initial or decline in gait speed. DISCUSSION: These findings extend the body of evidence suggesting a protective association between conscientiousness and physical function to performance-based assessment of gait speed. Future studies are needed to confirm these associations and to explore mechanisms that underlie the conscientiousness mobility connection in aging adults. |
Variation in delivery of the 10 essential public health services by local health departments for obesity control in 2005 and 2008
Luo H , Sotnikov S , Shah G , Galuska DA , Zhang X . J Public Health Manag Pract 2013 19 (1) 53-61 OBJECTIVES: To describe and compare the capacity of local health departments (LHDs) to perform 10 essential public health services (EPHS) for obesity control in 2005 and 2008, and explore factors associated with provision of these services. METHODS: The data for this study were drawn from the 2005 and 2008 National Profile of Local Health Department surveys, conducted by the National Association of County and City Health Officials. Data were analyzed in SAS version 9.1 (SAS Institute Inc, Cary, North Carolina). RESULTS: The proportion of LHDs that reported that they do not provide any of the EPHS for obesity control decreased from 27.9% in 2005 to 17.0% in 2008. In both 2005 and 2008, the 2 most frequently provided EPHS for obesity control by LHDs were informing, educating, and empowering the people (EPHS 3) and linking people to needed personal health services (EPHS 7). The 2 least frequently provided services were enforcing laws and regulations (EPHS 6) and conducting research (EPHS 10). On average, LHDs provided 3.05 EPHS in 2005 and 3.69 EPHS in 2008. Multiple logistic regression results show that LHDs with larger jurisdiction population, with a local governance, and those that have completed a community health improvement plan were more likely to provide more of the EPHS for obesity (P < .05). CONCLUSIONS: The provision of the 10 EPHS for obesity control by LHDs remains low. Local health departments need more assistance and resources to expand performance of EPHS for obesity control. Future studies are needed to evaluate and promote LHD capacity to deliver evidence-based strategies for obesity control in local communities. |
Louisiana implementation of the National Fetal and Infant Mortality Review (NFIMR) program model: successes and opportunities
Kieltyka L , Craig M , Goodman DA , Wise R . Matern Child Health J 2012 16 Suppl 2 353-9 Common features of successful, local-level, Fetal Infant Mortality Review (FIMR) Programs are identified by the National Fetal and Infant Mortality Review (NFIMR) Program, including medical records abstraction and home interviews, case reviews by a case review team (CRT), and community systems action recommendations implemented by a community action team (CAT). This paper presents Louisiana's FIMR program, an adaptation of NFIMR recommendations. In 2001, the Louisiana Maternal and Child Health Program began a statewide FIMR Network (LaFIMR) based on the NFIMR model. Geographic areas of focus, case identification, staffing, data collection methods, and CRT and CAT membership and activities include modifications of the NFIMR recommendations unique to LaFIMR implementation. Adaptations made to the NFIMR model were advantageous to LaFIMR's success. Specifically, LaFIMR geographic areas of interest cover multiple natural communities. Compared with independent FIMR programs elsewhere, LaFIMR represents a Title V Program-based coordinated network of regional LaFIMR teams offering opportunities for expanded partnerships. Primary sources for LaFIMR case identification include obituaries and hospital logs, with secondary identification available through vital records. Improvements in vital records data systems are expected to enhance future LaFIMR case identification. LaFIMR-identified records that are linked with vital event certificates provide enhanced contextual findings for reviews and support continuous quality improvement processes. These differences in the LaFIMR implementation reinforce the NFIMR-supported uniqueness of FIMR programs across the United States, and may encourage other FIMR programs to consider how adaptations to NFIMR recommendations could benefit their programs. |
Enumerating the environmental public health workforce - challenges and opportunities
Massoudi M , Blake R , Marcum L . J Environ Health 2012 75 (4) 34-36 Workforce enumeration is the foundation for identifying workforce | needs. In 2000, the Health Resources and Services Administration (HRSA) | sponsored an enumeration of the public | health workforce (HRSA, 2000), but since | then, no comprehensive enumeration has occurred. The Centers for Disease Control and | Prevention (CDC) and HRSA are now collaborating on an effort to determine the number and composition of the U.S. workforce at the | federal, state, and local levels. |
National trends in visit rates and antibiotic prescribing for adults with acute sinusitis
Fairlie T , Shapiro DJ , Hersh AL , Hicks LA . Arch Intern Med 2012 172 (19) 1513-1514 Acute sinusitis is diagnosed in over 3 million visits annually among adults and children in the United States.1 Of these, more than 80% result in an antibiotic prescription1; however, many of these prescriptions may be unnecessary,2,3 since sinusitis is most often of viral origin and benefits of antibiotics may be limited.4 Prior to 2012, amoxicillin was the recommended empirical treatment for acute bacterial sinusitis; current guidelines now recommend amoxicillin-clavulanate.5-7 In light of recent studies4 and new treatment guidelines,7 we sought to examine visit rates and antibiotic prescribing patterns for adults with acute sinusitis in the United States. |
Appropriate follow up to detect potential adverse events after initiation of select contraceptive methods: a systematic review
Steenland MW , Zapata LB , Brahmi D , Marchbanks PA , Curtis KM . Contraception 2012 87 (5) 611-24 BACKGROUND: After a woman initiates certain methods of contraception [e.g., hormonal methods, intrauterine devices (IUDs)], she is generally asked to return at some interval for a follow-up visit; however, is it unclear whether follow up is needed, what an appropriate follow-up schedule is and what should be done at follow-up visits. METHODS: We conducted four separate searches in the PubMed database for all peer-reviewed articles in any language published from database inception through April 2012 that examined the following health outcomes for combined hormonal contraceptives (CHCs), IUDs or medroxyprogesterone acetate (DMPA): (a) incidence of hypertension among women who began using a CHC compared to women not using a CHC; (b) incidence of migraine among women who began using a CHC compared to women not using a CHC; (c) incidence of pelvic inflammatory disease (PID) among women who began using an IUD compared to women who started another form or used no method of contraception or examined incidence of PID at two or more time periods after IUD insertion and (d) whether initial weight gain predicts future weight gain among women who began using DMPA. The quality of each study was assessed using the United States Preventive Services Task Force grading system. RESULTS: A total of 15 studies met our inclusion criteria: 5 examined hypertension and combined oral contraceptive (COC) use, 7 examined PID and IUD use and 3 examined weight gain after DMPA initiation. No studies that examined migraine after CHC initiation met our inclusion criteria. Few women developed hypertension after initiating COCs, and studies examining increases in blood pressure after COC initiation found mixed results (Level I, fair to II-2, fair). Among women who had a copper IUD inserted, there was little difference in incidence of PID, or IUD removal for PID, compared with women who initiated DMPA, a hormone-releasing IUD, or COCs (Level I, good to Level II-2, fair). Studies that examined when women were diagnosed with PID after IUD insertion found mixed results. The study with the largest sample size found a much greater incidence of PID in the first 20 days after insertion, with very low rates of PID up to 8 years postinsertion (Level I, good to Level II-3, poor). Studies that examined weight gain after DMPA initiation found that weight gain >5% of baseline weight at 6 months was associated with greater mean change in weight and greater mean change in body mass index at follow-up times ranging from 12 to 36 months (Level II-2, fair to Level II-3, fair). CONCLUSIONS: Evidence on select adverse events associated with initiation of contraceptive use is limited but does not suggest increased risk of hypertension among COC users or increased risk of PID among IUD users. DMPA users who gain >5% of baseline body weight may be at increased risk of future weight gain. |
Association between intensive handwashing promotion and child development in Karachi, Pakistan: a cluster randomized controlled trial
Bowen A , Agboatwalla M , Luby S , Tobery T , Ayers T , Hoekstra RM . Arch Pediatr Adolesc Med 2012 166 (11) 1037-1044 OBJECTIVE: To evaluate associations between handwashing promotion and child growth and development. DESIGN: Cluster randomized controlled trial. SETTING: Informal settlements in Karachi, Pakistan. PARTICIPANTS: A total of 461 children who were enrolled in a trial of household-level handwashing promotion in 2003 and were younger than 8 years at reassessment in 2009. INTERVENTIONS: In 2003, neighborhoods were randomized to control (n = 9), handwashing promotion (n = 9), or handwashing promotion and drinking water treatment (n = 10); intervention households received free soap and weekly handwashing promotion for 9 months. MAIN OUTCOME MEASURES: Anthropometrics and developmental quotients measured with the Battelle Developmental Inventory II at 5 to 7 years of age. RESULTS: Overall, 24.9% (95% CI, 20.0%-30.6%) and 22.1% (95% CI, 18.0%-26.8%) of children had z scores that were more than 2 SDs below the expected z scores for height and body mass index for age, respectively; anthropometrics did not differ significantly across study groups. Global developmental quotients averaged 104.4 (95% CI, 101.9-107.0) among intervention children and 98.3 (95% CI, 93.1-103.4) among control children (P = .04). Differences of similar magnitude were measured across adaptive, personal-social, communication, cognitive, and motor domains. CONCLUSIONS: Although growth was similar across groups, children randomized to the handwashing promotion during their first 30 months of age attained global developmental quotients 0.4 SDs greater than those of control children at 5 to 7 years of age. These gains are comparable to those of at-risk children enrolled in publicly funded preschools in the United States and suggest that handwashing promotion could improve child well-being and societal productivity. TRIAL REGISTRATION: clinicaltrials.gov IDENTIFIER: NCT01538953. (2012 American Medical Association. All rights reserved.) |
Brucellosis seroprevalence among workers in at-risk professions: northwestern Wyoming, 2005 to 2006
Luce R , Snow J , Gross D , Murphy T , Grandpre J , Daley WR , Brudvig JM , Ari MD , Harris L , Clark TA . J Occup Environ Med 2012 54 (12) 1557-60 OBJECTIVE: Brucellosis is uncommon in the United States; however, its circulation among wildlife and domestic cattle has been ongoing in Wyoming. To assess the public health threat of brucellosis circulation among animals, a seroprevalence study was undertaken among workers in professions considered to be at the highest risk for infection. METHODS: A seroprevalence study was undertaken targeting individuals in at-risk professions in the affected area of the state. RESULTS: Seroprevalence among study participants was 14.4%. Veterinarians were the main professional group that demonstrated a statistically significant association with measurable anti-Brucella antibodies. Vaccinating animals with Brucella vaccines was associated with seropositivity. CONCLUSION: The risk to the general public's health from the circulation of Brucella among wildlife and cattle can be attributed primarily to a limited subpopulation at high risk rather than a generally elevated risk. |
Content Index (Achived Edition)
- Chronic Diseases and Conditions
- Communicable Diseases
- Disease Reservoirs and Vectors
- Environmental Health
- Epidemiology and Surveillance
- Genetics and Genomics
- Health Behavior and Risk
- Health Communication and Education
- Immunity and Immunization
- Informatics
- Injury and Violence
- Laboratory Sciences
- Maternal and Child Health
- Mining
- Nutritional Sciences
- Occupational Safety and Health
- Parasitic Diseases
- Physical Activity
- Public Health Leadership and Management
- Public Health, General
- Reproductive Health
- Social and Behavioral Sciences
- Veterinary Medicine
About
CDC Science Clips is an online, continuously updated, searchable database of scientific literature published by CDC authors. Each article features an Altmetric Attention Score to track social and mainstream media mentions. If you are aware of a CDC-authored publication that does not appear in this database, please let us know.
- Page last reviewed:Feb 1, 2024
- Page last updated:Sep 03, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure