Last data update: Jul 01, 2024. (Total: 47134 publications since 2009)
Records 1-30 (of 69 Records) |
Query Trace: Crews J [original query] |
---|
The wildland firefighter exposure and health effect (WFFEHE) study: cohort characteristics and health behavior changes in context
Scott KA , Wingate KC , DuBose KN , Butler CR , Ramirez-Cardenas A , Hale CR . Ann Work Expo Health 2024 OBJECTIVES: Work is an under-recognized social determinant of health. There is limited research describing US wildland firefighter (WFF) workforce demographics or how to work associates with WFF health behaviors. In this study researchers characterized a WFF cohort and tested hypotheses that WFFs used tobacco, alcohol, and sugar-sweetened beverages (SSBs) differently over the course of the fire season and that different fire crews may exhibit different behavior patterns. METHODS: Researchers collected data in the field with 6 WFF crews during 2 consecutive fire seasons (2018 and 2019). WFF crews completed questionnaires before and after each season. WFFs with an initial preseason questionnaire and at least 1 follow-up questionnaire were included (n = 138). Descriptive statistics summarized WFFs' baseline demographic, employment, and health characteristics. Linear mixed models were used to test for changes in WFFs' substance use over time and assess crew-level differences. A meta-analysis of WFF longitudinal studies' population characteristics was attempted to contextualize baseline findings. RESULTS: WFFs were predominately male, less than 35 yr of age, non-Hispanic White, and had healthy weight. Smokeless tobacco use and binge drinking were prevalent in this cohort (52% and 78%, respectively, among respondents). Longitudinal analyses revealed that during the fire season WFFs' use of tobacco and SSBs increased and the number of days they consumed alcohol decreased. Crew-level associations varied by substance. The meta-analysis was not completed due to cross-study heterogeneity and inconsistent reporting. DISCUSSION: WFF agencies can promote evidence-based substance use prevention and management programs and modify working conditions that may influence WFF stress or substance use. |
Evaluation of commercially available high-throughput SARS-CoV-2 serological assays for serosurveillance and related applications (preprint)
Stone M , Grebe E , Sulaeman H , Di Germanio C , Dave H , Kelly K , Biggerstaff BJ , Crews BO , Tran N , Jerome KR , Denny TN , Hogema B , Destree M , Jones JM , Thornburg N , Simmons G , Krajden M , Kleinman S , Dumont LJ , Busch MP . medRxiv 2021 2021.09.04.21262414 SARS-CoV-2 serosurveys can estimate cumulative incidence for monitoring epidemics but require characterization of employed serological assays performance to inform testing algorithm development and interpretation of results. We conducted a multi-laboratory evaluation of 21 commercial high-throughput SARS-CoV-2 serological assays using blinded panels of 1,000 highly-characterized blood-donor specimens. Assays demonstrated a range of sensitivities (96%-63%), specificities (99%-96%) and precision (IIC 0.55-0.99). Durability of antibody detection in longitudinal samples was dependent on assay format and immunoglobulin target, with anti-spike, direct, or total Ig assays demonstrating more stable, or increasing reactivity over time than anti-nucleocapsid, indirect, or IgG assays. Assays with high sensitivity, specificity and durable antibody detection are ideal for serosurveillance. Less sensitive assays demonstrating waning reactivity are appropriate for other applications, including characterizing antibody responses after infection and vaccination, and detection of anamnestic boosting by reinfections and vaccine breakthrough infections. Assay performance must be evaluated in the context of the intended use.Competing Interest StatementThe authors have declared no competing interest.Funding StatementThis work was supported by research contracts from the Centers for Disease Control and Prevention (CDC Contract 75D30120C08170).Author DeclarationsI confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained.YesThe details of the IRB/oversight body that provided approval or exemption for the research described are given below:All blood donors consented to use of de-identified, residual specimens for further research purposes. UCSF IRB provided explicit approval for VRI self-certification that use of the de-identified CCP donations in this study does not meet the criteria for human subjects research. CDC investigators reviewed and relied on this determination as consistent with applicable federal law and CDC policy (45 C.F.R. part 46, 21 C.F.R. part 56; 42 U.S.C. Sect. 241(d); 5 U.S.C. Sect. 552a; 44 U.S.C. Sect. 3501).All necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived.YesI understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance).YesI have followed all appropriate research reporting guidelines and uploaded the relevant EQUATOR Network research reporting checklist(s) and other pertinent material as supplementary files, if applicable.YesThe analytic data set is available upon request. |
Proinflammatory diets and risk of ESKD in US adults with CKD
Banerjee T , McCulloch CE , Crews DC , Burrows NR , Pavkov ME , Saran R , Morgenstern H , Bragg-Gresham J , Powe NR . Kidney360 2022 3 (11) 1852-1860 BACKGROUND: Inflammation may affect long-term kidney function. Diet may play a role in chronic inflammation. We hypothesized that proinflammatory diets increase the risk of progression to kidney failure with replacement therapy (KFRT), and systemic inflammation is a mediator of the effect of diet on progression to KFRT. METHODS: In the 1988-1994 National Health and Nutrition Examination Survey linked to the national ESKD registry, in adults with CKD (eGFR 15-59 ml/min per 1.73 m(2)), aged ≥20 years, we calculated the Adapted Dietary Inflammatory Index (ADII) at baseline from a 24-hour dietary recall and an inflammation score (IS) using average of z scores of four inflammation biomarkers. We explored the association of the ADII and IS with risk of incident KFRT using Cox proportional model, adjusting for sociodemographics, physical activity, Framingham risk score, eGFR, and urinary ACR. We evaluated whether, and to what extent, IS mediated the effect of the ADII on KFRT incidence, using causal mediation analysis. RESULTS: Of 1084 adults with CKD, 109 (10%) developed KFRT. The ADII was associated with increased risk of KFRT (relative hazard [RH] per SD increase (2.56): 1.4 [1.04-1.78]). IS was also associated with KFRT (RH: 1.12; 95% CI, 1.02 to 1.25). Approximately 36% of the association between the ADII and KFRT was explained by IS. CONCLUSIONS: Among adults with CKD, a proinflammatory diet was associated with risk of KFRT, and that association was partially explained by an increase in inflammatory markers. Dietary interventions that reduce inflammation may offer an approach for preventing KFRT. |
Notes from the field: Coccidioidomycosis outbreak among wildland firefighters - California, 2021
Donnelly MAP , Maffei D , Sondermeyer Cooksey GL , Ferguson TJ , Jain S , Vugia D , Materna BL , Kamali A . MMWR Morb Mortal Wkly Rep 2022 71 (34) 1095-1096 Coccidioidomycosis, also known as Valley fever, is caused by inhalation of spores of the soil-dwelling fungi Coccidioides spp. Although most illness is mild, coccidioidomycosis can cause severe disease resulting in hospitalization or death. On July 28, 2021, the California Department of Forestry and Fire Protection (CAL FIRE) notified the California Department of Public Health (CDPH) of seven wildland firefighters from two crews who had respiratory illness. Crew A (19 members) and crew B (21 members) had worked on wildfires in late June 2021 near the Tehachapi Mountains, a California region with historically high coccidioidomycosis incidence.* Among the seven symptomatic firefighters, three cases of coccidioidomycosis were laboratory-confirmed; two patients developed severe disease. All three firefighters with confirmed coccidioidomycosis reported working in dusty conditions without wearing respiratory protection. Because no vaccine for coccidioidomycosis currently exists, correct use of respiratory protection is important for preventing coccidioidomycosis, especially in regions with high disease incidence. |
Investigation of COVID-19 Outbreak among Wildland Firefighters during Wildfire Response, Colorado, USA, 2020.
Metz AR , Bauer M , Epperly C , Stringer G , Marshall KE , Webb LM , Hetherington-Rauth M , Matzinger SR , Totten SE , Travanty EA , Good KM , Burakoff A . Emerg Infect Dis 2022 28 (8) 1551-1558 A COVID-19 outbreak occurred among Cameron Peak Fire responders in Colorado, USA, during August 2020-January 2021. The Cameron Peak Fire was the largest recorded wildfire in Colorado history, lasting August-December 2020. At least 6,123 responders were involved, including 1,260 firefighters in 63 crews who mobilized to the fire camps. A total of 79 COVID-19 cases were identified among responders, and 273 close contacts were quarantined. State and local public health investigated the outbreak and coordinated with wildfire management teams to prevent disease spread. We performed whole-genome sequencing and applied social network analysis to visualize clusters and transmission dynamics. Phylogenetic analysis identified 8 lineages among sequenced specimens, implying multiple introductions. Social network analysis identified spread between and within crews. Strategies such as implementing symptom screening and testing of arriving responders, educating responders about overlapping symptoms of smoke inhalation and COVID-19, improving physical distancing of crews, and encouraging vaccinations are recommended. |
Physiological stress in flat and uphill walking with different backpack loads in professional mountain rescue crews
Pinedo-Jauregi A , Quinn T , Coca A , Mejuto G , Cámara J . Appl Ergon 2022 103 103784 This study aimed to determine the interactive physiological effect of backpack load carriage and slope during walking in professional mountain rescuers. Sixteen mountain rescuers walked on a treadmill at 3.6 km/h for 5 min in each combination of three slopes (1%, 10%, 20%) and five backpack loads (0%, 10%, 20%, 30%, and 40% body weight). Relative heart rate (%HRmax), relative oxygen consumption (%VO(2)max), and rating of perceived exertion (RPE, Borg 1-10 scale) were compared across conditions using two-way ANOVA. Significant differences in %VO(2)max, %HRmax, and RPE across slopes and loads were found where burden increased directly with slope and load (main effect of slope, p < 0.001 for all; main effect of load, p < 0.001 for all). Additionally, significant slope by load interactions were found for all parameters, indicating an additive effect (p < 0.001 for all). Mountain rescuers should consider the physiological interaction between slope and load when determining safe occupational walking capacity. |
Evaluation of Commercially Available High-Throughput SARS-CoV-2 Serologic Assays for Serosurveillance and Related Applications.
Stone M , Grebe E , Sulaeman H , Di Germanio C , Dave H , Kelly K , Biggerstaff BJ , Crews BO , Tran N , Jerome KR , Denny TN , Hogema B , Destree M , Jones JM , Thornburg N , Simmons G , Krajden M , Kleinman S , Dumont LJ , Busch MP . Emerg Infect Dis 2022 28 (3) 672-683 Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) serosurveys can estimate cumulative incidence for monitoring epidemics, requiring assessment of serologic assays to inform testing algorithm development and interpretation of results. We conducted a multilaboratory evaluation of 21 commercial high-throughput SARS-CoV-2 serologic assays using blinded panels of 1,000 highly characterized specimens. Assays demonstrated a range of sensitivities (96%-63%), specificities (99%-96%), and precision (intraclass correlation coefficient 0.55-0.99). Durability of antibody detection was dependent on antigen and immunoglobulin targets; antispike and total Ig assays demonstrated more stable longitudinal reactivity than antinucleocapsid and IgG assays. Assays with high sensitivity, specificity, and durable antibody detection are ideal for serosurveillance, but assays demonstrating waning reactivity are appropriate for other applications, including correlation with neutralizing activity and detection of anamnestic boosting by reinfections. Assay performance must be evaluated in context of intended use, particularly in the context of widespread vaccination and circulation of SARS-CoV-2 variants. |
The Wildland Firefighter Exposure and Health Effect (WFFEHE) Study: Rationale, design, and methods of a repeated-measures study
Navarro KM , Butler CR , Fent K , Toennis C , Sammons D , Ramirez-Cardenas A , Clark KA , Byrne DC , Graydon PS , Hale CR , Wilkinson AF , Smith DL , Alexander-Scott MC , Pinkerton LE , Eisenberg J , Domitrovich JW . Ann Work Expo Health 2021 66 (6) 714-727 The wildland firefighter exposure and health effect (WFFEHE) study was a 2-year repeated-measures study to investigate occupational exposures and acute and subacute health effects among wildland firefighters. This manuscript describes the study rationale, design, methods, limitations, challenges, and lessons learned. The WFFEHE cohort included fire personnel ages 18-57 from six federal wildland firefighting crews in Colorado and Idaho during the 2018 and 2019 fire seasons. All wildland firefighters employed by the recruited crews were invited to participate in the study at preseason and postseason study intervals. In 2019, one of the crews also participated in a 3-day midseason study interval where workplace exposures and pre/postshift measurements were collected while at a wildland fire incident. Study components assessed cardiovascular health, pulmonary function and inflammation, kidney function, workplace exposures, and noise-induced hearing loss. Measurements included self-reported risk factors and symptoms collected through questionnaires; serum and urine biomarkers of exposure, effect, and inflammation; pulmonary function; platelet function and arterial stiffness; and audiometric testing. Throughout the study, 154 wildland firefighters participated in at least one study interval, while 144 participated in two or more study interval. This study was completed by the Centers for Disease Control and Prevention's National Institute for Occupational Safety and Health through a collaborative effort with the U.S. Department of Agriculture Forest Service, Department of the Interior National Park Service, and Skidmore College. Conducting research in the wildfire environment came with many challenges including collecting study data with study participants with changing work schedules and conducting study protocols safely and operating laboratory equipment in remote field locations. Forthcoming WFFEHE study results will contribute to the scientific evidence regarding occupational risk factors and exposures that can impact wildland firefighter health over a season and across two wildland fire seasons. This research is anticipated to lead to the development of preventive measures and policies aimed at reducing risk for wildland firefighters and aid in identifying future research needs for the wildland fire community. |
Trends in chronic kidney disease care in the US by race and ethnicity, 2012-2019
Chu CD , Powe NR , McCulloch CE , Crews DC , Han Y , Bragg-Gresham JL , Saran R , Koyama A , Burrows NR , Tuot DS . JAMA Netw Open 2021 4 (9) e2127014 IMPORTANCE: Significant racial and ethnic disparities in chronic kidney disease (CKD) progression and outcomes are well documented, as is low use of guideline-recommended CKD care. OBJECTIVE: To examine guideline-recommended CKD care delivery by race and ethnicity in a large, diverse population. DESIGN, SETTING, AND PARTICIPANTS: In this serial cross-sectional study, adult patients with CKD that did not require dialysis, defined as a persistent estimated glomerular filtration rate less than 60 mL/min/1.73 m2 or a urine albumin-creatinine ratio of 30 mg/g or higher for at least 90 days, were identified in 2-year cross-sections from January 1, 2012, to December 31, 2019. Data from the OptumLabs Data Warehouse, a national data set of administrative and electronic health record data for commercially insured and Medicare Advantage patients, were used. EXPOSURES: The independent variables were race and ethnicity, as reported in linked electronic health records. MAIN OUTCOMES AND MEASURES: On the basis of guideline-recommended CKD care, the study examined care delivery process measures (angiotensin-converting enzyme inhibitor or angiotensin II receptor blocker prescription for albuminuria, statin prescription, albuminuria testing, nephrology care for CKD stage 4 or higher, and avoidance of chronic nonsteroidal anti-inflammatory drug prescription) and care delivery outcome measures (blood pressure and diabetes control). RESULTS: A total of 452 238 patients met the inclusion criteria (mean [SD] age, 74.0 [10.2] years; 262 089 [58.0%] female; a total of 7573 [1.7%] Asian, 49 970 [11.0%] Black, 15 540 [3.4%] Hispanic, and 379 155 [83.8%] White). Performance on process measures was higher among Asian, Black, and Hispanic patients compared with White patients for angiotensin-converting enzyme inhibitor and angiotensin II receptor blocker use (79.8% for Asian patients, 76.7% for Black patients, and 79.9% for Hispanic patients compared with 72.3% for White patients in 2018-2019), statin use (72.6% for Asian patients, 69.1% for Black patients, and 74.1% for Hispanic patients compared with 61.5% for White patients), nephrology care (64.8% for Asian patients, 72.9% for Black patients, and 69.4% for Hispanic patients compared with 58.3% for White patients), and albuminuria testing (53.9% for Asian patients, 41.0% for Black patients, and 52.6% for Hispanic patients compared with 30.7% for White patients). Achievement of blood pressure control to less than 140/90 mm Hg was similar or lower among Asian (71.8%), Black (63.3%), and Hispanic (69.8%) patients compared with White patients (72.9%). Achievement of diabetes control with hemoglobin A1c less than 7.0% was 50.1% in Asian patients, 49.3% in Black patients, and 46.0% in Hispanic patients compared with 50.3% for White patients. CONCLUSIONS AND RELEVANCE: Higher performance on CKD care process measures among Asian, Black, and Hispanic patients suggests that differences in medication prescription and diagnostic testing are unlikely to fully explain known disparities in CKD progression and kidney failure. Improving care delivery processes alone may be inadequate for reducing these disparities. |
Exposure to particulate matter and estimation of volatile organic compounds across wildland firefighter job tasks
Navarro KM , West MR , O'Dell K , Sen P , Chen IC , Fischer EV , Hornbrook RS , Apel EC , Hills AJ , Jarnot A , DeMott P , Domitrovich JW . Environ Sci Technol 2021 55 (17) 11795-11804 Wildland firefighters are exposed to smoke-containing particulate matter (PM) and volatile organic compounds (VOCs) while suppressing wildfires. From 2015 to 2017, the U.S. Forest Service conducted a field study collecting breathing zone measurements of PM(4) (particulate matter with aerodynamic diameter ≤4 μm) on wildland firefighters from different crew types and while performing various fire suppression tasks on wildfires. Emission ratios of VOC (parts per billion; ppb): PM(1) (particulate matter with aerodynamic diameter ≤1 μm; mg/m(3)) were calculated using data from a separate field study conducted in summer 2018, the Western Wildfire Experiment for Cloud Chemistry, Aerosol Absorption, and Nitrogen (WE-CAN) Campaign. These emission ratios were used to estimate wildland firefighter exposure to acrolein, benzene, and formaldehyde. Results of this field sampling campaign reported that exposure to PM(4) and VOC varied across wildland firefighter crew type and job task. Type 1 crews had greater exposures to both PM(4) and VOCs than type 2 or type 2 initial attack crews, and wildland firefighters performing direct suppression had statistically higher exposures than those performing staging and other tasks (mean differences = 0.82 and 0.75 mg/m(3); 95% confidence intervals = 0.38-1.26 and 0.41-1.08 mg/m(3), respectively). Of the 81 personal exposure samples collected, 19% of measured PM(4) exposures exceeded the recommended National Wildland Fire Coordinating Group occupational exposure limit (0.7 mg/m(3)). Wildland fire management should continue to find strategies to reduce smoke exposures for wildland firefighters. |
Reassessing the Inclusion of Race in Diagnosing Kidney Diseases: An Interim Report from the NKF-ASN Task Force
Delgado C , Baweja M , Burrows NR , Crews DC , Eneanya ND , Gadegbeku CA , Inker LA , Mendu ML , Miller WG , Moxey-Mims MM , Roberts GV , St Peter WL , Warfield C , Powe NR . Am J Kidney Dis 2021 78 (1) 103-115 For almost two decades, equations that use serum creatinine, age, sex, and race to eGFR have included "race" as Black or non-Black. Given considerable evidence of disparities in health and healthcare delivery in African American communities, some regard keeping a race term in GFR equations as a practice that differentially influences access to care and kidney transplantation. Others assert that race captures important GFR determinants and its removal from the calculation may perpetuate other disparities. The National Kidney Foundation (NKF) and American Society of Nephrology (ASN) established a task force in 2020 to reassess the inclusion of race in the estimation of GFR in the United States and its implications for diagnosis and subsequent management of patients with, orat risk for, kidney diseases.This interim report details the process, initial assessment of evidence, and values defined regarding the use of race to estimate GFR. We organized activities in phases: (1) clarify the problem and examine evidence, (2) evaluate different approaches to address use of race in GFR estimation, and (3) make recommendations. In phase one, we constructed statements about the evidence and defined values regarding equity and disparities; race and racism; GFR measurement, estimation, and equation performance; laboratory standardization; and patient perspectives. We also identified several approaches to estimate GFR and a set of attributes to evaluate these approaches. Building on evidence and values, the attributes of alternative approaches to estimate GFR will be evaluated in the next phases and recommendations will be made. |
Angiotensin-converting enzyme inhibitor or angiotensin receptor blocker use among hypertensive US adults with albuminuria
Chu CD , Powe NR , McCulloch CE , Banerjee T , Crews DC , Saran R , Bragg-Gresham J , Morgenstern H , Pavkov ME , Saydah SH , Tuot DS . Hypertension 2020 77 (1) 94-102 Since 2003, US hypertension guidelines have recommended ACE (angiotensin-converting enzyme) inhibitors or ARBs (angiotensin receptor blockers) as first-line antihypertensive therapy in the presence of albuminuria (urine albumin/creatinine ratio ≥300 mg/g). To examine national trends in guideline-concordant ACE inhibitor/ARB utilization, we studied adults participating in the National Health and Nutrition Examination Surveys 2001 to 2018 with hypertension (defined by self-report of high blood pressure, systolic blood pressure ≥140 mm Hg or diastolic ≥90 mm Hg, or use of antihypertensive medications). Among 20 538 included adults, the prevalence of albuminuria ≥300 mg/g was 2.8% in 2001 to 2006, 2.8% in 2007 to 2012, and 3.2% in 2013 to 2018. Among those with albuminuria ≥300 mg/g, no consistent trends were observed for the proportion receiving ACE inhibitor/ARB treatment from 2001 to 2018 among persons with diabetes, without diabetes, or overall. In 2013 to 2018, ACE inhibitor/ARB usage in the setting of albuminuria ≥300 mg/g was 55.3% (95% CI, 46.8%-63.6%) among adults with diabetes and 33.4% (95% CI, 23.1%-45.5%) among those without diabetes. Based on US population counts, these estimates represent 1.6 million adults with albuminuria ≥300 mg/g currently not receiving ACE inhibitor/ARB therapy, nearly half of whom do not have diabetes. ACE inhibitor/ARB underutilization represents a significant gap in preventive care delivery for adults with hypertension and albuminuria that has not substantially changed over time. |
National trends in the prevalence of chronic kidney disease among racial/ethnic and socioeconomic status groups, 1988-2016
Vart P , Powe NR , McCulloch CE , Saran R , Gillespie BW , Saydah S , Crews DC . JAMA Netw Open 2020 3 (7) e207932 Importance: The overall prevalence of chronic kidney disease (CKD) has stabilized in the United States in recent years. However, it is unclear whether all major sociodemographic groups experienced this trend. Objective: To examine trends in CKD prevalence across major sociodemographic groups as defined by race/ethnicity and socioeconomic status. Design, Setting, and Participants: This repeated cross-sectional study used data from the National Health and Nutrition Examination Surveys for 1988 to 1994 and every 2 years from 1999 to 2016 on individuals 20 years or older with information on race/ethnicity, socioeconomic status, and serum creatinine levels. Statistical analysis was conducted from May 1, 2017, to April 6, 2020. Exposures: Race/ethnicity and socioeconomic status. Main Outcomes and Measures: Prevalence of CKD was defined as an estimated glomerular filtration rate of 15 to 59 mL/min/1.73 m2. Results: A total of 54554 participants (mean [SE] age, 46.2 [0.2] years; 51.7% female) were examined. The age-, sex- and race/ethnicity-adjusted overall prevalence of stage 3 and 4 CKD increased from 3.9% in 1988-1994 to 5.2% in 2003-2004 (difference, 1.3%; 95% CI, 0.9%-1.7%; P < .001 for change) and remained relatively stable thereafter at 5.1% in 2015-2016 (difference, -0.1%; 95% CI, -0.7% to 0.4%; P = .61 for change). The trend in adjusted CKD prevalence differed significantly by race/ethnicity (P = .009 for interaction). In non-Hispanic white and non-Hispanic black persons, CKD prevalence increased between 1988-1994 and 2003-2004 and remained stable thereafter. Among Mexican American persons, CKD prevalence was lower than in other racial/ethnic groups and remained stable between 1988-1994 and 2003-2004 but nearly doubled (difference, 2.1%; 95% CI, 0.9%-3.3%; P = .001 for change) between 2003-2004 and 2015-2016 to rates similar to those in other racial/ethnic groups. There were higher rates of CKD prevalence among groups with lower educational level and income (eg, 5.8% vs 4.3% and 4.3% vs 3.1% in low vs high education and income, respectively, in 1988-1994), but trends in CKD prevalence mirrored those for the overall population. The higher CKD prevalence among individuals with lower educational level and income remained largely consistent throughout the entire period. Results were similar in most subgroups when including albuminuria to define CKD. Conclusions and Relevance: The prevalence of CKD in the United States has stabilized overall in recent years but has increased among Mexican American persons. More important, gaps in CKD prevalence across racial/ethnic groups and levels of socioeconomic status largely persisted over 28 years. There is a need to identify and address causes of increasing CKD prevalence among Mexican American persons and a need to renew efforts to effectively mitigate persistent disparities in CKD prevalence. |
Self-reported oral health status among adults age 40+ years with and without vision impairment: National Health Interview Survey, 2008
Crews JE , Chou CF , Naavaal S , Griffin S , Saaddine JB . Am J Ophthalmol 2019 210 184-191 PURPOSE: To examine self-reported oral health among adults age 40 years and older with and without vision impairment. DESIGN: Cross-sectional with a nationally representative sample. METHODS: We used publicly available data from the Oral Health Module, last administered in 2008 of the National Health Interview Survey. Outcome variables included fair/poor oral health status, mouth condition compared to others the same age, mouth problems (mouth sores, difficulty eating, dry mouth, bad breath and/or jaw pain), teeth problems (toothache; broken/missing fillings or teeth; loose, crooked or stained teeth; and/or bleeding gums) and lack of social participation. Using descriptive statistics and multivariate logistic regression, we examined the association (p<0.05) between vision impairment and oral health outcomes by age-group, sociodemographic, and other explanatory variables. RESULTS: Our study sample included 12,090 adults; 12.8% of adults aged 40-64 years reported vision impairment, and among them, 44.5% reported fair/poor oral health status and 47.2% reported any mouth problems. Among adults aged >/=65 years, 17.3% reported vision impairment, of whom 36.3% reported fair/poor oral health status, and 57.3 reported any mouth problems. There is a strong association between vision impairment and poorer oral health of adults; adults aged 40-64 years with vision impairment reported 90% to 150% greater odds of oral health problems, including fair/poor oral health status, mouth problems, and teeth problems, compared to people without vision impairment. CONCLUSIONS: Oral health disparities exist between adults with and without vision impairment. Targeted interventions are required to improve oral health in this vulnerable population. |
Elevated serum anion gap in adults with moderate chronic kidney disease increases risk for progression to end stage renal disease
Banerjee T , Crews D , Wesson DE , McCulloch C , Johansen K , Saydah S , Rios Burrows N , Saran R , Gillespie B , Bragg-Gresham J , Powe NR . Am J Physiol Renal Physiol 2019 316 (6) F1244-F1253 BACKGROUND: Acid retention associated with reduced GFR exacerbates nephropathy progression in partial nephrectomy models of CKD and might be reflected in CKD patients with reduced eGFR by increased anion gap (AG). METHODS: We explored the presence of AG and its association with CKD in 14,924 adults, aged >/=20 years and eGFR>/=15ml/min/1.73m(2), enrolled in the National Health and Nutrition Examination Survey III, 1988-1994 using multivariable regression analysis. The model was adjusted for socio-demographic characteristics, diabetes, and hypertension. We further examined the association between AG and incident end-stage renal disease using frailty models, adjusting for demographics, clinical factors, BMI, serum albumin, bicarbonate, eGFR, and urinary albumin-to-creatinine ratio, by following 558 adults with moderate CKD for 12 years via the United States Renal Data System. Laboratory measures determined AG using the traditional, albumin-corrected, and full AG definitions. RESULTS: Individuals with moderate CKD (eGFR 30-59 ml/min/1.73 m(2)) had a greater AG than those with eGFR>/=60 ml/min in multivariable regression analysis with adjustment for covariates. We found a graded relationship between the adjusted mean for all three definitions of AG and eGFR categories (p trend<0.0001). During follow-up, 9.2% of adults with moderate CKD developed ESRD. Those with AG in the highest tertile had a higher risk of ESRD, after adjusting for covariates in a frailty model (Relative risk [95% CI] for traditional AG:1.8[1.2-2.3]), compared to those in the middle tertile. CONCLUSIONS: The data suggest that high AG, even after adjusting for serum bicarbonate, is a contributing acid-base mechanism to CKD progression in moderate CKD. |
Poor accordance to a DASH dietary pattern is associated with higher risk of ESRD among adults with moderate chronic kidney disease and hypertension
Banerjee T , Crews DC , Tuot DS , Pavkov ME , Burrows NR , Stack AG , Saran R , Bragg-Gresham J , Powe NR . Kidney Int 2019 95 (6) 1433-1442 The Dietary Approaches to Stop Hypertension (DASH) diet lowers blood pressure, an important risk factor for chronic kidney disease (CKD) and end-stage renal disease (ESRD). However, it is unclear whether adherence to a DASH diet confers protection against future ESRD, especially among those with pre-existing CKD and hypertension. We examined whether a DASH diet is associated with lower risk of ESRD among 1,110 adults aged >/= 20 years with hypertension and CKD (estimated glomerular filtration rate, eGFR 30-59 ml/min/1.73 m(2)) enrolled in the National Health and Nutrition Examination Survey (1988-1994). Baseline DASH diet accordance score was assessed using a 24-hour dietary recall questionnaire. ESRD was ascertained by linkage to the U.S. Renal Data System registry. We used the Fine-Gray competing risks method to estimate the relative hazard (RH) for ESRD after adjusting for sociodemographics, clinical and nutritional factors, eGFR, and albuminuria. Over a median follow-up of 7.8 years, 18.4% of subjects developed ESRD. Compared to the highest quintile of DASH diet accordance, there was a greater risk of ESRD among subjects in quintiles 1 (RH=1.7; 95% CI 1.1-2.7) and 2 (RH 2.2; 95% CI 1.1-4.1). Significant interactions were observed with diabetes status and race/ethnicity, with the strongest association between DASH diet adherence and ESRD risk observed in individuals with diabetes and in non-Hispanic blacks. Low accordance to a DASH diet is associated with greater risk of ESRD in adults with moderate CKD and hypertension, particularly in non-Hispanic blacks and persons with diabetes. |
Firefighter hood contamination: Efficiency of laundering to remove PAHs and FRs
Mayer AC , Fent KW , Bertke S , Horn GP , Smith DL , Kerber S , La Guardia MJ . J Occup Environ Hyg 2018 16 (2) 1-32 Firefighters are occupationally exposed to products of combustion containing polycyclic aromatic hydrocarbons (PAHs) and flame retardants (FRs), potentially contributing to their increased risk for certain cancers. Personal protective equipment (PPE), including firefighter hoods, helps to reduce firefighters' exposure to toxic substances during fire responses by providing a layer of material on which contaminants deposit prior to reaching the firefighters skin. However, over time hoods that retain some contamination may actually contribute to firefighters' systemic dose. We investigated the effectiveness of laundering to reduce or remove contamination on the hoods, specifically PAHs and three classes of FRs: polybrominated diphenyl ethers (PBDEs), non-PBDE flame retardants (NPBFRs), and organophosphate flame retardants (OPFRs). Participants in the study were grouped into crews of 12 firefighters who worked in pairs by job assignment while responding to controlled fires in a single family residential structure. For each pair of firefighters, one hood was laundered after every scenario and one was not. Bulk samples of the routinely laundered and unlaundered hoods from five pairs of firefighters were collected and analyzed. Residual levels of OPFRs, NPBFRs, and PAHs were lower in the routinely laundered hoods, with total levels of each class of chemicals being 56-81% lower, on average, than the unlaundered hoods. PBDEs, on average, were 43% higher in the laundered hoods, most likely from cross contamination. After this initial testing, four of the five unlaundered exposed hoods were subsequently laundered with other heavily exposed (unlaundered) and unexposed (new) hoods. Post-laundering evaluation of these hoods revealed increased levels of PBDEs, NPBFRs, and OPFRs in both previously exposed and unexposed hoods, indicating cross contamination. For PAHs, there was little evidence of cross contamination and the exposed hoods were significantly less contaminated after laundering (76% reduction; p = 0.011). Further research is needed to understand how residual contamination on hoods could contribute to firefighters' systemic exposures. |
Race/ethnicity, dietary acid load, and risk of end-stage renal disease among US adults with chronic kidney disease
Crews DC , Banerjee T , Wesson DE , Morgenstern H , Saran R , Burrows NR , Williams DE , Powe NR . Am J Nephrol 2018 47 (3) 174-181 BACKGROUND: Dietary acid load (DAL) contributes to the risk of CKD and CKD progression. We sought to determine the relation of DAL to racial/ethnic differences in the risk of end-stage renal disease (ESRD) among persons with CKD. METHODS: Among 1,123 non-Hispanic black (NHB) and non-Hispanic white (NHW) National Health and Nutrition Examination Survey III participants with estimated glomerular filtration rate 15-59 mL/min/1.73 m2, DAL was estimated using the Remer and Manz net acid excretion (NAEes) formula and 24-h dietary recall. ESRD events were ascertained via linkage with Medicare. A competing risk model (accounting for death) was used to estimate the hazard ratio (HR) for treated ESRD, comparing NHBs with NHWs, adjusting for demographic, clinical and nutritional factors (body surface area, total caloric intake, serum bicarbonate, protein intake), and NAEes. Additionally, whether the relation of NAEes with ESRD risk varied by race/ethnicity was tested. RESULTS: At baseline, NHBs had greater NAEes (50.9 vs. 44.2 mEq/day) than NHWs. It was found that 22% developed ESRD over a median of 7.5 years. The unadjusted HR comparing NHBs to NHWs was 3.35 (95% CI 2.51-4.48) and adjusted HR (for factors above) was 1.68 (95% CI 1.18-2.38). A stronger association of NAE with risk of ESRD was observed among NHBs (adjusted HR per mEq/day increase in NAE 1.21, 95% CI 1.12-1.31) than that among NHWs (HR 1.08, 95% CI 0.96-1.20), p interaction for race/ethnicity x NAEes = 0.004. CONCLUSIONS: Among US adults with CKD, the association of DAL with progression to ESRD is stronger among NHBs than NHWs. DAL is worthy of further investigation for its contribution to kidney outcomes across race/ethnic groups. |
Laboratory-based respiratory virus surveillance pilot project on select cruise ships in Alaska, 2013-15
Rogers KB , Roohi S , Uyeki TM , Montgomery D , Parker J , Fowler NH , Xu X , Ingram DJ , Fearey D , Williams SM , Tarling G , Brown CM , Cohen NJ . J Travel Med 2017 24 (6) Background: Influenza outbreaks can occur among passengers and crews during the Alaska summertime cruise season. Ill travellers represent a potential source for introduction of novel or antigenically drifted influenza virus strains to the United States. From May to September 2013-2015, the Alaska Division of Public Health, the Centers for Disease Control and Prevention (CDC), and two cruise lines implemented a laboratory-based public health surveillance project to detect influenza and other respiratory viruses among ill crew members and passengers on select cruise ships in Alaska. Methods: Cruise ship medical staff collected 2-3 nasopharyngeal swab specimens per week from passengers and crew members presenting to the ship infirmary with acute respiratory illness (ARI). Specimens were tested for respiratory viruses at the Alaska State Virology Laboratory (ASVL); a subset of specimens positive for influenza virus were sent to CDC for further antigenic characterization. Results: Of 410 nasopharyngeal specimens, 83% tested positive for at least one respiratory virus; 71% tested positive for influenza A or B virus. Antigenic characterization of pilot project specimens identified strains matching predominant circulating seasonal influenza virus strains, which were included in the northern or southern hemisphere influenza vaccines during those years. Results were relatively consistent across age groups, recent travel history, and influenza vaccination status. Onset dates of illness relative to date of boarding differed between northbound (occurring later in the voyage) and southbound (occurring within the first days of the voyage) cruises. Conclusions: The high yield of positive results indicated that influenza was common among passengers and crews sampled with ARI. This finding reinforces the need to bolster influenza prevention and control activities on cruise ships. Laboratory-based influenza surveillance on cruise ships may augment inland influenza surveillance and inform control activities. However, these benefits should be weighed against the costs and operational limitations of instituting laboratory-based surveillance programs on ships. |
Philadelphia telemedicine glaucoma detection and follow-up study: Methods and screening results
Hark LA , Katz LJ , Myers JS , Waisbourd M , Johnson D , Pizzi LT , Leiby BE , Fudemberg SJ , Mantravadi AV , Henderer JD , Zhan T , Molineaux J , Doyle V , Divers M , Burns C , Murchison AP , Reber S , Resende A , Bui TDV , Lee J , Crews JE , Saaddine JB , Lee PP , Pasquale LR , Haller JA . Am J Ophthalmol 2017 181 114-124 PURPOSE: To describe methodology and screening results from the Philadelphia Telemedicine Glaucoma Detection and Follow-up Study. DESIGN: Screening program results for a prospective, randomized clinical trial. MATERIALS AND METHODS: Individuals were recruited who were African-American, Hispanic/Latino, or Asian over age 40 years; Caucasian individuals over age 65 years; any ethnicity over age 40 years with a family history of glaucoma or diabetes. Primary care offices and Federally Qualified Health Centers were used for telemedicine (Visit 1). Two posterior fundus photographs and 1 anterior segment photograph were captured per eye in each participant, using a non-mydriatic, auto-focus, hand-held fundus camera (Volk Optical, Mentor, Ohio, USA). Medical and ocular history, family history of glaucoma, visual acuity, and intraocular pressure measurements using the ICarerebound tonometer (ICare, Helsinki, Finland) were obtained. Images were read remotely by a trained retina reader and a glaucoma specialist. RESULTS: From 4/1/15, to 2/6/17, 906 individuals consented and attended Visit 1. Of these, 553 participants were female (61.0%) and 550 were African American (60.7%), with a mean age of 58.7 years. A total of 532 (58.7%) participants had diabetes, and 616 (68%) had a history of hypertension. During Visit 1, 356 (39.3%) participants were graded with a normal image. Using image data from the worse eye, 333 (36.8%) were abnormal and 155 (17.1%) were unreadable. A total of 258 (28.5%) had a suspicious nerve; 62 (6.8%) had ocular hypertension, 102 (11.3%) had diabetic retinopathy; and 68 (7.5%) had other retinal abnormalities. CONCLUSION: An integrated telemedicine screening intervention in primary care offices and Federally Qualified Health Centers detected high rate of suspicious optic nerves, ocular hypertension, and retinal pathology. |
The prevalence of chronic conditions and poor health among people with and without vision impairment, aged ≥ 65 years, 2010-2014
Crews JE , Chou CF , Sekar S , Saaddine JB . Am J Ophthalmol 2017 182 18-30 PURPOSE: To examine the prevalence of 13 chronic conditions and fair/poor health among people aged ≥65 years in the U.S. with and without vision impairment. DESIGN: Cross-sectional study from the 2010-2014 National Health Interview Survey METHODS: We examined hypertension, heart disease, high cholesterol, stroke, arthritis, asthma, chronic obstructive pulmonary disease, cancer, weak/failing kidneys, diabetes, hepatitis, depression, and hearing impairment. We used logistic regression to show the association between vision impairment and chronic conditions and the association between vision impairment and poor health for those with chronic conditions. RESULTS: People aged ≥65 years with vision impairment reported greater prevalence of chronic conditions compared to people without vision impairment. After controlling for covariates (age, sex, education, race, smoking, physical activity, and obesity), people with vision impairment were more likely than those without to report chronic conditions (hypertension: OR [odds ratio] 1.43; heart disease: OR 1.68; high cholesterol: OR 1.26; stroke: OR 1.99; arthritis; OR 1.71; asthma: OR 1.56; COPD: OR 1.65; cancer: OR 1.23; weak/failing kidneys: OR 2.29; diabetes: OR 1.56; hepatitis: OR 1.30; depression: OR 1.47; hearing impairment: OR 1.91) (all P<0.05). Among older people with chronic conditions, those with vision impairment and chronic conditions compared to people without vision impairment and chronic conditions were 1.66 to 2.98 times more likely to have fair/poor health than those without vision impairment (all p<0.05). CONCLUSION: Higher prevalence of chronic conditions is strongly associated with vision impairment among the older people and poor health is strongly associated with vision impairment and chronic conditions. |
Applying RE-AIM to evaluate two community-based programs designed to improve access to eye care for those at high-risk for glaucoma
Sapru S , Berktold J , Crews JE , Katz LJ , Hark L , Girkin CA , Owsley C , Francis B , Saaddine JB . Eval Program Plann 2017 65 40-46 INTRODUCTION: Glaucoma is a leading cause of vision loss and blindness in the U.S. Risk factors include African American race, older age, family history of glaucoma, and diabetes. This paper describes the evaluation of a mobile eye health and a telemedicine program designed to improve access to eye care among people at high-risk for glaucoma. METHODS: The RE-AIM (reach, efficacy, adoption, implementation, and maintenance) evaluation framework was used to harmonize indicators. Both programs provided community-based eye health education and eye services related to glaucoma detection and care. Each program reported data on participants and community partners. An external evaluator conducted site visit interviews with program staff and community partners. Quantitative and qualitative data were integrated and analyzed using the RE-AIM dimensions. DISCUSSION: By targeting high-risk populations and providing comprehensive eye exams, both programs detected a large proportion of new glaucoma-related cases (17-19%) - a much larger proportion than that found in the general population (<2%). The educational intervention increased glaucoma knowledge; evidence that it led people to seek eye care was inconclusive. CONCLUSIONS: Evaluation findings from the mobile eye health program and the telemedicine program may provide useful information for wider implementation in public health clinics and in optometrist clinics located in retail outlets. |
Costs of a community-based glaucoma detection programme: analysis of the Philadelphia Glaucoma Detection and Treatment Project
Pizzi LT , Waisbourd M , Hark L , Sembhi H , Lee P , Crews JE , Saaddine JB , Steele D , Katz LJ . Br J Ophthalmol 2017 102 (2) 225-232 BACKGROUND: Glaucoma is the foremost cause of irreversible blindness, and more than 50% of cases remain undiagnosed. Our objective was to report the costs of a glaucoma detection programme operationalised through Philadelphia community centres. METHODS: The analysis was performed using a healthcare system perspective in 2013 US dollars. Costs of examination and educational workshops were captured. Measures were total programme costs, cost/case of glaucoma detected and cost/case of any ocular disease detected (including glaucoma). Diagnoses are reported at the individual level (therefore representing a diagnosis made in one or both eyes). Staff time was captured during site visits to 15 of 43 sites and included time to deliver examinations and workshops, supervision, training and travel. Staff time was converted to costs by applying wage and fringe benefit costs from the US Bureau of Labor Statistics. Non-staff costs (equipment and mileage) were collected using study logs. Participants with previously diagnosed glaucoma were excluded. RESULTS: 1649 participants were examined. Mean total per-participant examination time was 56 min (SD 4). Mean total examination cost/participant was $139. The cost/case of glaucoma newly identified (open-angle glaucoma, angle-closure glaucoma, glaucoma suspect, or primary angle closure) was $420 and cost/case for any ocular disease identified was $273. CONCLUSION: Glaucoma examinations delivered through this programme provided significant health benefit to hard-to-reach communities. On a per-person basis, examinations were fairly low cost, though opportunities exist to improve efficiency. Findings serve as an important benchmark for planning future community-based glaucoma examination programmes. |
Contamination of firefighter personal protective equipment and skin and the effectiveness of decontamination procedures
Fent KW , Alexander B , Roberts J , Robertson S , Toennis C , Sammons D , Bertke S , Kerber S , Smith D , Horn G . J Occup Environ Hyg 2017 14 (10) 0 Firefighters' skin may be exposed to chemicals via permeation/penetration of combustion byproducts through or around personal protective equipment (PPE) or from the cross-transfer of contaminants on PPE to the skin. Additionally, volatile contaminants can evaporate from PPE following a response and be inhaled by firefighters. Using polycyclic aromatic hydrocarbons (PAHs) and volatile organic compounds (VOCs) as respective markers for non-volatile and volatile substances, we investigated the contamination of firefighters' turnout gear and skin following controlled residential fire responses. Participants were grouped into three crews of twelve firefighters. Each crew was deployed to a fire scenario (one per day, four total) and then paired up to complete six fireground job assignments. Wipe sampling of the exterior of the turnout gear was conducted pre- and post-fire. Wipe samples were also collected from a subset of the gear after field decontamination. VOCs off-gassing from gear were also measured pre-fire, post-fire, and post-decon. Wipe sampling of the firefighters' hands and neck was conducted pre- and post-fire. Additional wipes were collected after cleaning neck skin. PAH levels on turnout gear increased after each response and were greatest for gear worn by firefighters assigned to fire attack and to search and rescue activities. Field decontamination using dish soap, water, and scrubbing was able to reduce PAH contamination on turnout jackets by a median of 85%. Off-gassing VOC levels increased post-fire and then decreased 17-36 minutes later regardless of whether field decontamination was performed. Median post-fire PAH levels on the neck were near or below the limit of detection (< 24 micrograms per square meter [microg/m2]) for all positions. For firefighters assigned to attack, search, and outside ventilation, the 75th percentile values on the neck were 152, 71.7, and 39.3 microg/m2, respectively. Firefighters assigned to attack and search had higher post-fire median hand contamination (135 and 226 microg/m2, respectively) than other positions (< 10.5 microg/m2). Cleansing wipes were able to reduce PAH contamination on neck skin by a median of 54%. |
Optimizing glaucoma screening in high risk population: design and 1-year findings of the Screening to Prevent (SToP) Glaucoma study
Zhao D , Guallar E , Gajwani P , Swenor B , Crews J , Saaddine J , Mudie L , Varadaraj V , Friedman DS . Am J Ophthalmol 2017 180 18-28 PURPOSE: To develop, implement, and evaluate replicable community-based screening intervention designed to improve glaucoma and other eye disease detection and follow-up care in high-risk populations in the United States. We present the design of the study and describe the findings of the first year of the program. DESIGN: Prospective study to evaluate screening and follow-up. METHODS: This is an ongoing study to develop an eye screening program using trained personnel to identify individuals with ophthalmic needs, focusing on African Americans ≥50 years of age at multiple inner-city community sites in Baltimore, MD. The screening examination uses a sequential referral approach and assesses presenting visual acuity (VA), best-corrected VA, digital fundus imaging, visual field testing, and measurement of intraocular pressure. RESULTS: We screened 901 individuals between Jan 2015 and Oct 2015. Subjects were mostly African Americans (94.9%) with a mean (SD) age of 64.3 (9.9) years. Among them, 356 (39.5%) participants were referred for a definitive eye exam and 107 (11.9%) only needed prescription glasses. The most common reasons for referral were ungradable fundus image (39.3% of those referred), best-corrected VA < 20/40 (14.6%), and ungradable autorefraction (11.8%). Among people referred for definitive exam, 153 (43%) people attended their scheduled exam. The most common diagnoses at the definitive exam were glaucoma and cataract (51% and 40%, respectively). CONCLUSIONS: A large proportion of individuals screened required ophthalmic services, particularly those who were older and less well educated. To reach and encourage these individuals to attend screenings and follow-up exams, programs could develop innovative strategies and approaches. |
Food insecurity, CKD, and subsequent ESRD in US adults
Banerjee T , Crews DC , Wesson DE , Dharmarajan S , Saran R , Rios Burrows N , Saydah S , Powe NR . Am J Kidney Dis 2017 70 (1) 38-47 BACKGROUND: Poor access to food among low-income adults has been recognized as a risk factor for chronic kidney disease (CKD), but there are no data for the impact of food insecurity on progression to end-stage renal disease (ESRD). We hypothesized that food insecurity would be independently associated with risk for ESRD among persons with and without earlier stages of CKD. STUDY DESIGN: Longitudinal cohort study. SETTING & PARTICIPANTS: 2,320 adults (aged ≥ 20 years) with CKD and 10,448 adults with no CKD enrolled in NHANES III (1988-1994) with household income ≤ 400% of the federal poverty level linked to the Medicare ESRD Registry for a median follow-up of 12 years. PREDICTOR: Food insecurity, defined as an affirmative response to the food-insecurity screening question. OUTCOME: Development of ESRD. MEASUREMENTS: Demographics, income, diabetes, hypertension, estimated glomerular filtration rate, and albuminuria. Dietary acid load was estimated from 24-hour dietary recall. We used a Fine-Gray competing-risk model to estimate the relative hazard (RH) for ESRD associated with food insecurity after adjusting for covariates. RESULTS: 4.5% of adults with CKD were food insecure. Food-insecure individuals were more likely to be younger and have diabetes (29.9%), hypertension (73.9%), or albuminuria (90.4%) as compared with their counterparts (P<0.05). Median dietary acid load in the food-secure versus food-insecure group was 51.2 mEq/d versus 55.6 mEq/d, respectively (P=0.05). Food-insecure adults were more likely to develop ESRD (RH, 1.38; 95% CI, 1.08-3.10) compared with food-secure adults after adjustment for demographics, income, diabetes, hypertension, estimated glomerular filtration rate, and albuminuria. In the non-CKD group, 5.7% were food insecure. We did not find a significant association between food insecurity and ESRD (RH, 0.77; 95% CI, 0.40-1.49). LIMITATIONS: Use of single 24-hour diet recall; lack of laboratory follow-up data and measure of changes in food insecurity over time; follow-up of cohort ended 10 years ago. CONCLUSIONS: Among adults with CKD, food insecurity was independently associated with a higher likelihood of developing ESRD. Innovative approaches to address food insecurity should be tested for their impact on CKD outcomes. |
Notes from the field: Occupational lead exposures at a shipyard - Douglas County, Wisconsin, 2016
Weiss D , Yendell SJ , Baertlein LA , Christensen KY , Tomasallo CD , Creswell PD , Camponeschi JL , Meiman JG , Anderson HA . MMWR Morb Mortal Wkly Rep 2017 66 (1) 34 On March 28, 2016, the Minnesota Poison Control System was consulted by an emergency department provider regarding clinical management of a shipyard worker with a blood lead level (BLL) >60 μg/dL; the National Institute for Occupational Safety and Health defines elevated BLLs as ≥5 μg/dL (1). The Minnesota Poison Control System notified the Minnesota Department of Health (MDH). Concurrently, the Wisconsin Department of Health Services (WDHS) received laboratory reports concerning two workers from the same shipyard with BLLs >40 μg/dL. These three workers had been retrofitting the engine room of a 690-foot vessel since January 4, 2016. | Work was suspended during March 29–April 4 in the vessel’s engine room, the presumptive primary source of lead exposure. On March 29, the shipyard partnered with a local occupational health clinic to provide testing for workers. Employees and their household members were also tested by general practitioners and local laboratories. The shipyard hired sanitation crews for lead clean-up and abatement and provided personal protective equipment for its employees. On April 1, WDHS and MDH issued advisories to alert regional health care organizations, local public health agencies, and tribal health departments to the situation and launched a joint investigation on April 4. Subsequently, WDHS activated its Incident Command System and worked with MDH to compile a list of potentially exposed workers. By August 31, a total of 357 workers who might have been employed at the shipyard during December 2015–March 2016 had been identified. | During April–July 2016, WDHS and MDH attempted telephone interviews with workers. The goal of the interviews was to gather information regarding employment history, work tasks, personal exposure prevention, symptoms commonly associated with lead exposures, and take-home contamination prevention and household composition and to convey health messages. |
Eye Care Quality and Accessibility Improvement in the Community (EQUALITY): Impact of an eye health education program on patient knowledge about glaucoma and attitudes about eye care
Rhodes LA , Huisingh CE , McGwin G Jr , Mennemeyer ST , Bregantini M , Patel N , Saaddine J , Crews JE , Girkin CA , Owsley C . Patient Relat Outcome Meas 2016 7 37-48 PURPOSE: To assess the impact of the education program of the Eye Care Quality and Accessibility Improvement in the Community (EQUALITY) telemedicine program on at-risk patients' knowledge about glaucoma and attitudes about eye care as well as to assess patient satisfaction with EQUALITY. PATIENTS AND METHODS: New or existing patients presenting for a comprehensive eye exam (CEE) at one of two retail-based primary eye clinics were enrolled based on ≥1 of the following at-risk criteria for glaucoma: African Americans ≥40 years of age, Whites ≥50 years of age, diabetes, family history of glaucoma, and/or preexisting diagnosis of glaucoma. A total of 651 patients were enrolled. A questionnaire was administered prior to the patients' CEE and prior to the patients receiving any of the evidence-based eye health education program; a follow-up questionnaire was administered 2-4 weeks later by phone. Baseline and follow-up patient responses regarding knowledge about glaucoma and attitudes about eye care were compared using McNemar's test. Logistic regression models were used to assess the association of patient-level characteristics with improvement in knowledge and attitudes. Overall patient satisfaction was summarized. RESULTS: At follow-up, all patient responses in the knowledge and attitude domains significantly improved from baseline (P≤0.01 for all questions). Those who were unemployed (odds ratio =0.63, 95% confidence interval =0.42-0.95, P=0.026) or had lower education (odds ratio =0.55, 95% confidence interval =0.29-1.02, P=0.058) were less likely to improve their knowledge after adjusting for age, sex, race, and prior glaucoma diagnosis. This association was attenuated after further adjustment for other patient-level characteristics. Ninety-eight percent (n=501) of patients reported being likely to have a CEE within the next 2 years, whereas 63% (n=326) had a CEE in the previous 2 years. Patient satisfaction with EQUALITY was high (99%). CONCLUSION: Improved knowledge about glaucoma and a high intent to pursue eye care may lead to improved detection of early disease, thus lowering the risk of blindness. |
The Philadelphia Glaucoma Detection and Treatment Project: Detection rates and initial management
Waisbourd M , Pruzan NL , Johnson D , Ugorets A , Crews JE , Saaddine JB , Henderer JD , Hark LA , Katz LJ . Ophthalmology 2016 123 (8) 1667-1674 PURPOSE: To evaluate the detection rates of glaucoma-related diagnoses and the initial treatments received in the Philadelphia Glaucoma Detection and Treatment Project, a community-based initiative aimed at improving the detection, treatment, and follow-up care of individuals at risk for glaucoma. DESIGN: Retrospective analysis. PARTICIPANTS: A total of 1649 individuals at risk for glaucoma who were examined and treated in 43 community centers located in underserved communities of Philadelphia. METHODS: Individuals were enrolled if they were African American aged ≥50 years, were any other adult aged ≥60 years, or had a family history of glaucoma. After attending an informational glaucoma workshop, participants underwent a targeted glaucoma examination including an ocular, medical, and family history; visual acuity testing, intraocular pressure (IOP) measurement, and corneal pachymetry; slit-lamp and optic nerve examination; automated visual field testing; and fundus color photography. If indicated, treatments included selective laser trabeculoplasty (SLT), laser peripheral iridotomy (LPI), or IOP-lowering medications. Follow-up examinations were scheduled at the community sites after 4 to 6 weeks or 4 to 6 months, depending on the clinical scenario. MAIN OUTCOME MEASURES: Detection rates of glaucoma-related diagnoses and types of treatments administered. RESULTS: Of the 1649 individuals enrolled, 645 (39.1%) received a glaucoma-related diagnosis; 20.0% (n = 330) were identified as open-angle glaucoma (OAG) suspects, 9.2% (n = 151) were identified as having narrow angles (or as a primary angle closure/suspect), and 10.0% (n = 164) were diagnosed with glaucoma, including 9.0% (n = 148) with OAG and 1.0% (n = 16) with angle-closure glaucoma. Overall, 39.0% (n = 64 of 164) of those diagnosed with glaucoma were unaware of their diagnosis. A total of 196 patients (11.9%) received glaucoma-related treatment, including 84 (5.1%) who underwent LPI, 13 (0.8%) who underwent SLT, and 103 (6.2%) who were prescribed IOP-lowering medication. CONCLUSIONS: Targeting individuals at risk for glaucoma in underserved communities in Philadelphia yielded a high detection rate (39.1%) of glaucoma-related diagnoses. Providing examinations and offering treatment, including first-line laser procedures, at community-based sites providing services to older adults are effective to improve access to eye care by underserved populations. |
Falls among persons aged ≥65 years with and without severe vision impairment - United States, 2014
Crews JE , Chou CF , Stevens JA , Saaddine JB . MMWR Morb Mortal Wkly Rep 2016 65 (17) 433-7 In 2014, an estimated 2.8 million persons aged ≥65 years in the United States reported severe vision impairment defined as being blind or having severe difficulty seeing, even with eyeglasses. Good vision is important for maintaining balance as well as for identifying low-contrast hazards, estimating distances, and discerning spatial relationships. Conversely, having poor vision increases the risk for falls (1,2). Falls among older adults are common and can cause serious injuries, disabilities, and premature death (1,3). To date, no state-level investigations have examined the annual prevalence of falls among persons with and without severe vision impairment. CDC analyzed data from the 2014 Behavioral Risk Factor Surveillance System (BRFSS) to estimate the state-specific annual prevalence of falls among persons aged ≥65 years with and without self-reported severe vision impairment. Overall, 46.7% of persons with, and 27.7% of older adults without, self-reported severe vision impairment reported having fallen during the previous year. The state-specific annual prevalence of falls among persons aged ≥65 years with severe vision impairment ranged from 30.8% (Hawaii) to 59.1% (California). In contrast, the prevalence of falls among persons aged ≥65 years without severe vision impairment ranged from 20.4% (Hawaii) to 32.4% (Alaska). Developing fall-prevention interventions intended for persons with severe vision impairment will help states manage the impact of vision impairment and falls on health care resources, and can inform state-specific fall prevention initiatives. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Jul 01, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure