Last data update: Jan 13, 2025. (Total: 48570 publications since 2009)
Records 1-30 (of 47 Records) |
Query Trace: Caldwell KL[original query] |
---|
Determination of iodine content in dairy products by inductively coupled plasma mass spectrometry
Vance KA , Makhmudov A , Shakirova G , Roenfanz H , Jones RL , Caldwell KL . At Spectrosc 2018 39 (3) 95-99 A probing study to establish a reliable and robust method for determining the iodine concentration using the ELAN DRC II ICP-MS was performed in combination with a sample digestion and filtration step. Dairy products from locally available sources were evaluated to help determine the possibility and need for further evaluations in relation to the U.S. population’s iodine intake. Prior to analysis, the samples were aliquoted and digested for 3 hours at 90+/-3 C. Dilution and filtration were performed, following the digestion. The sample extract was analyzed, and the results were confirmed with NIST SRM 1549a Whole Milk Powder. Further experimentation will need to be performed to optimize the method for projected sample concentration and throughput. |
LAMP: A CDC program to ensure the quality of blood-lead laboratory measurements
Caldwell KL , Cheng PY , Vance KA , Makhmudov A , Jarrett JM , Caudill SP , Ho DP , Jones RL . J Public Health Manag Pract 2019 25 S23-s30 CONTEXT: The Lead and Multielement Proficiency (LAMP) program is an external quality assurance program promoting high-quality blood-lead measurements. OBJECTIVES: To investigate the ability of US laboratories, participating in the Centers for Disease Control and Prevention (CDC) LAMP program to accurately measure blood-lead levels (BLL) 0.70 to 47.5 mug/dL using evaluation criteria of +/-2 mug/dL or 10%, whichever is greater. METHODS: The CDC distributes bovine blood specimens to participating laboratories 4 times per year. We evaluated participant performance over 5 challenges on samples with BLL between 0.70 and 47.5 mug/dL. The CDC sent 15 pooled samples (3 samples shipped in 5 rounds) to US laboratories. The LAMP laboratories used 3 primary technologies to analyze lead in blood: inductively coupled plasma mass spectrometry, graphite furnace atomic absorption spectroscopy, and LeadCare technologies based on anodic stripping voltammetry. Laboratories reported their BLL analytical results to the CDC. The LAMP uses these results to provide performance feedback to the laboratories. SETTING: The CDC sent blood samples to approximately 50 US laboratories for lead analysis. PARTICIPANTS: Of the approximately 200 laboratories enrolled in LAMP, 38 to 46 US laboratories provided data used in this report (January 2017 to March 2018). RESULTS: Laboratory precision ranged from 0.26 mug/dL for inductively coupled plasma mass spectrometry to 1.50 mug/dL for LeadCare instruments. All participating US LAMP laboratories reported accurate BLL for 89% of challenge samples, using the +/-2 mug/dL or 10% evaluation criteria. CONCLUSIONS: Laboratories participating in the CDC's LAMP program can accurately measure blood lead using the current Clinical Laboratory Improvement Amendments of 1988 guidance of +/-4 mug/dL or +/-10%, with a success rate of 96%. However, when we apply limits of +/-2 mug/dL or +/-10%, the success rate drops to 89%. When challenged with samples that have target values between 3 and 5 mug/dL, nearly 100% of reported results fall within +/-4 mug/dL, while 5% of the results fall outside of the acceptability criteria used by the CDC's LAMP program. As public health focuses on lower blood lead levels, laboratories must evaluate their ability to successfully meet these analytical challenges surrounding successfully measuring blood lead. In addition proposed CLIA guidelines (+/-2 mug/dL or 10%) would be achievable performance by a majority of US laboratories participating in the LAMP program. |
Comparison of nicotine and toxicant exposure in users of electronic cigarettes and combustible cigarettes
Goniewicz ML , Smith DM , Edwards KC , Blount BC , Caldwell KL , Feng J , Wang L , Christensen C , Ambrose B , Borek N , van Bemmel D , Konkel K , Erives G , Stanton CA , Lambert E , Kimmel HL , Hatsukami D , Hecht SS , Niaura RS , Travers M , Lawrence C , Hyland AJ . JAMA Netw Open 2018 1 (8) e185937 Importance: Use of electronic cigarettes (e-cigarettes) is increasing. Measures of exposure to known tobacco-related toxicants among e-cigarette users will inform potential health risks to individual product users. Objectives: To estimate concentrations of tobacco-related toxicants among e-cigarette users and compare these biomarker concentrations with those observed in combustible cigarette users, dual users, and never tobacco users. Design, Setting, and Participants: A population-based, longitudinal cohort study was conducted in the United States in 2013-2014. Cross-sectional analysis was performed between November 4, 2016, and October 5, 2017, of biomarkers of exposure to tobacco-related toxicants collected by the Population Assessment of Tobacco and Health Study. Participants included adults who provided a urine sample and data on tobacco use (N = 5105). Exposures: The primary exposure was tobacco use, including current exclusive e-cigarette users (n = 247), current exclusive cigarette smokers (n = 2411), and users of both products (dual users) (n = 792) compared with never tobacco users (n = 1655). Main Outcomes and Measures: Geometric mean concentrations of 50 individual biomarkers from 5 major classes of tobacco product constituents were measured: nicotine, tobacco-specific nitrosamines (TSNAs), metals, polycyclic aromatic hydrocarbons (PAHs), and volatile organic compounds (VOCs). Results: Of the 5105 participants, most were aged 35 to 54 years (weighted percentage, 38%; 95% CI, 35%-40%), women (60%; 95% CI, 59%-62%), and non-Hispanic white (61%; 95% CI, 58%-64%). Compared with exclusive e-cigarette users, never users had 19% to 81% significantly lower concentrations of biomarkers of exposure to nicotine, TSNAs, some metals (eg, cadmium and lead), and some VOCs (including acrylonitrile). Exclusive e-cigarette users showed 10% to 98% significantly lower concentrations of biomarkers of exposure, including TSNAs, PAHs, most VOCs, and nicotine, compared with exclusive cigarette smokers; concentrations were comparable for metals and 3 VOCs. Exclusive cigarette users showed 10% to 36% lower concentrations of several biomarkers than dual users. Frequency of cigarette use among dual users was positively correlated with nicotine and toxicant exposure. Conclusions and Relevance: Exclusive use of e-cigarettes appears to result in measurable exposure to known tobacco-related toxicants, generally at lower levels than cigarette smoking. Toxicant exposure is greatest among dual users, and frequency of combustible cigarette use is positively correlated with tobacco toxicant concentration. These findings provide evidence that using combusted tobacco cigarettes alone or in combination with e-cigarettes is associated with higher concentrations of potentially harmful tobacco constituents in comparison with using e-cigarettes alone. |
Lead exposure during childhood and subsequent anthropometry through adolescence in girls
Deierlein AL , Teitelbaum SL , Windham GC , Pinney SM , Galvez MP , Caldwell KL , Jarrett JM , Gajek R , Kushi LH , Biro F , Wolff MS . Environ Int 2018 122 310-315 INTRODUCTION: Cross-sectional studies suggest that postnatal blood lead (PbB) concentrations are negatively associated with child growth. Few studies prospectively examined this association in populations with lower PbB concentrations. We investigated longitudinal associations of childhood PbB concentrations and subsequent anthropometric measurements in a multi-ethnic cohort of girls. METHODS: Data were from The Breast Cancer and the Environment Research Program at three sites in the United States (U.S.): New York City, Cincinnati, and San Francisco Bay Area. Girls were enrolled at ages 6-8years in 2004-2007. Girls with PbB concentrations collected at </=10years old (mean 7.8years, standard deviation (SD) 0.82) and anthropometry collected at >/=3 follow-up visits were included (n=683). The median PbB concentration was 0.99mug/d (10th percentile=0.59mug/dL and 90th percentile=2.00mug/dL) and the geometric mean was 1.03mug/dL (95% Confidence Interval (CI): 0.99, 1.06). For analyses, PbB concentrations were dichotomized as <1mug/dL (n=342) and >/=1mug/dL (n=341). Anthropometric measurements of height, body mass index (BMI), waist circumference (WC), and percent body fat (%BF) were collected at enrollment and follow-up visits through 2015. Linear mixed effects regression estimated how PbB concentrations related to changes in girls' measurements from ages 7-14years. RESULTS: At 7years, mean difference in height was -2.0cm (95% CI: -3.0, -1.0) for girls with >/=1mug/dL versus <1mug/dL PbB concentrations; differences persisted, but were attenuated, with age to -1.5cm (95% CI: -2.5, -0.4) at 14years. Mean differences for BMI, WC, and BF% at 7years between girls with >/=1mug/dL versus <1mug/dL PbB concentrations were -0.7kg/m(2) (95% CI: -1.2, -0.2), -2.2cm (95% CI: -3.8, -0.6), and -1.8% (95% CI: -3.2, -0.4), respectively. Overall, these differences generally persisted with advancing age and at 14years, differences were -0.8kg/m(2) (95% CI: -1.5, -0.02), -2.9cm (95% CI: -4.8, -0.9), and -1.7% (95% CI: -3.1, -0.4) for BMI, WC, and BF%, respectively. CONCLUSIONS: These findings suggest that higher concentrations of PbB during childhood, even though relatively low by screening standards, may be inversely associated with anthropometric measurements in girls. |
Iodine status and consumption of key iodine sources in the U.S. population with special attention to reproductive age women
Herrick KA , Perrine CG , Aoki Y , Caldwell KL . Nutrients 2018 10 (7) We estimated iodine status (median urinary iodine concentration (mUIC (µg/L))) for the US population (6 years and over; n = 4613) and women of reproductive age (WRA) (15(-)44 years; n = 901). We estimated mean intake of key iodine sources by race and Hispanic origin. We present the first national estimates of mUIC for non-Hispanic Asian persons and examine the intake of soy products, a potential source of goitrogens. One-third of National Health and Nutrition Examination Survey (NHANES) participants in 2011(-)2014 provided casual urine samples; UIC was measured in these samples. We assessed dietary intake with one 24-h recall and created food groups using the USDA’s food/beverage coding scheme. For WRA, mUIC was 110 µg/L. For both non-Hispanic white (106 µg/L) and non-Hispanic Asian (81 µg/L) WRA mUIC was significantly lower than mUIC among Hispanic WRA (133 µg/L). Non-Hispanic black WRA had a mUIC of 124 µg/L. Dairy consumption was significantly higher among non-Hispanic white (162 g) compared to non-Hispanic black WRA (113 g). Soy consumption was also higher among non-Hispanic Asian WRA (18 g compared to non-Hispanic black WRA (1 g). Differences in the consumption pattern of key sources of iodine and goitrogens may put subgroups of individuals at risk of mild iodine deficiency. Continued monitoring of iodine status and variations in consumption patterns is needed. |
Assessing the stability of Cd, Mn, Pb, Se, and total Hg in whole human blood by ICP-DRC-MS as a function of temperature and time
Tevis DS , Jarrett JM , Jones DR , Cheng PY , Franklin M , Mullinex N , Caldwell KL , Jones RL . Clin Chim Acta 2018 485 1-6 BACKGROUND: Comprehensive information on the effect of time and temperature storage on the measurement of elements in human, whole blood (WB) by inductively coupled plasma-dynamic reaction cell-mass spectrometry (ICP-DRC-MS) is lacking, particularly for Mn and Se. METHODS: Human WB was spiked at 3 concentration levels, dispensed, and then stored at 5 different temperatures: -70 degrees C, -20 degrees C, 4 degrees C, 23 degrees C, and 37 degrees C. At 3 and 5weeks, and at 2, 4, 6, 8, 10, 12, 36months, samples were analyzed for Pb, Cd, Mn, Se and total Hg, using ICP-DRC-MS. We used a multiple linear regression model including time and temperature as covariates to fit the data with the measurement value as the outcome. We used an equivalence test using ratios to determine if results from the test storage conditions, warmer temperature and longer time, were comparable to the reference storage condition of 3weeks storage time at -70 degrees C. RESULTS: Model estimates for all elements in human WB samples stored in polypropylene cryovials at -70 degrees C were equivalent to estimates from samples stored at 37 degrees C for up to 2months, 23 degrees C up to 10months, and -20 degrees C and 4 degrees C for up to 36months. Model estimates for samples stored for 3weeks at -70 degrees C were equivalent to estimates from samples stored for 2months at -20 degrees C, 4 degrees C, 23 degrees C and 37 degrees C; 10months at -20 degrees C, 4 degrees C, and 23 degrees C; and 36months at -20 degrees C and 4 degrees C. This equivalence was true for all elements and pools except for the low concentration blood pool for Cd. CONCLUSIONS: Storage temperatures of -20 degrees C and 4 degrees C are equivalent to -70 degrees C for stability of Cd, Mn, Pb, Se, and Hg in human whole blood for at least 36months when blood is stored in sealed polypropylene vials. Increasing the sample storage temperature from -70 degrees C to -20 degrees C or 4 degrees C can lead to large energy savings. The best analytical results are obtained when storage time at higher temperature conditions (e.g. 23 degrees C and 37 degrees C) is minimized because recovery of Se and Hg is reduced. Blood samples stored in polypropylene vials also lose volume over time and develop clots at higher temperature conditions (e.g., 23 degrees C and 37 degrees C), making them unacceptable for elemental testing after 10months and 2months, respectively. |
Measurement challenges at low blood lead levels
Caldwell KL , Cheng PY , Jarrett JM , Makhmudov A , Vance K , Ward CD , Jones RL , Mortensen ME . Pediatrics 2017 140 (2) In 2012, the Centers for Disease Control and Prevention (CDC) adopted its Advisory Committee on Childhood Lead Poisoning Prevention recommendation to use a population-based reference value to identify children and environments associated with lead hazards. The current reference value of 5 mug/dL is calculated as the 97.5th percentile of the distribution of blood lead levels (BLLs) in children 1 to 5 years old from 2007 to 2010 NHANES data. We calculated and updated selected percentiles, including the 97.5th percentile, by using NHANES 2011 to 2014 blood lead data and examined demographic characteristics of children whose blood lead was ≥90th percentile value. The 97.5th percentile BLL of 3.48 microg/dL highlighted analytical laboratory and clinical interpretation challenges of blood lead measurements ≤5 mug/dL. Review of 5 years of results for target blood lead values <11 microg/dL for US clinical laboratories participating in the CDC's voluntary Lead and Multi-Element Proficiency quality assurance program showed 40% unable to quantify and reported a nondetectable result at a target blood lead value of 1.48 microg/dL, compared with 5.5% at a target BLL of 4.60 microg/dL. We describe actions taken at the CDC's Environmental Health Laboratory in the National Center for Environmental Health, which measures blood lead for NHANES, to improve analytical accuracy and precision and to reduce external lead contamination during blood collection and analysis. |
Biomonitoring method for the analysis of chromium and cobalt in human whole blood using inductively coupled plasma-kinetic energy discrimination-mass spectrometry (ICP-KED-MS)
Georgi JC , Sommer YL , Ward CD , Cheng P , Jones RL , Caldwell KL . Anal Methods 2017 9 (23) 3464-3476 The Centers for Disease Control and Prevention developed a biomonitoring method to rapidly and accurately quantify chromium and cobalt in human whole blood by ICP-MS. Many metal-on-metal hip implants which contain significant amounts of chromium and cobalt are susceptible to metal degradation. This method is used to gather population data about chromium and cobalt exposure of the U.S. population that does not include people that have metal-on-metal hip implants so that reference value can be established for a baseline level in blood. We evaluated parameters such as; helium gas flow rate, choice and composition of the diluent solution for sample preparation, and sample rinse time to determine the optimal conditions for analysis. The limits of detection for chromium and cobalt in blood were determined to be 0.41 and 0.06 g L-1, respectively. Method precision, accuracy, and recovery for this method were determined using quality control material created in-house and historical proficiency testing samples. We conducted experiments to determine if quantitative changes in the method parameters affect the results obtained by changing four parameters while analyzing human whole blood spiked with National Institute of Standard and Technology traceable materials: the dilution factor used during sample preparation, sample rinse time, diluent composition, and kinetic energy discrimination gas flow rate. The results at the increased and decreased levels for each parameter were statistically compared to the results obtained at the optimized parameters. We assessed the degree of reproducibility obtained under a variety of conditions and evaluated the method's robustness by analyzing the same set of proficiency testing samples by different analysts, on different instruments, with different reagents, and on different days. The short-term stability of chromium and cobalt in human blood samples stored at room temperature was monitored over a time period of 64 hours by diluting and analyzing samples at different time intervals. The stability of chromium and cobalt post-dilution was also evaluated over a period of 48 hours and at two storage temperatures (room temperature and refrigerated at 4 C). The results obtained during the stability studies showed that chromium and cobalt are stable in human blood for a period of 64 hours. The Royal Society of Chemistry 2017. |
Association of acute toxic encephalopathy with litchi consumption in an outbreak in Muzaffarpur, India, 2014: a case-control study
Shrivastava A , Kumar A , Thomas JD , Laserson KF , Bhushan G , Carter MD , Chhabra M , Mittal V , Khare S , Sejvar JJ , Dwivedi M , Isenberg SL , Johnson R , Pirkle JL , Sharer JD , Hall PL , Yadav R , Velayudhan A , Papanna M , Singh P , Somashekar D , Pradhan A , Goel K , Pandey R , Kumar M , Kumar S , Chakrabarti A , Sivaperumal P , Kumar AR , Schier JG , Chang A , Graham LA , Mathews TP , Johnson D , Valentin L , Caldwell KL , Jarrett JM , Harden LA , Takeoka GR , Tong S , Queen K , Paden C , Whitney A , Haberling DL , Singh R , Singh RS , Earhart KC , Dhariwal AC , Chauhan LS , Venkatesh S , Srikantiah P . Lancet Glob Health 2017 5 (4) e458-e466 BACKGROUND: Outbreaks of unexplained illness frequently remain under-investigated. In India, outbreaks of an acute neurological illness with high mortality among children occur annually in Muzaffarpur, the country's largest litchi cultivation region. In 2014, we aimed to investigate the cause and risk factors for this illness. METHODS: In this hospital-based surveillance and nested age-matched case-control study, we did laboratory investigations to assess potential infectious and non-infectious causes of this acute neurological illness. Cases were children aged 15 years or younger who were admitted to two hospitals in Muzaffarpur with new-onset seizures or altered sensorium. Age-matched controls were residents of Muzaffarpur who were admitted to the same two hospitals for a non-neurologic illness within seven days of the date of admission of the case. Clinical specimens (blood, cerebrospinal fluid, and urine) and environmental specimens (litchis) were tested for evidence of infectious pathogens, pesticides, toxic metals, and other non-infectious causes, including presence of hypoglycin A or methylenecyclopropylglycine (MCPG), naturally-occurring fruit-based toxins that cause hypoglycaemia and metabolic derangement. Matched and unmatched (controlling for age) bivariate analyses were done and risk factors for illness were expressed as matched odds ratios and odds ratios (unmatched analyses). FINDINGS: Between May 26, and July 17, 2014, 390 patients meeting the case definition were admitted to the two referral hospitals in Muzaffarpur, of whom 122 (31%) died. On admission, 204 (62%) of 327 had blood glucose concentration of 70 mg/dL or less. 104 cases were compared with 104 age-matched hospital controls. Litchi consumption (matched odds ratio [mOR] 9.6 [95% CI 3.6 - 24]) and absence of an evening meal (2.2 [1.2-4.3]) in the 24 h preceding illness onset were associated with illness. The absence of an evening meal significantly modified the effect of eating litchis on illness (odds ratio [OR] 7.8 [95% CI 3.3-18.8], without evening meal; OR 3.6 [1.1-11.1] with an evening meal). Tests for infectious agents and pesticides were negative. Metabolites of hypoglycin A, MCPG, or both were detected in 48 [66%] of 73 urine specimens from case-patients and none from 15 controls; 72 (90%) of 80 case-patient specimens had abnormal plasma acylcarnitine profiles, consistent with severe disruption of fatty acid metabolism. In 36 litchi arils tested from Muzaffarpur, hypoglycin A concentrations ranged from 12.4 mug/g to 152.0 mug/g and MCPG ranged from 44.9 mug/g to 220.0 mug/g. INTERPRETATION: Our investigation suggests an outbreak of acute encephalopathy in Muzaffarpur associated with both hypoglycin A and MCPG toxicity. To prevent illness and reduce mortality in the region, we recommended minimising litchi consumption, ensuring receipt of an evening meal and implementing rapid glucose correction for suspected illness. A comprehensive investigative approach in Muzaffarpur led to timely public health recommendations, underscoring the importance of using systematic methods in other unexplained illness outbreaks. FUNDING: US Centers for Disease Control and Prevention. |
Analysis of whole human blood for Pb, Cd, Hg, Se, and Mn by ICP-DRC-MS for biomonitoring and acute exposures
Jones DR , Jarrett JM , Tevis DS , Franklin M , Mullinix NJ , Wallon KL , Derrick Quarles C Jr , Caldwell KL , Jones RL . Talanta 2017 162 114-122 We improved our inductively coupled plasma mass spectrometry (ICP-MS) whole blood method [1] for determination of lead (Pb), cadmium (Cd), and mercury (Hg) by including manganese (Mn) and selenium (Se), and expanding the calibration range of all analytes. The method is validated on a PerkinElmer (PE) ELAN® DRC II ICP-MS (ICP-DRC-MS) and uses the Dynamic Reaction Cell (DRC) technology to attenuate interfering background ion signals via ion-molecule reactions. Methane gas (CH4) eliminates background signal from 40Ar2 + to permit determination of 80Se+, and oxygen gas (O2) eliminates several polyatomic interferences (e.g. 40Ar15N+, 54Fe1H+) on 55Mn+. Hg sensitivity in DRC mode is a factor of two higher than vented mode when measured under the same DRC conditions as Mn due to collisional focusing of the ion beam. To compensate for the expanded method's longer analysis time (due to DRC mode pause delays), we implemented an SC4-FAST autosampler (ESI Scientific, Omaha, NE), which vacuum loads the sample onto a loop, to keep the sample-to-sample measurement time to less than 5 min, allowing for preparation and analysis of 60 samples in an 8-h work shift. The longer analysis time also resulted in faster breakdown of the hydrocarbon oil in the interface roughing pump. The replacement of the standard roughing pump with a pump using a fluorinated lubricant, Fomblin®, extended the time between pump maintenance. We optimized the diluent and rinse solution components to reduce carryover from high concentration samples and prevent the formation of precipitates. We performed a robust calculation to determine the following limits of detection (LOD) in whole blood: 0.07 µg dL−1 for Pb, 0.10 µg L−1 for Cd, 0.28 μg L−1 for Hg, 0.99 µg L−1 for Mn, and 24.5 µg L−1 for Se. © 2016 |
Urinary iodine, thyroid function, and thyroglobulin as biomarkers of iodine status
Pearce EN , Caldwell KL . Am J Clin Nutr 2016 104 Suppl 3 898S-901S The accurate assessment of population iodine status is necessary to inform public health policies and clinical research on iodine nutrition, particularly the role of iodine adequacy in normal neurodevelopment. Urinary iodine concentration (UIC) directly reflects dietary iodine intake and is the most common indicator used worldwide to assess population iodine status. The CDC established the Ensuring the Quality of Iodine Procedures program in 2001 to provide laboratories that measure urinary iodine with an independent assessment of their analytic performance; this program fosters improvement in the assessment of UIC. Clinical laboratory tests of thyroid function (including serum concentrations of the pituitary hormone thyrotropin and the thyroid hormones thyroxine and triiodothyronine) are sometimes used as indicators of iodine status, although such use is often problematic. Even in severely iodine-deficient regions, there is a great deal of intraindividual variation in the ability of the thyroid to adapt. In most settings and in most population subgroups other than newborns, thyroid function tests are not considered sensitive indicators of population iodine status. However, the thyroid-derived protein thyroglobulin is increasingly being used for this purpose. Thyroglobulin can be measured in either serum or dried blood spot (DBS) samples. The use of DBS samples is advantageous in resource-poor regions. Improved methodologies for ascertaining maternal iodine status are needed to facilitate research on developmental correlates of iodine status. Thyroglobulin may prove to be a useful biomarker for both maternal and neonatal iodine status, but validated assay-specific reference ranges are needed for the determination of iodine sufficiency in both pregnant women and neonates, and trimester-specific ranges are possibly needed for pregnant women. UIC is currently a well-validated population biomarker, but individual biomarkers that could be used for research, patient care, and public health are lacking. |
Thyroid antagonists and thyroid indicators in U.S. pregnant women in the Vanguard Study of the National Children's Study
Mortensen ME , Birch R , Wong LY , Valentin-Blasini L , Boyle EB , Caldwell KL , Merrill LS , Moye J Jr , Blount BC . Environ Res 2016 149 179-188 The sodium iodide-symporter (NIS) mediates uptake of iodide into thyroid follicular cells. This key step in thyroid hormone synthesis is inhibited by perchlorate, thiocyanate (SCN) and nitrate (NO3) anions. When these exposures occur during pregnancy the resulting decreases in thyroid hormones may adversely affect neurodevelopment of the human fetus. Our objectives were to describe and examine the relationship of these anions to the serum thyroid indicators, thyroid stimulating hormone (TSH) and free thyroxine (FT4), in third trimester women from the initial Vanguard Study of the National Children's Study (NCS); and to compare urine perchlorate results with those in pregnant women from the National Health and Nutritional Examination Survey (NHANES). Urinary perchlorate, SCN, NO3, and iodine, serum TSH, FT4, and cotinine were measured and a food frequency questionnaire (FFQ) was administered to pregnant women enrolled in the initial Vanguard Study. We used multiple regression models of FT4 and TSH that included perchlorate equivalent concentration (PEC, which estimates combined inhibitory effects of the anions perchlorate, SCN, and NO3 on the NIS). We used multiple regression to model predictors of each urinary anion, using FFQ results, drinking water source, season of year, smoking status, and demographic characteristics. Descriptive statistics were calculated for pregnant women in NHANES 2001-2012. The geometric mean (GM) for urinary perchlorate was 4.04microg/L, for TSH 1.46mIU/L, and the arithmetic mean for FT4 1.11ng/dL in 359 NCS women. In 330 women with completed FFQs, consumption of leafy greens, winter season, and Hispanic ethnicity were significant predictors of higher urinary perchlorate, which differed significantly by study site and primary drinking water source, and bottled water was associated with higher urinary perchlorate compared to filtered tap water. Leafy greens consumption was associated with higher urinary NO3 and higher urinary SCN. There was no association between urinary perchlorate or PEC and TSH or FT4, even for women with urinary iodine <100microg/L. GM urinary perchlorate concentrations in the full sample (n=494) of third trimester NCS women (4.03microg/L) were similar to pregnant women in NHANES (3.58microg/L). |
Cord blood methylmercury and fetal growth outcomes in Baltimore newborns: Potential confounding and effect modification by omega-3 fatty acids, selenium, and sex
Wells EM , Herbstman JB , Lin YH , Jarrett J , Verdon CP , Ward C , Caldwell KL , Hibbeln JR , Witter FR , Halden RU , Goldman LR . Environ Health Perspect 2016 124 (3) 373-9 BACKGROUND: Methylmercury (MeHg) may affect fetal growth; however, prior research often lacked assessment of mercury speciation, confounders, and interactions. OBJECTIVE: Our objective was to assess the relationship between MeHg and fetal growth as well as the potential for confounding or interaction of this relationship from speciated mercury, fatty acids, selenium, and sex. METHODS: This cross-sectional study includes 271 singletons born in Baltimore, Maryland, 2004-2005. Umbilical cord blood was analyzed for speciated mercury, serum omega-3 highly unsaturated fatty acids (n-3 HUFAs), and selenium. Multivariable linear regression models controlled for gestational age, birth weight, maternal age, parity, prepregnancy body mass index, smoking, hypertension, diabetes, selenium, n-3 HUFAs, and inorganic mercury (IHg). RESULTS: Geometric mean cord blood MeHg was 0.94 mug/L (95% CI: 0.84, 1.07). In adjusted models for ponderal index, betaln(MeHg) = -0.045 (g/cm3) x 100 (95% CI: -0.084, -0.005). There was no evidence of a MeHg x sex interaction with ponderal index. Contrastingly, there was evidence of a MeHg x n-3 HUFAs interaction with birth length [among low n-3 HUFAs, betaln(MeHg) = 0.40 cm, 95% CI: -0.02, 0.81; among high n-3 HUFAs, betaln(MeHg) = -0.15, 95% CI: -0.54, 0.25; p-interaction = 0.048] and head circumference [among low n-3 HUFAs, betaln(MeHg) = 0.01 cm, 95% CI: -0.27, 0.29; among high n-3 HUFAs, betaln(MeHg) = -0.37, 95% CI: -0.63, -0.10; p-interaction = 0.042]. The association of MeHg with birth weight and ponderal index was affected by n-3 HUFAs, selenium, and IHg. For birth weight, betaln(MeHg) without these variables was -16.8 g (95% CI: -75.0, 41.3) versus -29.7 (95% CI: -93.9, 34.6) with all covariates. Corresponding values for ponderal index were -0.030 (g/cm3) x 100 (95% CI: -0.065, 0.005) and -0.045 (95% CI: -0.084, -0005). CONCLUSION: We observed an association of increased MeHg with decreased ponderal index. There is evidence for interaction between MeHg and n-3 HUFAs; infants with higher MeHg and n-3 HUFAs had lower birth length and head circumference. These results should be verified with additional studies. |
Long-term stability of inorganic, methyl and ethyl mercury in whole blood: Effects of storage temperature and time
Sommer YL , Ward CD , Pan Y , Caldwell KL , Jones RL . J Anal Toxicol 2016 40 (3) 222-8 In this study, we evaluated the effect of temperature on the long-term stability of three mercury species in bovine blood. We used inductively coupled plasma mass spectrometry (ICP-MS) analysis to determine the concentrations of inorganic (iHg), methyl (MeHg) and ethyl (EtHg) mercury species in two blood pools stored at temperatures of -70, -20, 4, 23 degrees C (room temperature) and 37 degrees C. Over the course of a year, we analyzed aliquots of pooled specimens at time intervals of 1, 2, 4 and 6 weeks and 2, 4, 6, 8, 10 and 12 months. We applied a fixed-effects linear model, step-down pairwise comparison and coefficient of variation statistical analysis to examine the temperature and time effects on changes in mercury species concentrations. We observed several instances of statistically significant differences in mercury species concentrations between different temperatures and time points; however, with considerations of experimental factors (such as instrumental drift and sample preparation procedures), not all differences were scientifically important. We concluded that iHg, MeHg and EtHg species in bovine whole blood were stable at -70, -20, 4 and 23 degrees C for 1 year, but blood samples stored at 37 degrees C were stable for no more than 2 weeks. |
Metals exposures of residents living near the Akaki river in Addis Ababa, Ethiopia: A cross-sectional study
Yard E , Bayleyegn T , Abebe A , Mekonnen A , Murphy M , Caldwell KL , Luce R , Hunt DR , Tesfaye K , Abate M , Assefa T , Abera F , Habte K , Chala F , Lewis L , Kebede A . J Environ Public Health 2015 2015 935297 BACKGROUND: The Akaki River in Ethiopia has been found to contain elevated levels of several metals. Our objectives were to characterize metals exposures of residents living near the Akaki River and to assess metal levels in their drinking water. METHODS: In 2011, we conducted a cross-sectional study of 101 households in Akaki-Kality subcity (near the Akaki River) and 50 households in Yeka subcity (distant to the Akaki River). One willing adult in each household provided urine, blood, and drinking water sample. RESULTS: Urinary molybdenum (p < 0.001), tungsten (p < 0.001), lead (p < 0.001), uranium (p < 0.001), and mercury (p = 0.049) were higher in Akaki-Kality participants compared to Yeka participants. Participants in both subcities had low urinary iodine; 45% met the World Health Organization (WHO) classification for being at risk of moderate iodine deficiency. In Yeka, 47% of households exceeded the WHO aesthetic-based reference value for manganese; in Akaki-Kality, only 2% of households exceeded this value (p < 0.001). There was no correlation between metals levels in water samples and clinical specimens. CONCLUSIONS: Most of the exposures found during this investigation seem unlikely to cause acute health effects based on known toxic thresholds. However, toxicity data for many of these metals are very limited. |
Assessing arsenic exposure in households using bottled water or point-of-use treatment systems to mitigate well water contamination
Smith AE , Lincoln RA , Paulu C , Simones TL , Caldwell KL , Jones RL , Backer LC . Sci Total Environ 2015 544 701-710 There is little published literature on the efficacy of strategies to reduce exposure to residential well water arsenic. The objectives of our study were to: 1) determine if water arsenic remained a significant exposure source in households using bottled water or point-of-use treatment systems; and 2) evaluate the major sources and routes of any remaining arsenic exposure. We conducted a cross-sectional study of 167 households in Maine using one of these two strategies to prevent exposure to arsenic. Most households included one adult and at least one child. Untreated well water arsenic concentrations ranged from <10mug/L to 640mug/L. Urine samples, water samples, daily diet and bathing diaries, and household dietary and water use habit surveys were collected. Generalized estimating equations were used to model the relationship between urinary arsenic and untreated well water arsenic concentration, while accounting for documented consumption of untreated water and dietary sources. If mitigation strategies were fully effective, there should be no relationship between urinary arsenic and well water arsenic. To the contrary, we found that untreated arsenic water concentration remained a significant (p≤0.001) predictor of urinary arsenic levels. When untreated water arsenic concentrations were <40mug/L, untreated water arsenic was no longer a significant predictor of urinary arsenic. Time spent bathing (alone or in combination with water arsenic concentration) was not associated with urinary arsenic. A predictive analysis of the average study participant suggested that when untreated water arsenic ranged from 100 to 500mug/L, elimination of any untreated water use would result in an 8%-32% reduction in urinary arsenic for young children, and a 14%-59% reduction for adults. These results demonstrate the importance of complying with a point-of-use or bottled water exposure reduction strategy. However, there remained unexplained, water-related routes of exposure. |
Dietary sources of methylated arsenic species in urine of the United States population, NHANES 2003-2010
deCastro BR , Caldwell KL , Jones RL , Blount BC , Pan Y , Ward C , Mortensen ME . PLoS One 2014 9 (9) e108098 BACKGROUND: Arsenic is an ubiquitous element linked to carcinogenicity, neurotoxicity, as well as adverse respiratory, gastrointestinal, hepatic, and dermal health effects. OBJECTIVE: Identify dietary sources of speciated arsenic: monomethylarsonic acid (MMA), and dimethylarsinic acid (DMA). METHODS: Age-stratified, sample-weighted regression of NHANES (National Health and Nutrition Examination Survey) 2003-2010 data ( approximately 8,300 participants ≥6 years old) characterized the association between urinary arsenic species and the additional mass consumed of USDA-standardized food groups (24-hour dietary recall data), controlling for potential confounders. RESULTS: For all arsenic species, the rank-order of age strata for median urinary molar concentration was children 6-11 years > adults 20-84 years > adolescents 12-19 years, and for all age strata, the rank-order was DMA > MMA. Median urinary molar concentrations of methylated arsenic species ranged from 0.56 to 3.52 micromol/mol creatinine. Statistically significant increases in urinary arsenic species were associated with increased consumption of: fish (DMA); fruits (DMA, MMA); grain products (DMA, MMA); legumes, nuts, seeds (DMA); meat, poultry (DMA); rice (DMA, MMA); rice cakes/crackers (DMA, MMA); and sugars, sweets, beverages (MMA). And, for adults, rice beverage/milk (DMA, MMA). In addition, based on US (United States) median and 90th percentile consumption rates of each food group, exposure from the following food groups was highlighted: fish; fruits; grain products; legumes, nuts, seeds; meat, poultry; and sugars, sweets, beverages. CONCLUSIONS: In a nationally representative sample of the US civilian, noninstitutionalized population, fish (adults), rice (children), and rice cakes/crackers (adolescents) had the largest associations with urinary DMA. For MMA, rice beverage/milk (adults) and rice cakes/crackers (children, adolescents) had the largest associations. |
Total and methyl mercury in whole blood measured for the first time in the U.S. population: NHANES 2011-2012
Mortensen ME , Caudill SP , Caldwell KL , Ward CD , Jones RL . Environ Res 2014 134C 257-264 BACKGROUND: Despite the public health and toxicologic interest in methyl mercury (MeHg) and ethyl mercury (EHg), these mercury species have been technically difficult to measure in large population studies. METHODS: Using NHANES 2011-2012 data, we calculated reference ranges and examined demographic factors associated with specific mercury species concentrations and the ratio of MeHg to THg. We conducted several multiple regression analyses to examine factors associated with MeHg concentrations and also with the ratio of MeHg to THg. RESULTS: Asians had the highest geometric mean concentrations for MeHg, 1.58microg/L (95% CI 1.29, 1.93) and THg, 1.86microg/L (1.58, 2.19), followed by non-Hispanic blacks with MeHg, 0.52microg/L (0.39, 0.68) and THg, 0.68microg/L (0.54, 0.85). Greater education attainment in adults and male sex were associated with higher MeHg and THg concentrations. Race/ethnicity, age, and sex were significant predictors of MeHg concentrations, which increased with age and were highest in Asians in all age categories, followed by non-Hispanic blacks. Mexican Americans had the lowest adjusted MeHg concentrations. The ratio of MeHg to THg was highest in Asians, varied by racial/ethnic group, and increased with age in a non-linear fashion. The amount of increase in the MeHg to THg ratio with age depended on the initial ratio, with a greater increase as age increased. Of the overall population, 3.05% (95% CI 1.77, 4.87) had MeHg concentrations >5.8microg/L (a value that corresponds to the U.S. EPA reference dose). The prevalence was highest in Asians at 15.85% (95% CI 11.85, 20.56), increased with age, reaching a maximum of 9.26% (3.03, 20.42) at ages 60-69 years. Females 16-44 years old had a 1.76% (0.82-3.28) prevalence of MeHg concentrations >5.8microg/L. CONCLUSIONS: Asians, males, older individuals, and adults with greater educational attainment had higher MeHg concentrations. The ratio of MeHg to THg varied with racial/ethnic group, increased with age, and was nonlinear. U.S. population reference values for MeHg and the ratio of MeHg to THg can assist in more precise assessment of public health risk from MeHg consumed in seafood. |
Median and quantile tests under complex survey design using SAS and R
Pan Y , Caudill SP , Li R , Caldwell KL . Comput Methods Programs Biomed 2014 117 (2) 292-7 Techniques for conducting hypothesis testing on the median and other quantiles of two or more subgroups under complex survey design are limited. In this paper, we introduce programs in both SAS and R to perform such a test. A detailed illustration of the computations, macro variable definitions, input and output for the SAS and R programs are also included in the text. Urinary iodine data from National Health and Nutrition Examination Survey (NHANES) are used as examples for comparing medians between females and males as well as comparing the 75th percentiles among three salt consumption groups. |
Postnatal exposure to methyl mercury and neuropsychological development in 7-year-old urban inner-city children exposed to lead in the United States
Wang Y , Chen A , Dietrich KN , Radcliffe J , Caldwell KL , Rogan WJ . Child Neuropsychol 2014 20 (5) 527-38 BACKGROUND: The most common route for general population exposure to methyl mercury (MeHg) is fish consumption. Recommendations to pregnant women about consuming fish contaminated with MeHg are also applied to children, but there are few studies available about the effects of low-level postnatal MeHg exposure in them. OBJECTIVES: To investigate the association between postnatal methyl mercury exposure and neuropsychological development in a study of children also exposed to lead, both measured at 7 years. METHODS: We measured MeHg concentrations in blood samples from the Treatment of Lead-Exposed Children (TLC) trial in which 780 children with elevated concentrations of lead in blood were followed with neuropsychological tests from ages 12-33 months through 7 years. Here we examine blood MeHg concentration and neuropsychological test scores, both measured at age 7 years. We used a maximum likelihood method to estimate geometric mean MeHg concentration and generalized linear regression models to analyze MeHg and neuropsychological test scores. RESULTS: Geometric mean MeHg concentration was 0.56 (95% confidence interval: 0.52, 0.59) mu g/L. A 1 mu g/L increase in MeHg was associated with a 2.1 (95% confidence interval: 0.4, 3.8) point increase in Full-Scale IQ and 0.2 (95% confidence interval: 0.02, 0.4) point increase in Learning Slopeindex T-score on a test of verbal memory. CONCLUSIONS: Our results suggest that the relatively low MeHg exposure in US school-aged children from this population has no detectable adverse effect on neuropsychological development. The positive associations observed between MeHg and neurodevelopment may indirectly reflect consumption of beneficial polyunsaturated fatty acids from seafood. |
Determination of 241Am in Urine Using Sector Field Inductively Coupled Plasma Mass Spectrometry (SF-ICP-MS)
Xiao G , Saunders D , Jones RL , Caldwell KL . J Radioanal Nucl Chem 2014 301 (1) 285-291 Quantification of Am-241 in urine at low levels is important for assessment of individuals' or populations' accidental, environmental, or terrorism-related internal contamination, but no convenient, precise method has been established to rapidly determine these low levels. Here we report a new analytical method to measure Am-241 as developed and validated at the Centers for Disease Control and Prevention (CDC) by means of the selective retention of Am from urine directly on DGA resin, followed by SF-ICP-MS detection. The method provides rapid results with a limit of detection (LOD) of 0.22 pg/L (0.028 Bq/L), which is lower than 1/3 of the C/P CDG for Am-241 at 5 days post-exposure. The results obtained by this method closely agree with CDC values as measured by liquid scintillation counting, and with National Institute of Standards Technology (NIST) Certified Reference Materials (CRM) target values. |
Association of selenium and copper with lipids in umbilical cord blood
Wells EM , Navas-Acien A , Apelberg BJ , Herbstman JB , Jarrett JM , Lin YH , Verdon CP , Ward C , Caldwell KL , Hibbeln JR , Halden RU , Witter FR , Goldman LR . J Dev Orig Health Dis 2014 5 (4) 281-7 Altered levels of selenium and copper have been linked with altered cardiovascular disease risk factors including changes in blood triglyceride and cholesterol levels. However, it is unclear whether this can be observed prenatally. This cross-sectional study includes 274 singleton births from 2004 to 2005 in Baltimore, Maryland. We measured umbilical cord serum selenium and copper using inductively coupled plasma mass spectrometry. We evaluated exposure levels vis-a-vis umbilical cord serum triglyceride and total cholesterol concentrations in multivariable regression models adjusted for gestational age, birth weight, maternal age, race, parity, smoking, prepregnancy body mass index, n-3 fatty acids and methyl mercury. The percent difference in triglycerides comparing those in the highest v. lowest quartile of selenium was 22.3% (95% confidence interval (CI): 7.1, 39.7). For copper this was 43.8% (95% CI: 25.9, 64.3). In multivariable models including both copper and selenium as covariates, copper, but not selenium, maintained a statistically significant association with increased triglycerides (percent difference: 40.7%, 95% CI: 22.1, 62.1). There was limited evidence of a relationship of increasing selenium with increasing total cholesterol. Our findings provide evidence that higher serum copper levels are associated with higher serum triglycerides in newborns, but should be confirmed in larger studies. |
Measurement of mercury species in human blood using triple spike isotope dilution with SPME-GC-ICP-DRC-MS
Sommer YL , Verdon CP , Fresquez MR , Ward CD , Wood EB , Pan Y , Caldwell KL , Jones RL . Anal Bioanal Chem 2014 406 (20) 5039-47 The measurement of different mercury compounds in human blood can provide valuable information about the type of mercury exposure. To this end, our laboratory developed a biomonitoring method for the quantification of inorganic (iHg), methyl (MeHg), and ethyl (EtHg) mercury in whole blood using a triple-spike isotope dilution (TSID) quantification method employing capillary gas chromatography (GC) and inductively coupled dynamic reaction cell mass spectrometry (ICP-DRC-MS). We used a robotic CombiPAL(R) sample handling station featuring twin fiber-based solid-phase microextraction (SPME) injector heads. The use of two SPME fibers significantly reduces sample analysis cycle times making this method very suitable for high sample throughput, which is a requirement for large public health biomonitoring studies. Our sample preparation procedure involved solubilization of blood samples with tetramethylammonium hydroxide (TMAH) followed by the derivatization with sodium tetra(n-propyl)borate (NaBPr4) to promote volatility of mercury species. We thoroughly investigated mercury species stability in the blood matrix during the course of sample treatment and analysis. The method accuracy for quantifying iHg, MeHg, and EtHg was validated using NIST standard reference materials (SRM 955c level 3) and the Centre de Toxicologie du Quebec (CTQ) proficiency testing (PT) samples. The limit of detection (LOD) for iHg, MeHg, and EtHg in human blood was determined to be 0.27, 0.12, and 0.16 mug/L, respectively. |
Certification of total arsenic in blood and urine standard reference materials by radiochemical neutron activation analysis and inductively coupled plasma-mass spectrometry
Paul RL , Davis WC , Yu L , Murphy KE , Guthrie WF , Leber DD , Bryan CE , Vetter TW , Shakirova G , Mitchell G , Kyle DJ , Jarrett JM , Caldwell KL , Jones RL , Eckdahl S , Wermers M , Maras M , Palmer CD , Verostek MF , Geraghty CM , Steuerwald AJ , Parsons PJ . J Radioanal Nucl Chem 2014 299 (3) 1555-1563 Radiochemical neutron activation analysis (RNAA) was used to measure arsenic at four levels in standard reference material (SRM) 955c Toxic Elements in Caprine Blood and at two levels in SRM 2668 Toxic Elements in Frozen Human Urine for the purpose of providing mass concentration values for certification. Samples were freeze-dried prior to analysis followed by neutron irradiation for 3 h at a fluence rate of 1 × 10^14 cm^−2 s^−1. After sample dissolution in perchloric and nitric acids, arsenic was separated from the matrix either by retention on hydrated manganese dioxide (urine) or by extraction into zinc diethyldithiocarbamate in chloroform (blood). ^76As was quantified by gamma-ray spectroscopy. Differences in chemical yield and counting geometry between samples and standards were monitored by measuring the count rate of a ^77As tracer added before sample dissolution. RNAA results were combined with inductively coupled plasma-mass spectrometry values from National Institute of Standards and Technology and collaborating laboratories to provide certified values of 10.81 ± 0.54 and 213.1 ± 0.73 μg/L for SRM 2668 Levels I and II, and certified values of 21.66 ± 0.73, 52.7 ± 1.1, and 78.8 ± 4.9 μg/L for SRM 955c Levels II–IV, respectively. Because of discrepancies between values obtained by different methods for SRM 955c Level I, an information value of <5 μg/L was assigned for this material. |
Determination of 234U/238U, 235U/238U and 236U/238U isotope ratios in urine using sector field inductively coupled plasma mass spectrometry
Xiao G , Jones RL , Saunders D , Caldwell KL . Radiat Prot Dosimetry 2014 162 (4) 618-24 Quantification of the isotopic composition of uranium in urine at low levels of concentration is important for assessing both military and civilian populations' exposures to uranium. However, until now there has been no convenient, precise method established for rapid determination of multiple uranium isotope ratios. Here, the authors report a new method to measure 234U/238U, 235U/238U and 236U/238U. It uses solid-phase chelation extraction (via TRU columns) of actinides from the urine matrix, followed by measurement using a magnetic sector field inductively coupled plasma mass spectrometer (SF-ICP-MS-Thermo Element XR) equipped with a high-efficiency nebulizer (Apex PFA microflow) and coupled with a membrane desolvating nebulizer system (Aridus II). This method provides rapid and reliable results and has been used successfully to analyse Certified Reference Materials. |
Analytical considerations in the clinical laboratory assessment of metals
Wang RY , Caldwell KL , Jones RL . J Med Toxicol 2014 10 (2) 232-9 The presence of metals in the environment is ubiquitous and humans are constantly being exposed to them. As such, a general concern exists about potential health consequences that result from the exposure to metals. The continued efforts of environmental scientists to measure metals in clinical specimens are important for defining the extent of human exposure to these chemicals. Laboratory methods to measure the concentration of metals in human blood or urine are available, and they can be used to assess the extent of human exposure to these chemicals. However, several considerations should be reviewed when requesting a laboratory measurement of metals because some factors can affect the test result or its interpretation. These considerations are discussed in this article and include pre-analytical, analytical, and post-analytical factors. Clinicians with this knowledge will be able to request these laboratory tests for their patients with enhanced confidence. |
Validity of predictive equations for 24-h urinary sodium excretion in adults aged 18-39 y
Cogswell ME , Wang CY , Chen TC , Pfeiffer CM , Elliott P , Gillespie CD , Carriquiry AL , Sempos CT , Liu K , Perrine CG , Swanson CA , Caldwell KL , Loria CM . Am J Clin Nutr 2013 98 (6) 1502-13 BACKGROUND: Collecting a 24-h urine sample is recommended for monitoring the mean population sodium intake, but implementation can be difficult. OBJECTIVE: The objective was to assess the validity of published equations by using spot urinary sodium concentrations to predict 24-h sodium excretion. DESIGN: This was a cross-sectional study, conducted from June to August 2011 in metropolitan Washington, DC, of 407 adults aged 18-39 y, 48% black, who collected each urine void in a separate container for 24 h. Four timed voids (morning, afternoon, evening, and overnight) were selected from each 24-h collection. Published equations were used to predict 24-h sodium excretion with spot urine by specimen timing and race-sex subgroups. We examined mean differences with measured 24-h sodium excretion (bias) and individual differences with the use of Bland-Altman plots. RESULTS: Across equations and specimens, mean bias in predicting 24-h sodium excretion for all participants ranged from -267 to 1300 mg (Kawasaki equation). Bias was least with International Cooperative Study on Salt, Other Factors, and Blood Pressure (INTERSALT) equations with morning (-165 mg; 95% CI: -295, 36 mg), afternoon (-90 mg; -208, 28 mg), and evening (-120 mg; -230, -11 mg) specimens. With overnight specimens, mean bias was least when the Tanaka (-23 mg; 95% CI: -141, 95 mg) or Mage (-145 mg; -314, 25 mg) equations were used but was statistically significant when using the Tanaka equations among females (216 to 243 mg) and the Mage equations among races other than black (-554 to -372 mg). Significant over- and underprediction occurred across individual sodium excretion concentrations. CONCLUSIONS: Using a single spot urine, INTERSALT equations may provide the least biased information about population mean sodium intakes among young US adults. None of the equations evaluated provided unbiased estimates of individual 24-h sodium excretion. This trial was registered at clinicaltrials.gov as NCT01631240. |
The impact of succimer chelation on blood cadmium in children with background exposures: a randomized trial
Cao Y , Chen A , Bottai M , Caldwell KL , Rogan WJ . J Pediatr 2013 163 (2) 598-600 Succimer lowers blood lead concentrations in children, and the structure of succimer chelates of lead and cadmium are similar. Using blood samples from a randomized trial of succimer for lead poisoning, however, we found that succimer did not lower blood cadmium in children with background exposure. |
Changes in the concentrations of biochemical indicators of diet and nutritional status of pregnant women across pregnancy trimesters in Trujillo, Peru, 2004-2005
Horton DK , Adetona O , Aguilar-Villalobos M , Cassidy BE , Pfeiffer CM , Schleicher RL , Caldwell KL , Needham LL , Rathbun SL , Vena JE , Naeher LP . Nutr J 2013 12 80 BACKGROUND: In developing countries, deficiencies in essential micronutrients are common, particularly in pregnant women. Although, biochemical indicators of diet and nutrition are useful to assess nutritional status, few studies have examined such indicators throughout pregnancy in women in developing countries. METHODS: The primary objective of this study was to assess the nutritional status of 78 Peruvian women throughout pregnancy for 16 different nutritional indicators including fat-soluble vitamins and carotenoids, iron-status indicators, and selenium. Venous blood samples from which serum was prepared were collected during trimesters one (n = 78), two (n = 65), three (n = 62), and at term via the umbilical cord (n = 52). Questionnaires were completed to determine the demographic characteristics of subjects. Linear mixed effects models were used to study the associations between each maternal indicator and the demographic characteristics. RESULTS: None of the women were vitamin A and E deficient at any stage of pregnancy and only 1/62 women (1.6%) was selenium deficient during the third trimester. However, 6.4%, 44% and 64% of women had ferritin levels indicative of iron deficiency during the first, second and third trimester, respectively. Statistically significant changes (p ≤ 0.05) throughout pregnancy were noted for 15/16 nutritional indicators for this Peruvian cohort, with little-to-no association with demographic characteristics. Three carotenoids (beta-carotene, beta-cryptoxanthin and trans-lycopene) were significantly associated with education status, while trans-lycopene was associated with age and beta-cryptoxanthin with SES (p < 0.05). Concentrations of retinol, tocopherol, beta-cryptoxanthin, lutein + zeaxanthin and selenium were lower in cord serum compared with maternal serum (p < 0.05). Conversely, levels of iron status indicators (ferritin, transferrin saturation and iron) were higher in cord serum (p < 0.05). CONCLUSION: The increasing prevalence of iron deficiency throughout pregnancy in these Peruvian women was expected. It was surprising though not to find deficiencies in other nutrients. The results highlight the importance of continual monitoring of women throughout pregnancy for iron deficiency which could be caused by increasing fetal needs and/or inadequate iron intake as pregnancy progresses. |
Urinary excretion of sodium, potassium, and chloride, but not iodine, varies by timing of collection in a 24-hour calibration study
Wang CY , Cogswell ME , Loria CM , Chen TC , Pfeiffer CM , Swanson CA , Caldwell KL , Perrine CG , Carriquiry AL , Liu K , Sempos CT , Gillespie CD , Burt VL . J Nutr 2013 143 (8) 1276-82 Because of the logistic complexity, excessive respondent burden, and high cost of conducting 24-h urine collections in a national survey, alternative strategies to monitor sodium intake at the population level need to be evaluated. We conducted a calibration study to assess the ability to characterize sodium intake from timed-spot urine samples calibrated to a 24-h urine collection. In this report, we described the overall design and basic results of the study. Adults aged 18-39 y were recruited to collect urine for a 24-h period, placing each void in a separate container. Four timed-spot specimens (morning, afternoon, evening, and overnight) and the 24-h collection were analyzed for sodium, potassium, chloride, creatinine, and iodine. Of 481 eligible persons, 407 (54% female, 48% black) completed a 24-h urine collection. A subsample (n = 133) collected a second 24-h urine 4-11 d later. Mean sodium excretion was 3.54 +/- 1.51 g/d for males and 3.09 +/- 1.26 g/d for females. Sensitivity analysis excluding those who did not meet the expected creatinine excretion criterion showed the same results. Day-to-day variability for sodium, potassium, chloride, and iodine was observed among those collecting two 24-h urine samples (CV = 16-29% for 24-h urine samples and 21-41% for timed-spot specimens). Among all race-gender groups, overnight specimens had larger volumes (P < 0.01) and lower sodium (P < 0.01 -P = 0.26), potassium (P < 0.01), and chloride (P < 0.01) concentrations compared with other timed-spot urine samples, although the differences were not always significant. Urine creatinine and iodine concentrations did not differ by the timing of collection. The observed day-to-day and diurnal variations in sodium excretion illustrate the importance of accounting for these factors when developing calibration equations from this study. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Jan 13, 2025
- Content source:
- Powered by CDC PHGKB Infrastructure