Last data update: Apr 22, 2024. (Total: 46599 publications since 2009)
Records 1-30 (of 48 Records) |
Query Trace: Jones RL [original query] |
---|
High-throughput determination of ultratrace actinides in urine by in-line extraction chromatography combined with quadrupole inductively coupled plasma mass spectrometry (EC-ICP-MS)
Liu Y , Xiao G , Jones RL . Anal Chem 2022 94 (51) 18042-18049 Determining actinides in urine is vital for occupational exposure monitoring and radiological emergency response because of the toxicity and radiological dose effects of actinides on human health. Traditional radiochemistry analytical methods used to determine actinide concentrations in urine are time-consuming (sample analysis takes several days) and are hindered by a variety of technical and instrumentation-related obstacles. A high-throughput, fully automated, precise, and accurate in-line method was developed for determining five actinides ((241)Am, (239)Pu, (237)Np, (232)Th, and (238)U) at ng/L levels in urine using extraction chromatography combined with quadrupole inductively coupled plasma mass spectrometry (EC-ICP-MS). In this method, the five actinides were successfully separated with the required sensitivity, peak shape, and resolution using a simplified single Eichrom TRU column with a Dionex ICS-5000 system. The separated actinides were subsequently injected into an in-line PerkinElmer (PE) NexION 300D ICP-MS for quantitative determination. The sample-to-sample run time was 23 min for automatic chemical separation and quantification using only 0.5 mL of urine. The limits of detection (LOD) obtained using this method were 0.015, 0.022, 0.039, 4.5, and 2.4 ng/L for (241)Am, (239)Pu, (237)Np, (232)Th, and (238)U, respectively. The method routinely had a chemical yield of >84% as well as a linearity (R(2)) coefficient of ≥0.999 for the calibrators. The method proved to be rapid, reliable, and effective for actinide quantification in urine and therefore is appropriate for radiological emergency response incidents. |
Rapid determination of thorium in urine by quadrupole inductively coupled plasma mass spectrometry (Q-ICP-MS)
Liu Y , Xiao G , Jones RL . J Radioanal Nucl Chem 2022 331 (9) 3957-3964 Inductively coupled plasma mass spectrometry (ICP-MS) has proven to be an excellent analytical technique with high sensitivity for detecting low levels of long-lived radionuclides, such as thorium. However, the high-sensitivity technique increases the memory effect of thorium. This study developed a rapid, high-throughput, simple method for measuring thorium in urine using quadrupole inductively coupled plasma mass spectrometry (Q-ICP-MS). Replacing the commonly used hazardous hydrofluoric acid with a rinse solution of 0.025 mol/L oxalic acid and 5% (v/v) nitric acid eliminated the memory effect of thorium. 233U was used as internal standard in this study. The limit of detection (LOD) for thorium in this study is 0.77 ng/L, which is comparable to those of reported methods using more sophisticated and expensive sector field inductively coupled plasma mass spectrometry (SF-ICP-MS). This proposed method can determine thorium concentrations in urine in both occupationally exposed workers and populations that live in areas with high background levels of thorium. © 2022, Akadémiai Kiadó, Budapest, Hungary. |
Lead poisoning among asymptomatic individuals with a long-term history of opiate use in Golestan Cohort Study
Etemadi A , Hariri S , Hassanian-Moghaddam H , Poustchi H , Roshandel G , Shayanrad A , Kamangar F , Boffetta P , Brennan P , Dargan PI , Dawsey SM , Jones RL , Freedman ND , Malekzadeh R , Abnet CC . Int J Drug Policy 2022 104 103695 BACKGROUND: Recent reports of lead poisoning suggest that people who use opium may be exposed to high amounts of lead. Here, we investigate the association between opium use and blood lead levels (BLL) in a population-based cohort study. METHODS: In 2017, we studied a random sample of 410 people who currently (both within the past year and the past month) used opium and 104 who did not from participants of the Golestan Cohort Study in northeast Iran. Participants were stratified by sex and tobacco use history, completed a comprehensive opiate and tobacco use questionnaire and provided blood. BLL was measured by Lead Care® II Blood Lead Test Kit, validated by inductively coupled plasma triple quadrupole mass spectrometry. BLL was categorized as "<5 µg/dL", "elevated" (5-10 µg/dL), "high" (10-50 µg/dL), and "very high" (above 50 µg/dL). To assess the association between BLL categories and opiate use, route of consumption and weekly use, we used ordered logistic regression models, and report OR (odds ratio) and 95% CI (confidence interval) adjusted for age, sex, place of residence, education, occupation, household fuel type, and tobacco use. RESULTS: In the cohort, participants used only raw (teriak) or refined (shireh) opium, which were smoked (45%, n = 184), taken orally (46%, n = 189), or both (9%, n = 37), for a mean duration of 24.2 (standard deviation: 11.6) years. The median BLL was significantly higher in people who currently used opium (11.4 µg/dL; IQR: 5.2-23.4) compared with those who did not (2.3 µg/dL; IQR: 2.3-4.2), and the highest median BLL was seen in oral use (21.7 µg/dL; IQR: 12.1-34.1). The BLL was <5 µg/dL among 79.8% of people with no opiate use, compared with only 22.7% in those using opium. BLL was elevated in 21.7%, high in 50.5% and very high in 5.1% of people using opium. About 95% of those with oral (180/189) or dual use (35/37) and 55% (102/184) of those who smoked opium had levels of blood lead above 5 µg/dL. The OR for the association between any opium use and each unit of increase in BLL category was 10.5 (95%CI: 5.8-19.1), and oral use of opium was a very strong predictor of increasing BLL category (OR=74.1; 95%CI: 35.1-156.3). This odds ratio was 38.8 (95%CI: 15.9-95.1) for dual use and 4.9 (95%CI: 2.6-9.1) for opium smoking. There was an independent dose-response association between average weekly dose and BLL among people using opium, overall and when stratified by route of use. CONCLUSION: Our results indicate that regular use of lead-adulterated opium can expose individuals to high levels of lead, which may contribute to mortality and cancer risks associated with long-term opium use. |
Update of the Blood Lead Reference Value - United States, 2021
Ruckart PZ , Jones RL , Courtney JG , LeBlanc TT , Jackson W , Karwowski MP , Cheng PY , Allwood P , Svendsen ER , Breysse PN . MMWR Morb Mortal Wkly Rep 2021 70 (43) 1509-1512 The negative impact of lead exposure on young children and those who become pregnant is well documented but is not well known by those at highest risk from this hazard. Scientific evidence suggests that there is no known safe blood lead level (BLL), because even small amounts of lead can be harmful to a child's developing brain (1). In 2012, CDC introduced the population-based blood lead reference value (BLRV) to identify children exposed to more lead than most other children in the United States. The BLRV should be used as a guide to 1) help determine whether medical or environmental follow-up actions should be initiated for an individual child and 2) prioritize communities with the most need for primary prevention of exposure and evaluate the effectiveness of prevention efforts. The BLRV is based on the 97.5th percentile of the blood lead distribution in U.S. children aged 1-5 years from National Health and Nutrition Examination Survey (NHANES) data. NHANES is a complex, multistage survey designed to provide a nationally representative assessment of health and nutritional status of the noninstitutionalized civilian adult and child populations in the United States (2). The initial BLRV of 5 μg/dL, established in 2012, was based on data from the 2007-2008 and 2009-2010 NHANES cycles. Consistent with recommendations from a former advisory committee, this report updates CDC's BLRV in children to 3.5 μg/dL using NHANES data derived from the 2015-2016 and 2017-2018 cycles and provides helpful information to support adoption by state and local health departments, health care providers (HCPs), clinical laboratories, and others and serves as an opportunity to advance health equity and environmental justice related to preventable lead exposure. CDC recommends that public health and clinical professionals focus screening efforts on populations at high risk based on age of housing and sociodemographic risk factors. Public health and clinical professionals should collaborate to develop screening plans responsive to local conditions using local data. In the absence of such plans, universal BLL testing is recommended. In addition, jurisdictions should follow the Centers for Medicare & Medicaid Services requirement that all Medicaid-enrolled children be tested at ages 12 and 24 months or at age 24-72 months if they have not previously been screened (3). |
Limit of detection comparison on urine gross alpha/beta, H-3, and P-32 analysis between different liquid scintillation counters
Piraner O , Jones RL . J Radioanal Nucl Chem 2021 330 (1) 381-384 As part of the Centers for Disease Control and Prevention’s post-radiological/nuclear incident response mission, we developed rapid bioassay analytical methods to assess possible human exposure to radionuclides and internal contamination. Liquid scintillation counting (LSC) is a valuable analytical tool for the rapid detection and quantification of gross alpha/beta-emitting radionuclides in urine samples. A key characteristic of this type of bioassay method is its detection sensitivity for the priority threat radionuclides. We evaluated the limit of detection of selected LSC instruments to determine which instrument can be used when low-dose measurement is important. © 2021, This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply. |
Rapid HPGe well detector gamma bioassay of (137)Cs, (60)Co, and (192)Ir method
Button J , Jones RL . Appl Radiat Isot 2021 175 109824 CDC designed a rapid HPGe Bioassay Method for (137)Cs, (60)Co, and (192)Ir that is suitable for a public health response to a radiological incident where people may ingest or inhale radionuclides. The method uses a short count time, small sample volume, and a large volume detector and well size. It measures a patient's urine sample collected post-incident. The levels of concern are directly related to the Clinical Decision Guide levels recommended in the National Council of Radiation Protection 161. |
Urine strontium-90 (Sr-90) manual and automated pre-analytical separation followed by liquid scintillation counting
Piraner O , Jones RL . J Radioanal Nucl Chem 2021 329 (1) 383-390 Responding to a radiological or nuclear incident may require assessing tens to hundreds of thousands of people for possible radionuclide contamination. The measurement of radioactive Sr is important because of its impact on people’s health. The existing analytical method for urine Sr-90 analysis using crown ethers is laborious and involves possible exposure to concentrated acids; therefore, this work is devoted to the development of the automated Sr-90 separation process, which became possible with the prepFast pre-analytical system (Elemental Scientific, Inc). © 2021, This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply. |
The effect of Sr resin cartridge age on stable Sr recovery methods used in Sr-90 analysis
Piraner O , Jones RL . J Radioanal Nucl Chem 2021 328 369-375 Radioactive strontium is a nuclear fission decay product found in industrial products and nuclear waste and is released during nuclear accidents. Current urine radiostrontium separation methods often are based on the use of Sr resin columns or cartridges (Eichrom Technologies). Most of these analytical methods use stable Sr as a tracer, with subsequent Sr recovery. The gravimetric recovery method requires 120 times more stable Sr than does the inductively coupled plasma mass spectrometry method described here. This difference can affect cartridge performance especially with aging cartridges. © 2021, This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply. |
Determination of 239pu in urine by sector field inductively coupled plasma mass spectrometry (SF-ICP-MS) using an automated offline sample preparation technique
Xiao G , Jones RL . J Radioanal Nucl Chem 2021 328 (1) 277-287 Here we report a new procedure to determine 239Pu in urine using a custom-made automated pre-analytical processing system (single probe) with 242Pu as a tracer followed by analysis by SF-ICP-MS. An average 242Pu recovery rate of 88% was obtained with CF-ThU-1000 columns reused > 100 times. Analytical results agree with measurements obtained using the CDC manual method with a R2 of 0.9994. Results for Oak Ridge National Laboratory reference materials align with target values with a bias range of − 3.44 to 3.05%. The limit of detection for this method is 0.63 pg/L, which is comparable to previous manual methods. |
Development and validation of a biomonitoring method to measure As, Cr, and Ni in human urine samples by ICP-UCT-MS
Jones DR , Jarrett JM , Stukes D , Baer A , McMichael M , Wallon K , Xiao G , Jones RL . Int J Hyg Environ Health 2021 234 113713 We developed an inductively coupled plasma mass spectrometry (ICP-MS) method using Universal Cell Technology (UCT) with a PerkinElmer NexION ICP-MS, to measure arsenic (As), chromium (Cr), and nickel (Ni) in human urine samples. The advancements of the UCT allowed us to expand the calibration range to make the method applicable for both low concentrations of biomonitoring applications and high concentrations that may be observed from acute exposures and emergency response. Our method analyzes As and Ni in kinetic energy discrimination (KED) mode with helium (He) gas, and Cr in dynamic reaction cell (DRC) mode with ammonia (NH(3)) gas. The combination of these elements is challenging because a carbon source, ethanol (EtOH), is required for normalization of As ionization in urine samples, which creates a spectral overlap ((40)Ar(12)C(+)) on (52)Cr. This method additionally improved lab efficiency by combining elements from two of our previously published methods(Jarrett et al., 2007; Quarles et al., 2014) allowing us to measure Cr and Ni concentrations in urine samples collected as part of the National Health and Nutrition Examination Survey (NHANES) beginning with the 2017-2018 survey cycle. We present our rigorous validation of the method selectivity and accuracy using National Institute of Standards and Technology (NIST) Standard Reference Materials (SRM), precision using in-house prepared quality control materials, and a discussion of the use of a modified UCT, a BioUCell, to address an ion transmission phenomenon we observed on the NexION 300 platform when using higher elemental concentrations and high cell gas pressures. The rugged method detection limits, calculated from measurements in more than 60 runs, for As, Cr, and Ni are 0.23 μg L-1, 0.19 μg L-1, and 0.31 μg L-1, respectively. |
Universal use of alpha/beta mode in liquid scintillation counting analysis for both alpha/beta and single nuclide determination
Piraner O , Jones RL . J Radioanal Nucl Chem 2021 327 975-983 Nuclear industry advancements and growing concerns about environmental contamination and terrorist activity have increased interest in quantifying radioisotopes in environmental and human samples. Increased presence in the environment, ease of entry into the food chain, nuclear medicine applications, and the possibility of radiological terrorism incidents can lead to human intake of these radionuclides (Radionuclides/Radiation Protection/US EPA (2020). https://www.epa.gov/radiation/radionuclides; Radiation from the Earth (Terrestrial Radiation) (2015) Radiation and Your Health, Centers for Disease Control and Prevention. 7 December. https://www.cdc.gov/nceh/radiation/terrestrial.html). A universal method to screen for and quantify individual radionuclides as well as both levels of alpha and beta emitters would address these concerns. |
Urine gross alpha/beta bioassay method development using liquid scintillation counting techniques
Piraner O , Jones RL . J Radioanal Nucl Chem 2021 327 (1) 513-523 In the case of a radiological or nuclear incident, valuable information could be obtained in a timely manner by using liquid scintillation counting (LSC) technique through fast screening of urine samples from potentially contaminated persons. This work describes the optimization of LSC parameters on PerkinElmer (PE) Tri-Carb and Quantulus GCT series instruments to develop a rapid method for screening urine in an emergency response situation. |
Determination of 226Ra in urine using triple quadrupole inductively coupled plasma mass spectrometry
Xiao G , Liu Y , Jones RL . Radiat Prot Dosimetry 2020 Measuring 226Ra in urine at low levels is critical for both biomonitoring and radiological emergency response. Here we report a new analytical method to quantify 226Ra, as developed and validated by a simple dilute-and-shoot procedure, followed by Inductively Coupled Plasma-triple quadrupole-mass spectrometry detection using 'No Gas MS-MS' mode. The method provides rapid and accurate results for 226Ra with a limit of detection (LOD) down to 0.007 ng/l (0.26 Bq/l). This LOD is well below the recommended action levels for 226Ra detection in children and pregnant women (C/P) set by the Clinical Decision Guide (NCRP Report #161). Results for 226Ra obtained by this method are within ±7.0% of the target values of standard reference materials spiked in the urine. |
The effect of quench agent on urine bioassay for various radionuclides using QuantulusTM1220 and Tri-CarbTM3110
Piraner O , Jones RL . J Radioanal Nucl Chem 2020 326 (1) 657-663 Following a radiological or nuclear incident, the National Response Plan has given the Department of Health and Human Services/Centers for Disease Control and Prevention the responsibility for assessing population’s contamination with radionuclides. In the public health response to the incident, valuable information could be obtained in a timely and accurate manner by using liquid scintillation counting techniques to determine who has been contaminated above background for alpha and beta emitting radionuclides. The calibration plays a major role in this process therefore, knowing the effect of quench agents on calibration is essential. |
Importance of preanalytical factors in measuring Cr and Co levels in human whole blood: Contamination control, proper sample collection, and long-term storage stability
Sommer YL , Ward CD , Georgi JC , Cheng PY , Jones RL . J Anal Toxicol 2020 45 (3) 297-307 A number of errors with potentially significant consequences may be introduced at various points in the analytical process which result in skewed, erroneous analytical results. Precautionary procedures such as contamination control, following established sample collection protocols, and having a complete understanding of the long-term stability of the elements of interest can minimize or eliminate these errors. Contamination control is critical in quantification of Cr and Co in human whole blood. Cr and Co levels in most biological samples are low, but these elements occur naturally in the environment and are often found in commercial and consumer products, which increases the risk of contamination. In this paper, we demonstrated that lot screening process in which we pre-screen a sub-set of manufactured lots used in collecting, analyzing, and storing blood samples is a critical step in controlling Cr and Co contamination. Stainless steel needles are often utilized in blood collection but are considered a potential source of introducing metal contamination to the patient sample. We conducted two studies to determine if there is a possibility of Cr or Co leaching into the human whole blood from the needles during blood collection. We analyzed blood collected from 100 donors and blood collected in-vitro in the laboratory from designated vessel containing spiked blood with higher levels of Cr and Co. Two blood tubes were consecutively collected through one needle. In both studies, Cr and Co concentration levels in the two consecutively collected tubes were compared. Based on the results from donor and in-vitro blood collection studies, we concluded there was no Cr and Co leaching from the limited sets of stainless steel needles used in these studies. Further, we demonstrated that Cr and Co human whole blood samples are stable for one year stored at temperatures of -70 degrees C, -20 degrees C, and 4 degrees C, and six months at room temperature. |
Determination of 237Np and 239Pu in urine using sector field inductively coupled plasma mass spectrometry (SF-ICP-MS)
Xiao G , Liu Y , Jones RL . J Radioanal Nucl Chem 2020 324 (2) 887-896 Measuring 237Np and 239Pu in urine at low levels is important for both biomonitoring and radiological emergency response. Here we report a newly developed and validated analytical method used to determine 237Np and 239Pu in urine by selective retention of Np and Pu from 2 mL of urine directly onto TEVA® resin followed by SF-ICP-MS (coupled to a membrane desolvating introduction system) detection. The method provides solid phase extraction of Np/Pu with observed recovery ratios ranging from 89 to 113% and rapid results with limits of detection well below recommended detection guidelines for children and pregnant women (NCRP 161 reference). |
Notes from the field: Methylmercury toxicity from a skin lightening cream obtained from Mexico - California, 2019
Mudan A , Copan L , Wang R , Pugh A , Lebin J , Barreau T , Jones RL , Ghosal S , Lee M , Albertson T , Jarrett JM , Lee J , Betting D , Ward CD , De Leon Salazar A , Smollin CG , Blanc PD . MMWR Morb Mortal Wkly Rep 2019 68 (50) 1166-1167 In July 2019, a Mexican-American woman aged 47 years in Sacramento, California, sought medical care for dysesthesias and weakness of her upper extremities. Over the ensuing 2 weeks of outpatient follow-up, her condition progressed to dysarthria, blurry vision, and gait unsteadiness, leading to hospital admission. While hospitalized, her condition declined rapidly to an agitated delirium. Two weeks into the hospitalization, screening blood and urine tests detected mercury concentrations exceeding the upper limit (UL) of quantification, indicative of abnormally high values of mercury (>160 μg/L [blood] and >80 μg/L [urine]). The hospital notified the California Poison Control System (CPCS) and the California Department of Public Health (CDPH). CPCS recommended oral dimercaptosuccinic acid, 10 mg/kg every 8 hours, which was administered via feeding tube. CDPH interviewed the patient’s family and learned that the patient was a long-term user of skin lightening creams obtained from Mexico (applied to the face twice daily for the past 7 years); the cream was analyzed and found to contain 12,000 ppm mercury. Mercury levels from the hospital specimens that initially implicated mercury were 2,620 μg/L blood mercury (reference population UL <1.81 μg/L)* and 110 μg/L urine mercury (UL <0.90 μg/L). A second blood specimen collected 11 days after the hospital initiation of ongoing dimercaptosuccinic acid chelation therapy detected 1,114 μg/L mercury. |
Determination of iodine content in dairy products by inductively coupled plasma mass spectrometry
Vance KA , Makhmudov A , Shakirova G , Roenfanz H , Jones RL , Caldwell KL . At Spectrosc 2018 39 (3) 95-99 A probing study to establish a reliable and robust method for determining the iodine concentration using the ELAN DRC II ICP-MS was performed in combination with a sample digestion and filtration step. Dairy products from locally available sources were evaluated to help determine the possibility and need for further evaluations in relation to the U.S. population’s iodine intake. Prior to analysis, the samples were aliquoted and digested for 3 hours at 90+/-3 C. Dilution and filtration were performed, following the digestion. The sample extract was analyzed, and the results were confirmed with NIST SRM 1549a Whole Milk Powder. Further experimentation will need to be performed to optimize the method for projected sample concentration and throughput. |
Trace metals screening process of devices used for the collection, analysis, and storage of biological specimens
Ward CD , Williams RJ , Mullenix K , Syhapanha K , Jones RL , Caldwell K . At Spectrosc 2018 39 (6) 219-228 The Centers for Disease Control and Prevention’s (CDC) Environmental Health Laboratory uses modified versions of inductively coupled plasma mass spectrometry (ICP-MS) analytical methods to quantify metals contamination present in items that will come into contact with patient samples during the preanalytical, analytical, and postanalytical stages. This lot screening process allows us to reduce the likelihood of introducing contamination which can lead to falsely elevated results. This is particularly important when looking at biomonitoring levels in humans which tend to be near the limit of detection of many methods. The fundamental requirements for a lot screening program in terms of facilities and processes are presented along with a discussion of sample preparation techniques used for lot screening. The criteria used to evaluate the lot screening data to determine the acceptability of a particular manufacturing lot is presented as well. As a result of lot testing, unsuitable manufactured lots are identified and excluded from use. |
Pre and postnatal polybrominated diphenyl ether concentrations in relation to thyroid parameters measured during early childhood
Cowell W , Sjodin A , Jones RL , Wang Y , Wang S , Whyatt R , Factor-Litvak P , Bradwin G , Hassoun A , Oberfield S , Herbstman J . Thyroid 2019 29 (5) 631-641 BACKGROUND: Penta brominated diphenyl ethers (PentaBDEs) are endocrine disrupting chemicals that structurally resemble thyroid hormones and were widely used as flame retardants in household consumer products from 1975-2004. Polybrominated diphenyl ethers (PBDEs) cross the placenta and evidence suggests that for many children, body burdens may peak during toddler years. We aimed to understand the impact of exposure timing by examining both pre and postnatal exposure to BDE-47, the predominant PentaBDE congener detected in humans, in relation to thyroid hormone parameters measured during early childhood. METHODS: The Columbia Center for Children's Environmental Health Mothers and Newborns Study is a prospective birth cohort of African American and Dominican maternal-child pairs. Pregnant women were recruited from two prenatal clinics in Northern Manhattan and the South Bronx between 1998 and 2006. Participants included 158 children with 1) plasma PBDE concentrations measured at birth and toddler years (age 2-3), and 2) serum thyroid parameters measured at 3 and/or 5 years. Outcomes included concentrations of serum thyroid stimulating hormone (TSH), free thyroxine (fT4) and total thyroxine (T4). RESULTS: Children with high exposure to BDE-47 during the prenatal period (-17%, 95% CI -29, -2) or toddler age (-19%, 95% CI: -31, -5) had significantly lower geometric mean TSH levels compared to children with low BDE-47 exposure throughout early life. Associations with T4 were also inverse, however, they did not reach statistical significance at the p=0.05 level. Sex-stratified models suggest associations with postnatal exposure may be stronger among boys compared to girls. CONCLUSIONS: The thyroid regulatory system may be sensitive to BDE-47 during prenatal and postnatal periods. |
LAMP: A CDC program to ensure the quality of blood-lead laboratory measurements
Caldwell KL , Cheng PY , Vance KA , Makhmudov A , Jarrett JM , Caudill SP , Ho DP , Jones RL . J Public Health Manag Pract 2019 25 S23-s30 CONTEXT: The Lead and Multielement Proficiency (LAMP) program is an external quality assurance program promoting high-quality blood-lead measurements. OBJECTIVES: To investigate the ability of US laboratories, participating in the Centers for Disease Control and Prevention (CDC) LAMP program to accurately measure blood-lead levels (BLL) 0.70 to 47.5 mug/dL using evaluation criteria of +/-2 mug/dL or 10%, whichever is greater. METHODS: The CDC distributes bovine blood specimens to participating laboratories 4 times per year. We evaluated participant performance over 5 challenges on samples with BLL between 0.70 and 47.5 mug/dL. The CDC sent 15 pooled samples (3 samples shipped in 5 rounds) to US laboratories. The LAMP laboratories used 3 primary technologies to analyze lead in blood: inductively coupled plasma mass spectrometry, graphite furnace atomic absorption spectroscopy, and LeadCare technologies based on anodic stripping voltammetry. Laboratories reported their BLL analytical results to the CDC. The LAMP uses these results to provide performance feedback to the laboratories. SETTING: The CDC sent blood samples to approximately 50 US laboratories for lead analysis. PARTICIPANTS: Of the approximately 200 laboratories enrolled in LAMP, 38 to 46 US laboratories provided data used in this report (January 2017 to March 2018). RESULTS: Laboratory precision ranged from 0.26 mug/dL for inductively coupled plasma mass spectrometry to 1.50 mug/dL for LeadCare instruments. All participating US LAMP laboratories reported accurate BLL for 89% of challenge samples, using the +/-2 mug/dL or 10% evaluation criteria. CONCLUSIONS: Laboratories participating in the CDC's LAMP program can accurately measure blood lead using the current Clinical Laboratory Improvement Amendments of 1988 guidance of +/-4 mug/dL or +/-10%, with a success rate of 96%. However, when we apply limits of +/-2 mug/dL or +/-10%, the success rate drops to 89%. When challenged with samples that have target values between 3 and 5 mug/dL, nearly 100% of reported results fall within +/-4 mug/dL, while 5% of the results fall outside of the acceptability criteria used by the CDC's LAMP program. As public health focuses on lower blood lead levels, laboratories must evaluate their ability to successfully meet these analytical challenges surrounding successfully measuring blood lead. In addition proposed CLIA guidelines (+/-2 mug/dL or 10%) would be achievable performance by a majority of US laboratories participating in the LAMP program. |
Very low-level prenatal mercury exposure and behaviors in children: the HOME Study
Patel NB , Xu Y , McCandless LC , Chen A , Yolton K , Braun J , Jones RL , Dietrich KN , Lanphear BP . Environ Health 2019 18 (1) 4 BACKGROUND: Mercury is toxic to the developing brain, but the lowest concentration associated with the development of behavior problems is unclear. The purpose of this study was to examine the association between very low-level mercury exposure during fetal development and behavior problems in children. METHODS: We used data from 389 mothers and children in a prospective pregnancy and birth cohort study. We defined mean prenatal mercury concentration as the mean of total whole blood mercury concentrations in maternal samples collected at 16- and 26-weeks of gestation, delivery, and neonatal cord blood samples. We assessed parent-reported child behavior up to five times from two to 8 years of age using the Behavioral Assessment System for Children (BASC-2). At 8 years of age, we assessed self-reported child anxiety using the Spence Children's Anxiety Scale (SCAS). We used multiple linear mixed models and linear regression models to estimate the association between mean prenatal mercury concentrations and child behavior and anxiety, respectively. RESULTS: The median prenatal total blood mercury concentrations was 0.67 mug/L. Overall, we did not find statistically significant associations between mean prenatal mercury concentrations and behavior problems scores, but a 2-fold increase in mercury concentrations at 16-weeks gestation was associated with 0.83 point (95% CI: 0.05, 1.62) higher BASC-2 anxiety scores. Maternal and cord blood mercury concentrations at delivery were associated with parent-reported anxiety at 8 years. CONCLUSION: We found limited evidence of an association between very-low level prenatal mercury exposure and behaviors in children, with an exception of anxiety. |
US Centers For Disease Control and Prevention experience in the joint external evaluation process - radiation emergencies technical area
Whitcomb RC Jr , Ansari AJ , Salame-Alfie A , McCurley MC , Buzzell J , Chang A , Jones RL . Radiat Prot Dosimetry 2018 182 (1) 9-13 In 2015-16, the US Department of Health and Human Services led 23 US Government (USG) agencies including the Centers for Disease Control and Prevention (CDC), and more than 120 subject matter experts in conducting an in-depth review of the US core public health capacities and evaluation of the country's compliance with the International Health Regulations using the Joint External Evaluation (JEE) methodology. This two-part process began with a detailed 'self-assessment' followed by a comprehensive independent, external evaluation conducted by 15 foreign assessors. In the Radiation Emergencies Technical Area, on a scale from 1-lowest to 5-highest, the assessors concurred with the USG self-assessed score of 3 in both of the relevant indicators. The report identified five priority actions recommended to improve the USG capacity to handle large-scale radiation emergencies. CDC is working to implement a post-JEE roadmap to address these priority actions in partnership with national and international partners. |
Assessing the stability of Cd, Mn, Pb, Se, and total Hg in whole human blood by ICP-DRC-MS as a function of temperature and time
Tevis DS , Jarrett JM , Jones DR , Cheng PY , Franklin M , Mullinex N , Caldwell KL , Jones RL . Clin Chim Acta 2018 485 1-6 BACKGROUND: Comprehensive information on the effect of time and temperature storage on the measurement of elements in human, whole blood (WB) by inductively coupled plasma-dynamic reaction cell-mass spectrometry (ICP-DRC-MS) is lacking, particularly for Mn and Se. METHODS: Human WB was spiked at 3 concentration levels, dispensed, and then stored at 5 different temperatures: -70 degrees C, -20 degrees C, 4 degrees C, 23 degrees C, and 37 degrees C. At 3 and 5weeks, and at 2, 4, 6, 8, 10, 12, 36months, samples were analyzed for Pb, Cd, Mn, Se and total Hg, using ICP-DRC-MS. We used a multiple linear regression model including time and temperature as covariates to fit the data with the measurement value as the outcome. We used an equivalence test using ratios to determine if results from the test storage conditions, warmer temperature and longer time, were comparable to the reference storage condition of 3weeks storage time at -70 degrees C. RESULTS: Model estimates for all elements in human WB samples stored in polypropylene cryovials at -70 degrees C were equivalent to estimates from samples stored at 37 degrees C for up to 2months, 23 degrees C up to 10months, and -20 degrees C and 4 degrees C for up to 36months. Model estimates for samples stored for 3weeks at -70 degrees C were equivalent to estimates from samples stored for 2months at -20 degrees C, 4 degrees C, 23 degrees C and 37 degrees C; 10months at -20 degrees C, 4 degrees C, and 23 degrees C; and 36months at -20 degrees C and 4 degrees C. This equivalence was true for all elements and pools except for the low concentration blood pool for Cd. CONCLUSIONS: Storage temperatures of -20 degrees C and 4 degrees C are equivalent to -70 degrees C for stability of Cd, Mn, Pb, Se, and Hg in human whole blood for at least 36months when blood is stored in sealed polypropylene vials. Increasing the sample storage temperature from -70 degrees C to -20 degrees C or 4 degrees C can lead to large energy savings. The best analytical results are obtained when storage time at higher temperature conditions (e.g. 23 degrees C and 37 degrees C) is minimized because recovery of Se and Hg is reduced. Blood samples stored in polypropylene vials also lose volume over time and develop clots at higher temperature conditions (e.g., 23 degrees C and 37 degrees C), making them unacceptable for elemental testing after 10months and 2months, respectively. |
Measurement challenges at low blood lead levels
Caldwell KL , Cheng PY , Jarrett JM , Makhmudov A , Vance K , Ward CD , Jones RL , Mortensen ME . Pediatrics 2017 140 (2) In 2012, the Centers for Disease Control and Prevention (CDC) adopted its Advisory Committee on Childhood Lead Poisoning Prevention recommendation to use a population-based reference value to identify children and environments associated with lead hazards. The current reference value of 5 mug/dL is calculated as the 97.5th percentile of the distribution of blood lead levels (BLLs) in children 1 to 5 years old from 2007 to 2010 NHANES data. We calculated and updated selected percentiles, including the 97.5th percentile, by using NHANES 2011 to 2014 blood lead data and examined demographic characteristics of children whose blood lead was ≥90th percentile value. The 97.5th percentile BLL of 3.48 microg/dL highlighted analytical laboratory and clinical interpretation challenges of blood lead measurements ≤5 mug/dL. Review of 5 years of results for target blood lead values <11 microg/dL for US clinical laboratories participating in the CDC's voluntary Lead and Multi-Element Proficiency quality assurance program showed 40% unable to quantify and reported a nondetectable result at a target blood lead value of 1.48 microg/dL, compared with 5.5% at a target BLL of 4.60 microg/dL. We describe actions taken at the CDC's Environmental Health Laboratory in the National Center for Environmental Health, which measures blood lead for NHANES, to improve analytical accuracy and precision and to reduce external lead contamination during blood collection and analysis. |
Biomonitoring method for the analysis of chromium and cobalt in human whole blood using inductively coupled plasma-kinetic energy discrimination-mass spectrometry (ICP-KED-MS)
Georgi JC , Sommer YL , Ward CD , Cheng P , Jones RL , Caldwell KL . Anal Methods 2017 9 (23) 3464-3476 The Centers for Disease Control and Prevention developed a biomonitoring method to rapidly and accurately quantify chromium and cobalt in human whole blood by ICP-MS. Many metal-on-metal hip implants which contain significant amounts of chromium and cobalt are susceptible to metal degradation. This method is used to gather population data about chromium and cobalt exposure of the U.S. population that does not include people that have metal-on-metal hip implants so that reference value can be established for a baseline level in blood. We evaluated parameters such as; helium gas flow rate, choice and composition of the diluent solution for sample preparation, and sample rinse time to determine the optimal conditions for analysis. The limits of detection for chromium and cobalt in blood were determined to be 0.41 and 0.06 g L-1, respectively. Method precision, accuracy, and recovery for this method were determined using quality control material created in-house and historical proficiency testing samples. We conducted experiments to determine if quantitative changes in the method parameters affect the results obtained by changing four parameters while analyzing human whole blood spiked with National Institute of Standard and Technology traceable materials: the dilution factor used during sample preparation, sample rinse time, diluent composition, and kinetic energy discrimination gas flow rate. The results at the increased and decreased levels for each parameter were statistically compared to the results obtained at the optimized parameters. We assessed the degree of reproducibility obtained under a variety of conditions and evaluated the method's robustness by analyzing the same set of proficiency testing samples by different analysts, on different instruments, with different reagents, and on different days. The short-term stability of chromium and cobalt in human blood samples stored at room temperature was monitored over a time period of 64 hours by diluting and analyzing samples at different time intervals. The stability of chromium and cobalt post-dilution was also evaluated over a period of 48 hours and at two storage temperatures (room temperature and refrigerated at 4 C). The results obtained during the stability studies showed that chromium and cobalt are stable in human blood for a period of 64 hours. The Royal Society of Chemistry 2017. |
Reply: Iodine content in milk alternatives
Vance K , Makhmudov A , Jones RL , Caldwell K . Thyroid 2017 Response to "Iodine in Milk Alternatives". |
Analysis of whole human blood for Pb, Cd, Hg, Se, and Mn by ICP-DRC-MS for biomonitoring and acute exposures
Jones DR , Jarrett JM , Tevis DS , Franklin M , Mullinix NJ , Wallon KL , Derrick Quarles C Jr , Caldwell KL , Jones RL . Talanta 2017 162 114-122 We improved our inductively coupled plasma mass spectrometry (ICP-MS) whole blood method [1] for determination of lead (Pb), cadmium (Cd), and mercury (Hg) by including manganese (Mn) and selenium (Se), and expanding the calibration range of all analytes. The method is validated on a PerkinElmer (PE) ELAN® DRC II ICP-MS (ICP-DRC-MS) and uses the Dynamic Reaction Cell (DRC) technology to attenuate interfering background ion signals via ion-molecule reactions. Methane gas (CH4) eliminates background signal from 40Ar2 + to permit determination of 80Se+, and oxygen gas (O2) eliminates several polyatomic interferences (e.g. 40Ar15N+, 54Fe1H+) on 55Mn+. Hg sensitivity in DRC mode is a factor of two higher than vented mode when measured under the same DRC conditions as Mn due to collisional focusing of the ion beam. To compensate for the expanded method's longer analysis time (due to DRC mode pause delays), we implemented an SC4-FAST autosampler (ESI Scientific, Omaha, NE), which vacuum loads the sample onto a loop, to keep the sample-to-sample measurement time to less than 5 min, allowing for preparation and analysis of 60 samples in an 8-h work shift. The longer analysis time also resulted in faster breakdown of the hydrocarbon oil in the interface roughing pump. The replacement of the standard roughing pump with a pump using a fluorinated lubricant, Fomblin®, extended the time between pump maintenance. We optimized the diluent and rinse solution components to reduce carryover from high concentration samples and prevent the formation of precipitates. We performed a robust calculation to determine the following limits of detection (LOD) in whole blood: 0.07 µg dL−1 for Pb, 0.10 µg L−1 for Cd, 0.28 μg L−1 for Hg, 0.99 µg L−1 for Mn, and 24.5 µg L−1 for Se. © 2016 |
GHSI emergency radionuclide bioassay laboratory network - summary of the second exercise
Li C , Bartizel C , Battisti P , Bottger A , Bouvier C , Capote-Cuellar A , Carr Z , Hammond D , Hartmann M , Heikkinen T , Jones RL , Kim E , Ko R , Koga R , Kukhta B , Mitchell L , Morhard R , Paquet F , Quayle D , Rulik P , Sadi B , Sergei A , Sierra I , de Oliveira Sousa W , Szabomicron G . Radiat Prot Dosimetry 2016 174 (4) 449-456 The Global Health Security Initiative (GHSI) established a laboratory network within the GHSI community to develop collective surge capacity for radionuclide bioassay in response to a radiological or nuclear emergency as a means of enhancing response capability, health outcomes and community resilience. GHSI partners conducted an exercise in collaboration with the WHO Radiation Emergency Medical Preparedness and Assistance Network and the IAEA Response and Assistance Network, to test the participating laboratories (18) for their capabilities in in vitro assay of biological samples, using a urine sample spiked with multiple high-risk radionuclides (90Sr, 106Ru, 137Cs, and 239Pu). Laboratories were required to submit their reports within 72 h following receipt of the sample, using a pre-formatted template, on the procedures, methods and techniques used to identify and quantify the radionuclides in the sample, as well as the bioassay results with a 95% confidence interval. All of the participating laboratories identified and measured all or some of the radionuclides in the sample. However, gaps were identified in both the procedures used to assay multiple radionuclides in one sample, as well as in the methods or techniques used to assay specific radionuclides in urine. Two-third of the participating laboratories had difficulties in determining all the radionuclides in the sample. Results from this exercise indicate that challenges remain with respect to ensuring that results are delivered in a timely, consistent and reliable manner to support medical interventions. Laboratories within the networks are encouraged to work together to develop and maintain collective capabilities and capacity for emergency bioassay, which is an important component of radiation emergency response. |
Long-term stability of inorganic, methyl and ethyl mercury in whole blood: Effects of storage temperature and time
Sommer YL , Ward CD , Pan Y , Caldwell KL , Jones RL . J Anal Toxicol 2016 40 (3) 222-8 In this study, we evaluated the effect of temperature on the long-term stability of three mercury species in bovine blood. We used inductively coupled plasma mass spectrometry (ICP-MS) analysis to determine the concentrations of inorganic (iHg), methyl (MeHg) and ethyl (EtHg) mercury species in two blood pools stored at temperatures of -70, -20, 4, 23 degrees C (room temperature) and 37 degrees C. Over the course of a year, we analyzed aliquots of pooled specimens at time intervals of 1, 2, 4 and 6 weeks and 2, 4, 6, 8, 10 and 12 months. We applied a fixed-effects linear model, step-down pairwise comparison and coefficient of variation statistical analysis to examine the temperature and time effects on changes in mercury species concentrations. We observed several instances of statistically significant differences in mercury species concentrations between different temperatures and time points; however, with considerations of experimental factors (such as instrumental drift and sample preparation procedures), not all differences were scientifically important. We concluded that iHg, MeHg and EtHg species in bovine whole blood were stable at -70, -20, 4 and 23 degrees C for 1 year, but blood samples stored at 37 degrees C were stable for no more than 2 weeks. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 22, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure