Biomonitoring for manganese (Mn) exposure is important due to its potential to cause adverse health effects. In this study, we investigate how different sample preparation methods (dilution, digestion), calibration protocols (aqueous, blood-based, standard additions), and instrumental techniques affect Mn method bias and analytical imprecision. The techniques used included graphite furnace atomic absorption spectrometry (GFAAS), dynamic reaction cell inductively coupled plasma mass spectrometry (DRC-ICP-MS), and sector field (SF-) ICP-MS. We analyzed NIST SRM 1643e Trace Elements in Water and SRM 1598a Inorganic Constituents in Animal Serum (both certified for Mn), and SRM 955c Toxic Metals in Caprine Blood – Level 1 (not certified for Mn). Various matrix effects in ICP-MS produced inaccurate results for SRM 1643e and discrepant results for SRM 955c. In the absence of a certified value for Mn in SRM 955c, we assigned a “consensus” value by combining data from the New York State Department of Health (NYS), the Centers for Disease Control and Prevention (CDC) and the Centre de toxicologie du Québec (CTQ). With this interlaboratory approach, we established an “all-lab” consensus value of 16.3 ± 0.8 μg L−1 based on data from DRC-ICP-MS with dilution sample preparation and blood-based calibration. We also assigned an “all-method” consensus value of 16.3 ± 0.9 μg L−1 based on GFAAS and SF-ICP-MS data from the NYS lab and the DRC-ICP-MS all-lab consensus value. Although the expanded uncertainty (U) calculated for the consensus values may not fully account for all sources of uncertainty, it does show the relative variation that might be expected from one study to the next for the determination of Mn in blood.
You have access to this article
Please wait while we load your content...
Something went wrong. Try again?