Investigation of a method for the correction of self-absorption by Planck function in laser induced breakdown spectroscopy
Abstract
The electron density and temperature of a laser-induced plasma can be determined from the width and intensity of the spectral lines, provided that the corresponding optical transitions are optically thin. However, the lines in laser induced plasma are often self-absorbed. One of the methods of correction of this effect is based on the use of the Planck function and an iterative numerical calculation of the plasma temperature. In this study, the method is further explored and its inherent errors and limitations are evaluated. For this, synthetic spectra are used that fully correspond to the assumed conditions of a homogeneous isothermal plasma at local thermodynamic equilibrium. Based on the error analysis, the advantages and disadvantages of the method are discussed in comparison with other methods of self-absorption correction.
- This article is part of the themed collection: JAAS HOT Articles 2023