Analysis of relative standard deviation of spectral line intensity and intensity ratio in laser-induced breakdown spectroscopy using CuIn1−xGaxSe2 thin film samples
Abstract
The precision of laser induced breakdown spectroscopy (LIBS) measurement is influenced by both the shot noise of the detector and the shot-to-shot fluctuation noise of the laser plasma. Since both types of noise are related to the variation of the photon number, it is hard to clearly estimate the degree of influence by each noise on the measured LIBS signal. In this work, to quantitatively estimate the shot noise of an intensified charge-coupled device detector, a model in which the detector noise is linearly correlated with signal intensity is developed. Using a stable light source for intensity calibration, the correlation parameter of detector noise is determined. By subtracting the relative standard deviation (RSD) of detector shot noise from the RSD of measured LIBS signal intensity or intensity ratio, the variation of RSD due to plasma fluctuation alone was obtained. From LIBS analysis of CuIn1−xGaxSe2 thin films under various experimental conditions (laser energy: 0.16–2.9 mJ, gate delay: 0.12–3.6 μs, spot size: 42 and 104 μm), it was found that the plasma-fluctuation-induced RSD of absolute intensity tended to increase with gate delay and to decrease with laser fluence. The plasma-fluctuation-induced RSD of the intensity ratio, however, tended to initially decrease with gate delay to a minimum value before starting to increase at longer gate delay. The observed behavior of plasma-fluctuation-induced RSD of the intensity ratio is understood to be related to the spectral properties of emission lines and elements of interest.
- This article is part of the themed collection: 2015 European Winter Conference on Plasma Spectrochemistry, Munster, Germany