I'm starting a new topic here on the FIGMAS k-ratio consensus measurements. This topic is related to the Open Letter to the Microanalysis Community on the effort to develop high purity, high accuracy synthetic silicates and oxides in ~kilogram quantities for global distribution to the microanalysis community as discussed here:
https://probesoftware.com/smf/index.php?topic=1415.0The initial effort involves small amounts of synthetic materials obtained by John Donovan and Will Nachlas, specifically synthetic MgO, Al2O3 and MgAl2O4. Clearly for high accuracy measurements on such different materials the instrument at least needs to have properly calibrated dead time constants. Especially for modern instruments at moderate beam currents and with large area crystals with higher count rates. And especially Cameca instruments with their typically higher dead time constants!
Have you all actually measured the dead time constants on your instruments or are you simply using the "factory defaults"? It's important for accuracy and it's not hard! Much helpful information is found here for JEOL instrument dead time calibrations:
https://probesoftware.com/smf/index.php?topic=394.0Here for Cameca instrument dead time calibrations:
https://probesoftware.com/smf/index.php?topic=33.0And here is found general information on dead time calibrations (also using the StartWin application for an automated dead time data acquisition):
https://probesoftware.com/smf/index.php?topic=1160.0Of course other instrument calibrations are also very important. For example, have you checked the accuracy of your high voltage power supply using the Duane-Hunt limit test?
https://probesoftware.com/smf/index.php?topic=1063.0Or the tilt on your stage (most important for SEMs)! Or the effective takeoff angle on your spectrometers? Or the simultaneous k-ratio test on multiple WDS (or EDS!) spectrometers?
https://probesoftware.com/smf/index.php?topic=369.msg1948#msg1948These are all important things to check especially as the instrument ages over time. In any event, I wanted to share my own efforts to measure k-ratios on these materials and discuss the effect of the dead time calibration on such k-ratio measurements...
So here are my Mg Ka and Al Ka measurements at 10 nA using element setups that I had (unthinkingly) loaded from 2015 that were using somewhat outdated dead time constants of 3.0 and 2.9 us respectively:
St 3100 Set 3 MgAl2O4 FIGMAS, Results in Elemental Weight Percents
ELEM: Mg Al O
TYPE: ANAL ANAL SPEC
BGDS: EXP EXP
TIME: 40.00 40.00 ---
BEAM: 10.06 10.06 ---
ELEM: Mg Al O SUM
141 17.399 38.789 44.985 101.174
142 17.300 38.787 44.985 101.073
143 17.244 38.544 44.985 100.772
144 17.250 38.722 44.985 100.958
145 17.329 38.724 44.985 101.038
AVER: 17.305 38.713 44.985 101.003
SDEV: .064 .100 .000 .150
SERR: .029 .045 .000
%RSD: .37 .26 .00
PUBL: 17.084 37.931 44.985 100.000
%VAR: 1.29 2.06 .00
DIFF: .221 .782 .000
STDS: 3012 3013 ---
STKF: .4740 .4353 ---
STCT: 582.29 753.61 ---
UNKF: .1341 .2700 ---
UNCT: 164.74 467.49 ---
UNBG: .67 .73 ---
ZCOR: 1.2904 1.4336 ---
KRAW: .2829 .6203 ---
PKBG: 247.22 639.60 ---
As one can see the values compared to ideal stoichiometry aren't too bad, but both are a little high. Of course one could look at different matrix corrections to double check the accuracy, because the matrix correction effects are quite large at 30% and 43% respectively, as seen here:
Summary of All Calculated (averaged) Matrix Corrections:
St 3100 Set 3 MgAl2O4 FIGMAS
LINEMU Henke (LBL, 1985) < 10KeV / CITZMU > 10KeV
Elemental Weight Percents:
ELEM: Mg Al O TOTAL
1 17.305 38.713 44.985 101.003 Armstrong/Love Scott (default)
2 17.213 39.034 44.985 101.232 Conventional Philibert/Duncumb-Reed
3 17.276 38.986 44.985 101.247 Heinrich/Duncumb-Reed
4 17.307 38.887 44.985 101.179 Love-Scott I
5 17.301 38.704 44.985 100.990 Love-Scott II
6 17.248 38.500 44.985 100.733 Packwood Phi(pz) (EPQ-91)
7 17.451 38.831 44.985 101.267 Bastin (original) Phi(pz)
8 17.333 39.217 44.985 101.535 Bastin PROZA Phi(pz) (EPQ-91)
9 17.318 39.096 44.985 101.400 Pouchou and Pichoir-Full (PAP)
10 17.303 38.915 44.985 101.203 Pouchou and Pichoir-Simplified (XPP)
AVER: 17.305 38.888 44.985 101.179
SDEV: .062 .211 .000 .225
SERR: .020 .067 .000
MIN: 17.213 38.500 44.985 100.733
MAX: 17.451 39.217 44.985 101.535
So they all look a little high, so what could be the problem? Well to me this indicates that there may be a problem with a too low dead time correction since the pure oxide (primary) intensities will be lower than expected if the dead time constants are too low (and your spectrometer dead times will only get longer over time as the instrument ages!). And when I looked at the dead time constants in the PFE Elements/Cations window, sure enough they were from a 2015 dead time calibration I had done 7 years ago of 3.0 us and 2.9 us respectively!

So, one could edit the dead time constants for each element in each sample but that would be tedious. If only there was an easier way, and yes there is. Just go to the Analytical | Update Dead Time Constants menu and open the dialog and select all samples (standards and unknowns) and edit the dead time value for that spectrometer (and crystal) as seen here:

By the way, this Update dead Time Constants dialog automatically loads the current dead time constants from the SCALERS.DAT file (it's almost as though it were exactly designed for such a situation!).

So using the new calibrated dead time constants (from 2019!) of 3.8 us and 3.5 us respectively for Mg and Al, we obtain the following results:
St 3100 Set 3 MgAl2O4 FIGMAS, Results in Elemental Weight Percents
ELEM: Mg Al O
TYPE: ANAL ANAL SPEC
BGDS: EXP EXP
TIME: 40.00 40.00 ---
BEAM: 10.06 10.06 ---
ELEM: Mg Al O SUM
141 17.343 38.710 44.985 101.038
142 17.244 38.708 44.985 100.938
143 17.188 38.464 44.985 100.637
144 17.194 38.643 44.985 100.823
145 17.273 38.645 44.985 100.903
AVER: 17.248 38.634 44.985 100.868
SDEV: .064 .100 .000 .150
SERR: .028 .045 .000
%RSD: .37 .26 .00
PUBL: 17.084 37.931 44.985 100.000
%VAR: .96 1.85 .00
DIFF: .165 .703 .000
STDS: 3012 3013 ---
So now we have relative variances of 0.96% and 1.85% respectively, which really isn't too bad at all for extrapolating from pure MgO and Al2O3 to MgAl2O4. But since that 2019 dead time calibration is now about 3 years old, I going to re-run it as soon as I get a chance. Because if I look at a test run at 6 nA (compared to the previous run at 10 nA) I obtain these results:
St 3100 Set 8 MgAl2O4 FIGMAS, Results in Elemental Weight Percents
ELEM: Mg Al O
TYPE: ANAL ANAL SPEC
BGDS: EXP EXP
TIME: 40.00 40.00 ---
BEAM: 6.03 6.03 ---
ELEM: Mg Al O SUM
216 17.123 38.535 44.985 100.643
217 17.230 38.581 44.985 100.797
218 17.053 38.557 44.985 100.595
219 17.219 38.757 44.985 100.961
220 17.199 38.502 44.985 100.686
AVER: 17.165 38.587 44.985 100.737
SDEV: .075 .100 .000 .146
SERR: .034 .045 .000
%RSD: .44 .26 .00
PUBL: 17.084 37.931 44.985 100.000
%VAR: .47 1.73 .00
DIFF: .081 .656 .000
STDS: 3012 3013 ---
Now we are even closer with relative variances of 0.47% and 1.73% respectively. So I suspect the dead times have increased slightly since 2019 and so I will run a new dead time calibration and report the new results as soon as Julie let's me have some time on the instrument! 😁