Hi all,
I would like to ask for a help or comment about sudden increase of off-peak X-ray background intensities from samples of similar mean atomic number range under same analytical conditions.
The issue is.. we have been measuring trace elements from olivines of varying compositions through 4 different sessions with single conditions and settings:
- 13 elements : Si, Mg (EDS) / Na, Al, K, Ca, P, Fe, Mn, Zn, Ti, Cr, Ni (WDS)
- 15kV, 500nA, 3um beam
- 360s for peak, 180s for upper and lower background (off-peak)
- exponential modeling for background, +-2mm background positions
and these are measured and interpolated background intensities at Al Ka position with TAP crystal:
(Time interval of 3 weeks between 1st and 2nd sessions, 2days between 2nd and 3rd sessions, and 6 hours between 3rd and 4th sessions)
You can see the background level at 2nd & 4th sessions are higher than those of 1st & 3rd sessions. (Beam, spectrometer and detector settings are *exactly same* for all sessions, except minor change of peak position). And this increase in seen for all WDS elements, not only for Al.
And this is background X-ray intensities vs. calculated mean atomic number (Z-BAR) of olivines:
Mean atomic number (MAN) range from 10.5 to 13.5 is due to different olivine compositions. And background level between 1.2 and 1.4 (cps/nA) of 1st and 3rd session seems normal, because they sat on a range of past background measurements using our probe from other matrices of similar mean atomic numbers. Why are there two different correlation lines of background vs MAN even though we used same analytical settings?
All olivine grains were in a single mount and vacuum conditions were constantly good through entire sessions. There are daily temperature variations in our lab but the 3rd session took 3 days, so we can rule out possible problems in coating, vacuum and temperature.
So far, we can not find any reasons for this dichotomy. Any comments would be helpful.
Thank you.