Probe Software Users Forum

General EPMA => Discussion of General EPMA Issues => Topic started by: Hwayoung Kim on October 20, 2019, 11:08:38 PM

Title: Unexplained Changes in Spectrometer Intensities
Post by: Hwayoung Kim on October 20, 2019, 11:08:38 PM
Hi all,

I would like to ask for a help or comment about sudden increase of off-peak X-ray background intensities from samples of similar mean atomic number range under same analytical conditions.

The issue is.. we have been measuring trace elements from olivines of varying compositions through 4 different sessions with single conditions and settings:
  - 13 elements : Si, Mg (EDS)  /  Na, Al, K, Ca, P, Fe, Mn, Zn, Ti, Cr, Ni (WDS)
  - 15kV, 500nA, 3um beam
  - 360s for peak, 180s for upper and lower background (off-peak)
  - exponential modeling for background, +-2mm background positions

and these are measured and interpolated background intensities at Al Ka position with TAP crystal:
(https://probesoftware.com/smf/gallery/1632_18_10_19_12_15_53.png)
(Time interval of 3 weeks between 1st and 2nd sessions, 2days between 2nd and 3rd sessions, and 6 hours between 3rd and 4th sessions)

You can see the background level at 2nd & 4th sessions are higher than those of 1st & 3rd sessions. (Beam, spectrometer and detector settings are *exactly same* for all sessions, except minor change of peak position). And this increase in seen for all WDS elements, not only for Al.

And this is background X-ray intensities vs. calculated mean atomic number (Z-BAR) of olivines:
(https://probesoftware.com/smf/gallery/1632_18_10_19_12_17_05.png)

Mean atomic number (MAN) range from 10.5 to 13.5 is due to different olivine compositions. And background level between 1.2 and 1.4 (cps/nA) of 1st and 3rd session seems normal, because they sat on a range of past background measurements using our probe from other matrices of similar mean atomic numbers. Why are there two different correlation lines of background vs MAN even though we used same analytical settings?

All olivine grains were in a single mount and vacuum conditions were constantly good through entire sessions. There are daily temperature variations in our lab but the 3rd session took 3 days, so we can rule out possible problems in coating, vacuum and temperature.

So far, we can not find any reasons for this dichotomy. Any comments would be helpful.
Thank you.
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Probeman on October 21, 2019, 11:48:03 AM
Hi Hwayoung,
Are you only seeing this "dichotomy" in intensities on spectrometer 2?
john
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Hwayoung Kim on October 21, 2019, 06:24:05 PM
Hi John,

No, we are seeing this in all five spectrometers.

(https://probesoftware.com/smf/gallery/1632_21_10_19_7_11_51.png)


Hwayoung
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Probeman on October 21, 2019, 06:30:53 PM
OK, that has to be a high voltage power supply for the bias voltage for the flow detectors.

Or maybe a bad P-10 bottle?

Are the intensities changing minute by minute?  It's only around 30-50% more counts at background level. Probably you wouldn't notice it when counting peak intensities I assume?

Weird.
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Hwayoung Kim on October 21, 2019, 07:38:15 PM
We are not seeing short-term changes, it changes session by session.

One session took more than 1-2 days and the background level is constant within a session. (Checked by chart recorder in JEOL software)

In a session when background intensity increased peak intensity also increased, so background increase does not affect net intensities.

That's why we're thinking this is related to detectors (probably bias voltage or gain?). Our 3, 4, 5 spectrometer has sealed Xe counter so P-10 may not be a problem, I think.
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Probeman on October 21, 2019, 07:45:59 PM
Do the different sessions use different bias voltages on the detectors?

What happens if you change the bias voltage manually?  Do the counts visibly change? 

Can you monitor the detector bias power supply?
john
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Probeman on October 22, 2019, 07:57:20 AM
Hwayoung,
You might also want to use the Drift application to check how your spectrometer intensities have changed over time:

https://probesoftware.com/smf/index.php?topic=575.msg3565#msg3565

Using Drift one can search for all runs that utilize a specific standard on a specific spectrometer/crystal at a specific condition.
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Mike Jercinovic on October 22, 2019, 08:42:36 AM
I would certainly have a look at the comparison of measured and absorbed current before and after each point. Then compare session to session.  Could be something is not right with current measurement or regulation.
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: JohnF on October 22, 2019, 11:29:15 AM
Please describe your PHA conditions. Are you using differential? With a wide or tight window? If there were significant weather (=barometric pressure) changes between the sessions, and you were using a tight window, there conceivably might be a change in the pulse energy position (=height). Though the binary (2 of each condition) seems somewhat fortuitous.
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Hwayoung Kim on October 22, 2019, 09:27:49 PM
Do the different sessions use different bias voltages on the detectors?

What happens if you change the bias voltage manually?  Do the counts visibly change? 

Can you monitor the detector bias power supply?

Bias voltage was same for all sessions. I will check count changes with changing bias voltage later when our probe is available.


I would certainly have a look at the comparison of measured and absorbed current before and after each point. Then compare session to session.  Could be something is not right with current measurement or regulation.

Unfortunately I did not check or record absorb currents for this measurements. I think I have to look at absorb current if I can see this intensity increases later.


Please describe your PHA conditions. Are you using differential? With a wide or tight window? If there were significant weather (=barometric pressure) changes between the sessions, and you were using a tight window, there conceivably might be a change in the pulse energy position (=height). Though the binary (2 of each condition) seems somewhat fortuitous.

We used differential mode with very wide window from 0.7 to 10.0V.


You might also want to use the Drift application to check how your spectrometer intensities have changed over time:

Actually, we saw this intensity increase not only from olivines but also from routine standards measurements.
This is background intensities in 5 spectrometer measured from JEOL MgO standard with 15kV energy, variable but mostly 20nA beam currents, same spectrometer/crystal configuration and detector settings since June 2018.
(https://probesoftware.com/smf/gallery/1632_22_10_19_7_52_30.png)

At mid-September 2019 (red arrow; a day between 1st and 2nd session of olivine measurements) there was sudden increase of intensities in Spec 1, 2, and probably 5? compared to usual counts over long time. And this was seen from many other standards measurements on that day.
(For Spec 5, sudden drop of intensities at Apr 2019 was due to replacement of some parts in the spectrometer)

Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Probeman on October 23, 2019, 09:46:29 AM
Maybe there was a solar flare that day?    ;)   

Seriously, I'm sure you've got the bias and gain properly set on your detectors, but I do have a question regarding JEOL detectors vs. Cameca detectors.

On Cameca instruments we have two types of detectors: both are P-10 flow detectors, but some are 1 atm pressure and some are 2 atm pressure. On the 1 atm detectors the bias settings are usually around 1300 to 1350 volts, but on the 2 atm detectors the bias settings are usually around 1850 volts for stable "proportional" photon response.

On JEOL instruments (and I speak as one who has never owned a JEOL instrument so please correct me if I am wrong), there are also two detector types: a flow P-10 detector and a sealed Xe detector.  From the Cameca perspective, I find it odd that both JEOL detector types seem to want to be set around 1650 to 1700 volts for proper proportional operation.   Is that because both JEOL detectors are running at 1 atm pressure?
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Mike Jercinovic on October 23, 2019, 12:13:43 PM
John, maybe there is more information out there now, but there is a Geller and Herrington paper, High Count Rate Electron Probe Microanalysis (Journal of Research of NIST, v107, no6 2002, pp 503-508), where they state in reference to the JEOL Xe counters... "For the sealed counter the xenon gas may be at a partial pressure of 1300 Pa to 7800 Pa (10 torr to 60 torr)".  This should make sense if the total pressure is about 760 torr given the absorption of moderate energy x-rays through about a 5 or 10% Xe in methane, at least with some crude calculations I did awhile back. 
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Probeman on October 23, 2019, 01:04:33 PM
Hi Mike,
That is interesting.

So you're thinking that the exact gas mixture (10% Ar versus 10% Xe) doesn't matter so much, but that the difference in gas pressure in the Cameca detectors is why they run at such different bias voltages for proportional response?
john
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Mike Jercinovic on October 23, 2019, 01:50:33 PM
Well P-10 is 10% methane, balance Ar, so the mixture means a lot (about 90%Ar compared to 10% Xe). Doubling the Ar counter pressure to 2 bars dramatically increases the quantum efficiency for photon energies above 4kV or so, but then becomes quite dense and absorbing for photon energies much below 2kV.  So, everyone uses Ar at just above 1 bar for low energy photon detection.
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Probeman on October 23, 2019, 01:58:58 PM
Hi Mike,
Yes, I get that. But I thought you were guessing that the JEOL Xe detectors were filled with 5 to 10% Xe and the balance with methane to 760 torr (1 atm)?

Quote
This should make sense if the total pressure is about 760 torr given the absorption of moderate energy x-rays through about a 5 or 10% Xe in methane, at least with some crude calculations I did awhile back.

So similar to 1 atm P-10, just replacing the Ar with Xe?

I'm just trying to understand why both JEOL detectors seem to use about the same bias, voltage, but Cameca detectors (1 atm vs. 2 atm) are so different with their required bias voltages...
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Mike Jercinovic on October 23, 2019, 02:33:05 PM
I guess I am just being more confusing than usual.
P-10 is 90% Ar
"Xe" in Xe sealed counters appears to be about 10% Xe (via the Geller and Herrington paper)
Both are operating at about 1 bar.

So JEOL can use a bias voltage for either counter type at around 1700v to efficiently avalanche ionizations given the physical dimensions of their counters.  Despite the large difference in the gas mixtures, Xe is very efficient (54 electrons) compared to Ar (18 electrons) at the same pressure.
Cameca operates the low pressure P-10 counters at a lower bias voltage to optimize low energy photon detection given the physical dimensions of their counters, then requires a higher voltage for an efficient proportional response at higher pressure for higher energy photon detection. 
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Probeman on October 23, 2019, 03:47:25 PM
Hi Mike,
No I'm just being dense.  I thought that is what you meant when you said:

Quote
This should make sense if the total pressure is about 760 torr given the absorption of moderate energy x-rays through about a 5 or 10% Xe in methane, at least with some crude calculations I did awhile back. 

I'm surprised that they would use so little Xe in their sealed detector gas mixture.  But in any event you think it's the higher pressure in the high energy Cameca detectors that requires higher bias voltage.   

Thanks for explaining all this.
Title: Re: Unexplained Changes in Spectrometer Intensities
Post by: Probeman on October 23, 2019, 04:33:52 PM
This article here:

https://science.mcmaster.ca/radgrad/images/6R06CourseResources/4R6Notes3_GasFilled_Detectors.pdf

Mentions proportional counters using Xe or Kr:

Quote
The  type  of  fill  gas  used  is  dependent  on  the  function  the  counter  is  to  perform.  Commonly  used  gases  for  [beta] measurements  are  the  noble  gases.  These  often  require  a  quench gas however. Cost dictates that argon is commonly used, usually as a mixture of 90% argon with 10% methane. This is called P-10 gas. For better [gamma]-ray detection the fill gas is switched to krypton or xenon.

This Wiki article says something similar:

https://en.wikipedia.org/wiki/Proportional_counter

Quote
Usually the detector is filled with a noble gas; they have the lowest ionization voltages and do not degrade chemically. Typically neon, argon, krypton or xenon are used. Low-energy x-rays are best detected with lighter nuclei (neon), which are less sensitive to higher-energy photons. Krypton or xenon are chosen when for higher-energy x-rays or for higher desired efficiency.

Often the main gas is mixed with a quenching additive. A popular mixture is P10 (10% methane, 90% argon).

Typical working pressure is 1 atmosphere (about 100 kPa)

Neither mentions noble gas percentages except for P-10 of course.