Probeman recently reported that when calculating
detection limit maps in CalcImage, some elements would display zero for all pixels as seen here for F and O mapped at 10 keV and 30 nA for 30 msec dwell time per pixel in an AlN sample:
![](https://probesoftware.com/smf/gallery/1_12_05_20_11_59_17.jpeg)
For those who didn't already know, one can calculate detection limit (and analytical sensitivity) maps in CalcImage when you quantify your x-ray maps. Here are some of the calculation options in CalcImage:
![](https://probesoftware.com/smf/gallery/1_12_05_20_11_59_01.jpeg)
Unlike x-ray mapping calibration curve methods, we can do a rigorous detection limit calculation because we perform a complete quantification on each pixel, including deadtime, background, matrix and spectral interferences. The background intensity is of course necessary for detection limit calculations.
Now why did the F Ka detection limit map work, but not the O Ka map? Well, these maps were corrected using the MAN background correction, which eliminates the necessity of acquiring separate off-peak maps, which can double the map acquisition times. MAN background corrections, explained here:
https://probesoftware.com/smf/index.php?topic=307.0are wonderful for point analyses but really "shine" for quantitative x-ray mapping because they take about half the time and produce better sensitivity. The full peer reviewed paper on the MAN background correction published in American Minerologist (2016) is available here, for those that want all the nasty statistical details:
https://epmalab.uoregon.edu/publ/A%20new%20EPMA%20method%20for%20fast%20trace%20element%20analysis%20in%20simple%20matrices.pdfSo what is going on the O ka detection limit map? Well, as explained in the paper, there's the variance on the peak measurement, but also a variance on the background measurement. Now traditionally when using off-peak backgrounds, the calculation is straight forward since we can take the interpolated off-peak intensity and calculate the variance (by assuming Gaussian statistics by taking the square root), because the variance is determined by the continuum statistics. Just as it would be for an on-peak measurement with a zero concentration.
But when we perform an MAN background correction, the variance of the background measurement is not limited by continuum statistics, but rather by the variance of the average atomic number of the sample. And the average atomic number is determined by the concentrations of the major elements.
Think of it this way: if you were measuring trace elements in a known matrix, say SiO2 or ZrSiO4, or pyrite, etc., and you *did not* measure the major elements, but simply specified the matrix by difference or a fixed concentration, well the variance of those major element concentrations are close to zero, and hence the variation on the average atomic number is close to zero, and therefore the variance on the background intensity is close to zero.
![Cool 8)](https://probesoftware.com/smf/Smileys/default/cool.gif)
If instead we do actually measure the major elements, we still have better MAN background statistics than directly measuring the off-peak intensities, because the variance of the measured major elements is still going to be much better than the continuum statistics.
And in addition of course we tend to utilize MAN intensities using several points on several standards covering the range of average Z we need. So we get excellent sensitivity, and as for accuracy, well, that's what the blank correction is for! But the accuracy is around 200 to 300 PPM for most cases. If you don't believe these claims please read the linked paper above. It should explain your questions (the second author was originally a strong skeptic too), but if you still have questions, please reply to this topic.
So anyway, because the MAN background intensity is determined using such a radically different method, than we we do for typical off-peak backgrounds, we also need to calculate the MAN variance differently, because merely taking the square root significantly overestimates the variance of the MAN background intensities.
This involves including the statistics of the MAN fit (which is a much smaller contribution that one would think, but see the paper for the details), but in any case for this calculation we need at least three points (standards) on the MAN calibration curve. Because two points always fits a line!
![Smiley :)](https://probesoftware.com/smf/Smileys/default/smiley.gif)
So the rigorous MAN variance calculation cannot be performed for an MAN curve using only two standards (or God forbid, only one standard!). So for elements with less than 3 MAN standards, we revert to a crude estimate by calculating Gaussian variance on the intensity and then reducing that by 30% since on average the MAN statistics improve by about the square root of two on continuum statistics (think about it).
So what was the "bug" on the oxygen detection limit map? Well when the code routine found that the number of MAN standards was less then three, it dropped out of the routine, but didn't bother to load the alternative calculation! Bad routine!
![Angry >:(](https://probesoftware.com/smf/Smileys/default/angry.gif)
But this is now fixed and the update software is uploaded and ready for you to update using the Help menu in Probe for EPMA (yes, this same detection limit bug was also in Probe for EPMA for these MAN detection limit calculations). And here now are the detection limits re-calculated with the new code:
![](https://probesoftware.com/smf/gallery/1_12_05_20_1_13_07.jpeg)