This board is for general EDS issues, but for the first post I'd like to make a radical suggestion for improving EDS analysis:
Simply eliminate EDS "standardless" analysis!
Too radical for you? Let's explore this question...
1. Dale Newbury at NIST estimates that 95-98% of all EDS analyses are performed "standardless".
2. Dale Newbury has demonstrated in the past that, although "standardless" EDS can be fairly accurate for some situations (e.g., major and minor elements in steel alloys), there are a very large number of other materials which perform much worse. New comparison studies of "standardless" by NIST are forthcoming, but initial results reported by Dale Newbury suggest that things have *not* improved much if at all.
3. Dale Newbury and Nicholas Ritchie have demonstrated that in many situation (though not all!), that EDS when performed carefully with standards can approach and often equal WDS analysis. But again, this is only being performed some 5% or less of the time!
4. At the risk of "tooting my own horn" I should mention that EDS analysis in Probe for EPMA can *only* be performed using full standards where the net intensity from the standard is utilized with the unknown net intensity to create an actual k-ratio, which is then cranked through the matrix correction along with the WDS k-ratios. This is very easy because a full EDS spectra is stored with each WDS analysis for all samples (standards and unknowns) and these are automatically acquired using stage automation.
What am I suggesting? Am I suggesting that every EDS user use standards every time? In a word, yes. We know it provides the best data by far and heck, EPMA users *always* use standards!
Does that mean that every SEM lab must obtain suitable standards? Again, yes.
Is this a bad thing? No.
If every EDS lab needed standards, we would have much more resources and effort applied to obtaining excellent standards as described in more detail here:
http://probesoftware.com/smf/index.php?topic=301.0What do you all think?