Author Topic: Proposal for a New Feature- Hyper-Gamma Correction  (Read 7812 times)

John Donovan

  • Administrator
  • Emeritus
  • *****
  • Posts: 3304
  • Other duties as assigned...
    • Probe Software
Proposal for a New Feature- Hyper-Gamma Correction
« on: July 02, 2013, 11:55:48 AM »
Jason Herrin (JEOL 8530 in Singapore) asks if we can calibrate BSE images using quant data and as Ed Vicenzi points out, this is an old method that has its share of problematic issues.  The main issue being that the BSE image doesn't really contain any chemical information, but rather only average atomic number contrast which could be utilized (as Cameca has recently tried to implement) as a sort of MAN background correction, by correlating BSE intensity with continuum measurements.

But besides the problem of using a proxy for continuum intensity, the additional issue is that even in this somewhat useful case one cannot easily perform an absorption correction to the continuum intensities, so the accuracy at minor and particularly trace levels is very problematic.

Anyway, I'm sure there are still some situations in which a "gamma" type correction could be useful for correlating BSE intensity to some chemical data type, but I haven't thought of anything especially sexy, though Jason suggests the following:

Quote
"Their goal is to take advantage of the high spatial resolution (and ease) of electron imaging for small features in systems where a single chemical parameter dictates BSE contrast, like Fo-Fa substitution in olivine or An-Ab in plag. Then export BSE intensity profiles across features wherein BSE intensity has been calibrated to EPMA analyses performed independently on selected regions of the image. Obviously they are doing diffusion work – which is subject to large errors anyway  :P One could conceivably jazz it up with options like Gaussian calibration points and rectangular profiles with median-weighted intensities and the like."

What I have thought of (and it may not be original, but it is cool) is to make a new window in CalcImage where one could import any analog signal image (or heck even a raw x-ray map) and then simply specify a PFE quant sample and a data type to correlate to the image raw data.

The program would then find the X and Y coordinates of each point analysis in the specified sample and use that data to calibrate the BSE intensity for each pixel. Automatically!  Dare I say with "just two mouse clicks"!   ;)

What do you all think?
« Last Edit: July 02, 2013, 12:20:47 PM by John Donovan »
John J. Donovan, Pres. 
(541) 343-3400

"Not Absolutely Certain, Yet Reliable"

Karsten Goemann

  • Global Moderator
  • Professor
  • *****
  • Posts: 228
Re: Proposal for a New Feature- Hyper-Gamma Correction
« Reply #1 on: July 04, 2013, 07:01:14 PM »
Actually, this is something we've been thinking about in the past for those exact two systems: An content in plagioclase and Fo content in forsterite. A couple of years ago we tried this for plagioclase but for the sample set we had there wasn't enough variation in the An content to get decent calibration curves. But one of my users has recently collected BSE images and probe measurements on a set of olivine samples where he wanted to try something similar. I could check with him if we maybe could use that sample set for testing purposes?

It would be good if CalcImage could perform this on external BSE images, e.g. collected on an SEM, as we're collecting a lot of large scale BSE maps automatically on our SEMs. I'm not sure how this could be done, as the probe measurement positions would then somehow have to be registered within those images, or CI would have to parse a probe measurement dataset to which we then would have to assign BSE grey values?
Another reason for "external calibration" is also that the apparent position of the deflected beam on our Cameca SX100 probe is not necessarily exactly the same as the corresponding stage position due to scanning hardware hysteresis issues.

Cheers, Karsten

John Donovan

  • Administrator
  • Emeritus
  • *****
  • Posts: 3304
  • Other duties as assigned...
    • Probe Software
Re: Proposal for a New Feature- Hyper-Gamma Correction
« Reply #2 on: July 09, 2013, 06:39:28 PM »
Ok, so here's my question:  are we simply talking about a method to calculate the end member values for various minerals (e.g., olivine Fo to Fa) using x-ray maps or is there an intrinsic value to using a BSE image for this purpose?

I agree that there could be real advantages to utilizing BSE images (e.g, saving time) not withstanding the fact that the BSE signal itself does not contain any explicitly compositional information.

But I would like to hear all of your thoughts regarding this question.
John J. Donovan, Pres. 
(541) 343-3400

"Not Absolutely Certain, Yet Reliable"

Jason in Singapore

  • Post Doc
  • ***
  • Posts: 15
    • FACTS Lab at NTU
Re: Proposal for a New Feature- Hyper-Gamma Correction
« Reply #3 on: July 11, 2013, 07:27:25 PM »
 
Thank you for starting this thread. The goal is to develop an easy-to-use tool to calibrate BSE images to EPMA analyses performed independently on selected regions of the image. This would allow the user to take advantage of the high spatial resolution and ease of electron imaging to characterize small features in systems where a single chemical parameter dictates BSE contrast, like Fo-Fa substitution in olivine or An-Ab in plagioclase. As an example, think of fine-scale oscillatory zoning frequently observed in volcanic plagioclase. These features are often too small to accurately characterize by x-ray techniques, yet there are usually patches within the same phenocryst that are homogeneous and could be used to calibrate BSE intensity. There are a handful of papers out there where people have done this using makeshift methodology with image analysis software like ImageJ, so this is a wheel that has been re-invented a few times with varying levels of sophistication.

Firstly, let’s establish that there are many potential pitfalls with this method and that the results can only ever be semi-quantitative at best. Moving on with the discussion, Probe Software has a huge advantage for this application because it already has the point locations on the image stored along with the chemical data for those point locations, so all of the data entry has already been done. That saves a lot of squinting and clicking. I envision a feature where you could click on “calibrate BSE intensity to quant” and then you select or deselect which analyses you would use for calibration. The result would be a pixel-by-pixel semi-quant dataset, which would be mostly latent, but you could use it to export the information you want.

 
Output (for our purposes):
In the case of diffusion modeling, what is desired is usually a one-dimensional concentration profile. So you can imagine a software feature where you draw a line on the image and it outputs the concentration of some or all elements along the length of that traverse. For finite difference modeling, the user probably has a pre-defined step distance in their model, so they will want the profile quantized into distance increments of, let’s say, 1 or 0.1 um. Ideally, the user could define a “superpixel” size along the length the traverse they have drawn that would dictate the area of pixels to average. This superpixel size should be independently adjustable in the directions parallel with and orthogonal to the traverse. Bonus points if the software then updates the image with little rectangular (or ellipsoidal?) boxes showing the location of each superpixel along the traverse. The ability to overlap superpixels might also be useful.
 

Here are a few additional thoughts:
 
 - John has suggested calibrating the MAN instead of a single chemical parameter (e.g. Fo or An). This is a great idea.
 
 - The option to include or exclude elements from the MAN/BSE calibration (and output results) might be useful, just so that irregular distribution of minor/trace elements can be ignored for sake of simplicity. For instance, you might want to exclude Fe from the calibration of a BSE image of plagioclase, if you were on ly interested in An and thought that there were reasons why Fe distribution did not precisely mimic An content in a specific ROI.
 
 - A feature to adjust the pixel area used for calibration. I suppose the default setting might be mean BSE intensity over a 1um diameter circle, but in the case of plagioclase we sometimes use a 3 um beam. Or maybe at low kV you would want 1 um circle with a Gaussian center-weighted average.

 - Some of the more sophisticated diffusion modelers are moving toward 2D modeling, so the ability to export semi-quant for arbitrary rectangles, in addition to line profiles, might also come in handy. This is also true for x-ray maps. Again, bonus points if the software then updates the image with little rectangular or ellipsoidal boxes showing the location of each superpixel and can accommodate overlap or asymmetry.
 
 - Would it be possible to work in cation space? Produce image and numeric output with self-defined chemical parameters (e.g. Fo, An)?
 
 - What is the minimum number of points one would need to calibrate an image? I don’t think this is true, but would it be necessary to define cation substitution mechanisms (e.g. CaAl2Si2O6 - NaAlSi3O8) in order to calibrate the BSE image using only 2 EPMA analyses? Would the BSE response be nonlinear for a system with a coupled-substitution mechanism?
 
 - Even with an annular detector, electron imaging is sometimes subject to shading effects, wherein an image of a homogeneous material can appear slightly brighter on one side. Could an option for shading correction be included? This would take into account not only MAN of calibration points but also their location on the image and make a linear correction. This would require more calibration points. Maybe the user could even specify horizontal, vertical, or arbitrary angle correction.


Anyway, it would be great to see this kind of tool developed. It would not only increase our capabilities, but it would decrease the demand for unnecessary x-ray mapping.
 

Cheers,
Jason Herrin
Nanyang Technological University, Singapore

"Truth is relative, belief absolute"

Zack Gainsforth

  • Graduate
  • **
  • Posts: 7
Re: Proposal for a New Feature- Hyper-Gamma Correction
« Reply #4 on: July 21, 2013, 06:21:27 PM »
We’d have to consider several factors to do this quantitatively.  To list (I believe) the most important issues:

  • Surface polish and topography.  Rough surfaces can have very different backscatter intensities, so only a high quality polish would be able to be quantitative.  Even then, you will still be left with electron channeling which can be as much as a 1% variation.  So you would have to ensure a >> 1% variation in your backscattering coefficient to overcome this “floor.”  For many mineral systems this should be fine.  However, you still should watch out for preferential etching, for example, by colloidal silica which can also produce large backscatter differences.
  • "Backscatter leakage" is a term I made up (if you know an official term then I'd like to know).  You can look at high resolution backscatter images, and see a false zoning near the interface of a high-Z to low-Z material even with a good polish.  The cause is that backscatter electrons escape more easily through the low-Z material and so more are collected by the detector when the beam is in the high-Z material but near the interface.  Electrons are “leaking” out through the low-Z material.  When you are trying to do high-resolution quantitative work this will be a problem.  It can be ameliorated by using lower voltage, and staying away from interfaces.  Possibly, you can do modeling or imaging at multiple energies to separate backscatter leakage from real zoning.
  • Likely the most important factor is computing error bars to relate what composition maps to what backscatter coefficient (eta).  Here, I did a quick computation assuming a simple model: I = Z^1/2 and approximate EDS error bars from an an olivine and found that the uncertainty connecting a particular EDS composition with an exact backscatter intensity should be about 1-2% in this case and assuming no other unseen elements are substituting.  (Attached pdf.)  It is very important to consider compositional changes from other cations, and remember that the density of the mineral often changes with the chemical substitutions too.  For example, in the case of olivine, the intensity will increase rapidly as Fe substitutes for Mg, but increase slowly as Ca substitutes for Mg.  Ca is a denser cation, but it is also much larger and will expand the unit cell disproportionately.  A computation using CASINO 2.0 and 10 keV shows that Ca only produces about 1/2 the difference in the backscatter coefficient relative to Fe.  A more troublesome case may be K vs Ca substitution in plagioclase.  Properly, all possible substitutions should be considered quantitatively in a manner similar to the attached computation (but also including density) to be able to say anything definitive about your mineral system.

Cheers,

Zack Gainsforth

Jason in Singapore

  • Post Doc
  • ***
  • Posts: 15
    • FACTS Lab at NTU
Re: Proposal for a New Feature- Hyper-Gamma Correction
« Reply #5 on: July 21, 2013, 10:23:42 PM »
Hi Zack,

Thanks for the response. There are definitely some pitfalls to this approach, as you point out, but the baseline assumption that BSE intensity can be proportional to some compositional parameter within simple mineral systems is valid. It is important to recognize the many sources of error and then to determine which are quantifiable and which, if any, are worth applying corrections to.

The two big problems that I was thinking about were irregularities in the carbon coating (of course you would always want to take the BSE image before analysis) and the fact that most annular BSE detectors seem to have a slight shading bias so that at high contrast images appear ever-so-slightly brighter on one side. Anyway, I think that this method can only ever be quasi-semi-quantitative, but there are certain applications for which it might be useful.

(1)   Good point. Also, I have seen backscatter irregularities with colloidal silica polish before (in olivine). I attributed it to residue on the surface, but it could have also been preferential etching I suppose. Grain centers appeared brighter.

(2)   Backscatter leakage?… I think that can also be caused by some food additive in fat-free potato chips :o   The primary point I would make is that spatial convolution effects are less severe for electrons than for x-rays, which is one of the reasons folks might opt to use electrons in the first place (another reason would be to quickly obtain a rough compositional map). For very fine scale imaging of interfaces (not that this technique is necessarily applicable to chemically abrupt interfaces), one could possibly try to deconvolve the signal by normalizing images taken at different accelerating voltages, as you suggest. A less troublesome (for the user) alternative could be an error assessment  that takes into account “leakage” or signal convolution from neighboring regions. For instance, the error-due-to-leakage on a profile of a given superpixel spacing could be estimated from a simplified BSE escape volume model that would take into account MAN, accelerating voltage, and perhaps an input for density. Since the error on each superpixel would be slightly different, it would be a separate column of data. My gut feeling is that leakage would be insignificant for most reasonable applications, but it might ward off some abuse by people wanting unrealistically small spatial resolution. If error were determined online, then the user could get a feel for what spatial resolution is realistic. When error due to influence of neighboring regions becomes too high, the user can simply increase the desired superpixel size, swapping spatial resolution for precision. Alternately, I can almost envision an image filter/data correction algorithm that could correct for leakage through neighboring regions based on their calibrated composition, but I expect this effect might be negligible within an individual mineral grain with a restricted compositional range – which is really the only situation where this method is applicable anyway.

(3)   1-2% would be great. The sum of errors will be difficult to quantify, but essential for people wanting to use these data quantitatively.  Again, an online error estimate would be a handy tool, but to do it correctly it might have to be phase specific, or at least you would need to define some sort of density range. Is a simpler estimate of error possible? Perhaps somebody will come out with a publication showing the inherent error for a few popular mineral systems, and that citation could come spewing out together with the results?

Again, thanks for the interest and response. It would be nice to see this tool developed and developed correctly. Also, it would be nice to make sure the many shortcomings of this approach are published somewhere so that reviewers have a citation to wield against those who might misuse this tool.

There is definitely quantitative information in the electron signal that is currently underutilized. Calibration of BSE images is a start. Could backscattered EELS (REELS?) ever become viable or are the fundamental limitations too overwhelming?

-Jason
« Last Edit: July 22, 2013, 02:36:07 AM by jsherrin »
"Truth is relative, belief absolute"