Author Topic: saga gis and swath profile  (Read 2879 times)

Ben Buse

  • Professor
  • ****
  • Posts: 498
saga gis and swath profile
« on: October 17, 2019, 06:32:36 AM »
Hi,

I was trying out saga gis, and you can plot surfer grid maps, plus do swath (wide line average) profiles

saga gis is here

http://www.saga-gis.org/en/index.html

Here's an image of a map plotted using saga gis, and printed as pdf



Here's a screenshot showing the swath profile



The data is averaged between the two lines paralleling the central line.

The swath profile is interactive, you click on the two points with left mouse button, then right click - and graph and table update

If anyone's interested I can explain how to load a map, and create a swath profile

Ben

« Last Edit: October 17, 2019, 06:35:38 AM by Ben Buse »

Probeman

  • Emeritus
  • *****
  • Posts: 2856
  • Never sleeps...
    • John Donovan
Re: saga gis and swath profile
« Reply #1 on: October 17, 2019, 05:03:23 PM »
Hi Ben,
Never saw these GIS tools before, but good to know. Thanks.

But it is interesting, isn't it?  That we microscopists utilize GIS tools designed for kilometer scales, and use them at micrometer scales.  :D    Only a difference in "mapping" scale of 10^9!

As you probably know, the Surfer software integrated with CalcImage for output, is also designed for GIS.  And just in case anyone doesn't know, Golden Software offers a "student" license for $50 for one year if reducing costs for students is an issue:

https://probesoftware.com/smf/index.php?topic=707.msg8220#msg8220
« Last Edit: October 17, 2019, 05:33:00 PM by Probeman »
The only stupid question is the one not asked!

theo_nt

  • Post Doc
  • ***
  • Posts: 17
Re: saga gis and swath profile
« Reply #2 on: February 24, 2020, 07:33:51 AM »
Hi John,

is it possible to do a wide line average profile with Surfer as Ben does it using the saga gis? Cameca has this option as well.

Theo


John Donovan

  • Administrator
  • Emeritus
  • *****
  • Posts: 3304
  • Other duties as assigned...
    • Probe Software
Re: saga gis and swath profile
« Reply #3 on: February 24, 2020, 07:52:48 AM »
Hi John,

is it possible to do a wide line average profile with Surfer as Ben does it using the saga gis? Cameca has this option as well.

Theo

Hi Theo,
The current Surfer "slice" scripts do not have any averaging capability, but I suspect that within Surfer there would be a method to do this sort of line averaging, but you'd have to ask Golden Software.

However, this average profile capability is already built into CalcImage as shown here:

https://probesoftware.com/smf/index.php?topic=41.msg8435#msg8435

Just be sure your PFE is up to date and it should work fine.
John J. Donovan, Pres. 
(541) 343-3400

"Not Absolutely Certain, Yet Reliable"

sem-geologist

  • Professor
  • ****
  • Posts: 304
Re: saga gis and swath profile
« Reply #4 on: April 15, 2021, 10:57:42 AM »
Quote
But it is interesting, isn't it?  That we microscopists utilize GIS tools designed for kilometer scales, and use them at micrometer scales.  :D    Only a difference in "mapping" scale of 10^9!

GIS tools are the tools I compare most of software to, it is like benchmark or example that things can be programmed right from the beginning.

GIS tools are basically the tools to deal with spatial data, which is mostly in form of raster or vector type (or grid and sparse data). Scale is completely irrelevant and GIS rather designed for solving spatial problems. Scale is an addition for final data presentation for human beings, while GIS algorithms and core are for handling horrendous amounts of data and making spatial sense from them.

I had used and still use GIS for microscopic data, there are a few examples:

I had used QGis for my PhD as main place for all data organisation (something like your pictureSnapApp). My EPMA results were fed into PostgreSQL PostGIS (spatial plugin for PostgreSQL) database. My geological maps, sample pictures, thin section scans (with 1 um resolution) in both polarised and cross polarised light at different were saved as geotiffs (PostgreSQL could not handle images very well, then, and there was some obvious performance penalty - thus geotiffs). All this data were used in the same geographic coordinate system, where thin section images were attached with left-top corner to coordinates of where sample was picked in meters, and sub-meter resolution was used for all micro-stuff. You could zoom from map of dozen of km^2 granitoid intrusion into micron scale for details of micro scale samples. Those thin section images were overlayed with SEM BSE images at right positions. I had special bash scripts which could recalculate from two correlation points for coordinate transformation back and forth between GIS the EPMA and SEM stages, and apply transformation when new data was imported into this Qgis workflow.
Whole project was near half terabyte (those tiffs/geotifs are fat)... but Qgis could launch in a single second, and load that project in few seconds on cheep 2 core laptop with <4GB of RAM, old spinning disc-drives and I could jump instantly to any of data, while Qgis would use just fraction of the available RAM.

Well, that setup sounds bizarre... And I agree, I would do things a bit more different now, but then it allowed me to do such insane things like spatial analysis of microprobe data in intrusion scale, I.e. seeing on the map how concentrations of Cl and F in all analysed apatites relates to the position in the intrusion on the map.
Using the proper database (PostgreSQL) actually allowed to do much more. I.e. I could define once how the recalculation of amphibole need to look in PostgreSQL. Such SQL code for apfu calculation of amphibole (with full implementation of logical balancing proposed by Leeke) would create a virtual table, that table would update with new amphibole data injected to the database. The advantage is that data collected in four years had exactly same recalculations/treatment. For some minerals as amphibole, this was going even a stpe further, i.e. calculating pressures and temperatures using single mineral thermobarometer. Updating map with new data was instant the moment new data was inserted into database.
   It is breeze to explore different kind of geological ideas just in seconds without need to create new tables or relations with samples. Zero Excel, no copy-pasting, reduced space for human error. And this is all possible as Qgis and PostgreSQL on 64 bit systems uses double float for coordinates, thus it can go even to submicron scale. I could find patterns I would never find without GIS, hopefully one day I will have some time to publish those findings.

Another example was using PostgreSQL spatial analysis algorithm to filter framboid pyrites for calculation redox-oxydation states. For this there is technique of measuring size of framboid, which often is done manually. I had automated process using Bruker Esprit particles workflow. The problem - it would recognize all framboid pyrites, also those in organic-rich cavities, which would bias the mean size and distribution of framboid pyrites. With PostgreSQL I could filter out all those framboids which are side by side, and take into account only framboids which are not touching.  Bruker system is quite capable in recognizing particles, but poor in filtering results. GIS was very handy.

I often use QGis for thin section elemental mapping exploration, as it has lots of possibilities to overlay element maps as layers (it has not only transparency but dozen of blending modes (such as Lighter, Darker, Burn, Multiply, Subtract, Difference....), custom color ramps, raster calculators...)

Coming back to GIS tools as benchmark, they do really good job in handling huge amount of data, there are some weaknesses, but most of times it is due to improper data formats. I.e. Geotiffs (or Tiff technology as whole) works much better than png's or jpeg when having massive data system, because pngs, and jpegs need to be read and decompressed in memory before displaying even only a chunk of it. Tiffs can be read in chunks from the disk drive, and can be read at exact chunk which is needed, thus the RAM usage is tremendously saved. GIS often use indexing - data is not exposed wholly to GIS, but only its geometrical/geographical representation (or bounding box), so that data is read only if its is in current viewport. There are many of clever programming and data handling techniques which can be learned by studying how those GIS are built. Unfortunately they spoil, and raise expectations for other kind of software.
« Last Edit: April 16, 2021, 05:39:14 AM by sem-geologist »