But it is interesting, isn't it? That we microscopists utilize GIS tools designed for kilometer scales, and use them at micrometer scales.
Only a difference in "mapping" scale of 10^9!
GIS tools are the tools I compare most of software to, it is like benchmark or example that things can be programmed right from the beginning.
GIS tools are basically the tools to deal with spatial data, which is mostly in form of raster or vector type (or grid and sparse data). Scale is completely irrelevant and GIS rather designed for solving spatial problems. Scale is an addition for final data presentation for human beings, while GIS algorithms and core are for handling horrendous amounts of data and making spatial sense from them.
I had used and still use GIS for microscopic data, there are a few examples:
I had used QGis for my PhD as main place for all data organisation (something like your pictureSnapApp). My EPMA results were fed into PostgreSQL PostGIS (spatial plugin for PostgreSQL) database. My geological maps, sample pictures, thin section scans (with 1 um resolution) in both polarised and cross polarised light at different were saved as geotiffs (PostgreSQL could not handle images very well, then, and there was some obvious performance penalty - thus geotiffs). All this data were used in the same geographic coordinate system, where thin section images were attached with left-top corner to coordinates of where sample was picked in meters, and sub-meter resolution was used for all micro-stuff. You could zoom from map of dozen of km^2 granitoid intrusion into micron scale for details of micro scale samples. Those thin section images were overlayed with SEM BSE images at right positions. I had special bash scripts which could recalculate from two correlation points for coordinate transformation back and forth between GIS the EPMA and SEM stages, and apply transformation when new data was imported into this Qgis workflow.
Whole project was near half terabyte (those tiffs/geotifs are fat)... but Qgis could launch in a single second, and load that project in few seconds on cheep 2 core laptop with <4GB of RAM, old spinning disc-drives and I could jump instantly to any of data, while Qgis would use just fraction of the available RAM.
Well, that setup sounds bizarre... And I agree, I would do things a bit more different now, but then it allowed me to do such insane things like spatial analysis of microprobe data in intrusion scale, I.e. seeing on the map how concentrations of Cl and F in all analysed apatites relates to the position in the intrusion on the map.
Using the proper database (PostgreSQL) actually allowed to do much more. I.e. I could define once how the recalculation of amphibole need to look in PostgreSQL. Such SQL code for apfu calculation of amphibole (with full implementation of logical balancing proposed by Leeke) would create a virtual table, that table would update with new amphibole data injected to the database. The advantage is that data collected in four years had exactly same recalculations/treatment. For some minerals as amphibole, this was going even a stpe further, i.e. calculating pressures and temperatures using single mineral thermobarometer. Updating map with new data was instant the moment new data was inserted into database.
It is breeze to explore different kind of geological ideas just in seconds without need to create new tables or relations with samples. Zero Excel, no copy-pasting, reduced space for human error. And this is all possible as Qgis and PostgreSQL on 64 bit systems uses double float for coordinates, thus it can go even to submicron scale. I could find patterns I would never find without GIS, hopefully one day I will have some time to publish those findings.
Another example was using PostgreSQL spatial analysis algorithm to filter framboid pyrites for calculation redox-oxydation states. For this there is technique of measuring size of framboid, which often is done manually. I had automated process using Bruker Esprit particles workflow. The problem - it would recognize all framboid pyrites, also those in organic-rich cavities, which would bias the mean size and distribution of framboid pyrites. With PostgreSQL I could filter out all those framboids which are side by side, and take into account only framboids which are not touching. Bruker system is quite capable in recognizing particles, but poor in filtering results. GIS was very handy.
I often use QGis for thin section elemental mapping exploration, as it has lots of possibilities to overlay element maps as layers (it has not only transparency but dozen of blending modes (such as Lighter, Darker, Burn, Multiply, Subtract, Difference....), custom color ramps, raster calculators...)
Coming back to GIS tools as benchmark, they do really good job in handling huge amount of data, there are some weaknesses, but most of times it is due to improper data formats. I.e. Geotiffs (or Tiff technology as whole) works much better than png's or jpeg when having massive data system, because pngs, and jpegs need to be read and decompressed in memory before displaying even only a chunk of it. Tiffs can be read in chunks from the disk drive, and can be read at exact chunk which is needed, thus the RAM usage is tremendously saved. GIS often use indexing - data is not exposed wholly to GIS, but only its geometrical/geographical representation (or bounding box), so that data is read only if its is in current viewport. There are many of clever programming and data handling techniques which can be learned by studying how those GIS are built. Unfortunately they spoil, and raise expectations for other kind of software.