There was a recent post to the microscopy mailing list about remote access to microscope systems. In summary the original submitter was asking:
"The Pandemic has forced even the most ardent to adopt web meetings whose numbers have exploded for doing things like microscope demonstrations and remote training. Growing further towards remote support for diagnosis of possible problems or tweaking settings to improve a customers use of their microscope or EDS."
"At the same time, corporations and the government entities have been implementing stricter "traffic cops". We have recently even seen USB drivers getting blocked. Then comes TCP/IP traffic and IT roadblock police restricting Administrator rights to a local PC that make it feel like George Orwell is running things. Everyone pointing fingers at the source of the problem. Chaos and frustration ensue."
"QUESTION: What do you find is the best solution to achieve these needs when their is suffocating IT overhead that the system is unable to tolerate?"
John asked me to post my reply here in case it is of interest to this audience and perhaps spark a discussion about the various issues:
These are good questions worthy of some discussion. I think, however, an important preface to make is that there will be no one-size-fits-all solution for everyone. Local IT policies vary widely and punishment for breaches of IT policy can vary from a slap on the wrist to being fired and/or having to answer questions from the authorities, so please be very careful about what you do on your network!
As a microanalytical consultant I work with a wide range of users located in labs all around the world. When asked similar questions in the past my initial advice is to always try and work with your local IT services to achieve the results that you require. Fundamentally they are providing you with a service and they should be working to help you do your job and provide your users access to their data. The disconnect arises when the IT services are either inexperienced, or they are following a mandate that has been over-optimised for office computing to the detriment of laboratory requirements.
There are two main questions you should have for your IT services:
1) What storage do you provide and how can users access it?
The idea here is that IT services will deal with all of the issues surrounding access to the server, and all you have to do is push data from your microscope PC to the local storage server.
These types of services should be perfectly adequate for data access, but of course this does not help with remote control of the instrument.
2) What remote access VPN service do you provide?
The idea here is that network services should make it possible for users to access the local network via their VPN. IT services will handle all of the issues and hassle supporting users to get into the network, but once they are in they can then access services running on your microscope PC directly. This is a nice option because you can use whatever services suit you best without requiring any additional input from IT.
While this approach works very well, it falls over when you have users who are not part of the organisation. Some IT departments will be able and willing to issue those users with credentials, while many will not.
What can you do when local IT services are unable or unwilling to provide the aforementioned services?
As you suggest, you can do an end run around the local network and use a GSM dongle to access cellular data directly. I have used this approach on a number of occasions and we have found that it always works extremely well. In one case the access was faster than when we had tried to use the LAN previously. Rather than leave it on all of the time, my client would set up the dongle in advance of the service window and then remove it afterwards. Permanent use of a GSM dongle may not make much sense if there are data caps on the service, and from a security perspective you may want to think carefully before tring it. For this reason it will most likely be forbidden by your local IT policy.
Another approach is to use your own VPN service. There are a number of these that are free to use, but there can be a lot of confusion surrounding them since they are often used by "gamers" and by themselves the VPN software does absolutely nothing. You still need to run a service (such as file sharing, FTP server, VNC remote access, etc...) on the microscope PC yourself, while the VPN software simply provides access. You also lose the ability to use DNS names to identify your computers, but this is not much of a drawback, users can just as easily enter an UP like "10.1.2.3" as they can "mysem.company.org".
My personal preference is ZeroTierOne (
https://www.zerotier.com/download/), but I have also had good success with LogMeIn Hamachi (
https://vpn.net/) in the past.
The benefit of using a VPN like this is that you can often keep using the same services you are already using. The downside is that you now have a VPN to administer and membership of that VPN will continue to grow over time as you pick up users. To manage this I would recommend regenerating the network every 6-12 months and then advertising the new network ID to the current pool of users.
In terms of hardening access, it is always a good idea to use a second PC as the point of contact for user access to data, and then push your working data to that PC from the microscope PC. This limits the services and software running on the microscope PC. In this case the second PC can be anything that is available, provided it has enough storage. The CPU and RAM requirements are minimal. It can even run an OS different to that on your microscope PC, which is useful in terms of securing a system that will be accessed by multiple users. The challenge is that a second PC is primarily only of use to serve up data. If you require remote access then while it can be used as an entry point to the network, where users access it via VPN, say, and then initiate a remote desktop connection to the microscope PC, such a set up is quite complicated and not recommended unless you are super keen.
For pushing data to the server, you have a range of options. FreeFileSync (
https://freefilesync.org/) is a good one due to the friendly interface and being cross platform, but ultimately the command line program "rsync" (and the many programs based upon it) is fundamentally the best way to sync large data sets across a network. For Windows PCs you will get good mileage out of the command line program "RoboCopy". The premise of all these programs is that they only copy updated files to the server, avoiding the need to copy the entire dataset each time. How frequently you push data to the server is up to you, for most practical purposes daily is usually sufficient.
If anyone is looking for specific advice on these sorts of setups please feel free to contact me directly (or post a reply here) and I'll do what I can to help.
All the best,
Ash