Tech talks are so much better when vendors present to a small group of like-minded techie friends in an informal way, followed by some fun and chats afterwards. vRetreat, run by Patrick Redknap, is such an event, and why I try not to miss it.

This time, we had Mark Hoffmann (Senior Product Manager – Progress Software) who talked about Application Experience Delivery and Analysis using Flowmon, followed by Vic Camacho (Principal Technologist – Cohesity) talking about how to address modern-day data breaches.

Application Experience Delivery and Analysis

I briefly touched upon Flowmon last year when Enhanced Network Telemetry was bundled with Kemp Load Balancers. That was a great move because having visibility into the entire application delivery chain allows application developers and operations to proactively identify bottlenecks, misconfigurations, and potential security issues. Having the Flowmon collector included means one less thing to buy and install to gain that telemetry data.

Once you have that data from the load-balancer (and other network paths), what next? How does one derive meaningful data out from all that telemetry data as that’s all that matters i.e. to proactively ensure the application experience does not suffer from any outages and bottlenecks. Not only that but those inferences are also made more meaningful if the relevant context is provided using the device configuration data. That is what Mark’s presentation was all about.

The representative setup relies on a Flowmon Collector VM appliance, to which, all telemetry data is sent. Its job is to analyse all the flow traffic collected and correlate it with the application context, provided by the “Virtual Service” configurations acquired from the LoadMaster appliances.

Being an architect, I wondered what is the impact of running Flowmon on the LoadMaster? That’s because you do need to configure the appliances to have that much headroom.

As you would expect with all things IT, it depends! The number of services being probed is a major factor there but also the size of the LoadMaster appliance too. According to Mark, it could be between a few per cent to 25 per cent, depending on the configuration. Do bear that in mind when deploying the probes in your environment.

LoadMaster - Flowmon

There are a couple of scripts that Mark ran in his demo that brought all those inferences to life so make sure to watch the whole video. After those scripts are run, dashboards are populated with things that matter about that application. More importantly, the dashboards can highlight any issues or degradation of service or response times, which are exactly the things that typically require immediate attention.

Personally, I would have liked this functionality to be built into the UI – with options to run periodically and/or automatically but I am sure that will come soon. That said, I am all for vendors taking the heavy lifting away from the customer as they shouldn’t be expected to weed out all the important metrics and create dashboards. All they want are timely and clear alerts on performance and application issues – things that they know, care about, and can fix.

This is just one example of how the wealth of data collected from LoadMaster using Flowmon, can be used to derive meaningful results and best of all, quite easily. If you are running such an environment or are considering implementing one, this is a feature that should be given consideration.

For more information, do check out these resources:

Application Experience Delivery and Analysis

Ask any company and they will tell you that they consider ransomware as the single biggest threat to their business. While it has been around for the past few years now, data protection products are having a tough time keeping up with the fast pace of its evolution.

Cohesity is one of the leading data protection vendors and in his presentation, Vic Camacho reminded us about the threat itself and then what Cohesity is doing to combat this threat proactively.

To see the reality of ransomware as the biggest threat to businesses, one must look at the pace of its rapid development in recent years. Like most security threats and responses, it has been a game of cat and mouse.

In the start, ransomware was used to encrypt the data and encryption keys were made available after a ransom was paid – usually in cryptocurrency. Companies with good backup regimes started getting around it by simply restoring the data from backup. So, ransomware evolved and started targeting those backup copies as well, which caused data protection companies to start offering “immutable” copies of data.

Now, we’re faced with yet another evolution in that shortly after breaching defences, the ransomware quietly starts exfiltrating data from the organisation, before triggering its encryption. That can later be used in several ways to force a company to pay up – whether it is because the company doesn’t want its secrets to go to the highest bidder or because it is personal or sensitive information stored with them, leakage of which, will cause irreparable damage to that company’s reputation.

Either way, it could be disastrous for any company to be in that situation and why data protection companies are working hard to develop methods using AI/ML, to detect such events proactively and as early as possible. The problem is made more complex because it’s not just the data security that is at risk but also the fact that most organisations don’t know the precise location of all their sensitive data and if its protection meets compliance standards. The truth about the latter is that how can you even protect the data – if you don’t know if and where it resides?

Cohesity Threat Defence Architecture

To combat that challenge, Cohesity has created its Threat Defence Architecture that allows a modular approach to defence mechanism for data protection. Not only does it deal with the common data protection tasks that you expect but also allows third-party solutions to integrate with it – to provide a more flexible and robust protection environment.

Cohesity already has the earlier incarnations of ransomware covered but to battle the newer threats, it is working on a couple of new services that will be available shortly: Fort Knox & DataGovern. Keep your eyes peeled for details as they come out. Vic finished the presentation by taking us through some of the best practices that will help you better protect your precious data.

I can’t possibly do justice to both presentations in a few words, so I do recommend you watch the videos for full details. They are not too long so it could be time well spent learning what’s new with these offerings.

Fun Time!

After the two excellent presentations, it was time to have some fun. Patrick had arranged for a multiplayer game of “Walkabout Mini Golf” between the attendees, which I think, is a brilliant idea. He had sent Oculus Quest 2 headsets to the attendees in advance, so we had them all charged up and ready.

We had an enormous amount of fun, and it was a splendid example of doing an informal, fun activity together while being remote. Patrick hopes to enable more attendees like this in the future so that we could have regular chats and gaming events in virtual reality environments and even VR vRetreats!

Sadly, I had to leave for a bit but amazingly, was able to return and finish the game! Later Gareth and I discussed vRetreat and the fun we had on our podcast episode, along with some of the concerns that interacting in VR environments brings as well. So, do have a listen if you haven’t already (and subscribe to the podcast too while there!)

It was an excellent event – for me particularly – as I always jump at the chance to meet the vCommunity and this is the perfect way to do it – tech and fun bundled together. Long may it continue!