We speak to Tobias Flitsch of Nebulon concerning the rise of edge computing and how that may drive demand for more storage that should be secure and resilient and centrally manageable
- Antony Adshead,Storage Editor
Published: 03 Aug 2022
In this podcast, we look at the way the rise of edge processing has effects on topologies from datacentres out to remote locations, the constraints the edge imposes and the growth of data services in these locations.
Flitsch discusses how topologies are evolving to obtain round the challenges of latency and bandwidth, and how which means storage should be resilient, secure and centrally manageable.
Adshead: Do you know the implications for storage of the rise of edge networks?
Flitsch: Whats happening at this time is which were seeing lots of organisations which are re-architecting their IT infrastructure topology because they’re either in the center of their digital transformation journey or already through the majority of it.
And, IT is definitely about data and information processing, and cloud was but still is really a key enabler for digital transformation. Thats because services are quickly instantiated and scaled through the entire transformation journey.
So, many organisations, within their digital transformation, have leveraged public cloud services and spun up new services there. Given that companies are becoming more digital, more data-driven, more data-centric, and understand their finest usage of digital assets, their reasons and requirements for more data access and data processing changes or gets more refined.
So, where and how they process data and for what purpose are actually key decision criteria for them, designed for IT architecture and the topology. Its not only cloud or the datacentre any longer. Now edge plays an integral role.
I am aware edge could be a tricky word as you can get another definition based on who you ask.
Edge if you ask me means putting servers, storage along with other devices outside the core datacentre or public cloud and nearer to the info source and users of the info, that could be people or machines. And how close? Thats a matter of the precise application needs.
Were seeing upsurge in the amount of data producers, but additionally the necessity for faster and continuous usage of data, and you could see that there’s the necessity to provide more capacity and data services locally in the edge sites.
There are always a couple of known reasons for that. Low-latency applications that you often find in industrial settings cannot tolerate the latency round-trip between an advantage site and a core datacentre or perhaps a cloud when accessing a database, for instance.
So, local data must support latency-sensitive applications, and there’s also remote office and branch office applications that dont have the blissful luxury of a high-bandwidth, low-latency access network to a corporate datacentre. But users still have to collaborate and exchange huge amounts of data, and content distribution and collaboration networks be determined by local storage and caching storage to minimise bandwidth utilisation and for that reason optimise costs.
Lastly, there’s the driver of unreliable networks. Were seeing a substantial growth in data analytics, however, not all data sources and locations can reap the benefits of a trusted high-bandwidth network to make sure continuous data flow to the analytics service, that is often done in the cloud.
So, local caching, data optimisation at the extreme doing the info analytics directly at the edge side requires reliable, dense and versatile storage to aid those needs. What this signifies for storage is that there surely is a growing demand for dense, highly available and low-maintenance storage systems in the edge.
Adshead: Do you know the challenges and opportunities for storage with the rise of edge computing?
Flitsch: In the event that you look at storage specifically from an advantage perspective, it really needs to adapt to the demands of the precise application at the edge. Previously, weve always deployed storage and storage systems in the central datacentres with a lot of rack and living area, power and cooling, usage of auxiliary infrastructure services, management tools, skilled service personnel and, needless to say, strong security measures.
The majority of this is simply not available at the normal edge site, this means storage solutions have to adjust and work around those restrictions, and thats a genuine challenge.
Take the problem of security for example. Not long ago i spoke with a manager in the transportation business that’s in charge of their organisations 140 edge sites which are create in a hub and spoke topology around their redundant core datacentres.
They can not depend on skilled personnel at these edge sites and its own challenging to secure these facilities, so key infrastructure will certainly be tampered with also it would be very difficult to inform.
Because these edge sites are linked to the core datacentre, this puts their entire infrastructure at an increased risk, let alone the issue of data exfiltration or perpetrators stealing storage devices, for instance.
I believe this is actually the main challenge at this time: securing infrastructure and data at the edge, especially with the rise of ransomware attacks along with other cyber security-related threats.
But, I really believe a reliable data protection and speedy recovery solution can address this issue.
I also think that modern infrastructure and storage can address these other challenges that I mentioned if it’s centrally and remotely manageable, if it’s dense and highly redundant, if it’s affordable and features to create data services.
Finally, I really believe the necessity for local storage at the edge will continue steadily to grow and be a lot more very important to customers, and I believe the advantages of having data accessible at low latency with resiliency outweigh those challenges for storage by way of a lot.