Distributed edge computing: a cost-effective way to harness the future
By Matthew Oostveen >>
THE GARTNER Hype Cycle report in 2020 posits that edge computing is close to the peak of inflated expectation.
Despite this viewpoint, it is undeniable that edge computing is having a meaningful impact on our lives, and in the process is transforming the way we live.
A constant stream of new use cases are emerging as technology and connectivity both converge and evolve. Consequently, this year, edge computing will come of age as more sensors and machines come on-line connected through the surge of 5G connectivity.
The global pandemic drove the biggest change to digitalisation that the world has ever seen. Every business on the planet raced to use technology to survive as disruption flooded whole industries. Distributed edge computing brings data storage closer to where it’s needed to improve latency and save bandwidth.
Edge computing is a technology that wears many hats, including the Internet of Things (IoT) and the convergence between IT and operational technology (OT). In its purest form, it’s automating technology to improve human life and the applications are almost limitless.
For example, sensors in the fields can track crop health and trigger fertiliser or pesticide application, self-driving machines can work 24/7 on mining sites, automated factories that operate without pause, and tiny drones that can revolutionise surgery.
THE CHANGES FOR IT INFRASTRUCTURE
Leveraging edge computing effectively requires a strategic approach that builds in distributed edge computing when creating IT solutions. The sheer volume and complexity of data involved will also need to be addressed and decision-makers may need assistance in determining what’s most important.
IDC (International Data Corporation) research projects that IoT devices alone will generate over 90ZB of data by 2025. With edge computing comprising more than just IoT devices, the data load generated by distributed edge computing applications will be astronomical.
Data centres will need to be designed, built, and even located with this in mind. Putting data centres physically closer to where the data is generated is necessary to overcome performance issues associated with managing such massive amounts of data.
Beyond location, data centres will also need to be able to accommodate massive amounts of unstructured data and be built on cloud-native technologies such as containers, to support a wider variety of application needs.
In reality, the cost to move all data to an ideal central location is cost-prohibitive. So, there will likely be a shift that will see applications and infrastructure also become more distributed.
This will see the demise of a pure core cloud approach to one that involves a distributed cloud at the edge working in tandem with the core cloud.
IT’S NOT WITHOUT ITS CHALLENGES
These smaller data centres need to have speed, flexibility, and operational simplicity in each location. This doesn’t come without its own set of challenges, though.
Latency has grown into a larger concern in recent years due to big data, streaming services, and IoT. Slow performance is no longer tolerable for the modern-day user.
Edge data centres solve this problem with a high performing and cost-effective way to provide functionality and content at speed.
Edge sites are small and often are in thousands, so all data can’t exist in every site. Edge data centres are a connectivity tool into larger central data centres. Processing data as close to the user as possible reduces latency and improves the customer experience.
New architectures are being built where edge applications generate, store, and interact with data at the edge.
Coupling these edge sites with core data centres means syncing all data and storing less used data. This needs infrastructure to support data access and movement and running applications in a much more distributed manner.
The second challenge is edge applications are being built generally as microservice applications. This means they can start, stop, and scale as users come and go from the services delivered at the edge.
Because edge sites are small, every application can’t have its own dedicated servers or storage, they must all share infrastructure.
Containerisation helps solve this problem by spinning applications and their storage up and down. It lasts only while that user or device is accessing a particular cell tower or edge data centre and is run on shared heterogenous infrastructure.
THE WAY OF THE FUTURE
Distributed edge computing is creating opportunities for technology to dramatically alter the way humans interact with the world.
From smart cities and more effective food production to automated mining and manufacturing, medical advances, and technologies not yet even dreamed of, distributed edge computing is the way of the future.
Combined with other emerging technologies such as 5G and artificial intelligence, reality will soon come to more closely resemble science fiction, with humankind as the winner.
About the author
Matthew Oostveen is the chief technology officer(CTO) for Pure Storage in the Asia-Pacific and Japan region.
Reference:
The Gartner Hype Cycle report - https://www.gartner.com/en/documents/3989981/summary-translation-hype-cycle-for-edge-computing-2020
ends