The role of cloud native at the edge

An illustration in blue of a cloud network made of lines and nodes
(Image credit: Shutterstock)

By 2025 analyst firm Gartner predicts that three quarters of enterprise-generated data will be created and processed at the edge – meaning outside of a traditional data centre or cloud and closer to end users.

The Linux Foundation defines edge computing as the delivery of computing capabilities to the logical extremes of a network in order to improve the performance, security, operating cost and reliability of applications and services. “By shortening the distance between devices and the cloud resources that serve them, edge computing mitigates the latency and bandwidth constraints of today’s internet, ushering in new classes of applications,” the foundation explains in its Open Glossary of Edge Computing.

Edge computing has been on a large hype cycle for several years now and many consider it “the Wild West”. This is because there’s a high volume of chaotic activity in this area, resulting in duplicated efforts, as technologists all vie to find the best solutions.

“It’s early doors,” says Brian Partridge, research director at 451 Research. “Vendors and service providers are throwing stuff at the wall to see what sticks. Enterprises are experimenting, investors are making large bets. In short, the market is thrashing, crowded and there’s a lot of confusion.”

A synergy between cloud native and the edge

Edge computing opens up many possibilities for organisations looking to scale their infrastructure and support more latency-sensitive applications. As cloud native infrastructures were created to improve flexibility, scalability and reliability, many developers are looking to replicate these benefits close to the data’s source, at the edge.

“Cloud native can help organisations fully leverage edge computing by providing the same operational consistency at the edge as it does in the cloud,” notes Priyanka Sharma, general manager of the Cloud Native Computing Foundation (CNCF).

“It offers high levels of interoperability and compatibility through the use of open standards and serves as a launchpad for innovation based on the flexible nature of its container orchestration engine. It also enables remote devops teams to work faster and more efficiently,” she points out.

Benefits of using cloud native at the edge

Benefits of using cloud native at the edge include the ability to complete faster rollbacks. Therefore, edge deployments that break or have bugs can be rapidly returned to a working state, says William Fellows, co-founder and research director of 451 Research.

“We’re also seeing more granular, layered container support whereby updates are portioned into smaller chunks or targeted at limited environments and thus don’t require an entire container image update. Cloud native microservices provide an immensely flexible way of developing and delivering fine-grain service and control,” he adds.

There are also financial benefits to taking the cloud native path. The reduction in bandwidth and streamlined data that cloud native provides can reduce costs, making it an incredibly efficient tool for businesses.

“It can also allow consumption-based pricing approach to edge computing without a large upfront CapEx spend,” notes Andrew Buss, IDC research director for European enterprise infrastructure.

However, it wouldn’t be the “Wild West” out there right now if cloud native was the perfect solution. There are still several challenges still to work on, including security concerns.

Containers are very appealing due to them being lightweight but they’re actually very bad at ‘containing’,” points out Ildikó Vancsa, ecosystem technical lead at the Open Infrastructure Foundation (formerly the OpenStack Foundation).

“This means they don’t provide the same level of isolation as virtual machines, which can lead to every container running on the same kernel being compromised. That’s unacceptable from a security perspective. We should see this as a challenge that we still need to work on, not a downside to applying cloud native principles to edge computing,” she explains.

There’s also the complexity of dealing with highly modular systems, so for those interested in moving towards cloud native edge computing, you need to prepare by investing the time and resources necessary to effectively implement it.

What should businesses be thinking about when embarking on cloud native edge computing?

Cloud native edge solutions are still relatively rare; IDC’s European Enterprise Infrastructure and Multicloud survey from May 2020 showed that the biggest edge investments are still on-premise.

“However, we expect this to shift in the coming years as cloud native edge solutions become more widely available and mature and we have more use cases that take advantage of cloud as part of their design,” says Gabriele Roberti, Research Manager for IDC's European Vertical Markets, Customer Insights and Analysis team, and Lead for IDC's European Edge Computing Launchpad.

For those businesses eager to take the leap, Partridge recommends starting with the application vision, requirements and expected outcomes. After targeting edge use cases that can support a desired business objective – such as lowering operations costs – you can then turn your attention to the system required.


Four security considerations for cloud migration

The good, the bad, and the ugly of cloud computing


Laura Foster, programme manager for tech and innovation at techUK, reiterates that it’s important to build a use case that works for your business needs.

“There’s an exciting ecosystem of service providers, innovators and collaboration networks that can help build the right path for you, but the journey towards cloud native edge computing also needs to go hand in hand with cultural change,” she points out.

“Emerging technologies, including edge computing, will pioneer innovation, but only if businesses push for change. Retraining and reskilling workforces is a fundamental part of an innovation journey and can often be the key to getting it right,” she concludes.

Keri Allan

Keri Allan is a freelancer with 20 years of experience writing about technology and has written for publications including the Guardian, the Sunday Times, CIO, E&T and Arabian Computer News. She specialises in areas including the cloud, IoT, AI, machine learning and digital transformation.