While the distributed nature of hybrid cloud has long been its appeal, there is value in bringing one’s cloud assets together in a more organized way. The idea of standardizing a heterogeneous environment might sound like a contradiction in terms, but working to eliminate silos and supercharge interoperability to control cloud costs has now become essential.
Industry observers tell ITPro that organizations increasingly need to standardize hybrid-cloud infrastructures, either in part for better automation and services or across the board. This isn't easy, warns Grant Caley, chief technologist UK&I at NetApp.
"When people broke away from the mainframe, they went for standardizing compute on x86, networking on Cisco at that time. They were still locking themselves into a vendor to some degree because it's really hard not to," Caley explains.
"This may be one way of achieving less of that. At least you can choose where you want to build rather than having to be focused into a particular cloud."
More flexibility can result from appropriate standardization across horizontal layers and while public cloud alternatives offer "great" services and capabilities, Caley argues they vary in how they are delivered.
"There's value, typically from a cost or security perspective of being able to build services in your own data center," he adds. "But choosing the right layers [to standardize] is hard."
Analyzing and offsetting certain operations such as automation, provisioning, backup, disaster recovery, or compute without "rewriting the run-book" for every change means saving resources, Caley adds.
Standardization for governance and compliance
Tina Howell, chief cloud officer at Xdesign, agrees that deploying services like Google Anthos on AWS or using AWS Outposts and Microsoft Azure Stack Hub on-prem, has made standardizing hybrid cloud easier.
"The main thing is investing upfront, ensuring consistency by building to standards and principles," she says, noting that one’s specific approach depends on cloud strategy. Do you want to be on AWS in five years, or have best-of-breed across a poly-cloud?
With the right configs and tools, you can build it exactly the same way: with one capability, only training people once and ensuring consistency while holding people to account. This makes what can be a difficult process much easier.
"If they can't govern themselves then, realistically, this is where it all starts to fail," Howell warns.
Discover a datacenter revitalization strategy that will help you dominate competitors
If config is out of sync or versions become incompatible — for example, if your version three is sitting in the cloud while version one sits in the data center — you can effectively have multiple "clouds" to manage while your hybrid cloud remains unstandardized.
Instead, organizations often jump into the chosen cloud setup or on-prem, realizing only afterward that it won't fill their requirements. In 18 months or so, these firms are forced to start over again. "With a lot of clients, they have been going round in circles," Howell adds, noting that “cloud can be cheap” if correctly architected.
Benjamin Brial, founder of engineering platform Cycloid, says achieving governance can mean "owning all" your automation in GitHubs, instead of relying on the cloud provider.
"Secondly, use open source software and new automation to make a new database, to reduce how people need to accelerate in the DevOps area. This also reduces the number of tools and the number of clouds, because automation and open source is one piece you need to modernize in cloud infrastructure and to bring more governance," Brial says.
“Choose and use only one configuration management approach, with no overlay on top, whether it's Ansible, SaltStack, Puppet, or whatever. The last part is the continuous integration/continuous delivery (CI/CD) path, which should also be rationalized.
"Centralize whatever CD tools you're using. There is no project on the top of older CD that can centralize JFK [files], GitHub, and the one from the cloud provider. You can't bring governance if you have multiple tools. It's not possible," Brial warns.
Continuous costs along the pipeline
Brial adds that hiring for cloud and devops at scale can also prove hard and that such projects are less likely to succeed, especially in budget. Organizations must also tackle carbon emissions and sustainable transformation, with environmental, social, and corporate governance (ESG) pressures only growing, so it's critical that cloud projects and costs are better governed.
Anthony Kesterton, principal solution architect at Red Hat, notes that standardizing can make it easier to monitor and secure one’s data, instead of having to manage multiple different offerings that perhaps don't play well together and exposing an organization to risk and even financial liability.
"It's even more important as you look at more exotic architectures — for instance, ARM is becoming so much more popular as a server," Kesterton says. "You see [standardization] happening now in containers."
Salesforce is migrating 200,000 out-of-support CentOS Linux 7 systems to standardize on Red Hat Enterprise Linux 9, which Kesterton says offers "portability and comfort".
Discover the building blocks for an effective data governance solution
Matthew Parin, Hyperforce senior product director at Salesforce, says it enables the provisioning of streamlined integration across its various cloud platforms, minimizing complexities in data management, security, and compliance. The firm is also leveraging artificial intelligence (AI) across the infrastructure.
"Within our service-owner driven approach to engineering, (we) require individual service teams to not leak implementation-specific details through their application programming interface (API), enabling individual service teams to move quickly with greater confidence that their changes won't slow or impact other service owners," Parin tells ITPro.
David Walker, EMEA field chief technology officer (CTO) at Yugabyte, explains that hybrid cloud standardization is a market phase ultimately driving further innovation. In a brand new market, such as the cloud market around 15 years ago, vendors rush ahead to develop the very best differentiated cloud offerings and reap rewards through first-mover advantage.
"But having done that, it's hard to move things from there unless you start to get some standard centralization in the middle. It's very hard for that whole market to innovate across the top," Walker says.
Agnostic cloud architectures needed
The infrastructure sitting underneath shouldn't be a matter of whether you're on Amazon, Google, or any other vendor. They should just be models, which means the move to standardization is about opening up the market as a matter of principle.
"Regulators will come in eventually if there's an unfair market advantage in any industry," Walker says. "So you may as well start thinking about how you're going to standardize and embrace all this. And so 'standardized hybrid cloud' is not a contradiction, it's a creative tension."
Des Hudson, chief operating officer at Advania, emphasizes that customers want more consistency in the hybrid cloud environment, and organizations should heed this.
"We have a lot of capability and skills in Microsoft 365 and equally in Azure. We're taking our clients to public cloud, we don't have significant amounts of hybrid cloud," he says.
Before confirming further investment in hybrid cloud, consider the type of applications in your portfolio and your estate, in part to be sure standardization is the right solution for future iterations of your environment, Hudson suggests.
Tom Fairbairn, distinguished engineer at Solace, notes that with the right messaging approach moving data between legacy apps, cloud-native apps, and services can be easier. But integrating cloud-native tech, microservices, the Internet of Things (IoT), and mobility remains challenging.
"If you can move workloads across your public cloud environment, you can pick the cheapest place to run your tasks, get access to the latest technologies in the cloud, and have easy scaling," he says.
Fairbairn notes that data sovereignty is another driver. "Sure, if you're super-large, you might tell a hyperscaler exactly where you'd like data to reside, while it might not be so easy for others. But [hybrid cloud] is not easy because there's not a great deal of standardization."
Cloud Pro Newsletter
Stay up to date with the latest news and analysis from the world of cloud computing with our twice-weekly newsletter
Fleur Doidge is a journalist with more than twenty years of experience, mainly writing features and news for B2B technology or business magazines and websites. She writes on a shifting assortment of topics, including the IT reseller channel, manufacturing, datacentre, cloud computing and communications. You can follow Fleur on Twitter.