If you’re at all involved in the world of IT, it’s almost inevitable that you’ll run into containers at some point. In many ways, they’re the engine that powers the modern web, and the technology is gaining popularity among organisations of all sizes as a tool for developing robust, highly-scalable applications.
However, while you’re almost certainly aware of containers, you might not necessarily be familiar with how they came to be so beloved by software developers. The history of containers is rooted in the concept of process separation - sandboxing services running on the same hardware, for reasons of security, resource monitoring and other concerns. While the concept is similar to (and in some aspects an evolution of) virtualisation, the two are slightly different.
The first real forerunner of the technology was FreeBSD’s ‘jails’ feature. Released in 2000, this allowed admins to split FreeBSD systems into smaller independent environments. Jails were followed in 2006 by Google’s Process Containers, which isolated and controlled the resources for a small collection of processes, which in turn evolved into LXC, and eventually Docker. Docker offered the first fully-fledged container management ecosystem, and when it hit the scene in 2013, it immediately started making waves.
The next several years saw rapid advancements, as innovative startups and tech providers pushed the bleeding edge of container development forward, while slow and careful enterprise adoption ensured that it continued to grow in maturity. Over time, the Google-created Kubernetes container orchestration system came to be seen as an implementation of the technology that was stable and enterprise-friendly enough to merit mainstream adoption, and containers have now become an accepted element of infrastructures around the world.
Contain your excitement
Part of the reason that these services - and containerisation in general - were so popular, was that they gave administrators much more control over how and where they ran their apps. While virtualisation had been around since the 90s, every virtual machine runs its own separate clone of an operating system whereas containers run on top of a single OS, sharing the elements that they need to while keeping their own binaries and dependencies safely isolated.
In addition to the actual application or process it’s housing, a container also includes an individual copy of the exact binaries and configuration files required to run that specific application in the exact way its creators intended. This ensures that it will run predictably in any environment, without the prospect of different software versions introducing compatibility issues.
For this reason, containers are incredibly portable; they are pre-packaged with only the software they need in order to run, which means they can be dropped onto any OS without requiring extra configuration or software installation. They’re also much more lightweight than VMs, because multiple containers can share the same OS, rather than each application requiring its own instance. They take up less space, require fewer resources and boot much faster than VMs, which is a big bonus for efficiency-seeking businesses.
A further advantage of containers is that they allow for a modular approach to application development. Rather than constructing an application inside a single giant VM, organisations can break an application up into its constituent elements and run them in individual containers. This means that if one element experiences an issue, it doesn’t bring the whole application down with it; furthermore, it enables developers to swap components out as and when they need to without having to rebuild the whole application.
Containerised applications are more portable, more resilient, cheaper and faster to run than many bare-metal or VM-based applications, but because they can be quickly and easily duplicated, they’re also more scalable. This makes them perfect for web-based applications, which may need to be hosted across cloud, on-prem or hybrid environments, and thus benefit hugely from the predictability and low resource consumption that containers offer. The vast majority of web-scale SaaS applications are delivered via containers, and they’re ideal for services that need to be fast, stable and responsive.
Architecting for success
There are many benefits to containerisation, but while all this may sound disarmingly easy, an effective container delivery strategy requires more than just knowing how to use Docker or Kubernetes. A number of complementary and supporting functions are necessary to properly utilise containers. Most of these functions revolve around principles of agile and DevOps; part of the appeal of containers is the ability they provide to rapidly iterate on software, but they also require rapid iteration in order to maximise their impact.
Continuous integration and continuous delivery (CI/CD) are often key to this. Using these processes allows containers to be built, improved, tested and fed into an organisation’s infrastructure at rapid pace with minimal downtime. Proper use of continuous testing also means that these containers will be more stable, and ongoing monitoring as part of a good DevOps programme will enable organisations to quickly detect and remedy any faults.
So are containers set to dominate the enterprise landscape, pushing out VMs and establish a position as the only real way to run applications? Well, not quite. VMs still have their uses; they’re better suited to instances where a diverse set of operating systems need to be run on the same server, and as VMware CEO Pat Gelsinger is fond of pointing out, most hyperscale public cloud providers actually run their containers inside virtual machines.
The reality for most businesses will instead likely be a mixed estate, with containers and VMs utilised in parallel for different workloads. What we are going to see, however, is a continual increase in container adoption within the enterprise, as tooling gets more sophisticated and easier to integrate with existing technologies and workflows. This mixed approach will also extend to the infrastructure that containers run on; while the scalable nature of containers makes them a natural fit for public cloud services, these benefits can also be achieved in hybrid or on-premise systems by utilising the flexible consumption models and easy on-premise scaling offered by HPE’s Greenlake IT as a service portfolio.
We may also hit a talent crunch; container management is already one of the most highly sought-after skills in the IT industry, and as it grows in importance, the demand for competent container architects could spark bidding wars and talent shortages. Businesses who want to implement containerised applications without engaging in these bidding wars, however, can also make use of HPE’s Pointnext Services, which apply the industry giant’s expert talents and specialisations to the construction, architecture and operation of its customers’ IT estates, with deep knowledge of cutting edge container technologies like Kubernetes, Mesosphere and Docker. The combination of Pointnext and Greenlake allows customers, with HPE’s assistance, to construct a flexible, performant and robust container architecture, which can be run on Greenlake infrastructure and scaled up, down and out as their capacity needs evolve.
The key for businesses who want to put themselves at the head of the curve is to look at their roadmaps, and figure out how they want to integrate containers into their ecosystems in years to come. That way, they can gradually build their competency over time, without having to suddenly pivot years down the road. Put simply - containers are here to stay.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2023.
ITPro is a global business technology website providing the latest news, analysis, and business insight for IT decision-makers. Whether it's cyber security, cloud computing, IT infrastructure, or business strategy, we aim to equip leaders with the data they need to make informed IT investments.