Containers go mainstream

Revera Head of Portfolio and Professional Services Mike Walls lifts the lid on containers

2019 looks like it will be the year containers gain mainstream adoption. There's been plenty of talk and play over the last few years but not so much enterprise uptake. More recently we have seen customers moving beyond the experimentation stage into end to end container adoption to assist with deployment efficiency, DevOps practices, and micro services architectures.

Containers add value as they provide a standard way to package your application's code, configurations, and dependencies into a single object. This allows the application and its run time to be abstracted from the environment in which they actually run.

The major benefit of this is being able to deploy and port an application consistently and quickly across virtually any environment - whether the target environment is a private data centre, public clouds, or even a developer’s desktop.

Additional benefits include the ability to version control for deployment and to scale very quickly due to a containers lightweight nature allowing for quick start and stop cycles. Containers can also help reduce operational complexity by providing a consistent isolated application operating environment.

There are a number of container technologies available; Docker is the most popular container image format used whilst Kubernetes has become the leading container orchestration and management platform that has really enabled mainstream adoption. There are various container platforms like Red Hat OpenShift, Pivotal Cloud Foundry, and Rancher and the public cloud providers have also developed their own container service offerings - all of which utilise levels of Docker and Kubernetes.

Whilst containers undoubtedly provide a number of benefits in application deployment and management there are some things to consider before adoption.

Whilst Containers provide an additional layer of security by sand-boxing an application from running directly on the host operating system, they do change your security posture. You will need a strategy for securing API access and with east-west traffic being prominent, traditional firewalls are no longer enough. With this is mind consideration needs to be given to automating and software defining networking including the use of micro-segmentation to secure applications.

Container deployments also don’t necessarily replace the need for all existing IT tools. For example you'll likely still need to automate elements including networking and your release management processes, and you also need to consider how you will log and monitor your container environments.

Public Cloud service offerings have recently matured to reduce some of these headaches by baking them into the offering or by providing seamless integration to some of their other services. For example Google Cloud (built upon Kubernetes itself) provides Google Kubernetes Engine, AWS has Elastic Container Service and Fargate, and Azure has their Container Instances and Kubernetes Service.

A consideration with public cloud provider native services is that they can have differences in APIs, feature sets and more which may mean a loss of workload portability between platforms. If you have a single cloud provider strategy then happy days - public cloud container services will vastly reduce the complexity of implementing and managing containers. However, if you have an on premise or hybrid strategy - say development in the public cloud and production in-country because of sovereignty needs - then there are some additional considerations to be wary of.

Revera have moved customers into the public cloud containers with the likes of AWS Fargate and Google Kubernetes Engine whilst we also provide in-country hybrid container services through Azure Stack, Red Hat OpenShift, and Rancher. If you would like some assistance or direction adopting containers or want to know more contact us at enquiry@revera.co.nz 

 

date_range 26 February 2019