SecurityBrief Asia - Technology news for CISOs & cybersecurity decision-makers
Story image
The age of containerisation and its impact on security
Fri, 30th Jun 2017
FYI, this story is more than a year old

A defining characteristic of software development and deployment is its constant evolution. Virtual machines (VMs) first appeared in the 1960s, and developers soon scaled it to commercial levels. Customers saw VMs as an ideal solution to enable easier server consolidation and drive disaster recovery and IT optimization. Since 2010, server virtualisation has become the norm rather than the exception. This model was supported by traditional “waterfall” software release methodologies, along with operations and software development working in different silos.

But these traditional models are changing, fueled by demand for increased productivity, accelerated innovation and faster go-to-market release of digital products and services. This has forced a rethink of how application development and deployment takes place. Rather than having software development and operations separated, the two are now being integrated into a single unified function - DevOps.

DevOps brings with it the concepts of continuous integration (CI) and continuous development (CD), which means that small incremental changes to applications are being made all the time. This is a stark contrast to the traditional waterfall development methodology.

The trend towards DevOps is also forcing a rethink of how software is deployed. Rather than the legacy virtualisation model used for years, a more efficient model has emerged. This technology, called containerisation, allows for more efficient use of hardware resources by facilitating the sharing of a single operating system instance with multiple segregated applications.

Benefits of container technologies

Container technologies have been revolutionary in their ability to enable fast application deployment and migration to the cloud. They not only reduce the required resources for deployment, but they can be brought online or quickly deleted in a matter of seconds. Containers also allow applications and processes to be consistently deployed across multiple clouds, which facilitates faster, more confident enterprise adoption of cloud services.

Containers are lightweight and portable, with everything needed at runtime, including all code and supporting libraries. In fact, containerisation takes IT automation to a whole new level when combined with orchestration technologies, such as Kubernetes, which facilitates totally elastic computing models.

We are now seeing the evolution of software defined infrastructures (SDI) where services can be provisioned and automatically deployed across large compute infrastructures. Applications developed using containerisation is an important part of this technology.

Unlike VMs, containers share operating systems and use only the resources they need to run the application they're hosting, resulting in greater efficiency. Hundreds of containers can run on just one server, saving valuable data center budget.

Not surprisingly, the remarkable benefits of DevOps deployment methods, a technology that is still in its infancy, have led to an incredible rate of adoption.

In a 2016 global survey, 53 percent of companies with at least 100 employees had either deployed or were in the process of evaluating containers. And once an organisation invests in containers, their use expands rapidly. In its report, Docker Containers Will Impact Enterprise Storage Infrastructure, Gartner predicts that by 2022, more than 20 percent of enterprise primary storage capacity will be deployed to support container workloads, up from less than one percent today.

But containerisation comes with growing pains. As with any new technology development, industry excitement often overshadows security until organisations recognise the need to address the new set of risks.

Security implications of containers

As traditional VMs typically exist for weeks, security issues are more likely to be caught during a weekly or monthly scan and patch cycle. But containers can exist for just minutes, and the periodic scanning approach simply doesn't work. Beyond this, containers cannot be scanned using traditional vulnerability assessment tools, since the services that these tools require are not present in container images. This means, new methods of vulnerability assessment must be embraced.

DevOps teams are often focused on speed rather than security. Some platforms, such as Docker, allow users to pull pre-built images from public repositories. While efficient, this can expose networks to unknown threats and vulnerabilities.

It is the very advantages of containers that make them such a security risk.

  • Containerisation allows the dynamic and elastic deployment of software, but this means the attack surface is constantly evolving.
  • A key strength of containers is their ability to be spun up or down almost instantly. But with an average lifespan of just a few hours or days, how can security teams effectively assess risk?
  • Their short lifespan and ability to be instantly deleted or replaced, key benefits when only concerned with speed of deployment, can cause a lack of visibility.

A major concern is that enterprise adoption and deployment of DevOps and containers is outpacing the approach used to secure them.

Tenable's 2017 Global Cybersecurity Assurance Report Card, found that security teams' confidence in their organisations' ability to mitigate risk in these categories is worryingly low, with containerisation platforms (52 percent) and DevOps environments (57 percent) each receiving a failing grade.

Integrating security into the DevOps innovation cycle

Organisations continue to struggle with container security because traditional security approaches, first designed for physical servers and then for VMs, cannot cope with today's dynamic IT environment.

Server or device-centric-security simply doesn't work in the new app-centric enterprise, which means organisations must rethink their approach to cybersecurity. Vulnerability scanning needs to be integrated into the DevOps innovation cycle. Each container must be scanned as it's built, before it's put into production and while it's in production. Doing so will ensure that vulnerability and malware detection occurs throughout the lifecycle of every container.

Recognising the critical importance of vulnerability scanning, Gartner expects that “by 2018, 60 percent of enterprises will mandate container vulnerability scanning, up from less than 30 percent in 2016.” [Gartner source: “Security Considerations and Best Practices for Securing Containers” by Neil MacDonald published Nov. 10, 2016].

Introducing security into the DevOps process at the same speed as DevOps itself is starting to gain momentum as organisations transition into DevSecOps. DevOps teams are having a larger say in networking and infrastructure security tools, and as containers have grown in popularity, so too have the tools available to help secure them.

As more enterprises migrate their existing and customer-facing apps into container environments, it will become critical that organisations recognise and address the need for a new security approach, ensuring that security is integrated into the DevOps process from the very beginning. This is the only way for organisations to get the operational benefits of containers, while also reducing their level of exposure and risk.