If you’re like most organizations, you have a great deal of policies, processes, and resources in place to control what technology gets introduced into your infrastructure. You rely on these checks and balances to mitigate risks.
What you might not know is that while containers can dramatically accelerate your ability to deliver applications, it also introduces many risks if you do not follow recommended practices. For example, Trevor Jay with the Red Hat Product Security Team observes:
“Docker brings with it an entirely new attack surface in the form of its automated fetching and installation mechanism, ‘docker pull.’ It may be counter-intuitive, but ‘docker pull’ both fetches and unpacks a container image in one step.”
Consider the example of a file attached to an email. While email attachments are a popular and convenient way to send documents and other files, they are also a common source of viruses and malware. You use caution when opening attachments, even if they appear to have been sent by someone you know, and this same caution should be taken when unpacking container images. Unfortunately, “docker pull” does not allow you to separate receiving the image from unpacking it, in the way that email does.
In addition to introducing many unknown suppliers of container images as building blocks into your infrastructure, it is possible that mistyping an image name during a “docker pull” can lead to system compromises through malicious code.
For this reason, Red Hat recommends that you protect your infrastructure when using Docker images by:
Making sure to only use content from trusted sources (which you should be doing anyway, regardless of whether it involves containers or not).
Separating the download and unpack/installation steps when working with container images.
You also need to be sure that the container images you include in your infrastructure are safe to consume and certified to run wherever you choose to deploy them – whether on premise to physical hardware, virtually, private cloud, or public cloud.
At Red Hat, we’re working to advance both container technology and the supporting ecosystem to make Linux containers enterprise-consumable, much as we did with Linux. Red Hat is currently developing containerization solutions to offer the advantages of:
Trusted access to digitally signed container images that are certified to run on certified container hosts.
An integrated application delivery platform from application container to deployment target built on open standards.
Proven container portability with deployment across physical hardware, hypervisors, private clouds, and public clouds.
Whether an application is deployed to a physical server, a virtual machine, a public cloud instance or a Linux container, IT teams must have confidence that their infrastructure, systems, and critical data will remain secure and uncompromised. Regardless of innovation, enterprises need the same level of security, reliability and trust of any “traditional” application – and containers are no exception.
Image credit: opensource.com