DockerCon 2015 and the Future of the Container Technology

In my previous post, I told the story of how I came to find out about Docker, and how Docker has improved my development practice as a freelancer. In this post, I want to talk about why I’m excited by the interesting announcements, hot tools, and cool demos made public during the last Docker conference, DockerCon, held in San Francisco this June.

The theme of this year’s DockerCon was “Docker in production”. There were around two thousand attendees. That’s four times as many as the previous year. And the previous year was the first ever DockerCon. This increase in attendance is a reflection of the phenomenal success and rising popularity of Docker.

Here are my take-aways from the conference.

Runtime Solutions

Keynote speeches highlighted major achievements of the past year.

Developers and ops folk are familiar with the phrase: “It works on my machine, but not on the production server!” This issue (we call it the runtime problem) is solved with Docker’s container infrastructure. And the challenge of packaging and distributing images is solved with the establishment of a Docker registry for storing and distributing Docker images, i.e. Docker Hub.

In addition to choosing between private and public repositories for Docker images, the Docker company now offers a commercial product consisting of an on-premise Docker registry suitable for companies with a need for ownership and security.

Building and Deploying Distributed Applications

To address the challenges of building and deploying distributed applications in a scalable, reliable, and secure way, a number of things have emerged:

  • Docker Compose: makes it easy to describe various components and services of a distributed application so the application can be deployed easily and in a repeatable way taking advantage of Docker’s container infrastructure.
  • Docker Machine: makes it possible to declare a cluster of machines and make them easily configurable so the task of deploying and scaling distributed applications with all their services is smoothly and transparently managed.
  • Docker networking: is the underlying networking infrastructure which Docker has re-worked in its effort to re-engineer the way distributed applications are approached by developers and ops folks.
  • Docker Swarm: provides a way to orchestrate smooth integration between the different tools in the Docker ecosystem. It provides a transparent way to move an application from development to the production, taking full advantage of Docker.
  • Docker Notary: strengthens the security of images and the Docker platform. This tool is still being evaluated, but promises to provide tightened security and trust in the hosting and distribution of Docker images and applications code.

While the number of tools being developed around Docker keeps increasing, the Docker company and early adopters call for “incremental” change, meaning: companies and developers are advised to pick the tool that will improve their processes one at a time without feeling the pressure to swap out all of their existing tool sets. Examples of Docker integration with existing tool sets are also being investigated and promoted.

Cool Demos

There were a number of cool demos showing how some of the new tools work, including:

  • Use of Docker machine on a cloud provider where an application is deployed and later scaled to serve more requests.
  • Transparent migration of a containerised application from one region to another without shutting down the application.
  • Demonstration of Notary’s ability to protect against untrusted sources injecting malicious code into application images.

Feedback From Cloud Providers and End-Users

A number of cloud providers and Docker users provided feedback on using Docker in production or integrating Docker in their existing development and deployment processes.

Microsoft, IBM, and Amazon showed how they used—and were planning to use—the Docker platform to offer to their customers Docker-compatible containers and related tools from development to the deployment stage. Business Insider and the General Services Administration (GSA) of the US Federal Government shared positive experiences integrating Docker into their production software stacks.

Looking to the Future

Lastly, important announcements were made concerning the next tools the Docker team will be working on and the future direction of the Docker project.

Solomon Hykes, Docker CTO, announced the extraction of a number of plumbing tools from the Docker code by popular demand. Of the tens of thousands of lines of code that constitute the Docker platform, roughly 50% is plumbing! Docker has plumbing for interacting with both Linux and Windows native capabilities; it has plumbing for networking; service discovery; master election; security; and more. These plumbing tools have been isolated and can now be used outside the Docker platform. Security work, and more precisely the integration of the work that has been done on Notary, will also be prioritized.

Docker announced the building of a tool that integrates all of the tools together in a unified and transparent manner. The product is called Project Orca and a demonstration of working code was provided, but the project is still under active development.

There were also developments in how the Docker open source project will move forward. Given the fact Docker has emerged as a de facto standard in container software, major players in the industry have agreed to promote it to a de jure standard, i.e. formalised with a documented open standard.

For this to happen, a number of decisions have been made. The most important of which was the creation of the Open Container Initiative (OCI) which is now responsible for promoting and protecting the standard.

The Docker container code, renamed runC, has been donated as the reference code for the OCI. In partnership with the Linux foundation, the founding members of the OCI are going to put in place a light governance model which will steer the work in this area without becoming a barrier to innovation and creativity.


As a freelance developer who has often had to worry about the full application stack, I warmly welcome the addition of Docker to my workflow.

Docker has provided a unified, repeatable, reliable, and broadly standardised way of dealing with the infrastructural aspect of building, running, and deploying applications.

My excitement is further reinvigorated by the large array of tools and products being developed around Docker, many of which were unveiled at DockerCon 2015. These include Docker Machine, Docker networking, Docker swarm, Notary, and Project Orca.

The fact that there is a broad consensus in the industry to promote Docker as the standard container model is an additional reason for optimism.

About the Author

Dr Mazembo Mavungu Eddy is a Tech and Social Science Writer, Developer, Senior Research Consultant and Founder of Dielais. He lives in France and provides training and consultancy on Docker and related container technologies. Get in touch by writing to mazemb_eddy at or dielainfos at Follow me on Twitter: @mazembo

Leave a Reply

Your email address will not be published. Required fields are marked *